- 0.23.3Latest
- 0.23.2
- 0.23.0
- 0.22.13
- 0.22.12
- 0.22.10
- 0.22.9
- 0.22.8
- 0.22.6
- 0.22.5
- 0.22.4
- 0.22.3
- 0.22.2
- 0.22.1
- 0.22.0
- 0.21.10
- 0.21.9
- 0.21.8
- 0.21.7
- 0.21.6
- 0.21.5
- 0.21.4
- 0.21.3
- 0.21.2
- 0.21.1
- 0.21.0
- 0.20.48
- 0.20.47
- 0.20.46
- 0.20.45
- 0.20.44
- 0.20.43
- 0.20.42
- 0.20.41
- 0.20.40
- 0.20.39
- 0.20.38
- 0.20.37
- 0.20.36
- 0.20.35
- 0.20.34
- 0.20.33
- 0.20.32
- 0.20.31
- 0.20.30
- 0.20.29
- 0.20.28
- 0.20.27
- 0.20.26
- 0.20.25
- 0.20.24
- 0.20.23
- 0.20.22
- 0.20.21
- 0.20.20
- 0.20.19
- 0.20.18
- 0.20.17
- 0.20.16
- 0.20.15
- 0.20.14
- 0.20.13
- 0.20.12
- 0.20.11
- 0.20.10
- 0.20.9
- 0.20.8
- 0.20.7
- 0.20.6
- 0.20.5
- 0.20.4
- 0.20.3
- 0.20.2
- 0.20.1
- 0.20.0
- 0.19.13
- 0.19.12
- 0.19.11
- 0.19.10
- 0.19.9
- 0.19.8
- 0.19.7
- 0.19.6
- 0.19.5
- 0.19.4
- 0.19.3
- 0.19.2
- 0.19.1
- 0.19.0
- 0.18.0
- 0.17.0
- 0.16.0
- 0.15.0
- 0.14.4
- 0.14.3
- 0.14.2
- 0.14.1
- 0.14.0
- 0.13.8
- 0.13.7
- 0.13.6
- 0.13.5
- 0.13.4
- 0.13.3
- 0.13.2
- 0.13.1
- 0.13.0
- 0.12.2
- 0.12.1
- 0.12.0
- 0.11.1
- 0.11.0
- 0.10.0
- 0.9.2
- 0.9.1
- 0.9.0
- 0.8.0
- 0.7.0
- 0.6.1
- 0.6.0
- 0.5.0
- 0.4.0
- 0.3.0
- 0.2.0
- 0.1.0
- 0.0.8
- 0.0.7
- 0.0.6
- 0.0.5
- 0.0.4
- 0.0.3
- 0.0.2
- 0.0.1
- 0.0.0
proc
Run child processes and work with async iterables in Deno—with the fluent Array API you already know.
import * as proc from "jsr:@j50n/proc";
// Run processes and capture output
const lines = await proc.run("ls", "-la").lines.collect();
// Chain processes like a shell pipeline
const result = await proc.run("cat", "data.txt")
.run("grep", "error")
.run("wc", "-l")
.lines.first;
// Work with async iterables using familiar Array methods
const commits = await proc.run("git", "log", "--oneline")
.lines
.map(line => line.trim())
.filter(line => line.includes("fix"))
.take(5)
.collect();
// Errors propagate naturally - handle once at the end
try {
await proc.run("npm", "test")
.lines
.map(line => line.toUpperCase())
.filter(line => line.includes("FAIL"))
.forEach(line => console.log(line));
} catch (error) {
console.error(`Tests failed: ${error.code}`);
}Why proc?
Errors that just work — Errors propagate through pipelines naturally, just like data. No edge cases, no separate error channels, no callbacks. One try-catch at the end handles everything. JavaScript streaming is fast, but error handling shouldn’t break your brain.
Powerful process management — Run commands, pipe between processes, capture output, and control execution with a clean, composable API.
Async iterables that feel like Arrays — Use map, filter, reduce, flatMap, take, drop, and more on any async iterable. No more wrestling with streams.
Type-safe and ergonomic — Full TypeScript support with intuitive APIs that guide you toward correct usage.
Key Concepts
Properties vs Methods: Some APIs are properties (.lines, .status, .first) and some are methods (.collect(), .map(), .filter()). Properties don’t use parentheses.
Resource Management: Always consume process output via .lines.collect(), .lines.forEach(), or similar. Unconsumed output causes resource leaks.
Error Handling: Processes that exit with non-zero codes throw ExitCodeError when you consume their output. Use try-catch to handle failures.
Enumeration: enumerate() wraps iterables but doesn’t add indices. Call .enum() on the result to get [item, index] tuples.
Quick Examples
Stream and process large compressed files
import { read } from "jsr:@j50n/proc";
// Read, decompress, and count lines - all streaming, no temp files!
const lineCount = await read("war-and-peace.txt.gz")
.transform(new DecompressionStream("gzip"))
.lines
.count();
console.log(`${lineCount} lines`); // 23,166 linesRun a command and capture output
const result = await proc.run("git", "rev-parse", "HEAD").lines.first;
console.log(`Current commit: ${result?.trim()}`);Handle errors gracefully
try {
// Errors propagate through the entire pipeline
// No need for error handling at each step
const result = await proc.run("npm", "test")
.lines
.map(line => line.toUpperCase())
.filter(line => line.includes("FAIL"))
.forEach(line => console.log(line));
} catch (error) {
// Handle all errors in one place
if (error.code) {
console.error(`Tests failed with code ${error.code}`);
}
}Transform async iterables
import { enumerate } from "jsr:@j50n/proc";
const data = ["apple", "banana", "cherry"];
const numbered = await enumerate(data)
.enum()
.map(([fruit, i]) => `${i + 1}. ${fruit}`)
.collect();
console.log(numbered); // ["1. apple", "2. banana", "3. cherry"]Process large files efficiently
import { read } from "jsr:@j50n/proc";
const errorCount = await read("app.log")
.lines
.filter(line => line.includes("ERROR"))
.reduce((count) => count + 1, 0);
console.log(`Found ${errorCount} errors`);Parallel processing with concurrency control
import { enumerate } from "jsr:@j50n/proc";
const urls = ["url1", "url2", "url3", /* ... */];
await enumerate(urls)
.concurrentMap(async (url) => {
const response = await fetch(url);
return { url, status: response.status };
}, { concurrency: 5 })
.forEach(result => console.log(result));Features
- Process execution —
run(),pipe(),result(),toStdout() - Array-like methods —
map,filter,reduce,flatMap,forEach,some,every,find - Slicing & sampling —
take,drop,slice,first,last,nth - Concurrent operations —
concurrentMap,concurrentUnorderedMapwith concurrency control - Utilities —
enumerate,zip,range,read(for files) - Caching —
cache()to replay iterables - Writable iterables — Push values into async iterables programmatically
Documentation
Full documentation and API reference: https://j50n.github.io/deno-proc/
Installation
import * as proc from "jsr:@j50n/proc";Or import specific functions:
import { run, enumerate, read } from "jsr:@j50n/proc";License
MIT