- 0.24.6Latest
- 0.24.5
- 0.24.4
- 0.24.3
- 0.24.2
- 0.24.1
- 0.24.0
- 0.23.4
- 0.23.3
- 0.23.2
- 0.23.0
- 0.22.13
- 0.22.12
- 0.22.10
- 0.22.9
- 0.22.8
- 0.22.6
- 0.22.5
- 0.22.4
- 0.22.3
- 0.22.2
- 0.22.1
- 0.22.0
- 0.21.10
- 0.21.9
- 0.21.8
- 0.21.7
- 0.21.6
- 0.21.5
- 0.21.4
- 0.21.3
- 0.21.2
- 0.21.1
- 0.21.0
- 0.20.48
- 0.20.47
- 0.20.46
- 0.20.45
- 0.20.44
- 0.20.43
- 0.20.42
- 0.20.41
- 0.20.40
- 0.20.39
- 0.20.38
- 0.20.37
- 0.20.36
- 0.20.35
- 0.20.34
- 0.20.33
- 0.20.32
- 0.20.31
- 0.20.30
- 0.20.29
- 0.20.28
- 0.20.27
- 0.20.26
- 0.20.25
- 0.20.24
- 0.20.23
- 0.20.22
- 0.20.21
- 0.20.20
- 0.20.19
- 0.20.18
- 0.20.17
- 0.20.16
- 0.20.15
- 0.20.14
- 0.20.13
- 0.20.12
- 0.20.11
- 0.20.10
- 0.20.9
- 0.20.8
- 0.20.7
- 0.20.6
- 0.20.5
- 0.20.4
- 0.20.3
- 0.20.2
- 0.20.1
- 0.20.0
- 0.19.13
- 0.19.12
- 0.19.11
- 0.19.10
- 0.19.9
- 0.19.8
- 0.19.7
- 0.19.6
- 0.19.5
- 0.19.4
- 0.19.3
- 0.19.2
- 0.19.1
- 0.19.0
- 0.18.0
- 0.17.0
- 0.16.0
- 0.15.0
- 0.14.4
- 0.14.3
- 0.14.2
- 0.14.1
- 0.14.0
- 0.13.8
- 0.13.7
- 0.13.6
- 0.13.5
- 0.13.4
- 0.13.3
- 0.13.2
- 0.13.1
- 0.13.0
- 0.12.2
- 0.12.1
- 0.12.0
- 0.11.1
- 0.11.0
- 0.10.0
- 0.9.2
- 0.9.1
- 0.9.0
- 0.8.0
- 0.7.0
- 0.6.1
- 0.6.0
- 0.5.0
- 0.4.0
- 0.3.0
- 0.2.0
- 0.1.0
- 0.0.8
- 0.0.7
- 0.0.6
- 0.0.5
- 0.0.4
- 0.0.3
- 0.0.2
- 0.0.1
- 0.0.0
proc
Unlock Denoβs secret AsyncIterable superpowers!
A simpler, saner alternative to JavaScript streams. Built on async iterablesβa more standard JavaScript primitiveβproc eliminates backpressure problems, produces cleaner code, and is easier to work with. Run processes, transform data between formats, and use Array methods on async iterables.
π Full Documentation | π Quick Start | π Performance Guide
import { enumerate, read, run } from "jsr:@j50n/proc";
import { fromCsvToRows, toTsv } from "jsr:@j50n/proc/transforms";
// Run processes and capture output
const lines = await run("ls", "-la").lines.collect();
// Chain processes like a shell pipeline
const result = await run("cat", "data.txt")
.run("grep", "error")
.run("wc", "-l")
.lines.first;
// Transform data between formats
await read("sales.csv")
.transform(fromCsvToRows())
.filter((row) => parseFloat(row[3]) > 1000)
.transform(toTsv())
.writeTo("high-value.tsv");
// Work with async iterables using familiar Array methods
const commits = await run("git", "log", "--oneline")
.lines
.map((line) => line.trim())
.filter((line) => line.includes("fix"))
.take(5)
.collect();
// Errors propagate naturally - handle once at the end
try {
await run("npm", "test")
.lines
.filter((line) => line.includes("FAIL"))
.toStdout();
} catch (error) {
console.error(`Tests failed: ${error.code}`);
}Why proc?
Simpler than streams β Async iterables are a standard JavaScript primitive (more standard than streams). Pull-based iteration is easier to reason about than push-based streams. No complex coordination, no buffering logic, no backpressure headaches.
Backpressure solved β Traditional streams require careful coordination between producers and consumers. Async iterators eliminate this entirelyβthe consumer pulls when ready. No memory pressure, no dropped data, no complexity.
Cleaner, more intuitive code β Use map, filter, reduce, flatMap,
take, drop and moreβjust like Arrays. Errors propagate naturally through
pipelines. One try-catch at the end handles everything.
WASM-powered data transforms β Convert between CSV, TSV, JSON, and Record formats with WebAssembly-accelerated parsing. For maximum throughput, use the flatdata CLI for multi-process streaming.
Powerful process management β Run commands, pipe between processes, capture output, and control execution with a clean, composable API. Shell-like pipelines with proper error handling.
Type-safe and ergonomic β Full TypeScript support with intuitive APIs that guide you toward correct usage.
Features at a Glance
π Process Management
- Run commands β
run(),pipe(),result(),toStdout() - Chain processes β Shell-like pipelines with
.run() - Capture output β Lines, bytes, or full output
- Error handling β Natural propagation through pipelines
π Async Iterables
- Array-like methods β
map,filter,reduce,flatMap,forEach,some,every,find - Slicing & sampling β
take,drop,slice,first,last,nth - Concurrent operations β
concurrentMap,concurrentUnorderedMapwith concurrency control - Utilities β
enumerate,zip,range,cache()
π Data Transforms
- Format conversion β CSV β TSV β JSON β Record
- Streaming processing β Constant memory usage for any file size
- LazyRow optimization β Faster parsing with binary backing
- flatdata CLI β WASM-powered tool for multi-process streaming
Installation
import * as proc from "jsr:@j50n/proc";Or import specific functions:
import { enumerate, read, run } from "jsr:@j50n/proc";Data Transforms (Optional)
Data transforms are in a separate module to keep the core library lightweight:
β οΈ Experimental (v0.24.0+): Data transforms are under active development. API may change as we improve correctness and streaming performance.
// Core library - process management and async iterables
import { enumerate, read, run } from "jsr:@j50n/proc";
// Data transforms - CSV, TSV, JSON, Record conversions
import { fromCsvToRows, toTsv } from "jsr:@j50n/proc/transforms";See the Data Transforms Guide for details.
Key Concepts
Properties vs Methods: Some APIs are properties (.lines, .status,
.first) and some are methods (.collect(), .map(), .filter()). Properties
donβt use parentheses.
Resource Management: Always consume process output via .lines.collect(),
.lines.forEach(), or similar. Unconsumed output causes resource leaks.
Error Handling: Processes that exit with non-zero codes throw
ExitCodeError when you consume their output. Use try-catch to handle failures.
Enumeration: enumerate() wraps iterables but doesnβt add indices. Call
.enum() on the result to get [item, index] tuples.
Quick Examples
Stream and process large compressed files
import { read } from "jsr:@j50n/proc";
// Read, decompress, and count lines - all streaming, no temp files!
const lineCount = await read("war-and-peace.txt.gz")
.transform(new DecompressionStream("gzip"))
.lines
.count();
console.log(`${lineCount} lines`); // 23,166 linesTransform data between formats
import { read } from "jsr:@j50n/proc";
import { fromCsvToRows, toJson } from "jsr:@j50n/proc/transforms";
// Convert CSV to JSON Lines with filtering
await read("sales.csv")
.transform(fromCsvToRows())
.filter((row) => parseFloat(row[3]) > 1000)
.map((row) => ({
id: row[0],
customer: row[1],
amount: parseFloat(row[3]),
}))
.transform(toJson())
.writeTo("high-value.jsonl");Run a command and capture output
import { run } from "jsr:@j50n/proc";
const result = await run("git", "rev-parse", "HEAD").lines.first;
console.log(`Current commit: ${result?.trim()}`);Handle errors gracefully
import { run } from "jsr:@j50n/proc";
try {
// Errors propagate through the entire pipeline
// No need for error handling at each step
await run("npm", "test")
.lines
.map((line) => line.toUpperCase())
.filter((line) => line.includes("FAIL"))
.toStdout();
} catch (error) {
// Handle all errors in one place
if (error.code) {
console.error(`Tests failed with code ${error.code}`);
}
}Transform async iterables
import { enumerate } from "jsr:@j50n/proc";
const data = ["apple", "banana", "cherry"];
const numbered = await enumerate(data)
.enum()
.map(([fruit, i]) => `${i + 1}. ${fruit}`)
.collect();
console.log(numbered); // ["1. apple", "2. banana", "3. cherry"]Process large files efficiently
import { read } from "jsr:@j50n/proc";
const errorCount = await read("app.log")
.lines
.filter((line) => line.includes("ERROR"))
.reduce((count) => count + 1, 0);
console.log(`Found ${errorCount} errors`);Parallel processing with concurrency control
import { enumerate } from "jsr:@j50n/proc";
const urls = ["url1", "url2", "url3" /* ... */];
await enumerate(urls)
.concurrentMap(async (url) => {
const response = await fetch(url);
return { url, status: response.status };
}, { concurrency: 5 })
.forEach((result) => console.log(result));Features
- Process execution β
run(),pipe(),result(),toStdout() - Array-like methods β
map,filter,reduce,flatMap,forEach,some,every,find - Slicing & sampling β
take,drop,slice,first,last,nth - Concurrent operations β
concurrentMap,concurrentUnorderedMapwith concurrency control - Data transforms β CSV, TSV, JSON, Record format conversions with streaming
- Utilities β
enumerate,zip,range,read(for files) - Caching β
cache()to replay iterables - Writable iterables β Push values into async iterables programmatically
Documentation
- Getting Started β Your first proc script in 5 minutes
- Process Management β Run commands, chain pipelines, handle errors
- Async Iterables β Array-like methods for streaming data
- Data Transforms β CSV, TSV, JSON, Record conversions
- Performance Guide β Benchmarks and optimization tips
- Recipes β Copy-paste solutions for common tasks
- API Reference β Complete API documentation
Contributing
Contributions are welcome! See the contributor guide for details on:
- Project architecture
- Coding standards
- Testing strategy
- Documentation guidelines
License
MIT
Building Documentation
The WASM book (labs/wasm/docs/) generates HTML, EPUB, and PDF outputs.
Prerequisites (Debian/Ubuntu)
# Rust toolchain
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
# mdbook
cargo install mdbook
# Document generation tools
sudo apt install pandoc weasyprint ghostscript imagemagick
# WebAssembly Binary Toolkit (for WASM verification)
sudo apt install wabtBuild
./build-site.shOutputs:
labs/wasm/docs/book/β HTMLlabs/wasm/docs/book/book.epubβ EPUB with coverlabs/wasm/docs/book/book.pdfβ PDF with cover