Connecting clients
A client is the application that calls your inferlet. Three official client libraries cover the common cases: Python (pie-client), JavaScript (pie-client), and Rust (pie-client crate). This page is task-shaped: how to connect, launch, stream, and close. The full API surface is in the per-language reference. Read this after Run a server.
Install
- Rust
- Python
- JavaScript
[dependencies]
pie-client = "*"
tokio = { version = "1", features = ["full"] }
serde_json = "1"
anyhow = "1"
pip install pie-client
npm install pie-client
Connect and authenticate
- Rust
- Python
- JavaScript
use pie_client::{Client, ParsedPrivateKey};
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let client = Client::connect("ws://127.0.0.1:8080").await?;
// Public-key auth (Ed25519 / RSA / ECDSA).
let key = ParsedPrivateKey::from_file("~/.ssh/id_ed25519")?;
client.authenticate("alice", &Some(key)).await?;
// ... use the client ...
client.close().await?;
Ok(())
}
import asyncio
from pie_client import PieClient, ParsedPrivateKey
async def main():
async with PieClient("ws://127.0.0.1:8080") as client:
key = ParsedPrivateKey.from_file("~/.ssh/id_ed25519")
await client.authenticate("alice", key)
# ... use the client ...
asyncio.run(main())
import { PieClient } from 'pie-client';
const client = new PieClient('ws://127.0.0.1:8080');
await client.connect();
// JS client today only supports token auth.
await client.authByToken(process.env.PIE_TOKEN!);
// ... use the client ...
await client.close();
If the engine is running with --no-auth, the authenticate call is a no-op (Python and Rust) and you skip it entirely (JavaScript).
Launch a process
launch_process(name@version, input) starts an inferlet. The returned handle streams events.
- Rust
- Python
- JavaScript
let input = serde_json::json!({"question": "What is 2+2?"}).to_string();
let mut process = client
.launch_process(
"research-agent@0.1.0".into(),
input,
true, // capture stdout/stderr
None, // token budget (None = unlimited)
)
.await?;
process = await client.launch_process(
"research-agent@0.1.0",
input={"question": "What is 2+2?"},
)
const proc = await client.launchProcess('research-agent@0.1.0', {
question: 'What is 2+2?',
});
input becomes the inferlet's typed Input struct. The launch resolves once the engine has confirmed the inferlet started; events stream after.
Stream events
The handle yields events one by one. Loop until Return (success) or Error (failure).
- Rust
- Python
- JavaScript
use pie_client::ProcessEvent;
loop {
match process.recv().await? {
ProcessEvent::Stdout(s) => print!("{s}"),
ProcessEvent::Stderr(s) => eprint!("{s}"),
ProcessEvent::Message(m) => println!("[message] {m}"),
ProcessEvent::File(b) => save_blob(&b),
ProcessEvent::Return(v) => {
let answer: String = serde_json::from_str(&v)?;
println!("{answer}");
break;
}
ProcessEvent::Error(e) => anyhow::bail!("{e}"),
}
}
from pie_client import Event
import json
while True:
event, value = await process.recv()
if event == Event.Stdout:
print(value, end="", flush=True)
elif event == Event.Stderr:
print(value, end="", flush=True, file=sys.stderr)
elif event == Event.Message:
handle_message(value)
elif event == Event.File:
save_blob(value)
elif event == Event.Return:
answer = json.loads(value)
print(answer)
break
elif event == Event.Error:
raise RuntimeError(value)
while (true) {
const { event, value } = await proc.recv();
if (event === 'stdout') process.stdout.write(value);
else if (event === 'stderr') process.stderr.write(value);
else if (event === 'message') handleMessage(value);
else if (event === 'file') saveBlob(value);
else if (event === 'return') {
const answer = JSON.parse(value);
console.log(answer);
break;
} else if (event === 'error') {
throw new Error(value);
}
}
Stdout events arrive in the order the inferlet emitted them. Per-event ordering is preserved across stdout / stderr / messages.
Send signals and files
The client side mirrors the inferlet's session surface.
- Rust
- Python
- JavaScript
process.signal("stop").await?;
process.transfer_file(&pdf_bytes).await?;
await process.signal("stop")
await process.transfer_file(pdf_bytes)
await proc.signal('stop');
await proc.transferFile(pdfBytes);
The inferlet receives signals via session.receive_signal() and files via session.receive_file().
Concurrent calls on one connection
A single connection multiplexes many processes. Launch several without awaiting, drain them in parallel.
- Rust
- Python
- JavaScript
let mut handles = Vec::new();
for q in questions {
let input = serde_json::json!({"question": q}).to_string();
let p = client.launch_process(
"research-agent@0.1.0".into(), input, true, None,
).await?;
handles.push(p);
}
// Drain in parallel via tokio tasks.
let answers = futures::future::join_all(
handles.into_iter().map(|mut p| async move {
loop {
match p.recv().await? {
ProcessEvent::Return(v) => return Ok::<_, anyhow::Error>(v),
ProcessEvent::Error(e) => anyhow::bail!("{e}"),
_ => {}
}
}
})
).await;
async def run_one(q):
proc = await client.launch_process(
"research-agent@0.1.0", input={"question": q},
)
while True:
event, value = await proc.recv()
if event == Event.Return:
return json.loads(value)
if event == Event.Error:
raise RuntimeError(value)
answers = await asyncio.gather(*(run_one(q) for q in questions))
async function runOne(q: string) {
const proc = await client.launchProcess('research-agent@0.1.0', { question: q });
while (true) {
const { event, value } = await proc.recv();
if (event === 'return') return JSON.parse(value);
if (event === 'error') throw new Error(value);
}
}
const answers = await Promise.all(questions.map(runOne));
The engine batches forward passes from concurrent processes. Three runs in parallel against the same engine take roughly the same wall-clock time as one run, plus per-step overhead.
Upload a local build
For client-uploaded builds (no registry):
- Rust
- Python
- JavaScript
use std::path::Path;
if !client.check_program("research-agent@0.1.0", None, None).await? {
client.add_program(
Path::new("./research-agent.wasm"),
Path::new("./Pie.toml"),
false,
).await?;
}
if not await client.check_program("research-agent@0.1.0"):
await client.install_program("./research-agent.wasm", "./Pie.toml")
if (!(await client.checkProgram('research-agent@0.1.0'))) {
await client.installProgram('./research-agent.wasm', './Pie.toml');
}
The upload chunks at 256 KiB. The build stays loaded as long as the engine runs.
Detach and reattach
Long-running processes can be detached and reattached later by ID.
- Rust
- Python
- JavaScript
let id = process.id().to_string();
drop(process); // detach
// later, on a new client connection:
let mut process = client.attach_process(&id).await?;
pid = process.process_id
del process
# later, on a new client connection:
process = await client.attach_process(pid)
const pid = proc.id;
// later:
const proc = await client.attachProcess(pid);
The process keeps running on the engine while detached. Reattachment resumes the event stream from where it left off (with the buffered events the engine has held).
Next
- Profiling: measure end-to-end performance from the client side.
- Python client reference: full API surface.
- Rust client reference: full API surface.
- JavaScript client reference: full API surface.