DEPLOY

Mesh Network

Connect machines into a distributed inference network.

How it works

Every TARX node advertises its capabilities to the mesh. When a request arrives, the mesh routes it to the best available peer.

devices
Peer discovery

Nodes find each other automatically on the local network or via relay.

route
Smart routing

Requests route to the peer with the best latency, capacity, and model match.

security
Zero trust

All peer communication is encrypted. No data leaves your network.

Port 11436

SuperComputer service

The SuperComputer service runs on port 11436. It is a standalone Rust binary that manages peer connections, job routing, and credit accounting.

bashcontent_copy
# Check mesh status
curl http://localhost:11436/mesh/health

# List connected peers
curl http://localhost:11436/mesh/peers

# Check credits balance
curl http://localhost:11436/mesh/credits

Distributed inference

When local inference is busy or a peer has a better model, the mesh transparently routes requests to available nodes. The API is identical — your application code doesn't change.

1
Request arrives
Client sends inference request to localhost:11435
2
Mesh check
Daemon checks local capacity and peer availability
3
Route decision
Best peer selected by latency, model, and reputation
4
Execute
Inference runs on selected node, response streams back

MCP tools

Nine MCP tools for mesh management:

monitor_heart
mesh_health

Check SuperComputer service status

info
mesh_status

Active jobs and network info

group
mesh_peers

List connected peers

search
mesh_query

Run distributed inference query

whatshot
mesh_hot_models

Models available on the network

account_balance_wallet
mesh_credits

Credit balance and earnings

devices
mesh_peer_capabilities

Hardware specs of peers

speed
mesh_device_score

Local device compute score

verified
mesh_reputation

Node reputation and trust

Autonomous

Research agent

Run autonomous experiment loops distributed across mesh nodes. One machine runs a hundred experiments overnight. Ten machines run a thousand.

javascriptcontent_copy
tarx_research({
  experiment_file: "train.py",
  program_file: "instructions.md",
  metric: "val_loss",
  budget_minutes: 300,
  parallel_nodes: 4
})