Automate multi-view dataset generation for NeRF, 3DGS, and 3D reconstruction using AI agents
With a single natural-language prompt, your AI agent can:
No scripting required. Describe what you want in plain English — the agent handles the API calls.
Add this to your MCP client configuration (Windsurf, Claude Desktop, Cursor, or any MCP-compatible host):
{
"mcpServers": {
"dxgl": {
"serverUrl": "https://mcp.dx.gl/",
"headers": {
"Authorization": "Bearer dxgl_sk_..."
}
}
}
}
Replace dxgl_sk_... with your API key from the Portal → API Keys.
Paste this into your AI agent to verify everything works:
List my models and tell me how many I have.
The agent will call list_models and return a summary. If this works, you're connected.
Generate a 196×1024 hemisphere dataset for model Ab3kF9x2qL1m.
The agent calls create_render with output: "dataset", datasetQuality: "196x1024", coverage: "hemisphere", then polls get_render until done, and returns the download URL.
Generate 100×800 hemisphere datasets for every model tagged "validation-set". Quote the cost first and wait for my approval before proceeding.
The agent will:
list_models with tags: "validation-set" to find matching modelsquote with the dataset configuration to show the total credit costcreate_batch_renders to submit all datasets at onceget_render for each until completeCreate a 400×2048 full sphere dataset for my cash register model. I need maximum coverage for reconstruction.
The agent uses coverage: "sphere" for ±80° elevation range instead of the default hemisphere.
Show me all models that already have dataset renders, and which ones don't have any yet.
The agent calls list_models, then get_model for each to inspect render history, and categorizes them.
I have 200 models tagged "catalog". How many credits would it cost to generate 196×1024 hemisphere datasets for all of them?
The agent calls quote with 200 dataset renders at 16 credits each, checks your balance, and reports whether you have enough.
Import this model from https://example.com/scan.glb, tag it "street-furniture", then generate a 196×1024 sphere dataset.
The agent chains ingest_model → waits for system preview → create_render with dataset settings → polls until done → returns the download URL.
| Tier | Views | Resolution | Credits | Best for |
|---|---|---|---|---|
100x800 |
100 | 800×800 | 1 | Quick experiments, proof-of-concept |
196x1024 |
196 | 1024×1024 | 4 | Production training, best quality-to-cost ratio |
400x2048 |
400 | 2048×2048 | 16 | Maximum fidelity, large-scale reconstruction |
Every dataset ZIP contains:
images/ — RGB PNG frames (composited on white background)depth/ — 8-bit grayscale depth PNGsdepth_16bit/ — 16-bit grayscale depth PNGs (65,536 levels)normals/ — world-space normal map PNGsmasks/ — foreground/background alpha maskstransforms.json — camera intrinsics + per-frame 4×4 transform matrices (nerfstudio / instant-ngp format)overview.webp — 4-quadrant contact sheet| Tool | What it does |
|---|---|
ingest_model |
Import a 3D model from a URL |
list_models |
List models with tag and pagination filters |
get_model |
Get model details and all its renders |
create_render |
Create a single dataset render |
create_batch_renders |
Create multiple dataset renders atomically |
get_render |
Poll render status until complete |
download_render |
Get the download URL for a completed dataset ZIP |
get_account |
Check your credit balance |
quote |
Estimate credit cost before committing |
Tag models before generating datasets. This lets you target subsets efficiently:
Generate datasets for all models tagged "batch-3" — skip any that already have a 196×1024 dataset.
The quote tool prevents surprises. Ask the agent to quote before any batch operation:
Quote 196×1024 datasets for my 50 furniture models, then proceed only if it's under 200 credits.
After downloading dataset ZIPs, train with nerfstudio:
unzip dataset.zip -d mymodel
ns-train splatfacto --data ./mymodel \
--max-num-iterations 15000 \
--pipeline.model.sh-degree 3 \
--pipeline.model.background-color white
The background-color white flag is required — images are composited on a white background.
Your agent can orchestrate the entire workflow in one prompt:
Import all GLB files from these URLs, tag them "experiment-7", generate 196×1024 sphere datasets for each, and give me the download URLs when done.
The agent handles import → tagging → rendering → polling → URL collection without any manual steps.