Skip to main content

Managing Managed Tables

This guide walks through the full Managed Tables API flows: database provisioning, manual creation, file upload with schema inference, and row CRUD.

When to use it: when your application needs to store the company's own structured data (spreadsheets, computed datasets, operational lists) without depending on an external database. When not to use it: if you want to expose a saved visualization as an API (use the Data API).

Prerequisites​

  • Console JWT (regular login β€” does not use API key).
  • Assigned IAM permissions:
    • managed-tables:list, managed-tables:create, managed-tables:update, managed-tables:delete
    • managed-tables:upload, managed-tables:data-read, managed-tables:data-write
  • Managed roles that already bundle everything: ConnectionManager, DataEngineer.

Flow overview​

Workflow 1: Provisioning​

Every company must provision its storage area once. Idempotent β€” subsequent calls return the existing state.

export TOKEN="eyJ..."
export API_BASE_URL="http://localhost:3100"

# 1. Check status
curl "$API_BASE_URL/api/v1/managed-tables/status" \
-H "Authorization: Bearer $TOKEN"
# β†’ { "provisioned": false, "connection_id": null }

# 2. Provision (idempotent)
curl -X POST "$API_BASE_URL/api/v1/managed-tables/provision" \
-H "Authorization: Bearer $TOKEN"
# β†’ { "provisioned": true, "connection_id": "507f..." }

Workflow 2: Manual table creation​

Use when you already know the schema. No file required.

# 1. Create table
curl -X POST "$API_BASE_URL/api/v1/managed-tables" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"display_name": "monthly_sales",
"columns": [
{ "name": "product", "display_name": "Product", "type": "text", "nullable": false },
{ "name": "amount", "display_name": "Amount", "type": "decimal", "nullable": true },
{ "name": "sale_date", "display_name": "Date", "type": "date", "nullable": false }
]
}'

# Response:
# { "_id": "507f...", "display_name": "monthly_sales", "columns": [...], ... }

# 2. Insert rows
curl -X POST "$API_BASE_URL/api/v1/managed-tables/507f.../data" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"rows": [
{ "product": "Shirt", "amount": 89.90, "sale_date": "2026-01-15" },
{ "product": "Pants", "amount": 159.90, "sale_date": "2026-01-15" }
]
}'
# β†’ 201 { "inserted": 2 }
Naming rules
  • display_name and column names match ^[a-z_][a-z0-9_]*$.
  • display_name is unique per company among active tables.
  • A conflict returns 400 with error_code: MANAGED_TABLE_DUPLICATE_NAME.

Workflow 3: File upload (CSV/Excel/ZIP)​

A 2-step flow with schema inference. The API processes the file asynchronously and the client confirms the schema before the final insert.

Step 1 β€” upload​

curl -X POST "$API_BASE_URL/api/v1/managed-tables/upload" \
-H "Authorization: Bearer $TOKEN" \
-F "files=@sales_2026.csv"
# β†’ 201 { "job_id": "abc123", "status": "queued" }

Limits: up to 10 files per call, 100MB total. Accepts .csv, .xlsx, .xls, .zip (zip is unpacked automatically).

Step 2 β€” status polling​

curl "$API_BASE_URL/api/v1/managed-tables/upload-jobs/abc123" \
-H "Authorization: Bearer $TOKEN"

States you will observe (in order):

StatusMeaningClient action
queuedWaiting for processingKeep polling
analisandoAPI is processing the fileShow spinner
waiting_confirmInferred schema is readyRender confirmation UI from inferred_schema
insertingInsert in progressShow progress (progress.processed_rows)
doneTable created/updatedRedirect to user_table_id
failedErrorShow error

When status === "waiting_confirm", the body contains:

{
"status": "waiting_confirm",
"inferred_schema": {
"suggested_table_name": "sales_2026",
"columns": [
{ "name": "product", "display_name": "Product", "type": "text", "nullable": false },
{ "name": "amount", "display_name": "Amount", "type": "decimal", "nullable": true }
],
"sample_rows": [{ "product": "Shirt", "amount": "89.90" }],
"total_row_count_estimate": 1500
}
}

Step 3 β€” confirm schema​

The user can edit column names, types, and the table display_name before confirming:

curl -X POST "$API_BASE_URL/api/v1/managed-tables/upload-jobs/abc123/confirm" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"display_name": "sales_2026",
"columns": [
{ "name": "product", "display_name": "Product", "type": "text", "nullable": false },
{ "name": "amount", "display_name": "Amount", "type": "decimal", "nullable": true }
]
}'
# β†’ 200 { "status": "inserting" }

Keep polling until status === "done". The job's user_table_id points to the created table.

Workflow 4: Append/Replace into an existing table​

Adds new rows to an existing table (or replaces the entire content).

# Append into existing table
curl -X POST "$API_BASE_URL/api/v1/managed-tables/507f.../upload" \
-H "Authorization: Bearer $TOKEN" \
-F "files=@sales_february.csv" \
-F "mode=append"

# Replace (overwrites everything)
curl -X POST "$API_BASE_URL/api/v1/managed-tables/507f.../upload" \
-H "Authorization: Bearer $TOKEN" \
-F "files=@sales_full.csv" \
-F "mode=replace"
Append/replace differences
  • The existing table display_name is reused β€” the regex ^[a-z_][a-z0-9_]*$ is not enforced (supports legacy tables).
  • Confirmed schema must match the existing table schema β€” extra columns cause the insert to fail.
  • replace is atomic β€” the entire content is swapped in a single operation (no partial state visible).

Workflow 5: Row CRUD​

Point-to-point operations on a table's rows.

# List (paginated)
curl "$API_BASE_URL/api/v1/managed-tables/507f.../data?page=1&per_page=50&sort_by=-sale_date" \
-H "Authorization: Bearer $TOKEN"
# β†’ { "total": 1500, "quantity": 50, "records": [{ "id": "uuid", "product": "X", ... }] }

# Insert
curl -X POST "$API_BASE_URL/api/v1/managed-tables/507f.../data" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{ "rows": [{ "product": "Z", "amount": 50 }] }'

# Update row
curl -X PUT "$API_BASE_URL/api/v1/managed-tables/507f.../data/<rowId>" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{ "amount": 99.90 }'

# Delete
curl -X DELETE "$API_BASE_URL/api/v1/managed-tables/507f.../data/<rowId>" \
-H "Authorization: Bearer $TOKEN"
# β†’ 204 No Content

rowId is the row's internal identifier (typically a UUID generated on insert). It appears as id in each item under records.

Workflow 6: Soft-delete a table​

Listings (GET /v1/managed-tables) already exclude archived tables, so the table disappears immediately. Restoration is not exposed by the API β€” open a support ticket if you need to revert.

Supported column types​

TypePostgreSQL DDLExample
textTEXT"abc"
integerBIGINT42
decimalNUMERIC89.90
booleanBOOLEANtrue
dateDATE"2026-01-15"
datetimeTIMESTAMPTZ"2026-01-15T10:30:00Z"

Error handling​

HTTPCodeCommon causeAction
400MANAGED_TABLE_DUPLICATE_NAMEdisplay_name already activeAsk the user for another name
400-Database not provisionedCall POST /provision first
400-Job not in waiting_confirmKeep polling before confirming
401-Missing or expired JWTRefresh token
403-Missing IAM permissionCheck the user's role
404-Table/row/job does not existRe-fetch resource state
422-Invalid schema (regex, type, missing field)Show specific error to the user
500-Internal errorRetry with backoff; log request_id

Best practices​

  • Don't loop POST /provision β€” it is idempotent, but each call is more expensive. Check GET /status first.
  • Use exponential-backoff polling β€” start at 1s, double up to 10s. Upload jobs may take seconds to minutes depending on file size.
  • Show progress.processed_rows / progress.total_rows during inserting to give the user feedback on large files.
  • Validate file sizes on the frontend before upload β€” the backend rejects >100MB total.
  • Reuse mode=append for incremental uploads; do not create a new table every time.
  • For custom queries on top of managed tables, expose them as a Visualization and publish them via the Data API. Managed Tables does not expose free-form SQL.

Full reference​

For each endpoint's detailed contract (all fields, status codes, error examples), see the Managed Tables API Reference.