curl -X 'GET' \
'https://prod.visionapi.unsiloed.ai/parse/04a7a6d8-5ef7-465a-b22a-8a98e7104dd9' \
-H 'accept: application/json' \
-H 'api-key: your-api-key'
{
"job_id": "04a7a6d8-5ef7-465a-b22a-8a98e7104dd9",
"status": "Starting",
"created_at": "2025-10-22T06:51:16.870302Z",
"metadata": {}
}
Check the status and retrieve results of parsing jobs
curl -X 'GET' \
'https://prod.visionapi.unsiloed.ai/parse/04a7a6d8-5ef7-465a-b22a-8a98e7104dd9' \
-H 'accept: application/json' \
-H 'api-key: your-api-key'
{
"job_id": "04a7a6d8-5ef7-465a-b22a-8a98e7104dd9",
"status": "Starting",
"created_at": "2025-10-22T06:51:16.870302Z",
"metadata": {}
}
POST /parse.false.chunks array in the response. Defaults to true.false.Starting, Processing, Succeeded, Failed, or Cancelled.xml_citation is enabled or from the job record.Starting.Succeeded or Failed.Succeeded.Succeeded.Succeeded.Succeeded.Failed.curl -X 'GET' \
'https://prod.visionapi.unsiloed.ai/parse/04a7a6d8-5ef7-465a-b22a-8a98e7104dd9' \
-H 'accept: application/json' \
-H 'api-key: your-api-key'
{
"job_id": "04a7a6d8-5ef7-465a-b22a-8a98e7104dd9",
"status": "Starting",
"created_at": "2025-10-22T06:51:16.870302Z",
"metadata": {}
}
Starting
Processing
Succeeded
Failed
Cancelled
import requests
import time
def poll_parse_job(job_id, api_key, max_wait_time=300, poll_interval=5):
"""Poll a parsing job until completion or timeout"""
start_time = time.time()
headers = {"api-key": api_key}
while time.time() - start_time < max_wait_time:
response = requests.get(
f"https://prod.visionapi.unsiloed.ai/parse/{job_id}",
headers=headers
)
if response.status_code == 200:
job = response.json()
if job['status'] == 'Succeeded':
return job
elif job['status'] == 'Failed':
raise Exception(f"Job failed: {job.get('message', 'Unknown error')}")
elif job['status'] in ['Starting', 'Processing']:
print(f"Job status: {job['status']} - waiting...")
time.sleep(poll_interval)
else:
print(f"Unknown status: {job['status']}")
time.sleep(poll_interval)
else:
print(f"Error checking status: {response.status_code}")
time.sleep(poll_interval)
raise Exception("Job polling timed out")
# Usage
try:
result = poll_parse_job("04a7a6d8-5ef7-465a-b22a-8a98e7104dd9", "your-api-key")
print("Job completed successfully!")
print(f"Total chunks: {result['total_chunks']}")
except Exception as e:
print(f"Error: {e}")
404 response.500 response.API key for authentication. Use 'Bearer <your_api_key>'
Job ID returned by POST /parse.
Return segment images as base64-encoded data URIs instead of S3 presigned URLs. Defaults to false.
Include the chunks array in the response. Defaults to true.
Return a presigned S3 URL to the raw output JSON file instead of inlining the full response body. Defaults to false.
Job status and results. Output fields (chunks, total_chunks, page_count, pdf_url) are present only when status is Succeeded.
Response body for GET /parse/{job_id}.
Fields marked as optional appear only when the job has reached the relevant status.
ISO 8601 timestamp when the job was created.
Job identifier.
Citation or job metadata. Populated when xml_citation is enabled or from the job record.
Current job status: Starting, Processing, Succeeded, Failed, or Cancelled.
Array of document chunks with segments and extracted content. Present when status is Succeeded.
Show child attributes
Configuration used for this job (mirrors the parameters submitted at creation time).
Show child attributes
Credits used for this job.
Original file name from the job record.
MIME type of the uploaded file.
S3 URL of the original uploaded file.
ISO 8601 timestamp when processing completed. Present when status is Succeeded or Failed.
Internal Supabase job record ID.
Whether table merging was enabled for this job.
Error or status detail message. Present when status is Failed.
Number of pages in the document. Present when status is Succeeded.
Presigned S3 URL to the generated PDF. Present when status is Succeeded.
ISO 8601 timestamp when processing started. Present when status is not Starting.
Total number of document chunks. Present when status is Succeeded.