How to create multiple objects from a single file upload in Sanity?

6 replies
Last updated: Jan 22, 2026
Hi! β˜•
I would like to load a file and generate several new objects based on its contents (depending on the size of the loaded file). I can already do this for one output object (based on
the tutorial ), but I don't know how to make it more flexible so that I can load any number of objects.
I will be very grateful for your tips
πŸ€—
AI Update

Based on your question, it sounds like you want to create multiple documents in Sanity programmatically based on the contents of a file. Here's how you can approach this flexibly:

Using the Sanity Client for Bulk Document Creation

The Sanity Client is the official JavaScript/TypeScript SDK that you'll need for creating multiple documents. Here's a flexible approach that can handle any number of documents:

import {createClient} from '@sanity/client'

const client = createClient({
  projectId: 'your-project-id',
  dataset: 'your-dataset',
  apiVersion: '2024-01-01',
  token: 'your-write-token', // Required for creating documents
  useCdn: false
})

// Example: Load and parse your file, then create documents
async function createDocumentsFromFile(fileData) {
  // Parse your file data into an array of objects
  const items = parseYourFile(fileData) // Your parsing logic here
  
  // Create a transaction for multiple documents
  const transaction = client.transaction()
  
  items.forEach(item => {
    transaction.create({
      _type: 'yourDocumentType',
      // Map your file data to document fields
      title: item.title,
      description: item.description,
      // ... other fields from your file
    })
  })
  
  // Commit all documents at once (atomic operation)
  try {
    const result = await transaction.commit()
    console.log(`Created ${result.results.length} documents`)
    return result
  } catch (error) {
    console.error('Transaction failed:', error)
    throw error
  }
}

For Large Files: Batch Processing

If your file contains many items, process them in batches to avoid rate limits:

async function bulkCreateDocuments(items) {
  const batchSize = 100 // Adjust based on your needs
  
  for (let i = 0; i < items.length; i += batchSize) {
    const batch = items.slice(i, i + batchSize)
    const transaction = client.transaction()
    
    batch.forEach(item => {
      transaction.create({
        _type: 'yourDocumentType',
        ...item
      })
    })
    
    await transaction.commit()
    console.log(`Processed batch ${Math.floor(i / batchSize) + 1}`)
  }
}

If you want this to happen server-side (which is the modern, recommended approach), use Sanity Functions:

// sanity.blueprint.ts
import {defineBlueprint} from 'sanity'

export default defineBlueprint({
  functions: [
    {
      name: 'process-file',
      handler: async (event, context) => {
        const {fileData} = event.body
        const client = context.getClient()
        
        // Parse your file
        const items = parseFile(fileData)
        
        // Create documents in batches
        const batchSize = 100
        for (let i = 0; i < items.length; i += batchSize) {
          const batch = items.slice(i, i + batchSize)
          const transaction = client.transaction()
          
          batch.forEach(item => {
            transaction.create({
              _type: 'yourType',
              ...item
            })
          })
          
          await transaction.commit()
        }
        
        return {success: true, count: items.length}
      }
    }
  ]
})

Key Points for Flexibility

  1. Transactions are atomic - all documents are created or none are (good for data integrity)
  2. Dynamic document count - the number of documents created depends entirely on your file contents
  3. Batch processing - handles large files by processing in chunks (recommended: 100-1000 per batch)
  4. Error handling - wrap in try/catch to handle failures gracefully
  5. Authentication - requires a token with write permissions

The flexibility comes from how you parse your file and map the data to your schema. Just iterate through your parsed data and add each item to the transaction. The number of output objects will automatically match the size of your input file!

Would you like help with a specific file format (CSV, JSON, Excel) or guidance on mapping your data to a particular schema?

Show original thread
6 replies

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?