how are people managing staging vs production datasets? is there an easy way to push a staging dataset to production?

9 replies
Last updated: Oct 3, 2020
how are people managing staging vs production datasets? is there an easy way to push a staging dataset to production?
AI Update

Managing staging vs production datasets in Sanity is a common workflow challenge, and there are several approaches depending on your plan level and needs.

Common Approaches

Most teams use separate datasets for different environments (like development, staging, and production). These datasets are completely isolated from each other within your project, which is great for testing changes without affecting live content.

Pushing Staging to Production

Unfortunately, there's no one-click "push to production" button in Sanity. Here are your options:

1. CLI Export/Import (Available on all plans)

The standard approach is using the Sanity CLI to export and import datasets:

# Export from staging
sanity dataset export staging staging.tar.gz

# Import to production (use --replace to overwrite)
sanity dataset import staging.tar.gz production --replace

You can also export/import specific document types:

sanity dataset export staging staging.tar.gz --types products,articles

This works but can be slow for large datasets since you're downloading and re-uploading all the data.

2. Cloud Clone (Enterprise only)

If you're on an Enterprise plan, you get access to Advanced Dataset Management which includes Cloud Clone. This lets you duplicate datasets directly in the cloud without the export/import dance. It's much faster and more efficient, especially for large datasets.

Enterprise customers also get Hot Swap functionality, which lets you use aliases to switch between datasets seamlessly - super useful for testing migrations before going live.

3. Cross Dataset Duplicator Plugin

The Cross Dataset Duplicator plugin provides a UI-based way to migrate documents and assets between datasets from within Studio. It's more selective than full dataset exports - useful when you want to copy specific documents rather than entire datasets.

npm i @sanity/cross-dataset-duplicator

How People Actually Manage This

From community discussions, here are common patterns:

  • Content flows production β†’ staging/dev: Many teams regularly export production data and import it to staging/dev environments so developers work with realistic data. This is the opposite of what you asked, but it's the more common workflow.

  • Schema changes go staging β†’ production: Developers test schema changes in staging first, then deploy the same schema to production. The content itself usually originates in production.

  • Separate workflows: Content editors work directly in production datasets, while developers work in development datasets. Schema changes are version-controlled and deployed through your normal CI/CD process.

Important Considerations

  • Datasets can have different schemas: Each dataset is schemaless, so your staging and production datasets can technically have different content models, though keeping them in sync is usually desired.

  • Cross-dataset references: If you use cross-dataset references, be aware that export/import can have issues with these.

  • Assets are included: When you export/import, assets (images, files) are included in the process.

The reality is that most teams don't regularly "push" staging content to production. Instead, they use staging to test schema changes and new features, while actual content is created directly in production. If you need frequent dataset synchronization, the Enterprise plan's Cloud Clone feature would be your best bet.

Show original thread
9 replies
What would be the use case for that πŸ˜„ Normally staging is just for testing/previewing. Data on production is always the source of truth
I can see why you'd wanna copy production data over to staging though
i mean depends on how your team/co use staging.. in this case the client wants to use staging to update content and circulate internally for approval before going live
they could obviously just replicate the changes in production when they're ready
probably better
Mike - sounds like content workflows are a much better fit for that vs having two separate data sources. They can run through an edit stage, and even preview before pushing it live.
Aah forgot about that, good one πŸ˜„ https://www.sanity.io/docs/custom-workflows
will definitely look at this thanks y'all

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?