CoursesTesting Sanity StudioContinuous Integration and Test Strategy
Testing Sanity Studio

Continuous Integration and Test Strategy

Move tests from local development into CI pipelines that verify changes before they reach production. Configure automated test runs on pull requests, report test results directly in GitHub, and develop a strategic framework for deciding what to test. Learn to prioritize test coverage based on business impact, complexity, and change frequency—balancing protection with development velocity.
Log in to mark your progress for each Lesson and Task

You've been running tests locally with pnpm test in watch mode. This provides instant feedback while developing. But tests become more valuable when integrated into your workflow at key points:

  • Pull requests - Automated checks prevent broken code from being merged
  • Before deployment - Tests catch issues before they reach content editors
  • Scheduled runs - Detect drift from dependencies or external changes

Running tests as part of your continuous integration (CI) ensures every code change is validated automatically, regardless of who wrote it or what they tested locally.

Architecture & DevOps covers setting up schema validation, linting, and preview deployments for your pull requests.

When tests fail in CI, Vitest reports the failure(s) in your pull request. In Github, for example, failing pull requests will show:

  • ❌ Red X next to the commit
  • Detailed logs showing which tests failed
  • Line numbers and error messages
  • Option to re-run failed tests

This prevents merging broken code and makes code review more efficient. Reviewers can focus on logic and design, trusting that tests verify correctness.

Not all code is equally important to test. Prioritize based on:

Validation functions - Protect data integrity

Data transformation - Shape content for display

Critical business logic - Features that could break revenue/experience

Custom input components - When they have non-trivial logic

Schema structure helpers - When they involve logic

Simple schema definitions - No logic to test

Thin wrappers - Just pass through to libraries

UI-only components - Styling with no behavior

Testing strategy is about making intentional trade-offs. Perfect coverage isn't the goal—protecting critical business logic while maintaining development velocity is.

As your test suite grows, maintain structure:

apps/studio/
├── schemaTypes/
│ ├── validation/
│ │ ├── eventValidation.ts
│ │ └── eventValidation.test.ts
│ ├── components/
│ │ ├── DoorsOpenInput.tsx
│ │ └── DoorsOpenInput.test.tsx
│ └── eventType.ts
└── __tests__/
├── fixtures/
│ ├── validation.ts
│ ├── client.ts
│ └── providers.tsx
└── setup.ts

Key principles:

  • Co-locate tests with the code they test
  • Share fixtures in a central location (__tests__/fixtures)
  • Name tests after the file being tested (DoorsOpenInput.test.tsx)

Tests require maintenance like production code. Follow these practices:

// ❌ Complex test with multiple concerns
it('handles everything', async () => {
const result1 = await validateVenue(venue1, context1)
const result2 = await validateVenue(venue2, context2)
const result3 = await validateVenue(venue3, context3)
expect(result1).toBe(true)
expect(result2).toBe(false)
expect(result3).toBe(true)
})
// ✅ Focused tests, one concept each
it('allows venue for in-person events', async () => {
const result = await validateVenue(venue, inPersonContext)
expect(result).toBe(true)
})
it('rejects venue for virtual events', async () => {
const result = await validateVenue(venue, virtualContext)
expect(result).toBe('Only in-person events can have a venue')
})
// ❌ Vague
it('works', () => {})
it('test1', () => {})
// ✅ Clear intent
it('allows venue for in-person events', () => {})
it('rejects venue for virtual events', () => {})
it('calculates doors open time 60 minutes before event', () => {})
// ❌ Repeated setup in every test
it('test 1', () => {
const context = {document: {_id: '1', _type: 'event', eventType: 'in-person'}}
// ... test
})
it('test 2', () => {
const context = {document: {_id: '2', _type: 'event', eventType: 'virtual'}}
// ... test
})
// ✅ Reusable fixture
function createEventContext(eventType: string) {
return createMockValidationContext({
_id: `event-${eventType}`,
_type: 'event',
eventType,
})
}
it('allows venue for in-person events', () => {
const context = createEventContext('in-person')
// ... test
})

Building a test suite is an investment. Start small and expand strategically:

Begin with functions that protect data integrity:

  • Required field validation
  • Business rule enforcement
  • Data consistency checks

Goal: Prevent content editors from creating invalid documents

Add tests for helper functions with non-trivial logic:

  • Date/time calculations
  • Formatting utilities
  • Data transformations

Goal: Catch bugs in commonly-used utilities

Test custom inputs with complex behavior:

  • Components with conditional rendering
  • Components with user interactions
  • Components that query the dataset

Goal: Ensure editor UI works correctly

Add GitHub Actions to run tests automatically:

  • On every pull request
  • Before merging to main
  • Before deployments

Goal: Prevent untested code from reaching production

Review your current Studio code. Categorize your validation functions, helpers, and components into high/medium/low priority based on business impact and complexity.

As your suite grows, periodically review test quality:

  1. Red - Write a failing test
  2. Green - Make it pass with minimal code
  3. Refactor - Improve both test and production code
Test-driven development (TDD) naturally produces better-designed code. Writing tests first forces you to think about interfaces and edge cases before implementation.

This discipline prevents over-engineering and keeps tests focused.

When you remove features or refactor code, delete tests that no longer serve a purpose. Dead code in tests is still maintenance burden.

Strategic investment

  • Tests pay dividends through confident refactoring and faster debugging
  • Start with high-value tests (validation, critical logic)
  • Grow your suite incrementally as complexity increases

Technical implementation

  • Pure functions are easiest to test
  • Mock external dependencies (clients, contexts)
  • Test user-facing behavior, not implementation details

Workflow integration

  • Watch mode for instant local feedback
  • CI runs for automated validation
  • Coverage reports to find gaps

Sustainable testing

  • Keep tests simple and focused
  • Co-locate tests with code
  • Delete obsolete tests
  • Prioritize readability over cleverness
You have 1 uncompleted task in this lesson
0 of 1