Schema Evolution
As your application grows, your database schema must evolve. Adding features means adding columns, renaming fields for clarity, and occasionally changing data types. This chapter teaches you how to make these changes safely with D1 and Drizzle.
Schema Evolution Patterns
Before diving into specific operations, let’s understand the common patterns for evolving schemas:
flowchart TD
subgraph SafeChanges["✅ Safe Changes (Non-Breaking)"]
A["Add new column<br/>(nullable or with default)"]
B["Add new table"]
C["Add new index"]
D["Add new constraint<br/>(on new data only)"]
end
subgraph CarefulChanges["⚠️ Careful Changes (Potentially Breaking)"]
E["Rename column<br/>(requires data migration)"]
F["Change column type<br/>(requires data transformation)"]
G["Add NOT NULL<br/>(requires default or backfill)"]
end
subgraph DangerousChanges["🚨 Dangerous Changes (Breaking)"]
H["Drop column<br/>(data loss)"]
I["Drop table<br/>(data loss)"]
J["Change primary key<br/>(foreign key issues)"]
end
A --> M["Generate migration"]
B --> M
C --> M
D --> M
E --> N["Plan carefully"]
F --> N
G --> N
N --> M
H --> O["Consider alternatives"]
I --> O
J --> O
O --> M
M --> P["Test locally"]
P --> Q["Apply to staging"]
Q --> R["Apply to production"]
Adding Columns
Adding columns is the most common schema change. It’s straightforward when done correctly.
Adding a Nullable Column
The safest way to add a column is to make it nullable (no notNull() constraint):
import { sqliteTable, text, integer } from 'drizzle-orm/sqlite-core';
export const users = sqliteTable('users', { id: integer('id').primaryKey({ autoIncrement: true }), name: text('name').notNull(), email: text('email').notNull().unique(), bio: text('bio'), // New nullable column - safe to add website: text('website'), // Another nullable column});Generate the migration:
pnpm drizzle-kit generateExpected generated SQL:
ALTER TABLE `users` ADD `bio` text;ALTER TABLE `users` ADD `website` text;This migration is safe because:
- Existing rows get
NULLfor the new columns - No existing data is affected
- No application code changes required immediately
Adding a Column with a Default Value
When you need a new column that shouldn’t be null, provide a default value:
export const users = sqliteTable('users', { id: integer('id').primaryKey({ autoIncrement: true }), name: text('name').notNull(), email: text('email').notNull().unique(), role: text('role').notNull().default('user'), // New column with default createdAt: integer('created_at', { mode: 'timestamp' }) .notNull() .default(sql`(unixepoch())`), // Default to current timestamp});Generated SQL:
ALTER TABLE `users` ADD `role` text NOT NULL DEFAULT 'user';ALTER TABLE `users` ADD `created_at` integer NOT NULL DEFAULT (unixepoch());This is safe because:
- Existing rows automatically get the default value
- New rows also get the default if no value is provided
- No data gaps or null issues
Adding a Required Column to an Existing Table
If you need to add a NOT NULL column without a sensible default, you have two options:
Option 1: Add as nullable, backfill, then add constraint
export const users = sqliteTable('users', { // ... existing columns avatarUrl: text('avatar_url'), // Start as nullable});Then write a custom migration to backfill:
-- Step 1: Add nullable columnALTER TABLE `users` ADD `avatar_url` text;
-- Step 2: Backfill existing rows with a default valueUPDATE `users` SET `avatar_url` = 'https://example.com/default-avatar.png' WHERE `avatar_url` IS NULL;Option 2: Use a placeholder default (simpler but less clean)
export const users = sqliteTable('users', { // ... existing columns avatarUrl: text('avatar_url').notNull().default('pending'), // Temporary default});Adding Multiple Columns at Once
You can add multiple columns in a single schema update:
export const users = sqliteTable('users', { id: integer('id').primaryKey({ autoIncrement: true }), name: text('name').notNull(), email: text('email').notNull().unique(), // New profile fields bio: text('bio'), website: text('website'), location: text('location'), company: text('company'), twitterHandle: text('twitter_handle'),});Drizzle Kit will generate a single migration with all the new columns.
Renaming Columns Safely
Renaming columns in SQLite requires careful planning. D1 uses SQLite, which has limited ALTER TABLE support.
The Challenge with Renaming
SQLite versions before 3.25.0 don’t support ALTER TABLE RENAME COLUMN. While D1 uses a newer SQLite version that does support it, Drizzle Kit generates the safer “table recreation” approach to ensure compatibility.
Method 1: Direct Rename (Modern SQLite)
If you’re confident your D1 instance supports it, you can write a custom migration:
-- Direct column rename (SQLite 3.25+)ALTER TABLE `users` RENAME COLUMN `name` TO `full_name`;Then update your schema to match:
export const users = sqliteTable('users', { id: integer('id').primaryKey({ autoIncrement: true }), fullName: text('full_name').notNull(), // Updated column name email: text('email').notNull().unique(),});Method 2: Table Recreation (Safest)
The safest approach recreates the table with the new structure:
-- Step 1: Create new table with desired structureCREATE TABLE `users_new` ( `id` integer PRIMARY KEY AUTOINCREMENT NOT NULL, `full_name` text NOT NULL, `email` text NOT NULL);
-- Step 2: Copy data from old table to newINSERT INTO `users_new` (`id`, `full_name`, `email`)SELECT `id`, `name`, `email` FROM `users`;
-- Step 3: Drop old tableDROP TABLE `users`;
-- Step 4: Rename new table to original nameALTER TABLE `users_new` RENAME TO `users`;
-- Step 5: Recreate indexesCREATE UNIQUE INDEX `users_email_unique` ON `users` (`email`);Method 3: Add New Column, Migrate, Remove Old (Gradual)
For production systems with zero-downtime requirements:
Step 1: Add new column
export const users = sqliteTable('users', { id: integer('id').primaryKey({ autoIncrement: true }), name: text('name').notNull(), // Keep old column fullName: text('full_name'), // Add new column (nullable initially) email: text('email').notNull().unique(),});Step 2: Migrate data
ALTER TABLE `users` ADD `full_name` text;UPDATE `users` SET `full_name` = `name`;Step 3: Update application to use new column
Deploy your application to read/write from both columns, preferring the new one.
Step 4: Make new column required, drop old
-- This requires table recreation in SQLiteCREATE TABLE `users_new` ( `id` integer PRIMARY KEY AUTOINCREMENT NOT NULL, `full_name` text NOT NULL, `email` text NOT NULL);
INSERT INTO `users_new` SELECT `id`, `full_name`, `email` FROM `users`;DROP TABLE `users`;ALTER TABLE `users_new` RENAME TO `users`;CREATE UNIQUE INDEX `users_email_unique` ON `users` (`email`);Changing Column Types
Type changes are among the trickiest schema modifications. SQLite’s flexibility with types helps, but careful planning is still required.
Understanding SQLite’s Type System
SQLite uses dynamic typing with “type affinities”:
| Declared Type | Affinity | Stored As |
|---|---|---|
INTEGER, INT | INTEGER | Integer |
TEXT, VARCHAR | TEXT | String |
REAL, FLOAT | REAL | Float |
BLOB | BLOB | Binary |
NUMERIC | NUMERIC | Any |
This means many “type changes” don’t actually require migration—SQLite is flexible about what you store.
Safe Type Changes (No Migration Needed)
Some changes only affect your TypeScript types:
// Before: storing as numberexport const products = sqliteTable('products', { price: integer('price'), // Cents as integer});
// After: still integer, just different TypeScript interpretationexport const products = sqliteTable('products', { price: integer('price', { mode: 'number' }), // Same storage, different TS type});Changing from Text to Integer
When you need to convert string data to numbers:
-- Step 1: Create new table with correct typeCREATE TABLE `products_new` ( `id` integer PRIMARY KEY AUTOINCREMENT NOT NULL, `name` text NOT NULL, `price` integer NOT NULL);
-- Step 2: Copy and convert dataINSERT INTO `products_new` (`id`, `name`, `price`)SELECT `id`, `name`, CAST(`price` AS INTEGER) FROM `products`;
-- Step 3: Swap tablesDROP TABLE `products`;ALTER TABLE `products_new` RENAME TO `products`;Changing Integer to Text
Converting numbers to strings is straightforward:
CREATE TABLE `items_new` ( `id` integer PRIMARY KEY AUTOINCREMENT NOT NULL, `product_code` text NOT NULL);
INSERT INTO `items_new` (`id`, `product_code`)SELECT `id`, CAST(`product_code` AS TEXT) FROM `items`;
DROP TABLE `items`;ALTER TABLE `items_new` RENAME TO `items`;Adding or Removing NOT NULL
Adding NOT NULL to existing column:
-- First, ensure no nulls existUPDATE `users` SET `bio` = '' WHERE `bio` IS NULL;
-- Then recreate table with constraintCREATE TABLE `users_new` ( `id` integer PRIMARY KEY AUTOINCREMENT NOT NULL, `name` text NOT NULL, `email` text NOT NULL, `bio` text NOT NULL -- Now required);
INSERT INTO `users_new` SELECT * FROM `users`;DROP TABLE `users`;ALTER TABLE `users_new` RENAME TO `users`;CREATE UNIQUE INDEX `users_email_unique` ON `users` (`email`);Removing NOT NULL from existing column:
CREATE TABLE `users_new` ( `id` integer PRIMARY KEY AUTOINCREMENT NOT NULL, `name` text NOT NULL, `email` text NOT NULL, `bio` text -- Now optional);
INSERT INTO `users_new` SELECT * FROM `users`;DROP TABLE `users`;ALTER TABLE `users_new` RENAME TO `users`;CREATE UNIQUE INDEX `users_email_unique` ON `users` (`email`);Changing Timestamp Modes
Drizzle offers different modes for storing timestamps:
// Mode 1: Store as Unix timestamp (integer seconds)createdAt: integer('created_at', { mode: 'timestamp' }),
// Mode 2: Store as Unix timestamp in millisecondscreatedAt: integer('created_at', { mode: 'timestamp_ms' }),
// Mode 3: Store as ISO stringcreatedAt: text('created_at'),Converting between these requires data transformation:
-- Convert from seconds to millisecondsUPDATE `posts` SET `created_at` = `created_at` * 1000;Best Practices for Schema Evolution
1. Plan Before You Change
Before modifying your schema, ask:
- Will this break existing queries?
- Does the application code need updating first?
- Can this be done without downtime?
2. Test Migrations Thoroughly
# Always test locally firstpnpm drizzle-kit generatepnpm wrangler d1 migrations apply DB --local
# Then stagingpnpm wrangler d1 migrations apply DB --remote --env staging
# Finally productionpnpm wrangler d1 migrations apply DB --remote3. Keep Migrations Small and Focused
-- ✅ Good: One focused changeALTER TABLE `users` ADD `bio` text;-- ❌ Avoid: Multiple unrelated changesALTER TABLE `users` ADD `bio` text;CREATE TABLE `categories` (...);ALTER TABLE `posts` DROP COLUMN `legacy_field`;4. Document Breaking Changes
Add comments to complex migrations:
-- BREAKING CHANGE: Removes legacy_status column-- Ensure application version 2.0+ is deployed before running-- The status field was migrated to status_enum in migration 0008
CREATE TABLE `orders_new` (...);-- ... rest of migration5. Never Edit Applied Migrations
Once a migration has been applied to any environment:
- Never modify it
- Create a new migration to fix issues
- Applied migrations are immutable history
6. Have a Rollback Strategy
For critical changes, prepare a rollback migration before deploying:
-- Prepared rollback for migration 0011-- Only run if 0011 causes issues
-- Reverse the changes...Summary
Schema evolution is an essential skill for long-lived applications:
- Adding columns: Prefer nullable columns or columns with defaults
- Renaming columns: Use table recreation for safety, or direct rename for simplicity
- Changing types: Leverage SQLite’s flexibility, but test conversions carefully
- Always test: Local → Staging → Production
With these patterns, you can confidently evolve your D1 database schema as your application grows.