@repo/db is the central persistence layer. It uses Drizzle ORM for schema definitions and migrations, and Kysely as the runtime query builder for full SQL control with type safety. All database access goes through typed repository objects whose methods return Safe<T> results.
graph TD
db["@repo/db"] --> errors["@repo/errors"]
db --> logger["@repo/logger"]
db --> safe["@repo/safe"]
db -.-> typescript_config["@repo/typescript-config"]
db -.-> vitest_config["@repo/vitest-config"]subgraph runtime [Runtime Layer] Kysely["Kysely Query Builder"] Repos["Repository Objects"] end
subgraph bridge [Type Bridge] D2K["DrizzleToKyesely"] end
Drizzle --> D2K D2K --> Kysely Kysely --> Repos Drizzle --> Migrations
subgraph runtime [Runtime Layer] Kysely["Kysely Query Builder"] Repos["Repository Objects"] end
subgraph bridge [Type Bridge] D2K["DrizzleToKyesely"] end
Drizzle --> D2K D2K --> Kysely Kysely --> Repos Drizzle --> Migrations
graph LR
subgraph schema [Schema Layer]
Drizzle["Drizzle ORM"]
Migrations["drizzle-kit migrations"]
end
subgraph runtime [Runtime Layer]
Kysely["Kysely Query Builder"]
Repos["Repository Objects"]
end
subgraph bridge [Type Bridge]
D2K["DrizzleToKyesely"]
end
Drizzle --> D2K
D2K --> Kysely
Kysely --> Repos
Drizzle --> Migrations
src/drizzle/schema/. It also drives migrations via drizzle-kit.db singleton (Kysely<DrizzleToKyesely>) connects to Postgres via pg.Pool.DrizzleToKyesely is an auto-generated type bridge that maps Drizzle table definitions to Kysely's type system (with snake_case column names matching Postgres).createDbRepo(). Each method uses Kysely to run queries, and the wrapper automatically converts results into Safe<T>.| Import | Resolves to | Description |
|---|---|---|
@repo/db |
src/repos/index.ts |
All repository objects (patientsRepo, templatesRepo, etc.) |
@repo/db/schema |
src/drizzle/schema/schema.ts |
Drizzle schema (tables, enums, relations) |
@repo/db/db.types |
src/db.types.ts |
Branded ID types, TableRow, TableInsert, TableUpdate, zId |
@repo/db/supabase |
src/lib/supabase.ts |
Supabase client factory |
@repo/db/supabase.types |
src/lib/supabase.types.ts |
Generated Supabase TypeScript types |
import { patientsRepo } from "@repo/db";
const result = await patientsRepo.getById(patientId);
if (result.error) {
// handle error -- result.error is a CircleError
return;
}
const patient = result.data;
Repos follow a consistent pattern using createDbRepo. Here is the simplest real example (entities.repo.ts):
// src/repos/my-thing.repo.ts
import type { MyThingId, MyThingInsertParams } from "../drizzle/schema/tables/my-thing/my-thing.table";
import { createDbRepo } from "../lib/db-repo";
export const myThingRepo = createDbRepo("myThing", ({ db }) => ({
getById: async (id: MyThingId) => {
return await db
.selectFrom("my_thing")
.selectAll()
.where("id", "=", id)
.executeTakeFirstOrThrow();
},
create: async (params: MyThingInsertParams) => {
return await db
.insertInto("my_thing")
.values(params)
.returningAll()
.executeTakeFirstOrThrow();
},
}));
Then register it in src/repos/index.ts:
export * from "./my-thing.repo";
Key points:
createDbRepo("name", ({ db }) => ({ ... })) -- the db parameter is a typed Kysely instance.createDbRepo does that automatically and returns Safe<T>.Use executeTakeFirstOrThrow() -- if no row is found, createDbRepo converts the NoResultError into a DbNotFoundError.
getById: async (id: PatientId) => {
return await db
.selectFrom("patient")
.selectAll()
.where("patient_id", "=", id)
.executeTakeFirstOrThrow();
},
Use execute() for lists. Build queries conditionally for optional filters.
listInOrganization: async (params: ListFacilitiesParams) => {
const { entityId, search, limit = 50, offset = 0 } = params;
let query = db
.selectFrom("facilities")
.where("entity_id", "=", entityId)
.select(["id", "name"]);
if (search?.trim()) {
query = query.where("name", "ilike", `%${search.trim()}%`);
}
return await query.orderBy("name").limit(limit).offset(offset).execute();
},
create: async (params: AuditQuestionInsertParams) => {
return await db
.insertInto("audit_question")
.values(params)
.returningAll()
.executeTakeFirstOrThrow();
},
update: async (id: TemplateId, params: TemplateUpdateParams) => {
return await db
.updateTable("templates")
.set({ ...params, last_updated: parseDbDate(new Date().toISOString()) })
.where("id", "=", id)
.returningAll()
.executeTakeFirstOrThrow();
},
delete: async (id: AuditQuestionId) => {
return await db
.deleteFrom("audit_question")
.where("id", "=", id)
.returningAll()
.executeTakeFirstOrThrow();
},
Use leftJoin with onRef and explicit column selections via select and eb.ref(...).as(...):
getDetail: async (id: PatientId) => {
return await db
.selectFrom("patient")
.leftJoin("profiles", (join) =>
join.onRef("patient.primary_provider_id", "=", "profiles.id"))
.leftJoin("facilities", (join) =>
join.onRef("patient.facility_id", "=", "facilities.id"))
.where("patient.patient_id", "=", id)
.select(["patient.patient_id", "patient.first_name", "patient.last_name"])
.select((eb) => [
eb.ref("profiles.first_name").as("provider_first_name"),
eb.ref("facilities.name").as("facility_name"),
])
.executeTakeFirstOrThrow();
},
Use db.transaction().execute() for multi-step operations that must be atomic:
duplicate: async (id: TemplateId, userId: UserId) => {
return await db.transaction().execute(async (trx) => {
const original = await trx
.selectFrom("templates")
.selectAll()
.where("id", "=", id)
.executeTakeFirstOrThrow();
const newTemplate = await trx
.insertInto("templates")
.values({ title: `${original.title} Copy`, created_by: userId })
.returningAll()
.executeTakeFirstOrThrow();
const sections = await trx
.selectFrom("sections")
.selectAll()
.where("template_id", "=", id)
.execute();
if (sections.length > 0) {
await trx
.insertInto("sections")
.values(sections.map((s) => ({ ...s, template_id: newTemplate.id })))
.execute();
}
return newTemplate;
});
},
The package uses a branded ID system (@repo/db/db.types) to prevent accidentally mixing IDs from different tables:
import { type Id, createIdParser, zId } from "@repo/db/db.types";
// In a table definition:
export type PatientId = Id<"bigint", "PatientId">;
export const parsePatientId = createIdParser<PatientId>("bigint");
export const zPatientId = zId(parsePatientId);
Id<Type, TagName> is a branded type that looks like a plain string or number at runtime but is distinct at the type level.createIdParser(idType) creates a parser function for "uuid", "bigint", "string", or "number" IDs.zId(parser) creates a Zod schema that validates and transforms raw input into the branded ID type -- useful in tRPC input schemas.Table definitions also export *InsertParams and *UpdateParams types derived from Kysely's Insertable<T> and Updateable<T>.
createDbRepo wraps every repo method so you never need to write try/catch in repo code:
{ data: T, error: null } (a SafeSuccess<T>)NoResultError (from executeTakeFirstOrThrow) is converted to DbNotFoundErrorPostgrestError is converted to UnhandledPostgresErrorDbConnectionError is rethrown (not wrapped in Safe) since the app cannot recover from theseUnhandledErrorCallers handle errors through the Safe<T> pattern:
const result = await patientsRepo.getById(patientId);
if (result.error) {
if (result.error instanceof DbNotFoundError) {
// patient does not exist
}
// other error
return;
}
const patient = result.data;
| Script | Description |
|---|---|
test |
Runs Vitest tests (with .env.test loaded). |
generate |
Runs Drizzle Kit to generate SQL migrations. |
migrate |
Applies Drizzle migrations to the database. |
codegen |
Regenerates Supabase TypeScript types from the remote project. |
check-types |
Runs tsc --noEmit to typecheck the package. |