Skip to main content

1. Configure Your Source Database

PowerSync needs to connect to your source database (Postgres, MongoDB, MySQL or SQL Server) to replicate data. Before setting up PowerSync, you need to configure your database with the appropriate permissions and replication settings.
Configuring Postgres for PowerSync involves three main tasks:
  1. Enable logical replication: PowerSync reads the Postgres WAL using logical replication. Set wal_level = logical in your Postgres configuration.
  2. Create a PowerSync database user: Create a role with replication privileges and read-only access to your tables.
  3. Create a powersync publication: Create a logical replication publication named powersync to specify which tables to replicate.
-- 1. Enable logical replication (requires restart)
ALTER SYSTEM SET wal_level = logical;

-- 2. Create PowerSync database user/role with replication privileges and read-only access to your tables
CREATE ROLE powersync_role WITH REPLICATION BYPASSRLS LOGIN PASSWORD 'myhighlyrandompassword';

-- Set up permissions for the newly created role
-- Read-only (SELECT) access is required
GRANT SELECT ON ALL TABLES IN SCHEMA public TO powersync_role;

-- Optionally, grant SELECT on all future tables (to cater for schema additions)
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT SELECT ON TABLES TO powersync_role;

-- 3. Create a publication to replicate tables. The publication must be named "powersync"
CREATE PUBLICATION powersync FOR ALL TABLES;
  • Version compatibility: PowerSync requires Postgres version 11 or greater.
Learn More
  • For more details on Postgres setup, including provider-specific guides (Supabase, AWS RDS, MongoDB Atlas, etc.), see Source Database Setup.
  • Self-hosting PowerSync? See the Self-Host-Demo App for a complete working example of connecting a Postgres source database to PowerSync.

2. Set Up PowerSync Service Instance

PowerSync is available as a cloud-hosted service (PowerSync Cloud) or can be self-hosted (PowerSync Open Edition or PowerSync Enterprise Self-Hosted Edition).
If you haven’t yet, sign up for a free PowerSync Cloud account here.After signing up, you will be taken to the PowerSync Dashboard.Here, create a new project. Development and Production instances of the PowerSync Service will be created by default in the project.

3. Connect PowerSync to Your Source Database

The next step is to connect your PowerSync Service instance to your source database.
In the PowerSync Dashboard, select your project and instance, then go to Database Connections:
  1. Click Connect to Source Database
  2. Select the appropriate database type tab (Postgres, MongoDB, MySQL or SQL Server)
  3. Fill in your connection details:
    Note: Use the username (e.g., powersync_role) and password you created in Step 1: Configure your Source Database.
    • Postgres: Host, Port (5432), Database name, Username, Password, SSL Mode
    • MongoDB: Connection URI (e.g., mongodb+srv://user:[email protected]/database)
    • MySQL: Host, Port (3306), Database name, Username, Password
    • SQL Server: Name, Host, Port (1433), Database name, Username, Password
  4. Click Test Connection to verify
  5. Click Save Connection
PowerSync will now deploy and configure an isolated cloud environment, which can take a few minutes.
Learn MoreFor more details on database connections, including provider-specific connection details (Supabase, AWS RDS, MongoDB Atlas, etc.), see Source Database Connection.

4. Define Basic Sync Rules

Sync Rules control which data gets synced to which users/devices. They consist of SQL-like queries organized into “buckets” (groupings of data). Each PowerSync Service instance has a Sync Rules definition in YAML format. We recommend starting with a simple global bucket that syncs data to all users. This is the simplest way to get started.
bucket_definitions:
  global:
    data:
      - SELECT * FROM todos
      - SELECT * FROM lists WHERE archived = false

Deploy Sync Rules

In the PowerSync Dashboard:
  1. Select your project and instance
  2. Go to the Sync Rules view
  3. Edit the YAML directly in the dashboard
  4. Click Deploy to validate and deploy your Sync Rules
Note: Table/collection names within your Sync Rules must match the table names defined in your client-side schema (defined in a later step below).
Learn MoreFor more details on Sync Rules usage, see the Sync Rules documentation.

5. Generate a Development Token

For quick development and testing, you can generate a temporary development token instead of implementing full authentication. You’ll use this token for two purposes:
  • Testing with the Sync Diagnostics Client (in the next step) to verify your setup and Sync Rules
  • Connecting your app (in a later step) to test the client SDK integration
  1. In the PowerSync Dashboard, select your project and instance
  2. Go to the Client Auth view
  3. Check the Development tokens setting and save your changes
  4. Click the Connect button in the top bar
  5. Enter token subject: Since you’re starting with just a simple global bucket in your Sync Rules that syncs all data to all users (as we recommended in the previous step), you can just put something like test-user as the token subject (which would normally be the user ID you want to test with).
  6. Click Generate token and copy the token
Development tokens expire after 12 hours.

6. [Optional] Test Sync with the Sync Diagnostics Client

Before implementing the PowerSync Client SDK in your app, you can validate that syncing is working correctly using our Sync Diagnostics Client (this hosted version works with both PowerSync Cloud and self-hosted setups). Use the development token you generated in the previous step to connect and verify your setup:
  1. Go to https://diagnostics-app.powersync.com
  2. Enter your development token (from the Generate a Development Token step above)
  3. Enter your PowerSync instance URL (found in PowerSync Dashboard - click Connect in the top bar)
  4. Click Connect
The Sync Diagnostics Client will connect to your PowerSync Service instance and display information about the synced data, and allow you to query the client-side SQLite database.
Checkpoint:Inspect your global bucket and synced tables in the Sync Diagnostics Client — these should match the Sync Rules you defined previously. This confirms your setup is working correctly before integrating the client SDK into your app.

7. Use the Client SDK

Now it’s time to integrate PowerSync into your app. This involves installing the SDK, defining your client-side schema, instantiating the database, connecting to your PowerSync Service instance, and reading/writing data.

Install the Client SDK

Add the PowerSync Client SDK to your app project. PowerSync provides SDKs for various platforms and frameworks.
Add the PowerSync React Native NPM package to your project:
npx expo install @powersync/react-native
Install peer dependenciesPowerSync requires a SQLite database adapter. Choose between:
Using Expo Go? Our native database adapters listed below (OP-SQLite and React Native Quick SQLite) are not compatible with Expo Go’s sandbox environment. To run PowerSync with Expo Go install our JavaScript-based adapter @powersync/adapter-sql-js instead. See details here.
Polyfills and additional notes:
  • For async iterator support with watched queries, additional polyfills are required. See the Babel plugins section in the README.
  • When using the OP-SQLite package, we recommend adding this metro config to avoid build issues.

Define Your Client-Side Schema

This refers to the for the managed SQLite database exposed by the PowerSync Client SDKs, that your app can read from and write to. The schema is applied when the database is instantiated (as we’ll show in the next step) — . PowerSync Cloud: The easiest way to generate your schema is using the PowerSync Dashboard. Click the Connect button in the top bar to generate the client-side schema based on your Sync Rules in your preferred language. Here’s an example schema for a simple todos table:
import { column, Schema, Table } from '@powersync/react-native';

const todos = new Table(
  {
    list_id: column.text,
    created_at: column.text,
    completed_at: column.text,
    description: column.text,
    created_by: column.text,
    completed_by: column.text,
    completed: column.integer
  },
  { indexes: { list: ['list_id'] } }
);

export const AppSchema = new Schema({
  todos
});
Note: The schema does not explicitly specify an id column, since PowerSync automatically creates an id column of type text. PowerSync recommends using UUIDs.
Learn MoreThe client-side schema uses three column types: text, integer, and real. These map directly to values from your Sync Rules and are automatically cast if needed. For details on how backend database types map to SQLite types, see Types.

Instantiate the PowerSync Database

Now that you have your client-side schema defined, instantiate the PowerSync database in your app. This creates the client-side SQLite database that will be kept in sync with your source database based on your Sync Rules configuration.
import { PowerSyncDatabase } from '@powersync/react-native';
import { AppSchema } from './Schema';

export const db = new PowerSyncDatabase({
  schema: AppSchema,
  database: {
    dbFilename: 'powersync.db'
  }
});

Connect to PowerSync Service Instance

Connect your client-side PowerSync database to the PowerSync Service instance you created in step 2 by defining a backend connector and calling connect(). The backend connector handles authentication and uploading mutations to your backend.
Note: This section assumes you want to use PowerSync to sync your backend source database with SQLite in your app. If you only want to use PowerSync to manage your local SQLite database without sync, instantiate the PowerSync database without calling connect() refer to our Local-Only guide.
You don’t have to worry about the backend connector implementation details right now — you can leave the boilerplate as-is and come back to it later.
For development, you can use the development token you generated in the Generate a Development Token step above. For production, you’ll implement proper JWT authentication as we’ll explain further below.
import { AbstractPowerSyncDatabase, PowerSyncBackendConnector, PowerSyncCredentials } from '@powersync/react-native';
import { db } from './Database';

class Connector implements PowerSyncBackendConnector {
  async fetchCredentials(): Promise<PowerSyncCredentials> {
    // for development: use development token
    return {
      endpoint: 'https://your-instance.powersync.com',
      token: 'your-development-token-here'
    };
  }

  async uploadData(database: AbstractPowerSyncDatabase) {
    const transaction = await database.getNextCrudTransaction();
    if (!transaction) return;

    for (const op of transaction.crud) {
      const record = { ...op.opData, id: op.id };
      // upload to your backend API
    }

    await transaction.complete();
  }
}

// connect the database to PowerSync Service
const connector = new Connector();
await db.connect(connector);
Once connected, you can read from and write to the client-side SQLite database. Changes from your source database will be automatically synced down into the SQLite database. For client-side mutations to be uploaded back to your source database, you need to complete the backend integration as we’ll explain below.

Read Data

Read data using SQL queries. The data comes from your client-side SQLite database:
// Get all todos
const todos = await db.getAll('SELECT * FROM todos');

// Get a single todo
const todo = await db.get('SELECT * FROM todos WHERE id = ?', [todoId]);

// Watch for changes (reactive query)
const stream = db.watch('SELECT * FROM todos WHERE list_id = ?', [listId]);
for await (const todos of stream) {
  // Update UI when data changes
  console.log(todos);
}

// Note: The above example requires async iterator support in React Native. 
// If you encounter issues, use one of these callback-based APIs instead:

// Option 1: Using onResult callback
// const abortController = new AbortController();
// db.watch(
//   'SELECT * FROM todos WHERE list_id = ?',
//   [listId],
//   {
//     onResult: (todos) => {
//       // Update UI when data changes
//       console.log(todos);
//     }
//   },
//   { signal: abortController.signal }
// );

// Option 2: Using the query builder API
// const query = db
//   .query({
//     sql: 'SELECT * FROM todos WHERE list_id = ?',
//     parameters: [listId]
//   })
//   .watch();
// query.registerListener({
//   onData: (todos) => {
//     // Update UI when data changes
//     console.log(todos);
//   }
// });
Learn More

Write Data

Write data using SQL INSERT, UPDATE, or DELETE statements. PowerSync automatically queues these mutations and uploads them to your backend via the uploadData() function, once you’ve fully implemented your backend connector (as we’ll talk about below).
// Insert a new todo
await db.execute(
  'INSERT INTO todos (id, created_at, list_id, description) VALUES (uuid(), date(), ?, ?)',
  [listId, 'Buy groceries']
);

// Update a todo
await db.execute(
  'UPDATE todos SET completed = 1, completed_at = date() WHERE id = ?',
  [todoId]
);

// Delete a todo
await db.execute('DELETE FROM todos WHERE id = ?', [todoId]);
Best practice: Use UUIDs when inserting new rows on the client side. UUIDs can be generated offline/locally, allowing for unique identification of records created in the client database before they are synced to the server. See Client ID for more details.
Learn MoreFor more details, see the Writing Data page.

Next Steps

For production deployments, you’ll need to:
  1. Implement Authentication: Replace development tokens with proper JWT-based authentication. PowerSync supports various authentication providers including Supabase, Firebase Auth, Auth0, Clerk, and custom JWT implementations.
  2. Configure & Integrate Your Backend Application: Set up your backend to handle mutations uploaded from clients.

Additional Resources

Questions?

Try “Ask AI” on this site which is trained on all our documentation, repositories and Discord discussions. Also join us on our community Discord server where you can browse topics from the PowerSync community and chat with our team.