dbt Projects
You can easily make changes in dbt and see them updated in your Lightdash project.
Lightdash supports dbt v1.4.0 and above. If you are using an older version of dbt, you will need to upgrade to sync your project to Lightdash
1. Automatically: deploy your changes to Lightdash using a GitHub action
If you've connected Lightdash to GitHub, you can setup a github action and get Lightdash to deploy your project automatically. This is the easiest way to keep Lightdash in sync with your changes in dbt.
Step 1: add the credentials to Github secrets
We are going to add some secrets and config to GitHub actions, but you don't want those to be public, so the best way to do this is to add them as secrets on Github.
If you already have a GitHub action for Lightdash, then you can use the same Lightdash secrets you created for your other action.
Go to your repo, click on Settings , on the left sidebar, click on Secrets under Security. Now click on the New repository secret

We need to add the following secrets:
LIGHTDASH_API_KEY
Create a new personal access token, by going to Settings > Personal Access Tokens. This is the token you'll put in for LIGHTDASH_API_KEY.

LIGHTDASH_PROJECT
The UUID for your project. For example, if your URL looks like https://eu1.lightdash.cloud/projects/3538ab33-dc90-aabb-bc00-e50bba3a5f69/tables, then 3538ab33-dc90-45f0-aabb-e50bba3a5f69 is your LIGHTDASH_PROJECT
LIGHTDASH_URL
This is either https://eu1.lightdash.cloud or https://app.lightdash.cloud for Lightdash Cloud users (check the URL to your Lightdash project).
If you self-host, this should be your own custom domain.
DBT_PROFILES
Some tips for this bit:
- You might be able to copy a bunch of the information from your local profiles.ymlfile. You can see what's in there by typingcat ~/.dbt/profiles.ymlin your terminal.
- If you have a separate prodanddevprofile, you probably want to use the information from yourprodprofile for your GitHub action.
- If you want to have different connection settings depending on the user that opened the pull request (dev profiles), then check out this guide.
Find your data warehouse from the list below to get a profiles.yml file template. Fill out this template, and this is your DBT_PROFILES secret.
BigQuery
Step 1: create a secret called GOOGLE_APPLICATION_CREDENTIALS
Add the service account credentials (the JSON file) that you want to use for your GitHub action. It should look something like this:
{
  "type": "service_account",
  "project_id": "jaffle_shop",
  "private_key_id": "12345",
  "private_key": "-----BEGIN PRIVATE KEY----- ... -----END PRIVATE KEY-----\n",
  "client_email": "jaffle_shop@jaffle_shop.iam.gserviceaccount.com",
  "client_id": "12345",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/jaffle_shop"
}
Step 2: create another secret called DBT_PROFILES
Copy-paste this template into the secret and fill out the details.
This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret in this guide.
[my-bigquery-db]: # this is the name of your project
  target: dev
  outputs:
    dev:
      type: bigquery
      method: oauth
      keyfile: keyfile.json # no need to change this! We'll automatically use the keyfile you created in the last step.
      project: [GCP project id]
      dataset: [the name of your dbt dataset]
More info in dbt's profiles docs: https://docs.getdbt.com/reference/warehouse-profiles/bigquery-profile#service-account-file
Postgres
company-name:
  target: dev
  outputs:
    dev:
      type: postgres
      host: [hostname]
      user: [username]
      password: [password]
      port: [port]
      dbname: [database name]
      schema: [dbt schema]
      threads: [1 or more]
      keepalives_idle: 0
      connect_timeout: 10
      retries: 1
More info in dbt's profiles docs: https://docs.getdbt.com/reference/warehouse-profiles/postgres-profile#profile-configuration
This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret in this guide.
Redshift
company-name:
  target: dev
  outputs:
    dev:
      type: redshift
      host: [hostname.region.redshift.amazonaws.com]
      user: [username]
      password: [password]
      port: 5439
      dbname: analytics
      schema: analytics
      threads: 4
      keepalives_idle: 240
      connect_timeout: 10
      ra3_node: true # enables cross-database sources
More info in dbt's profiles docs: https://docs.getdbt.com/reference/warehouse-profiles/redshift-profile#password-based-authentication
This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret in this guide.
Snowflake
my-snowflake-db:
  target: dev
  outputs:
    dev:
      type: snowflake
      account: [account id]
      # User/password auth
      user: [username]
      password: [password]
      role: [user role]
      database: [database name]
      warehouse: [warehouse name]
      schema: [dbt schema]
      threads: [1 or more]
      client_session_keep_alive: False
      query_tag: [anything]
More info in dbt's profiles docs: https://docs.getdbt.com/reference/warehouse-profiles/snowflake-profile#user--password-authentication
This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret in this guide.
DataBricks
your_profile_name:
  target: dev
  outputs:
    dev:
      type: databricks
      catalog:
        [
          optional catalog name,
          if you are using Unity Catalog,
          only available in dbt-databricks>=1.1.1,
        ]
      schema: [schema name]
      host: [yourorg.databrickshost.com]
      http_path: [/sql/your/http/path]
      token: [dapiXXXXXXXXXXXXXXXXXXXXXXX] # Personal Access Token (PAT)
      threads: [1 or more]
More info in dbt's profiles docs: https://docs.getdbt.com/reference/warehouse-profiles/bigquery-profile#service-account-json
This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret in this guide.
Step 2: Create deploy.yml workflow in Github
Go to your repo, click on Actions menu.
If you don't have any GitHub actions, you'll just need to click on Configure

If you have some GitHub actions in your repo already, click on New workflow, then select setup a workflow yourself.

Now copy this file from our cli-actions repo.
Give it a nice name like deploy-lightdash.yml
And commit this to your repo by clicking on Start commit.
You're done!
Everytime you make a change to your repo, on the main branch, it will automatically deploy your new config into your Lightdash projects
You can see the log on the Github actions page

2. In the UI: Syncing your dbt changes using refresh dbt
Whenever you make changes to your YAML files, you can sync Lightdash and see these changes by clicking the refresh dbt button in the Explore view of the app.

If you're using a git connection (like GitHub, Gitlab or Bitbucket), you'll need to push + merge your changes to the branch that your Lightdash project is connected to before you run refresh dbt.
3. From the command line: Syncing your dbt changes using lightdash deploy
If you're using the Lightdash CLI, you can use the lightdash deploy command to deploy your changes to your Lightdash project.
To read more about how to use lightdash deploy, check out our docs.
Note: If you've made any changes to the underlying data, you need to run dbt first
If you've made any changes to the underlying data (for example, adding a new column in your model.sql file or changing the SQL logic of an existing dimension), then you need to run: dbt run -m yourmodel before you click refresh dbt in Lightdash.