number of questions
What's covered in the exam?
- Understanding how to connect the warehouse
- Configuring IP whitelist
- Selecting adapter type
- Configuring OAuth
- Adding credentials to deployment environments to access warehouse for production / CI runs
- Connecting the git repo to dbt
- Understanding custom branches and which to configure to environments
- Creating a PR template
- Understanding version control basics
- Setting up integrations with git providers
- Understanding access control to different environments
- Determining when to use a service account
- Rotating key pair authentication via the API
- Understanding environment variables
- Upgrading dbt versions
- Deploying using a custom branch
- Creating new dbt Cloud deployment environment
- Setting default schema / dataset for environment
- Setup a CI job with deferral
- Understanding steps within a dbt job
- Scheduling a job to run on schedule
- Implementing run commands in the correct order
- Creating new dbt Cloud job
- Configuring optional settings such as environment variable overrides, threads, deferral, target name, dbt version override etc.
- Generating documentation on a job that populates the project’s doc site
- Understanding events in audit log
- Understanding how to audit a DAG and use artifacts
- Using the model timing tab
- Reviewing job logs to find errors
- Creating Service tokens for API access
- Assigning permission sets
- Creating license mappings
- Understanding 3-pronged access control (RBAC in dbt, warehouse, git)
- Adding and removing users
- Adding SSO application for dbt Cloud enterprise
- Setting up email notifications
- Setting up Slack notifications
- Using Webhooks for event-driven integrations with other systems
Each package has a ‘dbt version required’ interval. When you upgrade your dbt Cloud version in your project, you need to check the required version for your installed packages to ensure the updated dbt version falls within the interval. This makes You need to look for dbt version requirements on packages the project has installed the correct answer.
Custom cron schedule matches with A daily production data refresh that runs every other hour, Monday through Friday. Recurring jobs that run on a schedule are defined in the job setting triggers either by a custom cron schedule or day/time selection in the UI.
Continuous integration run on pull requests matches with A job to test code changes before they are merged with the main branch. Continuous integration jobs are set up to trigger when a pull request is created. The PR workflow occurs when code changes are made and a PR is created in the UI. This kicks off a job that run your project to ensure a successful run prior to merging to the main branch.
No trigger matches with Ad hoc requests to fully refresh incremental models one to two times per month- run job manually. Ad hoc requests, by definition, are one-off run that are not scheduled jobs and therefore are kicked off manually in the UI.
dbt Cloud Admin API matches with A near real-time update that needs to run immediately after an Airflow task loads the data. An action outside of dbt Cloud triggering a job has to be configured using the dbt Cloud Administrative API.
dbt has two types of tokens, service account and user. User tokens are issued to users with a developer license. This token runs on behalf of the user. Service account tokens run independently from a specific user. This makes Service account tokens are used for system-level integrations that do not run on behalf of any one user the correct answer.