Skip to content

Commit

Permalink
Merge branch 'current' into runleonarun-patch-9
Browse files Browse the repository at this point in the history
  • Loading branch information
runleonarun authored Jul 19, 2023
2 parents bdac8d1 + 34e7fdf commit 6b6db40
Show file tree
Hide file tree
Showing 23 changed files with 252 additions and 89 deletions.
8 changes: 4 additions & 4 deletions .github/workflows/autoupdate.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@ name: Auto Update

on:
# This will trigger on all pushes to all branches.
push: {}
# push: {}
# Alternatively, you can only trigger if commits are pushed to certain branches, e.g.:
# push:
# branches:
# - current
push:
branches:
- current
# - unstable
jobs:
autoupdate:
Expand Down
8 changes: 6 additions & 2 deletions website/dbt-versions.js
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
exports.versions = [
{
version: "1.6",
EOLDate: "2024-07-20", // placeholder - need to confirm the final date
EOLDate: "2024-07-31",
isPrerelease: true
},
{
Expand Down Expand Up @@ -31,6 +31,10 @@ exports.versions = [
]

exports.versionedPages = [
{
"page": "reference/commands/clone",
"firstVersion": "1.6",
},
{
"page": "docs/collaborate/govern/project-dependencies",
"firstVersion": "1.6",
Expand All @@ -55,7 +59,7 @@ exports.versionedPages = [
"page": "docs/collaborate/govern/model-contracts",
"firstVersion": "1.5",
},
{
{
"page": "reference/commands/show",
"firstVersion": "1.5",
},
Expand Down
1 change: 1 addition & 0 deletions website/docs/docs/build/about-metricflow.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ Metrics, which is a key concept, are functions that combine measures, constraint

MetricFlow supports different metric types:

- [Cumulative](/docs/build/cumulative) — Aggregates a measure over a given window.
- [Derived](/docs/build/derived) — An expression of other metrics, which allows you to do calculations on top of metrics.
- [Ratio](/docs/build/ratio) — Create a ratio out of two measures, like revenue per customer.
- [Simple](/docs/build/simple) — Metrics that refer directly to one measure.
Expand Down
12 changes: 9 additions & 3 deletions website/docs/docs/build/cumulative-metrics.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,12 @@ tags: [Metrics, Semantic Layer]

Cumulative metrics aggregate a measure over a given window. If no window is specified, the window is considered infinite and accumulates values over all time.

:::info MetricFlow time spine required

You will need to create the [time spine model](/docs/build/metricflow-time-spine) before you add cumulative metrics.

:::

```yaml
# Cumulative metrics aggregate a measure over a given window. The window is considered infinite if no window parameter is passed (accumulate the measure over all time)
metrics:
Expand All @@ -24,7 +30,7 @@ metrics:
### Window options
This section details examples for when you specify and don't specify window options.
This section details examples of when you specify and don't specify window options.
<Tabs>
Expand Down Expand Up @@ -56,7 +62,7 @@ metrics:
window: 7 days
```

From the sample yaml above, note the following:
From the sample YAML above, note the following:

* `type`: Specify cumulative to indicate the type of metric.
* `type_params`: Specify the measure you want to aggregate as a cumulative metric. You have the option of specifying a `window`, or a `grain to date`.
Expand Down Expand Up @@ -142,7 +148,7 @@ metrics:
```yaml
metrics:
name: revenue_monthly_grain_to_date #For this metric, we use a monthly grain to date
description: Monthly revenue using a grain to date of 1 month (think of this as a monthly resetting point)
description: Monthly revenue using grain to date of 1 month (think of this as a monthly resetting point)
type: cumulative
type_params:
measures:
Expand Down
2 changes: 2 additions & 0 deletions website/docs/docs/build/incremental-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ from raw_app_data.events
{% if is_incremental() %}

-- this filter will only be applied on an incremental run
-- (uses > to include records whose timestamp occurred since the last run of this model)
where event_time > (select max(event_time) from {{ this }})

{% endif %}
Expand Down Expand Up @@ -137,6 +138,7 @@ from raw_app_data.events
{% if is_incremental() %}

-- this filter will only be applied on an incremental run
-- (uses >= to include records arriving later on the same day as the last run of this model)
where date_day >= (select max(date_day) from {{ this }})

{% endif %}
Expand Down
7 changes: 4 additions & 3 deletions website/docs/docs/build/metrics-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,10 @@ This page explains the different supported metric types you can add to your dbt
- [Ratio](#ratio-metrics) — Create a ratio out of two measures.
-->

<!--not supported for this release

### Cumulative metrics

[Cumulative metrics](/docs/build/cumulative) aggregate a measure over a given window. Note that if no window is specified, the window would accumulate the measure over all time.
[Cumulative metrics](/docs/build/cumulative) aggregate a measure over a given window. If no window is specified, the window would accumulate the measure over all time. **Note**m, you will need to create the [time spine model](/docs/build/metricflow-time-spine) before you add cumulative metrics.

```yaml
# Cumulative metrics aggregate a measure over a given window. The window is considered infinite if no window parameter is passed (accumulate the measure over all time)
Expand All @@ -43,7 +43,6 @@ metrics:
#Omitting window will accumulate the measure over all time
window: 7 days
```
-->
### Derived metrics
[Derived metrics](/docs/build/derived) are defined as an expression of other metrics. Derived metrics allow you to do calculations on top of metrics.
Expand Down Expand Up @@ -145,7 +144,9 @@ You can set more metadata for your metrics, which can be used by other tools lat
## Related docs

- [Semantic models](/docs/build/semantic-models)
- [Cumulative](/docs/build/cumulative)
- [Derived](/docs/build/derived)




Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Client Secret for use in dbt Cloud.
| **Application type** | internal | required |
| **Application name** | dbt Cloud | required |
| **Application logo** | Download the logo <a href="https://www.getdbt.com/ui/img/dbt-icon.png" target="_blank" rel="noopener noreferrer">here</a> | optional |
| **Authorized domains** | `getdbt.com` | If deploying into a VPC, use the domain for your deployment |
| **Authorized domains** | `getdbt.com` (US) `dbt.com` (EMEA or AU) | If deploying into a VPC, use the domain for your deployment |
| **Scopes** | `email, profile, openid` | The default scopes are sufficient |

<Lightbox src="/img/docs/dbt-cloud/dbt-cloud-enterprise/gsuite/gsuite-sso-consent-top.png" title="GSuite Consent Screen configuration"/>
Expand Down
6 changes: 3 additions & 3 deletions website/docs/docs/collaborate/govern/model-versions.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ Instead, for mature models at larger organizations, powering queries inside & ou

During that migration window, anywhere that model is being used downstream, it can continue to be referenced at a specific version.

In the future, dbt will also offer first-class support for **deprecating models** ([dbt-core#7433](https://github.com/dbt-labs/dbt-core/issues/7433)). Taken together, model versions and deprecation offer a pathway for model producers to _sunset_ old models, and consumers the time to _migrate_ across breaking changes. It's a way of managing change across an organization: develop a new version, bump the latest, slate the old version for deprecation, update downstream references, and then remove the old version.
dbt Core 1.6 introduced first-class support for **deprecating models** by specifying a [`deprecation_date`](/reference/resource-properties/deprecation_date). Taken together, model versions and deprecation offer a pathway for model producers to _sunset_ old models, and consumers the time to _migrate_ across breaking changes. It's a way of managing change across an organization: develop a new version, bump the latest, slate the old version for deprecation, update downstream references, and then remove the old version.

There is a real trade-off that exists here—the cost to frequently migrate downstream code, and the cost (and clutter) of materializing multiple versions of a model in the data warehouse. Model versions do not make that problem go away, but by setting a deprecation date, and communicating a clear window for consumers to gracefully migrate off old versions, they put a known boundary on the cost of that migration.

Expand Down Expand Up @@ -73,7 +73,7 @@ As the **producer** of a versioned model:
- You keep track of all live versions in one place, rather than scattering them throughout the codebase
- You can reuse the model's configuration, and highlight just the diffs between versions
- You can select models to build (or not) based on whether they're a `latest`, `prerelease`, or `old` version
- dbt will notify consumers of your versioned model when new versions become available, or (in the future) when they are slated for deprecation
- dbt will notify consumers of your versioned model when new versions become available, or when they are slated for deprecation

As the **consumer** of a versioned model:
- You use a consistent `ref`, with the option of pinning to a specific live version
Expand Down Expand Up @@ -109,7 +109,7 @@ selectors:
</File>
Because dbt knows that these models are _actually the same model_, it can notify downstream consumers as new versions become available, and (in the future) as older versions are slated for deprecation.
Because dbt knows that these models are _actually the same model_, it can notify downstream consumers as new versions become available, and as older versions are slated for deprecation.
```bash
Found an unpinned reference to versioned model 'dim_customers'.
Expand Down
4 changes: 2 additions & 2 deletions website/docs/docs/collaborate/govern/project-dependencies.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ sidebar_label: "Project dependencies"
description: "Reference public models across dbt projects"
---

:::info
"Project" dependencies and cross-project `ref` is currently in closed beta and are features of dbt Cloud Enterprise. To access these features, please contact your account team.
:::caution Closed Beta - dbt Cloud Enterprise
"Project" dependencies and cross-project `ref` are features of dbt Cloud Enterprise, currently in Closed Beta. To access these features while they are in beta, please contact your account team at dbt Labs.
:::

For a long time, dbt has supported code reuse and extension by installing other projects as [packages](/docs/build/packages). When you install another project as a package, you are pulling in its full source code, and adding it to your own. This enables you to call macros and run models defined in that other project.
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/dbt-cloud-apis/admin-cloud-api.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The dbt Cloud Administrative API is enabled by default for [Team and Enterprise
- Manage your dbt Cloud account
- and more

Check out our dbt Cloud Admin API docs to help you access the API:
dbt Cloud currently supports two versions of the Administrative API: v2 and v3. In general, v3 is the recommended version to use, but we don't yet have all our v2 routes upgraded to v3. We're currently working on this. If you can't find something in our v3 docs, check out the shorter list of v2 endpoints because you might find it there.

<div className="grid--2-col">

Expand Down
19 changes: 19 additions & 0 deletions website/docs/docs/dbt-cloud-apis/service-tokens.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,11 @@ title: "Service account tokens"
id: "service-tokens"
description: "Service account tokens help you define permissions for securing access to your dbt Cloud account and its projects."
---
:::info Important service account token update

If you have service tokens created on or before July 18, 2023, please read [this important update](/docs/dbt-cloud-apis/service-tokens#service-token-update).

:::

## About service tokens

Expand Down Expand Up @@ -92,3 +97,17 @@ Analyst admin service tokens have all the permissions listed in [Analyst](/docs/

**Stakeholder**<br/>
Stakeholder service tokens have all the permissions listed in [Stakeholder](/docs/cloud/manage-access/enterprise-permissions#stakeholder) on the Enterprise Permissions page.


## Service token update

On July 18, 2023, dbt Labs made critical infrastructure changes to service account tokens. These enhancements improve the security and performance of all tokens created after July 18, 2023. To ensure security best practices are in place, we recommend you rotate your service tokens created before this date.

To rotate your token:
1. Navigate to **Account settings** and click **Service tokens** on the left side pane.
2. Verify the **Created** date for the token is _on or before_ July 18, 2023.
<Lightbox src="/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/service-token-date.png" title="Service token created date"/>
3. Click **+ New Token** on the top right side of the screen. Ensure the new token has the same permissions as the old one.
4. Copy the new token and replace the old one in your systems. Store it in a safe place, as it will not be available again once the creation screen is closed.
5. Delete the old token in dbt Cloud by clicking the **trash can icon**. _Only take this action after the new token is in place to avoid service disruptions_.

64 changes: 50 additions & 14 deletions website/docs/guides/migration/versions/01-upgrading-to-v1.6.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,19 @@
---
title: "Upgrading to v1.6 (beta)"
title: "Upgrading to v1.6 (prerelease)"
description: New features and changes in dbt Core v1.6
---

:::warning Beta Functionality
:::warning Prerelease

dbt Core v1.6 is in beta, and the features and functionality on this page are subject to change.
dbt Core v1.6 is available as a release candidate. [Final release is planned for July 31.](https://github.com/dbt-labs/dbt-core/issues/7990)

Test it out, and [let us know](https://github.com/dbt-labs/dbt-core/issues/new/choose) if you run into any issues!

:::

## Resources

- [Changelog](https://github.com/dbt-labs/dbt-core/blob/main/CHANGELOG.md)
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
- [CLI Installation guide](/docs/core/installation)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
- [Release schedule](https://github.com/dbt-labs/dbt-core/issues/7481)
Expand All @@ -22,24 +24,58 @@ dbt Labs is committed to providing backward compatibility for all versions 1.x,

### Behavior changes

**Coming soon**
- dbt Core v1.6 does not support Python 3.7, which reached End Of Life on June 23. Support Python versions are 3.8, 3.9, 3.10, and 3.11.
- As part of the Semantic layer re-launch (in beta), the spec for `metrics` has changed significantly. Migration guide coming soon: https://github.com/dbt-labs/docs.getdbt.com/pull/3705
- Manifest schema version is now v10, reflecting [TODO] changes

### For consumers of dbt artifacts (metadata)

The [manifest](/reference/artifacts/manifest-json) schema version has updated to `v10`. Specific changes:
- Addition of `semantic_models` and changes to `metrics` attributes
- Addition of `deprecation_date` as a model property
- Addition of `on_configuration_change` as default node configuration (to support materialized views)
- Small type changes to `contracts` and `constraints`
- Manifest `metadata` includes `project_name`

### For maintainers of adapter plugins

For more detailed information and to ask questions, please read and comment on the GH discussion: [dbt-labs/dbt-core#7958](https://github.com/dbt-labs/dbt-core/discussions/7958).

## New and changed documentation

[`dbt retry`](/reference/commands/retry) is a new command that executes the previously run command from the point of failure. This convenient command enables you to continue a failed command without rebuilding all upstream dependencies.
### Materialized views

**Materialized view** support (for model and project configs) has been added for three data warehouses:
- [Bigquery](/reference/resource-configs/bigquery-configs#materialized-view)
- [Postgres](/reference/resource-configs/postgres-configs#materialized-view)
- [Redshift](/reference/resource-configs/redshift-configs#materialized-view)
Supported on:
- [Postgres](/reference/resource-configs/postgres-configs#materialized-view)
- [Redshift](/reference/resource-configs/redshift-configs#materialized-view)
- Snowflake (docs forthcoming)

[**Namespacing:**](/faqs/Models/unique-model-names) Model names can be duplicated across different namespaces (packages/projects), so long as they are unique within each package/project. We strongly encourage using [two-argument `ref`](/reference/dbt-jinja-functions/ref#two-argument-variant) when referencing a model from a different package/project.
Support for BigQuery and Databricks forthcoming.

[**Project dependencies**](/docs/collaborate/govern/project-dependencies): Introduces `dependencies.yml` and dependent `projects` as a feature of dbt Cloud Enterprise. Allows enforcing model access (public vs. protected/private) across project/package boundaries. Enables cross-project `ref` of public models, without requiring the installation of upstream source code.
### New commands for mature deployment

### Quick hits
[`dbt retry`](/reference/commands/retry) executes the previously run command from the point of failure. Rebuild just the nodes that errored or skipped in a previous run/build/test, rather than starting over from scratch.

[`dbt clone`](/reference/commands/clone) leverages each data platform's functionality for creating lightweight copies of dbt models from one environment into another. Useful when quickly spinning up a new development environment, or promoting specific models from a staging environment into production.

### Multi-project collaboration

More consistency and flexibility around packages! Resources defined in a package will respect variable and global macro definitions within the scope of that package.
[**Deprecation date**](/reference/resource-properties/deprecation_date): Models can declare a deprecation date that will warn model producers and downstream consumers. This enables clear migration windows for versioned models, and provides a mechanism to facilitate removal of immature or little-used models, helping to avoid project bloat.

[Model names](/faqs/Models/unique-model-names) can be duplicated across different namespaces (projects/packages), so long as they are unique within each project/package. We strongly encourage using [two-argument `ref`](/reference/dbt-jinja-functions/ref#two-argument-variant) when referencing a model from a different package/project.

More consistency and flexibility around packages. Resources defined in a package will respect variable and global macro definitions within the scope of that package.
- `vars` defined in a package's `dbt_project.yml` are now available in the resolution order when compiling nodes in that package, though CLI `--vars` and the root project's `vars` will still take precedence. See ["Variable Precedence"](/docs/build/project-variables#variable-precedence) for details.
- `generate_x_name` macros (defining custom rules for database, schema, alias naming) follow the same pattern as other "global" macros for package-scoped overrides. See [macro dispatch](/reference/dbt-jinja-functions/dispatch) for an overview of the patterns that are possible.

:::caution Closed Beta - dbt Cloud Enterprise
[**Project dependencies**](/docs/collaborate/govern/project-dependencies): Introduces `dependencies.yml` and dependent `projects` as a feature of dbt Cloud Enterprise. Allows enforcing model access (public vs. protected/private) across project/package boundaries. Enables cross-project `ref` of public models, without requiring the installation of upstream source code.
:::

### Quick hits

- [`state:unmodified` and `state:old`](/reference/node-selection/methods#the-state-method) for [MECE](https://en.wikipedia.org/wiki/MECE_principle) stateful selection
- [`invocation_args_dict`](/reference/dbt-jinja-functions/flags#invocation_args_dict) includes full `invocation_command` as string
- [`dbt debug --connection`](/reference/commands/debug) to test just the data platform connection specified in a profile
- [`dbt docs generate --empty-catalog`](/reference/commands/cmd-docs) to skip catalog population while generating docs
- [`--defer-state`](/reference/node-selection/defer) enables more-granular control
1 change: 1 addition & 0 deletions website/docs/reference/artifacts/dbt-artifacts.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ All artifacts produced by dbt include a `metadata` dictionary with these propert

In the manifest, the `metadata` may also include:
- `send_anonymous_usage_stats`: Whether this invocation sent [anonymous usage statistics](/reference/global-configs/usage-stats) while executing.
- `project_name`: The `name` defined in the root project's `dbt_project.yml`. (Added in manifest v10 / dbt Core v1.6)
- `project_id`: Project identifier, hashed from `project_name`, sent with anonymous usage stats if enabled.
- `user_id`: User identifier, stored by default in `~/dbt/.user.yml`, sent with anonymous usage stats if enabled.

Expand Down
Loading

0 comments on commit 6b6db40

Please sign in to comment.