Skip to content

Commit

Permalink
Merge branch 'current' into mwong-sl-beta
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Jul 20, 2023
2 parents 15d0e30 + 4386b9c commit 0fc6a77
Show file tree
Hide file tree
Showing 7 changed files with 416 additions and 7 deletions.
4 changes: 3 additions & 1 deletion website/docs/docs/build/about-metricflow.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,9 @@ There are a few key principles:

- MetricFlow, as a part of the dbt Semantic Layer, allows organizations to define company metrics logic through YAML abstractions, as described in the following sections.

- You can install MetricFlow via PyPI as an extension of your [dbt adapter](/docs/supported-data-platforms) in the CLI. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"`.
- You can install MetricFlow using PyPI as an extension of your [dbt adapter](/docs/supported-data-platforms) in the CLI. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"`.

- To query metrics dimensions, dimension values, and validate your configurations; install the [MetricFlow CLI](/docs/build/metricflow-cli).

### Semantic graph

Expand Down
2 changes: 2 additions & 0 deletions website/docs/docs/build/build-metrics-intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,13 @@ To fully experience the dbt Semantic Layer, including the ability to query dbt m
:::

Before you start, keep the following considerations in mind:

- Define metrics in YAML and query them using the [new metric specifications](https://github.com/dbt-labs/dbt-core/discussions/7456).
- You must be on dbt v1.6 or higher to use MetricFlow. [Upgrade your dbt Cloud version](/docs/dbt-versions/upgrade-core-in-cloud) to get started.
- MetricFlow currently supports Snowflake, Postgres, BigQuery, Databricks, and Redshift.
- Unlock insights and query your metrics using the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and its diverse range of [available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations).


<div className="grid--3-col">

<Card
Expand Down
27 changes: 24 additions & 3 deletions website/docs/docs/build/measures.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,27 @@ Measures are aggregations performed on columns in your model. They can be used a
| [`agg`](#aggregation) | dbt supports the following aggregations: `sum`, `max`, `min`, `count_distinct`, and `sum_boolean`. | Required |
| [`expr`](#expr) | You can either reference an existing column in the table or use a SQL expression to create or derive a new one. | Optional |
| [`non_additive_dimension`](#non-additive-dimensions) | Non-additive dimensions can be specified for measures that cannot be aggregated over certain dimensions, such as bank account balances, to avoid producing incorrect results. | Optional |
| [`agg_params`] | specific aggregation properties such as a percentile. | [Optional]|
| [`agg_time_dimension`] | The time field. Defaults to the default agg time dimension for the semantic model. | [Optional] |
| [`non_additive_dimension`] | Use these configs when you need non-additive dimensions. | [Optional]|
| [`label`] | How the metric appears in project docs and downstream integrations. | [Required]|


## Measure spec

An example of the complete YAML measures spec is below. The actual configuration of your measures will depend on the aggregation you're using.

```bash
measures:
- name: The name of the measure # think transaction_total. If `expr` isn't present then this is the expected name of the column [Required]
description: same as always [Optional]
agg: the aggregation type. #think average, sum, max, min, etc.[Required]
expr: the field # think transaction_total or some other name you might want to alias [Optional]
agg_params: specific aggregation properties such as a percentile [Optional]
agg_time_dimension: The time field. Defaults to the default agg time dimension for the semantic model. [Optional]
non_additive_dimension: Use these configs when you need non-additive dimensions. [Optional]
label: How the metric appears in project docs and downstream integrations. [Required]
```

### Name

Expand Down Expand Up @@ -62,7 +83,7 @@ If you use the `dayofweek` function in the `expr` parameter with the legacy Snow
```yaml
semantic_models:
- name: transactions
description: A record for every transaction that takes place. Carts are considered multiple transactions for each SKU.
description: A record of every transaction that takes place. Carts are considered multiple transactions for each SKU.
model: ref('schema.transactions')
defaults:
agg_time_dimensions:
Expand Down Expand Up @@ -190,14 +211,14 @@ semantic_models:
name: metric_time
window_choice: min
- name: mrr_end_of_month
description: Aggregate by summing all users active subscription plans at end of month
description: Aggregate by summing all users' active subscription plans at the end of month
expr: subscription_value
agg: sum
non_additive_dimension:
name: metric_time
window_choice: max
- name: mrr_by_user_end_of_month
description: Group by user_id to achieve each users MRR at the end of the month
description: Group by user_id to achieve each user's MRR at the end of the month
expr: subscription_value
agg: sum
non_additive_dimension:
Expand Down
Loading

0 comments on commit 0fc6a77

Please sign in to comment.