Get data insights from a contribution analysis model using a summable ratio metric
In this tutorial, you use a contribution analysis model to analyze the contribution of the cost of sales ratio in the Iowa liquor sales dataset. This tutorial guides you through performing the following tasks:
- Create an input table based on publicly available Iowa liquor data.
- Create a contribution analysis model that uses a summable ratio metric. This type of model summarizes the values of two numeric columns and determines the ratio differences across the control and test dataset for each segment of the data.
- Get the metric insights from the model by using the
ML.GET_INSIGHTS
function.
Before starting this tutorial, you should be familiar with the contribution analysis use case.
Required permissions
To create the dataset, you need the
bigquery.datasets.create
Identity and Access Management (IAM) permission.To create the model, you need the following permissions:
bigquery.jobs.create
bigquery.models.create
bigquery.models.getData
bigquery.models.updateData
To run inference, you need the following permissions:
bigquery.models.getData
bigquery.jobs.create
Costs
In this document, you use the following billable components of Google Cloud:
- BigQuery ML: You incur costs for the data that you process in BigQuery.
To generate a cost estimate based on your projected usage,
use the pricing calculator.
For more information about BigQuery pricing, see BigQuery pricing in the BigQuery documentation.
Before you begin
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
Enable the BigQuery API.
Create a dataset
Create a BigQuery dataset to store your ML model.
Console
In the Google Cloud console, go to the BigQuery page.
In the Explorer pane, click your project name.
Click
View actions > Create dataset.On the Create dataset page, do the following:
For Dataset ID, enter
bqml_tutorial
.For Location type, select Multi-region, and then select US (multiple regions in United States).
Leave the remaining default settings as they are, and click Create dataset.
bq
To create a new dataset, use the
bq mk
command
with the --location
flag. For a full list of possible parameters, see the
bq mk --dataset
command
reference.
Create a dataset named
bqml_tutorial
with the data location set toUS
and a description ofBigQuery ML tutorial dataset
:bq --location=US mk -d \ --description "BigQuery ML tutorial dataset." \ bqml_tutorial
Instead of using the
--dataset
flag, the command uses the-d
shortcut. If you omit-d
and--dataset
, the command defaults to creating a dataset.Confirm that the dataset was created:
bq ls
API
Call the datasets.insert
method with a defined dataset resource.
{ "datasetReference": { "datasetId": "bqml_tutorial" } }
Create a table of input data
Create a table that contains test and control data to analyze. The following query creates two intermediate tables, a test table for liquor data from 2021 and a control table with liquor data from 2020, and then performs a union of the intermediate tables to create a table with both test and control rows and the same set of columns.
In the Google Cloud console, go to the BigQuery page.
In the query editor, run the following statement:
CREATE OR REPLACE TABLE bqml_tutorial.iowa_liquor_sales_data AS (SELECT store_name, city, vendor_name, category_name, item_description, SUM(sale_dollars) AS total_sales, SUM(state_bottle_cost) AS total_bottle_cost, FALSE AS is_test FROM `bigquery-public-data.iowa_liquor_sales.sales` WHERE EXTRACT(YEAR FROM date) = 2020 GROUP BY store_name, city, vendor_name, category_name, item_description, is_test) UNION ALL (SELECT store_name, city, vendor_name, category_name, item_description, SUM(sale_dollars) AS total_sales, SUM(state_bottle_cost) AS total_bottle_cost, TRUE AS is_test FROM `bigquery-public-data.iowa_liquor_sales.sales` WHERE EXTRACT(YEAR FROM date) = 2021 GROUP BY store_name, city, vendor_name, category_name, item_description, is_test);
Create the model
Create a contribution analysis model:
In the Google Cloud console, go to the BigQuery page.
In the query editor, run the following statement:
CREATE OR REPLACE MODEL bqml_tutorial.liquor_sales_model OPTIONS( model_type = 'CONTRIBUTION_ANALYSIS', contribution_metric = 'sum(total_bottle_cost)/sum(total_sales)', dimension_id_cols = ['store_name', 'city', 'vendor_name', 'category_name', 'item_description'], is_test_col = 'is_test', min_apriori_support = 0.05 ) AS SELECT * FROM bqml_tutorial.iowa_liquor_sales_data;
The query takes approximately 35 seconds to complete, after which the model
liquor_sales_model
appears in the bqml_tutorial
dataset in
the Explorer pane. Because the query uses a CREATE MODEL
statement to
create a model, there are no query results.
Get insights from the model
Get insights generated by the contribution analysis model by using the
ML.GET_INSIGHTS
function.
In the Google Cloud console, go to the BigQuery page.
In the query editor, run the following statement to select columns from the output for a summable ratio metric contribution analysis model:
SELECT contributors, metric_test, metric_control, metric_test_over_metric_control, metric_test_over_complement, metric_control_over_complement, aumann_shapley_attribution, apriori_support contribution FROM ML.GET_INSIGHTS( MODEL `bqml_tutorial.liquor_sales_model`) ORDER BY aumann_shapley_attribution DESC;
The first several rows of the output should look similar to the following. The values are truncated to improve readability.
contributors | metric_test | metric_control | metric_test_over_metric_control | metric_test_over_complement | metric_control_over_complement | aumann_shapley_attribution | apriori_support | contribution |
---|---|---|---|---|---|---|---|---|
all | 0.069 | 0.071 | 0.969 | null | null | -0.00219 | 1.0 | 0.00219 |
city=DES MOINES | 0.048 | 0.054 | 0.88 | 0.67 | 0.747 | -0.00108 | 0.08 | 0.00108 |
vendor_name=DIAGEO AMERICAS | 0.064 | 0.068 | 0.937 | 0.917 | 0.956 | -0.0009 | 0.184 | 0.0009 |
vendor_name=BACARDI USA INC | 0.071 | 0.082 | 0.857 | 1.025 | 1.167 | -0.00054 | 0.057 | 0.00054 |
vendor_name=PERNOD RICARD USA | 0.068 | 0.077 | 0.89 | 0.988 | 1.082 | -0.0005 | 0.061 | 0.0005 |
In the output, you can see that the data segment city=DES MOINES
has the highest
contribution of change in the sales ratio. You can also see this difference
in the metric_test
and metric_control
columns, which show that the ratio
decreased in the test data compared to the control data. Other metrics,
such as metric_test_over_metric_control
, metric_test_over_complement
,
and metric_control_over_complement
, compute additional statistics that
describe the relationship between the control and test ratios and how they
relate to the overall population. For more information, see
Output for summable ratio metric contribution analysis models.
Clean up
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.