ga4::bq kit — chrome extension
published 2025-09-25 · updated 2025-11-21 · bigquery chrome-extension · [link]
GA4::BQ Kit is a productivity tool for analysts and developers working with Google Analytics 4 data in BigQuery. It gives you instant access to a searchable library of SQL snippets directly in the BigQuery UI, helping you write accurate, complex queries faster.
Get the extension: Chrome Web Store
Documentation
The extension includes two kinds of snippets: component snippets (one-line building blocks) and query snippets (full queries). Use the in-extension search to filter by name or content and insert with a single action.
Table of contents
- Component snippets (🧩)
- ::parse date iso
- ::day of week iso
- ::day of week name
- ::extract iso week
- ::table suffix static
- ::table suffix dynamic
- ::timestamp seconds
- ::unnest event params string
- ::unnest event params int
- ::unnest event params float
- ::unnest event params double
- ::unnest event params coalesced
- ::row number
- ::aggregate event names
- Query snippets (🔗)
Component snippets (🧩)
Reusable one-line snippets for common SQL tasks on GA4 export tables.
::parse date iso
Parse the event_date string into a date type:
PARSE_DATE('%Y%m%d', event_date) AS event_date
::day of week iso
ISO day of week (1–7):
CAST(FORMAT_DATE('%u', SAFE.PARSE_DATE('%Y%m%d', event_date)) AS INT64) AS day_of_week_iso
::day of week name
Day name (e.g. Monday):
FORMAT_DATE('%A', PARSE_DATE('%Y%m%d', event_date)) AS day_of_week_name
::extract iso week
ISO week from event timestamp:
EXTRACT(ISOWEEK FROM TIMESTAMP_MICROS(event_timestamp)) AS iso_week
::table suffix static
Filter GA4 events table by a fixed date:
_TABLE_SUFFIX BETWEEN '20260101' AND '20260101'
::table suffix dynamic
Filter by yesterday’s date:
_TABLE_SUFFIX BETWEEN FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)) AND FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
::timestamp seconds
Truncate event timestamp to seconds:
TIMESTAMP_TRUNC(TIMESTAMP_MICROS(event_timestamp), SECOND) AS event_ts_sec
::unnest event params string
Extract a string event parameter (e.g. page_title):
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_title') AS page_title
::unnest event params int
Extract an integer event parameter (e.g. ga_session_id):
(SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'ga_session_id') AS ga_session_id
::unnest event params float
Extract a float event parameter:
(SELECT value.float_value FROM UNNEST(event_params) WHERE key = 'value') AS value_float
::unnest event params double
Extract a double event parameter:
(SELECT value.double_value FROM UNNEST(event_params) WHERE key = 'value') AS value_double
::unnest event params coalesced
Extract a numeric parameter (int/float/double or string cast to float):
(SELECT COALESCE(value.int_value, value.float_value, value.double_value, SAFE_CAST(value.string_value AS FLOAT64)) FROM UNNEST(event_params) WHERE key = 'value') AS numeric_value
::row number
Row number per user by event time (e.g. for “last event per user”):
ROW_NUMBER() OVER (PARTITION BY user_pseudo_id ORDER BY event_timestamp DESC) AS row_num
::aggregate event names
Aggregate distinct event names into a JSON-style list:
CONCAT('(',STRING_AGG(DISTINCT TO_JSON_STRING(event_name)),')') AS event_names_agg
Query snippets (🔗)
Complete queries for common operations and analysis.
::insert rows
Insert all rows from a source table into a target table:
# Insert all rows from a source table into a target table
INSERT INTO `target_table_id`
SELECT
*
FROM
`source_table_id`;
::update rows
Update rows based on a condition:
# Update rows in a table based on a condition
UPDATE `table_id`
SET column_a = 'new_value'
WHERE column_b = 'match_value';
::delete rows
Delete rows based on a condition:
# Delete rows from a table based on a condition
DELETE FROM `table_id`
WHERE column_a = 'value_a';
::create table (basic)
Create a new table from an existing table:
# Create a new table from an existing table
CREATE OR REPLACE TABLE `target_table_id` AS
SELECT
*
FROM
`source_table_id`;
::create table (partitioned and clustered)
Create a partitioned and clustered table:
# Create a partitioned and clustered table from an existing table
CREATE OR REPLACE TABLE `target_table_id`
PARTITION BY DATE(event_date)
CLUSTER BY column_a, column_b AS
SELECT
*
FROM
`source_table_id`;
::cte
Common Table Expression (CTE) pattern:
# Use a Common Table Expression (CTE) to create a temporary result set
WITH base_cte AS (
SELECT
*
FROM
`source_table_id`
WHERE
column_a = 'filter_value'
)
SELECT
*
FROM
base_cte;
::count events by name
Count events by event name:
# Count the number of events by event name
SELECT
event_name,
COUNT(*) AS event_count
FROM
`project.analytics123456.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20260101' AND '20260101'
GROUP BY ALL;
::find duplicate transactions
Find duplicate ecommerce transactions by transaction_id:
# Find duplicate ecommerce transactions by transaction_id
SELECT
event_date,
event_name,
ecommerce.transaction_id,
COUNT(*) AS event_count
FROM
`project.analytics123456.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20260101' AND '20260101'
GROUP BY ALL
HAVING
event_count > 1;
::consent by event
Analyze consent status for selected events:
# Analyze consent status for selected events
SELECT
event_name,
privacy_info.ads_storage,
COUNT(*) AS counter,
ROUND((COUNT(*) * 100.0) / SUM(COUNT(*)) OVER (PARTITION BY event_name), 2) AS percentage_within_event
FROM
`project.analytics123456.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20260101' AND '20260101'
AND event_name IN ('page_view', 'add_to_cart', 'purchase')
GROUP BY ALL
ORDER BY
event_name, ads_storage;
::find duplicate rows
Find and list exact duplicate rows using a row hash:
# Find and display exact duplicate rows in a table
WITH keyed AS (
SELECT
FARM_FINGERPRINT(TO_JSON_STRING((SELECT AS STRUCT t.*))) AS row_hash,
t.*
FROM
`table_id` AS t
)
SELECT
*
FROM
keyed
QUALIFY
COUNT(*) OVER (PARTITION BY row_hash) > 1
ORDER BY
row_hash;
::get column names
Get all column names for a table from Information Schema:
# Get all column names for a specific table
SELECT ARRAY_AGG(column_name ORDER BY ordinal_position) AS column_names
FROM `project.dataset.INFORMATION_SCHEMA.COLUMNS`
WHERE table_name = 'your_table';