One of the exciting new features introduced with GA4 is the ability to export user events to BigQuery – available to all GA accounts (previously this was restricted to only GA360 users). Unlike the GTM approach above, GA4 BQ will have both mobile and web events and requires no client side changes. To take advantage of this feature, Cloud Retail natively supports importing historical user events via GA4 BQ.
To try this feature out you will need to reach out to your support team to allowlist your project for it. After being allowlisted, the following steps are required:
Enable DataFlow API (this is used to transform the data from GA4 to Cloud Retail).
Grant the following permissions to service accounts used by Cloud Retail:
service-{Project_Number}@google-cloud-sa-retail.iam.gserviceaccount.com hasiam.serviceAccounts.actAs permission on the Compute Engine default service account
If the BQ is in another project than your Retail solution, you will need to grant both the Retail and Compute Engine service account Bigquery Admin role.
Call theRetail API import with data_schema=’ga4′, or go to the Retail console Import tab and choose GA4 as your BQ user events schema.
GA4 BQ exports shards each day into a separate table, so you will need to import each day using a separate import call.
Caveats: GA4 BQ has limitations on what jobs you can perform at no charge such as the number of events that can be exported, learn more. Data Flow also incurs costs on the end use. See Data Flow pricing.This cost is amortized by subsequent Retail model training and prediction costs.
Who is this for: Any customer wanting to import their historical events.
Who is this not for: This solution can work for most people, but it only has the ability to import events in bulk. You will need to use GTM or GA4 BQ Streaming to get real-time events. GA4 also does not allow backfilling data into BQ, so in order to have historical data, you have to have the BQ export set up prior to integrating with Cloud Retail.
GA4 BQ Streaming [Availability: Public Preview]
In addition to supporting a batch export to BQ, GA4 also supports streaming events into BQ in near real-time. Unlike the GTM approach above, GA4 BQ will have both mobile and web events and requires no client-side changes. To take advantage of this capability, we have architected a system on top of Cloud Functions and Cloud Scheduler that periodically polls data from GA4 BQ and streams them to Cloud Retail API.
The Cloud scheduler will trigger a Cloud Function to run every one minute, and pull the most recent data and send it to Cloud Retail. Detailed instructions are in our customer support Github.
By default, for home page view events, the import will ingest page views with a url that has no suffix after “/” e.g. myhomepage.com or myhomepage.com/. To ingest home page views that do not match against these, you will need to reach out to your support team.
To try this feature out you will currently need to reach out to your support team to allowlist your project for it.
Caveat: Given that events might arrive out of order and no timestamp exists for when the data was written, the function might pull duplicate events into Cloud Retail (but our models can handle this). GA4 BQ streaming also provides no SLOs on e2e latency. That being said, the latency from our own empirical analysis suggests latency on the order of less than 1 minute.
Who is this for: Customers who do not use GTM and don’t want to do client-side changes. This is also ideal for customers who have mobile apps, given that GTM won’t cover those.
Who is this not for: Customers who need SLO-backed real-time events and don’t want to deploy a server-side solution – GTM is a better option in this case.
Cloud Functions for Firebase [Availability: Public Preview]
From a functional point of view, there are situations where minimizing the time an event captured by GA4 occurs and enriching the Retail AI models can be key, especially at the customer’s native apps level where, for many retailers, the traffic from the applications represents a high percentage of the global traffic.
To help guide you through this section, it is helpful to understand how the data will flow through the system:
The native Android or iOS app generates an analytics event
The Cloud Functions for Firebase trigger for the analytics event is activated.
The Cloud Function configured to handle the event begins processing, and sends data to the Cloud Retail API
Cloud Retail receives the data and processes it for consumption in models and serving systems.
In this section we are going to show an implementation strategy based on the principles of event driven architecture. The key idea is to extract a GA4 event generated on the mobile app and produce a Recommendations AI event to inject into the Cloud Retail system in real time. This will be achieved through the following 5 steps:
Step 1: Adding Google Analytics for Firebase SDK
Step 2: Sending a Google Analytics event from your app, and marking it as a conversion.
Step 3: Configuring a Cloud Functions for Firebase trigger to send Analytics events to Google Cloud
Step 4: Using a Google Cloud Function to process the Analytics Events
Step 5: Deploying the Cloud Functions for Firebase
Let’s look at the setup steps in more detail.
Configuring Firebase to send Analytics events to Google Cloud
Before proceeding to this step, we need to log the event via the Google Analytics for Firebase SDK – learn more.
Once we have instrumented Firebase to log Analytics events, we can configure Firebase to send those events in real-time by going to the Firebase console and marking all relevant events as conversions, as shown in the following figure. Events not marked as a conversion are not sent in real time.