Google BigQuery

Overview

Google BigQuery is a fully managed, analytics data warehouse that is part of the Google Cloud Platform.

Using the Lytics integration with Google BigQuery, you can import and export events, and also export user profiles and audience changes to BigQuery. Granular event data in BigQuery will allow for ease of auditing, querying, and reporting through other data visualization tools.

Authorization

If you haven't already done so, you will need to set up a Google BigQuery account before you begin the process described below. You can follow the Google quickstart guide and may also want to read the Google BigQuery documentation.

If you are using an IP allowlist for your Google BigQuery account, contact your account administrator to add Lytics' IP addresses to your allowlist. Reach out to your Lytics Account Manager for the current list of IP addresses.

There are two different options for authorization with Google BigQuery:
User Account and Service Account. See Google's authentication documentation for details on the differences between user and service accounts. See the end of this document for information on how to give 3rd parties access to export data in BigQuery.

If you are new to creating authorizations in Lytics, see the Authorizations documentation for more information.

  1. Select Google Cloud from the list of providers.
  2. Select the authorization method. Note that different methods may support different job types. Google BigQuery supports the following authorization methods:
  3. Enter a Label to identify your authorization.
  4. (Optional) Enter a Description for further context on your authorization.
  5. Complete the configuration steps needed for your authorization. These steps will vary by method.
  6. Click Save Authorization.

User account

This option uses your Google user account to access your BigQuery data. Note that you must have projectEditor permissions on the project you are importing from in order for BigQuery exports and imports to run properly with User Account authorization.

  1. Select Big Query OAuth.
  2. From the user selection window, select the Google account you want to use from the list of accounts.
  3. Click Allow.

Service account

This option uses a service account to access your Google BigQuery data.

  1. You will need a service account credential file to use this authorization type. If you do not have one, follow the instructions in Google's documentation to create one.
    • For imports and exports, the service account must be able to list projects, create and write to tables, and create jobs. The combination of BigQuery Data Editor and BigQuery Job User permissions will satisfy the requirements. Learn more about the BigQuery permissions here.
  2. In your Lytics account, select Big Query Service Account JWT. Note there are different options for Import and Export.
  3. In Cert JSON textbox, copy the contents of the service account's credential file.

Give 3rd Parties Access to Export Data in BigQuery

When exporting data from Lytics to BigQuery, you can specify email addresses that you would like to give access to. However, if you later decide that you would like to give access to another person, you can do this from within BigQuery. See Google's documentation regarding access controls for datasets.

Import Activity Data

Many applications let you write to tables in Google BigQuery and you can easily import this custom data into Lytics. Once imported, you can leverage powerful insights on this custom data provided by Lytics data science to drive your marketing efforts.

Integration Details

  • Implementation Type:Server-side Integration.
  • Implementation Technique: REST API Integration.
  • Frequency: Batch Integration.
  • Resulting data: User Profiles, User Fields, Raw Event Data.

This integration utilizes the Google BigQuery APIs to import user data. Once the import is started the job will:

  1. Start a query job for data in Google BigQuery.
  2. Export the result to Google Cloud Storage.
  3. Import file from Google Cloud Storage into the selected Lytics data stream.

Fields

Fields imported through Google BigQuery will require custom data mapping. For assistance mapping your custom data to Lytics user fields, please contact Lytics Support.

Configuration

Follow these steps to set up and configure an import of Google BigQuery events into the Lytics platform. If you are new to creating jobs in Lytics, see the Data Sources documentation for more information.

  1. Select Google Cloud from the list of providers.
  2. Select the Import Activity Data job type from the list.
  3. Select the Authorization you would like to use or create a new one.
  4. Enter a Label to identify this job you are creating in Lytics.
  5. (Optional) Enter a Description for further context on your job.
  6. In the Stream text input, enter the stream where imported data will go.
  7. From the BigQuery Project input, select the Google BigQuery Project you want to import data from.
  8. From the BigQuery Dataset input, select the Google BigQuery Dataset you want to export data from.
  9. From the BigQuery Table input, select the Google BigQuery table you would like to import. NOTE: "nested tables" are not supported by this import process.
  10. From the Timestamp Field input, select the name of the field that contains the event timestamp. On continuous imports, the most recent time in this field will be saved. On the next import, only rows with a timestamp greater than that value will be imported. The field must be of type Datetime or Timestamp in BigQuery.
  11. Select the Keep Updated checkbox to continuously run this import. If this option is selected, the Timestamp Field must be provided.
  12. From the Frequency input, select how often do you want to run the import job. By default, the import job will run every 4 hours.
  13. (Optional) From the Time of Dayinput, select the time of day to start importing.
  14. (Optional) From the Time Zone input, select the time zone of the time of day above.
  15. Click Complete to start import.

The time necessary to import a table depends on the table size. It can take from 30 minutes to several hours to complete the import.

It's important to be aware that big imports can make your entire Lytics account slower as data is processed and audience memberships are updated.

Export Activity Data

Export event data from any Lytics data stream to Google BigQuery. NOTE: Unlike user fields, events are not represented within the Lytics dashboard. Events are the raw data received from integrations as seen in your Data Streams.

Integration Details

  • Implementation Type: Server-side Integration.
  • Implementation Technique: REST API Integration.
  • Frequency: Batch Integration.
  • Resulting data: Raw Event Data is stored in the resulting table.

This integration utilizes the Google BigQuery APIs to send user data. Once the export is started the job will:

  1. Check if dataset exists, if not create it in Google BigQuery.
  2. Check if a table exists, if not create one.
  3. Generate a CSV for each table to be created or updated in BigQuery by:
    1. Scan through events on the data stream.
    2. Generate CSV row(s) for each scanned event.
    3. Write CSV row to Google Cloud Storage.
  4. Upload the CSV(s) to BigQuery.

There are a few things to know when running an event export:

  • The event export scans the complete data stream from the first collected event to the current event, unless otherwise specified by the export configuration.
  • The export will run continuously throughout the day.
  • Field names in BigQuery will not be an exact match of field names in Lytics. Names will be formatted to fit BigQuery's field naming schema.
  • If two field names are transformed to BigQuery format and conflict, one will get an integer appended to the end of the field name. For example, if both first_name and First Name exist in Lytics and are exported, they'll be written to first_name and first_name1 in BigQuery.
  • Low usage fields (fields populated on <0.1% of total volume for the exported stream) will not be included within the event export.

Fields

The fields included will depend on the raw event in Lytics data. All fields in the selected stream will be included in the exported table. To see a list of fields in a stream, select the stream name from Data > Data Streams in the Lytics platform.

Configuration

Follow these steps to set up and configure an export of events from Lytics to Google BigQuery. If you are new to creating jobs in Lytics, see the Destinations documentation for more information.

  1. Select Google Cloud from the list of providers.
  2. Select the Export Activity Data job type from the list.
  3. Select the Authorization you would like to use or create a new one.
  4. Enter a Label to identify this job you are creating in Lytics.
  5. (Optional) Enter a Description for further context on your job.
  6. From the BigQuery Project input, select the Google BigQuery Project you want to export data to.
  7. From the BigQuery Dataset input, select the Google BigQuery Dataset you want to export data to.
  8. From the Data Streams to Export input, select data streams to export. A stream is a single source/type of data. You may choose more than one. If none are selected, all streams will be exported.
  9. In the Maximum numeric field, enter the number of events to be exported. If left blank, all events will be exported.
  10. In the Start Date text input, enter the date of the oldest event you want to export. Events from this date onwards will be exported. Use yyyy-mm-dd format (e.g. 2015-08-13).
  11. In the End Date text input, enter the most recent date you want to export. Events before, but NOT including this date will be exported. Use yyyy-mm-dd format (e.g. 2015-08-13).
  12. Click on the Show Advanced Options tab to expand the advanced configuration.
  13. Select the Partitioned Table checkbox to use partitioned tables that can help lower query costs by allowing you to target a specific date range of data instead of the whole data set. Tables will be partitioned by the event date.
  14. From the Partition Duration dropdown, select the duration of the table's partitions. This is ignored if Partitioned Table is not selected.
  15. Select the Flatten Streams checkbox to export all streams to one BigQuery table.
  16. Select the Keep Updated checkbox to continuously run this export.
  17. Select the Start Export From Now Onwards checkbox to only export events collected from now onwards. This will override any start or end date configuration made.
  18. Click Start Export.
    bigquery event config

Export Audience Changes

Export audience changes from Lytics to Google BigQuery.

Integration Details

  • Implementation Type: Server-side Integration.
  • Implementation Technique: REST API Integration.
  • Frequency: Batch Integration.
  • Resulting data: Audience segment change events.

This integration utilizes the Google BigQuery APIs to send user data. Once the export is started the job will:

  1. Check if dataset exists, if not create it in Google BigQuery.
  2. Check if a table exists, if not create one.
  3. Generate a CSV in BigQuery by:
    1. Receive segment change events.
    2. Write change event to Google Cloud Storage as a CSV row.
  4. Every 10 minutes Upload the CSV to BigQuery and start writing events to a new CSV.

Fields

By default, Lytics exports the following fields to Google BigQuery:

Lytics User FieldDescriptionGoogle BigQuery FieldType
emailEmail Addressemailstring
-Audience Slugsegment_slugstring
-Event TimetimestampTIMESTAMP
-was this an audience Exitexitboolean

Configuration

Follow these steps to set up and configure an export of audience change events from Lytics to Google BigQuery. If you are new to creating jobs in Lytics, see the Destinations documentation for more information.

  1. Select Google Cloud from the list of providers.
  2. Select the Export Audience Changes job type from the list.
  3. Select the Authorization you would like to use or create a new one.
  4. Enter a Label to identify this job you are creating in Lytics.
  5. (Optional) Enter a Description for further context on your job.
  6. From the BigQuery Project input, select the Google BigQuery Project you want to export data to.
  7. From the Audiences input, select the Lytics audience(s) you'd like to track. Leave empty to track all segment changes.
  8. From the BigQuery Dataset input, select the Google BigQuery Dataset you want to export data to.
  9. From the ID Field input, select the ID field in Lytics that'll get mapped to an identifying field in BigQuery.
  10. Click Start Export.
    gbq-export-audience-change-config

Export Audience Definitions

Use the Export Audience Definitions to export your Lytics audience definitions to Google BigQuery. You can use the exported audiences to do segmentation of users in other tools.

Integration Details

  • Implementation Type: Server-side Integration.
  • Implementation Technique: REST API Integration, File Based Transfer Integration.
  • Frequency: Batch Integration.
  • Resulting data: Audience Definitions

The selected audience definitions are retrieved, if they use any INCLUDE statements the included audience is inlined into the exported audience's definition replacing the INCLUDE statement.

Steps/Pattern:

  1. Check if the dataset exists in BigQuery, create it if it doesn't exist
  2. Check if the table exists in BigQuery, if it does, check that schema match, if it does not, create a new table
  3. For each selected audience:
    1. Retrieve the FilterQL for the audience
    2. Inline any INCLUDE statements
    3. Add row to CSV file
  4. Start Load Job in BigQuery

Fields

By default, Lytics exports the following fields to Google BigQuery:

Lytics Audience FieldDescriptionGoogle BigQuery FieldType
IDAudience ID (configurable)idstring
NameAudience name (configurable)namestring
SlugAudience slug (configurable)slugstring
FilterQLFilter statementfilterstring
ly_exported_tsTime of exportly_exported_tsdate
ly_created_tsTime audience was createdly_created_tsdate
ly_updated_tsTime audience was updatedly_updated_tsdate

Configuration

Follow these steps to set up and configure an export job for Google BigQuery in the Lytics platform. If you are new to creating jobs in Lytics, see the Destinations documentation for more information.

  1. Select Google Cloud from the list of providers.
  2. Select the Export Audience Definitions job type from the list.
  3. Select the Authorization you would like to use or create a new one.
  4. Enter a Label to identify this job you are creating in Lytics.
  5. (Optional) Enter a Description for further context on your job.bq-audience-def-config
  6. From the BigQuery Project input, select the Google BigQuery Project you want to export data to.
  7. (optional) From the BigQuery Dataset input, select the Google BigQuery Dataset you want to export data to. If no dataset is selected, the lytics_(AID) dataset will be used. If the selected Dataset does not exist it will be created.
  8. (optional) In the BigQuery Table text box, enter the Google BigQuery table you want to export data to. If an existing table is not selected then one will be created. If no table name is entered lytics_(AID)_audience_definition_export will be used.
  9. (optional) Select the Include Audience ID checkbox, to include the Audience's ID in the row.
  10. (optional) Select the Include Audience Name checkbox, to include the Audience's Name in the row.
  11. (optional) Select the Include Audience slug checkbox, to include the Audience's slug in the row.
  12. Click Start Export.

Export Audiences

Export user profiles including user fields and audience memberships from Lytics to Google BigQuery.

Integration Details

  • Implementation Type: Server-side Integration.
  • Implementation Technique: REST API Integration.
  • Frequency: Batch Integration.
  • Resulting data: User Fields will be exported to Google BigQuery as either a flat table or multiple tables with join keys for non scalar fields (maps and sets).
    • Field names in BigQuery will not be an exact match of field names in Lytics. Names will be formatted to fit BigQuery's field naming schema.
    • If two field names are transformed to BigQuery format and conflict, one will get an integer appended to the end of the field name, e.g. if both first_name and First Name exist in Lytics and are exported, they'll be written to first_name and first_name1 in BigQuery.
    • If non-scalar fields are exported, each field will be written to a separate table, with a unique identifier id field that can be used to join the main user table to the fields tables.
    • This id is not an identifier that exists in Lytics, but rather it's a hash of all the user unique identifier fields. This is calculated for the express purpose of providing a way to join the non-scalar field tables to the main user table. This field's value may change between exports.

This integration utilizes the Google BigQuery's APIs to send user data. Once the export is started the job will:

  1. Check if dataset exists, if not create it in Google BigQuery.
  2. Generate a CSV for each table to be created or updated in BigQuery by:
    1. Scan the selected Audience.
    2. Generate CSV row(s) for each scanned user.
    3. Write CSV row to Google Cloud Storage.
  3. Upload the CSV(s) to BigQuery.

There are a few things to know when running a BigQuery user export:

  • The user export scans the complete Lytics audience and exports all entries into a BigQuery table.
  • The export will run at most once a day, but large audiences may take more than a day to complete.
  • For large audiences (bigger than a million users), users will first be written into a dated user table user_ + audience slug + _ + date as the table builds in BigQuery. As more batches of users are exported, they will be appended to the dated table. Once the complete audience has been sent to BigQuery, the table will be renamed with the date removed.
  • If continuous export is selected, the work will start running again on the next day at the time specified. A new temporary dated audience table will be created in BigQuery alongside the completed non-dated table.
  • If an audience is large enough to take over a day to export and continuous export is selected, the work will not run again until after the previous export completes.

Fields

The fields exported to the Google BigQuery will depend on the Export Non-Scalar Fields and the Export Scalar Fields options in the workflow's configuration. Any user field in your Lytics account may be available for export.

Configuration

Follow these steps to set up and configure an export of users from Lytics to Google BigQuery. If you are new to creating jobs in Lytics, see the Destinations documentation for more information.

  1. Select Google Cloud from the list of providers.
  2. Select the Export Audiences job type from the list.
  3. Select the Authorization you would like to use or create a new one.
  4. Enter a Label to identify this job you are creating in Lytics.
  5. (Optional) Enter a Description for further context on your job.
  6. From the BigQuery Project input, select the Google BigQuery Project you want to export data to.
  7. From the BigQuery Dataset input, select the Google BigQuery Dataset you want to export data to.
  8. From the Audience input, select the audience you would like to export to BigQuery.
  9. From the Maximum numeric field, select the number of users to be exported. If left blank, all users will be exported.
  10. Select the Audience Membership checkbox to export the audiences each user is a member of. gbq-export-users-config-1
  11. From the Export Non-Scalar Fields input, select a list of non-scalar fields to export. NOTE: these will be created in a separate table.
  12. From the Export Scalar Fields input, select a list of scalar fields to export. NOTE: these will be created in the same table. If none are selected, all fields are exported.
  13. Click on the Show Advanced Options tab to expand the advanced configuration. gbq-export-users-config-2
  14. Select the Keep Updated checkbox to continuously run this export.
  15. Select the Keep Old checkbox to keep daily exports in dated tables.
  16. From the Time of Day input, select the time of day to start export each day.
  17. From the Timezone input, select the timezone for the time of day.
  18. Click Start Export.
    gbq-export-users-config-3

Exporting table data to BigQuery

Please see the Google BigQuery documentation on Exporting Table Data for options and instructions for exporting data to BigQuery.

Google Cloud: Export All Users

Lytics Conductor users can export all profiles in their Lytics Account to BigQuery via the Google Cloud: Export All Users workflow. This workflow is optimized for scanning the Lytics user table without the need for considering audience membership, and offers superior performance to the audience export workflows as a result. This makes it a great option for Lytics users who wish to maintain a daily sync of all their Lytics profiles to a BigQuery data warehouse of their choice.

Integration Details

  • Implementation Type: Server-side
  • Implementation Technique: REST API
  • Frequency: Batch
  • Resulting data: User Fields will be exported to Google BigQuery a flat table by default. To export non-scalars to separate join tables, deselect Export Non-Scalars as JSON and see the Export Audiences workflow documentation for information on how the target export tables are formatted in this mode.
    • Field names in BigQuery will not be an exact match of field names in Lytics. Names will be formatted to fit BigQuery's field naming schema.
    • If two field names are transformed to BigQuery format and conflict, one will get an integer appended to the end of the field name, e.g. if both first_name and First Name exist in Lytics and are exported, they'll be written to first_name and first_name1 in BigQuery.
    • If non-scalar fields are exported, they will be written to BigQuery columns of type JSON by default. See the BigQuery documentation for more information on working with JSON data

This integration utilizes the Google BigQuery's APIs to send user data. Once the export is started the job will:

  1. Check if dataset exists, if not create it in Google BigQuery.
  2. Generate a CSV for each table to be created or updated in BigQuery by:
    1. Scan the Lytics user table.
    2. Generate a CSV row for each scanned profile.
    3. Write the CSV row to Lytics-owned Google Cloud Storage.
  3. Upload the CSV(s) to BigQuery.

There are a few things to know when running a BigQuery user export:

  • The user export scans the complete Lytics user table and exports all entries into a BigQuery table.
  • The export will run at most once a day, but large accounts may take more than a day to complete.
  • For large accounts (with more than a million profiles), users will first be written into a dated user table user_ + audience slug + _ + date as the table builds in BigQuery. As more batches of users are exported, they will be appended to the dated table. Once the complete audience has been sent to BigQuery, the table will be renamed with the date removed.
  • If Keep Updated is selected, the work will start running again on the next day at the time specified. A new temporary dated audience table will be created in BigQuery alongside the completed non-dated table.
  • If an account is large enough to take over a day to export and continuous export is selected, the work will not run again until after the previous export completes.

Fields

The fields exported to the Google BigQuery will depend on the Export Non-Scalar Fields and the Export Scalar Fields options in the workflow's configuration. Any user field in your Lytics account may be available for export.

Configuration

Follow these steps to set up and configure an export job for Google Cloud in the Lytics platform. If you are new to creating jobs in Lytics, see the Jobs Dashboard documentation for more information.

  1. Select Google Cloud from the list of providers.
  2. Select the export job type from the list.
  3. Select the Authorization you would like to use or create a new one.
  4. Enter a Label to identify this job you are creating in Lytics.
  5. (Optional) Enter a Description for further context on your job.
    google
  6. Complete the configuration steps for your job.
  7. From the BigQuery Project input, select the Google BigQuery Project you want to export data to.
  8. (optional) From the BigQuery Dataset input, select the Google BigQuery Dataset you want to export data to.
  9. (optional) From the Export Non-Scalar Fields input, select a list of non-scalar fields to export.
  10. (optional) Select the Export Non-Scalars as JSON checkbox, to by default, non-scalar fields JSON columns. Deselect this to export non-scalar fields to separate join tables.
  11. (optional) From the Export Scalar Fields input, select a list of scalar fields to export (note: these will be created in the same table. If none are selected, all fields are exported).
  12. (optional) Select the Keep Updated checkbox, to continuously run this export.
  13. (optional) Select the Keep Old checkbox, to select to keep daily exports in dated tables.
  14. (optional) From the Time of Day input, select time of day to start export each day.
  15. (optional) From the Timezone input, select timezone for time of day.
  16. Click the Start job button to start the job

Google Cloud: Export System Events

Export Lytics system events to the BigQuery table and monitor jobs, authorizations, audiences, and queries in your Lytics instance.

Integration Details

  • Implementation Type: Server-side Integration.
  • Implementation Technique: REST API Integration.
  • Frequency: Batch Integration.
  • Resulting data: Lytics system events will be stored as a record in a BigQuery table.

This integration utilizes the Google BigQuery APIs to send system events. Once the export is started the job will:

  1. Check if given dataset exists in Google BigQuery.
  2. Check if given table exists, if not, then it will create one. If the name is given, it will use that name to create the table, otherwise the new table name will be of this lytics_system_event_export_{AID} format.
  3. Stores the system events in a csv file.
  4. The csv file is uploaded to BigQuery via Upload the CSV to BigQuery. The file is uploaded for every 100,000 events or when file size reaches 1GB. Once the file is uploaded, the export job will go to sleep for 15 minutes to let BigQuery process the file.
  5. The job will repeat the step from 1-4 until it sends all the events. Once it sends all the events, the job will sleep for 24 hours and will run again if the job is continuous.

Fields

By default, Lytics exports the following fields to Google BigQuery:

IDDateAIDUser IDSubject TypeSubject IDVerbNotesRelated
ID of the eventDate when the event occurredAccount AIDUser associated with the eventWhat the event is aboutID associated with the subjectAction described by the eventExtra information about the eventAny related information associated with the event

Configuration

Follow these steps to set up and configure an export of Lytics system events to Google BigQuery. If you are new to creating jobs in Lytics, see the Destinations documentation for more information.

  1. Select Google Cloud from the list of providers.
  2. Select the BigQuery: Export System Events job type from the list.
  3. Select the Authorization you would like to use or create a new one.
  4. Enter a Label to identify this job you are creating in Lytics.
  5. (Optional) Enter a Description for further context on your job.
  6. From the BigQuery Project input, select the Google BigQuery Project you want to export system events to.
  7. From the BigQuery Dataset input, select the Google BigQuery Dataset you want to export system events to.
  8. (Optional) From the BigQuery Table input, select the Google BigQuery table you want to export system events to. If an existing table is not selected then one will be created. If no table name is entered lytics_system_event_export_{AID} will be used.
  9. (Optional) Using the Start Date textbox, enter the date to export system events after this date (ex: 2020-01-01). If no start date is given then system events for last 30 days will be exported.
  10. (Optional) Using the End Date textbox, enter the date to export system events up to this date (ex: 2020-01-01). If no end date is given then system events up to the current date will be exported.
  11. (Optional) Using the Subject Type textbox, enter the Subject Type of system events you would like to export (work, segment, account). Only events specific to this Subject Type will be exported.
  12. (Optional) Using the Subject ID textbox, enter the Subject ID for above system event subject type you would like to export (work_id, segment_id, account_id). Only events specific to this Subject ID will be exported.
  13. Select the Keep Updated checkbox to continuously run this export. The export will run daily once it sends all the data.
  14. From the Time of Day input, select the time of day to start export each day.
  15. From the Timezone input, select the timezone for the time of day.
  16. Click Start Export.