Data Export to Cloud Storage (BigQuery / Amazon S3)

devtodev has an option to export user and event data to a cloud storage. The event data is uploaded once every hour. User data is uploaded every day, if we receive at least one event for the last 24 hours.

To export your data to one of the supported cloud storages, please send a request to info@devtodev.com.

Export to BigQuery

To export data to BigQuery you will need to:

  1. Create a service account, if it does not already exist.

  2. Get service account credentials.

  3. Create a dataset.

  4. Choose what kind of data you want to export to your dataset.

In the request specify the following details:

  1. Service account credentials;

  2. Name and location of the dataset in BigQuery;

  3. Export configuration (see below).

Creating a service account in BigQuery

  1. If you do not have a service account, create one by following the Google Cloud manual.

  2. Your service account has to have rights for table creation and data upload. Add bigquery.user or bigquery.admin and one of the following roles to your service account:

    • bigquery.dataEditor

    • bigquery.dataOwner

    • bigquery.jobs.create

    • bigquery.tables.create

  3. Create credentials for your service account, if there are none yet. Follow this manual to create access keys. To create keys, add serviceAccountKeyAdmin role to your service account.

Creating a dataset

Follow this manual to create a dataset in BigQuery.

Name your dataset devtodev, that way we can send your data to BigQuery.

Also, while creating a dataset, keep location in mind.

You cannot change the location of the dataset later! More on locations in BigQuery.

Export configuration

After creating a service account and a dataset we need to configure export in devtodev.

You can choose one of two ways to export your data:

Export data to one table — all event data will be uploaded to one common table named p<project id>_events.

Export data by event type — every event type will be uploaded to their respective table. The list of event types is below.

Every event type will have a table with a name like this p<project id>_events_<event type>[_<event name>]. For example:

  • p234_rp —this is a table for real payment events from a project with id 234.

  • p234_ce_mission_start — this is a table for a custom event named “mission_start“ from a project with id 234.

You can match project name and project id in the _projects table, which will be automatically filled at the time of the first export.

Active user information will be uploaded to a separate table named p<project id>_users regardless of how you choose to upload event data.

Export to Amazon S3

To export data to Amazon S3 you will need to:

  1. Create an account, if it does not already exist.

  2. Get credentials (accessKey and secretKey).

  3. Create a bucket.

  4. Choose what kind of data you want to export to your bucket.

In the request specify the following details:

  1. Account credentials;

  2. Name and region of the bucket in Amazon S3;

  3. Export configuration (see below).

Creating an Amazon S3 account

If you do not already have an account, follow this AWS manual to create one.

Getting credentials

See this manual for more detail on how to find your credentials.

We need accessKey and secretKey which are located in ~/.aws/credentials file. We will also need your region information, it is located in ~/.aws/config file. Execute aws configure command in AWS developer console to get accessKey and secretKey.

Example:

$ aws configure
AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]: us-west-2
Default output format [None]: json

Creating a bucket

Follow this manual to create a bucket in S3.

The Name of the bucket should be unique, see more on bucket naming. Also, while creating a dataset, keep the region in mind.

Objects can never leave the region unless they are explicitly transferred! More on AWS regions.

Export configuration

After creating an account and a bucket we need to configure export in devtodev.

Your data will be stored in a bucket directory named p<project id> which will store .csv files compressed with gzip. Each directory will have a project_info.txt file with the project name and application id in devtodev service.

You can choose one of two ways to export your data:

Export data to one table — all event data will be uploaded to one common table.

Example of such table: 2021_05_26_08_00_54common86ddf8a5-1e7f-4f2c-a4d3-22f4d6a8860c

Export data by event type — every event type will be uploaded to their respective table. The list of event types is below.

Some examples:

  • 2021_05_26_08_08_28ce[editor_item_remove]a9413576-0a32-4a2a-ad84-940150e9a218 — this is a table for a custom event named “editor_item_remove“.

  • 2021_05_26_08_08_11rp556dbd8d-71c9-41b4-9564-d43b39ca1b7d — this is a table for real payment events.

Active user information will be uploaded to a separate users table regardless of how you choose to upload event data.

Example of such table: 2021_05_26_08_07_52users439c129f-d70b-4f98-ad86-4cb01054732b

List of event types

During export configuration you can select what type of events you want to export. You can also select which project should be exported and which should not.

The list below contains event types (with fields) available for export.

Common basic fields for all event types

devtodev_id — numeric user id
main_id — string user id
crossplatform_id — cross platform user id, only for projects with set user id identification
uc_createtime — user registration date
uc_first_paymenttime — first payment date
uc_last_paymenttime — last payment date
uc_payment_cnt — number of payments
uc_payment_sum — sum of payments
uc_level — user level
uc_country — country
uc_language — language
uc_age — age
uc_gender — gender
uc_cheater — mark a cheater
uc_tester — mark a tester

Event types

Last updated

#989: best practices - improved readability

Change request updated