LogoLogo
My AppsLive demoNewsArticles
  • Introduction
  • πŸ“ŒProduct updates: 2025
    • 2024
    • 2023
  • Getting Started
    • Registration
    • Adding a space
    • Adding an app to the space
  • Basic Events & Custom Events
  • Integration
    • Expert Tips
      • What to track
      • Payments & Anti-cheat
      • Check your integration
    • Integration of SDK 2.0+
      • SDK Integration
        • Android
        • iOS
        • macOS
        • Windows
        • Web
          • Web SDK Integration
          • Web SDK Releases
        • Unity
        • Unreal Engine
        • Godot Engine
      • Automatic payment tracking
        • App Store
        • Google Play
      • Setting up Events
        • Basic methods
        • Secondary methods
        • User profile
        • Anticheat methods
        • Track sessions
      • Push notifications
        • Android
        • iOS
        • Windows (UWP)
        • Unity
          • Android
          • iOS
          • Windows (UWP/WSA)
        • Unreal Engine
      • A/B testing
        • Description of A/B testing on the SDK side
        • Working with A/B tests in the devtodev interface
        • A/B testing examples
    • Integration of SDK 1.0+ (deprecated)
      • SDK Integration
        • iOS
        • Android
        • Windows 8.1 and 10
        • Web
        • Unity
        • Mac OS
        • Adobe Air
        • UE4
      • Setting up Events
        • Basic methods
        • Secondary methods
        • User profile
        • Anti-cheat Methods
      • Push Notifications
        • IOS
        • Android
        • Windows 8.1 and Windows 10
        • Unity
        • Abode Air
        • UE4
    • Test Devices
    • Server API
      • Data API 2.0
      • Subscription API
      • Push API
        • IOS
        • Android
        • Windows UWP
        • Windows
      • Raw Export
      • Labels API
      • Data API
    • Import historical data via API
    • Data Export
      • Data Export to Cloud Storage (BigQuery / Amazon S3)
  • 3rd Party Sources
    • Attribution Trackers
      • AppsFlyer
      • Adjust
      • Branch.io
      • Kochava
      • Tenjin
      • Tune (MAT)
      • Singular
      • Custom postback API
      • Facebook Ads referral decryption
    • App Marketplace Data
      • App Store Connect Stats
      • App Store Subscriptions
      • Google Play Console Stats
      • Google Play Subscriptions
      • AppGallery Connect Stats
    • Ad revenue
      • AdColony
      • AdMob
      • Facebook
      • MoPub
      • Unity Ads
      • Vungle
      • Ad revenue API
    • Cohort export
  • Reports and Functionality
    • Space-related Reports and Functionality
      • Overview
      • Custom dashboards & Reports
      • SQL
        • SQL tips
        • SQL Query examples
      • Audience overlap
    • Project-related Reports and Functionality
      • Overview
        • Real-Time Dashboard
        • Acquisition reports
        • Engagement reports
        • Monetization reports
        • In-game analysis reports
        • Cohort analysis
      • Reports
      • Push Notifications
        • Android Notifications
        • IOS Notifications
        • Windows Notifications
        • Button Templates
      • Predictions
      • Users & Segments
      • Filters
      • A/B Testing
      • Tuning
      • Settings
  • Metrics and Glossary
    • Ad networks metrics
    • Market Metrics
    • Prediction Metrics
    • SDK Metrics
    • Subscription metrics
  • Space Management
  • User Profile Management
  • Limits
  • Scenarios and Best Practices
    • Analytics use cases
    • Match-3
    • MMORPG Games
    • Hyper-Casual games
    • Social Casino
    • RPG games
    • Farming games
    • Non-gaming app
    • Acquisition Example
  • FAQ
    • Identification
    • Raw Data
    • All about data discrepancies
  • Slack
Powered by GitBook
On this page
  • Export to BigQuery
  • Creating a service account in BigQuery
  • Creating a dataset
  • Export configuration
  • Export to Amazon S3
  • Creating an Amazon S3 account
  • Getting credentials
  • Creating a bucket
  • Export configuration
  • List of event types
  • Common basic fields for all event types
  • Event types

Was this helpful?

Export as PDF
  1. Integration
  2. Data Export

Data Export to Cloud Storage (BigQuery / Amazon S3)

PreviousData ExportNext3rd Party Sources

Last updated 5 days ago

Was this helpful?

Cloud export is available for .

devtodev has an option to export user and event data to a cloud storage. The event data is uploaded once every hour. User data is uploaded every day, if we receive at least one event for the last 24 hours.

To export your data to one of the supported cloud storages, please send a request to .

Export to BigQuery

To export data to BigQuery you will need to:

  1. Create a service account, if it does not already exist.

  2. Get service account credentials.

  3. Create a dataset.

  4. Choose what kind of data you want to export to your dataset.

In the request specify the following details:

  1. Service account credentials;

  2. Name and location of the dataset in BigQuery;

  3. Export configuration ().

Creating a service account in BigQuery

  1. If you do not have a service account, create one by following the .

  2. Your service account has to have rights for table creation and data upload. Add bigquery.user or bigquery.admin and one of the following roles to your service account:

    • bigquery.dataEditor

    • bigquery.dataOwner

    • bigquery.jobs.create

    • bigquery.tables.create

  3. Create credentials for your service account, if there are none yet. Follow to create access keys. To create keys, add serviceAccountKeyAdmin role to your service account.

Creating a dataset

Name your dataset devtodev, that way we can send your data to BigQuery.

Also, while creating a dataset, keep location in mind.

Export configuration

After creating a service account and a dataset we need to configure export in devtodev.

You can choose one of two ways to export your data:

Export data to one table β€” all event data will be uploaded to one common table named p<project id>_events.

Export data by event type β€” every event type will be uploaded to their respective table. The list of event types is below.

Every event type will have a table with a name like this p<project id>_events_<event type>[_<event name>]. For example:

  • p234_rp β€”this is a table for real payment events from a project with id 234.

  • p234_ce_mission_start β€” this is a table for a custom event named β€œmission_startβ€œ from a project with id 234.

You can match project name and project id in the _projects table, which will be automatically filled at the time of the first export.

Active user information will be uploaded to a separate table named p<project id>_users regardless of how you choose to upload event data.

Export to Amazon S3

To export data to Amazon S3 you will need to:

  1. Create an account, if it does not already exist.

  2. Get credentials (accessKey and secretKey).

  3. Create a bucket.

  4. Choose what kind of data you want to export to your bucket.

In the request specify the following details:

  1. Account credentials;

  2. Name and region of the bucket in Amazon S3;

Creating an Amazon S3 account

Getting credentials

Example:

$ aws configure
AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]: us-west-2
Default output format [None]: json

Creating a bucket

Export configuration

After creating an account and a bucket we need to configure export in devtodev.

Your data will be stored in a bucket directory named p<project id> which will store .csv files compressed with gzip. Each directory will have a project_info.txt file with the project name and application id in devtodev service.

You can choose one of two ways to export your data:

Export data to one table β€” all event data will be uploaded to one common table.

Example of such table: 2021_05_26_08_00_54common86ddf8a5-1e7f-4f2c-a4d3-22f4d6a8860c

Export data by event type β€” every event type will be uploaded to their respective table. The list of event types is below.

Some examples:

  • 2021_05_26_08_08_28ce[editor_item_remove]a9413576-0a32-4a2a-ad84-940150e9a218 β€” this is a table for a custom event named β€œeditor_item_removeβ€œ.

  • 2021_05_26_08_08_11rp556dbd8d-71c9-41b4-9564-d43b39ca1b7d β€” this is a table for real payment events.

Active user information will be uploaded to a separate users table regardless of how you choose to upload event data.

Example of such table: 2021_05_26_08_07_52users439c129f-d70b-4f98-ad86-4cb01054732b

List of event types

During export configuration you can select what type of events you want to export. You can also select which project should be exported and which should not.

The list below contains event types (with fields) available for export.

Common basic fields for all event types

devtodev_id β€” numeric user id
main_id β€” string user id
crossplatform_id β€” cross platform user id, only for projects with set user id identification
uc_createtime β€” user registration date
uc_first_paymenttime β€” first payment date
uc_last_paymenttime β€” last payment date
uc_payment_cnt β€” number of payments
uc_payment_sum β€” sum of payments
uc_level β€” user level
uc_country β€” country
uc_language β€” language
uc_age β€” age
uc_gender β€” gender
uc_cheater β€” mark a cheater
uc_tester β€” mark a tester

Event types

Event name

Event code

Additional fields

EventTrackingStatus

ts

allow_tracking β€” is tracking allowed

EventUserInfo

ui

language β€” device locale

custom_udid β€” custom user id

EventDeviceInfo

di

device_version

device_os

display_resolution

display_dpi

androidid

idfa

idfv

advertisingid

serialid

manufacturer

model

device_model

offset β€” user timezone offset

EventDeviceInfoV2

di

device_version

device_os

display_resolution

display_dpi

display_diagonal

manufacturer

model

offset β€” user timezone offset

androidid

openudid

idfa

idfv

advertisingid

serialid

install_source

user_agent

EventRealPaymentEntry

rp

currency

product

payment_id

price_usd

payment_status

EventGamePurchase

ip

amount

item_type

item

inapp_currencies β€” structure with info on currency type and its amount spent on item purchase

EventCustomColumnar

ce

event_name

event_params β€” structure with parameter names and values

EventProgression

pe

location

spent

earned

source

difficulty

success

duration

EventTester

tstr

tester

EventCheater

ch

cheater

EventRegistrations

rg

this event only has basic fields

EventGender

gr

gender

EventAge

ag

age

EventGameSessionStart

ss

amount

EventUserEngagement

ue

duration

EventPeople

pl

name

email

phone

photo

event_params β€” other custom fields

EventSocialNetworkPost

sp

network reason

EventSocialNetworkConnect

sc

network

EventTutorial

tr

step

EventLevelUp

lu

local_duration

absolut_duration

spent

earned

balance

bought

EventApplicationInfo

ai

sdk_version

app_version

bundle_id

engine

EventWipe

wipe

save_cheater_tester

save_custom_props

save_paying_status

save_registration

EventAlive

al

this event only has basic fields

EventReferal

rf

publisher

sub_publisher

sub_ad

sub_ad_group

sub_campaign

sub_placement

sub_site

cost

EventSubscription

sbs

source payment_type start_time expiry_time event_type price_usd product original_payment_id

payment_id purchase_type promo_code promo_type payment_status eventlevel

EventAdRevenue

adrv

source ad_unit ad_network

placement

revenue

Follow to create a dataset in BigQuery.

You cannot change the location of the dataset later! .

Export configuration ().

If you do not already have an account, follow to create one.

See for more detail on how to find your credentials.

We need accessKey and secretKey which are located in ~/.aws/credentials file. We will also need your region information, it is located in ~/.aws/config file. Execute command in AWS developer console to get accessKey and secretKey.

Follow to create a bucket in S3.

The Name of the bucket should be unique, see . Also, while creating a dataset, keep the region in mind.

Objects can never leave the region unless they are explicitly transferred! .

this manual
More on locations in BigQuery
this AWS manual
this manual
aws configure
this manual
more on bucket naming
More on AWS regions
Business and Enterprise price plans
info@devtodev.com
Google Cloud manual
this manual
see below
see below