Data Export to Cloud Storage (BigQuery / Amazon S3)
Cloud export is available for Business and Enterprise price plans.
devtodev has an option to export user and event data to a cloud storage. The event data is uploaded once every hour. User data is uploaded every day if we receive at least one event for the last 24 hours.
To export your data to one of the supported cloud storages, please send a request to [email protected].
Export to BigQuery
To export data to BigQuery you will need to:
Create a service account, if it does not already exist.
Get service account credentials.
Create a dataset.
Choose what kind of data you want to export to your dataset.
In the request specify the following details:
Service account credentials;
Name and location of the dataset in BigQuery;
Export configuration (see below).
Creating a service account in BigQuery
If you do not have a service account, create one by following the Google Cloud manual.
Your service account has to have rights for table creation and data upload. Add bigquery.user or bigquery.admin role to your service account. Make sure to add these permissions to your role:
bigquery.jobs.create
bigquery.tables.create
Create credentials for your service account, if there are none yet. Follow this manual to create access keys. To create keys, add serviceAccountKeyAdmin role to your service account.
Creating a dataset
Follow this manual to create a dataset in BigQuery.
Name your dataset devtodev, that way we can send your data to BigQuery.
Also, while creating a dataset, keep location in mind.
You cannot change the location of the dataset later! More on locations in BigQuery.
Export configuration
After creating a service account and a dataset we need to configure export in devtodev.
You can choose one of two ways to export your data:
Export data to one table — all event data will be uploaded to one common table named p<project id>_events.
Export data by event type — every event type will be uploaded to their respective table. The list of event types is below.
Every event type will have a table with a name like this p<project id>_events_<event type>[_<event name>]. For example:
p234_rp —this is a table for real payment events from a project with id 234.
p234_ce_mission_start — this is a table for a custom event named “mission_start“ from a project with id 234.
You can match project name and project id in the _projects table, which will be automatically filled at the time of the first export.
Active user information will be uploaded to a separate table named p<project id>_users regardless of how you choose to upload event data.
Export to Amazon S3
To export data to Amazon S3 you will need to:
Create an account, if it does not already exist.
Get credentials (accessKey and secretKey).
Create a bucket.
Choose what kind of data you want to export to your bucket.
In the request specify the following details:
Account credentials;
Name and region of the bucket in Amazon S3;
Export configuration (see below).
Creating an Amazon S3 account
If you do not already have an account, follow this AWS manual to create one.
Getting credentials
See this manual for more detail on how to find your credentials.
We need accessKey and secretKey which are located in ~/.aws/credentials file. We will also need your region information, it is located in ~/.aws/config file. Execute aws configure command in AWS developer console to get accessKey and secretKey.
Example:
Creating a bucket
Follow this manual to create a bucket in S3.
The Name of the bucket should be unique, see more on bucket naming. Also, while creating a dataset, keep the region in mind.
Objects can never leave the region unless they are explicitly transferred! More on AWS regions.
Export configuration
After creating an account and a bucket we need to configure export in devtodev.
Your data will be stored in a bucket directory named p<project id> which will store .csv files compressed with gzip. Each directory will have a project_info.txt file with the project name and application id in devtodev service.
You can choose one of two ways to export your data:
Export data to one table — all event data will be uploaded to one common table.
Example of such table: 2021_05_26_08_00_54common86ddf8a5-1e7f-4f2c-a4d3-22f4d6a8860c
Export data by event type — every event type will be uploaded to their respective table. The list of event types is below.
Some examples:
2021_05_26_08_08_28ce[editor_item_remove]a9413576-0a32-4a2a-ad84-940150e9a218 — this is a table for a custom event named “editor_item_remove“.
2021_05_26_08_08_11rp556dbd8d-71c9-41b4-9564-d43b39ca1b7d — this is a table for real payment events.
Active user information will be uploaded to a separate users table regardless of how you choose to upload event data.
Example of such table: 2021_05_26_08_07_52users439c129f-d70b-4f98-ad86-4cb01054732b
List of event types
For export configuration, you can select the type of events you want to export. You can also select which project to export.
The list below contains event types (with fields) available for export.
Common basic fields for all event types
Event types
EventTrackingStatus
ts
allow_tracking — is tracking allowed
EventUserInfo
ui
language — device locale
custom_udid — custom user id
EventDeviceInfo
di
device_version
device_os
display_resolution
display_dpi
androidid
idfa
idfv
advertisingid
serialid
manufacturer
model
device_model
offset — user timezone offset
EventDeviceInfoV2
di
device_version
device_os
display_resolution
display_dpi
display_diagonal
manufacturer
model
offset — user timezone offset
androidid
openudid
idfa
idfv
advertisingid
serialid
install_source
user_agent
EventRealPaymentEntry
rp
currency
product
payment_id
price_usd
payment_status
EventGamePurchase
ip
amount
item_type
item
inapp_currencies — structure with info on currency type and its amount spent on item purchase
EventCustomColumnar
ce
event_name
event_params — structure with parameter names and values
EventProgression
pe
location
spent
earned
source
difficulty
success
duration
EventTester
tstr
tester
EventCheater
ch
cheater
EventRegistrations
rg
this event only has basic fields
EventGameSessionStart
ss
amount
EventUserEngagement
ue
duration
EventPeople
pl
event_params — custom user property fields
EventSocialNetworkPost
sp
network reason
EventSocialNetworkConnect
sc
network
EventTutorial
tr
step
EventLevelUp
lu
local_duration
absolut_duration
spent
earned
balance
bought
EventApplicationInfo
ai
sdk_version
app_version
bundle_id
engine
EventWipe
wipe
save_cheater_tester
save_custom_props
save_paying_status
save_registration
EventAlive
al
this event only has basic fields
EventReferal
rf
publisher
sub_publisher
sub_ad
sub_ad_group
sub_campaign
sub_placement
sub_site
cost
EventSubscription
sbs
source payment_type start_time expiry_time event_type price_usd product original_payment_id
payment_id purchase_type promo_code promo_type payment_status eventlevel
EventAdRevenue
adrv
source ad_unit ad_network
placement
revenue
User data
User data is uploaded every day if we receive at least one event for the last 24 hours.
devtodev_id
Numeric user id in devtodev project
uc_user_id
Main user identifier (device advertising id by default)
uc_custom_uid
Custom user identifier set by developer (crossplatform_id)
uc_platform_key
Only for cross-platform projects. Platform identifier. Stores a separate value for each platform where the user registered + "general" for the General section of the User card.
uc_idfv
uc_idfa
uc_android_id
uc_advertising_id
uc_offset
User timezone offset
uc_publisher
uc_device
Device model
uc_createtime
User registration date (first app launch)
uc_lastseen
uc_first_paymenttime
uc_last_paymenttime
uc_payment_sum
Payments sum
uc_payment_cnt
Number of payments
uc_level
User level stored in the User Cards
uc_app_version
Current app version
uc_sdk_version
Current SDK version
uc_os_version
Current OS version
uc_country
uc_language
uc_cheater
Cheater mark
uc_tester
Tester mark
uc_custom_props
List of Custom User Properties
uc_sub_publisher
uc_sub_campaign
uc_sub_keyword
uc_sub_placement
uc_sub_site
uc_sub_ad_group
uc_sub_ad
ad_tracker_id
Additional identifier for aсquisition
Last updated
Was this helpful?
