Ask or search…

Import historical data via API

Historical data import overview
If you want to migrate to devtodev from another analytical system, move your data from your company's data collection mechanism, or use the data stored on your servers, feel free to use our Import Data Wizard.
We recommend that you start importing your historical data no later than one month after connecting your project to devtodev. At the same time, it is desirable that the right-hand border of the data interval is as close as possible to the start date of importing your historical data to devtodev.
You can import any data about users or events that are allowed by the devtodev data API. The most frequent task is to import an existing user database (with information about registration dates, player character levels, and other characteristics), data on payments, and user sessions. You can also dispatch other events but it is not reasonable to cover a period longer than 90 days from the import start date.
The main condition for the successful import of historical data is the match of IDs that are currently dispatched for user/device identification from the devtodev SDK built into your project, with the IDs in your possession - the IDs to which you can link the imported historical data to.
The best option of importing historical data is when you set custom user IDs tracking in the devtodev system. A custom user ID in the devtodev system is an ID assigned by the developer (see setUserId method in the devtodev SDK integration documentation for the corresponding platform). This is usually the number of the record about the user in your database or a third-party ID by which you authorize and identify the user.
Attention! By default, devtodev uses device ID for identification. Switching the project to identification by user ID can be done by contacting your account manager or by writing a request to our technical support. Switching to identification by user ID is irreversible!
After the date of switching the project to identification by user ID, it is advisable to wait 7 days before the start of importing historical data.
But there is a nuance when it comes to importing historical data - during data merging, the data obtained from third parties will be lost (statistics from markets, data on traffic sources from advertising trackers, and data on income received from advertising networks). If you have such data, then after completing the import process, contact your account manager and they will try to reload the data for the required period.
To start the process of exporting historical data, go to the settings of the project into which you want to load historical data and select Import Historical Data.

The process of loading historical data consists of several stages:

  1. 1.
    Preparation stage Click on the start button on the Historical data page. A temporary project will be created in devtodev (you can see it in the list of projects). It will have the same name as the original project, but with the addition of the TMP suffix. You will need to export your historical data to this temporary project. To upload, use the API key that you see on the Import Historical data page. Check out the devtodev data API documentation. Prepare a script that will send the historical data of the project to devtodev. It is extremely important that events are dispatched in chronological order for (at least) each user individually (in a JSON file the dispatched events should be ordered starting from the older ones at the beginning of the file to the newer ones at the end). At the same time, unlike the data that is sent in real time using the API, when sending your historical data to the JSON file that you intend to send to devtodev, to each parcel you need to add a line with a special flag:
    "options": {"use_specified_time": true}
  2. 2.
    Data loading stage After you have prepared the data for loading and are ready to start exporting them to devtodev, click the Start loading data button. At this moment, our server will switch to the mode of receiving historical data. Load the prepared data. If there is a lot of data, then you can expect the loading process to take up to several days. You should aim to keep it within 2 weeks.
  3. 3.
    Processing uploaded data After your script has finished uploading data to devtodev, click the Upload Finished button on the data upload step. After clicking this button, we will start transferring the uploaded data to our database and calculating metrics for this period of time. The calculation can take up to several hours. At the end of the data processing, the interface for loading historical data will automatically proceed to the next step - data verification. Once data processing is completed we will additionally send you a bell notification.
  4. 4.
    Reviewing the loaded data This is an extremely important step in the data loading process because it is here that you understand whether you did everything right and are satisfied with the result, or something went wrong, which means that you have to implement the necessary changes and try importing again. Open the devtodev interface and go through all the temporary project reports that can be built from the data you have imported. It is best if you compare the metric data aggregated by devtodev after importing with the metric data aggregated by the analytical system from which you are migrating. If you see incorrect data in the reports (the data does not match the information from your previous analytical system reports), try to find out what could have caused this problem, and to be more accurate, what data could be loaded incorrectly. Contact devtodev support if you are unable to determine the source of the problem. To reupload historical data, click the Clear uploaded data button. Then the data will be deleted and you can try again. If you don’t want to make another attempt, click the Cancel process button. If devtodev shows the data you expected to see - hooray! You have succeeded and you can complete the migration process.
Attention! If you agree with the result and complete the process of importing historical data (click the Verified button), then re-export or adding another chunk of historical data will be impossible. This action is irreversible!
5. Historical data is loaded
Well done, not everyone can reach this stage! Your temporary project ceased to exist. From now on, only the project with the loaded historical data is available to you.

Learn more about the specifics of loading historical data using API

Unlike the data that is sent in real time using the API, when sending your historical data to the JSON file that you intend to send to devtodev, to each parcel you need to add a line with a special flag:
"options": {"use_specified_time": true }
You can encapsulate the events either by using historical streamflow or by sending all events for each user individually. But the main thing is that events must be ordered by the date from the oldest to the newest in both each individual parcel and during the entire data loading process.
As first events, we recommend sending data about the user/device and the application, dating them with the date of user registration. Then you can send any other events.
This is an example of sent data:
"options": {
"use_specified_time": true
"put_user_custom_id_here": {
"ai": [{
"timestamp": 1386259227,
"appVersion": "1.2",
"bundleId": ""
"ui": [{
"country": "GB",
"language": "en",
"crossUid": "customuserid",
"ip": "",
"userAgent": "a lot of info"
"di": [{
"timestamp": 1386259227,
"manufacturer": "Apple",
"model": "iPhone 4,1",
"screenResolution": "1024x768",
"idfv": "87ASD-9A7SD-AD2G-Q26EO-AS7D",
"deviceVersion": "11",
"deviceModel": "iOS"
"gs": [{
"timestamp": 1386259227,
"length": 600,
"level": 1
"tr": [{
"step": -1,
"timestamp": 1386259234,
"level": 1
"lu": [{
"level": 2,
"timestamp": 1386259338,
"level": 3,
"timestamp": 1386259365,
"tr": [{
"step": -2,
"timestamp": 1386259380,
"level": 3