CRA IoT Cloud

Export and store LoRa Messages

Video tutorial

If you prefer more detailed step by step guide, please, follow instructions below.

Create Data Source

After logging to Dataddo platform, your first steps should be creating a new Data Source. Click on the Sources link in the main navigation which gets you to sources overview.

If you are creating your very fist Data Source a popup modal window will appear automatically. In case you already have some Data Sources created, click on the button New source placed in the top right corner:

new_source

In both scenarios you should see a modal window with all Data Sources available. Use the search field to filter CRA IoT Cloud:

source_selection

Authorize your account

During your first attempt of creating the CRA IoT Cloud Data Source, you have to authorize your CRA IoT Cloud Account with Dataddo. It is as simple as clicking on Authorize button:

authorize_account

In the following form fill all fields and click on Connect an account. The username and password informations are the same credentials which you use to login into CRA IoT Cloud Portal:

credentials

Data Source template

If the authorization process is successful you will be automatically redirected to CRA IoT Cloud template form. Some of the required fields are pre-filled but others require your attention such as TENANT ID, ACCOUNT, etc...

Related connectors

In this section you can choose from different source templates which focus on different type of data which you can obtain from CRA IoT Cloud Portal:

related_connectors

Label

Name of your Data Source is pre-filled but you can name it as you wish. We recommend to give each Source unique name. It helps you better organize your Sources throughout use of Dataddo platform.

It is convenient to use the deviceId (devEUI) or deviceName in the Label:

label

Parameters

Field TENANT ID is required as it is the main parametr used to extract data from CRA IoT Cloud. You can find the TENANT ID when you login to your account in CRA IoT Cloud Portal and select the Account which you want to extract data from:

tenantid

Parameter Date Range allows you to extract historical data for selected period:

parameters

Authorized Account

Select the authorized account which is related to CRA IoT Cloud:

authorized_account

Data

You can select data which you prefer in your dataset:

data

Automatic data synchronization

Last step is setting automatic data synchronization. We highly recommend to set automatic data synchronization as it is required for extracting data on regular basis. Without automatic synchronization, Dataddo platform will not extract new data automatically:

synchronization

When you finish the template form, click on the Connect button and you will see preview of extracted data types:

preview_data

Clicking on Save source you will be redirected to Data Flow.

Create Data Flow

Data Flow will provide preview of data which were immediately extracted from CRA IoT Portal. At the moment the data are prepared to be stored in you preferred Destination.

Main purpose of Data Flow is to provide integration between Source and Destination:

flow

You next step is to select a Destination which you prefer to store your data in. Clicking on Add Destination will provide a popup modal window where you can create new Destination or select from existing one.

If you prefer to use Business Intelligence and Dashboarding App you can select from list of various services such as Google Data Studio, Tableau, Klipfolio, etc...

select_destination

Data Flow with Business Intelligence and Dashboarding Application

If you prefer to work with your data within BI Application, select your desired service from the list. In our example we will choose Google Data Studio. After selection the Destination you will see the Data Flow as following:

data_to_gds

As we selected service (BI Application) no further settings is required and we can save our Data Flow by clicking on Save Data Flow button in popup modal window. Don't forget to properly name your Data Flow:

save_data_flow

You will be provided with extra information to finish the integration in Google Data Studio. In this case click on link Dataddo Google Data Studio connector:

gds_connector

Google Data Studio will expect API KEY and API ENDPOINT ID which are provided by Dataddo platform. Simply copy those informations and click on Connect.

gds_apikeys

Newly created Data Flow is available in Data Flow Overview:

lora_to_gds

Data Flow with Storage

If you prefer to store your data in Google Sheets, Google Big Query, FTP or database, click on Add New Storage and select new destination from the list. In our case we will select Google Sheets:

gs_destination

Storage authorization

Your first step in storing your data in Google Sheets is to authorize Dataddo to work with your Google Account. By clicking on Authorize Google Sheets...

gs_authorize

...you will trigger the simple process of authorization. Allow the communication between Dataddo and your Google Sheets by clicking the Allow button:

gs_authorization

When the authorization process is complete you will be automatically redirected back to Dataddo platform into step of creation of Google Sheets Destination. As you can see, you Google Sheets Account is already authorized and created and you can select it from the select box:

my_google_sheets

By clicking on Connect Dataddo to Google Sheets you will create your Destination and it will appear in your Destination overview:

gs_destination_created

Click on your newly created Google Sheets Destination and it will be added to Data Flow:

gs_in_dataflow

A popup modal will instantly appear which indicate that your Data Flow is complete and ready to be saved. Don't forget to properly name your Data Flow.

Please, pay an extra attention to automation. This automation is responsible for timing your data storing in to your preferred Destination.

save_data_flow_2

save_automation

Final step is to click on Save Data Flow.

 

Congratulation

You will see the confirmation window that your Data Flow has been created. In our case (Google Sheets) no extra settings is required.

In case of some other Destinations such as Google Big Query you will be provided with extra information to follow for finishing the storing process.

gs_configured

Newly created Data Flow is available in Data Flow Overview:

destinations_overview

Extra information

Data Flow will be in status CONNECTING until the first planned storing process is triggered. Do you remember the automation of Data Flow? This is your first run of data storing process.

In case you don't want to wait until the first run, you can force the run immediately by clicking on link Force run:

dataflow_connecting Your data are extracted and stored at this moment and you will find them in your Destination which is in our example Google Sheets.

In the detail of Data Flow you will find information about Last run and Next run of data storing process:

dataflow_next_run