What is Data Streams? 2
Singing Up 2
Creating A Stream 3
Linking Your Data Extract Source & Data Destination 3
Schedule & Date Range/Filters & Segments: 6
Metrics & Dimensions/Data Stream Name & Save 9
Managing Your Streams 10
Editing Your Streams 14
Pausing & Resuming Your Data Streams 15
Deleting a Stream 16
Viewing the Freshly-Imported Data in Tableau 16
What Is Data Streams?
Data Streams is an online data migration tool that allows users to bring data from external
platforms (Adobe Analytics, Bing Ads, DoubleClick for Advertisers, Facebook Pages, Facebook
Ads, Google Analytics, Google Ads, Google BigQuery, Kochava, Microsoft SQL Server, MySQL,
PostgreSQL, Redshift, Search Ads 360, Snowflake, Teradata, and YouTube) into various
destinations such as Amazon S3, Google BigQuery, FTP, MongoDB, MySQL,
PostgreSQL,Redshift, Snowflake, Microsoft SQL Server, and Teradata.
Signing Up
Go to our Data Streams link in your browser and click the ‘Sign Up For Free’ button.
Fill in the fields with your details, check the consent box, and click the ‘Create my account’
button.
Once the account has been successfully created, the stream builder page is displayed.
Creating a Stream
Linking Your Data Extract Source And Data Destination
1. Under the ‘Data Extract Source’ section select ‘Connect to a New Data Source.’
2. Choose a Data Connector from the drop down list under the Data Connector tab.
3. Link your credentials.
To do so, click the Link button. This will open a new window that will prompt you to enter your
Adobe credentials (for the Adobe Connector).
Log in with your username and Adobe secret. For more information about linking your Adobe
Analytics credentials, visit our Support Center article.
4. Once your credentials are linked, you can select your Adobe credential and Report Suite.
To save this data source for future reference, name it, and click the SUBMIT button.
5. Data Destination.
Choose your data destination from the ‘Data Connector’ dropdown.
6. Link your Snowflake credentials by hitting the ‘Link’ button. This prompts a new window
where you have to add your login and additional server and port information. The same goes for
Amazon Redshift and other data warehouse sources.
Once the credentials are linked, they will appear in the credentials list.
7. Before submitting, you need to choose an Existing table or creating a new table for the data
that is about to be migrated.
Schedule & Date Range/Filters & Segments:
8. Schedule and Date Range.
Streams can be:
● One-time
● Recurring.
The available date presets are found in the Date Preset field (Today, Yesterday, This Week,
This Month, Two Weeks Prior, Last Week, Last 13 Weeks, Last Month, Last 13 Months, Last 7
Days, Last 30 Days, Last 90 Days) as well as the ‘Custom’ option, which allows users to pick a
start date and an end date.
Granularity options:
● Daily
● Weekly
● Monthly
When building a recurring stream, a schedule needs to be set.
The ‘Run this report until’ field allows you to pick a date in the future when you want this stream
to stop running.
Stream Frequency:
● Daily
● Weekly
● Monthly
Time of Day - Sets the exact time for when the stream should run
Timezone - Sets the timezone for the scheduled selected.
The ‘Since when do you need your data?’ field sets the start limit for the data you are about to
query.
Calendar
● Standard calendar - Uses Gregorian calendar (the first day of the week is Sunday, and
months are determined by the calendar dates)
● Retail calendar - Uses the National Retail Federation calendar (The first day of the week
is Sunday, but the months are defined by the 4-5-4 NRF calendar).
● Custom calendar - Select the day the week starts (very useful if the first day of the week
is not Sunday, this option allows the user to define custom start days for weeks). This
comes in handy when pulling data from a data source at a weekly or monthly granularity
because data would already be correctly aggregated at these levels, with the Cognetik
Cloud Connector doing all the heavy lifting.
9. The Filters & Segments section is optional.
This feature allows you to choose a metric or dimension to filter specific data or Segment your
data. Simply use the drop down tab and select the data you would like to segment.
Metrics & Dimensions/Data Stream Name & Save
10. Parameter table.
Source - Allows you to choose a metric/dimension from the drop down.
Destination - Once the Source is set you can select your Destination. This sets a column in the
table.
Data Type - Populates automatically when choosing your Source and Destination.
NOTE:
● To add more metrics or dimensions simply click the green button below that says Add
Metric or Dimension.
● Note: At least one metric is mandatory for your stream.
11. Before saving the stream, you can preview your data. To do so, simply click the ‘Preview
Data’ button.
12. Name and Save stream.
Once it's saved, the stream will show in the stream list at the top of the list.
Managing Your Streams
Stream list
Lists all the streams created within the selected space. Each stream menu provides the
following information:
● Source
● Destination
● Schedule Type
● Granularity
● Start Date
● End Date
● Created by
You can also search for your streams by name in the search bar.
Unless otherwise chosen, all your streams within the selected space will be displayed on the
homepage.
You have the option to select which streams get displayed using the buttons under the search
bar.
Filter by:
● All
● Running
● Paused
● Completed
● One-time
● Recurring
You have the option to sort your streams using the icon in the upper left corner.
Sort your streams by the following criteria, both ascending and descending:
● Date of creation
● Name
● Date of last run
● Source
● Destination
● Schedule type
● Granularity
● Start date
● End date
You can also filter streams by data source and data destination by clicking the icon next to the
sort icon.
Editing Your Streams
You can edit the streams in your list anytime. Go to the job you want to edit and click the pencil
icon shown in the upper right of each data stream.
Once you click on the Edit button, the stream builder page will be displayed. Here you will be
able to do your changes, save, and go back to your Job List.
Note:
Editing one-time streams will not cause the stream to re-run, as they are by definition one-time
streams.
Pausing & Resuming Your Data Streams
To Pause your stream, simply click the pause icon located next to the edit icon in the upper right
of the stream in your Job List.
After the pause icon is activated, the stream will be paused and the icon turns to a play icon.
To resume the paused stream, click the now-play button.
Deleting a Stream
To delete a stream, simply click the trash icon next to the edit icon.
Viewing the freshly-imported data in Tableau
If you want to analyze the data or build dashboards with it, it’s very easy to put any BI tool like
Tableau, Domo, Power BI, etc on top of Snowflake data.
To view your Snowflake data in Tableau:
● Open Tableau Desktop
● Go to "To a server"
● Choose "Snowflake" from the list of data warehouses
● Log in with your Snowflake Credentials and click ‘Sign In’
Select the Warehouse, Database, Schema, and look for your Table. Drag and update, and your
data is now in Tableau and ready to be used for your dashboards!
Still have questions while working on your streams?
Please visit our website or Support Center for more information.
You can now access the Support Center from the home page by clicking the link in the upper
right corner.
Top Related