Options for each connector

Hardik Chheda Updated by Hardik Chheda

Before you can create a dataset, you have to make sure you have the correct configuration setup based on your choice of connectors.

As your create your connections, you need specific information to enter into the connection dialogue box. You can find more details for each connection type below:

  1. In the left navigation bar, click the Data icon.
  2. Click the Connect tab.
  3. On the Connect tab, click the Create New button.
  4. On the New Datasource page, click the button for the required datasource.
  5. Enter the datasource connection details for the required datasource.

 Amazon S3

  • AWS access key: The access key to your Amazon instance.
  • AWS secret key: The secret key to your Amazon instance.
  • S3 bucket: The bucket from which you want to use the data for your dataset.
  • AWS region: The region where you deployed your Amazon S3 instance.
  • Browse Bucket: Tellius loads the files from Amazon S3

MongoDB

  • Host: The hostname where you deployed your MongoDB.
  • Port: The port number for the configured host.  
  • Database: The database that you want to connect for creating a dataset
  • Username: The username to access the specified host.
  • Password: The password for the specified username. 
  • Collection: The collection from which you want to use the data for your dataset.  

Cassendra

  • Table name: The table name from where you want to use the data for your dataset.
  • Key space: The container for your data in Cassandra.
  • Partition key columns: Contact your administrator for details
  • Clustering key columns: Contact your administrator for details  

Oracle

  • Host: The host name where you deployed your Oracle database.
  • User:   The user name to access the specified host name. 
  • Password: The password for the specified user name.
Note: Once you click on Browse Host, you can select a table or enter custom SQL on a particular table you want to use for your dataset. 

HDFS

  • Path: The path of the data file from which you want to use the data for your dataset
Note: Once you click Load, choose the Load Sample data for faster Transformations option On or Off. Loading sample data ensures faster transformations. 
  • SampleLoad: The number of records that you want to load as a sample data.

MemSQL

  • Host: The host name where you deployed your MemSQL database. 
  • User: The user name to access the specified host name.
  • Password: The password for the specified user name.
Note: Once you click on Browse Host, you can select a table or enter custom SQL on a particular table you want to use for your dataset.
  • Path: The path of your data file 
  • ES Nodes: The IP address of the host where you deployed Elastic Search database.
  • ES Port: The port number for the specified IP address.
Note: Once you click on Browse Host, you can select a table or enter custom SQL on a particular table you want to use for your dataset. 

MySQL

  • Host: The host name where you deployed your MySQL database. 
  • User: The user name to access the specified host name.
  • Password: The password for the specified user name.
Note: Once you click on Browse Host, you can select a table or enter custom SQL on a particular table you want to use for your dataset. 

Postgres SQL

  • Host: The host name where you deployed your Postgres SQL database. 
  • User: The user name to access the specified host name.
  • Password: The password for the specified user name.
Note: Once you click on Browse Host, you can select a table or enter custom SQL on a particular table you want to use for your dataset. 

Redshift

  • Host: The host name where you deployed your Redshift database. 
  • User: The user name to access the specified host name.
  • Password: The password for the specified user name.
Note: Once you click on Browse Host, you can select a table or enter custom SQL on a particular table you want to use for your dataset.

MS SQL

  • Host: The host name where you deployed your MS SQL database. 
  • User: The user name to access the specified host name.
  • Password: The password for the specified user name.
Note: Once you click on Browse Host, you can select a table or enter custom SQL on a particular table you want to use for your dataset.

SalesForce

  • Host: The host name where you deployed your Salesforce database. 
  • ClientId: Client ID from the Salesforce portal
  • Clientsecret: Client Secret from the Salesforce portal
  • Security Token: Security token associated with salesforce account
  • User: The user name to access the specified host name.
  • Password: The password for the specified user name.
Note: Once you click on Browse Host, you can select a table or enter custom SQL of the data you want to use for your dataset. 

Google Analytics

Note: You need to authorize you Google Analytics application to proceed further.
  • Authorize: Give authority for Tellius to access your account and application
  • Access Code: Copy the code displayed on the page and paste on the Google Analytics page
  • Google Analytics Ids: Enter the IDs for which you want to create a dataset
  • Time Range: Enter the time period for which you want to collect the data
  • Dimension: Enter the dimension by which you want to view the data
  • Metrics: Enter the metrics that you want to measure the specified dimensions

Impala

  • URL: Enter the URL of the database.
  • Database Table: Select the database you are trying to access.
  • User Name: Enter the user name to access the specified database.
  • Password: Enter the password to access the specified database.

Snowflake

  • Host: Enter the host name where Snowflake is deployed.
  • User Name: Enter the user name to access the specified host.
  • Password: Enter the password to access the specified host.
  • Databases: Select the database you are trying to access.
  • Schema: Select the required database schema, either create a table or write custom SQL.

Azure

New connection

  • Azure Storage Account Name: The name of the Azure Storage account that you want to use.
  • Azure Storage Account Key: The secret key to your Azure Storage instance.
  • Azure Storage Container Name: The Azure container from which you want to use the data for your datasource.
  • Secure Protocol: Move the slider to Yes if you want to securely access the Azure account.
  • Relative path of the Blob file: The relative path of the blob file from which you want to take the data for your datasource.  
  • Datasource name: The name by which you want to save your datasource.

Existing Connection

Move the slider for Use validated datasource connection details to Yes

Select an existing datasource and only change the required values as specified. 

  • Datasource
  • Account key 
  • File path
  • Protocol
  • Relative path of Blob file
  • Datasource name

BigQuery

New connection

  • GCS Bucket: The name of the Google Cloud Storage bucket where your projects are stored.  
  • Project Id: The identifier of the project from which you want to take the data. 
  • Datasource name: The name by which you want to save your datasource.  

Exisiting Connection

Move the slider for Use validated datasource connection details to Yes

  • Table Name: The table name  where your data is stored.
  • Upload Config File: Upload the configuration file. Ensure that the configuration file is a valid JSON file.

How did we do?

Edit Connector

Contact