Resource

Resource enables a user to connect with a REST API friendly data source and efficiently ETL data into a user determined warehouse based on a user chosen update frequency in an automated manner.

Configuring the Credentials

Select the credentials from the dropdown menu & Click Next

Credentials not listed in dropdown ?

Click on + Add New for adding new credentials. There are slight variations in the way a REST API interacts with a given data source, these variations are accommodated by this connector using multiple authentication mechanisms. Select the authentication type from the dropdown menu, enter details and Click Save

Data Pipelines Details

Data Pipeline

Select Resource from the dropdown

Setting Parameters

Parameter Description Values

RequestURL

Required

Enter the complete URL of the REST API end point. URL parameters can be entered here or in the URL parameters field.

String value

RequestMethod

Required

Select the request method type. With POST you get an additional parameter

{GET, POST}

Default Value: GET

Body

Dependant

Optional

(If POST selected in the Request Method)

Define the body parameters of POST request in JSON format

String value in JSON format

Headers

Headers to be included in the API request in JSON format

String value

URL Parameters

Specify the URL parameters including URL in JSON format or these can be entered in the URL directly.

String value in JSON format

Pagination Available

Required

If the API supports pagination then select Yes else No.

{Yes,No}

Default Value: No

PaginationParameterName

Dependant

(Required only If Pagination Available is selected YES)

Enter the case sensitive name of the parameter used for pagination.

String value (eg: Pages, Next)

PaginationParameterLocation

Dependant

(Required only If Pagination Available is selected YES)

Specify where should the parameters be placed in the API request.

{HEADERS, BODY, URL}

Default Value: URL

PaginationInResponse

Dependant

(Required only If Pagination Available is selected YES)

If the value of the pagination parameter is received in the response itself then select Yes else No.

{Yes, No}

Default Value: Yes

PaginationStepSize

Dependant

(Required only If Pagination Available is selected YES and PaginationInResponse is set to No)

Step size to increment the pagination parameter value in each call.

Number Value (eg: If the pagination parameter name is "after", and each call consists of 1000 elements then step size should be 1000)

Default Value: 1

DataFieldName

Enter the field name that stores the data in the response JSON. If it is nested please fill in the the dot separated format. (eg: If the data is inside 'items' field inside 'data', then use data.items)

String Value ({ "data": { "items": [Actual data to be stored in warehouse] } })

Insert Mode

Required

Specifies the manner in which data will get updated in the data warehouse : Upsert will insert only new records or records with changes, Append will insert all fetched data at the end, Replace will drop the existing table and recreate a fresh one on each run.

{Upsert,Append,Replace}

Default Value: Upsert

Delete Keys

Dependant

Required, If Upsert is chosen as the Insert Mode Type)

Enter the column name separated by comma based on which data is to be upserted.

String value

HasDateParameters

Required

Select Yes if the API supports requesting data for a specific periods of time, else select No

{Yes, No}

Default Value: No

Number of days

Dependant

(Required only If HasDataParameters is selected YES)

Enter the number of days for which you wish you get the data in each run.

'NUMBER'

FromDateParameterName

Dependant

(Required only If HasDataParameters is selected YES)

Enter the name of the start date parameter used in the API

String Value (eg: fromdate,startdate,start)

ToDateParameterName

Dependant

(Required only If HasDataParameters is selected YES)

Enter the name of the end date parameter used in the API

String Value (eg: enddate,todate)

DateFormat

Dependant

(Required only If HasDataParameters is selected YES)

Enter the date format used in the API for from FROM DATE and TO DATE parameter values

%Y-%m-%d (Refer this link for details.)

_Default Value: %Y-%m-%d

DateParameterLocation

Dependant

(Required only If HasDataParameters is selected YES)

Specify where should the date parameters be placed in the API request.

{HEADERS, BODY, URL}

Default Value: URL

InferSchema

Required

If Yes then value types will be fetched as it is, eg: Float will be fetched as float. If No then everything will be fetched as string irrespective of its type.

{Yes,No}

Default Value: Yes

Datapipeline Scheduling

Scheduling specifies the frequency with which data will get updated in the data warehouse. You can choose between Manual Run, Normal Scheduling or Advance Scheduling.

Manual Run

If scheduling is not required, you can use the toggle to run the pipeline manually.

Normal Scheduling

Use the dropdown to select an interval-based hourly, monthly, weekly, or daily frequency.

Advance Scheduling

Set schedules fine-grained at the level of Months, Days, Hours, and Minutes.

Detailed explanation on scheduling of pipelines can be found here

Dataset & Name

Dataset Name

Key in the Dataset Name(also serves as the table name in your data warehouse).Keep in mind, that the name should be unique across the account and the data source. Special characters (except underscore _) and blank spaces are not allowed. It is best to follow a consistent naming scheme for future search to locate the tables.

Dataset Description

Enter a short description (optional) describing the dataset being fetched by this particular pipeline.

Notifications

Choose the events for which you’d like to be notified: whether "ERROR ONLY" or "ERROR AND SUCCESS".

Once you have finished click on Finish to save it. Read more about naming and saving your pipelines including the option to save them as templates here

Still have Questions?

We’ll be happy to help you with any questions you might have! Send us an email at info@datachannel.co.

Subscribe to our Newsletter for latest updates at DataChannel.