Connector API
A connector can be used to define the input source, where the imported data is coming from, or to determine the target, where the mapped and transformed data should be sent.
Currently, we support the following types of connectors:
You can send the following file types:
- XLS(X)
- CSV
- TSV
- XML
- JSON
Use this base URL and add the corresponding endpoint respectively:
Base URL
api-gateway.getnuvo.com/dp/api/v1/
Create
Endpoint
POST /connector/
Payload
Attributes
name
The name of the connector
type
Defines whether the connector is the source of the data to be imported or the target where the data should be sent:
INPUT
: Input connector (source of the data to be imported)OUTPUT
: Output connector (target where the data should be sent)
node_type
Defines the type of connector:
HTTP
: Receives or sends data from/to a specific URL (e.g., web server or REST API)S3
: Receives or sends data from/to an AWS S3 bucketFTP
: Transfers files directly over the network.EMAIL
: Receives files from an email accountAZURE
: Receives or sends data from/to an Azure Blob Storage container
configuration
Defines the specific setup of your connector based on the type
and node_type
E-mail connectors
recipients
The list of e-mail addresses to which the mapped and transformed data should be sent
file_name
The name of the file, without the file extension, that will be sent to the recipient(s), e.g. "my_output"
file_extension
Allowed file types are:
- XLSX
- XML
- CSV
(S)FTP connectors
host
The host address, e.g. "sftp.domain.com"
port
The port number
protocol
Defines the type of server that you’re using:
FTP
SFTP
username
The (S)FTP username to log into the server
password
The (S)FTP password to log into the server. For SFTP connectors you can also use an SSH private key instead of a password in secret_key
secret_key
The SFTP SSH private key, which can be used as an alternative to a password
directory_path
The path to the directory where files are stored on the (S)FTP server
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
HTTP(S) connectors
type
Defines whether an input connector proactively reads the data or if it reacts to the data sent by the user:
REACTIVE
: Data is sent to the connector during every pipeline executionPROACTIVE
: Data is read automatically from the location during every pipeline execution
url
The endpoint that is called to receive data or where processed data is sent. For event-based input connectors, it’s the endpoint from which data is sent
method
REST API method, depending on the type of HTTP connector you’re creating. Currently, you can choose the following options:
GET
: The pipeline receives data from the specified endpoint via GET requestPOST
: The pipeline sends/receives data to/from the specified endpoint via POST requestPUT
: The pipeline sends/receives data to/from the specified endpoint via PUT requestPATCH
: The pipeline sends/receives data to/from the specified endpoint via PATCH request
headers
The list of key-value pairs that define the headers for the request sent to receive data from the specified endpoint
type
Allowed types are:
REACTIVE
: The input connector listens for data sent by the userPROACTIVE
: The input connector pro-actively gets the data
authentication
Defines your refresh token endpoint to obtain a new access token after the previous one has expired. This mechanism is more secure and allows you to re-authenticate every time you need to obtain a new access token
headers
The list of key-value pairs used for requests to the defined refresh endpoint
refresh_url
The endpoint that is called to receive the authentication token
method
REST API method. Currently, you can choose the following options:
GET
: Therefresh_url
endpoint that is called via GET requestPOST
: Therefresh_url
endpoint that is called via POST requestPUT
: Therefresh_url
endpoint that is called via PUT requestPATCH
: Therefresh_url
endpoint that is called via PATCH request
AWS S3 connectors
aws_access_key
The AWS access key
aws_secret_access_key
The AWS secret access key
aws_bucket_name
The name of the S3 bucket
aws_region
The name of the region where the bucket is hosted
directory_path
The path in the S3 bucket where the input file is stored or where the output file should be stored
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
Azure Blob Storage connectors
azure_authentication_type
The authentication method used for the blob storage:
SAS
CONNECTION_STRING
azure_account
The name of the Azure account. This field is required if azure_authentication_type
is set to SAS
azure_sas
The SAS token necessary to authenticate the connector. This field is required if azure_authentication_type
is set to SAS
. Learn more about authentication types here
azure_connection_string
The string used to authenticate the connector. This field is required if azure_authentication_type
is set to CONNECTION_STRING
. Learn more about authentication types here
azure_container_name
The name of the Azure Blob Storage container
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
directory_path
The path in the Azure Blob Storage container where the input file is stored or where the output file should be stored
For AWS S3, (S)FTP, and Azure Blob Storage connectors
advanced_config
Add tags to specifically include or exclude files at the and define how to manage processed input files after the pipeline has run
after_process_option
Define what to do with the processed input files after each pipeline run:
DELETE
: Delete the processed input fileRENAME
: Replace the current file name with the name defined innew_name
UNCHANGED
: Don’t do anything with the processed input file
inclusion_tag
The list of tags (as strings). Only files containing at least one inclusion tag will be processed. If no tags are defined, all files except those excluded by exclusion tags will be processed
exclusion_tag
The list of tags (as strings). Files containing any of the specified exclusion tags will be skipped and not processed. If no exclusion tags are defined and no inclusion tags are specified, all files will be processed
new_name
Define how processed input files should be renamed after the pipeline has run
permissions
Define whether the connector should be available to all your sub-organizations or only for internal use
level
PUBLIC
: The connector can also be used by sub-organizationsPRIVATE
: The connector can only be used by users within your organization
Payload
{
"name": "string",
"type": "string",
"node_type": "string",
"configuration": {
// EMAIL
"recipients": ["string"],
"file_extension": "string",
"file_name": "string",
// (S)FTP
"host": "string",
"port": 2222,
"protocol": "string",
"username": "string",
"password": "string",
"secret_key": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// HTTP(S)
"url": "string",
"method": "string",
"headers": {},
"type": "string",
"authentication": {
"refresh_url": "string",
"headers": {},
"method": "string"
},
// AWS
"aws_access_key": "string",
"aws_secret_access_key": "string",
"aws_bucket_name": "string",
"aws_region": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// AZURE
"azure_authentication_type": "string",
"azure_account": "string",
"azure_sas": "string",
"azure_connection_string": "string",
"azure_container_name": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// For AWS, AZURE & (S)FTP
"advance_config": {
"inclusion_tag": ["string"],
"exclusion_tag": ["string"],
"after_process_option": "string",
"new_name": "string"
}
},
"permissions": {
"level": "string"
}
}
Response
Attributes
id
The ID of the connector
name
The name of the connector
type
Defines whether the connector is the source of the data to be imported or the target where the data should be sent:
INPUT
: Input connector (source of the data to be imported)OUTPUT
: Output connector (target where the data should be sent)
node_type
Defines the type of connector:
HTTP
: Receives or sends data from/to a specific URL (e.g., web server or REST API)S3
: Receives or sends data from/to an AWS S3 bucketFTP
: Transfers files directly over the network.EMAIL
: Receives files from an email accountAZURE
: Receives or sends data from/to an Azure Blob Storage container
configuration
Defines the specific setup of your connector based on the type
and node_type
E-mail connectors
recipients
The list of e-mail addresses to which the mapped and transformed data should be sent
file_name
The name of the file, without the file extension, that nuvo will send to the recipient(s), e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
(S)FTP connectors
host
The host address, e.g., "sftp.domain.com"
port
The port number
protocol
Defines the type of server that you’re using:
FTP
SFTP
username
The (S)FTP username to log into the server
password
The (S)FTP password to log into the server. For SFTP connectors you can also use an SSH private key instead of a password in secret_key
secret_key
The SFTP SSH private key, which can be used as an alternative to a password
directory_path
The path to the directory where files are stored on the (S)FTP server
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
HTTP(S) connectors
type
Defines whether an input connector proactively reads the data or if it reacts to the data sent by the user:
REACTIVE
: Data is sent to the connector during every pipeline executionPROACTIVE
: Data is read automatically from the location during every pipeline execution
url
The endpoint that is called to receive data or where processed data is sent. For event-based input connectors, it’s the endpoint from which data is sent
method
REST API method, depending on the type of HTTP connector you’re creating. Currently, you can choose the following options:
GET
: The pipeline receives data from the specified endpoint via GET requestPOST
: The pipeline sends/receives data to/from the specified endpoint via POST requestPUT
: The pipeline sends/receives data to/from the specified endpoint via POST requestPATCH
: The pipeline sends/receives data to/from the specified endpoint via PATCH request
headers
The list of key-value pairs that define the headers for the request sent to receive data from the specified endpoint
authentication
Defines your refresh token endpoint to obtain a new access token after the previous one has expired. This mechanism is more secure and allows you to re-authenticate every time you need to obtain a new access token
headers
The list of key-value pairs used for requests to the defined refresh endpoint
refresh_url
The endpoint that is called to receive the authentication token
method
REST API method. Currently, you can choose the following options:
GET
: Therefresh_url
endpoint that is called via GET requestPOST
: Therefresh_url
endpoint that is called via POST requestPUT
: Therefresh_url
endpoint that is called via PUT requestPATCH
: Therefresh_url
endpoint that is called via PATCH request
AWS S3 connectors
aws_access_key
The AWS access key
aws_secret_access_key
The AWS secret access key
aws_bucket_name
The name of the S3 bucket
aws_region
The name of the region where the bucket is hosted
directory_path
The path in the S3 bucket where the input file is stored or where the output file should be stored
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
Azure Blob Storage connectors
azure_authentication_type
The authentication method used for the blob storage:
SAS
CONNECTION_STRING
azure_account
The name of the Azure account. This field is required if azure_authentication_type
is set to SAS
azure_sas
The SAS token necessary to authenticate the connector. This field is required if azure_authentication_type
is set to SAS
. Learn more about authentication types here
azure_connection_string
The string used to authenticate the connector. This field is required if azure_authentication_type
is set to CONNECTION_STRING
. Learn more about authentication types here
azure_container_name
The name of the Azure Blob Storage container
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
directory_path
The path in the Azure Blob Storage container where the input file is stored or where the output file should be stored
For AWS S3, (S)FTP, and Azure Blob Storage connectors
advanced_config
Add tags to specifically include or exclude files at the and define how to manage processed input files after the pipeline has run
after_process_option
Define what to do with the processed input files after each pipeline run:
DELETE
: Delete the processed input fileRENAME
: Replace the current file name with the name defined innew_name
UNCHANGED
: Don’t do anything with the processed input file
inclusion_tag
The list of tags (as strings). Only files containing at least one inclusion tag will be processed. If no tags are defined, all files except those excluded by exclusion tags will be processed.
exclusion_tag
The list of tags (as strings). Files containing any of the specified exclusion tags will be skipped and not processed. If no exclusion tags are defined and no inclusion tags are specified, all files will be processed.
new_name
Define how processed input files should be renamed after the pipeline has run
permissions
Defines whether the connector should be available to all your sub-organizations or only for internal use
level
PUBLIC
: The connector can also be used by sub-organizationsPRIVATE
: The connector can only be used by users within your organization
created_at
The date and time when the connector was first created
created_by
Information of whom created the connector
id
The ID of the user or sub-organization who created the connector
name
The name of the user or sub-organization who created the connector
identifier
The identifier of the user or sub-organization who created the connector
type
Defines the type of user who created the connector:
USER
: A user of your organizationSUB_ORG
: A sub-organization that is part of your organization
updated_at
The date and time when the connector was last updated
updated_by
Information about whom last updated the connector
id
The ID of the user or sub-organization who last updated the connector
name
The name of the user or sub-organization who last updated the connector
identifier
The identifier of the user or sub-organization who last updated the connector
type
Defines the type of user who last updated the connector:
USER
: A user of your organizationSUB_ORG
: A sub-organization that is part of your organization
Response
{
"data": {
"id": "string",
"name": "string",
"type": "string",
"node_type": "string",
"configuration": {
// EMAIL
"recipients": ["string"],
"file_extension": "string",
"file_name": "string",
// (S)FTP
"host": "string",
"port": 2222,
"protocol": "string",
"username": "string",
"password": "string",
"secret_key": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// HTTP(S)
"url": "string",
"method": "string",
"headers": {},
"type": "string",
"authentication": {
"refresh_url": "string",
"headers": {},
"method": "string"
},
// AWS
"aws_access_key": "string",
"aws_secret_access_key": "string",
"aws_bucket_name": "string",
"aws_region": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// AZURE
"azure_authentication_type": "string",
"azure_account": "string",
"azure_sas": "string",
"azure_connection_string": "string",
"azure_container_name": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// For AWS, AZURE & (S)FTP
"advance_config": {
"inclusion_tag": ["string"],
"exclusion_tag": ["string"],
"after_process_option": "string",
"new_name": "string"
}
},
"permissions": {
"level": "string"
},
"created_at": "string",
"created_by": {
"id": "string",
"name": "string",
"identifier": "string",
"type": "string"
},
"updated_at": "string",
"updated_by": {
"id": "string",
"name": "string",
"identifier": "string",
"type": "string"
}
}
}
Update
Endpoint
PUT /connector/{id}
Payload
Attributes
name
The name of the connector
type
Defines whether the connector is the source of the data to be imported or the target where the data should be sent:
INPUT
: Input connector (source of the data to be imported)OUTPUT
: Output connector (target where the data should be sent)
node_type
Defines the type of connector:
HTTP
: Receives or sends data from/to a specific URL (e.g., web server or REST API)S3
: Receives or sends data from/to an AWS S3 bucketFTP
: Transfers files directly over the network.EMAIL
: Receives files from an email accountAZURE
: Receives or sends data from/to an Azure Blob Storage container
configuration
Defines the specific setup of your connector based on the type
and node_type
E-mail connectors
recipients
The list of e-mail addresses to which the mapped and transformed data should be sent
file_name
The name of the file, without the file extension, that nuvo will send to the recipient(s), e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
(S)FTP connectors
host
The host address, e.g., "sftp.domain.com"
port
The port number
protocol
Defines the type of server that you’re using:
FTP
SFTP
username
The (S)FTP username to log into the server
password
The (S)FTP password to log into the server. For SFTP connectors you can also use an SSH private key instead of a password in secret_key
secret_key
The SFTP SSH private key, which can be used as an alternative to a password
directory_path
The path to the directory where files are stored on the (S)FTP server
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
HTTP(S) connectors
type
Defines whether an input connector proactively reads the data or if it reacts to the data sent by the user:
REACTIVE
: Data is sent to the connector during every pipeline executionPROACTIVE
: Data is read automatically from the location during every pipeline execution
url
The endpoint that is called to receive data or where processed data is sent. For event-based input connectors, it’s the endpoint from which data is sent
method
REST API method, depending on the type of HTTP connector you’re creating. Currently, you can choose the following options:
GET
: The pipeline receives data from the specified endpoint via GET requestPOST
: The pipeline sends/receives data to/from the specified endpoint via POST requestPUT
: The pipeline sends/receives data to/from the specified endpoint via POST requestPATCH
: The pipeline sends/receives data to/from the specified endpoint via PATCH request
headers
The list of key-value pairs that define the headers for the request sent to receive data from the specified endpoint
authentication
Defines your refresh token endpoint to obtain a new access token after the previous one has expired. This mechanism is more secure and allows you to re-authenticate every time you need to obtain a new access token
headers
The list of key-value pairs used for requests to the defined refresh endpoint
refresh_url
The endpoint that is called to receive the authentication token
method
REST API method. Currently, you can choose the following options:
GET
: Therefresh_url
endpoint that is called via GET requestPOST
: Therefresh_url
endpoint that is called via POST requestPUT
: Therefresh_url
endpoint that is called via PUT requestPATCH
: Therefresh_url
endpoint that is called via PATCH request
AWS S3 connectors
aws_access_key
The AWS access key
aws_secret_access_key
The AWS secret access key
aws_bucket_name
The name of the S3 bucket
aws_region
The name of the region where the bucket is hosted
directory_path
The path in the S3 bucket where the input file is stored or where the output file should be stored
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
Azure Blob Storage connectors
azure_authentication_type
The authentication method used for the blob storage:
SAS
CONNECTION_STRING
azure_account
The name of the Azure account. This field is required if azure_authentication_type
is set to SAS
azure_sas
The SAS token necessary to authenticate the connector. This field is required if azure_authentication_type
is set to SAS
. Learn more about authentication types here
azure_connection_string
The string used to authenticate the connector. This field is required if azure_authentication_type
is set to CONNECTION_STRING
. Learn more about authentication types here
azure_container_name
The name of the Azure Blob Storage container
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
directory_path
The path in the Azure Blob Storage container where the input file is stored or where the output file should be stored
For AWS S3, (S)FTP, and Azure Blob Storage connectors
advanced_config
Add tags to specifically include or exclude files at the and define how to manage processed input files after the pipeline has run
after_process_option
Define what to do with the processed input files after each pipeline run:
DELETE
: Delete the processed input fileRENAME
: Replace the current file name with the name defined innew_name
UNCHANGED
: Don’t do anything with the processed input file
inclusion_tag
The list of tags (as strings). Only files containing at least one inclusion tag will be processed. If no tags are defined, all files except those excluded by exclusion tags will be processed
exclusion_tag
The list of tags (as strings). Files containing any of the specified exclusion tags will be skipped and not processed. If no exclusion tags are defined and no inclusion tags are specified, all files will be processed
new_name
Define how processed input files should be renamed after the pipeline has run
permissions
Defines whether the connector should be available to all your sub-organizations or only for internal use
level
PUBLIC
: The connector can also be used by sub-organizationsPRIVATE
: The connector can only be used by users within your organization
Payload
{
"name": "string",
"type": "string",
"node_type": "string",
"configuration": {
// EMAIL
"recipients": ["string"],
"file_extension": "string",
"file_name": "string",
// (S)FTP
"host": "string",
"port": 2222,
"protocol": "string",
"username": "string",
"password": "string",
"secret_key": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// HTTP(S)
"url": "string",
"method": "string",
"headers": {},
"type": "string",
"authentication": {
"refresh_url": "string",
"headers": {},
"method": "string"
},
// AWS
"aws_access_key": "string",
"aws_secret_access_key": "string",
"aws_bucket_name": "string",
"aws_region": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// AZURE
"azure_authentication_type": "string",
"azure_account": "string",
"azure_sas": "string",
"azure_connection_string": "string",
"azure_container_name": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// For AWS, AZURE & (S)FTP
"advance_config": {
"inclusion_tag": ["string"],
"exclusion_tag": ["string"],
"after_process_option": "string",
"new_name": "string"
}
},
"permissions": {
"level": "string"
}
}
Response
Attributes
id
The ID of the connector
name
The name of the connector
type
Defines whether the connector is the source of the data to be imported or the target where the data should be sent:
INPUT
: Input connector (source of the data to be imported)OUTPUT
: Output connector (target where the data should be sent)
node_type
Defines the type of connector:
HTTP
: Receives or sends data from/to a specific URL (e.g., web server or REST API)S3
: Receives or sends data from/to an AWS S3 bucketFTP
: Transfers files directly over the network.EMAIL
: Receives files from an email accountAZURE
: Receives or sends data from/to an Azure Blob Storage container
configuration
Defines the specific setup of your connector based on the type
and node_type
E-mail connectors
recipients
The list of e-mail addresses to which the mapped and transformed data should be sent
file_name
The name of the file, without the file extension, that nuvo will send to the recipient(s), e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
(S)FTP connectors
host
The host address, e.g., "sftp.domain.com"
port
The port number
protocol
Defines the type of server that you’re using:
FTP
SFTP
username
The (S)FTP username to log into the server
password
The (S)FTP password to log into the server. For SFTP connectors you can also use an SSH private key instead of a password in secret_key
secret_key
The SFTP SSH private key, which can be used as an alternative to a password
directory_path
The path to the directory where files are stored on the (S)FTP server
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
HTTP(S) connectors
type
Defines whether an input connector proactively reads the data or if it reacts to the data sent by the user:
REACTIVE
: Data is sent to the connector during every pipeline executionPROACTIVE
: Data is read automatically from the location during every pipeline execution
url
The endpoint that is called to receive data or where processed data is sent. For event-based input connectors, it’s the endpoint from which data is sent
method
REST API method, depending on the type of HTTP connector you’re creating. Currently, you can choose the following options:
GET
: The pipeline receives data from the specified endpoint via GET requestPOST
: The pipeline sends/receives data to/from the specified endpoint via POST requestPUT
: The pipeline sends/receives data to/from the specified endpoint via POST requestPATCH
: The pipeline sends/receives data to/from the specified endpoint via PATCH request
headers
The list of key-value pairs that define the headers for the request sent to receive data from the specified endpoint
authentication
Defines your refresh token endpoint to obtain a new access token after the previous one has expired. This mechanism is more secure and allows you to re-authenticate every time you need to obtain a new access token
headers
The list of key-value pairs used for requests to the defined refresh endpoint
refresh_url
The endpoint that is called to receive the authentication token
method
REST API method. Currently, you can choose the following options:
GET
: Therefresh_url
endpoint that is called via GET requestPOST
: Therefresh_url
endpoint that is called via POST requestPUT
: Therefresh_url
endpoint that is called via PUT requestPATCH
: Therefresh_url
endpoint that is called via PATCH request
AWS S3 connectors
account_id
The AWS account id
access_key
The AWS secret access key
bucket_name
The name of the S3 bucket
region
The name of the region where the bucket is hosted
directory_path
The path in the S3 bucket where the input file is stored or where the output file should be stored
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
Azure Blob Storage connectors
azure_authentication_type
The authentication method used for the blob storage:
SAS
CONNECTION_STRING
azure_account
The name of the Azure account. This field is required if azure_authentication_type
is set to SAS
azure_sas
The SAS token necessary to authenticate the connector. This field is required if azure_authentication_type
is set to SAS
. Learn more about authentication types here
azure_connection_string
The string used to authenticate the connector. This field is required if azure_authentication_type
is set to CONNECTION_STRING
. Learn more about authentication types here
azure_container_name
The name of the Azure Blob Storage container
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
directory_path
The path in the Azure Blob Storage container where the input file is stored or where the output file should be stored
For AWS S3, (S)FTP, and Azure Blob Storage connectors
advanced_config
Add tags to specifically include or exclude files at the and define how to manage processed input files after the pipeline has run
after_process_option
Define what to do with the processed input files after each pipeline run:
DELETE
: Delete the processed input fileRENAME
: Replace the current file name with the name defined innew_name
UNCHANGED
: Don’t do anything with the processed input file
inclusion_tag
The list of tags (as strings). Only files containing at least one inclusion tag will be processed. If no tags are defined, all files except those excluded by exclusion tags will be processed
exclusion_tag
The list of tags (as strings). Files containing any of the specified exclusion tags will be skipped and not processed. If no exclusion tags are defined and no inclusion tags are specified, all files will be processed
new_name
Define how processed input files should be renamed after the pipeline has run
permissions
Defines whether the connector should be available to all your sub-organizations or only for internal use
level
PUBLIC
: The connector can also be used by sub-organizationsPRIVATE
: The connector can only be used by users within your organization
created_at
The date and time when the connector was first created
created_by
Information about whom created the connector
id
The ID of the user or sub-organization who created the connector
name
The name of the user or sub-organization who created the connector
identifier
The identifier of the user or sub-organization who created the connector
type
Defines the type of user who created the connector:
USER
: A user of your organizationSUB_ORG
: A sub-organization that is part of your organization
updated_at
The date and time when the connector was last updated
updated_by
Information about whom last updated the connector
id
The ID of the user or sub-organization who last updated the connector
name
The name of the user or sub-organization who last updated the connector
identifier
The identifier of the user or sub-organization who last updated the connector
type
Defines the type of user who last updated the connector:
USER
: A user of your organizationSUB_ORG
: A sub-organization that is part of your organization
Response
{
"data": {
"id": "string",
"name": "string",
"type": "string",
"node_type": "string",
"configuration": {
// EMAIL
"recipients": ["string"],
"file_extension": "string",
"file_name": "string",
// (S)FTP
"host": "string",
"port": 2222,
"protocol": "string",
"username": "string",
"password": "string",
"secret_key": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// HTTP(S)
"url": "string",
"method": "string",
"headers": {},
"type": "string",
"authentication": {
"refresh_url": "string",
"headers": {},
"method": "string"
},
// AWS
"aws_access_key": "string",
"aws_secret_access_key": "string",
"aws_bucket_name": "string",
"aws_region": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// AZURE
"azure_authentication_type": "string",
"azure_account": "string",
"azure_sas": "string",
"azure_connection_string": "string",
"azure_container_name": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// For AWS, AZURE & (S)FTP
"advance_config": {
"inclusion_tag": ["string"],
"exclusion_tag": ["string"],
"after_process_option": "string",
"new_name": "string"
}
},
"permissions": {
"level": "string"
},
"created_at": "string",
"created_by": {
"id": "string",
"name": "string",
"identifier": "string",
"type": "string"
},
"updated_at": "string",
"updated_by": {
"id": "string",
"name": "string",
"identifier": "string",
"type": "string"
}
}
}
Read (by ID)
Endpoint
GET /connector/{id}
Response
Attributes
id
The connector’s ID, which is, for example, set in pipeline templates to ensure that all pipelines created with this template use the same input/output connector
name
The name of the connector
type
Defines whether the connector is the source of the data to be imported or the target where the data should be sent:
INPUT
: Input connector (source of the data to be imported)OUTPUT
: Output connector (target where the data should be sent)
node_type
Defines the type of connector:
HTTP
: Receives or sends data from/to a specific URL (e.g., web server or REST API)S3
: Receives or sends data from/to an AWS S3 bucketFTP
: Transfers files directly over the network.EMAIL
: Receives files from an email accountAZURE
: Receives or sends data from/to an Azure Blob Storage container
configuration
Defines the specific setup of your connector based on the type
and node_type
E-mail connectors
recipients
The list of e-mail addresses to which the mapped and transformed data should be sent
file_name
The name of the file, without the file extension, that nuvo will send to the recipient(s), e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
(S)FTP connectors
host
The host address, e.g., "sftp.domain.com"
port
The port number
protocol
Defines the type of server that you’re using:
FTP
SFTP
username
The (S)FTP username to log into the server
password
The (S)FTP password to log into the server. For SFTP connectors you can also use an SSH private key instead of a password in secret_key
secret_key
The SFTP SSH private key, which can be used as an alternative to a password
directory_path
The path to the directory where files are stored on the (S)FTP server
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
HTTP(S) connectors
type
Defines whether an input connector proactively reads the data or if it reacts to the data sent by the user:
REACTIVE
: Data is sent to the connector during every pipeline executionPROACTIVE
: Data is read automatically from the location during every pipeline execution
url
The endpoint that is called to receive data or where processed data is sent. For event-based input connectors, it’s the endpoint from which data is sent
method
REST API method, depending on the type of HTTP connector you’re creating. Currently, you can choose the following options:
GET
: The pipeline receives data from the specified endpoint via GET requestPOST
: The pipeline sends/receives data to/from the specified endpoint via POST requestPUT
: The pipeline sends/receives data to/from the specified endpoint via POST requestPATCH
: The pipeline sends/receives data to/from the specified endpoint via PATCH request
headers
The list of key-value pairs that define the headers for the request sent to receive data from the specified endpoint
authentication
Defines your refresh token endpoint to obtain a new access token after the previous one has expired. This mechanism is more secure and allows you to re-authenticate every time you need to obtain a new access token
headers
The list of key-value pairs used for requests to the defined refresh endpoint
refresh_url
The endpoint that is called to receive the authentication token
method
REST API method. Currently, you can choose the following options:
GET
: Therefresh_url
endpoint that is called via GET requestPOST
: Therefresh_url
endpoint that is called via POST requestPUT
: Therefresh_url
endpoint that is called via PUT requestPATCH
: Therefresh_url
endpoint that is called via PATCH request
AWS S3 connectors
account_id
The AWS account id
access_key
The AWS secret access key
bucket_name
The name of the S3 bucket
region
The name of the region where the bucket is hosted
directory_path
The path in the S3 bucket where the input file is stored or where the output file should be stored
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
Azure Blob Storage connectors
azure_authentication_type
The authentication method used for the blob storage:
SAS
CONNECTION_STRING
azure_account
The name of the Azure account. This field is required if azure_authentication_type
is set to SAS
azure_sas
The SAS token necessary to authenticate the connector. This field is required if azure_authentication_type
is set to SAS
. Learn more about authentication types here
azure_connection_string
The string used to authenticate the connector. This field is required if azure_authentication_type
is set to CONNECTION_STRING
. Learn more about authentication types here
azure_container_name
The name of the Azure Blob Storage container
file_name
The name of the file, without the file extension, that nuvo will send, e.g., "my_output"
file_extension
The file extension to use for the output file. Allowed file types are:
- XLSX
- XML
- CSV
- JSON
directory_path
The path in the Azure Blob Storage container where the input file is stored or where the output file should be stored
For AWS S3, (S)FTP, and Azure Blob Storage connectors
advanced_config
Add tags to specifically include or exclude files at the and define how to manage processed input files after the pipeline has run
after_process_option
Define what to do with the processed input files after each pipeline run:
DELETE
: Delete the processed input fileRENAME
: Replace the current file name with the name defined innew_name
UNCHANGED
: Don’t do anything with the processed input file
inclusion_tag
The list of tags (as strings). Only files containing at least one inclusion tag will be processed. If no tags are defined, all files except those excluded by exclusion tags will be processed
exclusion_tag
The list of tags (as strings). Files containing any of the specified exclusion tags will be skipped and not processed. If no exclusion tags are defined and no inclusion tags are specified, all files will be processed
new_name
Define how processed input files should be renamed after the pipeline has run
permissions
Defines whether the connector should be available to all your sub-organizations or only for internal use
level
PUBLIC
: The connector can also be used by sub-organizationsPRIVATE
: The connector can only be used by users within your organization
created_at
The date and time when the connector was first created
created_by
Information about whom created the connector
id
The ID of the user or sub-organization who created the connector
name
The name of the user or sub-organization who created the connector
identifier
The identifier of the user or sub-organization who created the connector
type
Defines the type of user who created the connector:
USER
: A user of your organizationSUB_ORG
: A sub-organization that is part of your organization
updated_at
The date and time when the connector was last updated
updated_by
Information about who last updated the connector
id
The ID of the user or sub-organization who last updated the connector
name
The name of the user or sub-organization who last updated the connector
identifier
The identifier of the user or sub-organization who last updated the connector
type
Defines the type of user who last updated the connector:
USER
: A user of your organizationSUB_ORG
: A sub-organization that is part of your organization
Response
{
"data": {
"id": "string",
"name": "string",
"type": "string",
"node_type": "string",
"configuration": {
// EMAIL
"recipients": ["string"],
"file_extension": "string",
"file_name": "string",
// (S)FTP
"host": "string",
"port": 2222,
"protocol": "string",
"username": "string",
"password": "string",
"secret_key": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// HTTP(S)
"url": "string",
"method": "string",
"headers": {},
"type": "string",
"authentication": {
"refresh_url": "string",
"headers": {},
"method": "string"
},
// AWS
"aws_access_key": "string",
"aws_secret_access_key": "string",
"aws_bucket_name": "string",
"aws_region": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// AZURE
"azure_authentication_type": "string",
"azure_account": "string",
"azure_sas": "string",
"azure_connection_string": "string",
"azure_container_name": "string",
"directory_path": "string",
"file_extension": "string",
"file_name": "string",
// For AWS, AZURE & (S)FTP
"advance_config": {
"inclusion_tag": ["string"],
"exclusion_tag": ["string"],
"after_process_option": "string",
"new_name": "string"
}
},
"permissions": {
"level": "string"
},
"created_at": "string",
"created_by": {
"id": "string",
"name": "string",
"identifier": "string",
"type": "string"
},
"updated_at": "string",
"updated_by": {
"id": "string",
"name": "string",
"identifier": "string",
"type": "string"
}
}
}
Read (all)
To further refine the response you can use query parameters like sort
, filters
, pagination
and options
. Look at a more detailed explanation here.
Endpoint
GET /connector/
Response
Attributes
id
The ID of the connector
name
The name of the connector
type
Defines whether the connector is the source of the data to be imported or the target where the data should be sent:
INPUT
: Input connector (source of the data to be imported)OUTPUT
: Output connector (target where the data should be sent)
node_type
Defines the type of connector:
HTTP
: Receives or sends data from/to a specific URL (e.g., web server or REST API)S3
: Receives or sends data from/to an AWS S3 bucketFTP
: Transfers files directly over the network.EMAIL
: Receives files from an email accountAZURE
: Receives or sends data from/to an Azure Blob Storage container
permissions
Defines whether the connector should be available to all your sub-organizations or only for internal use
level
PUBLIC
: The connector can also be used by sub-organizationsPRIVATE
: The connector can only be used by users within your organization
created_at
The date and time when the connector was first created
created_by
Information about whom created the connector
id
The ID of the user or sub-organization who created the connector
name
The name of the user or sub-organization who created the connector
identifier
The identifier of the user or sub-organization who created the connector
type
Defines the type of user who created the connector:
USER
: A user of your organizationSUB_ORG
: A sub-organization that is part of your organization
updated_at
The date and time when the connector was last updated
updated_by
Information about who last updated the connector
id
The ID of the user or sub-organization who last updated the connector
name
The name of the user or sub-organization who last updated the connector
identifier
The identifier of the user or sub-organization who last updated the connector
type
Defines the type of user who last updated the connector:
USER
: A user of your organizationSUB_ORG
: A sub-organization that is part of your organization
pagination
An object containing metadata about the result
total
The number of entries in the data array
offset
The offset set in the request parameters
limit
The limit set in the request parameters
Response
{
"data": [
{
"id": "string",
"name": "string",
"type": "string",
"node_type": "string",
"permissions": {
"level": "string"
},
"created_at": "string",
"created_by": {
"id": "string",
"name": "string",
"identifier": "string",
"type": "string"
},
"updated_at": "string",
"updated_by": {
"id": "string",
"name": "string",
"identifier": "string",
"type": "string"
}
}
],
"pagination": {
"total": 0,
"offset": 0,
"limit": 0
}
}
Delete
Endpoint
DELETE / connector / { id };
Attributes
message
Message confirming the deletion of the connector or providing an error message
Response
{
"message": "string"
}