- Getting started
- Best practices
- Tenant
- About the Tenant Context
- Searching for Resources in a Tenant
- Managing Robots
- Connecting Robots to Orchestrator
- Storing Robot Credentials in CyberArk
- Storing Unattended Robot Passwords in Azure Key Vault (read-only)
- Storing Unattended Robot Credentials in HashiCorp Vault (read-only)
- Deleting Disconnected and Unresponsive Unattended Sessions
- Robot Authentication
- Robot Authentication With Client Credentials
- SmartCard Authentication
- Audit
- Resource Catalog Service
- Folders Context
- Automations
- Processes
- Jobs
- Triggers
- Logs
- Monitoring
- Queues
- Assets
- Storage Buckets
- Test Suite - Orchestrator
- Other Configurations
- Integrations
- Classic Robots
- Host administration
- Organization administration
- Troubleshooting
Bulk Uploading Queue Items Using a CSV File
This feature enables you to bulk upload queue items into a specific queue in Orchestrator using a CSV file. To do that, first upload your file into your Orchestrator instance for a specific queue. After the file is successfully processed, the contained items are uploaded into the queue according to the selected strategy. Please note that the file must be populated beforehand using predefined formats so that the upload operation is successful.
There is a series of predefined column headers that you can use when building your CSV file, but you can also use other custom headers. After successfully uploading the CSV file, information contained in columns with predefined headers is mapped to columns as found in Orchestrator. Information contained in columns with custom headers is placed under the Specific Data section of the corresponding item in Orchestrator.
The upload is limited to 15,000 items per file.
You can lower the limit using the Upload.Queues.MaxNumberOfItems app setting.
File Column Header |
Orchestrator Field |
---|---|
Reference Mandatory for unique reference queues. It supports all special characters, except double quotes -
"
|
Reference Note: References are not compatible with v2016.2 or lower versions of Orchestrator or Robot.
|
Deadline If used, they must be populated with a date in one of the following formats:
|
Deadline |
Postpone If used, they must be populated with a date in one of the following formats:
|
Postpone |
Priority If used, they must be populated with a date in one of the following formats:
If the priority is not specified in the file, the items are uploaded with a high priority, by default. |
Priority Mapped to the following values, respectively:
|
[Custom_Name] The name can only be made of alphanumeric characters: 0-9, a-z, or underscores. |
Specific Data Mapped in JSON format. For example: On-Us Check: 5540 |
Please note that you cannot have empty column headers in your file.
reference
is mapped to the Specific Data section, instead of the Reference column.
There are two strategies for handling the upload:
- ProcessAllIndependently - processes all items individually and adds them to the queue, if successful. If not, it returns a list of those whose processing failed. You have the option to download the list into a CSV file populated with the same fields as in the initially uploaded file.
- AllOrNothing - adds the items only if all of them are successfully processed. Otherwise, none are added.
When uploading queue items to queues with schema definitions there are times when you need Orchestrator to interpret String characters as Integers or Booleans in order to match the schema definition.
Dynamic typing is an option that allows you to control how Orchestrator parses string value inside of .csv files used to upload queue items.
This section concerns adding items to a queue using a CSV file from Orchestrator.
-
In the Queues page, click the corresponding More Actions button for the desired queue, and then Upload Items. The Upload Items window is displayed.
- Click Browse and select the desired CSV file.
- Click Open. If your file is compliant with the supported formatting rules, the upload operation is successful.
-
Select the upload strategy:
- Process all independently - processes all items individually and adds them to the queue, if successful. If not, it returns a list of those whose processing failed. You have the option to download the list into a CSV file populated with the same fields as in the initially uploaded file
- All or nothing - adds the items only if all of them are successfully processed. Otherwise, none are added.
-
Select the Dynamic typing checkbox if you want Orchestrator to interpret string characters in the CSV files as integer or boolean. This is recommended when uploading queue items to queues with schema definitions when it's required to match the schema definition. Leave it unselected, if you want strings to be interpreted as strings.
- Click Upload. The items are added to your queue according to the previously selected strategy.
- If you used the Process All Independently option and there are unprocessed items, the Upload button becomes Download and enables you to download a CSV file containing the unprocessed items.
Let's say you upload into a queue the content of the following CSV file. An easy way to create such a file is to populate the data into an excel file and save it as a CSV file:
Or you can download a CSV file with all the pre-filled column headers and customize it to your needs.
Notice that this file contains 3 predefined columns, having the same name as in Orchestrator (Reference,Deadline, and Priority, highlighted in green) and 2 custom columns (Customer and Color, highlighted in red).
- Click Upload Items for the desired queue, and select the CSV file. Orchestrator parses the file to confirm it meets formatting rules.
- Select the desired upload strategies and click Upload to finish the process.
- Select More Actions > View Transactions to view the uploaded queue items.
The Transactions page displays each item uploaded from your file. The information contained within the file's predefined columns is mapped to the columns with the same name in Orchestrator (Reference,Deadline, and Priority, highlighted in green):
The information in the custom columns is mapped in JSON format to the Specific Data section of each item. Click View Details for the desired item to see it: