- Release Notes
- Getting Started
- Setup and Configuration
- Automation Projects
- Dependencies
- Types of Workflows
- Control Flow
- File Comparison
- Automation Best Practices
- Source Control Integration
- Debugging
- Logging
- The Diagnostic Tool
- Workflow Analyzer
- About Workflow Analyzer
- ST-NMG-001 - Variables Naming Convention
- ST-NMG-002 - Arguments Naming Convention
- ST-NMG-004 - Display Name Duplication
- ST-NMG-005 - Variable Overrides Variable
- ST-NMG-006 - Variable Overrides Argument
- ST-NMG-008 - Variable Length Exceeded
- ST-NMG-009 - Prefix Datatable Variables
- ST-NMG-011 - Prefix Datatable Arguments
- ST-NMG-012 - Argument Default Values
- ST-NMG-016 - Argument Length Exceeded
- ST-NMG-017 - Class name matches default namespace
- ST-DBP-002 - High Arguments Count
- ST-DBP-003 - Empty Catch Block
- ST-DBP-007 - Multiple Flowchart Layers
- ST-DPB-010 - Multiple instances of [Workflow] or [Test Case]
- ST-DBP-020 - Undefined Output Properties
- ST-DBP-021 - Hardcoded Timeout
- ST-DBP-023 - Empty Workflow
- ST-DBP-024 - Persistence Activity Check
- ST-DBP-025 - Variables Serialization Prerequisite
- ST-DBP-026 - Delay Activity Usage
- ST-DBP-027 - Persistence Best Practice
- ST-DBP-028 - Arguments Serialization Prerequisite
- ST-USG-005 - Hardcoded Activity Arguments
- ST-USG-009 - Unused Variables
- ST-USG-010 - Unused Dependencies
- ST-USG-014 - Package Restrictions
- ST-USG-020 - Minimum Log Messages
- ST-USG-024 - Unused Saved for Later
- ST-USG-025 - Saved Value Misuse
- ST-USG-026 - Activity Restrictions
- ST-USG-027 - Required Packages
- ST-USG-028 - Restrict Invoke File Templates
- ST-USG-032 - Required Tags
- ST-USG-034 - Automation Hub URL
- Variables
- Arguments
- Imported Namespaces
- Coded automations
- Introduction
- Registering custom services
- Before and After contexts
- Generating code
- Generating coded test case from manual test cases
- Trigger-based Attended Automation
- Recording
- UI Elements
- Selectors
- Object Repository
- Data Scraping
- Image and Text Automation
- Automating Citrix Technologies
- RDP Automation
- VMware Horizon Automation
- Salesforce Automation
- SAP Automation
- macOS UI Automation
- The ScreenScrapeJavaSupport Tool
- The WebDriver Protocol
- Extensions
- About extensions
- SetupExtensions tool
- UiPathRemoteRuntime.exe is not running in the remote session
- UiPath Remote Runtime blocks Citrix session from being closed
- UiPath Remote Runtime causes memory leak
- UiPath.UIAutomation.Activities packages and UiPath Remote Runtime versions mismatch
- The required UiPath extension is not installed on the remote machine
- Screen resolution settings
- Chrome Group Policies
- Cannot communicate with the browser
- Chrome extension is removed automatically
- The extension may have been corrupted
- Check if the extension for Chrome is installed and enabled
- Check if ChromeNativeMessaging.exe is running
- Check if ComSpec variable is defined correctly
- Enable access to file URLs and Incognito mode
- Multiple browser profiles
- Group Policy conflict
- Known issues specific to MV3 extensions
- List of extensions for Chrome
- Chrome Extension on Mac
- Edge Group Policies
- Cannot communicate with the browser
- Edge extension is removed automatically
- The extension may have been corrupted
- Check if the Extension for Microsoft Edge is installed and enabled
- Check if ChromeNativeMessaging.exe is running
- Check if ComSpec variable is defined correctly
- Enable access to file URLs and InPrivate mode
- Multiple browser profiles
- Group Policy conflict
- Known issues specific to MV3 extensions
- List of extensions for Edge
- Extension for Safari
- Extension for VMware Horizon
- Extension for Amazon WorkSpaces
- SAP Solution Manager plugin
- Excel Add-in
- Test Suite - Studio
- Troubleshooting
Data-Driven Testing
Data-driven testing lets you test RPA workflows and applications across various scenarios. To ensure your test cases use the prepared test data, debug and run data-driven test cases only from the Test Explorer panel. Visit Working With Test Explorer to read about the actions you can take within the Test Explorer panel.
A prerequisite for data-driven testing is an Orchestrator version equal to, or higher than 2022.4
When you configure data sources for your data variation, you can choose from the following options:
- File, if you want to use Excel and CSV files.
- Generate with AutopilotTM, for test data that you can refine using AI-powered capabilities.
- Data Service, if you use Data Service entities (Automation Cloud only).
- Existing data for any type of data that has already been stored in your project, in the Test Data folder.
- Test Data Queue if you have configured a JSON schema and added a test data queue in Orchestrator.
- Auto Generate, for test data that is automatically generated to cover as many paths as possible during execution.
The following table lists the actions that you can take for test cases that contain data variation.
Action |
Description |
Procedure |
---|---|---|
Update test data |
Update the imported test data by choosing whether to create a new file or overwrite existing data. You can use this when you have made any changes to the Excel file. Any new columns will be added as arguments to the test case. For Data Service entities, you can update the test case using new filter through the Query Builder. |
4. (Optional) Select Update all test cases using the same test data to update all the test cases that use the test data that you are currently editing. 5. Click Import to save changes. Note: You
can perform this action in Test Explorer by right-clicking a
file, and then choosing Update Test Data.
|
Remove test data |
Remove test data from the test case. |
Note: You
can perform this action in Test Explorer by right-clicking a
file, and then choosing Remove Test Data.
|
Modify test data JSON file |
Update test data directly into the JSON file. The file is created after adding test data to the test case. Important: Only for test data
files.
|
|