- Getting started
- Best practices
- Tenant
- About the Tenant Context
- Searching for Resources in a Tenant
- Managing Robots
- Connecting Robots to Orchestrator
- Setup Samples
- Storing Robot Credentials in CyberArk
- Setting up Attended Robots
- Setting up Unattended Robots
- Storing Unattended Robot Passwords in Azure Key Vault (read-only)
- Storing Unattended Robot Credentials in HashiCorp Vault (read-only)
- Deleting Disconnected and Unresponsive Unattended Sessions
- Robot Authentication
- Robot Authentication With Client Credentials
- SmartCard Authentication
- Audit
- Resource Catalog Service
- Folders Context
- Automations
- Processes
- Jobs
- Triggers
- Logs
- Monitoring
- Queues
- Assets
- Storage Buckets
- Test Suite - Orchestrator
- Integrations
- Classic Robots
- Troubleshooting
About Logs
The Logs page displays logs generated by Robots in all folders the user has access to, including logs generated for jobs started through remote debugging sessions.
To access it, navigate to Orchestrator Automations tab from a folder context, and select Logs from the options displayed.
The table below contains field descriptions for the Logs page.
Field |
Description |
---|---|
Time |
The timestamp the log was registered. Note: You can sort and filter the log list by Time.
|
Level |
The severity level at which the message was logged. The following options are available: Trace, Debug, Info, Warn, Error, and Fatal. Note: You can sort and filter the log list by Level.
|
Process |
The name of the process that generated a given log message. |
Hostname |
The name of the workstation used for the process execution. |
Host identity |
The identity under which the execution takes place. The following values are possible:
Note: For Robots older than 2021.10, the host identity gets populated dynamically according to the account settings made in Orchestrator.
Changing the
domain\username for the account used to execute a job changes the host identity as well.
|
Message |
The logged message. This can also be a message logged through the Log Message activity in Studio. Keep in mind that the content of this column is displayed in the language of the Robot regardless of what language was chosen for Orchestrator. |
To view all logs generated by a Robot for an indicated job, navigate to the Jobs page.
To filter logs by the name of the host machine they have been generated on, use the Machine filter on the Logs page.
The new filter works retroactively for logs stored in ElasticSearch, while for logs stored in the database, it only works for new log entries.
Messages are logged on the following levels: Trace, Debug, Info, Warn, Error and Fatal.
Custom messages can also be sent to this page from Studio, with the Log Message activity. The messages can be logged at all the levels described above and should be used for diagnostic purposes.
For example, in the screenshot below, you can see that we logged a custom message at a Fatal severity level.
All logs can be exported to a .csv file, by clicking the Export button. The filters applied to this page are taken into account when this file is generated. For example, if you set to view logs only from the last 30 days with an Info severity level, only the entries that meet these criteria are downloaded.
To ensure the best performance, please note that the exported entries are not in reverse chronological order.
Logs may not be in the proper order only in the following scenario:
- There are two or more robot log entries with almost equal timestamps - they are equal up to the millisecond (time expressed
as
yyyy-MM-dd HH\:mm\:ss.fff
is the same), but differ in the millisecond's subunits (the last four values inyyyy-MM-dd HH\:mm\:ss.fffffff
are different). - The logs are viewed in Orchestrator with the default sort order in the grid (sort by Time descending).
However, this does not affect the database and exported .csv file.