- Overview
- Document Understanding Process
- Quickstart tutorials
- Extracting data from receipts
- Invoices retrained with one additional field
- Extracting data from Forms
- Framework components
- ML packages
- Overview
- Document Understanding - ML package
- DocumentClassifier - ML package
- ML packages with OCR capabilities
- 1040 - ML package
- 4506T - ML package
- 990 - ML Package - Preview
- ACORD125 - ML package
- ACORD126 - ML package
- ACORD131 - ML package
- ACORD140 - ML package
- ACORD25 - ML package
- Bank Statements - ML package
- Bills Of Lading - ML package
- Certificate of Incorporation - ML package
- Certificate of Origin - ML package
- Checks - ML package
- Children Product Certificate - ML package
- CMS 1500 - ML package
- EU Declaration of Conformity - ML package
- Financial Statements - ML package
- FM1003 - ML package
- I9 - ML package
- ID Cards - ML package
- Invoices - ML package
- Invoices Australia - ML package
- Invoices China - ML package
- Invoices India - ML package
- Invoices Japan - ML package
- Invoices Shipping - ML package
- Packing Lists - ML package
- Passports - ML package
- Payslips - ML package
- Purchase Orders - ML package
- Receipts - ML Package
- Remittance Advices - ML package
- Utility Bills - ML package
- Vehicle Titles - ML package
- W2 - ML package
- W9 - ML package
- Other Out-of-the-box ML Packages
- Public Endpoints
- Hardware requirements
- Pipelines
- Document Manager
- OCR services
- Deep Learning
- Document Understanding deployed in Automation Suite
- Document Understanding deployed in AI Center standalone
- Licensing
- Activities
- UiPath.Abbyy.Activities
- UiPath.AbbyyEmbedded.Activities
- UiPath.DocumentProcessing.Contracts
- UiPath.DocumentUnderstanding.ML.Activities
- UiPath.DocumentUnderstanding.OCR.LocalServer.Activities
- UiPath.IntelligentOCR.Activities
- UiPath.OCR.Activities
- UiPath.OCR.Contracts
- UiPath.OmniPage.Activities
- UiPath.PDF.Activities
Extracting data from receipts
The aim of this page is to help first time users get familiar with UiPath® Document UnderstandingTM.
For scalable production deployments, we strongly recommend using the Document Understanding Process available in UiPath® Studio under the Templates section.
This quickstart shows you how to extract data from receipts using the out-of-the-box Receipts ML model with its corresponding public endpoint.
Validation can be done either by presenting the Validation Station or by using the Validation Action in Action Center. Both options are described in the following sections.
In this section, we are going to validate the extraction results using Validation Station.
To create a basic workflow using the Receipts ML Model follow the steps below.
- Create a blank process
- Install the required activities packages
- Create a taxonomy
- Digitize the document
- Extract the data using the Receipts ML model
- Validate the results using Validation Station
- Export the extraction results
Now, let us see every step in detail.
- Launch UiPath Studio.
- In the HOME backstage view, click Process to create a new project.
- The New Blank Process window is displayed. In this window, enter a name for the new project. If you want, you can also add a description to sort through your projects more easily.
- Click Create. The new project is opened in Studio.
From the Manage Packages button in the ribbon, besides the core activities packages (UiPath.Excel.Activities, UiPath.Mail.Activities, UiPath.System.Activities, UiPath.UIAutomation.Activities) that are added to the project by default, install the following activities packages:
Once the activities packages are installed, list out the required fields. The Receipts ML model supports data extraction for the fields below:
- name -
Text
- vendor-addr -
Address
- total -
Number
- date -
Date
- phone -
Text
- currency -
Text
- expense-type -
Text
- items -
Table
- description -
Text
- line-amount -
Number
- unit-price -
Number
- quantity -
Number
- description -
Open Taxonomy Manager and create a group named Semi Structured Documents, a category named Finance, and a document type named Receipts. Create the above listed fields with user friendly names along with respective data types.
- In the Main.xaml file, add a Load Taxonomy activity and create a variable for the taxonomy output.
- Add a Digitize Document activity with UiPath Document OCR. Provide the input property Document Path and create output variables for Document Text and Document Object Model.
- Remember to add the Document Understanding API Key in the UiPath Document OCR activity.
- Add a Data Extraction Scope activity and fill in the properties.
- Drag and drop a Machine Learning Extractor activity. A pop-up with three input parameters, Endpoint, ML Skill, and ApiKey, is displayed on the screen.
- Fill in the Endpoint parameter with the Receipts Public Endpoint, namely
https://du.uipath.com/ie/receipts
, and provide the Document Understanding API key. - Click on Get Capabilities.
- The next step is to configure the extractor. Configuring the extractor means mapping the fields that you created in Taxonomy Manager to the fields available in the ML model like shown in the below image:
- To use the Machine Learning Extractor with an ML Skill, choose the ML Skill from the dropdown and configure the extractor.
- You must have your robot assistant connected to the same tenant as your ML Skill.
To check the results through Validation Station, drag and drop the Present Validation Station activity and provide the input details.
DataSet
that contains multiple tables, which could then be written to an Excel file or be used directly in a downstream process.
Download this sample project using this link.
The example contains two workflows:
- Main.xaml - in this workflow, the extraction results are validated using Validation Station; this is described in the above section
- Main - Unattended.xaml - in this workflow, the extraction results are validated using Validation Action; this is described in the following section
Now, let’s see how to use an Action Center Validation Action instead of presenting the Validation Station.
When an automation includes decisions that a human should make, such as approvals, escalations, and exceptions, UiPath Action Center makes it easy and efficient to hand off the process from robot to human. And back again.
Document Understanding Action Center activities come with the UiPath.IntelligentOCR.Activities package and the UiPath.Persistance.Activities package. Don’t forget to enable Persistence activities from the General Settings in UiPath Studio:
Productivity can be increased by adding an orchestration process that adds document validation actions in Action Center, in both on-premises Orchestrator and Automation Cloud. This action reduces the need for storing the documents locally, having a robot installed on each human's operated machine, or having the robot wait for human users to finish validation.
More details here.
Repeat steps 1 to 5 described in the above section.
Then, instead of using the Present Validation Station activity, use the Create Document Validation Action and Wait for Document Validation Action and Resume activities.
The below image shows the Create Document Validation Action activity and its properties.
This creates a document validation action in Action Center. The output of the Create Document Validation Action activity can then be used with the Wait for Document Validation Action and Resume activity to suspend and resume orchestration workflows upon human action completion in Action Center.
- Using Receipts ML model with public endpoint and Validation Station
- 1. Create a blank process
- 2. Install the required activities packages
- 3. Create a taxonomy
- 4. Digitize the document
- 5. Extract the data using the Receipts ML model
- 6. Validate the results using Validation Station
- 7. Export the extraction results
- Download example
- Using Receipts ML model with public endpoint and Validation Action
- How do tasks in Action Center work?
- How does the validation action work?
- How to use the Validation Action?