Welcome to the third lesson on Web automation using Leapwork!
In the previous lessons, we looked at how to create a basic flow and how we could turn part of the flow into a reusable, parameterized sub-flow.
In this lesson, we will learn how to work with data from external data sources and make your flow data-driven in Leapwork.
Data-driven automation is an automation technique in which flows are designed to be executed with multiple sets of test data. This approach is used to improve the effectiveness and efficiency of automation by allowing a single flow to be executed with multiple inputs, thereby increasing automation coverage.
Data-driven testing is particularly useful in cases where the same functionality needs to be tested with different sets of data. For example, in a web application, a search functionality can be tested with different search queries, or a login functionality can be tested with different sets of username and password combinations.
We will continue to use the flow we created in lesson 2.
Let us learn how to perform data-driven automation using Leapwork!
If we look at the flow, we built in lesson 2, we have 2 Set Text blocks containing Hard-coded values for email and password.
On top of this, we have a Find Web Element block that looks for a specific name to verify that the Sign In was successful.
One can expect that by changing the email and password, the name of the Signed In user will change as well.
So, if the email and password are coming from an external data source, so should the username also come from Excel?
In this lesson, we will use Microsoft Excel as the external data source, however, Leapwork has building blocks to handle more or less all types of external data, including databases, web services, PowerShell scripts, bat files, etc.
You can find more about this in the advanced section of our Learning Centre.
To read the data from an Excel sheet, we add a Read Excel block before the Sign In block.
In the Read Excel building block, we can choose to either upload an Excel file or point to an Excel sheet on the local computer or on a network path.
Click on the button “File path”
I have prepared a file in advance and when the path is specified.
I click on "Define" to specify the data range we will use to drive the flow.
We have 3 columns in the sheet – "email", "password" and "display name" and 3 data rows.
I'll select all of them and check the "Use the first row as header".
When we click Save, we can see that the building block understands what fields or columns are available for the individual data rows.
All the headers in the Excel sheet are now exposed as individual properties on the building block, allowing us to easily use them as input in the flow.
We can now connect the email property on the Excel block to the email property on the sub-flow and do the same with the password field.
When the flow is running, the Excel block will read the data, and when the login block is about to execute it will get the data from the Excel sheet field by field.
Very easy to configure and very easy to understand just by looking at the flow.
One thing more we can configure on the Excel block is the way the rows are provided.
The default method is to just read the first row in the selected data range.
The second option is to read a specific row, by specifying the row index.
This means you can keep all the available credentials in one Excel sheet, and then just specify a row number to use a specific set of credentials to Sign in.
The third option is Iterate, which runs through all the data rows in the selected data range and triggers the top connector for each data row.
The flow attached to the top connector will be executed for each data row in the data range, and once all data rows have been processed, the Completed connector is triggered, allowing the flow to continue.
An example of using iterate is to generate New Leases under Lease management, the flow continues from the completed trigger to verify the generated leases above by moving to All Leases
The last thing to do in this flow is to make verification that the Sign In went well, dynamic as the credentials provided.
This brings us to the concept of locator strategies.
When we capture a web element, Leapwork generates a so-called strategy to find the captured element.
The strategy is based on one or more properties that will uniquely identify the element,
when the flow runs.
Examples are the ID of a field, the text inside a text box, and the destination of a link.
or maybe a combination of these properties.
We can get access to the strategies and modify the strategies by clicking on the captured element.
and select "Edit element". This will open the strategy editor.
On the left we have a list of strategies.
When we capture an element, Leapwork creates several different strategies that all return the same element but is doing it in separate ways.
On the right, we can see the actual element and the details for selecting the element.
The first strategy is based on selecting the SPAN tag on the Sign In page of D365.
The second strategy is also based on the SPAN, these strategies are static strategies, but we need to look at the Strategy which contains some specific text by using that we can make the strategy dynamic.
If we look at other strategies, we found one strategy where the Span contains the text.
We can use this strategy for the dynamic verification of the Sign In.
Instead of looking for hard-coded text inside the span, we can add a dynamic field allowing us to inject the text that we want as part of the strategy.
To add the dynamic field, we right-click on the text area, select the Insert token, and thereafter select Add New field option. The field is added.
Clicking on Save means that the selected locator strategy will be used and that the dynamic field is added as a property to the building block holding the element.
With the dynamic property available it is easy to wire up the last field from the Excel sheet,
to make the automation flow entirely data-driven.
Let us try to run it using row two I will select "Row index" as the method, and use a Set Number block to specify that we want to use row number 2.
Let’s run the flow.
As we can see in the activity log, row number 2 was used, and the flow ended successfully.
In the lesson, we saw how to use Excel as a data source for a flow.
and how to parameterize a Sub-Flow, making it even more reusable.
We ended by having a look at the strategy editor, which is used for selecting and modifying the selector strategies for captured web elements.
Before we leave you to perform a practice exercise on your application, here are some reasons why we suggest you use data-driven automation while designing flows in Leapwork:
Data-driven Automation Increased Coverage:
With data-driven automation, you can execute a single flow with multiple sets of test data. This means that you can test different scenarios and combinations of data, which increases the overall coverage.
Data-driven automation Reduced Maintenance Effort:
Data-driven automation allows you to separate test data from flows, which makes it easier to update flow data without changing the flow. This results in reduced maintenance effort as you only need to update the flow data, and the flow can be used for multiple scenarios.
Data-driven Automation Improved Efficiency: Data-driven automation can save time and effort because you can reuse the same Leapwork flow for multiple scenarios. This means that you do not need to create new flows for every scenario, which can be time-consuming and error-prone.
Data-driven Automation Improved Accuracy:
By using a large dataset, data-driven automation allows you to test your application with a variety of input data, which can help identify errors and edge cases that may not have been found with a smaller dataset.
Data-driven Automation Better Reporting:
Data-driven automation provides more comprehensive and accurate test reports, as it allows you to test multiple scenarios and combinations of data, which can help you identify areas of the application that need improvement.
In summary, Leapwork-based data-driven automation is a valuable technique that can improve the efficiency and effectiveness of automation. By using this approach, you can increase automation coverage, reduce maintenance effort, improve efficiency, improve accuracy, and generate better test reports.
We suggest you try replicating this flow on your own by following the video.