Guides on browser automation and reducing repetitive work
Selenium is often used for automating web applications for testing purposes but it is not what all it does. In this article, we’ll show you how to use Selenium for web scraping.
Filling in numerous web forms manually lacks efficiency, accuracy, and consistency. Let’s learn how to fix this by doing it automatically, using data from a CSV file with Python and Browserbear.
While XPath expressions are incredibly useful to locate an HTML element, it can be challenging to accurately navigate through complex and nested HTML/XML structures. If you’re struggling with writing XPath expressions, use this cheat sheet to prepare them effortlessly.
Using cURL in Python provides developers with a powerful combination of versatility and functionality for data transfer tasks. By leveraging the PycURL library, you can seamlessly incorporate cURL into your existing Python projects.
Not sure which automated testing tool you should use for your web application? Consider these two that are used by tens of thousands of developers around the world.
XPath functions allow you to perform various operations on XML or HTML documents by providing additional capabilities for selecting, manipulating, and extracting values from the document's nodes. There are over 20 XPath functions available and here’s one you must know.
Automating tasks that require logging in could be a hassle, adding more work on top of the original task, but don’t let it stop you. This article demonstrates how to log in to a website easily using Browserbear so that you can run automation on websites that need user authentication.
Performing repetitive tasks every day can make you lose interest in your work and affect your productivity and performance. Learn how to automate boring tasks and save time with Python and Browserbear to free up time and energy for more challenging and rewarding work.
R is a programming language that is widely used for data analysis, statistical computing, and visualization. In this article, let's see how you can also scrape a website in R to collect data for these tasks.
Sign up for our once a fortnight newsletter update on new Roborabbit features and our business journey