Bug Hunting on SAW

Web scraping is an automatic method to obtain large amounts of data from websites. The data is largely unstructured and can be further refined.

In this article, we will be testing the desktop application Scrape Any Website as an exercise for the HNGi11 program.

Learning Objectives

At the end of this tutorial, you should be able to understand:

  • Have an introductory understanding of the Scrape Any Website tool.

  • Some bugs observed on the platform


Bug Report

View the bug report via Bug Report

Bugs Observed

The following bugs were observed.

  1. On creating a new Scrape Job Name, the textbox font colour (white) is the same as the background colour (white). The text is not visible.

Image 1.1- Adding a Scrape Job

Image 1.2- Adding a Scrape Job with text highlighted

  1. URL Text Constraint not working

The condition "x" must be a part of the URL failed as I was able to add the LinkedIn URL.

Image 2.1- URLText Constraint not working

  1. URL with text constraint scraped "unsuccessfully"

Following issue 2, the LinkedIn website was scraped but no further information was provided.

Image 3.1- URL with text constraint scraped "unsuccessfully"

  1. Application shutdown on upload of URLs from file

In addition to no template provided, the application completely shut down on providing an Excel sheet containing URLs.

Image 4.1- Application shutdown on upload of URLs from file

  1. URL took too long to add

After restarting the application and trying to add a single URL, it seemed nothing worked. Closing the pop-up and clicking the "Save" button multiple times, the URL was finally saved.

Image 5.1- URL took too long to add

  1. No result from the scraping.


Although this article was written in line with the requirement of the HNGi11 Testing track, I hope you have learnt about the ScrapeAnyWebsite tool and standard testing reporting template.