What’s that?

Not that long ago, I decided to develop a CLI-based web scraper for skroutz.gr, the leading ecommerce platform in Greece, as I wanted to explore one of Go’s biggest features -that being Goroutines.

The app would scrape for any deals from the site’s dedicated section skroutz.gr/price-drops and present them in a table via the command line.

In short, the app sends a request to https://www.skroutz.gr/price-drops , using goroutines to manage concurrent requests. It then parses any essential data from the product cards, extracting details such as the product name, the original price (oldprice), the discounted price (newprice), and the product’s URL, which are then presented to the user, using jedib0t/go-pretty to create a well-organized table within the command line.

It also uses gocolly/colly (a very popular scraping framework for Go) for scraping the skroutz.gr website.

preview

Gif made using vhs

Installation using Make

Make sure you are running go version 1.20.x

git clone https://github.com/petersid2022/skroutz-prosfores-scraper-go.git
cd skroutz-prosfores-scraper-go
make

The make command generates an operating system-agnostic binary tailored to your platform, without any possible optimizations. Alternatively, you may use go install instead of make to build the binary, although you may have to rename it afterwards.

CLI Usage

There are a handful CLI options already provided, which you can specify like so:

skroutz [Options]

Options:

-h, --help      Output usage information
-f, [String]    Set filtering options: [Recommended], [price_asc], [price_desc], [newest]
-p, [Number]    Set the number of pages to scrape (default: 5)
-n, [Number]    Set the number of products to print when filtering_option=Recommended (default: 5)
-w, [Number]    Set the number of workers (default: 10)

Things I’ve worked on

  • Command-Line flags (change category, add filters, prices ascending/descending etc.) f155812

  • Use a different TUI Library

GitHub repo