Screen scraping with Colly in Go
Programming Snapshot – Colly

Lead Image © Hannu Viitanen, 123RF.com
The Colly scraper helps developers who work with the Go programming language to collect data off the web. Mike Schilli illustrates the capabilities of this powerful tool with a few practical examples.
As long as there are websites to view for the masses of browser customers on the web, there will also be individuals on the consumer side who want the data in a different format and write scraper scripts to automatically extract the data to fit their needs.
Many sites do not like the idea of users scraping their data. Check the website's terms of service for more information, and be aware of the copyright laws for your jurisdiction. In general, as long as the scrapers do not republish or commercially exploit the data, or bombard the website too overtly with their requests, nobody is likely to get too upset about it.
Different languages offer different tools for this. Perl aficionados will probably appreciate the qualities of WWW::Mechanize
as a scraping tool, while Python fans might prefer the selenium
package [1]. In Go, there are several projects dedicated to scraping that attempt to woo developers.
One of the newer ones is Colly (possibly from "collect"). As usual in Go, it can be easily compiled and installed directly from its GitHub repository like this:
go get github.com/gocolly/colly
After installation, a program like Listing 1 [2], for example, can access the website of the CNN news channel, dig through the links hidden in its HTML, and display them for testing purposes.
Listing 1
linkfind.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 ) 07 08 func main() { 09 c := colly.NewCollector() 10 11 c.OnHTML("a[href]", 12 func(e *colly.HTMLElement) { 13 fmt.Println(e.Attr("href")) 14 }) 15 16 c.Visit("https://cnn.com") 17 }
Goodbye, Dependency Hell
Customary in Go, go build linkfind.go
creates a binary named linkfind
, which weighs in at a hefty 14MB, but already contains all the dependent libraries and runs standalone on similar architectures without further ado. What a concept! Go shifting today's typical "Dependency Hell" from run time to compile time, and thus recruiting developers to do the heavy lifting instead of the end user, is probably one of the greatest ideas of recent times.
Listing 1 uses NewCollector()
to initialize a new colly
structure in Line 9; its Visit()
function later connects to the CNN website's URL and kicks off the OnHTML()
callback as soon as the page's HTML has arrived. As its first argument, the "a[href]"
selector calls the subsequent func()
code only for links in the format <A HREF=...>
. The Colly library ensures that each call receives a pointer to a structure of the colly.HTMLElement
type, containing all relevant data of the matching HTML structure, from which line 13 extracts the link URL as a string via e.Attr("href")
.
Moving on, how difficult would it be to determine which links branch to external websites and which reference the site internally? Listing 2 is based on the same basic structure, but also defines a counter structure, Stats
, and no longer hardwires the URL to be examined in the code, but accepts it as a parameter on the command line.
Listing 2
linkstats.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 "net/url" 07 "os" 08 ) 09 10 type Stats struct { 11 external int 12 internal int 13 } 14 15 func main() { 16 c := colly.NewCollector() 17 baseURL := os.Args[1] 18 19 stats := Stats{} 20 21 c.OnHTML("a[href]", 22 func(e *colly.HTMLElement) { 23 link := e.Attr("href") 24 if linkIsExternal(link, baseURL) { 25 stats.external++ 26 } else { 27 stats.internal++ 28 } 29 }) 30 31 c.Visit(baseURL) 32 33 fmt.Printf("%s has %d internal "+ 34 "and %d external links.\n", baseURL, 35 stats.internal, stats.external) 36 } 37 38 func linkIsExternal(link string, 39 base string) bool { 40 u, err := url.Parse(link) 41 if err != nil { 42 panic(err) 43 } 44 ubase, _ := url.Parse(base) 45 46 if u.Scheme == "" || 47 ubase.Host == u.Host { 48 return false 49 } 50 return true 51 }
To distinguish external links from internal ones, the linkIsExternal()
function uses Parse()
from the net/url
package to split the link URL and the original base URL passed into the function into their component parts. It then checks via Scheme()
if the link is missing the typical http(s)://
protocol or if the host is identical in both URLs – in both cases, the link points to the original page, so it is internal.
Structured by Default
Line 19 initializes an instance of the Stats
structure defined previously in Line 10, and – as usual in Go – all members are assigned default values; in the case of the two integers, each starts out at
. This means that lines 25 or 27 only need to increment the integer value by 1
for each link examined; at the end of the program, line 33 can then output the number of internal and external links. For the Linux Magazine home page, this results in:
$ ./linkstats https://www.linux-magazine.com https://www.linux-magazine.com has 64 internal and 12 external links.
It's a pretty complex website! But collecting link stats does not exhaust Colly's usefulness. Colly's documentation [3] is still somewhat sparse, but the examples published there might give creative minds some ideas for future projects.
Let's Go Surfing
For example, I frequently visit the website surfline.com, which shows the wave height at selected beaches around the world. Since I love surfing (at a casual level, mind you), I've always wanted a command-line tool that quickly checks the site for my local beach (Ocean Beach in San Francisco) and discovers whether there are any monster waves preventing me from paddling out, because – as a hobby surfer – anything more than eight feet has me shaking in my wetsuit. Figure 1 shows the relevant information as it's displayed on the web page; Figure 2 illustrates where the data is hidden in the page's HTML according to Chrome's DevTools. It is now the scraper's task to shimmy through the tags and trickle out the numerical values indicating today's surf size.


Squinting at the HTML in Figure 2, the wave height is indicated in a span
tag of the quiver-surf-height
class. However, this tag occurs several times in the document, because the page also displays the conditions at other neighboring surf spots. The trick now is to find a path from the document root to the data that is unique and therefore only matches the main spot's data. As Figure 2 shows, this path is winding through an element of the sl-spot-forecast-summary
class.
Programming this is an easy job; you just pass the two class names to the OnHTML()
function in line 12 of Listing 3 as space-separated strings in the first argument. The query processor digs down on this path into the document to find just one, and thus the correct, wave height for the current choice of spot.
Listing 3
surfline.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 "github.com/PuerkitoBio/goquery" 07 ) 08 09 func main() { 10 c := colly.NewCollector() 11 12 c.OnHTML(".sl-spot-forecast-summary " + 13 ".quiver-surf-height", 14 func(e *colly.HTMLElement) { 15 e.DOM.Contents().Slice(0,1).Each( 16 func(_ int, s *goquery.Selection) { 17 fmt.Printf("%s\n", s.Text()) 18 }) 19 }) 20 21 c.Visit("https://www.surfline.com/" + 22 "surf-report/ocean-beach-overview/" + 23 "5842041f4e65fad6a77087f8") 24 }
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Direct Download
Read full article as PDF:
Price $2.95
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Find SysAdmin Jobs
News
-
MNT Seeks Financial Backing for New Seven-Inch Linux Laptop
MNT Pocket Reform is a tiny laptop that is modular, upgradable, recyclable, reusable, and ships with Debian Linux.
-
Ubuntu Flatpak Remix Adds Flatpak Support Preinstalled
If you're looking for a version of Ubuntu that includes Flatpak support out of the box, there's one clear option.
-
Gnome 44 Release Candidate Now Available
The Gnome 44 release candidate has officially arrived and adds a few changes into the mix.
-
Flathub Vying to Become the Standard Linux App Store
If the Flathub team has any say in the matter, their product will become the default tool for installing Linux apps in 2023.
-
Debian 12 to Ship with KDE Plasma 5.27
The Debian development team has shifted to the latest version of KDE for their testing branch.
-
Planet Computers Launches ARM-based Linux Desktop PCs
The firm that originally released a line of mobile keyboards has taken a different direction and has developed a new line of out-of-the-box mini Linux desktop computers.
-
Ubuntu No Longer Shipping with Flatpak
In a move that probably won’t come as a shock to many, Ubuntu and all of its official spins will no longer ship with Flatpak installed.
-
openSUSE Leap 15.5 Beta Now Available
The final version of the Leap 15 series of openSUSE is available for beta testing and offers only new software versions.
-
Linux Kernel 6.2 Released with New Hardware Support
Find out what's new in the most recent release from Linus Torvalds and the Linux kernel team.
-
Kubuntu Focus Team Releases New Mini Desktop
The team behind Kubuntu Focus has released a new NX GEN 2 mini desktop PC powered by Linux.