Screen scraping with Colly in Go
Programming Snapshot – Colly
The Colly scraper helps developers who work with the Go programming language to collect data off the web. Mike Schilli illustrates the capabilities of this powerful tool with a few practical examples.
As long as there are websites to view for the masses of browser customers on the web, there will also be individuals on the consumer side who want the data in a different format and write scraper scripts to automatically extract the data to fit their needs.
Many sites do not like the idea of users scraping their data. Check the website's terms of service for more information, and be aware of the copyright laws for your jurisdiction. In general, as long as the scrapers do not republish or commercially exploit the data, or bombard the website too overtly with their requests, nobody is likely to get too upset about it.
Different languages offer different tools for this. Perl aficionados will probably appreciate the qualities of WWW::Mechanize
as a scraping tool, while Python fans might prefer the selenium
package [1]. In Go, there are several projects dedicated to scraping that attempt to woo developers.
One of the newer ones is Colly (possibly from "collect"). As usual in Go, it can be easily compiled and installed directly from its GitHub repository like this:
go get github.com/gocolly/colly
After installation, a program like Listing 1 [2], for example, can access the website of the CNN news channel, dig through the links hidden in its HTML, and display them for testing purposes.
Listing 1
linkfind.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 ) 07 08 func main() { 09 c := colly.NewCollector() 10 11 c.OnHTML("a[href]", 12 func(e *colly.HTMLElement) { 13 fmt.Println(e.Attr("href")) 14 }) 15 16 c.Visit("https://cnn.com") 17 }
Goodbye, Dependency Hell
Customary in Go, go build linkfind.go
creates a binary named linkfind
, which weighs in at a hefty 14MB, but already contains all the dependent libraries and runs standalone on similar architectures without further ado. What a concept! Go shifting today's typical "Dependency Hell" from run time to compile time, and thus recruiting developers to do the heavy lifting instead of the end user, is probably one of the greatest ideas of recent times.
Listing 1 uses NewCollector()
to initialize a new colly
structure in Line 9; its Visit()
function later connects to the CNN website's URL and kicks off the OnHTML()
callback as soon as the page's HTML has arrived. As its first argument, the "a[href]"
selector calls the subsequent func()
code only for links in the format <A HREF=...>
. The Colly library ensures that each call receives a pointer to a structure of the colly.HTMLElement
type, containing all relevant data of the matching HTML structure, from which line 13 extracts the link URL as a string via e.Attr("href")
.
Moving on, how difficult would it be to determine which links branch to external websites and which reference the site internally? Listing 2 is based on the same basic structure, but also defines a counter structure, Stats
, and no longer hardwires the URL to be examined in the code, but accepts it as a parameter on the command line.
Listing 2
linkstats.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 "net/url" 07 "os" 08 ) 09 10 type Stats struct { 11 external int 12 internal int 13 } 14 15 func main() { 16 c := colly.NewCollector() 17 baseURL := os.Args[1] 18 19 stats := Stats{} 20 21 c.OnHTML("a[href]", 22 func(e *colly.HTMLElement) { 23 link := e.Attr("href") 24 if linkIsExternal(link, baseURL) { 25 stats.external++ 26 } else { 27 stats.internal++ 28 } 29 }) 30 31 c.Visit(baseURL) 32 33 fmt.Printf("%s has %d internal "+ 34 "and %d external links.\n", baseURL, 35 stats.internal, stats.external) 36 } 37 38 func linkIsExternal(link string, 39 base string) bool { 40 u, err := url.Parse(link) 41 if err != nil { 42 panic(err) 43 } 44 ubase, _ := url.Parse(base) 45 46 if u.Scheme == "" || 47 ubase.Host == u.Host { 48 return false 49 } 50 return true 51 }
To distinguish external links from internal ones, the linkIsExternal()
function uses Parse()
from the net/url
package to split the link URL and the original base URL passed into the function into their component parts. It then checks via Scheme()
if the link is missing the typical http(s)://
protocol or if the host is identical in both URLs – in both cases, the link points to the original page, so it is internal.
Structured by Default
Line 19 initializes an instance of the Stats
structure defined previously in Line 10, and – as usual in Go – all members are assigned default values; in the case of the two integers, each starts out at
. This means that lines 25 or 27 only need to increment the integer value by 1
for each link examined; at the end of the program, line 33 can then output the number of internal and external links. For the Linux Magazine home page, this results in:
$ ./linkstats https://www.linux-magazine.com https://www.linux-magazine.com has 64 internal and 12 external links.
It's a pretty complex website! But collecting link stats does not exhaust Colly's usefulness. Colly's documentation [3] is still somewhat sparse, but the examples published there might give creative minds some ideas for future projects.
Let's Go Surfing
For example, I frequently visit the website surfline.com, which shows the wave height at selected beaches around the world. Since I love surfing (at a casual level, mind you), I've always wanted a command-line tool that quickly checks the site for my local beach (Ocean Beach in San Francisco) and discovers whether there are any monster waves preventing me from paddling out, because – as a hobby surfer – anything more than eight feet has me shaking in my wetsuit. Figure 1 shows the relevant information as it's displayed on the web page; Figure 2 illustrates where the data is hidden in the page's HTML according to Chrome's DevTools. It is now the scraper's task to shimmy through the tags and trickle out the numerical values indicating today's surf size.
Squinting at the HTML in Figure 2, the wave height is indicated in a span
tag of the quiver-surf-height
class. However, this tag occurs several times in the document, because the page also displays the conditions at other neighboring surf spots. The trick now is to find a path from the document root to the data that is unique and therefore only matches the main spot's data. As Figure 2 shows, this path is winding through an element of the sl-spot-forecast-summary
class.
Programming this is an easy job; you just pass the two class names to the OnHTML()
function in line 12 of Listing 3 as space-separated strings in the first argument. The query processor digs down on this path into the document to find just one, and thus the correct, wave height for the current choice of spot.
Listing 3
surfline.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 "github.com/PuerkitoBio/goquery" 07 ) 08 09 func main() { 10 c := colly.NewCollector() 11 12 c.OnHTML(".sl-spot-forecast-summary " + 13 ".quiver-surf-height", 14 func(e *colly.HTMLElement) { 15 e.DOM.Contents().Slice(0,1).Each( 16 func(_ int, s *goquery.Selection) { 17 fmt.Printf("%s\n", s.Text()) 18 }) 19 }) 20 21 c.Visit("https://www.surfline.com/" + 22 "surf-report/ocean-beach-overview/" + 23 "5842041f4e65fad6a77087f8") 24 }
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
TUXEDO Computers Unveils Linux Laptop Featuring AMD Ryzen CPU
This latest release is the first laptop to include the new CPU from Ryzen and Linux preinstalled.
-
XZ Gets the All-Clear
The back door xz vulnerability has been officially reverted for Fedora 40 and versions 38 and 39 were never affected.
-
Canonical Collaborates with Qualcomm on New Venture
This new joint effort is geared toward bringing Ubuntu and Ubuntu Core to Qualcomm-powered devices.
-
Kodi 21.0 Open-Source Entertainment Hub Released
After a year of development, the award-winning Kodi cross-platform, media center software is now available with many new additions and improvements.
-
Linux Usage Increases in Two Key Areas
If market share is your thing, you'll be happy to know that Linux is on the rise in two areas that, if they keep climbing, could have serious meaning for Linux's future.
-
Vulnerability Discovered in xz Libraries
An urgent alert for Fedora 40 has been posted and users should pay attention.
-
Canonical Bumps LTS Support to 12 years
If you're worried that your Ubuntu LTS release won't be supported long enough to last, Canonical has a surprise for you in the form of 12 years of security coverage.
-
Fedora 40 Beta Released Soon
With the official release of Fedora 40 coming in April, it's almost time to download the beta and see what's new.
-
New Pentesting Distribution to Compete with Kali Linux
SnoopGod is now available for your testing needs
-
Juno Computers Launches Another Linux Laptop
If you're looking for a powerhouse laptop that runs Ubuntu, the Juno Computers Neptune 17 v6 should be on your radar.