We decided to do this small test to find out what is more efficient (speed, CPU and RAM usage wise) programming language for web scraping purposes. We wrote all scraping scripts in the same manner, and we ran it in a single thread. Each scraper we ran for 10 minutes on the same machine, almost at the same time. We ran it on: Linux Ubuntu 14.04 (under Virtual Box), 1 CPU Core, 4Gb RAM.
We compared following programming languages (frameworks): Golang + Diggernaut meta-language, Perl, PHP5, Python 2.7, Python + Scrapy, Ruby. As a target we used U.S. Department of Health & Human Services website.
Let’s look at the speed chart.
As you can see there are 3 leaders: Golang + Diggernaut was able to fetch almost 3K pages, Ruby – approx 2.5K and Python + Scrapy – approx 1.5K. Other languages are slow.
However, if we look at the CPU usage chart, we can see a bit different picture.
First place here goes to PHP5 which used just 2.5% of CPU, then Golang +Diggernaut with 3.5% and third is Perl with approx 4%. Other languages are also close by, except Python + Scrapy – 11% is a way too much we think.
And last parameter we measured is RAM usage:
The winner here is Golang + Diggernaut with 26Mb, then Perl with 29Mb, and PHP5 with 39Mb. Ruby here is an outsider with 154Mb of RAM usage.
So to summarize measures we score each language using 100-points score system. Each measure goes separately (best result gets 100 points, worst gets 0 points) and then we use average.
Golang is a clear winner in this run.
We decided to attach files we used for test, so you may try and ensure: scripts