Skip to main content

Headless Chrome and the Puppeteer Library for Scraping and Testing the Web

With the advent of Single Page Applications, scraping pages for information as well as running automated user interaction tests has become much harder due to its highly dynamic nature. The solution? Headless Chrome and the Puppeteer library.
While there's always been Selenium, PhantomJS and others, and despite headless Chrome and Puppeteer arriving late to the party, they make for valuable additions to the team of web testing automation tools, which allow developers to simulate interaction of real users with a web site or application.
Headless Chrome is able to run without Puppeteer, as it can be programmatically controlled through the Chrome DevTools Protocol, typically invoked by attaching to a remotely running Chrome instance:
chrome --headless --disable-gpu
                     --remote-debugging-port=9222
Subsequently loading the protocol's sideckick module 'chrome-remote-interface' which provides  a simple abstraction of commands and notifications using a straightforward JavaScript API, one can execute  JavaScript scripts under a local Node.js installation.
From the official documentation, here is an  example that navigates to https://example.com and saves a screenshot as example.png::
const puppeteer = require('puppeteer');
(async () => {
  const browser = await puppeteer.launch();
  const page = await browser.newPage();
  await page.goto('https://example.com');
  await page.screenshot({path: 'example.png'});
  await browser.close();
})();
But since there's 'chrome-remote-interface' already, what does Puppeteer do differently? Puppeteer offers a higher level API to the CDP than the one made available by 'chrome-remote-interface'.

Comments

Popular posts from this blog

Hour of Code 2017 Introduces App Lab

t's the time of year when the world-class Hour of Code once more commences; just an hour for introducing coding to the uninitiated, having them complete self guided tutorials. But is a hour sufficient? What can a beginner actually code within this limit? The answer is a bit more complicated than that, so let's find out all about it! Integrated into the larger, worldwide, annual Computer Science Education week, this year taking place December 4-10, Hour of Code's novel mission has always been to get everybody coding, aged from 4 to 104, by providing: "a one-hour introduction to computer science, designed to demystify code, showing that anybody can learn the basics, and broadening participation in the field of computer science". But first of all, why this obsession with Computer Science, in particular in getting  kids as young as 4 to learn to code? The answer is simple. Nowadays code is everywhere around us, from desktop computers to mobile phones and, thanks to w…

3D Face Reconstruction with Neural Networks applied to Art

In yet another AI breakthrough, researchers from the University of Nottingham Computer Vision laboratory, have managed to transform 2D facial images to their 3D counterparts in a most productive and efficient way and have made the project available for the rest of us to enjoy. The best part is that it can even work with Art such as historical portraits and pencil sketches. Scientifically the process of mapping facial pixels to 3D coordinates falls within the realm of 3D face reconstruction and is one of the most difficult problems that Computer Vision and Graphics research is trying to solve.
full article on i-programmer.info

Book Review : How To Create Pragmatic, Lightweight Languages

At last, a guide that makes creating a language with its associated baggage of lexers, parsers and compilers, accessible to mere mortals, rather to a group of a few hardcore eclectics as it stood until now.

The first thing that catches the eye, is the subtitle:

The unix philosophy applied to language design, for GPLs and DSLs"
What is meant by "unix philosophy" ?. It's taking simple, high quality components and combining them together in smart ways to obtain a complex result; the exact approach the book adopts.
I'm getting ahead here, but a first sample of this philosophy becomes apparent at the beginnings of Chapter 5 where the Parser treats and calls the Lexer like  unix's pipes as in lexer|parser. Until the end of the book, this pipeline is going to become larger, like a chain, due to the amount of components that end up interacting together.

The book opens by putting things into perspective in Chapter 1: Motivation: why do you want to build lan…