Skip to main content

GraalVM Under The Covers

 At a very high level, GraalVM is a runtime that can compile bytecode into native self-contained executables as well as run programs in languages other than Java. This detailed look at it attempts to put a highly technical and difficult subject into perspective.


The starting point for this article was "Exploring Aspects of Polyglot High-Performance Virtual Machine GraalVM", a paper by M. Šipek, B. Mihaljević and A. Radovan of Rochester Institute of Technology Croatia, Zagreb, Croatia. The presents GraalVM's architecture, its features and examines how it resolves common interoperability and performance problems when having to support multiple programming languages. I've extended it in order to provide a high level overview in simple terms.


Virtual Machines were invented in order to run programs in an independent way regardless of the platform and the underlying hardware. Examples are the . NET CLR, MoarVM the modern virtual machine built for the Rakudo compiler implementing the Raku Programming Language and of course the JVM. Initially the JVM was built in order to make Java portable across platforms by running bytecode coming out of the Java compiler. Soon enough other languages that could emit bytecode for the JVM came along, like Scala, Kotlin, Groovy or Clojure, therefore extending the JVM's application beyond Java.

graalvmlogo

full article on i-programmer:

https://www.i-programmer.info/programming/java/15129-graalvm-under-the-covers.html

Comments

Popular posts from this blog

Upgrading to a newer Ingres version breaks SQL query

The following query (ignore most fields,just pay attention to d1.last_aa )   Q1   select d1.last_aa,           m.c_ylikoy as c_ylikoy,           n_ylikoy = m.n_ylikoy,           c_typ_ylik=tf.c_typ_ylik,           n_typ_ylik = tf.n_typ_ylik,           tm_mesh=m.tm_mesh,           pt_fpa=m.pt_fpa,           pt_fpa_last= d1.pt_fpa,           end_eis_ex = d.end_eis_ex,           ps_diaqesh = sum(d.ps_diaqesh),                    tm_monadas = avg(d.tm_monadas),           c_mon_metr = m.c_mon_metr,           c_typ_gmdn = m.c_typ_gmdn,           c_loga = m.c_loga,           c_cpv = c.c_cpv,           n_cpv = c.n_cpv,           c_prom = h1.c_prom,           c_loga_dls = ss.c_loga_dls,           tm_mm_last = max(decimal (d1.timh_symb /s.periektiko, 16, 4)),           tm_mm_fpa_last = max(( d1.timh_symb /s.periektiko) * (100.0 +d1.pt_fpa) / 100)        from hi_denyl d  inner join hi_henyl h on             d.c_entyp = h.c_entyp and             d.last_year = h.last_year and             d.las

Headless Chrome and the Puppeteer Library for Scraping and Testing the Web

With the advent of Single Page Applications, scraping pages for information as well as running automated user interaction tests has become much harder due to its highly dynamic nature. The solution? Headless Chrome and the Puppeteer library. While there's always been Selenium, PhantomJS and others, and despite headless Chrome and Puppeteer arriving late to the party, they make for valuable additions to the team of web testing automation tools, which allow developers to simulate interaction of real users with a web site or application. Headless Chrome is able to run without Puppeteer, as it can be programmatically controlled through the  Chrome DevTools Protocol , typically invoked by attaching to a remotely running Chrome instance: chrome --headless --disable-gpu                      --remote-debugging-port=9222 Subsequently loading the protocol's sideckick module 'chrome-remote-interface' which provides  a simple abstraction of commands and notifications using

Book Review : How To Create Pragmatic, Lightweight Languages

At last, a guide that makes creating a language with its associated baggage of lexers, parsers and compilers, accessible to mere mortals, rather to a group of a few hardcore eclectics as it stood until now. The first thing that catches the eye, is the subtitle: The unix philosophy applied to language design, for GPLs and DSLs" What is meant by "unix philosophy" ?. It's taking simple, high quality components and combining them together in smart ways to obtain a complex result; the exact approach the book adopts. I'm getting ahead here, but a first sample of this philosophy becomes apparent at the beginnings of Chapter 5 where the Parser treats and calls the Lexer like  unix's pipes as in lexer|parser. Until the end of the book, this pipeline is going to become larger, like a chain, due to the amount of components that end up interacting together. The book opens by putting things into perspective in Chapter 1: Motivation: why do you want