Dr. David J. Pearce

The Liquid Metal Project


One of the most interesting projects I came across at PLDI/ECOOP in Beijing was the Liquid Metal project being developed at IBM’s TJ Watson Research Center. From the Liquid Metal homepage:

The Liquid Metal project at IBM aims to address the difficulties that programmers face today when developing applications for computers that feature programmable accelerators (GPUs and FPGAs) alongside conventional multi-core processors.

There are a few demos on the Liquid Metal site, including an N-Body simulation. During the conference, I got to see the demo live … and it was pretty impressive!

Anyway, from my perspective, the most interesting part of the project is the LIME programming language being developed.  This is similar to Java, but aims to enable easy migration of code onto a GPU or even an FPGA.  Fundamental to this is the notion of  “arrays that behave as values” — that is, arrays which are immutable and can’t be null.  More specifically:

A value type represents a deeply immutable object type (e.g., data structure or array) declared using the value modifier on a type.

The reason for these value types is that they represent data which can be safely transferred to/from a GPU or FPGA. At the language level, “tasks” are used to represent asynchronous computation performed by “workers”:

The worker methods input immutable (value types) arguments (if any) and must return values (if any). This ensures that data exchanged between tasks does not mutate in flight, and provides the compiler and runtime greater opportunities for optimizing communication between tasks without imposing undue burden on the compiler to infer invariants involving aliasing

The project has already managed to demonstrate some impressive speedups using GPUs. The work on compiling down to FPGAs appears to be a little less developed which, after speaking with David Bacon at length about this, seems to have been significantly hampered by: a) difficulty of getting reliable FPGA boards; b) the large barrier to entry in working with FPGAs. But, they seem to have been making some good progress despite this, and I think there’s some really exciting stuff going on there. Indeed, I noticed that Forbes recently had an article on using FPGAs to power financial computations … so maybe, finally, we are beginning to see rise of the FPGA for general purpose computing…

References

Some direct links for papers on LIME: