The role of JVM

Tim Bray writes On Beyond Java — the JVM, basically thinking about the role JVM could play with many languages built on top of it, not just Java. I think that the comparison to mainframes is slightly misdirected, though — I'd compare JVM not to the mainframe, but to the PC CPU.

Mainframes were bulky and complex, whereas PCs are (were) basically memory, video and keyboard all connected to the CPU. We used to have many different CPUs with different machine languages and programs soon started to be written in an interoperable way, using compilers. Nowadays we have mostly i86, yet programming languages stay CPU-neutral — no reason to go changing that.

So on to virtual machines (JVM, CLR) — I'd say they are the new CPU: way smarter (garbage collection is good, threads are good etc.), but still basically machines that execute bytecode (the new machine language). So where we had Intel vs. Motorola vs. IBM earlier, we now have Sun vs. Microsoft. Unless the silicon makers start making successful CPUs with virtual machine capabilities (multicore CPUs should have no problem being the VM), they may go the way of memory makers — I don't care who made the commodity chip I'm sticking into my computer.

So I'm waiting for the first language compiled both for JVM and CLR. This might involve a lot of base library juggling or perhaps standardization (I'm talking the java.lang, or java.* or even javax.* here, dunno the MS equivalent) — how about that for a next gen programming language? 8-)

Posted at 0959 on Fri, Dec 2, 2005 in category Ideas | TrackBack | Comments feed
Comments

I'm not a language designer, but compiling to different target machines is pretty much a solved problem AFAIK. The value, though, would be questionable -- it's not the VM that is the problem, it's the libraries. Even if java.* were solved, javax.* is pretty much open-ended and not cleanly mappable to .NET frameworks.

Posted by: Stefan Tilkov at December 2, 2005 11:06 PM

Stefan, I think that the new, VM-independent language, might have its own set of extension libraries that would be implemented over javax.* or the .NET equivalent, and that's the hard part, not the actual compilation.

Posted by: Jacek at December 3, 2005 2:55 PM

Hi Jacek,

I think there was such a "mistake" - remember CISC, RISC and post-RISC generation of processors. RISC architecture is even simpler than JVM bytecode. The result now - processors having RISC core and CISC layer over it. Why? I think that experience proved it is too difficult (low effectivity, hard to use...) to implement an "all-in-one" architecture like your proposed one.

The next argument plays against your model. My laptop is Dell. Well, how many "providers" created it? Surely many. My software is compiled using many compilers. My software is made by many authors...

Having this fact and looking at work of e.g. W3C - I see standardization is a dinosaur offen producing useless crap. (In fact, I think the only good "standardization" is done by power of MS - everything on their own, the others accept it silently.)

If standardization process were put to CPU's world ten years ago, we would use calculators today (never mind - standardized).

And one more note:
Writing .NET app (in C#) I took a Java library (using sockets and other java.* stuff) and was very surprised when the code was compiled and run without any problem.

Posted by: Tobbi at December 14, 2005 4:15 PM