1 2 3 Previous Next


39 posts

Surely one of the biggest announcements at JavaOne 2010 was the new roadmap for JavaFX, laying out the journey towards a 2.0 release that will be radically different from what had gone before -- not so much evolution, as total revolution. While the details, when they arrived, contained a fair few surprises, the overall radical nature of the roadmap was not totally unexpected; many had suspected some kind of upheaval was on the cards. Since its inception, JavaFX had struggled to define itself within the Java community, upsetting some and bamboozling others. Even the 2.0 roadmap announcement was met with some confusion on Twitter [1][2][3], although eventually most tweeters seemed to get the message.

It would seem the about-turn is a reflection of a shift in the way Oracle wants Java to approach Rich Internet Applications in future, combined with a reality check regarding the limitations of the technology for API coding. No doubt the love-it-or-hate-it feedback JFX seemed to garner from the Java community also played its part in the decision.

Two points in the JavaFX 2.0 roadmap particularly caught the imagination of the Twittersphere. Firstly, the JavaFX Script language is to be discontinued as an official Oracle project, and secondly, the key language features and API (scene graph, binding, sequences, etc.) are to be re-engineered into regular Java APIs, making them accessible to languages other than JavaFX Script.

I have to say I'm sad to see JavaFX Script get dropped (and not just for the obvious reason), although as an open source compiler this probably won't be the last we see of it. Dropping official support, however, does mean the language won't necessarily benefit from the full time expertise of so many developers skilled in the art of compiler writing. Going forward, for the language to survive it has to attract the attention of not only graphics enthusiasts, but language specialists too.

JavaFX Script was an attempt to create a DSL (domain specific language) for user interface coding and animation. For those readers who've never dabbled in the black arts of graphics and animation, let me explain: traditionally the GUI programmer uses a general purpose language (or, at least, a language designed for something other than creating GUIs) with a set of graphics/UI APIs sellotaped crudely onto the side. Unlike Java, or C++, or Groovy, or JavaScript, or [insert your favourite here], JavaFX Script was a purpose made language intended to address directly (and grow with) the needs of the GUI developer. Unfortunately its greatest strength was also its greatest weakness: JavaFX Script was not a great general purpose language, and you need a solid general purpose language to create APIs. So the opening up of JavaFX to Java and other JVM languages was a very positive step. Ideally the JavaFX APIs would be built in Java, leaving the JavaFX Script language free to do what it does best: create great user interfaces.

I doubt there were enough desktop programmers in the Java community to truly appreciate the benefit of a DSL like JavaFX Script (more on this point later), and so faced with a barrage of "why didn't you use Groovy?, why didn't you use Scala?", I guess JavaFX Script's days were numbered.

But in a wider sense the change in direction heralded by the new roadmap has, I think, ramifications far beyond just dropping JavaFX Script and re-purposing the APIs to work with Java. In this blog entry I want to examine why JavaFX 1.x divided so many, and consider how the new 2.0 roadmap fits into the current (and future) landscape of RIAs.

Ten ton gorillas

Since the introduction of the first Enterprise Edition, Java has been a heavily lopsided community. The Standard Edition and Micro Edition have always found themselves starved of heat and light, living as they do in the hard shadow of their Goliath enterprise sibling. The gravity field surrounding EE is so great, it even crushed ME, despite ME’s massive initial market penetration. Treated almost as a curious sideshow attraction by the majority of the Java community, ME's growing pains were never addressed in a timely fashion, and its future never seemed to be a priority. I can't think of any other technology that fell quite so far through events of its own making -- even Netscape had the excuse of anti-competitive practices by a major rival!

Likewise, desktop Java has its own woeful tale of neglect (although, to be honest, other complications also conspired to limit its success). JavaFX was supposed to be a brave attempt to address that, to create something new and future facing, ensuring Java’s place at the top table in the second decade of the 21st century. So, naturally, almost everyone in the Java community treated it with deep suspicion!

The EE guys looked at JavaFX and wondered "how can I build a web site out of that?!" (an understandable reaction, given some of them seem to be blissfully unaware anything exists outside of a web browser). The ME folks wanted to know when it would run on a device they might actually have heard of. And the SE clan were mad as hell that the limited Java desktop bodies at Sun/Oracle were not busying themselves adding to Swing.

Of course, of all of these groups it would be the EE people who would ultimately determine the fate of JavaFX, simply because they dominate the community. Naturally they have a highly myopic view of the way software should be written for the internet, a view that sees the browser at the heart of the user experience (no doubt reinforced by over fifteen years of everything being browser based). To sell the EE folks a technology that looked beyond their browser centric worldview was always going to be tough. Many, after all, languished under the impression that HTML 5 had already been anointed as the platform for the next-gen Rich Internet Applications -- case closed!

Red Herrings

In term of building application GUIs, the problem with the web browser is (unlike proper UI/graphics toolkits) it only has one layout paradigm — the flow. Granted, it boasts the mother of all flow layout engines -- not so much "everything but the kitchen sink", more like "everything including the kitchen sink, plus a kitchen sink we robbed from next door when they took that two week vacation!". But there’s no getting away from the fact the browser sees everything as nested blocks of flowing content on a page, much like a desktop publisher; and the page/paragraph model is certainly not the way professional GUIs are created.

With its limited GUI capabilities, stateless protocols, and love of server round-trips, the web browser is an excellent choice for applications centred on a basic label/value form, wrapping a CRUD data model (create/read/update/delete, the record-based philosophy of all popular databases). Once you step away from basic forms and CRUD, however, you find the experience gets progressively less agreeable. The more sophisticated the UI, the greater the challenge to pull it off in HTML. And I say this not as an ignorant onlooker, but as someone who spends their working day coding with browser based visualisation tools like the Exhibit semantic web components from MIT's Simile project.

Many believe HTML 5 will elevate the browser to become a fully fledged RIA technology. They seem to think that, like some cheap televangelist, HTML 5 will magically raise the browser from its seat, invite it to throw away its crutches, and have it doing cartwheels around the room. Undoubtedly HTML 5 adds many useful app-centric features to the browser, but it’s still very immature and I suspect will always be a pain to get working across multiple browsers (the label "write once, debug everywhere" is more true here than anywhere else!) You may have been wowed by the latest IE9 showcase app [1][2], but the kind of effects IE9 is (just about) able to produce on a web page are already run of the mill for the purpose-made retained mode graphics engines driving Flash, Silverlight or JavaFX.

Ultimately the promise of HTML 5 and Canvas is that, at some unspecified point in the future, the browser may be able to offer the same kind of UI richness as already enjoyed today by the iPhone, Flash, JavaFX, etc. (Of course, by the time HTML 5 catches up, who knows where iPhone, Flash... will be?)

So will the browser be the future of RIAs? Despite its limitations, the browser has undeniably one major advantage: it is massively popular with end users. Every internet device has a web browser, and every user knows how to use it. More importantly, the web makes it easy to find and share applications: everything lives at the end of a URL. That familiarity has served the browser well, so many naturally assume this domination will extend into the future -- RIAs will be browser based, the web has won, end of story!

But one fateful day in July 2008 changed all that...

Clairvoyant octopuses

July 11th 2008 (so says Wikipedia) was the day the iPhone App Store opened its doors for the first time, and nothing was quite the same thereafter!

Previously Apple CEO, Steve Jobs, had insisted the way to write apps for the company’s iPhone was via the Safari browser -- Jobs was backing very firmly HTML 5 and web standards. (Ironic, given what later transpired.) After pressure from developers, however, Apple released an SDK for native iPhone apps, and in July 2008 the first apps started to arrive. iPhone owners quickly voted with their fingertips, and the App Store became an overnight smash hit. All talk of HTML 5 was abandoned, as the App Store became the undisputed USP (unique selling point) of the iPhone product range. And when it became known some developers were getting rich from their iPhone apps -- seriously rich! -- ever more programmers flooded onto the platform, creating a mini gold rush.

Android arrived, bringing the app experience to non-Apple devices, and it too proved to be a success. This was clearly not just a fluke!

What the App Store and Android Market proved was that the web browser was not the only way users wanted to consume their internet apps. The certainty that the browser, and HTML 5, were the sure-fire victors in the RIA race was challenged; now developers had to decided: which skill would be more in demand in five years time, HTML 5/Canvas or iPhone/Android?

As a web developer it would be all too easy to dismiss the success of the app store model as being localised to smart phones. Sure, phones have apps, but laptops and desktops will always have the browser and HTML, right? But the rise of the tablet has muddied even these waters. Devices like the iPad and Galaxy Tab create bridges between the mobile and desktop space -- if the app store model takes hold on tablets, then how long will it be before the desktop falls for the charms of a store model? Today when you meet with a client to flesh out the requirements for their new online app you ask "which browsers would you like to support?". Tomorrow you may well be asking "do you want iOS, Android, or both?"

Of course the game is still in play, and nobody (except, perhaps, Paul the Psychic Octopus) knows for sure what the end result will be. Personally I will be paying close attention to the tablet space. If tablet users access Facebook and Twitter via a web browser then perhaps HTML 5 will have a future as an application platform. If they instead opt for downloading Facebook or Twitter apps, then those with a heavily web centric skillset had better watch out, as the web could start to revert once more to just a hypertext technology.

(Note: I'm not suggesting either technology will vanish without trace, but history has taught us that when two technologies are in direct competition, sooner or later one starts to dominate.)

Dead parrots (or any they just pining for the fjords?)

So where does all this leave JavaFX?

While the dropping of JavaFX Script and the moving of the JavaFX APIs to Java generated the most chatter, two other points in the 2.0 roadmap are probably more important for the future of the technology. First, JavaFX Mobile is to be shelved in favour of anupdate to Java ME. Second, the JavaFX graphics APIs are eventually going to be able to target HTML 5.

In light of events post July 2008, this seems like a decidedly odd decision.

JavaFX 1.0 was a technology firmly in the native apps mold. But 2.0 seems to be doing 180 degree about face to re-engineer JavaFX for the HTML 5 crowd. Yes, judging by the roadmap it seems JavaFX 2.0 will still be able to target native apps -- but without credible API support extending onto mobile devices (like smart phones and tablets), how can it possibly fit in with Google's Android or Apple's iOS? Surely this couldn't be yet another example of Enterprise Edition's massive gravitational field, sucking everything within close range towards its web centric worldview, while crushing anything that cannot adapt?

It's true JavaFX in its 1.0 form didn't acquire the traction Sun had wanted. (I'm too much of a fan to admit it failed, so let's just agree to say it wasn't as successful as some had hoped!) The dropping of JavaFX Mobile, and with it the "all the screens of your life" agenda, I'm sure was done for pragmatic reasons. But it leaves JavaFX looking decidedly anemic in a possible future were apps may be expected to run across phones, tablets and desktops. Under 1.x there was still a hope JavaFX might find its way onto Android (once current difficulties had been sorted out), meaning it would span all three form factors. Under 2.0 it looks (assuming I've understood the details right) like we're retreating back to separate and distinct SE and ME platforms, with select JavaFX enhancements being added to each.

On the desktop JavaFX 2.0 will be able to work directly with the native graphics hardware (via the Prism stack) or indirectly via HTML 5. Rather than try to ride both horses at once, I think Oracle should choose whether it wants to back a native model of RIAs (a la Android/iOS), or the web model (a la HTML 5), then decisively engineer JavaFX towards that goal. What I'm particularly concerned about is the effort to port the scene graph APIs onto HTML 5 will consume so much of the JFX team's resources and prove so problematic (write once, debug everywhere) that it will act as a ball and chain to limit innovation and slow down growth.

Ultimately, for me, the 2.0 roadmap threw up more questions than it answered. If JavaFX is no longer to be for all the screens of our life, what is it good for? Is it merely for adding snazzy animations to Swing applications, or is it a replacement for Swing itself? (The answer is apparently "the latter".) If JavaFX is no longer required to run on phones, do we need a new UI? -- couldn't we just add a CSS skinnable PLaF to Swing? Of course the obvious answer is the Swing components don't lend themselves to being manipulated in a scene graph, but how often does one honestly need to spin a progress bar through 360 degrees, or apply a blur effect to a file selector?

With the announcement of the roadmap JavaFX has effectively been plunged back into pre-release beta. The transition from JavaFX 1.3 to 2.0 is far from clear; if I commit to 1.3 now, will I be able to move smoothly over to 2.0 (using the open source JavaFX Script compiler), or will the Java-ized APIs be sufficiently different to the JavaFX originals? Even if the answers to questions like these are favourable, I suspect the roadmap has killed JavaFX dead for the time being (how can anyone commit a major project to any technology destined for serious overhaul in less than 12 months?)

While I recognise the pragmatism behind the roadmap's decisions (even if I don't necessarily agree with all its conclusions) the most frustrating part is JavaFX fanboys (guilty!) are now back to the familiar wait-and-see game, biding our time and twiddling our thumbs before we can start over with the new re-imagined platform.

Although I've never pretended everything in the JavaFX garden was rosy, I have always been (and hope to remain) a JavaFX fan. I sincerely wish the development team all the best as they push forward into the uncharted realms of 2.0, and nothing would give me greater pleasure than to see a healthy JavaFX 2.0 dominating the RIA space, coupled with a strong and vibrant JavaFX Script community. But I have this uneasy feeling that the app store will win out (admittedly, no psychic octopus has yet ruled on the matter), and 2.0 seems to be a move away from that model (or, at the very least, a significant distraction in the form of supporting HTML 5).

That's all I have to say on the matter for now. But I'll leave you with one final thought: given Android doesn't really have a native retained mode graphics stack, and relies upon separate XML files for its declarative UI support, wouldn't it be a bittersweet irony (from Oracle's point of view) if an open sourced JavaFX Script found favour with Android developers, coupled with an Android specific scene graph, to create the kind of rich user interfaces that Chris Oliver originally intended for Java?

Now, does anyone know if Paul the Octopus is contactable on Twitter?. I have a few questions... :)

Amidst all the hype of the Sun to Oracle transition over the last week, some of you may have missed a certain announcement by a Cupertino-based firm regarding the imminent release of a computing device they say will fill the gap between netbooks and laptops. The Apple iPad is not, as some onlookers first suspected, a innovative feminine hygiene product, but a tablet device promising to offer (to quote Apple CEO Steve Jobs) "the best browsing experience you've ever had".

But "the best browsing experience" does not include web plugins it seems, as despite the iPad's 1024x768 screen, 1Ghz processor, and support for wireless connectivity, it is destined to be bereft of Flash, Silverlight, and (everyone's favourite in this neck of the woods) Java.

These omissions, particularly Flash, have caused some degree of controversy online, but why is Apple thumbing its nose at plugins?

Since the iPad product announcement the Flash debate has been raging across the blogosphere, on both pro-Appleand pro-Adobe sites. As you'd expect, the Adobe fans point to the prominence of Flash on the web, its use for games, video, and rich interactive content. This is undeniably true -- even Jobs' own carefully crafted iPad introduction quickly stumbled across a couple of examples of missing Flash content as he demonstrated browsing the web, although he was careful to ensure the tell-tale blue Lego logos were quickly scrolled out of view.

Apple fans respond by echoing Steve Jobs' own alleged comments that Abobe are lazy and produce crash-prone software. The iPad is better off without Flash. they say, because data gathered by Apple's crash reporting tools has shown the plugin is the single greatest source of Safari crashes, and its video playback is far too CPU-hungry.

Besides, isn't HTML5 the future?

These responses ignore a few simple facts. For a start Flash's high ranking in the crash report statistics could surely be explained by its overwhelming popularity on the web. Plugin systems, by their very nature, seem to be particularly prone to crashing -- for reasons far too numerous to list here trying to embed one application inside another, seamlessly and without either tripping the other up, has always been a particularly difficult trick to pull off. This is especially true of desktop software, where the thick and complex code glue that usually binds the two independent GUIs together and keeps them in sync often has to keep up with the erratic actions of the user. The question is this: when the stats are adjusted to account for popularity, is Flash any worse than other plugins?

And as for Flash's video playback being slow, the reason for this, according to Adobe (click "Product Details"), is because Apple refuses to expose the required APIs for hardware optimised decoding -- presumably they would prefer everyone to take the QuickTime route for video.

But what of the final charge against Flash, that it's obsolete now HTML5 has arrived? This relates specifically to Flash's use on sites like YouTube to play video. Even if for argument's sake we accept this as true, it ignores Flash's other numerous features, and it doesn't explain why Java is also missing.

i-Apologists have a simple answer for this: Apple is trying to defend the freedom of the web by taking a stand against closed source. They paint Apple is a knight in shining armour, with HTML5 its Excalibur, engaging in an ideological crusade against the evil forces of proprietary software (non-Apple proprietary software, naturally!) that seek to enslave the good citizens of the web. To defend everyone from the tyranny of Flash and its fellow freedom-hating plugins, Apple (our hero!) has banished them into the wilderness -- anyone suggesting this is a bit like fighting totalitarianism by burning all books that don't advocate democracy will be thrown in the dungeons! (Although these being Apple i-Dungeons, they probably have nice shiny walls and brushed metal trim on the thumb screws...)

The iPhone's limited 3G connection at least presented a plausible excuse for its lack of Flash -- but on the iPad (a Wi-Fi device being promoted as "the best browsing experience you've ever had") its continued non-presence looks more than a little absurd. Is this really the result of Steve Jobs' desire to keep the web open? I, for one, doubt it!

Apple makes substantial profit from selling software through its app store; this revenue stream would be severely compromised by rival (and free) application and media playing platforms like Flash, Silverlight, or our own JavaFX. We've become accustomed to walled gardens on cell phones, simply because of the limited nature of the hardware and networks (although how long this will last in the age of the smart phone is anyone's guess), but the iPad is distinctly not a phone, it is being pitched into the space between netbooks and laptops, an arena where the consumer does not expect to encounter walled gardens. (And remember, this is an artificial barrier, not created by the technical limitations of the device but the desire of the manufacturer to drive content towards their revenue creating services -- perhaps the iPad should be renamed the i-Pay?!)

Ultimately it could be said that Apple is just exercising its right to put out a products as it sees fit, and this is undeniably true. And consumers, after all, don't have to buy the iPad! But if we're talking about freedom of expression here, surely it is my (and other's) right to exercise free speech by commenting on their decisions, or indeed asking questions about the wider issue: should companies like this be allowed to arbitrarily block software applications and platforms from their devices?

A decade ago Microsoft was dragged into court by the US Department of Justice for taking advantage of its privileged position as an operating system maker to heavily promote and favour its own products. But Microsoft at no point banned Netscape, Java, or other competing products from Windows computers. It tried to influence new users towards its own products by pre-installing them and giving them high visibility on the desktop, but ultimately the end user was free to ignore the Microsoft choice and install an alternative instead. Not so with the iPad, it seems!

I guess what I'm trying to say is, while I admire the style and innovation Apple bring to each product, the political baggage they're starting to accumulate is really beginning to turn me off. And perhaps I don't like the message it sends out should the iPad prove to be successful -- that manufacturers, not consumers, should choose which software platforms run on their devices. As the Java community finally seems to be on the verge of getting the client facing tools it needs (JavaFX and its associated designers/editors), perhaps I dread the thought of being locked out of the next generation of devices.

JavaOne is coming up, and with it no doubt a slew of enhancements to JavaFX. Many of you reading will have no doubt dipped your toe into the waters of Sun's new platform, but how well do you really understand the power of its Domain-specific language, JavaFX Script?

I've had a pretty good excuse to write lots of JavaFX code of late, and to be exposed to some of the difficulties programmers first have when approaching the language. One of the common stumbling blocks seems to involve the shift in thinking from purely procedural source code (where the function/method is our chief currency) to the declarative source code supported by JavaFX Script (where code may be nested into a tree structure).

So, what I'm going to do in this post is take a problem ripe fordeclarative exploitation, and show how JavaFX Script might change the way we tackle the solution. For newbies struggling to understand the power of JavaFX, hopefully this will be an eye opener. For the uninitiated, perhaps it might demonstrate how JavaFX Script differs from Java and other languages.

The example I've chosen is the parsing of an XML file — a simple enough task one might think, but a task which highlights the stark contrasts between the structure of the document being processed, and the structure of the source code processing it. (Hopefully that last sentence will make a bit more sense by the time you've finished reading this posting!)

To parse XML in JavaFX we would typically reach for thejavafx.data.pull.PullParser class.PullParser walks the document from start to finish, breaking the XML up into a stream of events: for opening tags, for closing tags, and for any loose text inside them. We register a function to receive the events as they come in, and do with them as is our wont. It all seems very straight forward, so where's the problem?

Consider the (slightly contrived) XML below:

<?xml version="1.0" ?>
    <title>Shop example</title>
            <name>Joe Smith</name>
            <address>123 Java Street</address>
            <name>Fred Bloggs</name>
            <address>456 Duke Road</address>
            <name>Widget Maker</name>
            <price>20 groats</price>
            <name>Chocolate Fireguard</name>
            <price>99 credits</price>

Both <customer> and<product> contain <name>elements, and both <name> elements are at the same level in the document. JavaFX's PullParser tells us about the element causing the event, but little of where the element is in the document as a whole (aside from its level). This results in long blocks of conditional code, along the lines of...

If the event is a 'text' event, and our immediate parent is <name>, and it was at level 3 in the document, and its parent was<customer>...

A bit of a mouthful, just to ensure we know which bit of text in the XML we're processing. And remember, the only way of knowing thelineage of an event is to keep track of where we are in the document ourselves.

Things get even messier when we want to populate objects as we process the XML. Suppose we created a couple of classes, one to hold customer details, the other to hold product details, and we intend to populate and store objects of these classes as we walk over the XML.

class Customer {
    var name:String;
    var address:String;
class Product {
    var name:String;
    var price:String;

var title:String;
var customers:Customer[];
var products:Product[];

Because each XML element is delivered as separate, discrete, events we can't use a localCustomer/Product object; any objects we create must persist over multiple event function calls, giving us little alternative but to use private instance variables for this temporary data.

For any non-trivial XML document the event handling code quickly grows in complexity and acquires a burgeoning collection of private variables for maintaining state. The code starts to become a tad spaghetti-like, and inflexible — even a simple change to the XML format may require a lot of code hacking. The problem is, of course, we're trying to apply a very linear way of working to a non-linear data structure. If only we could write our event handling code as a tree, so it would mirror the XML.

Expert JavaFX Script programmers will be able to guess what's coming next.

def shopEventTree = Element {
    name: "shop"
    content: [
        Element {
            name: "title"
            onText: function(ev:Event) { title=ev.text; }
        } ,
        Element {
            name: "customers"
            content: [
                Element {
                    name: "customer"
                    var c:Customer;
                    onStart: function(ev:Event) { c=Customer{}; }
                    content: [
                        Element {
                            name: "name"
                            onText: function(ev:Event) { c.name=ev.text; }
                        } ,
                        Element {
                            name: "address"
                            onText: function(ev:Event) { c.address=ev.text; }
                    onEnd: function(ev:Event) { insert c into customers; }
        } ,
        Element {
            name: "products"
            content: [
                Element {
                    name: "product"
                    var p:Product;
                    onStart: function(ev:Event) { p=Product{}; }
                    content: [
                        Element {
                            name: "name"
                            onText: function(ev:Event) { p.name=ev.text; }
                        } ,
                        Element {
                            name: "price"
                            onText: function(ev:Event) { p.price=ev.text; }
                    onEnd: function(ev:Event) { insert p into products; }

Element is the first of two JavaFX classes designed to solve our problems. You can see it in action above, being used to create a tree which mimics the XML document we're loading. EachElement relates to a markup node in the document, withname representing the node's tag name. Three event functions (onStart(), onText() andonEnd()) fire on the opening and closing tags, or any loose text inside.

Note how we now handle the creation of thoseCustomer and Product objects (see the highlighted text in the listing). The object is local to the part of the structure it belongs to: onStart() creates it, the corresponding onEnd() stuffs it into the sequence, and in-between the nested Elements populate it.

def xeh:XMLEventHandler = XMLEventHandler {
    eventTree: shopEventTree;
def parser = PullParser {
    documentType: PullParser.XML;
    input: (new java.net.URL("{__DIR__}test.xml")).openStream();
    onEvent: xeh.handleEvent;

The Element class has a companion,XMLEventHandler, which accepts theElement tree and provides a function forPullParser to call with its events.XMLEventHandler tracks the XML's progress and directs the flat PullParser event to the relatedElement event function in our event tree.XMLEventHandler can tell the difference between the<name> in <customer> and the<name> in <product>, so we no longer need to worry!

The above is only an example, and far from perfect. For large XML documents the Element tree grows to hundreds of lines — hardly ideal. A more professional solution might demand the breaking down of the XML into several sub-trees, which we could model independently and re-use as necessary. Still,Element and XMLEventHandler serve their purpose as a demonstration of the power of thinkingdeclaratively.

The beauty of JavaFX Script lies in its ability to shape code around the data structures we work with. By utilising JavaFX Script's declarative syntax we were able to transform a (potentially troublesome) linear event handler into an event tree who's source code mirrored the document being parsed. The result was more readable, flexible, and easier to maintain. So, bottom line: don't just think about code to manipulate your data, start thinking about code which mimics (models) your data... :)

[Source code] (3k)


JavaFX in Style Blog

Posted by javakiddy Dec 31, 2008

One of the most touted parts of the new JavaFX API is the ability to skin UI controls using CSS-like stylesheets. However the current 1.0 release seems to be rather light on skin-aware controls, while documentation and examples seem to be rarer than a woman at a Star Trek convention. (That's my derogatory stereotyping quota used up for this year!)

Not that lack of documentation ever stopped anyone, of course!

A few days back I was playing about with some code, trying to unlock how the stylesheet support might work. All of a sudden a forum posting by tamerkarakan turning up, containing some hastily written (but insightful) notes— presumably the rewards of his own digging around in the code and trial-and-error testing.

I thought I'd work his findings up into a more complete, practical, example. What follows is a step-by-step guide to creating your own style-aware JavaFX control, including an external stylesheet which can transform the look of the component without need for recompilation of the JavaFX Script code.


Step one is to create a new type of control which supports a skin. For this example I thought we'd build something pretty simple, a basic progress bar.

package skindemo;
import javafx.scene.control.Control;

public class Progress extends Control
{   public var minimum:Number = 0;
    public var maximum:Number = 100;
    public var value:Number = 50;

    {   skin = ProgressSkin{};

The class extends javafx.scene.control.Controlrather than javafx.scene.CustomNode, giving us access to an inherited field, skin. This is where we will plug our scene graph code, which actually draws the control. This class just houses our control's public properties.

Step two is to create the skin itself, giving our control a UI.

package skintest;
import javafx.scene.Group;
import javafx.scene.control.Skin;
import javafx.scene.input.MouseEvent;
import javafx.scene.layout.HBox;
import javafx.scene.paint.*;
import javafx.scene.shape.Rectangle;

public class ProgressSkin extends Skin
{   public var boxCount:Integer = 10;
    public var boxWidth:Number = 15;
    public var boxHeight:Number = 20;
    public var boxHGap:Number = 2;
    public var unsetHighColor:Color = Color.YELLOW;
    public var unsetMidColor:Color = Color.GREEN;
    public var unsetLowColor:Color = Color.DARKGREEN;
    public var setHighColor:Color = Color.ORANGE;
    public var setMidColor:Color = Color.RED;
    public var setLowColor:Color =  Color.DARKRED;
    def boxValue:Integer = bind
    {   var p:Progress = control as Progress;
        var v:Number = (p.value-p.minimum) / 
        (boxCount*v) as Integer;

    {   def border:Number = bind boxWidth/10;
        def arc:Number = bind boxWidth/2;
        def lgUnset:LinearGradient = bind makeLG
        def lgSet:LinearGradient = bind makeLG
        scene = HBox
        {   spacing: bind boxHGap;
            content: bind for(i in [0..<boxCount])
            {   Group
                {   content:
                    [   Rectangle
                        {   width: bind boxWidth;
                            height: bind boxHeight;
                            arcWidth: bind arc;
                            arcHeight: bind arc;
                            fill: bind 
                                if(i<boxValue) setLowColor
                                else unsetLowColor;
                        } ,
                        {   x: bind border;
                            y: bind border;
                            width: bind boxWidth-border*2;
                            height: bind boxHeight-border*2;
                            arcWidth: bind arc;
                            arcHeight: bind arc;
                            fill: bind 
                                if(i<boxValue) lgSet
                                else lgUnset;                    
    function makeLG(c1:Color,c2:Color,c3:Color) : LinearGradient
    {   LinearGradient
        {   endX: 0;  endY: 1;
            proportional: true;
            [   Stop { offset:0;    color: c2; } ,
                Stop { offset:0.25; color: c1; } ,
                Stop { offset:0.50; color: c2; } ,
                Stop { offset:0.85;    color: c3; } 
Our skin extends (unsurprisingly) Skin. This is where our UI code for the Progress control actually lives. We create our scene graph and plug it into the inherited scenevariable. The public properties at the head of the file will be controllable through a stylesheet, as we'll see shortly.


Step three creates a test application for our new Progress control.

package skintest;
import javafx.animation.*;
import javafx.scene.Scene;
import javafx.stage.Stage;
import javafx.scene.layout.VBox;
import javafx.scene.paint.Color;

var val:Number = 0;
{   scene: Scene
    {   content: VBox
        {   spacing:10;
            translateX: 5;  translateY: 5;
            [   Progress
                {   minimum: 0;  maximum: 100;
                    value: bind val;    
                } ,
                {   id: "testId";
                    minimum: 0;  maximum: 100;
                    value: bind val;    
                } ,
                {   styleClass: "testClass";
                    minimum: 0;  maximum: 100;
                    value: bind val;    
        stylesheets: [ "{__DIR__}../Test.css" ]
        fill:  Color.BLACK;
        width: 200;  height: 100;    
    title: "CSS Test";
    visible: true;

{   repeatCount: Timeline.INDEFINITE;
    autoReverse: true;
    [   at(0s)   { val => 0 } ,
        at(0.1s) { val => 0 tween Interpolator.LINEAR } ,
        at(0.9s) { val => 100 tween Interpolator.LINEAR } ,
        at(1s)   { val => 100 }

Here we create three instances of our control. The second instance has a specific id, and the third has been assigned a style class. The first has neither. The significance of this will be apparent when we look at the stylesheet, next.

Step four is to create a stylesheet.

{   boxWidth: 15;
    boxHGap: 2;
    setHighColor: yellow;
    setMidColor: red;
    setLowColor: darkred;
    unsetHighColor: cyan;
    unsetMidColor: blue;
    unsetLowColor: darkblue;

{   boxWidth: 25;
    boxHeight: 30;
    boxCount: 7;
    boxHGap: 1;
    unsetHighColor: white;
    unsetMidColor: silver;
    unsetLowColor: dimgray;

{   boxWidth: 7;
    boxHGap: 2;
    boxCount: 20;
    setHighColor: yellow;
    setMidColor: limegreen;
    setLowColor: darkgreen;

The above is an external stylesheet file, with three example style rules. The first rule will match any instance of theskintest.Progress control (all of them, in effect); the second rule will match only that Progress control which has the id "testId"; while the third rule matches any Progress control with the style class "testClass".

In each case we are able to manipulate the class properties, change its size, the number of boxes, its colours, etc — all without recompiling the JavaFX Script code.

The source code for this mini-project is available here. Have fun (and don't forget to buy "JavaFX in Action" when it comes out! :)


No Future In Java Blog

Posted by javakiddy Nov 26, 2008
A C++ programmer walks into a Usenet newsgroup, "I don't see the point of Java!" he announces.
"It allows your code to work on many different platforms...", replies a local Java programmer.
The C++ programmer is unconvinced, "I can already do that with C++", he blusters.
"...without re-compiling your code for each platform", adds the Java programmer with a smile.
"What?!", shouts the C++ guy, "Why would I EVERneed to do anything like THAT?!?"
Okay, so as jokes go it's pretty lame, but conversations not unlike the above were ten a penny back when Java first appeared in the mid 1990s. A lot of programmers just didn't get it — but then, why would they? They had a lot invested in their current tools, and C++ (Fortran, BASIC, whatever) was serving their needs just fine. The newfangled World Wide Web may have been a fun distraction, but it wasn't likely to change the programming landscape, was it..? Roll forward almost a decade and a half, and it's blindingly obvious why the internet needed something more than C++. The software world evolves, but usually in slow baby-step increments. Very occasionally, however, the industry throws up an idea which tempts us with the promise of a giant leap forward, and a clutch of new and unusual tools emerge to explore the possibilities. Not allgiant leaps deliver — let's not forget Java started life aimed at a smart device explosion which failed to materialise (until years later.) A quick name change and a fresh lick of paint later, and Java became the internet focused technology we know and love. Some say we're rapidly approaching another seismic shift in the software world, at least from a desktop perspective. Whether you agree with the idea or not, Cloud Computing and Rich Internet Applications have many excited (guilty!) In anticipation of this brave new world we have the accompanying slew of strange new tools: Silverlight, AIR, and our very own JavaFX, which debuts in only a few days. They promise to change forever the way we think about user facing software, yet already seasoned Java programmers (eg. some respondents to Fabrizio Giudici's blog a couple of week back) are asking "What?! Why would I ever need to do something likethat?!?" As I see it, there are two questions here: Firstly, will Cloud/RIA computing become a significant force on the desktop? Secondly, assuming it does, how can the Java community grab a share of this market? The first one is easy to answer, a quick Gallic shrug and an apathetic "dunno" will suffice. It's the same answer we can confidently give for man made global warming, or whether any of us will have jobs this time next year. Short of James Gosling becoming the eleventh Doctor Who, we can never say with confidence how things will turn out for JFX until some time after the market has already decided. Tipping points are never appreciated when they happen, it usually takes a hefty dose of hindsight to uncover why things ended up the way they did. All we can do is guess at a reasonable worst case scenario, and equip ourselves accordingly. Let's put the first question to one side by assuming RIA's are the future — what should the Java community do about them? Giant leaps are often benefit from new languages, do RIAs need one too? The temptation is to defend our current tools against any criticism, to pretend not to notice their blind spots, or to claim their deficiencies don't matter. Recall how our comical C++ coder was quick to equate the cross-platform ability of C++'s source code with the platform neutrality of Java's binaries, and was totally blinded to the benefits of Write Once Run Anywhere in a network-centric world. Surely the most pragmatic answer is, "we don't know"? It seems reasonable to assume RIAs may benefit from some sort of DSL, although Swing and WebStart may also suffice. It all depends upon whether the informality associated with browser-based user interfaces is a passing fad, or a permanent change in end user tastes. But one thing we can say with certainty: no industry can progress unless someone takes a punt. The only reason we're talking now about "all the screens of your life" (to paraphrase the JavaFX motto) is because Sun took a gamble with Java. (Just how far would we have got with C++ and Perl...?) There are indeed some similarities between the early days of Java and JavaFX: JFX, like Java, will be launched in the face of a well established (and entrenched) opposition, and doesn't truly innovate so much as integrate and focus established but less visible ideas into a tool suited for solving a particular type of problem. But there's also some pretty stark differences too: in 1995 Java could ride on the guaranteed street cred of bringing eye candy colour to otherwise grey HTML pages, yet in 2008 JavaFX doesn't have the market to itself, and rippling lake applets no longer wow today's fickle users. JavaFX's big trump card is it's interoperability with Java. Rather than the client/server chasm inevitable with rival technologies — Java on the server, ActionScript on the client, for example — the developer can decide how far across the span of the application Java should reach. It's possible to develop a thin JavaFX Script veneer over heavyweight Java libraries, or (the other extreme) to restrict Java to only a server side role and let JFX rule the desktop — the option is left open. This causes familiar chasm-straddling technologies (SOAP/JSON/REST/etc,etc,etc) to be demoted to implementation choice instead of mandatory requirement. Only time will tell what the future has in store for JavaFX. If it turns out RIAs don't deserve a whole new platform, and Java will suffice, then we haven't lost much in hedging our bets. If the opposite is true, however, then at least JavaFX still gives us a fighting chance...  
If you're following my recent adventures (do you have nothing better to do?) you'll know I've been spending a lot of time with JavaFX Script recently. It's a language which targets a wider audience than its bigger cousin — more Javascript than Java. Thanks to The Java Posse I was recently alerted to Fabrizio Giudici's blog asking which scripting languages should be supported in his blueMarine project. And then there's our illustrious editor's own blogof a few days back, in which he concludes: 
High-school age programmers probably don't want to write financial applications or transactional databases... but they might be interested in making devices see, hear, spin, roll, lift, and/or just blow up. Can they use Java for that? Should they?
All three of these, in their own way, address a common issue: how do we bring the complex world of software development within the reach of the novice (or occasional) programmer?Okay, time for another "when I was a lad" digression: in days of yore to know about "computers" was to know about programming. Mainly BASIC, which was the language of choice on early home computers. Amazing as it seems, the computer actually "booted" straight into BASIC, and general housekeeping tasks like loading a program or deleting a file were done through BASIC direct mode instructions. The core of every high school "Computing" course (the ugly term "I.T." had yet to be invented) was coding software. Sure, you'd be shown a word processor, on the off chance you might encounter such an exotic beast in the outside world (and if you were really lucky you might have ten minutes on a spreadsheet) but mainly it was programming, programming, programming. How times change! Not that I'm complaining, things can't stand still, and the BASIC front-end was doomed to be replaced almost from the moment it first appeared. And naturally this would be reflected in the classroom. But I sometimes wonder if removing all programming from the core I.T. curriculum, and parcelling it off into speciality modules, was a wise decision. Increasingly the rich multimedia world we live in is exposing itself to the end user by way of programming APIs and software languages. At one end of the scale we have HTML: although not strictly a programming language it is a form of 'coding', which benefits from a familiarity with software development. At the other end of the scale we have an increased visibility (and importance?) of script-ability features, like those Fabrizio is considering for blueMarine. JavaFX Script is a domain specific language for graphics, in itself not an idea which one would consider controversial. But the decision to make the syntax akin to scripting languages like ECMAScript did appear to raise a few eyebrows when JavaFX was first formally taken into the Java family a couple of Java One's back. Is this a case of some of the Java faithful adopting a rather snobbish attitude to a syntax clearly designed to court the likes of web designers and animators(..?) (An aside: I urge anyone with mixed feelings to give JavaFX a second try when the full release arrives shortly — there's a lot of power hidden beneath that user-friendly facade.) If I'd have been born ten or fifteen years after I was, I probably wouldn't have become a programmer! Without an interest in financial applications or transactional databases (strange that!) I would presumably have given software development a miss. As Chris Adamson so succinctly noted, the mainstream languages like Java don't seem to offer much that would fire the imagination of young programmers. Even platforms like Flash and phone MIDlets, once awash with games, now seem more concerned with boring middle-aged pursuits like database connectivity and web services. What I see is a world in which every part of our lives — no longer just our work, but our play too — is being moved into the digital realm. Our music, our memories, our innermost secrets, increasingly they are being digitised and pushed onto the cloud (whether or not that's a good idea is an entirely separate debate!) Anyone with at least modest levels of coding skill is at an advantage in making these tools work and inter-operate. Yet the routes into programming seem dry and stuffy, devoid of any fun or instant gratification — a far cry from twenty years ago. Technologies like JavaFX, with rapid code/compile/test cycles (for near instant gratification) and accessible multimedia features may offer the answer. But do they risk being tainted by unwarranted criticism from hard-core coders who see their average-Joe syntax, and just don't 'get it'?  
There must be a name for that particular form of programming masochism which involves wringing the maximum effect out of the minimum of code. If not, someone should invent one! I first began coding when the Apple II and Commodore 64 (et al...) opened up computing to the masses. By modern standards their 32 or 64k of RAM is an impossibly small amount — I'd love to report it was more than enough back in the day, but I'm reminded of all those evenings I spent byte shaving. Still, as frustrating as it could sometimes be, there was a certain joy in coding those older systems, a certain pride in making everything as tight as possible to fit the most in. Shaving every byte, stripping away redundant clock cycles (zero page addressing, anyone?) and banking the Kernal ROM (Commodore's spelling) in and out of the 6510's address space to get at the RAM 'underneath'. In this age of the gigabyte, it can still be a useful exercise to see how much one can do with a given technology under tight (if artificial) constraints. The legendary64k introscene is a modern example, but as long ago as the 16 bit days I seem to recall a STOS or AMOS BASIC sponsored competition for writing games in ridiculously few lines of code (ten..?) (Actually, wasn't Brainf**k the product of an Amiga owner?) In recent weeks I've found myself needing to develop a lot of small JavaFX examples. I looked at what Joshua Marinacci had done with his 45 line 'preview' demo, setting the bar quite high, and wondered if I could do better...The increase in power of our computers has necessitated layers of abstraction. By contrast the graphics on the old 'micros' involved nothing more than writing values into the right place in RAM. Of course graphics were very much more primitive in the 8 bit days, but for instant coding gratification I have to confess nothing has ever really beaten the thrill of poke-n-play. Which is why domain specific technologies like JavaFX come as such a welcome change — not that they allow old fashioned bash the metal coding, but they certainly put some of the immediacy back into graphics work. After experimenting with JavaFX Script for a while I came up with the following self-imposed rules, seemingly offering a suitable balance between challenge and frustration: 
  1. One source code file.
  2. One hundred or fewer lines of code.
  3. No external data files. It's too easy to cheat with animated images and the like.
  4. Eighty columns maximum. No ridiculously long lines of code.
  5. Modestly readable source. Obviously there's a temptation to optimise the amount of code on each line, and ignore comments, but the source should still be indented and laid out in afairly readable way.
Bouncing Balls The bouncing balls was my opening gambit, which admittedly was a bit hit and miss. There's various effects on show here: the balls bouncing on a reflective surface, colour transitions as the mouse enters of leaves the scene, and faux lighting which shifts across the balls and background as the mouse pointer moves. To be honest, I think its a bit of a dog's dinner of effects and tweens. Fireworks My second attempt was a bit more focused: a fireworks effect. At random intervals multi-coloured rockets shoot into the night's sky, before exploding with a flash into sparks, drifting back towards the ground and fading as they go. It's a pleasing enough effect, but I get the feeling there's still more which can be done. I'm thinking perhaps something faux 3D, or involving parallax scrolling? As you may be able to tell, it's quite addictive stuff. With only 100 lines to play with the mind races, searching for cunning ways to create big effects with small amounts of code. I can certainly see why the 64k guys do it. If you want to check out the demos for yourself the source code (all 193 lines, combined) is available here. It should build/run using the 1st Preview of JavaFX from a few weeks back, with Java 6 update 10. And so, I hereby throw down the gauntlet to anyone who can come up with even cooler demos (or improve the two above.) Do you think you can beat my fireworks, in 100 lines or less? Surely it can't be too hard?!? Go on, give it a try — like me, you may be surprised at just how much fun it is!  

Watched Pots and JavaFX Blog

Posted by javakiddy Aug 28, 2008
In recent weeks I've been immersed in the strange and exciting world of the JavaFX Preview release. Some might say up to my neck, although sometimes it's felt more like drowning. JavaFX makes a lot of previously very complex graphics tasks now very simple. At the same time it makes a lot of previously very simple tasks now frustratingly hard! Of course, this is merely preview release one — another update is apparently scheduled for this November, so perhaps some headaches will be addressed by then. Here's hoping! But for now here's some random gripes, observations (and a little bit of interesting source code) relating to the current release.One of the main problems of developing using the recent preview release is the mass of outdated documentation, tutorials, and other such matter floating around on the Internet. JavaFX has been through a lot of changes and redesigns, with the current preview release being very different in both API and language syntax to the original F3-inspired platform which made its debut eighteen months ago at JavaOne 2007. When searching for any JavaFX related query one has to remember to always check the date when the resulting pages were posted, as pages relating to the old JFX seem to far outnumber the new. While it's perfectly understandable that the various commentaries and examples scattered across the far flung corners of the internet are still available, some of the obsolete documentation seems to be hosted by Sun itself! [1][2] Surely they can spare a few moments to take it down, update it, or at least flag it as outdated? Coding in JavaFX is really quite an enjoyable experience compared to hacking away at Swing. The declarative syntax is a real boon for getting stuff on screen quickly, and the timeline system takes all the pain out of getting things to animate. Perhaps this is why problems, when theydo rear their ugly head, seem all the more acute? Event handling was one example. JavaFX Script supports function types (variables which reference functions), and uses them for event handling. This means only one handler can be assigned to each event type. A clickable node (a button) might want to act on its own mouse clicks, for the purposes of animation, but also allow external classes to receive the click events too. The solution might be to create another event type, "action" for example, and republish the mouse click to the outside world as an actionevent. But if a subclass of the button then needs to know when action is fired, presumably yet another event type is needed to republish the action event(?) I wondered whether I could chain event handlers, but this proved unworkable from within the declarative syntax. And besides, even if I'd gotten it to work, it would have been a clumsy solution. Some sort of event multiplexing would be useful, although quite how that would work I don't know. Sure, each event function type could become a sequence (array), but how does one append a function to said sequence declaratively? It's not the end of the World — we can survive without an event mux! But can we survive without layout managers? The scene graph model seems to be orientated more towards nodes having control over their own position and dimensions. Certainly many drawing primitives (circle, rectangle, image) permit some control, but the top level Node class, and containers like Group, do not. You can find out how big they are, but (short of scaling or clipping) can't request a particular size. At the moment only a couple of very basic layout classes are included; one look at their source exposes plenty of hooks into undocumented low-level mechanics from theirGroup/Node parents. 
package jfxgui.layout;

import javafx.scene.Group;

public class GridBox extends Group
{   public attribute columns:Integer=1
        on replace { impl_requestLayout(); }
    {   impl_layout = doBoxLayout;
    private function doBoxLayout(g:Group) : Void
    {   var maxWidth:Number=0.0;
        var maxHeight:Number=0.0;
        for(n in content)
        {   maxWidth = if(n.getBoundsWidth() > maxWidth)
                n.getBoundsWidth() else maxWidth;
            maxHeight = if(n.getBoundsHeight() > maxHeight)
                n.getBoundsHeight() else maxHeight;
        var i:Integer=0;
        for(n in content)
        {   var x = i mod columns;
            var y = (i / columns).intValue();
            n.impl_layoutX = x*maxWidth;
            n.impl_layoutY = y*maxHeight;
The GridBox class above uses the same techniques as the scene graph's existing layout classes to reproduce AWT'sGridLayout. You can see all the references to the undocumented impl_ attributes and functions, required to control content layout. The AWT GridLayout sizes its contents to identical dimensions, but I'm not aware of a safe/recommended way to do this with Nodes. So is JavaFX a waste of space? Absolutely not! Don't get me wrong, JavaFX is a great technology, and JavaFX Script is shaping up nicely as a really cool DSL for graphics and effects programming. It's precisely because it gives you such a feeling of liberation that the problems, when encountered, stand out even more. I've no doubt future releases of JFX will address much of the above, but with only a few months to go before the version 1.0 release, I just wonder how much of that (crucial, opinion forming) first release will be marked "coming soon"?  

When Buzzwords Go Bad Blog

Posted by javakiddy Jul 31, 2008
I always assumed the word "jargon" was a reasonably recent addition to the English languages, but a quick glance at the OED gives examples of its use dating back as far as Chaucer. It would seem that man has been uttering "... unintelligible or meaningless talk or writing; nonsense, gibberish" for centuries! Or perhaps that should be "...conversing by means of symbols otherwise meaningless; a cipher, or other system of characters or signs having an arbitrary meaning" ? Gibberish, or just a cipher? The dual meaning perhaps reflects the inclusive/exclusive nature of jargon— if you're part of the 'in group' jargon is useful shorthand, but to outsiders that same jargon is unintelligible and meaningless. If "jargon" is centuries old, I wonder how old is the practice of using it to confuse and bedazzle? The OED's earliest source for "management speak" ("[...]being obfuscatory, needlessly complex, or empty of useful meaning.") only dates back to a 1986 Sunday Times article, yet I suspect the practice is far far older. The problem is no sooner has a new term entered the lexicon than someone, somewhere, will start to abuse it for whatever reason. The unfortunate popularisation of term "web" in place of "internet" was likely due to the ignorance of many politicians, journalists and other commentators during the early years of the fledling technology. However, years later the wholesale abuse of the (far too sexy for its own good) phrase "Web 2.0" was more down to 'marketing' than anything else. This is precisely what has happened to "RIA", Rich Internet Application, a piece of jargon now so diluted through multiple interpretation that it doesn't really meananything any more — at least, so say the members of the Java Possein their 24th July podcast. Is RIA meaningless? And if so, is it worth replacing it with a new term?

Really Indeterminate Acronym

First let us take apart the existing term: RIA — Rich Internet Application. What does it mean? Attempt #1: "Any application dependant upon the internet for its function." Seems reasonable enough at first sight. By this measure Google Earth would make it onto the list because of its need to constantly draw data across the network, and iTunes would also make it because its catalogue is online. But hold on, what about Firefox? What about Internet Explorer? Windows Update? FTP? Telnet? All of these depend largely upon the internet for their operation. Okay, scratch that, it's clearly far too broad. Attempt #2: "Any application which runs inside a web browser." This one surely has to be a winner! After all, we all know GMail is an RIA, right?!? So Google Search is an RIA too. And that Swedish Chef translator which was all the rage ten years ago. Hmm, somehow that just doesn't seem right. Pretty much any non-static web page would be an RIA by that token. Attempt #3: "Any application which runs inside a web browser and looks cool!" So coolness of user interface is the benchmark by which we judge whether a web site is an RIA or not? But what if we use Java applets, or Flash, do they count? We'll soon be able to drag and drop an applet out of the browser window and onto the desktop, where it will run independent of the browser — is it still an RIA? Attempt #4: "Any application which runs inside a web browser page, not outside a browser page (even if it started life inside a page!) and looks cool!" Aha, fine. But my Java applet (which is still on the page) opens a top-level Swing window on the desktop, complete with "this is an applet" warning banner. Attempt #5: "Any application which runs inside a web browser page, not outside a browser page (even if it started life inside a page!), or at least has 'one foot' still 'planted' on a browser page, even if it opens other windows... and looks cool!" 

Right, I Acquiesce!!

My own personal definition is this: "Any application which has the sophistication of a desktop application, but the omnipresence of a web site". I've focused on the word "internet" as being where the application lives, rather than what it does. So an images manipulation tool, which allows me to work on an image on my local hard disk (rather than on Flickr or wherever) would still be considered an RIA if it was accessible via a URL. The criteria are: the UI has the feel and finesse of a desktop application (rich), the application needs no formal installation — although it may be cached (internet), and it's software not data (application). Under this definition GMail is in, and so are applet and Flash applications regardless of whether they live inside a browser page. WebStart applications are also included. But Google Search is out (not rich) and so is Google Earth (requires installation). The problem is the odds of anyone else agreeing with the above are less than Google bringing out a JavaFX version of GMail. The Java Posse suggested the term RIA be retired in favour of something new. Perhaps they are right! Much of the confusion seems to surround distinguishing browser RIAs from their virtual machine counterparts, and as the web has far more of the mind shareI suspect it is beholden to the VM community to make the first move. The Posse suggested various acronyms like IRIS or IRA as replacements, although I suspect the latter might not go down too well in some parts of Ireland! (I considered a joke about coding wearing a balaclava at this point, but too many angry Irishmen know where I live!) Mulling it over, the best I could come up with was "RADICAL apps": Rich Animated Desktop Interface / Cloud Abstraction Layer. Not the best acronym in the World, but it encompasses the key concepts: Rich Animated Desktop Interface (slick UI, but not in a browser!), Cloud Abstraction Layer (not installed, lives and works through network 'abstractions' like URIs and Web Services.) Of course, all we're really doing is creatinganother buzzword, ripe for abuse and misuse. And this is the fundamental dilemma at the heart of naming any new concept (as opposed to something concrete, like a product or a web site) — the sexier the name the more it sticks in the imagination. The 'stickier' it is, however, the more likely it is to be mutated. Even so, I think it might be good to have a sexy new term for VM based RIA's, if only to placate my curiosity as to how long it will take before references to "RADICAL applications" (or whatever) in conjunction with "GMail" start to materialise on Google! ;)  

Flogging a Dead Horse Blog

Posted by javakiddy Jun 30, 2008
Today is apparently Bill Gates' first day away from Microsoft. As he leaves, some have suggested Microsoft's star is now in the descent, as Google's star climbs ever higher. Is this really the case, is Google destined to become the next Microsoft? When a company attains a certain dominance in the market, isn't it hard to unseat them? After all, they can afford to hire all the best people! Cast your mind back to IBM's nervous toe-dipping when it came to the fledgling micro computer market in the Seventies, or Microsoft's initial head-in-the-sand attitude towards the internet in the Nineties — being big doesn't always make you right. Indeed the larger the organisation, the better it gets at sustaining incorrect assumptions in the face of mounting contradictory evidence. (One wonders, for example, whether a concept like transubstantiationcould ever have survived in a religion with only a handful of members?) There's safety in numbers, for sure, but only by way of passing the buck for a bad idea. Shared responsibility can often mean no responsibility at all. In the right environment bad memes can survive unchallenged, and humans seem particularly good at creating those environments. We believe because the people around us believe, not because we have given an idea careful contemplation or scrutiny. What's important is that the group has clear goals; how well those goals stand up to reality is of secondary concern. As the song says: "any dream will do!"Google prides itself on hiring only the select few, the brightest of the bright. But a youthful company with an allegedly youthful workforce, all recruited from similar stock, doesn't leave much room for diversity of outlook. Those familiar with tale of Apple Computer's first decade will know how a group of supposedly intelligent people can divorce themselves from reality when they're distracted by working on new and exciting technologies. With so many developers recruited straight out of university, one wonders how many Google engineers really remember programming before the arrival of web? Perhaps unsurprisingly Google is wedded to the web as a platform for Rich Internet Application development, yet is there any strong evidence that this is a fruitful avenue to pursue? Sure, Gmail is used by many, but what of Google's other web-app offerings? Docs? Spreadsheet? Does anyone, aside from the occasional curious soul, seriously use these applications? I don't think so! Yet Google continues to develop and promote the likes of GWT (Google Web Toolkit) and Gears, technologies designed to smooth over or work around the obvious shortcomings in the web platform. If I was being cruel I might wonder whether the true genius of Google lies in finding more inventive ways to flog a dead horse! When you consider the promise of Adobe's AIR, Microsoft's Silverlight, or indeed Java's own JFX, you wonder why we aren't seeing evidence of significant investment in these technologies — or at the very least in one of them. Silverlight and JavaFX may still be raw, but AIR is mature enough to start producing applications, if only exploratory beta releases. Perhaps behind the scenes there are indeed moves to examine these alternative RIA technologies? Maybe, as I write, the latest build of a JFX based Google Docs has just finished compiling, or an Adobe AIR Gmail has been handed off for internal testing...? But there's very little evidence of that from the itinerary of the recent Google I/O conference. The problem with these non-web RIA platforms is they rely on a foundation of software already being installed on the user's desktop — in the case of JavaFX, for example, it's the JRE. Having to stop and install a plugin creates a hiccup in the user experience; by contrast the web browser is guaranteed to be present. So the future of non-web RIAs depends upon breaking an old chicken/egg scenario: RIAs won't start being written until runtimes are ubiquitous, and runtimes won't be ubiquitous until there's enough RIAs to drive demand for them. Microsoft has the option of simply pushing Silverlight out like it did with IE 7, such are the perks of owning the OS which runs on 90% or so of the World's computers. While the roll out wouldn't reach every Windows user, it would cover enough to make Silverlight a lot more attractive. Can Google afford to wait for Microsoft to do this? The only vendor of RIAs which has the clout (in terms of brand recognition and trust) with the common user is Google. If Google announced an enhanced Adobe AIR version of Gmail you can bet your bottom dollar it would generate plenty of headlines. Yet Google seems content to keep prodding the web, in the hopes it will somehow transform itself, in the best Cinderella fashion, into an effective RIA vehicle. There's plenty of paranoia about Google around at the moment. People are whispering "the next Microsoft" every time Mountain View announces a new product. Yet I wonder if, by shackling themselves only to the web and not dabbling with these other technologies, Google hasn't committed itself to something it will later regret? Of course Google could be right — maybe the future of the RIA is inside the browser. And even if they're wrong, there's still plenty of time to do a U-turn (although by waiting they could forfeit the opportunity to champion their preferred platform.) For the sake of us all, I sincerely hope they are wrong, and they'll start dabbling with JavaFX or AIR sooner rather than later!  

Knock Knock Blog

Posted by javakiddy May 29, 2008
Allegedly invented by accident, the humble Post-it Note has likely been responsible for more potential breaches in computer security than any single virus, rootkit or keylogger. This handy little aide-mémoire is home to 'to do' lists, phone numbers, doodles, and (inevitably) passwords. Most people wouldn't tape their front door key to their front door, yet they'll happily stick their computer password to the front of their computer monitor. One time, in a book shop, I had to endure a customer loudly direct her workmate (via cell phone) to riffle through her desk drawer for the letter containing her bank PIN number. To this day I still cannot decide what was more brain-dead, the fact that she stuffed the letter into an unlocked drawer, the fact that said unlocked drawer was in a semi-public place, the fact that she revealed its existence to someone else, or the fact that she repeated the number loudly for all the shop to hear as it was read to her! Incidents like this might be amusing, if not for the fact that we're moving towards an age when all our data may be held remotely (on 'the cloud') and accessed via Rich Internet Applications. But solving this problem could open up another one: as focus shifts from physically protecting locally stored data, to asserting access permissions on remotely held data, will we need to lose our anonymity to protect our privacy?

Safe hex

The reason people don't get computer security is because it's largely intangible. They can touch their front door key, picture in their mind's eye the menacing stranger trespassing in their home, see the empty space where their beloved widescreen TV used to be — yet none of this really seems to apply to something as ethereal as a password, or the data on a hard disk. In the eight-bit days 'safe-sex' computing used to be so easy — the worse most malware could do was trash a floppy, so we simply avoided dubious software and kept valuable disks write protected by default. Later, viruses meant malware could infect and trash other disks, but opportunities for infection were rare, plus backups and virus checking reduced the risk almost to nothing. For the most part security was a minor concern; something to be aware of, but not paranoid about. Then the World Wide Web got itself invented. Suddenly software from the four corners of the World was passing through the average browser cache on a minute-by-minute basis, and malware was no longer content just trashing your data, now it wanted to steal it! Yet users still have to be urged to install and maintain good network security software. If left to their own devices many don't bother — try scanning for wireless access points from your home and see how many of your neighbours didn't even spend the extra few moments to secure their router. Even dropping a friendly hint over the garden fence won't work — you're just that craaaaazy techie guy from next door, babbling about sniffing someone's packets! The problem here is still a physical one though — the data still physically lives on devices you own, the thief is trying to duplicate it elsewhere. But once the data moves to 'the cloud' the problem shifts to one of identity. Your data is already elsewhere, the issue is do you have permission to access it? 

On the internet, everyone knows you have blue eyes

Recently I just managed to stop a friend from logging into his webmail account via a public computer in a hostel's common room...with the "remember me" box ticked! Sheepishly he agreed it might be a little safer if he didn't give everyone using the PC after him free access to his mailbox. Last week I helped another friend FTP files to her new web site. Proudly she explained to me how she devises unique passwords for every internet account — first name plus an incrementing number (Jane36... Jane37... Jane38...) These people aren't stupid (indeed they represent the norm) but can they be trusted in an age whenall their private data may be protected by merely a password? It got me wondering whether biometics might be the way forward. We've already seen some laptops issued with fingerprint recognition instead of passwords to secure them, and face recognition using a built in webcam is also possible. But how about using this technology to restrict access to the applications and data itself? On the surface, it makes sense twice over. For the software industry it means customers can't get free applications by trading passwords. For the users it means their data is now protected (to potentially quite a high level) with zero effort on their part. It could even provide an effective replacement for Digital Rights Management. If iTunes used face/fingerprint scans to digitally tie my downloads to my identity, I should be able to freely play my music on every device I'll ever own, so long as its configured to 'me'. But here's the problem: if permission to run my applications and access my data is tied to something so certain, unique and unchangeable, doesn't it pretty much blow any hope of anonymity out the water? Listening to a recent edition of the Leo Laporte podcast The Tech Guy I was amused to hear Leo explain how his teenage daughter had set her public profile on FaceBook to that of a 38 year old guy from New Jersey. Smart kid — it presumably avoids a lot of unwanted attention. But if biometrics became the norm for accessing FaceBook this might become impossible, or a least tricky. If FaceBook gained access to a second independent data source it could compare the biometric reading and discover the inconsistency. The issue would then boils down to whether FaceBook would enforce its terms and conditions. Even if an individual RIA host had a policy of permitting bogus details, the biometric 'password' might still expose the real account owner to the Police or FBI, should they come knocking... So, as our digital lives move steadily with each passing month onto 'the cloud', it seems like we have a straight choice: carry on with user-unfriendly passwords and expose hundreds of millions of regular users to high risk of having their data stolen, or move towards a (supposedly) idiot-proof biometric system and surrender any hope of anonymity. Unless anyone has a better idea...?  

Anti-Social Networking Blog

Posted by javakiddy Apr 17, 2008
For a while now I've been mulling over an idea for a new type of social network, one which is actually social in nature and not just name. The key to my idea is harnessing the ad-hoc connectivity of wireless mobile devices to move the network out into the real physical world. It's a curious little idea which, like most curious little ideas, involves a lot unknowns which have to be worked out. The aim is to bring like-minded people together, be they fans of the same sports team, devotees of Opera, potential love interests or employees and their ideal employer. Oh, and once they've found each other it can recommend a restaurant all parties will enjoy (with a small referral fee, no doubt — even I know you need a revenue stream!) Mind you, everyone seems to be jumping onto the social networking bandwagon at the moment, and just because I find my idea interesting, doesn't mean the great unwashed masses will. If I had more killer business instinct I'd be land-grabbing a chirpy domain name in all its ".com", ".org", ".co.jp" variants, and readying my bank manager to receive the millions as they start rolling in. As it is I'm happier just to get a basic prototype working, to see if it actually works as a concept. As software increasingly moves on-line the consumer (we are told) benefits from the added flexibility. But the developer has to jump through more hoops just to get a basic prototype up and running. Time was when you compiled your binary and passed it around on a floppy to your friends. Now you have acquire a server, install and configure it, register a domain, pay for bandwidth... It's a lot of messing just to test out an idea. Fortunately Google have come to the rescue with the launch of their Google App Engine, promising to get rid of the pain so I can concentrate on the code. But what am I giving up in return for this shortcut?

Microsoft patents one and zeros

A friend of mine was totally taken in by Virgle, one of this year's Google April Fools. It didn't help that he wasn't a fan of Richard Branson to begin with (despite the small debt the Pistols owe him) and is becoming increasingly weary of Google's ever expanding reach. Since the birth of the modern PC prophets of doom have warned of dire consequences should 'X' storm off home, taking their ball with them — where the role of 'X' has been alternately filled by IBM, Intel, Microsoft, and (perhaps briefly) Netscape. That none of these companies ever showed any inkling of such actions did little to curb the paranoia, indeed within the developer community it grew powerful enough to propel Open Source to the position it has today. Now it's Google's turn in the firing line, I suppose. If I use proprietary software — without the capability to modify its source code — I am at the mercy of the vendor regarding the future course of that product. However, I am not at their mercy regarding the current version; Windows 95 will run long after being abandoned by Redmond, and (hardware permitting) I can still code in Commodore V2 BASIC a decade after the 'chicken head' breathed its last. Not so for the brave new world of on-line services! If my web mail provider decides to pull the plug, goes bankrupt, or gets taken over, I'm totally at their mercy — not only can they take their own ball home, they can take all my balls home too! (Erm, yes, perhaps that could have been worded a little better! :) 

Write once, run anywhere... well, somewhere... maybe

My 'bright idea' application is fully distributed, needing no web site. When it came to choosing a platform I went with Android rather than JavaME, simply because Android seemed to promise a faster route to getting a prototype up and running. After a little bit of experimental coding I began to wonder if the app might benefit from an optional web interface, allowing the luxury of a full sized screen and keyboard when editing data. Naturally I don't want to waste time finding and configuring a host, so when Google's own new 'bright idea' arrived at just the right time, it seemed like an ideal solution. But after just spending a lunch hour with my friend kicking off about how Google where taking over the galaxy (literally!) I began to reflect on the long term consequences of that decision. What if my little side-project actually has some merit (big 'if'!), will I be too firmly dependent on one technology provider? What if Google pull their App Engine service, or sell it off to someone else? And this prompted a second question: wasn't the internet supposed to save us from this kind of platform dependent hell? Weren't we supposed to have standards for our code (like Java) and our data (like XML) which would allow us to run our stuff wherever we wanted, and with whatever data we wanted? Why, then, are we continually being asked pick between incompatible platforms? It was bad enough when YahooIM, AIM, ICQ and MSN wouldn't talk to each other, but now we have MySpace, FaceBook and Bebo, various photo sharing sites, various music sites (and formats!) And in Java we have Android, JavaME and (soon) JavaFX Mobile in the mobile space alone, Swing vs SWT on the desktop, and I've lost track of how many rival technologies make up the JavaEE space. As for my curious little idea... well I'm going with Google App Engine anyway. I have to be realistic — it's been hard enough over recent months to find the time to devote to this project, so unless someone wants to pay me full time to develop it (anyone?) I have to recognise there's a good chance it won't get finished, multiplied by a good chance it mightn't catch on anyway. And so, at the risk of disappointing my bank manager, it's probably safe to go with GAE. So for me the dilemma is averted — but if we could just get back to our dream of a network based on open standards, then it would never have arisen in the first place...! Hmm, I'm being naive again, aren't I...? :)  
Our illustrious editor, Chris Adamson's blog on this years Java Posse Roundup gave me pause for thought recently. Chris outlined a session discussing the evolution of the user interface, or lack thereof, in which innovations like the Wii Remote got a mention. I have to say, I've mixed feelings about Nintendo's latest console. While I recognise the innovation it brings, I don't see it as necessarily a catalyst for revitalising user/machine interaction. Indeed, if anything, it could herald a major step backwards. Yes, I realise this is heresy! But just put down those pitchforks and burning torches for a moment, and hear me out...

New wave

For all the brouhaha caused by the Nintendo Wii prior and during its launch, for me its primary innovation had some stark limitations. The Wii Remote, with its capacity for tracking both motion and orientation, was certainly an impressive piece of hardware; however close inspection of the games used to build pre-launch Press hype reveals a distinct pattern. Conference reports flaunted pictures of excited young gamers energetically playing Tennis, Tenpin Bowling, and Golf — what do these games have in common? Yes, they all major on waving one's arms about! What was less noticeably (or even absent) from the Wii's pre-launch hype were games like Soccer, motor sports and platformers (not even Nintendo's legendary Mario.) Very strange, that! Yes, games like these have subsequently appeared on the Wii, and developers have tried their hardest to include motion input within their gameplay, but for the most part arm waving is a gimmick — indeed in the case of Soccer it's precisely what the game is not about! Waving a hand up and down to make Lara Croft jog to the top of a ladder brings the player no closer to a true tomb raiding experience than waggling a joystick brings them closer to running the 100m Sprint. Perhaps Nintendo should invent a giant remote controlled ball which chases players down stairs, and employ a band of spear-wielding Aztecs to wait outside their front door? Not that I'm mocking the Wii — it has undoubtedly brought a much needed breath of fresh air to the console market. But for all its cleverness, arm waving aside, the device is at best a gimmick and at worst a step back from cohesive user interface design. The splash made by the Wii will no doubt inspire other device manufacturers to experiment with motion based input, and before long we'll be sending text messages with the flick of a wrist, or checking for Wifi hotspots by circling our PSPs over our head. But what benefit does this bring? 

Nice touch

An example of innovation with more widespread application can be found in Apple's iPhone. While critics may point at the product's flaws (and admittedly it has a few), its touch screen interface can only be described as a simple idea, elegantly executed for maximum impact. The innovation differences between the iPhone and the Wii betray the likely creative dynamics which birthed them. In the case of the Wii, Nintendo presumably were reaching for a fresh gimmick to set them apart from the "ramp up the CPU, increase the polygons" mindset of its rivals. In the case of the iPhone, Apple clearly sought to overcome the cumbersome user interfaces which held mobile applications back. If one was being cruel it might be tempting to characterise Nintendo's approach as a solution in search of a problem. But this would be most unkind, as clearly Nintendo had identified a legitimate issue with traditional controllers — albeit one restricted to only a select band of games. But having alighted on its chosen 'gimmick' Nintendo then had to play it to the max — shoehorning arm waving into every game format (rumour has it they frown upon games not employing at least some measure of motion input.) Both Apple and Nintendo used gimmickry to boost what were otherwise rather anemic products. In the UK, after an initial burst of excitement, the iPhone apparentlyended a little shy of expectations; while in Japan some are suggesting the Wii's popularity may not last forever. Already the novelty of the Wii Remote seems to have worn off for some Nintendo fanboys, and the company is now looking to extend its range of ground-breaking input devices. By contrast, Apple's 'problems' likely stem from the weak functional spec of the actual device behind the gloss (coupled with network lock-in) rather than a limited novelty value for its touch screen interface. 

Can't think of a pun for this bit

Sometimes an idea seems so novel, it is hard to prevent oneself from getting carried away. I can recall back in the mid-Nineties when VRML (Virtual Reality Modelling Language) started to make an impact — today it's embarrassing to recall how many 'technology gurus' seriously predicted we'd be browsing the World Wide Web in 3D by now. Not just forward and back through a site, but up, down, left, right, in and out! Just imagine, instead of merely clicking a bookmark to be instantly transported toJava.net, you could instead guide your virtual avatar through a maze of corridors within a virtual library, to your own virtual bookcase, containing many virtual Java links... Strangely the idea failed to take off! Hmm... (Actually a good friend of mine — hello Martin! — participated in a project which involved navigating the web on a bicycle. Yep, a real bicycle, minus any wheels obviously. So now you know what universities do with your tax dollars!) A few years ago I witnessed a demo of experimental software in which an on-screen avatar guided users though items on a restaurant menu, with simulated body language and speech. The excited young researchers extolled the virtues of avatar based interfaces; but what, I wondered, was wrong with traditional printed menus? Low tech perhaps, but the printed word is cheap, efficient (no batteries), robust (doesn't crash), random access, and entirely sympathetic with the way the human brain (once taught to read) takes in such information. What the VRML web and the restaurant avatar demonstrate is the need to apply an Occam's razorapproach to user interface innovation. (Occam's razor is a simple philosophical device which states, in effect, "the simplest solution to any problem always wins!") When adding innovation one question should be repeatedly asked: why am I doing this? Sure, I can hook a Wii Remote into my desktop app, but what added value does it give me? Why do I want a rumble pack in my mouse? Is it really easier to navigate the Windows Start menu via voice commands? (And when are those damn Aztecs going to go home?) Yes, it may seem obvious, but it needs stating and re-stating, over and over and over! Because even skeptics like myself can get taken in by the thrill of a new gimmick. It's so tempting to see something like the Wii Remote and wish to find ways to include it in one's own software. But the urge should be resisted, because it is only by identifying and focusing on real problems (rather than gimmicks) that any real progress will be made. Thank you for listening. You may now continue with the pitchforks and flaming torches... :)  
You know you're getting old when you find yourself complaining about how English is being butchered, instead of inventing new ways to butcher it yourself. Languages change and evolve, they cannot stand still. This applies to programming languages just as much as natural written/spoken language. The difference is, of course, natural languages don't require backwards compatibility. It's this which causes the headaches — if we could just add keywords whenever we wanted, or retrofit the grammar at the drop of a hat, there wouldn't be a problem. But there's a huge body of Java code already out there, and it would be nice if none of it got broken by any changes. I wonder if this is why programming languages, unlike natural languages, seem to have shelf lives? In nature the success of a species is constrained by the size of its compatible habitat. As the environment changes a species may evolve, but only up to a point. Eventually its popularity fades and better adapted animals arrive to dominate the landscape. In programming, a dramatic environmental shift began at the end of the Seventies when the personal computer saw the end of dinosaur Mainframes, and with them stalwart languages like COBOL in favour of C and later C++. Fifteen years on and the internet saw C and C++ slowly retreat into niche markets, replaced by net-savy new blood in the form of Java, JavaScript, PHP, etc. Why couldn't COBOL just evolve to become less 'records and batch processing', and more 'files and interactivity'? Why couldn't C just evolve to be less 'bytes and memory pointers', and more 'bytecode'? The answer is they could have — but there's only so much retro-fitting one can do to an established technology... So how far can Java continue to evolve before it too goes the way of the dinosaur?

Seeking closure (groan!)

[I'm taking my life in my hands with the below — language design isn't really my field. So treat this blog as a 'view from the trenches', rather than the considered opinion of someone who's spent decades developing compilers.] An issue came up on the comments for Kirill Grouchnikov's blog "Evolving the language" (a reply to his own "And so it begins...") relating to the use of the BGGA Closures for event handling. While the BGGA proposal is slightly more than just a lightweight replacement for anonymous inner classes, when it comes to event handling it's the removal of the inner class boilerplate which seems the biggest selling point. Assamyron pointed out in the comments of the Kirill's blog, the following code... 
button.addActionListener(new ActionListener() {
    public void actionPerformed(ActionEvent e) {
...would be translated into... 
button.addActionListener({ActionEvent e => processAction(e)});
...which, at first glance, certainly seems more compact. Although the closure syntax may be a little more cryptic to the untrained eye, this it outweighed by the reduced need for boilerplate. If your code is merely a short and snappy negative/positive/zero condition to sort a collection, or a filename filter for JPEGs, it's likely to be dwarfed by the enclosing anonymous inner class code. So a leaner syntax on the face of it certainlydoes seem useful. But event handlers typically have to load/save data, interact with objects, update the UI state, talk to databases, etc. — all this means they are rarely single statement affairs. A more realistic event handler might be (adapting the above example)... 
// Sans Closures
button.addActionListener(new ActionListener()
{   public void actionPerformed(ActionEvent ev)
    {   doSomething();
        {   dontForgetToDoThis();
...which would translate as... 
// Avec Closures
button.addActionListener({ActionEvent e => processAction(e)});
private void processAction(ActionEvent e)
{   doSomething();
    {   dontForgetToDoThis();
It basically comes down to this: when the code body size is short, a heavyweight wrapper (as provided by anonymous inner classes) seems unreasonable. But as body expands beyond one line, the concept of a lightweight wrapper becomes more and more of an irrelevance. Indeed its impact becomes so trivial as to, some might argue, no longer justify its more cryptic syntax. The very fact that the original example hid the stodge behind aprocessAction() method betrays the ineffectiveness of the BGGA closure syntax for anything other than ultra-trivial blocks of code. While I recognise Closures have several strings to their bow, I personally wouldn't count 'event handling' as one of them. In a practical sense I doubt their tight syntax would have much of an impact given the size of most event handlers. It's a bit like when your girlfriend takes her socks off in a vain attempt to cheat the bathroom scales. 

The meaning of liff [*]

Of course everyone has their own favourite Java language enhancement. I, as a Swing coder, would like to see an end to the cumbersome boilerplate code for pushing stuff onto the Event Dispatch Thread. Perhaps a syntax for bundling arbitrary blocks of code into Runnables and posting them onto the end of queues? 
// Ugly
SwingUtilities.invokeLater(new Runnable()
{   public void run()
    {   // Your Swing code here
// Nice!  'SwingUtilities' implements an event queue interface.  (But how
// to support invokeAndWait() ..?)
{   // Your Swing code here
Not that anything like the above is likely to make it into the language. :) While the elite coder welcomes more syntactic shortcuts, the novice merely sees a steeper learning curve. A clean syntax with familiar grammatical patterns employed over and over is easier to learn than one full of cryptic looking quirks and counter-intuitive side effects. The rather awkward way in which Generics were slotted into the existing language is already a big turn off for beginners. New programmers will always favour a syntax with a low barrier to entry for novices, rather than go faster stripes for experts. They'll vote with their feet if a language takes a dramatic turn for the worse. The trick is to balance the two concerns — move the language forward, but not in such a way as to dramatically shorten its life expectancy. Unfortunately this can sometimes mean missing out on juicy additions, simply because the fallout would be too extreme. But do we really needevery juicy addition? Languages are tools, and each has its own strengths and weaknesses. They say French is good for making love, German for giving orders, and English for apologising — unfortunately mix-and-matching spoken language to suit your mood isn't normally an option. But computer languages are not so tightly bound: we can if so desired pick Python to write one project, Java for the next, and Ruby for a third. We can even mix them within a single project. The Da Vinci Machine promises to re-invent Java's runtime core to suit a much wider range of languages. If this project bears fruit (and I sincerely hope it does) we may not have to worry so much about including so many new feature into the Java language itself. But this will mean that while the Java runtime will go on, the Java language itself will eventually die out. Do we want Java to die out? An alternative is to seize the initiative and begin developing a Java replacement, a Mk. II if you like. Going back to the drawing board, we can fix all the problems with Generics and easily slot in Closures and Properties in a way which makes sense. So how much life is their left in the current Java? We have a lot of time and effort invested in it, which has resulted in a healthy reputation, but can it last much longer? When is it time to stop evolving, and start over with an 'intelligent re-design'?  
One of the things which slightly unsettled me after the release of Update N was the frequent mention of how much better things would now get for Java applets. The subject cropped up again, in detail, during my post-holiday catch up of Java Possepodcasts. To paraphrase Obi Wan Kenobi: "Applets, now there's a name I haven't heard in a long long time." Like many who came to the platform in its early days, my first experience of Java was via web applets. Back then I was working on projects for Hewlett Packard — the only way to build Java code on HPUX boxes was to call the JVM embedded in Netscape 2! Boy was I glad when HP released a proper SDK! While I recognised the potential of the Java platform (which it still hasn't fulfilled!) I wasn't mad keen on grafting it into web pages. Years later web designer friends would jokingly comment on Java's decline at the hands of Flash — no doubt they expected a robust defense of the applet platform, but that was the last thing I was ever likely to give. The undoubted kudos and publicity Java received thanks to applets came at a heavy price: the term "Java" became synonymous with eye candy. Useful applets were few and far between compared to the sea of eye candy. It's an image which seems to have stuck in the minds of many, as I've found when pitched Java solutions to government and media clients. I sometimes think it's an easier sell if the client has never heard of "Java". All this talk of applets has got me thinking: isn't it about time we moved Java up the food chain?

Survival of the fittest

Perhaps some Java folks never really got over the humiliating defeat to Flash during the latter part of the Nineties. No sooner is there a renewed focus on the desktop than people are extolling the virtues of the Java Kernel for applets. My response is simple: "let it go!" Flash won fair and square — their technology was nimbler, better met the needs of the audience (who, if you hadn't noticed, were more at home with 'Photoshop' than 'Visual Studio') and had all the key features within easy reach (read: video!) Java applets were, and still are, an application development technology in a graphic design world. What do we honestly expect — a revival? Sure, there's nothing stopping us creating our own Flash like development tool... except Flash's origins are very different to Java's. Born initially as a vector animation and graphics tool for the web, even now Flash still retains the feel of a technology who's users are more at home with a paint brush than a screwdriver. This isn't to say that Java doesn't have a place in the realm of graphics and animation, but there is a difference between low-level graphics/timing APIs, and the applications which sit atop them. (Are you listening Microsoft anti-trust lawyers? Winsock != Internet Explorer :) Besides, even if we came up with a development UI comparable to Flash, why would anyone want to switch? What must have functionality would force Flash users to abandon their hard earned skills en masse and join the Java camp? 

By the pricking of my thumbs...

Ironically, while the Java camp plots its return to the web browser, Adobe have been steadily trying to get Flashaway from the browser and onto the desktop. Increasingly users are becoming restless with the limitations of the web as an application platform. Web 2.0 hasn't delivered the kind of rich interface experience people recall from their time with desktop applications. It is only natural that Adobe should wish to follow the herd as they move away from the browser and back onto the desktop. But this is where Flash's heritage starts to work against it. The desktop isn't the web (obviously!) — while users do seem to favour more visually impressive UI's, they're also demanding a comprehensive widget library backed by plenty of functionality. Flash's UIs are still very much glorified web forms, and its API support for functionality outside of the web (crypto, heavy XML, desktop integration, file formats, network protocols, the list goes on...) is extremely feeble. It makes me wonder why, given Java is currently sitting pretty at ground zero of the likelynext big thing, anyone would suggest upping sticks and muscling our way against the flow of traffic? 

Fixing Java

At last steps are being taken to address the JRE's download size and start up time, making the Java platform a far more attractive proposition for both the developer and end user. But there are two key issues which still need addressing: one minor, one an 'elephant in the room'. The first issue is branding, specifically the amount of it. No, this isn't a plea to increase Java's visibility through some clever marketing campaign — quite the opposite, we need less branding! The fact that a given application was written in Java needs to be hidden from the end user. From applet splash panels, to tray icons, to WebStart download dialogs, and so on: the end user mustn't have Java's 'alien' (non-native) nature thrust into their face. After all, who are we preaching to? Regular end users don't care what their software is written in, but software publishers do care when their products are littered with tags and logos for the tools they use. The second issue (and the aforementioned elephant) is video support, or lack thereof. It can't have escaped anyone's notice that YouTube et al went with Flash for their embedded video component. Hardly a surprise, given the equivalent Java applet would have been a nightmare (even with the kernel.) What is it with Java and audio/video — why is it always such a struggle? 

Top of the food chain

Whenever Java has lagged on the desktop the culprit has often been its fetish for re-implementing native components as bytecode. Flawless native look-and-feels and web content rendering was impossible, until we ditched our fixation with '100% pure' and reached down to wrap the native components. Java's attempt to re-write media codecs fizzled and died, and again the solution points towards leveraging native codecs rather than writing our own. What use a 100% pure Java PDF viewer, when we can embed the native PDF viewer already available on the user's system? Can you see where this is leading us? :) Well, I began by talking about the demise of the applet, and how some would like to re-animate it. Coming almost full circle, I want to propose we turn this idea on its head (ouch, mixed metaphor!) Instead of embedding Java inside other applications, why don't we focus on embedding other applications inside Java? In the 'battle to come' for the Rich Internet Application space, Java is in a very strong position. We already have an extensive developer community, oodles of API support, and age-old issues of deployment and startup are finally being addressed. Now we need only a few tweaks to WebStart, and a new media API, to really get the ball rolling! Compare that to the uphill job Adobe has ahead, turning Flash's web-centric functionality into a tool which could churn out a respectable RIA word processor or media editor. Flash is heading Java's way again, but this time it'll be fighting on Java's home turf. But let's not get complacent: many a superior technology has fallen to weaker opponents who somehow caught the imagination of the public (and if there's one thing Flash is good at...) Let us assume, for argument's sake, Java reigns supreme... Even as the World Wide Web's star begins to wane, and newer stars twinkle ever more brightly in the night's sky that is the Internet (who says I can't do poetry, eh?), Flash's future is still assured. It will always retain its place as a handy 'embed' inside other applications. Can you imagine a desktop application in the not-too-distant future, where the help system consists not of boring text but rich vector/media animations? Those animations would be running inside an embedded Flash plugin, naturally... but what of the host app? Wouldn't it be spooky if it was written in Java? Hmm... Flash movies, running embedded inside Java Rich Internet Apps — now there's irony for you... :)  

Filter Blog

By date: