It's been several months since my last blog! I have recently re-joined the Glassfish team at Oracle and I'm currently looking at Web tier technologies like Web sockets and HTML5. In this blog, I'd like to show you a simple Web application in which you can control an HTML5 video object remotely. The basic idea is to intercept video events like 'play', 'pause' and 'seeked' and remote them using Web Sockets to control another player. Although there may be some use cases for an application like this (such as coaching), the real objective of this exercise is to show the low latency of the Web sockets implementation in Glassfish. So let's get started!

Here is a screenshot of the application:

A Web socket is nothing more than a socket connection that can be established between a (modern) Web browser and a Web container. It provides a low-latency, bi-directional communication "parallel" to the HTTP channel. It is Ajax/Comet on steroids. No hacks, no jumping through loops, nothing. It has the potential of redefining what we understand as a Web application lifecycle given it's low latency and always connected characteristics (but more about this on a future blog).

Our Web application comprises of a single page and some JavaScript code. We use JSF 2.0 facelets to define the main page:

 <html ...>
   <h:head>...</h:head>
   <h:body>
      <p>Let's share this video:</p>
      <p><h5:video height="360" controls="true" src="PBIRbest.m4v"/></p>
      <h:outputScript library="js" name="json2.js" target="head"/>
      <h:outputScript library="js" name="app.js" target="head"/>
   </h:body>
 </html>

 The tag h5:video is defined by the following custom JSF component:

<html ...>
  <cc:interface>
    <cc:attribute name="src" required="true"/>
    <cc:attribute name="controls" required="false"/>
    <cc:attribute name="width" required="false"/>
    <cc:attribute name="height" required="false"/>
  </cc:interface>
  <cc:implementation>
   <video src="#{cc.attrs.src}" controls="#{cc.attrs.controls}" 
      width="#{cc.attrs.width}" height="#{cc.attrs.height}"></video>
  </cc:implementation> 
</html>

 Defining a new h5:video tag (instead of using HTML5's video tag directly) is not a requirement for the application, but it does provide a clear separation and makes the example more readable (a separate component can be used to insert additional JavaScript code, if needed). Note the inclusion of the JavaScript libraries: json2.js and app.js. The former provides us with JSON.stringify() and JSON.parse(); the latter is where our client-side code resides. Let's have a look at that.

 Let us define a JavaScript function that returns a network object (i.e., a closure). This network object has two methods: initialize() and send(). The former opens a Web socket and registers listeners for incoming messages; the latter is for sending outgoing messages.

var network = function (websocket) {
    return {
        initialize: function() {
            var url = 'ws://' + document.location.host + document.location.pathname + 'videosharing';
            websocket = new WebSocket(url);
            websocket.name = APP.id;
            websocket.onopen = function() { };
            websocket.onmessage = function (evt) {
                var command = JSON.parse(evt.data);
                if (command.type == "pause") {
                    APP.pauseVideo();
                } else if (command.type == "play") {
                    APP.playVideo();
                } else if (command.type == "seeked") {
                    APP.seekVideo(command.currentTime);
                } else {
                    alert("Unknown command " + command);
                }
            };
            websocket.onclose = function() { };
        },
        send: function(shape) {
            websocket.send(shape);
        }
    }
};

 First, a Web socket is created from a URL constructed using the main page's host, pathname and "/videosharing". Notice the use of the "ws://" protocol. Next, listeners for onopen, onmessage and onclose events are registered. The onmessage listener parses an incoming command, determines its type and calls the appropriate method in out application object APP. This is how UI events are remoted. The send() method is used to remote events that are local to this window instance. 

The network's initialize() method is called by APP.initialize(). This method also registers listeners for all local video events for remoting purposes as shown next.

    initialize: function () {
        APP.network.initialize();

        var video = APP.getVideo();
        video.addEventListener('play', 
            function (event) { 
                var command = { type: "play" };
                APP.network.send(JSON.stringify(command));
            },
            false);
        video.addEventListener('pause',
            function (event) {
                var command = { type: "pause" };
                APP.network.send(JSON.stringify(command));
            },
            false);
        video.addEventListener('seeked',
            function (event) {
                var command = { type: "seeked",
                                currentTime: APP.getVideo().currentTime };
                APP.network.send(JSON.stringify(command));
            },
            false);
    }

Note how commands are constructed, stringified and remoted using to the network object. All commands have a type field; the 'seeked' command also includes a currentTime field in order to seek remote video objects to the correct location. The rest of the client code is uninteresting so lets look at the server side pieces.

 The server side API provided in Glassifish 3.1 for Web sockets is straightforward. First, we extend the class WebSocketApplication and override methods createSocket() and onMessage() as shown next.

public class VideoSharingApplication extends WebSocketApplication {

    @Override
    public WebSocket createSocket(NetworkHandler handler,
            WebSocketListener... listeners) throws IOException {
        return new VideoSharingWebSocket(handler, listeners);
    }

    @Override
    public void onMessage(WebSocket socket, DataFrame frame) throws IOException {
        final String data = frame.getTextPayload();
        for (WebSocket webSocket : getWebSockets()) {
            try {
                if (socket != webSocket) {
                    webSocket.send(data);
                }
            } catch (IOException e) {
                e.printStackTrace();
                webSocket.close();
            }
        }
    }
}

The onMessage() method handles socket frames by extracting the data in the frame and broadcasting it to all other Web socket clients. The createSocket() method simply creates an instance of our app's Web socket which, in this application, it is very simple.

public class VideoSharingWebSocket extends BaseServerWebSocket {
    public VideoSharingWebSocket(NetworkHandler handler,
            WebSocketListener... listeners) {
        super(handler, listeners);
    }
}

The last piece in this puzzle is the registration of our application in the WebSocketEngine. This can be accomplished using the init() method in a servlet as shown below.

public class WebSocketsServlet extends HttpServlet {

    private final VideoSharingApplication app = new VideoSharingApplication();

    @Override
    public void init(ServletConfig config) throws ServletException {
        WebSocketEngine.getEngine().register(
            config.getServletContext().getContextPath() + "/videosharing", app);
    }
}

Of note, this server-side API is still evolving and is subject to change and improvements in upcoming versions of Glassfish. A source bundle for this sample is attached and includes a NetBeans 6.9.1 project file. You must use Glassfish Server Open Source Edition 3.1 and a browser that supports Web sockets (as well as the codec for the video that you use!). In my project, I include an MPEG-4 video and Safari 5.X on a Mac.