wvreeven

4 posts
Over the past years I have been working on a Java project that uses a native library written in C by a third party to process data. Recently the data format was changed and a new version of the native library was made available. Unfortunately, the native library, for very good reasons, is not backward compatible with the old data format but my project, also for very good reasons, needs to be able to process both formats of data. Therefore my project needs to determine if the data format is the old one or the new one and use the corresponding library to process the data. Unloading a native library in Java is considered unsafe because it depends on the garbage collector to run. It is impossible to force a GC run (you can only suggest a GC run to the JVM) and there is no guarantee that, when the GC has run, a library (or any other object in memory) has been garbage collected. However, the concept of Dynamically Loaded libraries in C allows for safely unloading a library and loading it or a different version again. 

An example library

In order to clarify how to unload a library and load a different version, this "library" will be used as an example. The header file version.h looks like this int getVersion(void); So it will be a very simple piece of code returning a version number. Of course this is not a real life example The corresponding C code version.c is #include <stdio.h> #include "version.h" int getVersion(void) { return 1; } To use the getVersion() function in C, this code can be used #include <stdlib.h> #include <stdio.h> #include "version/version.h" int main(int argc, char **argv) { printf("%d\n", getVersion()); return 0; } To build all this and separate the library with the getVersion() function from the code that uses it, I created a directory called "src" and in it a directory called "version". So, the directory structure is 
  • src 
    • version 
      • version.c
      • version.h
    • test.c
Finally, all needs to be built before we can execute it. I am not a Make file guru, so I created a bash script: #!/bin/bash cd src/version gcc -c -Wall -Werror -fpic version.c -o version.o && gcc -shared -o libVersion.so version.o cd .. gcc -I./version -L./version -Wall test.c -o test -lVersion cd .. export LD_LIBRARY_PATH=src/version:$LD_LIBRARY_PATH echo "Calling library from C:" ./src/test Running the script generates this output: $ ./compile.sh 1 $ 

Calling the library from Java

Please refer to the JNA documentation for more info on JNA. To call the getVersion() function in the libVersion.so library from Java, I created this class package version; import com.sun.jna.Library; import com.sun.jna.Native; public class VersionModule { private Version version; private static VersionModule vm; public static void main(String[] args) { vm = new VersionModule(); vm.callVersion(); } private void callVersion() { vm.init(); System.out.println(vm.getVersion()); } public void init() { version = (Version) Native.loadLibrary(Version.LIBRARY_NAME, Version.class); } public int getVersion() { return version.getVersion(); } private interface Version extends Library { String LIBRARY_NAME = "Version"; int getVersion(); } } I put the java file in the src/java/version directory, like this 
  • src 
    • java 
      • version 
        VersionModule.java
    • version 
      • version.c
      • version.h
    • test.c
and I modified the complie.sh bash script to call the Java class as well: #!/bin/bash cd src/version gcc -c -Wall -Werror -fpic version.c -o version.o && gcc -shared -o libVersion.so version.o cd .. gcc -I./version -L./version -Wall test.c -o test -lVersion cd .. export LD_LIBRARY_PATH=src/version:$LD_LIBRARY_PATH echo "Calling library from C:" ./src/test cd src/java javac -cp .:/usr/share/java/jna-3.2.7.jar version/VersionModule.java echo "Calling library from Java:" java -Djna.library.path=../version -cp .:/usr/share/java/jna-3.2.7.jar version.VersionModuleRunning the script now generates this output: $ ./compile.sh Calling library from C: 1 Calling library from Java: 1 

Using two versions of the library via a proxy library

Suppose now that we get a new version of the library. The header file didn't change but the implementation of the getVersion() function did. Here is the new version #include <stdio.h> #include "version.h" int getVersion(void) { return 2; } In order to keep both versions next to each other, I renamed the directory holding the old code to version1 and I put the new code in a separate folder called version2 (for want of better names): 
  • src 
    • java 
      • version 
        VersionModule.java
    • version1 
      • version.c
      • version.h
    • version2 
      • version.c
      • version.h
    • test.c
In order to switch from one version to another, we introduce yet another C code file. This code will act as a proxy library and will perform the actual loading and unloading of the correct version of the library as well as calling the corresponding functions in the loaded library. In order to minimize code changes in the test.c file, the Java code and the compile.sh bash script, the header and code files are placed in the src/version directory. The header file version.h looks like this void set_library_path(char *_library_path); int getVersion(void); The proxy C code looks like this #include <stdlib.h> #include <stdio.h> #include <dlfcn.h> void *handle; void set_library_path(char *_library_path) { if (handle) { dlclose(handle); } handle = dlopen(_library_path, RTLD_NOW); if (!handle) { fputs (dlerror(), stderr); } } int getVersion(void) { int (*getVersion)(void); char *error; getVersion = dlsym(handle, "getVersion"); if ((error = dlerror()) != NULL) { fputs(error, stderr); } return (*getVersion)(); } The set_library_path function loads the requested version of the library using functions in the dlfcn (dynamic library function) library. Then, each function (in the API of the library for which different versions exist) needs to be mapped to a proxy function that tries to execute the required function in a way similar to Java reflection. It can be hard to get the proper mapping, depending on the definition of the functions that you need to proxy for. More on dynamically loading libraries can be found here. The source tree now looks like this 
  • src 
    • java 
      • version 
        VersionModule.java
    • version 
      • version.c
      • version.h
    • version1 
      • version.c
      • version.h
    • version2 
      • version.c
      • version.h
    • test.c
Please note that in order for the Java code to be able to load the proxy library, that library needs to be called libVersion.so or else JNA will not be able to load the library! The test.c code to use the old library and then the new is this #include <stdlib.h> #include <stdio.h> #include "version/version.h" int main(int argc, char **argv) { set_library_path("src/version1/libVersion.so"); printf("%d\n", getVersion()); set_library_path("src/version2/libVersion.so"); printf("%d\n", getVersion()); return 0; } Here is the compile.sh bash script to compile the code and execute the C test program: #!/bin/bash cd src/version1 gcc -c -Wall -Werror -fpic version.c -o version.o && gcc -shared -o libVersion.so version.o cd ../version2 gcc -c -Wall -Werror -fpic version.c -o version.o && gcc -shared -o libVersion.so version.o cd ../version gcc -c -Wall -Werror -fpic -rdynamic version.c -o version.o && gcc -shared -o libVersion.so version.o cd .. gcc -I./version -L./version -Wall test.c -o test -lVersion -ldl cd .. export LD_LIBRARY_PATH=src/version:$LD_LIBRARY_PATH echo "Calling library from C:" ./src/test The script now generates this output:$ ./compile.sh Calling library from C: 1 2 

Calling the proxy library from Java

With this new proxy library, the code changes to call the different versions from Java are quite small as well. Here is the new codepackage version; import com.sun.jna.Library; import com.sun.jna.Native; public class VersionModule { private Version version; private static VersionModule vm; public static void main(String[] args) { vm = new VersionModule(); vm.callVersion(); } private void callVersion() { String pwd = System.getProperty("user.dir"); vm.init("../version1/libVersion.so"); System.out.println(vm.getVersion()); vm.init("../version2/libVersion.so"); System.out.println(vm.getVersion()); } public void init(String path) { version = (Version) Native.loadLibrary(Version.LIBRARY_NAME, Version.class); version.set_library_path(path); } public int getVersion() { return version.getVersion(); } private interface Version extends Library { String LIBRARY_NAME = "Version"; int getVersion(); void set_library_path(String _library_path); } } and the new compile.sh bash script: #!/bin/bash cd src/version1 gcc -c -Wall -Werror -fpic version.c -o version.o && gcc -shared -o libVersion.so version.o cd ../version2 gcc -c -Wall -Werror -fpic version.c -o version.o && gcc -shared -o libVersion.so version.o cd ../version gcc -c -Wall -Werror -fpic -rdynamic version.c -o version.o && gcc -shared -o libVersion.so version.o cd .. gcc -I./version -L./version -Wall test.c -o test -lVersion -ldl cd .. export LD_LIBRARY_PATH=src/version:$LD_LIBRARY_PATH echo "Calling library from C:" ./src/test cd src/java javac -cp .:/usr/share/java/jna-3.2.7.jar version/VersionModule.java echo "Calling library from Java:" java -Djna.library.path=../version -cp .:/usr/share/java/jna-3.2.7.jar version.VersionModuleRunning the script produces this output $ ./compile.sh Calling library from C: 1 2 Calling library from Java: 1 2 

Performance impact

The 3rd party native library that is used by my project required me to create proxy functions for no less than 81 functions! Clearly this raises concerns about the performance of the proxy library w.r.t. the performance of direct use of one version of the library, even though 72 of these functions are getters and setters. A simple test, which was to call the whole processing sequence several times in a row with the old library and then the same with the proxy library calling the old library, The test shows there is no measurable overhead. 

Caveats

There are a few caveats for this method to be aware of. The most important one is what to do when the API changes. A function in the new version may have the same signature but with different arguments or different argument types. Functions may exist in one version of the library and not the other. From the JNA point of view, all that matters is that the methods in the interface extending Library map one on one to functions in the proxy library. It is perfectly ok to construct the proxy library in such a way that its functions call functions in the loaded library with an entirely different name. As long as those functions get called with the proper arguments, it will work fine. If you do such a thing, then of course make sure to document this well, either in inline comments or in a technical design document, or both! By the way, it is very tempting to delegate determining the location of the libraries to load and which functions can be called and which not to the end user. But this should be avoided as much as possible. You know about the internals of your Java code and proxy library so it is your responsibility to make sure that end users do not need to go to the trouble of making sure it all works fine. Then of course it is a very bad practise to hard code the path to the libraries in source code. It is much better to determine the location of the library to load at runtime. This can be done via properties, command line flags, values in a database or whatever way you prefer. And again, make sure to document this well in your software user manual! Finally, loading and unloading libraries takes time so it is important to minimize the need for that as much as possible. My project, for example, needs to deal with mixed content of data so I have made sure that I gather together all data of the old format and all data of the new format and then process them sequentially.  

There are several ways to enable user authentication for web based applications, like .htaccess files, plain tekst files, databases, LDAP, etc. They all have their pros and cons. In case a central, flexible solution is needed, either a database or LDAP solution can be used.

I chose for an LDAP solution since it can be reused by many web and application servers and the applications that run on them while many LDAP solutions provide excellent tooling and easy configuration. Since Yenlo strongly focuses on GlassFish and other Sun tools, I chose OpenDS to authenticate against.The idea is to create a bunch of users and put them into one or more groups. Each group gets access to one specific Subversion repository and every user that needs access to a certain subversion repositoy needs to be in the corresponding group.

This blog describes the basic steps I took to get Basic HTTP authentication for Subversion using Apache2 and OpenDS.

Setting up the server

The server is running on Debian Etch, for which most of the required software is available via the Debian package repository. The packages I installed are

apache2
apache2-mpm-prefork
libapache2-svn
subversion
sun-java6-bin
sun-java6-jdk
sun-java6-jre

Apart form these packages I downloaded and installed the OpenDS 2.2 zip file from the OpenDS homepage. For this purpose I created a user and a group called “opends” and the directory /usr/local/opends which I gave both ownership and group “opends”. Next I logged in as user opends and extracted the zip file into that directory. Finally, as root, I created a start and stop script and used the update-rc.d script to make sure OpenDS is automatically started and stopped when the server is started and stopped.

Setting up Subversion

Let’s suppose that our organisation has decided to create a Subversion repository for each development project that we do. This means that we would get a directory structure like this, with the leaves being the root directories of each repository:

/usr/share/subversion
                     /repository1
                     /repository2

The repositories were created as root using these commands in the /usr/share/subversion directory

# svnadmin create repository1
# svnadmin create repository2

That’s all the configuration that was done to the Subversion repositories.

Configuring OpenDS

First, a basic setup of OpenDS needed to be done using the command

$ /usr/local/opends/OpenDS-2.2.0/bin/setup

For more info about this, see Setting Up the Directory Server. Please note that if you are not running OpenDS as root user, it will listen on port 1389.

The next step was to configure OpenDS so it contains a few users and groups. The main configuration tool for OpenDS is called control-panel and, in my case, it can be started as user “opends” by issuing this command:

$ /usr/local/opends/OpenDS-2.2.0/bin/control-panel

With this tool, among other things, you can manage the entries in the directory. For my purpose I created the next layout

base dn (dc=yenlo,dc=nl)
  organisation (o=yenlo)
    organisational unit (ou=devel)
      user (uid=wouter1, password = wouter1)
      user (uid=wouter2, password = wouter2)
      user (uid=wouter3, password = wouter3)
      group (cn=group1, uniqueMember=wouter1,wouter2)
      group (cn=group2, uniqueMember=wouter2,wouter3)

which is a very simple layout but sufficient for now. Note that I created two Subversion repositories and therefore two LDAP groups, one for each Subversion repository.

Setting up Apache2 to get access to the Subversion repositories

First I configured Apache2 so the two repositories were accessible to anyone using e.g. a web browser. After libapache2-svn is installed, the module should be enabled in the Apache2 configuration files automatically. To verify this, check the

/etc/apache2/mods-enabled/

directory and make sure these symbolic links exist

dav.load -> ../mods-available/dav.load
dav_svn.conf -> ../mods-available/dav_svn.conf
dav_svn.load -> ../mods-available/dav_svn.load

If the first link is missing, create it as root using

# a2enmod dav

If the second and third are missing, create it as root using

# a2enmod dav_svn

To make sure the modules are loaded by Apache2, restart the server as root using

# /etc/init.d/apache2 restart

Now some modifications need to be made to the dav_svn configuration file

/etc/apache2/mods-available/dav_svn.conf

The idea is that there are two Subversion repositories that need to be configured individually. So, these two entries need to be put in the above mentioned configuration file

<Location /svn/repository1>
DAV svn
SVNPath /usr/share/subversion/repository1
SVNListParentPath On
SVNAutoversioning On
SVNReposName “Repository1 Subversion Repository”
</Location>
<Location /svn/repository2>
DAV svn
SVNPath /usr/share/subversion/repository2
SVNListParentPath On
SVNAutoversioning On
SVNReposName “Repository2 Subversion Repository”
</Location>

Now, if you point your browser to either http://<host>/svn/repository1 or http://<host>/svn/repository2 you should see the root of your repository.

Adding LDAP support using OpenDS

In order to use LDAP atuentication with Apache2, these two entries should be present in /etc/apache2/mods-enabled:

authnz_ldap.load -> ../mods-available/authnz_ldap.load
ldap.load -> ../mods-available/ldap.load

If they aren’t present, you can enable them as root with these commands

# a2enmod authnz_ldap
# a2enmod ldap
# /etc/init.d/apache2 restart

Now, each <Location> entry in /etc/apache2/mods-available/dav_svn.conf should be extended with these lines in order to get LDAP authentication working with OpenDS. Please note that you should pay attention to the group you assign to each subversion repository location. I am only showing the configuration addition for repository1 and leave it as an excercise to you to setup the second repository configuration. Also please note you make sure the correct password for the Directory Manager is provided. For more info about these configuration settings, please consult the OpenDS WIKI page on Apache Web Server.

AuthType Basic
AuthName “Repository1 Subversion Repository”
AuthBasicProvider ldap
AuthzLDAPAuthoritative off
AuthLDAPURL ldap://localhost:1389/dc=yenlo,dc=nl?uid
AuthLDAPBindDN “cn=Directory Manager”
AuthLDAPBindPassword mypassword
AuthLDAPGroupAttribute uniqueMember
AuthLDAPGroupAttributeIsDN on
Require ldap-group cn=repository1,ou=devel,o=yenlo,dc=yenlo,dc=nl

Issue a final

# /etc/init.d/apache2 restart

and you should be prompted for a username and password when you try to access either repository. Using the correct uid and password combination should grant you access to the repository.

Next stepts

The authentication mechanism we chose, Basic autentication, posts usernames and passwords in plain text, which potentially could be harmful in case someone sniffes the connection. I would strongly recommend to setup SSL based connections. Moreover, this blog post only shows how to secure one repository layout using LDAP. Both Subversion and Apache are sufficiently flexible that other layouts are possible. Those layouts most likely require tweaking of the Apache2 configuration options. Finally, I am not an LDAP expert so the directory structure I chose in OpenDS most likely can be much improved. However, following the steps in this article should get you on your way to using OpenDS incombination with Apache2.

 

This entry was originally posted on the Yenlo B.V. weblog.

According to the PrimeFaces website, "PrimeFaces is an open source component suite for Java Server Faces featuring 70+ Ajax powered rich set of JSF components. Additional TouchFaces module features a UI kit for developing mobile web applications.". Since it is an OpenSource JSF implementation that is very close to releasing JSF 2.0 compliant components, I figured it was time to try it out on GlassFish v3.

A very easy and powerful way of creating Java EE 6 compliant applications, is to use NetBeans 6.8, which comes with great GlassFish v3 and Maven support. The first thing to do is to create a new Maven Web Application. The wizard that helps you create it allows you to specify the Java EE version, which in this case should be 6. In order to make sure the PrimeFaces libraries are included in your project, add the next dependency to your pom.xml file

<dependency>
 <groupId>org.primefaces</groupId>
 <artifactId>primefaces</artifactId>
 <version>2.0.0.RC</version>
</dependency>

Since the PrimeFaces jars are hosted on the PrimeFaces Maven repository, you'll need to add the repository as well:

<repository>
 <id>prime-repo</id>
 <name>Prime Technology Maven Repository</name>
 <url>http://repository.prime.com.tr/</url>
 <layout>default</layout>
</repository>

PrimeFaces makes use of a servlet to get hold of resources, like css and JavaScript. Therefore, you need to register the Resource Servlet in web.xml like this

<servlet>
 <servlet-name>Resource Servlet</servlet-name>
 <servlet-class>org.primefaces.resource.ResourceServlet</servlet-class>
</servlet>
<servlet-mapping>
 <servlet-name>Resource Servlet</servlet-name>
 <url-pattern>/primefaces_resource/*</url-pattern>
</servlet-mapping>

A split pane

A simple index.xhtml file with a split pane may then look like this

<?xml version="1.0" encoding="UTF-8"?>
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
 <html xmlns="http://www.w3.org/1999/xhtml"
       xmlns:h="http://java.sun.com/jsf/html"
       xmlns:p="http://primefaces.prime.com.tr/ui">
  <head>
   <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
   <title>PrimeFaces Test</title>
   <p:resources />
  </head>
  <body>
   <p>
    <p:layout style="width:400px;height:200px;">
     <p:layoutUnit position="west" size="100">Left Pane</p:layoutUnit>
     <p:layoutUnit position="center">Right Pane</p:layoutUnit>
    </p:layout>
   </p>
  </body>
 </html>

which looks like this

The left bar is the one being dragged by the mouse.

File upload

Another nice PrimeFaces component is the fileUpload component. It supports single and multiple file uploads. To be able to use the fileUpload component, a few Apache Commons dependencies need to be added:

<dependency>
 <groupId>commons-fileupload</groupId>
 <artifactId>commons-fileupload</artifactId>
 <version>1.2.1</version>
</dependency>
<dependency>
 <groupId>org.apache.commons</groupId>
 <artifactId>commons-io</artifactId>
 <version>1.3.2</version>
</dependency>

Next, a filter needs to be added to web.xml and the JavaServer Faces state saving method should be set to server. So, add these lines to web.xml

<context-param>
 <param-name>javax.faces.PROJECT_STAGE</param-name>
 <param-value>Development</param-value>
</context-param>
<context-param>
 <param-name>javax.faces.STATE_SAVING_METHOD</param-name>
 <param-value>server</param-value>
</context-param>
<filter>
 <filter-name>PrimeFaces FileUpload Filter</filter-name>
 <filter-class>org.primefaces.webapp.filter.FileUploadFilter</filter-class>
 <init-param>
  <param-name>thresholdSize</param-name>
  <param-value>51200</param-value>
 </init-param>
 <init-param>
  <param-name>uploadDirectory</param-name>
  <param-value>/tmp</param-value>
 </init-param>
</filter>
<filter-mapping>
 <filter-name>PrimeFaces FileUpload Filter</filter-name>
 <servlet-name>Faces Servlet</servlet-name>
</filter-mapping>

Then a JSF ManagedBean is needed to handle the file uploads. A very simple one, that doesn't do anything with the uploaded files at all, may look like this

@ManagedBean(name = "fileUploadController")
@RequestScoped
public class FileUploadController implements FileUploadListener {
 @Override
 public void processFileUpload(FileUploadEvent event) throws AbortProcessingException {
  System.out.println("Uploaded: " + event.getFile().getFileName());
  FacesMessage msg = new FacesMessage("Succesful", event.getFile().getFileName() + " is uploaded.");
  FacesContext.getCurrentInstance().addMessage(null, msg);
 }
}

Finally, change index.xhtml to this

<?xml version="1.0" encoding="UTF-8"?>
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
 <html xmlns="http://www.w3.org/1999/xhtml"
       xmlns:h="http://java.sun.com/jsf/html"
       xmlns:p="http://primefaces.prime.com.tr/ui">
  <head>
   <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
   <title>PrimeFaces Test</title>
   <p:resources />
  </head>
  <body>
   <p>
    <h:form id="form" enctype="multipart/form-data" prependId="false">
     <p:growl id="messages" />
     <p:layout style="width:400px;height:200px;">
      <p:layoutUnit position="west" size="100">Left Pane</p:layoutUnit>
      <p:layoutUnit position="center">Right Pane</p:layoutUnit>
     </p:layout>
     <p:fileUpload fileUploadListener="#{fileUploadController.processFileUpload}" id="documentToUpload"
                   allowTypes="*.jpg;*.png;*.gif;" description="Images" update="messages"/>
    </h:form>
   </p>
  </body>
 </html>

Redeploy and the result looks like this

Conclusion

It is quite easy to get started with PrimeFaces using Maven and GlassFish. Please bear in mind that the fileUpload component is quite picky concerning the order in which the filters are specified in web.xml in case there is more than one. Please see this forum thread for more info.

In case you're having troubles following the instructions in this blog, here is my NetBeans project. Since it is a Maven project you should be able to open it in Eclipse or any other IDE.

This entry was originally posted on the AMIS Technology Blog.

In the past few month several Java EE 6 related JSRs (Java Specification Requests) have been finalized. The final ballot for them ended on November 30 and all were approved. Today, December 10, 2009, Java EE 6 and GlassFish v3, THE reference implementation of Java EE6, are released.

Four and a half year after the release of Java EE 5 we enter the next Java EE era. GlassFish v3 is the first application server that fully supports all Java EE 6 technologies. The list of supported technologies includes, but is not limited, by

  • Servlet 3.0
  • JSF 2.0
  • WebBeans
  • CDI (Contexts and Dependency Injection)
  • Bean Validation
  • EJB 3.1
  • JPA 2.0
  • JAX-RS

GlassFish v3 can be downloaded in two flavours. The first one contains all of the Java EE 6 technologies, the second one contains all technologies that are specified in the Java EE 6 Web Profile specification.

 

If there was only one GlassFish v3/Java EE 6 thing I'd like to mention then it would be that Java EE has become more powerful than ever before. Now we can do with annotations instead of XML configuration, we can do without ear files because war files are sufficient, Ajax has been standardized, and asynchronous calls are supported all the way from the client almost down to the database. And now you can do all that with GlassFish v3. A reason to celebrate? You bet it is!

https://glassfish-theme.dev.java.net/logo.gif

For those wanting to know all there is to know about GlassFish v3, please attend the Virtual Conference on December 15. For more info, see the conference flyer. You may also want to visit the GlassFish Community Home-page and the GlassFish Enterprise Home-page (should be live any moment now). Finally, you can download the Java EE 6 SDK and find even more info at the Java EE 6 homepage.

This blog entry was originally posted on the AMIS Technology Blog.

Filter Blog

By date: