Skip navigation


12 posts

I had referenced Ted Wise's post Using Java 1.5 and Java 1.4 on Snow Leopard in a project, and happened to notice a comment from a week ago about someone having trouble getting Java 1.4 to work in Mac OS X 10.7/Lion, so I decided to check it out, since I might also run into it later.

I'm using Snow Leopard currently, so I used a free utility called unpkg to look at the Java for Mac OS X 10.7released from Apple, as well as the other two latest non-preview releases: Java for Mac OS X 10.6 Update 4 and Java for Mac OS X 10.5 Update 9.

Even though Java for Mac OS X 10.5 Update 5 contained the src.jar for 1.4.2_22, it was axed in Java for Mac OS X 10.5 Update 6, with the statement "J2SE 1.4.2 is no longer being updated to fix issues and is therefore disabled by default in this update." (Note that Oracle's (formerly Sun's) Java 1.4 had reached EOL much earlier on October 30th, 2008.)

The current Java for Mac OS X 10.5 Update 9 contains a src.jar for Java 1.5/5.0 (1.5.0_28-b04) and one for 1.6/6.0 (1.6.0_24-b07-334). (Note that Oracle's (formerly Sun's) Java 1.5 reached its EOSL on November 3, 2009.) However, Apple's Java for OS X 10.6 and 10.7 do not contain Java 1.5/5.0 anymore; it is only a symlink to 1.6/6.0.

Java for Mac OS X 10.6 Update 4 only contains a src.jar for 1.6/6.0 (also 1.6.0_24-b07-334).

Java for Mac OS X 10.7 (at time of writing was not listed in, but from what I can see it is not a preview release in the Apple Developer site, so is ok to discuss here) does not contain a src.jar even though it contains 1.6.0_24-b07-345 (according to its System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Info.plist).

I'm not sure why Java for Mac OS X 10.7 included a 1.6 build of Java without the associated src.jar. Was it an oversight or intentional? Sure, it is the same version of Java, so perhaps it is very close or has the same source code as Java for Mac OS X 10.6 Update 4, but still, the build number was incremented (by 1, so I kind of doubt it was a big change- it might have just been a recompilation).

If you're interested in Java 1.7 (7.0), Apple announced the OpenJDK project for Mac OS X in Nov 2010. There is more info on the OpenJDK OS X port project on its project page,wiki, and you can check out the source if you'd like.

If you need guaranteed fast response time and simple caching on an unruly RSS feed, some other HTTP resource accessible via HTTP Get, or for that matter just about any Java static or instance method, check out Nerot.

Nerot uses Quartz, Rome, HttpClient, reflection, and a simple in-memory store to allow you to easily request the scheduling of some action, such as getting a Rome SyndFeed from a pre-configured or runtime defined URL, and then whenever the request is called to get the result, it is just a matter of retrieving the cached result. If you're using Spring, you can just define a bean in your webapp's root context to start the scheduling of the task, and then access it quickly and easily later in the same context or child context.

For more information about Nerot, check out its site or its GitHub project.

First off, I want to thank Apple for being such big supporters of Java. You may have gone off and done your own thing, but as Java developers, we still have to give some serious kudos.

However, what has been going on with symlinks in the /System/Library/Frameworks/JavaVM.framework/Versions directory is wrong. If I remember correctly, I noticed this at first with the upgrade to Snow Leopard, and it seems to have continued to happen with each subsequent update since. (So this article is way late, but so be it.) Here's what it looks like: (I had to remove a few directories from the listing and others were there before from previous upgrades, so it may not look exactly the same as yours.)

May 24 09:15 1.3 -> 1.3.1
Jul 20  2009 1.3.1
May 24 09:15 1.4 -> CurrentJDK
May 24 09:15 1.4.2 -> CurrentJDK
May 24 09:15 1.5 -> CurrentJDK
May 24 09:15 1.5.0 -> CurrentJDK
May 24 09:15 1.6 -> 1.6.0
Nov 30  2009 1.6.0
May 24 09:15 A
May 24 09:15 Current -> A
May 24 09:15 CurrentJDK -> 1.6

Apple keeps pointing the symlinks for previous versions of Java at the Current/CurrentJDK directories which are Java 6.

I'm guessing that this is probably just for backwards compatibility with Apps for OS X that need to run on older Java versions, and Apple doesn't want to bother maintaining them? It is true that some older Java apps/libraries will work this way, even though I had a few that didn't. So the bulk of OS X users that aren't Java developers get some of their Java apps running. But, what about the large Java development community that uses OS X? What about those apps that don't work in Apple's Java 6?

I'm sure Apple feels that they are the benevolent dictator of this directory and that developers and everyone else should stay kindly out and use the Java Preferences app. But with articlesout there discussing related issues with Java and Snow Leopard, I was really hoping that by now Apple would have done something. As far as I can tell after my most recent upgrade to OS X 10.6.4, they haven't.

So what am I getting at? I think that in future versions of OS X, Apple should stop messing with any symlinks in here. If they have OS X components that require Java 6, have them point at the 1.6 symlink, for example. If a completely new version of Java needs to be installed, and new symlinks need to be setup, that is fine. Tell OS X app developers somewhere that they should be pointing to the 1.4, 1.5, or 1.6 directory and not Current/CurrentJDK if they want to make sure it stays stable. But please don't mess with older version directories and symlinks like this. Maybe also undo the harm you did by providing 1.4 and 1.5 versions that work in OS X 10.6.x. Even if it means having to give up control, contracting a bunch of developers to migrate OS X's Java to OpenJDK, donating most of the code to that project, and decoupling Java from OS X completely... please just do something to make this better.

Edit: Apple seems to have done it yet again with Mac OS X 10.6 Update 4, messing with my Java 1.5.0-leopard install. See Recovering Lost Java 1.5.0 for Snow Leopard After Java for Mac OS X 10.6 Update 4 Install for more info, and be sure to see Using Java 1.5 and Java 1.4 on Snow Leopard by Ted Wise.

Today I glanced into /usr/share/java in a CentOS release 5.2 (Final) server with Tomcat and Java installed via RPM. It literally burned my eyes, and hopefully you can see why. 

My eyes first caught this: 

libgcj-4.1.2.jar -> libgcj-4.1.1.jar
libgcj-tools-4.1.2.jar -> libgcj-tools-4.1.1.jar

Owww!!! It burns! Regardless of how trivial a small change in a version of a jar might be in one case for one version of an application, since this is a shared area for jars, you don't know what some other application would expect out of that jar. And if the person trying to track down an issue thinks they are using one version of a jar, but it is really pointed at a different version... Aaaaarghh! Support issue that take 5 months to determine coming right up, says the evil one. 

Next up: 

xerces-j2.jar -> xerces-j2-2.7.1.jar

Jar names can just be all over the place. It isn't uncommon at all to see a jar without a version name associated with an app. But here in a shared area for jars, to have a symlink with a generic non-versioned name pointing at one version of a jar that could be changed to point somewhere else is nearly as evil. Once that swap is made- how do you know which version of the jar it was supposed to have been using? Reinstall/redownload it? Was that really the version? Who knows? 

How about something that is just wrong (yes this was actually in there): 

wsdl4j.jar -> qname-1.5.2.jar

Even if the two jars have equivalent classes in the same package, unless it was really just a jar rename, youreally don't know what classes someone else using the jar expect to be in there. Just because it works in one application, doesn't mean it will work for all. I'd say this one gets the 5 pitchfork seal of DISapproval. 


servletapi5.jar -> tomcat5-servlet-2.4-api-5.5.23.jar

Be very wary of symlinks to jars that have generic names but point to a non-generic implementation. This is very likely also a pitchfork rapping at the door. If you are going to have symlinks, they better darn well be specifically named for the name and version of the jar it is! 

Ok. I'm sorry if I've ruffled any feathers on a Friday, and I didn't mean to scare the rest of you so much. But just know that there is a Jar Hell encrusted with symlinks out there. And it might just be right at your backdoor. 

P.S.- For those that just can't get enough, I'll leave you with the full list. I'm sure you'll find others: 


activation.jar -> classpathx-jaf-1.0.jar
ant.jar -> ant-1.6.5.jar
ant-launcher.jar -> ant-launcher-1.6.5.jar
antlr.jar -> antlr-2.7.6.jar
bcel.jar -> bcel-5.1.jar
catalina-ant5.jar -> catalina-ant-5.5.23.jar
classpathx-jaf.jar -> classpathx-jaf-1.0.jar
classpathx-mail-1.3.1-monolithic.jar -> classpathx-mail-1.3.1-monolithic-1.1.1.jar
commons-beanutils-1.7.0.jar -> jakarta-commons-beanutils-1.7.0.jar
commons-beanutils-bean-collections-1.7.0.jar -> jakarta-commons-beanutils-bean-collections-1.7.0.jar
commons-beanutils-bean-collections.jar -> commons-beanutils-bean-collections-1.7.0.jar
commons-beanutils-core-1.7.0.jar -> jakarta-commons-beanutils-core-1.7.0.jar
commons-beanutils-core.jar -> commons-beanutils-core-1.7.0.jar
commons-beanutils.jar -> commons-beanutils-1.7.0.jar
commons-collections-3.2.jar -> jakarta-commons-collections-3.2.jar
commons-collections.jar -> commons-collections-3.2.jar
commons-daemon-1.0.1.jar -> jakarta-commons-daemon-1.0.1.jar
commons-daemon.jar -> commons-daemon-1.0.1.jar
commons-dbcp-1.2.1.jar -> jakarta-commons-dbcp-1.2.1.jar
commons-dbcp.jar -> commons-dbcp-1.2.1.jar
commons-digester-1.7.jar -> jakarta-commons-digester-1.7.jar
commons-digester-1.7-rss.jar -> jakarta-commons-digester-1.7-rss.jar
commons-digester.jar -> commons-digester-1.7.jar
commons-digester-rss.jar -> commons-digester-1.7-rss.jar
commons-discovery-0.3.jar -> jakarta-commons-discovery-0.3.jar
commons-discovery.jar -> commons-discovery-0.3.jar
commons-el-1.0.jar -> jakarta-commons-el-1.0.jar
commons-el.jar -> commons-el-1.0.jar
commons-fileupload-1.0.jar -> jakarta-commons-fileupload-1.0.jar
commons-fileupload.jar -> commons-fileupload-1.0.jar
commons-httpclient-3.0.jar -> jakarta-commons-httpclient-3.0.jar
commons-httpclient3.jar -> commons-httpclient.jar
commons-httpclient.jar -> commons-httpclient-3.0.jar
commons-launcher-0.9.jar -> jakarta-commons-launcher-0.9.jar
commons-launcher.jar -> commons-launcher-0.9.jar
commons-logging-1.0.4.jar -> jakarta-commons-logging-1.0.4.jar
commons-logging-api-1.0.4.jar -> jakarta-commons-logging-api-1.0.4.jar
commons-logging-api.jar -> commons-logging-api-1.0.4.jar
commons-logging.jar -> commons-logging-1.0.4.jar
commons-modeler-1.1.jar -> jakarta-commons-modeler-1.1.jar
commons-modeler.jar -> commons-modeler-1.1.jar
commons-pool-1.3.jar -> jakarta-commons-pool-1.3.jar
commons-pool.jar -> commons-pool-1.3.jar
com-sun-javadoc.jar -> com-sun-javadoc-0.7.7.jar
com-sun-tools-doclets-Taglet.jar -> com-sun-tools-doclets-Taglet-0.7.7.jar
dom3-xerces-j2-2.7.1.jar -> xerces-j2-2.7.1.jar
dom3-xerces-j2.jar -> dom3-xerces-j2-2.7.1.jar
eclipse-ecj.jar -> /usr/share/eclipse/plugins/org.eclipse.jdt.core_3.2.1.v_677_R32x.jar
ejb.jar -> geronimo/spec-ejb-2.1.jar
gnu-classpath-tools-gjdoc.jar -> gnu-classpath-tools-gjdoc-0.7.7.jar
hibernate_jdbc_cache.jar -> /etc/alternatives/hibernate_jdbc_cache
j2ee-connector.jar -> geronimo/spec-j2ee-connector-1.5.jar
j2ee-deployment.jar -> geronimo/spec-j2ee-deployment-1.1.jar
j2ee-management.jar -> geronimo/spec-j2ee-management-1.0.jar
jacc.jar -> geronimo/spec-j2ee-jacc-1.0.jar
jaf-1.0.2.jar -> classpathx-jaf-1.0.jar
jaf.jar -> /etc/alternatives/jaf
jakarta-commons-beanutils-bean-collections.jar -> jakarta-commons-beanutils-bean-collections-1.7.0.jar
jakarta-commons-beanutils-core.jar -> jakarta-commons-beanutils-core-1.7.0.jar
jakarta-commons-beanutils.jar -> jakarta-commons-beanutils-1.7.0.jar
jakarta-commons-collections.jar -> jakarta-commons-collections-3.2.jar
jakarta-commons-daemon.jar -> jakarta-commons-daemon-1.0.1.jar
jakarta-commons-dbcp.jar -> jakarta-commons-dbcp-1.2.1.jar
jakarta-commons-digester.jar -> jakarta-commons-digester-1.7.jar
jakarta-commons-digester-rss.jar -> jakarta-commons-digester-1.7-rss.jar
jakarta-commons-discovery.jar -> jakarta-commons-discovery-0.3.jar
jakarta-commons-el.jar -> jakarta-commons-el-1.0.jar
jakarta-commons-fileupload.jar -> jakarta-commons-fileupload-1.0.jar
jakarta-commons-httpclient.jar -> jakarta-commons-httpclient-3.0.jar
jakarta-commons-launcher.jar -> jakarta-commons-launcher-0.9.jar
jakarta-commons-logging-api.jar -> jakarta-commons-logging-api-1.0.4.jar
jakarta-commons-logging.jar -> jakarta-commons-logging-1.0.4.jar
jakarta-commons-modeler.jar -> jakarta-commons-modeler-1.1.jar
jakarta-commons-pool.jar -> jakarta-commons-pool-1.3.jar
jasper5-compiler.jar -> jasper5-compiler-5.5.23.jar
jasper5-runtime.jar -> jasper5-runtime-5.5.23.jar
javamail.jar -> /etc/alternatives/javamail
jaxp_parser_impl.jar -> /etc/alternatives/jaxp_parser_impl
jaxp_transform_impl.jar -> /etc/alternatives/jaxp_transform_impl
jdtcore.jar -> /usr/share/java/eclipse-ecj.jar
jms.jar -> geronimo/spec-jms-1.1.jar
jmxri.jar -> /etc/alternatives/jmxri
jspapi.jar -> tomcat5-jsp-2.0-api-5.5.23.jar
jsp.jar -> /etc/alternatives/jsp
jta.jar -> geronimo/spec-jta-1.0.1B.jar
ldapbeans.jar -> ldapbeans-4.18.jar
ldapfilt.jar -> ldapfilt-4.18.jar
ldapjdk.jar -> ldapjdk-4.18.jar
ldapsp.jar -> ldapsp-4.18.jar
libgcj-4.1.2.jar -> libgcj-4.1.1.jar
libgcj-tools-4.1.2.jar -> libgcj-tools-4.1.1.jar
log4j.jar -> log4j-1.2.13.jar
regexp.jar -> regexp-1.4.jar
servletapi5.jar -> tomcat5-servlet-2.4-api-5.5.23.jar
servlet.jar -> /etc/alternatives/servlet
tomcat5-jsp-2.0-api.jar -> tomcat5-jsp-2.0-api-5.5.23.jar
tomcat5-servlet-2.4-api.jar -> tomcat5-servlet-2.4-api-5.5.23.jar
wsdl4j.jar -> qname-1.5.2.jar
xalan-j2.jar -> xalan-j2-2.7.0.jar
xalan-j2-serializer.jar -> xalan-j2-serializer-2.7.0.jar
xerces-j2.jar -> xerces-j2-2.7.1.jar
xml-commons-apis.jar -> xml-commons-apis-1.3.02.jar
xml-commons-resolver.jar -> xml-commons-resolver-1.1.jar

See the tomcat-users mailing list thread for more comments.

For the several years (at least since mobile devices were able to browse the net in some form or fashion), companies and organizations have been interested (and increasingly so) in making their sites/web applications mobile-friendly.

But from what little I know, supporting mobile devices is not just as easy as a single differentiation in format.

Mobile devices support HTML, XHTML (various), WML, and other formats to varying degrees. Some tags are supported, some aren't. And that variation basically carved out a niche for companies to attempt to solve those sorts of issues by getting a lot of different mobile devices and then testing sites/content on each. (I used to work for a mobile content aggregator/provider, and that testing and their knowledge of what was supported on each device was the reason that they had relationships with major carriers, media companies, etc. to provide mobile content).

(Relatively) newer mobile devices (iPhones, etc.) are in-general a lot better about supporting HTML than older devices (at least as far as I know), and WML has been on its way out for a good while (as well as the number of devices in the wild that really use it). While there has been a bit of proliferation in mobile content markup languages, as can be seen in the timeline "Evolution of Mobile Web-Related Markup Languages" (from Wikipedia), from what I remember, it wasn't so much about the standards as much as it was how devices supported and/or failed to support those standards.

In general, a "one size fits all" XSLT transform to support most mobile devices doesn't really work that well :) . Applications I've seen that attempt to support a mobile markup language that worked on a single (or very few) mobile devices they tested them against doesn't exactly mean they can place a big "mobile devices supported" stamp on their site, at least not without crossing their fingers behind their backs. :)

That said, should you remove the WML support from your app that you added 3 years ago when it seemed like a good idea? It probably isn't worth the time.

But if there is a question of whether you should improve the quality of your existing services/applications vs. spending a good deal of effort to partially support "mobile devices" in-general (without assistance from a major company that does it all of the time), I'd choose the former, not the latter. And yes, I'm perfectly aware of the many, many mobiles devices out there and the huge market that entails.

My guess is the way that things are headed in the future (granted- some years down the road) will be less about the (generic) web content format provided from the server-side and more about how a site's content should be rendered on a device on the device-side as bandwidth becomes less of an issue. I'd also bet that there will be some major change in format of all web content over the next 10 years to support that kind of thinking. But I could definitely be wrong on that.

As always, I welcome all to provide feedback, even if to say that I'm off my rocker, as I often am. ;)

As a follow-up from the previous article on the Interaction-Flow-Service-Model Architectural Pattern (IFSM), let's talk about suggestions for developing a more modular flow layer. 
  • All non-flow-related logic should be pushed to service layer (or an additional service-logic layer).
  • Aim for simplicity in controllers and related contextual VOs. Unless something else is needed, lean towards using simple generic constructs such as a Map to store flow context.
  • If persistance is required for any data (including flow-state) store and retrieve it via services and minimize and handling required in flow.
  • Avoid use of complex flow logic, assumptions about flow control, and assumptions about scopes as they might be implementation-specific. The framework might offer 10 options for how to do something, but you choose the one that keeps things as simple as possible to guard against future changes.
  • At times the services will need access to the original request object/external context. It is simple to wrap the request object/external context in another object that you can call to get specific information that you need from the request (like getting the remote user, getting a session param, checking a user's role, etc.), and it might save some headache later if you swap out the slow layer with another implementation that has a different concept of context.

Let's also talk about scopes in the flow layer. For the official description of how Spring Web Flow 2 handles flow scope see the Special EL variables section of their reference docs. 

In Spring Web Flow 2: 

  • flowScope - within scope of a SWF2 "flow". Can be multiple flows per application. This is the primary scope used in SWF2. A smaller scope than web session, greater than scope of an web request.
  • viewScope - scope within a SWF2 "view-state". A smaller scope than web session, greater than scope of an web request.
  • requestScope - scope is from the point the flow is called until it returns. A smaller scope than web session, greater than scope of an web request.
  • flashScope - variable only lives for a single render in a flow. Similar to the scope of a web request.
  • conversationScope - is stored in web session, but is not equivalent to session. It only exists for as long as the top-level flow exists.

As you can see, most scopes don't really have a clear counterpart in the non-SWF2 world. The reason I mention this is that you might want try to keep scope concepts in your controller implementation simple, and when implementing any flow framework, be careful not to tie yourself tightly to flow scopes and logic that couldn't be expressed more simply. That's not so much a "flow isolation from services" thing as much as a "make sure that you understand that different flow layer solutions may handle scopes differently, so watch out" thing. 

Again, I'm not an expert in Spring Web Flow 2 or the flow layer in general (I'm still learning!), so would appreciate any comments or thoughts you have on this or on implementation of IFSM.  

There is one thing that I've overlooked until today, which is the importance of the division of the controller into application "flow(s)" and application "service(s)". For a good while now, I had been keeping controller code separate from service code (which in turn called the DAOs, that used Spring DAO, that used Hibernate, that interacted with the DB, etc.). However sometimes business logic that should probably be in those services can easily blur into the controller classes, which are sometimes also at least in part, responsible for the flow. Until you try to replace the part of the app handling the flow, maybe you're fine, as many are! However, there is a big wave headed your way, that you've felt coming for a good while now (and some have been dunked!) which is "flow". I started a few days ago to take an application I had written and convert it to Spring Web Flow 2. While writing the flows for the application, it stood out to me clearly how many methods were missing from the service layer of my application that I planned to call from the flows, and the additional pain that will come with moving that code from the controller and other related classes (injected instances in the controller and sometimes utility classes). Are we moving towards what possibly could be called a "Interaction-Flow-Service-Model" (IFSM) architectural pattern? (Which is updating the lingo a little in model-view-controller and separating the "controller" part concept into "flow" and "service"?) Throwing this out there, and would appreciate any comments.  

Lazy Testing in Java Blog

Posted by garysweaver Aug 28, 2008
Do you have almost no test coverage or perhaps none at all? Join the crowd. Although no one wants to admit it, a good part of the world runs on untested code. Here are a few tips for "lazy testing" your Java application, for those many of us that have nowhere to go but up. 


Unit test the stuff you are unsure of the behavior of that is easy to test

For example, if you have a utility class that is doing something funky- that would be a good place for one or more unit tests. 


Keep tests fast

You've probably heard this before. For those few tests you actually do add, make sure they all run quickly, and if they don't then make sure they'll run as part of continuous integration or via some automated fashion, because otherwise they'll probably either just slow down the build (wasting time as developers go read slashdot, etc. while waiting on the build), or if they aren't run as part of the build, they'll likely be ignored and not of much use. 


Use tools that make it faster to develop and maintain tests

Do you leave out tests because you're afraid of getting fired or shunned for being too slow to develop something? You think you'll be better off without writing tests? Well just a minute now- that's how we got into this "my code is poorly tested" position in the first place. If your code is buggy, then ignoring that fact isn't going to help, it's just going to mean time later to work out the kinks, which makes you slower. 

There are all kinds of things out there to help you. Just spend a bit of time researching via Google, and you'll be able to develop the tests you need with a lot less code. In addition to JUnit, if you're using Spring, the Spring base classes for unit tests (look at the Spring documentation) are really helpful, and stuff like Unitils can help a lot with setting up Spring beans and DBUnit for testing. DBUnit can be used to test out the DAO stuff (albeit as more of a functional test) and I think H2 makes a good in-memory DB for testing (it's from the guy that did HSQL/Hypersonic), and EasyMock/JMock are ok for mocking objects if the mock objects provided by your framework (Spring, etc.) don't help (but I think you can end up spending too much time learning to mock stuff, so it isn't always good for the lazy programmer). 


Setup a continous integration (CI) server

Having something telling you when you or someone else checked in something to source control (and if you don't have source control, good gracious! you are lazy- and stupid, I might add) that broke the build is a really good idea, and lazy (in a good way)! Bamboo kicks butt for CI. 


Functional tests vs. unit tests

If it is a matter of having quick functional tests that test out much of the application vs. only a few parts, go for the functional ones and run them as part of the build. 

For example, if you have an app that acts as an email reader, use GreenMail to setup an in-memory mail server as part of the build. If you can be lazy while doing it, try not to place dependencies on internal/work servers being up, but for internal projects, it might be easier. For open-source projects,(almost) anyone (almost) anywhere should be able to run the build, so don't place dependencies on internal/work servers, unless they are public and you can justify it. 

Using unitils and DBUnit with H2 works for DAO checking functional tests, if you're really concerned/unsure about your DAO code. 


Whenever possible, have a standardized build script (for example: ant, maven, maven 2) for all projects that call the tests

This is a must for open-source projects, and should usually be a given for internal projects. If you only use the IDE project or rely on it too heavily, you might be screwing your company or your fellow (or future) coworkers or replacement(s) who might want to build from command-line or use a continuous integration server (like Bamboo) to run the build. Also, the continuous integration build will likely call a script, ant, or maven/maven 2 build. 

I totally welcome any and all comments, as I'm definitely not trying to speak as an authority on the subject. Hope this starts up a healthy conversation and helps get some of you lazy programmers (which includes me) into doing at least some testing.  

Looking for information on how to quickly develop new JSR-168 and JSR-286 compliant portlets in an IDE an article called Developing Portlets with NetBeans Portal Pack 2.0

Those instructions appear to mostly be geared for those looking to develop JSR-286 compliant portlets, so if that's your goal, those instructions may work for you. 

However, I had a hard time getting through them for the purpose of using NetBeans 6.1 to quickly generate JSR-168 compliant portlets. 

I really wanted to get it working as close to the original instructions as possible, so at first I attempted to add a "OpenPortal Portlet Container 1.0" Server in NetBeans. However I ran into a few problems with that. In the end, it was easiest for me to install portlet-container 1.0_02 (download) in Tomcat 5.5 using these instructions, then adding that Tomcat 5.5 instance as a server in NetBeans 6.1 via Tools -> Servers -> Add Server, and then mostly following the rest of the instructions in Developing Portlets with NetBeans Portal Pack 2.0 to create the portlet, with the exception of having to specify Tomcat 5.5 as the server. 

When complete, I was a little surprised (I guess because I didn't understand the inner-workings well-enough) to find that the war I "deployed" to Tomcat wasn't actually in Tomcat's webapps directory. However, with a little searching, I found the war under my NetBeans project dir. 

It would have also been nice to have it create a Maven 2 project, but it's still a great start if you just want to create a new portlet quickly! Note also that there are tools to do this in Eclipse. I couldn't find a way to do it in IDea though.  

You've used in-memory databases, such as HSQLDB (formerly called Hypersonic) or it's newer, faster cousin H2, but fewer have heard about the in-memory mail server, GreenMail by IceGreen. 

According to their site, "GreenMail is an open source, intuitive and easy-to-use test suite of email servers for testing purposes. Supports SMTP, POP3, IMAP with SSL socket support. GreenMail also provides a JBoss GreenMail Service. GreenMail is the first and only library that offers a test framework for both receiving and retrieving emails from Java." 

Examples of its use are available at their site, but here are a few things to note with the current version at time of writing, GreenMail v1.3. 

1) In v1.3 GreenMail's start() and stop() methods start/stop the server which is running on a different thread, so for this version at least, it makes sense actually listen to the port of the GreenMail server until it starts up or shuts down. You may not have to worry about it sometimes, since it can start up fairly quickly, but be warned, unless this gets fixed in a later version. Just as a quick incomplete example, you might use code like this to check to see if the port is open (but this code is a bit messy!): 


    public static boolean isPortOpen(String host, int port) {
        boolean result = false;
        Socket socket = null;
        PrintWriter out = null;
        BufferedReader in = null;
        try {
            socket = new Socket(host, port);
            out = new PrintWriter(socket.getOutputStream(),
            in = new BufferedReader(new InputStreamReader(
            result = true;
        } catch (UnknownHostException e) {
            System.err.println("problem with host " + host + e);
        } catch (IOException e) {
            result = false;
        finally {
            try {
            } catch (Throwable t) {
            try {
            } catch (Throwable t) {
            try {
            } catch (Throwable t) {
        return result;

    private static void waitForStatus(boolean status, String host, 
            int port, int maxWaitSeconds, int waitPeriodMillis) 
            throws Exception {

        long startTime = System.currentTimeMillis();
        int lastSec = 0;

        while(isPortOpen(host, port)!=status && 
              lastSec < maxWaitSeconds) {
            long waited = System.currentTimeMillis() - startTime;
            if (lastSec!=(int)waited/1000) {
                if (status) {
                    System.err.println("Waited " + lastSec + 
                       " sec total on GreenMail to start... attempting to connect to " + 
                       host + ":" + port);
                else {
                    System.err.println("Waited " + lastSec + 
                       " sec total on GreenMail to stop... waiting for stop: " +
                       host + ":" + port);

            try {
            catch (Throwable t) {
                System.err.println("sleep interrupted");

        if (isPortOpen(host, port)!=status) {
            if (status) {
                throw new Exception("GREENMAIL NOT STARTED! (waited " +
                  (System.currentTimeMillis() - startTime) + " msec total)");
            else {
                throw new Exception("GREENMAIL NOT STOPPED! (waited " + 
                  (System.currentTimeMillis() - startTime) + " msec total)");

        if (status) {
            System.err.println("Waited " +
                  (System.currentTimeMillis() - startTime) +
                  " msec total on GreenMail to start, and it started, in theory (because is listening on socket)");
        else {
            System.err.println("Waited " + 
                  (System.currentTimeMillis() - startTime) +
                  " msec total on GreenMail to stop, and it stopped, in theory (because is not listening on socket)");

2. I had trouble with receiving email from the secure servers (POPS and IMAPS). I tried to use a slight modification of InstallCertthat would automatically say "Y" to install cert from GreenMail, but that didn't work. Let me know if you can get it to work, and please post your code or link to your code if you get it working. TIA! 

Despite some slight issues, GreenMail is a really great tool for testing, and I hope IceGreen continues to develop it! 

In last last couple of months working with a custom portlet that uses Apache CXF (2.1.1) in uPortal, I was getting a wierd issue:
Caused by: java.lang.ClassCastException:
        at org.apache.cxf.transport.https.HttpsURLConnectionFactory.createConnection(
        at org.apache.cxf.transport.http.HTTPConduit.prepare(
        at org.apache.cxf.interceptor.MessageSenderInterceptor.handleMessage(
        at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(
        at org.apache.cxf.endpoint.ClientImpl.invoke(
        at org.apache.cxf.endpoint.ClientImpl.invoke(
        at org.apache.cxf.frontend.ClientProxy.invokeSync(
        at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(
        ... 36 more

This occurred with Apache CXF v2.1.1 where it was doing: 


        HttpsURLConnection connection =
            (HttpsURLConnection) (proxy != null
                                   ? url.openConnection(proxy)
                                   : url.openConnection());

It turns out that in uPortal, there is a configuration file "uPortal/WEB-INF/classes/properties/" that contained the following: 


  # Protocol handler for https connections.  Set by default to the one provided with
  # Sun's JSSE - change to use your local JSSE implementation

This in turn is used by the following code in a static block in org.jasig.portal.utils.ResourceLoader: 


  static {
    f = DocumentBuilderFactory.newInstance();
      String handler = PropertiesManager.getProperty("org.jasig.portal.utils.ResourceLoader.HttpsHandler");
      if ((System.getProperty("java.protocol.handler.pkgs") != null) &&
        handler = handler+"|"+System.getProperty("java.protocol.handler.pkgs");
    catch(Exception e){
      log.error("Unable to set HTTPS Protocol handler", e);

As you can see it puts what I think is the old wrong package (at least for Java 1.4+) "" at the first of the list of handlers. Doing this is not only unnecessary but can cause problems in Java 1.4+ from what I've read and have now experienced. 

While you can try removing the System property "java.protocol.handler.pkgs" in your code, that is a bad idea for obvious reasons (you're affecting the whole JVM, you don't know when that property will be read necessarily, etc.) and it isn't guaranteed to work. The best thing to do is to find the offending code or config and axe it so that it doesn't set old sun protocol handlers in newer code. 

So in short, the right way to fix this was to change uPortal/WEB-INF/classes/properties/ to have: 



and then restart. 

Unfortunately it looks like this property is being set also in the latest uPortal distribution (v3.1) also. If uPortal v3 (and latest version of v2 for that matter) aren't meant to support Java versions under v1.3, should the default properties list the property value of this as (blank) by default? I emailed the uPortal guys, so I guess we'll find out. 

If you're a Java developer and you've been around for any length of time, you've likely run into the issue of wanting to write something that can deal with different versions of the same Java API that may be incompatible.

What got me thinking about this, is that I recently saw a forum post in the Atlassian Confluence Developer forum where someone was asking about how to get the version number of Confluence so that they could write their plugin to be compatible with different versions of Confluence that have incompatible APIs. And as background, in the past, Atlassian has been very quick to change their API (much quicker than in Apache/Apache Jakarta projects!) sometimes leaving plugin developers needing to update in order to work with the next minor version of Confluence.

Really briefly, here are some options you might want to think about in this situation:

  • Option 1: Just have different versions of your code for different versions of the API it uses. You look at most libraries and applications, and in-general, this is what people do.
  • Option 2: Try to discover classes/methods via reflection when you know that they change and/or are non-existant between versions. Discovering via reflection in certain cases is common across many open-source libraries and applications, but reflection is slow, so you really shouldn't do this unless necessary.
  • Option 3: If you have control over the library/API that is changing constantly, write a library that doesn't change as much that contains interfaces, then in the library that does change, make it dependent on the interface library. Then use Spring or the like to swap out which class or classes implement the interface(s) you are using, and then just change the Spring config (or similar) in the original application in order to use a newer version of the library that changes so much. This isn't totally as great as it sounds because you have to have control over the projects, and it requires more work.
  • Option 4: Like Option 2 except you configure which version you are using and then use reflection or some other method to get the class. If it is just a matter of generating different XML or putting together a request object to some API different, this might be ok. But in-general, this is not a very robust solution for most cases, because if it does have to use reflection to instantiate classes etc. based on that version number, and you misconfigure it, you may not know about those incompatibilities until runtime. Of course you could misconfigure things using one of the other options mentioned, but at least there you will be dealing with the nitty-gritty configuration of implementing classes and not just dealing with a version number that hides what is really going on. However, with adequate error messages, this might be ok.

I'm sure people could think of others, but maybe this will give you a jump-start thinking about this, if you haven't recently.