Bean validation is a nice API for for validating Java objects and is included in Java EE 6. But it can also be used anywhere, regardless of the layer. It can be used with or without JPA and in a stand alone Java SE.
It formalizes and encourages the validation approach at the domain model level.
It helps in de-duplication of the validation logic that we are accustomed to having all over the place - UI, business logic and elsewhere to get it back to the domain model - where it really belongs
In the past, people used (and a lot of them still do) anemic model objects, without ever giving a thought to the fact that they were central to their domain. A lot of validation is central to the domain, but was written elsewhere. I have noticed that people are thinking that the validation (and more) should be brought into the fold of the model objects. Bean Validation could serve as a selling point for Domain Driven Design. It worked for me.
Back to my point - I was desgining a JavaFX application. Not the ones with animation - but regular boring app, but with a horrendous set of validations. Writing all of them in the UI will be such a pain, in addtion to just not being right. Unknown to me, a backend was starting to take shape in another part of the world, where, among other things, validation was implemented as business logic in Spring POJOs wrapped in Session EJBs (you know, the typical hangover from early Spring-J2EE days). There were JUnit tests in place. That emboldened me to offer a refactoring of the validation into the domain classes - which were already JPA entities anyway. Not surprisingly I was met with resistance, but a bit of explanation around Domain Driven Design and the "insurance" of JUnit tests convinced the sponsors they should give it a shot. A few sessions of refactoring (based on guess work) and JUnit regression tests later, the validation logic was sitting pretty in the domain model, with a combination of standard annotations and custom constraints. This impressed the client so much that they started taking an active interest in domain driven design in other part of the system.
With the validations in domain model, I added the domain jar to JavaFX bundling. The required validation libraries from Hibernate were also added. The JavaFX application jar was signed (to allow appropriate JVM permissions for validator). The UI fields were retrieved to-and-from the domain model A Hessian remoting over HTTP allowed the objects to be dispatched to the server. The validations were shared between the JavaFX app and the backend. A happy ending indeed.
Moral of the story:
You can easily use Bean validation in JavaFX. Don't be scared. It would be a lost opportunity and time if you don't
You can use Bean Validation to sell the value of Domain Driven Design to your team and managers. It is one of the easy ways to convince - because it is a easy concept to unerstand even for non-technical managers.
Before you jump off and start doing everything with Bean Validation - remember that there are four types of validation
Data Type validation
Basic Domain Value Validation - trivial NotNull x less than y etc
Cross field Domain Validation
Complex Business Rules Validation crossing mutliple domain objects not in the same object graph
Remember that the first one should be done in the UI. A verification of whether a input is a String or Integer and pasing the int can only be done in the UI. It is better if you customize your input fields to accept only certain keys - thereby effectively eliminating this type of validation. Bean validation is ideally suited for 2 and 3. The fourth type should be evaluated case by case and generally is really business logic that belongs to EJBs or sometimes rule engines
Struts is a very mature framework. Some may think it is old fashioned or not so cool kid on the block, but like it or not, it is a force to reckon with. If I were running a business requiring a solid web infrastructure, I would bet on Struts. After all, the bottomline for the business is project success and not playing with cool bleeding edge framework. (That's the passion for us, developers). And that's probably the reason why Struts is so popular.
Anyway, I have been using Struts ever since it came out there. I have seen developers use Struts in many ways - Some right, while others blatantly incorrect. A bunch of best practices emerged in my mind due to common sense and experience. And so, I decided to document them.
What started as a personal notes was growing into a full fledged book. And then, I decided to try my luck with self publishing. About an year back, I self-published the book. The book, Struts Survival Guide: Basics to Best Practices, as I called it, was successful (by my yardstick). I did not make any profit in the process, but I did not incur any loss either. It was a labor of love and a very rewarding experience at the end of the day.
I sold off all the copies of the book. And now, the ebook is available free for download here.
What follows is my experience with self-publishing
As soon as I started writing the full fledged book, I realized that writing it is going to be tougher than my little notes. I have written articles before, but book authoring was a different ball game altogether.
For starters, I was my own editor, reviewer and graphics editor. That means I not only had to write, but also cross check the facts, fix the grammar and create graphics and illustrations. And I was doing all of this after my day job. (Needless to say that there are still a bunch of grammar mistakes, but no factual errors to my knowledge) I spent countless weekends and evenings working on all aspects of the book. It was painful and rewarding at the same time. It was a tough job to get reviewers for a fledgling book with uncertain future. However I did get a few of my acquaintances to review some chapters.
Finally, the book was completed. Now came the task of getting it printed. I realized I had to launch a company to publish and retain rights over it. The rule says that the book can be published only by a entity owning the ISBN for the book. And so, I registered a LLC in Texas over the Thanksgiving weekend of 2003. It was so easy to do over the internet; I didnt even have to get off my chair.
I also purchased ISBNs from R R Bowker. ISBN is that number on the book one hardly notices. It is the equivalent of UPC in the product world. Bowker sells ISBN in chunks of 10.
With the ISBN in hand, I shopped for somebody who could create cover page for me. Luckily there are a lot of businesses out there who can create cover pages for a decent price. The number of pages in the book, quality of paper to be printed upon etc. have to be known before the cover page is created. With their software, the cover page creators feed the page count, paper thickness and book size to create a template, draw some good eye-catching pictures and plug in a the ISBN and create a bar code out of it.
Next came printing. the printing cost directly depends on the number of copies (or print run as they call it). Higher the copies, lesser the price. I did not want print a lot and let it rot in my warehouse (read apartment ;-D). I did not want to print really less and pay too much per copy either. Finally I printed enough copies so as to break even when most of them were sold. And boy, was I lucky...
Next came copyright registration. It was pretty easy. Fill out a form and send it to Library of Congress
Any book is of no use without a credible and established sales channel. For book publishers, the sales channel comes in two-three forms. Third party retailers, Distributors, and direct sales. Large third party retailers such as Amazon, Barnes and Noble tend to buy directly from the publisher. Other smaller retailers and libraries buy through the Distributors. Then there is the direct sales from the Publisher's web site. Distributors and thrid party retailers take as much as 60% of the total list price as their commission.
I tried like crazy to get a distributor, only in vain. Luckily for me, Amazon.com is very small publisher friendly. I set up a account with them to sell my books. They take 55% of the sales price, but are very prompt in payments. Plus they turned out to be my biggest source of sales. If it were not for Amazon, I would be sitting on a pile of books.
No good book would sell without marketing. I joined the Publisher's Marketing Association (PMA), an entity that provides some marketing for small publishers. They hooked me up with Baker and Taylor, the largest wholesaler in US. A lot of libraries buy their books from Baker and Taylor. They accounted for my second largest sales after Amazon.
All said and done, people buy book only if they come to know. Here too, PMA provided me with options to bundle my flyers with other publishers and mail them to Libraries, colleges and so on for a small fee. I dont know definitively, but all my Baker and Taylor orders might have been from Libraries.
Another source of marketing for me was the book promotion in Java Ranch (http://www.javaranch.com). Those folks organize book promotions every week and I booked a slot in advance. On the week, my book is due for promotions, I would have to answer a bunch of questions posted by their readers. Four lucky readers would get a free copy of the book. I think it is a great idea and worked out really well for me. A lucky Slashdot review for my book also worked its magic.
The toughest part was getting credibility to the book. One needs reviews, forewords from well known folks for that. It goes without saying that I did not get any. I sent out book copies and previews to a bunch of guys, but only one person was kind enough to review. Jessica Sant (again from JavaRanch) did an independent review and gave it 9 out of 10 horseshoes (4 out of 5 stars on Amazon). And I thought getting reviewers for the initial chapters was tough.
One final piece of the puzzle was direct sales. I set up my web site and sold ebooks and paperback through it. Selling Paperback was easy. I hooked up with Paypal and linked my site to it. When a payment is made in Paypal, it sends me an email. At the end of the day, after my day job, I reply to all of them and mail the book via USPS.
Selling ebook was a challenge. It is norm that people buying ebook get it immediately. I did not have the infrastructure to set up my own credit card processing. When a payment is made in Paypal, I learnt that it not only sends me an email, but also posts (HTTP) the buyer data to a url I provide (Poor man's web service). I signed up for Java hosting. My Hosting provider gave me a Tomcat where I deployed Struts application (Yeah... eating my own dog food) that would persist the Paypal posted data to a MySQL database. The buyer could then immediately download the ebook. Problem solved.
Marketing, sales and customer support were difficult and hadn't it been for my wife, I would have been left with a bunch of angry customers. These tasks were tougher than writing the book on the first place. If one counts time as money, I made a huge loss. But then the experience was its own reward. There are some things money cannot buy. For all others things, well you know.....
I often hear from readers, why not just best practices book? There are other books that explain basics. From my perspective, it would be a non-seller. When readers buy a book, they expect complete coverage of the subject. And that's exactly what I did in my book.
Another thing I often hear is: What is the best practice for task XYZ? Why is it not covered in the book?
My answer is: It is definitely impossible to cover all best practices in such a small book. Moreover there are very few absolute best practices. Others are best practices relative to a project. The book lays foundation and prepares your mind set about best practices. Use your judgement in all other cases.
Logging with Log4J is simple and seems to be trivial and doesn't warrant a blog. However Logging in enterprise projects raises interesting requirements and possibilities.
The first question is where do you put your Logging library. With JDK Logging, you pretty much have no choice. It is always located in the classpath and loaded by bootstrap classloader, the mother of all class loaders.
Log4J brings two choices to the table. You can put it in application server's classpath or package it as a dependency library along with the EAR.
If yours is the only application hosted on the server, either choices will mean the same thing. However if there are multiple applications hosted on the same VM, care must be taken before putting the Log4J jar in the system classpath.
In Log4J, all Loggers are singletons. This means if you have Loggers with same names in multiple EARs, then the Logger defined later overwrites the earlier one.
In other words, you might find that the logs from your application end up in another application's logs.
This can be a problem even when there are no two loggers with same names.
The catch-all root logger that exists in all your log4j configuration can pose a threat.
If none of the defined logger categories are able to log the message, then the burden falls on the root logger. The root logger might be configured by the "other" application hosted on the shared server In other words never rely on the root logger and always defined a logical root logger. If you are using the fully qualified class name as your logger name, then define the toplevel package name uniquely identifying your application as the logical root logger. For instance "com.mycompany.myapplication" can be the logical root logger.
You might say, "Hey I have Log4J packaged in each EAR. So, the Loggers are singletons at the EAR level and I dont care about the names I assign to them".
Before you go that route, consider how you aggregate messages per user basis.
With Log4J, you are most probably using Nested Diagnostic Context (NDC) aren't you? Chances are that you are using a Servlet Filter to set the session id as the NDC contextual identifier. If your application is a standalone, then bundling the Log4J in your EAR is the right option.
However, if your application collaborates with other applications (EARs) and tracking user activity across applications with NDC is important to you
then you are out of luck with Log4J bundled in the EAR. NDC manages a static stack of contextual information per thread. When your application makes
call into another application's EJBs, then you are cutting across classloaders, and the NDC from the caller is not available in the callee. The only way that can be made available across applications is when the Log4J is loaded by the parent classloader.
Well, I might say "Put your Log4J library in the system classpath and your problems will be solved". But the reality is that you have to often live alongside other applications that have bundled Log4J in their EAR. Worse you might have to collaborate with them. Most likely you will not have the liberty to change their logging logic or configuration.
One solution that comes to my mind is using AOP in conjunction with ThreadLocal. For example if yours is the calling application and the callee relies on NDC, then you can store the identifier as ThreadLocal variable. Using Advices you can then associate the threadlocal value with the callee's NDC.
And thus you have effectively carried over the unique identifier for the user activity acorss the thread of execution. The class using ThreadLocal should load from the system classpath though.
No matter how you log in your system, you might have run into situations needing to filter logs across multiple files, possibly across multiple applications for a given user at various times based on NDC. Utilities like Chainsaw or LogFactor5 do this for a single file. There is a need for having a broad based tool that does time based NDC filtering across multiple files. Perhaps there is a open source tool out there satisfying my requirements.
Another question perhaps outside the realm of Log4J itself is "How to correlate Logging that occurs across VMs?". A question that needs to be addressed in distributed environments. This may be impossible without encapsulating the correlation identifier in the invocation itself. The collaborating systems (caller and callee) should be able to interpret the correlation identifier. But then, it also results in tight coupling. I dont know if there is really a good solution to this problem.
Imagine you entered a retail outlet to shop that just says OPEN. Now what is your reaction if something suddenly throws you out of the shop No reasons given. And then you find the retail outlet with a sign CLOSED. You will be frustrated wont you? You'd expect that the outlet lets you shop now that you have entered it before the "CLOSED" sign is put up. right?
Guess what - a lot of J2EE systems in production might not be doing that at all.
When new deployments go out, active users on the system are unceremoniously ousted.
What we want is:
Some minutes before the site is brought down, stop new users from entering the sytem.
However, let the existing users continue to use the syetm with a message that system will go down in 15 minutes.
When 15 minutes pass, bring down the system.
As any other problem there are many solutions to this problem at different levels.
At the hardware level, you can configure your router to not accept new requests for the application. You can also configure this at the web server level. In this blog, I am glossiing over a J2EE based solution to this problem.
I'd like to call this "Retail Outlet" Pattern, since it mimics the real world shopping experience. When the OPEN becomes CLOSED it lets existing shoppers to continue their unfinished work for another 15 minutes without letting any new shoppers in.
To implement, it requires a JMX MBean that holds info if the shop should be closed or not. Why JMX? well, a lot of people who start and stop the production application servers are administrators, not developers. MBean properties can be edited from any SNMP console, which all of them are familiar with. Some application servers allow editing of MBeans from their own console. Most of all, changes in JMX attributes are reflected immediately. (Another area where I like to use JMX is to control Log4J log level. More on that later.)
Another component is a Servlet Filter that interacts with another object (Session Lifecycle Listener) that counts the number of active users and keeps track of it. When the Mbeans switch is flipped to indicate the CLOSE status, the Filter blocks all new requests (those without a session), but let those with an active session to continue for another (say) 15 minutes.
A word of caution though. I have never used this approach before. However I would like to try this out. Any thoughts on your experiences?