I working on an extension to a class project in FPGA design. We have boards sending out black and white video at 15fps via UDP.
Right now I'm trying to get a Java video viewer working. I've set up a loop that receives packets indefinitely and processes them on the fly. The video display is a simple canvas in a JFrame. When I write 8x8 blocks in a single color, the video comes through just fine, albeit it at a very low resolution. When I try to do all of the necessary bit twiddling (which is much, much more painful in Java than in Verilog) to decode the additional information, it chokes up and doesn't render properly. It misses entire packets' worth of data.
What puzzles me is that the CPU usage is relatively low - about 12% on a dual core computer - and yet the performance is terrible. Clearly all of the packets make it in since I can reduce the amount of information displayed and handle everything without issue. At that point the CPU usage is around 4%.
Is there a more appropriate way to write video data to the screen than to repeatedly call "fillRect" on a Canvas Graphics object? Is there a way to queue up incoming packets and buffer them for processing between frames?