Skip to Main Content

Java Programming

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Interested in getting your voice heard by members of the Developer Marketing team at Oracle? Check out this post for AppDev or this post for AI focus group information.

Logging to a file without using Logger?

807591Mar 26 2008 — edited Mar 27 2008
Hi all,
I've been thinking of adding logging support to my app. So I found the Logger class. It seems to be ok, but that is not exactly what i was looking for.

What I want to implement is cyclic/rolling log. But I am not talking about cycling files (the Logger class works fine with that), I am talking about cycling lines in only one log file. I mean, to add a limit to the file (let's suppose it is a line count limit, but it can be implemented as a file size limit too) and once this line limit is reached the next log would erase the oldest line in the log file and move the rest to the place where their previous line had been (just like erasing the first element in an array and moving the rest of the elements one place back). Finally the new line is added at the end log.

I found the Logger class useful because a limit can be set to a log file. However when the file reaches the limit and a new log is being written ALL previous log entries are erased... unnecessary loosing a lot of useful data.

I also took a look at [Log4j|http://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/package-summary.html] but they don't seem to have what I am looking for.

Is there any library or any other class I've missed to perform this?
In case there isn't, what could be the best way to implement it? An ArrayList<String>?

Thanks for your time :)

Comments

796440
asdasdasdasdasdasdasdasd wrote:
What I want to implement is cyclic/rolling log. But I am not talking about cycling files (the Logger class works fine with that), I am talking about cycling lines in only one log file. I mean, to add a limit to the file (let's suppose it is a line count limit, but it can be implemented as a file size limit too) and once this line limit is reached the next log would erase the oldest line in the log file and move the rest to the place where their previous line had been (just like erasing the first element in an array and moving the rest of the elements one place back). Finally the new line is added at the end log.
That's a bad idea. Once you hit the limit, you'll be rewriting the entire log file for every log message. Why would you want to do that?
807591
jverd wrote:
That's a bad idea. Once you hit the limit, you'll be rewriting the entire log file for every log message. Why would you want to do that?
Besides, do you really want the length of time you keep log messages to be dependent on how many log messages you get? What if you need the log from last week, but your log only has items from the last day, because your app went screwy last night and dumped enough lines to push the older stuff out?
807591
Yes, I know you have to rewrite the whole log file... but is there any other way to keep track of the last X log entries? I don't want to truncate the file once the limit is reached because in that case I would be erasing the most recent entries too. Besides if X is about 100 - 500 lines... I think there isn't much performance loose... if that is what you meant... but yes, I agree with you... this is not the best solution.

Another comment. I am currently using Miranda (an instant messaging system) and I can set it to store in history the last X previous events. How can a similar procedure be implemented in Java?
807591
hunter009, your comment would also apply to the common limit you can set to the Logger class (cycling between MULTIPLE files). Not being able of getting ALL the log entries doesn't depend on the amount of files, it depends on cycling them. I choose to cycle them because i want them to have a "controlled" size. For me the log file size is much more important than what is actually logged (which in my case is not critical data).
I am rather certain that Log4j supports rolling files based on numerics and size limits. Thus it goes from file_1 to file_10 and then starts over.
796440
asdasdasdasdasdasdasdasd wrote:
Yes, I know you have to rewrite the whole log file... but is there any other way to keep track of the last X log entries?
What's the value in keeping the last N entries, vs. the last N days or last N megabytes?
I don't want to truncate the file once the limit is reached because in that case I would be erasing the most recent entries too.
So use a rolling file appender in log4j. You can set max file size, or max days to keep, maybe both. It renames the old one with a time/date stamp or with a sequence number. When you hit the max number of files, or possibly other criteria, it deletes the oldest.
807591
Log4j does support them. Actually even the Logger class allows you to cycle files. However i don't want to cycle multiple files, I just want to cycle log entries (or lines) in only one file.
You can always write your own appender then.

Note that for this to work the lines must be the same length every single time. And you can't post stack traces (because they are not the same length.)

Given that you might as well write you own debug log class. It isn't clear what advantage you would have with your requirements using another tool. And it isn't hard to implement that.
807591
It would be the same to keep the last N entries or megabytes. But the problem is that when the limit is reached the file is fully cleaned... so, after that cleanup you are not storing the last N megabytes... only the logrecord which had reached the limit. Anyway I just wanted to know if there was any (already thought) way of implementing a similar behaviour with only one file. I think I'll have to use multiple files.

Thank you all for replying!
796440
asdasdasdasdasdasdasdasd wrote:
Log4j does support them. Actually even the Logger class allows you to cycle files. However i don't want to cycle multiple files, I just want to cycle log entries (or lines) in only one file.
Again, why? What weird requirement is drving this.

Cycling lines in a file is simply not a reasonable thing to do here. And probably not even attainable with acceptable performance.
It would be the same to keep the last N entries or megabytes
Those are not the same. If the lines are not the same length then you can not manage the data in such a way that it will not result in a line that is incomplete.
But the problem is that when the limit is reached the file is fully cleaned... so, after that cleanup you are not storing the last N megabytes...
Huh?

There is no "storing" if you start overwriting. Short of stopping logging you can't grab the file while it is use or at least that really isn't a good idea.
807591
What about LimpidLog which is a revolutionary logging system. You do not need to hard code log statements.
It is an open source software at http://www.acelet.org/_limpidlog/.
Even better, you can use LimpidDebugger as a reader and tracer for your log data. It is a GUI debugger, just like a traditional debugger. It is free at http://www.acelet.com/super/LimpidDebugger/.
You do not need to hard code log statements.
Which would suggest that a "logging" system that doesn't work the way that I want.

Presumably it is using debugging points, and if I want a debugger then I would use a debugger.
807591
LimpidLog which is a revolutionary logging system. You do not need to hard code log statements.
It is completely different idea.

At the runtime, there is no break point. When you want read the log data, you set a break point. The break point is a break for READING, not EXECUTION! New idea!
LimpidLog which is a revolutionary logging system. You do not need to hard code log statements.
It is completely different idea.
Object oriented database were a "revolutionary" idea that proved after a while, not soon enough, that they had very little use in the marketplace.
At the runtime, there is no break point. When you want read the log data, you set a break point. The break point is a break for READING, not EXECUTION! New idea!
Apparently you completely ignored what I said before - if I want to use a debugger then I will use a debugger.

Conversely I do not use a log library in place of a debugger and I never will.

So it doesn't matter to me how your tool works because it will not meet my needs.
807591
qweqweqweqweqweqweqweqweqweqweqweqwe
1 - 16
Locked Post
New comments cannot be posted to this locked post.

Post Details

Locked on Apr 24 2008
Added on Mar 26 2008
16 comments
338 views