
MessageAppender extends RollingFileAppender. I overwrite rollover() to notify an Observable, but I'll verify the part with the increasing threads (by using the RollingFileAppender directly in logback.xml)... On 07/11/2012 04:46 PM, ceki wrote:
Could it be that com.nexustelecom.minaclient.writer.MessageAppender is not closing files? There are 24 hours in a day, so you should not have more than 24 active appenders...
On 11.07.2012 16:13, Andreas Kruthoff wrote:
Hi
I'm trying to resolve a scenario with the following sifting appender configuration, but got problems with too many open files.
I use logback to produce CSV files.
The ${hour} key is set based on the content (within my %msg).
Rollover is every minute.
<appender name="SIFT" class="ch.qos.logback.classic.sift.SiftingAppender"> <discriminator> <key>hour</key> <defaultValue>unknown</defaultValue> </discriminator> <sift> <appender name="DATA-${hour}" class="com.nexustelecom.minaclient.writer.MessageAppender"> <file>${java.io.tmpdir}/data-${hour}.csv</file> <encoder> <pattern>%msg</pattern> </encoder> <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${java.io.tmpdir}/data-${hour}-%d{yyyyMMddHHmm}.csv</fileNamePattern>
<cleanHistoryOnStart>true</cleanHistoryOnStart> <maxHistory>3</maxHistory> </rollingPolicy> </appender> </sift> </appender>
Suppose that I started the application at 11, and that current time is 16, so most of the data goes into data-16.csv, and some goes into data-15.csv, data-14.csv, but nothing goes into older data files.
After a while, I get more and more old files, active files and rolled files, i.e. data-11.csv, data-12.csv, and data-11-201207111125.csv, data-11-201207111132.csv, and so on.
Such old files (from 11, or 12) are under logback control, but as they don't get new data anymore, they stay open forever and are never cleaned up. There aren't new 'events' for these files.
The JVM shows an increasing number of active threads (probably the loggers) at the same time, which causes problems after a few hours.
Is there a way to configure logback to handle my scenario?
This email and any attachment may contain confidential information which is intended for use only by the addressee(s) named above. If you received this email by mistake, please notify the sender immediately, and delete the email from your system. You are prohibited from copying, disseminating or otherwise using the email or any attachment.
_______________________________________________ Logback-user mailing list Logback-user@qos.ch http://mailman.qos.ch/mailman/listinfo/logback-user
This email and any attachment may contain confidential information which is intended for use only by the addressee(s) named above. If you received this email by mistake, please notify the sender immediately, and delete the email from your system. You are prohibited from copying, disseminating or otherwise using the email or any attachment.