
Hi, I I am a new user of logback and would like to use it in my project. Currently we are using log4j, but we have very strict performance requirements and it would be gr8 if logback performs better then log4j. I did some preliminary tests to compare the performace of the two ..and i observed that at some places logback is certainly better , but i also observed a behaviour which is contrary to what is described in the documentation. I am attaching the source code and my config files for the test . The queries I have are as following - 1. The documentation says that the "logbacklogger.debug("SEE IF THIS IS LOGGED {}.", i);" is faster then if (logbacklogger.isDebugEnabled()) logbacklogger.debug("SEE IF THIS IS LOGGED " + i + "."); but i observed that this is only true if the logging is configured for debug . For any other logging level the second approach is still more performant. This means that in a production env I must still have the check for isDebugEnabled. Is that intended or am i missing something. 2. Something very strange that i observed in my test was that if i run my outer loop 10 times then always at the 7th invocation the logback took much longer time. I was able to get this behaviour consistently. Anyways for the time being I am attaching my source code and the config files below ...maybe somebody can point out the errors or way of improvement. ########################################JAVA FILE################################################## package mandy.test.programs; import org.slf4j.Logger; import org.slf4j.LoggerFactory; public class Logtester { static Logger logbacklogger = LoggerFactory.getLogger(Logtester.class); static org.apache.log4j.Logger log4jlogger = org.apache.log4j.Logger.getLogger(Logtester.class); public static void main(String[] args) throws InterruptedException { for (int j = 0; j < 5; j++ ) { System.out.println("#######################################"); System.out.println("TEST NUMBER "+j); System.out.println("#######################################"); long start = System.nanoTime(); for (int i = 0; i < 500; i++ ) { log4jlogger.debug("SEE IF THIS IS LOGGED " + i + "."); } System.out.println("time taken in nanoseconds in first log4j test is " + (System.nanoTime() - start)); start = System.nanoTime(); for (int i = 0; i < 500; i++ ) { if (log4jlogger.isDebugEnabled()) log4jlogger.debug("SEE IF THIS IS LOGGED " + i + "."); } System.out.println("time taken in nanoseconds in second log4j test is " + (System.nanoTime() - start)); Thread.sleep(1000); start = System.nanoTime(); for (int i = 0; i < 500; i++ ) { logbacklogger.debug("SEE IF THIS IS LOGGED " + i + "."); } System.out.println("time taken in nanoseconds in first logback test is " + (System.nanoTime() - start)); start = System.nanoTime(); for (int i = 0; i < 500; i++ ) { if (logbacklogger.isDebugEnabled()) logbacklogger.debug("SEE IF THIS IS LOGGED " + i + "."); } System.out.println("time taken in nanoseconds in second logback test is " + (System.nanoTime() - start)); Thread.sleep(1000); start = System.nanoTime(); for (int i = 0; i < 500; i++ ) { logbacklogger.debug("SEE IF THIS IS LOGGED {}.", i); } System.out.println("time taken in nanoseconds in third logback test is " + (System.nanoTime() - start)); } } } ###################################################LOG4J CONFIG###################################### <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE log4j:configuration SYSTEM "../resource/log4j.dtd"> <log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/" debug="false"> <appender name="TestLogfile" class=" org.apache.log4j.RollingFileAppender"> <param name="File" value="d:/mandy/Test_Programs/testlog4j.log"/> <param name="MaxFileSize" value="1000KB"/> <param name="maxBackupIndex" value="1"/> <param name="Append" value="true"/> <layout class="org.apache.log4j.PatternLayout"> <param name="ConversionPattern" value="%d{ISO8601} %5p [%t] %c{1} %X{transactionId} - %m%n"/> </layout> </appender> <logger name="mandy.test.programs" additivity="false"> <level value="debug"/> <appender-ref ref="TestLogfile"/> </logger> </log4j:configuration> #############################################LOGBACK CONFIG############################################ <configuration> <appender name="TestLogfile" class="ch.qos.logback.core.FileAppender"> <File>d:/mandy/Test_Programs/testlogback.log</File> <Append>false</Append> <Encoding>UTF-8</Encoding> <BufferedIO>true</BufferedIO> <ImmediateFlush>true</ImmediateFlush> <layout class="ch.qos.logback.classic.PatternLayout"> <Pattern>%-4relative [%thread] %-5level %logger{35} - %msg%n</Pattern> </layout> </appender> <root> <level value="debug" /> <appender-ref ref="TestLogfile" /> </root> </configuration> #########################################OUTPUT WITH LOG LEVEL SET TO INFO IN BOTH CONFIG FILES ############################ ####################################### TEST NUMBER 0 ####################################### time taken in nanoseconds in first log4j test is 1287035 time taken in nanoseconds in second log4j test is 425473 time taken in nanoseconds in first logback test is 1988521 time taken in nanoseconds in second logback test is 60622 time taken in nanoseconds in third logback test is 1378387 ####################################### TEST NUMBER 1 ####################################### time taken in nanoseconds in first log4j test is 2332139 time taken in nanoseconds in second log4j test is 115378 time taken in nanoseconds in first logback test is 1550756 time taken in nanoseconds in second logback test is 50565 time taken in nanoseconds in third logback test is 374069 ####################################### TEST NUMBER 2 ####################################### time taken in nanoseconds in first log4j test is 2050540 time taken in nanoseconds in second log4j test is 670755 time taken in nanoseconds in first logback test is 846756 time taken in nanoseconds in second logback test is 415974 time taken in nanoseconds in third logback test is 1142044 ####################################### TEST NUMBER 3 ####################################### time taken in nanoseconds in first log4j test is 321828 time taken in nanoseconds in second log4j test is 43581 time taken in nanoseconds in first logback test is 358985 time taken in nanoseconds in second logback test is 51962 time taken in nanoseconds in third logback test is 127390 ####################################### TEST NUMBER 4 ####################################### time taken in nanoseconds in first log4j test is 323784 time taken in nanoseconds in second log4j test is 43581 time taken in nanoseconds in first logback test is 362337 time taken in nanoseconds in second logback test is 45816 time taken in nanoseconds in third logback test is 124597 ####################################### TEST NUMBER 5 ####################################### time taken in nanoseconds in first log4j test is 326578 time taken in nanoseconds in second log4j test is 43581 time taken in nanoseconds in first logback test is 5767493 time taken in nanoseconds in second logback test is 25981 time taken in nanoseconds in third logback test is 103365 ####################################### TEST NUMBER 6 ####################################### time taken in nanoseconds in first log4j test is 261206 time taken in nanoseconds in second log4j test is 28775 time taken in nanoseconds in first logback test is 239416 time taken in nanoseconds in second logback test is 5867 time taken in nanoseconds in third logback test is 59505 ####################################### TEST NUMBER 7 ####################################### time taken in nanoseconds in first log4j test is 223492 time taken in nanoseconds in second log4j test is 12851 time taken in nanoseconds in first logback test is 235784 time taken in nanoseconds in second logback test is 5867 time taken in nanoseconds in third logback test is 58667 ####################################### TEST NUMBER 8 ####################################### time taken in nanoseconds in first log4j test is 220977 time taken in nanoseconds in second log4j test is 12851 time taken in nanoseconds in first logback test is 238299 time taken in nanoseconds in second logback test is 5867 time taken in nanoseconds in third logback test is 61460 ####################################### TEST NUMBER 9 ####################################### time taken in nanoseconds in first log4j test is 223212 time taken in nanoseconds in second log4j test is 12851 time taken in nanoseconds in first logback test is 237461 time taken in nanoseconds in second logback test is 5308 time taken in nanoseconds in third logback test is 62019 #########################################OUTPUT WITH LOG LEVEL SET TO DEBUG IN BOTH CONFIG FILES ############################ ####################################### TEST NUMBER 0 ####################################### time taken in nanoseconds in first log4j test is 85486004 time taken in nanoseconds in second log4j test is 35743649 time taken in nanoseconds in first logback test is 22309540 time taken in nanoseconds in second logback test is 7795963 time taken in nanoseconds in third logback test is 19922365 ####################################### TEST NUMBER 1 ####################################### time taken in nanoseconds in first log4j test is 39167827 time taken in nanoseconds in second log4j test is 33095826 time taken in nanoseconds in first logback test is 3618616 time taken in nanoseconds in second logback test is 2657601 time taken in nanoseconds in third logback test is 6253309 ####################################### TEST NUMBER 2 ####################################### time taken in nanoseconds in first log4j test is 12328941 time taken in nanoseconds in second log4j test is 13580218 time taken in nanoseconds in first logback test is 2401422 time taken in nanoseconds in second logback test is 2868521 time taken in nanoseconds in third logback test is 4724902 ####################################### TEST NUMBER 3 ####################################### time taken in nanoseconds in first log4j test is 12460243 time taken in nanoseconds in second log4j test is 11747583 time taken in nanoseconds in first logback test is 2559543 time taken in nanoseconds in second logback test is 2694477 time taken in nanoseconds in third logback test is 4801727 ####################################### TEST NUMBER 4 ####################################### time taken in nanoseconds in first log4j test is 11915481 time taken in nanoseconds in second log4j test is 12330618 time taken in nanoseconds in first logback test is 2379073 time taken in nanoseconds in second logback test is 2888076 time taken in nanoseconds in third logback test is 2698667 ####################################### TEST NUMBER 5 ####################################### time taken in nanoseconds in first log4j test is 15098288 time taken in nanoseconds in second log4j test is 11025144 time taken in nanoseconds in first logback test is 8196013 time taken in nanoseconds in second logback test is 2065067 time taken in nanoseconds in third logback test is 2888076 ####################################### TEST NUMBER 6 ####################################### time taken in nanoseconds in first log4j test is 10594922 time taken in nanoseconds in second log4j test is 9976687 time taken in nanoseconds in first logback test is 2229892 time taken in nanoseconds in second logback test is 2172902 time taken in nanoseconds in third logback test is 2692800 ####################################### TEST NUMBER 7 ####################################### time taken in nanoseconds in first log4j test is 10521728 time taken in nanoseconds in second log4j test is 10123354 time taken in nanoseconds in first logback test is 2148597 time taken in nanoseconds in second logback test is 2135746 time taken in nanoseconds in third logback test is 2933893 ####################################### TEST NUMBER 8 ####################################### time taken in nanoseconds in first log4j test is 10182579 time taken in nanoseconds in second log4j test is 10013563 time taken in nanoseconds in first logback test is 2300851 time taken in nanoseconds in second logback test is 2087137 time taken in nanoseconds in third logback test is 2761803 ####################################### TEST NUMBER 9 ####################################### time taken in nanoseconds in first log4j test is 9720230 time taken in nanoseconds in second log4j test is 10646605 time taken in nanoseconds in first logback test is 2647264 time taken in nanoseconds in second logback test is 2164800 time taken in nanoseconds in third logback test is 2681905

Hello, Thanks for your email and your observations. I've run your tests, and I think that we should modify a few things to obtain better results. See the class that I've appended at the end of this email. Since we want to calculate the cost of *not* logging, there is no need for configuration files. Only setting the level of the two loggers to info and using debug() methods is enough. We will then see the time it takes to both frameworks to realize that the current logging request needs to be dropped. Also, there will be a difference between the first and following results, because of the just in time compiler. So we should run the tests once without measuring first and then only ouput the results. One more thing is the number of times that we call the debug() method. I think that a number around 100'000 is more appropriate than 500, which is really too low. By running the modified test, I get these values [nanoseconds per calls]: Log4j direct debug call: 370 Log4j tested (isDebugEnabled) debug call: 10 Logback direct debug call: 354 Logback tested (isDebugEnabled) debug call: 3 Logback parametrized debug call: 50 Now about your performance questions: 1. In several documents[1][2] we explain that parametrized logging is way faster than calling the debug method with one argument that is constructed with Strings. That is: logger.debug("Hi, my name is {}.", name); is faster than: logger.debug("Hi, my name is " + name + "."); But it is not true that the parametrized logging is faster than calling isDebugEnabled() before doing anything. I really would like to know where did you read that the parametrized call was faster than the tested call (i.e. calling the printing method inside an if statement testing the level). If we wrote that somewhere, I am going to correct it. However, we can see that logback performs 3 times better than log4j when calling the isDebugEnabled() method. This comes from the fact that logback loggers know their effective level (that is, their level once level inheritance has been taken into consideration). There is no need to walk up the logger hierarchy to know the effective level of a logger. So if performance is critical to you, you have here a good reason to switch :) That said, I was expecting the parametrized call to be faster than what came out of this test, so I'm going to dig some more on this point. 2. I've tried to run the tests several times, but did not experience an invocation running for a much longer time than all the others... and I must admit that it looks strange to me... You might want to run the class that I've attached to this email, so we can see if it performs well in your environment. Regards, Sébastien [1] http://slf4j.org/faq.html#logging_performance [2] http://logback.qos.ch/manual/architecture.html#ParametrizedLogging -- Sébastien Pennec sebastien@qos.ch Logback: The reliable, generic, fast and flexible logging framework for Java. http://logback.qos.ch/ ###################################################### public class PerformanceComparator { static Logger logbacklogger = LoggerFactory.getLogger(PerformanceComparator.class); static org.apache.log4j.Logger log4jlogger = org.apache.log4j.Logger.getLogger(PerformanceComparator.class); //How many times should we try to log: static int loop = 100000; public static void main(String[] args) throws InterruptedException { initConfig(); //Let's run once for Just In Time compiler log4jDirectDebugCall(); log4jTestedDebugCall(); logbackDirectDebugCall(); logbackTestedDebugCall(); logbackParametrizedDebugCall(); //let's run the tests and display the results: long result1 = log4jDirectDebugCall(); long result2 = log4jTestedDebugCall(); long result3 = logbackDirectDebugCall(); long result4 = logbackTestedDebugCall(); long result5 = logbackParametrizedDebugCall(); System.out.println("###############################################"); System.out.println("Log4j direct debug call: " + result1); System.out.println("Log4j tested (isDebugEnabled) debug call: " + result2); System.out.println("Logback direct debug call: " + result3); System.out.println("Logback tested (isDebugEnabled) debug call: " + result4); System.out.println("Logback parametrized debug call: " + result5); System.out.println("###############################################"); } private static long log4jDirectDebugCall() { long start = System.nanoTime(); for (int i = 0; i < loop; i++) { log4jlogger.debug("SEE IF THIS IS LOGGED " + i + "."); } return (System.nanoTime() - start)/loop; } private static long log4jTestedDebugCall() { long start = System.nanoTime(); for (int i = 0; i < loop; i++) { if (log4jlogger.isDebugEnabled()) log4jlogger.debug("SEE IF THIS IS LOGGED " + i + "."); } return (System.nanoTime() - start)/loop; } private static long logbackDirectDebugCall() { long start = System.nanoTime(); for (int i = 0; i < loop; i++) { logbacklogger.debug("SEE IF THIS IS LOGGED " + i + "."); } return (System.nanoTime() - start)/loop; } private static long logbackTestedDebugCall() { long start = System.nanoTime(); for (int i = 0; i < loop; i++) { if (logbacklogger.isDebugEnabled()) logbacklogger.debug("SEE IF THIS IS LOGGED " + i + "."); } return (System.nanoTime() - start)/loop; } private static long logbackParametrizedDebugCall() { long start = System.nanoTime(); for (int i = 0; i < loop; i++) { logbacklogger.debug("SEE IF THIS IS LOGGED {}.", i); } return (System.nanoTime() - start)/loop; } private static void initConfig() { //We set the levels of all two loggers to INFO, //so that we do not log anything. log4jlogger.setLevel(org.apache.log4j.Level.INFO); ch.qos.logback.classic.Logger log = (ch.qos.logback.classic.Logger)logbacklogger; log.setLevel(ch.qos.logback.classic.Level.INFO); }

Hello Mandeep, After some research and performance optimizations, here are the results I've obtained. Times are shown in nanoseconds per requests, and are an average time after 1'000'000 requests. Log4j direct debug call: 442 Log4j tested (isDebugEnabled) debug call: 19 Logback direct debug call: 435 Logback tested (isDebugEnabled) debug call: 10 Logback parametrized debug call: 15 They are rather different than what I had witnessed previously. First of all, the procedure is closer to a real life example. The PerformanceComparator class is nested into a 4-level deep package, and not in the top package. Then, only the root loggers have a level set explicitly. That means that both logging frameworks have to include level inheritance when deciding to log or not. This show how logback performs faster than log4j when relying on level inheritance. The tested call (using isDebugEnabled()) in logback is almost 2 times faster than its log4j counterpart. Then, we've made a few optimizations in logback's Logger class. There are now less method calls, and smarter if statements when recieving the TurboFilter's reply. This has made the parametrized call much faster. This form is now faster than log4j's isDebugEnabled() form. Since TurboFilters are a big feature of logback, I should mention that they are called when using the direct call or the parametrized call, but also when calling isDebugEnabled(). For example, you might want to set your levels to INFO or WARN in production, and add a TurboFilter that sets the level to DEBUG for a certain user. Having the username in the MDC, you can easily add a TurboFilter that will always accept the log requests when they are issued by this user's actions, although the logger would not accept such requests coming from any other user. The isDebugEnabled will then return false every time, except when the current user is logged with the username to allow debug level. I've attached the class I used to get these results at the end of this email. You might want to try it on your environment and see if the performance satisfy your needs. You will need to checkout logback from its source repository[1][2] Cheers, Sébastien [1]http://logback.qos.ch/repos.html [2]http://logback.qos.ch/manual/introduction.html#BuildingLogback package perfTest.ch.qos.logback; import org.slf4j.Logger; import org.slf4j.LoggerFactory; public class PerformanceComparator { static Logger logbacklogger = LoggerFactory .getLogger(PerformanceComparator.class); static org.apache.log4j.Logger log4jlogger = org.apache.log4j.Logger .getLogger(PerformanceComparator.class); // How many times should we try to log: static int loop = 1000000; public static void main(String[] args) throws InterruptedException { initConfig(); // Let's run once for Just In Time compiler log4jDirectDebugCall(); log4jTestedDebugCall(); logbackDirectDebugCall(); logbackTestedDebugCall(); logbackParametrizedDebugCall(); // let's run the tests and display the results: long result1 = log4jDirectDebugCall(); long result2 = log4jTestedDebugCall(); long result3 = logbackDirectDebugCall(); long result4 = logbackTestedDebugCall(); long result5 = logbackParametrizedDebugCall(); System.out.println("###############################################"); System.out.println("Log4j direct debug call: " + result1); System.out.println("Log4j tested (isDebugEnabled) debug call: " + result2); System.out.println("Logback direct debug call: " + result3); System.out .println("Logback tested (isDebugEnabled) debug call: " + result4); System.out.println("Logback parametrized debug call: " + result5); System.out.println("###############################################"); } private static long log4jDirectDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { log4jlogger.debug("SEE IF THIS IS LOGGED " + j + "."); } return (System.nanoTime() - start) / loop; } private static long log4jTestedDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { if (log4jlogger.isDebugEnabled()) { log4jlogger.debug("SEE IF THIS IS LOGGED " + j + "."); } } return (System.nanoTime() - start) / loop; } private static long logbackDirectDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { logbacklogger.debug("SEE IF THIS IS LOGGED " + j + "."); } return (System.nanoTime() - start) / loop; } private static long logbackTestedDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { if (logbacklogger.isDebugEnabled()) logbacklogger.debug("SEE IF THIS IS LOGGED " + j + "."); } return (System.nanoTime() - start) / loop; } private static long logbackParametrizedDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { logbacklogger.debug("SEE IF THIS IS LOGGED {}.", j); } return (System.nanoTime() - start) / loop; } private static void initConfig() { org.apache.log4j.Logger log4jRoot = org.apache.log4j.Logger.getRootLogger(); log4jRoot.setLevel(org.apache.log4j.Level.INFO); ch.qos.logback.classic.Logger lbRoot = (ch.qos.logback.classic.Logger) LoggerFactory .getLogger("ROOT"); lbRoot.setLevel(ch.qos.logback.classic.Level.INFO); // create the loggers org.apache.log4j.Logger.getLogger("perfTest"); org.apache.log4j.Logger.getLogger("perfTest.ch"); org.apache.log4j.Logger.getLogger("perfTest.ch.qos"); org.apache.log4j.Logger.getLogger("perfTest.ch.qos.logback"); LoggerFactory.getLogger("perfTest"); LoggerFactory.getLogger("perfTest.ch"); LoggerFactory.getLogger("perfTest.ch.qos"); LoggerFactory.getLogger("perfTest.ch.qos.logback"); } } -- Sébastien Pennec sebastien@qos.ch Logback: The reliable, generic, fast and flexible logging framework for Java. http://logback.qos.ch/

Nice work on the optimizations. There are a lot of things that may account for the strange results in the original benchmarks. In general, microbenchmarks like this aren't very meaningful in Java. See's Sun's Java Tuning White Paper <http://java.sun.com/performance/reference/whitepapers/tuning.html#section3.1> for a specific discussion of the problems of microbenchmarks. Ryan Lowe (I make no general endorsement of him or his blog) blogged a good example of how things can go very wrong <http://www.ryanlowe.ca/blog/archives/000447_java_microbenchmarks_are_evil.php> . Because of HotSpot and other JVM features, you won't run execute same machine code every time you call a method. Depending on your JVM settings, methods may be inlined, optimized and even regressed to interpreted mode all while your application is running. The best thing to do is to put a library in your real application and run real benchmarks or use something like JProfiler to see how much time you're spending in different libraries. Microbenchmarks will never tell you very much about how libraries perform in actual use. You just can't simulate the ways the VM will optimize code in the wild, which will often dwarf any differences you find in tight artificial loops. If you insist on microbenchmarking, here are a few pointers: · Run each separate test run in its own VM. This eliminates influence of earlier tests (e.g., GC) on later tests. For example, for each of the five configurations previously mentioned, run a separate VM. · Run several hundred thousand iterations before starting the timer. This helps make sure your code gets native compiled before you start. Most applications run long enough to make the initial compile time irrelevant, but it will skew your benchmarks. · Use a benchmarking framework like Japex <https://japex.dev.java.net/> . Hope this is helpful, Andy Gerweck -----Original Message----- From: logback-user-bounces@qos.ch [mailto:logback-user-bounces@qos.ch] On Behalf Of Sebastien Pennec Sent: Tuesday, March 13, 2007 3:41 AM To: logback users list Subject: Re: [logback-user] logback performance clarification Hello Mandeep, After some research and performance optimizations, here are the results I've obtained. Times are shown in nanoseconds per requests, and are an average time after 1'000'000 requests. Log4j direct debug call: 442 Log4j tested (isDebugEnabled) debug call: 19 Logback direct debug call: 435 Logback tested (isDebugEnabled) debug call: 10 Logback parametrized debug call: 15 They are rather different than what I had witnessed previously. First of all, the procedure is closer to a real life example. The PerformanceComparator class is nested into a 4-level deep package, and not in the top package. Then, only the root loggers have a level set explicitly. That means that both logging frameworks have to include level inheritance when deciding to log or not. This show how logback performs faster than log4j when relying on level inheritance. The tested call (using isDebugEnabled()) in logback is almost 2 times faster than its log4j counterpart. Then, we've made a few optimizations in logback's Logger class. There are now less method calls, and smarter if statements when recieving the TurboFilter's reply. This has made the parametrized call much faster. This form is now faster than log4j's isDebugEnabled() form. Since TurboFilters are a big feature of logback, I should mention that they are called when using the direct call or the parametrized call, but also when calling isDebugEnabled(). For example, you might want to set your levels to INFO or WARN in production, and add a TurboFilter that sets the level to DEBUG for a certain user. Having the username in the MDC, you can easily add a TurboFilter that will always accept the log requests when they are issued by this user's actions, although the logger would not accept such requests coming from any other user. The isDebugEnabled will then return false every time, except when the current user is logged with the username to allow debug level. I've attached the class I used to get these results at the end of this email. You might want to try it on your environment and see if the performance satisfy your needs. You will need to checkout logback from its source repository[1][2] Cheers, Sébastien [1]http://logback.qos.ch/repos.html [2]http://logback.qos.ch/manual/introduction.html#BuildingLogback package perfTest.ch.qos.logback; import org.slf4j.Logger; import org.slf4j.LoggerFactory; public class PerformanceComparator { static Logger logbacklogger = LoggerFactory .getLogger(PerformanceComparator.class); static org.apache.log4j.Logger log4jlogger = org.apache.log4j.Logger .getLogger(PerformanceComparator.class); // How many times should we try to log: static int loop = 1000000; public static void main(String[] args) throws InterruptedException { initConfig(); // Let's run once for Just In Time compiler log4jDirectDebugCall(); log4jTestedDebugCall(); logbackDirectDebugCall(); logbackTestedDebugCall(); logbackParametrizedDebugCall(); // let's run the tests and display the results: long result1 = log4jDirectDebugCall(); long result2 = log4jTestedDebugCall(); long result3 = logbackDirectDebugCall(); long result4 = logbackTestedDebugCall(); long result5 = logbackParametrizedDebugCall(); System.out.println("###############################################"); System.out.println("Log4j direct debug call: " + result1); System.out.println("Log4j tested (isDebugEnabled) debug call: " + result2); System.out.println("Logback direct debug call: " + result3); System.out .println("Logback tested (isDebugEnabled) debug call: " + result4); System.out.println("Logback parametrized debug call: " + result5); System.out.println("###############################################"); } private static long log4jDirectDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { log4jlogger.debug("SEE IF THIS IS LOGGED " + j + "."); } return (System.nanoTime() - start) / loop; } private static long log4jTestedDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { if (log4jlogger.isDebugEnabled()) { log4jlogger.debug("SEE IF THIS IS LOGGED " + j + "."); } } return (System.nanoTime() - start) / loop; } private static long logbackDirectDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { logbacklogger.debug("SEE IF THIS IS LOGGED " + j + "."); } return (System.nanoTime() - start) / loop; } private static long logbackTestedDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { if (logbacklogger.isDebugEnabled()) logbacklogger.debug("SEE IF THIS IS LOGGED " + j + "."); } return (System.nanoTime() - start) / loop; } private static long logbackParametrizedDebugCall() { Integer j = new Integer(2); long start = System.nanoTime(); for (int i = 0; i < loop; i++) { logbacklogger.debug("SEE IF THIS IS LOGGED {}.", j); } return (System.nanoTime() - start) / loop; } private static void initConfig() { org.apache.log4j.Logger log4jRoot = org.apache.log4j.Logger.getRootLogger(); log4jRoot.setLevel(org.apache.log4j.Level.INFO); ch.qos.logback.classic.Logger lbRoot = (ch.qos.logback.classic.Logger) LoggerFactory .getLogger("ROOT"); lbRoot.setLevel(ch.qos.logback.classic.Level.INFO); // create the loggers org.apache.log4j.Logger.getLogger("perfTest"); org.apache.log4j.Logger.getLogger("perfTest.ch"); org.apache.log4j.Logger.getLogger("perfTest.ch.qos"); org.apache.log4j.Logger.getLogger("perfTest.ch.qos.logback"); LoggerFactory.getLogger("perfTest"); LoggerFactory.getLogger("perfTest.ch"); LoggerFactory.getLogger("perfTest.ch.qos"); LoggerFactory.getLogger("perfTest.ch.qos.logback"); } } -- Sébastien Pennec sebastien@qos.ch Logback: The reliable, generic, fast and flexible logging framework for Java. http://logback.qos.ch/ _______________________________________________ Logback-user mailing list Logback-user@qos.ch http://qos.ch/mailman/listinfo/logback-user ************************************************************************* The information contained in this communication is confidential, is intended only for the use of the recipient named above, and may be legally privileged. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, please resend this communication to the sender and delete the original message or any copy of it from your computer system. Thank you. *************************************************************************

Hello Andy, I think that our current example is pretty different to what Ryan Lowe tried to measure. Here, there is no such thing as an Exception or stack trace to navigate, which is what modified his measures in an important manner. It all comes down to measuring how quickly an int comparison can be done. Logback and log4j are not designed the same way. There are less treatment that needs to be done before comparing the levels when using logback than log4j and this is what makes logback faster. Also, to be sure that the code is "hot" enough, all methods are run twice, and only measured the second time. Since a method is run 1'000'000 times each time it is run, I guess that it's enough to warm the code up :) Anyway, after we've studied and modified the code in the Logger class, it appears that the results I first posted were not strange, only logical. The code was simply less optimized as it is now. I even wrote that we thought that our parametrized calls could not be faster than log4j or logback's isDebugEnabled() calls, and it is faster, now :) Until now, I've never tried JProfiler. I'll definitely take a look at it and run the tests again when I have some time for it. Thanks for the links you provided and the advice, Sébastien Gerweck Andy - agerwe wrote:
Nice work on the optimizations. There are a lot of things that may account for the strange results in the original benchmarks.
In general, microbenchmarks like this aren't very meaningful in Java. See's Sun's Java Tuning White Paper <http://java.sun.com/performance/reference/whitepapers/tuning.html#section3.1> for a specific discussion of the problems of microbenchmarks. Ryan Lowe (I make no general endorsement of him or his blog) blogged a good example of how things can go very wrong <http://www.ryanlowe.ca/blog/archives/000447_java_microbenchmarks_are_evil.php>.
Because of HotSpot and other JVM features, you won't run execute same machine code every time you call a method. Depending on your JVM settings, methods may be inlined, optimized and even regressed to interpreted mode all while your application is running.
The best thing to do is to put a library in your real application and run real benchmarks or use something like JProfiler to see how much time you're spending in different libraries. Microbenchmarks will never tell you very much about how libraries perform in actual use. You just can't simulate the ways the VM will optimize code in the wild, which will often dwarf any differences you find in tight artificial loops.
If you insist on microbenchmarking, here are a few pointers:
· Run each separate test run in its own VM. This eliminates influence of earlier tests (e.g., GC) on later tests. For example, for each of the five configurations previously mentioned, run a separate VM.
· Run several hundred thousand iterations before starting the timer. This helps make sure your code gets native compiled before you start. Most applications run long enough to make the initial compile time irrelevant, but it will skew your benchmarks.
· Use a benchmarking framework like Japex <https://japex.dev.java.net/>.
Hope this is helpful,
/ Andy Gerweck /
-- Sébastien Pennec sebastien@qos.ch Logback: The reliable, generic, fast and flexible logging framework for Java. http://logback.qos.ch/

I agree that this is a different case from Ryan's and I don't think there's anything really wrong with your test. As far as "strange" results, I was referring more to Mandeep's original post where the numbers changed significantly over time, which was the real inspiration for my post. I try to say something when I see microbenchmarks because I think people tend to underestimate how the JVM can distort performance. However, I would still raise caution about taking these results too seriously. They do show that you've done a good job with LOGBack: I'm personally appreciative of the work you and Ceki have put into a great project. With the Log4j bridge and/or SLF4J, it should be pretty easy for people to post performance numbers from real, complex applications. If people have benchmark suites and have time to post numbers, I'm sure we'd all be interested! Thanks, Andy Gerweck -----Original Message----- From: logback-user-bounces@qos.ch [mailto:logback-user-bounces@qos.ch] On Behalf Of Sebastien Pennec Sent: Wednesday, March 14, 2007 3:46 AM To: logback users list Subject: Re: [logback-user] logback performance clarification Hello Andy, I think that our current example is pretty different to what Ryan Lowe tried to measure. Here, there is no such thing as an Exception or stack trace to navigate, which is what modified his measures in an important manner. It all comes down to measuring how quickly an int comparison can be done. Logback and log4j are not designed the same way. There are less treatment that needs to be done before comparing the levels when using logback than log4j and this is what makes logback faster. Also, to be sure that the code is "hot" enough, all methods are run twice, and only measured the second time. Since a method is run 1'000'000 times each time it is run, I guess that it's enough to warm the code up :) Anyway, after we've studied and modified the code in the Logger class, it appears that the results I first posted were not strange, only logical. The code was simply less optimized as it is now. I even wrote that we thought that our parametrized calls could not be faster than log4j or logback's isDebugEnabled() calls, and it is faster, now :) Until now, I've never tried JProfiler. I'll definitely take a look at it and run the tests again when I have some time for it. Thanks for the links you provided and the advice, Sébastien Gerweck Andy - agerwe wrote:
Nice work on the optimizations. There are a lot of things that may account for the strange results in the original benchmarks.
In general, microbenchmarks like this aren't very meaningful in Java. See's Sun's Java Tuning White Paper <http://java.sun.com/performance/reference/whitepapers/tuning.html#section3.1> for a specific discussion of the problems of microbenchmarks. Ryan Lowe (I make no general endorsement of him or his blog) blogged a good example of how things can go very wrong <http://www.ryanlowe.ca/blog/archives/000447_java_microbenchmarks_are_evil.php>.
Because of HotSpot and other JVM features, you won't run execute same machine code every time you call a method. Depending on your JVM settings, methods may be inlined, optimized and even regressed to interpreted mode all while your application is running.
The best thing to do is to put a library in your real application and run real benchmarks or use something like JProfiler to see how much time you're spending in different libraries. Microbenchmarks will never tell you very much about how libraries perform in actual use. You just can't simulate the ways the VM will optimize code in the wild, which will often dwarf any differences you find in tight artificial loops.
If you insist on microbenchmarking, here are a few pointers:
· Run each separate test run in its own VM. This eliminates influence of earlier tests (e.g., GC) on later tests. For example, for each of the five configurations previously mentioned, run a separate VM.
· Run several hundred thousand iterations before starting the timer. This helps make sure your code gets native compiled before you start. Most applications run long enough to make the initial compile time irrelevant, but it will skew your benchmarks.
· Use a benchmarking framework like Japex <https://japex.dev.java.net/>.
Hope this is helpful,
/ Andy Gerweck /
-- Sébastien Pennec sebastien@qos.ch Logback: The reliable, generic, fast and flexible logging framework for Java. http://logback.qos.ch/ _______________________________________________ Logback-user mailing list Logback-user@qos.ch http://qos.ch/mailman/listinfo/logback-user ************************************************************************* The information contained in this communication is confidential, is intended only for the use of the recipient named above, and may be legally privileged. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, please resend this communication to the sender and delete the original message or any copy of it from your computer system. Thank you. *************************************************************************
participants (3)
-
Gerweck Andy - agerwe
-
mandeep singh
-
Sebastien Pennec