This is not just an issue with Docker/daemon processes. I am seeing stack traces which indicate the same underlying issue on my local workstation, attached to a terminal (running on Linux). It hangs the process so badly that I have to kill -9 it to get rid of. In one of the thread dumps, I had 12 threads parked on the lock held by the thread writing to the console (i.e. the thread below). Running on logback 1.2.3. So, I would also concur that this is more than just slowness. I wonder if there isn't a real deadlock here, or alternatively, some circular logging happening which means that the "queue" of threads waiting to acquire the lock is never cleared. (if new logging happens at a quicker pace than the log is being written, this could potentially happen) Interestingly enough, I don't see any logging being written to STDOUT in this case, so if the latter holds true, it's likely written to a logger/log level that is discarded in the actual output. I'll try to edit my logback.groovy to enable all log output the next time this happens, to help the debugging of this. Ceki, if you don't mind, please re-open this issue so we can discuss it further. Alternatively, please suggest some other way to work around this. "Thread-3 (activemq-netty-threads)" #178 daemon prio=5 os_prio=0 cpu=9252.88ms elapsed=594609.39s tid=0x00007fa0dc00a000 nid=0x111e runnable [0x00007fa041cdb000] java.lang.Thread.State: RUNNABLE at java.io.FileOutputStream.writeBytes(java.base@11.0.4/Native Method) at java.io.FileOutputStream.write(java.base@11.0.4/FileOutputStream.java:354) at java.io.BufferedOutputStream.write(java.base@11.0.4/BufferedOutputStream.java:123) - locked <0x0000000500c96270> (a java.io.BufferedOutputStream) at java.io.PrintStream.write(java.base@11.0.4/PrintStream.java:559) - locked <0x0000000500c96248> (a java.io.PrintStream) at java.io.FilterOutputStream.write(java.base@11.0.4/FilterOutputStream.java:108) at ch.qos.logback.core.joran.spi.ConsoleTarget$1.write(ConsoleTarget.java:37) at ch.qos.logback.core.OutputStreamAppender.writeBytes(OutputStreamAppender.java:199) at ch.qos.logback.core.OutputStreamAppender.subAppend(OutputStreamAppender.java:231) at ch.qos.logback.core.OutputStreamAppender.append(OutputStreamAppender.java:102) at ch.qos.logback.core.UnsynchronizedAppenderBase.doAppend(UnsynchronizedAppenderBase.java:84) at ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51) at ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270) at ch.qos.logback.classic.Logger.callAppenders(Logger.java:257) at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421) at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383) at ch.qos.logback.classic.Logger.log(Logger.java:765) at org.apache.logging.slf4j.SLF4JLogger.logMessage(SLF4JLogger.java:232) at org.jboss.logging.Log4j2Logger.doLog(Log4j2Logger.java:54) at org.jboss.logging.Logger.logv(Logger.java:2256) at org.apache.activemq.artemis.core.client.ActiveMQClientLogger_$logger.connectionFailureDetected(ActiveMQClientLogger_$logger.java:356) at org.apache.activemq.artemis.core.protocol.core.impl.RemotingConnectionImpl.fail(RemotingConnectionImpl.java:204) at org.apache.activemq.artemis.spi.core.protocol.AbstractRemotingConnection.fail(AbstractRemotingConnection.java:218) at org.apache.activemq.artemis.core.remoting.server.impl.RemotingServiceImpl$DelegatingBufferHandler.bufferReceived(RemotingServiceImpl.java:646) at org.apache.activemq.artemis.core.remoting.impl.netty.ActiveMQChannelHandler.channelRead(ActiveMQChannelHandler.java:73) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930) at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:799) at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:427) at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:328) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at org.apache.activemq.artemis.utils.ActiveMQThreadFactory$1.run(ActiveMQThreadFactory.java:118) |