We had to extract text out of documents that people were uploading to the site so we can index them. As this was going to be just a text extraction services I didnt wanted to go for some full blown tomcat server so wrote a quick http server using java
After being live for 2-3 days the process started crashing and it was a regular pattern. I added -verbose:gc -XX:+HeapDumpOnOutOfMemoryError to the startup script and had a crash dump after 3 days. Using eclipse Memory Analyzer and using Histogram to look at the heap dump revealed that we had 90K HttpConnection objects. wow that looks for sure like a leak.
I tried everything from closing the exchange, responseBody as shown below but the connections would remain.
class DocumentProcessHandler implements HttpHandler {
@SuppressWarnings("unchecked")
public void handle(HttpExchange exchange) throws IOException {
PrintWriter responseBody = new PrintWriter(exchange.getResponseBody());
try {
..........code removed for brevity of post....................
} catch (Throwable t) {
logger.error(t);
throw new IOException(t);
} finally {
responseBody.flush();
responseBody.close();
exchange.close();
}
}
}
To prove my theory I send few http requests and did jmap -histo:live 16315 |grep http and keep on seeing the http connections would increase after every requests.
This is definitely a bug in the HttpServer code so use it only for demos and not for production use. I saw that lots of other people are also facing same issues and sun wont agree that its a bug so I am again moving back to tomcat :).
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6563368
http://forums.sun.com/thread.jspa?threadID=5380631
http://www.experts-exchange.com/Programming/Languages/Java/Q_24416845.html
this.httpServer = HttpServer.create(addr, 0);Not adding all code for brevity.
HttpContext context = this.httpServer.createContext("/", new DocumentProcessHandler());
this.httpThreadPool = Executors.newFixedThreadPool(this.noOfThreads);
this.httpServer.setExecutor(this.httpThreadPool);
context.getFilters().add(new HttpParameterFilter());
this.httpServer.start();
After being live for 2-3 days the process started crashing and it was a regular pattern. I added -verbose:gc -XX:+HeapDumpOnOutOfMemoryError to the startup script and had a crash dump after 3 days. Using eclipse Memory Analyzer and using Histogram to look at the heap dump revealed that we had 90K HttpConnection objects. wow that looks for sure like a leak.
I tried everything from closing the exchange, responseBody as shown below but the connections would remain.
class DocumentProcessHandler implements HttpHandler {
@SuppressWarnings("unchecked")
public void handle(HttpExchange exchange) throws IOException {
PrintWriter responseBody = new PrintWriter(exchange.getResponseBody());
try {
..........code removed for brevity of post....................
} catch (Throwable t) {
logger.error(t);
throw new IOException(t);
} finally {
responseBody.flush();
responseBody.close();
exchange.close();
}
}
}
To prove my theory I send few http requests and did jmap -histo:live 16315 |grep http and keep on seeing the http connections would increase after every requests.
This is definitely a bug in the HttpServer code so use it only for demos and not for production use. I saw that lots of other people are also facing same issues and sun wont agree that its a bug so I am again moving back to tomcat :).
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6563368
http://forums.sun.com/thread.jspa?threadID=5380631
http://www.experts-exchange.com/Programming/Languages/Java/Q_24416845.html
I find out if the HTTP GET request has some extra bytes after the last 0xd 0xa in the HTTP header, it could lead to this leak as the HttpConnection is not added to "idleConnections" in sun.net.httpserver.ServerImpl
ReplyDeleteAnother relevant bug report: http://bugs.sun.com/view_bug.do;?bug_id=6946825
ReplyDelete