Discussion:
tomcat won't download large files -- out of memory error
DIGLLOYD INC
2008-09-26 05:01:01 UTC
Permalink
I have some large zip files I want to make available for download.
When I try to download a 70MB file, tomcat is trying to cache these
huge files (it seems). The result is that downloading them always
fails. I *want* caching for most everything eg jpegs, html, etc and
I've set tomcat to use up to 1.5GB of memory.

Is there a way to limit the size of the file that will be cached? It's
regrettable that failure to cache a file can't gracefully degrade into
just not caching it.


Sep 25, 2008 9:50:17 PM org.apache.catalina.connector.CoyoteAdapter
service
SEVERE: An exception or error occurred in the container during the
request processing
java.lang.OutOfMemoryError: Java heap space
at
org
.apache
.naming.resources.ProxyDirContext.cacheLoad(ProxyDirContext.java:1571)
at
org
.apache
.naming.resources.ProxyDirContext.cacheLookup(ProxyDirContext.java:1449)
at
org
.apache.naming.resources.ProxyDirContext.lookup(ProxyDirContext.java:
283)
at
org
.apache.tomcat.util.http.mapper.Mapper.internalMapWrapper(Mapper.java:
782)
at org.apache.tomcat.util.http.mapper.Mapper.internalMap(Mapper.java:
626)
at org.apache.tomcat.util.http.mapper.Mapper.map(Mapper.java:516)
at
org
.apache
.catalina.connector.CoyoteAdapter.postParseRequest(CoyoteAdapter.java:
444)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:
284)
at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:
844)
at org.apache.coyote.http11.Http11Protocol
$Http11ConnectionHandler.process(Http11Protocol.java:583)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:
447)
at java.lang.Thread.run(Thread.java:613)


Lloyd Chambers
http://diglloyd.com

[Mac OS X 10.5.2 Intel, Tomcat 6.0.16]





---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
DIGLLOYD INC
2008-09-26 05:34:43 UTC
Permalink
I came across the following:

-Dorg.apache.jasper.runtime.BodyContentImpl.LIMIT_BUFFER=true

at this page: http://hillert.blogspot.com/2008/05/if-tomcat-is-running-out-of-memory.html

I haven't tried it yet, and I don't know what it does (limits
something apparently).

Anyone know?


Lloyd
Post by DIGLLOYD INC
I have some large zip files I want to make available for download.
When I try to download a 70MB file, tomcat is trying to cache these
huge files (it seems). The result is that downloading them always
fails. I *want* caching for most everything eg jpegs, html, etc and
I've set tomcat to use up to 1.5GB of memory.
Is there a way to limit the size of the file that will be cached?
It's regrettable that failure to cache a file can't gracefully
degrade into just not caching it.
Sep 25, 2008 9:50:17 PM org.apache.catalina.connector.CoyoteAdapter
service
SEVERE: An exception or error occurred in the container during the
request processing
java.lang.OutOfMemoryError: Java heap space
at
org
.apache
.naming.resources.ProxyDirContext.cacheLoad(ProxyDirContext.java:1571)
at
org
.apache
1449)
at
org
283)
at
org
.apache
.tomcat.util.http.mapper.Mapper.internalMapWrapper(Mapper.java:782)
at
org.apache.tomcat.util.http.mapper.Mapper.internalMap(Mapper.java:626)
at org.apache.tomcat.util.http.mapper.Mapper.map(Mapper.java:516)
at
org
.apache
.catalina
.connector.CoyoteAdapter.postParseRequest(CoyoteAdapter.java:444)
at
org
284)
at
org
844)
at org.apache.coyote.http11.Http11Protocol
$Http11ConnectionHandler.process(Http11Protocol.java:583)
at org.apache.tomcat.util.net.JIoEndpoint
$Worker.run(JIoEndpoint.java:447)
at java.lang.Thread.run(Thread.java:613)
Lloyd Chambers
http://diglloyd.com
[Mac OS X 10.5.2 Intel, Tomcat 6.0.16]
---------------------------------------------------------------------
---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
Pid
2008-09-26 07:26:04 UTC
Permalink
Post by DIGLLOYD INC
-Dorg.apache.jasper.runtime.BodyContentImpl.LIMIT_BUFFER=true
http://hillert.blogspot.com/2008/05/if-tomcat-is-running-out-of-memory.html
I haven't tried it yet, and I don't know what it does (limits something
apparently).
Anyone know?
Jasper is Tomcat's JSP component, so it's unlikely to have any bearing
on your problem if your files are static and are not generated by JSP
processing/files.

What are your java memory settings and how are you setting them?

p
Post by DIGLLOYD INC
Lloyd
Post by DIGLLOYD INC
I have some large zip files I want to make available for download.
When I try to download a 70MB file, tomcat is trying to cache these
huge files (it seems). The result is that downloading them always
fails. I *want* caching for most everything eg jpegs, html, etc and
I've set tomcat to use up to 1.5GB of memory.
Is there a way to limit the size of the file that will be cached? It's
regrettable that failure to cache a file can't gracefully degrade into
just not caching it.
Sep 25, 2008 9:50:17 PM org.apache.catalina.connector.CoyoteAdapter
service
SEVERE: An exception or error occurred in the container during the
request processing
java.lang.OutOfMemoryError: Java heap space
at
org.apache.naming.resources.ProxyDirContext.cacheLoad(ProxyDirContext.java:1571)
at
org.apache.naming.resources.ProxyDirContext.cacheLookup(ProxyDirContext.java:1449)
at
org.apache.naming.resources.ProxyDirContext.lookup(ProxyDirContext.java:283)
at
org.apache.tomcat.util.http.mapper.Mapper.internalMapWrapper(Mapper.java:782)
at
org.apache.tomcat.util.http.mapper.Mapper.internalMap(Mapper.java:626)
at org.apache.tomcat.util.http.mapper.Mapper.map(Mapper.java:516)
at
org.apache.catalina.connector.CoyoteAdapter.postParseRequest(CoyoteAdapter.java:444)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:284)
at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:844)
at
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
at
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:447)
at java.lang.Thread.run(Thread.java:613)
Lloyd Chambers
http://diglloyd.com
[Mac OS X 10.5.2 Intel, Tomcat 6.0.16]
---------------------------------------------------------------------
---------------------------------------------------------------------
---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
Johnny Kewl
2008-09-26 08:47:47 UTC
Permalink
----- Original Message -----
From: "DIGLLOYD INC" <***@diglloyd.com>
To: "Tomcat Users List" <***@tomcat.apache.org>
Sent: Friday, September 26, 2008 7:01 AM
Subject: tomcat won't download large files -- out of memory error
I have some large zip files I want to make available for download. When I
try to download a 70MB file, tomcat is trying to cache these huge files
(it seems). The result is that downloading them always fails. I *want*
caching for most everything eg jpegs, html, etc and I've set tomcat to use
up to 1.5GB of memory.
Is there a way to limit the size of the file that will be cached? It's
regrettable that failure to cache a file can't gracefully degrade into
just not caching it.
Sep 25, 2008 9:50:17 PM org.apache.catalina.connector.CoyoteAdapter
service
SEVERE: An exception or error occurred in the container during the
request processing
java.lang.OutOfMemoryError: Java heap space
at org .apache
.naming.resources.ProxyDirContext.cacheLoad(ProxyDirContext.java:1571)
at org .apache
.naming.resources.ProxyDirContext.cacheLookup(ProxyDirContext.java:1449)
at org
.apache.naming.resources.ProxyDirContext.lookup(ProxyDirContext.java: 283)
at org
782)
at org.apache.tomcat.util.http.mapper.Mapper.internalMap(Mapper.java: 626)
at org.apache.tomcat.util.http.mapper.Mapper.map(Mapper.java:516)
at org .apache
444)
at
284)
844)
at org.apache.coyote.http11.Http11Protocol
$Http11ConnectionHandler.process(Http11Protocol.java:583)
447)
at java.lang.Thread.run(Thread.java:613)
Lloyd Chambers
http://diglloyd.com
[Mac OS X 10.5.2 Intel, Tomcat 6.0.16]
Lloyd... dont know, its definitely running out of mem, but I'm not sure its
because of a 70 meg file...
The default static cache size is 10 megs, so its doubtful that TC is caching
that file...

We run video content on the Lan and some of that stuff is 700 megs on
standard settings... out of the box settings.
We are still running TC 5.5.25 in production...

I dont know the MAC, but its possible that its just not setting enough
memory...
I guess the MAC has its own TC and its own JRE... so I have no idea how you
monitor stuff, or what the defaults are.
If it does have the later Sun JRE VisualVM tool... thats a nice way to watch
what your TC is doing.

Is it been served by tomcats default servlet or from a custom servlet... if
the later the coder may infact be sticking it all on the heap, very likely
it will break then.
The default servlet in TC is smarter than that...

I dont think a 70 meg file should pose any problems served as a static
file...
I would try another TC version just to make sure... and I would ask
Java-***@lists.apple.com as well if you dont come right here.

I think you may be seeing a symptom of another problem... ie something else
is consumed all the memory, and the file down load is the straw that broke
the camels back. Maybe another test case is in order where you make a little
webapp with nothing else but the static file in it... if that works, then
something else in that webapp ate your memory...

Good Luck...

---------------------------------------------------------------------------
HARBOR : http://www.kewlstuff.co.za/index.htm
The most powerful application server on earth.
The only real POJO Application Server.
See it in Action : http://www.kewlstuff.co.za/cd_tut_swf/whatisejb1.htm
---------------------------------------------------------------------------


---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
Mark Thomas
2008-09-26 11:07:29 UTC
Permalink
I have some large zip files I want to make available for download. When
I try to download a 70MB file, tomcat is trying to cache these huge
files (it seems). The result is that downloading them always fails. I
*want* caching for most everything eg jpegs, html, etc and I've set
tomcat to use up to 1.5GB of memory.
Do you mean you have set cacheMaxSize="1500000" on the context?

Which JVM are you using? Particularly, are you using a 32bit or 64bit JVM?
Is there a way to limit the size of the file that will be cached?
Not at present. The maximum (cacheObjectMaxSize) is set to
(cacheMaxSize/20). I can see a case for making cacheObjectMaxSize
configurable. The cache should probably use the smaller of
(cacheMaxSize/20) and cacheObjectMaxSize.
It's
regrettable that failure to cache a file can't gracefully degrade into
just not caching it.
It isn't possible to handle OOMs gracefully. Once they occur you have to
assume the JVM is toast and restart it.

Providing you have enough memory configured for the JVM to support the
cache size you have asked for plus the other memory you need to run Tomcat,
the cache will be fine and you won't see an OOM.

It appears in this case that the failure is that your JVM doesn't have
enough memory configured. With sufficient memory head room you should be
fine. The current cache implementation requires more headroom than is the
ideal. Limiting cacheObjectMaxSize should reduce the headroom required.

Mark



---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
Mark Thomas
2008-09-26 11:59:35 UTC
Permalink
Post by Mark Thomas
Post by DIGLLOYD INC
Is there a way to limit the size of the file that will be cached?
Not at present.
I have added a configuration option for this to trunk and proposed it for 6.0.x

Mark


---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
DIGLLOYD INC
2008-09-26 18:07:38 UTC
Permalink
Thanks to multiple people responding to this!

My site diglloyd.com serves almost entirely static content, with many
large JPEG files.

I have set:
CATALINA_OPTS=-Xmx1024M

That's limiting the JVM to 1GB of memory. And in tomcat/conf/
context.xml, I set:

<Context cacheMaxSize="1500000" cacheTTL="60000"
cachingAllowed="true" >

Therein presumably lies the problem. I'll change CATALINA_OPTS to be
2GB or so and retry.


Mark, it would be helpful to be able to say "don't cache anything
larger the N kilobytes/megabytes". I'd probably use a setting of 2MB
or so for that.

Lloyd

Lloyd Chambers
http://diglloyd.com

[Mac OS X 10.5.2 Intel, Tomcat 6.0.16]
Post by Mark Thomas
Post by DIGLLOYD INC
I have some large zip files I want to make available for download.
When
I try to download a 70MB file, tomcat is trying to cache these huge
files (it seems). The result is that downloading them always
fails. I
*want* caching for most everything eg jpegs, html, etc and I've set
tomcat to use up to 1.5GB of memory.
Do you mean you have set cacheMaxSize="1500000" on the context?
Which JVM are you using? Particularly, are you using a 32bit or 64bit JVM?
Post by DIGLLOYD INC
Is there a way to limit the size of the file that will be cached?
Not at present. The maximum (cacheObjectMaxSize) is set to
(cacheMaxSize/20). I can see a case for making cacheObjectMaxSize
configurable. The cache should probably use the smaller of
(cacheMaxSize/20) and cacheObjectMaxSize.
Post by DIGLLOYD INC
It's
regrettable that failure to cache a file can't gracefully degrade into
just not caching it.
It isn't possible to handle OOMs gracefully. Once they occur you have to
assume the JVM is toast and restart it.
Providing you have enough memory configured for the JVM to support the
cache size you have asked for plus the other memory you need to run Tomcat,
the cache will be fine and you won't see an OOM.
It appears in this case that the failure is that your JVM doesn't have
enough memory configured. With sufficient memory head room you
should be
fine. The current cache implementation requires more headroom than is the
ideal. Limiting cacheObjectMaxSize should reduce the headroom
required.
Mark
---------------------------------------------------------------------
---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
Mark Thomas
2008-09-26 18:19:38 UTC
Permalink
Post by DIGLLOYD INC
Thanks to multiple people responding to this!
My site diglloyd.com serves almost entirely static content, with many
large JPEG files.
CATALINA_OPTS=-Xmx1024M
That's limiting the JVM to 1GB of memory. And in
<Context cacheMaxSize="1500000" cacheTTL="60000" cachingAllowed="true" >
Therein presumably lies the problem. I'll change CATALINA_OPTS to be
2GB or so and retry.
You'll probably need to be using a 64-bit JVM to set that to 2GB.
Post by DIGLLOYD INC
Mark, it would be helpful to be able to say "don't cache anything larger
the N kilobytes/megabytes". I'd probably use a setting of 2MB or so for
that.
cacheObjectMaxSize is now configurable (at least in trunk). The only
limitation is that it can't be greater than cacheMaxSize/20.

Mark



---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
DIGLLOYD INC
2008-09-26 18:42:21 UTC
Permalink
Thanks Mark.

How to force the 64-bit JVM to run?

For now I'm using -Xmx2047M, which solves the immediate problem.

Lloyd Chambers
http://diglloyd.com

[Mac OS X 10.5.5 Intel, Tomcat 6.0.16]
Post by Mark Thomas
Post by DIGLLOYD INC
Thanks to multiple people responding to this!
My site diglloyd.com serves almost entirely static content, with many
large JPEG files.
CATALINA_OPTS=-Xmx1024M
That's limiting the JVM to 1GB of memory. And in
<Context cacheMaxSize="1500000" cacheTTL="60000"
cachingAllowed="true" >
Therein presumably lies the problem. I'll change CATALINA_OPTS to be
2GB or so and retry.
You'll probably need to be using a 64-bit JVM to set that to 2GB.
Post by DIGLLOYD INC
Mark, it would be helpful to be able to say "don't cache anything larger
the N kilobytes/megabytes". I'd probably use a setting of 2MB or so for
that.
cacheObjectMaxSize is now configurable (at least in trunk). The only
limitation is that it can't be greater than cacheMaxSize/20.
Mark
---------------------------------------------------------------------
---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
Mark Thomas
2008-09-26 19:19:33 UTC
Permalink
Post by DIGLLOYD INC
Thanks Mark.
How to force the 64-bit JVM to run?
For now I'm using -Xmx2047M, which solves the immediate problem.
The latest Java updates for Mac include 32 and 64 bit 1.6.0_07.

You need to use the Java Preferences (Applications > Util > Java - or
something close to that) to change the default JVM.

Mark
Post by DIGLLOYD INC
Lloyd Chambers
http://diglloyd.com
[Mac OS X 10.5.5 Intel, Tomcat 6.0.16]
Post by Mark Thomas
Post by DIGLLOYD INC
Thanks to multiple people responding to this!
My site diglloyd.com serves almost entirely static content, with many
large JPEG files.
CATALINA_OPTS=-Xmx1024M
That's limiting the JVM to 1GB of memory. And in
<Context cacheMaxSize="1500000" cacheTTL="60000"
cachingAllowed="true" >
Therein presumably lies the problem. I'll change CATALINA_OPTS to be
2GB or so and retry.
You'll probably need to be using a 64-bit JVM to set that to 2GB.
Post by DIGLLOYD INC
Mark, it would be helpful to be able to say "don't cache anything larger
the N kilobytes/megabytes". I'd probably use a setting of 2MB or so for
that.
cacheObjectMaxSize is now configurable (at least in trunk). The only
limitation is that it can't be greater than cacheMaxSize/20.
Mark
---------------------------------------------------------------------
---------------------------------------------------------------------
---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org
Johnny Kewl
2008-09-26 20:12:31 UTC
Permalink
----- Original Message -----
From: "DIGLLOYD INC" <***@diglloyd.com>
To: "Tomcat Users List" <***@tomcat.apache.org>
Sent: Friday, September 26, 2008 8:07 PM
Subject: Re: tomcat won't download large files -- out of memory error
Post by DIGLLOYD INC
Thanks to multiple people responding to this!
My site diglloyd.com serves almost entirely static content, with many
large JPEG files.
CATALINA_OPTS=-Xmx1024M
That's limiting the JVM to 1GB of memory. And in tomcat/conf/ context.xml,
<Context cacheMaxSize="1500000" cacheTTL="60000" cachingAllowed="true" >
Therein presumably lies the problem. I'll change CATALINA_OPTS to be 2GB
or so and retry.
Mark, it would be helpful to be able to say "don't cache anything larger
the N kilobytes/megabytes". I'd probably use a setting of 2MB or so for
that.
Lloyd
Ah I see what you up to... memory management is trick... theres a tool
called VisualGC that you may be able to get running on Apple, just for
interest sake.
The memory management is quite complex and is split into aging heaps... so
you dont really get that clump of memory, its pro rata'd out...
There are advanced settings if you read up on it... but you also need to see
it... to see what the JRE is actually up to...

Thats just of interest... but u into serious media so maybe you should try
something like this...

Make a servlet, read the files from disk yourself.... its not hard...
something like this... but google for more
http://greatwebguy.com/programming/java/java-image-resizer-servlet/

can also do interesting stuff like this

http://greatwebguy.com/programming/java/java-image-resizer-servlet/

So its pretty cool having your own servlet...

but heres what you can try and it will give you alot of control...

Map the servlet to say.... *.jpg

So... that means your servlet overrides the Default servlet

but now do this...

.... look at that size of the file... if its huge... you do it... and send
it...
if its "normal static content"... forward it to the Default servlet...

Its a kind of filter... and it will give you the ability to use TC's caching
and that very smart default servlet... or bypass it...

I think something like that will give you all the control you need...

It seems on Apple you dont really have too many choices when it comes to
JRE's... the machine model is more or less going to determine when you on 64
bit...
... but I think you can play a little and make it work efficiently anyway...
possibly just letting thumb nails cache... and you can control the browser
side caching to the nth degree as well.... by making a fronting servlet

.... maybe... have fun ;)

---------------------------------------------------------------------------
HARBOR : http://www.kewlstuff.co.za/index.htm
The most powerful application server on earth.
The only real POJO Application Server.
See it in Action : http://www.kewlstuff.co.za/cd_tut_swf/whatisejb1.htm
---------------------------------------------------------------------------



---------------------------------------------------------------------
To start a new topic, e-mail: ***@tomcat.apache.org
To unsubscribe, e-mail: users-***@tomcat.apache.org
For additional commands, e-mail: users-***@tomcat.apache.org

Loading...