One of my clients had the need for a lot of search collections and even more data to put in them (especially with massive archives of PDF files, xls, word docs, etc) for a very large intranet. So I tried to plan ahead by installing Solr on a separate server (Windows 2008 R2 64-bit with a decent chunk of RAM). You can find the separate installer on Adobe's website. For our initial tests it worked out great, but when we starting throwing more data at it, it quickly ran out of memory.
The first thing I did before I even created the first collection was to increase the memory for the JVM (to do this, edit the solr.lax file and change the default setting of -Xmx256m to your preferred max memory setting. I also prefer to set the min setting as well to the same value (see below)). What I quickly found was the the default version of the Solr install comes with a 32-bit version of Jetty. The 32-bit JVM maxes out at 1.5GB of RAM use. So even though we had enough RAM to use and were running on a 64-bit OS, the JVM wasn't being allowed to use the extra memory.
The solution ended up being very simple (even for a non-java person like myself). I downloaded a 64-bit JVM (in my case I just downloaded Sun's 64-bit JDK to test. I suggest the JRE though) and installed it on the box. I edited the solr.lax file and commented out the single line where Jetty calls javaw.exe (for Windows its that file, for other OS's it will be sloght different with a different path format) and instead pointed it to the new 64-bit JVM's javaw.exe file.
I then set the min and max memory settings to 3GB (-Xms3072m -Xmx3072m) and started Solr up. No more out of memory errors (no more java heap errors).
---
I didn't figure it out all on my own though. Several people helped me come to the final result that I'd like to thank (in order of me bugging them): Mark Mandel, Matt Woodward, and Sean Coyne