problems with too many open files under linux
843811Jan 13 2005 — edited Jan 21 2005we use a static hash pool with eg 50 file handles open. but under linux it appears that every thread uses these eg 50 file handles, at least the lsof command says so. this means we have eg 50 x 50 files open. this leads to a "too many open files" exception in stdout.txt and several errors in websphere.
normally i thought that this would be impossible with thread handling, but i read somewhere that the VM 1.3 under linux treats every thread like a separate process.
my question: is it really possible that under linux we have "threads" x "open files" files opened? and is this matter solved under another OS?
thank you for your answers