Too many open files in system cause database goes down
Hello experts I am very worry because of the following problems. I really hope you can help me.
some server features
OS: Suse Linux Enterprise 10
RAM: 32 GB
CPU: intel QUAD-CORE
DB: There is 3 instances RAC databases (version 11.1.0.7) in the same host.
Problem: The database instances begin to report Error message: Linux-x86_64 Error: 23: Too many open files in system
and here you are other error messages:
ORA-27505: IPC error destroying a port
ORA-27300: OS system dependent operation:close failed with status: 9
ORA-27301: OS failure message: Bad file descriptor
ORA-27302: failure occurred at: skgxpdelpt1
ORA-01115: IO error reading block from file 105 (block # 18845)
ORA-01110: data file 105: '+DATOS/dac/datafile/auditoria.519.738586803'
ORA-15081: failed to submit an I/O operation to a disk
At the same time I search into the /var/log/messages as root user and I the error notice me the same problem:
Feb 7 11:03:58 bls3-1-1 syslog-ng[3346]: Cannot open file /var/log/mail.err for
writing (Too many open files in system)
Feb 7 11:04:56 bls3-1-1 kernel: VFS: file-max limit 131072 reached
Feb 7 11:05:05 bls3-1-1 kernel: oracle[12766]: segfault at fffffffffffffff0 rip
0000000007c76323 rsp 00007fff466dc780 error 4
I think I get clear about the cause, maybe I need to increase the fs.file-max kernel parameter but I do not know how to set a good value. Here you are my sysctl.conf file and the limits.conf file:
sysctl.conf
kernel.shmall = 2097152
kernel.shmmax = 17179869184
kernel.shmmni = 4096
kernel.sem = 250 32000 100 128
fs.file-max = 6553600
net.ipv4.ip_local_port_range = 1024 65000
net.core.rmem_default = 4194304
net.core.rmem_max = 4194304
net.core.wmem_default = 262144
net.core.wmem_max = 4194304
limits.conf
oracle soft nproc 2047
oracle hard nproc 16384
oracle soft nofile 1024
oracle hard nofile 65536