Listener error kindly help
719330Aug 28 2010 — edited Sep 1 2010Hi ,
My system is 2GB ram Linux ... system uptime is load average: 0.14, 0.19, 0.18
i have oracle10g installed
i am getting the error in listener log
TNS-12518: TNS:listener could not hand off client connection
TNS-12564: TNS:connection refused
TNS-12602: TNS: Connection Pooling limit reached
TNS-00524: Current operation is still in progress
Linux Error: 115: Operation now in progress
there is no error in alert log
free
total used free shared buffers cached
Mem: 2055272 1920844 134428 0 18812 1608024
-/+ buffers/cache: 294008 1761264
Swap: 8193064 262960 7930104
SQL> show sga;
Total System Global Area 1073741824 bytes
Fixed Size 1271468 bytes
Variable Size 364906836 bytes
Database Buffers 683671552 bytes
Redo Buffers 23891968 bytes
my init file
DBL.__db_cache_size=511705088
DBL.__java_pool_size=4194304
DBL.__large_pool_size=54525952
DBL.__shared_pool_size=473956352
DBL.__streams_pool_size=4194304
*._bloom_filter_enabled=FALSE
*._DB_BLOCK_LRU_LATCHES=8
*._spin_count=5000
*.archive_lag_target=1800
*.audit_file_dest='/dbhome/dbl/dbf/audit'
*.audit_sys_operations=TRUE
*.audit_trail='db_extended'
*.BACKGROUND_DUMP_DEST='/dbhome/dbl/dbf/dump'
*.circuits=1000
*.COMPATIBLE='10.2.0.4'
*.CONTROL_FILES='/dbhome/dbl/dbf/cntl1/DBL_cntl1_01.ctl','/dbhome/dbl/dbf/cntl2/DBL_cntl2_02.ctl','/dbhome/dbl/dbf/cntl3/DBL_cntl3_03.ctl'
*.CORE_DUMP_DEST='/dbhome/dbl/dbf/dump'
*.CURSOR_SHARING='similar'
*.DB_BLOCK_SIZE=8192
*.DB_CACHE_SIZE=100M
*.db_file_multiblock_read_count=64
*.DB_NAME='dbl'
*.db_writer_processes=4
*.DISPATCHERS='(PROTOCOL=TCP)(DISPATCHERS=10)'
*.FAST_START_MTTR_TARGET=300
*.INSTANCE_NAME='DBL'
*.JOB_QUEUE_PROCESSES=1
*.large_pool_size=50M#128MB as on 01.03.2005 for shared servers
*.LOCAL_LISTENER='(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=XXX.XX.XX.XXX)(PORT=1521))(ADDRESS=(PROTOCOL=TCP)(HOST=XXX.XX.XX.XXX)(PORT=1521)))'
*.LOG_ARCHIVE_DEST_1='LOCATION=/dbhome/dbl/arch/'
*.log_archive_dest_2='LOCATION=/orabackup1/dbhome/dbl/arch/'
*.log_archive_dest_state_2='ENABLE'
*.LOG_ARCHIVE_FORMAT='dbl%t_%r_%s.arc'
*.LOG_ARCHIVE_START=TRUE
*.log_buffer=20000000
*.max_dispatchers=50
*.MAX_SHARED_SERVERS=50
*.O7_DICTIONARY_ACCESSIBILITY=FALSE
*.OPEN_CURSORS=300
*.parallel_adaptive_multi_user=TRUE
*.parallel_automatic_tuning=TRUE
*.parallel_max_servers=20
*.parallel_min_servers=2
*.PROCESSES=500
*.QUERY_REWRITE_ENABLED='TRUE'
*.QUERY_REWRITE_INTEGRITY='ENFORCED'
*.recovery_parallelism=5
*.recyclebin='OFF'
*.REMOTE_LOGIN_PASSWORDFILE='EXCLUSIVE'
*.remote_os_authent=FALSE
*.sessions=1000
*.SGA_TARGET=1024M
*.SHARED_POOL_SIZE=100M
*.SHARED_SERVERS=10
*.sort_area_size=10485760
*.sql92_security=TRUE
*.statistics_level='TYPICAL'
*.TIMED_STATISTICS=TRUE
*.UNDO_MANAGEMENT='AUTO'
*.UNDO_TABLESPACE='undotbs'
*.USER_DUMP_DEST='/dbhome/dbl/dbf/dump'
how can i resolve the error which i am getting intermittently