Skip to Main Content

Database Software

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

External UNIX script with background processes hanging

_Dylan_Aug 25 2011 — edited Aug 25 2011
I'm revising a process that we use to export data from our Oracle Databases and import into one of our Netezza databases. Basically, we use a pair of unix commands one to extract data to a pipe using a 3rd party app (FastReader) and another which is the bulk loader command for Netezza.

I have a shell script that does what I want. I've tested it as the user that we run external jobs as. All file-names have full paths. Other external jobs doing similar things work fine so settings on all external job related files seem to be good.

Below is the block of code that seems to be having an issue. FastReader starts up in the background (I get the echos for the FastReader PID, and 1 instance of being in the being in the while loop). But once FastReader has fully started up, all other processing is suspended...as if it was running in the foreground again. If I kill the FastReader process, the script continues to execute.

What I find weird about this is that the current process I'm replacing starts multiple instances of FastReader in the background and waits for all of them to finish without any issue.

Anybody have any thoughts on what the issue may be?
# Start FastReader in the background and get its Process ID.
FastReader config=/usr/users/dw/netezza_loads/$1/$2.ini &
FASTREADER_PID=$!
echo "*** FastReader PID=$FASTREADER_PID ***"

# Wait for FastReader to create the pipe before allowing nzload to begin.
RETRY_COUNT=0
while [ ! -e "$PIPE_NAME" ]; do
  if [ $RETRY_COUNT -eq 5 ]
    then
      echo "***" ERROR!!! Hit limit while waiting for fastreader to start "***"
      log_and_exit $ERRC_NO_PIPE_DETECTED
    else
      echo "***" Waiting for "$PIPE_NAME" to be created "***"
  fi
  sleep 5
  RETRY_COUNT=$RETRY_COUNT+1
done
echo "***" Named pipe created.  Starting nzload "***"
Edit
Also, the above script is called from within a wrapper script that redirects sdterr and stdout to files. When I break the above script into two, and start each of those shell scripts in the background. Makes passing exit codes around a pain, but it workable. But I'm still curious if anybody knows why I can't get more than 1 process running per script.
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Sep 22 2011
Added on Aug 25 2011
0 comments
275 views