This discussion is archived
6 Replies Latest reply: Mar 22, 2012 3:52 PM by Dude! RSS

What to do when not all pipes was consumed?

dbas_s2it Newbie
Currently Being Moderated
Hello,
I'm using 10 pipes to export database while gzip files in background.
Only 7 files were created by export process, 7 pipes were consumed.
How can I cancel that 3 pipes weren't consumed?
Even after delete the pipe file, background processes are still there.
Thanks in advance.
Ricardo.
  • 1. Re: What to do when not all pipes was consumed?
    BillyVerreynne Oracle ACE
    Currently Being Moderated
    Deleting pipes (or files) will seldom fail and terminate the processes using those pipes (or files). Stuck processes like that most often need to be dealt with The Large Hammer that goes by name of SIGKILL.

    Use the ps (process listing) command to find the stuck processes and then send a kill signal to the process using kill -9 <pid>.
  • 2. Re: What to do when not all pipes was consumed?
    dbas_s2it Newbie
    Currently Being Moderated
    Hello, Billy.
    Thanks for the help.
    My hope is that there was an automatic way to terminate those processes.
    So I will need to use kill -9 <pid>, something not simple to be done inside a script.
  • 3. Re: What to do when not all pipes was consumed?
    Dude! Guru
    Currently Being Moderated
    More help might be possible if you can show us the command or script and explain what you are trying to accomplish.
  • 4. Re: What to do when not all pipes was consumed?
    dbas_s2it Newbie
    Currently Being Moderated
    Hello, Dude.
    My script is something like this:

    # go to work directory
    cd /export
    # create pipes to gzip
    mknod pipe_exp_1 p
    mknod pipe_exp_2 p
    mknod pipe_exp_3 p
    # gzip in background
    gzip < pipe_exp_1 > exp_full_1.dmp.gz &
    gzip < pipe_exp_2 > exp_full_2.dmp.gz &
    gzip < pipe_exp_3 > exp_full_3.dmp.gz &
    # export database
    exp system/passwd full=y direct=y consistent=y compress=n buffer=10485760 filesize=8192m file=pipe_exp_1,pipe_exp_2,pipe_exp_3

    When export of database uses one or two of the files, the third gzip process will continue to run in background even after the script process ends up.
  • 5. Re: What to do when not all pipes was consumed?
    Dude! Guru
    Currently Being Moderated
    process will continue to run in background even after the script process ends up
    As far as I'm aware all background processes should be killed once the parent shell executing the commands gets killed. If you source execute the script using (. ./) then the background processes will be killed once you log out of your current session, otherwise when the script exists.

    Btw, if you need to split the export dump file and don't know how many you will need, the following approach should be useful:
    # go to work directory
    cd /export
    rm -f compress_pipe
    rm -f export_pipe
    
    # Create new pipes
    mknod compress_pipe p
    mknod export_pipe p
    chmod +rw export_pipe compress_pipe
    
    # Split and Compress backgroud processes. 
    nohup split -b 8192m < export_pipe &
    nohup gzip < compress_pipe > export_pipe &
    
    # export database
    exp system/passwd full=y direct=y consistent=y compress=n buffer=10485760 file=compress_pipe 
  • 6. Re: What to do when not all pipes was consumed?
    Dude! Guru
    Currently Being Moderated
    I thought the use of such pipes is quite interesting and did some more test that should give you the answer to your question. For example:

    <pre>
    mknod gunzip_pipe p
    gunzip < gunzip_pipe > testfile &
    [1] 12745
    jobs -l
    [1]+ 12745 Running gunzip < gunzip_pipe > testfile &

    echo "test" > gunzip_pipe
    gunzip: stdin: not in gzip format
    [1]+ Exit 1 gunzip < gunzip_pipe > testfile

    gzip < install.log > gunzip_pipe
    [1]+ Done gunzip < gunzip_pipe > testfile

    </pre>

    I think the above clearly demonstrates that sending data through the pipe reacts on the "exit" status of the application accepting the input. So the exit status to finish the job has to be either 1 or 0.

    The reason why your process(es) are still active after the export is because not all pipes where used, hence waiting for input and some "exit" status.

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points