2 Replies Latest reply on Apr 9, 2014 6:54 AM by taohiko



      Hi All


      I create shell script as below













      /usr/sfw/bin/wget --http-user="$1"  --http-passwd="$2"  -P $3 $4



      Those can run on OS without any error that download about 20 seconds and then I set job by dbms_scheduler as below



      dbms_scheduler.create_job (job_name=>'W', job_type=>'EXECUTABLE', job_action=>'/usr/bin/bash',number_of_arguments=>1,enabled=>false,auto_drop=>false);






      And then run it but that was hang and I check process in server that found 2 processes and download just 2 files and not finish.

      So I kill process by kill -9 command after that I re-check file in the path, that download completely all files.


      I tried 2nd time and wait more than 5 minutes that still same just 2 files in the path and then I kill processes again after that all files has been downloaded completely again.

      I tried again 3 times that still same situation.


      I test with only one command as below

      /usr/sfw/bin/wget  --http-user="$USERNAME"  --http-passwd="$PASSWORD" -P $FILE_PATH $URL1


      and then run by dbms_scheduler.run_job that is running too long time and I checked file in the path that exists but oracle session did not return anything and running.


      Do you have any experience like this? and How to resolve it?


      Thank you,


        • 1. Re: WGET and DBMS_SCHEDULER

          As my test, that seem like the problem relate to sizing of file.


          If file size more than 10M, wget session will be hang, I test to download 5 files and each file less than 10MB that is work.

          But I downloaded just only one file and file size more than 10MB (about 11MB) wget session was hang and I must kill process by manually.


          Anyone, Do you know about limitation or some parameter of wget command to control about this? because I read wget manual but I cannot found anything relate it.


          Thank you,


          • 2. Re: WGET and DBMS_SCHEDULER

            After add -O option to wget command and I used "more" command to get information from log file and then changed it to "cat" command, the problem is gone that is very very weird.