2 Replies Latest reply: Apr 9, 2014 1:54 AM by taohiko RSS

    WGET and DBMS_SCHEDULER

    taohiko

      Hi All

       

      I create shell script as below

      =============================================

      (w.sh)

      $FILE_PATH/wget_file.sh $USERNAME $PASSWORD $FILE_PATH $URL1

      $FILE_PATH/wget_file.sh $USERNAME $PASSWORD $FILE_PATH $URL2

      $FILE_PATH/wget_file.sh $USERNAME $PASSWORD $FILE_PATH $URL3

      $FILE_PATH/wget_file.sh $USERNAME $PASSWORD $FILE_PATH $URL4

      $FILE_PATH/wget_file.sh $USERNAME $PASSWORD $FILE_PATH $URL5

      $FILE_PATH/wget_file.sh $USERNAME $PASSWORD $FILE_PATH $URL6

      =============================================

      (wget_file.sh)

      #!/bin/bash

       

      /usr/sfw/bin/wget --http-user="$1"  --http-passwd="$2"  -P $3 $4

      =============================================

       

      Those can run on OS without any error that download about 20 seconds and then I set job by dbms_scheduler as below

      begin

      dbms_scheduler.drop_job('W');

      dbms_scheduler.create_job (job_name=>'W', job_type=>'EXECUTABLE', job_action=>'/usr/bin/bash',number_of_arguments=>1,enabled=>false,auto_drop=>false);

      dbms_scheduler.set_job_argument_value('W',1,argument_value=>'/tmp/w.sh');

      dbms_scheduler.enable('W');

      end;

      /

       

      And then run it but that was hang and I check process in server that found 2 processes and download just 2 files and not finish.

      So I kill process by kill -9 command after that I re-check file in the path, that download completely all files.

       

      I tried 2nd time and wait more than 5 minutes that still same just 2 files in the path and then I kill processes again after that all files has been downloaded completely again.

      I tried again 3 times that still same situation.

       

      I test with only one command as below

      /usr/sfw/bin/wget  --http-user="$USERNAME"  --http-passwd="$PASSWORD" -P $FILE_PATH $URL1

       

      and then run by dbms_scheduler.run_job that is running too long time and I checked file in the path that exists but oracle session did not return anything and running.

       

      Do you have any experience like this? and How to resolve it?

       

      Thank you,

      Hiko

        • 1. Re: WGET and DBMS_SCHEDULER
          taohiko

          As my test, that seem like the problem relate to sizing of file.

           

          If file size more than 10M, wget session will be hang, I test to download 5 files and each file less than 10MB that is work.

          But I downloaded just only one file and file size more than 10MB (about 11MB) wget session was hang and I must kill process by manually.

           

          Anyone, Do you know about limitation or some parameter of wget command to control about this? because I read wget manual but I cannot found anything relate it.

           

          Thank you,

          Hiko

          • 2. Re: WGET and DBMS_SCHEDULER
            taohiko

            After add -O option to wget command and I used "more" command to get information from log file and then changed it to "cat" command, the problem is gone that is very very weird.