I tried to export the table that have 80.000 records and I got a time out error.You don't say what utility you are using to do the export. I'm guessing the data unload in SQL Workshop. Not really a tool I'd use for a robust recovery scheme. For what you indicate, I'd create a script to export the table(s) you need using Datapump (expdp). Google will locate you a dozen sites telling you the specifics of exporting tables and/or schemas. One you have a script that exports the data that you need to have, then set up a job to run the script in the early AM each day. Keep the most recent 5-10 dump files just in case you lose some data and don't notice for a few days.
Could please tell me how to do that??Didn't realize you weren't using your own instance. First rule is that if this is really critical data, it shouldn't be on apex.com at all. From there, you have no direct access to save data to a directory.
CREATE PROCEDURE rough_backup_example AS v_csvdata CLOB; v_email_id NUMBER; CURSOR C1 IS SELECT EMPNO, ENAME, SAL, E.DEPTNO, DNAME FROM EMP E, DEPT D WHERE E.DEPTNO = D.DEPTNO ORDER BY EMPNO; BEGIN FOR v_Lp IN C1 LOOP v_csvdata := c_csvdata || C1_R.EMPNO || ',' || C1_R.ENAME || ',' || C1_R.SAL || ',' || C1_R.DEPTNO || ',' || C1_R.DNAME || CHR(13) || CHR(10); END LOOP; v_email_id := APEX_MAIL.SEND( p_to => 'firstname.lastname@example.org', p_from => 'email@example.com', p_subj => 'Backup file for ' || TO_CHAR(SYSDATE, 'DD-MON-YY'), p_body => 'Backup file for ' || TO_CHAR(SYSDATE, 'DD-MON-YY')); APEX_MAIL.ADD_ATTACHMENT( p_mail_id => v_email_id, p_attachment => v_csvdata, p_filename => 'Data_Backup_' || TO_CHAR(SYSDATE, 'MON_DD') || '.csv', p_mime_type => 'application/csv'); END; /
The biggest table have more than 80.000 records and we have daily records and our team need to backup our information.You do know that apex.oracle.com is not intended for production applications, right?
If so, should I truncate the table and import the backup file? What is the best what to do it?In my book, the easiest way to import from a CSV file in a controlled fashion is to create an external table against the file. Then you can issue SQL against it and access just the data that you need in whatever way that you need it. For example, say you had an employee table and some dummy meant to update one employee's last name to 'Smith', but forgot the WHERE clause and ended up setting every employee's last name to Smith. If you had a CSV backup of the table as the target of an external table, you could use it to grab all employees last names and employee IDs and fix the data without having to re-import everything.
PS.- Is there another way to export the data?Without more details on what you're looking for to be different, I'm not going to try to think of an alternate export method. The method as supplied gives you a file that is very portable and a process that can be automated. It would be nice to automate further in such a way that instead of being emailed to someone that the file was placed directly on a filesystem. However, apex.com has no means of accessing 'your' filesystem, so that's not practical.