Forum Stats

  • 3,838,691 Users
  • 2,262,394 Discussions
  • 7,900,734 Comments

Discussions

EXPDP and IMPDP via SQL*Net

Rodrigo Jorge DBA
Rodrigo Jorge DBA Member Posts: 86 Bronze Badge
edited Jan 11, 2016 6:12PM in Database Ideas - Ideas

IMPDP and EXPDP are undoubtedly amazing evolutions of ancient IMP/EXP.

However, why did Oracle remove the capability of generating remote dumps via SQL*Net? I don't want to backup to a local directory!

Many places still use the original export simply because the DBA does not have access to the server, or the server does not have sufficient free space, or many other limitations related do the DB server.

Please oracle, give me the option to run expdp transferring the data via Oracle Net and we can definitely say goodbye to exp! =]

Rodrigo Jorge DBAFranck PachotUser259623 -OracleGeeky Nerdmanborneseluser2482580Der BaboRobertOrtelManish Chaturvedivinaykumar2Andreas HuberGerald Venzl-OraclectriebArpit Jain -OraclePravin TakpireJagadekaratonibony7user2916724Geert GruwezNiels HeckerberxRacer I.Sven W.sdstuber
31 votes

Active · Last Updated

«1

Comments

  • Franck Pachot
    Franck Pachot Member Posts: 912 Bronze Trophy

    Hi,

    Workaround: install a XE database on your client and create database link.

    I know... a ligher solution can be good...

    Regards,

    Franck.

    Gbenga Ajakaye
  • carajandb
    carajandb Member Posts: 260 Bronze Badge

    Why do you want an entire export file on your local system. Shouldn't it be more helpful if you can export single table or tables and relations directly on the client, e.g. with SQL Developer or SQL*Plus?

  • top.gun
    top.gun Member Posts: 3,666 Gold Crown

    "because the DBA does not have access to the server"

    Are you kidding?

    "server does not have sufficient free space"

    You can use a compressed datapump dump file, and remember compression isn't possible with the exp/imp utilities.

    Really if you want a dumpfile as a backup, then you should be using RMAN for backups, and the backup can also be compressed.

  • Rodrigo Jorge DBA
    Rodrigo Jorge DBA Member Posts: 86 Bronze Badge

    "because the DBA does not have access to the server"

    Are you kidding?

    "server does not have sufficient free space"

    You can use a compressed datapump dump file, and remember compression isn't possible with the exp/imp utilities.

    Really if you want a dumpfile as a backup, then you should be using RMAN for backups, and the backup can also be compressed.

    "Are you kidding?"
    
    

    No. This is called "Layered security Architecture". The Infra OS Team is not the same as the DBA Team. They have some constraints on OS layer, like avoiding any user (including the DBA) from accessing the FS (unless needed for maintenance).

    You can use a compressed datapump dump file, and remember compression isn't possible with the exp/imp utilities.
    

    Not native, but it is possible using pipe. Datapump does not work via SQL*Net.

    Really if you want a dumpfile as a backup, then you should be using RMAN for backups, and the backup can also be compressed.

    I have physical backups. The point of logical backup is to fast restore a single object, or a line of a table or to check how the object was in the previous day.

  • top.gun
    top.gun Member Posts: 3,666 Gold Crown
    "Are you kidding?"
    
    

    No. This is called "Layered security Architecture". The Infra OS Team is not the same as the DBA Team. They have some constraints on OS layer, like avoiding any user (including the DBA) from accessing the FS (unless needed for maintenance).

    You can use a compressed datapump dump file, and remember compression isn't possible with the exp/imp utilities.
    

    Not native, but it is possible using pipe. Datapump does not work via SQL*Net.

    Really if you want a dumpfile as a backup, then you should be using RMAN for backups, and the backup can also be compressed.

    I have physical backups. The point of logical backup is to fast restore a single object, or a line of a table or to check how the object was in the previous day.

    1) The DBA needs access to the Unix account where Oracle is installed. Our Unix team allows us to use 'sudo su - oracle' to access the Oracle account. All DBAs should be allowed to do this.

    2) Native compression using COMPRESSION=ALL clause in the datapump export.

    3) To fast restore a single object the dump file needs to be on the local volume. If the dump file is located remotely you'll get a slow restore.

  • Franck Pachot
    Franck Pachot Member Posts: 912 Bronze Trophy

    1) The DBA needs access to the Unix account where Oracle is installed. Our Unix team allows us to use 'sudo su - oracle' to access the Oracle account. All DBAs should be allowed to do this.

    2) Native compression using COMPRESSION=ALL clause in the datapump export.

    3) To fast restore a single object the dump file needs to be on the local volume. If the dump file is located remotely you'll get a slow restore.

    Hi,

    Just a warning about the fact that COMPRESSION=ALL requires Advanced Compression Option to be licenced.

    Regards,

    Franck. 

    Pravin TakpireGbenga Ajakaye
  • Pavel.Ruzicka
    Pavel.Ruzicka Member Posts: 12 Blue Ribbon

    You can use NETWORK_LINK based Data Pump transfers (over SQL*Net based database link). Mind the restrictions like LONG data types.

    Regards,

    Pavel

    Gbenga Ajakaye
  • ZedDBA
    ZedDBA Member Posts: 35

    You can do this:

    Export remotely using DB link:
    expdp system/oracle NETWORK_LINK=PROD SCHEMAS=SH DIRECTORY=EXPORTS DUMPFILE=prod_sh.dmp

    Import directly into DB from source without need of dump file:
    impdp system/oracle NETWORK_LINK=PROD SCHEMAS=SH REMAP_TABLESPACE=EXAMPLE:USERS DIRECTORY=EXPORTS

    Hope this helps

    ZedDBA

    Gbenga Ajakaye
  • Gbenga Ajakaye
    Gbenga Ajakaye Member Posts: 3,422 Gold Trophy

    You can use NETWORK_LINK based Data Pump transfers (over SQL*Net based database link). Mind the restrictions like LONG data types.

    Regards,

    Pavel

    This option will work fine. It doesn't require creating local directory.

  • Please remember, that there are lots of situations where expdp is NOT working over a database links.

    For example when the source database is using long columns (i know longs are deprecated, but the still exist in lots of applications).

    With the old exp/imp i could "outsource" copies of application-data to test-databases or for investigation to the applications owner.

    With datapump i don't know a way to do this without creating stored procedures, NFS-mounts and so on.

    It's a lot more complicated.