Some Data Pump issues: DBMS_DATAPUMP Import via NETWORK_LINK fails + STATUS parameter giving bad performance

One of my dear Oracle ACS colleagues (Danke Thomas!) highlighted this issue to me as one of his lead customers hit this pitfall a week ago. .

DBMS_DATAPUMP Import Over NETWORK_LINK fails with ORA-39126 / ORA-31600

Symptoms are:

ORA-31600: invalid input value IN ('VIEWS_AS_TABLES/TABLE_DATA') for parameter VALUE in function SET_FILTER

This can be cured with the patch for bug19501000 –  but this patch can conflict with:Bug 18793246  EXPDP slow showing base object lookup during datapump export causes full table scan per object and therefore may require a merge patchpatch 21253883 is the one to go with in this case.

Another issue Roy just came across:

Data Pump is giving bad performance in Oracle when the STATUS parameter option is used on command line

Symptoms are:

It looks like the routines we are using to get status are significantly slower in 12c than in 11g. On a STATUS call of expdp/impdp runs in 0.2-0.3 seconds, but in it takes 0.8-1.6 seconds. As a result the client falls
behind on
; it is taking about 0.5-0.8 seconds to put out each line in the logfile because it is getting the status each time. With over 9000 tables in a test that half a second really adds up. The result in this test case was that the data pump job completed in 35 minutes, but it took another 30-35 minutes to finish putting out messages on the client (the log file was already complete) and return control to the command line. This happens only when you use STATUS on the command line.

Recommendation is:

Don’t use the STATUS parameter on the expdp/impdp command line in Oracle until the issue is fixed. This will be tracked under Bug 21123545.


Share this:

8 thoughts on “Some Data Pump issues: DBMS_DATAPUMP Import via NETWORK_LINK fails + STATUS parameter giving bad performance

  1. Just a clarification that this is really only an issue when using the STATUS=<n> parameter on the command line. If you use the STATUS command from the data pump interactive command mode, the half-second won’t really be that noticeable.

  2. Hi

    Also using RELY with constraints does not work when doing data pump import.

    It is hitting bug:

    Same bug is also causing problems when You are using dbms_redefinition.


  3. Hello,

    where can be found the patch for bug19501000 ? I’cant find it on MOS.
    I’m having the same issue when trying to import over dblink from to . From to is working fine. (12c DB is the one from where I issue IMPDP).


  4. Hi, I am using external table with datapump access driver to unload data from source and load it to target. the process works if fize size is around 40 KB but if size goes beyond 40 KB it fails with the below error. Do you know how to specify the filesize while creating external table. I referred the documentation but couldn’t find an option.

    ORA-29913: error in executing ODCIEXTTABLECLOSE callout
    ORA-31643: unable to close dump file "/jegnfs/ERPINT1_JEDIDEV1/JEDI_DPUMP_DIR/PA_COST_DS2.DMP"
    ORA-19510: failed to set size of 694809 blocks for file "/jegnfs/ERPINT1_JEDIDEV1/JEDI_DPUMP_DIR/PA_COST_DS2.DMP" (block size=4096)
    ORA-27037: unable to obtain file status

  5. @Rajesh:

    What are you doing exactly? I put the sequence of errors into MOS and received a good bunch of potential solutions. Can you check please as well by searching for:

    "ORA-29913 ORA-31643 ORA-19510: failed to set size of 4096 external table"


  6. Mike, many thanks for sharing this post – I’ve been working with MOS for over a year on this issue with the STATUS= causing the delays and none have been able to provide any insight into what is really happening behind the scenes. As soon as I took out the STATUS parm, worked like it should. Bingo! -Michele

  7. Pingback: Data Pump – Exclude Stats differently for TTS and FTEX | Upgrade your Database - NOW!

Leave a Reply

Your email address will not be published. Required fields are marked *

* Checkbox to comply with GDPR is required


I agree

This site uses Akismet to reduce spam. Learn how your comment data is processed.