Data Pump 12.1.0.2 – Wrong Dump File Version – ORA-39142

Again I’ll have to thank my colleague Roland Gräff from the German ACS Support team in Stuttgart for bringing this into our radar. Roland alerted me a week ago about an issue with exports in Oracle 12.1.0.2 only when you are on a certain patch level. I summarize the issue here under Data Pump 12.1.0.2 – Wrong Dump File Version – ORA-39142. In the below blog post you will learn about the actual issue, where it happens and when, and of course how to workaround it. When does it happen? The issue I will describe below happens only with Oracle…

Export with Data Pump and Long Identifiers

I blogged a few days ago about Long Identifiers in Oracle Database 12.2 and accessing the objects via database links from lower database version. As this raised a few questions, I realized there may be a bit more clarification necessary. One question was about what happens during export with Data Pump and Long Identifiers. That’s a pretty good question. Export with Data Pump and Long Identifiers I’d like to demonstrate the effect with a short example. I’m doing all my tests in a fresh PDB inside an Oracle 18.1.0 CDB from out Hands-On Lab. But you can repeat it with…

Upgrade and Migration Office Hours on March 21, 2018

The first Office Hours Session was just a month ago. It was a very good experience and based on the feedback we received from customers it was very helpful. You don’t believe it? Then you may watch the recording. But now it’s time for our second Upgrade and Migration Office Hours on March 21, 2018. What are “Office Hours”? This is a new interactive format we’d like to try out. Several groups in Oracle schedule Office Hours sessions. Roy and I held our first Office Hours session on February 21, 2018 for a full hour trying to answer your questions…

Data Pump’s amazingly useful METRICS=Y and LOGTIME=ALL parameters

Now that I am back from OpenWorld, I will hijack the Upgrade blog while Mike is traveling. 🙂 Thank you to everybody who came to our presentations or stopped at the demo booth to chat last week. We had a great many conversations, and we always learn from talking to customers! One of the common questions about Data Pump came in the form, “I have a data pump job that used to run in X minutes, but now takes <multiple of X> minutes. Can you tell me what might be happening?“ Of course with that much information we can do…

Data Pump – Exclude Stats differently for TTS and FTEX

Nice little best practice for statistics and Data Pump when doing either Transportable Tablespaces or Full Transportable Export-Import (credits to Roy and Dean Gagne). Transport Statistics via a Staging Table First of all we always recommend to exclude statistics when doing a Data Pump export as the import of such stats takes way longer than transporting them via a stats table. If you are unfamiliar with transporting stats between databases please see the Oracle Performance Tuning Guide with a nice tutorial: Importing and Exporting Optimizer Statistics The basic steps to transport statistics from one database to another fast and efficient…

Some Data Pump issues: DBMS_DATAPUMP Import via NETWORK_LINK fails + STATUS parameter giving bad performance

One of my dear Oracle ACS colleagues (Danke Thomas!) highlighted this issue to me as one of his lead customers hit this pitfall a week ago. . DBMS_DATAPUMP Import Over NETWORK_LINK fails with ORA-39126 / ORA-31600 Symptoms are: KUPW$WORKER.CONFIGURE_METADATA_UNLOAD [ESTIMATE_PHASE] ORA-31600: invalid input value IN (‘VIEWS_AS_TABLES/TABLE_DATA’) for parameter VALUE in function SET_FILTER This can be cured with the patch for bug19501000 –  but this patch can conflict with:Bug 18793246  EXPDP slow showing base object lookup during datapump export causes full table scan per object and therefore may require a merge patch – patch 21253883 is the one to go with in…

Parallel Index Creation with Data Pump Import

Here is a new capability that might be interesting to anybody who is performing a migration using Data Pump. Previously, Data Pump would create indexes one at a time, specifying the PARALLEL keyword for the CREATE INDEX statement to invoke parallel query for index creation. We used to recommend a workaround to create indexes in parallel, which involved a three-step process of importing without indexes, then creating a SQLFILE of the CREATE INDEX statements, and breaking that file into multiple windows. Through extensive performance testing we found that it is faster to create multiple indexes in parallel (using a parallel…

Full Transportable Export/Import White Paper

One of the really cool and helpful features in Oracle Database 12c is called Full Transportable Export/Import. It combines the basics of transportable tablespaces – of course cross-platform, cross-endianess, cross-version – with Data Pump taking care on all the stuff not stored in tables and indexes, such as views, synonyms, trigger, packages etc. And you can even reduce downtime by combining the feature with incrementally converted RMAN backups. And the best thing: this will work with an Oracle Database 11.2.0.3/4 to Oracle Database 12c – so you don’t have to be on Oracle Database 12c in order to leverage from…

Data Pump: Consistent Export?

Ouch … I have to admit as I did say in several workshops in the past weeks that a data pump export with expdp is per se consistent. Well … I thought it is … but it’s not. Thanks to a customer who is doing a large unicode migration at the moment. We were discussing parameters in the expdp’s par file. And I did ask my colleagues after doing some research on MOS. And here are the results of my “research”: MOS Note 377218.1 has a nice example showing a data pump export of a partitioned table with DELETEs on…

Exclude DBMS_SCHEDULER Jobs from expdp?

You have never thought about excluding DBMS_SCHEDULER jobs from a Data Pump export? Me neither but I’ve recently got a copy of an email for such a customer case from Roy who owns Data Pump as well. And this is the code example from Dean Gagne: exclude.par: exclude=procobj:”IN (SELECT NAME FROM sys.OBJ$ WHERE TYPE# IN (47,48,66,67,68,69,71,72,74))” This will work only on export It’s an all or nothing approach Quite interesting, isn’t it?

How to get the Master Table from a Data Pump expdp?

Interesting question a customer had last week during the Upgrade Workshop in Munich. He’s getting export dump files from several customers and often not much information describing the contents. So how can ge find out what’s in there, which was the source characterset etc. This seems to be a simple question but it did cost me a few searches and tests to come back with some (hopefully) useful information. First attempt: $strings expdp.dmp > outexpdp.txt I bet there are better ways to do this but in my case this will give me: “APP”.”SYS_EXPORT_SCHEMA_01″ x86_64/Linux 2.4.xx WE8ISO8859P15 LBB EMB GHC JWD…