Data Pump 12.1.0.2 – Wrong Dump File Version – ORA-39142

Again I’ll have to thank my colleague Roland Gräff from the German ACS Support team in Stuttgart for bringing this into our radar. Roland alerted me a week ago about an issue with exports in Oracle 12.1.0.2 only when you are on a certain patch level. I summarize the issue here under Data Pump 12.1.0.2 – Wrong Dump File Version – ORA-39142. In the below blog post you will learn about the actual issue, where it happens and when, and of course how to workaround it. When does it happen? The issue I will describe below happens only with Oracle…

Export with Data Pump and Long Identifiers

I blogged a few days ago about Long Identifiers in Oracle Database 12.2 and accessing the objects via database links from lower database version. As this raised a few questions, I realized there may be a bit more clarification necessary. One question was about what happens during export with Data Pump and Long Identifiers. That’s a pretty good question. Export with Data Pump and Long Identifiers I’d like to demonstrate the effect with a short example. I’m doing all my tests in a fresh PDB inside an Oracle 18.1.0 CDB from out Hands-On Lab. But you can repeat it with…

Transportable Tablespaces and READ ONLY in Oracle Database 12c

We recently worked with a customer who noticed that they were not able to use transportable tablespaces to connect the same tablespace data files to two databases at the same time, even after setting the tablespaces READ ONLY in SQL*Plus. This is new behavior in 12c, and many customers are not yet aware of this change. Here are the details of what changed, why, and how you might want to deal with it if the changes affect your environment. What Changed? Starting in 12.1, data pump sets tablespaces read write during the import phase of a transportable tablespace migration. This…

Full Transportable Export/Import – PAR File Examples

Roy and I blogged about Full Transportable Export/Import in the past: Full Transportable Export/Import White Paper Full Transportable Export/Import – Things to Know Full Transportable Export/Import – Migrating an 11.2.0.4 database to Oracle Database 12c- into the Oracle Cloud Exclude Stats differently for TTS and FTEX If you haven’t heard of this database feature, it allows you to migrate a full database by using Transportable Tablespaces as a base technology but letting Data Pump do all the manual steps for you in a one-command migration. And if needed, it works with RMAN Incremental Backups as well in order to decrease…

Data Pump’s amazingly useful METRICS=Y and LOGTIME=ALL parameters

Now that I am back from OpenWorld, I will hijack the Upgrade blog while Mike is traveling. 🙂 Thank you to everybody who came to our presentations or stopped at the demo booth to chat last week. We had a great many conversations, and we always learn from talking to customers! One of the common questions about Data Pump came in the form, “I have a data pump job that used to run in X minutes, but now takes <multiple of X> minutes. Can you tell me what might be happening?“ Of course with that much information we can do…

Full Transportable Export/Import – Migrating an 11.2.0.4 database to Oracle Database 12c- into the Oracle Cloud

Full Transportable Export/Import – one of the coolest features in Oracle Database 12c  We blogged about Full Transportable Export/Import a while back. It is – no doubt – one of the coolest features in Oracle Database 12c. And it is part of our Hands-On Lab exercise (Hands On Lab – Upgrade, Migrate, Consolidate to Oracle Database 12c) as well. It utilizes the technique of Transportable Tablesspaces – cross-platform, cross- Endianness, cross-version – but lets Oracle Data Pump do all the “dirty” work of rebuilding everything kept in your SYSTEM and SYSAUX tablespace including views, synonyms, public objects, grants, sequences etc etc etc. You can…

Data Pump – Exclude Stats differently for TTS and FTEX

Nice little best practice for statistics and Data Pump when doing either Transportable Tablespaces or Full Transportable Export-Import (credits to Roy and Dean Gagne). Transport Statistics via a Staging Table First of all we always recommend to exclude statistics when doing a Data Pump export as the import of such stats takes way longer than transporting them via a stats table. If you are unfamiliar with transporting stats between databases please see the Oracle Performance Tuning Guide with a nice tutorial: Importing and Exporting Optimizer Statistics The basic steps to transport statistics from one database to another fast and efficient…

Some Data Pump issues: DBMS_DATAPUMP Import via NETWORK_LINK fails + STATUS parameter giving bad performance

One of my dear Oracle ACS colleagues (Danke Thomas!) highlighted this issue to me as one of his lead customers hit this pitfall a week ago. . DBMS_DATAPUMP Import Over NETWORK_LINK fails with ORA-39126 / ORA-31600 Symptoms are: KUPW$WORKER.CONFIGURE_METADATA_UNLOAD [ESTIMATE_PHASE] ORA-31600: invalid input value IN (‘VIEWS_AS_TABLES/TABLE_DATA’) for parameter VALUE in function SET_FILTER This can be cured with the patch for bug19501000 –  but this patch can conflict with:Bug 18793246  EXPDP slow showing base object lookup during datapump export causes full table scan per object and therefore may require a merge patch – patch 21253883 is the one to go with in…

Parallel Index Creation with Data Pump Import

Here is a new capability that might be interesting to anybody who is performing a migration using Data Pump. Previously, Data Pump would create indexes one at a time, specifying the PARALLEL keyword for the CREATE INDEX statement to invoke parallel query for index creation. We used to recommend a workaround to create indexes in parallel, which involved a three-step process of importing without indexes, then creating a SQLFILE of the CREATE INDEX statements, and breaking that file into multiple windows. Through extensive performance testing we found that it is faster to create multiple indexes in parallel (using a parallel…

Data Pump: Consistent Export?

Ouch … I have to admit as I did say in several workshops in the past weeks that a data pump export with expdp is per se consistent. Well … I thought it is … but it’s not. Thanks to a customer who is doing a large unicode migration at the moment. We were discussing parameters in the expdp’s par file. And I did ask my colleagues after doing some research on MOS. And here are the results of my “research”: MOS Note 377218.1 has a nice example showing a data pump export of a partitioned table with DELETEs on…

Exclude DBMS_SCHEDULER Jobs from expdp?

You have never thought about excluding DBMS_SCHEDULER jobs from a Data Pump export? Me neither but I’ve recently got a copy of an email for such a customer case from Roy who owns Data Pump as well. And this is the code example from Dean Gagne: exclude.par: exclude=procobj:”IN (SELECT NAME FROM sys.OBJ$ WHERE TYPE# IN (47,48,66,67,68,69,71,72,74))” This will work only on export It’s an all or nothing approach Quite interesting, isn’t it?

How to get the Master Table from a Data Pump expdp?

Interesting question a customer had last week during the Upgrade Workshop in Munich. He’s getting export dump files from several customers and often not much information describing the contents. So how can ge find out what’s in there, which was the source characterset etc. This seems to be a simple question but it did cost me a few searches and tests to come back with some (hopefully) useful information. First attempt: $strings expdp.dmp > outexpdp.txt I bet there are better ways to do this but in my case this will give me: “APP”.”SYS_EXPORT_SCHEMA_01″ x86_64/Linux 2.4.xx WE8ISO8859P15 LBB EMB GHC JWD…