Data Pump

Virtual Classroom Seminar #15: Data Pump Best Practices on April 5, 2023

We are quite busy these days. And I am lagging behind in answering your comments on the blog. So please be patient, nothing will be lost. Meanwhile, we work hard on the slides for our next Virtual Classroom Seminar #15: Data Pump Best Practices on April 5, 2023.

What will be the topic?

Data Pump Best Practices and Real World Scenarios tells you already what we are planning to do. After we had a “Data Pump Deep Dive with Development” seminar last year (time flies so quickly), we decided that we should tackle …

Continue reading...

Data Pump Bundle Patches: You may need to download and apply again

The Data Pump Bundle Patches get released on top of every RU since over a year now. But a real issue with missing files got detected recently. So in case you had applied Data Pump Bundle Patches: You may need to download and apply again.

 

What is in the Data Pump Bundle Patch?

The Data Pump Bundle Patch is a convenient and very useful vehicle to apply Data Pump patches which are usually not included in a Release Update (RU). While RUs are per definitionem RAC-rolling and Standby-First, the Data Pump Bundle …

Continue reading...

Data Pump Super Patch for Oracle 19.10 and newer

Just a few days we released a Data Pump Super Patch for Oracle 19.10 and newer. “Newer” refers to the fact that once 19.11.0 will be available, this merge patch I’m writing about will be available as well right away (sorry, got delayed a bit). And if you work with Data Pump, you may want to consider this performance patch collection.

Why do you need this patch?

Many customer will know the issue that adding changes with datapatch may take a long time as soon as dpload.sql is involved. And this …

Continue reading...

Does Data Pump import only serially into PDBs?

Daniel and I did Web Seminars this week. One of them was about the different migration strategies. We did talk a lot about Data Pump. One attendee mentioned that Data Pump Import does not work parallel into PDBs in Oracle 12.2.0.1 and Oracle 18c. We were skeptical, and today I tried out to check: Does Data Pump import only serially into PDBs?

Very simple test setup

For this test, I take a schema export from an 11.2.0.4 database in our Hands-On Lab, the TPCC user HammerORA uses in the UPGR database.…

Continue reading...

Transportable Tablespaces – Example and strange error with a PDB

Yesterday I was browsing around for a useful simple example to test Transportable Tablespaces. A colleague mailed with the other day with a strange error message. The attempt to import into a PDB in Oracle 19c failed. My first thought: Oh, this is simple. But I failed, too. And even worse, I couldn’t find a single useful note in MyOracle Support (MOS) for ORA-31640, ORA-27037, Linux-x86_64 Error: 2 with Additional information: 7. So I decided to summarize this in Transportable Tablespaces – Example and strange error with a PDB.

A simple Transportable Tablespace

Continue reading...

Database Migration from non-CDB to PDB – Migration with Data Pump

You may have realized that there are a few techniques missing describing how to do a Database Migration from non-CDB to PDB – Migration with Data Pump is one of them. I will explain the most simple approach of going to Single- or Multitenant. It isn’t the coolest – and it isn’t very fast as soon as your database has a significant size. But it is not complex. And it allows you to move even from very old versions directly into an Oracle 19c PDB – regardless of patch levels or source and destination platform.

Database Migration from non-CDB to PDB – Migration with Data Pump

High Level Overview

Endianness change
Continue reading...

Data Pump 12.1.0.2 – Wrong Dump File Version – ORA-39142

Data Pump 12.1.0.2 - Wrong Dump File Version - ORA-39142Again I’ll have to thank my colleague Roland Gräff from the German ACS Support team in Stuttgart for bringing this into our radar. Roland alerted me a week ago about an issue with exports in Oracle 12.1.0.2 only when you are on a certain patch level. I summarize the issue here under Data Pump 12.1.0.2 – Wrong Dump File Version – ORA-39142.

In the below blog post you will learn about the actual issue, where it happens and when, and of course how to workaround it.

When does it happen?

The issue I will describe below happens only with

Continue reading...

Export with Data Pump and Long Identifiers

I blogged a few days ago about Long Identifiers in Oracle Database 12.2 and accessing the objects via database links from lower database version. As this raised a few questions, I realized there may be a bit more clarification necessary. One question was about what happens during export with Data Pump and Long Identifiers. That’s a pretty good question.

Export with Data Pump and Long Identifiers

Export with Data Pump and Long Identifiers

I’d like to demonstrate the effect with a short example. I’m doing all my tests in a fresh PDB inside an Oracle 18.1.0 CDB from out Hands-On Lab. But you can repeat …

Continue reading...

Transportable Tablespaces and READ ONLY in Oracle Database 12c

We recently worked with a customer who noticed that they were not able to use transportable tablespaces to connect the same tablespace data files to two databases at the same time, even after setting the tablespaces READ ONLY in SQL*Plus. This is new behavior in 12c, and many customers are not yet aware of this change. Here are the details of what changed, why, and how you might want to deal with it if the changes affect your environment.

What Changed?

Starting in 12.1, data pump sets tablespaces read write during the import phase of a transportable tablespace …

Continue reading...

Full Transportable Export/Import – PAR File Examples

Roy and I blogged about Full Transportable Export/Import in the past:

If you haven’t heard of this database feature, it allows you to migrate a full database by using Transportable Tablespaces as a base technology but letting Data Pump do all the manual steps for you in a one-command migration. And if needed, it works with RMAN Incremental Backups as well in order to …

Continue reading...

Data Pump’s amazingly useful METRICS=Y and LOGTIME=ALL parameters

Now that I am back from OpenWorld, I will hijack the Upgrade blog while Mike is traveling. 🙂

Thank you to everybody who came to our presentations or stopped at the demo booth to chat last week. We had a great many conversations, and we always learn from talking to customers! One of the common questions about Data Pump came in the form, “I have a data pump job that used to run in X minutes, but now takes <multiple of X> minutes. Can you tell me what might be happening?

Of course with that much information we

Continue reading...

Full Transportable Export/Import – Migrating an 11.2.0.4 database to Oracle Database 12c- into the Oracle Cloud

Full Transportable Export/Import – one of the coolest features in Oracle Database 12c 

We blogged about Full Transportable Export/Import a while back. It is – no doubt – one of the coolest features in Oracle Database 12c. And it is part of our Hands-On Lab exercise (Hands On Lab – Upgrade, Migrate, Consolidate to Oracle Database 12c) as well.

It utilizes the technique of Transportable Tablesspaces – cross-platform, cross- Endianness, cross-version – but lets Oracle Data Pump do all the “dirty” work of rebuilding everything kept in your SYSTEM and SYSAUX tablespace including views, synonyms, public objects, …

Continue reading...

Data Pump – Exclude Stats differently for TTS and FTEX

Nice little best practice for statistics and Data Pump when doing either Transportable Tablespaces or Full Transportable Export-Import (credits to Roy and Dean Gagne).

Transport Statistics via a Staging Table

First of all we always recommend to exclude statistics when doing a Data Pump export as the import of such stats takes way longer than transporting them via a stats table. If you are unfamiliar with transporting stats between databases please see the Oracle Performance Tuning Guide with a nice tutorial:

The basic steps to transport statistics from one database to another fast and

Continue reading...

Some Data Pump issues: DBMS_DATAPUMP Import via NETWORK_LINK fails + STATUS parameter giving bad performance

One of my dear Oracle ACS colleagues (Danke Thomas!) highlighted this issue to me as one of his lead customers hit this pitfall a week ago. .

DBMS_DATAPUMP Import Over NETWORK_LINK fails with ORA-39126 / ORA-31600

Symptoms are:

KUPW$WORKER.CONFIGURE_METADATA_UNLOAD [ESTIMATE_PHASE]
ORA-31600: invalid input value IN ('VIEWS_AS_TABLES/TABLE_DATA') for parameter VALUE in function SET_FILTER

This can be cured with the patch for bug19501000 –  but this patch can conflict with:Bug 18793246  EXPDP slow showing base object lookup during datapump export causes full table scan per object and therefore may require a merge patchpatch 21253883 is the one to go …

Continue reading...

Parallel Index Creation with Data Pump Import

Here is a new capability that might be interesting to anybody who is performing a migration using Data Pump. Previously, Data Pump would create indexes one at a time, specifying the PARALLEL keyword for the CREATE INDEX statement to invoke parallel query for index creation. We used to recommend a workaround to create indexes in parallel, which involved a three-step process of importing without indexes, then creating a SQLFILE of the CREATE INDEX statements, and breaking that file into multiple windows.

Through extensive performance testing we found that it is faster to create multiple indexes in parallel (using a parallel …

Continue reading...

Data Pump: Consistent Export?

Ouch … I have to admit as I did say in several workshops in the past weeks that a data pump export with expdp is per se consistent.

Well … I thought it is … but it’s not. Thanks to a customer who is doing a large unicode migration at the moment. We were discussing parameters in the expdp’s par file. And I did ask my colleagues after doing some research on MOS. And here are the results of my “research”:

  • MOS Note 377218.1 has a nice example showing a data pump export of a partitioned table with DELETEs on
Continue reading...

Exclude DBMS_SCHEDULER Jobs from expdp?

You have never thought about excluding DBMS_SCHEDULER jobs from a Data Pump export? Me neither but I’ve recently got a copy of an email for such a customer case from Roy who owns Data Pump as well. And this is the code example from Dean Gagne:

exclude.par:

exclude=procobj:"IN (SELECT NAME FROM sys.OBJ$ WHERE TYPE# IN
(47,48,66,67,68,69,71,72,74))"
  • This will work only on export
  • It’s an all or nothing approach

Quite interesting, isn’t it?…

Continue reading...

How to get the Master Table from a Data Pump expdp?

PumpInteresting question a customer had last week during the Upgrade Workshop in Munich. He’s getting export dump files from several customers and often not much information describing the contents. So how can ge find out what’s in there, which was the source characterset etc.

This seems to be a simple question but it did cost me a few searches and tests to come back with some (hopefully) useful information.

First attempt: $strings expdp.dmp > outexpdp.txt

I bet there are better ways to do this but in my case this will give me:

"APP"."SYS_EXPORT_SCHEMA_01"
x86_64/Linux 2.4.xx
WE8ISO8859P15
LBB EMB GHC JWD 
Continue reading...