Data Pump

Which releases allow you to upgrade to Oracle Database 21c?

As I wrote before, the release you should upgrade to is clearly Oracle Database 19c. Unless you need to use one of the new features in Oracle 21c. But as you have seen in the past, some of them get backported to 19c as well. Still, in case you can’t resists or you love upgrades as much as we do, the question may be: Which releases allow you to upgrade to Oracle Database 21c?

Upgrade-supported Releases

The Database Upgrade Guide for Oracle 21c is your best friend here. Right at the beginning in the first chapter you will find this …

Continue reading...

aioug Webinar Series July 2021: Database Upgrade and Migrations

Many people on the northern hemisphere will escape during July into the summer holidays. But we won’t. Instead, thanks to Sai Penumuru, we will give seven webinars for the aioug. And we are looking forward to talk to you soon at the aioug Webinar Series July 2021: Database Upgrade and Migrations.

aioug Webinar Series July 2021: Database Upgrade and Migrations

When will it happen?

This morning I looked into my calendar. And I realized: Wow, July 2021 is just a few days away. Roy will start with the first session on Friday, July 2. And then myself, Rodrigo, Daniel, Fred and Bill will …

Continue reading...

Data Pump Super Patch for Oracle 19.10 and newer

Just a few days we released a Data Pump Super Patch for Oracle 19.10 and newer. “Newer” refers to the fact that once 19.11.0 will be available, this merge patch I’m writing about will be available as well right away (sorry, got delayed a bit). And if you work with Data Pump, you may want to consider this performance patch collection.

Why do you need this patch?

Many customer will know the issue that adding changes with datapatch may take a long time as soon as dpload.sql is involved. And this …

Continue reading...

Sign up for the Upgrade 19c Virtual Classroom Seminars for the US, Canada and LAD

It took a while. And many of you asked whether we will run our Virtual Classroom Seminars we ran successfully for Australia and New Zealand in April and from June to now in EMEA for the US, Canada and South and Latin America time zones as well. And we do. It’s time to Sign up for the Upgrade 19c Virtual Classroom Seminars for the US, Canada and LAD.

Everything you need to know about Upgrade, Migration and Consolidation

When we ran these events especially for EMEA, a lot of colleagues asked us whether …

Continue reading...

Does Data Pump import only serially into PDBs?

Daniel and I did Web Seminars this week. One of them was about the different migration strategies. We did talk a lot about Data Pump. One attendee mentioned that Data Pump Import does not work parallel into PDBs in Oracle and Oracle 18c. We were skeptical, and today I tried out to check: Does Data Pump import only serially into PDBs?

Very simple test setup

For this test, I take a schema export from an database in our Hands-On Lab, the TPCC user HammerORA uses in the UPGR database.…

Continue reading...

Virtual Classroom: Migration Strategies – Tipps & Tricks & Secrets

October 13, 2020 – 11:00h CEST | 13:00 GST | 12:00 EEST | 10:00 BST
MIGRATION STRATEGIES – Tips and Tricks and Insights and SecretsIn the fourth summer webinar about Multitenant, we touched on some migration strategies. And now it’s time for us to dig deeper. We’d like to offer you further insights and a deep dive from a technical point of view, starting with Data Pump and Transportable Tablespaces, then Full Transportable Export Import, and adding RMAN Incremental Backups to decrease the downtime. Best practices and real-world experience will round up this two-hour webinar.



Continue reading...

Upgrade to Oracle 19c – Virtual Classroom Series – Parts 5 and 6

October will be a very busy month. Daniel and I are putting the slides together at the moment for the upcoming Upgrade to Oracle 19c – Virtual Classroom Series – Parts 5 and 6. We will speak about “Database Migration Techniques” and about “Move to the Cloud – For Techies Only“. No worries if you missed Parts 1-4 as you can watch all of them online on-demand. Find all links below.

Register for the Virtual Classroom Part 5 and Part 6 events

Find the overview for both events here …

Continue reading...

Database Migration from non-CDB to PDB – Migration with Data Pump

You may have realized that there are a few techniques missing describing how to do a Database Migration from non-CDB to PDB – Migration with Data Pump is one of them. I will explain the most simple approach of going to Single- or Multitenant. It isn’t the coolest – and it isn’t very fast as soon as your database has a significant size. But it is not complex. And it allows you to move even from very old versions directly into an Oracle 19c PDB – regardless of patch levels or source and destination platform.

Database Migration from non-CDB to PDB – Migration with Data Pump

High Level Overview

Endianness change
Continue reading...

Behavior Change in Oracle 18c/19c: No symbolic links for Data Pump directories

Most of you may have recognized the desupport of UTL_FILE_DIR with Oracle Database 18c. Reason is mostly that UTL_FILE_DIR opens a lot of possibilities to do insecure things. But this has another effect which may not be obvious to you. There’s a behavior change in Oracle 18c/19c: No symbolic links for Data Pump directories. Read further to learn more about it.

UTL_FILE_DIR Desupport

In Oracle Database 18c we announced the desupport of UTL_FILE_DIR initialization parameter. But as you can read as well in the Database Upgrade Guide, there’s this remark:

UTL_FILE Package Symbolic Link in Directory Paths Not Supported

Continue reading...

Data Pump: The Time Zone Pitfalls

Data Pump: The Time Zone Pitfalls

Photo by Luis Cortes on Unsplash

Last week a very experienced colleague called me. He had issues with an export dump taken from an Oracle 20c database importing into 19c. It failed. So this blog post is about Data Pump: The Time Zone Pitfalls.

The Case

Usually you will see this issue only when you try to export from a higher version, and then attempt to import into a lower one. But the same thing can happen when you patched your databases partially with a newer time zone patch regardless of the version.

My colleague saw this error:

impdp system/welcome1@//localhost:1521/MYDB 
Continue reading...

OOW 2018: How to Migrate to Oracle Autonomous Database Cloud

OOW 2018: How to Migrate to Oracle Autonomous Database Cloud

And last but not least our forth talk at OOW 2018: How to Migrate to Oracle Autonomous Database Cloud on Tuesday, Oct 23 at a fairly late slot at 5:45 p.m. (or 17:45h) as well in Moscone West 3004.

What is this talk all about?

It’s about the Oracle Autonomous Database cloud. But not so much about the Cloud offering itself. There are other talks covering this topic. We are the upgrade and migration guys. Hence, in this talk my boss, Roy Swonger and my PM mate Bill Beauregard will demonstrate you in a live demo how to migrate …

Continue reading...

Data Pump – Wrong Dump File Version – ORA-39142

Data Pump - Wrong Dump File Version - ORA-39142Again I’ll have to thank my colleague Roland Gräff from the German ACS Support team in Stuttgart for bringing this into our radar. Roland alerted me a week ago about an issue with exports in Oracle only when you are on a certain patch level. I summarize the issue here under Data Pump – Wrong Dump File Version – ORA-39142.

In the below blog post you will learn about the actual issue, where it happens and when, and of course how to workaround it.

When does it happen?

The issue I will describe below happens only with

Continue reading...

Export with Data Pump and Long Identifiers

I blogged a few days ago about Long Identifiers in Oracle Database 12.2 and accessing the objects via database links from lower database version. As this raised a few questions, I realized there may be a bit more clarification necessary. One question was about what happens during export with Data Pump and Long Identifiers. That’s a pretty good question.

Export with Data Pump and Long Identifiers

Export with Data Pump and Long Identifiers

I’d like to demonstrate the effect with a short example. I’m doing all my tests in a fresh PDB inside an Oracle 18.1.0 CDB from out Hands-On Lab. But you can repeat …

Continue reading...

Upgrade and Migration Office Hours on March 21, 2018

Upgrade and Migration Office Hours on Mar 21, 2018The first Office Hours Session was just a month ago. It was a very good experience and based on the feedback we received from customers it was very helpful. You don’t believe it? Then you may watch the recording. But now it’s time for our second Upgrade and Migration Office Hours on March 21, 2018.

What are “Office Hours”?

This is a new interactive format we’d like to try out. Several groups in Oracle schedule Office Hours sessions. Roy and I held our first Office Hours session on February 21, 2018 for a full hour trying to answer …

Continue reading...

Data Pump’s amazingly useful METRICS=Y and LOGTIME=ALL parameters

Now that I am back from OpenWorld, I will hijack the Upgrade blog while Mike is traveling. 🙂

Thank you to everybody who came to our presentations or stopped at the demo booth to chat last week. We had a great many conversations, and we always learn from talking to customers! One of the common questions about Data Pump came in the form, “I have a data pump job that used to run in X minutes, but now takes <multiple of X> minutes. Can you tell me what might be happening?

Of course with that much information we

Continue reading...

Data Pump – Exclude Stats differently for TTS and FTEX

Nice little best practice for statistics and Data Pump when doing either Transportable Tablespaces or Full Transportable Export-Import (credits to Roy and Dean Gagne).

Transport Statistics via a Staging Table

First of all we always recommend to exclude statistics when doing a Data Pump export as the import of such stats takes way longer than transporting them via a stats table. If you are unfamiliar with transporting stats between databases please see the Oracle Performance Tuning Guide with a nice tutorial:

The basic steps to transport statistics from one database to another fast and

Continue reading...

Some Data Pump issues: DBMS_DATAPUMP Import via NETWORK_LINK fails + STATUS parameter giving bad performance

One of my dear Oracle ACS colleagues (Danke Thomas!) highlighted this issue to me as one of his lead customers hit this pitfall a week ago. .

DBMS_DATAPUMP Import Over NETWORK_LINK fails with ORA-39126 / ORA-31600

Symptoms are:

ORA-31600: invalid input value IN ('VIEWS_AS_TABLES/TABLE_DATA') for parameter VALUE in function SET_FILTER

This can be cured with the patch for bug19501000 –  but this patch can conflict with:Bug 18793246  EXPDP slow showing base object lookup during datapump export causes full table scan per object and therefore may require a merge patchpatch 21253883 is the one to go …

Continue reading...

Parallel Index Creation with Data Pump Import

Here is a new capability that might be interesting to anybody who is performing a migration using Data Pump. Previously, Data Pump would create indexes one at a time, specifying the PARALLEL keyword for the CREATE INDEX statement to invoke parallel query for index creation. We used to recommend a workaround to create indexes in parallel, which involved a three-step process of importing without indexes, then creating a SQLFILE of the CREATE INDEX statements, and breaking that file into multiple windows.

Through extensive performance testing we found that it is faster to create multiple indexes in parallel (using a parallel …

Continue reading...

Full Transportable Export/Import White Paper

One of the really cool and helpful features in Oracle Database 12c is called Full Transportable Export/Import. It combines the basics of transportable tablespaces – of course cross-platform, cross-endianess, cross-version – with Data Pump taking care on all the stuff not stored in tables and indexes, such as views, synonyms, trigger, packages etc. And you can even reduce downtime by combining the feature with incrementally converted RMAN backups.

And the best thing: this will work with an Oracle Database to Oracle Database 12c – so you don’t have to be on Oracle Database 12c in order to …

Continue reading...

Data Pump: Consistent Export?

Ouch … I have to admit as I did say in several workshops in the past weeks that a data pump export with expdp is per se consistent.

Well … I thought it is … but it’s not. Thanks to a customer who is doing a large unicode migration at the moment. We were discussing parameters in the expdp’s par file. And I did ask my colleagues after doing some research on MOS. And here are the results of my “research”:

  • MOS Note 377218.1 has a nice example showing a data pump export of a partitioned table with DELETEs on
Continue reading...

Exclude DBMS_SCHEDULER Jobs from expdp?

You have never thought about excluding DBMS_SCHEDULER jobs from a Data Pump export? Me neither but I’ve recently got a copy of an email for such a customer case from Roy who owns Data Pump as well. And this is the code example from Dean Gagne:


exclude=procobj:"IN (SELECT NAME FROM sys.OBJ$ WHERE TYPE# IN
  • This will work only on export
  • It’s an all or nothing approach

Quite interesting, isn’t it?…

Continue reading...

How to get the Master Table from a Data Pump expdp?

PumpInteresting question a customer had last week during the Upgrade Workshop in Munich. He’s getting export dump files from several customers and often not much information describing the contents. So how can ge find out what’s in there, which was the source characterset etc.

This seems to be a simple question but it did cost me a few searches and tests to come back with some (hopefully) useful information.

First attempt: $strings expdp.dmp > outexpdp.txt

I bet there are better ways to do this but in my case this will give me:

x86_64/Linux 2.4.xx
Continue reading...