From time to time I deal with Transportable Tablespaces as an upgrade and migration approach. But TTS has one major flaw, especially when databases get really really big in the 3-digit-terabyte sizes or more: You will need to switch your tablespaces into read-only mode. But since Oracle Database 19c, you can Test Transportable Tablespaces without Read-Only Mode.

Photo by freestocks on Unsplash
Transport Constraints
Transportable Tablespaces (TTS) is a very common technique, especially when you need to migrate databases across Endianness. Since Oracle 10g, TTS works even from Big Endianness to Little Endianness platforms, and vice versa. Only if your tablespaces are TDE-encrypted you may need to decrypt them at first. But the biggest constraint is when you’d like to migrate a very large database. If you’d like to check how long the transport will take, you need to switch your tablespaces read-only. And of course, this is not an option for a production database. Not everybody has enough space available to have a duplicate copy of the 1.2 PB DWH.
Especially many partitions and subpartitions can be very time-consuming during TTS export and import. So you may want to test this. And this is where this new Oracle 19c Data Pump feature comes in.
Test TTS without Read-Only Mode
In Oracle Database 19c, expdp supports now a new parameter TTS_CLOSURE_CHECK. And it has a special value to test TTS: TEST_MODE. This allows you to do a test export for the transport without the need to take the tablespaces read-only. But the tablespaces can’t be imported afterwards.
Just to clarify this (thanks Frits for your great feedback):
The following steps will only allow you to test the export part of Transportable Tablespaces. There is no such test for the import portion yet. Hence, this is meant especially for situations where you have no test environment but can’t acquire downtime to check whether you can export the tablespace’s metadata correctly, and measure how long this will take.
It is very simple – see here an example par file:
metrics=yes logtime=all directory=DATA_PUMP_DIR dumpfile=tts.dmp logfile=ttsexport.log tts_closure_check=TEST_MODE transport_tablespaces=(TTS)
I created a tablespace TTS in my PDB – and then I run the expdp without taking the TTS tablespace into read-only mode:
$ expdp system/oracle@pdb1 parfile=tts.par Export: Release 19.0.0.0.0 - Production on Wed Sep 30 21:59:23 2020 Version 19.8.0.0.0 Copyright (c) 1982, 2019, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production 30-SEP-20 21:59:28.238: Starting "SYSTEM"."SYS_EXPORT_TRANSPORTABLE_01": system/********@pdb1 parfile=tts.par 30-SEP-20 21:59:28.803: W-1 Startup took 0 seconds 30-SEP-20 21:59:30.119: W-1 Processing object type TRANSPORTABLE_EXPORT/STATISTICS/TABLE_STATISTICS 30-SEP-20 21:59:30.223: W-1 Completed 3 TABLE_STATISTICS objects in 0 seconds 30-SEP-20 21:59:34.420: W-1 Processing object type TRANSPORTABLE_EXPORT/STATISTICS/MARKER 30-SEP-20 21:59:34.680: W-1 Completed 1 MARKER objects in 4 seconds 30-SEP-20 21:59:37.344: W-1 Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK 30-SEP-20 21:59:37.370: W-1 Completed 1 PLUGTS_BLK objects in 0 seconds 30-SEP-20 21:59:37.399: W-1 Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PLUGTS_BLK 30-SEP-20 21:59:37.417: W-1 Completed 1 PLUGTS_BLK objects in 0 seconds 30-SEP-20 21:59:42.120: W-1 Processing object type TRANSPORTABLE_EXPORT/TABLE 30-SEP-20 21:59:42.141: W-1 Completed 3 TABLE objects in 2 seconds 30-SEP-20 21:59:46.277: W-1 Master table "SYSTEM"."SYS_EXPORT_TRANSPORTABLE_01" successfully loaded/unloaded 30-SEP-20 21:59:46.299: ****************************************************************************** 30-SEP-20 21:59:46.299: Dump file set for SYSTEM.SYS_EXPORT_TRANSPORTABLE_01 is: 30-SEP-20 21:59:46.301: /u01/app/oracle/admin/CDB2/dpdump/B08E2264E5651243E055000000000001/tts.dmp 30-SEP-20 21:59:46.302: Dump file set is unusable. TEST_MODE requested. 30-SEP-20 21:59:46.302: ****************************************************************************** 30-SEP-20 21:59:46.304: Datafiles required for transportable tablespace TTS: 30-SEP-20 21:59:46.305: /u02/oradata/CDB2/pdb1/tts01.dbf 30-SEP-20 21:59:46.328: Job "SYSTEM"."SYS_EXPORT_TRANSPORTABLE_01" successfully completed at Wed Sep 30 21:59:46 2020 elapsed 0 00:00:20
Please note the message in the log output:
Dump file set is unusable. TEST_MODE requested.
And if you still want to import?
Well, this won’t work. See here:
userid=system/oracle@pdb1 metrics=yes logtime=all directory=DATA_PUMP_DIR dumpfile=tts.dmp logfile=ttsimport.log transport_datafiles='/u02/oradata/CDB2/pdb1/tts01.dbf'
When I call the import, I receive this error:
$ impdp parfile=imptts.par Import: Release 19.0.0.0.0 - Production on Wed Sep 30 22:45:02 2020 Version 19.8.0.0.0 Copyright (c) 1982, 2019, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production ORA-39001: invalid argument value ORA-39000: bad dump file specification ORA-39398: Cannot load data. Data Pump dump file "/u01/app/oracle/admin/CDB2/dpdump/B08E2264E5651243E055000000000001/tts.dmp" was created in TEST_MODE.
So it does not work. The dump file is clearly marked as being created in TEST_MODE. And Data Pump prevents the import as files created with TEST_MODE aren’t consistent.
Further Information and Links
–Mike
Hi Mike,
I know this feature is available in 19c only tts_closure_check to test_mode to predict time taken for taking export part. I have 18c DB as source but target is 19c. Is there any way, I can test/predict time do export of metadata in 18c without putting DB in read only mode.
Hi Sanjay,
unfortunately there is only one way: You need to try it out.
And as a rule of thumb, we assume that the import takes 3-4 times as long as the export in most cases.
Cheers
Mike