Uh … my blog counter tells me I have 57 unanswered comments in the queue at the moment. Forgive me, but July and August so far were really very busy months with some travel activity as well. And another proof that I’m not used to travel anymore: Last week I forgot the power adapter for my Mac at home and just realized it when I pulled out my Mac to do emails. Anyhow, this is a short Monday-vacation-time-blog post about two recent issues you may need one-off patches for since you otherwise may get trapped by them as well. Both happen only with more recent RUs.
Data Guard Broker configuration issue when you patch to 19.16.0
This issue was found by a Swiss customer who patched from 19.15.0 to 19.16.0 a few days after the RU got released. All worked fine except that as soon as they switched the standby environment to 19.16.0 (before invoking datapatch) they received an ORA-16705: internal error in Data guard broker. Of course, they found this on their test environment at first. But disabling and enabling the broker configuration did not help. And recreating the configuration was not an option since they operate over 200 Data Guard environments.
Luckily the MAA and Oracle Support reacted quickly – and a one-off patch is available. As far as I see, the fix may be included in a future RU. Meanwhile, please see:
- Bug 34446152 – Broker: 19.16 onward broker shows “ORA-16705: internal error in Data guard broker
- MOS Note: 2887535.1 – ORA-16705: Internal Error In Data Guard Broker After Applying Release Update 19.16 Patch
So you should apply this one-off on top of your 19.16.0 installation before attempting the patch run to 19.16.0 if you have Data Guard environments.
The fix is confirmed to be included in the 19.18.0 RU in January 2023.
Datapatch errors out with prereq: archived patch directory before 19.15.0
This is a really obscure issue. And it took me a bit to understand it – which required the help of out datapatch team (Thanks a lot, Santosh!). Again, in this case a VERY large customer hosting a lot of Oracle database environments made me aware of this issue (Thanks, Amit!!).
Before I explain the issue with my own words, let me point you to:
- MOS Note: 2235541.1 – datapatch -verbose Fails with Error :” Patch xxxxxx: Archived Patch Directory Is Empty
- Bug 33557344 – 19.x: datapatch fails in out of place patching with prereq: archived patch directory
This problem happens when you patch to 19.12.0, 19.13.0 or 19.14.0 – and then later you go to 19.15.0 or newer with – and this is important – one-off patches in your source home. So those of you who run a plain 19.14.0 right now for example won’t be affected. But if you have one-off patches on top, then datapatch will attempt to roll them back at first when you jump to a higher home, for instance 19.16.0.
Even though the patch for Bug 33557344 is already included, the issue is silently sleeping in your home. So to clarify again: When you upgrade from 18.104.22.168 to 19.15.0, you will never see this issue. And in case you have no one-off patches applied, you won’t see this issue.
But since I know that many of you readers are patching experts, I assume that you at first patched to 19.12.0, 19.13.0 or 19.14.0, and you needed to apply one-off patches on top as well.
Now, let me summarize the various scenarios as far as I understood them – with potential solutions of course where applicable:
- If you are on 19.12.0, 19.13.0 or 19.14.0 with one-off patches, you may see this issue when you patch to a newer RU.
- Solution 1:
Before you patch to 19.15.0 or newer, you roll back the one off patches in your current home manually with:
datapatch -rollback all -force
- Solution 2:
Copy the contents of your source $ORACLE_HOME/sqlpatch to your target $ORACLE_HOME/sqlpatch
- Solution 1:
- If you used cloning to create your new home, then you apply let’s say 19.16.0 to it, then you won’t see this issue since you cloned the $ORACLE_HOME/sqlpatch as well.
- If you have no one-off patches in your 19.12.0, 19.13.0 or 19.14.0 homes, you won’t be affected
- If you are on 19.11.0 or lower at the moment, even with one-off patches, and you will patch to 19.15.0 or newer, you won’t see this issue since the fix is included from 19.15.0 on already.
- If you plan to patch to 19.12.0, 19.13.0 or 19.14.0, then please apply the one-off for Bug 33557344 BEFORE you patch or upgrade to 19.12.0, 19.13.0 or 19.14.0. If you do so, then you won’t see this issue and you don’t have to juggle with manual rollbacks or directory tree copies.