If you are using Oracle Data Pump to backup tables containing LONG or LONG RAW columns, then you might be surprised when trying a recovery. Well, you tested it already. Didn’t you? ;-)
Right now I’m in the middle of a production migration. Earlier this week while testing this migration, I noticed couple strange errors during Data Pump import:
ORA-31693: Table data object "OWNER"."TABLE" failed to load/unload and is being skipped due to error:
ORA-06502: PL/SQL: numeric or value error: Bulk Bind: Truncated Bind
At some point during verification of all functionality, users notified me that they can’t see Discoverer reports stored in the database. After some poking around we figured that reports are stored in EUL5_DOCUMENTS table (thanks to Rob Hamel for help) and that table contains LONG RAW column. Guess what? This table was empty in the new migrated database.
It turned out that Data Pump Import (impdp) doesn’t allow importing LONG data into LONG columns. Instead, it requires that LONG and LONG RAW type columns are converted to LOB datatype (CLOB/BLOB) and only then it is able to import tables with LONG columns. Otherwise, those tables are empty.
One may argue that LONG data types are supposed to be converted to LOBs but, hey, let’s get back to the reality – either there is no time for that or no interest to take a risk and touch working applications. In fact, in the HA environment no one will approve combination of data move and application refactoring to be done at the same time – only one by one to mitigate the risks. Oracle Data Pump doesn’t give a chance for this and old Export/Import utilities should be used. Even CTAS (Create Table As Select) over database link doesn’t work failing with ORA-00997: illegal use of LONG datatype
.
Now go back to you databases backed up with Data Pump Export and run:
select DECODE(count(*),0,'Not much','I''m screwed '||count(*)||' times') "What's up?"
from dba_tab_columns
where data_type like 'LONG%'
and owner not in ('SYS','SYSTEM','OUTLN','EXFSYS','SYSMAN','WMSYS',
'and whatever other standard schemas where Oracle ignores its own recommendations');
Today is Sunday and I don’t want to finish on the negative note. So here is a good one…
Yesterday, I went to Produce Depot to grab some fruits and vegetables. By the way, if you live in Canada and don’t know this shop, I strongly recommend to go find one near you. It’s has great selection of quality vegetables and fruits. They are very fresh and prices are surprisingly low.
So back to the story… I picked up couple of broccoli and was looking around for a bag. I probably looked really lost because a man passing by offered me his bag that he prepared in advance… even a second one as my broccoli didn’t fit into a single bag and off he went for another two bags. How cool is that?! I don’t remember anything even close to that back in Germany.
This is just a single example but you know what? It’s amazing how friendly and open people are here in Canada. Well, I can’t say that people are unfriendly in Germany but you always feel the distance. It’s just cultural and, to be fair, as soon as you get to know someone closer – they are nice and charming people.
PS: In the meantime, migration has finished. Everything works like a charm. That was a long on-call weekend and I, perhaps, deserved an unscheduled holiday for tomorrow. This is another good news. Life is good!
17 Comments. Leave new
Great post, Alex. I like produce depot too. Next spring, remind me to give you a guided tour of the Parkdale Market – I know the stands that are actual producers as opposed to retailers of others’ local farming. :-)
There are two other markets that try to limit what they sell to local produce:
Ottawa Farmers Market: https://www.spcottawa.on.ca/ofsc/en/buylocal/farm%20pages/ottawafarmers.html
Ottawa Organic Farmers Market:
https://www.spcottawa.on.ca/ofsc/en/buylocal/farm%20pages/ottawaorganic.html
For a guide on this matter:
https://www.spcottawa.on.ca/ofsc/en/buylocal/buy_local_home.htm
that’s a great write-up – but definitely leaves me concerned on how I will handle my huge tables with long columns when we upgrade
Old good Export/Import are your friends.
You can also plan migration to LOBs before upgrade.
Another alternative – import as LOBs and transform back to LONGs.
Or just choose regular upgrades (i.e. catproc and etc.).
Old friend exp/imp is getting a bit dotty.
Importing triggers to another schema changes the owner of the trigger, but makes it on the original schemas table. Then if you try to create or replace…
From metalink:
Error: ORA 4095
Text: trigger already exists on another table, cannot replace
——————————————————————————-
Cause: An attempt was made to replace a trigger that exists on another table.
Action: Re-create the trigger on the other table using the CREATE OR REPLACE
TRIGGER statement.
From 9iR2:
$ oerr ora 4095
04095, 00000, “trigger ‘%s’ already exists on another table, cannot replace it”
// *Cause: Cannot replace a trigger which already exists on a different table
// than the one being replaced.
// *Action: Drop the trigger with the same name and re-create it.
It just irritated me that create or replace would not be able to replace. Thank you for letting me vent.
Joel,
Thanks for your comment.
I meant to use exp/imp as workaround to move data in tables with LONG columns – that’s what I did as volume was low. Anyway, I bet that Data Pump is still far more buggy than older exp/imp counterpart. ;)
Trigger issue is irritating indeed. It seems explainable that moving trigger to another schema leaves it created on the older table (if one exists in the target DB). It’s the same as if you reference schema prefix in the trigger text – in no way import will change the source for you. Would be really nice if imp could track that underlying table is moved to another schema during import. By the way, do you know if that’s also an issue with Data Pump? I expect it might be as this is a design thing.
Hi Alex,
I use data pump to move tables with longs from database to database using the data pump packages. The only drawback I found is that it cannot be done through a network link, but for that situation I created a procedure on the source database that performs the export, then I move the file to the target server with dbms_file_transfer.get_file and import the contents into the target database.
This is an excerpt of the code:
if table_has_longs(vtable_name,vschema) then
–Perform datapump export at the destination database and transfer the file
execute immediate ‘begin [email protected]’||vdblink||'(:1,:2,:3); end;’ using vtable_name, vschema,vsys_chng_nmbr ;
if vsys_chng_nmbr = 0 then
dbms_output.put_line(‘Error exporting table at the source’);
return;
end if;
dbms_file_transfer.get_file(‘DATA_PUMP_DIR’,vschema||’_’||vtable_name||’.dmp’,
vdblink,’DATA_PUMP_DIR’,vschema||’_’||vtable_name||’.dmp’);
vhandle:=dbms_datapump.open(‘IMPORT’,’TABLE’);
dbms_datapump.add_file(vhandle,vschema||’_’||vtable_name||’.dmp’,’DATA_PUMP_DIR’);
else
–perform a datapump import through database link
vhandle:=dbms_datapump.open(‘IMPORT’,’TABLE’,vdblink);
dbms_datapump.metadata_filter(vhandle,’EXCLUDE_PATH_LIST’,”’TRIGGER”,”GRANT”’);
dbms_datapump.set_parameter(vhandle,’FLASHBACK_SCN’,vsys_chng_nmbr);
end if;
The long fields are populated with no issues. No lob conversion has to be done.
Drop me an e-mail if you want me to send you the full package.
Have a great day,
Aldo
Hello Aldo,
I use impdp to import over the network a lot, but tables with long columns are an obvious problem.
I would really appreciate it if you could send me the full script. It sounds like an elegant solution.
John Goodhue
[email protected]
Would you please provide your valuable solution to me? Many Thanks for your help.
mabosleem at gmail dot com
Any chance someone could ask Aldo Bravo to send me that package? I need to do something very similar and I’d rather not recreate the wheel.
Thanks!
John Darrah
John, I forwarded you request and email address to Aldo.
Same here! I would like to have the package of Aldo Bravo, thank you!
Any news on Oracle when are they gonna fix this?! Or any link on their website about it?
Any chance I can get that Aldo as well?
John (Bari), I forwarded you request and email address to Aldo as well.
Please Alex Forwared the package to me ..
khalil.wael at gmail dot com
if you please could provide me the entire package would be helpfull.
please also send me the process to impdp from backup of a table with LONG column.
Sincere thanks