简体   繁体   中英

Export from Oracle dump file into PostgreSQL

I have project to combine data from multiple Oracle servers into single data warehouse based on PostgreSQL. Data from Oracle servers comes in a form of dump files since it's not possible to have direct connections to some of them.

After analyzing one of these dump files in hexadecimal editor I found that table definitions are stored in XML format there and so it is possible to extract it after some investigation. Unfortunately, data is stored in unreadable form that hardly can be parsed.

Did anyone solve such task before? Is there any application or API that could automate this project (at least a part of it)?

When you mean by dump, i consider it is a backup taken using either traditional exp or datapump. As for my experience, there is no complete way to read these dumps.

You can use strings command to see some of the characters inside a dump. All i see is some junk characters here.

[oranaxx@dtqlnxxx expdp]$ strings test22_schema_.dmp|more "SYS"."SYS_EXPORT_SCHEMA_01" x86_64/Linux 2.4.xx AL32UTF8 LBB EMB GHC JWD SD EBE WMF DDG JG SJH SRH JGK CL EGM BJM RAP RLP RP KR PAR MS MRS JLS CET HLT 10.02.00.00.00 HDR>T

As @vercelli pointed out, you can import the dumps into a temp database and read the data from the tables is one of the ways.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM