Hi Folks,
Suppose I have a JSon File like this:
[{"ID":1,"NAME":"Asterix"},{"ID":2,"NAME":"Obelix"}]
And a Table with the very same structure like this
create table aremorica (
id number,
name varchar2(256)
);
And I would like to Import the data from the JSon File (with a known structure) into the Table (with the very same structure). There are multiple tables of course but for each table there would be exactly one known File with a fixed (and flat - no nested structures) structure like shown above. The File is located at the DB File System. Now I do know that using External Tables I could create an external table with a CLOB Column with a JSON constraint like mentioned here:
https://docs.oracle.com/en/database/oracle/oracle-database/12.2/adjsn/loading-external-json-data.html#GUID-52EFC452-5E65…
but I was wondering if there isn't a more elegant way to solve my problem since the structure of my JSon document is flat and known.
The whole thing is used to export data (like test data or something) into a transportable format and load it again into a different database with the very same structure. Right now this is done via external tables and the datapump external table access driver. Each target table has it's external Table where the data is simply merged via the Primary Key. This works like a charm, but I would like to put the exported Files under version control (git or svn) so all developers are able to import the very same test data into their database (and add records if needed). The problem with datapump files is they are binary and are a bit hard to merge so conflicts are hard to resolve.
I was thinking about a home grown external table access driver based on JSON, but it seems that it's not possible to write that on your own (or at least I am unable to find informations regarding this). I know that I could read the file contents into a CLOB and simply parse this CLOB again with JSON_TABLE and I could replace my (generated) create External Table statements with (generated) create View statements, but as said I was wondering if there is a more elegant (read: less to code) method on how to do that - like the datapump access driver.
I would want to use JSon because...well honestly I am not sure. If there is a XML solution to this I'd be happy to hear it as well. I am aware that I could export the data in CSV and use ORACLE_LOADER but then I remember the headaches I had with them before I moved to datapump, so I would very much want to avoid CSV.
I'd also be happy for entirely other solutions for my problem - as said: I would like to interchange data between databases; the process is now:
- I export the data using CTAS with the datapump export driver from the source table
- I pull the file from the DB
- I get the DDL Statement for my external table via DBMS_METADATA
- the file plus the DDL gets transported to another DB (they most certainly aren't even on the very same network)
- the DDL gets executed
- the data gets merged into the target table with a simple MERGE
And my problem in short is that datapump files are binary and I don't want to use CSV to put it bluntly .
The (minimal) Version I'd have to do that would be Oracle 12.1.2.0.2 but I'd be happy to hear solutions even if they are on 20c...
thanks & regards
Christian