Hi,
I am facing performance issue while inserting 100K records received in JSON format as input collection and return details for newly added records. Can you suggest any bulk insert approach to handle below volume of data in about 15 sec ?
Requirement :
1) Insert 100K rows into 3 tables. Each table having separate sequence as PK. First_table PK will be stored in Second_table similary Second_table PK to be stored in Third_table.
2) Return details of newly inserted 100K rows of FIRST_TABLE.STORE_ID
What I have done :
I have written a while loop to insert each record and their values into 3 tables in same order to avoid FK issues. During insert I am storing newly inserted sequence values into a variable to return their details from table as output.
CREATE TABLE FIRST_TABLE
(
STORE_ID NUMBER(20),---PK (SEQUENCE_ID_FIRST_TABLE)
FT_COL_1 NUMBER(20),
FT_COL_2 NUMBER(20)
)
CREATE TABLE SECOND_TABLE
(
DEPT_ID NUMBER(20),---PK (SEQUENCE_ID_SECOND_TABLE)
STORE_ID NUMBER(20),---FK TO FIRST_TABLE.STORE_ID
ST_COL_1 NUMBER(20),
ST_COL_2 NUMBER(20)
)
CREATE TABLE THIRD_TABLE
(
PURCHASE_ID NUMBER(20),---PK (SEQUENCE_ID_THIRD_TABLE)
DEPT_ID NUMBER(20),---FK TO SECOND_TABLE.DEP_ID
TT_COL_1 NUMBER(20),
TT_COL_2 NUMBER(20)
)