Hello All,
I use Oracle 19c, users upload xlsx excel files via Apex. Some files have more than 100k rows. I need to check the data and insert it to the table. I know APEX_DATA_PARSER and AS_READ_XLSX packages. However, both of them takes time for more than 100k rows. For example the following code snippet takes more than 10 mins. I used APEX_DATA_PARSER and it takes 3 mins. Is there any faster way? Do you have any suggestions?
In other ways, I need a faster converter from blob into table or can I do this process via Apex? Users upload file with using Apex.
declare
rec apex_application_temp_files%rowtype;
v_cnt3 number;
v_filter_data varchar2(100);
v_item_name varchar2(100) := 'P'||nv('APP_PAGE_ID')||'_FILE';
begin
SELECT * into rec FROM apex_application_temp_files WHERE name = v(v_item_name);
with
q1 as (
select x.row_nr, x.col_nr, x.string_val, x.number_val, x.date_val
from table( as_read_xlsx.read(rec.blob_content) ) x
),
q2 as (
select col_nr from q1 where row_nr = 1 and upper(string_val) = 'DATA1'
)
select
count(distinct c1),
max(c1)
into v_cnt3, v_filter_data
from (
select
nvl(nvl(string_val, number_val), date_val) c1
from q1
where row_nr != 1 and col_nr = (select * from q2)
group by nvl(nvl(string_val, number_val), date_val)
);
end;
Thanks in advance