I'm using SQl to convert from a json array to object types. We've discovered that when strings are longer than defined in the object type, they're just silently set to null. How can we catch this as an error? This is on Oracle 19.3
Example below:
create table line_data
(
text varchar(3)
);
create type t_json_line as object
(
text varchar2(3)
);
create type t_json_lines as table of t_json_line;
-- A procedure to process a json array as input:
create or replace procedure process_json_array(p_json_array in clob) is
l_json_lines t_json_lines;
begin
select json_value(p_json_array, '$' returning t_json_lines error on error)
into l_json_lines
from dual;
dbms_output.put_line('Lines: ' || l_json_lines.count);
for i in l_json_lines.first..l_json_lines.last loop
dbms_output.put_line(i || ': ' || l_json_lines(i).text);
insert into line_data (text) values (l_json_lines(i).text);
end loop;
end;
-- This works fine and as expected:
begin
process_json_array('[{"text":"abc"},{"text":"def"}]');
end;
select *
from line_data;
+----+
|TEXT|
+----+
|abc |
|def |
+----+
-- But when the string (abcd) is too long, the string is just silently nulled out:
begin
process_json_array('[{"text":"abcd"},{"text":"efg"}]');
end;
select *
from line_data;
+----+
|TEXT|
+----+
|abc |
|def |
|null|
|efg |
+----+
-- I recreate the line type with a longer string type:
create or replace type t_json_line force as object
(
text varchar2(4)
);
-- Rerun with too long text for the column and we get an error as expected.
begin
process_json_array('[{"text":"abcd"},{"text":"efg"}]');
end;
ORA-12899: value too large for column "APPDATA"."LINE_DATA"."TEXT" (actual: 4, maximum: 3)
ORA-06512: at "APPDATA.PROCESS_JSON_ARRAY", line 10
ORA-06512: at line 2
-- cleanup
drop table line_data purge;
drop type t_json_lines;
drop type t_json_line;
drop procedure process_json_array;
So - in conclusion the string conversion to a type using the sql-trick just silently ignores too long strings and replaces them with null values. That might not be what we want.
Is there an option where we can add more strict checking (tried error on error in the example) when converting to the object data type?
A workaround would be to declare all such strings with max length (4000), but I have yet to measure if that does anything to memory consumption.
Grateful for any insights from you!
Regards,
Vidar