Problem :
I have a xml in the unix server machine and I try to read that xml file through a dba job which is using simple utl_file functionality.
To read the file and put in a table of column CLOB.
I'm checking it from front end Oracle 10g form 's text item , I'm getting junk characters like ???????
I tried to check whether the data stored in db is correct by running below program to generate message to the text file
BEGIN
for x in (
select message from tb_xml_in where xml_sr_no='123'
)
loop
clob_to_file( '/test/debug/',
'test' || '.txt',
x.message );
end loop;
END;
I transferred the test.txt from unix and checked in notepad of my machine where Arabic character is enabled
and it is showing me Arabic character.
Which means its storing correct in db but not able to show it correct in through FMB.
Database details:
NLS parameter is
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_NCHAR_CONV_EXCP FALSE
NLS_CHARACTERSET AR8MSWIN1256
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_LENGTH_SEMANTICS BYTE
NLS_RDBMS_VERSION 10.2.0.4.0
select * from V$NLS_VALID_VALUES where parameter = 'CHARACTERSET' and value like '%UTF%';
PARAMETER VALUE ISDEPRECATED
CHARACTERSET AL24UTFFSS TRUE
CHARACTERSET UTF8 FALSE
CHARACTERSET UTFE FALSE
CHARACTERSET AL32UTF8 FALSE
CHARACTERSET AL16UTF16 FALSE
select * from V$version;
BANNER
----------------------------------------------------------------
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
Changing NLS Characterset of db is not currently recommended as the production database is very huge.
Can you please help on how i can resolve this issue.