Skip to Main Content

Database Software

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Character Set Migration in simple terms

RocketdogAug 27 2013 — edited Aug 30 2013

Hi all, apologies if I have the wrong area. I'm running 10G databases and have read so much on character set migration, my head is spinning and I am by
no means an XML expert..

Background

  I have 2 application tables with single XMLTYPE columns that require NLS_LANG is updated to AL32UFT8 from WE8MSWIN1252 so that the encoding shows as
UTF-8 on data sent to and from the database application.

Table1 is prefilled with text strings to simulate data coming from an external client and table2 has data created by the database application using oracle’s
XML ready to be sent back to an external client.

  --------------------------------------------------------------------------------------------------------------

  CSSCAN - Have installed and run a Full scan as suggested -

  . Scan summary says all data in data dictionary and application is convertible to new character set

  . Data dictionary Conversion Summary shows VARCHAR2 and CLOB totals in convertible and message data dictionary can not be safely migrated using CSALTER

  . Application Data Conversion Summary shows VARCHAR2 and CLOB totals in convertible

. ALL truncation and lossy totals are 0 and no indexes are listed for rebuild

Aside from the single columns on two application, there are numerous references to SYS tables - JOB$, METASTYLESHEET, RULE$, SOURCE$, WRH$...
WRI$...

  --------------------------------------------------------------------------------------------------------------

Various docs read suggest building a new database and export/import data across, but I potentially have to do this on multiple database so am looking
for the quickest solution.

 

On a test instance on the same server, I copied rows (via insert as select) from another  database (DEV1) with the same ORACLE_HOME
and NLS_LANG setup

Checks via PUTTY

echo $NLS_LANG - shows nothing set for the linux env.

sqlplus - select userenv('language') from dual; - shows ENGLISH_UNITED KINGDOM.WE8MSWIN1252

similarly - select value from v$nls_parameters where parameter in ('NLS_LANGUAGE','NLS_TERRITORY','NLS_CHARACTERSET');  shows the same

Existing table1 data is readable and shows info for column data (type XMLTYPE) with e.g -

    

     <?xml version="1.0" encoding="US-ASCII"?>

    <ChangeWorkOrderRequest xmlns="urn:np:

 

Existing table2 data is readable and shows info for column data (type XMLTYPE) with e.g -

       <?xml version="1.0" encoding="US-ASCII"?>

       <UpdateTradMeterRequest xmlns="urn:np:

Checks via PLSQL Dev – look at the same data rows listed as column type CLOB

Existing table1 data is readable and shows as -

      <?xml version="1.0" encoding="utf-8"?>

       <ChangeWorkOrderRequest xmlns="urn:np:

 

Existing table1 data is readable and shows as – (*A)

      <?xml version="1.0" encoding="WINDOWS-1252"?>

       <UpdateTradMeterRequest xmlns="urn:np

 

The same differences show up when using Oracel SQL*Plus, not sure why this is?

Testing

I've tried changing the characterset using- update PROPS$ set value$ = 'AL32UTF8' where name = 'NLS_CHARACTERSET';

and confirmed it with - select value from NLS_DATABASE_PARAMETERS where parameter='NLS_CHARACTERSET';

and restarted the instance

Re-check data with select * from table1/2

PUTTY - select from table1 or table2 now give OCI-31011: XML parsing failed errors

SQL*Plus – select on either table lists all rows but data column shows predominantly as upside down question marks

PLSQL Dev – select on either table shows as unreadable – looks like Chinese/Japanese characters

 

NOTE - Updating the parameter back to WE8MSWIN1252, restart and recheck and all the data is as it was prior to updating to AL32UTF8.

Re-test with CSALTER – even though not suggested, tried out as database backup available

Re-updated from WE8MSWIN1252 to AL32UTF8

Shutdown and startup restrict

@$ORACLE_HOME/rdbms/admin/csalter.plb – ran ok with no errors

Shutdown and startup

 

Re-check data again with select * from table1/2 same as test without running CSALTER

NOTE – Updated the parameter back to WE8MSWIN1252 again, restart and recheck and all the data appears readable again.

Not sure what CSALTER did and if anything else is corrupted as a result?

 

Additional data –

Note that after changing the characterset from WE8MSWIN1252 to AL32UTF8 and allowing the application to run and add a new row to Table2, checks via PUTTY still show data as –

      <?xml version="1.0" encoding="US-ASCII"?>

      <UpdateCustomerRequest xmlns="urn:np:e

 

But PLSQL Dev and SQL*Plus both correctly now show with encoding UTF-8 (rather than WE8MSWIN1252 as in *A above) which is the goal we are aiming for –

      <?xml version="1.0" encoding="UTF-8"?>

      <UpdateCustomerRequest xmlns="urn:np:es

 

Does anyone has an explanation for the differences in PUTTY v PLSQL Dev and SQL*Plus?

 

Aside from building new databases and exporting/importing data as appears to be the predominant suggestion from where I’ve researched, does anyone know a
simple solution to this?

 

Any advice most would be most welcomed.

Regards

Lesley

Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Sep 27 2013
Added on Aug 27 2013
10 comments
5,501 views