External Table UTF8
627014May 20 2008 — edited May 28 2008When trying to read records that contain UTF8 characters, my external table produces the following error:
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-30653: reject limit reached
ORA-06512: at "SYS.ORACLE_LOADER", line 14
ORA-06512: at line 1
29913. 00000 - "erro in executing %s callout"
*Cause: The execution of the specified callout caused an error.
*Action: Examine the error messages take appropriate action.
Error at Line: 3
Here is my external table definition:
REM WRK_POSTS
CREATE TABLE "WRK_POSTS"
( "ID" NUMBER,
"POST_CONTENT" VARCHAR2(4000 CHAR)
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY "MYSQL_DIR"
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
CHARACTERSET 'UTF8'
STRING SIZES ARE IN CHARACTERS
NOBADFILE
NODISCARDFILE
NOLOGFILE
FIELDS TERMINATED BY 0X'5E5E'
LRTRIM
MISSING FIELD VALUES ARE NULL
(
ID CHAR,
POST_CONTENT CHAR
)
)
LOCATION
( 'wrk_posts.txt'
)
)
;
The table works fine w/o multi-byte utf8 characters. My server NLS characterset is UTF8 as confirmed via:
select
value -- UTF8
from v$nls_parameters where parameter='NLS_CHARACTERSET';
select
value -- UTF8
from nls_database_parameters where parameter='NLS_CHARACTERSET';
The POST_CONTENT field has 4000 characters, but due to multi-byte - has more than 4000 bytes. I thought that specifying CHARACTERSET UTF8 and STRING SIZES ARE IN CHARACTERS would take care of this.
Any feedback is very much appreciated.
~tim