Skip to Main Content

Integration

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

"Value too large for column" error - but it isn't.

Jon CoatAug 5 2016 — edited Aug 18 2016

Hi All

I'm running a job which takes several csv and fixed format files as input, does some jiggery-pokery and then writes data to an Oracle DB. I'm getting the following error (I've had to change the real names)...

WARNING: 05-Aug-2016 12:11:27: [EDQ-02813] Insert error for ZZZ Oracle DB: Problem writing insert batch to database: ORA-12899: value too large for column "ZZZ"."AAA"."BBB_NAME" (actual: 61, maximum: 60) (Code: 2,078)

SEVERE: 05-Aug-2016 12:11:27: [EDQ-06864] data batch 0 terminated by exception

com.datanomic.director.runtime.data.RecordWriteException: Error writing to export dbAAA to ZZZ Oracle DB: Problem writing insert batch to database: ORA-12899: value too large for column "ZZZ"."AAA"."BBB_NAME" (actual: 61, maximum: 60) (Code: 2,078) (Code: 203,051)

The value BBB_NAME is an unmanipulated value from the input. No trimming, whitespace etc. processing done at all. I profiled the input data and the max value length was 60 bytes. As a further check I altered the BBB_NAME column to have length 70 and reran the job. It completed successfully. I then ran a query to get the max length of data in BBB_NAME on the DB. It was also 60! Where on earth is 61 coming from? I'm just a little confused. Anybody else encountered this behaviour? ANSI/Unicode/UTF related maybe?

Cheers

Jon

Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Sep 15 2016
Added on Aug 5 2016
1 comment
477 views