Hi All
I'm running a job which takes several csv and fixed format files as input, does some jiggery-pokery and then writes data to an Oracle DB. I'm getting the following error (I've had to change the real names)...
WARNING: 05-Aug-2016 12:11:27: [EDQ-02813] Insert error for ZZZ Oracle DB: Problem writing insert batch to database: ORA-12899: value too large for column "ZZZ"."AAA"."BBB_NAME" (actual: 61, maximum: 60) (Code: 2,078)
SEVERE: 05-Aug-2016 12:11:27: [EDQ-06864] data batch 0 terminated by exception
com.datanomic.director.runtime.data.RecordWriteException: Error writing to export dbAAA to ZZZ Oracle DB: Problem writing insert batch to database: ORA-12899: value too large for column "ZZZ"."AAA"."BBB_NAME" (actual: 61, maximum: 60) (Code: 2,078) (Code: 203,051)
The value BBB_NAME is an unmanipulated value from the input. No trimming, whitespace etc. processing done at all. I profiled the input data and the max value length was 60 bytes. As a further check I altered the BBB_NAME column to have length 70 and reran the job. It completed successfully. I then ran a query to get the max length of data in BBB_NAME on the DB. It was also 60! Where on earth is 61 coming from? I'm just a little confused. Anybody else encountered this behaviour? ANSI/Unicode/UTF related maybe?
Cheers
Jon