I'm taking a .log file generated by an oracle database query and opening it with Excel, getting rid of the header and footer and then using Data/Text to Columns to separate the data. The .log file has commas between each value. I set the Text to Columns using comma delimited and it looks fine in Excel. When I save the file as a .csv file, it is adding a comma at the end of each row, which causes sqlldr to fail to load the data. This is odd because the same scripts have worked for years but for some reason the process with Excel is now adding a comma to the end. Is there a way to tell sqlldr to ignore the last comma or to consider the last comma as the end of the rows?
Example:
Generated .log file would have thousands of rows like: abcd,efgh,ijkl
After using excel and delimiting the file with commas, then saving as a .csv file I get this: abcd,efgh,ijkl, << comma added to end
I have two sets of .log files that I follow the same process but one comes out fine while this .log file ends up adding an extra comma.
Thanks for any help on this, been pulling my hair out all day.