Skip to Main Content

SQL & PL/SQL

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

UTL_ENCODE.UUENCODE doesn't seem to follow the standard uuencode format

User_QX1CQJul 8 2016 — edited Oct 5 2016

Oracle 12.1.0.1.0 on Windows 8.1.  I am trying to uuencode and uudecode a long uuencoded string in PL/SQL using UTL_ENCODE.uudecode, as well as encode binary raw data using UTL_ENCODE.uuencode.  I have external data coming in uuencoded format that I need to convert to binary (raw), and I need to create uuencoded data from binary data inside PL/SQL.  I have confirmed the incoming uuencoded data follows the standard uuencode formatting.  Rather than sharing my specific example, I'll just show the problem I am having from the uuencode standard using the most simple example.  The Oracle SQL:

select UTL_RAW.cast_to_varchar2(UTL_ENCODE.uuencode(UTL_RAW.cast_to_raw('Cat'), 1, 'cat.txt', 644)) from dual;

returns the following:

begin 644 cat.txt

$0V%T

end

But this does not appear to follow the uuencode standard. Try the same uuencode on 'Cat' string here:

Free Usenet Tools - Online UUEncoder and UUDecoder. usenet newsgroups binaries. Free Web interface for UUencode and UUd…

And you get:

begin 644 webutils_pl

#0V%T

`

end

Which DOES follow the uuencode standard as specified here:

https://en.wikipedia.org/wiki/Uuencoding

Extracted from the uuencode Wikipedia page:

Each data line uses the format:

 <length character><formatted characters><newline> 

<length character> is a character indicating the number of data bytes which have been encoded on that line. This is an ASCII character determined by adding 32 to the actual byte count, with the sole exception of a grave accent "`" (ASCII code 96) signifying zero bytes. All data lines except the last (if the data was not divisible by 45), have 45 bytes of encoded data (60 characters after encoding). Therefore, the vast majority of length values is 'M', (32 + 45 = ASCII code 77 or "M").

<formatted characters> are encoded characters. See Formatting Mechanism for more details on the actual implementation.

The file ends with two lines:

`<newline>

end<newline>

The second to last line is also a character indicating the line length with the grave accent signifying zero bytes.

As a complete file, the uuencoded output for a plain text file named cat.txt containing only the characters Cat would be

begin 644 cat.txt

#0V%T

`

end

Why is the Oracle SQL query using UTL_ENCODE.UUENCODE showing a '$' as the first character, which would indicate 4 characters in the string 'Cat'?  ASCII $ = 36, 36-32 = 4.  Yet the uuencoded string that Oracle UTL_ENCODE provides (OV%T) is for 3 characters, as expected.  The # sign is the correct first character: ASCII # = 35, 35-32 = 3.  The $ from Oracle UTL_ENCODE.uuencode is not correct.

Is this a characterset problem?  This is only 1 of the discrepancies from the uuencode standard that I have found in Oracle PL/SQL's UTL_ENCODE.  For longer input strings to be encoded, UTL_ENCODE.uuencode creates standard line output of 76 characters with a 'l' (that's lower case 'L') as the first character of each line (ASCII 'l' = 108, 108-32 = 76), rather than the uuencode standard 'M' first character which represents original line lengths of 45 (ASCII M = 77, 77-32 = 45).

I am having similar problems with UTL_ENCODE.uudecode interpreting uuencoded text incorrectly.  Oracle's UTL_ENCODE uuencode and uudecode are consistent in that they are reversible. But they don't seem to follow the standards and which my incoming external data follows.

I've attached a sample file of an external uuencoded string that I am trying to decode to a binary JPEG file in PL/SQL.  If I copy/paste the attached uuencoded string into the above referenced uuencoder/uudecoder site, and save the file, the resulting JPEG is a valid binary file and shows the picture correctly.  But when I use PL/SQL UTL_ENCODE.uudecode and save the resulting binary (raw) output to a file, it is not a valid JPEG. I have debugged and narrowed the issue down to the above discussion about differences in Oracle UTL_ENCODE vs. the standard implementations.

This post has been answered by odie_63 on Jul 9 2016
Jump to Answer
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Nov 2 2016
Added on Jul 8 2016
18 comments
3,830 views