I am trying following Java code:
Cipher cipher = Cipher.getInstance("AES/CTR/NoPadding");
AlgorithmParameterSpec paramSpec = new IvParameterSpec(iv);
SecretKeySpec secretKeySpec = new SecretKeySpec(key, "AES");
cipher.init(Cipher.ENCRYPT_MODE, secretKeySpec, paramSpec);
return cipher.doFinal(data);
where,
iv bytes in hex string = 0100000000000001417AA4D720A0A4EF
key bytes in hex string = A744D2E994AFD02F5EADC83E614821BF
data = "hello world how r you?".getBytes();
And I get encrypted bytes in hex string =
97AE911291D38A4B41F61CE2149E6432B9AA08F8D593
But when I try the same data with same iv and key with openssl with following command:
echo -n "hello world how r you?" | openssl enc -aes-128-ofb -K A744D2E994AFD02F5EADC83E614821BF -iv 0100000000000001417AA4D720A0A4EF | hexdump -C
I get encrypted bytes in hex string
00000000 97 ae 91 12 91 d3 8a 4b 41 f6 1c e2 14 9e 64 32
00000010 a4 c8 94 3b 22 3c
So first 16 bytes seems to be matching in both cases but after that JCE and openssl don't seem to get along? Can anyone tell what could be going wrong here?
I did find a reference to
this post.
Is it correct that turning off L_ENDIAN in openssl will give us the same result as JCE? If yes, then what do I have to do on JCE side to be compatible with openssl L_ENDIAN mode?
Will appreciate your response.
Thanks,
Sachin
P.S. Sorry, posted this topic in wrong forum. Can admin move this topic to "Security : Cryptography" forum?
Edited by: sushah23 on Sep 19, 2008 8:03 PM