Ran into an odd problem a bit ago that I'm hoping someone can shed some light on.
Our Oracle10g database's timezone is GMT. That's the timezone of the server it runs on. If I connect to the database and do a SELECT SESSIONTIMEZONE FROM DUAL
via a Java 1.6 app, it tells me -04:00
(US/Eastern). If I do the same thing from a C# app or a Perl script, it tells me +00:00
(GMT). None of them explicitly set the session timezone. It looks like Java is implicitly setting the session timezone to whatever the timezone is on the system that is executing the Java app.
So why is the result of that query different from a Java connection? What does a Java connection do that's different? Does the JDBC set the timezone of the session on connect? And is there a way to keep it from setting that session timezone value by itself (if that's what it's doing)?
This is causing issues with a couple of procs but only when called from a Java app. I can get around the issue by issuing an ALTER SESSION SET TIME_ZONE...
call, but why is that necessary from Java but not Perl or C#? What's different?
Dave