Timestamps are not something I have thought much about in the past, but now I have reason to learn a bit more.
I'll keep this relatively short, and get right to my confusion.
If you store a Timestamp in a database (in my case, I'm looking at DB2), doesn't something on the back end convert it to UTC? And then when you read it back out, it gets converted to the local time zone? How does this happen?
Lets say you are using a computer in one time zone, but the database is running on a server in another time zone. What conversions take place upon storage and then retrieval?
And here is where it gets a little more Java specific. Using a PreparedStatement or a ResultSet, you can change the time zone when setting and retrieving a Timestamp.
Example:
Calendar calendar = Calendar.getInstance(TimeZone.getTimeZone("GMT"));
ps.setTimestamp(2, timestamp, calendar);
And you can read it out from a ResultSet in a similar fashion.
However, isn't using such methods technically wrong? If the database is trying to store your instant in time accurately, doesn't specifying a Calendar other than the one in your local time zone cause the database to actually store an incorrect instant in time?
So I need to read Timestamps from a database, and I've been told they are stored as GMT. This has caused me great confusion. I think what I need to do is just read it out from the result set using a Calendar set for GMT timezone. But even though I think this will give the desired time, I realize I don't really understand... any of it. And it is hard to find good resources online that explain it, so I am asking here. It's really hard for me to continue on with other work when I don't understand something I am doing. It really has me pulling my hair out.
Thanks for any help you can provide!