From: Ofnuts on
David J Taylor wrote:
> "Ofnuts" <> wrote in message news:4aef509d$0$12528$426a74cc(a)news.free.fr...
> []
>> So if a file has a June 21st 12:00 timestamp, the computer should
>> always display it as June 21st 12:00 as long as it is set up in that
>> time zone, whether DST applies or not. Of course, if the computer is
>> configured in the next timezone, then the same timestamp can be
>> displayed as June 21st, 11:00, and this will be the correct display in
>> summer and winter in that time zone.
> []
>> Bertrand
>
> Yes, I can see your argument, but if you take it to its logical
> conclusion, wouldn't you need to include the different time zone of the
> taking device in your calculation of displayed date as well as the time
> zone of the display device, so pictures taken at the same instant would
> show a different timestamp if they were taken in New York, London or Paris?

A timestamp is absolute. In computers it is stored as the number of
seconds (or milliseconds, or nanoseconds) since a reference time (00:00
UTC on January 1st, 1970 for Unix and a lot of web-related things)(*).
Whether it is later displayed as June 21st, 12:00 or June 21st, 09:00
depends only on the "reader". In other words the timezone is nothing
more than a parameter when displaying or parsing date representations,
and unless it is implicit (but with computers, "implicit" is seldom a
good idea) it should be part of the display or input string. Giving a
time as "2009/11/03 12:00" without a time zone is a bit like giving an
appointment at 12:00 but not telling which day. So, ideally, your three
cameras would attach the very same time stamp to the three photos.

However, the people who did the Exif standard overlooked the
timezone/DST problem. The timestamp is stored as a "YYYY:MM:DD HH:MM:SS"
character string, and doesn't include a time zone. So this information
is lost and if the three cameras above are on local time, they will
store a different timestamp string. If the standard had provided for a
time zone, they would have stored something like "2009/11/03 12:00
UTC+0" in London, "2009/11/03 13:00 UTC+1" in Paris, and "2009/11/03
07:00 UTC-5" in NY which are three different representations of the very
same timestamp.

And the cherry on the cake is that the FAT/FAT32 file system used in
memory cards uses local time, and files are often transferred between
camera and computer using a card reader, so unless the camera is
carefully kept on local time things can get quite fun later, for
instance when geotagging the pictures.


(*) see "epoch" on Wikipedia
--
Bertrand
From: David J Taylor on

"Ofnuts" <> wrote in message
news:4af004ab$0$25364$426a74cc(a)news.free.fr...
[]
> A timestamp is absolute. In computers it is stored as the number of
> seconds (or milliseconds, or nanoseconds) since a reference time (00:00
> UTC on January 1st, 1970 for Unix and a lot of web-related things)(*).

... conveniently forgetting about things like leap seconds, perhaps?

> Whether it is later displayed as June 21st, 12:00 or June 21st, 09:00
> depends only on the "reader". In other words the timezone is nothing
> more than a parameter when displaying or parsing date representations,
> and unless it is implicit (but with computers, "implicit" is seldom a
> good idea) it should be part of the display or input string.

Agreed.

> Giving a time as "2009/11/03 12:00" without a time zone is a bit like
> giving an appointment at 12:00 but not telling which day. So, ideally,
> your three cameras would attach the very same time stamp to the three
> photos.

That would depend how the users have the cameras set, of course.

> However, the people who did the Exif standard overlooked the
> timezone/DST problem. The timestamp is stored as a "YYYY:MM:DD HH:MM:SS"
> character string, and doesn't include a time zone. So this information
> is lost and if the three cameras above are on local time, they will
> store a different timestamp string. If the standard had provided for a
> time zone, they would have stored something like "2009/11/03 12:00
> UTC+0" in London, "2009/11/03 13:00 UTC+1" in Paris, and "2009/11/03
> 07:00 UTC-5" in NY which are three different representations of the very
> same timestamp.

... and once lost, it's gone forever.

> And the cherry on the cake is that the FAT/FAT32 file system used in
> memory cards uses local time, and files are often transferred between
> camera and computer using a card reader, so unless the camera is
> carefully kept on local time things can get quite fun later, for
> instance when geotagging the pictures.
>
>
> (*) see "epoch" on Wikipedia
> --
> Bertrand

It can also be fun if:

- you take pictures before the time-change, and upload them to your
computer afterwards.

- you take pictures in one time-zone and upload them in another

and I'm sure you can think of even more circumstances. For those reasons,
I keep my camera set to UTC rather than local time, and I don't change it
between summer and winter. I don't bother about leap seconds on the
camera, as its clock isn't accurate enough to warrant it.

It is an important point to know that Windows displays the time with the
actual date, and the current offset time, rather than the time offset for
the taking date.

Cheers,
David

From: Ofnuts on
David J Taylor wrote:
>
> "Ofnuts" <> wrote in message news:4af004ab$0$25364$426a74cc(a)news.free.fr...
> []
>> A timestamp is absolute. In computers it is stored as the number of
>> seconds (or milliseconds, or nanoseconds) since a reference time
>> (00:00 UTC on January 1st, 1970 for Unix and a lot of web-related
>> things)(*).
>
> .. conveniently forgetting about things like leap seconds, perhaps?

No, they are taken in account (assuming the display and parsing code are
correct (mostly the display, since most serious computers
self-synchronize via NTP).

>> Whether it is later displayed as June 21st, 12:00 or June 21st, 09:00
>> depends only on the "reader". In other words the timezone is nothing
>> more than a parameter when displaying or parsing date representations,
>> and unless it is implicit (but with computers, "implicit" is seldom a
>> good idea) it should be part of the display or input string.
>
> Agreed.
>
>> Giving a time as "2009/11/03 12:00" without a time zone is a bit like
>> giving an appointment at 12:00 but not telling which day. So, ideally,
>> your three cameras would attach the very same time stamp to the three
>> photos.
>
> That would depend how the users have the cameras set, of course.

This assumes that the cameras would have a way to know the "absolute"
time (ie, set to UTC, or given a timezone).

> It is an important point to know that Windows displays the time with the
> actual date, and the current offset time, rather than the time offset
> for the taking date.
>

Not even sure of the date part. If a file is stamped at 00:30 on June
21st, will Window show it later as stamped at 23:30 on June 21st or at
23:30 on June 20th?

--
Bertrand
From: Dave Cohen on
Ofnuts wrote:
> David J Taylor wrote:
>>
>> "Ofnuts" <> wrote in message
>> news:4af004ab$0$25364$426a74cc(a)news.free.fr...
>> []
>>> A timestamp is absolute. In computers it is stored as the number of
>>> seconds (or milliseconds, or nanoseconds) since a reference time
>>> (00:00 UTC on January 1st, 1970 for Unix and a lot of web-related
>>> things)(*).
>>
>> .. conveniently forgetting about things like leap seconds, perhaps?
>
> No, they are taken in account (assuming the display and parsing code are
> correct (mostly the display, since most serious computers
> self-synchronize via NTP).
>
>>> Whether it is later displayed as June 21st, 12:00 or June 21st, 09:00
>>> depends only on the "reader". In other words the timezone is nothing
>>> more than a parameter when displaying or parsing date
>>> representations, and unless it is implicit (but with computers,
>>> "implicit" is seldom a good idea) it should be part of the display or
>>> input string.
>>
>> Agreed.
>>
>>> Giving a time as "2009/11/03 12:00" without a time zone is a bit like
>>> giving an appointment at 12:00 but not telling which day. So,
>>> ideally, your three cameras would attach the very same time stamp to
>>> the three photos.
>>
>> That would depend how the users have the cameras set, of course.
>
> This assumes that the cameras would have a way to know the "absolute"
> time (ie, set to UTC, or given a timezone).
>
>> It is an important point to know that Windows displays the time with
>> the actual date, and the current offset time, rather than the time
>> offset for the taking date.
>>
>
> Not even sure of the date part. If a file is stamped at 00:30 on June
> 21st, will Window show it later as stamped at 23:30 on June 21st or at
> 23:30 on June 20th?
>

There is no easy universal way around these problems. Fortunately in the
grand scheme of things it's not a big deal as long as you are aware of
what's happening. If one is that picky, set the camera to UTC, keep it
that way and just do a mental adjustment as needed.
From: bucky3 on
Thanks everyone, that was quite an educational thread.

Just curious, how do other operating systems (like Unix, Mac) handle
the file timestamp in regards to DST? Do they have the same problem,
or do they handle it better?