Last-Modified Header Timestamp Not Reliable? - http-headers

I was working on a project dealing with the Last-Modified time of photos from various websites. One of the websites drew my attention with the fact that all of its product photos being requested had the same Last-Modified header as Wed, 20 Jan 1988 04:20:42 GMT in the response headers. (See photo below)
The timestamp looks really strange to me because year 1988 is a year when internet and png/jpeg photo format (product photos of the website) did not even exist yet. I need to evaluate the validity of the timestamp for the project but I really want to understand more before simply saying it's a fake time. There're some thoughts in my mind that could not be wrapped in one question so I'm just listing them out as follows:
From a high level, where does the last-modified timestamp come from, is it from the last-modified field from the file system?
The company started in 1986 based on their company profile, so it's still possible to have some digital photos in storage at year 1988. Would it be technically possible that the original photos were created in 1988, stored and served later when internet becomes available?
In response headers, the Date property is actually reflecting correctly the current date when I requested the resource. Does that justify the validity of the time or it's also possible that the server time got changed back and forth somewhere in the middle?
Its product photos are png or jpeg and with resolutions up to the standard of today, how did they manage to do that without altering the last-modified metadata if we assume the timestamp is still valid? Note: they seem to all have the same timestamp precisely.
As mentioned, I'm still trying to understand more under the hood, so if you have any thoughts on how I could continue digging into this, that'll be helpful as well. Thanks in advance.

There are no obvious answers to these questions. You’ll have to ask the people who built the site.
Photo metadata is 100% under the control of the developers / operators of the site. If you wanted to use file system timestamps, you could. If you wanted to substitute a timestamp from another source, or a made-up time stamp, you could.
You might be able to find a clue by downloading photos and examining the EXIF metadata that’s embedded in the image data. That’s another potential source of the timestamp data. But of course that could also be doctored.

Related

Mailchimp Archive get more than 20 results

I am using Mailchimp's archive URL in PHP -- I am simply fetching the URL and displaying it as it sits, in order to white lable the funky URL IE
https://us17.campaign-archive.com/home/?u=xxxxxyyyyyxxxxyyyy&id=xxxxyyyyyxxxxyy
In doing so I have read through both the Archive and API documentation, and have found nothing on the parameter for row count. It defaults to 20 as stated in the Archive docs, but I know I have seen archives with a larger row count than that. Is anyone familiar enough with the URL parameters used by MailChimp to increase the row count, to say, 100? IE
https://us17.campaign-archive.com/home/?u=xxx&id=yyy&count=100
It's been a problem for years. Even in 2022 there is still no known way for an end-user to get more than past 20 issues from mailchimp, they simply refuse to add/allow that ability.
However the newsletter creator can go into their backend and generate/enable a javascript API that has the &show= parameter, which can be increased.
https://mailchimp.com/help/add-an-email-campaign-archive-to-your-website/
Again, only the campaign creator can do this, not some random end-user/reader.

Timeset - UTC - CAMERAS- TRAVELING - a complete mess

I'm an amateur photographer. I have Video, Mobile, Camera, Drone, etc. My goal is to sync all data on the right time and add GPS tag.
I end on configure everything on UTC (except mobile), and add after GPS tag that is on UTC also.
Main problem is how to determine correct timezone based on where the picture has been taken, and if I have to, since all my mobile photos has the timezone (and daylight savings) applied when taking video/photo.
I'm using now an OPEN API for having the correct GMT based on the geolocation and time/date.
Am I doing it right? Am I interpretating DATE/TIME changes correctly? I saw after that Lightroom changes based on the date/time info and the place it's used (photos).
It's a mess.

See how much was donated to a particular Thing

Say you want to have a kickstarter-like site using Flattr, where you create a Thing for each individual user. How would you be able to tell (with the API) how much each user Flattr'd? It wouldn't update right away (it would have to update at the first of each month), but would it be possible at all? If there's no way to do this, could you tell how much a particular Flattr'd to you in total?
In the details of each months revenue reports there's a link to download that revenue report as a CSV-file which you could then import into your system to find out how much each thing has received. The information is not yet exposed through the API (see this question) so you would have to download the files manually - luckily there's only twelve months a year so it's not too much work to do :)
The CSV-file contains these columns:
period - Date in the format of YYYY-MM
id - The internal numeric id of the flattr thing
flattr url - The URL to the thing on Flattr.com
url - The original URL that the thing is pointing to
title
clicks - The amount of clicks for the thing during the period
revenue - The revenue for the thing during the period
clicks total
So - as you see you can't see how much each individual click has been worth if a thing has received more than one click and you can't either see which users it is that has clicked the thing (you can however see some of the users by asking the regular API for more information, but since some users are anonymous you can't figure out who they are without asking them for permission to do so).
If you really need to know if a user has flattred something, then you can force them to flattr the thing through your system by having them authenticate with your system using the Flattr API - that way you will at least know if a flattr has been made by the user or not, but you will still have trouble trying to figure out how much their flattrs are worth - but that's kind of the point with Flattr - that people should spend whatever they personally feel comfortable with rather than spending what others think they should be spending.

mapping location to a time zone

I need to get the time zone for a given address/location. Assume that the address/location can be reverse geocoded (using google) to a lat/lng if necessary.
This means that I may not have a zip code.
I was really hoping that google provided some kind of API for this, but it seems that they don't. At a minimum you can google search for "time in washington, dc" and get the time/TZ -- but then I'd have to screen scrape that which is not fun :(
I know there are databases available that map locations to time zones, but that'd have to be maintained. Has anyone come up with a tricky solution to this problem?
Thanks!
Google provides an API for this.
https://maps.googleapis.com/maps/api/timezone/<json or xml>?location=<lat>,<lng>&timestamp=<unix timestamp>&sensor=<true or false>
eg:
https://maps.googleapis.com/maps/api/timezone/json?location=39.6034810,-119.6822510&timestamp=1331161200&sensor=false
http://www.geonames.org/ provides an API that returns the timezone given a lat/lng
The question needs to be clarified a bit more. What are you doing this for? What language(s) are you using? But how I would approach the problem is: First, create a table like structure that translates longitudes into zones(much like this table ). Next I would query the GMT either locally, or on the web somehow, and finally take the offset from the table and add it to GMT. This way there is no "maintenance" of the table since these longitudes don't change.
I've written a small Java class to do this, with the data structure embedded in the code. If you want to avoid using a web service, and accuracy of 22km is good enough, and you're happy to assume that timezone boundaries don't change, then it could help.
https://sites.google.com/a/edval.biz/www/mapping-lat-lng-s-to-timezones

What are the effects of the Last Modifed Header (LMH) changing too often on a dynamic site?

We have a web application that has a defect and is updating all pages Last Modified Header to the date of the last publish. We are in the process of fixing the defect, but we wanted to know if this defect might impact our SE results for this site.
Basically each time a page on the site get's updated all pages updates the last modified date even if the page has not been updated.
Is there any possibility of the Search Engine detecting the site as spam, since all pages are changing too often? -- Theory
It's unlikely to change much, since all the search engines will notice that your content hasn't actually changed. They will crawl at a rate commensurate with the observed rate of content change, more or less regardless of what you tell them, and small changes like that won't be marked as content changes in the index.
Changing the last modified date too often will NOT have a negative impact with the big 3.
The only way you can affect crawl rate via meta data (and sitemap.xml) is to reduce it. The reason for this is indicators that increase ranking/indexing are too easily abused. However, reducing spider rate is still an option for the resource conscious webmaster.