I have built a cookie consent module that is used on many sites, all using the same server architecture, on the same cluster. For the visitors of these sites it is possible to administer their cookie settings (eg. no advertising cookies, but allow analytics cookes) on a central domain that keeps track of the user preferences (and sites that are visited).
When they change their settings, all sites that the visitor has been to that are using my module (kept in cookie) are contacted by loading it with a parameter in hidden iframes. I tried the same with images.
On these sites a rewrite rule is in place that detects that parameter and then retracts the cookie (set the date in the past) and redirects to a page on the module site (or an image on the module site).
This scheme is working in all browsers, except IE, as it needs a P3P (Probably the reason why it is not working for images is similar).
I also tried loading a non-existent image on the source domain (that is, the domain that is using the module) through an image tag, obviously resulting in a 404. This works on all browsers, except Safari, which doesn't set cookies on 404's (at least, that is my conclusion).
My question is, how would it be possible to retract the cookie consent cookie on the connected domains, given that all I can change are the rewrite rules?
I hope that I have explained the problem well enough for you guys to give an answer, and that a solution is possible...
I am still not able to resolve this question, but when looked at it the other way around there is a solution. Using JSONP (for an example, see: Basic example of using .ajax() with JSONP?), the client domain can load information from the master server and compare that to local information.
Based on that, the client site can retract the cookie (or even replace it) and force a reload which will trigger the rewrite rules...
A drawback of this solution is that it will hit the server for every pageview, and in my case, that's a real problem. Only testing that every x minutes or so (by setting a temporary cookie) would provide a solution.
Another, even more simple solution would be to expire all the cookies on the client site every x hour. This will force a revisit of the main domain as well.
Related
I'm coming to you because I'm stuck on the following problem:
I have a website, hosted on a server on which I will be doing messy maintenance stuff (understand I'm not sure what I'm doing so I might crash everything).
I'd like to temporarily redirect all the traffic to a simle page stating the website is undermaintenance and will be back soon.
So this page must be hosted on another server, since mine will be down.
To make matters more complicated, I have an ssl certificate on my whole website, so most of my users have the https adress memorized in their browser (and that's also what's memorized by google).
I've tried hosting the simple page on a free hosting, and also on microsoft azure (because I already have an account for another web-project). However, i've encountered the same problem in both cases: the users coming to the website see big red flags from thei browsers, saying that the connexion isn't private. (ERR_CERT_COMMON_NAME_INVALID)
What would be the proper way to proceed and redirect my users in a smooth way?
Thanks in advance!
Rouli
Can someone please help me to find a solution to maintaining the session across subdomains.
The site uses fake subdomains for users, eg. (thisuser.mysite.com/). All the fake subdomains map to the main site (mysite.com) so there's a common database for everything.
The subdomains are used only for a couple of components(com_xxx) on the site. for other components the user is redirected to the main site via htaaccess.
The problem is when a user is redirected to mysite.com from thisuser.mysite.com and vice versa. Their session is not maintained. The user has to login back again.
I have tried updating the cookie domain in php.ini to '.mysite.com' but it doesnt seem to help.
Is it possible that the site may have auto-logins across all subdomains and main domain without any core hacks, assuming the solution lies in making cookies readable from all subdomains, irrespective of from where it's being set?
Thanks all, for your time and suggestions !
I'm not sure how you could do this....
Here is just an idea, it would rely on javascript...
WHen a user logs in using your login form... a hidden iframe would exist and javascript would post your login data to the login page of each domaine for your site. chaining them...
i dont feel its a safe thing tho... maybe im wrong...
u could use joomla's mootool framework to send an ajax requests to each domain...
Otherwise might want to check how joomla creates a session row in the database for each user on the site. maybe you can just create them for each domaine with 1 login. im gona check my mysql....
are you using joomla 1.5 or 2.5 ?
Otherwise i found this document for you:
http://docs.joomla.org/Multiple_Domains_and_Web_Sites_in_a_single_Joomla!_installation
okay, this was easy, I was testing on the local machine and it seems if domain doesnt have the dot, then the cookies are not handled well.
Just ensuring taht cookie domain is set to '.mysite.com' gets the job done
1. It is also recommended that you use the same joomla "secret" configuration value in the different websites as it is used to check the data exchanged between the different domains.
2. taht cookie domain is set to '.mysite.com' gets the job done
I'm building a site with Umbraco, and there are a couple of pages that need to be visited over HTTPS instead of HTTP (e.g. a login page).
I've seen a couple of macros that get put on the page that needs to use HTTPS, and essentially just check the protocol used and do a Response.Redirect with the correct protocol if necessary. This seems like a terrible way of achieving what seems to be a fairly basic requirement - ideally I'd want Umbraco to render any links to these pages as <a href="https://...", not do a redirect when the user goes to a page.
With these redirecting macros, there's also the possibility of a browser displaying a warning if the user's on an HTTPS page and navigates to a HTTP one. If the links are relative, the user will be redirected from HTTPS to HTTP, and the browser may warn about this.
Is there a way to achieve this without modifying any Umbraco framework code?
There's currently no built-in way to make a few pages in Umbraco return a https url.
The only way I can think of doing this at the moment is just by making sure that you set up your links correctly.
But there's no way of stopping people from entering the insecure link. That is where the redirects come in handy though, it will make sure you don't get to a secure page insecurely.
I would recommend running the whole site in https mode. In the past, performance would have been an objection to running your full site in https mode. However with modern servers, this really shouldn't be a problem any more.
One of YSlow's measurables is to use cookie-free domains to serve static files.
"When the browser requests a static
image and sends cookies with the
request, the server ignores the
cookies. These cookies are unnecessary
network traffic. To workaround this
problem, make sure that static
components are requested with
cookie-free requests by creating a
subdomain and hosting them there." --
Yahoo YSlow
I interpret this to mean that I could experience performance gains if I move www.example.com/images to static.example.com/images.
Although this is easy to do, I would lose the handy ability within my content management system (Joomla/WordPress) to easily reference and link to these images.
Is it possible to use .htaccess to redirect all requests for a particular folder on www.example.com to a folder on static.example.com instead? Would this method also fool the CMS into thinking the images were located in the default locations on its own domain?
Is it possible to use .htaccess to redirect all requests
for a particular folder on www.example.com to a folder on
static.example.com instead?
Possible, but counter productive — the client would have to make an HTTP request, get the redirect response, then make another HTTP request.
This costs a lot more than the single line of cookie data saved!
Would this method also fool the CMS into thinking the images
were located in the default locations on its own domain?
No.
Although this is easy to do, I would
lose the handy ability within my
content management system
(Joomla/WordPress) to easily reference
and link to these images.
What you could try to do is create a plugin in Joomla that dinamically creates these references.
For example, you have a plugin that when you enter {dinamic_path path} in an article, it appends 'static.example.com/images' to the path provided. So, everytime you need to change the server path, you just change in the plugin. For the links that are already in the database, you can try to use phpMyAdmin to change them in this structure.
It still loses the WYSIWYG hability in TinyMCE, but is an alternative.
In theory you could create a virtual domain that points directly to the images folder, such as images.example.com. Then in your CMS (hopefully at the theme layer) you could replace any paths that point to the images folder with an absolute path to the subdomain.
The redirects would cause far more network traffic, and far more latency, than simply leaving things as they are.
It would redirect the request but the client would still be sending its cookies to the server, so really you accomplished nothing. You would have to directly access the files from a domain that isn't storing cookies for it to work.
What you really want to do is use staticexample.com/images instead of static.example.com/images so that you don't pick up any cookies on the example.com domain that you may have set. If all you do is server images from that domain with a simple apache server or something then you can configure that server not to return even a session cookie.
The redirects are a very bad idea. Cookies cause some performance hits but round trips to the server such as a redirect would cause are a much more serious performance issue.
I did below and gained success:
<FilesMatch "!\.(gif|jpe?g|png)$">
php_value session.cookie_domain example.com
</FilesMatch>
What it means is that if you do not set images in cookie information.
Then images are cookie-free with server.
Whenever I login to one Google service, I am automatically logged in all their other websites on different domains.
What I want to know is how they are able to access the disparate cookies and sessions that belong on another domain.
I tried searching online but I couldn't find any information. I could probably pull out firebug and try to find out but I am sure someone here knows.
A Google Login works like this:
1) You login, normally at a login page that is under the Google.com/accounts domain.
1a) If you aren't on the Google.com/accounts domain, it is going to forward you there after you post the form. This can be found on sites like Blogger.
Once you arrive at the Google.com/accounts domain, they do two things
2) They set a cookie(s) that is specific to the Google.com/accounts domain, that are also only able to be sent over a secure connection. This is to verify your identity later on.
I say multiple because there are several cookies bound to the google.com/accounts domain. I believe that one of these is to make sure that all doesn't fail if secure connections aren't allowed
3) They set a cookie that spans all the domains using .google.com as their domain, because this will make the cookie available to any domain.
4) They forward you back.
5) If it is a site on a different domain, like blogger, they send along an authorization key in the URL. The page sees it, verfies it, and sets the cookie for a different domain. A technique like this can be seen using Google's Oauth.
Here is where that Secure Cookie comes in.
If you notice, whenever you go to a site after you close your browser, they forward you to the google.com/accounts path, where they reverify you under a secure connection, and then reset the subdomain-wide cookie. Then they send you back.
Furthermore, some sites like Google Adsense use the same technique as Google.com/accounts uses, by making a secure cookie on a specific path, and then using more global cookies to allow greater access.
Some of this is guessing, but given what a non-insider can see, I believe that is close to the truth.
Note: I literally spent like an entire month just browsing from Google Site to Google Site seeing how they did stuff. By upvoting this post, you are decreasing the sadness I have for having no life