What happens to old links if I change my custom URL/subdomain? - branch.io

If I were to use a specific custom domain (e.g. a.example.com) for a while (creating links via the API, via the Marketing tab in the dashboard, etc.) and then change it later on (e.g. to b.example.com), is there defined behaviour for exactly what happens with the old a.example.com links that are out in the wild?
The docs here simply say:
switching can cause significant problems with your existing links
What exactly are those problems? From a technical perspective, if a.example.com is still pointing to custom.bnc.lt, it should still be possible to identify what app that's for and what links it'll resolve to.
Just curious to know if anyone has any experience with this or a definitive answer to whether or not the "old" links will be broken after changing the custom domain and what "significant problems" may be encountered in doing so.
Thanks in advance!

Alex from Branch.io here: if you change your link domain, all your existing links will stop working and give errors.
While you're right that the CNAME record on the old domain is still pointing to custom.bnc.lt and will still forward traffic there, our backend performs a lookup to make sure the domain of the incoming link is affiliated with a known app in the system. If it doesn't find a match, we don't complete the link routing process.
As far as I know, we don't have any plans to support multiple domains for each Branch app configuration in the future. So the advice still holds!
We recommend that you choose one domain or subdomain to use with Branch and stick with it, as switching can cause significant problems with your existing links.

Related

Workarounds for Safari ITP 2.3

I am very confused as to how Safari 2.3 works in certain respects, and why sites can’t easily circumvent it. I don’t understand under what circumstances limits are applied, what the exact limits are, to what they are applied, and for how long.
To clarify my question I broke it down into several cases. I will be referring to Apple’s official blog post about ITP 2.3 [1] which you can quote from, but feel free to link to any other authoritative or factually correct sources in your answer.
For third-party sites loaded in iframes:
Why can’t they just use localStorage to store the values of cookies, and send this data along not as actual browser cookies🍪, but as data in the body of the request? Similarly, they can parse the response to updaye localStorage. What limits does ITP actually place on localStorage in third party iframes?
If the localStorage is frequently purged (see question 1), why can’t they simply use postMessage to tell a script on the enclosing website to store some information (perhaps encrypted) and then spit it back whenever it loads an iframe?
For sites that use link decoration
I still don’t understand what the limits on localStorage are in third party sites in iframes, which did NOT get classified as link decorator sites. But let’s say they are link decorator sites. According to [1] Apple only start limiting stuff further if there is a querystring or fragment. But can’t a website rather trivially store this information in the URL path before the querystring, ie /in/here without ?in=here … certainly large companies like Google can trivially choose to do that?
In the case a site has been labeled as a tracking site, does that mean all its non-cookie data is limited to 7 days? What about cookies set by the server, aren’t they exempted? So then simply make a request to your server to set the cookie instead of using Javascript. After all, the operator of the site is very likely to also have access to its HTTP server and app code.
For all sites
Why can’t a service like Google Analytics or Facebook’s widgets simply convince a site to additional add a CNAME to their DNS and get Google’s and Facebook’s servers under a subdomain like gmail.mysite.com or analytics.mysite.com ? And then boom, they can read and set cookies again, in some cases even on the top-level domain for website owners who don’t know better. Doesn’t this completely defeat the goals of Apple’s ITP, since Google and Facebook have now become a “second party” in some sense?
Here on StackOverflow, when we log out on iOS Safari the StackOverflow network is able to log out of multiple sites at once … how is that even accomplished if no one can track users across websites? I have heard it said that “second party cookies” still can be stored but what exactly makes a second party cookie different from a third party?
My question is broken down into 6 cases but the overall theme is, in each case: how does Apple’s latest ITP work in that case, and how does it actually block all cases of potentially malicious tracking (to the point where a well-funded company can’t just do the workarounds above) while at the same time allowing legitimate use cases?
[1] https://webkit.org/blog/9521/intelligent-tracking-prevention-2-3/

Using Magento as the main, and creating a single sign on to integrate with other third party software

This has been something I have been trying to work on for a good long time. It first started with Prestashop as an integration with other scripts or pieces of the puzzle I needed to make for an overall website. I am currently still using Prestashop as my webstore but have since switched to Magento.
I switched to Magento because of it's complex flexibility and because overall I think it is the best solution, best backing and best overall eCommerce script to go with.
That being said, the same issues I was having with Prestashop appear to be the same I will continue to have any in aspect that I try to integrate things together in perfect harmony.
I have Magento setup, as the main portion of the website, and inside Magento in sub folders I have Wordpress installed in a folder called "articles" and I have also went with FluxBB as my message forums because of it's simplicity in not having a crap load of bloated extra features that I could care less about and that is in a sub folder called "forums".
From this point, we know that Magento, Wordpress and FluxBB all have their own way of managing users; creating, managing, and tracking them.
What I am wanting to do is find the best way to fit these three and more together for my website to make the experience for the customer as smooth and as functional as possible. After emailing the ever talented and helpful Alan Storm, he told me the best solution he was aware of working was to make a third party user management that they all point to and it manages the customers authentication. I do believe his thoughts may be the best but I wanted to put this out there here on StackOverFlow and I may post this on Magento as well to get the broad scrope of magento developers and smart guys that like challenges.
I have several thoughts, none may work, some may work half ass, or one may just be something workable. But first let me tell you what I have accomplished so far. I have done the necessary steps to integrate my overall design for the header and footer, so essentially Wordpress and FluxBB are wrapped and are contained inside Magento's outer design layer. So with that being said I have also made it where Magento will check the session to see if the user is logged in to Magento or not by saying "Hello Guest" or "Hello User". This is where I have hit a stopping point because I am out of my depth and would like assistance, whether it is something we create together out of pure challengeness or someone says if I pay them they will help me, either way I would like this accomplished. If and when I get the code figured out whether by means of paying for assistance of a group effort I would like to make it freely available for others to use the concept for their own projects.
Brain Fart #1:
Adjust the user tables for both Wordpress and FluxBB to conform more to the structure of Magento, as for the password and username/email login portion. The rest of the fields can respectively stay as they are for post counts, and etc.
From there, I would like to figure out which class in Magento does the actual input into the database when a customer is created out of registration. When I find that code, I would like to extend upon it the ability to copy the user credentials into the other two tables in the database for Wordpress and FluxBB. If necessary it can just be an added couple of fields to Wordpress and FluxBB if that seems like a better idea and yes I do mean the actual encrypted password that Magento creates, I want this to be secure as well.
From there, when we know that a customer registers with Magento the data is copied over to the other two tables then we at least have made progress, whether this progress will actually work, is still to be determined.
We then disable the login/logout and registration links in any way that we can from Wordpress and FluxBB because they will no longer be needed because we want the user to register, login and logout through one location which is Magento.
Then comes the fun part in my eyes, keep the damn session going throughout the entire website as they order products, review wordpress articles and possibly leave comments, send to friends and etc.... as well as post topics, replies and etc in the FluxBB capacity.
To me this is where the creating the fields or adding the data from Magento's customer registration comes into play, I can make it check to see if they are logged into Magento already and from there we may be able to have it validate itself. This may be over kill or this may just be how it needs to be done. But to me if the credentials are located in all three databases then they should be able to be validated by changing the code in Wordpress and FluxBB or adding code. And Yes I am aware that we will also have to do something about Profile Editing and Password Editing if a customer so desires to change their information.
But that is my first thought on this whether it is the right decision or not, I would like hear from the vast knowledge of people here who have more experience and knowledge than I get with Magento, PHP and everything else.
Brain Fart #2
This illogical idea seems like an outside stretch entirely to me because of the complexity of Magento and how it is overall setup.
But the idea is to remove/edit the Wordpress and FluxBB (and any other third party software) to pretty much ignore it's own method of registration, login, logout, edit and look to Magento for it's credentials and establishing new customers. Essentially making them an oversized module of Magento.
I just know that the way Magento is setup is to be modulerized and its complexity seems like it would take a lot more coding and troubleshooting to do this.
Brain Fart #3
Dump both Wordpress and FluxBB and look towards modules in the Magento Connection Store that pretty much has all of the functionality that I need and can add to them what is missing and not mess with trying to integrate third party software.
I love Wordpress, I think replicating it with a module, at least after the hours I have spent looking at all of the modules available that are CMS/News related is a tough call. FluxBB I could take it or leave it, if someone had an already viable solution to use phpBB or vBulletin or SimpleMachines I would go with them. I rather it be free open source software, not because I am a cheap skate but just because I support open source as much as I can.
Brain Fart #4
Can this be a cookie this, but would only be effective if they allow cookies, or could somehow addon to the session to allow things to pass through but Magento sets up different sessions or allows you too so they things to crash against each other so this may not at all be an idea or may be one as well.
I know I am not giving examples of things I have tried, files I have looked at or anything related to that and I apologize, I provide some links related but nothing specifically found so far that matches what I am trying to accomplish. And I have tried to merge things together with some fun disastrous results.
Link Examples?:
http://www.magentocommerce.com/wiki/doc/webservices-api/api/customer#customer.create
http://www.magentogarden.com/blog/how-are-passwords-encrypted-in-magento.html
http://www.nicksays.co.uk/magento_events_cheat_sheet/
http://www.magentocommerce.com/wiki/5_-_modules_and_development/customers_and_accounts/registration_fields
How to access Magento customer's session from outside Magento?
Any assistance with this would be nice, I am trying to work on several parts of the website at once and this one is troublesome and I would say that everyone is going to find it hard or have found it hard. Anyone like challenges? :)
--------- EDIT:
I have got Magento and Wordpress to work perfectly together with James Kemp's module found on CodeCanyon's website (Single Sign-On for Magento and Wordpress) and I am going to adapt it to work for FluxBB or anything else I do.
Just passing along the information... I see this was edited, don't know what was edited and don't care. Just passing along information I have since found since posting this.
I am managing/customizing a combo of magento+vanilla forums+a custom app made in Yii framework. The users are "shared" between the apps. None of the two links are good. As Alan already replied to you, the correct SSO will be with an external user database/manager. But well, not everyone is up to recoding three apps just to get 1 post a week forum and 1 article a month blog to work with magento. So we are left with less options. First of all, if you don't want (most probably not) to rewrite a good portion of already written open source project that is being updated and maintained and then maintain your changes against periodical updates (you want them), then you have to duplicate the user data over three databases. Unless the project you adapt has some way to manage users data as plugin or external module. AFAIK both of your choice don't.
So, how to implement it? Assuming you choose Magento as mother-of-all, you need it to export an API for authentication, which may work over browser using cookies and javascript but this is rather tricky, or you can use it's frontend cookie to validate the sessions doing server-server API requests from children apps. This is a preferred option as far as "classical" SSO goes. Technically, what should happen when your users open forum or blog, the respective apps detect magento's cookie and check if the session is valid and who is the user. If the user is found, his data is copied to the blog or forum tables. Then you need to start an authenticated session on blog or forum app using the newly created user record.
So far so good, but yet some work. you need to disable the user profiles management in the children apps or modify it so the data held in Magento is always the correct one and you need to invent something to synchronize the Magento's representation of user profile down to the children. This is better to be hooked up on Magento's events so every time a user changes his profile the data is updated in the children app. But there is another but too. You probably want to keep some data app specific, a display name on the forum is not necessary the FirstName+LastName from the Magento and some would like to keep it private.
The above is just what I can recall as interesting facts about keeping it running. There are certainly many other things I've left out, more or less specific. But hopefully my comment can help your brain farting.
We've tried to evaluate other options but anything without duplicate data seems to be too expensive to implement or to maintain. Maybe later. With budget and time.

Google and Mirror Websites

Which is the best way to manage a website with one or more mirrors so that:
Google don't consider it as "dupicated content"
The website is correctly indexed
No inconsistencies or duplicated information are present in Google Analytics
The Google webmaster guidelines in general are respected
NOTE: I'm not sure if I should ask this question here or in ServerFault. It looks a bit in the middle between programming and server administration. Let me know if you think ServerFault represent a more appropriate place for this and I'll move it.
Thanks.
The official and simple solution is the canonical link tag. This is the official solution recommeded by Google.
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
If you need to host on multiple servers using round-robin DNS (or other load-balancing techniques) is a good idea. This will let you use a single host name and would generally not create problems with crawling and indexing on the search engine side (since the crawlers don't see multiple URLs for the content).
If you need to host using separate host or domain names (for whatever reason) it's best to pick one preferred version and to make sure that only that one is indexed. A way to do that could be to use rel=canonical link elements on the alternate versions. In general, however, I'd recommend working to prevent multiple host/domain names from being visible to the user & search engines by keeping the technical hosting issues (mirrored hosts) out of sight (as mentioned in the first part).
If you need to use multiple ccTLDs to host on country-specific domains then I'd strongly recommend making sure that you actually have country-specific content on each site (and not just mirroring one version). More about this is at http://googlewebmastercentral.blogspot.com/2010/03/working-with-multi-regional-websites.html

Is there an advantage to building my own tinyURL-like function/service over using third party resources?

I've seen the many different ways I can build a function/service to generate short URLs which I can then control via my own domain.
This sounds like a great idea; however, as I look at the advantages such as being able to control these URLs long term, adjusting the end location if needed have more tracking over where they wind up, etc.
I'm wondering if there is already a service out there that provides for this level of control without needing to build/host/support the solution myself?
The exact features desired are as follows:
Control of where the URL points to AFTER it's generated (the underlying URL needs to change due to legal/regulatory issues)
More robust tracking of where the URL is used as opposed to just doing a Google Search
for the tiny URL
The advantage would be that you own the links and are not dependent on a service that may go out of business. Also, if the shortened URL still has your domain it would have SEO advantages for page rank. Another thing is that it would reduce friction from clicking the link by your users. When you use another domain for shortening you are dependent on the trust the user has with that organization as well.
They may not necessarily be optimised for shortness, typically they look like http://purl.org/net/kin , but PURLs might be what you want
PURLs (Persistent Uniform Resource Locators) are Web addresses that act as permanent identifiers in the face of a dynamic and changing Web infrastructure. Instead of resolving directly to Web resources, PURLs provide a level of indirection that allows the underlying Web addresses of resources to change over time without negatively affecting systems that depend on them. This capability provides continuity of references to network resources that may migrate from machine to machine for business, social or technical reasons.
purl.org
You have complete control over where they point if you use the centralised server; you can also download their software to run your own.

Is this a blackhat SEO technique?

I have a site which has been developed completely in flash. Now the site owners do not want to shift to a more text/html based site. So am planning to create an alternative html/text based site which the googlebot will get redirected to. (By checking the useragent). My question is that is this allowed officially by google?
If not then how come there are many subscription based sites which display a different set of data to google compared to the users? Is that allowed?
Thank you very much.
I've dealt with this exact scenario for a large ecommerce site and Google essentially ignored the site. Google considers it cloaking and addresses it directly here and says:
Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
Instead, create an ADA compliant version of the website so that users with screen readers and vision aids can use your web site. As long as there as link from your home page to your ADA compliant pages, Google will index them.
The official advice seems to be: offer a visible link to a non-flash version of the site. Fooling the googlebot is a surefire way to get in trouble. And remember, Google results will link to the matching page! Do not make useless results.
Google already indexes flash content so my suggestion would be to check how your site is being indexed. Maybe you don't have to do anything.
I don't think showing an alternate version of the site is good from a Google perspective.
If you serve up your page with the exact same address, then you're probably fine. For example, if you show 'http://www.somesite.com/' but direct googlebot to 'http://www.somesite.com/alt.htm', then Google might direct search users to alt.htm. You don't want that, right?
This is called cloaking. I'm not sure what the effects of it are but it is certainly not whitehat. I am pretty sure Google is working on a way to crawl flash now so it might not even be a concern.
I'm assuming you're not really doing a redirect but instead a PHP import or something similar so it shows up as the same page. If you're actually redirecting then it's just going to index the other page like normal.
Some sites offer a different level of content -- they LIMIT the content, they don't offer alternative and additional content. This is done so it doesn't index unrelated things generally.