I have a site that is available through 2 domains. One domain is one that I got for free with a hosting plan and don't want to promote. However when I perform search queries, pages from my site on that domain pop up.
What techniques are there to avoid the one domain from indexing while the other is perfectly indexed? Remember it's one hosting space with the exact same files on.
I have already submitted it in Google webmaster tools but that only works for Google obviously.
I would set up a sitewide 301 redirect from the domain you don't want to use too the other one. That way you will remove it from indexing as well as move people to use the correct one. You can probably do it in the .htaccess file (apache server). I'm not at my computer at the moment so can't easily give you the commands.
Related
How do you create a simple Apache ReWrite rule for mod_rewrite that will simply redirect any Microsoft Edge browser to a page one fixed page. This is needed on an instance of Apache that runs on an Intranet and runs one specific application that does NOT cater to Edge, only IE. When a user tries to go to the application with Edge, this rule should redirect them to a very specific URL on that server that will enlighten them to how to fine IE.
I am facing two issues:
I know the rule needs to act on the User-Agent, but I don't know what makes the Edge browser unique from all the others. Any thoughts on where the best place might be to go and figure this out? I have looked at Microsoft's web site and they share with you what the strings are, but it doesn't spell out how exactly to tell them apart. I am thinking that it might be best to look at some open source library that has already figured it out.
How I do write a rule for any URL that hits the site EXCEPT the 'enlightenment' page?
In general, User agents are really crappy to deal with. It is best to not rethink it and use a heavily tested library. One of the best is ua-parser. It is a collection of regexes to match user agents, with flavors in most languages.
If you want to have this in the Apache logic itself, you can extract from their list of regex
(Edge)(\d+)(?:\.(\d+))?
How I do write a rule for any URL that hits the site EXCEPT the 'enlightenment' page?
RewriteRule is what you want to look at
I'm moving a website from old.com to new.com/old, but I have to make sure it works before deleting old.com.
It's a very large legacy website that probably has links, images, scripts and other things hardcoded to old.com. The problem is that these references to old.com aren't obvious since the site loads up perfectly since old.com is still up.
Is there a way to block all requests to old.com from my local machine only, or some other tool to make finding these references simpler?
The former is done by updating your hosts file on your local machine to point old.com to something else, this overrides what the internet DNS states. The latter very much depends on how your application is build and there is not enough info here.
I think someone is stealing bandwidth of my website. To prevent this I enabled hotlink. But there are only extensions related to image. How can I protect my other files with extensions like .php or .asp? When I add .php or .asp extensions, I was unable to access in my website.
Another thing is, I found, in my cpanel IP address of my website sometimes appears as dedicated and sometimes as shared. Why is this happening?
I found static.reverse.softlayer.com in my visitors list. But which web pages it visited are not displayed. Please help me.
You can only protect secondary material from hotlinking, for example images, Javascript files and CSS files. Because those are fetched from a page in your site, the server can determine if they are used correctly or not.
If you try to keep primary material (e.g. the actual web pages) from being hotlinked, you are actually keeping them from being fetched at all. Any resource that you want to be available directly can't be locked down that way.
I'm actually scouring the web for the right terms for this question but after a few hours I decided to post my question here.
The scenario is:
we have a website running on two servers. So the files/website is synchronized in these two servers. We have a second server for internal purposes. Let's name the first server as www and ww2 for the second server. ww2 is automatically updated once the files are updated in www.
Now, Google is indexing the ww2 which I want to stop and just let www be crawled and indexed. My questions are:
1. How can I removed those crawled pages in ww2 removed from Google index?
2. How can I stop google from indexing ww2?
Thanks guys!
You can simply use robots.txt to disallow indexing. And there is a robots meta tag obeyed by google.
For your first question Google has a removal tool in their Webmaster tools here is a more info about it Remove page or site from Google's search results
For the second question you either can use a robots.txt file to block Google from crawling your site (here is a more info Blocking Google) or you can restrict the access to that server
I have bought a ipad website and it's moved to my server.
Now i have tried to make an addon domain, but it does not work on my first hosting account.
On my second hosting account it works, but on that server there is another ipad website so i don't think this is smart to do because of the same ip adresses.
So adding an addon domain does not work and the site is down now!
I have added a service ticket, but i think this will cost at least 8 hours before i get an answer.
Can anyone tell me how bad this is for my serp position in google.
The website has always been on the first page.
Will this 404 error do bad to my site?cOr is it better to place the site on the same server as the other ipad website?
EDIT:
It is not ideal to serve a 404/timeouts, however your rankings should recover. You mentioned that the sites are different. Moving the site to a different server/IP shouldn't matter too much as long as you can minimize the down time of the said process performed (and should probably be preferred over downtime, if possible). I want to ensure this is communicated, but do NOT show site #2 as site #1 in the short term as you will experience duplicate content issues.
If you don't already have it, you might open up a Google Webmaster Tools account. It will provide you with some diagnostics about your outage (e.g. how many attempts Google tried, the returned response codes, etc..) and if something major happens, which is unlikely, you can request re-inclusion.
I believe it is very bad if the 404 is a result of an internal link.
I cannot tell you anything about which server you should host it on though, as i have no idea if that scenario is bad. Could you possibly host it on the one server, then when the next is up, host it from there?