How to check how Google robot sees my local website?
Is there an official Google tool? I've heard of one that check online websites, but what about locally site not avaible on network?
I don't think Google Webmasters Tools allow crawling or fetching local site.
Also, in addition to crawling local site, you'll have to make sure your local site or URLs don't get crawled in Google search as it will devalue your live site in future.
Thanks
You can use fetch as googlebot in webmaster tools to see how google views your site.
Related
Could you help me? Recently my websites on the server were updated to HTTPS - vectorization-eu and pixsector-com. The problem is that google bots for some strange reason indexing pages from pixsector under vectorization-eu domain. Vectorization-eu doesn't have .htaccess file. Could be this an issue.
PREVIEW
Request removal of the URLs that do not exist on Google Search Console.
Upload a site maps to Google Search Console for each domain.
I tried using Google Webmaster tools to re-crawl the site and it hasn't helped.
Does anyone know why the link on Google might say 'halalgems.com', but redirect to another site?
The description is also incorrect.
Unfortunately, it appears as though your website has been hacked, and not a problem with Google.
After looking at the response from your website once someone goes to it from that google page, it is your website that does the redirecting.
I cannot determine the source of this issue, as that would likely require access to your server. Good luck with tracking it down!
I'm building a website for a company, I build this website offline so google won't index it, with the reason that google don't see the codes yet and thinks at the official launch that we copied the text/codes from an other website.
Sadly I encounterd a problem. I need to implant the facebook social plugin, sadly does this plugin only work when the site is online. But as I said putting it online can be dangerous for future Google SEO.
Is there an other option where I can see the facebook plugin but it is not online yet or is it okay for me to just place it online already on (for example) www.example.com and later put the released product on (for example) www.released.com.
Any toughts on this problem?
Why don't you place the website online in a folder that is blocked for Googlebot via robots.txt?
That way you can test all the online elements work and not have to worry about users and search engines finding it and listing it.
An alternative could be to use the .htaccess file to limit access only to your IP address - that way, you'd be the only one to see the site live.
On the sitemaps.org it says that it is possible to submit the sitemap.xml via HTTP request to the search engine. However I'm unable to find documentation on how to do this for Google. I'm only finding the documentation on submitting it via Google Webmaster Tools.
Any ideas, is this even possible?
You can ping the sitemap url :
http://www.google.com/webmasters/sitemaps/ping?sitemap=URLOFSITEMAP.xml
Pinging google sitemap after every new article submission?
You don't need to submit and resubmit sitemap.xml to search engines. You can define them in your robots.txt file and web crawlers will find them and crawl them frequently.
My question is "How to integrate apache and google site via proxy"?
I found this tutorial but it didn't work as I expected. It redirect to google site instead of keep my domain in address bar and change content only.
In my case, I want whenever people access to http://mydomain.com, they will see the content from https://sites.google.com/site/mydomain/
Thanks!
I think it's not possible because you can't acces on the dataBase on your Google Sites. The url of your Google can't change.