Robots.txt Error on Google Search Console - seo

While I am submitting my website robots.txt to Google Search Console it's showing as an error like the below screenshot.

Just upload the robots.txt file to the root. Check it yourself by going to yourdomain.com/robots.txt.
If it works, then it ... works! It can take a while before Google updates the status in Search Console. Sometimes you need to lift your eyes from Search Console ;-)

It's an not an error. its saying upload the robot.txt file in the root directry of your website. And then ask google to update the robot.txt file.

Related

Why does Live Test in Google Search Console return content that is not viewable in the browser?

According to Google Search Console Live Test on https://www.energycouncil.com.au/ returns text that does not exist on the viewable website.
Why is completely different source returned when using the Google Search Console with a Live Test?
I worked it out.
Code had been injected into a source code file that redirects the site if it is a website indexer. Ingeniously, this code did not do anything for over a year so I didn't see the issue as no files appeared to be recently modified.
To find the issue, download the live site and search it for "redirect"

Can we stop googlebot crawling old pdf url

In my site their is a button linking to PDF. Let say current pdf url on button is http://www.abc.come/wp-content/uploads/2016/09/xyz.pdf and this url google bot has crawled. Now later after month from admin, administrator uploads new pdf let say http://www.abc.come/wp-content/uploads/2016/09/xyz-latest.pdf and updates url on button.
Issue is that googlebot is still crawling old url with xyz.pdf and giving 404 in webmaster tools.
How can we make googlebot to stop crawling old url and crawl new ones.
Thanks.
Yup you can.
Under webmaster go to Google Index -> Remove URLS. Remove you url from here and then from your app. Works for me.
I had the same issue, my solution was an entry in the .htaccess with an 410 ('gone') statuscode. After some time google stops crawling.
But I also read that Google will stop crawling when 404. But on my site it keeps crawling 404-sites.

Advice regarding Google Webmaster 404 errors

I created a CMS for a website and integrated Google Analytics. The site changes it's content every week (adding, editing, removing pages and URLs) and I rewrite the sitemap every time when one of this actions occurred.
The problem is that the web crawlers from Google detect a lot of 404 error pages.
What I am doing wrong?
Getting reports about 404s is perfectly normal and generally no need to worry about them.
Check where does Google find those 404 URLs, you can see that in Search Console (formerly Webmaster Tools), and see if you can fix them. If you cannot, if you have great content, sooner or later you'll get better links.
What you could do additionally, is to create custom 404 pages, where you link to content on your site that's similar to the missing page (if it's possible to determine that), or that's popular on the site.
Also if you feel that the page is for content that won't be coming back on the site. you can remove the URL for their index by using the remove URL option.

Why can't Google Webmaster verify my websites subdomain?

Okay, so I have my main site, www.mydomain.com.au along with my mobile site which is within a subdomain, m.mydomain.com.au.
I am trying to submit my subdomain to Google Webmaster Tools, although it cannot seem to verify it. I have tried with the HTML file upload and the meta tag option to.
I can verify that the Google HTML file is DEFINITELY uploaded since the link Google provides to check if it is there is working perfectly. Also, I have checked my source code and the meta tag is also definitely there.
My .htaccess file has a mobile redirect so I thought that may have been the problem, so I deleted that file to check if it would validate, but still no luck.
I have approximately 12 websites that are setup the EXACT same as this one, and all of their mobile sites/subdomains verified with Webmaster Tools perfectly.
This is my error message: We were unable to connect to your server.
Does anyone have any suggestions to why Google cannot verify my mobile site?
Google doesn't see sub domains...they see the folder where the sub domain is located. Delete the site and add is again as mydomain.com.au/m
Google will see it and add it to Webmaster tools...I just did this for my sub domain. That's how I worked it out...
You can verify them from here https://www.google.com/webmasters/verification/home?hl=en
It is wrong that google is not seeing sub domains. you can check it with site:yoururl. with your main domain you will get your sub domain as well.

Manually add sitemap located in s3 into google webmaster tools

I have an app running in Heroku.
I am using sitemap_generator to generate sitemap and save it into s3.
I have added the robots.txt to contain my sitemap location.
My question are.
How can I know my sitemap are successfully find by search engine like google?
How can I monitor my sitemap?
If my sitemap is located in my app server I can add the sitemap manually into google webmaster tools for monitoring. Because when I click on "Test/Add sitemap" in Google webmaster tools, it default to the same server.
Thanks for your help.
I got it to work.
Google has something called cross submission: http://googlewebmastercentral.blogspot.com/2007/10/dealing-with-sitemap-cross-submissions.html
You might want to visit this blog as well:
http://stanicblog.blogspot.sg/2012/02/how-to-add-your-sitemap-file-located-in.html
Thanks for your help, yacc.
Let me answer your two first questions, one at a time (I'm not sure what you mean by 'how can I monitor my sitemap' so I'll skip it):
Manually submit a sitemap to Google
If you can't use Google webmaster form to submit your sitemap, use an HTTP get request to notify Google of your new site map.
If your sitemap is located at https://s3.amazonaws.com/sitemapbucket/sitemap.gz , first URL encode your sitemap URL (you can use this online URL encoder/decoder for that) then using curl or wget to submit your encoded URL to Google:
curl www.google.com/webmasters/tools/ping?sitemap=https%3A%2F%2Fs3.amazonaws.com%2Fsitemapbucket%2Fsitemap.gz
If your request is successful you'll get a 200 answer with a message like this:
... cut ...
<body><h2>Sitemap Notification Received</h2>
<br>
Your Sitemap has been successfully added to our list of Sitemaps to crawl.
... cut ...
Checking that Google knows about your new sitemap
Open Webmaster Tools, navigate to Site sonfiguration->Sitemaps, there you should see the sitemaps that you've submited. It might take sometime for a new sitemap to show up there, so check frequently.