I have a site www.megalim.co.il,
recently due to a version upgrade, I discovered that i have a robots.txt file that disallowed all Search engines.. my google ranking dropped , and I couldn't find the site's main page anymore
I changed the robots.txt file to one that allows all, and now the web master toolkit doesn't
write me that the site is blocked from google.
I did this about 5 days ago, I've also fetched as google
and submitted www.megalim.co.il to index with all related pages
but still, when i search this: "site:www.megalim.co.il"
i get a bunch of results from my site , but not the main page!
what else should I look for?
thanks!
Igal
You don't see your main page because of your old robots.txt. 5 days is nothing for Google bots to re-index all your website.
Just wait a little and you will see your website fully indexed in Google results.
Issue sorted out..
embarassing...
apparently we (inexplicably) had a nofollow, noindex meta tag..
after a day we start reappearing in google
thanks :)
Related
I have a website with updated content daily. I have two questions:
How does google see this content? Do I need a SEO in this case?
Does the error 404 page have an influence on the ranking on the search engine. (I do not have a static page)
So Google can "know" the content is supposed to be updated daily, it may be useful (if you don't do it yet) to implement a sitemap (and update, if necessary, dynamically). In this simemap, you can specify for each page, the update period.
This is not a constraint for Google, but it can help to adjust the frequency of indexing robots visit.
If you do, you must be "honest" with Google about times updates. If Google realizes that the frequency defined in the sitemap does not correspond to the actual frequency, it can be bad for your rankings.
404 errors (and other HTTP errors) can actually indirectly have an adverse effect on the ranking of the site. Of course, if the robot can not access content at a given moment, it can not be indexed. But scoffers, if too many problems are encountered during the visit of your site by web crawlers, Google will adjust the frequency of visits to the downside.
You can get some personalized advice and monitor the process of indexing your site using Google Webmaster Tools (and to a lesser extent, Analytics or any other tool that could monitor the web crawlers visits).
You can see the date and time when Google last visited your page. So you can see that Google adapted your updated content or not. If you have a website with updated content daily then you can ping your website to many search engines and can also submit your site in Google.
You can make a sitemap for only those urls who have a daily updated content and submit to google webmaster tools. You can define your date and time when the url was last modified in tag. You can also give a hint how frequently your page will likely to change under tag. You can set high priority for the pages which are modified daily under tag.
If you have 404 error (file not found) page then then put them in one directory and define it in your robots.txt file. So Google will not crawl that web pages and automatically it will not be indexed. It will not make any influence on your SERP ranking.
I have submitted the sitemap to google webmaster tools however it is not getting indexed.
It has been almost a month since it has been submitted. The webmaster tools say "No data available." on almost every section.
As far as I can tell there is nothing blocking google from indexing, robots.txt as you can see is not blocking anything, no meta tags blocking crawling.
Here is a screen shot of the webmaster tools for the sitemap:
http://www.2shared.com/photo/4HLbsOte/webmaster.html
I am not sure why it says Processed May 3 2012 when I submitted it earlier last month. But nothing has been indexed and looks like there are no issues with it either.
Any ideas?
Thanks for the help.
SOLVED Edit:
looks like I had X-Robots-Tag: noindex, nofollow in my http header.
In the sitemap section of webmaster tools, does it say that there are any errors with the sitemap you submitted?
Also, how many pages are in that sitemap? if there are very few pages, you are likely to see very low indexing because google usually doesn't index all of your pages
had X-Robots-Tag: noindex, nofollow in my http header.
There may be some issues when you submit sitemap of your website to google webmaster tool. You should try again to add sitemap. Just delete previous sitemap and add it again. I hope it will work now.
So I have a website that I recently made changes to, and one of the changes was removing a page from the site. I deleted the page, it doesn't exist anymore.
However, when you search for my site, one of the results is the page that I deleted. People are clicking on the page and getting an error.
How do I remove that page from the search results?
Here is the solution
First get ur site on google webmaster. Then go to site configuration -- > crawler access --> remove url . Click on New removal request and add the page you want to remove and make sure you have added that page to the robots.txt of your site. Google will deindex the page within 24 hrs.
You simply wait for googles robots to find out that it doesn't exist anymore.
A trick that used to work is to upload a sitemap to google where you add the url to the deleted page and set it to top priority and that it changes every day. That way the google robots will prio that page and quicker find out that its not there anymore.
There might be other ways but none that are known to me.
You can remove specific pages using the webmaster tools I believe.
Yahoo Web tools offer a similar service as I understand it.
This information was correct the last time I tried to do this a little while ago.
You should go to https://www.google.com/webmasters/tools/removals and remove the pages which you want.
i have a blog build in wordpress, And my domain name is like example.com (i can't give you the original name, because some times the editors will mark this question as SPAM :( , and if any one really want to check directly from my site will add at the end of the question.)
http://example.com and the blog name is http://example.com/articles/
and the sitemap.xml is available in http://example.com/sitemap.xml
Google daily visit my site and all my new articles were crawled, if i search the "articles title + example.com " will get the search result from the google , its my site. but the heading is not the actual one. its getting from another article's data.
(i think can give you a sample search query, please don't take this as a spam)
Installing Go Language in Ubuntu+tutorboy - But this will list with proper title after a long title :(, I think now you understood what i am facing ... please help me to find out why this happens.
Edit:
How can i improve my SEO with wordpress?
When I search that query I don't get the page "Installing Go...", I get the "PHP header types" article, which has the above text on the page (links at the right). So the titles showing in Google are correct.
Google has obviously not crawled that page yet since it's quite new. Give it time, especially if your site is new and/or unpopular.
Couple of things I need to make clear:
Google crawled your site on 24 Nov 2009 12:01:01 GMT, so needless to say Google actually does not visit your site(blog)everyday.
When I queried the phrase you provided, the results are right. There are two url relates to your site. One is home page of your blog, another is the page that is (more closely)related to your query. The reason is the query phrase is directly related to the page of tutorboy.com/articles/php/use-php-functions-in-javascript.html, however, in your home page there are still some related keywords. That is the reason why Google presents two pages on the result page.
Your second question is hard to answer since it needs a complicated answer. Still, the following steps are crucial to your SEO.
Unique and good content. Content is king, and it is the subject that remains consistent in the whole time while another elements are changing with the evolving of search engine technology. Also keep your site content fresh.
Back links. Part of the reason that Google does not visit your site after your updating your site is your site lacks enough back links.
Good structure. Properly use those tags like<t>, <description>,<alt>etc.
Using web analysts tools like Google Analysts. It free, and you can see lot of things that you missed.
Most importantly, grabbing some SEO books or spending couple of minutes everyday to read some SEO articles.
Good Luck,
Google said that i had malware in my site, then i deleted all the files and upload just a simple index.html with "hello world". Asked to google check it again, they checked but the answer was that i still had malware. The strange thing is that in the webmasterstools page has the icon to my site has a image of the old one and in google search my old pages keep apearing , its like the site has not been updatade to google, but how can they recheck malware without updating it first(indexing again)? Thanks Guys
To make it harder for people to work out how to spam search result placings google use a time lapse on their webmastertools data. Some things (like the image of your home page) can take a month or more to update. Give it some time. As far as your pages still appearing in search results go, I get DNS errors in webmastertools for pages I removed 6 months ago.