Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this question
I was wondering how (or if) I should guide Googlebot through my blog. Should I only allow visiting pages with single entries or should it also crawl the main page (which also has full entries)? My concern is that the main page changes when I add a new post and google keeps the old version for some time. I also find directing people to the main page annoying - you have to look through all the post before you find the one you're interested in. So what is the proper way to solve this issue?
Why not submit a sitemap with the appropriate <changefreq> tags -- if you set that to "always" for the homepage, the crawler will know that your homepage is very volatile (and you can have accurate change freq for other URLs too, of course). You can also give a lower priority to your homepage and a higher one to the pages you prefer to see higher in the index.
I do not recommend telling crawlers to avoid indexing your homepage completely, as that would throw away any link juice you might be getting from links to it from other sites -- tweaking change freq and priority seems preferable.
Make a sitemap.xml and regenerate it periodically. Check out Google Webmaster Tools.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I am the developer of Infermap.com. We are regularly monitoring and working on SEO and presence on Google SERPs. In the past 3-4 days we have seen a sudden steep drop in the number of Impressions on Google.
Can someone suggest me the possible reason of why might this happen and by what ways I can prevent it.
Also I have added around 11k urls to be indexed out of which only 1.5k has been indexed. What are the possible reasons for it?
(note: this question should probably be moved to Webmasters Stack Exchange)
Looks like your 11k new URLs have not been picked up as quality content by Google. You might even be cloaking, when I click on a result I get a completely different text on your site.
Ways to avoid it:
avoid cloaking
avoid adding similar looking pages without unique content, e.g. make sure your pages are unique enough before publishing them
feed new content that looks alike gradually, e.g. start with 100 pages, wait a week or two, and add another 200. Once you are confident your pages are picked up well you can add everything at once.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
i want to optimize rank of my site in search engines especially Google . i have submitted a sitemap to Google ; after about a week i see that 170 page had been submitted but just one page has been indexed. is there something wrong with it?
It isn't certain that there is something wrong.
Google first reads your sitemap. It is reporting that it found 170 urls in your sitemap, and has queued them up to be considered.
A week later it has decided to add one page to its index. One of two things has happened: google has not gotten around to crawling ( that is reading ) and considering all the pages in your sitemap. Or Google has looked at your pages and decided not to add them to its index.
Look in webmaster tools under "google index", "index status", "advanced". Then select "ever crawled". It should show you how many URLs it crawled from your site. If they haven't been crawled yet, you may just have to wait.
If they have been crawled, and are not added to the index, consider improving your content - or try the "fetch as googlebot" feature to make sure that what you are sending to google is what you think. Sometimes things can be configured so they look good to users, but are not visible to googlebot - e.g. all your content is ajaxed or in flash or something.
Also make sure that you aren't disallowing google to crawl your site in robots.txt, and that you are allowing the pages to be indexed. ( check to make sure you do not have a "noindex" tag in your html ).
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Thanks for reading my question. I am building a site that will be listing products from each manufacturer. I'm planning to structure the URL as following variations:
www.mysite.com/manufacturer_name/product_name/product_id
www.mysite.com/product_name/product_id
www.mysite.com/manufacturer_name
There are millions of products and I want all the major search engine to crawl them. What is the best way to go about doing that?
Would simply submitting site to all the search engines be enough? I would assume if I submit the manufacturer page which lists out all the manufacturer name as links the search engine will click on each links and click on all the products displayed within each manufacturer links (I will have paging for products) so the search engine can keep crawling the site for more products within each manufacturer until it runs out of the page number.
Would that be sufficient to list out each product on the every search engine? or is there a new and better way to do this? May be there are new SEO tricks that I'm not aware of. I am hoping if you can point me to the right direction.
I've previously used robot.txt to tell search engines which pages to crawl and that seemed to work fine.
Thanks,
bad_at_coding
Submit an XML sitemap. The easiest way to do this is to link to it in your robots.txt file.
Sample robots.txt file:
Sitemap: http://example.com/sitemap_location.xml
See Submitting Sitemaps for more on this topic from Google
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
A site I worked on recently used to be joomla based and had a ton of articles within it and the entire business is different.
After clearing out the site (FTP) and starting fresh and finally finishing all was done however, the sites rankings on google are plagued by old pages which no longer exist. Furthermore, Google seems to think these pages do exist.
I was under the impression that after recrawling the site (at whatever time it saw fit) it would recognise those pages are now non existent and replace them when it could.
Its driving me insane. There are 100's of pages, so I can't put in requests to remove them all, won't they ever automatically be removed?
It will take a while but they will eventually stop looking for those pages. They keep trying for a little while under the assumption that their being missing is an error and they will return. If you're not going to do removal requests then you will have to simply wait it out.
Make sure, all old pages are returning 404 or 410 status. If Googlebot encounters multiple times 404/410 status, it will remove them from index.
Also suggest to check if any of those pages are having backlinks. If Googlebot keeps encountering backlinks to outdated page, it might still hold them in their search index. If there are some pages with valid backlink suggest to 301 (re-direct) them to valid pages.
Try the answer using Google WebMaster Tools you may find here.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
If you google a specific entity, occasionally the website listed first is given a little listing of content, sort of like a mini site-map that the user can click on to navigate the linked site, bypassing the home page.
My question is this: Can I control this mini-sitemap when I am PR1? If so, how do I do so? I'm trying to build a list of relevant links so users can more effectively hit my site, but I'm not sure where to go about doing this.
Help?
No you cannot turn this on. Google decides this on their own wheter or not to generate them and for which search terms. If you sign up for the google webmasters you can see the status (if google has generated some for your site) and read more about their background.
Google generates the sitelinks itself, but only for certain sites. As for how it determines which sites get it and which don't, I'm not really sure, but I suspect it has something to do with the pagerank of the site and the amount of content you have.
For a while, I had sitelinks for my site (PR4 with about 40,000 pages indexed in Google) but then a while later, they went away. In my case it generated sitelinks for the main tabs on the site, probably because they are in the header navigation and therefore on every single page near the top of the page.
The only control you have over them is you can use the Google webmaster tools to remove sitelinks that you don't like, but you can't change the existing ones or suggest new ones.
They are called Sitelinks - there's a FAQ entry about them here.
You can't control them (except to remove ones you don't like) - the FAQ says "At the moment, sitelinks are completely automated."