diffrence between submitted page and indexed in googlemap [closed] - seo

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
i want to optimize rank of my site in search engines especially Google . i have submitted a sitemap to Google ; after about a week i see that 170 page had been submitted but just one page has been indexed. is there something wrong with it?

It isn't certain that there is something wrong.
Google first reads your sitemap. It is reporting that it found 170 urls in your sitemap, and has queued them up to be considered.
A week later it has decided to add one page to its index. One of two things has happened: google has not gotten around to crawling ( that is reading ) and considering all the pages in your sitemap. Or Google has looked at your pages and decided not to add them to its index.
Look in webmaster tools under "google index", "index status", "advanced". Then select "ever crawled". It should show you how many URLs it crawled from your site. If they haven't been crawled yet, you may just have to wait.
If they have been crawled, and are not added to the index, consider improving your content - or try the "fetch as googlebot" feature to make sure that what you are sending to google is what you think. Sometimes things can be configured so they look good to users, but are not visible to googlebot - e.g. all your content is ajaxed or in flash or something.
Also make sure that you aren't disallowing google to crawl your site in robots.txt, and that you are allowing the pages to be indexed. ( check to make sure you do not have a "noindex" tag in your html ).

Related

How to help search engines to find all the pages on my website [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I currently program a website which gives information about food products.
The way the website works is that there's a search engine -> the users search for the product they want to know something about -> the website shows all the products that they may want to see, and every product has his own page with all the information about it.
So my question is: how search engines, like google, will be able to find all the product pages?
Search engines use many different ways to find new pages. Most commonly their web crawlers follow (external as well as internal) hyperlinks.
While a typical informational website links to all available pages in its site-wide navigation (so web crawlers can reach all pages by following internal links), other websites don’t necessarily link to all their pages (maybe because you can only reach them via forms, or because it doesn’t make sense for them to provide all links, etc.).
To allow discovery/crawling of new pages of these sites, too, they can provide a site map. This is essentially just a page linking to all existing pages, but often with structured metadata that can help search engines.
So just make sure that all your pages are linked somehow. Either via "natural" internal links on your site, or by providing a sitemap (ideally following the sitemaps.org protocol), or both.
For questions about SEO advice (which is off-topic here on SO), see our sister site https://webmasters.stackexchange.com/.
Please add sitemap in your site for google crawling all pages easily and indexing properly.
also add xml sitemap
your website need SEO process.

Will my page get unindexed by google if i delete it's link from my sitemap and resubmit it ? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I have a website with over 4,00,000 pages and i have created 10 sitemaps with 40,000 links in each site dynamically with php and submitted it in my google webmasters account , i add 50 - 60 pages to my website daily and i don't want to create another sitemap after every 40,000 links now . I have a solution in mind for this which is making a sitemap dynamically which shows all the links to pages created with in last 30 days now and re-submitting it everyday once (with a cron job) but here's the problem the pages i have created before last 30 days will not be in any of the sitemaps so i wanna know is if the links are indexed by google and after resubmitting the sitemap if the links are not in the sitemap will they get unindexed ? and if yes i would really like to know the solution for this ..
I am kind of beginner in seo so if it's a bad question i am really sorry but i searched alot before posting this question but couldn't find any solution.
You might want to look at the Sitemap index standard to see if this may help you break your very large site into more manageable chunks for Google and other search engines to traverse through your sitemaps. Particularly since you are using PHP, the "last updated" date and the assigned weight still factor into the crawl frequency.
To answer your question, though, I am fairly sure the answer is "No". Google has no reason to delete a page from their index unless you explicitly tell them to (using the section in Webmaster Tools or if your server responds with a 301 or 404 HTTP status code).
But I really do think you could benefit from using the Sitemap directory schema described above.

Improve dictionary's internal linking structure [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
What I want to achieve:
I have an online dictionary which works quite fine - but the crawling by search engines (especially Google) could be better.
So I would like to improve the internal linking structure on my website so that Google can easily find (almost) all pages of the dictionary.
What I know yet:
The number of internal links per page should not exceed 100. Search engines don't like pages containing masses of links - looks spammy. And a website is not to be designed for search engines but for the users. So the usability should not suffer from this optimization, best case would be if the usability does even increase.
My ideas for improving the internal linking structure so far:
on each dictionary entry page: link 25 similar words which could be mixed up
create an index: list of all dictionary entries (75 per page)
...
Can you help me to optimize the linking structure?
Thank you very much in advance!
You could link to synonyms and antonyms, which would be both user-friendly and crawler-friendly. But I think the biggest thing you could do to improve crawling, particularly by Google, would be to add a sitemap:
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Google has lots of information on Sitemaps and how to generate them on their webmaster help pages.

How does google generate the formatted list of links under the #1 result on a google search? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
If you google a specific entity, occasionally the website listed first is given a little listing of content, sort of like a mini site-map that the user can click on to navigate the linked site, bypassing the home page.
My question is this: Can I control this mini-sitemap when I am PR1? If so, how do I do so? I'm trying to build a list of relevant links so users can more effectively hit my site, but I'm not sure where to go about doing this.
Help?
No you cannot turn this on. Google decides this on their own wheter or not to generate them and for which search terms. If you sign up for the google webmasters you can see the status (if google has generated some for your site) and read more about their background.
Google generates the sitelinks itself, but only for certain sites. As for how it determines which sites get it and which don't, I'm not really sure, but I suspect it has something to do with the pagerank of the site and the amount of content you have.
For a while, I had sitelinks for my site (PR4 with about 40,000 pages indexed in Google) but then a while later, they went away. In my case it generated sitelinks for the main tabs on the site, probably because they are in the header navigation and therefore on every single page near the top of the page.
The only control you have over them is you can use the Google webmaster tools to remove sitelinks that you don't like, but you can't change the existing ones or suggest new ones.
They are called Sitelinks - there's a FAQ entry about them here.
You can't control them (except to remove ones you don't like) - the FAQ says "At the moment, sitelinks are completely automated."

How should google crawl my blog? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this question
I was wondering how (or if) I should guide Googlebot through my blog. Should I only allow visiting pages with single entries or should it also crawl the main page (which also has full entries)? My concern is that the main page changes when I add a new post and google keeps the old version for some time. I also find directing people to the main page annoying - you have to look through all the post before you find the one you're interested in. So what is the proper way to solve this issue?
Why not submit a sitemap with the appropriate <changefreq> tags -- if you set that to "always" for the homepage, the crawler will know that your homepage is very volatile (and you can have accurate change freq for other URLs too, of course). You can also give a lower priority to your homepage and a higher one to the pages you prefer to see higher in the index.
I do not recommend telling crawlers to avoid indexing your homepage completely, as that would throw away any link juice you might be getting from links to it from other sites -- tweaking change freq and priority seems preferable.
Make a sitemap.xml and regenerate it periodically. Check out Google Webmaster Tools.