Sudden drop in Google impression on Google Webmasters [closed] - seo

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I am the developer of Infermap.com. We are regularly monitoring and working on SEO and presence on Google SERPs. In the past 3-4 days we have seen a sudden steep drop in the number of Impressions on Google.
Can someone suggest me the possible reason of why might this happen and by what ways I can prevent it.
Also I have added around 11k urls to be indexed out of which only 1.5k has been indexed. What are the possible reasons for it?

(note: this question should probably be moved to Webmasters Stack Exchange)
Looks like your 11k new URLs have not been picked up as quality content by Google. You might even be cloaking, when I click on a result I get a completely different text on your site.
Ways to avoid it:
avoid cloaking
avoid adding similar looking pages without unique content, e.g. make sure your pages are unique enough before publishing them
feed new content that looks alike gradually, e.g. start with 100 pages, wait a week or two, and add another 200. Once you are confident your pages are picked up well you can add everything at once.

Related

Seek advice from SEOs [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I have read that Google no more uses meta tags to rank your website.
لوله بازکنی دهکده المپیک
So what are the ways otherwise if I want to increase traffic or optimize my website for search engines so that more customer would get attracted to my website. we are running e-commerce business which is confined to a not very large area.
لوله بازکنی غرب تهران
Its only 5-6 months we have launched our website. Can I get any tips so that I can optimize my website for searching.
You could register your website on the Google Webmaster tool :
https://www.google.com/webmasters/tools/home?hl=en
Not only you'll find a few tips about their SEO, but it will warn you if the Google crawler had problems while visiting your website, which could be the reason for your website to be ranked poorly.
That is true about Meta tags - not relevant now.
There is no simple recipe to increase PageRank and search engines position.
There are huge amount of guides on web that can help. Professional companies offering positioning for payment. And also not every positioning practice is also "fair" and legal.
But for the general, I would say to answer your question:
keep your web-code clean, and if possible meeting the W3C validators requirements: http://validator.w3.org/
keep good-quality content
thing that increasing your web-position is the fact that your page is linked on on other pages in positive and good-quality context. Try to achieve that (with to

Search engine page creating [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I noted that Google use web page content when they index pages for SEO purposes. Therefore, what I did was I created the web pages and I used lot of keyword on the web pages. Then I applied the background color to the above keywords to show users.
Question is do they block this kind of pages?
Search Engine Optimization (SEO) is something you really need an expert for these days. The days of having some keywords and meta-data only have long gone, so you need to keep up to date with current SEO tricks to get your site up the Google ranking. You can also check the Alexa rankings for your website.
Take a look at the SEO guidelines from Google here
Take a look at some pointers here and here, but you really need to invest some time and research into the best practices.
You should also make your site as accessible as possible, this will make the site easier to spider, there are some tools here to look at and there's a site here you can use.

SEO, Remove OLD pages from Google? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
A site I worked on recently used to be joomla based and had a ton of articles within it and the entire business is different.
After clearing out the site (FTP) and starting fresh and finally finishing all was done however, the sites rankings on google are plagued by old pages which no longer exist. Furthermore, Google seems to think these pages do exist.
I was under the impression that after recrawling the site (at whatever time it saw fit) it would recognise those pages are now non existent and replace them when it could.
Its driving me insane. There are 100's of pages, so I can't put in requests to remove them all, won't they ever automatically be removed?
It will take a while but they will eventually stop looking for those pages. They keep trying for a little while under the assumption that their being missing is an error and they will return. If you're not going to do removal requests then you will have to simply wait it out.
Make sure, all old pages are returning 404 or 410 status. If Googlebot encounters multiple times 404/410 status, it will remove them from index.
Also suggest to check if any of those pages are having backlinks. If Googlebot keeps encountering backlinks to outdated page, it might still hold them in their search index. If there are some pages with valid backlink suggest to 301 (re-direct) them to valid pages.
Try the answer using Google WebMaster Tools you may find here.

How does google generate the formatted list of links under the #1 result on a google search? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
If you google a specific entity, occasionally the website listed first is given a little listing of content, sort of like a mini site-map that the user can click on to navigate the linked site, bypassing the home page.
My question is this: Can I control this mini-sitemap when I am PR1? If so, how do I do so? I'm trying to build a list of relevant links so users can more effectively hit my site, but I'm not sure where to go about doing this.
Help?
No you cannot turn this on. Google decides this on their own wheter or not to generate them and for which search terms. If you sign up for the google webmasters you can see the status (if google has generated some for your site) and read more about their background.
Google generates the sitelinks itself, but only for certain sites. As for how it determines which sites get it and which don't, I'm not really sure, but I suspect it has something to do with the pagerank of the site and the amount of content you have.
For a while, I had sitelinks for my site (PR4 with about 40,000 pages indexed in Google) but then a while later, they went away. In my case it generated sitelinks for the main tabs on the site, probably because they are in the header navigation and therefore on every single page near the top of the page.
The only control you have over them is you can use the Google webmaster tools to remove sitelinks that you don't like, but you can't change the existing ones or suggest new ones.
They are called Sitelinks - there's a FAQ entry about them here.
You can't control them (except to remove ones you don't like) - the FAQ says "At the moment, sitelinks are completely automated."

How should google crawl my blog? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this question
I was wondering how (or if) I should guide Googlebot through my blog. Should I only allow visiting pages with single entries or should it also crawl the main page (which also has full entries)? My concern is that the main page changes when I add a new post and google keeps the old version for some time. I also find directing people to the main page annoying - you have to look through all the post before you find the one you're interested in. So what is the proper way to solve this issue?
Why not submit a sitemap with the appropriate <changefreq> tags -- if you set that to "always" for the homepage, the crawler will know that your homepage is very volatile (and you can have accurate change freq for other URLs too, of course). You can also give a lower priority to your homepage and a higher one to the pages you prefer to see higher in the index.
I do not recommend telling crawlers to avoid indexing your homepage completely, as that would throw away any link juice you might be getting from links to it from other sites -- tweaking change freq and priority seems preferable.
Make a sitemap.xml and regenerate it periodically. Check out Google Webmaster Tools.