Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
There is this link:
http://www.talentblend.com/projects/Female-Dancers-Needed-for-La-Bayadre-The-Royal-Ballet-Flanders/229
You search that in google, it will come up first (no surprise there). This means google has crawled and indexed that page.
But if you search the title of that page 'Female Dancers Needed for La Bayadère, The Royal Ballet Flanders' it will not come up anywhere. But you will see and other page from talentblend.com coming up somewhere on the first page, that is not relevant to the searched words (just vaguely contains that text somewhere on the page).
This has happened when i updated the code on the site. Since then all newly added content behaves like the above example. Old pages still come up high in google (even the ones i deleted since).
Google webmaster tools doesnt say any errors (crawl, security, robots). I also have Google Analytics running on the page.
Can somebody tell me why is this?
My guess is that there is very little actual content on this page. There's the one sentence and then a login form. Was there more content prior to your most recent update?
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 months ago.
Improve this question
We currently have a big user guide that can be either a raw chm file or just hosted in a webpage. We are wanting to get to the point that Google indexes all the items inside help guide so someone can just google it and it would come up.
Has anyone tried this type of mass SEO of their user guide/help guide? Any tips?
A long time ago I put some stuff (web help created by FAR HTML) online. A Google search found a match (see attached snapshot). OK not really a new note.
A table of contents, an index and e.g. a search button is recommended for web help too. Please have a look at http://helpware.net/FAR/help/hh_start.htm and try the “Search” button. Something you already have online ...
Uploading your web help content to a subdomain e.g. www.knowledgebase.YourCompany.com may be a one part of a solution for you. Use Google’s webmaster tools for uploading a sitemap about this subdomain content.
Google’s Custom Search is another idea. For further information please have a look at: https://developers.google.com/custom-search/
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I have 2 url width the totally same content because I'd like if a subpage could reach in easily (short name) and valuable urls too:
domain.com/name_of_company
domain.com/name_of_company/address_of_company
And I have a third url:
domain.com/name_of_company/products_of_company
This page would be the same as domain.com/name_of_company but with more content at the top of the page.
What you think if it's good way or I should forget 'cause I'll be punished by google?
Thank you in advance!
It's perfectly fine to have identical pages, but then you should add a canonical link to the pages, which tells which URL you consider to be the original. That way search engines can immediately see that the pages were intended to be identical.
This will not only avoid being punished for duplicate content, it will also allow for the search engines to count incoming links to both URLs as links to the original page.
For the third page, you could consider if you need to repeat all the content from the other page. Having some content repeated from page to page is common (e.g. page footer), but too much repeated content is not valuable. Even if you are not directly punished for the repetition, there is a risk that some of the repeated content is simply ignored.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I am the developer of Infermap.com. We are regularly monitoring and working on SEO and presence on Google SERPs. In the past 3-4 days we have seen a sudden steep drop in the number of Impressions on Google.
Can someone suggest me the possible reason of why might this happen and by what ways I can prevent it.
Also I have added around 11k urls to be indexed out of which only 1.5k has been indexed. What are the possible reasons for it?
(note: this question should probably be moved to Webmasters Stack Exchange)
Looks like your 11k new URLs have not been picked up as quality content by Google. You might even be cloaking, when I click on a result I get a completely different text on your site.
Ways to avoid it:
avoid cloaking
avoid adding similar looking pages without unique content, e.g. make sure your pages are unique enough before publishing them
feed new content that looks alike gradually, e.g. start with 100 pages, wait a week or two, and add another 200. Once you are confident your pages are picked up well you can add everything at once.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
When I look up the website I've made for a customer (www.leadevreese.be) through "Lea De Vreese" I get an internal page, and not the home page.
How can I fix this?
It's very likely that Googlebot (Google web crawler) found your internal webpage before the home page (because of a link find in a website for example) and didn't follow links on this internal webpage to index other webpages of the website. In general, it's the case when the website is young and has few backlinks pointing to it.
To fix this, you can submit a sitemap.xml to the Google Webmaster Tools account managing the website and put links pointing to it over the web. Therefore, Googlebot will find it and will index the webpages.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
If you google a specific entity, occasionally the website listed first is given a little listing of content, sort of like a mini site-map that the user can click on to navigate the linked site, bypassing the home page.
My question is this: Can I control this mini-sitemap when I am PR1? If so, how do I do so? I'm trying to build a list of relevant links so users can more effectively hit my site, but I'm not sure where to go about doing this.
Help?
No you cannot turn this on. Google decides this on their own wheter or not to generate them and for which search terms. If you sign up for the google webmasters you can see the status (if google has generated some for your site) and read more about their background.
Google generates the sitelinks itself, but only for certain sites. As for how it determines which sites get it and which don't, I'm not really sure, but I suspect it has something to do with the pagerank of the site and the amount of content you have.
For a while, I had sitelinks for my site (PR4 with about 40,000 pages indexed in Google) but then a while later, they went away. In my case it generated sitelinks for the main tabs on the site, probably because they are in the header navigation and therefore on every single page near the top of the page.
The only control you have over them is you can use the Google webmaster tools to remove sitelinks that you don't like, but you can't change the existing ones or suggest new ones.
They are called Sitelinks - there's a FAQ entry about them here.
You can't control them (except to remove ones you don't like) - the FAQ says "At the moment, sitelinks are completely automated."