My question is I had changed My Websites Keywords and submitted it to the google i.e verified by the google MetaTag. and Google is indexing that keywords. I have changed my keywords, But google indexing older keywords not the recent keywords,
Will I need to re-verify by the google MetaTag
or please suggest me any other solution which will index my new keywords
The google crawler doesn't run everyday. Sometimes it can take up to a week for new keywords to be indexed. It's just a matter of waiting. They'll show up.
If you have pages where the content changes frequently, when you provide google with your sitemap, you can stipulate this. A simple tool that will auto generate this for you is http://www.xml-sitemaps.com/
Just alter the change
The keywords meta tag is not important. Search engines don't pay any attention to it anymore. I wouldn't waste my time worrying about.
Here's a quote from SEOmoz's Beginner Guide to SEO (http://guides.seomoz.org/chapter-4-basics-of-search-engine-friendly-design-and-development): "The meta keywords tag had value at one time, but is no longer valuable or important to search engine optimization."
Instead, I would worry about making sure the content on your site is well-written, relevant to your topic, and keyword-focused (but not keyword-crammed...sound natural). Make sure your title tags are appropriate to what's on each page. That will get you farther than meta keywords.
Additionally, it takes time for search engines to index changes. Be patient on any changes you make to your site.
Related
I'm working on improving the site for the SEO purposes and hit an interesting issue. The site, among other things, includes a large directory of individual items (it doesn't really matter what these are). Each item has its own details page, which is accessed via
http://www.mysite.com/item.php?id=item_id
or
http://www.mysite.com/item.php/id/title
The directory is large - having about 100,000 items in it. Naturally, on any of the pages only a few items are listed. For example, on the main site homepage, there are links to about 5 or 6 items, from some other page there links to about a dozen different items, etc.
When real users visits the site, they can use search form to find item by keyword or location - so there would be a list produced matching their search criteria. However when, for example, a google crawler visits the site, it won't even attempt to put a text into the keyword search field and submit the form. Thus as far as the bot is concern, after indexing the entire site, it has covered only a few dozen items at best. Naturally, I want it to index each individual item separately. What are my options here?
One thing I considered is to check the user agent and IP ranges and if the requestor is a bot (as best I can say), then add a div to the end of the most relevant page with links to each individual item. Yes, this would be a huge page to load - and I'm not sure how google bot would react to this.
Any other things I can do? What are best practices here?
Thanks in advance.
One thing I considered is to check the user agent and IP ranges and if
the requestor is a bot (as best I can say), then add a div to the end
of the most relevant page with links to each individual item. Yes,
this would be a huge page to load - and I'm not sure how google bot
would react to this.
That would be a very bad thing to do. Serving up different content to the search engines specifically for their benefit is called cloaking and is a great way to get your site banned. Don't even consider it.
Whenever a webmaster is concerned about getting their pages indexed having an XML sitemap is an easy way to ensure the search engines are aware of your site's content. They're very easy to create and update, too, if your site is database driven. The XML file does not have to be static so you can dynamically produce it whenever the search engines request it (Google, Yahoo, and Bing all support XML sitemaps). You can find out mroe about XML sitemaps at sitemaps.org.
If you want to make your content available to search engines and want to benefit from semantic markup (i.e. HTML) you should also make sure your all of content can be reached through hyperlinks (in other words not through form submissions or JavaScript). The reason for this is twofold:
The anchor text in the links to your items will contain the keywords you want to rank well for. This is one of the more heavily weighted ranking factors.
Links count as "votes", especially to Google. Links from external websites, especially related websites, are what you'll hear people recommend the most and for good reason. They're valuable to have. But internal links carry weight, too, and can be a great way to prop up your internal item pages.
(Bonus) Google has PageRank which used to be a huge part of their ranking algorithm but plays only a small part now. But it still has value and links "pass" PageRank to each page they link to increasing the PageRank of that page. When you have as many pages as you do that's a lot of potential PageRank to pass around. If you built your site well you could probably get your home page to a PageRank of 6 just from internal linking alone.
Having an HTML sitemap that somehow links to all of your products is a great way to ensure that search engines, and users, can easily find all of your products. It is also recommended that you structure your site so more important pages are closer to the root of your website (home page) and then as you branch out gets to sub pages (categories) and then to specific items. This gives search engines an idea of what pages are important and helps them organize them (which helps them rank them). It also helps them follow those links from top to bottom and find all of your content.
Each item has its own details page, which is accessed via
http://www.mysite.com/item.php?id=item_id
or
http://www.mysite.com/item.php/id/title
This is also bad for SEO. When you can pull up the same page using two different URLs you have duplicate content on your website. Google is on a crusade to increase the quality of their index and they consider duplicate content to be low quality. Their infamous Panda Algorithm is partially out to find and penalize sites with low quality content. Considering how many products you have it is only a matter of time before you are penalized for this. Fortunately the solution is easy. You just need to specify a canonical URL for your product pages. I recommend the second format as it is more search engine friendly.
Read my answer to an SEO question at the Pro Webmaster's site for even more information on SEO.
I would suggest for starters having an xml sitemap. Generate a list of all your pages, and submit this to Google via webmaster tools. It wouldn't hurt having a "friendly" sitemap either - linked to from the front page, which lists all these pages, preferably by category, too.
If you're concerned with SEO, then having links to your pages is hugely important. Google could see your page and think "wow, awesome!" and give you lots of authority -- this authority (some like to call it link juice" is then passed down to pages that are linked from it. You ought to make a hierarchy of files, more important ones closer to the top and/or making it wide instead of deep.
Also, showing different stuff to the Google crawler than the "normal" visitor can be harmful in some cases, if Google thinks you're trying to con it.
Sorry -- A little bias on Google here - but the other engines are similar.
My question is I had changed My Websites Keywords and submitted it to the google i.e verified by the google MetaTag. and Google is indexing that keywords. I have changed my keywords, But google indexing older keywords not the recent keywords, Will I need to re-verify by the google MetaTag or please suggest me any other solution which will index my new keywords
If you are talking about webmaster tools (Google MetaTag): no you don't need to resubmit, google will find it on it's own.
If you are talking about the robots metatag: no you don't need to resubmit, google will find it on it's own.
If you are talking about the "keywords" tag: don't use it (http://www.mattcutts.com/blog/keywords-meta-tag-in-web-search/)
It will take some time to index your keywords. You can make change in on page factors such as fresh content but make sure about keyword density.
I have made changes to my website's keywords, description, and title, but Google is not indexing the new keyword. Instead, I have found that Google is indexing the older one.
How can I get Google to index my site using the new keywords that I have added?
Periods between crawling a page vary a lot across pages. A post to SO will be crawled and indexed by Google in seconds. Your personal page that hasn't changed content in 20 years might not even be crawled as much as once a year.
Submitting a sitemap to the webmaster tools will likely re-crawl your website to validate your sitemap. You could use this to speed up the re-crawling.
However, as #Charles noted, the keywords meta-tag is mostly ignored by Google. So it sounds like you're wasting your time.
Is it possible to fine-tune directives to Google to such an extent that it will ignore part of a page, yet still index the rest?
There are a couple of different issues we've come across which would be helped by this, such as:
RSS feed/news ticker-type text on a page displaying content from an external source
users entering contact phone etc. details who want them visible on the site but would rather they not be google-able
I'm aware that both of the above can be addressed via other techniques (such as writing the content with JavaScript), but am wondering if anyone knows if there's a cleaner option already available from Google?
I've been doing some digging on this and came across mentions of googleon and googleoff tags, but these seem to be exclusive to Google Search Appliances.
Does anyone know if there's a similar set of tags to which Googlebot will adhere?
Edit: Just to clarify, I don't want to go down the dangerous route of cloaking/serving up different content to Google, which is why I'm looking to see if there's a "legit" way of achieving what I'd like to do here.
What you're asking for, can't really be done, Google either takes the entire page, or none of it.
You could do some sneaky tricks though like insert the part of the page you don't want indexed in an iFrame and use robots.txt to ask Google not to index that iFrame.
In short NO - unless you use cloaking with is discouraged by Google.
Please check out the official documentation from here
http://code.google.com/apis/searchappliance/documentation/46/admin_crawl/Preparing.html
Go to section "Excluding Unwanted Text from the Index"
<!--googleoff: index-->
here will be skipped
<!--googleon: index-->
Found useful resource for using certain duplicate content and not to allow index by search engine for such content.
<p>This is normal (X)HTML content that will be indexed by Google.</p>
<!--googleoff: index-->
<p>This (X)HTML content will NOT be indexed by Google.</p>
<!--googleon: index>
At your server detect the search bot by IP using PHP or ASP. Then feed the IP addresses that fall into that list a version of the page you wish to be indexed. In that search engine friendly version of your page use the canonical link tag to specify to the search engine the page version that you do not want to be indexed.
This way the page with the content that do want to be index will be indexed by address only while the only the content you wish to be indexed will be indexed. This method will not get you blocked by the search engines and is completely safe.
Yes definitely you can stop Google from indexing some parts of your website by creating custom robots.txt and write which portions you don't want to index like wpadmins, or a particular post or page so you can do that easily by creating this robots.txt file .before creating check your site robots.txt for example www.yoursite.com/robots.txt.
All search engines either index or ignore the entire page. The only possible way to implement what you want is to:
(a) have two different versions of the same page
(b) detect the browser used
(c) If it's a search engine, serve the second version of your page.
This link might prove helpful.
There are meta-tags for bots, and there's also the robots.txt, with which you can restrict access to certain directories.
i have a blog build in wordpress, And my domain name is like example.com (i can't give you the original name, because some times the editors will mark this question as SPAM :( , and if any one really want to check directly from my site will add at the end of the question.)
http://example.com and the blog name is http://example.com/articles/
and the sitemap.xml is available in http://example.com/sitemap.xml
Google daily visit my site and all my new articles were crawled, if i search the "articles title + example.com " will get the search result from the google , its my site. but the heading is not the actual one. its getting from another article's data.
(i think can give you a sample search query, please don't take this as a spam)
Installing Go Language in Ubuntu+tutorboy - But this will list with proper title after a long title :(, I think now you understood what i am facing ... please help me to find out why this happens.
Edit:
How can i improve my SEO with wordpress?
When I search that query I don't get the page "Installing Go...", I get the "PHP header types" article, which has the above text on the page (links at the right). So the titles showing in Google are correct.
Google has obviously not crawled that page yet since it's quite new. Give it time, especially if your site is new and/or unpopular.
Couple of things I need to make clear:
Google crawled your site on 24 Nov 2009 12:01:01 GMT, so needless to say Google actually does not visit your site(blog)everyday.
When I queried the phrase you provided, the results are right. There are two url relates to your site. One is home page of your blog, another is the page that is (more closely)related to your query. The reason is the query phrase is directly related to the page of tutorboy.com/articles/php/use-php-functions-in-javascript.html, however, in your home page there are still some related keywords. That is the reason why Google presents two pages on the result page.
Your second question is hard to answer since it needs a complicated answer. Still, the following steps are crucial to your SEO.
Unique and good content. Content is king, and it is the subject that remains consistent in the whole time while another elements are changing with the evolving of search engine technology. Also keep your site content fresh.
Back links. Part of the reason that Google does not visit your site after your updating your site is your site lacks enough back links.
Good structure. Properly use those tags like<t>, <description>,<alt>etc.
Using web analysts tools like Google Analysts. It free, and you can see lot of things that you missed.
Most importantly, grabbing some SEO books or spending couple of minutes everyday to read some SEO articles.
Good Luck,