SEO Optimization Error, improper crawling or improper indexing - optimization

i have a blog build in wordpress, And my domain name is like example.com (i can't give you the original name, because some times the editors will mark this question as SPAM :( , and if any one really want to check directly from my site will add at the end of the question.)
http://example.com and the blog name is http://example.com/articles/
and the sitemap.xml is available in http://example.com/sitemap.xml
Google daily visit my site and all my new articles were crawled, if i search the "articles title + example.com " will get the search result from the google , its my site. but the heading is not the actual one. its getting from another article's data.
(i think can give you a sample search query, please don't take this as a spam)
Installing Go Language in Ubuntu+tutorboy - But this will list with proper title after a long title :(, I think now you understood what i am facing ... please help me to find out why this happens.
Edit:
How can i improve my SEO with wordpress?

When I search that query I don't get the page "Installing Go...", I get the "PHP header types" article, which has the above text on the page (links at the right). So the titles showing in Google are correct.
Google has obviously not crawled that page yet since it's quite new. Give it time, especially if your site is new and/or unpopular.

Couple of things I need to make clear:
Google crawled your site on 24 Nov 2009 12:01:01 GMT, so needless to say Google actually does not visit your site(blog)everyday.
When I queried the phrase you provided, the results are right. There are two url relates to your site. One is home page of your blog, another is the page that is (more closely)related to your query. The reason is the query phrase is directly related to the page of tutorboy.com/articles/php/use-php-functions-in-javascript.html, however, in your home page there are still some related keywords. That is the reason why Google presents two pages on the result page.
Your second question is hard to answer since it needs a complicated answer. Still, the following steps are crucial to your SEO.
Unique and good content. Content is king, and it is the subject that remains consistent in the whole time while another elements are changing with the evolving of search engine technology. Also keep your site content fresh.
Back links. Part of the reason that Google does not visit your site after your updating your site is your site lacks enough back links.
Good structure. Properly use those tags like<t>, <description>,<alt>etc.
Using web analysts tools like Google Analysts. It free, and you can see lot of things that you missed.
Most importantly, grabbing some SEO books or spending couple of minutes everyday to read some SEO articles.
Good Luck,

Related

Google displaying website title differently in search results

Google displays my website’s page title differently to how it is meant to be.
The page title should be:
Graphic Designer Brighton and Lewes | Lewis Wallis Graphic Design
It displays fine in Bing, Yahoo and on my actual website.
However, Google displays it differently:
Lewis Wallis Graphic Design: Graphic Designer Brighton and Lewes
This is annoying as I want my keywords "graphic designer brighton" to go before my name.
I am using the Yoast SEO plugin and my only suspicion is that there might be a conflict between that and my theme, Workality.
Has anyone got any suggestions as to why this might be happening?
Google Search may change webpage titles they show in the result page (since 2012-01):
We use many signals to decide which title to show to users, primarily the <title> tag if the webmaster specified one. But for some pages, a single title might not be the best one to show for all queries, and so we have algorithms that generate alternative titles to make it easier for our users to recognize relevant pages.
See also the documentation at http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35624:
Google's generation of page titles and descriptions (or "snippets") is completely automated and takes into account both the content of a page as well as references to it that appear on the web. The goal of the snippet and title is to best represent and describe each result and explain how it relates to the user's query.
[…]
While we can't manually change titles or snippets for individual sites, we're always working to make them as relevant as possible.
In my answer on Webmasters SE I linked to questions from people having the same issue.
Is is possible that you changed the title, or installed the plugin, and Google hasn't picked up the changes yet?
It can take a few weeks for Google to pick up changes to your site, depending on how often it spiders it. The HTML looks fine so I can only think that Google hasn't got round to picking up the changes yet.

SEO: Allowing crawler to index all pages when only few are visible at a time

I'm working on improving the site for the SEO purposes and hit an interesting issue. The site, among other things, includes a large directory of individual items (it doesn't really matter what these are). Each item has its own details page, which is accessed via
http://www.mysite.com/item.php?id=item_id
or
http://www.mysite.com/item.php/id/title
The directory is large - having about 100,000 items in it. Naturally, on any of the pages only a few items are listed. For example, on the main site homepage, there are links to about 5 or 6 items, from some other page there links to about a dozen different items, etc.
When real users visits the site, they can use search form to find item by keyword or location - so there would be a list produced matching their search criteria. However when, for example, a google crawler visits the site, it won't even attempt to put a text into the keyword search field and submit the form. Thus as far as the bot is concern, after indexing the entire site, it has covered only a few dozen items at best. Naturally, I want it to index each individual item separately. What are my options here?
One thing I considered is to check the user agent and IP ranges and if the requestor is a bot (as best I can say), then add a div to the end of the most relevant page with links to each individual item. Yes, this would be a huge page to load - and I'm not sure how google bot would react to this.
Any other things I can do? What are best practices here?
Thanks in advance.
One thing I considered is to check the user agent and IP ranges and if
the requestor is a bot (as best I can say), then add a div to the end
of the most relevant page with links to each individual item. Yes,
this would be a huge page to load - and I'm not sure how google bot
would react to this.
That would be a very bad thing to do. Serving up different content to the search engines specifically for their benefit is called cloaking and is a great way to get your site banned. Don't even consider it.
Whenever a webmaster is concerned about getting their pages indexed having an XML sitemap is an easy way to ensure the search engines are aware of your site's content. They're very easy to create and update, too, if your site is database driven. The XML file does not have to be static so you can dynamically produce it whenever the search engines request it (Google, Yahoo, and Bing all support XML sitemaps). You can find out mroe about XML sitemaps at sitemaps.org.
If you want to make your content available to search engines and want to benefit from semantic markup (i.e. HTML) you should also make sure your all of content can be reached through hyperlinks (in other words not through form submissions or JavaScript). The reason for this is twofold:
The anchor text in the links to your items will contain the keywords you want to rank well for. This is one of the more heavily weighted ranking factors.
Links count as "votes", especially to Google. Links from external websites, especially related websites, are what you'll hear people recommend the most and for good reason. They're valuable to have. But internal links carry weight, too, and can be a great way to prop up your internal item pages.
(Bonus) Google has PageRank which used to be a huge part of their ranking algorithm but plays only a small part now. But it still has value and links "pass" PageRank to each page they link to increasing the PageRank of that page. When you have as many pages as you do that's a lot of potential PageRank to pass around. If you built your site well you could probably get your home page to a PageRank of 6 just from internal linking alone.
Having an HTML sitemap that somehow links to all of your products is a great way to ensure that search engines, and users, can easily find all of your products. It is also recommended that you structure your site so more important pages are closer to the root of your website (home page) and then as you branch out gets to sub pages (categories) and then to specific items. This gives search engines an idea of what pages are important and helps them organize them (which helps them rank them). It also helps them follow those links from top to bottom and find all of your content.
Each item has its own details page, which is accessed via
http://www.mysite.com/item.php?id=item_id
or
http://www.mysite.com/item.php/id/title
This is also bad for SEO. When you can pull up the same page using two different URLs you have duplicate content on your website. Google is on a crusade to increase the quality of their index and they consider duplicate content to be low quality. Their infamous Panda Algorithm is partially out to find and penalize sites with low quality content. Considering how many products you have it is only a matter of time before you are penalized for this. Fortunately the solution is easy. You just need to specify a canonical URL for your product pages. I recommend the second format as it is more search engine friendly.
Read my answer to an SEO question at the Pro Webmaster's site for even more information on SEO.
I would suggest for starters having an xml sitemap. Generate a list of all your pages, and submit this to Google via webmaster tools. It wouldn't hurt having a "friendly" sitemap either - linked to from the front page, which lists all these pages, preferably by category, too.
If you're concerned with SEO, then having links to your pages is hugely important. Google could see your page and think "wow, awesome!" and give you lots of authority -- this authority (some like to call it link juice" is then passed down to pages that are linked from it. You ought to make a hierarchy of files, more important ones closer to the top and/or making it wide instead of deep.
Also, showing different stuff to the Google crawler than the "normal" visitor can be harmful in some cases, if Google thinks you're trying to con it.
Sorry -- A little bias on Google here - but the other engines are similar.

Use of sitemaps

I've recently been involved in the redevelopment of a website (a search engine for health professionals: http://www.tripdatabase.com), and one of the goals was to make it more search engine "friendly", not through any black magic, but through better xhtml compliance, more keyword-rich urls, and a comprehensive sitemap (>500k documents).
Unfortunately, shortly after launching the new version of the site in October 2009, we saw site visits (primarily via organic searches from Google) drop substantially to 30% of their former glory, which wasn't the intention :)
We've brought in a number of SEO experts to help, but none have been able to satisfactorily explain the immediate drop in traffic, and we've heard conflicting advice on various aspects, which I'm hoping someone can help us with.
My question are thus:
do pages present in sitemaps also need to be spiderable from other pages? We had thought the point of a sitemap was specifically to help spiders get to content not already "visible". But now we're getting the advice to make sure every page is also linked to from another page. Which prompts the question... why bother with sitemaps?
some months on, and only 1% of the sitemap (well-formatted, according to webmaster tools) seems to have been spidered - is this usual?
Thanks in advance,
Phil Murphy
The XML sitemap helps search engine spider to indexing of all web pages of your site.
The sitemap is very usefull if you publish frequently many pages, but does not replace the correct system of linking of the site: all documents must be linke from an other related page.
Your site is very large, you must attention at the number of URLs published in the Sitemap because there are the limit of 50.000 URLs for each XML file.
The full documentation is available at Sitemaps.org
re: do pages present in sitemaps also need to be spiderable from other pages?
Yes, in fact this should be one of the first things you do. Make your website more usable to users before the search engines and the search engines will love you for it. Heavy internal linking between pages is a must first step. Most of the time you can do this with internal sitemap pages or category pages ect..
re: why bother with sitemaps?
Yes!, Site map help you set priorities for certain content on your site (like homepage), Tell the search engines what to look at more often. NOTE: Do not set all your pages with the highest priority, it confuses Google and doesn't help you.
re: some months on, and only 1% of the sitemap seems to have been spidered - is this usual?
YES!, I have a webpage with 100k+ pages. Google has never indexed them all in a single month, it takes small chunks of about 20k at a time each month. If you use the priority settings property you can tell the spider what pages they should re index each visit.
As Rinzi mentioned more documentation is available at Sitemaps.org
Try build more backlinks and "trust" (links from quality sources)
May help speed indexing further :)

Search Engine Optomisation

My neighbour popped over last night to ask me for help with regards to his company's website. He said that it used to be ranked pretty high on Google but has since fallen off completely.
Now, I'm a Windows App programmer hence my request for help. I took a look and there the meta tags seem ok. I recommended that he add a <h1>heading</h1> to the pages with a page title to help reinforce the content.
I also suggested that finding related websites and getting them to link to his site was good for search ranking.
Are there any other general strategies / tools that could help?
He site is: http://www.colofinder.co.uk/
ps. BTW: this isn't just an attempt to have StackOverflow link to my neighbour's site - I'm aware that links from SO don't add to its ranking.
Go to http://ooyes.net/blog/a-step-by-step-15-minute-seo-audit-%28a-sample-from-seo-secrets%29 and read it. Then go to http://www.searchenginejournal.com/55-quick-seo-tips-even-your-mother-would-love/6760/ and read it. Then go to your friends site and look at it with that information in mind. Off the top of my head, I would add flip the company name and page title in the "title" tags. Look at the google analytics account and see how people are coming to the site. That will give you an idea of where you should start your efforts to build a workable base.
First of all he needs to be make sure that his website contents are well managed and to the point. Then Page title has to be pin point, meta tags are obsolete so try meta description. Then Main Heading should be under h1 tag, sub heading under h2 and further sub heading h3. Try to update your website one in a month.
Use community websites like Facebook, Twitter and linkidin and other related forums for posting updates about completed projects and must give inbound links. You can use your company name as an inlink to your primary website and project name as an inlink of subpage of your company website.
Keep on posting at least once in a week. Post website URL to online directories will be a great help. Do not use Blackhat SEO techniques like cloaking. Do not use any invisible text/div in your website. Make sure that whenever you give your website link any where, give the most to the point and appropriate link.
Your link should have to have that stuff against you are posting your link/sublink. Make a section on your website for tag clouds/google tags, this will be a great attraction for search engines and they will link your website to other popular websites.
Make sure these tags should be directed to top ranking website which should have relevant material. I hope this will help. Feel free if you have trouble to understand anything i have mentioned above. Best of Luck

Is a deep directory structure a bad thing for SEO?

a friend of mine told me that the company he works at are redoing their SEO for their large website. Large == both number of pages and traffic they get a day.
Currently they have a (quote) deeply nested site , which i'm assuming means /x/y/z/a/b/c.. or something. I also know it's very unRESTful from some of the pages i've also seen -> eg. foo.blah?a=1&b=2&c=3......z=24 (yep, lots of crap in the url).
So updating their SEO sounds like a much needed thing.
But, they are going flat. I mean -> totally flat. eg. /foo-bar-pew-pew-abc-article1
This scares the bollox out of me.
From what he said (if i understood him right), each - character doesn't mean a new heirachial level.
so /foo-bar-pew-pew-abc-article1 does not mean /foo/bar/pew/pew/abc/article1
A space could be replace by a -. A + represents a space, but only if the two words are suppose to be one word (whatever that means). ie. Jean-Luke will be jean+luke but if i had a subject like 'hello world, that would be listed ashello-world`.
Excuse me while i blow my head up.
Is this just mean or is it totally silly to go completly flat. To mean, I was under the impression that when SEO people say keep it as flat as possible, they are trying to say keep it to 1 or 2 levels. 4 is the utter max=.
Is this me or is a flat heirachy a 'really really good thing' for seo ... for MEDIUM and LARGE sites (lots of resources, not necessairly lots of hits/page views).
Well, let's take a step back and look at what SEO is supposed to accomplish; it's meant to help a search engine identify quality, relevant content for users based on key phrases and terms.
Take, for example, the following blog URLs:
* http://blog.example.com/articles/2010/01/20/how-to-improve-seo/
* http://blog.example.com/how-to-improve-seo/
Yes, one is deep and the other is flat; but the URL structure is important for two reasons:
URL terms and phrases are high-value targets for determining relevance of a page by a search engine
A confusing URL may immediately force a user to skip your link in the search results
Let's face it: Google and other search engines can associate even the worst URLs with relevant content.
Take, for example, a search for "sears kenmore white refrigerator" in Google: http://www.google.com/search?q=sears+kenmore+white+refrigerator&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a.
Notice the top hit? The URL is http://www.sears.com/shc/s/p_10153_12605_04665802000P , and yet Google replaces the lousy URL with www.sears.com › Refrigerators › Top Freezers. (Granted, 2 results down is the true URL.)
If your goal for SEO is optimized organic relevance, then I would wholeheartedly recommend generating either key/value pairs in the URL, like www.sears.com/category/refrigerators/company/kenmore (meh), or phrase-like URLs like www.sears.com/kenmore/refrigerators/modelNumber. You want to align your URLs with the user's search terms and phrases to maximize your effort.
In the end, if you offer valuable content and you structure your content and site properly, the search engines will accurately gather it. You just need to help them realize how specific and authoritative your content is. :)
Generally the less navigation to reach content the better. But with a logical breadcrumb strategy and well thought out deep linking the excess of directory depth can be managed and not hurt seo and the visibility in search.
Remember that Google is trying to return the most relevant link and the best user experience, so if your site has 3 urls coming up for the same search term and it take 2 or 3 exits to find the appropriate content, Google will read that as bad and start lowering all of your urls in SERPs.
You have to consider how visitors will find your content - not navigate it. Think content discovery and just navigation.
HTH
Flat or deeply nested really shouldn't affect the SEO. The key part is how those individual pages are linked to will determine how they get ranked. I did write some basic stuff on this years ago see here, but essentially if pages are not buried deeply within a site, i.e. it takes several clicks (or links from Google's perspective) then they should rank fairly much the same in either case. Google used to put a lot more weight on keywords in URL's but this has been scaled back in more recent algorithm changes. It helps to have keywords there, but its no longer the be-all and end-all.
What you/they will need to consider are the following two important points:
1) How will the URL structure be perceived by the users of the site? Will they they be able to easily navigate the site and not have to rely on the URL structure in the address bar?
2) In making navigational changes such as this its vitally important to set-up redirects from old url's. Google, hates 404's and they should either put in 410 (Gone) HTTP responses for pages are no longer valid or 301 HTTP response for permanent redirects (with new url).
In making any large changes such as this you can save loads of time getting the site indexed successfully by utilising XML sitemaps and Google's webmaster console.