Trying to understand Google Results and meta tags - seo

Note: this does NOT regard ranking, I just want the results to look better overall.
I'm working with a "news site" with a lot of articles, some dynamic, some static.
The developers haven't really given much thought about SEO but now want the Google Results to look a bit prettier - which landed on my table.
In the source code there's a few meta-tags, example:
<meta name="twitter:title" content="content">
<meta name="og:title" content="content">
Running it through Google Structured Data Testing Tool shows what I'd expect but it doesn't look like my search result for that specific link has the correct snippet.
Seems like it doesn't want to pick the og:description content all the time. Sometimes it does, and sometimes it also adds the title again in the snippet.
What I don't get: is Google using og:title for results or is that only for ex Facebook sharing? Do I simply need this one below, since that is actually missing from the code?
The description itself would be the same as og:description since they contain the same content.
<meta name="description" content="content">
As far as I understand it can be quite tricky to customize these sorts of things but could it really be that hard to have any sort of consistency throughout the results from our page?

There are two things you can do but both come with a caveat.
Google takes anything from your site as a suggestion. There is no way to program it to perform identically in all situations. If Google's algorithm believes there is a better way to present a result - it will ignore any direction you give it and auto-generate a new presentation for your page.
That said there's two things you can do:
Add meta tags with the exact text you'd like to appear on the SERP. The page title may or may not be appended with your brand/company name. If it already contains the company/brand name, Google is more likely to leave it where it is.
Google takes text from the page based on what it thinks is more important/relevant to the search. For News, using either HTML5 elements (nav, article, aside) or labelling your divs with a class using those key words will help Google understand what the real content is. Asides are less likely to be used while Articles will be focused upon.
I would also recommend having authors write their own custom descriptions and insert them with your CMS. They're likely much better at constructing a good summary than Google or an auto-summary script. Google will experiment with alt descriptions occasionally but once something solidifies itself as popular in terms of click rate, it'll stick.

Related

How to create SEO-friendly paging for a grid?

I've got this grid (a list of products in an internet shop) for which I've no idea how big it can get. But I suppose a couple hundred items is quite realistic, especially for search results. Maybe even thousands, if we get a big client. :)
Naturally, I should use paging for such a grid. But how to do it so that search engine bots can crawl all the items too? I very much like this idea, but that only has first/last/prev/next links. If a search engine bot has to follow links 200 levels deep to get to the last page, I think it might give up pretty soon, and not enumerate all items.
What is the common(best?) practice for this?
Is it really the grid you want to have index by the search engine or are you afer a product detail page? If the last one is what you want, you can have a dynamic sitemap (XML) and the search engines will take it from there.
I run a number of price comparison sites and as such i've had the same issue as you before. I dont really have a concrete answer, i doubt anyone will have one tbh.
The trick is to try and make each page as unique as possible. The more unique pages, the better. Think of it as each page in google is a lottery ticket, the more tickets the more chances you have of winning.
So, back to your question. We tend to display 20 products per page and then have pagination at the bottom. AFAIK google and other bots will crawl all links on your site. They wouldnt give up. What we have noticed though is if your subsequent pages have the same SEO titles, H tags and is basically the same page but with different result sets then Google will NOT add the pages to the index.
Likewise i've looked at the site you suggested and would suggest changing the layout to be text and not images, an example of what i mean is on this site: http://www.shopexplorer.com/lcd-tv/index.html
Another point to remember is the more images etc... on the page the longer the page will take to load the worse your UI will be. I've also heard it affects quality on SEO ranking algorithms.
Not sure if i've given you enough to go on, but to recap:
i would limit the results to 20-30
I would use pagination but i would use text and not images
i would make sure the paginated pages have distinct enough 'SEO markers' [ title, h1 etc.. ] to be a unique page.
i.e.
LCD TV results page 2 > is bad
LCD TV results from Sony to Samsung > Better
Hopefully i've helped a little
EDIT:
Vlix, i've also seen your question ref: sitemaps. If you're concerned with that, i wouldnt be, then split the feed into multiple seperate feeds. Maybe on a category level, brand level etc... I'm not sure but i think google would want as many pages as possible. It will ignore the ones it doesnt like and just add the unique ones.
That at least, is how i understand it.
SEO is a dark art - nobody will be able to tell you exactly what to do and how to do it. However, I do have some general pointers.
Pleun is right - your objective should be to get the robots to your product detail page - that's likely to be the most keyword-rich, so optimize this page as much as you can! Semantic HTML, don't use images to show text, the usual.
Construct meaningful navigation schemes to lead the robots (and your visitors!) to your product detail pages. So, if you have 150K products, let's hope they are grouped into some kind of hierarchy, and that each (sub)category in that hierarchy has a managable (<50 or so) number of products. If your users have to go through lots and lots of pages in a single category to find the product they're interested in, they're likely to get bored and leave. Make this categorization into a navigation scheme, and make it SEO friendly - e.g. by using friendly URLs.
Create a sitemap - robots will crawl the entire sitemap, though they may not decide to pay much attention to pages that are hard to reach through "normal" navigation, even if they are in the sitemap.xml.
Most robots don't parse more than the first 50-100K of HTML. If your navigation scheme (with a data grid) is too big, the robot won't necessarily pick up or follow links at the end.
Hope this helps!

Google has all the wrong keywords

I hope stackoverflow is the right part of the trinity to ask this kind of question ...
Google webmaster tools shows the keywords it considers important for my blog (blog.schauderhaft.de). But among the top 20% are all the month names (you know january and so on).
I actually have a two part question about this:
why does google think theses are important keywords?
how do I fix that?
It might have something to do with the whole list of archives in the head of your page: <link rel='archives' title='January 2008' and so on.
Do you think this will actually be a problem? These people don't seem to think so..
We used to have a big problem on one of our client websites with a similar problem country names appearing most important. On some pages we were running multiple forms where one could choose a country. Google was finding this all over the place and thus considered it important.
So if you have month names in archives/in dates of articles it might very well be a possibility. You have to ensure you tag each one properly if its a date you can maybe use the HTML5 code to identify that its a date; otherwise in case of archives what you can do is load this using AJAX; or calculate it using javascript.
In order to drop the counry names we had to use a jQuery trick to insert these dynamically into the page following page load. (so google no longer sees the list as important to our website)

Does apparent filename affect SEO?

If I name my HTML file "Banks.html" located at www.example.com/Banks.html, but all the content is about Cats and all my other SEO tags are about Cats on the page, will it affect my page's SEO?
Can you name your files whatever you want, as long as you have the page title, description, and the rest of the SEO done properly?
Page names are often not very representative of the page content (I've seen pages named 7d57As09). Therefore search engines are not going to be particularly upset if the page names appear misleading. However, it's likely that the page name is one of many factors a search engine considers.
If there's no disadvantage in naming a page about cats, "cats.html", then do so! If it doesn't help your SEO, it will make it easier for your visitors!
If you want to be on better place when someone searchs for 'banks', then yes, it can help you. But unless you are creating pages about cats in banks I'm sure that this wont help you very much :)
It shouldn't affect your search engine ranking, but it may influence people who, having completed a search on Google (or some of the other great search engines, like um...uh...), are now scanning the results to decide where to click first. Someone faced with a url like www.dummy.com/banks.html would be more likely to click than someone faced with www.dummy.com/default.php?p_id=1&sessid=876492u942fgspw24z because most people haven't a clue what the last part means. It's also more memorable and gives people greater faith in getting back to the same site if you write your URLs nicely. No one that isn't Dustin Hoffman can remember the second URL without a little intense memory training, while everyone can remember banks.html. Just make sure your URL generation is consistent and your rewriting is solid, so you don't end up with loads of page not found errors which can detriment search engine ranking.
Ideally, your page name should be relevant to the content of the page - so your ranking may improve if you call the page "cats.html", as that is effectively another occurrence of the keyword in the page.
Generally, this is fairly minor compared to the benefits of decent keywords, titles, etc on the page. For more information take a look at articles around Url Strategy, for example:
"I’ve heard that search engines give some weighting to pages which contain keywords users are searching for which are contained within the page URL?"
Naming your pages something meaningful is a good idea and does improve SEO. It's another hint to the search engines what the page is about, in addition to the title and content. You would be surprised if you opened a file on your computer called "Letter to Grandma.doc" and it was actually your tax return!
In general, the best URLs are those that simply give a page name and hierarchical structure, without extensions or ID numbers. Keep it lowercase and separate words with dashes, like this:
example.com/my-cats
or
example.com/cats/mittens
In your case you will probably wanna keep the .html extension to avoid complexities with URL rewriting.
Under circumstances this can be considered a black-hat SEO technique. Watch out not to be caught or reported by curious users.
Google's PageRank algo has hundreds, thousands or even millions of variables and factors. From this point of view, you can be sure that the name of the files that you use on your website will affect your pagerank and/or your keyword targeting. Think about it.
There are few on-page elements that have significance. The URL, while it can be /234989782 is going to be more beneficial if it's named relevantly.
From any point of view, Google and all search engines like to see a coherence between everything: if you have a page named XYZ, then google will like it better if the text, meta, images, url, documents, etc, on the page to have XYZ in them. The bigger this synchronisation between the different elements on a page, the more the search engine sees how focused the content of that page is, resulting in more hits for you when someone looks up that focused search term.
If you have an image for example, you're better off having the same:
caption
description
name
alt text
(wordpress users will recognize that these are the four parameters that can be set for images on wordpress).
The same goes for all files you have on your website. Any parameter that can be seen by a search engine is better of optimized in regards to the content that goes with it, in sync with all the other parameters of this same thing.
The question of how useful this all is arises afterwards. Will I really rank lower if my alt text is different than the name of my image? Probably not by a lot. But either way, taking advantage of small subtleties like these can take you a long way in SEO. There are so many things we can't control in SEO (or that we shouldn't be able to control, like backlinks), that we have to use what we can control in the best way possible, to compensate.
It's also hard to tell if it is all useful after the Google Panda and Penguin. It definitely has less of an impact ever since those reforms (back then, this kind of thing was crucial), the question is simply how much of an impact it still has. But all in all, as I said, whenever possible, name your files according to your content.
Today algorithm is totally different when the SEO was introduce. The seo today is about content and its quality. It must produce a good reader and follower so any filename and description are no longer important.
Page name doesn't affects much in terms of SEO. but naming a page is also one of the Google 200 SEO signals.
Naming a url different sure will reduce your bounce rate a little. Because any user comes to your site through organic search results doesn't understand what the page has.
Even search engines loves when a page name is relevant to the topic in the page.

Hiding or Promoting specific content within a page to search engines

A bit of an SEO question here.
I've got a site with a ton of pages, of content. I know lots of the content is the same on each page.
I thought that Search Engines keyed off of the differences in page content so that they could promote the correct data, but when I look at the summary in google and bing, the summary shows my 'feedback' block (which is where I just ask for feedback).
Yahoo (and the summary in Facebook) shows my search options menu.
These aren't really things that are going to make a person want to click on the page.
So I'm wondering what the best way is to either hide this content from search engines, or improve the visibility of the other content that should get indexed.
The page structure is pretty consistent, so I thought it would have been easy for the search robots to pick this stuff out, but apparently not.
You may want to try using a meta tag like this.
< META NAME="description" CONTENT="Here is a short summary of the page" >
Search engines also prefer title and header tags over regular text.
Meta is the best way to do that.
However,Beware that your structure of page is a also important, which means search engines prefer to use metal tag, but they also weigh the structures, keywords, headers things like that.
I encountered such trouble couple of months ago. I found Google showed price and download rather than meta description. I solved that by reorganize meta description(more accurate and shorter,177 characters)eliminate tags from price and download tags. And made some slight adjustments to the structure. Now the Google summary is what I want.
Hope this helps you!

SEO for Ultraseek 5.7

We've got Ultraseek 5.7 indexing the content on our corporate intranet site, and we'd like to make sure our web pages are being optimized for it.
Which SEO techniques are useful for Ultraseek, and where can I find documentation about these features?
Features I've considered implementing:
Make the title and first H1 contain the most valuable information about the page
Implement a sitemap.xml file
Ping the Ultraseek xpa interface when new content is added
Use "SEO-Friendly" URL strings
Add Meta keywords to the HTML pages.
The most important bit of advice anyone can get when optimizing a website for search engines and indeed for tools like Ultraseek is this...
Write your web pages for your human audience first and foremost. Don't do anything odd to try and optimize your website for a search engine. Don't stuff keywords into your URL if it makes the URL less sensible. Think human first.
Having said this, the following techniques usually make things better for both the humans and the machines!
Use headings (h1 through h6) to give your page a structure. Imagine them being arranged in a tree view, with a h1 containing some h2 tags and h2 tags containing h3 tags and so on. I usually use the h1 tag (there should be only one h1 tag) for the site name and the h2 tag for the page name, with h3 tags as sub-headings where appropriate.
Sitemaps are very useful as they contain a list of your pages, consider this a request of pages you would like included in any index. They don't normally contain much context though.
Friendly URL strings are great for humans. I'd much rather visit www.website.com/Category/Music/ than www.website.com?x=3489 - it does also mean that you give the machines some more context for your page. It especially helps if the URL matches your h1 and h2 tags. Like this:
www.website.com/Category/Music/
Website
Category: Music
Welcome to the music category!
Meta keywords (and description) are useful - but as per the above advice, you need to make sure that it all matches up. Use a small but targeted set of keywords that highlight what is specifically different about the page and make sure your description is a good summary of the page content. Imagine that it is used beneath the title in a list of search results (even though it might not be!)
Navigation! Providing clear navigation, as well as back links (such as bread-crumbs) will always help. If someone clicks on a search result, it might not be the exact page they are after, but it may well be very close. By highlighting where people have landed in your navigation and by providing a bread-crumb that tells them where they are, they will be able to traverse your pages easily even if the search hasn't taken them to the perfect location.