Which of these url structures is better practice - seo

I just had a quick question about URL structures. Out of these structures which is more commonly used, and should be used more for best practice. Or if you had any other ways that are even better, that would be appreciated.
Create all add type pages with a prefix of add- and then the rest
http://example.com/add-account
or create a folder for all adding functionality
http://example.com/add/account

From a SEO point of view, I would think that add-account would be nice. But like Joël said, if you are focusing on a sane URL structure, /account/add would suffice, and I do not think it would be worse for SEO.
It's not that important for an account page anyway. If it were a clothes shop I would recommend example.com/products/women/dresses/red-dress-with-flowers as the URL. Not women-dresses-red-dress-with-flowers :)

Related

What is the correct way to do REST urls?

domain.com/blog/How-To-Code/3 (page 3)
domain.com/user/alicejohnson/comments
OR
domain.com/How-To-Code/3
domain.com/alicejohnson/comments
Facebook and Quora does it the 2nd way: http://www.quora.com/Swimming/Can-one-swim-from-New-Zealand-to-Australia They eliminate the noun and go straight to the object.
Stackoverflow does it the first way: What is the correct way to do REST urls?
Which should I do?
Most importantly, how does this affect SEO?
Also, if I do the 2nd version, how do I go about writing the "router" for that?
Perhaps think of 'blog' and 'user' above as namespaces. If there are multiple uses for How-To-Code you might have to put something behind it.
Some people tend to be pedantic about "proper REST" but I try to avoid this. I would design your URL schemes so it fits your needs, works well with tools and allows you to simply paste URLs into a browser to test your code.

Hiding a page part from Google, does it hurt SEO?

We all know that showing inexistent stuff to Google bots is not allowed and will hurt the search positioning but what about the other way around; showing stuff to visitors that are not displayed for Google bots?
I need to do this because I have photo pages each with the short title and the photo along with textarea containing the embed HTML code. googlebot is taking the embed code and putting it at the page description on its search results which is very ugly.
Please advise.
When you start playing with tricks like that, you need to consider several things.
... showing stuff to visitors that are not displayed for Google bots.
That approach is a bit tricky.
You can certainly check User-agents to see if a visitor is Googlebot, but Google can add any number of new spiders with different User-agents, which will index your images in the end. You will have to constantly monitor that.
Testing of each code release your website will have to check "images and Googlebot" scenario. That will extend testing phase and testing cost.
That can also affect future development - all changes will have to be done with "images and Googlebot" scenario in mind which can introduce additional constraints to your system.
Personally I would choose a bit different approach:
First of all review if you can use any methods recommended by Google. Google provides a few nice pages describing that problem e.g. Blocking Google or Block or remove pages using a robots.txt file.
If that is not enough, maybe restructuring of you HTML would help. Consider using JavaScript to build some customer facing interfaces.
And whatever you do, try to keep it as simple as possible, otherwise very complex solutions can turn around and bite you.
It is very difficult to give you very good advise without knowledge of your system, constraints and strategy. But I hope my answer will help you out to choose good architecture / solution for your system.
Boy, you want more.
Google does not because of a respect therefore judge you cheat, he needs a review, as long as your purpose to the user experience, the common cheating tactics, Google does not think you cheating.
just block these pages with robots.txt and you`ll be fine, it is not cheating - that's why they came with solution like that in the first place

Category & product url SEO

What is considered to be best practice for url structuring these days?
for some reason i thought including an extension at the end of a url was once you got down to the 'lowest' part of your hierarchy e.g.
/category/sub-category/product.html
then all category urls would be:
/category/sub-category/
rather than including an extension at the end because there is still further to go down the structure.
looking forward to your thoughts.
Andy.
EDIT
Just for clarification purposes: I'm looking at this from an ecommerce perspective.
Your question is not very clear, but I'll reply as I understand it.
As to use or not to use file extensions, according to Google's representative Matt Cutts, Google crawls .html, .php, or .asp, but you should keep away from .exe, .dll,.bin. They signify largely binary data, so may be ignored by Googlebot.
Still, when designing SEO friendly URLs, keep in mind they should be short and descriptive, so you may use your keywords to rank higher. So, if you have good keywords in your category names, why not let them be visible in the URL.
Make sure you're using static instead of dynamic URLs, they are easier to remember, and they don't change.

Is it easier to rank well in the search engines for one domain or multiple (related) domains?

I plan to provide content/services across multiple (similar and related) subcategories. In general, users will only be interested in the one subcategory related to their needs.
Users will be searching for the term that would be part of the domain, subdomain or URL.
There's three possible strategies:
primary-domain.tld, with subdomains:
keyword-one.primary-domain.tld
keyword-two.primary-domain.tld
primary-domain.tld, with directories:
primary-domain.tld/keyword-one
primary-domain.tld/keyword-two
or each keyword gets its own domain:
keyword-one-foo.tld
keyword-two-foo.tld
From an SEO point of view, which is the best approach to take? I gather that having one overall domain would mean any links to any of the subdomains or directories weight for the whole site, helping the ranking of each subdomain/directory. However, supposedly if the domain, keywords and title all match nicely with the content, that would rank highly as well. So I'm unsure as to the best approach to take.
The only answer I think anyone could give you here, is that you can't know. Modern search engine algorithms are pretty sophisticated, and to know which marginally different naming methodology is better is impossible to know without inside knowledge.
Also even if you did know, it could change in the future. Or perhaps it doesn't even come to the eqation at all, as it is open for abuse.
99% of the time it comes down to content. Originality, quality etc etc.
As long as you provide the best Quality Content and Make your website more SEO friendly, later domain names doesnot matter,
I personally prefer create several domains and maintain that, when the content grows, you can map it, this may help when you think of content Delivery networks.

Is listing all products on the homepage's footer making a real difference SEO-wise?

I'm working on a website on which I am asked to add to the homepage's footer a list of all the products that are sold on the website along with a link to the products' detail pages.
The problem is that there are about 900 items to display.
Not only that doesn't look good but that makes the page render a lot slower.
I've been told that such a technique would improve the website's visibility in Search Engine.
I've also heard that such techniques could lead to the opposite effect: google seeing it as "spam".
My question is: Is listing products of a website on its homepage really efficient when it comes to becoming more visible on search engines?
That technique is called keyword stuffing and Google says that it's not a good idea:
"Keyword stuffing" refers to the practice of loading a webpage with keywords in an attempt to manipulate a site's ranking in Google's search results. Filling pages with keywords results in a negative user experience, and can harm your site's ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.
Now you might want to ask: Does their crawler really realize that the list at the bottom of the page is just keyword stuffing? Well, that's a question that only Google could answer (and I'm pretty sure that they don't want to). In any case: Even if you could make a keyword stuffing block that is not recognized, they will probably improve they algorithm and -- sooner or later -- discover the truth. My recommendation: Don't do it.
If you want to optimize your search engine page ranking, do it "the right way" and read the Search Engine Optimization Guide published by Google.
Google is likely to see a huge list of keywords at the bottom of each page as spam. I'd highly recommend not doing this.
When is it ever a good idea to specify 900 items to a user? good practice dictates that large lists are usually paginated to avoid giving the user a huge blob of stuff to look through at once.
That's a good rule of thumb, if you're doing it to help the user, then it's probably good ... if you're doing it purely to help a machine (ie. google/bing), then it might be a bad idea.
You can return different html to genuine users and google by inspecting the user agent of the web request.
That way you can provide the google bot with a lot more text than you'd give a human user.
Update: People have pointed out that you shouldn't do this. I'm leaving this answer up though so that people know it's possible but bad.