What is considered to be best practice for url structuring these days?
for some reason i thought including an extension at the end of a url was once you got down to the 'lowest' part of your hierarchy e.g.
/category/sub-category/product.html
then all category urls would be:
/category/sub-category/
rather than including an extension at the end because there is still further to go down the structure.
looking forward to your thoughts.
Andy.
EDIT
Just for clarification purposes: I'm looking at this from an ecommerce perspective.
Your question is not very clear, but I'll reply as I understand it.
As to use or not to use file extensions, according to Google's representative Matt Cutts, Google crawls .html, .php, or .asp, but you should keep away from .exe, .dll,.bin. They signify largely binary data, so may be ignored by Googlebot.
Still, when designing SEO friendly URLs, keep in mind they should be short and descriptive, so you may use your keywords to rank higher. So, if you have good keywords in your category names, why not let them be visible in the URL.
Make sure you're using static instead of dynamic URLs, they are easier to remember, and they don't change.
Related
domain.com/blog/How-To-Code/3 (page 3)
domain.com/user/alicejohnson/comments
OR
domain.com/How-To-Code/3
domain.com/alicejohnson/comments
Facebook and Quora does it the 2nd way: http://www.quora.com/Swimming/Can-one-swim-from-New-Zealand-to-Australia They eliminate the noun and go straight to the object.
Stackoverflow does it the first way: What is the correct way to do REST urls?
Which should I do?
Most importantly, how does this affect SEO?
Also, if I do the 2nd version, how do I go about writing the "router" for that?
Perhaps think of 'blog' and 'user' above as namespaces. If there are multiple uses for How-To-Code you might have to put something behind it.
Some people tend to be pedantic about "proper REST" but I try to avoid this. I would design your URL schemes so it fits your needs, works well with tools and allows you to simply paste URLs into a browser to test your code.
We all know that showing inexistent stuff to Google bots is not allowed and will hurt the search positioning but what about the other way around; showing stuff to visitors that are not displayed for Google bots?
I need to do this because I have photo pages each with the short title and the photo along with textarea containing the embed HTML code. googlebot is taking the embed code and putting it at the page description on its search results which is very ugly.
Please advise.
When you start playing with tricks like that, you need to consider several things.
... showing stuff to visitors that are not displayed for Google bots.
That approach is a bit tricky.
You can certainly check User-agents to see if a visitor is Googlebot, but Google can add any number of new spiders with different User-agents, which will index your images in the end. You will have to constantly monitor that.
Testing of each code release your website will have to check "images and Googlebot" scenario. That will extend testing phase and testing cost.
That can also affect future development - all changes will have to be done with "images and Googlebot" scenario in mind which can introduce additional constraints to your system.
Personally I would choose a bit different approach:
First of all review if you can use any methods recommended by Google. Google provides a few nice pages describing that problem e.g. Blocking Google or Block or remove pages using a robots.txt file.
If that is not enough, maybe restructuring of you HTML would help. Consider using JavaScript to build some customer facing interfaces.
And whatever you do, try to keep it as simple as possible, otherwise very complex solutions can turn around and bite you.
It is very difficult to give you very good advise without knowledge of your system, constraints and strategy. But I hope my answer will help you out to choose good architecture / solution for your system.
Boy, you want more.
Google does not because of a respect therefore judge you cheat, he needs a review, as long as your purpose to the user experience, the common cheating tactics, Google does not think you cheating.
just block these pages with robots.txt and you`ll be fine, it is not cheating - that's why they came with solution like that in the first place
I have found that one of the keywords I would like to be found in the search engine has a domain that I can register. This is not a good name for the general project, and so not for all the web, but it is a good definition or explanation of a part. Is it a good idea to register something like this just to point it to a section of the web? I mean is this effective from the SEO point of view? but most important, is it a good practice?
Interesting question, in therms of SEO, this is NOT a good practice, and google can punish your website (so i'd not recommend it), but...
...if this word is really easy to remember and you think the user will try it to access your site without needing to search for it, it may be "acceptable", because you won't lose online visits.
Anyway.. you should avoid black hat techniques.
Google updates "Panda", "Penguin" later versions discourage this type of technique. Naming your domain as same to the specific research like www.healthcaremedicine.com. That means if someone search for the products health care medicine. Your website is shown at top.
I think that is what you mean.
In past years people name their website closer to the search result but now it is not recommended. It may for some time take your site to top. But that will not last long. Your site should have to provide what it promise its visitors. At least they have to spend some time.
Exact Match Domain is what you are referring to, as answered above is not advisable, but if you find it useful you can register and go ahead.
How to keep you off from penalties.
Do not stuff KW's in title and descriptions, as your domain already has the KW
When doing off-page SEO, Do not choose anchor text as your Main KW, instead use URL itself as anchor text and some generic anchor text like click here, more info, read more.
These can save you from penalty. I still see a lot of EMD's ranking just by being careful with the usage of KW's and anchor text.
I plan to provide content/services across multiple (similar and related) subcategories. In general, users will only be interested in the one subcategory related to their needs.
Users will be searching for the term that would be part of the domain, subdomain or URL.
There's three possible strategies:
primary-domain.tld, with subdomains:
keyword-one.primary-domain.tld
keyword-two.primary-domain.tld
primary-domain.tld, with directories:
primary-domain.tld/keyword-one
primary-domain.tld/keyword-two
or each keyword gets its own domain:
keyword-one-foo.tld
keyword-two-foo.tld
From an SEO point of view, which is the best approach to take? I gather that having one overall domain would mean any links to any of the subdomains or directories weight for the whole site, helping the ranking of each subdomain/directory. However, supposedly if the domain, keywords and title all match nicely with the content, that would rank highly as well. So I'm unsure as to the best approach to take.
The only answer I think anyone could give you here, is that you can't know. Modern search engine algorithms are pretty sophisticated, and to know which marginally different naming methodology is better is impossible to know without inside knowledge.
Also even if you did know, it could change in the future. Or perhaps it doesn't even come to the eqation at all, as it is open for abuse.
99% of the time it comes down to content. Originality, quality etc etc.
As long as you provide the best Quality Content and Make your website more SEO friendly, later domain names doesnot matter,
I personally prefer create several domains and maintain that, when the content grows, you can map it, this may help when you think of content Delivery networks.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
In SEO there are a few techniques that have been flagged that need to avoided at all costs. These are all techniques that used to be perfectly acceptable but are now taboo. Number 1: Spammy guest blogging: Blowing up a page with guest comments is no longer a benefit. Number 2: Optimized Anchors: These have become counterproductive, instead use safe anchors. Number 3: Low Quality Links: Often sites will be flooded with hyperlinks that take you to low quality Q&A sites, don’t do this. Number 4: Keyword Heavy Content: Try and avoid too many of these, use longer well written sections more liberally. Number 5: Link-Back Overuse: Back links can be a great way to redirect to your site but over saturation will make people feel trapped
Content, Content, CONTENT! Create worthwhile content that other people will want to link to from their sites.
Google has the best tools for webmasters, but remember that they aren't the only search engine around. You should also look into Bing and Yahoo!'s webmaster tool offerings (here are the tools for Bing; here for Yahoo). Both of them also accept sitemap.xml files, so if you're going to make one for Google, then you may as well submit it elsewhere as well.
Google Analytics is very useful for helping you tweak this sort of thing. It makes it easy to see the effect that your changes are having.
Google and Bing both have very useful SEO blogs. Here is Google's. Here is Bing's. Read through them--they have a lot of useful information.
Meta keywords and meta descriptions may or may not be useful these days. I don't see the harm in including them if they are applicable.
If your page might be reached by more than one URL (i.e., www.mysite.com/default.aspx versus mysite.com/default.aspx versus www.mysite.com/), then be aware that that sort of thing sometimes confuses search engines, and they may penalize you for what they perceive as duplicated content. Use the link rel="canoncial" element to help avoid this problem.
Adjust your site's layout so that the main content comes as early as possible in the HTML source.
Understand and utilize your robots.txt and meta robots tags.
When you register your domain name, go ahead and claim it for as long of a period of time as you can. If your domain name registration is set to expire ten years from now rather than one year from now, search engines will take you more seriously.
As you probably know already, having other reputable sites that link to your site is a good thing (as long as those links are legitimate).
I'm sure there are many more tips as well. Good luck!
In addition to having quality content, content should be added/updated regularly. I believe that Google (an likely others) will have some bias toward the general "freshness" of content on your site.
Also, try to make sure that the content that the crawler sees is as close as possible to what the user will see (can be tricky for localized pages). If you're careless, your site may be be blacklisted for "bait-and-switch" tactics.
Don't implement important text-based
sections in Flash - Google will
probably not see them and if it does,
it'll screw it up.
Google can Index Flash. I don't know how well but it can. :)
A well organized, easy to navigate, hierarchical site.
There are many SEO practices that all work and that people should take into consideration. But fundamentally, I think it's important to remember that Google doesn't necessarily want people to be using SEO. More and more, google is striving to create a search engine that is capable of ranking websites based on how good the content is, and solely on that. It wants to be able to see what good content is in ways in which we can't trick it. Think about, at the very beginning of search engines, a site which had the same keyword on the same webpage repeated 200 times was sure to rank for that keyword, just like a site with any number of backlinks, regardless of the quality or PR of the sites they come from, was assured Google popularity. We're past that now, but is SEO is still , in a certain way, tricking a search engine into making it believe that your site has good content, because you buy backlinks, or comments, or such things.
I'm not saying that SEO is a bad practice, far from that. But Google is taking more and more measures to make its search results independant of the regular SEO practices we use today. That is way I can't stress this enough: write good content. Content, content, content. Make it unique, make it new, add it as often as you can. A lot of it. That's what matters. Google will always rank a site if it sees that there is a lot of new content, and even more so if it sees content coming onto the site in other ways, especially through commenting.
Common sense is uncommon. Things that appear obvious to me or you wouldn't be so obvious to someone else.
SEO is the process of effectively creating and promoting valuable content or tools, ensuring either is totally accessible to people and robots (search engine robots).
The SEO process includes and is far from being limited to such uncommon sense principles as:
Improving page load time (through minification, including a trailing slash in URLs, eliminating unnecessary code or db calls, etc.)
Canonicalization and redirection of broken links (organizing information and ensuring people/robots find what they're looking for)
Coherent, semantic use of language (from inclusion and emphasis of targeted keywords where they semantically make sense [and earn a rankings boost from SE's] all the way through semantic permalink architecture)
Mining search data to determine what people are going to be searching for before they do, and preparing awesome tools/content to serve their needs
SEO matters when you want your content to be found/accessed by people -- especially for topics/industries where many players compete for attention.
SEO does not matter if you do not want your content to be found/accessed, and there are times when SEO is inappropriate. Motives for not wanting your content found -- the only instances when SEO doesn't matter -- might vary, and include:
Privacy
When you want to hide content from the general public for some reason, you have no incentive to optimize a site for search engines.
Exclusivity
If you're offering something you don't want the general public to have, you need not necessarily optimize that.
Security
For example, say, you're an SEO looking to improve your domain's page load time, so you serve static content through a cookieless domain. Although the cookieless domain is used to improve the SEO of another domain, the cookieless domain need not be optimized itself for search engines.
Testing In Isolation
Let's say you want to measure how many people link to a site within a year which is completely promoted with AdWords, and through no other medium.
When One's Business Doesn't Rely On The Web For Traffic, Nor Would They Want To
Many local businesses or businesses which rely on point-of-sale or earning their traffic through some other mechanism than digital marketing may not want to even consider optimizing their site for search engines because they've already optimized it for some other system, perhaps like people walking down a street after emptying out of bars or an amusement park.
When Competing Differently In An A Saturated Market
Let's say you want to market entirely through social media, or internet cred & reputation here on SE. In such instances, you don't have to worry much about SEO.
Go real and do for user not for robots you will reach the success!!
Thanks!