How much SEF have to be the URLs? [closed] - seo

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I'm really attracted to the webservice.js module. I'd like to use it as a real webserver, using only an HTML page with JS calling the webservice to retrieve the datas.
The problem I'm seeing is about the search engines, as I do wish my website to be search engine optimized.
So I thought I could be fallbacking to plain HTML when JS is not enabled (just going to the url, webservice.js sends back some datas in plain HTML). For this, the links will be displayed in the HTML markup on the frontpage.
The problem is about how much SEF have the URLs to be?
I mean, the webservice will allow me to have URLs of this kind : http://domain.com/content?get=title-uri-encoded.
Is it search-engine friendly? I know having http://domain.com/content/title-uri-encoded would be better, but is the kind I'm thinking of still friendly?
PS : I'm not sure whether this post belongs to SO or Programmers.se...

You probably want to look into progressive enhancement techniques or Google's proposed AJAX solution.
You may end up with a URL structure like this:
AJAX enabled public version
http://domain.com/content#!get=title-uri-encoded
Search Engine version (plain html)
http://domain.com/content?_escaped_fragment_=get=title-uri-encoded

Related

GET vs POST in SEO [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
My web application retrieves a page for every request generated by a form submission. That form submits to the same URL of the page.
Each time the page loads with a different title tag. Does it indicate different pages with the same URL?
How does it affect SEO? how can I manage this situation?
Edit
This question is not purely SEO related no it requires SEO specific reasoning or answers it can be explained also technically how search engine robots work. if it still seems offtopic for moderators I request them to explain why
Try and use a rewiter rule to format your URL to a unqiune page if your always loading to the same page google ( or other search engines) will only index that single page.
http://www.seomoz.org/img/upload/anatomy-of-a-url.jpg
In addition to load the page each time with different title tag you need to append the URL with some uinque text like your GET variable data..
For getting crawled by spiders don't forget to submit your sitemap to search engines with relevant urls..

Is serving a bot-friendly page to google-bot likely ot adversly affect SEO? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I have a site which has a homepage that contains a great deal of javascript. I am concious that this isn't great for mobile clients, javascript-less browsers and crawlers/bots. The page uses propper <noscript /> alternatives, alt attributes, etc.
The user-agent can easily be sniffed to serve up a the page content without Javascript (there is a non-javascript version of the content already on the site), but I don't want to be seen to be cheating to crawlers (google-bot).
Humans that use mobile-clients and javascript-less browsers would surely appreciate a tailored version (given an option to switch back to the full version if they want). Bots might think they're being cheated.
Finally, the site has been indexed very well so far, so I am tempted to not tailor it for google-bot, just for humans that use mobile clients and javascript-less browsers. It seems like a safer option.
If you serve different content to the search engines then you do your users you are cloaking and definitely in violation of Google's terms of service.
The proper way to handle generated with JavaScript is to use progressive enhancement. This means that all of your content is available without JavaScript being required to fetch or display it. Then you enhance that content using JavaScript. This way everyone has access to the same content but users with JavaScript get a better experience. This is good usability and good for SEO.

Bot resistant website [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm building a website with lots of images and want to stop bots from accessing those images. So I'm looking something beyond cookies since bots can handle cookies. My idea is that all authentication should reside purely on the server side. Any ideas?
Someone suggested a website, that makes a user visit a thumbnail page first. Somehow visiting that page triggers a server side variable, which allows the main image to be displayed later. How can that be implemented.
http://www.google.com/search?hl=en&q=Robots.txt
That will stop some bots from spidering images on your site.
Also look into a .httaccess file
http://www.google.com/search?hl=en&q=.htaccess

SEO : things to consider\implement for your website's content [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
lets say i have a website that i am developing...
the site may have wallpapers, question & answers, info (e.g imdb,wikipedia etcetera)
what do i need to do so that when some search engine analyzes a particular page of my website for particular, lets say 'XYZ', it finds 'XYZ', content it finds 'XYZ' content if it present in that page...
please i am new to this so pardon my non-techy jargon...
The most important tips in SEO revolve around what not to do:
Keep Java and Flash as minimal as is possible, web crawlers can't parse them. Javascript can accomplish the vast majority of Flash-like animations, but it's generally best to avoid them altogether.
Avoid using images to replace text or headings. Remember that any text in images won't be parsed. If necessary, there are SEO-friendly ways of replacing text with images, but any time you have text not visible to the user, you risk the crawler thinking your trying to cheat the system.
Don't try to be too clever. The best way to optimize your search results is to have quality content which engages your audience. Be wary of anyone who claims they can improve your results artificially; Google is usually smarter than they are.
Search engines (like Google) usually use the content in <h1> tags to find out the content of your page and determine how relevant your page is to that content by the number of sites that link to your page.

What support does Google search have for HTML 5? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm wondering if Google search is aware of tags such as <nav>, <aside>, <section>, etc. that are being added by HTML 5?
My navigation comes before my content and I have too many links in it for good SEO. I'd like to use <nav> if Google recognised it rather than using js or a css work around.
Thanks,
Denis
You can use HTML5 tags like even now, see Mads Kjaer article. Don't wait for google to be ready, use it now!
But until Google recognises those tags stick to following current SEO rules. Move your content to the beginning of the code and navigation to the end!
Probably nothing yet, since I don't believe any of those tags are implemented right now, and HTML5 is far from being a standard yet.
Has this now changed? Is it safe to add HTML5 to the start of the page and not impact SEO? Is google 'aware' of these tags?