SEO : things to consider\implement for your website's content [closed] - seo

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
lets say i have a website that i am developing...
the site may have wallpapers, question & answers, info (e.g imdb,wikipedia etcetera)
what do i need to do so that when some search engine analyzes a particular page of my website for particular, lets say 'XYZ', it finds 'XYZ', content it finds 'XYZ' content if it present in that page...
please i am new to this so pardon my non-techy jargon...

The most important tips in SEO revolve around what not to do:
Keep Java and Flash as minimal as is possible, web crawlers can't parse them. Javascript can accomplish the vast majority of Flash-like animations, but it's generally best to avoid them altogether.
Avoid using images to replace text or headings. Remember that any text in images won't be parsed. If necessary, there are SEO-friendly ways of replacing text with images, but any time you have text not visible to the user, you risk the crawler thinking your trying to cheat the system.
Don't try to be too clever. The best way to optimize your search results is to have quality content which engages your audience. Be wary of anyone who claims they can improve your results artificially; Google is usually smarter than they are.

Search engines (like Google) usually use the content in <h1> tags to find out the content of your page and determine how relevant your page is to that content by the number of sites that link to your page.

Related

How to ensure search engines index specific version of site [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I am just starting to work on my first responsive site and am realizing some SEO things may fit on the front page of our site may not fit 'naturally' on the front page of the mobile version.
Is there anyway to ensure search engines see the full-size site?
Once complicating matter is that I am designing the site 'mobile first'. So the site does not default to full-size, it defaults to mobile sizes.
Assuming you deliver the same content to the end user regardless of device, and just show/hide or reformat based upon a media query, it really doesn’t matter. Google will still get the full content of the page so will index all of your content. What is visible in the viewport isn’t really significant to Google.
Google will, however, understand the use of media queries and give you some additional SEO benefits as a result. Google favours responsive design over separate sites/mobile specific pages. Responsive design also helps improve the indexing efficiency of the Googlebot.
One thing they do advise is not to block access to any of your ‘external’ resources (css, js, images etc)
Plenty of good information here

Search engine page creating [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I noted that Google use web page content when they index pages for SEO purposes. Therefore, what I did was I created the web pages and I used lot of keyword on the web pages. Then I applied the background color to the above keywords to show users.
Question is do they block this kind of pages?
Search Engine Optimization (SEO) is something you really need an expert for these days. The days of having some keywords and meta-data only have long gone, so you need to keep up to date with current SEO tricks to get your site up the Google ranking. You can also check the Alexa rankings for your website.
Take a look at the SEO guidelines from Google here
Take a look at some pointers here and here, but you really need to invest some time and research into the best practices.
You should also make your site as accessible as possible, this will make the site easier to spider, there are some tools here to look at and there's a site here you can use.

SEO with similar pages [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Our company has created a "comparison" tool that uses unique urls to choose who you want to compare, example:
http://www.sportingcharts.com/nhl/2010-edmonton-oilers/vs/2008-calgary-flames/
http://www.sportingcharts.com/nhl/1993-carolina-hurricanes/vs/2008-dallas-stars/
Does anyone know if this is a recommended SEO strategy or is it better to use query string parameters instead of completely different urls. One advantage I was thinking of is this could grab long tail traffic searches such as "2010 Edmonton Oilers Vs 1995 Calgary Flames" but having this many URLS might also hurt the general SEO of these pages.
Does anyone have any experience in creating pages like this? What is the recommended strategy?
The style of URL is not going to matter much to search engines.
From a search engine perspective they are going to care more that:
You have 30 teams and 24 seasons. You are creating 30*24*30*24 = over 500,000 pages.
Each page has very little content. Its just two team names and some numerical stats.
The content that you do have is heavily duplicated across pages.
The search volume for your targeted keywords is going to be very low. Very few people search for two team names with two different years.
If I ran a search engine, I would not want to have my crawlers waste time crawling that site. I wouldn't want the pages in the index.
I expect that your site will suffer from "thin content", "duplicate content", and "excessive pages" issues because of this section.

Is there any value in embedding hyperlinks in a PDF? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
My company has given me a PDF that's already been mastered, packaged, etc. but of course the one thing they didn't do is add a linking hotspot over the 2" square ad space that they've set out for themselves. I have a hard time imagining people click links in PDFs with any regularity but can't find any knowledge to back that up. Are there any benefits to search? As of now the only place this linked version will exist is on the site that the ad itself links to.
Thanks.
I haven't seen anything definitive that says search engines parse PDFs for links or if they help SEO or not. But Google does read and index PDFs, and some flash as well, so those links are probably being seen by them. If I were to speculate about this I would say those links do have some SEO value. All of the rules would apply to it such as anchor text, link popularity (probably to the PDF document), etc.
Most people seem to regard PDF as "Printable Document File" rather than "Portable Document Format". If the document has the website address on it, that is sufficient.

Improve dictionary's internal linking structure [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
What I want to achieve:
I have an online dictionary which works quite fine - but the crawling by search engines (especially Google) could be better.
So I would like to improve the internal linking structure on my website so that Google can easily find (almost) all pages of the dictionary.
What I know yet:
The number of internal links per page should not exceed 100. Search engines don't like pages containing masses of links - looks spammy. And a website is not to be designed for search engines but for the users. So the usability should not suffer from this optimization, best case would be if the usability does even increase.
My ideas for improving the internal linking structure so far:
on each dictionary entry page: link 25 similar words which could be mixed up
create an index: list of all dictionary entries (75 per page)
...
Can you help me to optimize the linking structure?
Thank you very much in advance!
You could link to synonyms and antonyms, which would be both user-friendly and crawler-friendly. But I think the biggest thing you could do to improve crawling, particularly by Google, would be to add a sitemap:
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Google has lots of information on Sitemaps and how to generate them on their webmaster help pages.