Meta tags doesn't function [closed] - seo

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I created a simple website for one of my clients. I added meta tags in order to end up high in Google searches. However, if I enter the name of the site or some meta keywords, Google doesn't find my website.
The critical keywords that I want to be found: "Orquidea", "schoonheidssalon Westerlo", "Westerlo", "schoonheidssalon"
I uploaded the meta tags a week ago. I think that would be long enough to be scannend and recognized by Google right?
Anyone a solution?
Here's the URL: http://www.orquidea.be

Although the question is off-topic, I'll still answer it!
FACT: Meta keywords have no effect on SEO. Instead, you should focus on generating quality content and getting backlinks to appropriate webpages.
Make sure to use those meta keywords only, that match the content of your client's website. And yes, don't worry about anything else.

Sure, Nowadays meta tags have no effect on search engine rankings. That is Meta Tags are not considered in ranking your website. But before it was into consideration when people used to keyword stuff their meta tags.
This caused difficulty in validating websites for Google, so they used to follow only meta descriptions, content, titles. So always be careful when writing meta descriptions, titles and content. Don't stuff your content with your target keywords. Write simple content that is readable and understandable for your website and sure your website will be indexed in right place for the right keyword...
thanks

Related

Meta keywords and google [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I recall an article (by a Google employee) that says keywords are obsolete now regarding SEO. This may be true, but is it possible that meta keywords can determine relevancy of AdSense ads? Another words, should meta keywords be ignored or used?
No, ignore the keyword meta tag.
Neither Google Crawler nor Google AdSense is using the meta keywords, because they are completly useless, create good content, use headers and structured content, and you will be good.
"At least for Google's web search results currently (September 2009), the answer is no. Google doesn't use the "keywords" meta tag in our web search ranking."
Read this: Google does not use the keywords meta tag in web ranking
The Google employee you are thinking of is Matt Cutts, he said Google doesn't factor in meta keywords into their search algorithm. You can completely ignore them.
Source: 7 Common SEO Mistakes To Avoid
codecrater.com/blog/7-common-seo-mistakes-avoid/
Even though search engines yahoo and bing use this tag as a less
significant one in ranking purposes,
It's better to ignore meta keywords tag.
Meta keywords doesn't have any impact on SEO of a website or blog but I recommend to use it as it doesn't have any disadvantage in using it. I am using meta keywords on each post published by me on my blog and it leads to generate traffic.
Secondly, Google had announced that they will not use meta keywords or meta tags to rank a blog in search results. But it doesn't told that it is restricted to use meta keywords.

How to help search engines to find all the pages on my website [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I currently program a website which gives information about food products.
The way the website works is that there's a search engine -> the users search for the product they want to know something about -> the website shows all the products that they may want to see, and every product has his own page with all the information about it.
So my question is: how search engines, like google, will be able to find all the product pages?
Search engines use many different ways to find new pages. Most commonly their web crawlers follow (external as well as internal) hyperlinks.
While a typical informational website links to all available pages in its site-wide navigation (so web crawlers can reach all pages by following internal links), other websites don’t necessarily link to all their pages (maybe because you can only reach them via forms, or because it doesn’t make sense for them to provide all links, etc.).
To allow discovery/crawling of new pages of these sites, too, they can provide a site map. This is essentially just a page linking to all existing pages, but often with structured metadata that can help search engines.
So just make sure that all your pages are linked somehow. Either via "natural" internal links on your site, or by providing a sitemap (ideally following the sitemaps.org protocol), or both.
For questions about SEO advice (which is off-topic here on SO), see our sister site https://webmasters.stackexchange.com/.
Please add sitemap in your site for google crawling all pages easily and indexing properly.
also add xml sitemap
your website need SEO process.

SEO : things to consider\implement for your website's content [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
lets say i have a website that i am developing...
the site may have wallpapers, question & answers, info (e.g imdb,wikipedia etcetera)
what do i need to do so that when some search engine analyzes a particular page of my website for particular, lets say 'XYZ', it finds 'XYZ', content it finds 'XYZ' content if it present in that page...
please i am new to this so pardon my non-techy jargon...
The most important tips in SEO revolve around what not to do:
Keep Java and Flash as minimal as is possible, web crawlers can't parse them. Javascript can accomplish the vast majority of Flash-like animations, but it's generally best to avoid them altogether.
Avoid using images to replace text or headings. Remember that any text in images won't be parsed. If necessary, there are SEO-friendly ways of replacing text with images, but any time you have text not visible to the user, you risk the crawler thinking your trying to cheat the system.
Don't try to be too clever. The best way to optimize your search results is to have quality content which engages your audience. Be wary of anyone who claims they can improve your results artificially; Google is usually smarter than they are.
Search engines (like Google) usually use the content in <h1> tags to find out the content of your page and determine how relevant your page is to that content by the number of sites that link to your page.

How to get Google Sitelinks on a website? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
There are a lot of websites that look professional in Google results. Try searching for 'stackoverflow' and you'll see at the top a result with a title, a description and a table of 8 links to stackoverflow categories. That's what I'm interested in producing for future websites.
So what must be done? Does it depend on the number of visitors? How long does it take until the results start looking like that?
I think you are referring to "sitelinks". Google generally does not make it public exactly how those are created (to prevent abuse, for example). I suspect you need the subpages to be very strongly linked, perhaps about the same amount or more than the top-level page. No way to know for sure. The best way to get your website looking good in Google is to make it as user-friendly and human-friendly as possible. I think Google typically looks for clues as to whether the website will be relevant to humans and very likely penalizes content that detracts from the interface just to become search-engine optimized.
Make sure that each page (not just your home page) has a title.
Include description meta information, which search engines may (or may not) use for snippets to display.
If an unordered list (<ul><li><a href="http://..">Home...) is used for navigation on the page, Google will pick that up and display it underneath the page listing when it is the #1 or #2 position listing.
Google may also use the description meta, or the first few lines of text that appear on the page, underneath the entry. It usually does this for searches in the other positions.

SEO blacklisting for cloaking [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am using postbacks to perform paging on a large amount of data. Since I did not have a sitemap for google to read, there will be products that google will never know about due to the fact that google does not push any buttons.
I am doing cloaking to spit out all the products with no paging if the user-agent is that of a search engine. There may be some work arounds for situations like this which include hidden buttons to paged urls.
What about information you want indexed buy google but you want to charge for the content. Imagine that I have articles that I want users to be able to find in google, but when the user visits the page, only half the content is displayed and users will have to pay for the rest.
I have heard that google may blacklist you for cloaking. I am not being evil, just helpful. Does google recognize the intention?
Here is a FAQ by google on that topic. I suggest to use CSS to hide some content. For example just give links to your products as an alternative to your buttons and use display:none; on them. The layout stays intact and the search engines will find your pages. However most search engines will not find out about cloaking and other techniques, but maybe competitors will denigrate you. In any way: Don't risk it. Use sitemaps, use RSS feeds, use XML documents or even PDF files with links to offer your whole range of products. Good luck!
This is why Google supports a sitemap protocol. The sitemap file needs to render as XML, but can certainly be a code-generated file, so you can produce on-demand from the database. And then point to it from your robots.txt file, as well as telling Google about it explicitly from your Google Webmaster Console area.
Highly doubtful. If you are serving different content based on IP address or User-Agent from the same URL, it's cloaking, regardless of the intentions. How would a spider parse two sets of content and figure out the "intent"?
There is intense disagreement over whether "good" cloakers are even helping the user anyway.
Why not just add a sitemap?
I don't think G will recognize your intent, unfortunately. Have you considered creating a sitemap dynamically? http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40318