which is better for seo, a div with a style set to hidden or an title tag for a jquery tooltip - seo

Wondering what would be better for SEO and a spider,
which is better for seo, a div with a style set to hidden or an title tag for a jquery tooltip
I am hesitant to put a div with a style set to hidden because a google spider might discard these types of divs and their content.
Thoughts?
-- Another note: Another option could be to find each div for the tooltip content and with jquery hide them on page load.

SEO is quite an elusive subject, as you may know. Meaning, you can not draw a conclusion by simply asserting hiding content is bad for SEO, while things that is informative to users are good for SEO, since at least Google claims so. One simple example is images, a good images is more informative than several lines of words, but Good simply can not get.
My experience of SEO is that when you do not know whether something is good for bad for SEO, you need to consider at least two elements. One, whether Google is smart enough to index it. Second, is it good for your end users.
Hiding one div is not so horrible in SEO wise, I did it for a long time (almost 2 years) and I have not experienced any negative result. I added a learn more button though, just FYI. I think giving your users an option to see the whole content should you decide to hide it, then it is not a big deal. A caveat however, this is just my opinion backed by my past experience, I can not guarantee it will be true in the future.
In terms of JQuery tool tip, I am not sure since I am barely use it. However, it is widely hold that Google still does not crawl JavaScript, you'd better give JQuery a second thought. If possible, Ajax is a good choice though, anyway it is another story. hope this helps

Title tags are useful ways of adding information about a link to the user. Therefore, this is also good for SEO, because Google likes pages that are informative to users.
Hiding information in hidden divs will probably be picked up by google, but might cost you penalty points, as hiding information is considered bad form in SEO..
So to answer your question, I'd go for the title tag.

Related

Hiding a page part from Google, does it hurt SEO?

We all know that showing inexistent stuff to Google bots is not allowed and will hurt the search positioning but what about the other way around; showing stuff to visitors that are not displayed for Google bots?
I need to do this because I have photo pages each with the short title and the photo along with textarea containing the embed HTML code. googlebot is taking the embed code and putting it at the page description on its search results which is very ugly.
Please advise.
When you start playing with tricks like that, you need to consider several things.
... showing stuff to visitors that are not displayed for Google bots.
That approach is a bit tricky.
You can certainly check User-agents to see if a visitor is Googlebot, but Google can add any number of new spiders with different User-agents, which will index your images in the end. You will have to constantly monitor that.
Testing of each code release your website will have to check "images and Googlebot" scenario. That will extend testing phase and testing cost.
That can also affect future development - all changes will have to be done with "images and Googlebot" scenario in mind which can introduce additional constraints to your system.
Personally I would choose a bit different approach:
First of all review if you can use any methods recommended by Google. Google provides a few nice pages describing that problem e.g. Blocking Google or Block or remove pages using a robots.txt file.
If that is not enough, maybe restructuring of you HTML would help. Consider using JavaScript to build some customer facing interfaces.
And whatever you do, try to keep it as simple as possible, otherwise very complex solutions can turn around and bite you.
It is very difficult to give you very good advise without knowledge of your system, constraints and strategy. But I hope my answer will help you out to choose good architecture / solution for your system.
Boy, you want more.
Google does not because of a respect therefore judge you cheat, he needs a review, as long as your purpose to the user experience, the common cheating tactics, Google does not think you cheating.
just block these pages with robots.txt and you`ll be fine, it is not cheating - that's why they came with solution like that in the first place

Keywords and domain

I have found that one of the keywords I would like to be found in the search engine has a domain that I can register. This is not a good name for the general project, and so not for all the web, but it is a good definition or explanation of a part. Is it a good idea to register something like this just to point it to a section of the web? I mean is this effective from the SEO point of view? but most important, is it a good practice?
Interesting question, in therms of SEO, this is NOT a good practice, and google can punish your website (so i'd not recommend it), but...
...if this word is really easy to remember and you think the user will try it to access your site without needing to search for it, it may be "acceptable", because you won't lose online visits.
Anyway.. you should avoid black hat techniques.
Google updates "Panda", "Penguin" later versions discourage this type of technique. Naming your domain as same to the specific research like www.healthcaremedicine.com. That means if someone search for the products health care medicine. Your website is shown at top.
I think that is what you mean.
In past years people name their website closer to the search result but now it is not recommended. It may for some time take your site to top. But that will not last long. Your site should have to provide what it promise its visitors. At least they have to spend some time.
Exact Match Domain is what you are referring to, as answered above is not advisable, but if you find it useful you can register and go ahead.
How to keep you off from penalties.
Do not stuff KW's in title and descriptions, as your domain already has the KW
When doing off-page SEO, Do not choose anchor text as your Main KW, instead use URL itself as anchor text and some generic anchor text like click here, more info, read more.
These can save you from penalty. I still see a lot of EMD's ranking just by being careful with the usage of KW's and anchor text.

would lazy-loading img src negatively impact SEO

I'm working on a shopping site. We display 40 images in our results. We're looking to reduce the onload time of our page, and since images block the onload event, I'm considering lazy loading them by initially setting img.src="" and then setting them after onload. Note that this is not ajax loading of html fragments. the image html along with the alt text is present. it's just the image src is deferred.
Does anyone have any idea as to whether this may harm SEO or lead to a google penalty box now that they are measuring sitespeed?
Images don't block anything, they are already lazy loaded. The onload event notifies you that all of the content has been downloaded, including images, but that is long after the document is ready.
It might hurt your rank because of the lost keywords and empty src attributes. You'll probably lose more than you gain - you're better off optimizing your page in other ways, including your images. Gzip + fewer requests + proper expires + a fast static server should go a long way. There is also a free CDN that might interest you.
I'm sure google doesn't mean for the whole web to remove their images from source code to gain a few points. And keep in mind that they consider anything under 3s to be good loading times, there's plenty of room to wiggle before resorting to voodoo techniques.
From a pure SEO perspective, you shouldn't be indexing search result pages. You should index your home page and your product detail pages, and have a spiderable method of getting to those pages (category pages, sitemap.xml, etc.)
Here's what Matt Cutts has to say on the topic, in a post from 2007:
In general, we’ve seen that users usually don’t want to see search results (or copies of websites via proxies) in their search results. Proxied copies of websites and search results that don’t add much value already fall under our quality guidelines (e.g. “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.” and “Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches…”), so Google does take action to reduce the impact of those pages in our index.
http://www.mattcutts.com/blog/search-results-in-search-results/
This isn't to say that you're going to be penalised for indexing the search results, just that Google will place little value on them, so lazy-loading the images (or not) won't have much of an impact.
There are some different ways to approach this question.
Images don't block load. Javascript does; stylesheets do to an extent (it's complicated); images do not. However, they will consume http connections, of which the browser will only fire off 2 per domain at a time.
So, what you can do that should be worry-free and the "Right Thing" is to do a poor man's CDN and just drop them on www1, www2, www3, etc on your own site and servers. There are a number of ways to do that without much difficulty.
On the other hand: no, it shouldn't affect your SEO. I don't think Google even bothers to load images, actually.
We display 40 images in our results.
first question, is this page even a landing page? is it targeted for a specific keyword? internal search result pages are not automatically landing pages. if they are not a landingpage, then do whatever you want with them (and make sure they do not get indexed by google).
if they are a landingpages (a page targeted for a specific keyword) the performance of the site is indeed important, for the conversion rate of these pages and indirectly (and to a smaller extend also directly) also for google. so a kind of lazy load logic for pages with a lot of images is a good idea.
i would go for:
load the first two (product?) images in an SEO optimized way (as normal HTML, with a targeted alt text and a targeted filename). for the rest of the images make a lazy load logic. but not just setting the src= to blank, but insert the whole img tag onload (or onscroll, or whatever) into your code.
having a lot of broken img tags in the HTML for non javacript users (i.e.: google, old mobile devices, textviewer) is not a good idea (you will not get a penalty as long as the lazy loaded images are not missleading) but shitty markup is never a good idea.
for general SEO question please visit https://webmasters.stackexchange.com/ (stack overflow is more for programing related questions)
I have to disagree with Alex. Google recently updated its algorithm to account for page load time. According to the official Google blog
...today we're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.
However, it is important to keep in mind that the most important aspect of SEO is original, quality content.
http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search-ranking.html
I have been added lazyload to my site (http://www.amphorashoes.ro) and i have better pagerank from google (maybe because the content is loading faster) :)
first,don't use src="",it may hunt your page,make a small loading image instead it.
second,I think it won't affect SEO, actually we always use alt="imgDesc.." to describe this image, and spider may catch this alt but not analyse this image what id really be.
I found this tweet regarding Google's SEO
There are various ways to lazy-load images, it's certainly worth
thinking about how the markup could work for image search indexing
(some work fine, others don't). We're looking into making some clearer
recommendations too.
12:24 AM - 28 Feb 2018
John Mueller - Senior Webmaster Trends Analyst
From what I understand, it looks like it depends on how you implement your lazy loading. And Google is yet to recommend an approach that would be SEO friendly.
Theoretically, Google should be running the scripts on websites so it should be OK to lazy load. However, I can't find a source(from Google) that confirms this.
So it looks like crawling lazy loaded or deferred images may not be full proof yet. Here's an article I wrote about lazy loading image deferring and seo that talks about it in detail.
Here's working library that I authored which focuses on lazy loading or deferring images in an SEO friendly way .
What it basically does is cancel the image loading when DOM is ready and continue loading the images after window load event.
...
<div>My last DOM element</div>
<script>
(function() {
// remove the all sources!
})();
window.addEventListener("load", function() {
// return all the sources!
}, false);
</script>
</body>
You can cancel loading of an image by removing it's src value or replacing it with a placeholder image. You can test this approach with Google Fetch
You have to make sure that you have the correct src until DOM is ready so to be sure that Google Fetch will capture your imgs original src.

Is listing all products on the homepage's footer making a real difference SEO-wise?

I'm working on a website on which I am asked to add to the homepage's footer a list of all the products that are sold on the website along with a link to the products' detail pages.
The problem is that there are about 900 items to display.
Not only that doesn't look good but that makes the page render a lot slower.
I've been told that such a technique would improve the website's visibility in Search Engine.
I've also heard that such techniques could lead to the opposite effect: google seeing it as "spam".
My question is: Is listing products of a website on its homepage really efficient when it comes to becoming more visible on search engines?
That technique is called keyword stuffing and Google says that it's not a good idea:
"Keyword stuffing" refers to the practice of loading a webpage with keywords in an attempt to manipulate a site's ranking in Google's search results. Filling pages with keywords results in a negative user experience, and can harm your site's ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.
Now you might want to ask: Does their crawler really realize that the list at the bottom of the page is just keyword stuffing? Well, that's a question that only Google could answer (and I'm pretty sure that they don't want to). In any case: Even if you could make a keyword stuffing block that is not recognized, they will probably improve they algorithm and -- sooner or later -- discover the truth. My recommendation: Don't do it.
If you want to optimize your search engine page ranking, do it "the right way" and read the Search Engine Optimization Guide published by Google.
Google is likely to see a huge list of keywords at the bottom of each page as spam. I'd highly recommend not doing this.
When is it ever a good idea to specify 900 items to a user? good practice dictates that large lists are usually paginated to avoid giving the user a huge blob of stuff to look through at once.
That's a good rule of thumb, if you're doing it to help the user, then it's probably good ... if you're doing it purely to help a machine (ie. google/bing), then it might be a bad idea.
You can return different html to genuine users and google by inspecting the user agent of the web request.
That way you can provide the google bot with a lot more text than you'd give a human user.
Update: People have pointed out that you shouldn't do this. I'm leaving this answer up though so that people know it's possible but bad.

SEO Superstitions: Are <script> tags really bad? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
We have an SEO team at my office, and one of their dictums is that having lots of <script> blocks inline with the HTML is apocalypticly bad. As a developer that makes no sense to me at all. Surely the Google search engineers, who are the smartest people on the planet, know how to skip over such blocks?
My gut instinct is that minimizing script blocks is a superstition that comes from the early ages of search engine optimizations, and that in today's world it means nothing. Does anyone have any insight on this?
per our SEO guru, script blocks (especially those that are in-line, or occur before actual content) are very, very bad, and make the google bots give up before processing your actual content. Seems like bull to me, but I'd like to see what others say.
It's been ages since I've played the reading google's tea leafs game, but there are a few reasons your SEO expert might be saying this
Three or four years back there was a bit of conventional wisdom floating around that the search engine algorithms would give more weight to search terms that happened sooner in the page. If all other things were equal on Pages A and B, if Page A mentions widgets earlier in the HTML file than Page B, Page A "wins". It's not that Google's engineers and PhD employees couldn't skip over the blocks, it's that they found a valuable metric in their presence. Taking that into account, it's easy to see how unless something "needs" (see #2 below) to be in the head of a document, an SEO obsessed person would want it out.
The SEO people who aren't offering a quick fix tend to be proponents of well-crafted, validating/conforming HTML/XHTML structure. Inline Javascript, particularly the kind web ignorant software engineers tend to favor makes these people (I'm one) seethe. The bias against script tags themselves could also stem from some of the work Yahoo and others have done in optimizing Ajax applications (don't make the browser parse Javascript until is has to). Not necessarily directly related to SEO, but a best practice a white hat SEO type will have picked up.
It's also possible you're misunderstanding each other. Content that's generated by Javascript is considered controversial in the SEO world. It's not that Google can't "see" this content, it's that people are unsure how its presence will rank the page, as a lot of black hat SEO games revolve around hiding and showing content with Javascript.
SEO is at best Kremlinology and at worse a field that the black hats won over a long time ago. My free unsolicited advice is to stay out of the SEO game, present your managers with estimates as so how long it will take to implement their SEO related changes, and leave it at that.
There's several reasons to avoid inline/internal Javascript:
HTML is for structure, not behavior or style. For the same reason you should not put CSS directly in HTML elements, you should not put JS.
If your client does not support JS you just pushed a lot of junk. Wasted bandwith.
External JS files are cached. That saves some bandwith.
You'll have a descentralized javascript. That leads to code repetition and all the known problemns that comes with it.
I don't know about the SEO aspect of this (because I never can tell the mambo jambo from the real deal). But as Douglas Crockford pointed out in one of his javascript webcasts the browser always stops for parsing the script, at each element. So, if possible, I'd rather deliver the whole document and enhance the page as late as possible with scripts anyway.
Something like
<head>
--stylesheets--
</head>
<body>
Lorem ipsum dolor
...
...
<script src="theFancyStuff.js"></script>
</body>
I've read in a few places that Google's spiders only index the first 100KB of a page. 20KB of JS at the top of your page would mean 20KB of content later on that Google wouldn't see, etc.
Mind you, I have no idea if this fact is still true, but when combine it with the rest of the superstition/rumors/outright quackery you find in the dark underbelly of SEO forums, it starts to make a strange sort of sense.
This is in addition to the fact that inline JS is a Bad Thing with respect to the separation of presentation, content, and behavior, as mentioned in other answers.
Your SEO guru is slightly off the mark, but I understand the concern. This has nothing to do with whether or not the practice is proper, or whether or not a certain number of script tags is looked upon poorly by Google, but everything to do with page weight. Google stops caching after (I think) 150KB. The more inline scripts your page contains, the greater the chance important content will not be indexed because those scripts added too much weight.
I've spent some time working on search engines (not Google), but have never really done much from an SEO perspective.
Anyway, here are some factors which Google could reasonably use to penalise the page which should be increased by including big blocks of javascript inline.
Overall page size.
Page download time (a mix of page size and download speed).
How early in the page the search terms occurred (might ignore script tags, but that's a lot more processing).
Script tags with lots of inline javascript might be interpreted to be bad on their own. If users frequently loaded a lot of pages form the site, they'd find it much faster if the script was in a single shared file.
I would agree with all of the other comments but would add that when a page has more than just
<p> around the content you are putting your faith in Google to interpret the mark-up correctly and that is always a risky thing to do. Content is king and if Google can't read the content perfectly then it's just another reason for google to not show you the love.
This is an old question, but still pretty relevant!
In my experience, script tags are bad if they cause your site to load slowly. Site speed actually does have an impact on your appearance in SERPs, but script tags in and of themselves aren't necessarily bad for SEO.
Lots of activities in SEO is not recommended by search engine. You can use <script> tag but not excessively. Even Google Analytics snippet code in <script> tag.