h1 modification with javascript - is it worth doing for seo - seo

My online store script on category pages pulls h1 tags form category names. Thats not always what I'd want them to be. Is it worth changing with javascript or google just won't read the altered versions of them and seo won't be affected?

Google will read the altered versions of the h1 tag and seo will be affected.

Related

Meta tag population

I would like know about changing the content of meta tags in keywords using javascript on each page, to make keywords different for each page. Is it good for the SEO or not? Will GOOGLE be able to able to read the meta tag as i am changing the keywords with the help of javascript?
Google no longer counts the meta keywords tags. They stopped doing so since somewhere near 2009.Here is the official link of the announcement.
http://googlewebmastercentral.blogspot.gr/2009/09/google-does-not-use-keywords-meta-tag.html
You can check the article above from google support about what types of meta tags their bots are taking into consideration when they crawl your page.
https://support.google.com/webmasters/answer/79812?hl=en
On the other hand it is very good to have different pages for different keywords since only one of them is going to receive high ranking for a specific keyword.
But you have to do that without the meta keywords tags. You can use the title tags and the description tags also.You will can get some more by adding the keyword in the images alt text too. But yet again your meta keyword tags will have no value at all. So in my honest opinion try using the the methods i mentioned above and forget about managing the keywords with JavaScript.
If u have need more details feel free to ask in comments of my answer
Google is on record saying they no longer look at this meta keywords tag. As for the meta description and title tag, place text there that you think will optimize your click through rate on the SERPs, but don't cram keywords in.

How to remove duplicate title and meta description tags if google indexed them

So, I have been building an ecommerce site for a small company.
The url structure is : www.example.com/product_category/product_name and the site has around 1000 products.
I've checked google webmaster tools and in the HTML improvements section it shows that I have multiple title and meta description tags for all the product pages. They all appear two times, both:
-www.example.com/product_category/product_name
and
-www.example.com/product_category/product_name/ (with slash in the end)
got indexed as separate pages.
I've added a 301 redirect from every www.example.com/product_category/product_name/ to www.example.com/product_category/product_name, but this was almost two weeks ago. I have resubmitted my sitemap and asked google to fetch the whole page a few times. Nothing has changed, GWT still shows the pages as duplicate tags.
I did not get any manual action message.
So I have two questions:
-how can I accelerate the reindexation process, if it's possible?
-and do these tags hurt my organic search results? I've googled it, yes and some say it does and some say it doesn't.
An option is to set a canonical link on both URLs (with and without /) using the URL without a /. Little by little, Google will stop complaining. Keep in mind Google Webmaster Tools is slow to react, especially when you don't have much traffic or backlinks.
And yes, duplicate tags can influence your rankings negatively because users won't have proper and specific information for each page.
Set a canonical link on both Urls is a solution but it take time from my experience.
The fasted way is to block old URL in robots.txt file.
Disallow: /old_url
canonical tag is option but why you are not adding different title and description for all pages.
you can add dynamic meta tags one time and it will create automatically for all pages so we dont worry about duplication.

Editing the head element on an old blog platform on a post-by-post basis. Is this impossible or am I missing something?

Sorry for being a total rookie.
I am trying to help my professor implement this advice:
Either as a courtesy to Forbes or a favor to yourself, you may want to include the rel="canonical" link element on your cross-posts. To do this, on the content you want to take the backseat in search engines, you add in the head of the page. The URL should be for the content you want to be favored by search engines. Otherwise, search engines see duplicate content, grow confused, and then get upset. You can read more about the canonical tag here: http://www.mattcutts.com/blog/canonical-link-tag/. Have a great day!
The problem is I am having trouble figuring out how to edit the head element on a post-by-post basis. We are currently on a super old blogging platform (Movable Type 3.2 from 2005), so maybe it is not possible. But I'd like to know if that is likely the reason, so I'm not missing out on a workaround.
If anyone could point me in the right direction, I would greatly appreciate it!
Without knowing much about your installation, I'll give a general description, and hopefully it matches what you see and helps.
In Movable Type, each blog has a "Design" section where you can see and edit the templates for the blog. On this page, the templates that are published once are listed under "Index Templates," and the templates published multiple times, once per entry, per category, etc., are listed under "Archive Templates."
There probably is an archive template called "Entry" (could be renamed) publishing to a path like category/sub-category/entry-basename.php. This is the main template that publishes each entry. Click on this to open the template editor.
This template could be an entire HTML document, or it might have "includes" that look like <MTInclude module=""> or <$mt:Include module=""$> (MT supports varying tag styles.).
You may find there is an included module that contains the <head> content, or it might just be right in that template. To "follow" the includes and see those templates, there should be links on the side of the included templates.
Once you find the <head> content, you can add a canonical link tag like this:
<mt:IfArchiveType type="Individual">
<mt:If tag="EntryPermalink">
<link rel="canonical" href="<$mt:EntryPermalink$>" />
</mt:If>
</mt:IfArchiveType>
Depending on your needs, you might want to customize this to output a specific URL structure for other types of content, like category listings. The above will just take care of telling search engines the preferred URL for each entry.
#Charlie: may be I'm missing something, but your solution basically places a canonical link on each entry to… itself, which is a no-no for search engines (the link should point to another page that's considered the canonical one).
#user2359284 you need a way to define the canonical entry for those which need this link. As Shmuel suggested, either reuse an unused field or a custom field plugin. Then you simply add that link in the header in the proper archive template that outputs your notes. In the hypothesis that the Entry template includes the same header as other templates, and, say, you're using the Keywords field to set the URL, then the following code should work (the mt:IfArchiveType test simply ensures it's output in the proper context, which you don't need if your Entry template has its own code for the header):
<mt:IfArchiveType type="Individual">
<link rel="canonical" href="<$mt:EntryKeywords$>" />
</mt:IfArchiveType>

Google SEO - duplicate content in web pages for submitting sitemaps

I hope my question is not too irrelevant to stackoverflow.
this is my website: http://www.rader.my
It's a car information website. The content is dynamic. Therefore, google crawler could not find all the cars specification pages in my website.
I created a sitemap with all my cars URL in it (for instance: http://www.rader.my/Details.php?ID=13 is for one car). I know I haven't made any mistake in my .xml file format and structure. But after submission, google only indexed one URL which is my index.php.
I have also read about rel="canonical". But I don't think in my case I should use such a thing since all my pages ARE different with different content but only the structure is the same.
Is there anything that I missed? Why google doesn't accept my URLs even though the contents are different? What can I do to fix this?
Thanks and regards,
Amin
I have a similar type of site. Google is good about figuring out dynamic sites. They'll crawl the pages and figure out the unique content as time goes on. Give it time.
You should do all the standard things:
Make sure each page has a unique H1 tag.
Make sure each page has substantial unique content
Unique keywords and description tags aren't as useful as they used to be but they can't hurt.
Cross-link internally. Create category pages that include links to all of one manufacturer and have each of the pages of that manufacturer link back to 'similar' pages.
Get links to your pages. Nothing helps getting indexed like external authority.

Is googlebot indexing links in html comments?

I got a huge number of NOT FOUND links on Google webmaster tool, looks like the links are coming from a section of code in the footer which was put in an HTML comment
All pages have NOARCHIVE tag so it's probably not a cache issue
Did this happen to anyone?
A quick Google (ironic, eh?) shows that whilst there is no official word on the subject, the general concensus (through anecdotal and experimental evidence) is that Google will process everything including content in comment tags. This means that it will indeed index your links, even if they're in comment tags. However, it does not use the content as a source for keyword searches, i.e. anything in a HTML comment is not considered to be part of your page's visible content and is therefore not usable as part of search criteria.
HTML comments are designed to simply specify human-readable information about what your layout is doing, for example signifying where a particular include begins in a page outputted by a PHP script. You shouldn't be using HTML comments to remove large chunks of code in your site. I suggest that you remove the content.
If you don't want Google to follow a link, you can add rel="nofollow" to your hyperlink. You can also use robots.txt to specify directories or URL wildcards that you do not want Google to index.
References:
http://en.wikipedia.org/wiki/Nofollow
http://en.wikipedia.org/wiki/Robots.txt
http://www.webmasterworld.com/forum3/4270.htm
http://www.codingforums.com/archive/index.php/t-71686.html
If you are talking about links in comments between tags, I don't think they are taking effect with Google Bots as stated there and there.
Regards.