Important elements of web page are "nvisible" - seo

I ran my website through a web tool that evaluates SEO weight of elements and in the report it says that certain parts, like Description and other meta tags are missing... Also as a thumbnail of my site it shows a default server page. At the same time it shows the list of other pages that are linked from index page.
I checked and this AGENT is not blocked in robots.txt
Now, how can that be?
Demo

I think that the description issue is caused by the the fact that you are using "META" instead of "meta" in your meta tags.

there are many sites out that can run similar tests on your site such as the one you provided. It is just that site showing old data, you may want to submit your sitemap.xml to Bing & Google Webmaster Tools. If your site doesn't have a sitemap.xml file you may want to consider creating one.

Related

How to remove URLs with argument in google result

I have a website which I have recently started and also submitted my sitemap on google webmaster tool. My site got index whiten short time but whenever I search about my website on google, I see two three version of my same pages with diff URL arguments on each
Means suppose my site name is example.com, so when I search about exmaple.com on Google I get the results like following
www.example.com/?page=2
www.example.com/something/?page=3
www.example.com
As I know result 1 and result 3 are same, why are they being shown separately ? I don't have any such URL in my sitemap and not even in any of my html page so why is this happening I am little confused. I want to get rid of it
Also result no 2 should be displayed simple as www.exaple.com/something
and not like www.example.com/something?page=3
There is actually a setting in google webmaster tool which helps in removing URLs with parameters. To access & configure the setting, navigate to Webmaster tool --> Crawl --> URL Parameters and set them according to your needs
I also found following article useful for understanding concept behind those parameters and how could we remove pages getting crawled with unnecessary parameters
http://www.shoutmeloud.com/google-webmaster-tool-added-url-parameter-option-seo.html

How to remove duplicate title and meta description tags if google indexed them

So, I have been building an ecommerce site for a small company.
The url structure is : www.example.com/product_category/product_name and the site has around 1000 products.
I've checked google webmaster tools and in the HTML improvements section it shows that I have multiple title and meta description tags for all the product pages. They all appear two times, both:
-www.example.com/product_category/product_name
and
-www.example.com/product_category/product_name/ (with slash in the end)
got indexed as separate pages.
I've added a 301 redirect from every www.example.com/product_category/product_name/ to www.example.com/product_category/product_name, but this was almost two weeks ago. I have resubmitted my sitemap and asked google to fetch the whole page a few times. Nothing has changed, GWT still shows the pages as duplicate tags.
I did not get any manual action message.
So I have two questions:
-how can I accelerate the reindexation process, if it's possible?
-and do these tags hurt my organic search results? I've googled it, yes and some say it does and some say it doesn't.
An option is to set a canonical link on both URLs (with and without /) using the URL without a /. Little by little, Google will stop complaining. Keep in mind Google Webmaster Tools is slow to react, especially when you don't have much traffic or backlinks.
And yes, duplicate tags can influence your rankings negatively because users won't have proper and specific information for each page.
Set a canonical link on both Urls is a solution but it take time from my experience.
The fasted way is to block old URL in robots.txt file.
Disallow: /old_url
canonical tag is option but why you are not adding different title and description for all pages.
you can add dynamic meta tags one time and it will create automatically for all pages so we dont worry about duplication.

How do I test a website in webmaster tools without indexing it

Suppose I have my live site at www.mywebsite.com, tracked and managed via Google Webmaster Tools. Then I want to add to the project list a subdomain like test.mywebsite.com which I use for testing purposes. Of course that subdomain shouldn't be tracked or indexed by Google, but I would like to use "fetch as Google" feature on it to see how the crawler manages the pages. Can I set up such a test environment without being indexed by Google?
Not had chance to test this, but I think if you add noindex tags to your site then it should still allow your site to be registered with webmaster tools, as it can still see the site's content in order to detect ownership.
I believe "fetch as google" then returns live results rather than what is already indexed (it wouldn't be very useful if it didn't allow you to check new pages or re-check updated pages), and so temporarily removing the noindex tag when you run it should allow this feature to be used (it may also return some useful information without removing it).
The fact "fetch as" has a separate "submit" button suggests to me that it will not automatically index pages found via this method, so that should not be a concern.
Adding canonical tags pointing to your main content would provide an additional security measure to stop it accidentally listing.
Google can't provide any information about your website if it's not indexed.
In other words, you can use Google Webmaster Tools without your website being indexed, but it will be pretty much useless, since will not provide any data.
Google webmaster tools won't let you do that but you can test a website for seo checkup or other errors like search description missing,image alt missing etc with bing webmaster tools

Google SEO - duplicate content in web pages for submitting sitemaps

I hope my question is not too irrelevant to stackoverflow.
this is my website: http://www.rader.my
It's a car information website. The content is dynamic. Therefore, google crawler could not find all the cars specification pages in my website.
I created a sitemap with all my cars URL in it (for instance: http://www.rader.my/Details.php?ID=13 is for one car). I know I haven't made any mistake in my .xml file format and structure. But after submission, google only indexed one URL which is my index.php.
I have also read about rel="canonical". But I don't think in my case I should use such a thing since all my pages ARE different with different content but only the structure is the same.
Is there anything that I missed? Why google doesn't accept my URLs even though the contents are different? What can I do to fix this?
Thanks and regards,
Amin
I have a similar type of site. Google is good about figuring out dynamic sites. They'll crawl the pages and figure out the unique content as time goes on. Give it time.
You should do all the standard things:
Make sure each page has a unique H1 tag.
Make sure each page has substantial unique content
Unique keywords and description tags aren't as useful as they used to be but they can't hurt.
Cross-link internally. Create category pages that include links to all of one manufacturer and have each of the pages of that manufacturer link back to 'similar' pages.
Get links to your pages. Nothing helps getting indexed like external authority.

What should i add to my site to make google index the subpages as well

I am a beginner web developer and i have a site JammuLinks.com, it is built on php. It is a city local listing search engine. Basically i've written search pages which take in a parameter, fetch the records from the database and display it. So it is dynamically generating the content. However if you look at the bottom of the site, i have added many static links where i have hard coded the parameters in the link like searchresult.php?tablename='schools'. So my question is
Since google crawls the page and also the links listed in the page, will it be crawling the results page data as well? How can i identify if it has. So far i tried site:www.jammulinks.com but it results the homepage and the blog alone.
What more can i add to make the static links be indexed by it as well.
The best way to do this is to create a sitemap document (you can even get the template from Google's webmaster portion of their sites, www.google.com/webmasters/ I believe).