can google detect structured data if using multiple sitemaps? - seo

I have tested my webpages using Google Structured Data Testing Tool. It’s been several months now and Google still has not detected structured data on my site.
I have 2 sitemaps sitemap.xml and us-sitemap.xml.
Google is detecting structured data from a link in sitemap.xml but it is not detecting structured data in any link submitted in us-sitemap.xml. Does the sitemap has to be called sitemap.xml for it to work properly or is there something else that I need to do?
This is the site in question: http://www.findazan.info

If you have multiple sitemaps with different names, these should preferably be declared in your robots.txt file as following:
sitemap: http://yoursite.com/sitemap.xml
sitemap: http://yoursite.com/us-sitemap.xml
Alternatively, your could submit both to all search engines, but that is extra work...

Related

How to remove duplicate title and meta description tags if google indexed them

So, I have been building an ecommerce site for a small company.
The url structure is : www.example.com/product_category/product_name and the site has around 1000 products.
I've checked google webmaster tools and in the HTML improvements section it shows that I have multiple title and meta description tags for all the product pages. They all appear two times, both:
-www.example.com/product_category/product_name
and
-www.example.com/product_category/product_name/ (with slash in the end)
got indexed as separate pages.
I've added a 301 redirect from every www.example.com/product_category/product_name/ to www.example.com/product_category/product_name, but this was almost two weeks ago. I have resubmitted my sitemap and asked google to fetch the whole page a few times. Nothing has changed, GWT still shows the pages as duplicate tags.
I did not get any manual action message.
So I have two questions:
-how can I accelerate the reindexation process, if it's possible?
-and do these tags hurt my organic search results? I've googled it, yes and some say it does and some say it doesn't.
An option is to set a canonical link on both URLs (with and without /) using the URL without a /. Little by little, Google will stop complaining. Keep in mind Google Webmaster Tools is slow to react, especially when you don't have much traffic or backlinks.
And yes, duplicate tags can influence your rankings negatively because users won't have proper and specific information for each page.
Set a canonical link on both Urls is a solution but it take time from my experience.
The fasted way is to block old URL in robots.txt file.
Disallow: /old_url
canonical tag is option but why you are not adding different title and description for all pages.
you can add dynamic meta tags one time and it will create automatically for all pages so we dont worry about duplication.

SEO Search Only content

We have a ton of content on our website which a user can get to by performing a search on the website. For example, we have data for all Public companies, in the form of individual pages per company. So think like 10,000 pages in total. Now in order to get to these pages, a user needs to search for the company name and from the search results, click on the company name they are interested in.
How would a search bot find this page? There is no page on the website which has links to these 10,000 pages. Think amazon, you need to search for your product and then from the search results, click on the product you are interested in to get to it.
The closest solution I could find was the sitemap.xml, is that it? Anything which doesn't require adding 10,000 links to an xml file?
You need to link to a page, or for it to be close to the homepage for it to stand a decent chance of getting indexed by Google.
A sitemap helps, sure, but a page still needs to exist in the menu / site structure. A sitemap reference alone does not guarantee a resource will be indexed.
Google - Webmaster Support on Sitemaps: "Google doesn't guarantee that we'll crawl or index all of your URLs. However, we use the data in your Sitemap to learn about your site's structure, which will allow us to improve our crawler schedule and do a better job crawling your site in the future. In most cases, webmasters will benefit from Sitemap submission, and in no case will you be penalized for it."
If you browse Amazon, it will be possible to find 99% of the products available. Amazon do a lot of interesting stuff in their faceted navigation, you could write a book on it.
Speak to an SEO or a usability / CRO expert - they will be able to tell you what you need to do - which is basically create a user friendly site with categories & links to all your products.
An XML sitemap pretty much is your only on-site option if you do not or cannot link to these products on your website. You could link to these pages from other websites but that doesn't seem like a likely scenario.
Adding 10,000 products to an XML sitemap is easy to do. Your sitemap can be dynamic just like your web pages are. Just generate it on the fly when requested like you would a regular web page and include whatever products you want to be found and indexed.

Google SEO - duplicate content in web pages for submitting sitemaps

I hope my question is not too irrelevant to stackoverflow.
this is my website: http://www.rader.my
It's a car information website. The content is dynamic. Therefore, google crawler could not find all the cars specification pages in my website.
I created a sitemap with all my cars URL in it (for instance: http://www.rader.my/Details.php?ID=13 is for one car). I know I haven't made any mistake in my .xml file format and structure. But after submission, google only indexed one URL which is my index.php.
I have also read about rel="canonical". But I don't think in my case I should use such a thing since all my pages ARE different with different content but only the structure is the same.
Is there anything that I missed? Why google doesn't accept my URLs even though the contents are different? What can I do to fix this?
Thanks and regards,
Amin
I have a similar type of site. Google is good about figuring out dynamic sites. They'll crawl the pages and figure out the unique content as time goes on. Give it time.
You should do all the standard things:
Make sure each page has a unique H1 tag.
Make sure each page has substantial unique content
Unique keywords and description tags aren't as useful as they used to be but they can't hurt.
Cross-link internally. Create category pages that include links to all of one manufacturer and have each of the pages of that manufacturer link back to 'similar' pages.
Get links to your pages. Nothing helps getting indexed like external authority.

Submit RSS feed as a Sitemap to Google?

Background
I work for an online media company that hosts a news site with over 75K pages. We currently use Google Sitemap Generator (installed on our server) to build dynamic XML sitemaps for our site. In fact since we have a ton of content, we use a sitemap of sitemaps. (Google only allows a maximum of 50K URLs.)
Problem
The sitemaps are generated every 12 hours and is driven by user behavior. That is, it parses the server log file and sees which pages are being fetched the most and builds the sitemap based on that.
Since we cannot guarantee that NEW pages are being added to the sitemap, is it better to submit a sitemap as an RSS feed? In that way, everytime one of our editors creates a new page (or article) it is added to the feed and submitted to google. And this brings up the issue of pushing duplicate content to google as the sitemap and the RSS feed might contain the same urls. Will google penalize us for duplicate content? How do other content-rich or media sites notify google that they are posting new content?
I understand that googlebots only index pages that it deems important and relevant, but it would be great if atleast crawled any new article that we post.
Any help would be greatly appreciated.
Why not simply have every page in your sitemap? 75k pages isn't a huge number, plenty of sites have several sitemaps totalling millions of pages and Google will digest them all (although Google will only index those it deems important as you pointed out).
One technique for you would be to split the sitemaps up into New and Archived content based on the publication date - such as a single sitemap for all content from the previous 7 days and the rest of the content split into other sitemap files as appropriate, this may help to get your freshest content indexed quickly.
Back to your question about an RSS Feed sitemap - don't worry about duplicate content as this is not an issue when it comes to sitemaps. Duplicate content is only a problem if you published the same article several times on the site - sitemaps and RSS feeds are only links to the content, not the content itself, so if a RSS feed is the easiest way of reporting your fresh content to Google, go for it.

What should i add to my site to make google index the subpages as well

I am a beginner web developer and i have a site JammuLinks.com, it is built on php. It is a city local listing search engine. Basically i've written search pages which take in a parameter, fetch the records from the database and display it. So it is dynamically generating the content. However if you look at the bottom of the site, i have added many static links where i have hard coded the parameters in the link like searchresult.php?tablename='schools'. So my question is
Since google crawls the page and also the links listed in the page, will it be crawling the results page data as well? How can i identify if it has. So far i tried site:www.jammulinks.com but it results the homepage and the blog alone.
What more can i add to make the static links be indexed by it as well.
The best way to do this is to create a sitemap document (you can even get the template from Google's webmaster portion of their sites, www.google.com/webmasters/ I believe).