I have an ecommerce website that's PCI compliant, and the website does not have an XML sitemap. The previous webmaster stated that he removed the sitemap because it caused the website to fail PCI validation testing. (We use the McAfee SECURE service for automated testing.) I'd like to restore the sitemap for SEO, but I don't want to jeopardize our compliance.
Assuming that my new sitemap only includes relevant product/information link information, do I have anything to worry about?
t-nez,
I work for a merchant service provider, www.banckardclub.com, as the lead SEO. The XML sitemap will not cause your site to fail PCI compliance. We have an XML sitemap and we submit to google, and other search engines. We pass our compliance test with out a problem.
Have your webmaster look into the problem some more.
Create an account on Webmaster tools and if you encounter problems search the help, this should not be a problem.
Related
I tried using Google Webmaster tools to re-crawl the site and it hasn't helped.
Does anyone know why the link on Google might say 'halalgems.com', but redirect to another site?
The description is also incorrect.
Unfortunately, it appears as though your website has been hacked, and not a problem with Google.
After looking at the response from your website once someone goes to it from that google page, it is your website that does the redirecting.
I cannot determine the source of this issue, as that would likely require access to your server. Good luck with tracking it down!
I'm building a website for a company, I build this website offline so google won't index it, with the reason that google don't see the codes yet and thinks at the official launch that we copied the text/codes from an other website.
Sadly I encounterd a problem. I need to implant the facebook social plugin, sadly does this plugin only work when the site is online. But as I said putting it online can be dangerous for future Google SEO.
Is there an other option where I can see the facebook plugin but it is not online yet or is it okay for me to just place it online already on (for example) www.example.com and later put the released product on (for example) www.released.com.
Any toughts on this problem?
Why don't you place the website online in a folder that is blocked for Googlebot via robots.txt?
That way you can test all the online elements work and not have to worry about users and search engines finding it and listing it.
An alternative could be to use the .htaccess file to limit access only to your IP address - that way, you'd be the only one to see the site live.
I have a business listing website which has links to business owners own business website..
Recently I've been getting requests to remove listing because they are getting warning from Google that links from my site to their site is unnatural link.
How do I go about change things on my site so I don't lose listing because Google are penalizing my out bounding links??
The fast and quick solution is to make these links nofollow.
I have a website that has two different pages structure - one for mobile visitors, and one for desktop. That's why I have two sitemap files - one for the mobile and one for desktop.
I want to create a robots.txt file that will "tell" search engines bots to scan the mobile sitemap for mobile sites, and the desktop sitemap for desktop sites.
How can I do that?
I thought of creating a sitemap index file which will point to both of those site maps, and to add the following directive to the robots.txt file:
sitemap: [sitemap-index-location]
It this the right way?
I can not give you a certainty, but I believe the best practice is to inform the two sitemaps in robots.txt. In mobile sitemap you already have the markings <mobile:mobile/> is reporting that a mobile version.
Another interesting question is perhaps also create a sitemap index:
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>http://example.com/sitemap.desktop.xml</loc>
</sitemap>
<sitemap>
<loc>http://m.example.com/sitemap.mobile.xml</loc>
</sitemap>
</sitemapindex>
And your robots.txt will look like:
# Sitemap index
Sitemap: http://example.com/sitemap.xml
# Other sitemaps. I know it is already declared in the sitemap index, but I believe it will do no harm also set here
Sitemap: http://example.com/sitemap.desktop.xml
Sitemap: http://example.com/sitemap.mobile.xml
robots.txt does not tell the search engine which mobile end which is the end of your PC, and he can only declare a sitemap.This is sufficient Well, I think you can add a judge in the html page header, the pc end pc side web mobile end mobile page, this is not good?Site Map there is on the page there is a link to, then more to promote inclusion.
I think I will recommend you for the responsive website design.
With the help of a responsive web design technique, you can build the alter web pages using CSS3 media queries. Here, there is one HTML code for the page regardless of the device accessing it. But, its presentation changes through CSS media queries to specify as to which CSS rules apply to the browser for displaying the page.
With responsive website design, you can keep both the desktop and mobile content on a single URL. It is easier for your users to interact with, share, and link to and for Google’s algorithms to assign the indexing properties to your web content.
Besides, Google will crawl your content effectively and there won’t be any need to crawl a web page using a different Googlebot user agent.
You can simply define a single sitemap and put in robots.txt file. It will crawl both for your desktop and mobile content.
In addition to stating the files in robots.txt, you should log into Google Webmaster Tools and submit the sitemaps there. That will tell you
If the sitemap url you submitted is correct
That the sitemap file has the correct syntax
How many of the files in the sitemap have been crawled
How many of the files in the sitemap have been indexed
I have an app running in Heroku.
I am using sitemap_generator to generate sitemap and save it into s3.
I have added the robots.txt to contain my sitemap location.
My question are.
How can I know my sitemap are successfully find by search engine like google?
How can I monitor my sitemap?
If my sitemap is located in my app server I can add the sitemap manually into google webmaster tools for monitoring. Because when I click on "Test/Add sitemap" in Google webmaster tools, it default to the same server.
Thanks for your help.
I got it to work.
Google has something called cross submission: http://googlewebmastercentral.blogspot.com/2007/10/dealing-with-sitemap-cross-submissions.html
You might want to visit this blog as well:
http://stanicblog.blogspot.sg/2012/02/how-to-add-your-sitemap-file-located-in.html
Thanks for your help, yacc.
Let me answer your two first questions, one at a time (I'm not sure what you mean by 'how can I monitor my sitemap' so I'll skip it):
Manually submit a sitemap to Google
If you can't use Google webmaster form to submit your sitemap, use an HTTP get request to notify Google of your new site map.
If your sitemap is located at https://s3.amazonaws.com/sitemapbucket/sitemap.gz , first URL encode your sitemap URL (you can use this online URL encoder/decoder for that) then using curl or wget to submit your encoded URL to Google:
curl www.google.com/webmasters/tools/ping?sitemap=https%3A%2F%2Fs3.amazonaws.com%2Fsitemapbucket%2Fsitemap.gz
If your request is successful you'll get a 200 answer with a message like this:
... cut ...
<body><h2>Sitemap Notification Received</h2>
<br>
Your Sitemap has been successfully added to our list of Sitemaps to crawl.
... cut ...
Checking that Google knows about your new sitemap
Open Webmaster Tools, navigate to Site sonfiguration->Sitemaps, there you should see the sitemaps that you've submited. It might take sometime for a new sitemap to show up there, so check frequently.