Ajax Load Contact Form and SEO - seo

Im thinking of creating an Ajax load contact form, so when the user submits a successful form the Thanks! page will load via Ajax Load.
But saying this, if I were to do this, my SEO wouldnt show that the visitor got to the thanks page... would it?
I was thinking if this were the case then I could add my Analytics code to the thanks page.
Thanks in advance

Your contact form is typically not a Search Engine Optimization concern. I've never heard of a site that gets a significant amount of search referral traffic to the contact form.
I believe that Googlebot views the presence of a contact form as a sign of a quality site (similar to having a privacy policy and about us). For this purpose, I wouldn't imagine that having as AJAX would hurt as long as there were a link on the home page that include the word "contact".
It sounds like you might actually be worried about analytics, not SEO. Maybe you should rephrase the question?

Related

Scrapy. How to navigate, select and submit form

I am trying to make a bot to simulate some human behaviors, and I got some instructions about scrapy to login at a page like nike.com.br, but once I need to select some buttons and submit some forms I was not able to find how.
Can anyone help me on it?
for example, after the login, I need to chose the size of the product and click at add to the cart, that is some way to do it using scrapy?
It's hard to answer you question because it's too generic, and this probably will have different solutions for different pages.
Generally speaking you need to check what the page is doing when you click to submit the form. Most likely a POST request, so you will need to mimic that POST request with scrapy (check FormRequest).
Same logic applies to add item to the card.
I think the best way to approach that is to use the browser's network tool. In scrapy docs there are a few tips about using it for similar purpose (here).

Handling SEO on Isomorphic React

i'm using React & Node JS for building universal apps (). I'm also using react-helmet as library to handle page title, meta, description, etc.
But i have some trouble when loading content dynamically using ajax, google crawler cannot fetch my site correctly because content will be loaded dynamically. Any suggestion to tackle this problem?
Thank you!
I had similar situation but with backend as django, but I think which backend you use doesnt matter.
First of let me get to the basics, the google bots dont actually wait for your ajax calls to get completed. If you want to test it out register your page at google webmaster tools and try fetch as google, you will see how your page is seen by bots(mine was just empty page with loading icon), so since calls dont complete, not data and page is empty, which is bad for SEO ,as bots read text.
So what you need to do, is try server side rendering. This you can do in 2 ways either you prerender.io or create templates on backend which are loaded when the page is called for the first time, after which your single page app kicks in.
If you use prerender its paid but pre-render internally uses phantom.js which you are you can directly use. But it did not work out really well for me so I went with creating templates on the backend. So the bots or the user when come to page for first time(or first entry) the page is served from backend else front end.
Feel free to ask in case in any questions :)

Are captchas need for my online form

I have an online form in asp.net which using the jQuery wizard, and was not sure my last stage would need a captcha control to prevent bots/crawlers.
So would I need a captcha? for my online web form...?? Is it recommended??
Captcha is recommended if your application/form is being attacked by bots. If you feel, there are attacks and you have sensitive information, you can opt for captcha (or recaptcha).
If you are not going to make it public, i.e. providing form to invited users only, then, you probably don't need Captcha. But, if it is going to be open to public,(like gmail for example) then, its definitely recommended to put captcha in the form.

SEO : Adding to Google other than submitting directly for google's crawler - http://www.enshaeyah.webs.com

What are other ways of making your website searchable by Google, other than submitting the link directly to Google.
Submitting links to yahoo is a breeze, gets crawled for a day or two... Google though takes a while...
Thanks...
if you add a link to your website on a website that's already indexed by google, google will follow that and reach your site without you needing to submit to their page. it's actually not recommended to submit your site to their page because then you're put at the end of the queue. but if you have a link on a page that google indexes in the next minute, it will get to you much faster. more links on many pages with higher ranking the better. cheers
Add your site to DMOZ.org, and encourage everyone you know to link to your site. The more places that link to your site, the more likely it'll get indexed sooner (and more fully), and the better it will rank.
Also, if your site is very large, it is not unreasonable to sign up for their webmaster tools and submit a sitemap index. This is especially effective for fast ranking, and showing up in obscure search results, but it will not help you rank for difficult terms.
Also note that if your site was visited by googlebot,
it doesn't necessarily end up in the google index.
Use this link to check:
http://www.google.com/webmasters/tools/sitestatus

Is this a blackhat SEO technique?

I have a site which has been developed completely in flash. Now the site owners do not want to shift to a more text/html based site. So am planning to create an alternative html/text based site which the googlebot will get redirected to. (By checking the useragent). My question is that is this allowed officially by google?
If not then how come there are many subscription based sites which display a different set of data to google compared to the users? Is that allowed?
Thank you very much.
I've dealt with this exact scenario for a large ecommerce site and Google essentially ignored the site. Google considers it cloaking and addresses it directly here and says:
Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
Instead, create an ADA compliant version of the website so that users with screen readers and vision aids can use your web site. As long as there as link from your home page to your ADA compliant pages, Google will index them.
The official advice seems to be: offer a visible link to a non-flash version of the site. Fooling the googlebot is a surefire way to get in trouble. And remember, Google results will link to the matching page! Do not make useless results.
Google already indexes flash content so my suggestion would be to check how your site is being indexed. Maybe you don't have to do anything.
I don't think showing an alternate version of the site is good from a Google perspective.
If you serve up your page with the exact same address, then you're probably fine. For example, if you show 'http://www.somesite.com/' but direct googlebot to 'http://www.somesite.com/alt.htm', then Google might direct search users to alt.htm. You don't want that, right?
This is called cloaking. I'm not sure what the effects of it are but it is certainly not whitehat. I am pretty sure Google is working on a way to crawl flash now so it might not even be a concern.
I'm assuming you're not really doing a redirect but instead a PHP import or something similar so it shows up as the same page. If you're actually redirecting then it's just going to index the other page like normal.
Some sites offer a different level of content -- they LIMIT the content, they don't offer alternative and additional content. This is done so it doesn't index unrelated things generally.