Page not index in google search console - development-environment

I recently try to Index 10,000 pages in google search console but it is more than 2 week these pages still not index someone please help me to fix this problem.
Or share any method to index page Quickly.
The site is: https://www.solutioninn.com/

As I can see that your website is made in AngularJs version 1. Now, there is hell lots of issues with Angularjs version 1 and Google indexing.
Google bot uses Chromium version 41 to crawl and index websites for the search engine. For having a clear understanding that where your website is failing to get indexed, Download Chromium/Chrome browser version 41 and then load your website and check what errors you are getting in the console. If you are getting a single error your site will not get indexed. You also need to implement proper meta tags.
I have worked extensively on this type of issues. It would be really terrible to write a whole solution here. You can ping me in person. I may help in resolving your issue.

Related

Google Search Console API Results Two Days Behind the Site. How to Fix?

So I just started working with the Google Search Console API to see top keywords that sent people to a site and interestingly, when I use the API, the most recent data I can get is from two days ago, but if I go directly into the search console website as a user, I can get data from today.
Is there a way to fix this or is it a known limitation? I Know there's a 48-hour delay in getting the search console data into Google Analytics, but I thought that applied only to the GA / Search console connection.
Simple! You just have to define the value for dataState as all to include this fresh data you're referring to.

Google SEO Using Search Console

I am writing my personal website but it is not showing on the search engine so that I search from different forums. After searching a lot from the internet I read this google search console. So I register my site then successfully after 8 hours my site is showing in google using "site:personalwebsite.xyz" but after 24 hours my site is missing as in 0 result return. Guy's I am only a novice programmer so please help me how to show my site in google. Thank you in advance.
Google usually shows only the best and the most used contents. Yours is not maybe matching with keywords or there is sites more succesfully than yours.
Google may also take your site as "spam site" and thats reason why it wont show

Vaadin 7 make application SEO ready

Google Ajax-crawling instructions say the !# is actually transformed into ?_escaped_fragment_ by the google crawler.
I'd like to prepare my Vaadin 7 application to be SEO ready for Google search engine so could you please tell me if there is any out of the box functionality that will simplify that process by handling a following requests with ?_escaped_fragment_ ?
If there is no out of the box solution - what is the right way in order to implement this ?
Or another idea - is it possible to use Prerender.io together with Vaadin ?
UPDATED
Looks like nowadays Google is able to crawl, render, and index the #! URLs.
Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you've deprecated your recommendation?
A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you're making the next update for your site. Instead of the _escaped_fragment_ URLs, we'll generally crawl, render, and index the #! URLs.
https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html
Can someone please confirm that Vaadin application can be successfully crawled by Google bot ?
Check out the volga addon and the related blog post.
Volga is a Vaadin add-on that helps you to add meta data to your
Vaadin applications, which will help social media services and search
engines better interpret your application.

Automatic Google Indexing

Implemented Google site search in our company website. We need to automate the google indexing for our website.
Suppose like our customers are updated the forum. We need to show the up to updated forum information in our forum search ?
Is there any option in google API or any other API please help me ?
You can use an XML sitemap. This will tell the search engines where your content is so they can find it and crawl it. Keep in mind there is no way to make the search engines crawl your site when you want them to. They will crawl on a schedule they determine to be right for your site. (You can set a crawl rate in Google Webmaster Tools but that rate is relative to what crawl rate Google already has set for you. Setting it to fastest will not speed up heir crawl rate)).
Unfortunately, Google will only crawl your site when it feels like it. It is based on many variables to determine how often this occurs (i.e. site ranking, standards compliance, and so on). The sitemap XML is a helpful way to help Google determine what parts of your site to index, however if you don't have one Google will find it by crawling links on other parts of your page and updating its index if the page changes.
The more visitors you get and the more often your site's links appear on other sites will make Google index more frequently.
To start, I'd suggest http://validator.w3.org/ to validate your site and make sure you get it as close to possible to no errors. This makes it easier for Google to index your site because it can find the information it expects without having to crawl over invalid markup. Also, chances are, if a site validates with a very small amount of errors, it is more credible than one containing many errors. It tells the search engine that you update your site to ensure most all browsers can use it and that it is accessible.
Also validating your site gives you some bragging rights over those who don't meet W3 standards :)
Hope this helps!

Sitemap.xml - Google not indexing

I have created a sitemap for my site and it complies with the protocol set by http://www.sitemaps.org/
Google has been told about this sitemap via webmaster tools. It has tracked all the urls within the sitemap (500+ urls) but has only indexed 1 of them. The last time google downloaded the sitemap was on the 21st of Oct 2009.
When I do a google search for site:url it picks up 2500+ results.
Google says it can crawl the site.
Does anyone have any ideas as to why only 1 url is actually indexed?
Cheers,
James
First off, make sure Google hasn't been forbidden from those pages using robots.txt, etc. Also make sure those URLs are correct. :)
Second, Google doesn't just take your sitemap at face value. It uses other factors, such as inbound links, etc, to determine whether it wants to crawl all of the pages in your sitemap. The sitemap then serves mostly as a hint more than anything else (it helps Google know when pages are updated more quickly, for example). Get high-quality, relevant, useful links (inbound and outbound) and your site should start getting indexed.
Your two statements seem to contradict one another.
but has only indexed 1 of them.
and
When I do a google search for site:url it picks up 2500+ results
bdonlan is correct in their logic (robot.txt and Google's lack of trust for sitemaps) but I think the issue is what you "think" is true about your site.
That is, Google Webmaster Tools says you only have 1 page indexed but site:yoursite.com shows 2.5k.
Google Webmaster Tools aren't very accurate. They are nice but they are buggy and MIGHT help you learn about issues about your site. Trust the site: command. Your in Google's index if you search site:yoursite.com and you see more than 1 result.
I'd trust site:yoursite.com. You have 2.5k pages in Google, indexed and search-able.
So, now optimize those pages and see the traffic flow. :D
Sidenote: Google can crawl any site, flash, javascript, etc.