Is keystone js webapp is SEO friendly? - keystonejs

I am building new web app using keystone js with hbs template. Can google crawl my website?
Thanks.

Yes Google will be able to crawl your site.
Handlebars templates will be rendered on the server before being sent to the client.
You might want to set meta tags in your templates to help with SEO.
On the following web page, there is a section called 3.2. SEO AND SOCIAL OPTIMIZATION which you might find helpful:
https://nodevision.com.au/blog/post/tutorial-blogging-with-nodejs-and-keystone-cms

Related

Hosting a login-gated content on a jekyll blog

I'm hosting a static Jekyll website with written markdown content on Netlify (e.g. like cooking recipes or something). I'd like to integrate a login flow so that some of the html pages are visible to members only, but my understanding is that this is not possible on static jekyll websites. Is that the case or is there a simple way to add members-only content on Jekyll?
I can do this trivially by writing some vue.js code, but I also want to preserve the ability to render markdown->html. Is there a way that I can get login-gated content on Jekyll natively, or somehow get some other web framework to render Jekyll-compiled content correctly?

How to redirect indexation bots to a different route using a Vue.js SPA to server rendered pages for them to be crawled and indexed?

I need to develop a Vue.js SPA where some of its pages need to be referenced by search engines.
I've read about multiple ways to make SPAs SEO-Friendly so I found the following solutions
Server-rendered pages
Prerendering
Since we have a lot of dynamic content to index, generating a static page for each "row" in the database seems not acceptable since we have hundreds if not thousands of content pages.
Creating multiple routes (one for users to visualize and one for bots to crawl)
This solution has been proposed by my manager and it interests me since it's more suitable for our case.
I found this article that illustrates the idea using another SPA framework
My question here is how can I detect that a crawler or an indexation bot have accessed our SPA in order to redirect it to our server rendered web pages and how to actually achieve that in Vue.js 2 (Webpack) ?
If you are concerned about SEO, the solution is to use SSR. Read through https://v2.vuejs.org/v2/guide/ssr.html#The-Complete-SSR-Guide.
If you already have server rendered webpages, as you mentioend in your question, then using SSR would be less work than keeping the Vue SPA and server app in sync. Not to mention, if the Vue SPA and server app show different content, this could hurt SEO and frustrate users.
Any method you use to target web crawlers can be by-passed, so there is no generic solution for this. Instead, try and focus on specific web crawlers, like Google's.
For Google, start by registering your app at https://www.google.com/webmasters. If you need to determine if the Google crawler visited your server, look at the user agent string; there are already other answers for that: Is it possible to find when google bot is crawling any urls on my site and record the last access time to a text file on server, https://support.google.com/webmasters/answer/80553?hl=en.

User registration in gatsby js

is their a way to let a website developed using Gatsby js, have user registration, user can have a profile and edit it, and user can add new article or item to the website ?
I know gatsby is static site generator. but I'm wondering if I can use Gatsby as a solution for such web application, and if not what are the tools I need to include to make it work ?
thanks.
Not sure why you are getting downvoted.
Apart from the comment by #fabian-schultz, the keyword is you are looking for I believe is CMS.
Basically you just want some dynamic content in your website which is generated by a static site generator.
There are a lot of choices. For example, you can follow the official tutorial to use Netlify CMS with Gatsby.
Hope it's enough to get you started.

How to make reactjs serve static pages for SEO

I have this reactjs webapp - http://52.26.51.120/
It loads a single page and then you can click links to dynamically load other boards on the same page... such as 52.26.51.120/#/b, 52.26.51.120/#/g
Now the thing is, I want each of these to be an individual page, for the purpose of seo. I want you to be able to google "my-site.com g", and you will see the result 52.26.51.120/#/g. This obviously cant work if you are using reactjs and it is all on one page. Google will only render and cache that main page (can it even render that page? does google execute javascript?)
Even more specifically, id want something like my-site.com/g/1234 to load a thread. There could be a thousand threads every day, so im not sure how to handle that.
Are there any solutions to this? to allow react to possibly serve up static pages as html so that they can be cached on google? I am using expressjs and webpack as my server right now, and all the code is executed in jsx files.
Thanks
You can use Prerender.io service. In a nutshell you will need to connect your server app or web server with prerender service and do little setup (see documentation). After that search engines and crawlers will be served with a static (prerendered) content while users will get classic React.js (or any other single page) app.
There is a hosted service with free tier or open source version
I suspect you could do some smart configuration with nginx and it could convert a url of domain/board to a redirection of domain/#/board when detecting the user-agent as a bot instead of a typical user.

Put link to my site to google+ page related to my site via app related to that page

I have the my own application in google api.
and google+page for this application in https://cloud.google.com/console
Is it possible to publish any links to my website by the google-api on google+page wall?
The short answer is no.
While there is a Pages API that supports this functionality, currently publishing sites like HoutSuite have access.