I am developing a php script and it will be free to download for the users, if use css,js files hosted in my network in the script and they install it. So every user who uses my script will load my websites css,js files.
Will that increase SEO of my website ?
your ask is no !!
because its not important for GOOGLE and don't influence for SEO because it dont make BACKLINK for your website .
js and CSS files are'nt link
Yes,of course .
CDN increase your load page for users.
CDN increase your security for users.
CDN affect for mobile users.
Related
I dont know is this a right place to ask this question, so i planned to make a blog website with SPA (Single Page Application) model using vuejs, we all know SPA websites are not SEO friendly and we need SSR or Prerendering to overcome this problem.
From some source on the internet i found out that apllying SSR to a Website make it consume more server resource and i need to upgrade my hosting plan if needed and it will affect to my website operating cost.
can you guys give me advice what should i do?
from your experience did ssr cost server resources that much?
should i make my website without javascript framework and using fully dynamic php page?
I am building new web app using keystone js with hbs template. Can google crawl my website?
Thanks.
Yes Google will be able to crawl your site.
Handlebars templates will be rendered on the server before being sent to the client.
You might want to set meta tags in your templates to help with SEO.
On the following web page, there is a section called 3.2. SEO AND SOCIAL OPTIMIZATION which you might find helpful:
https://nodevision.com.au/blog/post/tutorial-blogging-with-nodejs-and-keystone-cms
I have this reactjs webapp - http://52.26.51.120/
It loads a single page and then you can click links to dynamically load other boards on the same page... such as 52.26.51.120/#/b, 52.26.51.120/#/g
Now the thing is, I want each of these to be an individual page, for the purpose of seo. I want you to be able to google "my-site.com g", and you will see the result 52.26.51.120/#/g. This obviously cant work if you are using reactjs and it is all on one page. Google will only render and cache that main page (can it even render that page? does google execute javascript?)
Even more specifically, id want something like my-site.com/g/1234 to load a thread. There could be a thousand threads every day, so im not sure how to handle that.
Are there any solutions to this? to allow react to possibly serve up static pages as html so that they can be cached on google? I am using expressjs and webpack as my server right now, and all the code is executed in jsx files.
Thanks
You can use Prerender.io service. In a nutshell you will need to connect your server app or web server with prerender service and do little setup (see documentation). After that search engines and crawlers will be served with a static (prerendered) content while users will get classic React.js (or any other single page) app.
There is a hosted service with free tier or open source version
I suspect you could do some smart configuration with nginx and it could convert a url of domain/board to a redirection of domain/#/board when detecting the user-agent as a bot instead of a typical user.
I have a website that has two different pages structure - one for mobile visitors, and one for desktop. That's why I have two sitemap files - one for the mobile and one for desktop.
I want to create a robots.txt file that will "tell" search engines bots to scan the mobile sitemap for mobile sites, and the desktop sitemap for desktop sites.
How can I do that?
I thought of creating a sitemap index file which will point to both of those site maps, and to add the following directive to the robots.txt file:
sitemap: [sitemap-index-location]
It this the right way?
I can not give you a certainty, but I believe the best practice is to inform the two sitemaps in robots.txt. In mobile sitemap you already have the markings <mobile:mobile/> is reporting that a mobile version.
Another interesting question is perhaps also create a sitemap index:
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>http://example.com/sitemap.desktop.xml</loc>
</sitemap>
<sitemap>
<loc>http://m.example.com/sitemap.mobile.xml</loc>
</sitemap>
</sitemapindex>
And your robots.txt will look like:
# Sitemap index
Sitemap: http://example.com/sitemap.xml
# Other sitemaps. I know it is already declared in the sitemap index, but I believe it will do no harm also set here
Sitemap: http://example.com/sitemap.desktop.xml
Sitemap: http://example.com/sitemap.mobile.xml
robots.txt does not tell the search engine which mobile end which is the end of your PC, and he can only declare a sitemap.This is sufficient Well, I think you can add a judge in the html page header, the pc end pc side web mobile end mobile page, this is not good?Site Map there is on the page there is a link to, then more to promote inclusion.
I think I will recommend you for the responsive website design.
With the help of a responsive web design technique, you can build the alter web pages using CSS3 media queries. Here, there is one HTML code for the page regardless of the device accessing it. But, its presentation changes through CSS media queries to specify as to which CSS rules apply to the browser for displaying the page.
With responsive website design, you can keep both the desktop and mobile content on a single URL. It is easier for your users to interact with, share, and link to and for Google’s algorithms to assign the indexing properties to your web content.
Besides, Google will crawl your content effectively and there won’t be any need to crawl a web page using a different Googlebot user agent.
You can simply define a single sitemap and put in robots.txt file. It will crawl both for your desktop and mobile content.
In addition to stating the files in robots.txt, you should log into Google Webmaster Tools and submit the sitemaps there. That will tell you
If the sitemap url you submitted is correct
That the sitemap file has the correct syntax
How many of the files in the sitemap have been crawled
How many of the files in the sitemap have been indexed
We are looking into the new bundling feature of ASP.NET MVC 4 and are wondering if there are any advantages to bundling CSS files that are served from a CDN?
Is there even a way to bundle multiple files served up from a CDN in ASP.NET MVC 4?
This doesn't work:
var cdnCssPath = "http://MyCdn/css/";
bundles.Add(new StyleBundle("~/Content/css", cdnCssPath)
.Include("~/Content/site.css")
.Include("~/Content/Test1.css")
.Include("~/Content/Test2.css")
.Include("~/Content/Test3.css")
);
Any ideas?
First of all it depends on if you have access to a CDN where you can upload your own files or if you're using, for example, google's CDN to get external libraries like jQuery.
If you pull files from a CDN and bundle them, you would lose the advantage of using a CDN unless you're able to upload your new bundled file to the CDN.
For example if you get jQuery and jQuery UI from google's CDN and bundle them, you're no longer using google's CDN, you're instead serving up local resources (the created bundle).
You may have reduced the number of requests, but instead of 2 requests too google's CDN (which has a high probabillity to be cached already by the users browser) there's one request to your server (which is not as likely to be cached).
So in short I would say that there's no advantage to bundle files together that comes from a CDN, however uploading your bundled files to a CDN is different story.
Do note that it is possible to use use CDN for bundles though:
look at the "Using a CDN" part of this article
Edit: Here's an article that explains when to use a CDN or not and why, a bit more indepth than my answer http://www.kendoui.com/blogs/teamblog/posts/13-11-07/know-when-to-cdn.aspx