Firstly, wow, Bigcommerce's own forums are dead as heck, huh?
Anyway, I've been working on my theme locally within stencil and for the most part it's been going fine. I did add a grip of images to the /assets/img directory that are displayed by category.js under certain conditions, which worked great when I'm serving from localhost but when I push the theme to Bigcommerce from the commandline (using 'stencil push') these images (and a couple custom icons, etc) are broken. I read the docs that warned me to make sure any added directories have the proper permissions but 1) this is an existing directory and 2) i'm getting a 404 anyway so it's not like they can't be accessed, they're just not there at all. Clearly Push isn't bundling them up, which means I'm doing something wrong.
Related
I can not figure out how to make nuxt generate fully static website. It makes api call static and that is awesome. But all images, and download links still making request to a remote server.
Is it possible to generate fully static website where all links to external files(<img src="remote.jpg">, <a href="remote.pdf", background-image: url('remote.jpg')) will be downloaded and placed in local folder and then every url will be replaced to local files? Or nuxt does SSG only for APIs?
You could totally optimize and put all of your assets into the /static directory indeed.
It will require some CI or any kind of build step to have them properly updated, organized etc but nothing impossible (this will keep everything in the same place). Meanwhile, having resources outside of your server is not bad in the principle itself neither.
When I check my website URL on Google URL inspection tool it shows that page resources could not be loaded i.e image, stylesheet and script files. However, my website is working perfectly on a live server and the website is not rendered properly by Googlebot smartphone. I have tried everything to remove these errors but nothing helped. I have also checked that these resources are not blocked in robots.txt file.
Screenshot of page resources error
I've been struggling with this for a couple of days now, and finally reached the only solution that has worked for me. In my case, it wasn't a robots.txt problem, as I believe that you've already checked before posting this.
The problem has to do with the number of resources Googlebot is willing to fetch before giving up. If your CSS and JS files are too many, or too big, Googlebot gives up before fetching all of the resources needed to render the page properly.
You can solve it by minifying your files via a server mod, or via plugins like WP Rocket or Autoptimize. If you have too many CSS and JS files and the problem persists after minifying, try combining these files as well by using the same plugins.
My app is created in next.js and works great in localhost. When I deployed it in heroku, only the front page shows up and all page paths do not work even though they are correctly inputted in the browser. The only page that is connected to the index.js file in my page paths is the front page. Do the other pages need to also be connected to the indec.js file? I am terribly lost with this issue since the site works perfectly in localhost. In heroku every page path besides / has a 404 error. I didn't add any code to this question since no one file seems relevant to the issue. I've been searching all over for an answer to this issue but can't seem to find any relevant information online since the app is deployed successfully it just won't render any file paths besides /.
Thank you in advance for any help you can offer. I really appreciate it!
Applications are ran differently on localhost and when deployed to the server. Since you added react tag on the post, I assume you are trying to deploy react native app on Heroku, there is lots of information on internet how to do it.
For example this post.
Anyway first you need to build your app correctly, so static files would be generated (you didn't mentioned how you are running that app).
To your question:
Relative paths are the same on both local and server, but absolute paths will be different.
But for your 404 error I see that no static content are found on the root path.
Folks:
I'm creating an app using Node Webkit. The purpose of this app is to display images and pdfs. The app needs to download those files from a central repository, and cache them locally. When the app runs offline, the files should still be available, and displayed.
On the face of it, this sounds like appcache is the answer - and that indeed is where I was heading when this was a pure webapp in a browser. However, now I've discovered node-webkit, and here we are.
node-webkit's GitHub wiki states:
"However, application cache is designed for browser use, for apps using node-webkit, it's less useful than the other two method, read HTML5 Application Cache if you want to use it."
But doesn't say why.
I've also researched node.js filesystem - but that seems like a whole magnitude of complexity above what I need.
Can anyone point me in a sensible direction?
Thanks.
It has to do with the nature of App Cache itself.
You specify a manifest file that lists all the static assets required for your app to run offline. You don't have any programmatic access to the cache to add and remove files via JS.
So for a node-webkit app, it'd make more sense to fetch these files and store them in the Application Support folder (Or AppData, depending on the platform). That's where the node.js part is really useful, the file IO stuff.
I am about to start a new project that requires a responsive design solution and I'd also like to take advantage of LESS.js for the styling.
I downloaded the Responsive template from Initializr and set it up as a new site on my localhost. I have begun making minor tweaks to the index.html file to begin styling the page up as it would be required for my project. I began by adjusting the width of the .wrapper class and was frustrated when my changes did not appear to be making any difference when viewing in the browser.
However upon changing the name of the folder containing the files and amending the URL, my changes appear, so there is obviously some sort of caching issue when using LESS on Localhost.
Does anyone know of a solution to this?
What worked for me was disabling, or deleting, cookies of the domain on which you're developing.