I cloned this https://github.com/Code-Pop/pwa-with-vue-3/tree/09-End
Then I ran npm install, npm run build and now I have a dist folder.
I'm trying to upload all the files from the dist folder to a s3 bucket configured to host static files
Everything gets uploaded fine except the file service-worker.js. I don't have a detailed error, just "Network Error" and I tried multiple times.
Related
I'm having an issue with installing an NPM package from GCP.
I was able to upload the package to the artifact registry of GCP by doing the following steps:
Login to my google account (gcloud auth application-default login)
Run
gcloud artifacts print-settings npm \ --project=[my-project]\ --repository=[my-repo] \ --location=us-east1 \ --scope=#[my-scope]
Pasting the output of the previous step in the .npmrc file located in the root of the project.
Refreshing the access token to GCP (npx google-artifactregistry-auth ./.npmrc)
Run yarn publish
My .npmrc file looks like this:
#[my-scope]:registry=https://us-east1-npm.pkg.dev/[my-project]/[my-repo]/
//us-east1-npm.pkg.dev/[my-project]/[my-repo]/:_authToken="[auth-token]"
//us-east1-npm.pkg.dev/[my-project]/[my-repo]/:always-auth=true
However, when I try to install the package on another project by:
Executing steps 1-4 mentioned above
Run yarn add #[my-scope]/[my-package]
I get an 404 error.
Looks like yarn is looking for the package in the default registry:
error An unexpected error occurred: "https://registry.yarnpkg.com/#[my-scope]/#[my-pacakge]/-/#[my-scope]/[my-package]-0.0.1.tgz: Request failed \"404 Not Found\"".
I simply followed the steps mentioned in the installation instructions in GCP but somehow it's not working.
I encountered a similar issue in this post: Can't install a scoped package I published to a npm registry in GCP but this not the exact error I get.
I would appreciate any help regarding this issue.
Thanks in advance!
I just had this problem for a couple of days and the solution is simple, DO NOT USE YARN when publishing. That's it.
I don't know which part of yarn causes this but basically it ignores .npmrc resulting in the tarball to point to the wrong repository, you can check it if you run yarn info. So when publishing to GCP artifact registry one should use npm publish instead.
In both setting up authentication for npm and Managing Node.js packages, Obtaining an access token section the command used is
npx google-artifactregistry-auth
In the same section there is a note that explains how to add flags if you need to change the path of the .npmrc file.
Note: If you need to store your repository settings and credentials in .npmrc files other than the defaults, you can run the credential helper with additional flags.
--repo-config is the .npmrc file with your repository settings. If you don't specify this flag, the default location is the current directory.
--credential-config is the path to the .npmrc file where you want to write the access token. The default is your user .npmrc file.
Instead of:
npx google-artifactregistry-auth ./.npmrc
It could be written as
npx google-artifactregistry-auth --repo-config=pathto/.npmrc --credential-config=pathto/.npmrc
If you are not sure where your file is you can run npm config ls -l | grep config as explained here
Also check you are specifying the correct .npmrc path if it is different than the default registry as shown in Configuring npm and confirm you are trying to install a package from the Node.js package repository with the correct scope, package, tag or version to be completely explicit.
Im using project vue 2 cli + vuex + router (without history mode) but when Im opening index.html, after "nmp run build" command, site is empty. How can I solve it... I need site to be opened on index.html (like common cdn project)
The dist directory is meant to be served by an HTTP server (unless you've configured publicPath to be a relative value), so it will not work if you open dist/index.html directly over file:// protocol.
Source
One way to test your build, is to install Serve:
// install Serve globally
npm install -g serve
// Run in project root, with folder name as input
serve -s dist
I am trying to run npm install from inside AWS Lambda.
But I'm getting the below error.
Setting --prefix to "/tmp" doesn't work either.
{ Error: Command failed: npm install async npm ERR! code EROFS npm
ERR! syscall mkdir npm ERR! path /home/sbx_user1051 npm ERR! errno -30
npm ERR! rofs EROFS: read-only file system, mkdir '/home/sbx_userXXXX'
npm ERR! rofs Often virtualized file systems, or other file systems
npm ERR! rofs that don't support symlinks, give this error.
You cannot run npm install inside lambda, you need to upload your modules using zip file
A deployment package is a ZIP archive that contains your function code and dependencies. You need to create a deployment package if you use the Lambda API to manage functions, or if you need to include libraries and dependencies other than the AWS SDK. You can upload the package directly to Lambda, or you can use an Amazon S3 bucket, and then upload it to Lambda. If the deployment package is larger than 50 MB, you must use Amazon S3.
https://docs.aws.amazon.com/lambda/latest/dg/nodejs-create-deployment-pkg.html
I was able to work around this same npm issue by creating a .zip, then an AWS Layer, then finally configuring the Lambda function to use that Layer; specific steps below:
make a new empty directory:
cd newdir && cd newdir
install whatever npm things: npm install --save xyz
make a directory skeleton that matches the expected Lambda structure for Node14 (there's a different structure for Node12, or various other languages; see https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html?icmpid=docs_lambda_help): mkdir -p nodejs/node14
copy the "node_modules" directory into that newly made directory skeleton:
cp -R node_modules nodejs/node14
zip the whole thing up (name it whatever you want):
zip -r custom-drivers-node14.zip nodejs
from there, go to AWS console, Lambda, then "Layers" and create a new layer. In the dialog, upload your .zip file ("custom-drivers-node14.zip").
finally, edit your Lambda function in AWS console, and add a new Layer – the interface might change, but as of now, this is under the main screen for a single function, then scroll way down to the bottom. Follow the "Add a layer" flow, choose the Layer you made, and then try your code.
One final note, this code structure worked:
const xyz = require('xyz');
exports.handler = async (event) => {
xyz.doSomething();
}
I need to continually build a create-react-app application and deploy it to Amazon S3 bucket.
I have written the following CircleCi config.yml:
version: 2
jobs:
build:
docker:
- image: circleci/node:7.10
steps:
- checkout
- run: npm install
- run: npm run build
deployment:
prod:
branch: circle-config-test
commands:
- aws s3 sync build/ s3://http://www.typing-coacher.net.s3-website.eu-central-1.amazonaws.com/ --delete
What I think should happens:
I have a docker container, I install the application, build it and the files are resting ready in build folder.
I am running the command listed in CircleCi docs and the build files are moving from the docker machine to s3 bucket.
To deploy a project to S3, you can use the following command in the
deployment section of circle.yml:
aws s3 sync <path-to-files> s3://<bucket-URL> --delete
What actually happens:
Application is being install and build files are being created, but nothing happen with deployment. it doesn't even appear on the builds console.
What Am i missing?
disclaimer: CircleCI Developer Advocate
Everything from the deployment: line and down shouldn't be there. That's syntax for CircleCI 1.0 while the rest of your config file is CircleCI 2.0.
You can either:
Create a new step and check for the branch name with Bash. If it's circle-config-test, then run the deployment commands. You'll also need to install the AWS CLI in that build.
Using [CircleCI Workflows], create a deployment job with a branch filter for circle-config-test. You can use any image that contains the AWS CLI or install it yourself. The CI Builds: AWS Docker image contains this for you.
Is it possible to deploy nuxt-js spa project in shared hosting like hostgator through cpanel?
I have tried several time to run below command, it's generate dist folder but didn't work.
My command: npm run build
Can you help me please?
npm run build
Builds files for `npm run start` serving. And looks like you want it to be static files, which you just upload to hosting. So you should run:
npm run generate
then in `/dist/` you will find static spa.