How to include static resources when publishing an Atom package - npm

I'm working on an Atom package that requires some static resources. I was expecting the following to work:
Define scripts.prepublish in package.json to call a script to download the external resources
Publish the Atom package on atom.io with apm publish VERSION
Install the Atom package from atom.io with apm install PKGNAME
Unfortunately it seems the prepublish script is not executed,
and so the package gets installed without the static resources.
Curiously, if I install with apm install REPOREF instead of apm install PKGNAME, then the prepublish step gets called, and the package will have the static resources.
The relevant parts from my package.json:
"name": "janos-ss-prepublish-demo",
"repository": "https://github.com/janos-ss/prepublish-demo",
"scripts": {
"prepublish": "node ./scripts/setup.js"
},
"files": [
"files"
],
files here is the directory where node ./scripts/setup.js downloads the static resources, if the package is installed with npm install.
After publishing with apm, if I install the package with apm install janos-ss-prepublish-demo, it seems the prepublish step will not get executed and the package will not have the static resources.
If I install the package with apm install janos-ss/prepublish-demo, the prepublish step is executed and the package will have the static resources.
Notice the difference in the two apm install commands, the first uses the name of the package as published on atom.io,
the second uses a GitHub repository reference,
with my GitHub username and the name of the repository.
Installing with apm install REPOREF is not an option,
because it requires explicit user action, which I don't want.
I want users to be able to install the complete package from atom.io, using the natural way from the package explorer of Atom itself.
If I understand correctly this pending pull request confirms that what I'm trying to do is currently not possible. I also tried to use a custom build of apm from the pull request, rebased on top of the official master, but it still did not work, so I'm not sure if I'm understanding correctly what's going on here.
Is there something fundamentally wrong with what I'm trying to do?
Is there another way to include static resources in the package on atom.io?
My current workaround is to download the static resources after the package is installed and activated. But this is unnecessary complexity at runtime, and ugly.
A minimal reproducible package is on GitHub.
Repro steps:
apm publish patch
apm install janos-ss-prepublish-demo
ls ~/.atom/janos-ss-prepublish-demo # error, doesn't exist
apm install janos-ss/prepublish-demo
ls ~/.atom/janos-ss-prepublish-demo # exists

Related

Does npm install have an equivalent to pip install --no-deps?

I'm more familiar with the Python ecosystem at this point and have a question about how I can do something with npm that I'm used to doing with pip.
Let's say I have a wheel for a particular Python package, as well as a wheel file for each of the Python package's dependencies. And let's say I have all these wheel files in a folder called /path/to/wheel/files. To install this package and all of its dependencies, I could run something like pip install /path/to/wheel/files/*.whl --no-deps, where --no-deps keeps me from having to install the various dependencies in the proper order.
Does npm have an equivalent to this? I'm using npm-offline-packager to create a tarball that contains a Node package (as its own tarball) and all of its dependencies (as their own tarballs). I know I can tell npm install to install a particular tarball. However, when I do this, it tries pulling in the required dependencies from the online NPM registry instead of pulling in the dependencies from the tarballs I already have.
Ideally, I'd like npm install to use the tarballs to add the main package to my project's package.json while adding the package's dependencies to my project's package-lock.json. And of course, I'd also like the main package and all its dependencies to be installed to my project's node_modules directory as well.
TL;DR Does npm have something equivalent to pip install /path/to/wheel/files/*.whl --no-deps?
I'm responding to my own question here, but note that my answer is only applicable to my particular use case and may not be applicable in general.
For my use case, I have access to two computers: one that has access to the internet and one that doesn't. For the machine that doesn't have access to the internet, I was attempting to use Verdaccio as a way of creating a self-hosted NPM registry. However, publishing packages to Verdaccio wasn't working because it kept trying to pull in the package's dependencies from the public NPM repository. The solution was to remove all references to "npmjs" in Verdaccio's config file (which, for me, Verdaccio created at ~/.config/verdaccio/config.yaml).
So, in case anyone needs to do development on a machine that doesn't have access to the internet, the process for setting up Verdaccio looks something like this:
On the machine that has access to the internet, create an NPM project using npm init (I called my project "verdaccio_runner"). The reason I did this is because, without already having an NPM registry on the machine that doesn't have access to the internet, it was hard doing a global install of Verdaccio.
Run npm install verdaccio to install Verdaccio to the NPM project that was created in the previous step.
Transfer this project over to the machine that doesn't have access to the internet.
Once it's transferred over, run Verdaccio from the project like this: npx verdaccio.
Quit out of Verdaccio.
Remove all references to "npmjs" from the config file that Verdaccio created (again, mine was at ~/.config/verdaccio/config.yaml).
Run Verdaccio again to pull in those changes.
Tell NPM where your private registry is: npm config set registry http://localhost:4873/.
Add yourself as a user by running npm adduser and by then filling out the information you're prompted for.
And the process for publishing packages to Verdaccio on a machine that doesn't have access to the internet looks like this:
For the package you want to install, on the machine that has access to the internet, run npo fetch <package name> --no-cache (assuming you've already done a global install of npm-offline-packager on the machine that has internet access).
Bring the tarball that npo created for you over to the machine that doesn't have internet access.
Untar the tarball.
From the directory that's created, run for file in ./*.tgz; do npm publish $file; done.
The published packages can now be npm installed to projects on the machine that doesn't have internet access.
Note: in order for Verdaccio to be accessible to other machines on the private network, I also had to add the following to Verdaccio's config file:
listen:
0.0.0.0:4873

Automatically downloading npm packages listed in package.json file

I'm working on creating a local repository that will contain all packages I use in my project, so I can have those packages installed on a machine that does not have access to the internet. I think of the repository that I could clone on the machine and run yarn install to have all the packages available in the project from the local repository. How can I do that? Similar question was asked here Using npm how can I download a package as a zip with all of its dependencies included in the package
There's not enough information in your question to fully understand your situation, but if you commit your node_modules directory to the repository, the modules will be there without the user having to run npm or yarn to install them. This assumes the user will run code from the repo workspace and that there aren't any modules that require a compilation step or other build step that may be platform-specific. But if they're all plain ol' JavaScript modules, you should be fine.
If you want to have all the modules as a separate repo rather than checking in node_modules, I can offhand think of two ways this might work.
Have the packages repo be a check-in of a fully installed node_modules directory. Then make that repo a Git submodule of the main repo that gets cloned as node_modules in the main repo.
Use npm pack to create .tgz files for each package you need. Store those files in the packages repo. Clone that repo into a known path on your target machine. Have the main repo install via path names. For example, if you run npm install /var/packages/foo-1.0.0.tgz, it will add a line to your package.json that might look something like this: "foo": "file:../../../var/packages/foo-1.0.0.tgz". In that case, npm install will install from that path rather than over the network.

Does npm or yarn clone from VCS and run build script when install a package?

I am studying about npm and I have some questions.
Where the npm get the package from? i.e. when run npm install <package-name> or yarn add <package-name>.
When get the package, do npm get the package as raw or get then build it(like run the build script written in package.json)?
When publish the package, the repository field of package.json is required?
Can be different between the repository for publishing and the repository in pacakge.json?
To answer your questions:
npm gets them from the NPM package registry, and so does yarn, but Yarn probably has a proxy registry in front of it. In general, you can say, both tools fetch their packages from https://npmjs.com by default.
It gets the package as it was published (so, in short, the answer is "raw"). Building is up to the publisher and depends on the type of package. Often, some prepublish task builds something into dist/ (or any other location in the package), and these files are also shipped with the package others then download. Building rarely happens after installing a package (exception here are library-wrapping packages built with node-gyp).
The repository field is not required, to my knowledge, but it is good practise to include it (it will be displayed on the NPM website, for example).
Technically, yes. You can just specify any repository in repository, but it wouldn't make much sense to specify one that isn't the source of the package.
If you in general want to read up more on how npm works, check out it's documentation over at https://docs.npmjs.com/

Transpile with babel after npm install

I installed this package: https://github.com/feathersjs/feathers-authentication-local (question not related specifically to this package). This package's source code is in ./src, and npm run compile puts the babel-transpiled code into ./lib, which is the main entry point.
My question is, after I do npm install feathers-authentication-local, how does npm know that it needs to run npm run compile? I thought of putting a postinstall script in package.json, this package doesn't have that.
Regarding what is uploaded to npm when publishing, there are two fields in package.json, files and directories, which are used to specify what should be uploaded.
Take a look as well to "main" property, it points to the files that will be used when importing a module in your aplication so:
import foo from 'foo'
Will look into node_modules/foo/$(main) which in this case points to lib/
The package actually doesn't compile after install on client's machines, but it's probably compiled on the mantainer's machine and then published on NPM.
Compiling process is fired by the prepublish script in package.json.

Running npm prepublish on an entire project

I have a weird set of "local" npm modules that use TypeScript and depend on each other similar to:
A -> B, C
B -> C
C -> D
I need to run npm install and get all of my TypeScript compiled in order or it won't be able to find things properly. I'm under the impression I should use prepublish scripts to handle the TypeScript compile but it doesn't seem to cascade the prepublish request for local dependencies.
How am I supposed to set up a bunch of local modules with prepublish scripts such that they all get resolved appropriately when running npm install?
Another way to word what I am asking: How do I maintain multiple, local node modules and modify them at the same time? The modules have varying dependencies on each other and it is extremely inconvenient to modify them in isolation.
TypeScript compile but it doesn't seem to cascade the prepublish request for local dependencies.
Indeed prepublish only runs install for dependencies. Your dependencies should already be built (with prepublish) before putting them on NPM and installing them.
I figured out how to do what I needed. After updating to npm 3.3.9 and TypeScript 1.6, I was able to use a postinstall script to build things on the fly. The prototype lives here: https://github.com/MrHen/TypeScriptNpm
But the important pieces are:
// In the module's package.json
"scripts": {
"build": "gulp npmbuild",
"postinstall": "npm run build"
},
And:
// In the server's package.json
"dependencies": {
"hen-doodad": "file:../modules/hen-doodad",
"hen-widget": "file:../modules/hen-widget"
}
And:
// In the gulpfile
gulp.task('npmbuild', function() {
gulp_util.log('Detecting appropriate starting directory...', process.env.INIT_CWD);
var out = process.env.INIT_CWD + '/app';
var build = [process.env.INIT_CWD + '/**/*.ts', 'typings/tsd.d.ts', '!' + process.env.INIT_CWD + '/node_modules/**/*'];
var typings = 'typings/tsd.d.ts';
// ... do typescript build using above paths
Feels like a bit of a hack but this worked more consistently than prepublish. To run the whole thing, do an npm install inside of the server folder.
The gulp task is optional. Presumably you could use tsc directly.
It should be noted that this is most certainly not how you should be packaging up npm modules. The reason I had to do this involved details from a pre-existing build system.
Pitfalls from doing it this way:
Does not fit the standard publishing patterns used by npm.
The built files only ever live inside of the server's node_modules which can be awkward depending on what else you do with them.
Running npm install twice will not grab latest changes. You need to either remove the installed modules or update the version number on the module.
Each module gets its own TypeScript build instead of building en masse. If you can just build everything all at once you should do that instead.
Requires TypeScript 1.6 or greater due to how they auto-detect typings files included inside of node_modules.
Assuming that you are using TypeScript 1.6+ which support node module resolution. As you are trying to keep all modules synced, I guess your project is just starting.
I think symlinks would do the job, but if you have more concerns, please let me know.
More specifically, you may either create symlinks manually or take the advantages of npm link.
cd /path-of-a-module
npm link # this will create a link as global module
cd /path-of-your-app
npm link your-module-name
Then you may just maintain these modules happily.
As for dependencies configuration in your package.json file, it could be a git repository. But you could probably left the compiled *.js and *.d.ts files there and everything would just work.