npm copy file to a shared drive - npm

Is it possible to copy app.js (from dist folder) to a shared drive (inside my organization)? a scripts setting inside Package.json
npm install copy-files-from-to is a good add-on but doesn't copy to a shared drive. I tried fetching a file from a shared drive as it works well but the other way doesn't seem to work.
Here is a copy of the json file:
{ "copyFiles": [ { "from": "dist/js/app.js", "to": "\\\\<Shared Path>\\app.js" } ] }

Related

Nunit.framework dll from the package doesnt get copied to bin folder

I'm currently running .net framework 4.6 and added the NUnit, NUnit3TestAdapter nuget package to the test project that i am working on and I see that the test's are not being discovered.
Upon some investigtion, I see that nunit.framework.dll is not being copied to the bin folder. Did some more research and I see the following in the assets.json file, and Nunit's build/NUnit.props file which is supposed to have the MSBuild settings is empty and does not have any.
"NUnit/3.13.3": {
"type": "package",
"compile": {
"lib/net45/nunit.framework.dll": {}
},
"runtime": {
"lib/net45/nunit.framework.dll": {}
},
"build": {
"build/NUnit.props": {}
}
},
However Nbuild/Unit3TestAdapter.props file which is installed from the official Nunit has the MSBuild settings. Is there a reason why build/NUnit.props does not have the build setting where as build/Unit3TestAdapter.props has these?
Also, Is there a work-around to get these copied to local or should I add the reference's manually than getting these from the package?

serve html file from node_modules using react native static server and webview

Is it possible to serve an html file from within the node_modules folder? I know how to serve from the assets folder, something like:
<WebView
...
source={{uri:'file:///android_asset/project_a/index.html'}}/>
or the corresponding path for iOS, but how can I access the node_modules folder and serve a file from there? I've tried using require() and/or using the path to the module I want served but with no luck.
The idea is that, I want to avoid copy-pasting build files between projects (i.e., from the build folder of project A to the assets/www of project B), instead, I want publish project A as an npm package and npm install and serve it from project B. This also improves version management of the projects.
For Android, you can run a custom Gradle task that able to will handle copy your assets in build time.
Create a Gradle file in your module's root, like below:
(I presume your module name is react-native-my-awesome-module and your HTML files is under www folder in module's root.)
awesome.gradle:
/**
* Register HTML asset source folder
*/
android.sourceSets.main.assets.srcDirs += file("$buildDir/intermediates/ReactNativeMyAwesomeModule")
/**
* Task to copy HTML files
*/
afterEvaluate {
def targetDir = "../../node_modules/react-native-my-awesome-module/www";
def fileNames = [ "*.html" ];
def htmlCopyTask = tasks.create(
name: "copyReactNativeMyAwesomeModuleAssets",
type: Copy) {
description = "copy react native my awesome module assets."
into "$buildDir/intermediates/ReactNativeMyAwesomeModule/www"
fileNames.each { fileName ->
from(targetDir) {
include(fileName)
}
}
}
android.applicationVariants.all { def variant ->
def targetName = variant.name.capitalize()
def generateAssetsTask = tasks.findByName("generate${targetName}Assets")
generateAssetsTask.dependsOn(htmlCopyTask)
}
}
After installing the module, put the below line in your project's android/app/build.gradle file:
apply from: file("../../node_modules/react-native-my-awesome-module/awesome.gradle");
When you build your app, your assets under www folder will be copied from your module. You can access your resources in your app like below :
<WebView
source={{uri: 'file:///android_asset/www/index.html'}}
/>
I ended up adding a script to packages.json that copies the files of interest into the assets/www folder before starting the app. Something like:
"scripts": {
"copy:build": "node -e \"fs.copyFileSync('./node_modules/dir-of-module','./assets/www/build')\"",
"start-android": "npm run copyAssets && react-native run-android",
...
}
If there is a better alternative, please let me know!

dotnet pack - defining output folders

I'm using dotnet pack to create a nuget package for a .net core project but have 2 issues:
The .nupkg file when inspected always has a lib folder which contains the contents of the project's bin folder. Can you prevent the lib folder from being created in the output?
In the project.json packOptions how do you map an include folder to an output directory? I've tried using the mappings property as detailed below to output the contents of the publish directory into the wwwroot directory below but no luck so far.
"packOptions": {
"files": {
"include": [
"publish"
],
"mappings": {
"wwwroot": "publish"
}
}
},
I was unable to remove the lib folder but the mapping works with slight modifications to the example above as displayed below:
"packOptions": {
"files": {
"include": [
"publish"
],
"mappings": {
"wwwroot/": "publish/"
}
}
}

How can I get the Amazon Cognito Identity SDK working in Aurelia?

I am trying to get the Amazon Cognito Identity SDK working in Aurelia. I do not have a lot of Javascript experience and am very unfamiliar with the various dependency systems.
I installed the Cognito SDK using: npm install --save amazon-cognito-identity-js
I then edited my aurelia_project/aurelia.json file as suggested in the Aurelia documentation to include a new client library dependency in build.bundles vendor-bundle dependencies:
"sjcl",
"jsbn",
{
"name": "aws-sdk",
"path": "../node_modules/aws-sdk/",
"main": "dist/aws-sdk"
},
{
"name": "amazon-cognito-identity-js",
"path": "../node_modules/amazon-cognito-identity-js/dist",
"main": "amazon-cognito-identity.min"
}
However, when I try to run the code using au run I get the error: Error: ENOENT: no such file or directory, open '/Users/nathanskone/Projects/scc/aurelia-app/src/xmlbuilder.js'
I have tried to include xmlbuilder in my aurelia.json to no avail. When it is included I end up getting this error about lodash: Error: ENOENT: no such file or directory, open '/Users/nathanskone/Projects/scc/aurelia-app/src/lodash/object/assign.js'
I haven't found any way to get past the lodash error.
Is there anyone out there familiar with the Aurelia dependency system that could help?
Thanks,
Nathan
EDIT #2: While I got past the xmlbuilder/lodash errors, I have run into further errors trying to bundle the aws-sdk. Here is my current aurelia.json:
"dependencies": [
{
"name": "xmlbuilder",
"path": "../node_modules/xmlbuilder/lib",
"main": "index"
},
{
"name": "aws-sdk",
"path": "../node_modules/aws-sdk",
"main": "index",
"resources": ["lib/region_config.json"]
},
And the error I am currently getting:
Error: ENOENT: no such file or directory, open '/Users/nathanskone/Projects/scc/aurelia-app/src/crypto.js'
If I remove the resources (lib/region_config.json) then I get this error instead:
Error: ENOENT: no such file or directory, open '/Users/nathanskone/Projects/scc/aurelia-app/node_modules/aws-sdk/lib/region_config.json.js'
I think crypto is actually an object defined in aws-sdk/lib/util.js, which is required by aws-sdk/lib/region_config.js.
Try the compiled library instead, using the compiled lib bundled just fine.
Also the library seems to define window.AWS, so injecting it or not will work
{
"name": "aws-sdk",
"path": "../node_modules/aws-sdk/dist",
"main": "aws-sdk.min",
"exports": "AWS"
}
UPDATE:
It seems the only way to import those libraries is by using the prepend section, the libraries write to the window variable so it can still be accesible to your app scripts, only by not importing them like ES6 modules.
"prepend": [
"node_modules/aws-sdk/dist/aws-sdk.min.js",
"node_modules/amazon-cognito-identity-js/dist/aws-cognito-sdk.min.js",
"node_modules/amazon-cognito-identity-js/dist/amazon-cognito-identity.min.js",
"node_modules/bluebird/js/browser/bluebird.core.js",
"scripts/require.js"
],

Pulling files from a directory into the root folder for NPM

I am publishing a library to NPM.
When I build the library, the resulting artifact is placed in the dist folder located in the root of my project as index.js.
When users install from NPM I would like index.js to be present in the root of the folder created in their node_modules folder. Presently, it remains in a directory named dist.
How can I do this?
My packages.json:
{
"name": "my-package",
"version": "0.0.9",
"files": ["dist/*"],
"main": "index.min.js",
"private": false,
"dependencies": {},
"devDependencies": {},
"repository": "git#github.com:username/my-package.git"
}
I had exactly the same problem.
I solved it not by copying the files up, but by copying the files I needed down into the ./dist/ folder and then doing an npm publish from there; NPM then treats that folder as a complete package and everything works very nicely. The only files I needed to copy from the root folder were:
package.json
README.md
Because we're going to copy these files down into the ./dist/ folder before we do the publish, we do NOT want the package.json file to reference ./dist/. So remove the package.json's files entry completely, because we don't need to tell it which files we'll take - we're going to take everything in the ./dist/ folder. I'm using TypeScript so I also have a typings entry, and again no reference to ./dist/.
{
"name": "my-package",
"version": "0.0.9",
"main": "index.min.js",
"typings": "index.d.ts",
"private": false,
"dependencies": {},
"devDependencies": {},
"repository": "git#github.com:username/my-package.git"
}
Now for the publish step. I built a gulp task that will perform the publish for me, making it nice and automated (except for incrementing the package version #).
From gulp I'll use Node's spawn() to kick-off the npm process. However, because I'm actually working on Windows I used "cross-spawn" rather than the normal built-in Node.js spawn (which I learned the hard way didn't work when I had spaces in my path!).
Here's my gulp file, with the TypeScript bits removed:
var gulp = require('gulp');
var del = require('del');
var spawn = require('cross-spawn'); // WAS: require('child_process').spawn;
var config = {
src: { tsFiles: './src/**/*.ts' },
out: { path: './dist/' }
}
gulp.task('clean', () => {
return del('dist/*');
});
gulp.task('build', ['clean'], () => {
....
});
gulp.task('publish', ['build'], (done) => {
// Copy the files we'll need to publish
// We just use built-in gulp commands to do the copy
gulp.src(['package.json', 'README.md']).pipe(gulp.dest(config.out.path));
// We'll start the npm process in the dist directory
var outPath = config.out.path.replace(/(\.)|(\/)/gm,'');
var distDir = __dirname + '\\' + outPath + '\\';
console.log("dist directory = " + distDir);
// Start the npm process
spawn('npm', ['publish'], { stdio:'inherit', cwd:distDir } )
.on('close', done)
.on('error', function(error) {
console.error(' Underlying spawn error: ' + error);
throw error;
});
});
Notice when we call spawn() we pass in a 3rd argument which is the options. The main entry here is the cwd:distDir, which tells spawn to run the npm process from the ./dist/ directory. Because using spawn can cause problems I've hooked into the spawn error handling. As I was troubleshooting my use of spawn() I found the following StackOverflow article very helpful.
This worked like a charm; my published package has all the files in the root directory and the ./dist/ folder is not published.