I am playing with Gulp.js & npm recently, it's great. However, I do not really get the idea of npm as a package manager for packages which will get pushed for dist.
Let's go with an example.
I want to download the latest jquery, bootstrap and font-awesome so I can include them into my project. I can simply download them from their websites and get the files to include. Another option seems to be a packet manager, i.e. NPM.
However, my node_modules directory is huge due to other packages such as gulp, and it's not nested at all. What would be the easiest way to move selected packages to another dir - for example src/vendors/
I was trying to achieve that by gulp task simply copying specified files from node_modules and moving them to a specified dir, nonetheless in the long run it's almost the same as manually copying files since I have to specify not only the input directory, but also the output directory for each single package.
My current solution:
gulp.task('vendors', function() {
var jquery = gulp.src(vendors.src.jquery)
.pipe(gulp.dest(vendors.dist.jquery));
var bootstrap = gulp.src(vendors.src.bootstrap)
.pipe(gulp.dest(vendors.dist.bootstrap));
return merge(jquery, bootstrap);
});
vendors = {
src: {
jquery: 'node_modules/jquery/dist/**/*',
bootstrap: 'node_modules/bootstrap/dist/**/*'
},
dist: {
jquery: 'src/resources/vendors/jquery',
bootstrap: 'src/resources/vendors/bootstrap'
}
}
Is there an option to do it faster and/or better?
There's no need to explicitly specify the source and destination directory for each vendor library.
Remember, gulp is just JavaScript. That means you can use loops, arrays and whatever else JavaScript has to offer.
In your case you can simply maintain a list of vendor folder names, iterate over that list and construct a stream for each folder. Then use merge-stream to merge the streams:
var gulp = require('gulp');
var merge = require('merge-stream');
var vendors = ['jquery/dist', 'bootstrap/dist'];
gulp.task('vendors', function() {
return merge(vendors.map(function(vendor) {
return gulp.src('node_modules/' + vendor + '/**/*')
.pipe(gulp.dest('src/resources/vendors/' + vendor.replace(/\/.*/, '')));
}));
});
The only tricky part in the above is correctly figuring out the destination directory. We want everything in node_modules/jquery/dist to end up in src/resources/vendors/jquery and not in src/resources/vendors/jquery/dist, so we have to strip away everything after the first / using a regex.
Now when you install a new library, you can just add it to the vendors array and run the task again.
Related
I have some pdf files on my website, lets call them a,b,c,d (all ending with .pdf).
These files don't sit all under the same path (there is a total of 3 different directories), for example we have
https://example.com/first_dir/a.pdf
https://example.com/first_dir/b.pdf
https://example.com/second_dir/c.pdf
https://example.com/third_dir/d.pdf
I have the following alias dictionary:
name_conversions = {
'first_name' : 'first_dir/a.pdf'
'other_name' : 'first_dir/b.pdf'
'another_name' : 'second_dir/c.pdf'
'yet_another_name' : 'third_dir/d.pdf'
}
and I want to create redirections according to this dictionary, so that when we access https://example.com/first_name we get redirected to https://example.com/first_dir/a.pdf, and same goes for the other entries.
How can this be done in webpack? If this can't be done in webpack what are the alternatives, that hopefully don't require me making a lot of new files, one per pdf, and don't require creating copies of these pdfs.
I've seen people use webpack proxy for this, but this is not for production from what I understood, and only for development.
Update: I found a plugin for doing redirects, but I can seem to find how to use it.
I found out I can use the package redirect-webpack-plugin.
All I need to do is the following:
module.exports = {
...
plugins: [
new RedirectWebpackPlugin({
redirects: name_conversions
})
...
]
...
}
I'm not sure this is even possible, but it looks like some of the moving parts are there.
GOAL:
Create a library of single file Vue 3 components that will compile into separate chunks using Vite, and be dynamically/async loaded at runtime. The app itself will load, then load up a directory of individually chunk'd elements to put in a toolbox, so afterward each element could be updated, and new ones could be added by putting new chunks in the same path.
So far, I can create the separate chunks within the vite.config as follows:
...
build: {
rollupOptions: {
output: {
...buildChunks()
}
}
}
...
The buildChunks function iterates over SFC files in the ./src/toolbox path and returns an object like...
{
'toolbox/comp1':['./src/toolbox/comp1.vue'],
'toolbox/comp2':['./src/toolbox/comp2.vue'],
'toolbox/comp3':['./src/toolbox/comp3.vue'],
...
}
This all works, but I'm not sure how to make that next leap where the server code dynamically loads all of those generated chunk files without explicitly listing them in code. Also, since the Vite build adds an ID in the file name (e.g. comp.59677d29.js) on each build, referencing the actual file name in the import can't be done explicitly.
So far what I've considered is using defineAsyncComponent(()=>import(url)) to each of the files, but I'd need to generate a list of those files to import...which could be done by building a manifest file at build time, I guess.
Any suggestions? Is there a better approach?
I use rollup with rollup-plugin-scss plugin in the project to bundle css. Is it possible to generate both .css and .min.css using this plugin or some other plugins?
plugins: [
scss({
output: path.resolve(__dirname, 'projects/project_name/main.css'),
})
]
I tried to add outputStyle: "compressed" but this make only compressed version, not both.
It is not possible out of the box but you hook into the output option that also take a function as option and write both files manually (including a compression step). In the sample code below I used clean-css but there are plenty of other packages available.
scss({
output: function (styles, styleNodes) {
fs.writeFileSync('bundle.css', styles)
const compressed = new CleanCss().minify(styles).styles;
fs.writeFileSync('bundle.min.css', compressed)
}
})
Note that this setup does not have any logging or filesizes or anything as you get it from the regular plugin, but this is something that can be added fairly easily into the function.
I am creating an app via NPM using https://www.npmjs.com/package/grunt-twig-render. It essentially is twig.js. I'm keeping it slim, so right now there isn't any other php or anything like that. Just npm / twig.js and other npm packages, including Grunt.
Here's what I'm trying to do. Right now, I have a bunch of twig files within subfolders of a directory.
What I'd like to do is generate a list of data of the .twig files in that subdirectory. Something like this may work well
files: [
{
"name": "file1.twig"
"path": "/folder1"
},
{
"name": "file2.twig"
"path": "/folder1"
},
{
"name": "file3.twig"
"path": "/folder1"
}
]
But I'm not super picky. Just seeing if anyone has found a way to create a list of files within a folder via npm, or twigjs, or something similar.
If it generates a .json file, that would be ideal. This would be part of a build process, so doing that via Grunt would work well too.
Thank you in advance
You can register a custom grunt task in your Gruntfile.js which utilizes the shelljs find method to retrieve the path of each .twig file.
shelljs is a package which provides portable Unix shell commands for Node.js. It's find method is analogous to the Bash find command.
The following steps describe how to achieve your requriement:
cd to your project directory and install shelljs by running:
npm i -D shelljs
Configure your Gruntfile.js as follows:
Gruntfile.js
module.exports = function(grunt) {
// requirements
var path = require('path'),
find = require('shelljs').find;
grunt.initConfig({
// other tasks ...
});
/**
* Custom grunt task generates a list of .twig files in a directory
* and saves the results as .json file.
*/
grunt.registerTask('twigList', 'Creates list of twig files', function() {
var obj = {};
obj.files = find('path/to/directory')
.filter(function(filePath) {
return filePath.match(/\.twig$/);
})
.map(function(filePath) {
return {
name: path.basename(filePath),
path: path.dirname(filePath)
}
});
grunt.file.write('twig-list.json', JSON.stringify(obj, null, 2));
});
grunt.registerTask('default', ['twigList']);
};
Explanation
Both shelljs and nodes built-in path module are required into Gruntfile.js.
Next a custom task named twigList is registered.
In the body of the function we initialize an empty object and assign it to a variable named obj.
Next, the shelljs find method is invoked passing in a path to the subdirectory containing the .twig files. Currently the path is set to path/to/directory so you'll need to redefine this path as necessary.
Note: The find method can also accept an array of multiple directory paths as an argument. For example:
find(['path/to/directory', 'path/to/another/directory'])
The find method returns an Array of all paths found inside the given directory, (many levels deep). We utilize the Array's filter() method to return only filepaths with a .twig file extension by providing a regex (\.twig$) to the Strings match method.
Each item of the resultant Array of .twig filepaths is passed to the Array's map() method. It returns an Object with two properties (name and path). The value for the name property is obtained from the full filepath using nodes path.basename() method. Similarly, the value for the path property is obtained from the full filepath using nodes path.dirname() method.
Finally the grunt file.write() method is utilized to write a .json file to disk. Currently the first argument is set to twig-list.json. This will result in a file named twig-list.json being saved to the top-level of your project directory where Gruntfile.js resides. You'll need to redefine this output path as necessary. The second argument is provided by utilizing JSON.stringify() to convert obj to JSON. The resultant JSON is indented by two spaces.
Additional info
As previously mentioned the shelljs find method lists all files (many levels deeep) in the given directory path. If the directory path provided includes a subfolder(s) containing .twig files that you want to exclude you can do something like the following in the filter() method:
.filter(function(file) {
return filePath.match(/\.twig$/) && !filePath.match(/^foo\/bar/);
})
This will match all .twig files, and ignore those in the sub directory foo/bar of the given directory, e.g. path/to/directory.
bit of an RN newb here. I'm trying to read some json data files:
function loadCategories() {
const ids = ['tl1', 'tl2', 'tl3', 'tl4', 'tl5', 'tl6'];
ids.forEach(function(id) {
var contents = require('../Content/top-level/' + id + ".json.js");
...
});
}
But here I always get an error:
Unhandled JS Exception: Requiring unknown module "../Content/top-level/tl1.json.js".If you are sure the module is there, try restarting the packager or running "npm install".
The files exist and my relative path logic should be OK given the project structure:
ProjectDir
Components
ThisComponent.js
Content
top-level
tl1.json.js
tl2.json.js
...
i.e. the above code is running from ThisComponent.js and trying to access tl1.json.js, etc so I would think the relative path of ../Content/top-level/tl1.json.js would work.
I've tried:
Restarting the packager
Referencing ./Content/top-level/tl1.json.js instead
Referencing /Content/top-level/tl1.json.js instead
I'm on RN 0.36.0. Gotta be something obvious…right?
This isn't possible in React Native because of how the packager works. You have to require files with static string path. You can use a switch statement something like this -
switch (id) {
case 'tl1': return require('../Content/top-level/tl1.json');
case 'tl2': return require('../Content/top-level/tl2.json');
...
}
Also why does your json files have .js extension?