Dropbox selective syncing - pattern matching? - dropbox

I'm using Dropbox on daily basis and put my programming projects in there.
It works great, but once I got many projects my /node_modules dir's are putting
a struggle on Dropbox. It's syncing process starts to be slow and it eats up CPU time.
Is there any way to do a selective sync based on directory name or a mask pattern?
Would be nice to have to a .gitignore equivalent to configure.
Any 3rd party software for that task?

There is a way to selectively sync but I don't believe it has any advanced rules like you're describing:
https://www.dropbox.com/help/175/en

2 way to resolve this problem:
You can put node_modules upper then project directory in files tree. For example:
Project dir: c:/prj/myProjWrapper/myProj
In the c:/prj/myProjWrapper put package.json and make npm install here, NodeJS recursively will find it.
Win and Linux only, not for Mac! In project dir create .ds_store folder (it is not sync by dropbox). Put package.json in to it and do npm install. You must set NODE_PATH=./.ds_store/node_modules;. when starting NodeJS

Related

Should i delete webpack and other libraries after bundling?

NPM donwloads a lot of files needed for the webpack/libraries. From what i understand, webpack generates a one single bundle file, that contains all code for script working. After that, when i finish building my app, do i need to keep all those jquery/react files and webpack itself? Or should i just delete them?
It's common practice to make a project portable/shareable by following these steps;
Create a package.json and ensure to capture all dependencies,devDependencies and/or peerDependencies.
Add/commit this package.json and package-lock.json files to your version control
Create a .gitignore file and add node_modules to it (in essence, this cuts out that baggage)
For production purpose (e.g. to be shared with client finished product), build the project (which often results into a small files, often within /build or dist). And then you can always push that build file to AWS or Heroku or the clients' servers.
What does the above help you achieve?
You can easily start the project using any machine, as long as you run npm install which reads from your package.json.

Do we need all files and folders while re-desiging Nuxt JS site?

Is it necessary to have all files while re-designing website?
Like, I've designed a site with Nuxt JS, published it, now if in future I want to make some changes, do I need all files back that I started with? Like all node modules, pages folder, components folder, everything? Asking because there are tons of files in total.
A recent case happened with me is, I wanted to do some changes in my recent Nuxt JS site, but I missed "pages" folder, however I have "dist" folder. Is there any way I can like recover "pages" folder from my final production site?
Also, what will be best practice to manage Nuxt JS projects? Any tips, tricks will be appreciated.
To develop on a NuxtJS site, you need the directories and files listed in the Nuxt guide's Directory Structure section. The files you don't need for future development are the files in the default .gitignore that create-nuxt-app generates for you, including the dist directory and the node_modules directory.
The dist directory can be regenerated from your source code using npm run generate and node_modules from running npm install if you have package.json or package-lock.json file. Anything that can be generated from some other file(s), you don't need to keep.
Is there any way I can like recover "pages" folder from my final production site?
Unfortunately not.
What will be best practice to manage NuxtJS projects?
Not sure what you mean with "manage", but if you don't use git yet, then git.

NPM : Create an NPM package that adds files and folders to the project root directory

I've created a web app template that I use frequently for many different projects.
I would like to create an NPM package for it so that it's easier to install for new projects, separate the template from the project files, separate the template dependencies from the project dependencies, and allow easier updating of the template across all projects.
The issue I have is that I need some files/folders to be installed in the root directory (i.e. where package.json is saved). Most can go in the node_modules folder however I have some files that must be placed in the root directory.
For example, the template uses Next.js with a custom _app.js file. This must be in the root directory in a folder named pages. I also have various config files that must be in the root directory.
Can this be done with NPM, or does everything need to be installed in the node_modules folder? I'm having trouble finding anything on SO or Google that answers this, so if you happen to know a guide online on how to do this or can outline things I should search for it would be much appreciated.
With pure npm, everything has to go to the node_modules folder, so you can't solve your issue this way.
Maybe going with a templating tool such as grunt init or yeoman could be a solution here, although – unfortunately – you'll then lose some of the benefits of being able to install a package via npm.
Another option might be to use GitHub template repositories, which have just been introduced recently.
Last but not least one option might also be to just have the files' contents in the npm package, but create the pages/_app.js manually, but inside of it simply require the file contents from an npm module, and that's it. This at least helps to have the content portable, but of course it still asks you to setup the file and folder structure on your own.
Sorry that I don't have a better answer, but I hope it helps anyway.
PS: One "solution" might also be to use the postinstall step in an npm module's package.json file to create folder structure, copy files to where they should be and so on, but at least to me this feels more like a clumsy workaround than like a real solution.

What does "Linking Dependencies" during npm / yarn install really do?

For large web apps npm install resp. yarn install does take a lot of time, mostly in a step called Linking Dependencies. What is happening here? Is it fetching the dependencies of the dependencies? Or something completely different? Which files are created during this step?
When you call yarn install, the following things happen in order:
Resolution: Yarn starts resolving dependencies by making requests to the registry and recursively looking up each dependency.
Downloading/Fetching: Next, Yarn looks in a global cache directory to see if the package needed has already been downloaded. If it hasn't, Yarn fetches the tarball for the package and places it in the global cache so it can work offline and won't need to download dependencies more than once. Dependencies can also be placed in source control as tarballs for full offline installs.
Linking: Finally, Yarn links everything together by copying all the files needed from the global cache into the local node_modules directory after identifying what's already there and what's not there.
yarn install does take a lot of time, mostly in a step called Linking Dependencies
You should notice that Step 3: Linking is taking more time than Step 1: Resolution and Step 2: Fetching where the actual download happens. During by this step we already have things that we need ready and downloaded, then why is it taking long, did we miss anything?
Yes, COPY to local project into node_modules folder...! The reason for this is that this copy is not equivalent to copying one large 4.7GB ISO file. Instead it's multiple super small files (Don't take it light when I say multiple, it can be 15k+ files :P ), hence take a lot of time to copy. (Also, it is important to note that when you download the packages, you download one large tar file per package, whose contents should then be extracted into the cache which also takes time)
It is slower due to
Anti-virus: Your antivirus is sitting in the middle and doing a quick inspect (in addition to our yarn checking if it already exists) on every single file yarn is trying to copy cutting its speed by so much. If you are on Windows, try adding your project's parent folder as exception to Windows Defender.
Storage medium's transfer rate: SSDs can improve this speed hugely (Sorry, SSHDs and FireCudas will not help either, this is gonna be one time).
But is this efficient? Can I have it taken from the global node_modules (after creating one)?
Nope for both questions. Because of the way node works each package finds its dependencies only relative to its own location. Also because each project may want to use different versions of the same package to ensure its working properly and not broken by package updates.
Ideally, the project folder should be lean. An efficient way of doing this would be to have a global node_modules folder. Any and all requested packages are downloaded if not already present AND used from this location. Actually Ruby does it this way. Here's my global Ruby's equivalent of node_modules folder. Notice the presence of different versions of the same package for use in different projects.
But keep in mind that it would reduce project portability. It's a trade-off that any manager (be it rubygems or node modules) has to make. I can just copy the node project folder (which in fact may take hours because you will be copying the (local) node_modules folder as well, but I can expect it to work if I have just that project folder, as opposed to copying a ruby project would only some seconds to few minutes, as there is no local packages (or gems as they call them) folder, but running the project on different system would require those packages to be present on the global gems folder.
The documentation for yarn install can be found here.
You can use the command
yarn install --verbose
Show additional logs while installing dependencies
The output will show what the yarn/npm install is doing.
It's good for debugging in case the process is failing or taking a long time.
The linking phase works in essentially 3 big steps:
Find every file that need to be in node_modules
Check this list versus what is already there and find what need to be copied around
from cache to node_modules
Do the copy
Maybe this issue on Github will help you out.
https://github.com/yarnpkg/yarn/issues/1496

How can I separate generated artifacts from the main build with semantic UI?

I am trying to figure out how to integrate Semantic UI with my gulp-based frontend toolchain.
The npm artifact semantic-ui includes an interactive installer that will write a semantic.json file to the root of my project and install the less files, gulp tasks and some configuration into my project. All of these files will be put in subdirectories of a single base directory specified in semantic.json.
I do not want any dependency implementation files or any generated files in the git repository for my project because this will pollute revision history and lead to unneccessary merge conflicts. I would very much prefer to provide semantic.json only and .gitignore the semantic base directory. On npm install, the Semantic installer should install everything to the base directory specified in semantic.json. When building, I want the artifacts generated into a separate dist directory that does not reside under the semantic base directory.
However, if I do this, the installer will fail with a message stating that it cannot find the directories to update and drop me into the interactive installer instead. This is not what I want, because it means my build is no longer non-interactive (which will cause the CI build to fail).
How can I integrate Semantic UI into my build without having to commit Semantic and its generated artifacts into my git repository?
This is what we did in our similar scenario. The following are true:
Everything Semantic UI generates is in .gitignore. Therefore, the only Semantic UI files we have in our repo are:
semantic.json
semantic/src folder (this is where our theme modifications actually are)
semantic/tasks folder (probably doesn't need to be on git, but since it's needed for building, everything is simpler if we keep it in our repo)
We never need to (re)run the Semantic UI installer, everything is integrated in our own gulpfile.js.
Semantic UI outputs in our assets folder which is not in the same folder as its sources.
Semantic UI is updated automatically using npm as per the rules in my package.json.
Here are the steps needed to achieve this:
Install Semantic UI. By this I assume that you either used npm or cloned it from git (using npm is highly recommended), either way, you have semantic.json in your project's main folder and a semantic folder with gulpfile.js, src and tasks.
Make sure Semantic UI can be built. Navigate to semantic/ and run gulp build. This should create a dist folder in your semantic/ directory. Delete it and also delete Semantic UI's gulpfile.js since you'll no longer need it.
Edit semantic.json. You need to edit two lines:
Change "packaged": "dist/", to the path where you'd like to output semantic.css and semantic.js relative to Semantic UI's folder. In our case, it was "packaged": "../../assets/semantic/",
Change "themes": "dist/themes/" in the same way, since the themes/ folder contains fonts and images Semantic UI uses so it needs to be in the same folder as semantic.css. In our case, it was "themes": "../../assets/semantic/dist/themes/".
Edit your gulpfile.js so that it uses Semantic UI's build task. Add var semanticBuild = require('./semantic/tasks/build'); (if semantic/ is in the same folder as your gulpfile.js) and then simply register a task that depends on it, for example gulp.task('semantic', semanticBuild);.
Optionally, create a clean task. We used del for that.
gulp.task('clean:semantic', function(cb) {
del(['assets/semantic'], cb);
});