How to block intellij idea from building huge nexus indexes in /tmp - intellij-idea

I just found out, that Intellij idea builds index of nexus repository, and puts it in /tmp in default configuration. In my case if found out, that it takes 7G of ram from me, which is far from acceptable. Is there a way how to controll this behavior or turn this feature off. I undestand, that it is probably needed by some functionality, but 7G is way too much.
3.0G    nexus-maven-repository-index.gz2870725737868948664
4.0G    nexus-maven-repository-index.gz289507210603911153.dir

Related

How to Prevent GitLab Runner From Ever Using /home

I build my runner and it works fine. However, when it initializes, it first clones the project to /home/user/builds/xxxx... I never, ever, ever want GitLab to use /home. Never. Not for anything. I was told that it is impossible to change it to a different location. I find it hard to believe.
See in the image below, it gives me a warning about templates not found in some made up directory, then clones the entire project under the user's home directory. I don't give it that command - so it must be a default. Is there a way to choose ANY other mount point? The project is several hundred gigabytes and the /home directory is 50k. I cannot control that. So to a different mount point it must go.
I can provide the yml etc, but this is about core behavior of the runner itself - not anything I created. I'm hoping it is a simple variable I can send when initializing the runner.
Thank you in advance.

How can I stop indexing intellij idea?

Without "excluding" folders, how can I stop Indexing intellij idea on start? It's very very annoying that it starts indexing on startup without allowing you to, for example, debug a test case making you wait until indexing has finished.
You can stop synchronizing/indexing each time you switch to the IDEA and it's quite useful when dealing with big projects and outside build process which triggers indexing.
Just disable checkbox System Settings -> Synchronize files on frame or editor tab activation.
Make sure you run indexing manually to update hints when needed through Synchronize menu (Cmd-Alt-Y on Mac) or File -> Reload All from Disk to run it manually for IDEA 2020 running under Linux
Try:
File-> Invalidate Caches / Restart.
I think that this issue happens (at least for me it seems to be the reason) if you start commit and then close the IDE in the middle of the process.
Now, "excluding" folders is the only way to disable indexing when a change occurs in one of excluded folders (except for generated sources that mustn't be excluded).
Checked with Intellij 2016.2.5
I suggest you to tune your Intellij configuration, see this post : https://stackoverflow.com/a/22508853/779338
You Can easily resolve it.
Just Go to: File -> Setting -> Directories.
Stop all the files that are going to included and need to empty ADD Content Root.
Like This:
It may be because of libraries folder is added more than one sub folder in the project. example in my case i have node_modules folder in two locations in the project. one is under root directory(app-->node_modeules) another one is under(app-->test-->node_modules)
Simply Right click on each folder and then selected Mark Directory As --> Excluded
It resolved my problem. Hope it would be helpful.
After struggling with this issue for around a week and searching all the solution I came to below conclusion:
either upgrade your intellij which gives you feature to pause the indexing ( I can't upgrade my intellij due to license issues ) so the other option was to disable all the plugins and then try enabling as per the need one by one.
After enabling any of the plugin if you see it behaving the same uninstall the plugin and download some alternative of it.
If longer indexing is an issue you can enable the shared indexes.
Add this plugin in your intellij and follow the instructions from here.
In my case i had constantly indexing files in Angular application when i remove node_modules folder.
I've tried to invalidate caches multiple times what didn't help at all.
Only solution was to remove project with clear Git state and reclone project - than everything started to work just flawlessly.

How to configure IntelliJ products WITHOUT editing files in bin?

I'd like to set some specific options in idea.vmoptions and idea.properties for IntelliJ IDEA 14, but I don't have access to those files in C:\Program Files\... (yes, that's Windows, don't troll ;)
Is there a folder in %UserProfile% or an environment variable I could set to read those files (both vmoptions and properties!) from elsewhere?
Please don't suggest to copy the whole IDEA folder elsewhere, there's a reason why I can't access it. I would be interested in a Linux solution too, the same would most likely work on Windows.
My Research
For Mac there're specific instructions at Increasing Memory Heap, but for Linux and Windows it's just filename which are trivial to find out anyway.
I also found IntelliJ IDEA files locations, but it says can be modified in IDEA_HOME\bin\idea.properties which doesn't help since I can't access that file, but want to change properties in it.
Update: Simple Answer
Create IDEA_PROPERTIES and IDEA_VM_OPTIONS environment variables and point them to the files you want, restart IDE, done.
Also see documentation for more (and maybe report that it lacks any mention of IDEA_PROPERTIES).
You can use %USERPROFILE%\.IntelliJIdea14\idea%BITS%.exe.vmoptions on Windows as custom options file. I tried it and it works.
Another way that I haven't tried, but I think should work, is to copy idea.bat and edit it to use the file you need.

Operating system agnostic way to get config file in clojure

I have a few clojure applications that load the sensitive info off of .properties file in /etc/ and this has worked well so far.
Recently, I have had to deal with a few windows machines added into our server collection and I need to run the clojure applications on there as well. Windows doesn't obviously have or understand /etc/ path and I got around that fact by looking at /etc/ and if that's missing then looking at d:\configs.
But I don't quite like this way of doing it, because, if there is another windows developer looking into it and he doesn't have d:\ or prefers elsewhere for configs it would get messy.
Is there any way I can load a file from clojure, no matter what operating system it is? My initial thoughts were of saving a key-path in the Environment variable and accessing it from clojure.
I am just wondering if there is a better way of doing it.
Thanks.
Have a look at environ. It offers some flexibility when it comes to configuring your Clojure app, letting you choose between a number of options:
environment variables: This seems to be the way to go in Clojureland, so I'd say your initial thought wasn't the worst;
in ~/.lein/profiles.clj: You can store them in the :user profile as Clojure data - that sounds quite nice, I guess;
Java CLI properties: Finally, you can pass them to the java executable directly via the command line.
environ will collect data from all these places.

/tmp files filling up with surefires files

When Jenkins invokes maven build, /tmp fills with 100s of surefire839014140451157473tmp, how to explicitly redirect to another directory during the build. For clover build it fills with 100s of grover53460334580.jar? Any idea to over come this?
And any body know exact step by step to create ramdisk so I could redirect surefire stuffs into that ramdisk ? Will it save write time to hard drive?
Thanks
Many programs respect the TMPDIR (and sometimes TMP) environment variables. Maybe Jenkins uses APIs that respect them? Try:
TMPDIR=/path/to/bigger/filesystem jenkins
when launching Jenkins. (Or however you start it -- does it run as a daemon and have a shell-script to launch it?)
There might be some performance benefit to using a RAM-based filesystem -- ext3, ext4, and similar journaled filesystems will order writes to disk, and even a quick fd=open(O_CREAT); unlink(fd); sequence will probably require both on-disk journal updates and directory updates. (Homework: test this.) A RAM-based filesystem won't perform the journaling, and might or might not write anything to disk (depending upon which one you pick).
There are two main choices: ramfs is a very simple window into the kernel's caching mechanism. There is no disk-based backing for your files at all, and no memory limits. You can fill all your memory with one of these very quickly, and suffer very horrible consequences. (Almost no programs handle out-of-disk well, and the OOM-killer can't free up any of this memory.) See the Linux kernel file Documentation/filesystems/ramfs-rootfs-initramfs.txt.
tmpfs is a slight modification of ramfs -- you can specify an upper limit on the space it can allocate (-o size) and the page cache can swap the data to the swap partitions or swap files -- which is an excellent bonus, as your memory might be significantly better used elsewhere, such as keeping your compiler, linker, source files, and object files in core. See the Linux kernel file Documentation/filesystems/tmpfs.txt.
Adding this line to your /etc/fstab will change /tmp globally:
tmpfs /tmp tmpfs defaults 0 0
(The default is to allow up to half your RAM to be used on the filesystem. Change the defaults if you need to.)
If you want to mount a tmpfs somewhere else, you can; maybe combine that with the TMPDIR environment variable from above or learn about the new shared-subtree features in Documentation/filesystems/sharedsubtree.txt or made easy via pam_namespace to make it visible only to your Jenkins and child processes.