how will I read Segment/index files which is stored inside a jar file using Apache Lucene. If files are kept in a folder then it works properly.But I have to read the file from a jar only.
There is no such functionality out of the box. To make this happen, you would need to implement o.a.l.store.Directory and feed it to the IndexReader.
Needless to say, performance for such a thing would suffer as effectively every change would have to be zipped/unzipped.
Related
It looks like middleman s3_sync doesn't upload my robots.txt. Is there a way to enable it to always upload a specific file?
It depends on the version of Middleman S3_Sync that you are using.
Versions 3.0.x build the list of files based on the content of the build directory. In that case, copying the file into the build directory will include it in the sync.
Versions 3.3.x moved to the Middleman sitemap in preparation of MM 4. It currently only syncs the files that Middleman is aware of. Copying a file into the build directory doesn't make S3_Sync aware of it.
In the second case, there are two options available.
The first one is to move robot.txt to the source directory. This will include it in the sitemap and it will be sync'ed.
The second is to open an issue (or even better, a pull request) that will ask for the ability to include files that originate from outside of the source directory.
It would help to get the version of Middleman and s3_sync that you are using.
I use automatic deploy through FTP. Everything worked well until I started to use coffeescript and its filewatcher feature which recompiles my .coffee file into .js file on every change.
Problem is that IDEA don't want to upload these compiled files like others. So I have manually press hotkey to upload compiled file after every change, which I want to see on the server.
How can I do it more convenient to use?
There is an option to upload external changes automatically.
Right now IDEA will not perform synchronization after file watcher is invoked, so you will need to do File | Synchronize, IDE will detect the changes and upload them.
Next update will have an option for the file watcher to perform synchronization after execution as the result of addressing this feature request.
this feature request concerns to a possibility to synchronize all files in output directory on every change (required by some transpilers). But, AFAIK, this is not the case for CoffeeScript compiler - synchronization should work there. Do the generated js files appear in the Project View as soon as the compilation completes, or do you have to synchronize the view manually to see changes? In the latter case something must be wrong with file watcher configuration (output path set incorrectly, for example). If files are synchronized correctly, setting 'upload external changes' option should do the thing
I have a data parsing utility in the form of a runnable JAR file. I also have an Apache server (Ubuntu 12.04) to which data files are uploaded. Is there anyway that I could launch said JAR file as a background process when a file is uploaded? (FYI: File access by multiple processes isn't a concern here; I've got file locking in place.)
Related idea: if the above isn't possible, I could always launch the aforementioned JAR file from a bash script. However, I'm still not sure how to do that via Apache. I'm quite a novice at using it effectively.
Edit: Just noticed this potential php solution. Apache folks: is this a good idea, or is there a better solution?
Maybe you can use File Alternation Monitor to achieve this. It can be configured as a background daemon which performs operations if the new file is spotted. If you want to avoid starting while the file is currently uploaded, wait approx. 5 minutes after the file change time and start processing your utility.
I use a similar technique for monitoring uploaded files on a Samba share and it works flawless.
I checked quite a few similar questions, but so far I am unsatisfied with the solutions.
Ever use the Minecraft Server? At initial launch, it creates all the files and folders it needs, and allows you to make changes to files like Server.properties and ops.txt by making them external of the executable jar file.
I'm working on a similar project, and I want to duplicate that behavior. Everything works great when I run it in eclipse. When I export to a jar file though, things get funky. The external files and folders are created without a hitch, but afterword, it would appear as though they cannot be read from or written to. Any ideas how Notch made his server?
--edit--
Scratch that, it doesn't even appear to reliably create the files and folders. Maybe it only creates them the very first run after creation?
--edit again--
It creates them in the root directory. When I tested it in eclipse, the root directory was limited to the folder containing the project, and therefore looked fine. The solution was to make the class aware of it's location, and include it in all file operations.
Have the main class in your executable jar file look up where it is, then have it store that information in a global String or something. Prefix your filenames with that string in your file operations, and voila! It's writing to the correct directory.
I have a project where I create a JAR which contains a bunch of classes with main() plus a set of scripts which set the environment to invoke them. Most of those are long running processes which log a lot (~10-20GB).
This means I have a pretty complex log4j.xml file which, being in src/main/resources/, goes into the JAR. When something breaks in the production system, I'd like to modify the logging on the fly for a single run.
So I came up with the idea to have a conf/ directory on the production and put that into the classpath, first. Then, I thought that it would be great if M2 would put the config files in there (instead of the JAR). But that would overwrite any manual changes during an automated deployment which I strongly dislike. I'm also not fond of timestamps and things like that.
So my next ideas was this: M2 should leave the config files in the JAR but create copies of the files with the name *.tpl in the conf/ directory. The admin could then copy a template to the basename to override the files in the JARs. .tpl-Files would be overwritten but that wouldn't hurt. Admins would have full control over which version of the log was active and they could run a diff to see whether any important changes were made.
Now the question: Has someone seen a plugin which automates this process? That is which creates a conf/ directory with all or a selected subset of everything in src/main/resources/ and which renames the files?
Best practice in Maven handling config files is to place them in a separate conf directory, and pack them in a binary assembly using the assembly plugin. Placing configuration files, like log4j.xml in the src/main/resources doesn't make sense, since it is not a true application resource, but more of a configuration file.
We cope with the overwriting, by packing the configuration files with the posfix .def. For example: myapp.properties is packed into the assembly as myapp.properties.def. When the person who uses the assembly unpacks it, it will not overwrite his original files. After unpacking he simply merges them by an external tool (we use meld in Fedora Core).
I may be missing something and this doesn't answer directly the question but did you consider producing a zip assembly of the exploded content of required artifacts (to be unzipped on the target environment)?
Sounds like you're attacking the problem the wrong way. Why not just run the application with -Dlog4j.configuration=/some/where/my-log4j.properties? If you want, you can add a command line flag to main() which invokes the PropertyConfigurator directly.