How to use a common file across many Terraform builds? - structure

I have a directory structure as follows to build the Terraform resources for my project:
s3
main.tf
variables.tf
tag_variables.tf
ec2
main.tf
variables.tf
tag_variables.tf
vpc
main.tf
variables.tf
tag_variables.tf
When I want to build or change something in s3, I run the Terraform the s3 directory.
When I want to build the ec2 resources, I cd into that folder and do a Terraform build there.
I run them one at a time.
At the moment I have a list of tags defined as variables, inside each directory.
This is the same file, copied many times to each directory.
Is there a way to avoid copying the same tags file into all of the folders? I'm looking for a solution where I only have only one copy of the tags file.
Terraform do offer a solution of sorts using the "local" verb, but this still needs the file to be repeated in each directory.
What I tried:
I tried putting the variables in a module, but variables are internal to a module, modules are not designed to share code into the main file.
I tried making the variables an output from the module but it didn't like that either.
Does anyone have a way to achieve one central tags file that gets used everywhere? I'm thinking of something like an include of a source code chunk from elsewhere? Or any other solution would be great.

Thanks for that advice ydaetskcoR, I used symlinks and this works perfectly.
I placed the tags list.tf file in a common directory. Each Terraform project now has a symbolic link to the file. (Plus I've linked some other common files in this way, like provider.tf).
For the benefit of others, in Linux a symbolic link is a small file that is a link to another file, but it can be used exactly like the original file.
This allows many different and separate Terraform projects to refer to the common file.
Note: If you are looking to modularise common Terraform functions, have a look at Terraform modules, these are designed to modularise your code. They can't be used for the simple use case above, however.

Related

What is an .env (or dotenv) file exactly?

There's a lot of programs out there that can utilize .env files. Most of them support the basic bash-syntax, others support more advanced things like templating within the .env files. The way you're supposed to use .env files varies a lot as well. Often in this context, you'll year from the 12-factor-app which states that you should export (some) configuration as environment variables, that apparently lead to two use cases:
Some dotenv programs do just that: you prefix your command on the shell with dotenv and the variables in the dotenv file will magically be available in you process.
Others however provide libraries that actually read .env files from within your code -- a very different approach since your code is suddenly interpreting .env files directly and does not "see" the environment variables anymore.
Since there's so many different interpretations of usage, use-cases and syntax, is there a proper definition of .env files or some commonly accepted standard? If not, at least some historical references?
It appears that .env files lack a complete definition.
This reference:
https://devcenter.heroku.com/articles/heroku-local
Links here:
http://blog.daviddollar.org/2011/05/06/introducing-foreman.html
Which leads here:
https://ddollar.github.io/foreman/
Which contains this:
ENVIRONMENT
If a .env file exists in the current directory, the default
environment will be read from it. This file should contain key/value
pairs, separated by =, with one key/value pair per line.
FOO=bar
BAZ=qux

How to mark package as a resource folder?

I have a dir structure for Intellij 12:
...
...test
- java
- com.mycompany.myproject
- package1 (contains code, etc,.)
- resourcePackage (want to contain .json files but can't mark as a resource)
- myOtherJunk.json
- o o o
- resources
- aResource.json
The thing is if I right click on my package name (com.mycompany.myproject) I can only add packages and not directories (like that of an existing resource folder).
However, I don't want to use that existing resource folder for the .json files that I'm going to read into per my test class.
So, I need something to support:
// this already works for the resources directory per the .json file but doesn't for the
// myOtherJunk.json per the resourcePackage.
InputStream is = MyClassTest.class.getResourceAsStream("aResource.json");
This can be solved in several ways. An example of a good approach would be the following folder structure:
src
main
java
resources
test
java
resources
When this is done, you put all you java classes under src/main/java/com.mycompany package and any resources under /src/main/resources/com/mycompany folder.
To link them together, go to the project properties, and find the Path tab. Mark the src/main/java and src/main/resources as source folders. (see the screen-shot attached)
If you link them together, you'll be able to use getResourceAsStream() method.
If you wonder why you should use the following folder structure - this is standard maven way of keeping things clean and tidy.
Directories Creation
Intellij creates directories when you ask her to create package. It is not an error.
If you create package "com", it will create the dir "com", and if you create a source file there, it will think that the file is in the package "com".
If you create package "com.next.pack", it will create three nested dirs "com", then "next", then "pack", and if you create a source file there, it will think that the file is in the package "com.next.pack".
Directories Structures
Only the path under the source root is taken as a package. Any directory(ies) can be set as a source root(s).
Resources roots
Make practically any structure of directories. Somewhere in it there is the root dir of resources. Right-click it and Mark Directory As ... Resources Root.
Notice, the same method can be used for the directories structures for tests, test classes, and test resources. Look here.
Please use #ContextConfiguration annotation to load the resource files. Please see below example.
#ContextConfiguration( { "/app-config.xml", "/test-data-access-config.xml",application-test.yml })

jars, external properties, and external file io

I checked quite a few similar questions, but so far I am unsatisfied with the solutions.
Ever use the Minecraft Server? At initial launch, it creates all the files and folders it needs, and allows you to make changes to files like Server.properties and ops.txt by making them external of the executable jar file.
I'm working on a similar project, and I want to duplicate that behavior. Everything works great when I run it in eclipse. When I export to a jar file though, things get funky. The external files and folders are created without a hitch, but afterword, it would appear as though they cannot be read from or written to. Any ideas how Notch made his server?
--edit--
Scratch that, it doesn't even appear to reliably create the files and folders. Maybe it only creates them the very first run after creation?
--edit again--
It creates them in the root directory. When I tested it in eclipse, the root directory was limited to the folder containing the project, and therefore looked fine. The solution was to make the class aware of it's location, and include it in all file operations.
Have the main class in your executable jar file look up where it is, then have it store that information in a global String or something. Prefix your filenames with that string in your file operations, and voila! It's writing to the correct directory.

Teamcity 2 configurations merge and deploy

I have two teamcity configurations one becoming my common helpers and reuseable components and my other a website which uses the common project.
I use a third configuration to publish to a test environment.
When the third configuration is run i would like it to get the artifacts from the common project and merge them with the website output and deploy. Am i asking for two much?
This ought to be pretty straightforward.
On ThirdConfig add two artifact dependencies. One whose source is CommonProject, and another whose source is WebProject. When configuring an artifact dependency it will allow you to specify which artifact files are are actually pulled from CommonProject and WebProject into ThirdConfig via the 'Artifact paths'. The artifact files can then be placed into some new folder hierarchy specific to ThirdConfig by using the 'Destination path'. These two options ought to be enough to create the directory structure that is the merging of CommonProject and WebProject. That takes care of the merge part.
The deploy is a bit more tricky. To my knowledge TeamCity does not support any sort of 'copy or upload to external location' function out of the box. For this bit you'll need to create an msbuild script (or batch file, or anything that can be run from the command line). Said script can expect the file/directory structure you've created via artifact dependencies where the root of the structure is the initial working directory of the script, and need only push these files out to your specific deploy location. That 'push' of course is going to be specific to your environment. Ftp, unc share, etc.

How can I make deployed resources editable with Maven 2?

I have a project where I create a JAR which contains a bunch of classes with main() plus a set of scripts which set the environment to invoke them. Most of those are long running processes which log a lot (~10-20GB).
This means I have a pretty complex log4j.xml file which, being in src/main/resources/, goes into the JAR. When something breaks in the production system, I'd like to modify the logging on the fly for a single run.
So I came up with the idea to have a conf/ directory on the production and put that into the classpath, first. Then, I thought that it would be great if M2 would put the config files in there (instead of the JAR). But that would overwrite any manual changes during an automated deployment which I strongly dislike. I'm also not fond of timestamps and things like that.
So my next ideas was this: M2 should leave the config files in the JAR but create copies of the files with the name *.tpl in the conf/ directory. The admin could then copy a template to the basename to override the files in the JARs. .tpl-Files would be overwritten but that wouldn't hurt. Admins would have full control over which version of the log was active and they could run a diff to see whether any important changes were made.
Now the question: Has someone seen a plugin which automates this process? That is which creates a conf/ directory with all or a selected subset of everything in src/main/resources/ and which renames the files?
Best practice in Maven handling config files is to place them in a separate conf directory, and pack them in a binary assembly using the assembly plugin. Placing configuration files, like log4j.xml in the src/main/resources doesn't make sense, since it is not a true application resource, but more of a configuration file.
We cope with the overwriting, by packing the configuration files with the posfix .def. For example: myapp.properties is packed into the assembly as myapp.properties.def. When the person who uses the assembly unpacks it, it will not overwrite his original files. After unpacking he simply merges them by an external tool (we use meld in Fedora Core).
I may be missing something and this doesn't answer directly the question but did you consider producing a zip assembly of the exploded content of required artifacts (to be unzipped on the target environment)?
Sounds like you're attacking the problem the wrong way. Why not just run the application with -Dlog4j.configuration=/some/where/my-log4j.properties? If you want, you can add a command line flag to main() which invokes the PropertyConfigurator directly.