I have created a puppet module and wish to use (hiera 5) module-level hiera to set values for dependencies. I have created a hiera.yaml, data dir and common.yaml. However, module hiera values do not appear to be passed to the dependencies, and I have to set the hiera at the environment level instead (which is then not self contained within the module).
It appears that hiera keys only resolve with the local module prefix.
The dependency I am targeting is fervid/secure_linux_cis and I need to set some hiera for it's classes with 'enforced => false'. Doing this in a manifest isn't 100% effective as a few classes present as duplicate resources.
Essentially, what I want to be able to do in my module's common.yaml is:
---
my::module::key1: 'value'
secure_linux_cis::redhat7::cis_1_1_2:enforced: false
Using the above example, I can resolve the my::module value but the dependency secure_linux_cis value is ignored.
Is there a way to get module level hiera to pass values to a dependency?
Is there a way to get module level hiera to pass values to a dependency?
Not directly, no. Automatic data binding performs Hiera lookups in the context of the class whose parameters are being looked up, not that of the one declaring it. In your case, then, it is the hiera data of module secure_linux_cis that will be consulted for default values for the parameters of class secure_linux_cis::redhat7::cis_1_1_2.
If you are willing to risk using a resource-like declaration of that class, then you should be able to do something along these lines:
class my::module(
String $key1
) {
$cis112_enforced = lookup('secure_linux_cis::redhat7::cis_1_1_2:enforced', Boolean, undef, false)
class { 'secure_linux_cis::redhat7::cis_1_1_2': enforced => $cis112_enforced }
}
Of course, that carries all the risks attending resource-like class declarations. I don't actually recommend it. You could consider instead setting the parameter in a per-node level of the environment-level data. You might also consider looking at the Roles & Profiles pattern, for resource-like class declarations make a little bit more sense in a profile class.
secure_linux_cis::redhat7::cis_1_1_2::enforced
This should work. You missed one : in your example.
Related
In alloy you can make modules polymorphic on signatures by defining them as arguments in the module definition, e.g.:
module mymodule[sigA,sigB]
In my case, I also want to have predicates about these signatures that might change in the different instantiations of my module. Something like, say,
module mymodule[sigA,predA]
sig B extends sigA {}
pred B[b : B] { ... }
fact { all b : B | predA[b] => predB[b]}
If I just do it like this naively, Alloy complains that it won't typecheck, as parameters to a module are automatically assumed to be signatures. Is there any workaround or some other way to make modules be polymorphic in predicates like this?
Genericity in the module system is indeed limited to signatures parameters. The only workaround (sort of) I know of is exemplified in util/graph for instance, of which I'm pasting an excerpt:
module util/graph[node]
/** graph is rooted at root */
pred rootedAt[r: node->node, root: node] {
node in root.*r
}
As you can see, node is a parameter. If you want to talk about a relation on nodes, then you pass it as a parameter (e.g.: rootedAt). So, you can;t require the existence of a relation on nodes, in addition to the signature node itself, but you can give the module "client" means to reason about fields too.
I have a job whose output format is SequenceFileOuputFormat.
I set the output key and value class like this:
conf.setOutputKeyClass(IntWritable.class);
conf.setOutputValueClass(SplitInfo.class);
The SplitInfo class implements Serializable,Writable
I set the io.serializations property as follows:
conf.set("io.serializations","org.apache.hadoop.io.serializer.JavaSerialization,"
+ "org.apache.hadoop.io.serializer.WritableSerialization");
However, on the reducer side I get this error, telling me that Hadoop couldn't find a serializer:
java.lang.NullPointerException
at org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:961)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:892)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:393)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:354)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:476)
at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:61)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.<init>(ReduceTask.java:569)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:638)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:417)
Can anyone help, please ?
The problem was that I was making a stupid mistake: I was not updating a jar. So, basically SplitInfo was not implementing the Writable interface in the old (in use) jar.
As a general observation: the error specified in the OP has as underlying cause the fact that HADOOP can't find a Serializer for a specific type which you're trying to serialize (being directly or indirectly, e.g. by using that type as an output key/value). Hadoop cannot find a Serilizer for one of the 2 reasons:
your type is not serializable (i.e. it doesn't implement Writable or Serializable)
There is no Serializer available to Hadoop for the type of serialization your type implements (e.g.: your type implements Writable but hadoop for one reason or another cannot use the org.apache.hadoop.io.serializer.WritableSerialization class)
I think you're trying to do something you don't need to. Your output value only needs to implement the Writable interface and you should just set the output format.
conf.setOutputFormatClass(SequenceFileOutputFormat.class);
You only use the "io.serializations" configuration if you want to use a different serialization framework, which it doesn't look like you need.
I've got a Registry class and there are a few Registry values that I want to access from within that Registry class. (There is a bit of a calculation with these values so I thought I'd just put all that code right in the Registry Class itself).
So we might have something within our RegistryRoutine.cls like:
Function GetMyValue() as integer
Dim R as new RegistryRoutine
<calculations>
GetMyValue=R.GetRegisetryValue (HKEY, key, value, etc.)
End Function
No, in general you won't see any problems (like member variables being overwritten or anything weird like that).
Where you could run into issues is if you have explicity shared variables that are being written to multiple times. But that's dangerous no matter what you do.
Do watch out for recursive cases - e.g., GetMyValue() should not call R.GetMyValue(), nor should GetRegistryValue() call GetMyValue().
It's rare that you actually want to do this, however.
Since you're not passing any arguments into GetMyValue(), we can assume that the current instance already has all the information it needs.
Since you're only returning an integer, not a RegistryRoutine instance, the client has no need for a new instance.
So, why not just call GetRegistryValue (without the R.)?
It's quite common for classes to work with instances of themselves. Consider, for example, how a tree structure works. Either a Node class has to keep track of its children, or has to keep track of its parent (or both). All the nodes are the same class.
Is setX() method name appropriate for only for setting class property X?
For instance, I have a class where the output is a string of an html table. Before you can you can call getTable, you have to call setTable(), which just looks at a other properties and decides how to construct the table. It doesn't actually directly set any class property -- only causes the property to be set. When it's called, the class will construct strHtmlTable, but you can't specify it.
So, calling it setTable breaks the convention of get and set being interfaces for class properties.
Is there another naming convention for this kind of method?
Edit: in this particular class, there are at least two ( and in total 8 optional ) other methods that must be called before the class knows everything it needs to to construct the table. I chose to have the data set as separate methods rather than clutter up the __construct() with 8 optional parameters which I'll never remember the order of.
I would recommend something like generateTable() instead of setTable(). This provides a situation where the name of the method clearly denotes what it does.
I would probably still use a setTable() method to actually set the property, though. Ideally, you could open the possibility of setting a previously defined table for further flexibility.
Yes, setX() is primarily used for setting a field X, though setX() may have some additional code that needs to run in addition to a direct assignment to a field. Using it for something else may be misleading to other developers.
I would definitely recommend against having a public setTable() and would say that setTable() could be omitted or just an unused private method depending upon your requirements.
It sounds like the activity to generate the table is more of a view of other properties on the object, so you might consider moving that to a private method on the object like generateHtmlTable(). This could be done during construction (and upon updates to the object) so that any subsequent calls to getTable() will return the the appropriate HTML.
I am using classes and static methods to 'scope' functions in a namespace, similar to C#. However, every time I add a new method to a class, at first it is not found. I have to restart the MATLAB environment (2007a) for the new methods to be recognised.
Surely there is an 'update' or 'refresh' type command that I can use so that I do not have to restart the MATLAB environment each time I add a function?
Issuing this call to CLEAR should do it:
clear classes
One unfortunate side effect of this is that it also effectively issues a clear all, which clears all of the variables in the workspace as well (however, this would happen anyway when you close and restart MATLAB). This clearing of the workspace actually serves a purpose, since it will remove any variables of the same type as the old version of your class, which potentially wouldn't work correctly with the new version of your class.
The function REHASH may work, but I doubt it (I think it deals more with file paths than class definitions).
Clearing instances of your class should work.
Suppose that you have an instance of "MyClass" in your base workspace:
foo = MyClass;
Now, suppose you edit MyClass and add new static method "bar":
foo.bar(); % Will cause error, as foo is instance of previous "MyClass"
However, "clear"-ing foo will remove the reference to the previous class:
clear('foo');
foo = MyClass;
foo.bar(); % this should now work.
This should be fine if you only have one or two instances of the class in your base workspace. If you have many instances of the class in your base workspace, then you may want to write a script to clear them:
varList = whos;
for iVar = 1:numel(varList)
if isequal( 'MyClass', varList(iVar).class )
clear( varlist(iVar).name );
end
end
clear('varList');
clear('MyClass');
If you have instances of the class in more locations, you may wish to extend the script as appropriate.
The last call to clear the class name might only be necessary if you are making modifications to classes in an inheritance hierarchy.
try "clear classname"