How do I go about initializing attributes of objects, when I create them in an array of objects?
Say, if I would normally do something like:
Train train1(unique_id_1,unique_serial_nr_1);
Train train2(unique_id_2,unique_serial_nr_2);
etc...
How can I do those initializations, if I want to do an array instead?
I tried something like:
Train Trainarray [10] {train1(unique_id_1,unique_serial_nr_1), train2(unique_id_2,unique_serial_nr_2), etc.. }
but that doesn't seem to work.
Regards
Anders
Related
In random forest type models, there is usually an attribute like "estimators" which returns all the tree split as a list of lists.
I can't seem find something similar with lightgbm. The closest I can come is lgb.plot_tree which gives a nice visualization of a single tree. But I would like to use the data shown in the visualization in variables.
How can I get at this data?
There's not something exactly the same in LightGBM. But you could use the dump_model or trees_to_dataframe methods of the booster_ attribute of the scikit-learn estimator, i.e.
clf = lgb.LGBMClassifier().fit(X, y)
clf.booster_.dump_model()
clf.booster_.trees_to_dataframe()
I am trying to do a Multiple Instance Learning for a binary classification problem, where each bag of instances has an associated label 0/1. However, the different bags have different numbers of instances. One solution is to take the minimum of all the instance numbers of the bag. For eg-
Bag1 - 20 instances, Bag2- 5 instances, Bag3 - 10 instances .... etc
I am taking the minimum i.e- 5 instances from all the bags. However, this technique discards all the other instances from other bags which might contribute to the training.
Is there any workaround/algorithm for MIL where variable instance numbers for bags could be handled?
You can try using RaggedTensors for this. They're mostly used in NLP work, since sentences have variable numbers of words (and paragraphs have variable numbers of sentences etc.). But there's nothing special about ragged tensors that limits them to this domain.
See the tensorflow docs for more information. Not all layers or operations will work, but you can use things like Dense layers to build a Sequential or Functional (or even custom) model, if this works for you.
I have worked on lists in g1ant but lists are 1D array, now I want to work on 2D array but I don't know how to do that it g1ant. Is there any way of doing it?
At the current state of G1ANT, it is not possible to create a 2D array as a g1ant structure. Although you can use C# snippets in which it is possible to create 2D arrays and operate on them.
Right now I am using yolo3 but it does not have all 9000 classes.
Is it possible to use yolo3 with yolo9000 classes and weights?
I would like to have as many objects as possible to assign tags to some pictures and I am not sure if I should use yolo9000 or it is possible to use yolo3 with 9000 classes.
Update:
So, I tired loading the weights yolo9000 with yolo3 and the model does not find any objects. It seems I need to tweak something else.
Got it,
I needed to change thresholds when I call detect, default value is 0.5
https://github.com/pjreddie/darknet/blob/61c9d02ec461e30d55762ec7669d6a1d3c356fb2/python/darknet.py#L125
det = darknet.detect(self._net, self._meta, im, thresh=.1, hier_thresh=.1, **kwargs)
Looking at layers implementations of third parties, like tensorflow_addons, I see that each layer is being registered as a custom object.
For example, you can see the use of the wrapper register_custom_keras_object call here.
This wrapper uses the function tf.keras.utils.get_custom_objects() to do the registering.
My question, is why should this be done for any custom layers? What is the benefit of registering the layer as a custom object?
Doing this allows you to then to refer to your custom object via string. You see this with keras default objects all the time. For example:
# You can either compile a model with the Adam optimizer like this
model.compile(optimizer='adam', ...)
# or like this
adam = keras.optimizers.Adam()
model.compile(optimizer=adam, ...)
Taken from the definition of custom_object_scope:
Code within a with statement will be able to access custom objects by name. Changes to global custom objects persist within the enclosing with statement. At end of the with statement, global custom objects are reverted to state at beginning of the with statement.
Example: Consider a custom object MyObject
with custom_object_scope({'MyObject':MyObject}):
layer = Dense(..., kernel_regularizer='MyObject')
# save, load, etc. will recognize custom object by name
Defined as
def custom_object_scope(*args)
Arguments:
*args: Variable length list of dictionaries of name, class pairs to add to custom objects.