How to increase the limitation of Jfuzzylogic "rules"? - fuzzy-logic

iam searching a way around the limitation of fuzzy control language. In the documentation there is a limitation of 10 rules per rule set and 15 rules overall. I would like to increase the overall limit to for example 30 rules overall.
For example ,when i use JFuzzylogic with more then 15 rule set the 16. rule will be ignored.
Any experience with that around?

I am not sure about that limitation in JFuzzyLogic or the Fuzzy Controller Language (FCL), but you could try jfuzzylite, which is also free and open source, you can import and export controllers from and to FCL without limitation on the number of rules, and it is available at http://www.fuzzylite.com. In addition, you can use QtFuzzyLite to easily create your controllers.

Related

design pattern for unlimited number of conditions

I wonder what design pattern I can use to replace unlimited number of conditions.
a more straightforward question :
How to make tool like LinPEASE.sh in oop way that lets many users add their own logic . (I just bring this tool as an example because it has many many if conditions, and the amount of conditions in it grow over time)
https://github.com/carlospolop/privilege-escalation-awesome-scripts-suite/tree/master/linPEAS
Looking at LinPEASE.sh, it's basically just a bunch of if statements in sequence. If you wanted to achieve the same functionality, but make it dynamic so that additional checks and operations could be added using a design pattern, Chain of Responsibility seems well suited.
As per me this case is best suited for a Rule engine like drools.
https://www.drools.org/

Alter speed/feed by tool number

I need to use some new drills using unmodified original .MIN CNC programs for Okuma Thinc controller, MU6300V. I'm looking to use the Okuma API to detect when tool group 4 is loaded into the spindle and then alter the speed/feed when it drills. I am familiar with the API and .NET. Looking for some general guidance on objects/methods and approach.
If this is too difficult then I would settle for just modifying the feed rate when a G81 drill cycle is called for a tool in group 4.
The first part of your request is pretty straight-forward.
// Current Tool Number
Okuma.CMDATAPI.DataAPI.CTools.GetCurrentToolNumber();
// Group number of current tool
Okuma.CMDATAPI.DataAPI.CTools.GetGroupNo(CurrentToolNumber);
Altering the drill feed / speed will be more troublesome however.
You cannot set feed/speed overrides using the API.
That is, not without some additional hardware and special options.
Other people have done it actually.
Have you ever seen Caron Engineering's Tool Monitoring Adaptive Control?
Because I think that is essentially what you're asking for.
https://www.caroneng.com/products/tmac
The only other option you have is altering your part program to look for common variable values to set spindle speed and/or feed rate.
For Example
Use one variable to determine if fixed or variable value should be used, and another for the variable value
That way, on a machine that has your old drills and no THINC Application altering common variables, the fixed values are used. But, on a machine that has the application, it can look at the tool number or group and set a common variable that determines specific speed/feed values. Then those new values are used before starting the spindle and moving into the cut.
The choices available for changing feed/speed after the machine has entered a cut or commanded the spindle to run are:
Human operator at the control panel
TMAC

Trigger action realtime based on keyword in Logs

I have a requirement for which I want to trigger an action (like calling a REST-ful service) in the event a keyword is found in the logs. The trigger would have to be fairly real time. I was evaluating open source solutions like GrayLog2, ELK stack (which I believe can't analyse real time), fluentd etc. but wanted to know your opinion on that. It would be great if the tool also allows setting up rules against key words to eliminate false positives and easy to set up.
I hope this makes sense and apologies if this has been discussed elsewhere!
You can try Massalyzer. It's a real-time analyzer too, very fast (up to 10 millinon line per sec), and you can analyze unlimited size with free demo version.
So, I tried Logstash+Graylog2 combination for the scenario I described in the question and it works quite well. I had to tweak a few things to make Logstash work with Graylog2, especially around capturing the right log levels. I will try this out on a highly loaded clustered environment and update my findings here.

Elasticsearch _boost deprecated. Any alternative?

So basically _boost, which was a mapping option to give a field a certain boost is now deprecated
The page suggest to use "function score instead of boost". But function score means:
The function_score allows you to modify the score of documents that are retrieved by a query
So it's not even an alternative. Function score just modifies the score of the documents at query time.
How i do alter the relevante of a field at the mapping time?
That option is not valid anymore? Removed and no replacement?
The option is no longer valid and there is no direct replacement. The problem is that index time boosting was removed from Lucene 4.0 upon which Elasticsearch runs. Elasticsearch then used it's own implementation which had it's own issues. A good write up on the issues can be found here: http://blog.brusic.com/2014/02/document-boosting-in-elasticsearch.html and the issue deprecating boost at index time here: https://github.com/elasticsearch/elasticsearch/issues/4664
To summarize, it basically was not working in a transparent and understandable way - you could boost one document by 100 and another by 50, hit the same keyword and yet get the same score. So the decision was made to remove it and rely on function score queries, which have the benefit of being much more transparent and predictable in their impact on scoring.
If you feel strongly that function score queries do not meet your needs and use case, I'd open an issue in github and explain your case.
Function score query can be used to boost the whole document. If you want to use field boost, you can do so with a multi match query or a term query.
I don't know about your case but I believe you have strong reasons to boost documents at index time. It is always recommended to boost at "Query" as "Index" time boosting will require reindexing the data again if ever your boost criteria changes. Being said that, in my application we have implemented both Index & Query time boosting, we are using
Index Time Boosting (document boosting), to boost some documents which we know will always be TOP HIT for our search. e.g searching with word "google" should always put a document containing "google.com" as top hit. We do this using a custom boost field and a custom boosting script to achieve this. Please see this link: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/modules-scripting.html
Query time Boosting (Per field boosting), we are using ES java APIs to execute our queries, we apply field level boosting at query time to each field as its highly flexible & allows us to change the field level boosting without reindexing the whole data set again.
You can have a look at this, it might be helpful for you: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/query-dsl-function-score-query.html#_field_value_factor
I have described my complete use case here, hopefully you will find it helpful.

Conceptual / Rule Implementation / String Manipulation

So I'm working on a software in VB.Net where I need to scrape information and process it according to rules. For example, simple string replace rules like making "Det" into "Detached" for a specific field, or a split/join rule, basic string operations. All my scraping rules are RegEx and I store them in a database in rows of rule sets for different situations.
What is the best way behind creating/storing rules to manipulate text? I do not want to hardcode the rules into the software, but rather be able to add more as there will be a need for them. I want to store them in a database, but then how do I interpret them? I'm assuming I would have to create a whole system to interpret them, like a rules engine? Maybe you can give me a different outlook on this problem.
I've written rules engines before. They are usually a Bad Idea (tm).
I would consider writing the rules in your application code. Leave the database and rules engine out of it. First, a rules engine often confuses intent. It's hard to see exactly what is going on when you come back in a couple months for a maintenance patch. Second, VB (or C# or any other language you choose) contains a more appropriate vocabulary for defining rules than anything you will likely have time to implement. Trust me, XML is a poor representation of rules. Lastly, non-programmers won't be able to write regex anyhow.. so you aren't gaining anything for all your added complexity.
You can mitigate most of the deployment headaches by using ClickOnce deployment.
Hope that helps.