While I am comfortable with optimization problems in Python (and 'R' for that matter) I am curious to know if it can be done in WEKA.
I have X,Y coordinates of several routes and I need to optimize the best overall solution.
Any help would be appreciated...
Weka is a machine learning, not optimization package. While you could try to predict the optimal solution using the machine learning algorithms in Weka, there's nothing to check that a prediction is the optimal solution or even a solution at all.
Definitely sounds to me like you're trying to use the wrong tool for the job.
Related
I have to optimize the result of a process that depends on a large number of variables, i.e. a laser engraving system where the engraving depth depends on the laser speed, distance, power and so on.
The final objective is the minimization of the engraving time, or the maximization of the laser speed. All the other parameters can vary, but must stay within safe bounds.
I have never used any machine learning tools, but to my very limited knowledge this seems like a good use case for TensorFlow or any other machine learning library.
I would experimentally gather data points to train the algorithm, test it and then use a gradient descent optimizer to find the parameters (within bounds) that maximize the laser travel velocity.
Does this sound feasible? How would you approach such a problem? Can you link to any examples available online?
Thank you,
Riccardo
I’m not quite sure if I understood the problem correctly, would you add some example data and a desired output?
As far as I understood, It could be feasible to use TensorFlow, but I believe there are better solutions to that problem. Let me expand on this.
TensorFlow is a framework focused in the development of Deep Learning models. These usually require lots of data (the number really depends on the problem) but I don’t believe that just you manually gathering this data would be enough unless your team is quite big or already have some data gathered.
Also, as you have a minimization (or maximization) problem given variables that lay within a known range, I think this can be a case of Operations Research optimization instead of Machine Learning. Check this example of OR.
I have a task: to determine the sound source location.
I had some experience working with tensorflow, creating predictions on some simple features and datasets. I assume that for this task, there would be necessary to analyze the sound frequences and probably other related data on training and then prediction steps. The sound goes from the headset, so human ear is able to detect the direction.
1) Did somebody already perform that? (unfortunately couldn't find any similar project)
2) What kind of caveats could I meet while trying to achieve that?
3) Am I able to do that using this technology approach? Are there any other sound processing frameworks / technologies / open source projects that could help me ?
I am asking that here, since my research on google, github, stackoverflow didn't show me any relevant results on that specific topic, so any help is highly appreciated!
This is typically done with more traditional DSP with multiple sensors. You might want to look into time difference of arrival(TDOA) and direction of arrival(DOA). Algorithms such as GCC-PHAT and MUSIC will be helpful.
Issues that you might encounter are: DOA accuracy is function of the direct to reverberant ratio of the source, i.e. the more reverberant the environment the harder it is to determine the source location.
Also you might want to consider the number of location dimensions you want to resolve. A point in 3D space is much more difficult than a direction relative to the sensors
Using ML as an approach to this is not entirely without merit but you will have to consider what it is you would be learning, i.e. you probably don't want to learn the test rooms reverberant properties but instead the sensors spatial properties.
I am trying to solve some problems that can be mapped in convex optimisation problem.
In particular is for analysis of quantum state tomography data.
In Matlab there are some tools to help you do this, like SeDuMi or CVX
http://sedumi.ie.lehigh.edu
http://cvxr.com/cvx/
But I could not find anything similar in Mathematica, on the web or in the forums.
Does anybody know if there is an easy way of implementing this kind of algorithm in Mathematica?
I would like to avoid to be forced to switch to Matlab to solve this problem. Nothing against it, but I have most of the programming for this state tomography developed in Mathematica.
Thank you very much.
I had also some troubles with Mathematica in
optimization, exactly on convex problems.
I suggest you export to CVX, which will require
some work because it wants the problem in matrix notation.
Otherwise, to remain with the algebraic formulation,
you could try with Maple, which has, as far
as I can tell, better optimizers than Mathematica.
(check the doc to have an idea)
I have a large MIP problem, and I use GLPSOL in GLPK to solve it. However, solving the LP relaxation problem takes many iterations, and each iteration the obj and infeas value are all the same. I think it has found the optimal solution, but it won't stop and has continued to run for many hours. Will this happen for every large-scale MIP/LP problem? How can I deal with such cases? Can anyone give me any suggestions about this? Thanks!
The problem of solving MIPs is NP-complete in general, which means that there are instances which can't be solved efficiently. But often our problems have enough structure, so that heuristics can help to solve these models. This allowed huge gains in solving-capabilities in the last decades (overview).
For understanding the basic-approach and understanding what exactly is the problem in your case (no progress in upper-bound, no progress in lower-bound, ...), read Practical Guidelines for Solving Difficult Mixed Integer Linear
Programs.
Keep in mind, that there are huge gaps between commercial solvers like Gurobi / Cplex and non-commercial ones in general (especially in MIP-solving). There is a huge amount of benchmarks here.
There are also a lot of parameters to tune. Gurobi for example has different parameter-templates: one targets fast findings of feasible solution; one targets to proof the bounds.
My personal opinion: compared to cbc (open-source) & scip (open-source but non-free for commercial usage), glpk is quite bad.
I have a simple optimization problem and am looking for java software for that.
The Apache math optimization software looks just like what I want but I cant find documentation to suit my needs (where those needs are to useful to a beginner / non maths professional!)
Does anyone know of a worked, simple, example?
In case it helps, the problem is that I want to find the max r where
r1 = s1 * m1
r2 = s2 * m2
and there are some constraints and formula for defining the relationship between the variables. The Excel Solver works fine for this problem. I got LPSolve working great, but this problem requires a multiplication of s and m, so I understand LPSolve cant help as this makes the problem non linear.
I recently ported the derivative-free non-linear constrained optimization code COBYLA2 to Java. Since it does not explicitly rely on derivatives, the algorithm may require quite a few iterations for larger problems. Nonetheless, you are able to formulate your problem with both a non-linear objective function and (potentially) non-linear constraints.
You can read more about it and download the source code from here.
I am not aware of a simple Java-based NLP solver. (I did find an example of Quadratic programming (QP) in Apache Math Works, but it doesn't qualify since you asked for a non-math professional example.)
I have two suggestions for you to solve your non-linear program:
1.. Excel's Solver does have the ability to tackle non-linear problems. (Don't use LPSOLVE.) In fact, NLP is the default mode in Solver.
Here are two links to using Excel to solve NLPs: Example 1 - Step by step Solver walk-through that covers NLP and
Example 2 - A General Neural network example in Excel
Also for Excel, I like Paul Jensen's (utexas) ORMM Add-in's.
He has a module called Teach NLP. Chapter 10 of his book deals with NLP and is available from his site.
2.. If you are going to be doing even some amount of data analysis, then I recommend investing a few hours to download and learn the basics of R.
R has numerous packages and libraries for optimization. optim() and nlme are relavant for solving non-linear programs.
Just for completeness, I mention SAS, MATLAB and CPLEX as other options. If you have access to any of these, they all do a very good job with solving non-linear programs.
Hope these pointers help.