Finite automata, Pushdown automata and Turing machine examples - finite-automata

I'm looking for some good source of Finite automata, pushdown automata and Turing machine tasks examples (for solving manually, by hand).
I was searching around but didn't find nothing special so I'm wondering if someone's got some good examples. Thanks in advance.

Your best bet might be to get a book on the subject, such as Introduction to the Theory of Computation, Third Edition by Michael Sipser, and then work through the exercises.
For a collection of problem sets on automata, along with solutions, check out Stanford's introductory course in the theory of computation. Problem Sets 5, 6, and 7 directly talk about automata (finite, pushdown, and Turing machines), along with equivalent representations (regular expressions and context-free grammars).
Hope this helps!

Related

FSM framework required

Please recommend framework for Finite State Machine creation and simulation. I am aware of Stateflow package in Matlab, but are there any other good choices? It shouldn't be only Matlab. Frameworks on Java, R or Python are also ok.
What I am basically trying to do is to evolve automata for binary sequence prediction problem, like shown in this article
Thanks.
Consider Ragel. It has a manual and a good amount of examples; I find the documentation superior to that of AT&T Research's FSM Tools (which consisted of a couple manpages and sparse examples).

What is the application of automata?

In other words, why should I learn about it? When am I going to say... oh I need to know about push down automata or turing machines for this.
I am not able to see the applications of the material.
Thanks
You should learn about automata theory because it will help you understand what is computationally possible in a given system. People who understand the difference between a push-down automata and a universal turing machine understand why trying to parse HTML with regular expressions is a bad idea. People who don't think it is just fine to try to parse HTML with REs.
There are problems that are nice fit to this kind of solutions, some of which are:
parsers
simulations of stateful systems
event-driven problems
There are probably many others. If you start writing code that has some ad-hoc state variable depending on which some functions can do this or that, you can probably benefit from proper FSA.
First off, it's my position that there are things worth learning not because they're immediately useful, but because they are inherently valuable. A great failing of modern education is that it does nothing to convince students of this when they're still impressionable.
That being said, automata theory is both inherently valuable and incredibly useful. Parsing text, compiling programs, and the capabilities of computing devices can only really be understood using the kinds of things automata theory gives us... and getting the most out of computational systems requires deep understanding. Automata theory allows us to answer some of the most fundamental questions we can ask about computation: what resources do we need to do computation? with given resources, what can we solve? are there problems which can't be solved no matter how many resources we possess? Let alone the fact the complexity theory - which deals with the efficiency of computations - requires automata theory in order to be meaningfully defined.
Learning about automata(which are nothing but machines) gives an idea about the limits of computation. When an automata does not accept a string, it mean a machine cannot take that string as an input. State diagrams generally gives the possible outcomes for an input which makes us build parsers/machines.
Good example would be checking the format of email-id. Softwares donot accept the email-ids while filling a form if the email format is not good. Here the software is accepting email-ids only in a specific format. We were able to build a software of such by basically sorting out this theoretically using automata and state machines.

What's the best language for physics modeling?

I've been out of the modeling biz, so to speak, for a while now. When I was in college, most of the models I worked with were written in FORTRAN, which I never liked. I'm looking to get back into science, so I'm wondering if there are modern languages with feature sets suited for this kind of application. What would you consider to be an optimal language for simulating complex physics systems?
While certainly Fortran was the absolute ruler for this, Python is being used more and more exactly for this purpose. While it is very hard to say which is the BEST program for this, I've found python pretty useful for physics simulations and physics education.
It depends on the task
C++ is good at complicated data structures, but it is bad at slicing and multiply matrices. (This task equires you to spend a lot of time writing for loops.)
FORTRAN has a nice notation for slicing and multiplying matrices, but it is clumsy for creating complicated data structure such as graphs and linked lists.
Python/scipy has a nice notation for everything, but python is an interepreted language, so it is slow at certain tasks.
Some people are interested in languages like CUDA that allow you to use your GPU to speed up your simulations.
In the molecular dynamics community c++ seems to be popular, because you need somewhat complicated data structures to represent the shapes of the molecules.
I think it's arguable that FORTRAN is still dominant when it comes to solving large-scale problems in physics, as long as we're talking about serial calculations.
I know that parallelization is changing the game. I'm less certain about whether or not parallelized versions of LINPACK and other linear algebra packages are still written in FORTRAN.
A lot of engineers are using MATLAB and Mathematica these days, because they combine numerical and graphics capabilities.
I'd also point out that there's a difference between calculation and display engines. The former might still be written in FORTRAN, but the latter may be using more modern languages and OpenGL.
I'm also unsure about how much modeling has crept into biology. Physical chemistry might be a very different animal altogether.
If you write a terrific parallel linear algebra package in Scala or F# or Haskell that performs well, the world will beat a path to your door.
Python + Matplotlib + NumPy + α
The nuclear/particle/high energy physics community has moved heavily toward c++ (in part due to ROOT and Geant4), with some interest in Python (because it has ROOT bindings).
But you'll note that this is sub-discipline dependent..."physics" and "modeling" are big, broad topics, so there is no one answer.
Modelica is a specialized language for modeling (and simulating) physical systems. OpenModelica is an open source implementation of Modelica.
Python is very popular among science-oriented people, as is Matlab. The issue with these is that they are both VERY slow (to run). If you want to do large simulations that may take minutes/hours/days, you're going to have to pick another language.
As long as you are picking a language for speed, suck it up and use C/C++, maybe with CUDA depending on your needs.
Final thought though: if it takes you two days longer to write and debug your model in C than in python, and the resulting code takes 10 minutes to run instead of an hour, have your really saved any time?
There's also a lot of capability with MATLAB. Especially when interfacing your simulations with hardware, or if you need your results visualised.
I'll chime in with Python but you should also look to R for any statistical work you may need to do. You should really be asking more about what packages for which languages to use rather than the language itself.

Floorplan and packaging architecture resources for the interested software professional?

One of the more interesting things I've run into lately is the art and science of laying out chip floorplan and determining packaging for the silicon. I would like to read some materials on the subject for the "Interested Software Guy".
Does anyone have any recommendations (Website or book, so long as it is a good quality)?
This is a result of my search on the subject as I was curious about your question and this is where I would start myself. Sorry I am not a specialist on the subject but hope it can kick-start you!
Seems floorplan optimization is a matter of combinatorial optimization.
As a developer, you'll want to tackle the theory behind it and most likely some proven algorithms. You might then be interested by books such as:
Computational Geometry: Algorithms and Applications by Mark de Berg and al.
Integer and Combinatorial Optimization by Laurence A. Wolsey and al.
Algorithms for VLSI Design Automation by Sabih H. Gerez
Handbook of Algorithms for Physical Design Automation by Charles J. Alpert and al.
Evolutionary Algorithms for VLSI CAD by Rolf Drechsler
It's a bit more difficult to get links on this subject, but if you're a member of IEEE Xplore, you might want to look at this paper and other similar ones.
Finally, on the floorplan wikipedia entry, you'll notice this on sliceable floorplans that might give you your best starting point:
Sliceable floorplans have been used in
a number of early EDA tools for a
number of reasons. Sliceable
floorplans may be conveniently
represented by binary trees which
correspond to the order of slicing.
What is more important, a number of
NP-hard problems with floorplans have
polynomial time algorithms when
restricted to sliceable floorplans
Good luck!
There seems to be a class on this at Carnegie Mellon
VLSI CAD
some of the lecture notes that looked more interesting than others:
All Notes
layout
"Floorplanning"
maze search
That site make references to one or more books that this fellow published:
Majid Sarrafzadeh
Also I found a short tutorial here
AutoCAD is a classic... Though on my limited budget I prefer Rhino 2.0.

What are the typical use cases of Genetic Programming?

Today I read this blog entry by Roger Alsing about how to paint a replica of the Mona Lisa using only 50 semi transparent polygons.
I'm fascinated with the results for that particular case, so I was wondering (and this is my question): how does genetic programming work and what other problems could be solved by genetic programming?
There is some debate as to whether Roger's Mona Lisa program is Genetic Programming at all. It seems to be closer to a (1 + 1) Evolution Strategy. Both techniques are examples of the broader field of Evolutionary Computation, which also includes Genetic Algorithms.
Genetic Programming (GP) is the process of evolving computer programs (usually in the form of trees - often Lisp programs). If you are asking specifically about GP, John Koza is widely regarded as the leading expert. His website includes lots of links to more information. GP is typically very computationally intensive (for non-trivial problems it often involves a large grid of machines).
If you are asking more generally, evolutionary algorithms (EAs) are typically used to provide good approximate solutions to problems that cannot be solved easily using other techniques (such as NP-hard problems). Many optimisation problems fall into this category. It may be too computationally-intensive to find an exact solution but sometimes a near-optimal solution is sufficient. In these situations evolutionary techniques can be effective. Due to their random nature, evolutionary algorithms are never guaranteed to find an optimal solution for any problem, but they will often find a good solution if one exists.
Evolutionary algorithms can also be used to tackle problems that humans don't really know how to solve. An EA, free of any human preconceptions or biases, can generate surprising solutions that are comparable to, or better than, the best human-generated efforts. It is merely necessary that we can recognise a good solution if it were presented to us, even if we don't know how to create a good solution. In other words, we need to be able to formulate an effective fitness function.
Some Examples
Travelling Salesman
Sudoku
EDIT: The freely-available book, A Field Guide to Genetic Programming, contains examples of where GP has produced human-competitive results.
Interestingly enough, the company behind the dynamic character animation used in games like Grand Theft Auto IV and the latest Star Wars game (The Force Unleashed) used genetic programming to develop movement algorithms. The company's website is here and the videos are very impressive:
http://www.naturalmotion.com/euphoria.htm
I believe they simulated the nervous system of the character, then randomised the connections to some extent. They then combined the 'genes' of the models that walked furthest to create more and more able 'children' in successive generations. Really fascinating simulation work.
I've also seen genetic algorithms used in path finding automata, with food-seeking ants being the classic example.
Genetic algorithms can be used to solve most any optimization problem. However, in a lot of cases, there are better, more direct methods to solve them. It is in the class of meta-programming algorithms, which means that it is able to adapt to pretty much anything you can throw at it, given that you can generate a method of encoding a potential solution, combining/mutating solutions, and deciding which solutions are better than others. GA has an advantage over other meta-programming algorithms in that it can handle local maxima better than a pure hill-climbing algorithm, like simulated annealing.
I used genetic programming in my thesis to simulate evolution of species based on terrain, but that is of course the A-life application of genetic algorithms.
The problems GA are good at are hill-climbing problems. Problem is that normally it's easier to solve most of these problems by hand, unless the factors that define the problem are unknown, in other words you can't achieve that knowledge somehow else, say things related with societies and communities, or in situations where you have a good algorithm but you need to fine tune the parameters, here GA are very useful.
A situation of fine tuning I've done was to fine tune several Othello AI players based on the same algorithms, giving each different play styles, thus making each opponent unique and with its own quirks, then I had them compete to cull out the top 16 AI's that I used in my game. The advantage was they were all very good players of more or less equal skill, so it was interesting for the human opponent because they couldn't guess the AI as easily.
http://en.wikipedia.org/wiki/Genetic_algorithm#Problem_domains
You should ask yourself : "Can I (a priori) define a function to determine how good a particular solution is relative to other solutions?"
In the mona lisa example, you can easily determine if the new painting looks more like the source image than the previous painting, so Genetic Programming can be "easily" applied.
I have some projects using Genetic Algorithms. GA are ideal for optimization problems, when you cannot develop a fully sequential, exact algorithm do solve a problem. For example: what's the best combination of a car characteristcs to make it faster and at the same time more economic?
At the moment I'm developing a simple GA to elaborate playlists. My GA has to find the better combinations of albums/songs that are similar (this similarity will be "calculated" with the help of last.fm) and suggests playlists for me.
There's an emerging field in robotics called Evolutionary Robotics (w:Evolutionary Robotics), which uses genetic algorithms (GA) heavily.
See w:Genetic Algorithm:
Simple generational genetic algorithm pseudocode
Choose initial population
Evaluate the fitness of each individual in the population
Repeat until termination: (time limit or sufficient fitness achieved)
Select best-ranking individuals to reproduce
Breed new generation through crossover and/or mutation (genetic
operations) and give birth to
offspring
Evaluate the individual fitnesses of the offspring
Replace worst ranked part of population with offspring
The key is the reproduction part, which could happen sexually or asexually, using genetic operators Crossover and Mutation.