Can anyone recommend a good resource for learning VHDL? [closed] - hardware

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Can anyone recommend a good book for learning VHDL? Or failing that, any good resource?

The unfortunate problem with VHDL is that there are loads of outdated, poorly styled and outright wrong resources out there; both electronic and in print.
Part of the art of mastering VHDL is knowing how to filter these out. The following is the filtering I did in my previous life as a hardware designer. I hope it is helpful to you.
These are the things you want to read, or own, or download:
Book: "The Designer's Guide to VHDL" by Peter J. Ashenden (ISBN 1-55860-270-4). It does not waste your time by telling you to use obsolete or vendor specific libraries; it does not explain VHDL assuming you are a software engineer who wants to know about HDLs; it does not explain VHDL by assuming you are a hardware engineer who wants to know about HDLs. It does not advocate a vendor and its solutions (working with a particular vendor toolchain is a separate issue, and I've found it helpful to keep learning VHDL and vendor-specific separate). What it does do is introduce VHDL from the right perspective: as a language used to describe discrete event systems, from which smart programs can extract something that can end up as being hardware. It also describes what the standard language constructs are, which standards of the language exist, and what are their specific properties. Modern tools are ever more adherent to the standards, so this info is way more useful than a bunch of analogies that some other books (to remain nameless) seem to purport. Buy it, it's worth every cent.
The newsgroup comp.lang.vhdl is inhabited by people who are very knowledgeable about modern VHDL and can give you sane advice if you can ask questions well. To be able to do the latter, read the book mentioned above. Wading through numerous VHDL forums is in general a waste of time, as the information content there is generally drowned in noise.
Know your tools. Get yourself a PDF of the toolchain you will be using and know it very well. The more, the better. Especially know their limitations. Tools often have idiosyncrasies that you will need to work around or play along with to get things just right. For instance, you will probably want to write the portable behavioral code; except for the parts that are either technology-specific, or are such that your tool happens to synthesize them wrong.
Know where to find sane VHDL resources. An example of a sane resource is the Hamburg VHDL archive (at: http://tams-www.informatik.uni-hamburg.de/research/vlsi/vhdl/) I found through using the sampling method that the signal-to-noise ratio on that particular website is pretty high. Use it.
A fairly obscure book about hardware synthesis (for the really curious; and written from an academic perspective) is Giovanni de Micheli's "Synthesis and Optimization of Digital Circuits" (http://si2.epfl.ch/~demichel/publications/mcgraw/index.html) which may shed some light on the hardware synthesis methods -- though a substantial body of work has been done to improve the results given there since. You may want to borrow this one from a nearby library and leaf through it.

I like the book called "Circuit Design with VHDL", from Volnei A. Pedroni. It focuses on synthesizable VHDL, that is what you will need to code for real chips, not only for simulation.

A great textbook to start out with is: Fundamentals of Digital Logic with VHDL Design
I remember starting out with this to get a quick overview.

I found the Low Carb VHDL Tutorial to be excellent when I was learning VHDL. Now even more since the author of Low-Carb VHDL Tutorial has turned it into an open-source book titled Free Range VHDL.

I recommend the Chu, Pong P.: RTL Hardware design using VHDL. John Wiley & Sons Inc., 2006.

When learning any sort of HDL (Verilog, VHDL...) it is important to keep one thing in mind. It is not software programming and things work in parallel. That being said, I find that the best way to learn any HDL is to learn how to think in hardware and describe the hardware (that's why it is called a hardware description language).
So far, I have rarely seen books that show you how your HDL gets translated into hardware. I've read through one when I was at Synopsys (pages filled with code and schematics) but it was an internal publication. However, even lacking this book, you can still see how your code gets turned into hardware by running it through synthesis on free-software.
The reason that I wish to stress this is because there are many ways to solve a problem. You will only be able to write code that solves it efficiently, from a gate count and timing stand point, if you understand how it gets translated into underlying hardware.
Good luck!

This is the book I used for Systems Architecture class. It is dirt simple.

Be carefull though things are not always parallel. Sequential assignments are different than combinational assignments.

I'm not sure what your background or needs are, but Digital Design and Computer Architecture, by David Harris and Sarah Harris, was a very useful introduction for me. It's not VHDL specific (Verilog and VHDL examples are presented side by side) or even HDL-heavy – as the title would suggest, it's more of an introduction to digital design in general. But for me it has been a great approach, presenting the code along with a grounding in its application and theoretical context.

Related

What is a good standard exercise to learn the OO features of a language? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
When I'm learning a new language, I often program some mathematical functions to get used to the control flow syntax. After that, I like to implement some sorting algorithms to get used to the array/list constructs.
But I don't have a standard exercise for exploring the languages OO features. Does anyone have a stock exercise for this?
A good answer would naturally lend to inheritance, polymorphism, etc., for a programmer already comfortable with these concepts. An ideal answer would be one that could be communicated in a few words, without ambiguity, in the way that "implement mergesort" is completely unambiguous. (As an example, answering "design a game" is so vague as to be useless.)
Any ideas?
EDIT: I have to remark that the results here are somewhat ironic. 10 upvotes and (originally) 5 favorites suggest that this is a question others are interested in. Yet the most upvoted answer is one that says there is no good answer. Oh well. I think I'll look at the textbook below, I've found games useful in the past for OO.
I can't imagine there could be a standard set of exercises that would naturally introduce OO features of a programming language to everybody. A lot of the introductory OO tutorials are full of Animals, Cats, and Dogs which does not really cut it for me at least. Find a problem domain in OO you've struggled with a lot, and try to use that as your set of stock exercises for each language you pick up.
The OO constructs that we are used to thinking in terms of may not make sense in a language. Javascript comes to mind which shakes the entire foundation of how we think about objects in general. That said, you shouldn't adapt to a language but rather adapt the language for your purposes. Over time as your knowledge repository grows and improves with experience, you'll naturally want to implement what you think is best in each programming language that you use regardless of what the language offers.
Good question...
In my opinion the best teacher is just find a simple example of OO features and try to write something alone, creating new examples for Yourself and trying develop simple application in which You can connect all features of OO .
Implementing algorithm like merge sort which don't use OO feature, cause they don't need it is useless. Try real useful programs.
I remember when learning OO i write application with general "Animal" interface with methods and class which inherit it, like "amphibian". it was fanny time ;)
Some fun: implement the Shape/Circle/Ellipse hierarchy without falling into the trap (it can be done very nicely in Java, Scala, etc.).
edit implement it before looking at the proposed solutions in the Wikipedia article :)
I've used Hunt The Wumpus. The original implementation in BASIC was not at all OO, but if you start fresh it lends itself pretty nicely to this.
Here's what I use:
http://homepage.mac.com/s_lott/books/oodesign.html
I've done it enough times that it's "standard" in my opinion.
This might be too specific, but it's what I credit for really getting me to understand OOP personally. For my work I had to write code to extract data from a large variety of different sources. It seemed straightforward to me at the time that I should tackle the problem from the perspective of designing various "DataProvider" classes. What only gradually became clear was how much code I could reuse by breaking the different kinds of providers down into hierarchical categories, like this:
DataProvider
TextDataProvider
HtmlDataProvider
CsvDataProvider
XmlDataProvider
BinaryDataProvider
...and so on. I would suggest that any problem like this--where you need to accomplish a certain kind of task (in my case, extracting data) in a bunch of different ways (e.g., from multiple sources)--will be a great opportunity to delve into OOP and hopefully learn to appreciate how useful it is.
I personally find the best way to learn OO, is to write your own testing framework.
I find a layout of a Test Runner, owning one or more Test Suites, which each have their own Test Cases enough of a starting point, but you can easily grow it from there, and it might even be something you care to use in the future.
Alternatively, if you want something completely throw away, there's always Enterprise FizzBuzz. :)

What features are important in a programming language for young beginners? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I was talking with some of the mentors in a local robotics competition for 7th and 8th level kids. The robot was using PBASIC and the parallax Basic Stamp. One of the major issues was this was short term project that required building the robot, teaching them to program in PBASIC and having them program the robot. All in only 2 hours or so a week over a couple months. PBASIC is kinda nice in that it has built in features to do everything, but information overload is possible to due this.
My thought are simplicity is key.
When you have kids struggling to grasp:
if X>10 then <DOSOMETHING>
There is not much point in throwing "proper" object oriented programming at them.
What are the essentials needed to foster an interest in programming?
Edit:
I like the notion of interpreted on the PC as learning tool. Due to the target platforms more than likely being somewhat resource constrained, I would like to target languages that are appropriate for embedded work. (Python and even Lua require more resources than the target likely to have. And I actually kinda like Lua.) I suppose that is one of the few virtues BASIC has, it has been ran on systems with less than 4K for over 30 years. C may not be a bad option if there are some "friendly" tools available such as Ch.
The most important is not a lot of boiler plate to make the simplest program run.
If you start of with a bunch of
import Supercalifragilistic from <expialidocious>
public void inherited security model=<apartment>
public : main .....
And tell them they "not to worry they aren't supposed to understand that" - you are going to put off both the brightest and the dumbest.
The nice thing about python is that printing "hello world" is print "hello world"
Fun, quick results. Capture the attention span of the kid.
Interpretive shells like most scripting languages offer (command line) that lets the student just type 1 or 2 liners is a big deal.
python:
>>> 1+1
2
Boom, instant feedback, kid thinks "the computer is talking back". Kids love that. Remember Eliza, anyone?
If they get bogged down in installing an IDE, creating a project, bleh bleh bleh, sometimes the tangents will take you away from the main topic.
BASIC is good too.
Look for some things online like "SIMPLE" : http://www.simplecodeworks.com/website.html
A team of researchers, beginning at Rice, then spreading out to Brown, Chicago, Northeastern, Northwestern, and Utah, have been studying this question for about 15 years. I can't summarize all their discoveries here, but here are some of their most important findings:
Irregular syntax can be a barrier to entry.
The language should be divided into concentric subsets, and you should choose a subset appropriate to the student's level of knowledge. For example, their smallest subset is called the "Beginning Student" language.
The compiler's error messages should be matched to the students' level of knowledge. If you are using subsets, different subsets might give different messages for the same error.
Beginners find it difficult to learn the phase distinction: separate phases for type checking and run time, with different kinds of errors. For this reason, beginners do better with a language where types are checked at run time, i.e., a dynamically typed language.
Beginners find it difficult to reason about mutable variables and mutable objects. If you teach pure functional programming, by contrast, you can leverage students' experience with high-school and middle-school algebra.
Beginning students are more engaged by an interactive programming environment than by the old edit-compile-link-go model.
Beginning students are engaged by splash and by interactivity. It's good if your language's standard library provides built-in support for creating and displaying images. It's better if those images are supported within the interactive programming environment, instead of requiring a separate player or viewer. And it's even better if your standard library can support moving images, or some other kind of animation.
Interestingly, they have got very good results with just 2D images. Even though we are all surrounded by examples of 3D computer graphics, students seem to get very engaged working with just two-dimensional images.
These results have been obtained primarily with college students, and they have been replicated at over 20 universities. However, the research team has also done some work with high-school and middle-school students. The first papers on that work are just coming out, so I'm less aware of the new findings and am not able to summarize them.
When you have kids struggling to grasp:
if X>10 then <DOSOMETHING>
Maybe it's a sign they shouldn't be doing programming?
What are the essentials needed to foster an interest in programming?
To see success with no or little effort. To create something running in a matter of minutes. A lot of programming languages can offer it, including the scary C++.
In order to avoid complication with #includes, multiple source files, modularization and compilation, why not have a look elsewhere? Try to write some Excel macros or use any other software with some basic built-in scripting language to automate certain tasks?
Another idea could be to play with web pages. It is not exactly programming, but at least easy to achieve something and show to others with pride.
This has been said on SO before, but... try Scratch. It's an incredible learning tool for kids. It teaches the basics of programming concepts in a hands-on and language-independent way. After a bit of playing around with it they can get down to learning a specific language's implementation of the concepts they already understand.
The common theme in languages that are easy for beginners - especially children to pick up is that there's very little barrier to entry, and immediate feedback. If "hello world" doesn't look a lot like print "Hello, world!", it's going to be harder for people to pick up. The following features along those lines come to mind:
Interpreted, or incrementally JIT compiled (which looks like an interpreter to the user)
No boilerplate
No attempt to enforce a specific programming style (e.g. Java requiring that everything be in a class definition, or Haskell enforcing purely functional design)
Dynamic typing
Implicit coercion (maybe)
A REPL
Breaking the problem (read program) down into a small set of sections (modules) that do one thing and do it very well.
You have to get them to stop thinking like a user and start thinking like a programmer. They need to take it one step at a time. Ask them what they have to think of in order to figure out the problem them selves and then write them down as steps. If you can then you break each step even more in the same mater. When done you will have the program in english making it simpler to program for real.
I did this with a friend that just could not get it and now he can. He used to look at something that I did and be bewildered and I would say that he has done more complex stuff than this.
One of the more persistently-present arguments I have had with other programmers is whether or not one's first language should require explicit type languages. Many are of the opinion that learning a language which requires you to explicitly declare type information is one which will teach you to program typefully. Conversely, it can be said that dynamic languages might present a less demanding learning curve. It goes either way, I suppose.
My advice: start with a simple model of how a computer works. I am particular to stack machines as good tools for teaching computation.
Remember that beginners are learning two disciplines at the same time: how computers work and the abstract logic involved (the basics of Computer Science), plus how to write programs that match their intended logic (learning a specific language's syntax and idioms). You have to address both concerns in an interwoven fashion in order for the students to quickly become effective. This is also the reason experienced programmers can often pick up new languages quickly.
It's worth noting Python grew out of a project for a language named ABC, which was targeted at beginners. For example, the required colon isn't strictly required, but was found to improve readability:
if some_condition:
do_this()
I got 3 words : Karel the Robot.
it's a really really simple 'language' that is designed to teach people the basis of programming :
Look for it on the web. You can look at this, though I never tried it :
http://karel.sourceforge.net/
While this isn't related to programming a robot, I think web programming is a great place to start with kids that age. It's how I started at that exact age. It easily translates to something kids understand if they use the web at all. Start with HTML, throw in Javascript, and soon they want to be doing features requiring server-side scripting or some sort, and things progress from there.
With the kind of kids who are already interested in robotics, though, I'd actually go for a different language like the ones already described. If you want to work in a field like robotics, you don't need to be convinced to try something hard. You need to be challenged.
A few years ago I saw a presentation at Ignite! Seattle from one of the people working on the project now known as Kodu who envisioned as a programming language for children. He spent time talking about what common language features could simply be thrown out in a programming environment meant to teach fundamentals.
A lot of cherished imperative constructs, like C-style for loops, were simply left out in favor of a simple object-messaging approach. Object-oriented programming isn't hard to understand when you think about "objects" and "messages"; the hard part is when you deal with things that programmers, but not children, care about, like inheritance and contracts and sweeping abstractions. I've got this thing (noun), now act on it (verb), in this way (adverb like quickly), when thing (sees/bumps into) something (with some attribute) (that's your if). Events are really conditions, and have all of the power of conditions, but it's up to the runtime to identify when those events happen.
This kind of event and messaging driven approach probably translates even better to robots than procedural programming would, anyway, so it might be a good way to look at the problem. Try not to think about what you'd "need" to know to program in C or Pascal or something; think about what you'd want to be able to make something do.

How to become a good at Technical Design [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Would like to know what a programmer should know to become a good at Designing particluarly in Java/J2EE technologies.
Firstly Good Design transcends whichever language you choose to use to implement the design. Good software design is about managing complexity to create easy to understand code which is robust and maintainable. Key points are
Work in the highest level of abstraction you can at any time
Encapsulate and hide areas of complexity
Understand what value there is in clear and consistent naming
In my mind Good design is achieved by a combination of understanding good practice and being creative. And in my experience the hardest part of design is in achieving the right functional decomposition of the problem into smaller sub-problems. It is important to understand that the process of achieving this decomposition is almost always an iterative process rather than a methodical top down process. You have to be prepared to modify or throw away your previous design decomposition until you have something which is maintainable.
It is hard to talk about good design and not to mention two things in particular
Object Oriented Proctices
Design Patterns
While some languages are object oriented, some are purely object based and others, like C, were created prior to object based design becoming wide spread, the principles and practices can be applied in any language. Most of the code I write is in C and I try to use object like practices where possible.
Design Patterns present good solutions to common problems and give these solutions names. I have found the study of Design Patterns a key to understanding what good design can achieve.
For beginning to understand design, you should probably first write some toy-projects. Write them, take a step back once in a while and reflect, go back and rewrite. Lather, rinse and repeat.
Making mistakes in design is the best way to understand how you should do better next time. There are of course some methodologies you should be aware of, most important of which patterns and information hiding. Beyond that there are various sources/books for software architecture. For example: Software Architecture in Practice (2nd Edition) (The SEI Series in Software Engineering) by Len Bass, Paul Clements, and Rick Kazman
Try to look closely at where information belongs. Should the interest-rate be a field in Account or AccountType for (a small) example.
Last but not least, try to involve yourself in discussions about design. Debate with your peers, but also pick the brains of more experienced designers/architects.
And stay critical! Although Software Design is more of an exact field than building design with (some) clear pros and cons, taste/preference and rhetoric is still part of the deal.
I would recommend a couple of things:
Read about some design patterns. The original Gang of Four book helps with OO design. If your are writing Enterprise applications I can't recommend Martin Fowlers Enterprise Application Architecture book too much.
Patterns give you the essential words to describe designs both to yourself and to others. Just reading about the different approaches makes you see new possibilities. If you are looking at J2EE, patterns like Inversion of Control are essential.
Obsess about loose coupling
The essentials of good design is preventing tight coupling. Anything that can be used to move your code in to loosely couples layers is going to help your overall design.
Read other people's code. Study some high profile open source code in the your technology area.
Just studying other peoples code quickly gives you a feel for nice looking designs compared to cluttered Big Ball Of Mud approaches.

Formal Methods and Enterprises [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
So...
I teach formal methods in software engineering. I also teach "agile methodologies". Most people seem to think this is contradictory. I think it makes a lot of sense... I also work for a company, where we need to actually get things done :) While I can apply my earned skill points on "specification" in a day-to-day basis, my colleagues typically flee away from the word "formal".
I used to think that this was due to the intrinsic way we learn how to program: we are usually driven to find a working solution, not to understand the problem. Then I thought this was due to the fact that most people in the formal community are not engineers, but mathematicians or computer scientists. Nowadays, I wonder if it just because the formal-methods community hide behind some kind of "obfuscation" law to use all the available UNICODE symbols, actively develop rude, unesthetic tools, and laugh in the face of standards.
Yes, I've been moving from a "blame them" to a "blame us" perspective ;-)
So, my question is: do you use any kind of formal methods in your company? Have you introduced them, or were they pre-requisites? What techniques do you use to clear the fog of mathematics from people's fears and incite them to use formal methods? What do you think current tools are lacking for a more general usage?
The key to getting people to buy into any methods or methodologies is to show them how it solves problems they are having. If they can see it will make their lives better you have a much improved chance of getting them to adopt the techniques.
And if you can't show them that, perhaps you wanted to adopt the methods based on philosophy rather than practicality. Unless the others share your philosophy then you're not going to get anywhere. And perhaps you shouldn't.
Over the decades there have been a great many methodologies. Newer ones always address the shortcomings of the old ones, yet projects still get in trouble and fail. Why? Because the rock stars that come up with new methodologies are rock stars, and have made a new methodology precisely because they understand the underlying issues and how to apply them. Those who come after tend to blindly follow the recipe, and it doesn't work so well.
So I think the best thing is to teach about the underlying problems and then show how various methods attempt to deal with those problems. The differences in companies, projects, and teams is so great that no one methodology can be applied successfully to all combinations. Learning to choose an appropriate tool and apply it well is crucial.
Thank you for all contributions. They are very insightful. Allow me to flame a bit (don't take it personal, though :-)
Most people seem to think that formal methods are just about program verification. Or critical systems. This may be true if we pursue the ultimate cliche: to prove we are doing the program right (v.s. validation, which asks, as a contributor said, if we are doing the right program).
But consider model finding/checking tools, such as Alloy. Learning to use a tool like this takes a negligable ammount of time for anyone used to UML and OO. Still, it can give you immediate insight over your model. It usually takes no more than 10 minutes to find a counter-example over a small enough subset of the model one's trying to use (and that includes describing the model in Alloy in the first place).
Take requirements engineering as an example. One usually draw a lot of UML. Few people use OCL, though, and many business rules are informally annoted in natural language. Why? Time constraints?
Now consider the fact that the majority just uses her/his gut-feeling to prove that a model is satisfiable. Again, why? I can take the same amount of time (probably even less, since I don't need to care about drawing aesthetics) to write that model in Alloy, and just check for satisfiability? And what kind of mathematics do I need to now? "Predicates"? Fancy name for IFs and booleans ;-) Quantifiers? Fancy names for ForEachs()...
What about big information systems? They don't need to be critical... Just try to analyze in your head a conceptual (not implementation!) diagram with over 600 classes. I see many people banging their head in the wall with easy-to-make model mistakes because they missed some constraint, or the model allows stupid things to happen.
The fact is, one does not need to use formal approaches from head to tail. Granted, I could prove a whole application in Coq, and certify that it is 100% compliant with some specification. This may be the Computer Scientist/Mathematician approach.
Still, with a GTD philisophy, why can't I delegate some tasks for the computer and allow it to help improving my development? Is it really a matter of "time", or plain, simple lack of technical abilities and will to learn/inovate?
Working with line of business IT development in an enterprise means having to transfer knowledge about the business from actual business people into the heads of developers. While I myself find abstract maths to be one of the greatest pastimes there is, it's a terrible communications tool. And communications is what it's all about. While I might conceivably have some success convincing IT people to embrace more abstract notations, I basically have no chance with the business people.
While there are some areas where I can see a role for formal methods in an enterprise (math- and logic-heavy specialist software, significant need for provable properties as in safety critical software) they provide little help with getting correct requirements on e.g. how to fulfil a customer order by issuing one or more supply orders to a set of possible external or internal providers.
I think the jury is still out on model based approaches and domain specific languages. I think they will succeed or fail depending on whether they provide quicker feedback from IT to the wishes and needs of the business side, and whether they presume business people will have to do any significant studying.
Technology is easy. Communication is hard. Formal methods may help us do things right, but those I've seen do nothing to help us do the right things. (Yes, these are cliches, but that's because they're inescapably and painfully true.)
I'm taking a course on 'Specification and Verification'. As part of the course structure we are doing the following-
1. Learning tools like PVS(Prototype Verification System) http://pvs.csl.sri.com/ and SMV(Software Modeling and Verification) http://www.cs.cmu.edu/~modelcheck/smv.html
2. Apart from that we do dissect accidents which happened because of software failures. For e.g. - Failure of Ariane V
I feel formal methods are more applicable to scenarios where the failure cost is more than the design cost. And it seems apt to use them for softwares being used in critical systems. I guess it is used in avionics, chip design etc. and the current automobile industry is also drafting it into practice.
I have tried to get people to embrace formal specification methods a few times (Z and Alloy) and have made the same expirience that you have: Most people, while feeling that they serve a useful purpose, are very uncomfortable using them for actual work.
Funny enough, the same people are more than happy to produce utterly useless UML diagrams in ginormous quantities.
I think there are two main reasons for this:
a.) Many developers are uncomfortable with the level of abstraction required by a formal approach. The fact that most entry-level mathematics education is all calculus and non discrete-mathematics might have to do something with this.
b.) Formal methods require a very bottom up design aproach where you design your core model from the ground up and make it airtight and then connect it up to the actual user requirements by providing an interface on top of it. Since we tend to have requirements drive development efforts, a top-down approach feels more natural although it often leads to inconsistent models. It's like retrofitting a basement underneath your house after it has already been built.
Formal methods make no sense in systems where the cost of failure is low.
In a production web application, you've got multiple front-end boxes, multiple back-end boxes, multiple database boxes - if a program on any one of them fails, it's a non-event. Hardware is so cheap that you can build these systems for far less than the cost of formally specifying all your software.

Do you follow the Personal Software Process? Does your organization/team follow the Team Software Process? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
For more information - Personal Software Process on Wikipedia and Team Software Process on Wikipedia.
I have two questions:
What benefits have you seen from
these processes?
What tools and/or
methods do you use to follow these
processes?
I went through the training and then my company paid for me to go to Carnegie Mellon and go through the PSP instructor training course to get certified as an instructor. I think the goal was to use this as part of our company's CMM/CMMI effort. I met Watts Humphrey and found him to be a kind, gentle soul with some deeply held ideas about process. I read several of his books as well.
Here's my take on it in a nutshell - it is too structured for most people to follow, assuming you follow things to the letter. The idea of estimation based on historic info is OK, particularly in the classroom setting, but in the real world where estimates are undone in a day due to the changing tide of requirements and direction, it is far less useful. I've also done Wide Band Delphi estimation and that was OK but honestly wasn't necessarily any better than the 'best guess' I'd make.
My team was less than enthusiastic about PSP and that is part of the problem - developer buy-in. My company was doing it for the wrong reason - simply to say "hey, look we use PSP and have some certified instructors!".
In the end, I've found using a 'agile' approach to be better. I have a backlog of work to do and can generally estimate it pretty well. I've been doing it long enough that I can make pretty good rough estimates on time and frankly don't think that the time tracking really improves things much. Perhaps in some environments it would work well, but at my place, we'll keep pumping out quality software without all the process hoops that yield questionable benefits.
Just my two cents.
I got into this once, even tried using PSP Dashboard.
It's just too hard to keep up with. Who wants to use a stop watch for all their activities? Follow Joel's advice on Painless Scheduling and Evidence Based Scheduling.
+1 this question, -1 to PSP.
I have used the PSP and TSP process by heart for 4 years (though it was in the begining of my software career). As an idealist you will love what is being done by you and ofcourse Yes there are amazing results as well.
Though PSP advocates the recording of your defects to the core (such as ; or typo's), I was in a conversation with Mr. Watts Humphrey where lot of people asked him about the advancements of compilers and missing of object orientedness (which I felt, how is it missing, as I was an OO Programmer and was using it successfully). There was a very good answer provided by him. It went on like, "PSP, or as a matter of fact any process methodology, is not a concept thats stuck on a single idea. The core idea is to introduce people to the quality methods and analysis.
"Its always adaptive. You can tailor it to fit to your needs. If you feel like you will go with Function Point methodology, you are alright to go ahead with it. Same for any estimation techniques. But you should do it constantly and repetetively.
"Same with the advancement of compilers. If you feel like the WBS in the structure of PSP won't fit to your development, do modify it and use but again do it constinuously.
"As you do it continuously, you will have collected the historical data of yours and will be statistically do a predectable and accurate estimates on all the parameters"
May be I am giving this answer late but when I read all the replies, I felt I wanted to share this.
As per the tools, we have Process Dashboard, the PSP excel sheet and all.
For the PSP, I have seen the Software Process Dashboard, but it seems awfully difficult to use.
I learned it just this last semester in college and it worked great for me. I know that by following it to the letter I can be confident I can hit Compile and won't have any errors and by hitting Run I won't have to spend time anymore fixing and re-compiling the program to run it again and again till the mess is fixed.
People complain about having to record the "missing semi-colons" and such but by the time you're on program 7, you're no longer making such trivial mistakes and instead your defects are found in the important bits of your program. I have not had the opportunity to apply it to a real scenario though but im really looking forward to!
I try to follow the PSP 2.1 process when possible. It really helps me keep a focus on not skipping important, but less exciting, portions of a project. Usually this is design and design review for small projects.
To keep track of time you can use the PSP Dashboard, which has a bunch of built in features and scripts that help you follow the process.
If you are just looking for a time-tracking tool, I also like http://slimtimer.com. It can also do some decent reports.
I've been using PSP for the last six months.
It is time consuming. For my estimations I had to spend 7% of my time filling forms.
It is frustating to have to put the mistake "missing semicolon" over and over again.
But on the other hand as I get used to the process, it became important as I started to see which errors I was mainly doing and I started "naturally" avoiding them.
It also makes you "review" your code so you can see if there's any problem before hitting the compile button.
For tools I recommend using Timetracker: http://0xff.net/
I recommend at least trying PSP for a couple of months, because you will create some habits that help reduce the time you spend compiling and correcting minor bugs.
I have completed the PSP course, the next one is supposed to be TSP which is meant for team dynamics as others say. I have mixed feelings about PSP (mostly negative, but the results were interesting), I arrived to the following conclusions:
First of all my main source of frustration is that the design templates are way too tedious and impractical. Change them for UML and BPMN, tell your instructors from the start, IMPOSE IF NECESSARY. The book itself says that the design templates are for people who don't know or want to learn UML.
Secondly, estimations were the only valuable part for me. The book itself says that you can use other stuff appart from lines of code and it even tells you how to know how relevant they are statistically. My take on this (counting lines of code) is that a tool/plugin that connects with your VCS (git, mercurial) must exist and automate the building of your personal database, otherwise is too tedious to track base/added/reused parts.
The process itself is nice, but not applicable to big projects, why?, because it just doesn't cope with iterations. In the real world, due to requirement changes you will always have to reiterate on a project. You can still apply the discipline to small programmatic tasks, this is: plan, design, review your design (have design standards and a small checklist that u can memorize), code, review your code (have clear coding standards and a small mental checklist you can memorize), test, ponder on your mistakes. Any experienced programmer will know these are eventually intuitive steps to follow. My recommendation in real practice: follow the process but don't document other stuff than your design, and if you do implement unit tests, document them well.
This process might actually be worth to follow and practical... for real-time system programming where there is absolutely no room for mistakes, otherwise doesn't feel worth it.
If you are seeking for a methodology to organize and improve focus, try GTD (Get Things Done) and Pomodoro first.
If you have obsessive-compulsive disorder you might actually enjoy PSP =).
My final recommendation, learn from it as a reference, might lead to better and more practical stuff. This thing is just too academic.
P.S.: R.I.P. Watts Humphrey
I followed the PSP for a few weeks some years ago, because my group wanted to experiment with it. I found it very disappointing and even irritating to work with. It exhausted my patience. My main negative points are:
Ridiculous emphasys in things like typos or missing semicolons.
Impractical forms that you have to fill by hand.
Focus on procedural programming instead of OO.
Estimating involves counting the number of loops, functions, etc.
I found it a massive waste of time. I'd rather choose to leave this profession than to be forced to follow the PSP.
Related material: My answer about a PSP book in the "What programming book would you NOT recommend to developers" question.
I used it during university but at work we really don't have a process at all. Only recently have we started using version control.
My experience with it was that it seemed far too tedious to be useful. If it's not automated, then it can go away.