Beginner LabVIEW Tasks [closed] - labview

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I am on a FRC (FIRST Robotics Competition) team, and we plan on using LabVIEW to program our robot. I was wondering if anyone had any basic LabVIEW tasks that we could use to learn LabVIEW before we begin the actual programming of our robot?
EDIT: Most of the programmers have at least a basic understanding of programming, and are coming from another language.

I believe the best thing would be to go through the getting started tutorial of LabVIEW:
http://digital.ni.com/manuals.nsf/websearch/EC6EF8DE9CB98742862576F7006B0E1E
The reason I say that is because they contain exercises between every lesson, and you could attempt to do that without having a look at the solution.
Also, the following site has the 3-hour and 6-hour course on LabVIEW which could be approached in the same way:
http://www.ni.com/academic/labview_training/
Also, if you need guidance for that particular project, I don't mind getting involved to mentor your team on it. You could provide me with the contact details of your teacher/professor and I can get in touch with them.
Take Care
Adnan

I was also on a FIRST team for a while, and I taught the programmers while I was there. I found that the best way to get the language down was to practice with some simple projects which solidify data-flow and other important concepts in the mind.
A few:
A stop light with user-manipulable controls for how fast each light should stay on. Once you've got that down, fix it so that the user can only change stopping distance and speed limit. That way you work in some of the math functions.
I always taught some of the basic concepts, like loops and shift registers, with imaginary killbots. A killbot has a pre-set kill limit (for for loops), and has to keep track of how many hits it gets with shift registers.
I certainly wouldn't go with NI's training things. They only managed to confuse the new programmers, even the ones with experience in other languages. I also found it best not to teach the concept of global variables, which NI does, because it breaks the whole point of LabVIEW, data-flow.
Wow. That was long winded.

While I haven't gone through them, Ben Zimmer's company has been posting (apparently free) FRC training videos at http://www.frcmastery.com/. Possibly they're worth checking out.

If you have LabVIEW installed you could have a look at the following two sections of the on-line help files:
Getting started
Fundamentals
The Getting started section is a technical part on how to use LabVIEW, the fundamentals on the other hand provide a deep inside in how to program with LabVIEW and covers a lot. Both elements are available on the web (I provided the URLs)

Personally, I'm not so into NI resources.
However, they provided this short and rather nice course: http://cnx.org/content/col10241/1.4
(I like the videos).
Also, I used this
http://techteach.no/labview/lv85/labview/index.htm

Related

How much time do I need to learn LabVIEW [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I know that this question is too abstract. But. How much time do i need to learn LabVIEW to become average LabVIEW developer? For example, if I buy good book about LabVIEW and have 8 hours per day (on my work) dedicated to LabVIEW learning how many days i will spend on LabVIEW learning? Could you please provide example from your own experience. More information about me that can be helpful: I'm a developer and know c\c++\python and a little bit of java languages.
Like Swinders said, it might depend a lot on your sensibilities. I have seen people who had a really hard time migrating to the data flow concept. It's a different paradigm from the classic text-based languages and some people can't easily think in these concepts.
If you get past that hurdle, you'll find that the IDE handles a lot of the annoying things you used to take care of for you (things such as syntax and memory allocation). This allows you to become productive very quickly.
It doesn't mean, however, that your level would be high. One potential pit you should try hard to avoid is casting your existing experience onto LV. The most common example is probably local variables. This may be shocking to people coming from a text-based world, but LV does not have variables, per-se. Unfortunately, it does have elements called variables and people migrating from C who find them jump on them and use them as they would use variables in C, leading to LV code which looks like C code and is bad code (at least in LV).
If you do manage to work around this, I would guess you would become better than the global average in less than a month and better than most professional developers after creating three projects you would later look at and say "what the hell was I thinking?".
I never took any of the NI courses (although I understand some of the advanced architecture ones are pretty good), but I would suggest you also spend some time in some of the online communities (such as LAVA or the NI forums) and look at some of the examples and discussions there. There's a lot of material about best practices, design patterns, etc., which would allow you to become a more professional developer.
Above all, do not abandon your current professional conduct. If you have a structured process for designing and developing software, you already have a leg up on the majority of LV programmers. Just make sure you adapt and keep using such a process.
I started with no commercial programming experience (I have always programmed for fun) and followed an on-line tutorial to pick up the basics of LabVIEW. Within a week I was able to understand existing code and could develop a small application.
It is hard to give an estimate on how long it would take to become an 'average' LabVIEW developer as this depends on what you mean by 'average'. One thing to consider is how easy you are able to think in terms of data flow rather than procedural languages. If you can pick up new programming languages quickly then this will help.
Would you be the only person using LabVIEW or are there others at your place of work that could mentor you? You may also find that there are user groups operating near you which I would recommend (check the NI website or contact your local NI office).
There is then the experience that you will need to gain to allow you to produce good LabVIEW code. I was lucky to be able to attend the National Instruments training courses a few years ago which I think helped me but only by using it have I become an 'average' LabVIEW developer.
I'd say a few weeks or most with devoting the majority of your work time to it. I had a similar background to you when I started to develop in LabVIEW. The hardest part was adapting to the lack of variables. There are local variables, but it's not what you're used to at all. Additionally, their functions, called Virtual Instruments (VIs) can have multiple inputs and outputs, similar to how Python can handle n-tuples.
I will warn you, their array handling features are terrible. A lot of general concepts you might be used to are difficult to implement. My mantra when working with the language is it makes hard things easy and easy things hard. There are also a lot of "gotchas" in the language set, especially with their DAQmx function. I'm not sure what you're planning on developing and their Real-Time module has it's own issues as well, different issues from the main language set.
I would definitely spend some time on NI's website and read as many whitepapers as you can, especially about good design practices, here and here. Learn their State Machine (here or here) and Producer/Consumer pattern well, that's the backbone of many applications you'll be writing.
Good luck, it will make your head spin for a while.
There are some excellent resources to help you get started. If your employer can afford training, you can get started pretty quickly by taking a week of training run by National Instruments. The NI website also has an outstanding developer community that is highly responsive to questions even from novice developers. But I would say that the key to being comfortable with the idioms and style of the language is just plain old practice that you get by solving problems using LabVIEW on a regular basis.
You will find eventually that there is the question of hardware and instruments. Labview is really all about data acquisition-- either through NI's DAQ hardware or through traditional GPIB instruments, or through 3rd party api's (activeX, .NET assemblies). If you're using LabVIEW, you're probably interfacing to hardware of some type. This can get really challenging with complex instruments and measurements. If you're getting started, I would recommend making sure that you have unlimited access to at least some of the hardware you'll be working with. In other words, make sure that your manager understands that you need a lot of access to the hardware in order to get good at developing with it.
We are using LabVIEW to create test software for our factory test systems. In the past years I have already trained some beginners to understand LabVIEW. I would say it depends on how good you are at learning new concepts. I have trained some to be able to produce standalone applications using the queued message handler concept, doing dynamic GUIs and using hardware drivers within about 3 weeks. Unfortunately there were others aswell that were only able to learn half of that within half a year.
The most important thing in my opinion is the learning source. Having an experienced LabVIEW user that can guide you is the best option. If there is no one not available I would recommend Youtube Tutorials combined with the shipped LabVIEW examples.
The LabVIEW core tutorials are not very handy in my opinion. Those are quite boring and far from what you really need to get started.

do you rely on your memory or consult references and use a lot of intellisense? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I have noticed I do not code as much as I use to. Today I dedicate more time to analysis and design, then I communicate that design to programmers. Then they do the coding. This has affected my coding productivity, because I must consult references and rely on intellisense. Things are becoming more complex everyday
Now, here is the irony. If I were to hire a programmer and ask him/her to sit in front of a computer, I may ask to do some coding and I would check abilities. I would evaluate them based on their use of memory vs. consulting references. Maybe I will prefer that programmer who did not consult too much, but who knows what they are doing.
What is your opinion and experience?
I would say that a developer who knows how to find the answers is better than one who has an overall good knowledge already. I find that intellisense is a good tool for finding answers, besides it is too much to remember all method names, arguments, overloads, etc.
I use memory to get me into the right general area (e.g. knowing which classes to use or at least which namespace they'll be in) and then often Intellisense/MSDN for the exact method name or arguments to use.
Having said that, Stack Overflow is improving my ability to code without any references (or even compilation) - I'm sure code will just work out of the box for me more often now than it used to. (I tend to post and then check the code works, add links to MSDN etc - assuming I'm reasonably confident in the approach.)
Someone knowing what resources are available, and how to find the answers, and how to effectively debug - these are qualities I look for now in prospective employees.
I used to consult my memory only, but two things have happened:
Class libraries have gotten larger, so has the number of languages available
The ratio of programming-related memory to personal-life-related memory has shifted away from code
Programming today is also eight times harder than it was when I started. I used to work on 8-bit machines, now I'm working on 64-bit ones. :)
I once was at a job interviewed with the CTO of a company. He asked a question based on a real life problem the company had a while back and solved. It was a multi step problem.
I was standing in front of a whiteboard working through my solution and struggling through a particular part, a part I would use google for before even attempting it, had I been tasked with solving this problem for real instead of for an interview. He asked me at that point, "would you do anything different if this wasn't an interview question." I responded, "Yes. I would exhaust all possibilities of using a third party component for this part of the task and look up the solution, because it is a well defined problem thats been solved several times." There was a bit more discussion where I justified my answer, explained exactly what I would research, and I solved some other parts of the question. In the end I was offered and accepted the job, partly because of knowing how to find out what I didn't know.
Being able to use references is as important as being able to code from memory. Obviously, if you are a one language shop, and want people proficient in that language,the person should be able to write a complete hello world app in notepad. Interview problems should focus on small problems, and one should not worry about small syntax errors. This is why a whiteboard is the best IDE for interview questions.
Unless you demand all your coders use notepad and don't give them internet access, don't be as concerned by the syntax. If you do sit them down in front of a computer, worry about the finished product as well as the technique used to get there.
I'm a PHP programmer in my early 30's. I rely on PHP's excellent documentation, for several reasons:
Programming concepts don't change. If I know what my object models are and how I want to manipulate data, then there's dozens of ways to implement the details. The details are important, but a better grasp of the design and structure is more important
PHP has notoriously inconsistent functions. One string function might use ($needle,$haystack) as parameters, and another might use ($haystack,$needle). Trying to keep them straight isn't worth the hassle when you can just type php.net/function_name and get the reference.
I don't rely on intellisense, simply because I haven't found a decent IDE for PHP that does it well. Eclipse is ok, but it's not fantastic. Netbeans gives me 'PHPDoc not found' for all the built-in PHP functions whenever I install it. There's nothing that I've found so far that beats out the documentation.
The bottom line is that the ability to memorize functions isn't indicative of coding ability. Obviously there's a key set of basic functions that a good programmer will know just from extensive usage over time, but I wouldn't base a hiring decision on whether someone knows substr_replace vs. str_replace from memory.
Because I've read either the documentation, or articles, or a book on a subject, the things I learn on a topic are organized. The result is that, if I can't bring something up from memory, I can probably find it quickly through IntelliSense or the Object Browser.
Worse come to worst, I can pick up the book again; something these youngsters are not being taught to do.
John Saunders
Age 51
Pretty much Google + Old Projects + my memory (of course)
References will not solve your problems though, its only for the nuts and bolts, the higher level of problem solving is the actual "programming" part IMHO.
I tend to use Intellisense and Resharper much more than I used to before, but this has helped my overall productivity. If I can get the idea of how I want to solve something and then use tools to get the more boring parts like class names and function signatures, why shouldn't I use the tools I have? I feel relieved that Jon Skeet has a similar approach it seems.
I rely on my bookmarks and books... and my ability to use them effectively. I have multiple books above my desk, including a copy of the ISO C90 standard. Moreover, I use Xmarks to have access to my bookmarks wherever I go. Sometimes, I make a pdf out of a particular page and upload it to my web-site if it is important enough.
Sometimes the information provided by the resources I use makes its way into my terrible memory... maybe.

When has a human-based service replaced an automated one? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this question
I'm looking for examples where a human-based service replaced an automated one. For example, machine translation (although suboptimal in quality) largely replaced human translation in many areas -- can anyone think of where the opposite has occurred (especially with regard to today's industry)?
Edit: Before you downvote because this doesn't have the keyword C++, my reasoning is that programmers invariably create these technologies, and programmers are the ones who are either 1) displaced by the revival of human service, or 2) asked to somehow integrate the human element in a service. When there are questions like this one, it doesn't make sense to downvote this (unless you downvoted that, too).
reCaptcha is, I think, a direct counter-example to your machine translation example (since it's a form of visual translation, so to speak),
perhaps google image labeller counts
I recall something about yahoo running a "humans do simple tasks online for you cheaply, in the place of cpu cycles" scheme.
Crowdsourcing in general might be something similar to what you're thinking of.
Perhaps not the most exciting examples, but certainly among the most common--used everyday by pretty much everyone. Non-trivial as well.
Electronic Stability Control (braking-steering control in many (most?) automobiles
Auto-focus in some digital cameras.
It is an unlikely situation because for a machine to have been given the responsibility in the first place you would expect it to have been sufficiently good/cheap/etc at the task. So for it to then be replaced by a human, the human approach would need to have some how got better or cheaper at a faster rate than the technology driven one.
That's not to say that there isn't an example of this of course. I think it is an interesting question.
I've heard there are CAPTCHA-resolving centers in India. That may be a lie though.
Actually, I have an interesting example of this.
A few years ago I was working for a company writing a Dive School System for a Hotel in Sharm El Sheikh (in Egypt) along with a new "front of house" system. We were trying to come up with a fast check-in system for checking in an entire busload of divers arriving from the airport, the thing is, in Egypt labor is very very cheap, and the end it was more costly to do this with a computer system, than to continue with the old manual system of simply having 15-20 staff doing the checking in.

Good challenges/tasks/exercises for learning or improving object oriented programming (OOP) skills [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
What is a good challenge to improve your skills in object oriented programming?
The idea behind this poll is to provide an idea of which exercises are useful for learning OOP.
The challenge should be as language agnostic as possible, requiring either little or no use of specific libraries, or only the most common of libraries. Try to include only one challenge per answer, so that a vote will correspond to the merits of that challenge alone. Would also be nice if a level of skill required was indicated, and also the rationale behind why it is a useful exercise.
Solutions to the challenges could then be posted as answers to a "How to..." question and linked to from here.
For example:
Challenge - implement a last-in-first-out stack
Skill level - beginner
Rationale - gives experience of how to reference objects
Building Skills in Object-Oriented Design is a free book that might be of use.
The description is as follows:
"The intent of this book is to help the beginning designer by giving them a sequence of interesting and moderately complex exercises in OO design. This book can also help managers develop a level of comfort with the process of OO software development. The applications we will build are a step above trivial, and will require some careful thought and design. Further, because the applications are largely recreational in nature, they are interesting and engaging. This book allows the reader to explore the processes and artifacts of OO design before project deadlines make good design seem impossible."
Write a challenging program from scratch. Try to get some people (around five, that should be doable) to use it. Respond to their change requests.
Adapt your program's design. Start small, then watch it grow. Manage this growth. This is hard. You will also have to fix bugs and maintain the thing over time, which for me was a very valuable lesson.
Challenge: Write a wrapper for your web site/service API of choice in your language of choice, that doesn't already exist (ex. a ZenDesk API wrapper written in C#). Release the wrapper as open source for others to use.
Skill Level: Beginner to Intermediate
Rationale: To learn how to extrapolate a 3rd party web service API into a meaningful set of objects/classes, making the reuse of that API easier in your chosen language.
After you have learned the basics, study the "Gang of four" design patterns book.
http://www.amazon.com/Design-Patterns-Object-Oriented-Addison-Wesley-Professional/dp/0201633612/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1221488916&sr=8-1
This is a classic, and a must read for any coder who wants to understand how to use OO to design elegant solutions to common coding problems.
Take a procedural-style written piece of code and try to transform it into OOP based solution. During the process, consult a book on refactoring and design patterns. A friend of mine was able to make a huge step forward in understanding object oriented concepts exactly this way. As with anything, this might not work for everyone.
I have found CRC cards to be quite effective in learning, teaching and building good OO design.
Certainly a good challenge, although less accessible than a "start from scratch" assignment, is to refactor some existing code that either doesn't use inheritance or doesn't use very much of it to make greater use of inheritance. The process of refactoring will expose a lot of the benefits and gotchas of oop, as it certainly has for me on my most recent project. It also pushed me to understand the concepts better than past projects have where I've created my own object oriented designs.
A given task has very little to do with being "OOP", it's more in how you grade it.
I would look at the Refactoring book, chapter 3, and make sure none of the bad code smells exist in the solution. Or, more importantly, go over ones that do apply.
Most importantly, watch for the existence of setters and getters (indicating that you are operating on values from a class and not asking the class to operate on it's own values)--or using "extends" without applying the Liskov Substitution Principle, stuff like that.

Coming up to speed on the programming environment [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm not a full-time software guy. In fact, in the last ten years, 90 % of my work was either on the hardware or doing low-level (embedded) code.
But the other 10% involves writing shell scripts for development tools, making kernel changes to add special features, and writing GUI applications for end-users.
The problem is that I find myself facing significant holes in my knowledge, often because it's been years since I've done "X", and I've either forgotten, or the environment has changed.
Every so often, there are threads on TheDailyWTF.com along the lines of "WTF: the guy spent all day writing tons of code, when he could have called foobar() in library baz". I've been there myself, because I don't remember much beyond the #include <stdio.h> stuff (for example), and my quick search somehow missed the right library.
What methods have you found effective to crash-learn and/or crash-refresh yourself in programming environments that you rarely touch?
Ask developers you know that work in the environment that you are interested in.
Search the web a lot.
Ask specific questions in relevant IRC channels (Freenode is great).
Ask specific questions on StackOverflow and other sites.
There really isn't any substitue for being "in the daily flow" of the programming environment in question. Having a good feel for the current state of the art is something you only get from experience, as I'm sure you can verify in you own areas of expertise.
i try to keep up with general news about languages i'm interested in but aren't necessarily using at the moment. being able to follow the general changes helps a lot for when you have to pick it up again.
beyond that, i personally find it easiest to grab an up to date reference book, and code a few basic things to get me used to the environment again, ie as a web programmer i'd make a simple crud app, or a quick web service/client.
For frameworks/APIs (such as a JavaScript framework or a widget library):
Quickly scan through the entire API documentation; get a glimpse of all that's out there instead of picking the first method that seems to fit your needs.
If available, glance at the source code of the
framework to see how the
API was intended to be used. Seeing what's behind the curtain helps. And also
some of the methods will have been used
internally, showcasing their true intents.
Don't necessarily always trust existing code (Googled, from co-workers, from books) since not everyone does the due diligence to find out the most proper way to use an API. Sometimes even samples in API documentation can be out-of-date.
In newer full-featured environments like Java, .NET, and Python, there are library solutions to almost every common problem. Don't think "how can I program this in plain C", but "which library solves this problem for me?" It's an attitude shift. As far as resources, the library documentation for the three environments I mentioned are all good.
The best solution I think is to get a book on the topic / environment you need to catch up on.
Ask questions from developers who you know who have the experience in that area.
You can also check out news groups (Google Groups makes this easy) and forums. You can ask questions, but even reading 10 minutes of the latest popular questions for a particular topic / environment will keep you a little bit "in the know".
The same thing can go for blogs too if you can find a focussed blog. These are pretty rare though and I personally don't look to blogs to keep me "in the know" on a particular environment. (I personally find blogs most popular and interesting in the "here's something neat" or "here's how I failed and you can avoid it" or "general practice" areas.)
In addition to the answers above, I think what you are asking for will take a significant amount of your time, and you must be willing to spend that time to achieve your goals. My method would be pretty much the same as Owen's answer; get a reference book or tutorial and work through the examples hacking in changes as you go to experiment with how any given thing works. I'd say as a bare minimum, allocate a hour to do this every other day, in a time that you know you won't be interrupted. Any less, and you'll probably continue to struggle.
The best way to crash-learn is simple, simply do it, use google to search for X tutorial, open your favorite browser and start typing away. Once you reached a certain level of feeling with X, do look at other people things, there is lots of open source out there and there must be someboby who has used X before, look at how they solved certain problems and learn from this, this is an easy way to verify that you are 'on the right track' or that you're doing things or thinking in patterns that other people would define as 'common sense'.
Crash-refreshing something is much easier since you have a suspended learning curve already, the way I do this is to keep some of the example you did while writing or keep some projects you did. Then you can easily refresh and use your own examples.
The library issue you mention here well, only improving your search skills will improve that one (although looking on how others solved this will help as well)
Don't try and pick up every environment.
Focus on the one that's useful and/or interesting, and then pick a few quality blogs to regularly read or podcasts to listen to. You'll pick up the current state of the environment fairly quickly.
Concrete example: I've been out of the Java world for a long time, but I've been put on a Java project in the last few months. Since then I've listened to the Java Posse podcast and read a few blogs, and although I'm far from a Java guru I've got back up to speed without too much trouble.
Just a though. While we are working on our code we know that we need to work very hard to optimize the critical path, but on non critical path we usually don't spend to much effort to optimize.
From your description you are working 90% on embedded and 10% on rest, lets assume that in 50% of the rest you are spending more time that needed. So according to my calculation you are optimizing about 5% of your work flow ...
Of course the usual google/SO/forums search is useful before you doing something new, but investing more than just that is waste of time for my opinion, unless you want to waste some time just for fun or general education ... :), but this is another story.
By the way I'm in same position and last time i needed some GUI and used MFC (because i used it sometimes 10 years ago :) ) and i perfectly understand that i probably will get better results with C# and friends, but the learning curve just not justify this especially knowing that i need mix the C code with GUI.