FAQ: The LabVIEW Programming Language
To paraphrase an old joke:
"Opinions about programming languages are like a***holes - everyone has one. And most of them aren't worth putting on public display."
Google "C++", "Visual Basic", "LabVIEW" or any other programming language and you will find people who hate them, peole who love them, and occasionally a few unbiased people in between. Most of what is posted as fact by the hard-core fans of alternative languages on either side is just plain wrong - it simply reflects their lack of true knowledge and understanding of how the tool works.
This FAQ attempts to address some of the the misinformation about LabVIEW (both good and bad) that we have encountered at various times over the years. We are unashamedly LabVIEW fans, and these answers are backed by over 20 years successful operation as a commercial LabVIEW developer. We must know something about what we are talking about!
My programming expert says <insert langauge of choice> is better than LabVIEW. Are they right?
The question makes no sense without qualification - "...better than LabVIEW for what?". ALL programming languages have strengths and weaknesses, and most competant developers with any language will know how to take advantage of a language's strengths, and how to work around its weaknesses - so ultimately any of the well known development languages can do any task reasonably well, depending upon the skills and experience of the developer.
If the assertion was "Visual Basic is better than LabVIEW for building a Retail POS interface", or "HTML5 is better than LabVIEW for building interactive web pages" we would probably agree. But when it comes to the best overall environment for building world class Measurement and Automation Systems, we believe LabVIEW has no peers! LabVIEW's strengths align with the requirements for building a modern Measurement and Automation System to a greater degree than any other environment that we know.
For another take on the same question see this link - LabVIEW or C? (Could You Rephrase the Question).
What are LabVIEW's strengths for Measurement and Automation programming?
Many and varied, and every LabVIEW programmer will probably have a different point of view, but here are our top three strengths:
- As a Dataflow language LabVIEW is inherently capable of handling multiple simultaneous processes executing in parallel (sounds just like the real world), and is inherently capable of taking advantage of modern parallel computing architectures. This is a BIG, BIG plus!
- LabVIEW has a consistent I/O model that applies to all different signal types, and is tightly integrated to hardware through high quality, consistent drivers. You spend less time wrestling your data out of your hardware, and more time doing stuff with it.
- LabVIEW encourages modular development, because every application or sub-routine has its own user interface and can be tested independent of its parent or calling framework. This type of modularity maps perfectly into the requirements of modern "agile" software engineering practices.
What are LabVIEW's weaknesses for Measurement and Automation programming?
So no-one can accuse us of being unreasonably biased we've also come up with three weaknesses:
- Dataflow is arguably LabVIEW's greatest strength, but from another perspective can also be its greatest weakness, IF the programmer does not properly understand the Dataflow paradigm. Failure by the programmer to properly understand the Dataflow paradigm is behind most of the poorly performed LabVIEW code that has ever been written. If you make the effort to understand and adapt to the Dataflow paradigm you will unlock tremendous power - if you don't it can bite you.
- There aren't many aspects of text programming we miss, but even the most ardent LabVIEW fan would probably agree that a deeply nested Case structure is less transparent to interpret than a well layed out "If...Then...Else" scripting statement. If that concerns you, do what we do and use a LabVIEW Mathscript Node to implement text style math and decision logic where it makes sense.
- A parallel Dataflow language is inherently less memory efficient than a purely sequential Procedural language (like C or C++) because there will be occasions when parallel execution will require that multiple copies be made of the same piece of data. LabVIEW minimizes data copying by implimenting intelligent compiler routines, but a well written and optimized C program will almost always be more memory efficient than its LabVIEW equivalent. BUT in terms of modern memory capacity and architectures this difference does not significantly impact performance, except at the most extreme edges of memory manipulation. For most applications, most of the time, it is a complete non-issue. And as the trade-off for ease of implementing parallel execution systems we would put up with slightly less efficient memory management every day of the week.
But LabVIEW isn't really a programming langauge [This one is usually presented as a statement - as if it somehow trumps all other opinions]
Is LabVIEW a programming language, or just a "high-level application development tool"? This ranks right up there on our don't really know, and don't really care scale.
We've heard arguments for, and arguments against. If you get a group of hard core IT experts in a room they will argue endlessly about the theoretical characteristics of a "true" programming language. We've even heard people argue that C and C++ aren't programming languages - that the only true programming language is assembler, and everything else is just a high level tool.
We take a more pragmatic approach. A programming language is a medium for building an arbitrarily complex set of logical instructions into a task that a computer can execute to achieve a desired outcome. Every project that we do involves making a CPU or logic controller perform a different and varied set of tasks and LabVIEW is how we do it. LabVIEW is a programming language - no further correspondance will be entered into!
What does it mean that LabVIEW is a Dataflow langauge?
LabVIEW is a different kind of programming language to most other mainstream languages like C, C++, VB etc. The LabVIEW compiler is based on the Dataflow model of code execution, whilst other mainstream languages are typically compiled according to Procedural models of code execution.
The Dataflow model allows for the execution of multiple processes in parallel, and therefore generates code that is well structured for execution on modern parallel computing architectures. It also means that LabVIEW coding techniques map very well into real-world Measurement and Automation applications that include multiple parallel data streams and control tasks. Procedural code is compiled according to a strictly linear execution model, and does not map as neatly into modern parallel computing architectures or complex real-world applications.
The Dataflow paradigm is a real strength of LabVIEW for programming Measurement and Automation applications, BUT it does require that the programmer adjusts the way they think about structuring the task vs the strictly linear procedural approach.
My engineers already know C (or C++ etc). Why do they need LabVIEW training?
LabVIEW is a Dataflow language and is based on a different coding paradigm to Procedural languages like C, C++, VB etc. As noted in other questions, the Dataflow paradigm is a real strength of LabVIEW for Measurement and Automation programming, but it does require a subtle shift in how the programmer thinks about structuring code vs the Procedural approach.
Experienced Procedural language programmers will need to adjust their thinking slightly to take full advantage of LabVIEW.
LabVIEW is so easy to use - you don't need to be a programmer to program it [and many similar varients]
There are many varients of this question or statement, all emphasizing LabVIEW's ease of use and then suggesting to varying degrees that you don't need to be a real programmer, or you don't need programming skills, to use it.
This is a very unfortunate side effect of the historic emphasis on LabVIEW's ease of use in NI's own marketing material. It has led some engineering managers to make the unwarranted assumption that they can take any engineer, add LabVIEW, and expect that they will create the company's next big software application in record time, regardless of that person's previous experience (or not) with I/O programming in general, and LabVIEW programming in particular. This rarely has a happy ending.
It is true that a beginner LabVIEW programmer can often create a simple application in a suprisingly short time (the same may also be true of some beginner C or C++ programmers). However complex applications are complex by their nature, and require programmers with experience in how to interleave all the pieces, while preserving the timing and synchronisation necessary to make it all work correctly, and anticipating all the unexpected external events that may impact the program execution. LabVIEW won't make a complex application simple.
Don't short change your committment to LabVIEW by short changing your investment in the skills of your developers, whether they be in-house (you need to invest in training), or contractors (you need to invest in a good one). Like any large and powerful programming language, LabVIEW continues to evolve. Mastery takes time, and never really stops. We've been doing it for more than 20 years, and we still learn something new every month.
There is also another unfortunate consequence of the emphasis on ease of use. While we would not argue with the basic contention, that for most engineering people LabVIEW is "easy to use", we think there has been an overemphasis on ease of use, to the detriment of people's perception of LabVIEW as a tool for "power programmers". So the argument goes something like "...because LabVIEW is so easy to use, it can't be any good for real power programming stuff". WRONG!
OK then - give me an example of LabVIEW's power.
The LabVIEW compiler uses a parallel processing model, so LabVIEW code is structured in a way that maps directly into today's modern parallel processing computer hardware and software architectures. This is not to say that procedural code (like C) cannot be implemented on parallel computing architectures - it clearly can, but it is a difficult task that requires a very high level of skills and experience. The relative skill level required by a procedural programmer to write good parallel executing code is significantly higher than that required by a LabVIEW programmer, and in our experience significantly higher than the skill level actually attained by most casual engineering programmers.
The most simple LabVIEW program developed by a beginner LabVIEW programmer on day one is already multi-threaded, and will run more efficiently on a multi-core or multi-processor CPU. It is undeniable that LabVIEW programmers as a group are better placed to take advantage of modern parallel processing computer architectures than procedural programmers as a group.
I been told LabVIEW is "Self-Documenting". Is this true?
This would have to be the holy grail for any programmer - a programming language that was truly self documenting. Because no matter how much programming theory and practice has improved over the years, documentation is the same old chore that it has always been.
No - LabVIEW is not self-documenting. It is true that NI has made it relatively easy to add elements of documentation to the Block Diagram as you code, and has also made it easy to collect and print all those elements at the end of a project. It is also true that a well laid out graphical Block Diagram can document broad program flow and modular architecture in a way that pages of text code simply cannot. However not all Block Diagrams are well laid out, and a badly laid out Block Diagram can be just as confusing and unintelligible as badly contructed text code.
As always with documentation, what you get out depends on what you put in.
My IT Manager says LabVIEW isn't suited to big projects with serious software engineering requirements. Are they right?
The perception that LabVIEW does not fit well into mainstream software engineering environments is completely unfounded. This view comes in part from the fact that very few IT Managers have any meaningful experience with LabVIEW, and in part from the fact that most traditional software engineering tools are understandably biased towards text based langauges. In fact LabVIEW code suits the specific requirements of modern "Agile" coding methodologies better than most traditional procedural langauges.
There is no technical barrier that limits how big or how complex a LabVIEW project can be. We have routinely worked on LabVIEW applications that are equivalent to tens of thousands of lines of C code, or even more.
The issues that are important in managing large projects are exactly the same for LabVIEW as for any other language, and most tools and techniques can be adapted to LabVIEW's graphical code base with minimal effort. Traditional source code control tools may have some bells and whistles that are optimised for text languages, but they can all do the basics of source code control for graphical LabVIEW code. We use Perforce as our source code control tool of choice. The integrated text search tools aren't much use, but the LabVIEW Professional System includes a very capable graphical search tool as an alternative.
LabVIEW is a proprietary language - is this a cause for concern?
Only if you are also concerned with the fact that Oracle owns JAVA, Microsoft owns .NET and Visual Studio, and Apple owns ObjectiveC. In our experience the whole proprietary vs non-proprietary vs open source thing is an over-rated issue.
The reality is that commercial measurement and automation programming is dominated by proprietary vendor-supplied tools and languages, and proprietary frameworks wrapped around otherwise open-source languages. For good reason - most commercial clients want certainty of supply, certainty of support, protection for their own IP, and do not want exposure to elements of open source development that are not relevant to their immediate needs.
The fact that LabVIEW is proprietary is not relevant to making a decision about the best tool to build a measurement and automation system with a projected commercial life of (say) 5 - 20 years. The only things that are important are how well LabVIEW integrates with the rest of the measurement and automation environment as it exists now, and what commitment has the vendor demonstrated to evolving and supporting LabVIEW into the future.
LabVIEW is a very open language that is designed to integrate with other industry standard tools like C/C++, .NET, VHDL, and any well-supported industry-standard communications protocol. LabVIEW is an industry standard to the extent that many hardware manufacturers supply their products with LabVIEW drivers - although not always good ones. Given that LabVIEW can call external code libraries there is almost no hardware that LabVIEW cannot connect to.
In the 20+ years that we have worked with LabVIEW we can say that NI's commitment to evolving LabVIEW, embracing integration with new third-party hardware and software technologies as they come to market, and to providing legacy support and upgrade pathways for LabVIEW based systems is second to none in the industry.