I keep wondering when National Instruments LabView will reach some sort of tipping point—especially in embedded programming. I can remember learning to program in C and how hard it was to keep track of the big picture while coding in the weeds. Not to mention how arcane relay ladder logic is. I’ve been fascinated by LabView since I first saw it in 1998. Problem is that it is proprietary. But it’s powerful, you can see at an abstraction layer beyond the details in order to keep the big picture in mind, and it’s dataflow model makes it ideal for parallel programming that can take advantage of today’s powerful multicore processors and FPGAs.
In honor of the 25th anniversary of LabView, here is a letter from Jeff Kodosky, NI Technical and Business Fellow, Co-Founder and Father of LabView.
Dear Industry Colleague,
LabVIEW is 25 this year, so it’s a perfect time to pause for reflection and to speculate on the future.
From the beginning, we thought we were on to something special. The first Mac had recently released, bringing with it the first mainstream GUI and mouse. We knew that personal computers held great potential for use by scientists and engineers, and we were confident that the GUI and mouse would be the primary way people would interact with their computers.
Our thoughts for LabVIEW were simply to combine the power of the PC with the expressiveness of the GUI and hide the unnecessary complications of programming. We wanted to provide a tool that would be as useful to scientists and engineers automating measurements as the spreadsheet was for people working with financial data. Nobody asked us to build LabVIEW. We made it with a “build it and they will come” mentality. We wanted to build a tool that offered a step-function improvement in productivity in designing test and measurement systems.
We didn’t set out to create the G programming language, but that’s where we ended up. We realized we needed that level of flexibility and control to enhance engineers’ productivity and support all of the types of I/O and processing our customers would require. After much deliberation, we settled on a graphical, structured dataflow representation that more closely resembled actual circuits and block diagrams compared to traditional forms of programming.
It’s rewarding to see the many ways our customers have used LabVIEW. From detecting cancer earlier to providing the blind with experiences once thought impossible. From conducting the most advanced experiments in the world that aim to uncover the mysteries of the universe to inspiring our children to be future technology leaders. In return, our customers inspire us to maintain our mission and provide them with continuous productivity improvements and access to the latest technologies.
Until now, most of what we’ve seen in the mainstream market are multiple homogenous cores on a single chip. Specialized processors like GPUs have been reserved for specific tasks and haven’t been readily accessible to programmers. Looking to future technologies, I expect to see more specialized processing cores working alongside general-purpose CPUs to meet the increasing demand for processing power. Additionally, as FPGAs evolve, they will become more prevalent and more capable. To harness the available power, engineers will need productive tools that assist them in partitioning and targeting logic to the appropriate processors. Once harnessed, this increased processing power will help engineers do in real time what they previously had to do offline.
Wireless and mobile technologies are prevalent in our everyday lives. These technologies make it possible to deploy intelligent, sensor-rich devices over large areas. Next-generation networking technologies like IPv6 and high-speed wireless networks promise to support what will become an enormous number of connected devices capable of acquiring and transferring unthinkable amounts of data. I expect we’ll see that this technology could ultimately help prevent hunger and disease, preserve natural resources and improve our quality of life by bringing in additional real-world data to be able to make more timely and better decisions and create more accurate models.
The way people interact with software is another area of rapid technological advancement. In the future, I imagine we’ll see much more efficient input mechanisms. Just as the mouse took over many input duties from the keyboard, I expect to see touch- and gesture-based input become a more prominent aspect of software interaction. Software will become more intuitive, reducing the learning curve as it continues to incorporate physical metaphors.
We made a decision 25 years ago to base the LabVIEW core programming language on structured data flow. The inherent parallelism of data flow was a natural fit for the acquisition-analysis-presentation problem our customers were solving. When the industry moved to multicore machines, that decision made us look downright prescient. Still today and in the future, LabVIEW is naturally positioned to integrate these next-generation technologies and many more that engineers and scientists will rely upon to meet the most difficult challenges they face. That being said, we still have a lot of work to do. As our mission is to equip our customers with tools that accelerate their productivity, innovation and discovery, we are excited to see what the future will bring.
The future is bright.
Jeff K
I like the concept, you made greate job!!
Good work!
Best regards!
LabView isn't even close to a tipping point in embedded, and never will be. For one, most embedded is still 8-bit and 16-bit; most 32-bit is single chip (MCU). Just like embedded multicore isn't common, and won't be for a very long time.
Also, programming isn't inherently two dimensional, and usable programming languages have advanced a lot since C. "Graphical" isn't inherently magical.
Good comment. "Everyone" knows C. Heck, even I've written programs in C. It's a nice compact language that can be squeezed into narrow places. However, there is a big world of higher level embedded systems that make LabView intriguing. It may not have enough targets right now, and that may limit its growth. But the ability to combine system design with programming is a powerful idea.
I'm in wait-and-see mode. But NI is quietly aggressive. It'll be interesting.
When you get into "big" systems, there are other options. I think a great option is a high level dynamic language (Python, Lua/LuaJIT, etc) plus C/C++. In fact, I know of quite a few embedded systems already using this approach. LabView isn't dynamic.
The best approach to multi-core is still up in the air; NI's dataflow approach (similar to the Actor model available in Erlang) isn't necessarily the best approach for many applications.
As far as the systems approach goes, LabView already has stiff competition from other sources such as MatLab/Simulink and Vissim. If I were betting, I'd probably put my money on MatLab/Simulink over LabView in the embedded market.
I'm also a bit of a system design skeptic. It definitely has its place (although I think a good argument can be made that functional languages are a better approach for high reliability), but I think there are plenty of applications where it's not the best choice.
At least from the advertising, it looks like a Systems Engineering approach (in software development terms, waterfall or CMM style approaches), where the end target is fixed. But in many cases, how to deal with change, including changing requirements, is a big deal.