More complexity, more flexibility - today’s test managers are faced with unprecedented needs for test system management. Here are the improvements orgs are looking for in test system management in 2017.
In early December, I attended the 3GPP RAN#74 TSG meeting in Vienna, Austria. This was the last plenary meeting before the official 5G work item (WI) kicks off at the 5G RAN #75 meeting in Dubrovnik, Croatia next March. The 3GPP membership has poured a lot of investment into the study of new technologies and methods to meet the 5G architecture requirements, and with the first big milestone just three short months away, here are a few high level takeaways from the 3GPP Workshop covering 5G.
First, many companies continue to push their concepts and technologies for inclusion into the first WI. However, time is running out. In March, the 5G Phase 1 WI will start and serve as the base for the initial 5G specification (3GPP Release 15). Although the 3GPP is planning for Phase 2 to start 18 months after Phase 1, use cases or technologies not included in Phase 1 must wait an additional 18 months, which could be commercially challenging for some.
On the opposite side, the 3GPP leadership proposed narrowing the scope of some of the work in Phase 1 to increase the probability of meeting the March deadline and ultimately the Sept 2018 finalization goal. Although no consensus was reached in Vienna, it is clear something has to give in order to move forward. With time the ultimate equalizer, innovation may need to be tempered with the reality that the study items must be completed and consensus reached before the definition phase can begin.
In parallel, the 3GPP continues to evolve LTE 4G particularly for the IoT (NB-IoT), MTC (LTE-MTC), and V2X use cases. In fact, some have proposed delaying 5G work related to these use cases to assess whether these new LTE evolved technologies can address the IMT-2020 requirements. The 3GPP has already signaled that a primary use case for Phase 1 will be eMBB (enhanced Mobile Broadband), and this may be the major achievement of the Phase 1 work.
Also of interest, eMBB is looking more like a mmWave or cmWave system utilizing multi-carrier OFDM and up to 8 component carriers with a minimum of 100 MHz of bandwidth. Consensus was reached on channel coding with LDPC proposed for data and polar coding targeted for the control channels. Requiring two coding methods is curious because this requires a mobile device to include both methods in the physical layer with the mobile switching between the two depending on the state of the link - increasing cost and adding complexity. Although each method has its merits, it will be interesting to see if the current status quo goes unchecked in March.
Finally, the 3GPP came to consensus on 5G terminology. The new 5G physical layer will be officially named, “NR” for new radio. The new 5G core network will be called, “5G CN”. A connection between NR and 5G CN will be named “NG”. (I think it is safe to say that the marketing experts were conspicuously absent from this discussion.) On to Dubrovnik!
This blog orginially appeared in Microwave Journal as part of the 5G and Beyond series.
An interview with James Truchard (Dr. T) and Jeff Kodosky, co-founders of NI, on our 40th birthday as a company.
When did you know you wanted to be an engineer? And how did you meet?
Dr. T: “I was one of the Sputnik kids. Sputnik had just been launched in 1958. I graduated from high school in ‘60. And Wernher von Braun, the big hero of the US program, was a physicist, and so that meant I wanted to be physicist, too, not really knowing what physics was or what it was about. So I got a bachelors and masters in physics. Working at the University of Texas, I realized that, you know, you have to really, really be smart to be a great physicist because it’s very hard. Plus they had all these mysterious particles that they talked about and I couldn’t always see them. So I gravitated towards more pragmatic things in engineering and started designing circuits. The last year of my masters, I was working full-time. Then I continued to work full-time at the lab and in seven years, one course at a time, I got my Ph.D. in electrical engineering.”
Jeff: "I didn’t! I thought I wanted to be a physicist, a theoretical physicist.”
Dr. T: “Jeff came to Texas when Texas was really the star because we were building the Super Collider and a lot of funding was coming to Texas, and everyone was excited about what physics could do. I distracted Jeff from his career as a physicist.”
Jeff: “I grew up in New York, but I had this professor who said, ‘Apply to UT Austin, they’ve got a great graduate school in physics,’ and so that’s what brought me down here. I started as a TA at and then discovered I didn’t like teaching, which should have been the first sign that I was in the wrong career. I’d heard about Applied Research Laboratories (ARL) because I knew they hired lots of students part-time there. Dr. T interviewed me and hired me, and so, basically, Dr. T is the only person I’ve ever worked for. When I started to work at ARL, that’s when I got exposed to the PDP8 and started thinking about software. Prior to that, programming was just Fortran or assembly language…it wasn’t very interesting. But when I got to play with the computer myself, that’s when I guess the engineering side started coming out. I realized it’s a lot more fun building things than theorizing about them. I would say I was probably more a computer programmer than anything else, and Jim came up with lots of ideas, and has always comes up with lots of ideas, and we would work on experiments together.”
Dr. T: “We started meeting early '76. We made a list of ten different projects we could do…and then we voted, which I view as a very random process for deciding. We said, ‘Okay which is the best idea?’ and we picked the instrument interface. I always say that was lucky because it was right between computers and instruments…what better place to be if you want to revolutionize instrumentation with computers? It gave us access to a customer base of scientists and engineers. You know, in the rearview mirror, it all looks very simple, like success just came to us without any effort, but there were 23 companies making GPIB controllers in that timeframe when we started the…Hewitt Packard had HP2000 computers with interfaces to them and we essentially saw how we could do the same thing for the PDP11, and that’s what we did.”
When did you get your big break?
Dr. T: "In November of 1979, I started full-time and I went on a sales call that first week with our rep up in New England. We visited Brown & Sharpe, had a very cordial visit, and it looked like they were going to use our product. When we got home, we had gotten a $90,000 order for this bus extender. And so we built it, we shipped it, and with the profit, we bought Jeff a PDP11 44 computer and a $20,000 copy of Unix operating system…”
Jeff: “..and we were off and running.”
Dr. T: “We were off and running. And then, two years later we get a call from the purchasing agent saying they had made a mistake. They were supposed to get a quote, not place an order….so, that’s where I came up with my saying, 'nothing beats dumb luck,’ and of course don’t exclude it. That really kick-started us, allowed Jeff to buy his computer, us to start full-time, and also for me to buy the reference books I wanted. I splurged at the end of that year to buy reference books. “
Jeff: "To clarify, When I started working for Jim it was '73. It was '76 when we began this big measurement system project at Applied Research Laboratories (APL), and that’s also when Jim mentioned someone could start a business building an interface that would connect HP instruments to popular computers. So, we were working full time at ARL on this massive measurement project for the navy, and then moonlighting on NI to get it going— designing and building GPIB boards. We started designing one board in the spring and shipped it that fall. That was design, development, debugging, manufacturing…all while we are moonlighting. It boggles the imagination to see what we were able to accomplish in that short time!”
What makes this new version so special? For starters, we added NI Linux Real-Time capability for all software defined radio (SDR) products. This added capability empowers you to develop real-time algorithms for execution on the NI Linux Real-Time operating system, work with other tools to move up the protocol stack to MAC and network layers, and access the vast repositories of open source tools and technologies needed to build complete system prototypes.
We also introduced the MIMO Application Framework, a fully configurable, parameterizable physical layer written and delivered in LabVIEW application source code that helps build massive MIMO prototypes.
We’re excited to share this LabVIEW update that allows you to be more efficient and have the time to focus on what you do best – creating new technologies and solutions for future 5G systems.
The National Instruments Engineering Impact Awards recognize engineers and students who excelled in developing systems that solve problems across a range of categories.
The 2016 Engineering Impact Awards received nearly 100 submissions from almost 200 authors around the globe. NI’s technical panel of experts reviewed the papers and narrowed it down to just 15.
And the winners are…
2016 Customer Application of the Year
The University of Bristol and Lund University used the NI MIMO Prototyping System to test the feasibility of massive MIMO as a viable technology for bringing greater than 10X capacity gains to future 5G networks. In doing so, they implemented the world’s first live demonstration of a 128-antenna, real-time massive MIMO testbed and set two world records in spectrum efficiency.
Their massive win at the 2016 Engineering Impact Awards must be another new world record, as they took home five separate awards in recognition of their 5G wireless achievement!
Not only did Bristol & Lund win the 2016 Customer Application Award of the year, they were also the winner of the Wireless Communications Award, the Hewlett Packard Enterprise Big Analog Data Award, the Xilinx FPGA Award, and a special Engineering Grand Challenges Award sponsored by NI’s Dr. T himself for their commitment to grand engineering challenges.
Congratulations to PhD student Paul Harris and Steffen Malkowsky for their success at the 2016 Engineering Impact Awards!
Taking home the Transportation and Heavy Equipment Award for their amazing work with cross rail and having worked with Bombardier on this project, is Frazer Nash UK.
Authored by Senior Consultant, Colin Freeman, the paper explains how model-based design techniques were used to optimize everything from requirements capture and validation, through design and on to validation and verification testing, at a sub-system and system level.
The test facility Train Zero uses NI VeriStand and PXI for both integration testing of train systems and validation of the models, allowing any changes made to models to easily be revalidated in the same environment they would be running in later.
Congrats to Colin Freeman! Train Zero won Transport category in #NIWeek 2016 Engineering Impact Awards
The proud owners of the Intel Internet of Things Award are V2i from Belgium. They created a real-time measurement and diagnostic tool to determine welding quality and avoid unplanned production stops due to weld tearing developed with help of NI’s CompactRIO hardware and several sensors.
There were many more amazing projects this year that inspired us about the social impact of engineering! Every year we are blown-away by the creativity, especially that of the young engineers, so NI shines a spotlight on these bright and talented people in theStudent Design Competition Award.
But this year’s winner was University of Leeds with Project ALAN. This multidisciplinary team are helping stroke-survivors regain lost muscle control with a commercially-viable, robotic rehabilitation system.
Team ALAN with the people who helped make it happen!
Check out all the category winners and finalists and read their award-winning papers here: http://bit.ly/2bmP3nu
This summer we commissioned a study from research agency IDC, studying the best practices for internet of things (IoT) implementations and how companies can prepare themselves for IoT-based operations.
Here are the best practices for taking on any IoT project as a business:
Have a clear understanding of business objectives and what business value an IoT project will deliver.
Start with an objective that has organizational pull and already has identified business value.
Have an executive champion who will make the project a priority.
Have a start-up mentality. Start small, such as with a pilot project, and establish clear milestones.
Build in security from the start. Security and privacy concerns are the number 1 hindrance to the deployment of an IoT solution.
Own the data that will result from the project, and know if you’ll manage it yourself or will need to have others manage it for you.
Connect the IT and operations teams in your org to the end customer that will be served by the IoT project. This connection along the value chain of your project provides a level of trust that will smooth its approval, implementation, and use.
Small and medium-sized businesses with limited IT departments should plan on having systems integrators and other partners work onsite as much as possible.
Use products based on standard platforms as opposed to custom platforms as possible. Standards help ensure the compatibility and scalability of the end solution.
Use products that are flexible and extensible. Ideally based on software, such products can adapt after their initial deployment to evolving requirements with no hardware changes.
Companies that’ve already worked to define and implement IoT projects and access the IoT’s value are trailblazing an immature and evolving environment.
By applying these best practices for implementing IoT in your org, you can access the benefits of this huge emerging market while avoiding the significant pitfalls experienced by these trailblazers.
“I'd like to think that if actual engineers were involved in more projects, we wouldn't live in a world where it's a given that most websites, applications, apps, and embedded systems are poorly designed, overly buggy, and insecure. And though one would like to imagine that all safety-critical systems, at least, are created under the aegis of engineers and engineering principles, I have my doubts.”
This is one of the reasons we exist. We supply an integrated software-based platform that enables engineers to do exactly what Mr. Dunn states. Our choice in this endeavor isn’t to explore a path where engineers needed to become software developers. Instead, our mission is to create an engineering system design tool that empowers engineers to build world-class, complex, mission-critical, software based systems.
LabVIEW is at the core of our engineering platform, and when coupled with our modular hardware platforms, becomes a gateway to innovation that engineers the world over are using to make products safer, get to market faster, and accomplish amazing things.
LabVIEW simplifies the development of complex engineering applications. It’s native graphical language uses a concept called dataflow to define the parameters of execution and combines its native language with an open interface that integrates code from other software approaches. With this, we ensure that engineers can choose the approach they’re most familiar with for any individual component of the application.
We’ve invested 30 years into LabVIEW, making it one of the most productive tools on the planet. And we’re excited to share that the best is yet to come.
When paired with our software defined radio (SDR) hardware, our new MIMO system provides a well-documented, reconfigurable, parameterized physical layer written and delivered in LabVIEW source code - enabling researchers to build both traditional MIMO and Massive MIMO prototypes.
Our LabVIEW Communications MIMO Application Framework lets you develop algorithms and evaluate custom IP to solve a lot of the practical challenges associated with real-world, multi-user MIMO deployments. Scalable from 4 to 128 antennas, the MIMO Application Framework - when used with the NI USRP RIO and NI PXI hardware platforms - allows you to create small to large scale antenna systems with minimal system integration or design effort.
Researchers can use the system out of the box to conduct Massive MIMO experiments and seamlessly integrate their own custom signal processing algorithms in a fraction of the time compared to other approaches, speeding up the overall design process as the wireless industry races toward 5G.
In October, I traveled to Asia with stops in China and South Korea. While in China, I gave a talk on 5G at the Wireless Communications and Signal Processing conference (WCSP) in beautiful Yangzhou. I met with several researchers at the conference and was also able to travel to Nanjing to meet with professors at Southeast University (SEU). As you can imagine, wireless researchers are focused on mmWave and Massive MIMO.
While discussing mmWave, I conducted an informal survey with a central question of, “what is your frequency range of interest”. Although my informal poll is far from definitive, there was a lot of interest in 28 GHz. This is a bit surprising considering China has yet to officially designate mmWave spectrum for 5G. Additionally, it appears that spectrum around 28 GHz could pose a challenge in this region. I expect we’ll be hearing more on this in the near future, so stay tuned.
On the Massive MIMO front, Prof Xiqi Gao at SEU is exploring a new Massive MIMO technique called Beam Division Multiple Access (BDMA). This technique uses orthogonal beams to provide access to mobile users at the base station spatially, and his team plans to build the world’s first working BDMA Massive MIMO prototype.
Next I traveled to Seoul, South Korea where I attended the International Conference on Information and Communication Technology Convergence (ICTC) in Jeju Island. My industry presentation at ICTC focused on wireless research and the role software defined radios play in the design validation and the standardization process. While the conference covered many topics, much of the discussion was centered around 5G, from network to applications.
While in Korea, I was also able to meet with several researchers around the country including professors at UNTI, Seoul National University, Yonsei and ETRI. All the researchers have a keen interest in 5G and a spirited interest in prototyping. I was encouraged to chat at length with several researchers about how LabVIEW Communications and the USRP RIO are impacting 5G wireless research in areas such as Full Duplex Radio, LTE and WiFi coexistence, and V2X. In particular, I wanted to highlight the work of Prof Chan-Byoung Chae at Yonsei University. Our lead user team has been working with Prof Chae in the area of Full Duplex Radio for several years and at NIDays in Seoul, Prof Chae and his students demonstrated a working full duplex radio system using the NI platform to prototype a full duplex radio system capable of transmitting and receiving a full 120 MHz wide channel – the widest bandwidth ever demonstrated!
We’re seeing exciting developments on the 5G front every day. Next month I’ll be attending the 3GPP plenary meeting in Vienna. Look for updates from that meeting in my next post.
In early September, I attended the 3GPP RAN #73 plenary meeting in New Orleans. Even though we were in New Orleans, this was no party. Members submitted 528 documents for review, and over 250 participants from 135 different entities attended with a commitment to evolve our wireless standards to address the growing demands of the market.
Here are some key takeaways:
Release 14 (also known as LTE Advanced Pro) is on track with impactful updates for NB IoT and eMTC. Even though not technically 5G, Release 14’s coverage of NB IoT and eMTC have enhanced LTE’s ability to create a network of “things”.
LAA has given way to eLAA and this type of coexistence with WiFi and unlicensed bands continues to be an interesting and viable extension for bandwidth improvement using existing infrastructure.
Release 15 – the much anticipated “5G” release – generated a lot of discussion. Not surprisingly, the 3GPP has agreed to focus on the Enhanced Mobile Broadband (EMBB) use case for release 15, and the ultralow latency, ultra-reliable and massive MTC use cases will be deferred to March 2017. There will be some investigative work continuing on these use cases but the extent and scope of this work is still to be determined.
“New Radio”, as the new Physical Layer is called in the 3GPP, will have a distinct focus on the EMBB use case as a priority. This aligns with the 3GPP System Architecture (SA) groups’ request from this past June. Of particular note, the RAN groups will focus on the “non standalone” case where New Radio takes advantage of the existing EPC core network rather than relying on a completely new architecture. New Radio investigation will also focus on spectrum below 40 GHz, with higher frequency work also deferred until next March.
My time in New Orleans shows that 5G continues to progress and the narrowing of the scope coupled with the important additions to Release 14 for NB IoT and eLAA provide pragmatic evolutions to expedite next generation wireless capabilities. The 3GPP will collectively choose the official Release 15 work items in March of 2017, which will give us a good picture of the initial 5G spec.
Every summer, the dogs days of August hit Texas where each new day resembles the last - hot, dry and sunny. Don’t get me wrong, I love the sun and the outdoors, but the heat can start to wear on you during the long summer months. For us wireless folks, however, this summer was unlike any other – full of surprises, world records, and teeming with the promise of new mountains to climb.
On July 14th, the FCC announced new rules to open up 11 GHz of spectrum for flexible, mobile and fixed use wireless broadband – 3.85 GHz of licensed spectrum and 7 GHz of unlicensed spectrum. The new rules create a new Upper Microwave Flexible Use service in the 28 GHz (27.5-28.35 GHz), 37 GHz (37-38.6 GHz), and 39 GHz (38.6-40 GHz) bands, and a new unlicensed band at 64-71 GHz. This move extends beyond current generations of mobile communication, increasing data rates and reliability and lowering latency for device to device communication, spectrum usage, and multiple antenna transmission. By doing so the FCC has extended the capabilities of 5G wireless access and definitively put the United States at the forefront of the 5G race.
The next day, FCC chairman Tom Wheeler, the National Science Foundation, non-profit organization US Ignite, and representatives from several companies including NI announced the Platforms for Advanced Wireless Research. With over $400M in funding, the Platforms for Advanced Wireless Research initiative seeks to create four “at scale” wireless test beds located in cities throughout the US. The unprecedented scale and investment highlights the US government’s acknowledgement of the importance of 5G to the world and the desire for the US to drive it forward. I was honored to speak at the announcement, and as Prof Ted Rappaport pronounced during a panel later that day it was indeed a “double rainbow day” for US wireless researchers.
And just as the long days of August kicked into high gear, NI hosted its annual conference, NIWeek 2016 in Austin, Texas. Wireless researchers from all over the world gathered at the 5G Summit to exchange ideas and demonstrate the latest research. Of particular note, Michael Ha of the FCC joined panels to discuss the mmWave spectrum issues and the US roles in defining 5G spectrum, as well as sharing unlicensed spectrum between licensed and unlicensed carriers. Dr. Arun Ghosh from AT&T Wireless Research took to the NIWeek keynote stage to discuss the group’s novel mmWave channel sounder that captures channel data at 28 GHz comprehensively to create the most accurate models in the world. And with the Olympics taking place in August, it only seemed right that Prof Ove Edfors from Lund University in Sweden and Prof Andrew Nix from the University of Bristol in the UK present on their Massive MIMO prototype that set new world records in spectral efficiency achieving 145 bits/s/Hz using a 128 antenna system!
5G is indeed gearing up and unlike summers past, rest and relaxation took a backseat as the significance and importance of 5G gathered even more momentum and urgency. NI is particularly excited to be working so closely with academia, industry, and governments to drive the 5G agenda forward. The “everything after” seems most poignant as new stakes have been thrust into the ground and researchers work furiously to meet these once seemingly unattainable objectives. Much progress has been made this summer and I believe we will look back fondly at this time where a re-energized research community focused on a bright and very wireless future.
We’ve built a new pair of cRIO controllers and teamed up with Cisco to enable creation of distributed systems that perform synchronized I/O, code execution, and deterministic communication, all using the latest additions to standard Ethernet. Together those controllers create technology engineers are already using to help vet the technology in ecosystem activities, including the Industrial Internet Consortium TSN Testbed for smart manufacturing.
The tech specs The technology includes new CompactRIO controllers featuring Intel Atom processors and the Intel i210 TSN-enabled NIC for a high performance control system. These controllers use LabVIEW system design software to maintain synchronized time to the network and expose that time to code running on the real-time processor, as well as the code running on the FPGA.
LabVIEW’s already designed with time as a core concept using structures such as timed loops and single-cycle timed loops. Now these structures are synchronized to network time which makes it simple for users to tightly coordinate signal processing, control algorithms, and I/O timing with scheduled network transmission and between multiple systems distributed across a network. Additionally now with TSN these systems can deterministically send data across standard Ethernet networks to create reliable multi-controller coordinated systems.
How to get early access To get early access to these new controllers you can:
Join our Time Sensitive Networks on our online Community, where you’ll find example code and documentation, along with more detailed info on hardware and software capabilities, and details on the appropriate products/accessories you need to create deployable TSN systems.
Time-sensitive networking (TSN) enables the creation of distributed, synchronized, hard real-time systems over standard Ethernet. These systems use the same infrastructure to provide real-time control and communicate all standard IT data, powering convergence of control, measurement, configuration, UI and file exchange infrastructure. TSN’s expected to fundamentally change system design and maintenance by offering network convergence, secure control traffic and high performance!
We commissioned a study from research agency IDC, studying the best practices for internet of things (IoT) implementations and how companies can prepare themselves for IoT-based operations.
The study found the following key drivers necessary to prepare business operations for the adoption of IoT implementations:
Business uptime. Customers, competition, and government increasingly require organizations to function 24 x 7. Orgs are turning to implementing IoT as a way to maintain operations continuously and provide the tools to respond quickly and efficiently, and are looking to their solution providers — technology providers, contract manufacturers, OEMs, and so forth — to enable constant uptime and work in concert to provide new solutions that enable additional services. Example: the installation of sensors on equipment to enable remote diagnostics and predictive maintenance of systems. This enables operations to anticipate and schedule system downtime rather than shut down operations. Companies are depending on their technology providers to integrate systems without disrupting operations.
Automation. Orgs have realized 24x7 operations are beyond ordinary human abilities to assimilate incoming requests and respond efficiently. They’re looking to IoT to embed machine intelligence into their operations and automate complex processes that enable greater worker productivity. Example: embedding intelligence into tools so that the proper specifications (torque, depth, etc) are communicated to the operator and the final reading is taken and documented, verifying the completed operation.
Real-time data. Automation requires moving from sporadic, limited data collections to continuous, expansive, real-time data collection and analysis. IoT enables both the low-level collection of data and rudimentary machine analysis and the high-level human analysis for decision making. Companies are still determining whether to push embedded intelligence to the edge or expand the utilization of cloud capabilities. The operational business model, the model’s demand on data latency requirements, the connectivity environment, and costs are key decision key factorshere.
Environmental context. The IoT’s continuous collection of data enables constant input on the environmental conditions in which an organization operates. Data humans would miss can be captured and provide the context for improved understanding of the environmental conditions contributing to a particular outcome. The expanded usage of a wide array of sensors is driving the growth of data, storage, and bandwidth.
Connectivity. Connectivity of equipment is a basic building block of efficient operations. When equipment communicates automatically without human intervention, it can fundamentally change an organization’s capabilities, both human and machine. For example, internal applications can connect to remote applications. Humans can concentrate on enabling higher capabilities, such as managing, organizing, and analyzing the incoming data rather than just gathering data manually from disconnected systems. While there are a wide array of existing communication protocols, there are a number of organizations working to unify the various emerging standards. Location, environmental conditions, and operational demands are important considerations in the selection of the communication method and protocols.
Smart City. While orgs may be thinking primarily about their internal operations, IoT opens the prospect of interconnectivity with other IoT-enabled organizations. Scaling of cooperation from a single partner to a constellation of ecosystem suppliers will eventually enable Smart City IoT. Infrastructure, replacement cycle expectations, ownership and leasing contracts, and the potentially wide variety of governmental agencies necessary for approval must all be taken into consideration.
There’s one big problem with robots: cost. A good robotics kit will set you back at least $1,000, not including the software, and that’s too expensive for your average hobbyist maker, school teacher, or typical undergrad student. This problem makes robotics inaccessible.
We’ve seen what happens when tech tools gets cheaper. Arduino and Raspberry Pi have taken hardware out of the hands of the engineers and into the hands of any creative willing to write a few lines of code. The myRIO has brought LabVIEW Real Time programming to students, and you can see for yourself the incredible projects that has spawned. It’s about time that the same happened with robotics.
With this in mind, I’ve set out to build a low cost, open robotics platform. It’s called QuadBot, and in its first two weeks on Kickstarter it received 200% funding. The design and code is open source and I kept the cost low by designing a single controller board that also doubles as the mechanical frame. The chassis is 3D Printed and hardware is fully accessible with plug in terminals.
It’s an easy sell to most makers, which explains the rapid funding, but I wanted to expand the platform into academia and give makers access to more powerful programming. Enter LabVIEW. As former NI intern and creator of the NI Hexapod, it was a natural choice. LabVIEW is ideal for makers because it’s graphical and follows a visual style that appeals to people without a coding background. This is what makes it the ideal language to build into my robot platform.
My next step is to bring LabVIEW to makers by creating a LabVIEW API for QuadBot. If this ideal match can be completed, it will bridge the gap between making, academia, and robotics.
The robots will be shipping in April 2017, and the API and source code will be ready by then, so if you’re interested go and check out the Kickstarter and bag yourself a QuadBot.