From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

movidius: AI on USB, how to make LabVIEW drive it

Solved!
Go to solution

It is 100 Gflops on a USB for $80.

http://www.anandtech.com/show/11649/intel-launches-movidius-neural-compute-stick

https://developer.movidius.com/

http://www.mouser.com/new/Intel/intel-movidius-stick/

 

How do I push LabVIEW code to it.  Caffe?  Tensorflow?

 

I make things go very very fast in LabVIEW.  It usually beats R or MatLab by 1000x.  Instead of running a compute job over a weekend, I re-code it in LabVIEW and run it in a hour or two.  LabVIEW is amazing.

 

AI is big.  It is existential.  There is a crap-ton of value and opportunity there. 

 

I can't imaging not being able to have the two meet.  It sounds like a marriage made in heaven.

 

How do I connect LabVIEW to either a TensorFlow or Caffe capable AI framework?

 

https://forums.ni.com/t5/LabVIEW-Idea-Exchange/Tensorflow-API-for-LabVIEW/idi-p/3248737

http://caffe.berkeleyvision.org/ <-- (BSD 2-clause license, fyi)

 

 

 

0 Kudos
Message 1 of 4
(3,504 Views)

Pushing LabVIEW code on this is very likely not an option. I doubt that even NI would be able to do that as it would require a fully functional LLVM plugin for this hardware, and I have serious doubts that that is even viable, let alone interesting enough to develop.

 

After all this is about Neural Networks to solve specific algorithmic problems, not standard programming logic execution that all normal compilers including LabVIEW do generate. While it would be maybe possible to generate executable code for a Neural Network from a standard program description like C or LabVIEW, I would not expect much good as far as performance is concerned from such an approach.

 

So the LabVIEW integration would most likely boil down to an interface library that can control the Neural Network Chip in this, and push program description code down to it, that has been generated using the provided compiler tools from the Movidius development toolkit from the same algorithm description source code that you would use in any other solution.

Rolf Kalbermatter
My Blog
0 Kudos
Message 2 of 4
(3,494 Views)

We, okay LabVIEW, has great quality interface to a truckload of hardware.  This is arguably its single most important core proficiency.  If that doesn't work well then little things like compute speed and programmability become meaningless.

 

Would you ever have a decently high-quality interface?  This would open LabVIEW up to participating in the AI revolution.  Not only do you build robot control at a basic level, you can work on their brains. 

 

Do you know how many hundreds of billions of dollars of investment are going into this area in the next 5 years?  Several.  LabVIEW is a perfect tool to be a leader in that area. 

 

0 Kudos
Message 3 of 4
(3,427 Views)
Solution
Accepted by topic author EngrStudent

Hi!  Labview has the ability to import a Tensorflow trained model VIA the Vision Development module as of this year as a part of the vision toolkit!  It has support for both Labview and LabviewRT.  I've used it and I was able to import my model in under and hour using the .pb file.

 

The API is straight forward:  Open Tensorflow Model, set inputs, read outputs, close.  Further details here:

http://www.ni.com/documentation/en/vision-development-module/latest/ni-vision-algorithms-node-ref/cr...

Feel free to let me know how it goes or what you think, and if you'd like to discuss your application specifically.

I realize that this is running the network on the Processor and not the movidius compute stick, however it does allow a machine learning model trained from tensorflow to be run inside of Labview, and ran on desktop, PXI, or embedded target.

Message 4 of 4
(2,855 Views)