From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
04-16-2013 06:37 AM
Hi
How do I implement a simple for loop in parallelism on GPU. I can implement it on the CPU but I need to use the GPU for Parallelism.
Regards
Ritesh
04-16-2013 09:06 AM
Ritesh,
do you have the GPU Analysis Toolkit? If so, do you have a NVIDIA CUDA graphic card or something else?
Norbert
04-16-2013 04:40 PM
04-17-2013 02:33 AM
Ritesh,
did you search for examples for this? There is also a very general tutorial on this. Have you read it? Did you check the links in the "next steps" section?
If you run an example using the GPU functions, do they work?
thanks,
Norbert
04-19-2013 11:09 AM
Hi
Yes I did look through and the examples but nothing to help me get started. My goal is to implement IMAQ pattern matching a 100 or more times in one iteration (On the GPU). But I also want to start small and by doing a simple calculation on the GPU
Regards
Ritesh
04-22-2013 12:20 PM
Hi reigngt09,
Here are some discussion forums and other resources that might help you get started with GPU Analysis toolkit.
https://decibel.ni.com/content/message/50977#50977
https://decibel.ni.com/content/message/50844#50844
Let me know if it is helpful