05-14-2009 05:08 PM
Hi,
The time deference between 2 subsequent loops should be 1 micro second. i have 40mhz clock and its derivatives. how can achive this.?
Thank you,
Ranjith
Solved! Go to Solution.
05-14-2009 08:44 PM
05-14-2009 09:58 PM
Hi Jim,
Do i have to apply some math to get one microsecond time for each iteration?
I have one more question:
I could see FPGA analog channel can take 16 bit data. but i receive float data from 3rd party software. How do i manage this?
Thanks,
Ranjith
05-14-2009 10:30 PM
Scale the float data (SGL or DBL) from the software to 16 bit data. i.e. if the voltage range of the AI is +-10V then accomodate this data within the entire I16 datatype. 0- -1V and 2^16 - +10V
Cheers.
05-14-2009 10:38 PM
Hi JK,
I didnt get what do you mean by below line
0- -1V and 2^16 - +10V
Thank you,
Ranjith
05-14-2009 10:43 PM
Sorry there was a mistake.
Assuming that the range of analog input is +-10V:
Scale the -10V as 0 (16 bit binary)
and 10V as 65536 (2^16 last value of the 16 bit range)
Take care of complementing the sign bit (since Its signed integer) when moving from -10V to -1 and then from 0 to 10V