LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

get microsecond precision using 40mhz clock

Solved!
Go to solution

Hi,

The time deference between 2 subsequent loops should be 1 micro second. i have 40mhz clock and its derivatives. how can achive this.?

Thank you,

Ranjith

0 Kudos
Message 1 of 6
(2,724 Views)
Single-Cycle Timed Loop
Jim
You're entirely bonkers. But I'll tell you a secret. All the best people are. ~ Alice
For he does not know what will happen; So who can tell him when it will occur? Eccl. 8:7

0 Kudos
Message 2 of 6
(2,705 Views)

Hi Jim,

Do i have to apply  some math to get one microsecond time for each iteration?

 

I have one more question:

I could see FPGA analog channel can take 16 bit data. but i receive float data from 3rd party software. How do i manage this?

 

Thanks,

Ranjith

0 Kudos
Message 3 of 6
(2,698 Views)

Scale the float data (SGL or DBL) from the software to 16 bit data. i.e. if the voltage range of the AI is +-10V then accomodate this data within the entire I16 datatype. 0- -1V and 2^16 - +10V

 

Cheers.

With regards,
JK
(Certified LabVIEW Developer)
Give Kudos for Good Answers, and Mark it a solution if your problem is solved.
0 Kudos
Message 4 of 6
(2,689 Views)

Hi JK,

 

I didnt get what do you mean by below line

 

0- -1V and 2^16 - +10V

 

Thank you,

Ranjith

0 Kudos
Message 5 of 6
(2,683 Views)
Solution
Accepted by topic author TRanjith

Sorry there was a mistake.

Assuming that the range of analog input is +-10V:

 

Scale the -10V as 0 (16 bit binary)

and 10V as  65536 (2^16 last value of the 16 bit range)

 

Take care of complementing the sign bit (since Its signed integer) when moving from -10V to -1 and then from 0 to 10V

 

 

With regards,
JK
(Certified LabVIEW Developer)
Give Kudos for Good Answers, and Mark it a solution if your problem is solved.
0 Kudos
Message 6 of 6
(2,681 Views)