LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

SPI Input/Output Labview

Hi,

First of all, I am 100% new to LabView, never used it before in my life.

I am trying to figure out how to send data over SPI to an Analog Devices AD5791 DAC (http://www.analog.com/static/imported-files/data_sheets/AD5791.pdf), and receive data from an Analog Devices AD7356 ADC(http://www.analog.com/static/imported-files/data_sheets/AD7356.pdf).  I am using the NI USB-8451.  I am having a hard time creating the LabView program for either sending or receiving.  It seems that in the DAC there is no CS or SS pin, and the following is stated in the datasheet.

Standalone Operation
The serial interface works with both a continuous and noncontinuous serial clock. A continuous SCLK source can be used only if SYNC is held low for the correct number of clock cycles. In gated clock mode, a burst clock containing the exact number of clock cycles must be used, and SYNC must be taken high after the final clock to latch the data. The first falling edge of SYNC starts the write cycle. Exactly 24 falling clock edges must
be applied to SCLK before SYNC is brought high again. If SYNC is brought high before the 24th falling SCLK edge, the data written is invalid. If more than 24 falling SCLK edges are applied before SYNC is brought high, the input data is also invalid. The input shift register is updated on the rising edge of SYNC. For another serial transfer to take place, SYNC must be brought low again. After the end of the serial data transfer, data is automatically transferred from the input shift register to the addressed register. Once the write cycle is complete, the output can be updated by taking LDAC low while SYNC is high.

 

My question is, how do I ensure the specifics of the number of SCLK cycles that need to occur, do occur?  Can I attach the SYNC pin to a CS pin on the USB-8451?  How would I take LDAC low while SYNC is high?

 

For receiving data, there is an SCLK, MISO, and CS available.  I would assume that this would be connecting the clock, data, and CS pins and letting labview handle it.  However, the data I am receiving will be in the form 00A11A10..A00000B11B10..B000.  How can I seperate these two pieces of data?

 

 

 

Thanks a ton for any help!

Download All
0 Kudos
Message 1 of 6
(3,318 Views)

The 8451 has 8 lines that can be used for Chip Select. You can use one of these to control the SCLK line on the 5791. This should allow the 8451 to control the SCLK line when you want to transfer the data. In your case the data consists of 3 U8 values. Thus, your "Control Register" array should have 3 values, each corresponding to one byte of data. As fas as the LDAC line is concerned, you need to decide whether you want the DAC to be update synchronously or asynchrously. If you set LDAC low prior to the transfer then the DAC register will be udpated once the SCLK line goes high (which is after all 3 bytes have been sent). If you set LDAC high prior to the transfer then the DAQ register will NOT be update at the end. You need to manually set LDAC low to update the DAC registers.

 

For the 7356 unless you want to do bit-banging then the SPI VIs will be sending out 16 clocks. This means you'll get zeros being padded at the end. Basically, all you need to do is to perform a right-shift on the values you read using the Logical Shift function. You would need to shift the value by 2 bits to the right (toward LSB). The zeros at the MSB can be ignored.

Message 2 of 6
(3,313 Views)

Thanks for the response.  I have been working on this and still have not had a succesful analog output from the DAC.  I am wondering about a couple things. First the logic levels for the DAC as stated on the datasheet are .3*IOVcc and  .7*IOVcc, and IOVcc is 5V, so levels are 1.5V and 3.5V.  The USB8451 is outputting levels at around 3.3V for a HIGH, is this going to be a problem for the chip to detect high from low? If thats the case then I dont know how talking to this chip is possible through the 8451.  Also, when I look at the clock signal coming out of the 8451, it is not a consistent clock, but there is a noticeable "pause" after each byte and I wasn't sure if the chip was tyring to send each byte to the output because of this, instead of all three bytes required.  So I tried using the LDAC pin and making it HIGH, performing the write, then setting it LOW.  No luck. Looking at the LDAC output on the scope there is barely any change in the output in terms of voltage level.

 

I attached the code I've written and been using

Anyone have thoughts on this?  Thanks.

 

0 Kudos
Message 3 of 6
(3,286 Views)

@christopherdean wrote:

Thanks for the response.  I have been working on this and still have not had a succesful analog output from the DAC.  I am wondering about a couple things. First the logic levels for the DAC as stated on the datasheet are .3*IOVcc and  .7*IOVcc, and IOVcc is 5V, so levels are 1.5V and 3.5V.  The USB8451 is outputting levels at around 3.3V for a HIGH, is this going to be a problem for the chip to detect high from low? 


Of course it will be.  0.7*IOVcc is the minimum voltage that the chip will use to indicate a high condition. 

 


If thats the case then I dont know how talking to this chip is possible through the 8451.

You would need to lower the voltage use to power the chip.

 

 

Message 4 of 6
(3,283 Views)

Yes, but the problem I forgot to mention is that the chip is already on a board custom built that takes +5V and powers DC/DC converters and other chips on the board from the single +5V supply.

0 Kudos
Message 5 of 6
(3,274 Views)

Then you will need to either:

  • Use the 8451 in bit-banging mode and use the digital I/O lines in open-drain configuration
  • Provide level shifting for the SPI lines
0 Kudos
Message 6 of 6
(3,256 Views)