From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to store/write received IQ data

Hello EveryOne,

I am working on Channel Sounding Techniques for which I am using USRP as a receiver. I want to store my receive IQ data for post processing in order to find the channel impulse responce but I am new in the field of USRP & LabVIEW so is theer any one who can help me that how can I store the received data into hard-disk and how to retrieve it for post-processing via LabVIEW. The receiver .vi file is attached and thanks in advance...

Regards

0 Kudos
Message 1 of 8
(5,465 Views)

Hello Attique,

Writing the data is a little tricky because you want to stream a complex waveform. As far as I know there is no premade solution for this, however

it is fairly easy to implement it using the write to binary file functions. The timing information is only written to the file on the first call, in every succeeding call

only the waveform data is written. For reading you just have to know that the first few bytes in the file are a timestamp, the next position is a double (delta t), and

the remaining data is complex double values. I modified your code to implement this functionality. I did not have the chance to test the code because I do not

have the right hardware, so please be careful and test the code before using it.

 

Regards,

  Georg

Download All
0 Kudos
Message 2 of 8
(5,391 Views)

Hello Georg,

Many thanks, its a great help from your side. I used those codes via my hardware and found them okay but as you mentioned in your reply " first few bytes in the file are a timestamp, the next position is a double (delta t), and

the remaining data is complex double values" so whatever I got at the end from "Y" terminal (in read binary.vi) is basically my receive complex data, isn't it? Secondly I found a way to store the data in .tdms file as well. So what do you prefer to store my received complex data as; .tdms or .bin file?

Kind Regards

0 Kudos
Message 3 of 8
(5,361 Views)

Hi!

It is possible to save your data as TDMS file. If you are familiar with this format I would even prefer using it over using the binary format. Both format are optimized for speed and to save the data as their binary representation (if you use ASCII the files will be huge). The advantage of TDMS is that it is a well defined file format, which can be read by third party programs (Excel, Openoffice,...) via plugins. Additionally if you store the data as TDMS anybody can open them later without having to read the file specifications of your binary file and writing an import filter for their specific software.

Regards,

  Georg

0 Kudos
Message 4 of 8
(5,348 Views)

Hi again,

No doubt TDMS files has some edge over BINARY files but the problem is; for the 10 min. of measurement the .tdms file size reaches to 2 GB (appx.) (the same in the case of .bin file) and when I tried to read it via read tdms.vi (which is attached) it return me an error that "Not enough memory to complete this operation". Secondly I tried to utilized your code after some modification to read received data and do some post processing but in case of large file as discussed above it returns nothing...

How can I read and utilize the received data for post processing (which is in GBs)?

 

Download All
0 Kudos
Message 5 of 8
(5,336 Views)

I have two suggestions:

1) If you got error when you tried to read the whole data in one read (2GB in your case), then it will exceed the maximum memory usage. You can take advantage of the "index" and "count" terminals of TDMS Read and read the data out in bunches instead of all at one time.

 

2)If the error is thrown out from TDMS Open, that means the TDMS file contains lots of meta data. In that case, you can try using TDMS Advanced Nodes to log and then read the data out. TDMS Advanced API has optimized meta data size and you can find how to use them by checking examples in LabVIEW.

0 Kudos
Message 6 of 8
(5,324 Views)

Hello deppSu,

Thanks for your suggestions but actually I am new to LabVIEW so I think I will adopt the first suggestion of yours 🙂

I checked the TDMS Read.vi and find count which is set to -1 (read all) but didn't find index terminal over there but the offset is there (I think you mean the offset). 

The only way that I have is to use Write TDMS.vi to store one big file (30 or 40 GB appx.) and then use Read TDMS along with count and offset to read my data back in chunks and use it for post processing... I think it will work, Waiting for your comments?

0 Kudos
Message 7 of 8
(5,298 Views)

Since you are new to LabVIEW, you may want to read the section in the LabVIEW help about dealing with large data sets in LabVIEW.  Alternately, you can read this tutorial, but it is a bit dated and does not mention newer methods like the in-place element structure or data value references.

 

Good luck!

0 Kudos
Message 8 of 8
(5,289 Views)