07-11-2017 05:51 AM
As hinted in the title, i am implementing my own RFID system for automatic inventory control. The reader i have does not return a RSSI index, so i cannot compute distance to tag based on this. I figured i can get that distance by computing the time it takes for the radio wave to reach the tag and back. I have attached a part of my code that writes and reads to a serial port using VISA vis. I would like to know if there is a way to get the time when labview actually starts writing to the port, and when it starts reading back the response?
Is this remotely possible?
Solved! Go to Solution.
07-11-2017 06:19 AM
Can't say I think it'll work. You're trying to rely on the speed of the comms through the CPU and OS to determine something that should be measured by a power level. OS/CPU speed vs radiowaves is not going to be very reliable at all. I'd give up straight away. If it's a must, change the H/W.
07-11-2017 06:23 AM
If i can get that time between signal sending and signal reading i think it might work. Reader is operating at a fixed frequency and serial port baud rate is fixed as well. But i do need a very precise timing of the 2 actions.
07-11-2017 06:27 AM
rough estimation for a distance of 10m:
travel time := 2*10m / 3e8 m/s = 6.7e-8 s = 67ns
Time it takes to transfer just one byte over a serial port at 19.2kbaud:
transfer time := 10 bits / 19200 bits/s = 5.2e-4 s = 0.52ms
What exactly do you want to measure? 😄
07-11-2017 06:39 AM
i want to calculate the distance between the reader and tag. In my case this is not possible throught signal strength because the reader does not calculate it. So i am trying to achieve this by calculating the time between when a command is actually written on the serial port and a response is returned. But it requires accurate timing
07-11-2017 06:48 AM
So i am trying to achieve this by calculating the time between when a command is actually written on the serial port and a response is returned.
I tried to explain that transferring data over a serial comm takes magnitudes more time than th radio wave to travel between RFID tag and receiver.
With my example above it would take ~20ms to send a command and receive its answer - with a radio wave travel time of <100ns in between. What kind of results do you expect from your experiment?
When you want to measure time intervals that accurate you need to buy DAQ devices capable of such measurements!
07-11-2017 06:57 AM
because the rate of writing and reading data is knowing isn't it possible to get a rough estimate then through calibration the distance can be obtained?
07-11-2017 07:24 AM
when the jitter introduced by the OS (Windows I guess) is by magnitudes larger than the time interval you want to measure: no, I don't think you will get meaningful results from your setup.
As has been said before…
07-11-2017 07:32 AM
okey, thank you for your time
07-12-2017 01:50 AM
Hi fadihajj ,
As you know there are two methods that;
One is measuring time interval, another one is signal magnitude.
Measuring time is not possible with low speed DAQ devices that lower clock rate than Ghz..Because signal will move at speed of sound and that means 3 ns for 1 meter distance. So if you use Arduino as example that has 16 Mhz clock rate, it means 60ns. In other words, you can not measure small distances at that rate. Maybe if your tag is so far away like km from receiver it can work for it 🙂
Only measuring signal magnitude can work but you need a receiver that measures magnitude of signal . You can use spectrum analyzer that has more than 60db resolution.