ni.com is currently undergoing scheduled maintenance.

Some services may be unavailable at this time. Please contact us for help or try again later.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Increase SQLite Write Speed

Solved!
Go to solution

I have a VI that's writing data to disk from an FPGA. Unfortunatly, My SQLite writer is having difficulty keeping up with the write speed and I would like to know if there is anything I can do to keep up. Currently, I'm able to write 10,000 rows in 60ms. I would love to get down to 30ms or faster if possible.

 

To write the data, I have the following steps:

  • Read FPGA data as a 2D array in a producer loop, pass as variant to consumer loop process state (time start, t=0)
  • Convert 2D array to 1D array of clusters (to better represent "real data,"), run a fast check and pass as variant to logging state (5ms, t=5ms)
  • Log 1D array of clusters (as=10,000) with drjdpowell's SQLite library (60ms, t=65ms )

 

Please take a look at this simplified VI to see if there's anything I can do.

0 Kudos
Message 1 of 10
(4,252 Views)

You can try to form one request into one large string:

INSERT INTO table1 (column1,column2 ,..)
VALUES
(value1,value2 ,...),
(value1,value2 ,...),
...
(value1,value2 ,...);

 

Or you need a faster HDD 🙂

 

How long do you plan to write this data? If not for a long time, then perhaps you can accumulate data in memory, and writing them down after reading.

 

0 Kudos
Message 2 of 10
(4,216 Views)

Unfortunately, we're taking data over a few days so we can't just buffer to memory.

 

I tried the bulk data you mentioned and it seems my performance decreased from 60 to 100ms.

 

Updated VI below

0 Kudos
Message 3 of 10
(4,209 Views)

Also, In terms of getting a faster hard drive, we're using an SSD and writing at about 500kb (currently over 60ms) for 8.3Mb/s As I understood it, SSD's can write in the 100's of megabytes.

0 Kudos
Message 4 of 10
(4,205 Views)

Measure the write speed of a binary file.
And a workaround for solving the problem is to write a binary file, and after the experiment, transfer it to the database.

0 Kudos
Message 5 of 10
(4,185 Views)
Solution
Accepted by topic author JScherer

PRAGMA synchronous:

0 | OFF

1 | NORMAL

2 | FULL


PRAGMA temp_store:

0 | DEFAULT

1 | FILE

2 | MEMORY



Use:

PRAGMA synchronous = 0
PRAGMA temp_store = 2

 

But be careful, this is a quick but not safe way to speed up.

Message 6 of 10
(4,161 Views)

@IvanLis wrote:

PRAGMA synchronous:

0 | OFF

1 | NORMAL

2 | FULL


PRAGMA temp_store:

0 | DEFAULT

1 | FILE

2 | MEMORY



Use:

PRAGMA synchronous = 0
PRAGMA temp_store = 2

 

But be careful, this is a quick but not safe way to speed up.



Thanks!

 

With these changes I went down from an average of 60ms to 35ms write speed.

 

FYI: Trying the binary write, got me 12ms, so if this continues to be an issue, I can do the raw file with a separate conversion utility.

 

 

0 Kudos
Message 7 of 10
(4,132 Views)

You seem to have gotten the speed that you want, but I'll say what I first thought of anyway.

 

Your VI that writes a row of data, call it asynchronously and enable re-entrant execution so multiple rows can be written simultaneously.

Redhawk
Test Engineer at Moog Inc.

Saying "Thanks that fixed it" or "Thanks that answers my question" and not giving a Kudo or Marked Solution, is like telling your waiter they did a great job and not leaving a tip. Please, tip your waiters.

0 Kudos
Message 8 of 10
(4,122 Views)

@FireFist-Redhawk wrote:

You seem to have gotten the speed that you want, but I'll say what I first thought of anyway.

 

Your VI that writes a row of data, call it asynchronously and enable re-entrant execution so multiple rows can be written simultaneously.


I'll have to test that out if I run into the issue again. I'm concerned about having multiple writers to the same file though, but I suppose with unique transaction IDs for each Async call would mitigate.

0 Kudos
Message 9 of 10
(4,109 Views)

@JScherer wrote:

FYI: Trying the binary write, got me 12ms, so if this continues to be an issue, I can do the raw file with a separate conversion utility.


I would recommend using Bynary TDMS.
The access speed is actually the same, but it is much easier to organize data writing and reading.

0 Kudos
Message 10 of 10
(4,098 Views)