LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

ScanFile is skipping zeros

Solved!
Go to solution

Hi, I've been using this algorithm for some time in CVI 2012, but now in CVI 2013, I have found that ScanFile is not working the same for me.

 

I'm unclear whether this is a CVI 2013 issue, or just a dormant bug that I had never witnessed until now.

 

I'm am scanning through an ASCII tab-delimited log file.  Some "columns" are text, others are numeric. I prefer to scan through the columns in a non-rigid way; i.e., I'm not strictly following the column order as it exists in the log file.  Here's what it looks like:

 

for (i=0; i<LOG_MAX_COLS; i++)
{
	switch (i)
	{
		case LOG_COL_ID:
			ScanFile(fileHandle,"%s>%i[x]",&unitData[slot].id);			
			break;
		case LOG_COL_SERIAL:
			ScanFile(fileHandle,"%s>%s[xt09]",unitData[slot].serial);		// discard tab (ASCII 0d09)
			break;
		case LOG_COL_REVISION:
			ScanFile(fileHandle,"%s>%s[xt09]",unitData[slot].revision);		// discard tab (ASCII 0d09)
			break;
		case LOG_COL_STAGE:
			ScanFile(fileHandle,"%s>%i[x]",&unitData[slot].logStage);
			break;
		case LOG_COL_RETRIES:
			ScanFile(fileHandle,"%s>%i[x]",&unitData[slot].retry);	
			break;
		case LOG_COL_RUNNING:
			ScanFile(fileHandle,"%s>%i[x]",&unitData[slot].running);	
			break;
		case LOG_COL_COMPLETE:
			ScanFile(fileHandle,"%s>%i[x]",&unitData[slot].complete);
			break;
		case LOG_COL_ELAPSED:
			ScanFile(fileHandle,"%s>%s[xt09]",elapsedString);	// discard tab (ASCII 0d09)
			break;
		/* Now fill a placeholder string with the remaining fields so that the file pointer stays in sync.*/
		case LOG_COL_DATE:
		case LOG_COL_TIME:
		case LOG_COL_VOLTAGE:
		case LOG_COL_CURRENT:
		case LOG_COL_STATION:
		case LOG_COL_SOFTWARE:
		case LOG_COL_SLOT:
		case LOG_COL_DESC:
		case LOG_COL_COMMENT:
			ScanFile(fileHandle,"%s>%s[xt09]",tempString);		// discard tab (ASCII 0d09)
			break;
	}

 

The behavior I'm seeing now is if I have any zeros in my log file in the numeric columns, these are getting skipped until a non-zero value is found.  Suggestions?

0 Kudos
Message 1 of 6
(4,507 Views)

Can you also send a sample line from the file and tell us what you expected from parsing that line and what you got instead?

 

I think you complain about the behaviour for "%s>%i" type of conversions, am I right?

S. Eren BALCI
IMESTEK
0 Kudos
Message 2 of 6
(4,484 Views)

Ah, of course, that would have helped huh?  😉

 

Here is a sample log file.  The last four columns in particular are giving me problems.  You'll see that last column is really just a line number that increments for each line.  The previous three in this case are zero.  It's the zeros that aren't being read.  And you are right.  It's my %s>%i that's not working.

 

I should add that for me, it's not really clear how the Scan set of functions work.  It's not documented how they parse through a file/buffer.  I assume there's some sort of file pointer that remembers where it left off?  So with each successive call, you plow through a line/file/buffer?  At least that's how I've been using it.

0 Kudos
Message 3 of 6
(4,477 Views)
Solution
Accepted by topic author ElectroLund

Ok, I found the bug.  It was an insidious little bugger!

 

It turned out I had a blank value for one of the columns shortly after creation of the log file.  Thus, there was a double delimiter (two tabs) and apparently ScanFile was stopping after the first tab and throwing off the rest of the reads.

 

So it's safest to put non-empty strings or values into each column.  Unless there's a safer way to Scan the line.

0 Kudos
Message 4 of 6
(4,455 Views)

I wouldn't claim it's 'safer' but I prefer using commas as delimiter - I can more easilly recognize them compared to distinguishing spaces and tabs...

0 Kudos
Message 5 of 6
(4,445 Views)

I suspect you'd have the same issue with CSV files.  In other words, if I had this:

 

0123,Monday,01-23-2014,,5.3625,39.99,14%

 My ScanFile would still choke at that double delimeter.

 

The reason I went with a non-printable was that a lot of my text fields contained commas. It was tricky and my brain hurt too much to figure out the regex involved in scanning those fields.

0 Kudos
Message 6 of 6
(4,434 Views)