LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Converting seconds to a date in another language

Solved!
Go to solution

13 would equal Feb of the following year...

0 Kudos
Message 11 of 22
(1,475 Views)

"I can correct this by changing the date in the creation of my 'dateobj' to 1 Dec 1903"

 

I'm assuming you did this by setting Dec using 12.  If months start with 0, you set that to the 13th month.  Where it lets you overflow, it's still strange it didn't throw an error there.

0 Kudos
Message 12 of 22
(1,474 Views)

Nah, it's not particullarly strange - try it in LV, you will find the same behavior.

0 Kudos
Message 13 of 22
(1,469 Views)

Why is there this persistent myth that LabVIEW is based on C?

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 14 of 22
(1,454 Views)

An approach that does not generate any ambiguity in transfering date/time information between (possibly) incompatible systems is to use Seconds to Date/Time and pass the elements of the cluster as text. You then parse the data and build a timestamp in the other language. It is a little more work but gives you complete control.

 

date and time.png

 

Lynn

Message 15 of 22
(1,434 Views)

@billko wrote:

Why is there this persistent myth that LabVIEW is based on C?


LabVIEW is written in C (or C++); that's not a myth. That has nothing to do with timestamps and date formats though - those are the responsibility of the operating system.

0 Kudos
Message 16 of 22
(1,425 Views)

@nathand wrote:

@billko wrote:

Why is there this persistent myth that LabVIEW is based on C?


LabVIEW is written in C (or C++); that's not a myth. That has nothing to do with timestamps and date formats though - those are the responsibility of the operating system.


Thanks Nathan.  You hit it right.  However, how time is interpretted is not the responsibility of the OS - this is very much the responsibility of the development tool.  Hence the reason you have epochs starting at different times, even by the same producer (as I pointed out above, I believe that Teststand references 1970, like JS, while LV references 1904).

0 Kudos
Message 17 of 22
(1,423 Views)

That is actually a great solution, Lynn, but which will consume more bandwidth.  My guess is this solution.  Ideally, I transfer an 8 byte array representing the seconds.  If I transmit the seconds as a string, I will likely take 12 bytes.  If I transfer the stamp as something like 2014117130623.25 I am now at 16 bytes.  This doesn't seem like a huge deal, but I am trying to minimize my line usage as I am pushing a lot of data over the network connection.

 

Cheers, Matt

0 Kudos
Message 18 of 22
(1,418 Views)

@nathand wrote:

@billko wrote:

Why is there this persistent myth that LabVIEW is based on C?


LabVIEW is written in C (or C++); that's not a myth. That has nothing to do with timestamps and date formats though - those are the responsibility of the operating system.


More of the "LabVIEW IDE for G" is written in LabVIEW every day.  Yes, some highly optomized routines have allways existed from lower level languages.  C and C++ routines are used where the available optomizations warrent there use.  G, The native langauge of the LabVIEW IDE, is a 4th generation language and C is not as abstracted from hardware.  Will "Pure G" ever outperform some other lower level language's compiler in every case--Probably not.

 

And yes, say "OK, Google Now! What is LabVIEW?"  

 

Spoiler
The last time I needed to upgrade my phone I was wearing a cobalt polo.  The salesperson asked "What is LabVIEW" and because I had this "Amazing" phone in my hand I passed off the question.  The LMB just shook her head and excused us when the answer came and I bought the danred phone!

 

 


"Should be" isn't "Is" -Jay
0 Kudos
Message 19 of 22
(1,396 Views)

@JÞB wrote:

@nathand wrote:

@billko wrote:

Why is there this persistent myth that LabVIEW is based on C?


LabVIEW is written in C (or C++); that's not a myth. That has nothing to do with timestamps and date formats though - those are the responsibility of the operating system.


More of the "LabVIEW IDE for G" is written in LabVIEW every day.  Yes, some highly optomized routines have allways existed from lower level languages.  C and C++ routines are used where the available optomizations warrent there use.  G, The native langauge of the LabVIEW IDE, is a 4th generation language and C is not as abstracted from hardware.  Will "Pure G" ever outperform some other lower level language's compiler in every case--Probably not.

 

And yes, say "OK, Google Now! What is LabVIEW?"  

 

Spoiler
The last time I needed to upgrade my phone I was wearing a cobalt polo.  The salesperson asked "What is LabVIEW" and because I had this "Amazing" phone in my hand I passed off the question.  The LMB just shook her head and excused us when the answer came and I bought the danred phone!

 

 


I understand this is off-topic, so this is the last I will mention of it in this thread - not because of any hurt feelings and such, but because I hate hijacking threads.  I was always led to believe that most of LabVIEW was written in G, but a lot of the highly optimized stuff was written in C.  To me, that doesn't qualify for something being "based" on something else.  To me, being written in C with calls to G would be more in line with that.  To anyone: feel free to PM me if you'd like to continue the discussion.  I'd rather be proved wrong than continue to BE wrong.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 20 of 22
(1,389 Views)