Tagged: date time parse shr
- Athanasios May 9, 2019 at 4:01 am
I am trying to print the timestamp, which is an object of the structure SHRSweepHeader in shr_parse.h.
I used the signal_hound_sdk -> shr_parser and its contents for this project. So i am using printf to print the timestamp variable which is uint64_t.
printf(“Mill since epoch: %PRIu64d\n “,state2.timestamp); and it gives me an enormous number which cannot even be defined in UNIX time.
Can you help me with this?
Athanasios LoukasAndrew May 9, 2019 at 9:41 am
What times are you seeing? As you noticed, the time is in “Milliseconds since epoch” which ~1557420094000 right now. Is this the number you are seeing?
I’ve always used the %llu print format specifier to print 64-bit uints, is this the possible issue?
Regards,nickdef May 15, 2019 at 3:20 pm
I’m also seeing a problem with the SHRSweepHeader timestamp when parsing a SHR capture. I’m using Python to read the timestamp field as a uint64 which produces huge values such as 14032183609892863189.
The timestamp you gave (1557420094000) also doesn’t appear to conform to any standard I am familiar with e.g. in UNIX time this corresponds to Sunday, March 7, in the year 4264073288 (!) at 12:00:00am.
Can you please confirm what format the timestamp is encoded in and double-check the correct value is being recorded in the SHRSweepHeader?
Andrew May 15, 2019 at 3:39 pm
- This reply was modified 1 week, 3 days ago by Andrew.
If you play this file back in Spike does Spike show the correct time? You can see the time displayed for each sweep right under the playback scrubber bar.
Yes, I have verified the timestamp is being set properly in the latest version of Spike (3.3.0). I have lots of files from previous versions that all appear to have valid time stamps.
I suspect your python parser is either reading this value incorrectly or is aligned improperly?
The time is in milliseconds since epoch where epoch is defined as 1-1-1970. Here is a website that shows the current millis since epoch. https://currentmillis.com/
Andrewnickdef May 15, 2019 at 5:14 pm
My apologies – the converter I was using was assuming seconds from epoch as input, not milliseconds, so your value does look correct.
I’ll double check my Python code.
Nicknickdef May 15, 2019 at 9:18 pm
Problem solved. I was incorrectly parsing the SHRFileHeader which had flow-on effects when parsing the SHRSweepHeader.
Thanks for your time!Athanasios May 18, 2019 at 12:34 am
Hello Nick and Andrew. Thank you for your replies.
I used ” printf(“%llu \n”,state2.timestamp); ” like you suggested but still got an enormous undefined number.
(Although I get this weird warning for %llu : “unknown conversion type character ‘l’ in format [-WFORMAT=] )
Different result but still enormous number while trying with “%” PRIu64 ” \n”.
I checked the signal hound code again, and there is a comment next to unint64_t timestamp saying //milliseconds since epoch. So the timestamp is encoded in epoch time format.
I can’t figure out what is wrong. I am using C++, a GCC compiler.
I imported the shr file to Spike and i saw that the timestamp I need to get is 20/3/19 9:37:46.
I saw another function in the signal hound code called “vrtGetTime”. Will that also give me the timestamp?
AthanasiosAndrew May 19, 2019 at 12:49 pm
vrtGetTime is unrelated.
I would start by beginning with the example SHR parser files found in the SDK. If you don’t have the latest SDK, download it from here.
I would start by simply compiling, running, and examining the parser example and then modify it from there once you get it running.
If there is any concern that the file has been modified since it has been saved from Spike, I would regenerate a new sample recording file.
AndrewAthanasios May 23, 2019 at 4:06 am
I figured out the mistake.
I tried to print the timestamp while in the for loop that did the sweep.
It worked with : printf(“%” PRIu64 ” \n”,state2.timestamp);
Thanks for all the help.
Athanasios LoukasAthanasios May 23, 2019 at 8:06 am
Hello Andrew. Now I have a new problem with regards to the timestamp. I can’t write it on a csv file, I get the same enormous number like before. Do I need a special conversion before passing it to a csv because it is uint64_t ?
I passed integers and doubles into the csv without problems.
You must be logged in to reply to this topic.