- andrewclegg January 9, 2019 at 5:47 pm
Would it be possible to add a flag in the .shr files that indicates that an “Uncal/IF overload” condition existed for a given sweep?
I perform offline processing of data and could use the flag to ignore any sweeps that were acquired during overload. Also, without the flag, when playing back previously recorded data in Spike, there is no way to indicate that uncal/overload conditions existed when a sweep was originally acquired.
Can this be a new feature in a future version of Spike? Or is there some way to accomplish this already?
As always, thanks!
AndyAndrew January 10, 2019 at 11:27 am
Thank you for the request. You are correct in noting that the overload condition is not stored in the sweep file. I can look into adding this in Spike.
It sounds like you are saving the files in Spike and parsing them in your own application? If so, simply storing the overload condition would be adequate for your needs? If there was a setting to not save overloaded sweeps, would that also be adequate?
I look forward to your response.
andrewclegg January 10, 2019 at 4:27 pm
- This reply was modified 1 year, 6 months ago by andrewclegg.
Thanks for the quick response.
Generally, yes, I acquire and save in Spike, and process the data in my own Python applications. I sometimes also look at the recorded data in Spike. Probably 75%/25% split. In either case, it would be good to know which sweeps are potentially corrupted (or uncalibrated) due to overload.
Preferentially, all sweeps would be kept, and those that are overloaded would be flagged. This would allow continuous data acquisition, even if some of the sweeps should be considered suspect, and then I can deal with them accordingly and as desired in post-processing.
The next best solution would be to simply not record sweeps that are overloaded, but personally I would prefer the first solution, since to me it’s always better to collect as much data as possible and deal with imperfections accordingly in post-processing. For some applications, I could imagine it’s better to have a sweep even if it’s uncal than to be missing data completely (for example, when trying to capture the exact time and general nature of an infrequent short burst, for example).
Of course, the truly ideal solution is to have both: the option to either record and flag overloaded sweeps, or not record them.
I believe there’s 16 bytes of unused data in each sweep (unint64_t reserved). Could the overload condition be encoded in part of those unused bits?
By the way, when decimation is used, how are overloaded sweeps that occur during the decimation period handled? Is there any special treatment or are they handled like any other sweep? If the latter, it would probably be good to flag the affected decimated sweeps too, if they are possibly corrupted.
I always try to avoid acquiring important data in overload conditions, but sometimes I’m on the edge due to dynamic range considerations, and it would usually be good to be able to identify uncal data if it happens.
AndyAndrew January 14, 2019 at 8:59 am
Thank you for the information Andy. I have made notes of your requests and will see about getting this functionality in a future release of Spike.
Regardsandrewclegg January 14, 2019 at 9:39 am
Perfect, thanks. BTW, I really appreciate all the work you put into Spike. It’s a great piece of software, and keeps getting even better with each release.
Andyandrewclegg January 7, 2020 at 2:10 pm
Thanks very much for implementing this feature! May I be bold enough to ask for one tweak? That is related to the decimation issue raised above. When recording using the decimation feature, would it be possible to set the overload flag on a decimated sweep if one or more constituent sweeps acquired during the decimation period were overloaded?
AndyAndrew January 8, 2020 at 6:20 am
What behavior are you seeing now? Looking at the code, it looks like it should already be doing that. It looks like it should mark the output sweep with an ADC overload if any of the sweeps that contributed to it had an ADC overload condition.
Look forward to your response,
Andrewandrewclegg January 8, 2020 at 6:35 am
Hmmm, my script is supposed to tell me if any of the sweeps read in from my (decimated or otherwise) files have the overload flag set. It’s possible (or likely) that I screwed something up. I tested the functionality on some files that were not recorded with decimation on and it worked. Then I didn’t see any flags on files with decimation on, even though I know there were some overloads. Let me check later today. I probably screwed something up. Sorry for the apparent false alarm.andrewclegg January 9, 2020 at 6:01 am
Andrew, it was an error in my script. I am now seeing the flag set in recordings with decimation on if a constituent sweep shows an overload during recording. Thanks very much.
You must be logged in to reply to this topic.