08-17-2021 02:33 PM
ok, this is happening already when the images are loaded
notice, the whole "stitching" is happening trough the "tunnel mode: concatenate" - this has no effect at all.
I have no idea why this is happening, but there must be some kind of image-processing in the imaq sub.vis involved.
I guess its some process step either in imaq read or imaq imagetoarray - both subvis are wrapping dll calls ... so, I can't dig deeper
08-17-2021 02:56 PM
So, IMAQ is incorrect for rescaling these values, as when we read images, we shouldn't have the pixel values changes automatically. I have tried to determine if there is some arbitrary threshold that IMAQ uses to decided to scale, however, I was not able to narrow anything down. Based of the histogram it appears to use a factor of 130,000 to scale the values down but that could be subjective to this image. So, should I use Python instead of IMAQ when trying to join images in LabView, due to its unpredictable behavior?
08-17-2021 03:03 PM
I can picture ( 😉 ) two use-cases:
the imaq images "looks" better, but the python-opencv imaqs is more "acurate"
08-17-2021 03:23 PM
My thoughts as well. Cheers @alexderjuengere 😀
08-18-2021 09:15 AM
I had to look at this again ... there is obviously some histogram matching - with reference to the first image - going on in imaq.
https://en.wikipedia.org/wiki/Histogram_matching
I tried a quick attempt to reproduce the histogram matching, but I probably just use imaq, if I ever need this.
this might be an actual use case for those tensorflow neural nets ... nevermind.
08-19-2021 08:06 AM
maybe I need this again