Welcome! Log In Create A New Profile

Advanced

2D spatial transformation calibration

Posted by Anna_mito 
2D spatial transformation calibration
March 07, 2022 01:12PM
Hi,
We have some issue with 2D spatial transformation for calibrating eye signal: we noticed that by using more than 5 fixation points (which we need for our experiment in order to test the visual field up to 25-30 degrees) the assigned nulling values shift from the acquired initial ones. For calibration we use the same points structure shown in the picture under the iii paragraph of ML Docs (https://monkeylogic.nimh.nih.gov/docs_CalibratingEyeJoy.html), the only difference is that the external square of points is more spaced apart from the central one. In particular, we have a kind of rotation of the eye end point during a saccade task concerning the more external points (we can't use autodrift function because we need to catch the actual subject drift in case it occurs). Since in the Docs it is reported that calibtrating with more than 5 points requires a projective transformation, maybe we are making some mistakes selecting/ordering the calibration points. Any suggestion to solve the problem?
Hope I explained the details properly, thank you so much.
Re: 2D spatial transformation calibration
March 07, 2022 03:38PM
I do not understand. What do you mean by the assigned nulling values?

Are you saying that the calibrated eye positions are rotated when the subject looks at outer fixation points? Are you seeing it during the calibration or does it occur over time?

2-D Spatial Transformation is the projective transformation. It always needs 4 or more fixation points to solve the matrix equation.

-----

What was the degree coordinates of the outmost fixation point in your calibration? Make the calibration field larger than the visual field you want to use, since extrapolation may not work well.
Re: 2D spatial transformation calibration
March 14, 2022 10:19AM
As you can see in the first picture, the 15 (also16 and 17) calibrated eye position is shifted from the actual one. This implies that kind of rotation I mentioned in the last message of the saccade end point (second picture). Moreover, we noticed that every time we select fixation points in order to calibrate, Matlab give us the following warning message "matrix is close to singular or badly scaled. results may be inaccurate", that is why I told you maybe we are making some mistakes selecting/ordering the calibration points.
About the extrapolation issue: if we are testing the visual field up to 20° we select the external square points at 20°, is it necessary to set them at 25° for example?
Attachments:
open | download - WhatsApp Image 2022-03-09 at 14.26.32.png (792 KB)
open | download - WhatsApp Image 2022-03-09 at 14.31.05.png (433.5 KB)
Re: 2D spatial transformation calibration
March 14, 2022 03:06PM
Do you see the map on the right above the Save button? Do you see how close the red circles cluster there and how long the white arrows are stretched? That is because the gain of your eye tracker input is too small to distinguish one fix point from another. As a result, the multiplication factor becomes too large and even a small difference in input is being amplified to a large error.

You need to boost the eye tracker input. If you use voltage signals, there must be some menu for it in the control software. If you are using TCP/IP, increase the gain in the TCP/IP eye tracker menu of NIMH ML. You may need to put the camera more close to your subject. It appears that the gain should be 5-10 times larger. The best thing is that the red circles spread as wide as the cyan dots so that the arrows connecting two corresponding points become shorter.

Also I would spread out the inner fixation points (1-4 & 6-9) or add more fixation points between the inner square and the outer square, to cover the field more evenly. The larger the distance between the fixation points is, the bigger the interpolation error becomes.
Re: 2D spatial transformation calibration
April 06, 2022 05:58AM
We made adjustments as you suggested, modifying gain, spreading out inner fixation points and adding more fixation points between the inner square and the outer square. It seems to be working so thank you so much for your help.
Re: 2D spatial transformation calibration
April 19, 2022 09:48AM
Hi,
As we previously told you, after your suggestions, we improved the 2D calibration by putting the camera more close to our subject and boosting the eye tracker input, however there are still two positions with a 2-3 degrees shift. We noticed that when calibrating eye signal, Matlab continues to give us the warning message: "matrix is close to singular or badly scaled. results may be inaccurate. Here we send you a picture of the Calibration scheme we are using. Is it possible that we are making some mistakes ordering, numerically, fixation points which might lead to an inaccurate matrix calculation? Is there a specific rule we have to follow for selecting fixation points in terms of numeric order?
Thank you in advice.
Attachments:
open | download - Image.BMP.jpg (117.1 KB)
Re: 2D spatial transformation calibration
April 19, 2022 07:27PM
Increase the eye signal gain a lot more! I do not see any difference from your previous picture. I would pick different fixation points, like the yellow circles in the attached figure, but nothing matters if you don't boost up the eye signals better like the yellow dots in the map. What kind of eye tracker are you using and how is it connected to NIMH ML (DAQ vs TCP/IP)?
Attachments:
open | download - Image.BMP.jpg (235.6 KB)
Re: 2D spatial transformation calibration
April 21, 2022 09:03AM
We tried the fixation points scheme you suggested, but by doing so the center point shifted beyond the fixation window and we still did not solve the issue with the previously mentioned points. Moreover, we boosted the eye signal as much as possible but we can not increase the signal more because it saturates. As eye tracker, we are using a CHAMELEON3 CME-U3-13Y3M Camera, connected to NIMH ML through a DAQ, and the eye signal goes from Fly Capture software to the eye-tracking software, that is Oculomatic (https://doi.org/10.1016/j.jneumeth.2016.06.016).
Re: 2D spatial transformation calibration
April 21, 2022 02:07PM
Forget about the fixation points scheme. It has very little to do with the calibration results. I do not understand what you mean by "the center point shifted beyond the fixation window". Why don't you just film your calibration process and show it to me?

The voltage range of your eye signals has not been changed in your pictures. What did you do to boost the signal? It would be helpful, too, if you could show me the screen of your Oculomatic. I would like to see how big the eye is on the window.

The National Institute of Mental Health (NIMH) is part of the National Institutes of Health (NIH), a component of the U.S. Department of Health and Human Services.