Did you start sending packets on the DQW side? It does not send anything until you click some button.by Jaewon - Questions and Answers
If you are scripting in the v1 style, you can monitor the eye position around the initial fixation point with eyejoytrack('holdfix',...) and, as soon as it returns with 0 (i.e., when the fixation is broken), call eventmarker. However, I advise against this unless you have a compelling reason to detect saccades in real time. Whenever possible, it is better to analyze saccades offline.by Jaewon - Questions and Answers
Depending on what you exactly want to do, there can be different suggestions, but usually this is what you need a mixer for.by Jaewon - Questions and Answers
I wrote an example task for you. This example requires you to define the boundaries of the pathways with non-overlapping rectangles. See the code in the attached file.by Jaewon - Questions and Answers
You can use the custom calibration function. See this manual page. https://monkeylogic.nimh.nih.gov/docs_CoordinateConversion.html There is also an example task of this in the "task\runtime v1\8 customized calibration\" directory.by Jaewon - Questions and Answers
The eye tracker has to have a voltage output option. If it provides only TCP/IP, it will not work. See this post. https://monkeylogic.nimh.nih.gov/board/read.php?3,1993by Jaewon - Questions and Answers
What is the width and height of your videos? I guess a video card with large memory and a powerful CPU can help with it, but it is not always possible to prevent skipped frames for large videos.by Jaewon - Questions and Answers
This is not something you can fix with a little tweak. I may be able to develop something without actual hardware, but, at some time point, the hardware will be required for testing whether everything works. Ask the vendor if they can send me a test unit. Tobii once lent me their top-of-the-line product for a month so that I could support their hardware. Maybe this company will do the same.by Jaewon - Questions and Answers
Due to a system upgrade made a couple of months ago, some functions of the board (e.g., posting new topics, creating new profiles, etc.) were disabled. Most of them are restored now, but please let me know if you find anything that does not work as expected.by Jaewon - News
* Changes in 2.0.270 (Dec 19, 2024) - OnsetDetector did not report RT due to a change made in 2.2.43. It is now working correctly. (Thanks to Chaoyi Zhang)by Jaewon - News
Establishing a TCP/IP connection alone does not guarantee any functionality. The critical factor is the structure of the transmitted data, which varies by vendor. Unless your eye tracker is listed among the supported hardware (https://monkeylogic.nimh.nih.gov/docs_TCPIPEyeTracker.html), it is not possible to retrieve eye positions from the connection. To support new hardware, I would need accessby Jaewon - Questions and Answers
Please do your part first before asking that kind of question. Take a look at the provided example tasks and manuals.by Jaewon - Questions and Answers
There are so many ways to detect such a behavior. You need to use a little imagination. For example, you can repeat the scene until the subject touches the target.by Jaewon - Questions and Answers
You description is confusing. What is the difference between two separate inputs and the dual screen? Are you talking about duplicating the same screen on two monitors? Then, you should set it up in the Windows Display settings.by Jaewon - Questions and Answers
I am posting your email reply here so that we can continue our discussion here. ------- Thank you very much for your response, but the monitor setup manual page is just about two separate inputs, not the dual screen I am looking for, which is two monkeys can play on two different touch screens at the same time.by Jaewon - Questions and Answers
Have you tried the trick introduced in the monitor setup manual page? https://monkeylogic.nimh.nih.gov/docs_MonitorSetup.html#MoreThanTwoMonitorsby Jaewon - Questions and Answers
No, there isn't. You can try modifying mlpayer though.by Jaewon - Questions and Answers
I do not understand the purpose of your state machine. TDT can store the numbers sent via digital lines and their timestamps. Why do you need to build a state machine? Eventmarkers are not sent at the same time. As I said, all assigned digital lines are used together to represent one number, so there is no way to send more than one number at a time.by Jaewon - Questions and Answers
I think you misunderstood something. It appears that what the screenshot shows is the traces of single digital bits. An eventmarker is a multi-digit number, not a single bit. All assigned digital lines are used at the same time whenever an eventmarker is sent out.by Jaewon - Questions and Answers
You did not provide any information about your monitors and movies. Try a refresh rate of 60 Hz, instead of 120 Hz. The size of your movies (width & height) may be so large that your system cannot process them in time. As for preloading, see the userloop example (2 movie preloading) in the task directory of your NIMH ML installation path. It has nothing to do with skipped frames though.by Jaewon - Questions and Answers
To use different thresholds per target, put them in columnar order, like a n-by-1 or n-by-2 matrix. n is the number of targets. mul = MultiTarget(touch_); mul.Target = [-5 0; 0 0; 5 0]; mul.Threshold = [1; 2; 3]; % a column vector mul.WaitTime = 5000; mul.HoldTime = 500; scene = create_scene(mul); run_scene(scene);by Jaewon - Questions and Answers
Call the setCursorPos method of the mouse tracker before your Likert scale scene starts. https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#Trackers ... scene = create_scene(con); mouse_.setCursorPos([0 0]); run_scene(scene); ...by Jaewon - Questions and Answers
You should have told me that information first. Then, we could reach this conclusion earlier. The conclusion is that I cannot help you with this issue of yours. NIMH ML receives touch input from Windows via standard APIs. There is nothing that makes NIMH ML any different from other applications. So, you should look for the reason of malfunction in your Windows or hardware, especially when it wby Jaewon - Questions and Answers
What is your touchscreen model? When you touch the monitor outside MonkeyLogic, do you see the mouse cursor under your fingertip?by Jaewon - Questions and Answers
I cannot tell whether it was an appropriate test, since I have not seen the task code. Please try the menu as shown in the following link. When the subject screen is touched, a hand icon will appear on the corresponding position of the control screen. https://monkeylogic.nimh.nih.gov/docs_IOTest.htmlby Jaewon - Questions and Answers
How did you figure out that the touches are not registered?by Jaewon - Questions and Answers
Did you check on the option in the menu? https://monkeylogic.nimh.nih.gov/docs_MainMenu.html#OtherDeviceSettings You also need to choose how many touches you want to track, if the monitor supports multi-touch.by Jaewon - Questions and Answers
It is just a matter of designing/building the pump circuit and writing the code for it. Do not touch reward_function.m or anything. See this for the DAQ setup. https://monkeylogic.nimh.nih.gov/docs_MainMenu.html#DAQSettings Then, call goodmonkey() in your timing script. It will internally call reward_function.m and send out TTL pulses for your reward pumps. https://monkeylogic.nimh.nih.goby Jaewon - Questions and Answers
* Changes in 2.2.48 (Sep 27, 2024) - The AnalogInputMonitor adapter can display signals from the voice recording and the high frequency sampling device as well. - The Stimulator adapter can turn off the output at the end of the scene. - Minor fixesby Jaewon - News
Assigning a TaskObject to a different variable does not create a copy of the TaskObject but another variable that points to the same TaskObject. You should create two TaskObjects of the same movie, one at (0,0) and the other at (5,0), or change the position of TaskObject(3) to (5,0) with reposition_object() before presenting it again.by Jaewon - Questions and Answers