You can use analogoutput of DAQ boards, if you need zero-latency sounds, but then you cannot mix multiple tones on the fly and may need an amplifier.by Jaewon - Questions and Answers
I probably deleted it when I edited the previous comment. Analogoutput channels on the same device cannot be controlled individually, so there is no point of assigning two STM objects, unless you want to send out two waveforms simultaneously. Delete the other Stimulation channels, except Stimulation 1, or run putsample like the following. putsample(DAQ.Stimulation{1},zeros(1,length(DAQ.Stimuby Jaewon - Questions and Answers
What I meant was that the time when a sound comes out of the speaker is ~40 ms later than the time recorded by the eventmarker of toggleobject. It is because sounds have to go through the Windows audio stack when they are played via the sound card. The time that eventmarkers take to reach an external machine is just ~0.2 ms, so it can be ignored. % This loop plays 4 times the BASAL toneby Jaewon - Questions and Answers
I have a little concern about the timing of your stimuli though. You used idle() to leave short intervals between the tones, but idle() is not that accurate timingwise because it has to redraw the control screen from time to time during the wait. So, if the tones should be precisely at 150-ms intervals, I would use a userloop function, create a long wav sound there, in which 5 tones are combinby Jaewon - Questions and Answers
You can do this as a temporary solution. putsample has to be called after STM is turned off. If you are sending out a 5V flat pulse, I would use a TTL. It is fast and easier. toggleobject(1); % TaskObject#1: crc(0.2,[1 1 1],1,0,0) ontarget = eyejoytrack('acquirefix',1,3,10000); if ~ontarget trialerror(4); % no fixation return end toggleobject(2); % TaskObjectby Jaewon - Questions and Answers
You don't need to (and should not) change anything if the latency test looks fine. It is just because your computer is too busy. Using a smaller resolution will be helpful, if you don't mind a little blurriness. I think 1 to 4 skipped frames are not that critical, although it depends on what you want to do.by Jaewon - Questions and Answers
Rewind the used sounds before turning them on again. https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#rewind_object Sounds used to be rewound automatically when turned off, but they don't anymore. It is because now very long sounds can be streamed from files and rewinding them may take long while not everybody needs it.by Jaewon - Questions and Answers
One easy fix is to send out the waveform to the end (assuming that the waveform has a trailing 0) whether the fixation is broken or not. Do you need to stop it in the middle? In some applications, the last value should stay there, so I need to think about how to accommodate this.by Jaewon - Questions and Answers
Yes, that is the reason. I reduced the size by taking only 16-bit color formats, but it is still huge. Compression methods work only when the video is saved as a file and you cannot keep the compressed size once you load the video into MATLAB, so it is necessary to choose the frame size of the video wisely. I am thinking of exporting the videos as separate AVIs, but then we cannot keep them inby Jaewon - Questions and Answers
Just do not include any adapter that tracks behavior, such as WaitThenHold. You can program the same thing in the scene framework like the following. tc = TimeCounter(null_); tc.Duration = sample_time; scene = create_scene(tc,sample); run_scene(scene,20); idle(0); % Clear the screen. Not necessary if this is not the lase scene.by Jaewon - Questions and Answers
Please download the new version of NIMH ML and try the task attached below. Now SingleTarget gets the target position from the child adapter (CurveTracer in the attached code), if no Target is assigned.by Jaewon - Questions and Answers
* Changes in NIMH MonkeyLogic 2 (Nov 8, 2019) + A new adapter, AnalogInputMonitor, is added for online analog input monitoring. https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#AnalogInputMonitor + During the I/O Test, the voltage range of each General Input in the display can be adjusted. To change the range, click one General Input panel and then click the currenby Jaewon - News
You can use WaitThenHold but make its HoldTime 0. Then, when the WaitThenHold succeeds, run another scene that just shows the sample image without checking any behavior. Does it make sense?by Jaewon - Questions and Answers
Use the userloop function. https://monkeylogic.nimh.nih.gov/docs_CreatingTask.html#Userloop There are a few example tasks under the task\userloop directory, too.by Jaewon - Questions and Answers
I built the scene framework (timing script v2) for this kind of purpose. https://monkeylogic.nimh.nih.gov/docs_CreatingTask.html#RuntimeVersion2 Take a look at the following examples in the task directory. They are not exactly the same as what you described but do something similar. task\runtime v2\10 pursuit eye movement task\runtime v2\11 curve trace If you explain how you are going toby Jaewon - Questions and Answers
The "confirmation" on the command window just tells you that the software did what it was supposed to do and did not detect any error in doing it. It does not necessarily mean that you should be able to see some changes on your "neural recording monitor", since it is not a test for your hardware. It seems that you did some tests on your hardware, but you need to describe thby Jaewon - Questions and Answers
I added a property (OnsetTime) to ImageChanger so that you can tell when the "correct" image was presented. The OnsetTime (an n-by-1 vector) is NaN for the images not presented during the scene. I think the easiest way to retrieve the eye trace is to use get_analog_data(). correct_image = 10; onset = img.OnsetTime(correct_image); if isnan(onset) % The correct image was noby Jaewon - Questions and Answers
I should mention that there is an option to send out the TTL as a second analog output, IF you have two analog output channels AND program with the scene framework. I thought we might need a new adapter but the Stimulator adapter is already doing it. In that way, two pulses can start and stop exactly at the same time.by Jaewon - Questions and Answers
I cannot update the download packages at the moment. Please overwrite your mlconfig and mlread with the attached files and see if you can read the task.by Jaewon - Questions and Answers
Please see the following link. https://monkeylogic.nimh.nih.gov/docs_GettingStarted.html#FormatsSupported Also these commands may help. https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#behaviorsummaryby Jaewon - Questions and Answers
I can examine why the conversion failed, if you send me the config file. I think you can use Private Messages.by Jaewon - Questions and Answers
Thank, Mitch. Please give me some time to take a look at the files.by Jaewon - Questions and Answers
Delete the configuration file of the task (*_cfg2.mat) and try again. Some fields in the MLConfig struct have changed to store information of two eye trackers. NIMH ML is supposed to convert the format seamlessly when loading old config files, but it doesn't seem to have gone well somehow.by Jaewon - Questions and Answers
toggleobject does not know how long the stimulations will last. If you call two toggleobjects in a row, you are just turning them off as soon as they are on. Try inserting "idle(2000);" or "pause(2);" between them and see if it works for you. The latter is more precise in timing but it won't update the screens for 2 s. If they are not precise enough, we can make an adaby Jaewon - Questions and Answers
There is not so much that I can tell without seeing the hardware configuration and the code. Maybe the port failed to reopen because it was still in use in the next trial. Try opening it again immediately after closing at the end of the timing script. If it doesn't have to be opened and closed every trial, you may want to initialize it in the alert_function. https://monkeylogic.nimh.nih.by Jaewon - Questions and Answers
It seems that this model is a standalone camera, not a kind that we call "webcams". Usually webcams have to be connected to computers to take pictures and cannot function independently. When connected, the computer should be able to control the webcam. If the connection is just for file transfer, it won't work. Can you record videos on Windows Camera app or do video-chatting witby Jaewon - Questions and Answers
Did you restart MATLAB after plugging in the webcams? If you did, what kind of webcams are they? Do you know their manufacturers and model names? Yes, the video is recorded, once a webcam is selected in the Non-DAQ device settings. The ITI period is not recorded, unless you check on the ITI recording on the main menu.by Jaewon - Questions and Answers
There are a few more things you can try. Windows keeps all previous monitor configurations and restores them when the same monitors are connected again. Sometimes this prevents you from starting a new, fresh configuration. So I would delete all previous configurations from the registry first, although it may not solve the problem. To do so, open regedit, go to \HKEY_LOCAL_MACHINE\SYSTEM\CurrentCoby Jaewon - Questions and Answers
Can you explain the details more? What is the purpose of getting the time during the trial? Eventmarker times usually become available after a trial via the TrialRecord structure.by Jaewon - Questions and Answers
You can set eventmarkers in each row of the image list. https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#ImageChangerby Jaewon - Questions and Answers