Show all posts by user
ML programming tips
I have no idea what you are trying to do. If you are comparing the timestamp intervals, you can do diff(2nd column) and diff(3rd column) and compare the result.
by
Jaewon
-
Questions and Answers
If what you want is just the filename of the conditions file, you can retrieve it from MLConfig.
cond_file = MLConfig.MLPath.ConditionsFile;
by
Jaewon
-
Questions and Answers
Use as a target and display the graphic separately, like Example 3 in the following link.
https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#SingleTarget
by
Jaewon
-
Questions and Answers
It is the same way as presenting any other stimulus. Define a TTL TaskObject in the conditions file and turn it on/off with toggleobject().
https://monkeylogic.nimh.nih.gov/docs_TaskObjects.html
https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#toggleobject
by
Jaewon
-
Questions and Answers
The register function makes the digital line stand by so that the TTL can go off when the next frame is displayed. However, the putvalue function changes the line state immediately, so the duration of the signal may be a little shorter than you think.
Calling register() multiple times is okay.
Also try the new ML version. I made the start time and duration of TTLOutput adjustable.
https://
by
Jaewon
-
Questions and Answers
No, it is not the right section. The Blackrock LED driver is for optogenetic stimulation.
I am not an electronics engineer, but you can send out a TTL to turn on/off an LED. If the LED does not need a large amount of current, maybe it can be driven directly by the TTL. Otherwise, add a relay switch to the LED circuit and control the relay with the TTL.
by
Jaewon
-
Questions and Answers
I think using trial-unique identifiers or comparing the intervals between the timestamps would be easier, but it is up to you.
First of all, ML timestamps are not referenced to the event 9 of that trial. See this manual page.
https://monkeylogic.nimh.nih.gov/docs_GettingStarted.html#AlignTimestampsAndAnalogData
All ML timestamps, including the events 9 and 18, are referenced to the Absolut
by
Jaewon
-
Questions and Answers
See this manual page for the block selection/change function.
https://monkeylogic.nimh.nih.gov/docs_TaskflowControl.html
You can also use the userloop function.
https://monkeylogic.nimh.nih.gov/docs_CreatingTask.html#Userloop
by
Jaewon
-
Questions and Answers
I think there is a bit of misunderstanding about what the master clock signal does. From your description, I believe this master clock is an external sample clock, not a synchronization or reset signal. Do you have any test result or the manufacturer's manual about how the master clock is used in the logger?
We may still use it to sync events, if ML begins to send out the master clock sig
by
Jaewon
-
Questions and Answers
You did not answer my questions about how your system works. Is your device a wireless logger attached on the subject's head? Does it save data to a memory card? How do you deliver the master clock signal to the wireless logger?
by
Jaewon
-
Questions and Answers
There are so many things wrong. You should read the eyejoytrack manual first.
https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#eyejoytrack
eyejoytrack() needs at least 4 input arguments, but you wrote only 3 in every single eyejoytrack command.
You expect simultaneous input from two bars, so you need to check both of them in one eyejoytrack command, not in two separate commands.
by
Jaewon
-
Questions and Answers
I do not understand how you do the synchronization with square waves. If they are periodic pulses, how can you tell when a particular pulse was generated?
Does your wireless system save data locally (e.g., to a memory card) or remotely (e.g., by transmitting the signals to the base station through the air)? Do you need to feed the square waves to the wireless device (i.e., the head implant)?
by
Jaewon
-
Questions and Answers
If it is not data saving, it can be stimulus creation that consumes the time. I guess you need to test what it is one by one. There is not so much I can do, since I do not know what your code does.
by
Jaewon
-
Questions and Answers
Of course, anything that you save with bhv_variable() slows things down. Are you saving something that grows larger each trial?
by
Jaewon
-
Questions and Answers
What filetype are you using? if it is not BHV2, try switching to BHV2. H5 and MAT have the problem of getting slower as the filesize gets larger.
by
Jaewon
-
Questions and Answers
Thanks for sharing the adapter. I just have a couple of suggestions.
You wrote your addition right on the code of an existing adapter. This can cause a problem, because the interfaces of an adapter (i.e., the names of properties and methods) may be preserved during updates but the code lines are not. If the adapter that your work was based on is rewritten in a new ML version, your adapter may
by
Jaewon
-
Questions and Answers
* Changes in 2.2.37 (Jun 15, 2023)
+ The number of parameters that can be received from TCP/IP eye trackers is
increased from eight to ten.
- Fixed the issue that the Success property of SingleTarget and MultiTarget
was not reset when those adapters were reused in the same trial. This error
was introduced in 2.2.28 (Jul 5, 2022).
- Fixed a problem in using a movie as a FIX
by
Jaewon
-
News
I think your approach, reducing the ITI to 0, is the best. Since NIMH ML does need time to prepare for the next trial, the ITI cannot be truly 0 but will be minimum.
A new trial starts after the ITI, so there is no point in adding the video at the beginning. If you do not turn off the avatar movie at the end of the trial, however, the last shown frame will stay on the screen throughout the ITI
by
Jaewon
-
Questions and Answers
Why don't you move the data files to the designated directory afterward?
by
Jaewon
-
Questions and Answers
I added 22050 Hz. To check all the supported recording formats, type daqhwinfo('wasapi') on the MATLAB command window. The daqtoolbox directory of NIMH ML should be in the MATLAB path.
>> info = daqhwinfo('wasapi')
info =
struct with fields:
AdaptorName: 'wasapi'
BoardNames: {'Microphone (2- USB PnP Sound Device)&
by
Jaewon
-
Questions and Answers
* Changes in 2.2.36 (May 24, 2023)
+ SineGrating and RandomDotMotion are now graphic adapters.
+ Graphic adapters can be targets of behavior tracking adapters, such as
SingleTarget, MultiTarget, CurveTracer and DragAndDrop.
+ Graphic objects (both TaskObjects and adapters) used as targets are now
turned on and off automatically, when a scene starts and ends. There is no
need
by
Jaewon
-
News
Download the new version. It will allow you to change the joystick cursor in the timing script. Refer to the attached example and also the following manual page.
https://monkeylogic.nimh.nih.gov/docs_RuntimeFunctions.html#JoyCursor
by
Jaewon
-
Questions and Answers
Your sound card is recording at 48 kHz. Not every sound card supports the format you choose. If the chosen sample frequency is not supported, NIMH ML resamples the recorded data before saving them to the disk. Go to the Sound control panel and see which format is supported by your device. Actually 22050 Hz is not that popular these days, so NIMH ML scans only for higher frequencies. I can include
by
Jaewon
-
Questions and Answers
That functionality has been removed for a long time. Let me think about how to put it back.
by
Jaewon
-
Questions and Answers
Joystick is not the same as mouse. A mouse movement displaces the cursor from its current position, but tilting the joystick handle displaces the cursor from the origin. Keep this in mind and try the attached example.
If what you want is to change the origin of the joystick cursor, see this manual page.
https://monkeylogic.nimh.nih.gov/docs_CoordinateConversion.html
by
Jaewon
-
Questions and Answers
The Interleaf property determines how many frames later the dots presented in one frame reappear on the screen and it should be set to 3. My implementation of the random dot motion stimuli is based on the papers of Michael Shadlen's group (a.k.a. the Movshon/Newsome version). You can find the detailed descriptions about it in any of his papers, including this one. For the different algorithm
by
Jaewon
-
Questions and Answers
I added a way to create a GEN video from a BGRA vector. Please update your NIMH ML and try the attached example.
To indicate the provided data is a BGRA vector, not a MATLAB bitmap matrix, the info struct that the GEN function returns must have the of the video in the DoNotPermute field. See the attached example for more details.
by
Jaewon
-
Questions and Answers
If you are writing the script for someone else, ask the person if ~2-sec ITI is okay. Typically ITI is not an object of concern, when people talk about the timing and latency.
If video data can be fed directly in the 32-bit color format, the stimulus creation time will be reduced considerably. Let me think about how to add an interface for it.
By the way, the byte stream that your mex file
by
Jaewon
-
Questions and Answers