Hi Jaewon Last month, I made modifications to OnOffMarker to achieve a similar function. I required this modification because, in the context of , I was using eventcodes to indicate whether the eye fixation is within the window. However, when the eye movements occurred frequently, going in and out of the fixation window, it resulted in too many eventcodes and even conflicts with other task-relatby MoonL - Questions and Answers
Thank you for updating this feature, Jaewon! I can't wait to try it out!by MoonL - Questions and Answers
Hi, MonkeyLogic Experts! I am using ImageChanger to present multiple (100+) images to subjects. I know that ImageChanger has eventcode for recording presentation times, but I'm interested in utilizing a photodiode for even more precise timing of image presentation. Could you kindly provide guidance on how to effectively integrate ImageChanger with a photodiode for this purpose? Your assistaby MoonL - Questions and Answers
Hi, You can first convert the Chinese sentences into images and then display them as pictures, which is something I used to do when using Psychtoolbox. By the way, are you planning to show Chinese to monkeys? Can they recognize text? MoonLby MoonL - Questions and Answers
Hi Jaewon: Thank you for your valuable suggestions and modifications! I apologize for the redundancy caused by my unfamiliarity with class inheritance. I have already tried implementing your suggestions and incorporated them into my code. Thank you!by MoonL - Questions and Answers
Hi, I need to change the fixation window size during a trial, which can last up to 200 seconds for an fMRI scan. Breaking the trial to make this change would require stopping the scan, which is expensive. Therefore, I modified the singletarget so that we can adjust the window threshold during a scene using a hotkey: hotkey('i', "fix1.WaitForChange = 1; fix1.ChangeVal = 1;");by MoonL - Questions and Answers
Thank you Jaewon! I thought I could use network to send the eventcodes in NIMH adapters like ImageChanger so that the data server can figure out the image index series, just like what I do in . I will try to run webwrite and ImageChanger in parallel to achieve this demand. Thank you!by MoonL - Questions and Answers
I don't know how to do so. I thought I can only send eventcode from NI digital ports. Could you please teach me how to send them just like NI board? I only find digital port in "behavioral code" sub system. Thank you!by MoonL - Questions and Answers
I’m sorry I didn’t make myself clear. The data server has limit for digital port number. In my current experiment, I need more than 256 event codes, but the data server can only receive 8 port at most. If I can send event code via http server, I can read them in data server likeby MoonL - Questions and Answers
Hi In my current usage scenario, I cannot use the digital port of the NI board to send eventcode. Can I send the eventcode to another computer through a network cable and switch?by MoonL - Questions and Answers
I have come up with a rather clumsy method: use the zorder to cover the front of the dots with an image with a square holeby MoonL - Questions and Answers
Hi Jaewon, I was able to resolve the issue by incorporating singletarget.success into an onoffmarke. This allows me to easily load the time series data that pertains to whether the fixation is within the specified window in AO. Thank you for your helpful advice!by MoonL - Questions and Answers
Thank you Jaewon! I will try to use alter_funtion!by MoonL - Questions and Answers
Yes, it would be easy after data collection, but this funtion is necessary for me because I need to select the dataset based on the result from online analysis(the image would be different according to specific neuron's response). Maybe I need to use the raw signal from eye tracker and solve drifting problem.by MoonL - Questions and Answers
I'm doing a passive viewing paradigm, I present images repidly, regardless of whether monkey is making fixation. Like: In AlphaOmega, I want to check whether 'Success' of singletarget is ture during image onset, and exclude the images that fixation break too long within a image(I can't use eventcode sent by xxHOLD, because it can't match with image presentation). Any eby MoonL - Questions and Answers
Hi Jaewon. When using singleTarget, I want to send the 'success' property to my AlphaOmega system in real time so that I can do on-line analysis (I want to delete the data when success==false), the signal can be analog or digital. I tried to use Simulator adapter but it will stop once fixation break. I tried to use ClosedLoopStimulator in example 13, but It seems to conflict with reby MoonL - Questions and Answers
Thank you for your generous help, Jaewon! I'm using userloop function now so that everything seems to running correctlyby MoonL - Questions and Answers
Hello everyone, I come from a vision lab and I'm trying to build a task to present 5000+ images while monkey passively viewing the fixation to get the reward (a passive viewing task). The presenting speed of images is too fast (sometimes 100-ms onset and 100-ms offset), and I need to monitor the eye even when images offset, so I can't put one image into one trial(Due to a minium ITIby MoonL - Questions and Answers
Hi Jaewon, Thank for your reply! It works after I change subject screen as main screen. Thank you!by MoonL - Questions and Answers
Hello everyone I'm using 2 screens with different refresh rate. The subject screen is 120Hz while the experimenter screen is 60hz, it appears that when I try 'test' in video panel, there is nothing on the subject screen. If I do test independently on either of the screens, it works good. Running task shows same problem. Is the same refresh rate required in NIMH MonkeyLogic?by MoonL - Questions and Answers