fNIRS UK 2023 | 14th September 2023, 13:00-17:30 | Colchester, United Kingdom
A handful of software tools have been developed for fNIRS analysis, however most of them are MATLAB based. Other languages such as Python propose libraries like MNE-NIRS which are reliable and versatile for the analysis of fNIRS data. This workshop is designed to introduce participants to the fundamentals of fNIRS data analysis using MNE-NIRS. Participants will be provided a brief theoretical introduction to MNE-NIRS, then acquire their own fNIRS data using a visual stimulation experiment implemented in PsychoPy. Finally there will be a guided analysis part covering single-participant and group-level general linear models (GLM) with MNE-NIRS. This workshop will offer a practical and interactive learning experience, providing an opportunity for those new to fNIRS or Python. At the end of this workshop, participants will have a comprehensive practical knowledge of the whole process from data acquisition to analysis using MNE-NIRS. Previous programming experience is considered a plus but not necessary.
As a main objective this workshop will cover the explanation of applying the general linear model for analysis of fNIRS data using the Python-based libraries MNE and MNE-NIRS. It will also demonstrate how to run an experiment in PsychoPy and provide hands-on time with Artinis fNIRS devices and software for data acquisition. This workshop is motivated firstly by the advantages of Python, as it is an open source, free, and easy to learn programming language. It has a flourishing community and a plethora of libraries that enable fNIRS analysis to be integrated with other components (e.g., machine learning, robotics, audio/video). Moreover, the MNE library has a large community and is very responsive to contributions. It is a robust library with extensive testing and has a detailed documentation and active forum. Finally this library also supports many neuroimaging techniques such as fNIRS, EEG, MEG, ECoG, and more, enabling integrated analysis of multimodal datasets.