Diffusion Real-time brain-machine interface (master)

Real-time brain-machine interface




PyCNBI provides a Python-based online brain signal decoding framework. A motor imagery classification runs at approximately 15 Hz on a 3rd-gen i7 2.4 GHz laptop. Although this decoding rate is more than enough for myself, there are plenty of possibilities to optimize the speed in many parts of the code.

It was initially developed for my personal projects and not fully cleaned up yet. It has been applied in many different online scenarios (hand imagery, leg imagery, error-related potentials) and has been be working well so far, but the code is not in the most efficient form. Any contribution is welcome. Please contact kyu.lee@epfl.ch for any comment / feedback. You can find the things to be done at the end of this document.

The data communication is based on the lab streaming layer (LSL). Since the data communication is based on TCP, the EEG data can be transmitted wirelessly. For more information about LSL, please visit:

The code has been tested with AntNeuro eego, g.tec gUSBamp, and BioSemi ActiveTwo. It has also been tested with BrainProducts actiCHamp but only for signal checking and no decoding was tested. It has been tested on Windows 7-10 with Python 2.7 but most of the codes should also work in Linux environment and with Python 3 but there are still some modules not compatible with Python 3. I didn't have time to test in another environment yet.

Important folders

This module visualizes signals in real time with bandpass filtering and common average filtering options.
It relies on StreamReceiver module.

This is used to record signals into a file. It relies on StreamReceiver module.

This folder contains decoder and trainer modules. Currently, LDA, regularized LDA, Random Forests, and Gradient Boosting Machines are supported as the classifier type. Recurrent Neural Network-based decoders are currently under development.
decoder.py: perform online classification and output class probabilities.
trainer.py: perform cross-validation and/or train a classifier.

You can find various montage settings for OpenVibe servers, which can be loaded from OpenVibe acqisuition servers.

Contains some basic protocols for training and testing. Google Glass visual feedback is supported through USB communication.

Triggers are used to mark event (stimulus) timings during the recording. This folder contains common trigger event definition files.

Contains various utilities.


For easy installation of Python environment, you might want to try Enthought Canopy or Anaconda. PyCNBI code has been tested on both Canopy and Anaconda with Python 2.7.

PyCNBI depends on the following modules:

  • scipy, numpy
  • mne
  • scikit-learn
  • xmltodict
  • pyqt5, pyqtgraph

You can install these packages using pip. For the usage of pip, have a look here:

Optional but recommended:

OpenVibe supports wide range of acquisition servers and you can make use of them. Make sure you tick the checkbox "LSL_EnableLSLOutput" in Preferences when you run acquisition server. This will stream the data through the LSL network from which PyCNBI receives data.

Clone the repository:

git clone https://c4science.ch/diffusion/1299/pycnbi.git

setup tools is not implemented yet, so simply add the cloned pycnbi directory to your PYTHONPATH environment variable and logout and login again.

For g.USBamp, the following customized acquisition server is recommended instead of default LSL app,

git clone https://c4science.ch/diffusion/1300/gUSBamp_pycnbi.git

because the default gUSBamp LSL server do not stream event channel as part of the signal stream. The customized version supports simultaneous signal+event channel streaming. For AntNeuro eego systems, use the OpenVibe acquisition server and make sure to check "LSL output" in preference. If you don't see "eego" from the device selection, it's probably because you didn't install the additional drivers when you installed OpenVibe.

Since we wanted to send triggers to a parallel port using standard USB ports, we have developed an Arduino-based triggering system. The package can be downloaded by:

git clone https://c4science.ch/diffusion/1301/arduino-trigger.git

The customized firmware should be installed on Arduino Micro and the circuit design included in the document folder should be printed to a circuit board.

Things to improve

  • Make it compatible with Python 3.
  • Add setup scripts.
  • GUI-based interactive classifier training tools.
  • Load settings from .ini files instead of .py files.
  • More cpu-efficient decoder class.
  • Tutorial with examples

Locate File