Roadmap Bessy 2023
Goal of this roadmap is to try to start to put down key times/items that we see coming in the future that we want to have ready so by the end of 2023 we have achieved some of those targets. This roadmap of 2023 is broken down into 3 key parts -
- General (joint) overview
- Bessy Python Goals/Milestones/Risks
- Bessy Unity Goals/Milestones/Risks
Part 1: General High-level concepts
Below is the current component map of how Bessy interacts online

We need to identify if this is the primary structure we would like to keep, or if we would like to split out. In particular, how we want to actually handle all the different elements of computation might be worth exploring. In particular, if we want to compartmentalize aspects of the Bessy python we could look to make sure we can wrap this in other languages for different computer sizes.
Part 2: BCI-Essentials Python
Bessy Python is the core back-end we are planning to continue to grow and use in all of our BCI projects. We need to make this core aspect as stable as possible. Right now Bessy Python is the majority of the computational work and “collating” of data (all the information from Unity, LSL, EEG etc.).
Goals
By the end of 2023 we want BCI-essentials Python to be refactored for increased stability, usability, which can scale with increased developer use.
Priorities
High
- The installation of Bessy Python for new virtual python environments is seamless with limited need for additional dependency imports required
- Bessy Python can be used out-of-the-box on installation for existing applications developed in BCI4Kids
- There is clear documentation for:
- Setting up BCI-Essentials Python for development use (e.g. develop something with Bessy Python)
- Setting up BCI-Essentials Python for application use (e.g. in the background of an app)
- Setting up BCI-Essentials Python for contribution (e.g. extending the use cases and abilities of Bessy python)
- Running online & offline examples/samples with Bessy Python
- Components for Bessy python and a map of class interactions is outlined for refactoring
- We have a series of high-level tests that can output expected results for
- LSL Streams from headsets can be brought into Bessy python and processed
- LSL Streams from Unity can be brought into Bessy python and processed
- Data is effectively structured to be saved for use (PEDS BCI)
- We can run offline tests for evaluating P300, SSVEP, and MI data
- We can simulate online streaming and analysis
- We have effective ways to handle errors
Med
- We refactor the existing major classes
- We have the ability to graphically interact with Bessy Python
- We can deploy Bessy Python on a small form factor, e.g. Raspberry Pi 4.
Low
- We have the ability to deploy Bessy Python natively in Unity
- We have the ability to deploy Bessy Python through Docker
Part 3: BCI-Essentials Unity
Bessy Unity is actively one of the most attractive parts of our platform. We had ongoing partners at Unity Technologies in Calgary, and interested parties from talks that Eli Kinney-Lang gives who want to contribute to this part of the platform. This includes the opportunity for game developers to “test in the wild”, with individuals reaching out to ask how to utilize this in their work. Since this is largely our “customer facing” part of the work, we need to make sure it behaves and integrates well with projects that come up.
Goals
By the end of 2023, BCI-Essentials Unity should be capable of being integrated into Unity applications as simply and readily as other standard Unity Plugins.
The biggest gap missing right now is documentation, example scenes, and clear guides/walkthroughs on how to utilize this. This is our “unique” value proposition over other BCI-driven software such as BCI2000, and BCIpy - both of which may be able to use Unity, but aren’t built with that in mind.
Priorities
High Priority
- Bessy has clear documentation for:
- Installation within Unity and the package manager system
- What the sample scenes are, and where to load them in Unity
- Dependencies required and supported for development
- How to use LabstreamingLayer (LSL) for networking
- Required LSL applications that are likely to be used (e.g. LabRecorder)
- What hot-key buttons are currently set, and how to change these
- Bessy has short-form videos showing how to:
- Install and load in sample scenes
- Add BCI Compatibility to a simple set of objects in the scene
- How to change BCI control paradigms in Unity
- How to build “request” BCI control paradigms, and why you would select each one
- How to build a simple BCI 3-selection game (stretch)
Med Priority
- Bessy has the ability to tweak visual stimuli to encompass all available known stimulus patterns
- P300 oddball paradigm
- P300 rapid serial visual presentation
- Steady State VEP (SSVEP)
- Transient VEP (tVEP)
- Code Modulated VEPs (cmVEP)
- Motion Onset VEP (moVEP)
- Bessy has easy options for tuning up/down aspects of the stimulus for easy integration with vision goal. (For lower level developers)
- Bessy has easy pre-set options for visual presentation/stimuli for high-level users (e.g. pre-made VEP add-ins for game developers)
Low Priority
- Bessy has the ability to jointly present hybrid stimulus patterns
- e.g. P300 + SSVEP
- Bessy has the ability to engage auditory stimulus to stimulate P300 auditory responses
- In the future we will have additional functionality for other auditory stimuli