Want to control the sound of an electric guitar just by thinking about it? Neurosity's got you...

Introducing "Brain Wah"

Want to control the sound of an electric guitar just by thinking about it? Neurosity's got you...

In February 2020, Neurosity -- a brain-machine interface startup -- announced its Notion device, a “brain-reading computer” which combines CPUs, RAM, brain sensors and internet connectivity in a single device you wear on the back of your head and which tracks brain behaviour 250 times per second. The big idea: neural devices and neural apps that allow users to interact with technology [Ed: with no freaky brain stitching, Elon...]

Does it work? Co-founder Alex Castillo, a former senior software engineer at Netflix, says he can now add guitar effects to his real-world playing, by thinking about them. He explains, in an exclusive blog for The Stack.

A programmer models a Neurosity headset in a stock image. Credit: Neurosity.

Traditionally, guitar players have used foot pedals to control their guitar sound, writes Alex Castillo. Pedals allow you to change volume, apply effects, modulate sound, and all sorts of awesome sound combinations without using your hands, so you can focus on playing.

Today, we'll be using a Neurosity Brain-Computer Interface and our thoughts to control the sound of an electric guitar.

https://twitter.com/castillo__io/status/1331285114941104128

Let's do this.

Communication

For this experiment, we won't be using a traditional guitar amplifier, but instead a software-only guitar plugin. So we'll be plugging in the electric guitar to a USB Audio Interface like this one.

The communication from the Node.js app will happen via MIDI which stands for Musical Instrument Digital Interface. For that, we'll use the easymidi library.

npm install easymidi

This awesome package will allow us to send commands to the guitar plugin app via Node.js.

Now, let's create a virtual MIDI output.

import { Output as Midi } from "easymidi";

const midi = new Midi("Notion", true);

If we run this code, our new MIDI output is ready to be detected by the guitar plugin.

node index.js

Guitar Plugin

I've traditionally used hardware units like the AXE-FX for designing my guitar tone, but lately, I've been playing around with the Neural DSP guitar plugins, and so far I'm loving it.

Alt Text

We'll be using their Archetype: Cory Wong plugin which comes with a neat MIDI utility that we can use map MIDI commands to different sound settings like Presets Changes, Amp Types, Gain, Effects, etc.

You can download a free trial here.

Let's open the standalone plugin and go to Settings (gear icon) located at the bottom left corner of the plugin. Next, let's check the the Notion input under "MIDI Inputs Devices".

💡 The input will show as long as the Node.js app is running.
Neurosity

That's the MIDI device we created in Node with 2 lines of code!

Next, let's go to Midi Mappings by clicking on the icon next to Settings.

Neurosity

I've added 1 MIDI setting and configured it control the strength of the Wah-ng Wah effect based on the value we send from Node.js.

Mind Control

Now let's put everything together.

First, let's install the Notion API.

npm install @neurosity/notion

🤯 The Notion API enables full communication between the headset and your apps, in our case Node.js. We'll use it to get real-time feedback based on our cognitive state. For this app, we'll work specifically with the Focus and Kinesis metrics.

Check out the docs

import { Notion } from "@neurosity/notion";
import { Output as Midi } from "easymidi";
import { tween } from "./utils/tween";
import { email, password } from "./options";

const midi = new Midi("Notion", true);
const notion = new Notion();
await notion.login({ email, password });

notion
  .predictions("rightFoot")
  .pipe(tween({ from: [0, 1], to: [0, 254] }))
  .subscribe((value) => {
    midi.send("cc", { value });
  });

Let's break this code down:

  • We create a new Notion and call login using a Neurosity account credentials
  • Subscribe to motor imagery predictions metrics
  • Then, we use the tween utility function to map focus scores from 0 to 1 to their corresponding MIDI value (0-254) AND interpolate all values in between to smooth out the knob change
  • Lastly, we send a CC MIDI command to the Neural DSP plugin
  • View full code

At this point, the guitar plugin modulates the Wah-ng Wah effect when you think of moving your right foot.

Neurosity

But, what we wanted to change the guitar preset similar to how we would do it with a pedalboard?

For that, we can use Notion's Kinesis API to train our command leftFoot and activate Preset Next via MIDI command by just thinking about your left foot pushing down.

notion
  .kinesis("leftFoot")
  .subscribe(() => {
    midi.send("program", { channel: 0 });
  });

Conclusion

We, humans, spend most of our lives translating our thoughts into hand movements in order to interact with the world around us.

Would you believe me if I told you the average person presses, taps, and clicks around 3.5 million times a year?

To learn more about the brain, the electrical activity produced by our neurons, and how we can empower the mind, check out my TEDx Talk.

See also: Hyundai is seeking a CTO to help it bring flying cars to market by 2028