… and then they are off again! Last week, I gave myself 5 days to make a set of musical gloves that I would demonstrate at the weekend at UX Camp Brighton and with the help of ChatGPT I was able to pull off a fairly reasonable demonstration, even though I say so myself.
This week, the gloves came off and I made a Mark 2 version that had the sensors and the microcomputer attached to wristband replacements as used for fitness trackers. These seem like a better option for what I am hoping to achieve. In separate posts I will give all the info needed for one to build these yourself. In this post I am just going to talk about the rationale for doing this in the first place.
Anyone that knows anything about musical gloves will immediately think of Imogen Heaps’s Mi.Mu gloves which have orientation and gesture sensors, and also a means to detect finger motion. 10 years in development, these can be purchased for around £2500.
My gloves, in contrast, only have orientation sensors (BNO055) although it would be possible to detect gestures too. But the intention for mine is that whilst being able to send MIDI data for XY movements of the hand, that data will also be used to control piano keyboards attached to robot wrists that dastardly keep the keyboards under the hands in whichever orientation they might be. The rationale behind this is to allow accordion-like bellow movements without actually having bellows, and to control sound qualities other than volume (i.e. beyond simulating the pressure of air going through reeds).
I chose the Teensy 4.0 as the microcontroller for the project as I had used it before when I made a joystick to midi controller. The Teensy is an Arduino-like board with the advantage that it configures very nicely for USB midi and can be programmed from the Arduino IDE. Despite the fact that I had used the technology before, I got ChatGPT to write me a program for testing the midi interface.
For the position sensors, I chose BNO055 boards from Adafruit. Using a chipset from Bosch these boards give 9 degrees of freedom (9DOF) with XYZ outputs for position, acceleration and magnetic field. I was just going to be using the position data. I had some experience with these sensors too, having played with them for getting motors to move (in advance of building the robotic wrists). But once again I found it useful to get ChatGPT to write me some test routines. (It literally is just asking ChatGPT to write the program, pasting it into the Arduino IDE, compiling and uploading the program to the Teensy and then checking the output in the “Serial Monitor”.

Having established both the midi and the sensors were working, it was now just a question of mounting the sensors onto gloves (elasticated and fingerless orthopaedic £5 purchase from Amazon) using Velcro. I didn’t want a “beepy” electronic noise for my demo so I chose a choir plugin from Spitfire Audio. The left hand was going to play one of 5 notes from D, E, F, G and A from the tenor and bass samples, plus a note a 5th above. The notes chosen would be dependent on the x-axis rotation of my hand, and similarly the alto and soprano samples with my right hand. Choosing this combination would hopefully give a relatively pleasing sound whatever the combination. To get some tonal variation, the choir plugin allows you to choose between short and long sounds and I put these on different midi channels, which were selected by the y-axis position of my hands. I should add that this was done in Ableton Live. I should also add that ChatGPT pretty much wrote the whole program for this too – although my commands here were to add routines to what I already had
To add more variation, I was able to control the volume with the z-rotation of my right hand, and the frequency with which the midi notes were sent out with my left-hand z- position. Finally, I added a little random percussion using Dillon Bastan’s “Inspired by Nature” free sound pack. I had just been watching this demoed on this excellent YouTube video by ELPHNT.
Surprisingly, it all worked rather well as you will see in the video below, and also at my session at UXCamp. To say that I was completely in control of what I was playing would be wrong – let’s just say that the slight randomness of sound to hand gestures rather added to it’s appeal.
This is all getting a bit to TL:DR so I will save the bit about the mark 2 gloveless version for the next post, but here is the gloves-on version.
Leave a Reply