The GPIO (General-Purpose Input/Output) pins of the Raspberry Pi are versatile ports for handling both incoming and outgoing digital signals. When used as output ports, GPIO pins can control lights, actuators, motors, relays and many other things.
One neat thing you can do with them is map them to MIDI “Notes” and play them as output devices. Using McLaren Labs software, you can share these over the network and “play” them from a remote computer. This article says how.
The Linux ALSA MIDI subsystem has a lot of capabilities and it takes a while to learn all of the bits and pieces that are available. Such was the case when one of our customers wanted to connect Reaper to an external RTP-MIDI capable device (a Behringer X-Touch). The solution was the snd-virmidi kernel module. This post will describe what snd-virmidi does and how it can be used to bridge different types of software.
The types of Linux device Access
MIDI hardware devices connected to Linux show up through simultaneously through two distinct mechanisms.
the RAW MIDI interface
the SEQ (sequencer) MIDI interface
The figure below illustrates this situation. On the left is the “RAW” MIDI software interface and on the right is the “SEQ” MIDI interface. When an external device is connected, these two distinct interfaces are created for it.
The RAW interface
The RAW interface for software programs presents MIDI data as an uninterpreted byte stream. A ‘noteOn’ message will appear as a sequence of bytes (0x90, 0x??, 0x??) for a software program to interpret according to the MIDI standard. The stream of bytes on this interface is not timestamped and is not interpreted.
Version 2.0 of McLaren Lab’s rtpmidi improves the timing capabilities of the software and optimizes memory usage.
Our most notable new feature is the ability to adjust the playing time of received notes by adjusting latency. Users will notice a new column in the list of participants called “latency adjust.” The entry field is editable and you can type in a number (in milliseconds) to adjust latency of received notes.
The figure above shows the latency field highlighted and we have entered a value of 100.
This article describes the background of the latency adjustment feature and how it can improve your musical experience.
Flight Time
The figure below shows four notes being sent from a Sender MIDI instrument to a Receiver MIDI instrument. The flight time is the time it takes for the note information to be sent from the Sender to the Receiver.
Depending on the network and its characteristics, flight time might be so small it is not noticeable, or large and variable enough that you can “hear” or “feel” it. On a dedicated Ethernet network using RTP-MIDI, the flight time is usually small and uniform. On a WiFi network using RTP-MIDI the flight time can be large and variable.
McLaren Labs’ rtpmidi software has been around for a few years now. It reliably sends and receives MIDI events and properly implements error correction. The software is robust and has been used in very many different environments by an enormous number of users.
But, we knew there were some limitations around scheduled note timing, and we resolved to get to the bottom of it.
Update: 2024-09-15 – The easiest way to connect external MIDI devices to the Linux subsystem of your Chromebook is to use the free McLaren MIDI Server sending to McLaren Labs’ rtpmidi for Chromebook Linux! (Debian 12 Bookworm).
Our port of rtpmidi to Debian 12 (Bookworm) installs effortlessly in the Chromebook Linux subsystem. Why would you want to do this? The reason is to get MIDI events from external devices into the Linux subsystem of your Chromebook.
Currently, when you plug a MIDI device into the USB of your Chromebook, it is not passed through to the Linux subsystem. By using rtpmidi you can send MIDI events from a device connected to another computer into the Linux subsystem of your Chromebook.
When used with a wired Ethernet connection (avoiding WiFi) the latency of such a set-up can rival a direct connection.
The McLaren Synth Kit presents audio synthesis as the construction of a graph of audio operators (voices) with the parameters of each type of operator determined by an associated model. Voices define the code that is run in the audio thread and models hold the values that are read from the audio thread. This separation allows the audio thread code to be ensured that maniplations on the models do not block. With models safely out of the audio loop, UI interactions with model properties is straightforward and can use Cocoa idioms for manipulating property values.
At least that was the intent. The McLaren Synth Kit was released without associated GUI controls, but we believed it would be straightforward to define controls for the various models.
But we had never fully defined a Cocoa-based application that rested on the Synth Kit. We finally put it all together in our newly released application called “Synth80”.
Synth80 is a two-oscillator multi-timbral polyphonic synthesizer. This application pulls together the following features.
Cocoa bindings for linking Controllers to McLaren Synth Kit Models
Document-based application for saving and restoring presets
Undo/Redo facility
Single Document Interface document-based application for the singleton Synthesizer window
This article summarizes some of the things we learned about using Cocoa this way while developing Synth80.
Focus on Data Structures
We found that we were able to focus on modeling data more than building GUI elements. The models for our synthesizer elements (envelopes, oscillators, filters, etc) were designed first, with controls added later.
Container technology can be used for many things. It can be used for isolation, security, portability or resource management. One popular use of containers is to run a binary for one operating system on another operating system. We wondered if Linux lxc/lxd container technology would allow us to run the binary of rtpmidi for Ubuntu 22.04 on a Debian 12 system. We succeeded eventually.
This post outlines how we reached success. It is not necessarily a recommendation to follow this path. There are a lot of steps, and they can be confusing. But we are going to write down what we learned anyway.
Last week I read about a new release of a fork of the Plan9 Operating System from the 9Front crew. The release is called “Human Biologics” and the new feature list was intriguing. There was now default support for git, and ssh, and some improvements to audio were included.
audio(1) – mp3, ogg, flac, ulaw, wav
I had heard about Plan9 on an off for years. I wondered what sorts of audio and MIDI support might be in this OS, so I gave it a ride. What follows is a short account of some things I learned.