Tom

LXD/rtpmidi – running McLaren Labs rtpmidi in a container

  • by

Container technology can be used for many things. They can be used for isolation, security, portability or resource management. One popular use of containers is to run a binary for one operating system on another operating system. We wondered if Linux lxc/lxd container technology would allow us to run the binary of rtpmidi for Ubuntu 22.04 on a Debian 12 system. We succeeded eventually.

This post outlines how we reached success. It is not necessarily a recommendation to follow this path. There are a lot of steps, and they can be confusing. But we are going to write down what we learned anyway.

Read More »LXD/rtpmidi – running McLaren Labs rtpmidi in a container

9Front and Audio on Raspberry Pi4

  • by

Last week I read about a new release of a fork of the Plan9 Operating System from the 9Front crew. The release is called “Human Biologics” and the new feature list was intriguing. There was now default support for git, and ssh, and some improvements to audio were included.

  • audio(1) – mp3, ogg, flac, ulaw, wav

I had heard about Plan9 on an off for years. I wondered what sorts of audio and MIDI support might be in this OS, so I gave it a ride. What follows is a short account of some things I learned.

Read More »9Front and Audio on Raspberry Pi4

GNUstep StepTalk is Alive and Well

  • by

A few years ago, I explored the idea of writing a small interpreter for embedding into McLaren Labs applications. The concept was to extend rtpmidi or a synth application with a scriptable ability to customize controls, behaviors or sound graphs. This would help make each application very slim, but to allow extension through scripting. (This is not a very novel idea: it’s been done many times before. 🙂 )

The Objective-C execution environment (libobjc) provides many hooks that make it fairly easy to call into the ObjC runtime from an interpreter, or even to bridge Objc method calls back to an interpreter. The idea of a scritping environment to accompany a GNUstep ObjC Appplication program seemed promising and I wrote a little interpreter (based on PostScript syntax) to test out some of the ideas.

Along the way, I learned about a much more mature project in libs-steptalk. Among other things, it provides a SmallTalk-like language for extending applications or even gluing them together in the GNUstep workspace. It seemed very old, but there were some tremendous ideas in there: I wondered if I could learn enough about it to use it, and I wondered if it still worked.

(Spoiler Alert! I did and it does.)

Read More »GNUstep StepTalk is Alive and Well

GNUstep Desktop – a refreshed look at NeXT/OpenStep

  • by

There’s a cool project called “GNUstep Desktop” that brings together many old GNUstep technologies, and a few new ones, to provide an entire integrated desktop. We gave it a whirl and are pretty impressed!

GNUstep Desktop on Debian 11

It’s pretty amazing that software that began life 30 years ago is still operational and evolving, but that is the case in the GNUstep project.

At McLaren labs, we have been a fan of modern ObjC on clang, and the ObjC runtime with Foundation libraries and have been slowly learning more about GNUstep GUI and desktop. To be honest, it has been a slow learning process. While there is actually very good information about the various pieces of the GNUstep project, it sometimes seems that there are too many conflicting versions around — there isn’t a single source of truth that brings everything together. But that being said, the quality of many of the components really amazes me.

Read More »GNUstep Desktop – a refreshed look at NeXT/OpenStep

Announcing McLaren Synth Kit

The “McLaren Synth Kit” is an Objective-C framework for using MIDI and Audio on Linux computers with the GNUstep programming environment. It is distributed as a project including libraries, headers and example programs at https://github.com/mclarenlabs/libs-mclaren-alpha. You can use it to experiment with sound synthesis for your own personal projects. The project is designed to provide ready-to-compile examples after you clone the repo.

Working with sound is a delicate endeavor. The Synth Kit does a lot of the low-level work of opening devices, managing an audio thread and copying MIDI events to and from dispatch queues. This leaves the audio programmer free to think about designing sounds as a graph of processing units, called Voices.

Standard Voices in the Synth Kit provide envelopes and oscillators of various types, filters and a reverb algorithm. Using the features of modern Objective-C (blocks, ARC and dispatch queues) the Synth Kit makes programming sounds easy, or at least “easier.”

Read More »Announcing McLaren Synth Kit

Using avahi-browse to find Bonjour services on your Network

  • by

On Linux systems, Bonjour is implemented by the “Avahi” service. This is what McLaren Labs’ rtpmidi program uses to find Apple MIDI services on your network when you set it up. Usually rtpmidi can find the iPhones and iPads on your network, but sometimes things do not go smoothly. That’s when you need to do some network debugging. One of the tools that you can use to learn about Bonjour services on your network is avahi-browse.

Install avahi-browse

If you have installed McLaren Labs’ rtpmidi, then there is a good chance that avahi-browser is already installed. Try it out.

$ avahi-browser

If it isn’t there, you can install it like this on Ubuntu systems.

$ sudo apt-get install avahi-utils
Read More »Using avahi-browse to find Bonjour services on your Network

VSCode, Ubuntu Snaps and ALSA Sound Development

Here at McLaren Labs we like to try all sorts of tools and development environments to see how they work together. We’ve been using vscode (https://code.visualstudio.com/) on and off for about a year, but only recently decided to try using it for a more complete edit/debug/run cycle for an Objective-C Synthesizer project. We ran into an unexpected interaction between the Snap environment of vscode and the ALSA (Advanced Linux Sound Architecture) PCM interface.

The Symptoms

What we discovered was that when running a program that attempted to access an ALSA sound device was that the program malfunctioned. It worked correctly in a “normal” terminal, however.

We reproduced the behavior using the default aplay command that is available in alsa-utils. See the screenshot below for what should appear in a terminal when it is run, playing an Ubuntu standard sound called “Front-Left”. Unless otherwise specified, this command opens the “default” sound device.

If your Ubuntu sound system is set up correctly, you will hear a woman saying “Front Left” … and it will come out of your front-left speaker.

Read More »VSCode, Ubuntu Snaps and ALSA Sound Development