MIDI device representation

Basic MIDI editing support in Ardour is heavily coming along, but another aspect that hasn’t been addressed to any significant degree yet is how to manage the other end of the MIDI connection, the MIDI devices. Any respectable sequencer has some means of representing semantic information about the MIDI data it sends, such as device names, program names and controller names.

Several database type formats exist to provide this semantic information for a given MIDI device. Back in the days I used Cakewalk, which use .ins instrument definition files. Rosegarden and Muse have their own xml based formats that does the same, and probably there exists many more formats. Softsynths are another chapter, but I don’t know of any commonly used way for a softsynth to transmit this kind of data to its host application.

One day Ardour will be faced with the problem of representing this kind of data within the program. Defining a new internal data format would obviously be wrong as there exists so many already. Choosing an already existing format will probably lead to people complaining about not supporting this and that.

The question I’m raising is whether this needs to be built into ardour at all (or to what degree), as we now have support for the extensible LV2 plugin standard. ll-plugins already contains a MIDI map extension, somewhat a step in the direction I’m talking about. I know LV2 plugins are mainly designed to process sound data, but I wonder if, with an appropriate extension, they could somehow serve as a database API for extracting/parsing semantic MIDI information from the various established MIDI semantic info file formats.

Consider a MIDI track in Ardour. That MIDI track is connected to a JACK port, which has a name. The instrument that plays the notes in the end can often, but not always, be determined from the name of that port. Suppose that you have a configuration file in your home directory that maps JACK port names with a file containing the semantic information about the synth/device. You also have a collection of LV2 plugins that can parse the given files, and return the information in a uniform manner. Given the port connection, whose name ardour send to the plugin, the plugin will return the info requested after it has parsed the config file.

These are just thoughts and I won’t elaborate much further until I get some feedback from the regular bunch of smart people that hangs out around here. I’m sure they have better and more elegant solutions:)

Ardour has traditionally played very little attention to what its connected to, other than to distinguish between physical and non-physical ports. This would change that in a fundamental way.

Also, its not clear to me that a JACK port name will reliably (let alone usefully) indicate the instrument/timbre to be generated.

We also have the option of choosing a device explicitly, independent of the jack port used, but that may result in not displaying the right information when the jack port connection changes.

Also, the jack port way cannot always work, because of the ability to route the signal through other hosts before reaching it’s destination, and there is no way for us to figure out what happens to the routing inside that host. The MIDI channel is also used to infer what device the signal ends up in, once the signal leaves the software domain.

We may also have to consider the presence of LASH in all of this, but I don’t know if there are concrete plans for Ardour to support it. I’m not at all aware of the way LASH works, but I know that it manages JACK connections. Maybe some modifications to LASH may provide it with this kind of semantic MIDI information to hand out to hosts when requested.

Maybe we need a jack client that acts as a proxy for hardware synths, and it’s responsibility is to route and translate midi channel of the MIDI signals from its (device-named) input ports to the appropriate output ports offered by the system and running softsynths. Plugins inside ardour may then communicate with this client in some way (maybe using OSC?), the plugin takes the JACK port as parameter (which is the connection to the proxy client) and queries the client for the information asked for, being a program list, controller names, drum mappings, or whatever.

The use of this client is of course optional, and when the plugin cannot find a connection to the proxy client (or the plugin is not installed in the system) simply no semantic information is returned. Code must be written in ardour to send queries and parse responses from the plugin, but I suppose that code will be fairly small.

The advantages of this proxy client is that the MIDI setup may change, but jack MIDI clients does not need to know about it, they just communicate with the proxy which keeps a non-changing list of JACK input ports named after the synth.

Nedko proposed using jack/dbus for this. Whether that’s right or wrong i’ll let others decide, and in the meantime i’ll think of the concept that seems to evolve into something best described as a JACK synthesizer manager.

The client will as described above provide one MIDI input port for every defined synth in the system. The signal is routed to the correct port/channel inside the client. In order to get the semantic midi data a host queries the client, maybe using jack/dbus. I’ll have to investigate how it all works, as I would really like to write and manage this project if this is the desired solution.

Synths normally correspond to hardware or software synths, but there is nothing standing in the way of defining “compound” synths inside the program. Think of a drum synth that uses your favorite drum sounds from different synths, some hardware and some software. It’s just routing inside the application.

The app would be split in two, an engine part run by LASH and an optional gui part that allows adding, removing and editing of synth definitions and modifying the MIDI routings.

How a synth/program is selected inside a sequencer is an important issue to me. In my opinion, for example Rosegarden got it wrong. When you want [program name] as the used sound for a track in Rosegarden, (if I remember correctly) you get a tree-like menu sorted by some category. What patches that show up there are decided by which midi bank you selected in another widget. I have a Korg Trinity synth, and to get a desired patch you need to remember which bank it was in, what category (if categories are defined for the synth), and which number it has, as the menu is sorted by the numbers.

Ardour’s patch selector should in my opinion be designed as close to the current plugin selector as possible. It has a flat view, sorted alphabetically, and you have a search field. Also, as the plugin selector (that’s architecture agnostic (LADSPA/LV2)), the patch selector should be device agnostic. Eventually, the device (and thus the JACK port) used for the MIDI track in question is determined from which patch name you chose, because the patch name is associated with a device. Which device is used should be shown in the patch selector in the same way as the plugin architecture is shown in the plugin selector.

When a patch is chosen, a query could be sent to the synth manager, which returns the jack port name, program number, bank change and midi channel that needs to be sent to the port in order to get the right sound. (one problem is when you use multitimbral synths which offer more than one MIDI channel with different sounds: How do you specify “use a new unique channel” or “use an existing channel” without exposing the midi channel in ardour? (given that this is what we want))

As always I hope this makes sense, and the topic is of course open for anyone to discuss.

audun: I like the idea of a proxy. I agree it would be nice to make patch selection similar to plugin selection. Ever had a look at NI Kore?

For the hardware side, things could become complicated if you start to model MIDI setups, including chained devices.

Auto-connecting audio ports according to patch selection could be interesting.

Maybe such a proxy could help with making projects portable? Say 2 users wap a project back and forth and use different hardware. The proxy could deal with how the hardware maps to each other, or at least offere a single interface to the software side and allow switching of hardware profiles.

thorwil:
NI Kore, never heard of it.

I’ve often run into the problem of having a large collection of MIDI files refer to an absolute MIDI device (the data saved in the project refers to a port number, a patch number, a bank and a channel number). When I don’t have access to the synth, I have to change each and every one of the files that I’m going to listen to, finding a patch on another synth that sounds about the same (heh, this sounds so ancient). Think about defining patch groups, e.g. “Vibraphone”. In the group is every patch in the system that sounds like a vibraphone (this could be set up manually or by automatic name search for patches including the word). Each patch in the group has a priority, the sequencer refers to only the group, and gets the patch data with the highest priority. When you sell your synth, you don’t have to change anything, it just keeps playing.

I know we’re not living in the 90s anymore, but some people still swear by hardware synth modules, and in my opinion this is still an issue.

This would also help portability when hardware synths are used.

Thorwil, could you explain what you mean by auto connecting audio ports according to patch selection?

audun: say you have a MIDI track as source and an audio track as final destination. Now you connect the MIDI track to a synth implicitly by selecting a patch. Now you want the synth’s output connected to your destination track. If you switch the patch, the audio connection should be changed, too, if required.

Oh, and you should have a look at NI Kore, especially regarding patch categorisation (never used it myself).

Aha, now I understand. So optionally, for hardware synths, you can specify a system audio input port that’s where the system receives the synth’s audio signal. If this audio port is unique to that synth, one can use that port internally in the Instrument concept in ardour (hardware and software instrument tracks behaving in roughly the same way).

The hardware synths in the synthesizer manager have to be set up manually, but it’s possible to have a discovery mechanism for softsynths. The program listens to the jack graph/client change callbacks, and when a client appears that have at least one MIDI input port and one audio output port we can in some way infer that it is a synth.

If it talks with LASH, it could be possible to bring up new instances of softsynths when needed (in the same way that you specify the uniqueness of the channel with hardware synths, as pointed out in a former post, but with softsynths you’re not limited to 16).

Note that I’m talking about synths as JACK clients. Plugin based synths are better hosted inside the sequencer application. Or are there arguments to make the synthesizer manager a plugin host as well?

Thorwil, I saw the Kore shots you sent me on irc. The categories window resembles the thing I’m talking about. It could come with a set of predefined categories, and when you load an instrument definition, it matches the patch names with the categories and puts them there. Another far out category feature would be to have a machine learning mechanism: If you have patch A in a track and change the sound into patch B, the app may note that there is a stronger relationship between A and B than a random relationship. Could the information extracted be used for some automatic categorization?

Regarding categories
Maybe it’s better to do it this way: The sequencer refers to the patch by it’s name and/or an id to make it unique. A query is sent to the synth manager, which returns the port/chan/bank/program change required if the patch is defined. If not, it searches upwards in the patch group hierarchy until it finds the closest patch that is defined.

audun: softsynths as plugins will allow to have instrument tracks or busses: MIDI in, audio out. They are part of Dave’s vision :slight_smile:

On patch selection, you would want to see everything that your hardware and all jack and plugin softsynths offer, right?

Maybe it’s time to think about project planning: do you have the time and motivation to implement anthing along these lines? Who might help? I should be able to do quite a bit on conception and interface design, but you can’t expect a single line of code from me. What are the requirements that can realistically be met in a first implementation?