Musical instrument digital interface

from Wikipedia, the free encyclopedia

Musical Instrument Digital Interface , [ ˈmjuːzɪkl̩ ˈɪnstɹəmənt ˈdɪdʒɪtl̩ ˈɪntɚfeɪs ], ( English "digital interface for musical instruments"), in short: MIDI [ ˈmiːdiˑ ] is an industry standard for the exchange of musical control information between electronic instruments , e.g. B. keyboards or synthesizers . This standard includes both the exact nature of the required hardware and the communication protocol for the data to be transmitted. MIDI 1.0 was introduced in August 1982 and has since been expanded several times. MIDI was developed by Dave Smith in cooperation with Ikutaro Kakehashi from Roland Corporation , for which both were awarded the technical Grammy in 2013.

Areas of application

The MIDI protocol was originally developed for communication between synthesizers from different manufacturers. The real purpose was to control other synthesizers from a keyboard of a synthesizer. Before that, synthesizers could only be connected in an analog way and with a lot of cabling.

At the time, the synthesizers only had a few voices; that is, they could usually only produce 4–8 tones at the same time. Despite a certain selection of sounds, no device could produce more than one sound at the same time. So if you wanted to play two or more sounds with one touch of a button, you had to couple two devices with a keyboard. So you could superimpose different sounds, e.g. B. to get a "thicker" synthesizer string sound or to combine synthesizer strings with synthesizer brass.

This was now possible with the connection via a single MIDI cable by connecting the MIDI out of the main device to the MIDI in of the controlled device using a 5-pin MIDI cable (only two pins are used). Since the audio signals of the various synthesizers are not MIDI control data, they have to be fed to a mixer via additional lines.

MIDI also separated the keyboard of a synthesizer from its sound generation at the same time, which of course massively increased the application possibilities of an instrument: This way it was also possible to split up a keyboard and distribute the keyboard areas to different synthesizers. So the keyboard player could z. For example, you can play a string sound with a controlled synthesizer with the left-hand keyboard area and a solo synthesizer sound with the local device with your right hand.

The MIDI interface was quickly adapted for almost every type of electronic musical instrument, e.g. B. for expander modules, samplers , drum computers , effect devices (reverb, echo, equalizer, etc.), hardware sequencers (recording and playback devices for MIDI data), computers , controllers (such as master keyboards , drum pads, master keyboard controllers , standard MIDI files -Player, fader boxes, later also for sound and audio cards , etc.), last but not least - used for other purposes - to control light effects for stages (MIDI Show Control).

The use of computers in recording studio technology gave MIDI a further boost. With the help of a hardware sequencer or a computer and a sequencer program, the less experienced keyboard player was able to create complex, difficult or even manually unplayable pieces of music because he was able to change and correct the MIDI data in the sequencer. Because only the control data are saved, the sound can also be exchanged in the sequencer after a recording. This opened up completely new possibilities - also for experienced musicians - and has an impact on the way music is produced to this day:

Composition , arrangement and notation have been simplified considerably by combining a MIDI-capable keyboard and a computer. Variations of voices and song sequences can be implemented very quickly and can be changed at any time. This time saving is an important factor in studio productions . The composer often makes use of the computer tool and edits his concept directly using software, but many voices are still played using a piano keyboard or a master keyboard.

With special converter devices you can also generate MIDI data from the tones of any acoustic instruments such as guitar or saxophone . The pitch played has to be determined from a complex sound pattern, which, depending on the instrument and how it is played, soon reaches its limits. For a guitar z. B. can be interpreted which note or notes are played, even if a fret change or pulling a string is involved. With the notes generated in this way, completely different sounds can be generated in combination with the acoustic instrument or completely independently via a synthesizer or sampler connected to MIDI.

In the 2000s, when cell phones were running out of memory , the MIDI format was also used for ring tones .

functionality

If you play a key on a keyboard, digital information about pitch and velocity is output at the keyboard's MIDI output and can be transmitted to the MIDI input of a computer or used to control the tone generator in electronic instruments and sound cards. Such commands are, for example, note-on ("key for note x was pressed") and note-off ("key for note x was released").

For example, a full 3-byte record for a play command for a note might look like this:

  1. Byte: Note on on MIDI channel 1
  2. Byte: Note C3
  3. Byte: Velocity 103

A controlled sound generator plays this tone until it receives the 3-byte command with a note out byte instead of a note in byte. Which sound the tone generator plays is set either beforehand on the tone generator or with additional MIDI commands before the play command for the note.

A computer can record and save this information and visualize, enter and manipulate it in various editors. The following editors are common here:

  • a list editor in which MIDI data can be edited directly;
  • the (arguably the most widely used) piano roll editor, which can display a piano keyboard over time;
  • a music editor that the notation makes visible on the screen; however, this is often not useful in practice because the score looks completely torn when notes are played in manually without further processing.

At the same time or later, the recorded data can be sent back to a MIDI instrument. The input of game information is separated from the sound generation.

In addition to the musical commands, other data packets can be used to control the target device, such as program change commands to select one of its many hundreds of sound spectra. Many sound generators such as synthesizers, expanders and others understand commands with which their internal sound generation can be directly influenced in order to create complex, individual sounds from a series of simple fundamental waveforms.

In addition to electronic instruments, numerous other devices such as (digital) mixing consoles and multi-track recorders now use the MIDI protocol to exchange control information. Instead of using MIDI data packets to transmit note commands, the data can be used here, for example, to remotely control all mixer functions ( fader , mute switch, panorama etc.) or to control the drive functions of a recorder (play, stop, fast-forward / rewind).

technology

MIDI interface on the PC: external part for connection to the 15-pin combined MIDI / game socket

MIDI uses a unidirectional protocol for serial data transmission without data flow control . The transmission speed is 31250 bit / s (exactly 32 µs per bit). For each byte, consisting of 8 bits, there is a start and a stop bit, so that the complete transmission of a data record consisting of 30 bits takes 960 µs. With the exception of the lack of the parity bit, it corresponds to the protocol for PC- UARTs .

In contrast to level-controlled interfaces, however, a 5 mA current loop is used with MIDI. The optocoupler in the receiving line results in a galvanic separation of the individual MIDI devices from one another, the ground line (and cable shielding ) must not be connected to the MIDI-In interface in order to avoid ground loops .

plug

Physically, the connections are implemented as five-pole DIN sockets (DIN 5/180 ° - formerly diode socket / plug) according to the classic specification . Pins 4 and 5 are used as the data line, pin 2 is the ground line. According to the MIDI specification, pin 4 is connected to +5 V via a 220 Ohm resistor. Pin 5 can be connected to 0 V via 220 Ohm. If a MIDI in port is connected to the other end (another 220 ohms in series with the optocoupler), a current flows over the line; this state is defined as a logical "0". If pin 5 is not connected, no current flows, which is a logical "1" and also represents the idle state.

According to a subsequent expansion of the MIDI specification, MIDI outputs based on a 3.3 V supply voltage are now also possible. In this case, series resistors of 33 ohms and 10 ohms are used on pin 4 and pin 5, respectively.

In some systems (e.g. the “Roland MPU-401 AT” MIDI interface card as an ISA card) the connections are also designed as 6-pin mini-DIN sockets. In such cases, a connection adapter that is identical to a keyboard adapter "Mini-DIN plug to DIN socket" (PS / 2 to AT) helps.

On July 26, 2018, the specification was extended to 2.5 mm and 3.5 mm jack plugs . The three-pole form of such plugs is ideal, since only three of the five pins of the usual DIN socket are used for signal transmission. The smaller design of a jack socket allows the specification-compliant installation of MIDI interfaces even in particularly flat devices.

connections

MIDI connections as DIN 5 pin

There are three different types of MIDI connections, MIDI-In , MIDI-Out and MIDI-Thru .

  • MIDI-In is used by a device for reception.
  • MIDI out is used for sending.
  • MIDI-Thru forwards signals received at the MIDI-In unprocessed.
Master-slave principle

MIDI works on the master-slave principle. If you want to control a synthesizer with a keyboard, you connect the MIDI-Out socket of the keyboard (master) to the MIDI-In socket of the synthesizer (slave). If two sound modules are to be controlled as slaves A and B with a keyboard (master) , connect the MIDI out socket of the master to the MIDI in socket of slave A and the MIDI thru socket of slave A with the MIDI -In socket of slave B.

A frequently encountered scenario is the use of a computer with appropriate sequencer software as well as the connection of a keyboard or electronic piano to play in the notes and several synthesizers to generate the sound. Usually the MIDI-Out socket of the keyboard is connected to the MIDI-In socket of the computer, the MIDI-Out socket of the computer with the MIDI-In socket of the synthesizer, possibly linked via the MIDI-Thru socket . It should be noted that the inevitable delays in the MIDI data stream add up and can lead to timing errors. Star-shaped MIDI cabling, in which the master keyboard sends its data to a central distributor (MIDI patchbay) to which all other MIDI devices are connected, eliminates such problems.

Input devices

Master keyboard with 88 weighted hammer mechanism keys

A master keyboard generates note information in MIDI format and is used exclusively to control expanders, software synthesizers or to record key movements when playing music in a sequencer. It does not contain its own sound generation. The sound control on the devices can be done using device-specific control functions such as bank switching.

Opposite are the pure MIDI controllers. These are devices without a keyboard that only have buttons and dials with which incoming data can be modified or new data can be added to the data stream on other channels. They are connected between a master keyboard and a receiver - or in parallel with them.

MIDI controller for use on a PC

Today, combined input devices are often used that can perform both note and extensive control functions. Some generate pure MIDI information, others can be connected to the PC either in addition or on their own.

Starr Labs Ztar Z6 series (drvinay's studio)

For many acoustic musical instruments there are pickups for generating MIDI signals (e.g. guitar-to-MIDI converter, piano touch-up etc.). Here the acoustic vibration is recorded by a microphone and converted into a MIDI pitch by adding a The fundamental tone is determined and controller values ​​for modulating it are generated. This allows tones of varying pitch (vibrato) to be generated. Some of these systems are also suitable for recording human singing. Complex vocal courses such as blue notes and phrasing e.g. B. be brought by whistling on the music paper.

A number of instruments exist today as pure MIDI devices, in which the sound generation is only possible with an expander. Examples include MIDI violins, MIDI guitars, wind transducers, and MIDI drums .

Calculators as storage media

Signal converter

In order to communicate with a computer via MIDI, a signal converter , usually called a MIDI interface, has to be inserted. It translates the voltage level and ensures galvanic decoupling. In principle, every serial data transmission interface of a computer with a suitable MIDI interface can be used for MIDI transmission, provided it can be set to the typical MIDI protocol.

Commodore 64

The Commodore 64 played a pioneering role , on which the German software authors Gerhard Lengeling and Karl Steinberg in particular programmed their first sequencers , which stand for the names C-LAB, Emagic and Steinberg .

Atari ST

The commercial breakthrough for MIDI as a platform for professional music production is closely linked to the Atari ST , as it was delivered with a MIDI interface as standard . The development of important MIDI programs such as Cubase (Steinberg) or Notator (Lengeling) began on the Atari ST.

Commodore Amiga

With Bars & Pipes Professional , the software company Blue Ribbon Inc. coined a new sequencer software principle on the Commodore Amiga , the functions of which can be expanded almost at will thanks to its freely programmable plug-in interface. Most of the MIDI interfaces for the Commodore Amiga were offered as adapters for the serial interface and are equipped with a MIDI-In, a MIDI-Thru and mostly three MIDI-Outs. There are both synchronous and asynchronous MIDI interfaces. With an asynchronous MIDI interface, the various MIDI out interfaces can be controlled independently of one another. With three MIDI-Out interfaces there are 48 MIDI channels (3 × 16).

IBM PC

This was originally an 8-bit ISA plug-in card from the manufacturer Roland. Many computer games available for MS-DOS PCs between 1988 and 1995 support this MIDI interface for controlling sound generators such as B. the internal Roland LAPC- I or the external MT-32. Other manufacturers such as Creative Labs only supported the MPU-401 mode to a limited extent in the so-called dumb mode (UART) , while the intelligent mode , which guaranteed precise timing through hardware support, was only supported by Roland's own products.

The standard DIN sockets for MIDI are too large to be built directly into the back panel of a PC card. For a long time, the usual procedure was to convert the MIDI signals that were available at a combined game / MIDI connection of corresponding sound cards to the standard MIDI interface via an adapter (see MIDI connections ). Older PC sound cards , based on the Sound Blaster , shaped a connection in which the game interface and MIDI interface share a 15-pin D-Sub socket and which is still used today in cheaper, non-professional MIDI interfaces in PCs is represented. The sound card only needs to provide two digital, serial lines without data flow control (MIDI does not use data flow control). With this type of hardware implementation, part of the MIDI interface is moved to an external part, which can often be purchased separately and which is usually encapsulated in the thicker connector of a cable. Motherboards that sound -, MIDI and Game - Controller on board have to have this combined Game- / MIDI connector taken. This corresponds to sound, game and MIDI chipsets that integrate these functionalities partially or entirely together. The presence of a 15-pin D-Sub socket does not allow any conclusions to be drawn as to whether a MIDI interface is available or, if available, of what quality it is.

General

On the software side, the hardware was mostly MPU-401 compatible. Before that, MIDI interfaces for serial (COM) and parallel (printer port) interfaces were also in use. Professional MIDI devices for PCs often use proprietary (manufacturer-specific) sockets between the plug-in card and the external device. However, there are now many MIDI interface devices for USB, FireWire ( mLAN ) and LAN.

Expander (sound module)

synthesizer

An expander is an external sound generator without its own keyboard. It receives notes exclusively via MIDI. Only parameters that can be set on the expander are transferred back and can be recorded. In some cases, the switching commands that are generated with the buttons on the device are also sent and recorded.

Synthesizer as an expander

An expander module extends the possibilities of a keyboard synthesizer or keyboard. Like their keyboard counterparts, expanders can provide many forms of sound generation. Because of their small footprint, their importance is at least as relevant as the keyboard versions for the development and distribution of MIDI.

Expander technology offers the possibility of generating a variety of sounds in relatively small devices. Without it, MIDI would certainly have spread less quickly and less widely, because other methods are out of the question in many places because of their space requirements and their costs.

The sound generation processes of expander modules can be divided into two basic classes:

  1. sample-based sound generator modules;
  2. Synthesizer modules

Sample-based sound generator modules

ROM sound generator ( ROMpler ) and sample player

These sound generators provide a multitude of different basic sounds (multisamples) that are simply played back. They serve to provide a basic supply of imitation sounds from natural instruments. Therefore, these devices were from the start with at least 16 votes and at least 6 tones ( timbres equipped). Thanks to this multitimbrality , several different sounds can be called up on different MIDI channels. In the beginning, this was only possible with a few devices.

This class of sound generators was most widespread at the beginning of the expander era. Examples are Roland U110 and Roland U220 . A little later these devices got simple editing functions, i.e. simple filters. However, this did not turn them into full-blown synthesizers, because these filters were used more to refine the sound than to create new sounds. One example is the Korg M1r .

Sampler

Samplers do not provide sounds, but can record and play them back. Usually samples are loaded into the device from a sound library. With samplers the user can create his own multisamples and thus create an elaborately compiled instrument imitation. Samplers can also provide editing functions such as loops. Samplers allow musicians to play in new ways, for example the use of drum loops and one-shot effect sounds within a keymap, i.e. a keyboard combination of different samples.

The sampler technique has shaped modern music significantly. Music styles like hip-hop would be inconceivable without samplers. This technique also has a major impact on film music. To this day, good orchestral imitations are only possible with complex sampling technology.

Samplers were and are often used in live operations, for example to expand the sound properties of a real drum kit with samples, or to provide choirs or backing vocals for small band memberships. A keyboard player presses the choir samples with a key, or the drummer triggers them with an e-pad via MIDI.

Almost all popular samplers were only made as expander versions. A well-known example is the Akai S series.

Sample-based synthesizer modules

There are a variety of sample-based synthesis techniques:

  • The LA synthesis is a synthesis process developed by Roland. It separates sounds into transients and surface sounds, which are then processed with a digital filter. The various combinations of these sounds can be used to create new sounds. Starting with the D50 and its expander model D550, many of the company's models are based on this sound generation process to this day.
  • The Yamaha company, on the other hand, integrated samples into their FM process, in which ROM and RAM samples provide waveforms.
  • With the Wavestation, Korg had a device that could play samples rhythmically.
  • With devices from Waldorf and Ensoniq there was the wavetable method, in which individual cycles of samples are run through and new sounds are generated.
  • Kurzweil developed VAST synthesis, a process with which samples are extremely modulated using complex DSP functions and thus distorted so that the output sample can no longer be recognized.

Each sound generation has its own character. A selection of devices with keyboard live or in the recording studio is often not possible due to lack of space, which is why expanders are usually used here.

Synthesizer modules with generic sound generation

This device class is as comprehensive as that of the sample-based synthesizer, with the sound generation being purely synthetic . It ranges from modular, analog wall units, for example from Doepfer, to small digital modules in half-inch size.

Much used synthesis methods are (small selection):

  • analog synthesis with analog oscillators and filters (Roland MKS 80)
  • analog synthesis with digitally controlled oscillators and analog filters (Oberheim Matrix6)
  • FM synthesis (Yamaha DX7 / TX802)
  • additive synthesis (Kawai K5000R)
  • virtual synthesis methods of all kinds:

Virtual simulations are the most modern type of synthesis, they reproduce acoustic, mechanical, electrical and electronic construction methods of the instruments, which is why their operation is based on the structural characteristics and is therefore completely different from the classic synthesis methods. An example would be the virtual opening of the wing cover, which reflects a mechanical process. For a filter, however, there is no real counterpart, it remains an electronic process that depends on the characteristics of the electronic circuit.

Examples of virtual synthesis are "virtual-analog" synthesizers, "virtual strings" for guitar, bass guitar and string instrument simulations; "Virtual brass" for wind instrument simulations, various organ, piano and e-piano simulations for mechanical and electromechanical instruments.

Control options of the expander

Different synthesis techniques require different processing methods. Many expander modules are now equipped with control elements such as knobs, faders and switches that can send MIDI signals. For example, the control curves of a filter can be sent from the controller of an expander module to a sequencer for recording. You can receive your MIDI note commands from the sequencer or from a keyboard. Many of these devices are therefore, depending on the situation, sender, receiver or both at the same time.

Construction

Expander modules are normally manufactured in a standardized 19-inch design. There were and are always exceptions. In the early days of MIDI there were even more devices that did not meet this standard.

Sequencer

Hardware sequencer

The hardware sequencer is used to record the MIDI data and to arrange a piece of music. MIDI sequencers allow programming, recording and playback of recorded or programmed MIDI information such as note values, velocity or other control commands such as e.g. B. Modulation. The sequencers integrated in keyboards or groove boxes are also very popular for live use. A combination of sound generator and synthesizer, master keyboard and hardware sequencer is called a workstation .

Software sequencer

Software sequencers are of great importance in the field of composition, as they offer further processing options in graphic form (subsequent editing, quantization, etc.) in addition to the standard functions (programming, recording, playback) and can now process not only MIDI but also audio material. This combination of audio and MIDI processing on a PC is called a DAW (Digital Audio Workstation).

The sequencer programs mainly used today are the aforementioned Cubase from Steinberg for Mac (OSX) and Windows PC and its counterpart Logic , which, in the meantime, bought by Apple , is only available for Apple Macintosh . Steinberg (including Cubase) has since been bought by Yamaha . In addition, there are Rosegarden and MusE on Unix-like platforms and some other solutions such as Sonar , Ableton Live , REAPER , Renoise or Reason .

Additional hardware

Numerous digital effects units and mixing consoles can also be controlled using MIDI controller commands. In this way, activities at the desk can be recorded with a sequencer or a PC, e.g. B. rework a live mix or anticipate complicated actions. Standard mixes can also be saved for certain recording situations. Larger FOH consoles and monitor mixers can be remotely controlled in this way by musicians.

protocol

The protocol was developed in 1981 by Dave Smith for the Audio Engineering Society and first presented by the MIDI Manufacturers Association in 1983 at the NAMM Show in Anaheim , USA . The MIDI standard is set by the MMA (MIDI Manufacturers Association).

The following section requires an understanding of the hexadecimal system . A byte is made up of two hexadecimal digits between 0 and 15, with the numbers from 10 being noted with A to F. A 2-digit hex number has a range from 0 to 255.

Most MIDI commands contain a channel number in addition to their command ID and command data. The channel number is 4 bits , so that 2 4 , i.e. 16 channels can be controlled. Each channel controls a special instrument, also called a "program".

Message types

Note names and MIDI note numbers

The status byte is the first byte of a command. The status byte also contains the relevant MIDI channel n . This ranges from 0 to 15. In many programs, when the channel number is displayed, the actual channel number is shown increased by 1, i.e. from 1 to 16 instead of 0 to 15. The following bytes are data bytes. In order to be able to correctly resume an interrupted data stream at any time, a status byte always begins with a 1 and a data byte with a 0. The status bytes are in the range 80 16 to FF 16 and the data bytes between 00 16 and 7F 16 . If a data byte is received instead of an expected status byte, the last status byte is considered repeated and the current data byte is part of its data (running status).

Status byte and MIDI channel
byte 0
Data
byte byte 1
Data
byte byte 2
action Explanation
8 n kk vv Note Off Ends playing the note kk . So has the meaning of letting go of a key. If the note was not played before, this signal is simply ignored. In addition, a "Release velocity" vv is sent. In the .mid file format , if vv is omitted, this can also specify the time span between two commands (e.g. note on - note off), whereby a quarter measure is divided into 960 units, which are represented with a 14-bit value.
9 n kk vv Note On Starts playing a note kk . In addition, the touch dynamics vv (English Velocity) is specified, which corresponds to the pressure on the key in 127 steps from 1 (very weak) to 127 (very strong). The value 0 is defined as a note-off command.
A n kk vv Polyphonic aftertouch Describes how to change the keystroke vv while the kk key is already pressed for each key. These data are neutral; that is, they have to be assigned to other data. (E.g .: Assignment to controller 11: Expression - strength of expression, with which the sound of a saxophone can be changed while the note is sounding after the key is pressed via the "floor pressure" on the key.)
B n cc vv Control change Changes the state of a controller cc (see next section) with the value vv .
C n pp Program change Specifies the instrument pp to be played for the specified channel .
D n vv Monophonic or channel aftertouch Describes how to change the keystroke vv while the keys are already pressed, for all keys together.

As with Polyphonic Aftertouch, this data is neutral.

E n vv ww Pitch bending Adjustment of the pitch bend wheel. The two data bytes vv and ww together result in the 14-bit wide value between 0 and 16383. It should be noted that they are combined interchanged, i.e. (byte 2 · 128) + byte 1 = 14-bit value. The middle position of the wheel is indicated by the value 8192.
F 0… 7 xx System (exclusive) message Control messages, often device-specific, length also device-specific (xx = data bytes)

Controller

The purpose of a MIDI controller (Continuous Controller = CC, pre-defined, fixed controller) is to provide the user with the typical playing aids for his instrument, e.g. B. the piano player a holding pedal (CC064) or the organ player a volume control (CC007 or CC011) and a Leslie speed switch (e.g. CC004, CC006).

The synthesizer player, for example, has a whole range of other ways of influencing his game. B. a modulation wheel (CC001), pitch bending lever, various pedals and foot switches, further slide and pressure regulators and switches, blow converters (CC002) or light controls for the hands. There are even - external - controls for the brain (brain-to-midi).

Good feel and control when influencing the way you play can enable expressive play. For this purpose, the game aids usually have mechanically movable elements that the user can operate. This movement is translated into messages (controller data) and passed on to the devices and sound generators. They work in a similar way to game controllers . Some of the game aids are linked to specific controller commands via MIDI protocol, whereby the game aids on the transmitter (e.g. keyboard) as well as on the receiver (expander module, plug-in) can be freely changed to other controllers (commands) (e.g. . Modulation wheel as volume control).

It is only recently that controllers have been used increasingly by ignoring the original semantics . The functions assigned to each command are completely different from those provided in the MIDI protocol. This is often the case when music editing is done entirely on a computer. Adapted controllers are very common, especially for DJ programs . Here, for example, a dialog for song selection is opened with the MIDI signal, or functions such as start and stop can be controlled. On the other hand, display elements in controllers can be operated. The MIDI command semantically assigned the pitch can encode the length of a repetition loop, for example. The haptics of the controller are particularly important here, as the required reaction time and sensitivity cannot be achieved with the mouse on the computer, for example when synchronizing the beat of two songs. The proprietary character of this type of application can be countered by freely assigning the MIDI commands both in the program and in the controller, although this is far from the rule.

Classic applications of the MIDI controller, on the other hand, strictly adhere to the semantics of the protocol. These are of greater importance in the field of music production, as they can be used to easily control device-specific sound parameters of the current instrument. Different devices are interchangeable.

The controllers send a certain value on a certain channel with a certain controller number. Simple controllers can take values ​​from 0 to 127, which, however, very quickly leads to unsightly, step-like, erratic pitch changes when pitch changes. Therefore the controllers 0–31 can be coupled with a so-called LSB controller 32–63 in order to obtain a much higher resolution. In practice, however, this technique is seldom used because a resolution of the volume, for example in 128 steps, is sufficient.

Switches such as the holding pedal number 64 can theoretically send values ​​between 0 and 127, but since a switch can only accept two values, values ​​from 0 to 63 are usually interpreted as "off" and values ​​from 64 to 127 as "on".

If a programmable control unit is used, knowledge of the controller numbers and what they usually control is of great benefit. The most important controllers are listed in the table below. The first byte of a controller command is always B n 16 , where n indicates the channel number and vv stands for the value that the sound parameter to be controlled should assume.

Manufacturer-specific extensions

The general (RPN - Registered Parameter Number) and the manufacturer-specific (NRPN - Non Registered Parameter Number) controllers are used to control parameters that cannot be accommodated in the normal controller area. Parameters with an index between 0 and 16383 can be changed. The controllers are used to set the indices

  • RPN-MSB (65 16 )
  • RPN-LSB (64 16 )
  • NRPN-MSB (63 16 )
  • NRPN-LSB (62 16 )

Then either a normal MIDI controller (06 16 ) with a value between 0 and 127 or a data increment / decrement (60 16 or 61 16 ) is sent, whereby the third byte is ignored here. It is important that the MSB controller is always sent before the LSB controller. (see table).

System exclusive messages

System exclusive messages (SysEx) are part of the MIDI transmission protocol .

While the other MIDI commands are largely defined as a standard, the system-exclusive messages ensure that the manufacturers of hardware and software can also transmit information that is not provided for in the MIDI protocol. For example, the contents of a device's memory can be sent to a computer to back up data . Model names can also be exchanged between the devices. The MIDI protocol guarantees that system exclusive messages are not interrupted by other commands. System-exclusive messages are therefore transmitted in pauses so that there can be no unpleasant dropouts or delays.

A system exclusive message begins with F0 16 and ends with F7 16 . The second byte is the manufacturer identification, which is assigned by the MIDI Manufacturers Association and is unique worldwide. If the second byte is 00 16 , the following two bytes identify the device manufacturer. The manufacturer ID enables the devices to filter, which enables the corresponding devices to interpret the messages. Any number of data bytes follow between the manufacturer ID and the final character, but these can only contain values ​​between 00 16 and 7F 16 and thus only allow a 7-bit character set. As a rule, a few bytes are sent first to determine which device the messages are intended for. The relevant manufacturer defines the meaning.

Examples:

  • F0 41 … F7
  • F0 00 20 33 … F7

Extension of the standard by MPE

One point of criticism of MIDI is that the controllers are usually only transmitted monophonically, i.e. have the same effect on all notes played in a program (exception: the rarely encountered polyphonic aftertouch ).

In order to be able to achieve greater expressiveness, MPE was defined in 2017 as a downward-compatible extension of the MIDI standard. Under MPE, each note is assigned its own MIDI channel, so that the notes of a played chord can also be modulated independently of one another.

MPE is used by devices such as the Roli Seaboard or the Linnstrument, which in addition to the touch dynamics and the pressure on the key (aftertouch) also evaluate movements of the fingers on the X and Y axes.

see main article MIDI Polyphonic Expression

synchronization

Synchronization is always necessary when two or more MIDI or audio devices have to be linked together to form a system. There are two MIDI protocols for this:
the older and more imprecise MC (MIDI-Clock) , which references the basic tempo of a song and is intended for musical use, and the newer MTC (MIDI-Time-Code) , which uses an absolute time system is referenced and is relevant for recording systems.

Each system group has exactly one master , all other devices are slaves that must follow the master.

If all devices are to send data, a multi-channel connection must be established, see MIDI ring.

Synchronization is necessary in many areas. Two fundamentally different purposes can be identified, from which various mixed forms result:

Synchronization of systems

Here various recording, playback and processing systems are connected to form a system unit. So they should start, stop and run simultaneously and synchronously and of course not diverge during the piece of music, which would happen if the individual systems were started manually at the same time. This is exactly what the MTC does and prevents such problems.

Some synchronization examples from practice (MTC):

  • Two DAWs (Digital Audio Workstation) are interconnected to form a network.
  • A DAW must be interconnected with a hardware sequencer.
  • A DAW must be interconnected with a drum machine.
  • A tape machine must be interconnected with a DAW.
  • A tape machine, a sequencer and a hard disk recorder must be interconnected.
  • A tape machine or a DAW must be connected to a mixer with its own automation system for mixing.
  • Special application: Several computers should exchange data via the MIDI interface see MIDI-MAZE ,

In the case of (analog) tape machines, the timecode must be transferred to the tape before recording. A SMPTE time code is recorded on the last track (8-track on track 8, 16-track on track 16, etc.) of the machine, which is generated by an SMPTE generator. An SMPTE code disrupts adjacent channels by crosstalk, which is why one switches to the outer track in order to lose only one other track for recordings in addition to the SMPTE code track.

An SMPTE reader “reads” the signal from the recorded SMPTE code and converts it into MTC, which can be read by the devices connected to it.

Since only professional machines have complex synchronizers, the tape machines are mostly MTC masters via the SMPTE code, which all other systems must follow. Another reason is that even professional tape machines always have a winding time, which is why it rarely makes sense to use a DAW as a master, for example.

Depending on the type of SMPTE code, it can make sense to include the old time code from the tape as an audio signal when transferring tape material in order to minimize computational differences. Sometimes, however, the timecode on old tapes is so bad that synchronization is not possible, so that all tracks have to be recorded at once. The timecode can be repaired later with special software or devices and made usable again.

Synchronization of synthesizer functions and effects devices

Here the sequencer and synthesizer or effects devices are switched in line with each other in order to trigger musically important events.

Some synchronization examples from practice (MC):

  • A delay should be set exactly to the rhythm of a piece of music in order to provide a musical echo for a solo instrument. To do this, the effects device must of course be able to process MC. Especially if the tempo of the piece changes, this is only possible in this way, because it would not be possible to do it manually.
  • A surface sound of a synthesizer should be rhythmized ("chopped up").
    • Either a MIDI gate is controlled in such a way that it opens and closes to the rhythm of the piece of music,
    • or the MC is used to synchronize the volume modulation in the synthesizer.
  • A sound of a synthesizer should wander in the panorama to the rhythm of a piece of music.
  • A sound of a synthesizer should open and close the filter in the rhythm of a piece of music.

The possibilities for using MC are only limited by the technical possibilities of the connected devices.

MIDI clock and song position pointer

  • MIDI clock (MC)

The MIDI clock is based on measure and note levels. The unit is a tick, a 96th of a beat (quarter note for many songs). The density with which the clock information is sent results from the selected BPM setting of the song. The following messages are possible:

  1. F216 note position within a song 0–16383 (two data bytes)
  2. F316 Song selection 0–127 (one data byte)
  3. F816 is sent 24 times per quarter note (during playback)
  4. FA16 start
  5. FB16 Continue from the current position
  6. FC16 stop

These signals start all sequencers at the same time, follow the given tempo and stop at the same time.

  • Song Position Pointer (SPP)

While MC basically only sends clock ticks, SPP is responsible for transmitting the position in the song. It is transmitted every sixteenth note and is limited to a maximum of 1024 bars.

The relatively coarse subdivision of SPP commands is used for the coarse synchronization of devices, while the exact synchronization takes place via MC.

Connected devices can recognize where they are in a piece of music even when a sequencer is in stop mode . You can also perform functions on certain song positions, such as B. that drum machines trigger preprogrammed repetitions.

MIDI time code

The MIDI timecode (MTC) is a conversion of the SMPTE timecode to the MIDI format. In contrast to the MIDI clock, the MTC is pure time information, a conversion to song, position within the song and playback speed must be done by the software.

With larger jumps in the position, the master sends the absolute position within a SysEx message

F0 7F 7F 01 01 hh mm ss ff F7
  1. F016 SysEx message start
  2. 7F16 This manufacturer code indicates that it is a universal realtime message
  3. 7F16 This channel number indicates a broadcast message
  4. 0116 denotes the message as MIDI timecode
  5. 0116 indicates a full, absolute timecode message
  6. hhconsists of the SMPTE rate and hours in the form 0rrhhhhh2
    1. hhhhh 2 hours from 0-23
    2. rr = 00 2 24 frames / s (film)
    3. rr = 01 2 25 frames / s (PAL video)
    4. rr = 10 2 29.97 frames / s (drop frame timecode NTSC video)
    5. rr = 11 2 30 frames / s (non drop timecode NTSC video)
  7. mm minutes from 00-59
  8. ss seconds 00–59
  9. ff Single image 00-nn, depending on the frame rate
  10. F716 SysEx message end

Only short messages are sent during playback

F1 xx

The byte xx is best understood in the bit representation

  1. 0000 000y2 frame counter low nibble
  2. 0001 yyyy2 frame counter high nibble
  3. 0010 00yy2 second counter low nibble
  4. 0011 yyyy2 second counter high nibble
  5. 0100 00yy2 minute counter low nibble
  6. 0101 yyyy2 minute counter high nibble
  7. 0110 0rry2 hour counter low nibble + frame rate s o.
  8. 0111 yyyy2 hour counter high nibble

When the tape rewinds, however, the messages come in reverse order.

File formats

MIDI is in itself a transmission format with which commands are exchanged between digital musical instruments in approximate real time. But it soon turned out to be useful to save such commands in files. To play back such a MIDI file according to the times given in the MIDI commands, you need your own software.

In addition to real-time MIDI commands, MIDI files can contain other information that is encoded with so-called meta-events. The first byte of a meta-event is always FF, the following bytes contain specific commands and data. One of the most important meta events is the lyrics event FF 05, with which the lyrics can be saved in a MIDI file. Meta-events are not sent to connected instruments during playback, but they can be interpreted by the playing software. A typical example is the display of lyrics for karaoke performances.

There are three file formats for saving MIDI commands in standard MIDI files (SMF for short):

  • SMF 0 - With format 0, all MIDI channels are combined in one track. This format is also used by ringtones for cell phones and can be converted to format 1 with common sequencer programs.
  • SMF 1 - In format 1, each channel has its own track and optionally its own name. Different voices and instruments can thus be better identified.
  • SMF 2 - In format 2, each track consists of independent units. In contrast to SMF 1, several tracks can have the same MIDI channel number.

The standardized file extension for MIDI files is .mid. They are also .karused. These so-called karaoke files contain the entire lyrics. The file format is exactly the same as for .mid. However, many programs do .karnot recognize files with the extension as MIDI files. (In practice, however, the file can simply be renamed.) Windows distinguishes between the files so that it can be seen that it is a karaoke file. The file can still be played with karaoke-compatible software as well as with normal "players". Microsoft uses the extension .rmifor RIFF RMID files. With these, a regular MIDI file is packed in a RIFF container. RIFF-RMID is not an official MMA or AMEI -MIDI standard. For files containing MIDI SysEx data, is .syxused. Mostly these are sound presets from synthesizers.

Alternative transmission routes

USB and FireWire

Since MIDI is essentially a data protocol for controlling electronic musical instruments, it is fundamentally irrelevant which hardware the data is transmitted via. In order to achieve a cost-effective, cross-platform and, above all, fast connection of external MIDI interfaces to a computer, more and more manufacturers are equipping their devices with USB or FireWire (IEEE1394) connections in addition to the classic MIDI connections . The MIDI commands are tunneled via USB or FireWire. In this way, several virtual MIDI connections can be implemented, which means that the limit of 16 per connection is practically irrelevant. Compared to the PC game port, this type of MIDI interface is a much more reliable variant for connecting MIDI devices to the computer, as the drivers used are usually optimized for timing accuracy by the manufacturers of these relatively expensive devices. For professional use, interfaces with four to eight individually addressable out ports are used, with which timing problems can be significantly reduced (see also the following paragraph).

When connecting several identical devices (e.g. keyboards) via USB, the identical device names are differentiated by assigning numbers. If the USB assignment is changed, this numbering is carried out again, which can mean that the devices can no longer be found and a software z. B. must be reassigned.

The USB-MIDI protocol extends the conventional MIDI protocol. The midi signals are always transmitted in 32-bit packets. The lower nibble of the first byte contains the code index number, which for the MIDI commands 8..E corresponds to the command encoded in the MIDI status byte. The upper nibble determines the cable number (0..15), which corresponds to a so-called "Jack" , max. 16 per USB endpoint. 16 channels can be transmitted via each jack in accordance with the MIDI standard. The following three bytes correspond to the MIDI standard, longer MIDI commands such as SysEx are introduced and continued by code index 4 (another 3 bytes) and finally, depending on the number of remaining bytes (1, 2, 3 bytes), by code index 5 ..7 finished.

Wireless MIDI

Following the trend towards wireless data transmission, devices are also offered with which MIDI data can be transmitted by radio. The devices usually use transmission frequencies in the ISM band and send an "ALL NOTES OFF" in the event of transmission errors in order to avoid hanging tones. According to the manufacturer, these devices have a range of 10 to 80 meters.

IP-based networks

For some time now there have been a number of virtual device drivers that allow MIDI data to be transmitted over IP-based networks. While most of these products transmit MIDI data over the network via TCP or UDP on a proprietary basis, there is now also an RFC for standardized transmission of MIDI data over networks based on the RTP protocol: RFC 4695 . There are several open source implementations of this standard and the Apple Network MIDI of Mac OS X Tiger and iOS 4.2 are based on this specification. There is also a Windows RTP-MIDI driver that runs on Windows XP to Windows 7 (32-bit and 64-bit) and is compatible with the Apple implementation.

Other

Until recently, efforts were made to establish the mLAN standard developed by Yamaha as a combination of MIDI and audio data on the basis of FireWire . However, this could not prevail and has since been discontinued.

Another option is the transmission of MIDI data via S / PDIF interfaces and audio signals.

criticism

Advantages and disadvantages compared to digital audio recording

Before MIDI came into being, digital audio recording was extremely expensive and therefore reserved for a few productions. Thus, the technically not very complex MIDI with its enormous performance by recording pure control signals suddenly opened up new horizons for a broad mass of musicians in the early 1980s. Thanks to MIDI, even amateur musicians with appropriate knowledge can create more complex musical structures. For example, strings and wind instruments can be imitated synthetically while drums, guitar and vocals are played in via audio tracks.

MIDI signals only contain control data . Digital "audio signals", on the other hand, are a continuous binary data stream, created by the very fast sampling ( digitization ) of analog vibrations from an audio source (microphone or electronic instrument). They have a constantly high data rate and after digital-to-analog conversion can be made audible via an amplifier-loudspeaker system. MIDI data is only generated when keys on a keyboard are pressed or released. With MIDI, this results in much smaller amounts of data. Recorded MIDI signals can easily be sent to another sound generator afterwards. There is also the option of editing the imported MIDI data as required, for example in order to bring wrong notes to the correct pitch or playback position or to adjust their dynamics . Compared to post-processing of digital audio recordings, all these changes to the original recording cost very little computational effort and are possible with all sequencer programs available today.

Latency and jitter

The speed of the MIDI interface is out of date from a technical point of view. Since up to 3 bytes have to be transmitted to trigger a tone, which takes 0.960 ms, there are delays in the case of several events that should actually be transmitted at the same time, which can also be audible on instruments with a clear attack. The random sorting of the notes also plays a role, which influences the effectively sounding note lengths in the device.

A clearly audible delay caused by MIDI occurs when too much MIDI data is to be transmitted at once, as in the following example scenarios:

  1. In connection with the use of system exclusive data: This message type occupies far more than the typical three bytes (1 status byte and 2 data bytes).
  2. Intensive use of pitch bending or similar controllers, e.g. B. Expression Controllers
  3. When chaining several hardware sound generators via MIDI-Out or software-buffered MIDI-Thru connections: There is a further delay of at least one byte in each case.

Several countermeasures are taken when working in the studio:

  1. Avoidance of the loads mentioned
  2. Distribution of the MIDI data to several parallel outputs of one or more MIDI interfaces.
  3. Particularly time-critical synchronization data such as MTC or MIDI clock are transmitted on separate MIDI connections.
  4. "Bringing forward" the MIDI signal flow: Parts of the MIDI data are played back a little earlier.
  5. Use of functions in the sequencer to thin out the data. In this way, redundant MIDI data is filtered out and does not get into the MIDI data flow.

However, these measures are often not possible in live applications. A master keyboard usually only has one MIDI out, that is, everything that is triggered there on notes and controllers must be transmitted via this one interface.

Only if only note values ​​are transmitted in quick succession and instead of the explicit note-off command a note-on command with velocity value 0 is used, can the transmission of the status byte be dispensed with (running status). However, this only reduces the amount of data to be transferred for this block by 1/3.

Temporal resolution

In addition to the uneven time delay, there is also a quantization of the MIDI times: The precision of a MIDI sequence is usually defined to 1/96 of a quarter note (see MIDI clock), in some cases sequencers and programs can achieve a higher accuracy today there are divisions of 240 and 480. When using plugins, depending on the application program, it is possible to completely cancel the quantization and then re-quantize. The PPQN (pulses per quarter note) that the device can process are decisive here . This therefore also means that the smallest reproducible timing fluctuation is sometimes dependent on the song tempo. In a song with 120 bpm in quarter time this value is e.g. B. 5.21 milliseconds. This means that soft tempo changes or certain playing techniques, such as laid back or pushed , cannot be played precisely.

Real-time capability of MIDI

Depending on the protocol used and the number of data to be transmitted, the delay times vary: The transmission of typical MIDI information consisting of 3 values ​​is still faster with standard MIDI with less than 1 ms than the typical latency that is found on the USB Bus is observable. In the best case, this is 1–2 ms, but can also be up to 50 ms. The hardware of the chips and the operating system also play a role. When using a to-host interface and a serial interface, between 10 and 30 notes can be transferred even faster at 230kbps than with USB 1.0. With immediate execution, even if a chord is fingered with both hands, all notes are transmitted in quick enough succession, which is sufficient for subjective simultaneity. Unfortunately, USB 2.0 and 3.0 do not bring any improvements here, but often have even higher latencies. Only from 30–100 note information, as occurs at the beginning of a heavily orchestrated work, is USB faster overall because it has the higher bandwidth, but depending on the packet size it still has a non-reproducible latency and thus jitter. Therefore USB is less suitable for live play and more for the production of music from the DAW .

Controller resolution

All analog and continuous aspects such as the velocity of the touch, the behavior of the sound during pressing (after touch) and after releasing the key as well as the real-time modifications of the amplitude, the frequency etc. are resolved in the 128 levels of the MIDI format. This inevitably creates a distorted coarsening of the manual game during the recording and prevents a continuous change, which is problematic above all with the volume and the speed of the game. In many modern keyboards, sound generators and synthesizers, the incoming controller values ​​are smoothed and smoothly transitioned to new values, but the musician only has limited control over the sound.

A real instrument has no gradation in pitch or volume. Even when controlling artificial sound parameters in synthesizers, such as B. the cutoff frequency of a filter, this "gradation" is sometimes audible.

Even instruments with apparently fixed pitches such as flutes or pianos have a certain variance in pitch during play, which in wind instruments depends on the air flow and in string instruments on the touch and the instantaneous volume as well as all kinds of resonance effects. The fluctuations in the tone parameters amplitude and phase that exist during a complex tone course can therefore only be reproduced or generated very roughly by MIDI hardware and software. This is particularly noticeable with velocity-sensitive instruments ( Bösendorfer 290SE ). Also vibrato and Blue Note technique can hardly be displayed correctly.

This problem can only be circumvented by using two consecutive MIDI controllers, which further restricts the time frame. Another example of the use of two linked controllers is the bank select command (MIDI CC 0 and 32 of the General MIDI standard), which allows up to 128 banks to be addressed. Each bank can in turn contain sub-banks (max. 128). In this way, 16384 sound banks, each with 128 sounds, can theoretically be implemented. Special sound libraries therefore offer different articulations and playing styles of acoustic instruments in order to circumvent this disadvantage.

Moods and scales

The limitations of a musical nature weigh heavily: MIDI was designed to control synthesizers that reproduce tones according to an equal tuning . For other moods or scales that do not have twelve levels, the note data must either be artificially modified by the controller in the sequencer or reinterpreted in the end devices. Many high-quality keyboards now have a corresponding setting option. However, only one tuning can be used within a musical work. For more complex demands on moods, there is special software (so-called microtuner ) that requires a MIDI interface (e.g. USB-MIDI adapter) and a computer. The increased latency may make precise live interaction more difficult.

Extended formats

With XG-MIDI there is an extension from Yamaha , which, like Roland's GS-MIDI, brings improvements in the compatibility of standard MIDI files, but did not advance beyond a proprietary system. Only GM ( General MIDI ) has established itself as a quasi-standard . Both standards use the normal MIDI system with no changes to the MIDI hardware or protocol.

MIDI 2.0

In order to remedy the weaknesses of the aging MIDI standard, the manufacturers organized in the MMA industry association have drawn up the specification for a new MIDI standard. It should have the following key data:

  • Two-way communication
  • Devices inform each other about their capabilities via "Capability Inquiry" messages and adapt configurations to each other
  • Higher velocity with 16 bits
  • More CC controller types: 128 CC controllers, 16,384 predefined controllers (comparable to RPN), 16,384 user-definable controllers (comparable to NRPN) - each with 32-bit resolution
  • 32-bit resolution for all controllers: CC, channel and poly aftertouch, pitch bend
  • Improved timing and synchronization

MIDI 2.0 is backwards compatible, i. H. Conventional MIDI devices should be able to be connected further without using the extended possibilities of MIDI 2.0. Therefore the hardware connection via DIN socket should remain. In addition to the USB connections via 2.0, USB 3.x is also specified and the USB-MIDI protocol is expanded.

The new standard was unanimously adopted by the MIDI Manufacturers Association at the US music fair NAMM in January 2020.

The first devices should hit the market in 2020; Roland presented a MIDI 2.0-capable master keyboard at NAMM 2020.

literature

  • Christian Braut: The MIDI book . Sybex, Düsseldorf et al. 1993, ISBN 3-8155-7023-9 .
  • Bernd Enders , Wolfgang Klemme: The MIDI and SOUND book for the ATARI ST. MIDI basics, MIDI commands, programming the sound chip, controlling MIDI instruments. Everything about professional music software, MIDI program tools, conversion tables . Markt & Technik Verlag AG, Haar near Munich 1988, ISBN 3-89090-528-5 .
  • Erol Ergün: Cubase SX / SL in practice. The new generation of MIDI / audio music production. Getting started, professional tips and strategies for music production with Steinberg Cubase SX / SL . 3rd, updated and expanded edition. PPV, Presse-Project-Verlags-GmbH, Bergkirchen 2005, ISBN 3-937841-22-9 .
  • Dieter Stotz: Computer-aided audio and video technology. Multimedia technology in use . 3. Edition. Springer, Berlin et al. 2019, ISBN 978-3-662-58872-7 .
  • Rob Young: Working with MIDI files - the way to professional-sounding sequencer songs . Carstensen, ISBN 978-3-910098-34-3 .

Web links

Commons : MIDI  - collection of images, videos and audio files

Individual evidence

  1. MIDI Specs. MIDI ORG, August 9, 2018, accessed on July 19, 2020 .
  2. Karadimoz: Midi Spec - What is MIDI? (PDF) In: http://www.borg.com/ . May 3, 2006, accessed July 2020 .
  3. Andrew Swift - A brief introduction into MIDI ( Memento of August 30, 2012 in the Internet Archive )
  4. Jan Kuhlmann: The inventor of MIDI - Dave Smith. In: bonedo.de. February 20, 2013, accessed July 19, 2020 .
  5. MIDI-1.0-Electrical-Specification-ca33. (PDF) AMEI (Association of Musical Electronics Industry), September 16, 2015, accessed on August 19, 2017 .
  6. Specification for TRS Adapters Adopted and Released . ( midi.org [accessed September 11, 2018]).
  7. Mind To MIDI. MIDI.ORG, accessed August 2020 (UK English).
  8. MIDI messages. Musik Steiermark, 2005, accessed in 2020 .
  9. Manufacturer SysEx ID Numbers. Retrieved July 20, 2020 .
  10. Universal Serial Bus Device Class Definition for MIDI Devices. (PDF) MIDI ORG, 1999, accessed in 2020 (English, 180 kB, p. 16, Chapter 4: USB-MIDI Event Packets).
  11. Universal Serial Bus Device Class Definition for MIDI Devices. P. 13, Chapter 3.2.1: MIDI Endpoints and Embedded MIDI Jacks.
  12. Universal Serial Bus Device Class Definition for MIDI Devices. P. 16, Table 4–1: Code Index Number Classifications.
  13. Tobias Erichsen: rtpMIDI. TE, accessed July 20, 2020 .
  14. Andreas Schwarz et. al .: MIDI function and hardware. Mikrocontroller.net Collection of Articles, 2009, accessed July 20, 2020 .
  15. ^ Harm Fesefeldt: MIDI Timing Concepts. In: harfesoft. Retrieved July 24, 2020 .
  16. microcontroller - USB device latency. stackexchange.de, accessed in 2020 .
  17. USB latency in the millisecond range? - Mikrocontroller.net. Retrieved July 24, 2020 .
  18. Computer for audio applications (DAW). Retrieved July 24, 2020 .
  19. ↑ Use USB-AUDIO-MIDI-Controller in your own electronics - Mikrocontroller.net. Retrieved July 24, 2020 .
  20. ^ Limitation of the current MIDI protocol. In: 96khz. 1999, accessed July 24, 2020 .
  21. USB 2.0 vs USB 3.0. Retrieved July 24, 2020 (UK English).
  22. Midi hardware jitter. sequencer.de, accessed on July 24, 2020 .
  23. All information about the MIDI 2.0 standard. In: AMAZONA.de. February 28, 2020, accessed on July 20, 2020 (German).
  24. Details about MIDI 2.0 ™, MIDI-CI, Profiles and Property Exchange. Retrieved July 20, 2020 (UK English).
  25. Roland Announces "MIDI 2.0 Ready" A-88MKII MIDI Keyboard Controller. Retrieved July 20, 2020 (UK English).