Metronome using General MIDI sounds

When you’re trying to synchronize musical events to a timeline, it’s sometimes useful to have a metronome that tells you where the computer thinks the beat is. The metronome makes a percussive sound on each beat, and ideally makes a unique sound on the first beat of each measure. Most DAW and MIDI sequencing programs have such a feature. This example patch demonstrates a way to implement a transport-synchronized metronome using MIDI messages to trigger percussive clave sounds in a synthesizer.

A computer metronome could trigger any sort of sound: MIDI notes, percussion sounds, recorded samples, synthesized beeps, etc. In this example, I’ve chosen to use two specific MIDI notes, key numbers 75 and 76 on MIDI channel 10 which, according to the General MIDI standard, plays the sounds of Claves and High Wood Block. Those sounds should be roughly the same on any GM-compliant synthesizer, including the synthesizers built into the Mac OS and Windows OS.

The “global transport” in Max is a timekeeper that’s separate from, but linked to, the main Max scheduler of events. It governs the behavior of timing objects that use tempo-relative timing syntax (note values, bars.beat.units notation, or ticks). The transport object provides an interface within a Max patch to communicate with the global transport. The transport object can be used to control the global transport timekeeper, such as by setting its current time position, setting its tempo, setting its time signature, and starting or stopping it. The transport object can also be queried for that information; a bang in the inlet of transport will send that status info out its outlets.


metronome-MIDI.maxpat

This example uses both the ‘setting’ and ‘getting’ capabilities of the transport object. The metro object in this patch is initialized with a timing interval of 4n, meaning that when it’s turned on it will send out a bang every quarter note (which, at the default tempo of 120, will be every 500 ms). It also has its quantize attribute set to 4n, which means that its output will be snapped precisely to the quarter note timepoints (the beats) of the global transport. Because the metro interval is states as a tempo-relative timing value (4n), this metro will only run when the global transport is turned on. The textbutton labeled Rewind triggers a 0 message to the right inlet of transport, which resets the global transport to the time position of 0 ticks. The messages 1 and 0 from the toggle labeled Start/Stop turn the transport on or off. In this example, I’ve provided separate toggle switches for turning the global transport on and off and for turning the metro on and off, because in a more elaborate complete program the metronome controls and the transport controls would probably be encapsulated as separate parts of the program. (The user should be able to start and stop the transport in one place, and should be able to turn on or off the option of hearing the metronome in another place.)

Each time the metro sends out a bang, at moments that will necessarily be synchronized with the beats of the global transport, it queries the transport object to find out which beat of the measure it is. If it’s the first beat of the measure, it plays the Claves sound with a dynamic of fortissimo (velocity 127); if it’s any other beat of the measure, it plays the High Wood Block sound with a more moderate forte dynamic (velocity 100). Note that the note durations are set at 8n in the makenote object, to ensure that each MIDI note will be turned off after half a beat’s time.

Tempo-relative timing in Max

As noted in the essay on musical timing, computers can measure absolute time with great precision, to the nearest millisecond or microsecond, but for musical purposes it’s generally more suitable to use “tempo-relative” time, in which we refer to units of time as measures, beats, and ticks (fractions of a beat) relative to a given tempo stated in beats per minute (bpm). (For the purpose of this discussion, we’ll consider “beat” and “quarter note” to be synonymous.)

The default—and most common—unit of time in Max is the millisecond. Almost all timing objects (cpuclock, metro, clocker, timer, delay, pipe, etc.) refer to time in terms of milliseconds by default. However, in Max (as in most DAW software) there exists a syntax for referring to time in a variety of formats: absolute time in milliseconds, absolute time in clock format (hours:minutes:seconds:milliseconds), audio samples (dependent on the audio sampling rate in effect), hours:minutes:seconds:frames (dependent on an established film/video frame rate), and tempo-relative time based on the tempo attribute stored in the transport.

Tempo-relative time, as controlled by the transport in Max, can be expressed in bars.beats.units (bbu format, equivalent to measures.beats.ticks in DAW software), note values, or simply in ticks. The relationship of those units to absolute time depends on the value stored in the transport‘s tempo attribute, expressed in bpm. A complete explanation of the time value syntax in Max is in the Max documentation. A complete listing of the objects that support time value syntax is also available in the documentation. (Just about all time-related objects do support it.) To translate automatically from one format to another, the translate object is useful. (The translate object works even when the transport is not running.)

When the transport is on, its sense of time moves forward from the specified starting point, and it governs all timing objects that refer to it. If a timing object is using absolute millisecond time units, it will be oblivious to the transport. However, if you specify its timing in tempo-relative units, it will depend on (and be governed by) the transport. The transport can be turned on and off in Max, its current time can be changed, its tempo and time signature can be changed, and it can be queried for information about the current tempo, the time signature, and the current moment in (its own sense of) time.

The phasor~ object can also be synchronized to a transport. Since the phasor~ can be used to drive many other MSP objects, many audio processes (such as oscillator rates, looping, etc.) can be successfully governed by the transport for tempo-relative musical timing.

In addition to the Max documentation cited above, you can read more about tempo-relative timing in Max in the article “Tempo-relative timing” and you can try out the example Max patch it contains. To understand the Max transport object and its implications for rhythmic timing, you can study this example of “Tempo-relative timing with the transport object“, read the accompanying explanatory text, and also study the other examples to which links are provided in that text.

Timing in MIDI files

In a standard MIDI file, there’s information in the file header about “ticks per quarter note”, a.k.a. “parts per quarter” (or “PPQ”). For the purpose of this discussion, we’ll consider “beat” and “quarter note” to be synonymous, so you can think of a “tick” as a fraction of a beat. The PPQ is stated in the last word of information (the last two bytes) of the header chunk that appears at the beginning of the file. The PPQ could be a low number such as 24 or 96, which is often sufficient resolution for simple music, or it could be a larger number such as 480 for higher resolution, or even something like 500 or 1000 if one prefers to refer to time in milliseconds.

What the PPQ means in terms of absolute time depends on the designated tempo. By default, the time signature is 4/4 and the tempo is 120 beats per minute. That can be changed, however, by a “meta event” that specifies a different tempo. (You can read about the Set Tempo meta event message in the file format description document.) The tempo is expressed as a 24-bit number that designates microseconds per quarter-note. That’s kind of upside-down from the way we normally express tempo, but it has some advantages. So, for example, a tempo of 100 bpm would be 600000 microseconds per quarter note, so the MIDI meta event for expressing that would be FF 51 03 09 27 C0 (the last three bytes are the Hex for 600000). The meta event would be preceded by a delta time, just like any other MIDI message in the file, so a change of tempo can occur anywhere in the music.

Delta times are always expressed as a variable-length quantity, the format of which is explained in the document. For example, if the PPQ is 480 (standard in most MIDI sequencing software), a delta time of a dotted quarter note (720 ticks) would be expressed by the two bytes 85 50 (hexadecimal).

So, bearing all that in mind, there is a correspondence between delta times expressed in terms of ticks and note values as we think of them in human terms. The relationship depends on the PPQ specified in the header chunk. For example, if the PPQ is 96 (hex 60), then a note middle C on MIDI channel 10 with a velocity of 127 lasting a dotted quarter note (1.5 beats) would be expressed as
00 99 3C 7F // delta time 0 ticks, 153 60 127
90 99 3C 00 // delta time 144 ticks, 153 60 0

It’s about time

Sound and music take place over time. Sonic phenomena change constantly over time, and therefore almost any consideration of them has to take time into account.

The word “rhythm” is used to refer to (sonic) events that serve to articulate, and thus make us aware of, how time passes. We become aware of intervals of time by measuring and comparing—either intuitively or with a time-measuring device such as a clock—the interval between events. We can detect patterns among those intervals, and we can recognize those patterns when they recur, even if with variations.

In everyday consideration of time, we discuss durations or intervals of time in terms of “absolute”, measurable clock time units such as hours, minutes, and seconds. When considering sound, we often need to consider even smaller units such as milliseconds (to accurately represent the rhythmic effect of events) or even microseconds (in discussions of audio sampling rate and the subsample accuracy needed for many digital audio signal processing operations).

Almost all programming languages provide a means of getting some numerical representation of the current time with millisecond or microsecond accuracy, such as the System.nanoTIme() method in Java, the cpuclock object in Max, etc. By comparing one instant to another, you can measure time intervals or durations with great accuracy.

When considering music, we most commonly don’t use direct reference to clock time units. Instead we refer to a “beat” as the basic unit of time, and we establish a metronomic “tempo” that refers indirectly to clock time in terms of beats per minute (bpm). Thus, a tempo of 120 bpm means that 120 beats transpire in one minute, so the beat interval (that is, the time interval between beats) is 60 seconds/minute divided by 120 beats/minute, which is 0.5 seconds/beat. Humans don’t consciously do that mathematical calculation; we just use the designated tempo to establish a beat rate, and then we think of the music in terms of divisions or multiples of the beat.

In the music programming language Csound, time is expressed in terms of beats, and the default tempo is 60 bpm, so time is by default also expressed in seconds. A command to change the tempo changes the absolute timing of all subsequent events, while keeping the same rhythmic relationships, relative to the designated tempo. Referring to units of time in terms of tempo, beats, measures (groups of beats), and divisions (fractions of beats) can be called “tempo-relative” time, to distinguish it from “absolute” time. This represents two different ways of talking about the same time phenomena; each has its usefulness. In most music, it makes more sense to use tempo-relative time, since we’re quite used to conceptualizing and recognizing musical timing in tempo-relative terms, yet not so good at measuring time in absolute terms (without the aid of a timekeeping device).

Most audio/MIDI sequencing programs, such as Live, Reason, Logic, Pro Tools, Cubase, Garage Band, etc. are based on the idea of placing events on a timeline, and they allow the user to refer to time either in terms of absolute time or tempo-relative time. For music that has a beat, tempo-relative time is usually preferable and more common. The norm is to have a way of setting the metronomic tempo in bpm, a way of setting the time signature, and then referring to points in time in terms of measures, beats, and ticks (fractions of a beat) relative to a starting point such as “1.1.0”, meaning measure 1, beat 1, 0 ticks. In most sequencing programs “ticks” means fractions of a quarter note, also sometimes called “parts per quarter”, regardless of the time signature. (That is, “ticks” always refers to fractions of a quarter note, even if we think of the “beat” as a half note as in 2/2 time, or an eighth note as in 5/8 time, or a dotted eighth note as in 6/8 time.) 480 ticks per quarter note is standard in most programs, the default time signature is 4/4, and the default tempo is 120 bpm. 480 ticks per quarter note gives timing resolution at nearly the millisecond level at that tempo. In Max, the measures.beats.ticks terminology is called bars.beats.units, but the idea is the same.

MIDI file format, MusicXML, and tablature

Link

The Standard MIDI FIle format is described by the MIDI Manufacturers Association and is available online at
http://www.cs.cmu.edu/~music/cmsip/readings/Standard-MIDI-file-format-updated.pdf

This page also describes it pretty well:
http://www.somascape.org/midi/tech/mfile.html

MusicXML is a standard started by the MakeMusic company (publishers of the Finale notation software), that appears to be pretty well supported and maintained and in use in their products and others’.

Music XML Tablature Format
http://www.musicxml.com/tutorial/tablature/

Guitar Pro is perhaps the best known tablature software.

Guitar Pro 4.06 file format description
http://dguitar.sourceforge.net/GP4format.html

And here’s another tablature software I came across.
TablEdit commercial tablature editor
http://www.tabledit.com/

Music 147 assignment for Tuesday May 20, 2014

In preparation for a discussion of probabilistic decision making as used in the piece Entropy, read the articles on “Randomness“, “A simple probabilistic decision“, and “Probability Distribution“, and try out the example Max patch that accompanies each article.

In preparation for a discussion of the ‘granulation’ technique used in the piece Insta-pene-playtion, read section 2.1 of the article “Programming New Realtime DSP Possibilities with MSP“.

In preparation for the upcoming discussion of the computerized evaluation of ‘gesture’ in music, read the “Gesture Follower” presentation made by the research team on Realtime Musical Interactions at IRCAM.

 

Music 147 assignment for Thursday May 15, 2014

Complete your initial project proposal and deposit it in the EEE DropBox called “ProjectProposal1” by the end of the day on Wednesday May 14.

In preparation for the upcoming class session on musical timing, read the article “Tempo-relative timing” and try the example Max patch it contains. To understand the Max transport object and its implications for rhythmic timing, study this example of “Tempo-relative timing with the transport object“, read the accompanying explanatory text, and also study the other examples to which links are provided in that text.

Music 147 assignment for Tuesday May 13, 2014

Similar to the exercise for last Tuesday, now do the same for what you intend to do for your own project.

Based on what you’ve learned so far in this course, how would you implement your own Music 147 final project? You may express your response in terms of a design drawing (flow chart, block diagram, etc.), a written description (feature specifications, pseudocode, etc.), and/or actual code examples.

Take the following into account when doing the assignment:
• Imagining a task/project of appropriate scope will be a big part of this assignment.
• Think of this as the genuine first step in your final project, and the more detailed you can make it, the farther along you’ll be toward completing it.
• Give careful thought to what kind of a project you want to do and will be excited to do, as well as what you would like to learn from doing the project and what scope of project will be feasible for you.
• What will you need to learn in order to accomplish your project that you don’t currently know how to do?
• What methods or specific resources to you intend to use to research those things?
• If you will be collaborating with someone else in the class, include a description (in as much detail as you’re able) of how the labor and responsibilities will be shared/divided.
• If possible, provide a timeline of how your work will proceed in the coming four weeks.