sigmund~ for pitch tracking

The sigmund~ object is a third-party object for pitch tracking designed by Miller Puckette and Ted Apel, available on their page of downloadable Max objects. You’ll need to have it installed for this example to work. Why is it called sigmund~? ‘Cause it’s your analyst, I assume! You might also want to have downloaded ducker~.maxpat from the previously posted example, so you can try it out in conjunction with sigmund~.


sigmunddemo.maxpat

To have sigmund~ attempt to discern discrete notes rather than give a continuous pitch evaluation, use the ‘notes’ argument. Pitches are reported using MIDI-based numbering, but using float values to show fractions of semitones. In most cases, it works better to round those values rather than truncate them, for more accurate representation of the intended pitch.

ducker~ to suppress soft sounds

[This example has been determined faulty and is thus deprecated by its author. It’s left here merely for reference, but will not produce optimal results in actual use. A better ducker~ is presented as an example for Music 215B, winter 2016.]

A “ducker” is a system that turns a signal down to 0 when it’s below a given threshold. It’s useful for suppressing unwanted low-level audio, such as in a cell phone transmission when the user is not talking. Or, more to the point for musical purposes, as in a microphone signal when the musician is not playing.


ducker~.maxpat

In this patch the user specifies an amplitude threshold, attack time, and release time, either by sending those values in the appropriate inlets or by typing them in as arguments to the object in the main patch.

When the amplitude of the signal coming in the left inlet goes above the threshold, the >=~ object sends out 1, so the rampsmooth~ object starts heading toward 1 in the number of samples specified (i.e., starts fading in the signal). So as not to lose too much of the wanted signal, that fade-in should be quick. A good attack time might be in the range of 5-40 ms, depending on the source. When the amplitude of the signal coming in the left inlet goes below the threshold, the >=~ object sends out 0, so the rampsmooth~ object starts heading toward 0 in the number of samples specified (i.e., starts fading out the signal). For many instruments, the release time might be slower than the attack time, so you might want that fade-out time to be longer than the fade-in time. A good release time might be in the range of 200 ms or more, depending on the source.  In the main patch you can use the mstosamps~ object to calculate the correct number of samples for the fade times.

MIDI-DMX conversion

DMX data is encoded with “channel” information similarly to MIDI so that each receiving device can pay attention only to particular information. Each channel can carry a value from 0-255. Note that it’s therefore easy to convert the standard MIDI range 0-127 to the DMX range 0-255 just by multiplying values by 2 (or by shifting the number one bit to the left).

MIDI-DMX
MIDI-DMX.maxpat

Music 215 assignment for Wednesday April 9, 2014

Read (or re-read) the article “How Digital Audio Works” in the MSP Tutorials. Understanding the basic premises of how sound is digitized and how numerical audio data is managed is a first step toward understanding audio synthesis and effects processing.

Make sure you understand the meaning of the following terms:
simple harmonic motion, amplitude, frequency, fundamental mode of vibration, harmonics (overtones, partials), harmonic series, spectrum, amplitude envelope, loudness/amplitude, pitch/frequency, decibel, analog-to-digital converter, digital-to-analog, converter, digital sampling, Nyquist theorem/rate/frequency, sampling rate, aliasing/foldover, quantization, quantization precision, signal-to-quantization noise ratio, clipping.

If you don’t fully understand the explanation of those terms in the article, you’ll need to do some research to learn more about the things you don’t understand (getcher Google and yer Wikipedia on). Come to class with specific questions regarding any topics, italicized terms, or concepts discussed in the article that are unclear to you, and/or you can posts questions or comments about the article on the class MessageBoard. Check the MessageBoard periodically to see if there are any questions by others there that you can answer.

Write a brief summary of your research topic on the MessageBoard and start to compile a bibliography of references (which might also include online sources, tutorial videos, works of music or media art, etc.) that you’ll consult to teach yourself about your research topic. (If they’re references you think others might benefit from, post those on the MessageBoard, too!)

Similarly, write a brief summary of what aspects of Max you want to learn about, including specific objects you think you need to study/use that will help you implement a project relevant to your research goal. To find out what those objects even are, you’ll need to dig around a bit in the Max documentation. The search feature of the documentation doesn’t work all that great, but it’s worth a try. Following the “See Also” component of all the reference pages and help files is also good a good way to take a guided tour of new object exploration. Check out the Max Object Thesaurus, a keyword-indexed list of objects, and maxobjects.com, a searchable compilation of Max objects, both standard and third-party.

And just working through the MSP Tutorials and Jitter Tutorials is a good way to learn about those aspects of Max, and the tutorials contain some useful general information about audio and video.

Cross-domain mapping

Composer, author, and artificial intelligence experimenter David Cope explained to me that he thinks a defining aspect of creativity is the ability to make meaningful connections between disparate domains of knowledge.

That made me think of a book chapter on that topic that I think can be interesting and inspirational for composers. The book is Conceptualizing Music by Lawrence M. Zbikowski, and the chapter in question is titled “Cross-Domain Mapping.”

And whaddaya know, the whole chapter is available online.

Music 215 assignment for Wednesday April 2, 2014

For the first class session, please plan to present your most recent Max work, which in the case of the first-year students will probably be your work from last quarter (indeed might be a repeat of your final presentation from last quarter?), and in the case of the second-year students will likely be current thesis work or some prior Max work. Don’t worry if you think others already know all about your work; I don’t, and it doesn’t hurt for everyone else to hear what you’ve been up to. We’ll figure about 15 minutes per person. We all speak Max, so don’t hesitate to go into excruciatingly nerdy detail.

I’d also like to hear what you hope to learn and do with technology this quarter, so please be prepared to discuss that.

Introduction

This website will host most of the information and communications relevant to:

Seminar in Music Technology
Music 215 — Spring 2014
University of California, Irvine

Professor
Christopher Dobrian

This Blog portion of the site will contain notes in preparation for the upcoming class, notes and clarifications on topics covered in the most recent class session, assignments for the upcoming class, examples from the previous class session, and other dubiously relevant thoughts as they come to me.