Assignment for March 12, 2015

Prepare to make either a final presentation or a work-in-progress presentation of your final programming project.

Your presentation should be a combination of “product demo” and technical explanation. Describe what your program is, what it’s intended to do, and how it does it. Do a demonstration of the program in action (and/or allow others to try it). Describe the design process, the challenges you encountered, and how you solved (or didn’t solve) them. Walk through a technical explanation of the most important aspect(s) of your program, explaining how it works, and describing any techniques you discovered or invented along the way to accomplish it.

Class will meet in the Realtime Experimental Audio Laboratory (REALab), Room 216 of the Music and Media Building. Bring all the necessary software to run your program, including any associated media files or devices. You may come early (as much as 20 minutes in advance of the class session) to pre-install your software on the lab computer if you’d like.

Assignment for March 10, 2015

Many interesting audio effects are achieved by combining a sound with a delayed (and possibly altered) version of itself. To delay a sound, one needs to store it for a certain amount of time till (a delayed copy of) it is needed. That storage has to be constantly ongoing when we’re dealing with realtime audio processing, yet we usually also want to dispose of the delayed audio data once it’s no longer needed. Realtime delay of audio is therefore most often achieved by storing the sound in what’s commonly called a ring buffer or a circular buffer.

In preparation for the upcoming discussion of delay-based effects, study these examples from a previous class.
Simple delay of audio signal
Delay with tempo-relative timing
Simple flanging
Delay with feedback

Study the MSP Tutorials that deal with delayed sound. [These links are to the web version of the documentation, but you’ll probably prefer to use the Max Documentation within Max so that you can try out the tutorial patches while you read about them.]
Delay Lines
Delay Lines with Feedback
Flanging
Chorus
Comb Filter

If you just can’t get enough of examples of delay, check out these other examples from a past class.
Change of delay time may cause clicks
Continuous change of delay time causes a pitch shift
Ducking when changing delay time
Abstraction for crossfading between delay times
Demonstration of crossfading delay

You can also read about digital filtering. Filtering is a special case of delay, using extremely short delay times to create interference between a sound and a slightly delayed version of itself, which causes certain frequencies in the sound to be attenuated (lessened in strength) or resonated (increased in strength), changing the sound’s timbre. [These are links to the web version of two MSP tutorials; you may prefer to read them in the Max Documentation within the Max application.]
Simple filters
Variable type filters

Here’s a very thorough tutorial on filters in MSP written by Peter Elsea.

Here are some filter examples from a past class.
Bandpass filter swept with a LFO
A variable-mode filter: biquad~
Smooth filter changes

 

Assignment for March 5, 2015

To begin familiarizing yourself with the way that Jitter objects use Open GL for 3D animation, read as many as you can of the Jitter Tutorials numbered 30-34.

Study the following examples of Jitter objects that enable 3D animation with Open GL.

Create a sphere in Open GL
Display a video on a shape in OpenGL
Apply a texture to a shape in OpenGL
Display a video on a plane in OpenGL
Alphablend a videoplane in OpenGL

You can also take a look at the Max Documentation reference manual listing of the attributes and messages that are common to all the Open GL objects in Jitter.

Assignment for March 3, 2015

In class I’ll introduce the basics of using the JavaScript programming language within Max.

In preparation, read this introduction to JavaScript in Max, and then study this set of three small JavaScript programs. Download and save the six files in that directory. The files are meant to be studied in progressive order: 1) bang2x.maxpat, 2) number1x.maxpat, 3) numbearray.maxpat. Within each patch double-click on the js object to open the script it contains. Study each script till you understand how it works. (Ask questions about anything you don’t understand!)

There’s quite a bit of JavaScript documentation within the Max application’s Max Documentation, which you can also read in the online version. In that documentation, read the Introduction, and Basic Techniques.

You can look up anything about features of the JavaScript language in the official JavaScript reference manual.

Assignment for February 26, 2015

In preparation for a discussion of intensity-based panning,
a) read the Wikipedia entry on panning,
b) take a look at these three consecutive examples:
Linear amplitude panning
Constant power panning using square root of intensity
Constant power panning using table lookup
and
c) read MSP Panning Tutorial 1: Simple panning.
(The link is to the web version of the documentation; you can also read it in the Max Documentation within the Max application, which will allow you to try out the tutorial patch, as well. If the last part, about “speaker-to-speaker crossfade” makes your brain explode, you can skip that part.)

Take a look at this abstraction for constant-intensity panning, which you might find useful in your own work.

Assignment for February 24, 2015

In the Max 7 Documentation, read MSP Sampling Tutorial 1: Record and Play Samples and MSP Sampling Tutorial 3:  Sample Playback with Loops. (Note that these links are all to the web documentation. You can read the same material in the Max Documentation that’s accessible in the Help file of the Max application, with the added benefit of being able to open and try the example patch.)

The buffer~ object discussed in those two tutorials is key to many types of sample manipulation in MSP. A buffer~ creates a place in the computer’s random access memory (RAM) to store audio. Audio in RAM is more quickly accessible than sound that is read from the hard disk, and is accessible at any “random” location within the audio buffer. Whereas with sfplay~ you can open and read a file from the hard disk (and indeed sfplay~ preloads a few milliseconds of the sound into RAM as a buffer in order to stay ahead of where it’s reading), with buffer~ you can preload any amount of sound you’d like into RAM (up to the limit of your computer’s available memory) and thus your program can access any place in the sound instantaneously.

Every buffer~ object must have a unique name typed in as an argument, and other MSP objects can then access the sound that’s in that place in memory simply by referring to the buffer name.

Study examples 15-21 from the 2012 class, and try out the patches. Post any questions you have about any of these tutorials and examples on the Q&A site.

You should now be well underway with your programming project. If you haven’t yet written out the detailed version of your project plan, as described in the Assignment for February 12, 2015, you should do so immediately and begin programming as soon as possible. Post thoughts, ideas, discoveries, etc. on your blog as you go (think of it as the ongoing notepad and journal for your project), and post questions to the Q&A site.

Assignment for February 19, 2015

Read about the relationship of perception to stimulus in the Wikipedia article on Fechner’s law.

Learn about the meaning of decibel and how the decibel terminology is useful in digital audio. It was discussed briefly in the article you read in the first assignment on “Digital Audio” (search for the word “decibel”). It’s also discussed in a fairly non-technical way near the end of my blog post about “Fading by means of linear interpolation” (search for the word “decibel”). For a more technical understanding, you might want to start with this explanation. Then you could proceed to this similar but perhaps slightly more technical explanation. If you want the TMI version including historical information, etc., you can read the Wikipedia article.

If you’re uncertain about the mathematics of exponents and logarithms, you can research them on the web. Perhaps the most painless introduction is to watch videos on the Khan academy about exponential and logarithmic functions and about logarithm basics (“understanding logarithms as inverse exponentials).

Think about how the logarithmic nature of our perceptions is relevant to the relationship of musical pitch to frequency, and the relationship of tempo to time interval. Here’s a Max example that lets you experience the difference between linear and exponential control of amplitude.

Assignment for February 17, 2015

Read “Cross-Domain Mapping“, Chapter 2 of Conceptualizing Music by Lawrence M. Zbikowski. [You will need to be on the UCI LAN or accessing it via VPN in order to read this book online.]

Study the following two examples from a past class, regarding linear mapping and linear interpolation. The first one is pretty math-y, but is worth reading carefully so that you understand linear mapping conceptually. The second one gives a few practical (if not super-useful musically) examples of applying linear mapping to convert one range of numbers into another.
1. Linear mapping equation
2. Linear mapping and linear interpolation

In class Anna and Molly will bring instruments for us to experiment with the pitch-detection object sigmund~.

Read the Wikipedia article on pitch tracking. If you want to read more about how pitch detection is accomplished, there’s no shortage of articles online. The page on Pitch Detection Algorithms by Gareth Middleton gives a pretty clear explanation of two algorithms, and you can read Miller Puckette’s description of his fiddle~ object, which was the predecessor of sigmund~, in his article “Real-time audio analysis tools for Pd and MSP” (a .ps PostScript file).

Assignment for February 12, 2015

Based on the discussion in class, begin solidifying your project proposal, filling in details, defining clearly what you want your finished program to do. Once you know what you want it to do, you will need to determine what the program requires in terms of a) pre-made “footage” of audio, video, graphics, etc. and b) control input from the user (who might be you and/or might be an unknown user). The determination about functionality and input will help you determine the user interface—what the user will see onscreen, and how s/he will provide the needed information to the program. Once you have established those goals and needs, your next step should be to think about the global structure of the program (sketching the flow of information on paper can be a great way to brainstorm structure), break the task down into steps or subtasks, and then make specific plans regarding the steps you’ll need to take to accomplish each subtask, and set intermediate goals and deadlines for completing them.

Catch up on any past assignments you may not have completed.

Assignment for February 10, 2015

1. Write a program that has both an audio/music component and a video component, and that exhibits some degree of synchronization between control of the audio and control of the video.

The music portion of the program may use either MIDI or MSP audio, and may be either performed in real time (by mouse, keyboard, MIDI controller, etc.) or automated. The video may be controlled by some characteristic of the audio, or by user actions, or by automation. The idea is to make evident a rhythmically coherent relationship between the two elements.

2. Make a dedicated post or page or category in your blog site for notes about your final project. Write some initial plans about what topic you hope to address in some depth in your final project. Graduate students should propose a solo project; the goal may be either an aesthetic project (i.e., a piece) or an experimental project (something that will explore and elucidate a research topic, but might not necessarily have a finished artistic product as its result). Undergraduate students may propose either of the above (artistic project, or experimental project) or an application that uses or generates audio-visual media; undergrad projects will be carried out in small groups of people with similar interests.