Grant Muller

Processing HarmonicTable: Part 1

Earlier this year while reading Harmonic Experience by W. A. Mathieu, I was introduced to the concept of lattices to represent tones, chords and keys. These lattices can be used to represent the basics of music composition in a visual way that makes more sense than standard scales on staffs. Here is an example:

ChordLattice

The lattice is effectively several staffs of music stacked on top of each other, so that the note can be displayed horizontally and vertically. Starting from any note in the lattice, moving horizontally (diagonally right or left), jumps by a 5th (for example C to G, or G to D). Moving vertically jumps by thirds (C to E, or G to B). Since nearly all music is the construction of 5ths in 3rds, this system makes it easy to create patterns visually for chords, cadences, scales etc, that will always look the same no matter where you start on the lattice. In the example above, you can see that Major chords are always point up triangles, Minor chords are the opposite, point down. Until recently this system didn’t translate directly to an instrument. Any chord on a keyboard will have similar but different fingerings and patterns depending on the starting note, so one still has to memorize a ton of different patterns for one scale, chord, or modulation.

A few weeks into reading Harmonic Experience, providence saw fit to lead me to the C-Thru AXiS controller. The AXiS is a Harmonic Table based midi controller that creates a table much like the lattice in Harmonic Experience, separating notes not in semi-tones like a keyboad, but by 5ths and 3rds (and of course many other inter-relationships based on this). Here is the layout of the AXiS: natural_keyboard If you look the relationships between the keys become clear. They are reversed from Mathieu’s lattice, 3rds are horizontal (diagonally), and 5ths are always straight up. To play a Major triad on the AXiS, just play that triangle pattern using starting at any key. C Major will have the same pattern as Bb Major, etc. There are of course drawbacks to this system (playing inversions, etc), but for the most part it accomplishes the goal of simplifying the muscle memory aspects of playing music so the user can concentrate on composition and performance. you learn the pattern for a scale, chord or mode only once, then you may modulate it anywhere without the need to retrain your fingers.

The problem with AXiS is the price tag. Its around $1700 for one from what I can tell, which is a pretty steep entrance fee for a device I may not be able to get used to. They are in the process of creating a cheaper smaller version, but I want to try it now. The solution was to build one in processing. I’ve only just started, but combining the midi reference classes I’ve been working on along with (eventually) a touch screen, I think I should be able to pull something off that works well enough for me to test this thing out. I started by looking for ways to draw regular hexagons, and came across this site. With a some basic trigonometry and a lot of cut-and-try with the vertex functions, I came up with this function:
int length=30;
float a = length/2;
float b = sin(radians(60))*length;
float c = length;
public void drawHex(){
beginShape();
vertex(0,b);
vertex(a,0);
vertex(a+c,0);
vertex(2*c,b);
vertex(a+c,2*b);
vertex(a+c,2*b);
vertex(a,2*b);
vertex(0,b);
endShape();
}
This will construct a regular hexagon based on the length of one side. After that I used a series of translation matrices to draw them all over the screen: public void draw(){ rowNumber=0; setNoteNumber(rowNumber); for (int j=2; j<20; j++){ pushMatrix(); translate(0+space, (height-(j*(b+space/2)+1))); drawHex(getNextNote()); for (int i=1; i <12; i++){ translate(space+(2*a)+(2*c), 0); drawHex(getNextNote()); } popMatrix(); j++; rowNumber++; setNoteNumber(rowNumber); pushMatrix(); translate(a+c+1.5f*space, (height-(j*(b+space/2)))); drawHex(getNextNote()); for (int i=1; i <12; i++){ translate(space+(2*a)+(2*c), 0); drawHex(getNextNote()); } popMatrix(); rowNumber++; setNoteNumber(rowNumber); } } [/sourcecode] I predefined the number of horizontal hexagons to 12 for each row, this means that STRAIGHT horizontally across a row I will have a perfect chromatic scale. The number of vertical keys I simply copied from the AXiS.After predefining the number of columns and rows, it meant that I could construct the size of the screen based on the length of one side of the hexagon. I also included a variable that allows me to declare an amount of fixed space in between the keys (in case my fingers are too big for the key's surface). One variable, named length, can be changed to create a bigger surface with larger keys. After implementing some simple code to write the note name into the key as well, I was finished with the layout. Here is a snapshot of it directly out of processing: Virtual Harmonic Table Its of course much bigger. With the key surface complete now all I have to do is map the key locations to the midi note they’re associated with, and use some on press functions to trigger them. After that I just need to get access to a touch screen, anyone want to donate one? Here is the preliminary code to draw the hexagons, as well as the code required to map midi notes: HarmonicTable NoteReference In Part 2 I’ll have have midi functionality implemented and some testing done. In Part 3 I’ll have it implemented with a touchscreen, most likely a laptop, at that point I’ll make the rest of the code available.

Java Midi Reference Class

Lately I have been doing some work with Processing to create visuals from music. One of the concepts I’m working with is live visuals based on video feeds from cameras that are tracking the show, or band, or whatever. This alone would be enormously boring, so to heighten the experience, I thought of using the audio output of the show to control features of the video feeds, like playback speed, positions, color and hue, etc. This turned out to be way too processor intensive, and given the properties of sound, would be impossible to synchronize with a live show.

The solution was to use MIDI to trigger these events, rather than having to perform audio analysis first. Using rwmidi and Processing, I started an experiment to see what I can do to video, using MIDI messages as triggers, NoteOn’s as ellipses with alpha masks, or Pitch Bend controllers tracking hue logarithmically. So far these experiments have been successful, but I found myself referring to a MIDI reference constantly.

The problem is most musicians don’t think in MIDI numbers. Even though the messages I’m using have no musical information, they’re derived from the world of music. So Note names like C4 (The note C in the 4th position) have a MIDI number (60). When I’m creating a drum beat in Ableton for instance, I don’t look for note number 60, I look for C4. This means that every time I wanted to map this information to an event in Processing, I had to first look up the note number. After a while you memorize them, but that’s lame, and if I leave the project for a while and come back, I’ll have to go back to looking them up again.

I couldn’t find any solution for this in code, so I created a simple Java class to handle MIDI Reference Data:

MidiReference

Which you can download from the link above. Right now it only handle Notes and their associated midi numbers. The crux of the code consists of two loops:

//Loop through all of the note numbers
for (int noteNumber=0; noteNumber&lt;127;){
    //loop through the ranges (-1 through 9)
    for (Integer range=-1; range&lt;10; range++){
        //loop through all of the note names
        for (String nextNote : noteNames){
            //if  the note name contains a "flat" then it has the identical note number
            //as the previous flat, decrement the total note number count and insert into
            //the map so that note numbers can be translated from either sharps or flats
            if (nextNote.contains("b")) noteNumber--;
                noteNameMap.put(nextNote + range.toString(), noteNumber);
            noteNumber++;
        }
    }
}

and

//Loop through all of the note numbers
for (int noteNumber=0; noteNumber&lt;128;){
    //loop through the ranges (-1 through 9)
    for (Integer range=-1; range&lt;10; range++){
        //loop through all of the note names
        for (String nextNote : noteNames){
            String noteName;
            //if  the note name contains a "flat" then it has the identical note number
            //as the previous flat, decrement the total note number count and insert into
            //the array a string made up of both the previous notes name and the current one
            if (nextNote.contains("b")){
                noteNumber--;
                String previous = noteNumberArray[noteNumber];
                noteName = previous + "/" + nextNote + range.toString();
            } else {
                noteName = nextNote + range.toString();
            }
                noteNumberArray[noteNumber] = noteName;
                noteNumber++;
        }
    }
}

The first loop rolls through all of the potential note names and maps them to a number, in order from 0-127 (all of the MIDI notes). The second does the opposite, so that you can quickly refer to the note name by number (which is something that Processing will find easier to do).

If you find this kind of thing useful, feel free to use it. I will expand it from time to time as I add new reference data for myself.

American Gods and running with the dead

So I finally got around to reading American Gods by Neil Gaiman this past winter, and as it turned out I couldn’t have picked better circumstances.

Actually I didn’t so much “read” American Gods as I listened to it. As narrated by George Guidall, who I am convinced is the finest narrator of audio books I’ve ever had the privilege to listen to.

It turns out the story takes place largely in winter, and since this book was to be my running companion for much of the season it turned out to be perfect. Most of the time anyway. Gaiman’s descriptions of the sub-artic temperatures in Lakeside made my balmy “barely freezing” weather that much worse.

The story itself wound its way through much of the midwest into the south, just as I was training for my pre-season marathon here in Atlanta, and the journey quality to the tale in particular made the 2 and 3 hour runs memorable, even enjoyable. I really got a kick out of the final scenes, which took place in Rock City. Anyone who’s live south of the Mason-Dixon has seen the “SEE ROCK CITY” Birdhouse and can certainly relate.

Towards the end of the cold season I found myself running through a confederate graveyard just across the street from my home, just as the protagonist of the story is beaing lead through the ceremony of the dead. What timing. It was about here that I realized how old the city I lived in was, and how much history I was passing as I ran through it. Later I would realize how much history my city has managed to collect in such a short time, when I’m reminded that:

“In England 100 miles is a long way, in America 100 years is a long time”.

It was a nice experience, and I hope to be able to match my book selection and season again in the future. Its a sunny spring day as I write this, and already I’ve swapped Tennessee Whiskey for tequila and lemon and Doc Martens for flip-flops.

As a last note I have yet to read a Neil Gaiman story I haven’t liked. The way he weaves primal myths into everything from sci-fi to road stories is entertaining at least, timeless at best. I think I’ve read Sandman three times now. When my only disappointment with a story is that it has ended, then it was a fine story.

Music to code by

Lately I’ve been having to step back in to writing a lot of code. I used to have no trouble at all concentrating on the task at hand, but for some reason focusing these days is tough. Its probably the vast number of distractions from co-workers needing help with this, that, or the other thing. At any rate, slapping on a Tool album or Captain Beefheart or something like that certainly isn’t going to help matters, so I’ve gone back to my music collection to find some tunes that are “barely there”.

Recommendations:

  • Godspeed! You black emperor – Yanqui U.X.O
  • Pole – 1
  • Basic Channel – Basic Channel
  • Monolake – Momentum
  • Brian Eno – Apollo
  • Pan Sonic – Aaltopirii
  • Autechre – Tri Repetae++
  • Aphex Twin – Selected Ambient Works Vol I and II
  • Aerovane – Tides

All of these albums are pretty linear, with very few quick changes in time signature or tempo. They’d probably work well for relaxing after a rough day at the office. I’ll update this list as I find new tunes to code by.

Introducing…

The Professor Hubert J. Farnsworth. Yes, Cary and I have been given another puppy, and as one person has already said:

“Holy Zombie Jesus, He’s Cute!”

Indeed. Here is the eight week old professor now as displayed by proud adopted interspecies care-taker, Cary:

Awwwwwwww…

So far he appears to be highly intelligent, extremely fastidious (he’s already crate-trained…at 8 weeks), and capable of handling himself in complex social situations involving dogs 40 times his size:

1 vs. 100…pounds

And to think, when Juno was a puppy we almost bought chew-toys that looked like little pugs.

Out cats have taken to him about as well they took to Juno. Fear and loathing best describes the relationship between the cats and the dogs now, and the fact that the cats could easily take the 2.5 lb pug in an old-fashioned donnybrook doesn’t seem to change those sentiments one bit. I’m beginning to think that all those calendar pictures of cats and dogs resting together in perfect harmony are just Photoshop tricks.

The only challenge we face with the Professor in the upcoming months is ensuring that Juno doesn’t mistake him for a meal. I’ll keep you posted.

Grant Muller