Grant Muller

Processing HarmonicTable: Part 2

Since the last post I’ve had to make far more changes than I expected. If you looked at the previous examples, there was using a loop to create the hex buttons, making translations to relative to other translations on the screen. In the process I completely lost track of the absolute position of the button, which basically made it impossible to detect the location of the mouse on the screen in order to tell which button I was pressing. As a result I had to create a separate class to represent the Hex Button, and store the absolute starting point of each hex button and the length of one side. This is all the information I needed to create and detect a hexagon anywhere on the screen. From there I just created an array of these buttons in the setup() block, and drew them all over the screen:

hexButtons = new ArrayList</p>
<HexButton></p>
<p>&lt;</p>
<p>p>();</p>
<p>for (int j=2; j &lt;20; j++){
resetNoteNumber(rowNumber);
for (int i=0; i&lt;12; i++){
hexButtons.add(new HexButton(this, space+(i<em>xOffset), parseInt(height-(j</em>b)), length,    noteNumber));
noteNumber++;
}</p>
<p>j++;
rowNumber++;
resetNoteNumber(rowNumber);</p>
<p>for (int i=0; i&lt;12; i++){
hexButtons.add(new HexButton(this, space+parseInt(a+c)+(i<em>xOffset), parseInt(height-(j</em>b)), length, noteNumber));
noteNumber++;
}</p>
<p>rowNumber++;
}

As you can see I’m still making relative translations to locations on the screen, but I’m storing them in the class to be accessed later. This way I can still change the proportions and space variable at the top and not have to change a bit of code anywhere else to resize and reposition items. This greatly simplifies my draw() block:

public void draw(){
background(255);
for (int i = 0; i &lt; hexButtons.size(); i++){
HexButton button = (HexButton) hexButtons.get(i);
button.drawHex(false);
}
}

Now detecting the position of the mouse is simple:

public void mousePressed(){
for (int i = 0; i &lt; hexButtons.size(); i++){
HexButton button = (HexButton) hexButtons.get(i);
if (mouseX >= button.startX+a &amp;amp;amp;&amp;amp;amp; mouseX &lt;= button.startX+a+c &amp;amp;amp;&amp;amp;amp; mouseY >= button.startY &amp;amp;amp;&amp;amp;amp; mouseY &lt;= button.startY+(2*b)){
println(button.note);
activeNotes.add(button.thisNoteNumber);
midiOutput.sendNoteOn(0, button.thisNoteNumber, 100);
}
}
}

Notice I only bother performing this function when the mouse it pressed, this saves me some cycle since I have no intention of sending a midi note on unless the mouse is pressed (or dragged, which is using the same function as above). Also notice the activeNotes array. I’m storing notes that have already been pressed, so that I don’t retrigger a note unless the mouse is pressed again, which is necessary for mouse drags. After the mouse is released, I just send a note off to all notes in the acttiveNotes array.

I also revised the array to create the rows of notes across the screen. Previously it was hard coded, but with the following bit of code:

public void resetNoteNumber(int rowNumber){
if (rowNumber == 0){
noteNumber = noteNameMap.get(startingNote);
previousNote = noteNumber;
} else if (rowNumber % 2 == 0){
noteNumber = previousNote + 3;
previousNote = noteNumber;
} else {
noteNumber = previousNote + 4;
previousNote = noteNumber;
}
}

I can just change the starting note field in the class to be whatever I want, and it will always move up by rows in 5ths and 3rds. So I can make my starting note A2 instead of C2, and everything lines up without a hitch and without any code changes:

harmonictablea2

I tested it out with some seriously 80’s sounding FM pad, dragging across intervals, drawing random chord shapes in honeycomb patterns, etc. Here is a short output:

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

As usual here are the updated files:

HarmonicTable HexButton NoteReference

Its a lot of fun, even though I can only play it with the mouse. Last step is to get a touch screen. Anyone want to donate one?

Processing HarmonicTable: Part 1

Earlier this year while reading Harmonic Experience by W. A. Mathieu, I was introduced to the concept of lattices to represent tones, chords and keys. These lattices can be used to represent the basics of music composition in a visual way that makes more sense than standard scales on staffs. Here is an example:

ChordLattice

The lattice is effectively several staffs of music stacked on top of each other, so that the note can be displayed horizontally and vertically. Starting from any note in the lattice, moving horizontally (diagonally right or left), jumps by a 5th (for example C to G, or G to D). Moving vertically jumps by thirds (C to E, or G to B). Since nearly all music is the construction of 5ths in 3rds, this system makes it easy to create patterns visually for chords, cadences, scales etc, that will always look the same no matter where you start on the lattice. In the example above, you can see that Major chords are always point up triangles, Minor chords are the opposite, point down. Until recently this system didn’t translate directly to an instrument. Any chord on a keyboard will have similar but different fingerings and patterns depending on the starting note, so one still has to memorize a ton of different patterns for one scale, chord, or modulation.

A few weeks into reading Harmonic Experience, providence saw fit to lead me to the C-Thru AXiS controller. The AXiS is a Harmonic Table based midi controller that creates a table much like the lattice in Harmonic Experience, separating notes not in semi-tones like a keyboad, but by 5ths and 3rds (and of course many other inter-relationships based on this). Here is the layout of the AXiS: natural_keyboard If you look the relationships between the keys become clear. They are reversed from Mathieu’s lattice, 3rds are horizontal (diagonally), and 5ths are always straight up. To play a Major triad on the AXiS, just play that triangle pattern using starting at any key. C Major will have the same pattern as Bb Major, etc. There are of course drawbacks to this system (playing inversions, etc), but for the most part it accomplishes the goal of simplifying the muscle memory aspects of playing music so the user can concentrate on composition and performance. you learn the pattern for a scale, chord or mode only once, then you may modulate it anywhere without the need to retrain your fingers.

The problem with AXiS is the price tag. Its around $1700 for one from what I can tell, which is a pretty steep entrance fee for a device I may not be able to get used to. They are in the process of creating a cheaper smaller version, but I want to try it now. The solution was to build one in processing. I’ve only just started, but combining the midi reference classes I’ve been working on along with (eventually) a touch screen, I think I should be able to pull something off that works well enough for me to test this thing out. I started by looking for ways to draw regular hexagons, and came across this site. With a some basic trigonometry and a lot of cut-and-try with the vertex functions, I came up with this function:
int length=30;
float a = length/2;
float b = sin(radians(60))*length;
float c = length;
public void drawHex(){
beginShape();
vertex(0,b);
vertex(a,0);
vertex(a+c,0);
vertex(2*c,b);
vertex(a+c,2*b);
vertex(a+c,2*b);
vertex(a,2*b);
vertex(0,b);
endShape();
}
This will construct a regular hexagon based on the length of one side. After that I used a series of translation matrices to draw them all over the screen: public void draw(){ rowNumber=0; setNoteNumber(rowNumber); for (int j=2; j<20; j++){ pushMatrix(); translate(0+space, (height-(j*(b+space/2)+1))); drawHex(getNextNote()); for (int i=1; i <12; i++){ translate(space+(2*a)+(2*c), 0); drawHex(getNextNote()); } popMatrix(); j++; rowNumber++; setNoteNumber(rowNumber); pushMatrix(); translate(a+c+1.5f*space, (height-(j*(b+space/2)))); drawHex(getNextNote()); for (int i=1; i <12; i++){ translate(space+(2*a)+(2*c), 0); drawHex(getNextNote()); } popMatrix(); rowNumber++; setNoteNumber(rowNumber); } } [/sourcecode] I predefined the number of horizontal hexagons to 12 for each row, this means that STRAIGHT horizontally across a row I will have a perfect chromatic scale. The number of vertical keys I simply copied from the AXiS.After predefining the number of columns and rows, it meant that I could construct the size of the screen based on the length of one side of the hexagon. I also included a variable that allows me to declare an amount of fixed space in between the keys (in case my fingers are too big for the key's surface). One variable, named length, can be changed to create a bigger surface with larger keys. After implementing some simple code to write the note name into the key as well, I was finished with the layout. Here is a snapshot of it directly out of processing: Virtual Harmonic Table Its of course much bigger. With the key surface complete now all I have to do is map the key locations to the midi note they’re associated with, and use some on press functions to trigger them. After that I just need to get access to a touch screen, anyone want to donate one? Here is the preliminary code to draw the hexagons, as well as the code required to map midi notes: HarmonicTable NoteReference In┬áPart 2 I’ll have have midi functionality implemented and some testing done. In Part 3 I’ll have it implemented with a touchscreen, most likely a laptop, at that point I’ll make the rest of the code available.

Pointless Changes

Most of the things I do are completely pointless. For instance, rather than adding content the my Lolight Records site, I instead redesigned it as a WordPress site. Why? It gave me a chance to screw around with a new technology, and implementing PDO for PHP was an unhappy prospect (long story). So check out the site if you want to see it in action, everything that was on the previous site is there on the new site. It just looks better.

Whats the impetus? Besides the screwing around with new technology part, I wanted something that would require less time on my end to maintain. You see, Lolight Records was written 4 years ago entirely in PHP from the ground up, with a custom database and code. That sucks to maintain. Also, I like widgets. The point is I’m trying to get back to spending my time creating content, rather than infrastructure.

So I’m working on some new stuff, music, videos, software and the like.

Fedback Feedback

A week or so ago on the very awesome AudioCookbook blog, there was a post about Old Amplifier Abuse that reminded me of an experiment from a few years back conducted by Arthur, Micah and I using an old Crate amp head with a spring reverb. As luck would have it not two days later while recording some test drum takes Micah asked whatever happened with that stuff, so I dug it up.

We exposed the spring in the reverb tank, plugged the output into a short effects chain (Distortion stomp box, and a Boss SP-202 for ring mod and filtering), then plugged it right back into the input. The speaker outs we plugged in to some speakers, then mic’ed the setup. We recorded 30+ minutes of the most undisciplined noise, banging on the spring, attaching crap to it, and twisting knobs on the filters. This created feedback loop of oscillations, with the occasional bang as someone threw a drumstick at the spring tank. It reminds me of some industrial experiment gone wrong, with the crash of toppling, derelict factory equipment in the background. Here are a few clips.

NOTE: These samples might be quite loud and obnoxious.

Section 1

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Section 2

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Section 3

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Section 4

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Section5

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Java Midi Reference Class

Lately I have been doing some work with Processing to create visuals from music. One of the concepts I’m working with is live visuals based on video feeds from cameras that are tracking the show, or band, or whatever. This alone would be enormously boring, so to heighten the experience, I thought of using the audio output of the show to control features of the video feeds, like playback speed, positions, color and hue, etc. This turned out to be way too processor intensive, and given the properties of sound, would be impossible to synchronize with a live show.

The solution was to use MIDI to trigger these events, rather than having to perform audio analysis first. Using rwmidi and Processing, I started an experiment to see what I can do to video, using MIDI messages as triggers, NoteOn’s as ellipses with alpha masks, or Pitch Bend controllers tracking hue logarithmically. So far these experiments have been successful, but I found myself referring to a MIDI reference constantly.

The problem is most musicians don’t think in MIDI numbers. Even though the messages I’m using have no musical information, they’re derived from the world of music. So Note names like C4 (The note C in the 4th position) have a MIDI number (60). When I’m creating a drum beat in Ableton for instance, I don’t look for note number 60, I look for C4. This means that every time I wanted to map this information to an event in Processing, I had to first look up the note number. After a while you memorize them, but that’s lame, and if I leave the project for a while and come back, I’ll have to go back to looking them up again.

I couldn’t find any solution for this in code, so I created a simple Java class to handle MIDI Reference Data:

MidiReference

Which you can download from the link above. Right now it only handle Notes and their associated midi numbers. The crux of the code consists of two loops:

//Loop through all of the note numbers
for (int noteNumber=0; noteNumber&lt;127;){
    //loop through the ranges (-1 through 9)
    for (Integer range=-1; range&lt;10; range++){
        //loop through all of the note names
        for (String nextNote : noteNames){
            //if  the note name contains a "flat" then it has the identical note number
            //as the previous flat, decrement the total note number count and insert into
            //the map so that note numbers can be translated from either sharps or flats
            if (nextNote.contains("b")) noteNumber--;
                noteNameMap.put(nextNote + range.toString(), noteNumber);
            noteNumber++;
        }
    }
}

and

//Loop through all of the note numbers
for (int noteNumber=0; noteNumber&lt;128;){
    //loop through the ranges (-1 through 9)
    for (Integer range=-1; range&lt;10; range++){
        //loop through all of the note names
        for (String nextNote : noteNames){
            String noteName;
            //if  the note name contains a "flat" then it has the identical note number
            //as the previous flat, decrement the total note number count and insert into
            //the array a string made up of both the previous notes name and the current one
            if (nextNote.contains("b")){
                noteNumber--;
                String previous = noteNumberArray[noteNumber];
                noteName = previous + "/" + nextNote + range.toString();
            } else {
                noteName = nextNote + range.toString();
            }
                noteNumberArray[noteNumber] = noteName;
                noteNumber++;
        }
    }
}

The first loop rolls through all of the potential note names and maps them to a number, in order from 0-127 (all of the MIDI notes). The second does the opposite, so that you can quickly refer to the note name by number (which is something that Processing will find easier to do).

If you find this kind of thing useful, feel free to use it. I will expand it from time to time as I add new reference data for myself.

2008: A Year Off

Well, as far as athletic competition goes. Anyone paying attention probably noticed that I added nothing to my race schedule after The ING Atlanta Marathon in March, whereas usually I have at least 3 or 4 races in the Summer, and some runs in the Fall.

What happened? Not sure. I forgot to sign up for the tri’s I usually do. I had big plans at the beginning of the year to do a Half-Ironman, but couldn’t seem to find one worth doing. Not to mention when I started looking at the time involved to get myself up to Ironman level I had to ask myself if this was a job or a hobby. Speaking of hobbies, I have more than a few, and of course dedicating more time to one leaves less for another. Cary got sick early in the year, then my grandmother died. Work of course got out of control with yet another project running against an impossible schedule. I can go on making excuses, really, I’m full of them.

I wasn’t the only one. My brother-in-law rode fewer sub 100 mile rides than he has in years. Maybe we’re all just burned out.

I did have one fun race. Cary and I had such a good time in January at the Disney Marathon Cary decided to sign us up for a 5K in September. Turned out to be a nice weekend trip, and it got Cary involved which was a lot of fun as well.

At any rate, I’ve forced myself to sign up for at least 2 marathons early in the 2009 season. Sedona AZ and the ING Atlanta. We’ll see if that kick starts or stalls the next year.

Jamie Dupree is the best political journalist you don’t know about

It’s Sunday afternoon and a “Bailout Bill” is imminent. Having listened to this nonsense without explanation all weekend a journalist with trust enough in the beholder has posted and impartial, full text bill proposal as of this afternoon:

http://wsbradio.com/blogs/jamie_dupree/2008/09/text-of-the-wall-street-bailou.html

He has also posted his analysis, which those who have listened to Dupree in the past can trust to be fairly devoid of bias one way or the other:

http://wsbradio.com/blogs/jamie_dupree/2008/09/analysis-of-wall-street-bailou.html

In general Dupree is almost always a spectator in the game of politics, never making his leanings known. He consults for a number of radio stations, including WSB Atlanta. You can catch him during the morning hours and sometimes during Boortz’s show (which is usually hilarious as Boortz tries to get him to say things he simply will not say). Most of the characters you see on TV and hear on the radio are just that, characters, designed to entertain whichever bias you are subject to. Dupree is not an entertainer. His blog is fun to read but more often it is highly informative, and of course largely impartial. Those as entertained as I am by the games our politicians play should take a look.

I encourage you all to pay very close attention to what is going on with this bailout bill. This is an opportunity for both major parties to play the “power grab” game, and without voter participation in this cluster it will quickly get out of hand. Those not aware of the implications of this bill to the separations of powers in United States government have some catching up to do.

UPDATE: The full text bill proposal was released by CNN and the other major news outlets hours after this…

Sound Memory

It’s pretty common to create mnemonic devices to help remember things. Indeed, I can think of at least a dozen acronyms, sing-a-longs and rhymes that I’ve used in the past (thanks to Fields, I’ll never forget my Latin Declensions). Apparently the sense of smell creates the strongest memory bonds in most folks, but for me, its gotta be sound.

At an early age I noticed that if I listened to a song while I was reading something, if I heard the song again, or even thought of it, I could remember parts of whatever it was I was reading. I used to read perpetually in the car whenever I would go places with my mom, of course with my mother in charge of the tunes. Now tragically whenever I hear ANY UB40 song I picture scenes from the Death of Superman. The wiggly, piercing tenor of Aaron Neville triggers memories of X-men and Green Lantern.

Of course this applied to more than just books. There was a Sega Genesis game called Flashback that I played as a kid. The game had almost no music. For a 10 year old this was unbearable. Naturally I had to listen to something while I played, so I put on Metallica’s …And Justice For All. Nothing like rocking hard to the what might have been the slowest game on the Genesis platform. A few months back at Battle ‘n’ Brew, someone played Blackened on Rock Band, and all I could think of was how to find the holocube in the forest, and why I can’t freakin’ jump far enough to reach that tree branch. You’d have to have played that game to understand…it was addictively frustrating. I have a whole list of games I played with alternative tunes. Future Sound of London became the soundtrack for Illusion of Gaia, God Lives Underwater substituted for Earthbound. I can’t forget most of these games now because of it.

I used to be a pro at tuning out the sounds around me, at least to the point of tuning them into whatever I was focused on. I’ve grown out of that, and I can no longer tune out the television, conversations, or the lyrics of the music I’m hearing with the words I’m reading. I still listen when I read, but I try to select music with little to no sweeping key changes, quick cuts, and most definitely words. Right now Godspeed You Black Emperor is the soundtrack for Joseph Campbell’s Mask’s of God.

Welcome, Organ

Drawbars and Keys

This weekend I picked up a 100+ year old organ from a friend’s house. Thanks Ryan. Its a pump organ, made by a company that looks like “Moreto”. Apparently it was also made in Chicago.

All I know is that it doesn’t require any power other than my own two feet to operate. This makes it easy to stop by on my way to bed and play a little tune, whatever I make up, so that I end my day with a song. Here is some close ups of the detail work…

Detail

And of course…the fabulous cast iron and solid wood stool it came with

I of course have work to do on it. The reeds are slightly out of tune. Can those be retuned? The fabric behind the wood carvings needs to be replaced (thanks in advance, wife). The drawbars labels need to be redone, as some are missing. All in all it ought to be a fun project.

First test of the Firepod (FP10)

I guess I should start calling it the FP10, since technically its been renamed/rebranded. Firepod just sounds so cool.

Band practice Saturday consisted of cutting a drum track for an ongoing demo that we’re working on (I’ll post a copy when Bill finishes the mixing). I didn’t feel at all prepared for it, but that’s beside the point, and it actually went really well.

The setup consisted of exactly 3 mics. That’s it. A single overhead condenser, and a dynamic on the kick and snare. We ran those into the FP10 along with a mix of Bill’s “wall-of-keyboards”, and cut the 11 or so minute track in about an hour using Garageband on Bill’s Mac.

It sounded fantastic.

Listening back through a pair of home audio speakers later to the unmixed, uneq’d tracks we all nodded in approval at the quality of the recording. I can’t tell you if it was the preamps, Bill’s skills at mic placement, or the mics, but I can tell you that my drumming had very little to do with it. It was the best drum sound I’ve heard while trying to record outside of a studio. The kick resonated in your chest, the snare and cymbals were crisp without being irritating, and the toms pounded like a stomping rhino. Very nice.

There is a section of the track where I’ll be playing something like a drum solo, which is a rare treat for me, and in 7/8 time to boot. It won’t sound great on the demo, but I’m looking forward to preparing something for it live and for later recordings.

I’ve also retooled the home studio a bit. Look for pics of it as well as the AM practice space. To listen to the older demo from a few weeks ago of the first movement, go here:

http://www.myspsace.com/ascendedmastersga