Grant Muller

Need Some New Tunes?

megaproject theme coverI started drumming in a new band sometime last year called the Ascended Masters. When I joined we were a trio, though the band has expanded to a quartet/quintet with the inclusion of a horn section. Progress with the band has been quick as of late, even if the ramp up was a bit logarithmic. In the span of the last several months, we’ve released an album, played several shows, and booked a couple of recurring gigs. Where can you find this stuff? Brace yourself for links.

  • megaProject THEME – The first album, download, buy or listen to it here.
  • of course…there is a myspace page
  • Prefer live shows? Well you can check us out every 3rd Thursday of the month at the Caribou off Moreland in Little 5 (Atlanta, GA).
  • Need more? We’re at the Caribou in Midtown every 4th Thursday.
  • We occasionally play a couple tracks at the open jam on Tuesdays at the 5 Spot. Even if we don’t play though, you should check out the house bands (featuring Eric and Bill of Ascended Masters). Half the band ain’t bad.
  • What, you want video? Fine, check us out on YouTube.
  • Wait, there’s more! We’ll have a podcast up soon, featuring the highlights of some live gigs (posted semi-weekly), rehearsals, free jams, and the like. Stay tuned.

That should be enough to keep you busy. End plug.

Mandala Meets Drumset

Mandala

I’ve had several chances to play my new setup now, including the Mandala. I played a live gig (recording to come) on Friday of last week, and learned a few lessons. Now that I’ve got things under control, I started to create some basic kits in Battery for my Mandala.

The Mandala is capable of subdividing into 7 zones. For a 10 inch surface that’s a lot. It’s really great for emulating snare sounds, and extremely realistic percussion (you can effectively assign any number of samples to these zones, velocity map them, and have a “real” drum). But I don’t need that. I have plenty of drums.

My first little attempt at a custom kit for the Mandala is a riff off of the tabla kit released a while ago. I found that 3 zones per pad is my ideal number, so this little snippet features about 30 samples mapped over 3 zones. Each zone is actually only one “instrument” of course, but velocity mapping dictates that I have more than one sample per zone for a more realistic implementation. Here’s a little sample of my drums and the Mandala working together.

Mandala Tabla Test

New Music Setup

DrumStudioSetup-1

When we bought our house a few years back, I finally had enough space to consolidate all my music gear into one space. I had big plans for my basement studio. Mix desk over on one side. Wall-mounted monitors. Jute coffee sack sound baffles and just the right location for recording the drums. Later I found a place for my pump organ, and of course a myriad of other instruments strewn here and there. Over a year or so, I managed to make all that happen. The result has been rather…terrible.

After I spread everything out, it became clear that recording with just little-old-me was slow if not impossible. If I wanted to record drums, I had to press record, and walk over to my drums to do it. Not a big deal if I’m just going to cut it all in post, but what about punch-ins? I’m also lazy. If it takes me more than 2-3 minutes to setup and tear down to make some sound happen, I just won’t do it. I’m not into setting up, tuning and perfecting the timbre for an hour dammit, I’ve got a melody in my head now.

I thought about my past work, stuff I’d done for an EP and a full-length a few years back and realized everything I have ever recorded I’ve done in a tight space, with just a few things, without a big setup. As a lone wolf in the studio, I don’t have a team to put things together for me, press record and punch when I need, and patch up my instruments. If I was going to continue doing it myself, I was going to have to make some changes. So I did.

I started by ditching my Digi001. This was an old recording platform released by Digidesign about a decade ago. I loved this thing, but it was really starting to age, not to mention it only had 2 mic preamps. I traded it for my Presonus FP-10…which I’m so far very impressed with. 8 Mic preamps, and I can use it with any software I want (previously I was tied to Pro Tools). All in one rack unit. Done.

Next was the mixer and cabinet. With 8 Mic preamps in the FP-10, why do I need a mixer? I ditched it (eBay style), and the rack I had it mounted in (which I had built many years ago…it’s ok to destroy the things you create). This freed up a lot of space, and of course, freed up my hands.

After that it was bye-bye desktop. It required a monitor, keyboard, mouse, blah blah blah. Gone. As of now I’m doing everything with a laptop. Yeah, I know Kid606 and other people cooler than me have been doing this for years. I don’t care. I’m old school or something.

So, down to a laptop, FP-10, mics, drums, a controller keyboard, and a bunch of random instruments. Now what? Round ’em up:

I set up in the round so that everything would be right at arms length. I’m primarily a drummer, anything else I do is purely piddling around, so I made the drums the focus by setting up the mics (Recorderman Style) specifically for recording them and my Mandala. Couple overheads (Samson C02’s), Kick, 2 on the snare (Shure SM57), and some crappy tom mics. To my left I threw together a keyboard stand with the FP-10 and a place for the laptop. Right behind me is a controller keyboard, so if I want to throw together a cheesy bassline for the rad beat in 25/8 time I just wrote, I can do it immediately. I just spin around to make music. Mostly.

So the idea is that I’ll write here, and when I feel pretty comfortable everything is ready, I’ll take it to the mix desk, where those coffee sack baffles will really help out.

What’s next? I just got word that a friend is selling an old electronic kit. Since I’m basically a drum machine, I’ll pick that up and include it in my little round here…maybe right in with my current drum kit. We’ll see if this works…I played this silly little drum lick to test the mics:

Silly Drum Lick

Converting 24PPQ Midi Sync in Java/Processing

midi_logoI would be the first person to say that for the most part, MIDI is perfectly acceptable as an interface between musical devices, and has survived for as long as it has because of how dead simple it is. MIDI is still plenty fast, and in terms of interoperability, has yet to be bested. However MIDI does have its shortcomings, and while helping John Keston over at AudioCookbook with his Gestural Music Sequencer, I ran in to a big one.

Way back in the day, I can only assume before Mu-ziq and BT, MIDI clock was implemented at a rate of 24 pulses per quarter note. This means that when you press play on your sequencer, the devices that are synced via this clock hear 24 pulses for ever quarter note, and as a result can playback in time along with these pulses. After 12 pulses pass, you would move by an eighth note, 6 pulses a sixteenth notes and 3 pulses a 32nd note. You should see that as soon as I try to drop to a 64th note resolution, I’m in trouble. How do I wait a pulse and a half if all that I’m relying on is the pulse to tell me when to move forward? I’m sure that while scoring the soundtrack for Legend, Tangerine Dream were perfectly happy being able to sequence 32nd notes, but how am I supposed to start an Aphex Twin cover band without 128th note stuttering beats and retrigger effects?

There are ways around this of course, you can use the native time code in your sequencer or application and just use SMPTE frame based timing to keep it all in sync, but if you’re writing a simple step sequencer to interface with some master sequencer, this is a huge hassle. Whats a guy to do? Convert it!

Conceptually, the steps to do this are simple:

      1. Intercept pulses
      2. Subtract the space between two pulses(in milliseconds)
      3. Divide this value by the quotient of your conversion divided by 24 (to get n milliseconds)
      4. Start a thread and send out a pulse to your sequencer every n milliseconds

So, if I want to convert 24 ppq to 96 ppq, I divide 96/24 to get 4. This means that for every pulse I get from the master sequencer, I should generate 4 pulses for my local sequencer. Now, look at the time of the first pulse, and subtract it from the time of the second pulse. Lets pretend I get something like 100 milliseconds. So at this point while the master sequencer will be sending me pulses every 100 milliseconds, I should generate pulses every 25 milliseconds. Then just do this for every pulse I receive (in the case that the time changes and I need to reduce or increase the amount of time between pulses, in the case of a ritardando or accelerando. Seems easy enough, but how to put it in practice.

I worked on a standalone project in java first. Wired through RWMidi, I passed the pulses from the event handler in my main java class to the Sync Converter, then using reflection mapping passed the newly generated pulses back into another even handler in my main class. I decided that this approach sucked for a few reasons. For one, the performance hit I took doing the reflection mapping twice (Once in RWMidi, then again in my sync converter. Also, this required me to have two event handlers to catch the pulses in my main class, one for the originating pulses, and another for the newly generated pulses. Thinking in terms of an end-user, who would likely not want to go through all of this trouble just to convert some pulses, I decided it would be better to jam the sync converter right in line with the MidiInput class of RWMidi. Since I’m no stranger to modifying that library (I enabled sync in it a few months ago), I figured Wesen wouldn’t mind some additional modifications to his work to create this sync converter.

At first, I assumed the best approach would be to create separate thread that handles the entire timing routine, a thread that would just stay in sync with the incoming pulses from the external sequencer. This approach turned out to be silly after 5 minutes. When do the math, at 120 BPM the MIDI timing pulses are roughly 21 Milliseconds apart. Asking a thread to perform even a simple algorithm like the one proposed above while making sure to send a sync pulse on the arrival from the external reference along with a calibrated “in-between” pulse (every 5 milliseconds, in this case), was asking too much. I tried it anyway, even though the math wasn’t working out, and sure enough, a significant amount of drift was occurring between the external and internal sequencers.

So no dice on a replacement time reference. Perhaps I could let the original pulse fall through to keep my internal sequencer in sync, and create a timer/thread to handle the “in-between” pulses? This approach was on the right course, but failed for a few reasons. I couldn’t the separate thread to wake up quickly enough to send the pulse (or pulses), so would fall behin the external sync every 8 to nine pulses. The result was drift. The timer was closer, but the overhead of creating a new thread per timer every 21 milliseconds (at 120 BPM), was just not cutting it, the result was still a bit of drift.

Frustrated, I did what I always do in these situations and went to sleep. It was clear that I was losing track of the number one priority of the clock, keeping that internal device in lock step with the external device. These in-between pulses are technically superfluous, and won’t even be accounted for unless you drop below a note resolution of 32nd note triplets. I started doing the math in my head. If a 1/4 note occurs every 504 milliseconds (120 BPM), than a 64th note only occurs arrives every 31 milliseconds. Would a listener be able to distinguish if a 64th note were off by a few milliseconds one way or the other at this speed? What is the threshold where individual notes just sound like pulses in an oscillation? I speculated that that threshold is the minimum amount of space I would need between pulses, after that it would just be a matter of sending the in-between pulses as quickly as possible.

In order to find out exactly what that threshold was, I ran a sleep test directly on the MidiInput class. I determined what my sleep time was using the same formula from above, and slept the entire midiInput thread for the determined amount. My assumption regarding this was that my sleep times would be less than the speed that any quantized midi message could possibly arrive, so sleeping would be safe, especially if the midiInput would only be used for sync messages. As I increased the tempo, I found that the threshold was around 5 milliseconds, after that you couldn’t sleep the thread for a short enough period of time to meet the 64th note requirement. Once this 5 millisecond threshold is surpassed, the thread doesn’t bother sleeping before sending the extra pulses, it just blasts them on to the receiver of the messages. I also discovered that this method of sleeping the thread was working just fine, and there was no reason to implement another method.

A few tests and mistakes on my part later and Keston was up and running with synced 64th notes on the GMS. Keep an eye on his space to watch his progress. The next release of GOL Sequencer will also include 64th (and possibly 128th note) capability

Processing HarmonicTable 01

UPDATE: This is no longer the latest version, I fixed some bugs for Mac users and reposted. To get the latest version click here.

Since I don’t have a touchscreen to test this with, I’m releasing the 01 version with mouse functionality, along with a few GUI functions to change the midi out port and the starting note number. At some point I’ll get a tablet and add the touchscreen functionality, or if someone has a tablet I can pass the source code on and they can test/implement that functionality.

harmonictable011

There are two tabs at the top, one for the main table screen, and one for setup. The basic setup options at this time are midi ports and starting note values.

This was built using the Processing libraries, along with the ControlP5 library and Ruin & Wesen’s rwmidi library for midi functionality.

This is an executable jar, meaning if you have java installed on your machine (version 6), you should be able to double-click it and you’re off and running. I have only tested this on Windows and Linux, so if there are any Mac users out there, please let me know if there are any problems.

Download HarmonicTable 01

If you have any questions or comments, leave them in the section below.

Christmas Time is Tabla Time

tabla-5I’ve had Christmas surprises in the past, but my wife pulled a fast one this year that is particularly noteworthy.

As many of you know, I’ve been taking tabla lessons for the last few months, and having a lot of fun at it. I had been borrowing a pair of my teacher’s to practice on, all the while looking at getting a pair for myself.

If you’re in the market for a tabla set in the U.S., your options are limited. There is a college in California called Ali Akbar that sells them, but they can get pretty pricey. Getting them from India is much cheaper, the downside of course being that shipping can get dicey, and of course I would have to go through someone to get them.

A month ago I found out that a Ganesh, a co-worker of mine, was going to be coming in from India and asked if he’d kindly bring a set back for me. My teacher Amit offered a while ago to broker something for me, and was able to get the name of a shop to Ganesh. Little did I know that I was throwing a wrench into Cary’s plans.

As far back as October Cary had started working with Amit to get me a set for Christmas. If she were to tell me that she already got a set that would ruin the surprise, so when she was informed that I was trying to acquire a set, she had to get everyone to play along with a little ruse. Amit gave a fake shop name to my Ganesh, who in turn claimed to visit the shop and let me know that that the set would be ready in time. I was really looking forward to getting the set, but was a little disappointed when Ganesh arrived and convincingly told me that “the set wasn’t quite finished in time, but it should be done my the beginning of January, and he could send it out with someone else”. I could wait a few more weeks I guess, but I was really looking forward the them.

Christmas arrived and a great big box was put in front of me, which I didn’t expect. I usually have some idea of what I might get for Christmas, so when I unwrapped the box (a toilet paper box), and saw the same round hard case that Amit brings his tabla over in, I was completely shocked. Here is the shipping label, which I thought was very cool:

tabla-1

Notice the “new Zaibaba Temple”, I wish mail addressed to me was labeled “near something-or-other”.

I’ve played the new set a few times, but I’m going to hold off until I can show the set to Amit and make sure everything is kosher with them. not to mention I still don’t really know how to tune them. Here is the baya, and detail of the copper which is pitted really nicely:

tabla-3

tabla-9

Here is a detail of the stabilizer ring that the drums sit on, and the wooden pegs for the tabla:

tabla-6

tabla-7

Processing HarmonicTable: Part 2

Since the last post I’ve had to make far more changes than I expected. If you looked at the previous examples, there was using a loop to create the hex buttons, making translations to relative to other translations on the screen. In the process I completely lost track of the absolute position of the button, which basically made it impossible to detect the location of the mouse on the screen in order to tell which button I was pressing. As a result I had to create a separate class to represent the Hex Button, and store the absolute starting point of each hex button and the length of one side. This is all the information I needed to create and detect a hexagon anywhere on the screen. From there I just created an array of these buttons in the setup() block, and drew them all over the screen:

hexButtons = new ArrayList

();

for (int j=2; j <20; j++){ resetNoteNumber(rowNumber); for (int i=0; i<12; i++){ hexButtons.add(new HexButton(this, space+(i*xOffset), parseInt(height-(j*b)), length, noteNumber)); noteNumber++; } j++; rowNumber++; resetNoteNumber(rowNumber); for (int i=0; i<12; i++){ hexButtons.add(new HexButton(this, space+parseInt(a+c)+(i*xOffset), parseInt(height-(j*b)), length, noteNumber)); noteNumber++; } rowNumber++; } [/sourcecode] As you can see I'm still making relative translations to locations on the screen, but I'm storing them in the class to be accessed later. This way I can still change the proportions and space variable at the top and not have to change a bit of code anywhere else to resize and reposition items. This greatly simplifies my draw() block: [sourcecode language='java'] public void draw(){ background(255); for (int i = 0; i < hexButtons.size(); i++){ HexButton button = (HexButton) hexButtons.get(i); button.drawHex(false); } } [/sourcecode] Now detecting the position of the mouse is simple: [sourcecode language='java'] public void mousePressed(){ for (int i = 0; i < hexButtons.size(); i++){ HexButton button = (HexButton) hexButtons.get(i); if (mouseX >= button.startX+a &amp;amp;&amp;amp; mouseX <= button.startX+a+c &amp;amp;&amp;amp; mouseY >= button.startY &amp;amp;&amp;amp; mouseY <= button.startY+(2*b)){ println(button.note); activeNotes.add(button.thisNoteNumber); midiOutput.sendNoteOn(0, button.thisNoteNumber, 100); } } } [/sourcecode] Notice I only bother performing this function when the mouse it pressed, this saves me some cycle since I have no intention of sending a midi note on unless the mouse is pressed (or dragged, which is using the same function as above). Also notice the activeNotes array. I'm storing notes that have already been pressed, so that I don't retrigger a note unless the mouse is pressed again, which is necessary for mouse drags. After the mouse is released, I just send a note off to all notes in the acttiveNotes array. I also revised the array to create the rows of notes across the screen. Previously it was hard coded, but with the following bit of code: [sourcecode language='java'] public void resetNoteNumber(int rowNumber){ if (rowNumber == 0){ noteNumber = noteNameMap.get(startingNote); previousNote = noteNumber; } else if (rowNumber % 2 == 0){ noteNumber = previousNote + 3; previousNote = noteNumber; } else { noteNumber = previousNote + 4; previousNote = noteNumber; } } [/sourcecode] I can just change the starting note field in the class to be whatever I want, and it will always move up by rows in 5ths and 3rds. So I can make my starting note A2 instead of C2, and everything lines up without a hitch and without any code changes:

harmonictablea2

I tested it out with some seriously 80’s sounding FM pad, dragging across intervals, drawing random chord shapes in honeycomb patterns, etc. Here is a short output:

HarmonicTable Sample

As usual here are the updated files:

HarmonicTable
HexButton
NoteReference

Its a lot of fun, even though I can only play it with the mouse. Last step is to get a touch screen. Anyone want to donate one?

Processing HarmonicTable: Part 1

Earlier this year while reading Harmonic Experience by W. A. Mathieu, I was introduced to the concept of lattices to represent tones, chords and keys. These lattices can be used to represent the basics of music composition in a visual way that makes more sense than standard scales on staffs. Here is an example:

ChordLattice

The lattice is effectively several staffs of music stacked on top of each other, so that the note can be displayed horizontally and vertically. Starting from any note in the lattice, moving horizontally (diagonally right or left), jumps by a 5th (for example C to G, or G to D). Moving vertically jumps by thirds (C to E, or G to B). Since nearly all music is the construction of 5ths in 3rds, this system makes it easy to create patterns visually for chords, cadences, scales etc, that will always look the same no matter where you start on the lattice. In the example above, you can see that Major chords are always point up triangles, Minor chords are the opposite, point down.

Until recently this system didn’t translate directly to an instrument. Any chord on a keyboard will have similar but different fingerings and patterns depending on the starting note, so one still has to memorize a ton of different patterns for one scale, chord, or modulation.

A few weeks into reading Harmonic Experience, providence saw fit to lead me to the C-Thru AXiS controller. The AXiS is a Harmonic Table based midi controller that creates a table much like the lattice in Harmonic Experience, separating notes not in semi-tones like a keyboad, but by 5ths and 3rds (and of course many other inter-relationships based on this). Here is the layout of the AXiS: natural_keyboard If you look the relationships between the keys become clear. They are reversed from Mathieu’s lattice, 3rds are horizontal (diagonally), and 5ths are always straight up. To play a Major triad on the AXiS, just play that triangle pattern using starting at any key. C Major will have the same pattern as Bb Major, etc. There are of course drawbacks to this system (playing inversions, etc), but for the most part it accomplishes the goal of simplifying the muscle memory aspects of playing music so the user can concentrate on composition and performance. you learn the pattern for a scale, chord or mode only once, then you may modulate it anywhere without the need to retrain your fingers.

The problem with AXiS is the price tag. Its around $1700 for one from what I can tell, which is a pretty steep entrance fee for a device I may not be able to get used to. They are in the process of creating a cheaper smaller version, but I want to try it now. The solution was to build one in processing. I’ve only just started, but combining the midi reference classes I’ve been working on along with (eventually) a touch screen, I think I should be able to pull something off that works well enough for me to test this thing out.

I started by looking for ways to draw regular hexagons, and came across this site. With a some basic trigonometry and a lot of cut-and-try with the vertex functions, I came up with this function:

int length=30;
float a = length/2;
float b = sin(radians(60))*length;
float c = length;
public void drawHex(){
beginShape();
vertex(0,b);
vertex(a,0);
vertex(a+c,0);
vertex(2*c,b);
vertex(a+c,2*b);
vertex(a+c,2*b);
vertex(a,2*b);
vertex(0,b);
endShape();
}

This will construct a regular hexagon based on the length of one side.

After that I used a series of translation matrices to draw them all over the screen:

public void draw(){
rowNumber=0;
setNoteNumber(rowNumber);
for (int j=2; j<20; j++){ pushMatrix(); translate(0+space, (height-(j*(b+space/2)+1))); drawHex(getNextNote()); for (int i=1; i <12; i++){ translate(space+(2*a)+(2*c), 0); drawHex(getNextNote()); } popMatrix(); j++; rowNumber++; setNoteNumber(rowNumber); pushMatrix(); translate(a+c+1.5f*space, (height-(j*(b+space/2)))); drawHex(getNextNote()); for (int i=1; i <12; i++){ translate(space+(2*a)+(2*c), 0); drawHex(getNextNote()); } popMatrix(); rowNumber++; setNoteNumber(rowNumber); } } [/sourcecode] I predefined the number of horizontal hexagons to 12 for each row, this means that STRAIGHT horizontally across a row I will have a perfect chromatic scale. The number of vertical keys I simply copied from the AXiS.After predefining the number of columns and rows, it meant that I could construct the size of the screen based on the length of one side of the hexagon. I also included a variable that allows me to declare an amount of fixed space in between the keys (in case my fingers are too big for the key's surface). One variable, named length, can be changed to create a bigger surface with larger keys. After implementing some simple code to write the note name into the key as well, I was finished with the layout. Here is a snapshot of it directly out of processing: Virtual Harmonic Table

Its of course much bigger. With the key surface complete now all I have to do is map the key locations to the midi note they’re associated with, and use some on press functions to trigger them. After that I just need to get access to a touch screen, anyone want to donate one?

Here is the preliminary code to draw the hexagons, as well as the code required to map midi notes:

HarmonicTable
NoteReference

In┬áPart 2 I’ll have have midi functionality implemented and some testing done.

In Part 3 I’ll have it implemented with a touchscreen, most likely a laptop, at that point I’ll make the rest of the code available.

Java Midi Reference Class

Lately I have been doing some work with Processing to create visuals from music. One of the concepts I’m working with is live visuals based on video feeds from cameras that are tracking the show, or band, or whatever. This alone would be enormously boring, so to heighten the experience, I thought of using the audio output of the show to control features of the video feeds, like playback speed, positions, color and hue, etc. This turned out to be way too processor intensive, and given the properties of sound, would be impossible to synchronize with a live show.

The solution was to use MIDI to trigger these events, rather than having to perform audio analysis first. Using rwmidi and Processing, I started an experiment to see what I can do to video, using MIDI messages as triggers, NoteOn’s as ellipses with alpha masks, or Pitch Bend controllers tracking hue logarithmically. So far these experiments have been successful, but I found myself referring to a MIDI reference constantly.

The problem is most musicians don’t think in MIDI numbers. Even though the messages I’m using have no musical information, they’re derived from the world of music. So Note names like C4 (The note C in the 4th position) have a MIDI number (60). When I’m creating a drum beat in Ableton for instance, I don’t look for note number 60, I look for C4. This means that every time I wanted to map this information to an event in Processing, I had to first look up the note number. After a while you memorize them, but that’s lame, and if I leave the project for a while and come back, I’ll have to go back to looking them up again.

I couldn’t find any solution for this in code, so I created a simple Java class to handle MIDI Reference Data:

MidiReference

Which you can download from the link above. Right now it only handle Notes and their associated midi numbers. The crux of the code consists of two loops:

//Loop through all of the note numbers
for (int noteNumber=0; noteNumber<127;){ //loop through the ranges (-1 through 9) for (Integer range=-1; range<10; range++){ //loop through all of the note names for (String nextNote : noteNames){ //if the note name contains a "flat" then it has the identical note number //as the previous flat, decrement the total note number count and insert into //the map so that note numbers can be translated from either sharps or flats if (nextNote.contains("b")) noteNumber--; noteNameMap.put(nextNote + range.toString(), noteNumber); noteNumber++; } } } [/sourcecode] and [sourcecode language='java'] //Loop through all of the note numbers for (int noteNumber=0; noteNumber<128;){ //loop through the ranges (-1 through 9) for (Integer range=-1; range<10; range++){ //loop through all of the note names for (String nextNote : noteNames){ String noteName; //if the note name contains a "flat" then it has the identical note number //as the previous flat, decrement the total note number count and insert into //the array a string made up of both the previous notes name and the current one if (nextNote.contains("b")){ noteNumber--; String previous = noteNumberArray[noteNumber]; noteName = previous + "/" + nextNote + range.toString(); } else { noteName = nextNote + range.toString(); } noteNumberArray[noteNumber] = noteName; noteNumber++; } } } [/sourcecode] The first loop rolls through all of the potential note names and maps them to a number, in order from 0-127 (all of the MIDI notes). The second does the opposite, so that you can quickly refer to the note name by number (which is something that Processing will find easier to do). If you find this kind of thing useful, feel free to use it. I will expand it from time to time as I add new reference data for myself.

Music to code by

Lately I’ve been having to step back in to writing a lot of code. I used to have no trouble at all concentrating on the task at hand, but for some reason focusing these days is tough. Its probably the vast number of distractions from co-workers needing help with this, that, or the other thing. At any rate, slapping on a Tool album or Captain Beefheart or something like that certainly isn’t going to help matters, so I’ve gone back to my music collection to find some tunes that are “barely there”.

Recommendations:

  • Godspeed! You black emperor – Yanqui U.X.O
  • Pole – 1
  • Basic Channel – Basic Channel
  • Monolake – Momentum
  • Brian Eno – Apollo
  • Pan Sonic – Aaltopirii
  • Autechre – Tri Repetae++
  • Aphex Twin – Selected Ambient Works Vol I and II
  • Aerovane – Tides

All of these albums are pretty linear, with very few quick changes in time signature or tempo. They’d probably work well for relaxing after a rough day at the office. I’ll update this list as I find new tunes to code by.