Popular Posts

Tuesday, August 23, 2011

53ET 5-Limit in Geo Explained


Just Intonation
Just intonations are tuning systems in which pitches follow physically relevant ratios. The mechanics of making instruments has, for a very long time, dictated the use of equal tempered instruments. Without equal temperment, retuning adjustments will need to be made when playing in different keys. Equal temperment that uses a small number of pitches also makes the mechanical instruments easier to build because there are less parts.

But with electronic music, the justifications for only sticking to the 12 tone system no longer hold. Now it is just an issue of what people are in the habit of playing and listening to. With the rise of touchscreens as controllers, fretless controllers that can do dynamic intonation will become more common. Pythagoras started out as an attempt to do dynamic intonation. But when I got things working, 53ET became a defacto feature of it that emerged on its own. I eventually gave up on actual harmonics based intonation and just went with 53ET because it was so close, within the exact pixel in almost all cases, to the real spectrum locations.

I don't aim to expand upon this too much into the realm of academic microtonality. This is something that is designed to be practical. So instead of learning by generating big lists of floating point numbers and ratios for you to ponder and bamboozling you with equations, I am building geometry into an instrument that allows these ideas to be used and tested in practice. There is a little bit of multiplication going on, because that's easier than giving every little ratio a weird name for you to remember.

There is only so much you can do with timbre to make an instrument sound interesting. Getting the correct intonation is the first thing that almost all MIDI based instruments are giving up on. The pitches need to go in correct, and cannot be corrected by cleverness in the engine. Two waves in perfect ratios form a standing wave of a new shape, and do not come out as two separate notes. The octave is the only interval that we are used to hearing completely correctly, and the fifths and fourths are reasonably close. All the other intervals could use improvement by exploring the spectrum outside the normal 12 notes.

Relating Actual Harmonic Geometry to 53ET


I have posted before about the 53 note per octave scale, 53ET, with some lower quality images of its pattern. It is a highly special tuning system that captures the behavior of the spectrum in a practical way. It is explicitly used in Turkish Makam, a classical music system. A subset of this scale shows up in some form just about everywhere that flexible tuning instruments such as violin, sitar, and voice are used. The 12 tone system that we are all familiar with is interesting in its own way, but there is very little about that system that readily connects with the reality of the spectrum in an obvious way. The more you dig into 53ET, the more you feel like you have stumbled upon some ancient Magick. It really is a special system, and is much deeper than the 12 tone system we are used to.

Starting with a clean slate. Forget all about the twelve tone scale. Just think about a reference tone at some middle frequency, like a drone note. It doesn't matter what it's actual frequency is, because microtonality is about well defined relationships in a relative sense.In the top figure, we have a note that's a reference note of 1. It is in some rows that are stacked by pure fourths. Every note emits overtones that can lock with notes up above it. So the lines going up out of a note are the overtones that it emits. The lines going down are notes that can have an overtone that goes into our note. The unmarked line at an angle of about 2 o'clock is actually marked 2, the octave overtone.

The pitch ratios are simple numbers in accordance with the harmonic series, (N+1)/N. That is to say, 2/1, 3/2, 4/3, 5/4, 6/5, ... 16/15, ... 81/80,... etc. In this system moving up to a string multiplies the pitch times 4/3, because we are tuned to the pitch ratio of a Just fourth - which is 4/3. It happens to be 22 frets of 53ET along the same string. So we treat going exactly up to be the same as going 22 frets to the right. (It's the same concept as a bass guitar, where going up 5 frets is the same as jumping up to the next string.) So this second diagram shows whole tones as the most important unit, which is roughly two chromatic notes. ie: A to B for example. I managed to forget to note the fact in the drawings anywhere, but that whole tone is the ratio 9/8 ((N+1)/N, so that means that it's a special part of the harmonic series).

So any notes that are in the scale have some short path along these lines where you pick a note and draw out harmonics from it as the first diagram does. So, if we start at pitch 1, take the route 4/3 by going straight up, then the path 3/2 by taking that diagonal, these numbers multiply to give us the number 2. That is the definition of octave; twice the original frequency. 4/3 is called a "fourth", and 3/2 is called a "fifth", 2 is "octave", 1 is "root". And note that we can get a semitone of sorts by following 4/3 up, but travel down to 4/5 to land on 16/15. This means that if you chord the notes 16/15 and 1 together, 16/15 has major third overtone at 4/3, and the 1 has a fourth overtone at 4/3, so they are consonant because they link up by a short path. Notice also that even though we reached 16/15 by navigating that it's (N+1)/N, which means that it shows up on the harmonic series if we go far enough, as does 9/8.

And I do in fact mean to chord two notes so close to each other, an "A" and a Just semitone of "Bflat" 16/15 will sound like trash on a piano. It's a reasonable chord in this system because it's 1 versus 16/15 rather than 1 versus 2^(1/12), which isn't very close to any reasonable integer ratio.

So this whole system ignores the really high harmonic ratios and uses the ones that you can create from nothing more than octave, fourth, major third relationships. It happens to be that octave (2) and undertone (1/2) are related, as are fourths (4/3) and fifths (3/2), and the major and minor thirds. What is more is that if you pick out all the prime numbers in their ratios, then this system is everything you can make only out of the numbers involving 2, 3, 5; the first three prime numbers. This gives a way to "measure" how closely related two pitches are. There are no note names here, just relative relationships by pitch.

The real enlightening moment for me was when I had fixed a bug in the drawing that had caused major and minor third to not hit exact 53ET frets, that everything locked together very tightly. I could hear this and knew this long before I had the drawing right; I let my wrong math overrule what was obvious to my ears. So, here is an explaination of *why* major and minor triads are special.


When you play a minor third interval, there are 9 steps to the first wholetone (9/8) which are reached by 3/2 * 3/4 = 9/8. But from the fifth (3/2), you can take a major third down by 4/5 to reach 1 * 3/2 * 4/5 = 6/5. So the ratio 6/5 is the minor third as noted in the harmonic series above. 6/5 has a minor third harmonic coming off of it, and the root has a fifth coming off of it and they meet at the same point.

Similarly if you play the root and major third together, you can see that there is a minor third harmonic coming off of it that meets at the same fifth that the root meets. These two chords are simply stacks of a major and minor third, but in a different order. These three notes together along with the octave relate: 1/1, 2/1, 3/2, 4/3, 5/4, 6/5 all simultaneously. They form a rich set of harmonics in that sense. That's why triads are special.



One thing to keep in mind about 53ET though is that it's very close to a complete system that captures all octave, fifth, major third relationships. It is not exact in a mathematical sense. But it is very close, much closer than the 12 tone system, in the sense that you will only be able to discern larger ratios to a reasonable degree of accuracy and it's a very practical approximation to the spectrum.

Note: The scale I am showing up above is not using the Pythagorean choice for the third and the sixth. A semitone is generally going to be 4 frets rather than 5 when in those cases. Pythagorean scales are generally a span of notes that are adjacent in the Just circle of fifths (or fourths... same result), which causes third and sixth to come out one fret flatter if done that way. So my scale is considered wrong if the goal is Pythagorean tuning, or might be acceptable if you favor being closer to Just major thirds and Just minor thirds. Transposition is easier if the Pythagorean choice is taken. There is a short harmonic path between both choices, so it's really a matter of what it is you are trying to do. (I am getting authoritative information from somebody that actually does Makam, and I am trying to make sure that I am doing things right. :-) )


Another interesting thing about 53ET is that it contains the interval that is the difference between a pure major third (5/4) and two whole tones (9/8 * 9/8 = 81/64). This difference creates a lot of controversy in tuning systems. There is a tension between wanting something that is simple and tunable by ear using only fifths and octaves, and the close to major third tone that a simple Pythagorean tuning gives. So, 81/80 shows up in a lot of places related to tuning, in Indian music, etc. This interval causes a situation where the scale may not want to follow the simplest ratios created by following the fourths and fifths. In a minor-like scale, there will be a tendency to make minor third, minor sixth and the seventh one fret sharper than you might guess if you did the geometry without listening to the result.

















A Note On Accuracy

If you go through this and verify the intervals (as my Makam playing friend did), you will find that you have to be careful about how you interpret the geometric picture. If we assume that going up is multiplying by exactly 4/3, then the path that you compute to an interval will be exact. It's a simple matter of multiplying ratios.

But that doesn't mean that it lands exactly on a a 53ET fret. But you can tell from actually playing Geo Synth that it is close to a high level of accuracy, and see exactly how far off it is when it's not exactly to the pixel on the screen. When doing all navigation as pure 3-limit, you will always get exact numbers (ie: 2, 1/2, 3/4, 4/3 are the navigational options, all else is derived) if you don't treat the circle of Just Fifths as closed, and make sure that you only navigate a span of 53 fifths without trying to wrap around at the point where it almost closes. However, it still applies that the 53ET frets themselves are an approximation, as 2^(1/53) is an irrational number.

It gets even more complicated when we declare that the interval 00 05 is a minor third and 00 08 is a major third (see tablature below). You can have different paths that reach the same fret, but don't come out to the exact same ratio. But this is ok, because if you subtract one ratio from the other you get a very small number that can be considered zero for all practical purposes; especially because we pick *shortest* path to determine how a ratio is created.

This problem was proposed to me:

I am at A. Where is F?

... navigate down two whole tones
8/9 * 8/9
= 2^6 * 3^-4

... navigate up four fourths then two octaves back down to get into range

(4/3)^4 * 2^-2
= 4^4 * 3^-4 * 2^-2
= 2^8 * 3^-4 * 2^-2
= 2^6 * 3^-4

These are the same because they use 3-limit intervals. But when we use paths involving the 5-limit, they don't come out identical. I was given 405/512. So I try to factor it into a path:

405/512 =
810/1024 =
81 * 10 / 1024 =
81 * 5 / 512 =
2^-9 * 3^4 * 5 =
5/4 * 2^-7 * 3^4 =
(3/2)^4 * 5/4 * 2^-3 =
(9/4)^2 * 5/4 * 2^-3 =
(9/8)^2 * 5/4 * 2^-1

up two wholetones, up a maj third, down an octave. But subtracting these intervals:

64/81 - 405/512 = -0.0000892168...


They are practically the same value, so both landed at the same fret. So, if you mix in 5-limit , then don't use high powers of them and accuracy will not get out of control. The use of this is that you can do the geometry in your head, and if you do it carefully it will be exact. But you can use approximations without the number crunching via 53ET.






Tablature



When I was faced with writing tablature for 53ET, I thought that the idea of notating with large fret numbers was ridiculous. Instead I took a different approach, that is compatible with 12ET tablature once you get used to it.

The dark blue lines every 9 lines in my grid are the markers for whole tones (9/8). In 53ET, whole tones cannot be split into exact semitones because they are split into 9 subdivisions. So the first numerical digit notes which dark blue line (marking a whole tone boundary), while the second digit denotes the lighter line above the whole tone division.

One interesting property that this has is that if you pretend that these are actually decimal numbers with a decimal in between them and multiply them times 2 (ie: "15" x 2 = 30 ... think of as 3.0); the numbers round off into the same locations as their 12ET tablature counterparts.

Ex: 00 06 15 becomes... 0.0 1.2 3.0 ... or 0 1 3 in guitar tab, where it's obvious that the middle note is intended to be sharper. But the notation is completely unambiguous. It tells you exactly what frets to put your fingers on, without making up strange symbols to try to add to the standard music system. (Note: I am targeting guitarists. Almost no guitarists fluently read standard music notation, and it's not really necessary for this.)


Geo Synth (Pythagoras)

So the main idea that got Pythagoras started (now called Geo Synth) was geometry. I wanted to avoid any equal tempered system, or anything artificial that didn't come from the spectrum itself. But when I draw out the 3-limit, which is the closure of all fourths and fifths (up to some reasonable distance) then it became 53ET. It was never my plan to support 53ET. It came to me. And it wasn't until recently that I realized that my ears were right and my drawings for the thirds were wrong, and fixed the drawings. Now the major and minor lines draw straight through 53ET frets. It is now clear that even the ratios 7/6 and 8/7 which are up next are not too far from 53ET as long as you only navigate that ratio once.

Here is a quick video picking out some of the obvious intervals:

http://www.youtube.com/watch?v=eViP4cJYvvE

The idea behind the interface was as explained above, to show all of the harmonics coming off of a note. You line them up until they lock visually. When they lock visually, the sound is obviously consonant.

So I have the normal twelve pitch system drawn over it, as well as quartertones (ie: 24Et which includes the normal 12 tones).

24ET is roughly the official Arabic intonation system. It doesn't line up so well with the harmonic series. It's main point is to take a core pentatonic scale, and don't play semitone stretches. Whenever you encounter a minor third, the note in the dead center between them (median second) is what you want to play. These are what I would call "red notes" on a piano. If you had red keys where black keys are missing between E/F and B/C on piano, they would have to be quartertone "red notes".

I think touch screens will usher in some initial experimentation with these kinds of scales at first. But after a while, it will sink in that Phrygian scales and Harmonic Minor scales are bad imitations of the Arabic system (especially Phrygian sharp fourth), and people will start adopting actual Maqam scales for certain scenarios. Then people will realize that with 53ET you can do this, while also getting even better chords out of it than we are accustomed to getting.




MIDI Instruments on iOS

MIDI Best Practices Manifestos

There is a MIDI Manifesto on Best Practices that is getting a lot of attention at the moment. They are focused on getting developers to write apps that cooperate to make CoreMIDI work really well on iOS. You can read it here:

http://groups.google.com/group/open-music-app-collaboration/browse_thread/thread/939e4f2998a8bc

This focuses on interoperability issues. It is very much from the perspective of developers playing piano-like instruments to drive traditional synthesizers and beat machines. This is very welcome news, and it could not have better timing for this to happen. Up until a few days ago, I did not know it was even possible to have a sound engine run in the background while a controller runs in a foreground app, communicating over CoreMIDI.

I am not an electrical engineer or a sound engineer, so the mandatory internal engine that I had to write was always something that was an especially time-consuming part of my app that yielded less than ideal results for all the work that went into it. I have always wanted to focus on making a controller with world-class virtuoso playable potential. The achievement above, using virtual MIDI ports, promises to free me up from having the sound engine be such a huge focus in what I do.

This achievement won't matter if the sound engines get the pitch handling wrong. So here are some issues for sound engine creators to think about, which you will encounter as soon as you plug in something that's not a piano.

The Skill Mismatch

Frankly, I think there are a lot of synthesizers on the iPad with great sound, but playability on the device itself seems to be unconquered territory. I don't think it's possible to make a synth on iPad that resembles a piano, and be able to consistently produce people that can play it better than a real piano. The small rectangular dimensions defy the layout, and the continuous surface fights against the discrete-key design of the instrument being emulated. A piano is wide and doesn't need to be so tall. As a result, most synthesizers are something like 80% controls and 20% playing area. This makes for a really small octave range. There is a lot of demand for synths to "support MIDI" because of this. The truth is, it's mostly so that you can at least plug in a 2 octave keyboard so that you can play at a reasonable level of skill on the thing. It is not so that you can take your iPad and play a piano-like app to drive a Korg Karma. Nobody in the real-world does that.

There is also the skill mismatch that bothers me. If you look at any rock band, you will notice something about their makeup. It is usually 4 or 5 guys. The keyboardist is optional, but he shows up more often than he used to. There is always a drummer. There's always a guitar player. There is almost always a bass player. In a lot of cases, there are two guitar players. The ratio of guitar players to piano players is very high. If you look at all the music stores in your area, you will notice a gigantic guitar section, and a smaller piano and keyboard section. I think that there are something like 5 guitar (including bass) players to every piano player.

The electronic instrument industry is out of balance in this regard. They don't get it.

TouchScreens versus MIDI

Code that actually implements these ideas (the AlephOne instrument). It works very well. But it pushes all of the complexity of MIDI into a client library, and forces the synth (the server) to be as dumb as possible so that the client can get whatever it wants:


It is no coincidence that iPads and iPhones are posing a challenge to the instrument industry at this time. iOS devices are essentially rapid prototyping devices that let you make almost anything you want out of a touchscreen, audio, midi, accelerometer, networking, etc combination. iOS developers are becoming the new instrument manufacturers, or are at least doing the prototypes for them at a very high turnover rate.

Multitouch has a unique characteristic of being tightly coupled to the dimensions of human hands. It does away with discrete keys, knobs, and sliders. If you put your hand on a screen in a relaxed position with your fingers close to each other without touching, you can move them very quickly. Every spot that a user touches should be an oval a little bit larger than a fingertip. If you make the note spots any smaller, then the instrument quickly becomes unplayable. So, lining up all notes besides each other to get more octaves does not work. The awkward stretch to reach accidentals at this small size is equally unhelpful.

A densely packed grid of squares that are about the size of a fingertip is exactly what string instrument players are used to. So, a simple row of chromatics stacked by fourths is what we want. We can play this VERY fast, and have room for many octaves. It's guitar layout. Furthermore, the pitch handling can be much more expressive because it's a glass surface. We know exactly where the finger is at all times, and frets are now optional, or simply smarter than they were before. An interesting characteristic of a stack of strings tuned to fourths is that there is a LOT of symmetry in the layout. There are no odd shapes to remember that vary based on the key that is being played in.

Transposition is simply moving up or down some number of squares. This even applies to tuning. Just play a quartertone lower than you normally would and you are playing along with something that was tuned differently without actually retuning the instrument. It is a perfect isomorphic instrument, with a layout that's already familiar to 80% of players. This is the real reason why this layout has a special affinity for the touch screen.

This link below on a *phone*, and it's ridiculously easy to do. I could play at a similar level on Mugician 1.0 only a few weeks after I got it working. The layout matters. It matters more than a dozen knobs and sliders. You can fix a simple sound with external signal processing, but if you can't play fast, or if the controller drops all the nuances, or it has latency, then that can never be fixed by the sound engine later.

http://www.youtube.com/watch?v=FUX71DAelno&feature=channel_video_title

This layout works, mostly because of the way that pitch handling is done.

Fretless MIDI Messages

Synths that take in MIDI messages are only getting the things right that a piano exercises. They seem to consistently get everything else wrong in various ways. So without any more ranting, here is how MIDI must behave on touch screens:

A MIDI number is a number such that 0 is a very low C note. That low C note is the reference frequency, and I don't remember or know how many hertz it is at the moment. So we will just speak in relative terms to "midi note 0":

frequency = c0 * 2^(n/12)

I believe that A 440hz is Midi note 33. It has this frequency:

c0 * 2^(33/12)

But what happens when the MIDI note is not a whole number? That's a MIDI note with a bend. If we bend midi 33 up with 1/4 pitch wheel with the default wholetone pitch wheel setting, we get midi note:

33.5 ( frequency = c0 * 2^(33.5/12) )

"A quartersharp" if you must name it. So, when we want this pitch, we send midi messages like:

bend ch1 +25%
on ch 1 33

So imagine for a moment the coordinate system that normalizes all touches to fit in the rectangle <0,0> at the bottom left corner, and <12,6> for the top corner. This is an instrument with 6 strings stacked by fourths, with the lowest note in the bottom left corner. You don't have frets yet, but you know that at = <2.5, 0> that you are on the bottom string, half way between fret 2 and fret 3. The pitch is 2^(2.5/12). Every string you go up by adds 5 frets, and therefore multiplies by another 2^(5/12). There is a pixel-to-pitch mapping. This generates the exact frequencies that we are trying to represent as (note,bend) pairs in MIDI.

We bend up to 50% and get an A sharp note....

bend ch1 +25%
on ch 1 33
bend ch1 +50%

Bend up another semitone:

bend ch1 +25%
on ch 1 33
bend ch1 +50%
bend ch1 +100%

So we are up at the top of the bend. We can't do a vibrato now! We will exceed the bend width. So, if the synth was set for the bend to mean "percentage of an octave up or down", then we can do this...

bend +0%
on ch1 33
bend +(3/12)

To bend 3/12 of the way up to the note C. So now, we can do whatever we want as long as we don't get up to an octave. Because users will simply drop their fingers on the glass and actually drag the finger up an octave, this MUST be supported. This drag CANNOT happen on a piano, which is why it doesn't seem that this scenario is ever getting tested.

The core problem with MIDI's model of bends is that it assumes that there is one pitch wheel for the whole instrument. You have to bind together multiple monophonic instruments as one if you want every note to have its own pitch wheel. A finger dropped on the glass is a pitch wheel for that finger. These per finger movements are what gives string instruments their special character, which you cannot fix later in the sound engine after the gestures have been rounded off and lost. Here is an example of two of the same note being played and slightly detuned from each other:

bend ch1 0%
bend ch2 0%
on ch1 33
bend ch1 -1%
on ch2 33
bend ch2 +1%
off ch1 33
off ch2 33

This isn't possible on a piano. But this always happens on a string instrument. We have two concurrent instances of the same note. They are bent in different directions. But they are on the same instrument. Here is another challenge, where note overlaps will cause trouble for synths that don't correctly interpret multiple channels:

on ch1 33
on ch2 33
off ch1 33
on ch3 36

On synths that replace every channel with the same number before interpreting, not only did it get the bends wrong, but it also gets the above scenario wrong. We should be hearing ch2 33 and ch3 36 sounded together as a chord, not just ch3 36 by itself. As for the next scenario, of the problems when you have more notes to play than channels available, just make sure you don't get a stuck note on this:

on ch1 33
on ch1 33
on ch1 33
off ch1 33

This should sound three times with exactly one note, and no stuck note at the end. But this:

on ch1 33
on ch2 33
on ch3 33
off ch1 33
off ch2 33

This will leave ch3 33 still playing. At the time that ch1 and ch2 had notes on, it wasn't one note either. It was two instances of the sound playing, most likely out of phase from each other.

Channel cycling is a requirement to get the independent pitch wheels. Because note off message isn't actually the end of the note, we have to give the notes release time. This is something done in the controller, but synth engines should be aware of why this is being done. We want to maximize the amount of time that a channel has been dead before we steal it to play a new note.
So when we make a fretless instrument, we spread it across a channel span, such as channels: 4-8 for a 4 channel instrument. That allows for 4 notes down with each note having its independent pitch wheel. It means also that if you play 4 notes per second, then an old channel gets stolen (along with its pitch wheel) at 4 notes per second. This means that there is a speed limit to playing without anomalies in MIDI because of the small number of channels.

Chorusing For Real Men

Note that if you are playing fretlessly on an instrument that allows note duplication, that you don't need no stinking chorus effect. Just play the line simultaneously at two different locations:

bend ch1 0%
bend ch2 0%
on ch1 33
bend ch1 -1%
on ch2 33
bend ch2 +1%
off ch1 33
off ch2 33

You can switch between chorusing and chording as you play as well. With microtonality turned on, you can get exact pitch ratios such that the pitches don't sound like distinct notes, but end up being different wave shapes as well.

Note Tie Non Registered Parameter Number

It is not impossible to bend MIDI notes to any width you want at fullest possible resolution. the problem is that there is no defacto or dejure standard on how this is done. Imagine a piano player trying to simulate a bend, and it's on our channel cycling instrument....

bend ch1 0%
bend ch2 0%
...
bend ch16 0%
on ch1 33
...
off ch1 33
on ch2 34
...
off ch2 34
on ch3 35
...
off ch3 35
on ch4 36
...

So he's playing chromatics to simulate the bend because that's the best he can do. But if we are on a synth that inserts bend messages, the synth can at least bend from one chromatic to the next like this:

bend ch1 0%
bend ch2 0%
...
bend ch16 0%
on ch1 33
bend ch1 20%
bend ch1 40%
bend ch1 60%
bend ch1 80%
bend ch1 100%
off ch1 33
on ch2 34
bend ch2 20%
bend ch2 40%
bend ch2 60%
bend ch2 80%
bend ch2 100%
off ch2 34
on ch3 35
bend ch3 20%
bend ch3 40%
bend ch3 60%
bend ch3 80%
bend ch3 100%
off ch3 35
on ch4 36
bend ch4 20%
bend ch4 40%
bend ch4 60%
bend ch4 80%
bend ch4 100%

So, this would be a smooth bend, except we hear the note retrigger every time we reach the next chromatic. So let's say that we have a special message that notes that there is a note tie coming and that it's done when the next note on appears.

bend ch1 0%
bend ch2 0%
...
bend ch16 0%
on ch1 33
bend ch1 20%
bend ch1 40%
bend ch1 60%
bend ch1 80%
bend ch1 100%
tie ch1 33
off ch1 33
on ch2 34
bend ch2 20%
bend ch2 40%
bend ch2 60%
bend ch2 80%
bend ch2 100%
tie ch2 34
off ch2 34
on ch3 35
bend ch3 20%
bend ch3 40%
bend ch3 60%
bend ch3 80%
bend ch3 100%
tie ch3 35
off ch3 35
on ch4 36
bend ch4 20%
bend ch4 40%
bend ch4 60%
bend ch4 80%
bend ch4 100%

We can continue this from the lowest note on the keyboard to the highest for a super-wide bend. It is at the full pitch resolution as well because we aren't playing tricks with the MIDI bend width. It is also the case that if we broadcast this both to a piano that can't bend, and the synth that understands, we get a similar result. It degrades gracefully on the piano, and sounds perfect on the synth that understands. We can use this to track up to 16 fingers at arbitrary pitches (in MIDI range of course!) bending in whatever wild directions they need.

The NRPN looks like this in our code:

#define TRANSITION 1223

static inline void sendNRPN(int ochannel,int msg,int val)
{
//B0 63 6D
//B0 62 30
//B0 06 100
int lsb = msg&0x7f;
int msb = (msg>>7)&0x7f;
//midiPlatform_sendMidiPacket7(0xB0+ochannel, 0x63, msb, 0x62, lsb, 6, val);

midiPlatform_sendMidiPacket3(0xB0+ochannel, 0x63, msb);
midiPlatform_sendMidiPacket3(0xB0+ochannel, 0x62, lsb);
midiPlatform_sendMidiPacket3(0xB0+ochannel, 6, val);
}

static inline void retriggerNewMidiNote(int finger,float midiFloat,int vol,int expr)
{
int channel = midiFingerUsesChannel[finger];
if(channel >= 0)
{
int ochannel = midiChannelOChannelSent[channel];
sendNRPN(ochannel,TRANSITION,midiChannelNote[channel]);
}
stopMidiNote(finger);
startNewMidiNote(finger,midiFloat,vol,expr);
}


Let us know if there is something unreasonable about that message. I haven't used NRPNs before, and since we write both ends of it, they could both be 'wrong' and work just fine between our synths.


What Is It Useful For

There is a very practical use for this: Violin! Oud! You could even do an accurate rendition of the human voice to MIDI without auto-tuning it. Most real-world instruments exhibit this because the spectrum itself is actually microtonal, with the exact 2^(n/12) adjustments being something that can't actually be achieved in practice on an acoustic instrument. They are real-world resonating bodies afterall, and will resonate in whole tone ratios and have real harmonics. Acoustic pianos often use a stretched octave tuning to deal with this problem.

This opens up the door to rendering music outside the 12 tone tempered traditions as well. MIDI should be a rendering format that doesn't break the capability to do what you mean without injecting its prejudgement of what should be disallowed. MIDI itself, like auto-tune, seems to be one of the key factors in keeping electronic music sounding more like it was produced by a machine than acoustic instruments do. There has been a lot of progress in getting good timbre out of instruments, but this is meaningless if you round off all the nuances in pitch that are what really give the acoustic instrument its character.

I am working on a new project, and if you are already a tester or jailbroken, then you can download it here (it's very minimal, with the purpose being more to produce reuseable code than to ship something right now):

http://rfieldin.appspot.com

This is a similar post that puts it in perspective with more general instruments like this for iOS:

http://rrr00bb.blogspot.com/2011/09/multitouch-midi-recommendation.html

Friday, August 19, 2011

53ET Geometry fix in Geo Synth (Pythagoras)

This thing is getting very close to being shipped. Kevin Chartier (Jordan Rudess' programming partner at Wizdom) is doing a lot of user interface changes. We are working on at least making a stable and feature frozen version that's shippable in principle to have something ship by a well defined date, or just go ahead and ship that version and keep going with an update.

The entire time using 53ET in this synth, there was a minor bug in the geometry that is now fixed. The consequence was that even though 53ET emerged as a consequence of the fourths and fifths without being programmed in (or even known at the time I first saw it!), the major and minor thirds conspicuously didn't seem to line up very exactly with any of the 53ET frets; in spite of the fact that the maj fret slightly flat of 12ET and min fret slightly sharp of 12ET sounded astonishingly like correct Just intervals.

So now that this minor numerical fix was made, you can easily navigate visually around while playing to discover the Just Intervals that are so closely approximated in 53ET. This numerical problem I always had also affected the 12ET tuning very slightly, something that I noticed when working on the MIDI, but had written off as just floating point inaccuracies.

This is kind of like discovering an exact representation for Pi and having all of the goofy engineering inaccuracies disappear so that you can actually understand what you are looking at. Now that this is in, major and minor triads actually form well defined triangles visually. I should think about making those an explicit part of the visualization at some point. Playing around with the spectrum rather directly like this is a bit of a game on its own.

There will be more to come. I will ship this in the store soon, hopefully with Wizdom as planned; but some version of it will ship no matter what.