14 or 10 bits. Which one is it?
Hey guys! There has been a bit of confusion on the bit depth of the controller which I’d like to clarify. In some messages I’ve mentioned 14 bit depth while in others I mentioned the controller has a precision of 10 bit. Which one is true? Well, they’re both true. Let me explain (I’ll try to not make this to technical but it’s a technical story): Back in the 1980s when MIDI was first invented, hardware was not as fast as it is nowadays. In order for MIDI to be experienced as real-time, the creators had to make some concessions. So, for CC messages (control change), they only used 7 bits to save space and have the packages arrive faster.
So, what if you want to have higher precision than 7 bits (which only gives you 128 different values)? The only way to do that in MIDI 1, is to send two separate CC messages, which would add up to 14 bits. There is no in between. In my case, the microcontroller I’m using (which is the heart of the MIDI controller, the processor) can read faders with a 10 bit precision (1024 different values). The only way to send over 10 bits of data using CC, is to use 14 bits (2 x 7 bits).
That’s why I use both terms. Why not use a microcontroller that can do 14 bits, you may ask? Well, why that may be a very good idea if you’re controlling, for instance, pitch bend (which does have a 16 bit precision in standard MIDI), for mixing, you won’t hear the difference between value 50 and 51, when using 10 bits. You can hear this difference (albeit barely) when using only 7 bits. Take that knowledge and combine it with the fact that a 10 bit ADC (Analog to Digital Convertor, which is responsible for translating the fader values to digital values on the microcontroller) is way cheaper than a 14 bit, and I have made my choice.
So long story short: I use 14 bit MIDI CC messages to transfer 10 bit. In the future, I will probably send out an update to the controller with MIDI 2 support. More on that later.