Tuesday, 15 March 2016
mimi disrupts my hearing
I thought I'd try that Mimi music system, after reading about it on the FAQ blog a couple of days ago.
It starts with a hearing test, which I flunked completely the first time. I followed the instructions and plugged in a pair of Sennheiser headphones. The iPhone App even showed a little picture of Sennheiser headphones before I started the test.
Then it pumped out a couple of test tones at 500 Hz. Left ear or right? I couldn't hear anything. I waited and eventually after about a minute I pressed the 'heard it' button anyway. No, it said, you pressed too early.
I restarted the test. This time I waited a couple of minutes. Apparently no tone but it still let me go to the next signal. Same pattern. My hearing must be really bad as I haven't heard any of the tones. Another long wait.
I abandon the test until later.
Second attempt, I swap out my fancy Sennheisers. I've always liked Sennheiser headphones since the time I first tried HD414s and couldn't believe how amazing the sound could be from such a plasticy-looking headset. My original HD414s died long ago, when one of the membranes cracked, but I've stuck with Sennheisers pretty much since that time.
However, for this test I'm using the Apple EarPods. The type that come with the iPhone.
Re-run the test and this time it worked. Mimi were quite serious about the software only working with certain types of earpiece. This time I could hear the various tones in each ear, although in the gaps I could also hear a distant Radio 4 broadcast from a clock-radio.
And so I was presented with my findings. My left ear showed worse hearing than my right. Something I already knew. The frequency range wasn't too bad, although by my age one should expect some drop-off in higher frequencies.
At a practical level, one can make charts that show the types of impact hearing loss has across the spectrum, and it's something that the Mimi technology aims to rectify, currently for music playback.
It would be interesting for Mimi to produce a customised chart similar to the one I illustrate above, with left and right ear showing the approximate positioning and sensitivity of a person's hearing. I expect it will follow.
In the short term, Mimi provide a kind of hidden equalisation and compression algorithm to drive music listening from the iPhone. I tried it, using their selected track from my iPhone, which was "Don't Know Why" by Norah Jones. The effect of Mimi was quite noticeably different, more immersed in the sound, and with some of the detailing increased. I noticed that the vocals also showed boosted treble and that it created a harshness which wasn't in my original experience of the song. There was a switch to reduce the level of the effect and I found that better vocal rolloff produced a more natural sound.
However, when I played it a little longer I could also hear a kind of 'banding' effect across the whole track, something like a rolling lower frequency pulse that wasn't part of the music. No, it wasn't my heartbeat or anything like that, it made me think that the software was having a slight challenge to keep up with the processing. Either that or there was eventually going to be a 'Pay' button pop-up to make me splash some cash.
The phone I used for the test was an iPhone 5s, and I'm wondering if the processing is actually quite computing intense, maybe with active processing on multiple bands of equalisation and compression. I haven't really tried it long enough to cross check this via the battery use or heat from the phone, but I noticed that the FAQ review mentioned battery life as a challenge, and I think that was with an iPhone 6.
It is well-known that hearing sensitivity drops as one gets older.
Back in analogue days, a couple of tone controls would make the necessary adjustments. Bass and Treble. The circuit would reduce the whole signal by about 20db and then put back some of the frequencies.
The Mimi ideas certainly improve the space inside the listening experience - assuming the processing can properly keep up.
I'm pretty certain the real trajectory of this technology is towards hearing support. Like the old tone circuits, analogue hearing aids also boost and then tweak the frequencies in a fairly basic way. The more expensive ones add filter ranges, more akin to what Mimi appears to be doing.
I'd previously wondered whether an iPhone could be used as an aid to hearing, and this seems to be along that direction, presumably with the phone as a low energy (Bluetooth Smart?) controller and a new type of dedicated processing in ear-mounted hardware. Piezo electric sensor, programmable amplifier, analogue to digital converter, reconfigurable decimation filter, waveform creator? Something like that - similar block schematics to something Line 6 would make to simulate guitar sounds but much, much smaller.
Interestingly, there's a whole ton of legislation related to actual hearing aids, which are classed as medical devices and which means that folk like the Mimi developers are having to tread somewhat carefully. Let's hope their footsteps are heard by the right people.