Earlier, I spoke about the fact that many hearing aid users tend to have too much gain at 1-2 kHz, and that may explain why we all hate the sound of our hearing aids so much.
When you plot audiology of threshold elevations versus frequency in a logarithmic Hz axis, it tends to overemphasize the 1 kHz transition region in the audiology. Plot that same audiology against a Bark frequency axis and the exaggeration is not as great.
Here’s an example:
Notice that cliff at 1-1.5 kHz? It looks really grim for my hearing.
But now, re-plot that same data in Bark frequency space:
The cliff is gone! What we actually see is some sloppy elevation estimates. And I do say sloppy. A repeat trial with the same audiology group on the next visit had me completely flunking my audiology exam. They decided that I was now profoundly impaired, and they tuned me up accordingly. It was completely awful, and I went back and demanded that they restore the hearing aids to their previous settings.
Audiology claims that it is accurate to +/-5dB. Hmm… maybe on a good day… Think about it… You are taking impaired hearing and testing to within 5 dB? At extreme levels of impairment I think that would be a pipe dream. I would accept this statement of accuracy when measuring unimpaired hearing.
But anyway, back to the original topic. I think you can see for yourself, with that 1 kHz cliff, why so many audiologists would have a tendency to overcorrect at 1-2 kHz.
- DM
(to be fair… I have tinnitus ringing off the hook! When I did the retest with the audiology group, I wasn’t sure what I should be listening for — steady tones? beeping tones? So I was pressing the button like crazy, since my tinnitus dominated everything in that soulless isolation booth. I guess the audiologist did the best she could, and rated me an F- on my hearing exam.
So why do we insist on testing impaired persons with threshold level sounds?)