Bugs are energy of noise on continuous spectrum where frequency describes types of bugs.
When you write software you generate initial signal which will depend on what you are doing and your ability (awareness of different types of bugs, etc.)
Then during further stages of testing these bugs tend to be filtered out. Some types of bugs are attenuated very efficiently. For example a bug that would cause your application to not compile would have very little chance of being shipped...
Some bugs are only detected (as signal) only in some circumstances. For example, users use a detector that has different characteristics from the detector used by developer.
You don't generally want to invest in attenuating all frequencies. What you do is you ignore frequencies that don't matter (that just pass and cause no harm) and focus on harmful frequencies.
In the end, the best way to reduce the output noise of the system is usually to reduce the input noise (ie. signal produced by developer).
Since the beginning of computer software (and computer hardware) the solutions that were not reliable enough for practical purposes were discarded.