Code-wise, the app could be accurate (and from a coder's myopic point-of-view, his/her code is always "perfect," right?). But the issue is not with the code itself. It's with the equipment -- the cell phone itself:
1. That mass-produced cell phone's wafer-microphone with no emphasis or engineering on sound quality/capturing at all.
The quality of the sound is pretty much irrelevant. All that matters is that the microphone is capable of recording a relative peak in the sound. Digital audio programs (on which the app is based), have extremely accurate time clocks. The quality of the analog to digital conversion *might* be a factor, but again even a crappy converter has more than enough resolution to record the important information.
2. Audio compression -- all cell phones use it. Audio compression is a LOSSY compression (meaning, it loses information *by design* -- just like picture/image compression [e.g. JPEG, GIF]). A cell phone *has* to use compression to accommodate different sound levels in different acoustic environments -- from inside small quiet enclosures like a restroom, all the way on up to a concert hall or noisy street.
I think you might be confusing the ideas of compression as it relates to data recording and as it relates to sound. The microphone on most cell phones certainly uses BOTH for making calls. However, there is no reason to believe that audio recordings are subject to either. Regular audio compression is *probably* used, as I doubt the dynamic range of the little cell condenser mic is very much, but this is not for sure either. The audio compression (in which the amplitude input/output follows a curve or line that is not a 1:1 ratio) would not have any bearing on the app's accuracy, unless the compression was so extreme all amplitudes were reduced to near zero. (Obviously not the case). The data compression (in which the analog input from the microphone is converted to digital at a very low sampling rate) *might* have a bearing, but again...not likely unless it was extreme (which it isn't). When you refer to "lossy" compression, you are referring to either the analog to digital conversion process, or else a dithering function that reduces the bit rate and or sampling rate after it was already converted to digital.
3. Placement of the cell phone in relation to the sound.
This is not statistically significant. Do the math. It is essentially irrelevant. However, it is extremely easy to overcome this "problem" by simply placing the phone the same distance from the cue ball and the rack. Either way, remember how much time we are talking about for sound to travel 5 feet.
4. What you mentioned with inherent error in "placement of the cue ball" on the app, in preparation for the sound of the break itself, is huge. In the short distance on the table, a placement error of 1 inch can result in a significant discrepancy in the calculation of the speed.
This is the only item on the list so far that can potentially matter. And it can matter a LOT. There are some techniques for getting pretty accurate. I've played around with this, and if you can at least be within a ball's width, you are only talking about a few tenths of a mph.
5. Speed guns "flaky"? Sure, if they are not calibrated at regular intervals like they are supposed to. (And in our financially constrained times, it's not uncommon for speed guns to miss their periodic recalibrations, because the owner is pinching pennies.) We've all heard the stories about speeding tickets being thrown out in courtroom because the defendant "had the speed gun tested, and it measured the speed of an apple resting on the desk as 'moving at 20mph'." The fact is, while that may have happened long ago (even *that* is debatable), today, those are definitely urban legends.
-Sean
Don't forget that police radar guns are expensive. Like thousands expensive. A really nice sports radar gun that is even capable of reading 0.1 mph accurately is close to $1k. So my old $300 radar gun is not equal to say a $1000 Stalker gun that can read tenths...
Also, not sure if you have personal experience using a radar gun, but the person who said they are "flakey" probably did. I say this because in my own personal experience, there is "flakiness" in a perfectly calibrated gun. Lights, acquisition speed, etc. all can make the reading hit or miss (see my other post). However, usually when you *do* get a reading, it is going to be within the accuracy tolerances of the gun, but NOT always.
Hope this helps,
KMRUNOUT