Drillroom App

AF pool guy

AzB Silver Member
Silver Member
They’ve got a lot of drills and the augmented reality for camera setup and success tracking is pretty good. That said I haven’t ponied up for the $40/year subscription yet.


Sent from my iPad using Tapatalk
 

buckshotshoey

AzB Silver Member
Silver Member
Agreed on that last part. I think I would have to try it first, like maybe a limited try out offer, before I paid $40 a year for it. My problem is they don't have an app for Android. Which doesn't make any sense... I believe there are far more Android phones out there than Apple.
 

iusedtoberich

AzB Silver Member
Silver Member
I was a beta tester a few months ago. I don’t think it would be for anyone above a banger level, based on the beta. I know they have improved the AI since then, but I have not tried it again.
 

PoolStats

Pool Stats LLC
Silver Member
Agreed on that last part. I think I would have to try it first, like maybe a limited try out offer, before I paid $40 a year for it. My problem is they don't have an app for Android. Which doesn't make any sense... I believe there are far more Android phones out there than Apple.
Apple's ML Vision framework is much more advanced than Google's Firebase ML-Kit. Not only is Apple Vision easier to implement, it can track 16 simultaneous objects, whereas the Firebase ML-kit can only handle 5. In other words a max of a 15 ball drill versus a a maximum of a 4 ball drill. Or to be more anticipatory, tracking an entire 8-ball game with iOS.

I'm unsure how many simultaneous objects can be tracked with the TensorFlow lite SDK, but I'm almost positive it's not much more than Google's Firebase ML-kit. One could use Android NDK and build the full TensorFlow suite into the app which would handle more objects and have better classification than Apple Vision, however, the Android AI chipset isn't as advanced as the iPhone's and would be a bigger drain on the battery and lacks in performance in comparison.

TL;DR Android AI sucks in comparison to Apple Vision.
 
Last edited:

buckshotshoey

AzB Silver Member
Silver Member
Apple's ML Vision framework is much more advanced than Google's Firebase ML-Kit. Not only is Apple Vision easier to implement, it can track 16 simultaneous objects, whereas the Firebase ML-kit can only handle 5. In other words a max of a 15 ball drill versus a a maximum of a 4 ball drill. Or to be more anticipatory, tracking an entire 8-ball game with iOS.

I'm unsure how many simultaneous objects can be tracked with the TensorFlow lite SDK, but I'm almost positive it's not much more than Google's Firebase ML-kit. One could use Android NDK and build the full TensorFlow suite into the app which would handle more objects and have better classification than Apple Vision, however, the Android AI chipset isn't as advanced as the iPhone's and would be a bigger drain on the battery and lacks in performance in comparison.

TL;DR Android AI sucks in comparison to Apple Vision.
I welcome the education.
 

kling&allen

AzB Gold Member
Gold Member
I was a beta tester a few months ago. I don’t think it would be for anyone above a banger level, based on the beta. I know they have improved the AI since then, but I have not tried it again.

Like the drills were too basic? Or the ball tracking was so limited it could only provode simple functionality?
 

iusedtoberich

AzB Silver Member
Silver Member
Like the drills were too basic? Or the ball tracking was so limited it could only provode simple functionality?
Both. Drills were very basic. Short and straight in type stuff. I made pretty much every shot. I see now though they have the whole dr Dave test on it so that should be improved.

Second is if I shot the ball hard, it sometimes wouldn’t register it went and count as a miss.

It was fun to experiment with. And cool to see AI tracking working. I could see this improving as time goes on. It might be much improved from when I tried it a few months ago as well. I just lost interest…
 

buckshotshoey

AzB Silver Member
Silver Member
Both. Drills were very basic. Short and straight in type stuff. I made pretty much every shot. I see now though they have the whole dr Dave test on it so that should be improved.

Second is if I shot the ball hard, it sometimes wouldn’t register it went and count as a miss.

It was fun to experiment with. And cool to see AI tracking working. I could see this improving as time goes on. It might be much improved from when I tried it a few months ago as
Both. Drills were very basic. Short and straight in type stuff. I made pretty much every shot. I see now though they have the whole dr Dave test on it so that should be improved.

Second is if I shot the ball hard, it sometimes wouldn’t register it went and count as a miss.

It was fun to experiment with. And cool to see AI tracking working. I could see this improving as time goes on. It might be much improved from when I tried it a few months ago as well. I just lost interest…
Is there any chance that it could be made customizable so you can input your own drills? And, is there a way to make Corrections when it gets something wrong? Kind of like you can adjust the Predator Break Speed app when it gets it wrong?

I can see the advantage to this app. It could keep a record of your drill progress automatically, without you having to write anything down.
 

SlateMan

Registered
This is really interesting and might be a great alternative to the projector type systems.

Phones now can share their screen to a nearby TV. So you could mount this, add a mouse via bluetooth to your phone, share your phone screen to a nearby TV and not be limited to looking at this on the small screen. You could use the mouse to interact with the phone. (Well, I know Ipads don't like the bluetooth mouse, I am not sure about IPhones. My Android can have a full keyboard and mouse attached wirelessly.

This also would work if you could project the phone screen to a project pointed at the table like the other projector based systems, except this system could actually read the table via the phone camera and then interact with the AI. Of course the software would have to deal with the camera reading what the projector was projecting to the table and what is really on the table.

Now if I can only get Jennifer's voice saying "Split the Wicket" out of my head. I did not like that. If she could have sounded either more excited or said it in a slower, softer, sexier voice, well, that might have worked for me. :)
 
Top