Randy G's Pool School Review

12squared said:
I don't know Scott, Randy or their school, however they are very well respected instructors so I thought I'd chime in on the "if you can't beat me, you can't teach me" discussion.

I happily continue to learn from players of all skill levels.

Dave

Thanks Dave! tap, tap, tap!...and we continue to learn from our students too!:D

Scott Lee
www.poolknowledge.com
 
Here is an idea that might help your school. There are a few ways to determine if your school is worth the effort.

1. Collect information from your students on the first day. What leagues do they play, what is their handicap, how many balls can they run in various games. How often do they practice now, etc.

2. Give them something like the Hopkins test.
3. Teach as usual.

4. At the conclusion of the school re-test with the Hopkins test (one of several measures that could be used).

5. Wait three months and send a follow up evaluation form that asks how many hours the student used your methods, number of hours played, changes in handicaps, etc. Have them use the Hopkins test and report the results to you. Determine, from them, where they learned the most and the least.

6. Follow up again in six months and one year.

Some of the measures would be better than others (for various reasons). In general, it is possible to determine what benefits your students derive. While there would be some exaggeration, and some who did not comply, if there are 150 students (over say six months to a year) a reasonable determination could be made. This could be used for advertising if the results are as expected.

The number of full data sets controls the number and kinds of analyses that could be conducted. With a minimum of 30 students you could at least give a description of the results obtained on a preliminary basis. This might include the level of student ability required, and the amount of gain that could be expected.

There is a specialty in the social sciences called "Program Evaluation." Before I retired, I taught graduate students working on a master's degree how to use these techniques. You can Google the term and learn much about this field.

If you and your colleagues would like some assistance determining the usefulness, strengths, and weaknesses of your teaching methods I would be glad to help (no fees involved). My only requirement is that the evaluation would become public if I analyzed the data. If you or your group analyzed the data then it would be your decision to make public or not.

PM and we can discuss it further if you are interested.
 
Last edited:
JoeW said:
Here is an idea that might help your school. There are a few ways to determine if your school is worth the effort.

1. Collect information from your students on the first day. What leagues do they play, what is their handicap, how many balls can they run in various games. How often do they practice now, etc.

2. Give them something like the Hopkins test.
3. Teach as usual.

4. At the conclusion of the school re-test with the Hopkins test (one of several measures that could be used).

5. Wait three months and send a follow up evaluation form that asks how many hours the student used your methods, number of hours played, changes in handicaps, etc. Have them use the Hopkins test and report the results to you. Determine, from them, where they learned the most and the least.

6. Follow up again in six months and one year.

Some of the measures would be better than others (for various reasons). In general, it is possible to determine what benefits your students derive. While there would be some exaggeration, and some who did not comply, if there are 150 students (over say six months to a year) a reasonable determination could be made. This could be used for advertising if the results are as expected.

The number of full data sets controls the number and kinds of analyses that could be conducted. With a minimum of 30 students you could at least give a description of the results obtained on a preliminary basis. This might include the level of student ability required, and the amount of gain that could be expected.

There is a specialty in the social sciences called "Program Evaluation." Before I retired, I taught graduate students working on a master's degree how to use these techniques. You can Google the term and learn much about this field.

If you and your colleagues would like some assistance determining the usefulness, strengths, and weaknesses of your teaching methods I would be glad to help (no fees involved). My only requirement is that the evaluation would become public if I analyzed the data. If you or your group analyzed the data then it would be your decision to make public or not.

PM and we can discuss it further if you are interested.
I like this idea, with the exception of one you have listed. I would eliminate the Hopkins test at the end of school. This wouldn't really prove anything. There's no way you could really improve over 3 days. More than likely, you would score less because of learning a new stroke. You actually take a couple steps back before stepping forward, but I do like the idea of collecting data and measuring skill. I take the Hopkins test everyday and document it. Instead of running 50 racks to get my point average, I run 10 racks and multiply it by 5.
 
I forgot to mention that many others professionals are also trained in program evaluation, especially engineering.

Documenting an initial decrement in performance at the end of the teaching program is also useful as it would help students to better understand the process.

Some of the better treatment programs in psychology initially show a decrement in performance and this is an indicator that the treatment worked. It is not common but it is known. For those interested see Sarason's work on Video Modeling Therapy.
 
Last edited:
S.A.M. is also just as incredible. I, like many people, never really had a pin point aiming system. I used Ghostball to figure the contact point and then went off feel from there. To have an exact point of aim and, after time and dialing this in, it will definetely give me an opportunity to improve me game.

I have a bad feeling about asking this question, but what's "S.A.M."?

pj
chgo
 
Patrick Johnson said:
I have a bad feeling about asking this question, but what's "S.A.M."?

pj
chgo
It stands for Supplemental Aiming Method, but it is used as the primary aiming method for many. Do a search and you will find a few threads on the subject. There was one just started in the last week.
 
Searched S.A.M. and SAM and nothing found. Can you be more specific please?
 
JoeW said:
Here is an idea that might help your school. There are a few ways to determine if your school is worth the effort.

1. Collect information from your students on the first day. What leagues do they play, what is their handicap, how many balls can they run in various games. How often do they practice now, etc.

2. Give them something like the Hopkins test.
3. Teach as usual.

4. At the conclusion of the school re-test with the Hopkins test (one of several measures that could be used).

5. Wait three months and send a follow up evaluation form that asks how many hours the student used your methods, number of hours played, changes in handicaps, etc. Have them use the Hopkins test and report the results to you. Determine, from them, where they learned the most and the least.

6. Follow up again in six months and one year.

Some of the measures would be better than others (for various reasons). In general, it is possible to determine what benefits your students derive. While there would be some exaggeration, and some who did not comply, if there are 150 students (over say six months to a year) a reasonable determination could be made. This could be used for advertising if the results are as expected.

The number of full data sets controls the number and kinds of analyses that could be conducted. With a minimum of 30 students you could at least give a description of the results obtained on a preliminary basis. This might include the level of student ability required, and the amount of gain that could be expected.

There is a specialty in the social sciences called "Program Evaluation." Before I retired, I taught graduate students working on a master's degree how to use these techniques. You can Google the term and learn much about this field.

If you and your colleagues would like some assistance determining the usefulness, strengths, and weaknesses of your teaching methods I would be glad to help (no fees involved). My only requirement is that the evaluation would become public if I analyzed the data. If you or your group analyzed the data then it would be your decision to make public or not.

PM and we can discuss it further if you are interested.

Joe...No discussion needed. We already do many of the things you mentioned. If a student needs to "see if the school is worth the effort", then they are not ready for the program. Nuff said...

Scott Lee
www.poolknowledge.com
 
Patrick Johnson said:
I have a bad feeling about asking this question, but what's "S.A.M."?

pj
chgo


Hi Patrick, long time no chat.

S.A.M is the fixed aim point of the ghost ball along with additional math info.....SPF=randyg
 
skill assessment?

mattman said:
I like this idea, with the exception of one you have listed. I would eliminate the Hopkins test at the end of school. This wouldn't really prove anything. There's no way you could really improve over 3 days. More than likely, you would score less because of learning a new stroke. You actually take a couple steps back before stepping forward, but I do like the idea of collecting data and measuring skill. I take the Hopkins test everyday and document it. Instead of running 50 racks to get my point average, I run 10 racks and multiply it by 5.



ok... Ill bite, what's the "Hopkins test"?
 
Lol

Barbara ... Jimmy Caras? You're showing your age!!! ... LOL I was 14 when I saw Jimmy Caras. He is the one that lite my 'Pool fire' almost 46 years ago.

I am not a certified instructor, but I have taught in the past, mostly when I was younger. I taught kids in Houston many years ago, while working in a Pool room and going to University of Houston. Being a good instructor is about knowledge, and how to communicate that knowledge to a wide variety of students in a manner that they can understand, and to enable them afterwords to come up with solutions to future problems they may have.

I still think of my 2 prize students, a curly headed 10 year old, and an shoulder length straight haired 11 year old. I held a tournament at the end of the lessons, and these 2 were in the finals. The 11 year old ran a 3 pack on the full sized tables and the 10 year old ran a 2 pack. I was so proud of both of them. The 11 year old won the tournament.

I have taught adults too, mostly hit and miss with guys, more concentrated with women players. Part of being a good instructor or teacher has to do with having the right demeanor, personality, and temperment, and many top players do not for being a good instructor, some might, like Allison.

Seems to me there is something about teaching someone to fish instead of just feeding them. Well, that is what a good instructor does, he doesn't tell you how to win the one game, he teaches you to win many games in the future.

We all remember that one teacher back in school that we learned a great deal of knowledge from, not only about the subject, but life itself, and that inspired us to be more than we were back then.

Taking lessons from a good instructor is an investment in yourself. I have learned a great deal about life by Playing Pool, and in turn, I have taken from other areas in life, and applied it to my Pool game, like my computer knowledge about systems. And I always enjoy meeting and making friends with all the people in Pool.

If you want to be really good in Pool, if that is your dream, then do it the right way with a good instructor(s) because when you get old, you won't regret what you did, you will only regret what you didn't do.
 
ugotactionTX said:
ok... Ill bite, what's the "Hopkins test"?


It's a single player practice game consisting of 10 innings, each starting with a full 8-ball style break of 15 balls. In each inning you run out the balls like in straight pool (in any order) until there are five left on the table. The last five are run in rotation order. This is also known as the Q-skill test.
 
ugotactionTX:
ok... Ill bite, what's the "Hopkins test"?

mikepage:
It's a single player practice game consisting of 10 innings, each starting with a full 8-ball style break of 15 balls. In each inning you run out the balls like in straight pool (in any order) until there are five left on the table. The last five are run in rotation order. This is also known as the Q-skill test.

A better version, IMO, is a game called FARGO, invented by Professor Mike himself (who's from Fargo, ND, along with Sheriff Frances McDormand).

FARGO is played the same way as Q-Skill, but at some point (shooter's choice) you switch from shooting any ball to shooting rotation - you get 1 point for each ball pocketed before switching and 2 points for each ball pocketed after switching (maximium rack score = 30).

It's a very cool idea and a great internet competition game - The Professor used to referee them for competitors on RSB. Maybe he could be talked into refereeing one for AZB one of these days (sorry, Mike)...

pj
chgo
 
Patrick Johnson said:
A better version, IMO, is a game called FARGO, invented by Professor Mike himself (who's from Fargo, ND, along with Sheriff Frances McDormand).

FARGO is played the same way as Q-Skill, but at some point (shooter's choice) you switch from shooting any ball to shooting rotation - you get 1 point for each ball pocketed before switching and 2 points for each ball pocketed after switching (maximium rack score = 30).

It's a very cool idea and a great internet competition game - The Professor used to referee them for competitors on RSB. Maybe he could be talked into refereeing one for AZB one of these days (sorry, Mike)...

pj
chgo


Gee thanks Pat....

Actually "Fargo" is going to be the subject of my next youtube thingy -- in a week or two...

Check your pm's Pat.
 
Randy, maybe you are losing to your students is because he spend so much time on the golf course.

I completey agree with the original post. I cannot say enough about the instruction I received from Randyg, Scott Lee, and Joe Tucker last summer.
 
kaznj said:
Randy, maybe you are losing to your students is because he spend so much time on the golf course.

I completey agree with the original post. I cannot say enough about the instruction I received from Randyg, Scott Lee, and Joe Tucker last summer.


Have not touched a golf club in 6 months (shame), school has been every day. Not time to play pool or golf. Woe is me..:-)....SPF=randyg
 
okay ....

Heres a question. Assuming a student had a chance to attend either RandyG's traveling school, or Scott Lee's traveling school, and could only attend one, what should he look at to choose which one. Are there shortcomings, that one school addresses better than the other? Assuming monies not the question nor time or location of the school.
 
Back
Top