collapse

Resources

2024-2025 SOTG Tally


2024-25 Season SoG Tally
Jones, K.10
Mitchell6
Joplin4
Ross2
Gold1

'23-24 '22-23
'21-22 * '20-21 * '19-20
'18-19 * '17-18 * '16-17
'15-16 * '14-15 * '13-14
'12-13 * '11-12 * '10-11

Big East Standings

Recent Posts

Media Rights Update by StillAWarrior
[Today at 01:55:39 PM]


IU vs MU preview by Scoop Snoop
[Today at 12:30:25 PM]


More conference realignment talk by The Sultan
[Today at 08:26:22 AM]


Recruiting as of 5/15/25 by Juan Anderson's Mixtape
[July 07, 2025, 11:14:59 PM]


To the Rafters by sodakmu87
[July 07, 2025, 09:29:49 PM]


2025-26 Schedule by brewcity77
[July 07, 2025, 02:10:17 PM]


Marquette NBA Thread by Jay Bee
[July 07, 2025, 11:51:18 AM]

Please Register - It's FREE!

The absolute only thing required for this FREE registration is a valid e-mail address. We keep all your information confidential and will NEVER give or sell it to anyone else.
Login to get rid of this box (and ads) , or signup NOW!

Next up: A long offseason

Marquette
66
Marquette
Scrimmage
Date/Time: Oct 4, 2025
TV: NA
Schedule for 2024-25
New Mexico
75

MerrittsMustache


bamamarquettefan

Quote from: MerrittsMustache on May 12, 2011, 07:16:23 PM
Told you.


Disagree, I didn't get at all upset when some people don't like equations.  I've been getting emails and calls from around the country asking for the spreadsheet all day (the Vandy guys in particular got pretty excited of course), so I'm glad a lot of people like rankings like this, but I don't get upset that other people have no interest at all.

It's not for everyone, and by now you probably know if one pops up from me it may focus on statistics.  As I pointed out in the post leading up to it though, the history is that the teams that started using stats that really measured how many games players were winning for them (best example Oakland A's) started beating better funded teams that didn't.  This goes way back - the Dallas Cowboys early Superbowls occurred right after they were the first to go to spreadsheets to evaluate all incoming talent. Teams highered Bill James to determine which players should be in the majors, etc.  But obviously there are other people that are very good at watching a player and seeing potential and which ones can take it to another level - so I believe it's smart to draw from all perspectives when making decisions for a program.

On the rating 100 players vs. 2500 players, I appreciate anyone worried about me spending too much time on this stuff, but the fact is once you are in the spreadsheet it takes just as long to rank 5 players as 2500.  The hard part is making sure the formula works, but once it does, you copy the formula down the column and sort the players by the results.
The www.valueaddsports.com analysis of basketball, football and baseball players are intended to neither be too hot or too cold - hundreds immerse themselves in studies of stats not of interest to broader fan bases (too hot), while others still insist on pure observation (too cold).

bamamarquettefan

Quote from: Ari Gold on May 12, 2011, 03:09:03 PM
Davante is pretty proud of himself
http://twitter.com/#!/DGardner_54/status/68690773956235264
just found out I was ranked 403 in the NCAA out of 2500 players....this season coming is going to be crazy haha got something up my sleeve

Thanks for passing that on! I'm not a tweeter, but happy that Davante saw it and liked it.  Fact is his offensive numbers are pretty unbelievable for a guy who got that few minutes.  Hope we see a bit more of him on the court next year.
The www.valueaddsports.com analysis of basketball, football and baseball players are intended to neither be too hot or too cold - hundreds immerse themselves in studies of stats not of interest to broader fan bases (too hot), while others still insist on pure observation (too cold).

brewcity77

Quote from: bamamarquettefan on May 13, 2011, 02:41:15 AM
Thanks for passing that on! I'm not a tweeter, but happy that Davante saw it and liked it.  Fact is his offensive numbers are pretty unbelievable for a guy who got that few minutes.  Hope we see a bit more of him on the court next year.

Nice...that confirms 2 players reading CS, Crowder tweeted about an article on there too :)

Dr. Blackheart

Quote from: bamamarquettefan on May 13, 2011, 02:31:41 AM

On the rating 100 players vs. 2500 players, I appreciate anyone worried about me spending too much time on this stuff, but the fact is once you are in the spreadsheet it takes just as long to rank 5 players as 2500.  The hard part is making sure the formula works, but once it does, you copy the formula down the column and sort the players by the results.


Great stuff--especially in the off-season. Since you have so much time, perhaps if you revealed what the data said coming into the 2010-11 season to validate, this would help to convince the non-quant jocks?   :D

We have had this argument many times on this board about the advanced stats...with coaches like Buzz and Brad Stevens highly reliant on the stats, and others like Calhoun who rely solely on their experience.  The technology to capture and quantify/scout teams' tendencies is leading-edge (a ESPN video of a game is in digital, so it can be converted to numbers), and has allowed the up-and-comers to close the gap/gain an edge over the old salty dogs--and has allowed mid-majors or 11th place conference teams to advance deep.

Who's right?  Both

MerrittsMustache

Quote from: bamamarquettefan on May 13, 2011, 02:31:41 AM
It's not for everyone, and by now you probably know if one pops up from me it may focus on statistics.  As I pointed out in the post leading up to it though, the history is that the teams that started using stats that really measured how many games players were winning for them (best example Oakland A's) started beating better funded teams that didn't. 

Which better funded teams did the A's beat when it mattered? In their 5 "Moneyball" trips to the Playoffs, they won exactly one series and that was against Minnesota. They also lost a series to Minnesota as well as to the Yankees twice and the Red Sox and Tigers once. Granted, they won a lot of regular season games but when it came to playing the big boys in the postseason, they failed.

Personally, I'm not opposed to Sabermetrics and other basketball-specific statistical data, but I do feel that it is overvalued in today's sports world.

Marquette84

Quote from: Dr. Blackheart on May 13, 2011, 07:45:50 AM
Great stuff--especially in the off-season. Since you have so much time, perhaps if you revealed what the data said coming into the 2010-11 season to validate, this would help to convince the non-quant jocks?   :D

If the data is predictive with any sort of accuracy, it would also be interesting to apply it to incoming recruits as well.

Then we'd be able to know if we have any legitimate chance at beating St. Johns over the next few years

Quote from: Dr. Blackheart on May 13, 2011, 07:45:50 AM
We have had this argument many times on this board about the advanced stats...with coaches like Buzz and Brad Stevens highly reliant on the stats, and others like Calhoun who rely solely on their experience.  The technology to capture and quantify/scout teams' tendencies is leading-edge (a ESPN video of a game is in digital, so it can be converted to numbers), and has allowed the up-and-comers to close the gap/gain an edge over the old salty dogs--and has allowed mid-majors or 11th place conference teams to advance deep.

If true, the implications of this would be interesting.  Good coaching has always been defined as some alchemy of recruiting skill, personality and experience.

If coaching can be distilled down to decisions based on numbers and formulae,  it would create some interesting issues.

1.  Are the formulae and stats that Buzz is developing and using as MU's coach his own intellectual property or does it belong to MU?  

2.  If the data belongs to MU (or Butler in the case of Stevens), it would seem to decrease the important of who the coach is as long as they can apply the formulas appropriately when recruiting and coaching.

3.  What happens once everyone buys into the theory?  Is the current success of teams like Butler and MU a result of being the college basketball equivalent of the Oakland A's under Beane--experiencing a transitory success until the old guard catches up?  Or a permanent advantage that will last as long as Stevens or Buzz are the program's respective coaches?


If nothing else, if these theories become more widely accepted I would think school athletic departments would have to become much more involved in the day-to-day coaching activities and data ownership.  The university-owned assets that drive success are no longer limited to databases of donors, ticket sales data, and top-notch facilities.  Player data would seem to be just as important.

Dr. Blackheart

Quote from: Marquette84 on May 13, 2011, 08:18:29 AM
If the data is predictive with any sort of accuracy, it would also be interesting to apply it to incoming recruits as well.

Then we'd be able to know if we have any legitimate chance at beating St. Johns over the next few years

If true, the implications of this would be interesting.  Good coaching has always been defined as some alchemy of recruiting skill, personality and experience.

If coaching can be distilled down to decisions based on numbers and formulae,  it would create some interesting issues.  I will leave this to BAMA

1.  Are the formulae and stats that Buzz is developing and using as MU's coach his own intellectual property or does it belong to MU?  A service that MU can customize. The whole court is digitalized. I will try and find a write-up to share.  When you hear Buzz say "# of paint touches" or "our opponents score 63% out of a time out or 78% of a in-bounds under our own basket", he is quoting these facts. 

2.  If the data belongs to MU (or Butler in the case of Stevens), it would seem to decrease the important of who the coach is as long as they can apply the formulas appropriately when recruiting and coaching. True, it is an edge now but it will be commonplace tomorrow, so how to stay ahead of the game or technology--same issues but different aspects as 1977.

3.  What happens once everyone buys into the theory?  Is the current success of teams like Butler and MU a result of being the college basketball equivalent of the Oakland A's under Beane--experiencing a transitory success until the old guard catches up?  Or a permanent advantage that will last as long as Stevens or Buzz are the program's respective coaches?  I think both.  Stats can help you exploit match-ups or lessen disadvantages. Stats definitively helped MU gameplan for Xavier.  They new the exact spot Holloway liked to shoot from, receive the ball, how many times he shot off a curl, created his own off the dribble and where.  Yet, they had an match-up advantage in JFB to shut him down.  Mo Acker could have known all the tendencies as well, but he could not have stopped Tu.

On the A's, drug testing may have been a bigger equalizer in the end but Beane and his disciples (Epstein, et al) have continued to do well.  Old dogs like LaRussa have adapted.  Sweet Lou?  Not all all which is why he is out the door and retired.



If nothing else, if these theories become more widely accepted I would think school athletic departments would have to become much more involved in the day-to-day coaching activities and data ownership.  The university-owned assets that drive success are no longer limited to databases of donors, ticket sales data, and top-notch facilities.  Player data would seem to be just as important.


Dr. Blackheart

Here is the NYT article I have quoted before: 

http://query.nytimes.com/gst/fullpage.html?res=9A0DE2DF1331F937A15750C0A9679D8B63

Marquette uses Synergy Sports Technology (link below).  Pretty cheap, considering--$5000-$7500. 

QuotePrograms like Virginia Commonwealth, Ohio State, Marquette and Butler use KenPom's statistics as a complement to Synergy Sports Technology, a video scouting system that has become popular in the past five years. (The full Synergy package costs $5,000 to $7,500 each year.)

Synergy is so advanced that the Butler assistant coaches Matthew Graves and Micah Shrewsberry said they could take a player like Old Dominion guard Kent Bazemore, whom they faced in their first game of the N.C.A.A. tournament, and find out that he drives to the right 75 percent of the time (despite being left-handed).

The Synergy statistics back up what is shown in the clips. With a few clicks, Butler coaches can watch clips of every post move on the right block that Wisconsin forward Jon Leuer made this season and how many times he turned over his left and right shoulder.


http://www.synergysportstech.com/

bamamarquettefan

Quote from: MerrittsMustache on May 13, 2011, 08:16:40 AM
Which better funded teams did the A's beat when it mattered? In their 5 "Moneyball" trips to the Playoffs, they won exactly one series and that was against Minnesota. They also lost a series to Minnesota as well as to the Yankees twice and the Red Sox and Tigers once. Granted, they won a lot of regular season games but when it came to playing the big boys in the postseason, they failed.

Personally, I'm not opposed to Sabermetrics and other basketball-specific statistical data, but I do feel that it is overvalued in today's sports world.

First off, I think that the key fact is that every major league team adopted most of Moneyball once the book was out, and I don't believe every team was wrong.

But the point is they were making the playoffs over lots of teams who had much more money to work with and many more of the old pros to work with.  Once in the baseball playoffs I think you are in a crapshoot talking about playing 4 to 7 games vs. playing 162 games.  I believe I saw at one point that the team with the better record going into a playoff series had something like a 55% winning percentage, but i'd have to look it up - may have been over a stretch of time.

Certainly your final sentence is valid, and I'll even argue the exceptions against myself.  I believe every program has to decide how much WEIGHT to put on Sabermetrics.  A good friend of mine, Riki Ellison, came out of USC and travelled to the Cowboys camp with Roger Craig to go through all their drills for their spreadsheets that they'd been using to get better players than the competition for several years.  After they ran their numbers, they called them both in to tell them they would never play in the NFL.  Well, Riki shows me his 3 Superbowl Rings from his days as a starting linebacker for the 49ers when they took out the Cowboys.  That's clearly an example of someone getting so wrapped up in stats that they overemphasize them and completely disregard the heart of a hard-working player who is just below the cut in strength and speed and whatever else they measured then but was going to succeed.

But the folks in scouting departments who went to the other extreme of disregarding sabermatrix were unemployed before too long.  When new evidence is discovered that says, "you know your favorite player with the nice .292 batting average who never draws a walk or gets an extra base is killing you by eating up your first base position," and you insist the stat guys don't know baseball, then you are going to start losing games pretty soon and be out of a job.  So I'm not disagreeing with your first post completely, but EVERY team in EVERY spot has had to start using Sabermatrix as a certain percentage of their evaluation process because the personnel who didn't started losing games.  Everyone adoped Moneyball after it came out - no one still weights batting averages heavier than On Base PErcentage + Slugging Percentage because it is demonstable that a team with a much higher OBP+SP will kills an opposing team with a much higher BA over the course of a season.

On the question of predicting future seasons, that is never as precise of course.  With thousands of test cases every year, you can verify that Oliver's estimates add up and measure how much a player contributed.  Many predicting models have worked pretty well in most cases, but obviously there are guys with huge potential who could be much better than they were last year (Yancy Gates).  So you can't say this list is really a predictor other than to say, "If only the players returning next year played last year, then here is how the seasons would have turned out."  I'm sure there are people on this list who played through bad injuries and are much better players than what they produced last year, but the figures I'm listing are a very accurate account of what they produced for their team's last year, so they at least produce a nice starting point.

My next study is to take all of the seniors from last year who played 4 straight years at the same school and see what the typical progression is between each year.  If looking for prediction, that is probably a better indicator.  On the much simpler Win Credits system, it does appear that 5-star players seemed to make huge jumps after their freshman year, and 4-stars after their sophomore year, but this will be a much bigger pool and therefore much more accurate as far as "predicting" next year.

In my mind, the balance I strike is:

1. part this objective measurement that even if you disagree with it is unbiased - I can't move someone up or down because I like him more or less and it gives the same criteria to every player whether or not i can see him play, balanced with

2. what people much smarter than me think about the potential of players - where were they rated by the Scouting Services and where are they predicted to go in the NBA draft boards.

Some of you may be much better at simply watching a player and seeing the potential - that's just not my talent so I stay focused on the piece of the puzzle I'm good at and leave the rest to folks who have actually coached or played at high levels.
The www.valueaddsports.com analysis of basketball, football and baseball players are intended to neither be too hot or too cold - hundreds immerse themselves in studies of stats not of interest to broader fan bases (too hot), while others still insist on pure observation (too cold).

Previous topic - Next topic