Top-100 Prospect List Showdown: Conclusions and System Rankings

Why Sluggerrr? Why Not? (Photo Credit: Denny Medley-US PRESSWIRE)

On Sunday I finished up the final two divisions (articles) in my Top-100 Prospect Showdown miniseries. I was going to leave it at that, but about the time I was transitioning from the AL to the NL I realized that they needed to be tied together in some fashion. After all I’m a huge fan of context and it didn’t seem right to limit the comparisons to the Top-100 lists of teams within each division.

To start things off, I wanted to stack the divisions up next to one another using two numbers. The first is the total number of players to make a Top-100 for each division. The second number is the total number of lists that those players made. So a player that was a unanimous selection across all six sources contributed 6 points while a player listed on just one Top-1o0 contributed just 1 point to the final count.

I thought this would be an interesting data element to include as a consensus Top-100 player has more value, or is theoretically more likely to live up to expectations, than a player included on one list but not the other five. This does not factor in the position of each player within the respective rankings – obviously a player that is a consensus top-10 is more valuable than a player that is ranked around 75 on all the lists.

That’s okay because the intent it to paint with broad strokes here compared to the finer brush we used in the divisional articles. Speaking of, to check out each individual article in the series, just click on the name of the division in the below table and you’ll be whisked away.

Players # of Rankings
AL East 38 115
NL Central 35 115
NL West 32 107
AL West 28 102
AL Central 25 78
NL East 23 80

Beyond the divisions, I thought it would be worthwhile to put all 30 teams together in one list and sort them based on the number of lists each team’s prospects made. As you will see, I think this method of ranking puts things in a bit of a different perspective.

Here is that list in its entirety followed by a few observations:

NOTE: I went ahead and linked each of our network’s team sites to the team names if you want to check out the fine work being done on the site(s) of your team(s) of choice.

Players # of Rankings
San Diego Padres 14 44
Kansas City Royals 8 32
Seattle Mariners 6 32
Toronto Blue Jays 11 32
Oakland Athletics 7 30
Atlanta Braves 6 27
Pittsburgh Pirates 6 27
St. Louis Cardinals 8 26
Texas Rangers 11 26
Boston Red Sox 11 23
Chicago Cubs 7 23
Colorado Rockies 6 22
New York Yankees 5 22
Arizona Diamondbacks 5 21
Tampa Bay Rays 7 21
Minnesota Twins 8 18
Baltimore Orioles 4 17
New York Mets 3 17
Washington Nationals 6 16
Cincinnati Reds 5 14
Detroit Tigers 4 14
Los Angeles Angels 4 14
Houston Astros 3 13
Milwaukee Brewers 6 12
Philadelphia Phillies 4 11
Los Angeles Dodgers 3 10
San Francisco Giants 4 10
Miami Marlins 4 9
Chicago White Sox 2 7
Cleveland Indians 3 7

The Padres come out on top and had I listed them in order of players to make a list (regardless of it was 1, 3, or 6 times) they would still be the king of the hill. Getting beyond San Diego however things shift considerably. Of the three teams to put 11 players on at least one of the six Top-100 lists, the Blue Jays came in 4th – which is fairly close to where you’d expect them to be – but the Rangers and Red Sox slipped to 9th and 10th respectively.

On the other side of the spectrum, the Marlins, White Sox and Indians failed to break double digits. This isn’t remotely surprising since they are clearly three of the bottom five systems in all of baseball.

Using this system of ranking the organizations, two in particular stand out:

  • The New York Mets only had three players make a Top-100 which would have tied them for 2nd to last, however those three players occupied 17 out of a possible 18 spots on the six lists. That score of 17 tied the Mets for 17th in baseball.
  • Then there are the Brewers who had a solid middle of the pack total of six players named to a top-100 but those six earned a meager 12 out of a possible 36 points which ranked them 24th.

Looking at the results in the above table, I’m surprised at how viable and accurate it is (for the most part) at ranking the strength of the various systems. There are a few outliers, like the Rangers, but most of the teams are in the vicinity of where they belong. I’m certainly not going to claim that this is a superior method of comparing and ranking the organizations 1 through 30, but it does put some tangible rationale and data behind where each team winds up. It also takes a lot the bias out of the equation.

This series took a bit longer to put together than I thought it would, but it wound up being a labor of love. The fact that it turned out better than I intended when I decided to go down this path certainly made it easier to dig through the lists team by team and compile the lists. So much so in fact that I have tentatively decided to revisit this concept in a month or so when more sources have published their Top-100 lists. When I tackle round two, I plan on including the S2S rankings again as a “control” and compare our rankings with those from Bullpen Banter, Top Prospect Alert, ESPN, Baseball America and Full Spectrum Baseball.

So if you’ve enjoyed this miniseries, you have another round to look forward to.


You can follow us on Twitter @Seedlings2Stars and yours truly @thebaseballfish. You can also keep up to date with all things S2S by liking our Facebook page.

Topics: AL Central, AL East, AL West, NL Central, NL East, NL West

Want more from Grading on the Curve?  
Subscribe to FanSided Daily for your morning fix. Enter your email and stay in the know.