Top-100 Showdown Redux: Conclusions and Final Rankings

facebooktwitterreddit

Now that I’ve completed my division by division run through a second set of Top-100 lists, it’s time to wrap things up in a tidy little bow. Of course this time around there are two conclusions with rankings and data to present. The first is to sum up the results of the seven lists I examined in the last couple weeks. That information then needs to be reconciled with the data and rankings from the first six lists I looked at back in early February.

As with last time I will compare the divisions and teams using both the number of players to make the respective lists and also the total numbers of lists that those players made. Thus a player that was a unanimous selection would contribute seven “points” while a player named to just one list would add just a single point to the total of that division and team. When I get to the section where I combine the data for all lists covered in both installments of the showdown series, the value a consensus player contributes to the bottom line increases to 13.

Players# of RankingsAvg/Team
AL West2613032.5
AL East3414128.2
NL West2912424.8
NL Central2813722.8
NL East248416.8
AL Central198116.2

Last time the NL Central was tied with the AL West in terms of the number of rankings and as you can see from the above the AL East and NL Central came out ahead in that number. However it’s not entirely apples to apples since the number of teams in each division is not the same (I for one can’t wait for the Astros to move to the AL West), so I added the average number of lists (rankings) each team’s players made and sorted by that.  The result paints a much more relevant picture in terms of the prospect “strength” of each division. Our clear front runner by that standard is the AL West with the A’s, Rangers and Mariners all in the top-7 below. The division would be even stronger if Yu Darvish and Yoenis Cespedes had signed prior to the majority of Top-100s being published.

Players# of Rankings
San Diego Padres1248
Oakland Athletics944
Toronto Blue Jays939
Kansas City Royals737
Texas Rangers835
St. Louis Cardinals835
Seattle Mariners533
Pittsburgh Pirates631
Colorado Rockies731
Boston Red Sox1030
New York Yankees628
Atlanta Braves726
Tampa Bay Rays624
Arizona Diamondbacks523
Chicago Cubs422
Houston Astros421
Washington Nationals720
Baltimore Orioles320
New York Mets418
Minnesota Twins618
Los Angeles Angels418
Cincinnati Reds316
Los Angeles Dodgers314
Detroit Tigers314
Milwaukee Brewers312
Philadelphia Phillies210
Miami Marlins410
San Francisco Giants28
Cleveland Indians17
Chicago White Sox25

Four of the top five teams are the same as last time though the appear here in a slightly different order. Seattle was the only team that dropped out of that grouping – from #3 to #7. On the other end of the spectrum the bottom four also remained the same but the Marlins and Giants flipped spots as did the Indians and White Sox. In the end some teams moved a few slots but by and large the order stayed the same and teams were in the same vicinity.

Interestingly enough, this time around only two teams (Padres and Red Sox) had 10 or more players named to a Top-100 list compared to four teams with 11 or more players (Padres, Red Sox, Rangers and Blue Jays) in the first installment of this series. Since I looked at seven Top-100s this time as opposed to six at the beginning of February you’d think the opposite would be true.

While we’re making assumptions, you’d expect that combining the chunks of data together into tables that incorporate all 13 lists would provide relatively similar results. In this case, expectation and reality line up and correlate nicely. In fact the divisional rankings remain the same as the above though the separation in the averages is greater due to the increase in the amount of data used.

Players# of RankingsAvg/Team
AL West3223258.0
AL East4425651.2
NL West3623146.2
NL Central3925242.0
NL East2816432.8
AL Central2715931.8

In putting together the team data from the 13 lists combined I again sorted the systems by the number of rankings that each team’s players received. While I was at it I decided that a number that reflected the strength of players ranked was also necessary. After all a system that has 20 prospects mentioned on 10% of lists would rank 1st using my methodology but it wouldn’t be necessarily as strong as a system with 10 players that made more than 50% of lists. The result is the third column below which gives a quick average of the number of Top-100 lists each player was named to.

If  a team had nothing but consensus players it would have a score of 13.0 by this standard and I think it’s important to keep this number in mind when evaluating systems. The Mariners (10.8) and Pirates (9.7) come out ahead in this measure and help to explain how two teams – each with just six players ranked – find themselves in the Top-10 of systems. Seattle, for example, may not have the overall depth of a team like the Padres, but they certainly have more upper echelon talent and as a result may be positioned better to build a contender.

Of course it all depends on what you value.

Players# of Rankings# Lists/Player
San Diego Padres15926.1
Oakland Athletics10747.4
Toronto Blue Jays12715.9
Kansas City Royals9697.7
Seattle Mariners66510.8
Texas Rangers12615.1
St. Louis Cardinals10616.1
Pittsburgh Pirates6589.7
Colorado Rockies8536.6
Boston Red Sox13534.1
Atlanta Braves7537.6
New York Yankees7507.1
Tampa Bay Rays8455.6
Chicago Cubs7456.4
Arizona Diamondbacks6447.3
Baltimore Orioles4379.3
Washington Nationals8364.5
Minnesota Twins9364.0
New York Mets4358.8
Houston Astros4348.5
Los Angeles Angels4328.0
Cincinnati Reds6305.0
Detroit Tigers4287.0
Los Angeles Dodgers3248.0
Milwaukee Brewers6244.0
Philadelphia Phillies4215.3
Miami Marlins5193.8
San Francisco Giants4184.5
Cleveland Indians3144.7
Chicago White Sox2126.0

For me this study has largely reinforced how I ranked the farm systems in my head but it has also forced me to reassess my feelings on a few of them. This reassessment holds most true for the Cardinals which I had woefully underrated even though they have two of my favorite prospects (Shelby Miller and Kolten Wong) in their ranks.

On the other end of the spectum I think this exercise has shown that the Indians and White Sox are very much deserving of their standing as the two worst systems in all of baseball at the present time. Though in fairness to fans of the Tribe, they have a lot more talent buried deep in their pipeline that could wind up on Top-100 lists at this time next year.

As valuable as all this data is, at the end of the day comparing the relative value of prospects and systems to one another remains more subjective than objective. Relying on how each organization’s top prospects were ranked to evaluate them is but a small piece of the bigger picture.

When we add the subjective piece back into things the rankings have to shift around a bit for each of us. For example, I would rank the Toronto Blue Jays as the top system in baseball even though they came in 3rd in the number of rankings and tied for 3rd in the number of players that were ranked base on the above.

Bottom line, this was an enjoyable experiment that has helped shape and deepen my knowledge base and understanding of all the organizations and I hope along the way you found some value in it as well.

~~~~~

You can follow us on Twitter @Seedlings2Stars and yours truly @thebaseballfish. You can also keep up to date with all things S2S by liking our Facebook page.