NBA Power Rankings

Posted on January 21, 2007 by

16


Who is the best team in the NBA? One could look at the standings, but this is too simple and frankly doesn’t offer much room for discussion.

Various writers – such as Steve Kerr , Tony Mejia, and Marc Stein – offer team rankings that argue that various teams are better or worse than their record might indicate. These rankings do inspire discussion, but these are not exactly objective.

Hollinger’s Power Rankings

Hence the need for the latest from John Hollinger. Hollinger has created a power ranking based entirely on objective data. Hollinger’s “secret” formula (does ESPN know what the word “secret” means?) indicates that his ranking relies upon the following factors:

  • Team’s average scoring margin (MARG)
  • Team’s average scoring margin over the last 10 games (MARGL10)
  • Strength of schedule (SOS)
  • Strength of schedule over the last 10 games (SOSL10)
  • Whether or not a team has played its games predominantly at home or on the road, over the course of the season and recently (HOME, HOME10, ROAD, ROAD10)

The specific “secret” formula is as follows:

RATING = (((SOS-0.5)/0.037)*0.67) + (((SOSL10-0.5)/0.037)*0.33) + 100 + (0.67*(MARG+(((ROAD-HOME)*3.5)/(GAMES))) + (0.33*(MARGL10+(((ROAD10-HOME10)*3.5)/(10)))))

In sum, Hollinger considers a team’s offensive and defensive ability, the quality of the team’s opponents, where it has played, and how well it has performed recently.

Let me start by saying that I generally like this idea. First and foremost, it’s objective. But more importantly, Hollinger’s ranking is based primarily on a team’s offensive and defensive ability. Why is this last point important? Let’s turn to the words of Hollinger:

One of my goals was to create a system that told us more about a team’s quality than the standings do.

So instead of winning percentage, the rankings uses points scored and points allowed, which are actually better indicators of a team’s quality than wins and losses.

This might not sound right at first, but studies have shown scoring margin to be a better predictor of future success than a team’s win-loss record. Thus, scoring margin is a more accurate sign of a team’s quality.

That explains why, for instance, Phoenix is No. 1 right now even though Dallas has a better record — the Suns have the best scoring margin in basketball.

Conversely, it explains why Miami is No. 24 even though the Heat are close to .500.

Okay, so I like the ranking. Of course, I still have a small quibble.

A couple of months ago I created a small uproar by noting that Hollinger’s Player Efficiency Rating (PER) has a few problems. One issue I raised was that it was not entirely clear what Hollinger’s PER was seeking to measure. A similar quibble could be offered with respect to Hollinger’s power rankings. He offers explanations for his weights, but often his explanations seem to boil down to a “this makes sense to me” defense, as opposed to a “these weights allow us to explain or predict something (like final standings or playoff outcomes)” defense. I would emphasize that my question concerning weights is a very minor quibble. I have no doubt that in evaluating a team it is reasonable to consider the elements Hollinger includes. It’s just not clear to me why each factor is weighted as he suggests. Still, I very much prefer Hollinger’s rankings to a power ranking based on a writer’s subjective impressions of each team.

Ranking Offensive Efficiency, Defensive Efficiency, and Projected Wins

Okay, all that being said, I thought it might be useful to offer an update of a ranking I posted the day after Christmas. At that time I offered a ranking of the best offenses, best defenses, and best teams. I would not argue that these rankings are better or worse than what Hollinger is offering. I am merely trying to show what we see when we only focus on the quality of each team’s offense and defense.

The Best Offenses

Table One: The Offensive Efficiency Ranking

Offensive efficiency is defined as how many points a team scores per possession. For example, the Denver Nuggets score 105.4 points per game, the 4th best mark in the league. Denver, though, plays at the fastest tempo in the league. Hence, when we look at offensive efficiency we find that Denver ranks 13th, or closer to the middle of the pack. In contrast, the Detroit Pistons rank 18th in points scored per game, but are 7th in offensive efficiency. From this we would conclude that the Pistons are actually a better offensive team than the Nuggets.

Although the Pistons are above average, Detroit is not the best. The two best teams, and this is true whether we look at points scored per game or the offensive efficiency, are the Phoenix Suns and Washington Wizards.

The Best Defenses

Table Two: The Defensive Efficiency Ranking

When we look at points surrendered per game we see that the Nuggets are ranked 27th in the league. This ranking, though, is driven by the tempo the team plays. Defensive efficiency ranks the Nuggets 8th. Yes, Denver is relatively better at defense (a ranking driven by the team’s most productive player, Marcus Camby).

The top two defensive teams are San Antonio and Chicago. Phoenix, the top offensive team, is ranked 12th defensively. The Wizards, though, are ranked 29th on defense.

The Best Teams

”Best” is being defined here strictly in terms of offensive and defensive efficiency. That is not to say that strength of schedule is not important. As noted, though, I am only looking at how the teams rank if we only consider offensive and defensive ability. So if a team has played a relatively easy (or hard) schedule so far, then the ranking over-states (or under-states) the team’s prospects.

Table Three: The “Best” Teams

When we consider both offensive and defensive efficiency, the team at the top of the rankings is San Antonio. The Spurs rank 1st in defense and 5th in offense. The next two teams – the Suns and Dallas Mavericks – are quite close to the Spurs. These three teams are all projected to win sixty or more games, so these franchises are the very best in the NBA.

After these three we see a couple of teams that are very good. Both the Chicago Bulls and Houston Rockets project to win about 54 games. This makes the Bulls the best team in the East, which this year isn’t saying much.

Once we get past the top five, then we see quite a drop-off. The Utah Jazz currently has a record of 27-14, which translates into 54 wins over the course of the season. The team’s Efficiency Differential – or the difference between offensive and defensive efficiency per 100 possessions – is only 2.64. This is about half of the differential we see for Chicago or Houston. Consequently I would conclude that Utah is not quite as good as Chicago or Houston (at least, thus far this season).

Connecting to the Players

The concepts of offensive and defensive efficiency come from the writings of Dean Oliver and John Hollinger. From The Wages of Wins we see how to go from a team’s offensive and defensive efficiency to an evaluation of the individual players on the team.

Basically a regression of wins on the two efficiency metrics allows us to ascertain the relative value – in terms of wins – of points, field goal attempts, free throw attempts, rebounds, steals, and turnovers. A few more regressions allow us to determine a value for personal fouls, blocked shots, and assists. Once we have these values, with a bit of work we can determine the Wins Produced for each player. And the calculation of Wins Produced allows us to identify which players are responsible (or not) for the projected wins we see for each team.

The Seattle SuperSonics in 2006-07

To illustrated, let’s consider the Seattle SuperSonics this season.

Seattle has played exactly half their season and currently has a record of 16-25. When we look at offensive and defensive efficiency, we see this record is well deserved. Although the Sonics rank 9th in offensive efficiency, it’s only 25th in defense. Seattle’s efficiency differential stands at -1.94, which translates into a projected winning percentage of 0.439.

Which players are responsible for this outcome? The following table reports the Wins Produced for Seattle’s players after 41 games.

Table Four: The Seattle SuperSonics in 2006-07

Last summer the Sonics signed Chris Wilcox to a 3 year – $24 million contract. This was similar to the contract Cleveland offered Drew Gooden. As I noted last summer, though, Gooden has historically been a much more productive player than Wilcox.

This season Wilcox is posting a Wins Produced per 48 minutes of 0.111 (average WP48 is 0.100). This is an improvement over his career WP48 entering the season (0.068), but not quite what Seattle saw last season in 29 games (WP48 of 0.229). Given that Gooden is giving the Cavaliers a WP48 of 0.222 this season, we can conclude that so far either Seattle is paying Wilcox too much or Gooden is getting too little.

Although Wilcox might be under-performing his contract, he is thus far the only above average performer Seattle employs at power forward and center. In other words, as has been the case since the 1992-93 season, this team still has a problem finding consistent production in the middle.

If you are looking for production on this team you have to look at Rashard Lewis and Ray Allen. Fifty percent of the team’s wins come from these two players. Unfortunately, once you get past Allen, Lewis, and Wilcox, no other player who appears on a regular basis is above average.

Given the depth in the Western Conference it seems likely that the Sonics will be back in the lottery in 2007. And given the lack of productivity in the front court, it seems likely that this team will once again target a big man in the 2007 draft.

Teams to Analyze

There are now only seven teams I have yet to analyze: Atlanta, Charlotte, Denver, Miami, Milwaukee, New Orleans–Oklahoma City, and Philadelphia. At the time of the Iverson trade I offered a comment on both Philadelphia and Denver, so I think we should wait a few more weeks to check back with these teams. I am open to posting on any of the other five teams. So if anyone has a preference, let me know.

Also, like the Sonics, teams are approaching the midpoint of the season. So far 13 teams have played 41 games. As each team hits this mark I am downloading their data from NBA.com. When all teams have reached 41 games – and I find time to do the analysis – I will start posting on who the best (and worst) players, rookies, teams, etc… are at the midpoint. Given my schedule, this analysis should be posted by November (of 2009).

– DJ

Advertisement