In yesterday’s vlog, MKBHD decided to investigate the machinations behind that holy grail of cellphone camera analysis: the DxOMark ratings. Here are some of his findings:
It’s Not That Simple
Despite media and tech executives’ exhortations to the contrary, DxOMarks’ smartphone ratings are more nuanced than that single two-digit figure that gets all the headlines. Underneath this all-encompassing label is a comprehensive list of subcategories, each with its own rating. First, the overall score is split into two categories: photo and video. From there, each category is split into several more categories such as: brightness, exposure, portraiture, etc. And finally, these individual scores are weighted according to DxOMark’s algorithm of importance, eventually supplying the score that gets the raucous cheers at tech conferences.
In short, if you’re looking to buy a smartphone for its camera, look deeper than the overall score. Some cameras will be better at certain tasks, so ask yourself how you plan on using that camera, look at the appropriate score for that skill, and choose accordingly.
He notes that you should pick the Samsung Galaxy Note 8 (94 rating) over the Google Pixel 2 (98 rating) if you’re looking to do portraiture. Despite its inferior overall rating, both the zoom and the bokeh of the Note are superior. This doesn’t translate into a higher overall score because DxOMark’s algorithm puts less weight on these categories. So, in case you do, make sure to look closely.
DxOMark Plays Both Sides
Apart from their ratings business, DxOMark also acts as a consultant to tech companies in their efforts to gain elevated ratings. In effect, they are paid to help companies build cameras which, according to DxOMark’s specifications, are better.
This means that we shouldn’t be surprised when, year after year, products get higher DxOMark ratings. Their judging practices are not only objective, but they are literally guiding the tech industry in their efforts to build better cameras.
DxOMark employs a vast array (50) of both indoor and outdoor “scenes” which they capture using the camera being rated. They then take the finished production and place it on a sliding scale–based upon noise, focus, etc.–of how well the camera performed. This process, where all variables are kept identical besides the camera, is the most rigorous and objective in the business, so it’s no wonder DxOMark has become the gold standard.
100 Is Just a Coincidence
The highest cellphone score achieved to date–a 98 by the Pixel 2–sounds additionally impressive because it is close to 100 and therefore almost a perfect camera. At least this is how the media behaves. In fact, the ratings are not out of 100 and the fact that we are now nearing that figure is a mere coincidence. The highest ranked camera on the site currently has a 108. So, MKBHD warns, when the media necessarily makes a fuss over the first 100-rated phone, see through the hype and realize it’s just a number. And, as he says, we shouldn’t be surprised cameras are always getting better, that’s just how technology works.
Moral of the Story
Basically, look deeper. Although DxOMark’s overall rating can be useful in deciding whether a camera is trash or not, it isn’t particularly useful when dealing with high-level cameras where the differences may be subtle. Don’t assume that because a camera is higher ranked that means it is better in every respect. If you’re looking for a specific feature, look for that feature, and see what the DxOMark says.
And the next time you hear a tech exec say: “this is the highest ranked cameraphone ever,” reply with: “yeah I’d hope so, otherwise you’re working backwards.”