Saturday 27 December 2014

The EA Sports Model of Artistic Talent

When I was a kid I loved to play EA Sports's NHL 2001 on Playstation. In the game, you could play hockey, make trades, set line combinations, draft talent, and do other fun stuff that I can't remember because it's no longer 2001.

Every player in the game had each of their skills rated from 1 to 99. So a really good player might have had a 90 shot, 88 speed, 95 stickhandling, and 85 body checking or whatever. Just think of it like a report card. But they would also have an overall rating that basically summed up their entire report card with their average.

Had Wayne Gretzky been in the game in his prime, he might have had a 97 or 98 overall rating. The worst players in the league had overall ratings in the low 60s. Anybody with an overall rating above 80 was a player you wanted on your team.

I think that we should think of artists in this way too.

I used to believe that some famous artists were basically many orders of magnitude more talented and brilliant than other famous artists. Comparing a Hollywood director like Steven Spielberg or Ron Howard to the greats like Andrei Tarkovsky, John Cassavetes, or Ingmar Bergman was just blasphemous. It's like comparing Miley Cyrus to Bach! Or JK Rowling to Dostoevsky!

I would have told you that not even a thousand Spielbergs could equal up to one Tarkovsky because Tarkovsky is a true artist and Spielberg makes generic Hollywood crap.

I wasn't thinking like NHL 2001. Had I created a video game featuring famous artists, I might have given Spielberg a rating of 65. Then I would have given Tarkovsky a rating of 3,200,673 and Da Vinci a rating of 28,238,912, and Simple Plan a rating of 7.

I'm skeptical of any evaluative model of art that places a single artist as a thousand times more effective than his or her rivals - especially if the basis of that judgment is aesthetic or artistic value rather than social effects or some other objective measure.

I think the EA Sports designers got it right. Even Wayne Gretzky can't surpass 99 and even the worst pro players are above 60. This is how it seems to work in just about every other field. What's more likely, that human talent is especially variable for those fields where it's notoriously difficult to quantify success... or that human evaluations of talent are especially bullshit for those fields where it's notoriously difficult to quantify success? You don't need to worship history's most successful artists - or anyone else for that matter.


  1. I'm pretty sure you're right about identification of top artistic talent being bullshit. But I don't think that's the case in every field. Consider math, for instance: Gauss, Euler, von Neumann, Ramanujan, Hilbert, Grothendieck, etc. (especially the first two) were thousands of times more productive and important than almost all of their contemporaries. And unlike art and religion, they can be rigorously verified to have solved problems that were too hard for anyone else.

    1. Depends whether we're praising people for being "productive and important" or for being "brilliant and spiritually superior." I think if we're looking at consequences then it's possible for one person to have an impact of 1000 times greater than a rival expert. I think it's likely that some artists in history (not necessarily the Legends) have had a 1000x expert compared to some other famous artist whose work had virtually no effect. Similarly, Euler could have had a much, much greater impact than other mathematicians. However, these Legends often, especially when it comes to art, aren't evaluated by their social effects but by their genius or talent - the skills in their toolkits that equate to stickhandling, shooting, and body checking. Using art for huge social good (see: Development Media International) doesn't seem to require the tremendous gifts that art culture glorifies. If what we value is social effects, then you don't have to be a genius to create art that surpasses Da Vinci, Mozart, and Bach. You can have a 75 Overall toolkit and create something for DMI that saves thousands of lives. I think the same may apply in some way to mathematicians. Euler doesn't require a 1,000 Overall rating to have 100x the impact on his field.

    2. I'd also go further and say that I don't think the difference in social effects between Legendary artists and modern-day celebrity artists is too big, apart for maybe the differences caused by age. This is because artistic glorification isn't selected for by the qualities that lead to real consequences. In math, this probably isn't the case since mathematicians look for the same qualities in math that provide social value.

      ...Basically, it all comes back to my recurring theme of art culture glorifying the wrong people and having generally fucked priorities by using common sense criteria for artistic (and moral) greatness that aren't well supported by cognitive science or good philosophy.

    3. Sorry for being unclear. I'm not praising Gauss and Euler for being productive and important; I'm praising them for being talented.

      You said:

      > In real life, I don't think it's plausible that some people are orders of magnitude more talented than other highly successful people in their field.

      I think Gauss and Euler are counterexamples to this claim because they were hundreds or thousands of times more powerful than most of their (mathematically famous) contemporaries. The reason I brought up their output is that it's strong evidence for their mathematical talent, because mathematical success and mathematical talent track each other rather strongly (especially compared to e.g. artistic success and artistic talent).

    4. I don't believe that Euler was literally a thousand times more *talented* than the other mathematicians of his day. I think this would become obvious if, like NHL 2001, you took a reductionistic view of talent and broke Euler's overall talent level down into its component skills. I also think there are areas where mathematical success is NOT well tracked by mathematical talent. For example, being the first one to make a discovery. There may have been six other mathematicians on the brink of the discovery but whoever makes that extra step first receives several times the praise while his rivals remain unknown.

    5. What evidence would convince you that this wasn't true? What predictions does your view of talent actually make?

      Given a set of objective measurements of people that vary by orders of magnitude--for instance, number of mathematical problems solved, number of goals per season for hockey forwards, number of people who buy your music--you can always posit that these measurements are generated by "hidden variables" that don't vary as much. (For instance, a hidden "math talent" variable might range from 60-99, and math paper output = e^[math talent - 60].) But if these extra factors are only visible through objective measurements that vary by orders of magnitude, then it's kind of pointless to argue about them, because there's nothing that stops me from positing a different model where the hidden factors vary even more than the observable ones.

    6. For the record, I think that there are domains, like music and to some extent sports, where these high-variance objective measures legitimately *are* caused by low-variance hidden factors. For instance, in both art and sports, being the best at what you do gives you a huge advantage on objective measurements: presumably more people passed to Wayne Gretzky because he had the best chance of scoring, and more people bought Elvis's albums because all their friends were listening to Elvis, etc.

      But the key is that these models actually make predictions that you can verify--for instance, people won't prefer Elvis to Elvis-imitators that strongly in blind tests, and Wayne Gretzky won't be that much better than average at scoring goals in one-on-one competitions, etc.

      On the other hand, there are some analogous objective measures that I don't think can be explained in this way, like Bach's ability to improvise a six-part fugue (so Bach composed a six-part fugue in ~5 minutes; I don't think most Baroque composers could have written its equivalent in one solid day of work). Or Ramanujan's famous habit of stating facts that came to him in dreams and took, in some cases, decades for anyone else to understand and verify.

    7. First, it predicts that nobody will score a thousand times better than their rivals on any objective measures that we have. So Wayne Gretzky won't score 1000 times as many goals as his contemporaries, Euler won't publish a 1000 times as many papers, Elvis won't sell a thousand times as many albums, Bach's music won't trigger 1000 times the neural response in listeners, and so on. I don't think we really see these freakish effects in the world.

      I also think the "hidden factors" are themselves observable. We can actually assess Gretzky's skills individually. We can time the speed of his skating, measure the speed of his slap shot, see how much he can bench press, check his flexibility - and then compare that to the skills of other NHL players. You could also test artists and mathematicians on various problems. Is there a math exam you could give Euler on which he'd score orders of magnitude better than his rivals? Would he complete it in orders of magnitude less time? I don't know of any objective measures that suggest Euler was "literally hundreds or thousands of times better" than his rivals rather than just "better."

      I can't say how difficult it would be for Bach's contemporaries to improvise a 6-part fugue. Maybe he's an exception. I'd still think that accomplishment's almost certainly reducible to low-variance measures such as the ability to juggle an extra 1 or 2 pieces in working memory.

    8. As I said (or intended to say), I suspect you're right about Gretzky and Elvis that objective tests would prove them not a lot better than their contemporaries. I also think a factor of 1000 might be a bit of a stretch, but 10 is certainly reasonable and probably 100 is too. (Definitely you can get far bigger than the factor of 1.5 discussed in your post.)

      As for Euler, his collected works fill 60-80 quarto volumes (Wikipedia), that is, about 40 times the size of the Oxford English Dictionary if my arithmetic is correct. While this isn't thousands of times larger than the average mathematician, it is certainly hundreds. For another comparison:

      > The historian of science Clifford Truesdell has estimated that in a listing of all of the mathematics, physics, mechanics, astronomy, and navigation work produced during the 18th Century, a full 25% would have been written by Leonhard Euler.


      Similarly, historians of math estimate that publishing the unpublished notebooks of Gauss would have set mathematics forward 50 years or so. While this is undoubtedly an exaggeration, it would have to be exaggerated by a factor of ~100 for Gauss to have a normal-human level of output.

      The other mathematicians were not quite as prolific, but nor were they on quite the same plane as Euler and Gauss. However, in addition to writing *more* papers than the typical mathematician, they also wrote *better* papers, so merely looking at paper output would understate the disparity.

      On the Bach side, his complete works span 150 CDs. They don't make complete works of very many composers, so it's hard for me to compare to his contemporaries, but I would be surprised if the average professional composer in his day (or now, for that matter) got anywhere near that.

      As I said, you can make up stuff like "juggling 1 extra piece in working memory" that makes it sound like everything is caused by low-variance stuff, but I don't think you'll get any useful insights about the world that way. For instance, if the neural cost of working memory is quadratic (let alone exponential) in it size then you're back to a high-variance measure ("neurons devoted to working memory"), but all relevant facts about the world stay the same.

    9. Alright, I deleted that sentence. Thanks for your comments.

  2. This comment has been removed by the author.