Big Data And Why Spielberg Don’t Know Dick
A problem with philosophically charged reviews of sci-fi films is that they often treat technology as either an extension of an ancient question or secondary to the quandary at hand. Upon a recent viewing of Steven Spielberg’s dystopian thriller Minority Report, based on the Philip K. Dick short story, my online hunt for a review venturing beyond the tired arguments of free will verses determinism dealt grim results. Granted, in 2002, when the film was made and the reviews were written, there was no Youtube, Facebook, or any overt factors garnering layman speculation of a soon-to-be “hive brain.” Cell phones were dumb. Big Data was only knee high. Online algorithms predicting what you’ll buy, where you would like to eat, and who you may want to sleep with were in the early stages of development (Currently over 1/4 of all romantic relationships start online). Watching Minority Report over a decade since its release, a worldview in which these technologies are commonplace blends seamlessly.
John Anderton, played by Tom Cruise, runs Pre-Crime, a police unit equipped with a vast analytical computer system capable of catching would-be murderers before they kill. The main processors of the system are three individuals with psychic abilities called “Pre-cogs,” short for pre-cognition. Once caught, suspects are locked up as though they had carried out the murder. Anderton believes Pre-Crime is flawless, that is until the analytical system spits out a prediction that he’s going to murder a man he’s never met. Holding to his beliefs, Anderton assumes he has been setup.
In the film pre-cogs are considered to be miraculous because they foresee murder. In fact murder is the only crime they foresee. They reason given is that it’s due to the “nature of murder.” It’s assumed we’re to take murder as carrying a heavier moral weight than other crimes, thus a stronger signal for the pre-cogs to pick up. In Dick’s short story, the pre-cogs are capable of predicting all crimes. Instead of being metaphysical anomalies, they are more akin to biochemical processors. Dick describes them as lumps of flesh so severely deformed and deprived that they’re barely recognizable as human. Their utter dehumanization doesn’t entirely obliterate their autonomy, yet they’re no longer operating in a contemporary reality.
By imposing mysticism to the Pre-cogs, the film version of Minority Report misses a key point of the story – the commission of the crime, which is arrived at via ‘analytical’ machines, is itself purely metaphysical. There in lies the tragedy of Pre-Crime. Having a Pre-cog running through a mall able to see everything that’s going to happen a minute before it actually does, goes against the free will argument Spielberg is trying to make. Anderton’s setup is also problematic because it leaves it open for the Pre-cogs to still be capable of accurate predictions.
In the book Anderton isn’t framed. He’s also fifty years old and the creator of Pre-Crime. In the end he goes through with the murder to cover up the flaw in the system he spent his life developing. Being that Dick was well versed in philosophy, his version of Pre-Crime is far more interesting. It’s fundamentally a construction of Kant’s “synthetic a priori” which Kant presents in his Critique of Pure Reason. To clarify, say we have a predicate such as 7 + 5, which simply means the sum of 7 and 5. Being that according to Kant math is universal and necessary, the statement 7 + 5 is true and exists outside of experience. However, a synthetic truth can be derived which is not contained in the predicate, that being the number 12. (This is in contrast to an analytic statement, which contains the definition in the predicate. A common example is “all bachelors are unmarried men.”) Therefore, a true outcome can be synthesized prior to experience. Kant’s attempt to derive metaphysical truths from synthetic a priori was of course laid to rest by the wave of alternative geometries that challenged his Euclidian assumptions. In Minority Report the Pre-cog’s visions are initially treated as universal truths preceding experience in which their synthesis (the murder) is also taken to be true. i.e. When the Pre-cogs have a vision that a person is going to murder, it’s necessarily true that person is a murderer.
Now, if instead, as the film depicts, metaphysical visionaries exist, then Spielberg’s case for free will is left wanting and the importance of the technology being an extension of the human mind is missing. Further, if Pre-Crime is flawless and thus the world of Minority Report is deterministic, why are the criminals held personally accountable for fate running its course?
Getting caught in the determinism/free will conundrum, which I would argue is a false dichotomy, doesn’t make much sense. The late Christopher Hitchens dealt with the problem of free will exquisitely; when asked if he believed in its existence he replied, “I have no choice.” The free will/determinism dichotomy also detracts from more fascinating questions garnered by Minority Report such as: how capable are humans at using inductive methods to predict behavior? What are the ethical implications of Big Data? According to the Innocence Project 5% – 10% of criminals are innocent. If behavior prediction reaches a point of accuracy comparable to the rate of wrongly prosecuted inmates, is preemptive law just? Is that even possible?
Data aggregation is already yielding some fascinating results. Take the case of the father who raged at Target for sending his daughter – still in high school – ads for cribs, strollers, and other baby gear. He wanted to know why the company was, “Encouraging her to get pregnant.” To his dismay he discovered that Target knew she was pregnant before he did. Based on one’s purchasing history Target has been able to predict due dates within a couple weeks. Simply based on if you switch brands of coffee or toothpaste, Target can tell if you’ve gotten married or divorced. While Target’s data aggregation is a tad unsettling, they’re by no means the only company collecting and using Big Data. For instance Netflix created the show “House of Cards” based on the fact that they knew there was a large crossover audience that loved Kevin Spacey, the original BBC production, and movies directed by David Fincher. It was a no-brainer for them to toss these elements together to create a profitable show.
Insurance companies are also very interested in Big Data. Say you regularly use GPS in your car or upload photos to Instagram of your microbrew du jour. That data can be crunched to figure out if you regularly drive over the speed limit and the state of your health. If your insurance company knows these details, what’s to stop them from setting their rates to reflect your lifestyle? In the near future insurance companies may very well charge you more because they’re foreseeing particular follies in your future.
To capture possible future technologies as well as he could Spielberg consulted futurist and virtual reality pioneer, Jaron Lanier. In his book You Are Not a Gadget, Lanier claims that not only are large companies interlinking online with the intent to collect and sell your data, contrary to popular belief the internet is being structured in a way that reduces individuality. Programs are getting developed as quickly as possible and many almost instantaneously gain millions of users. This causes certain programming flaws to get “locked in.” What Lanier means is that once the program is in play it’s nearly impossible to go back and change it. Consider that when you’re writing in Word, if you type a number followed by a period, then hit enter, Word automatically thinks you’re going to start on outline. This action is supposed to be a shortcut, but if you’re not writing an outline it actually creates more work for you. Internet culture also gets locked in. Programs that are quickly integrated into our daily lives often lock in limits to personal expression rather than creating a space for it. One of the examples Lanier gives is that in the nineties personal websites were often oddly designed. People with knowledge of HTML but no aesthetic sense would build sites with a rainbow background and neon green fonts or use gifs of sparkling hearts surrounding all the links on their page. Today people use Facebook and other formulated sites that set the perimeters and limit certain expressions. I would argue that by doing so predicting behavior will get easier.
I’m reminded of Chuck Klosterman’s essay about MTV’s “Real World” called “What Happens When People Stop Being Polite.” Klosterman’s observation is that early reality TV shows were built around the Breakfast Club ideal of mixing several strong and diverse character types teens identified with (the nerd, the preppy girl, the bitchy gay etc.). As fans of these shows grew up they began to imitate the behaviors of the character type they associated with rather than becoming their own person. The classification gave them a sub-culture, and they could then play their role in accord with the other subcultures seen on the show. From a philosophical perspective this concept is much like what Guy Debord wrote about in Society of the Spectacle. This is nothing new. Much of the hippy and punk movements were simply sub-cultures created, packaged, accessorized, and marketed by corporations to sell identities to the youth of the day. The new structure simply reconfigures the hippy, punk, nerd, etc. in relation to each other. Once you put people in categories and they accept their categorization, you can more easily predict their behavior and cash in on their identities.
As algorithms based on a person’s behavior become part of everyday life, exposure to things outside one’s comfort zone gets limited. Consider the vast selection of books one can browse at a bookstore. Now consider what Amazon’s algorithm recommends for you. Yes, Amazon has more titles listed, and you may actually want to read some of their recommended titles. However, there are countless books that could expand your horizons that you’ll never know about because Amazon is basing its recommendations for future purchases solely on your past purchases. Although we may never reach an extreme like Pre-Crime, we’re already being shaped by an ever-growing array of behavioral predicting mechanisms. If there is anything we can be assure of in the future, it’s machines telling us what we should buy.