I can’t remember where it came from—maybe it’s a comedy bit, or maybe it’s an old Tumblr text post—but I can recall a joke that’s stuck with me regarding technology. It’s premised on the idea that people and popular culture have always been obsessed with flying cars—really, with the nonexistence of the promised flying car—ignoring the fact that, if flying cars existed, they would essentially function exactly like the planes we have now. Does the premise totally work? Not exactly. But it’s a useful meditation on the ways we view the technology we have in contrast to the technology we wish we had, or that we had imagined would exist by now. Bogost’s article is predicated on this same idea. When the “corporate fashion” of AI software is taken in kind with how “press and popular discourse sometimes inflate simple features into AI miracles,” the notion of artificial intelligence becomes something like the proverbial flying car: an ideal some subset of culture is convinced that it and the rest of society not only want, but are mere decades, years, or months away from achieving. Bogost frames this conflation as a kind of technocratic theology, a worship of “false idols” like AI and algorithms at the expense of serious consideration of what services these kinds of software can—and more importantly, can’t—be expected to perform.
What become eminently useful are analyses like Crawford’s, which engage head-on with notions and consequences of “bias” and “classification” as they emerge, nominally and implicitly, from precisely the sort of “nothing special” systems Bogost reminds us are inevitably “made by people.” Through a leapfrogging historical overview of human classification systems which today appear somewhere between “quaint and old-fashioned” at best and systematically harmful at worst, Crawford explains how the “arbitrary classification” of computation works both to the benefit of oppression and to the detriment of the oppressed. She quotes Stuart Hall: “Systems of classification become the objects of power.” In response, Crawford proposes the crafting of technologies in respect of the “kind of world we want”—a process fundamentally reliant on cross-disciplinary collaboration. This reach “beyond [the] technical approach to the socio-technical” is one we have explored multiple times in the texts we’ve read for class, but it’s one that can sometimes appear worryingly reflective, after-the-fact in its approach to critique. I can’t help but think back to Bridle’s invocation of Epimetheus.