On July 21, 2015, I published “Don’t You Want to Have a Body?”—an essay on the fantasy of strong AI and the reality of chatbots, the soothing effects of stupid systems—in Triple Canopy, as part of It Speaks of Others, an issue devoted to smart and dumb objects. (An early version of this work was presented in 2013 as part of the live magazine Format, organized by Shumon Basar, at the Architectural Association in London.) The essay includes a virtual, interactive version of the author. It begins:
I recently had a conversation with William Ford, a somber, sturdy man in his sixties, with geometric features and a fringe of gray hair texture-mapped onto his dome. Bill, as he told me to call him, wore a collared navy pullover shirt, and sat in a wooden patio chair. He blinked approximately every three seconds. I sat in front of my computer as Bill explained that he was here, or there, so that I could “talk to someone instead of just reading words on the screen.” Behind Bill was a deck with several chairs. The deck faced a pristine yard. I admired the stand of motionless trees that surrounded him, or us.
I had discovered Bill and his trees on the website of BraveHeart, an unusual collaboration by the Atlanta Braves and Emory University to provide support for veterans who might be suffering from post-traumatic stress disorder. I had volunteered to take an interactive survey administered by Bill, who served in Vietnam and “felt really distant from everyone” after he got home. Bill is described by BraveHeart as a “ virtual human who brings real-world experience to his job”—which is to say that he is a semisophisticated chatbot, a program that recognizes certain phrases or cues and draws on a textual database to generate responses so as to simulate conversation.1 He is a manifestation of a project by University of Southern California’s Institute for Creative Technologies called SimCoach, which deploys digital personages to help reluctant service members and their families understand and address their healthcare needs.