Automated and/or authentic intimacy: What can we learn about contemporary intimacy from the case of Ashley Madison's bots?
First Monday

Automated and/or authentic intimacy: What can we learn about contemporary intimacy from the case of Ashley Madison's bots? by Katherine Harrison



Abstract
In July 2015 the well-known affairs Web site, Ashley Madison, was hacked and both customer details and internal company correspondence were stolen, and later dumped online for public access. In the analysis of the data, it became clear that the site had extensively used bots to attract customers. This paper brings together these bots with a short story by Candas Jane Dorsey about the discomfiting potential of human-machine intimacies. I use the two to help me explore what I argue is an inbuilt and little discussed expectation/requirement of “authentic” intimacy: humanness.

Contents

Introduction
Background
Intimacy and authenticity
Infidelity — A more authentic intimate relation?
Human-machine relations: “(Learning about) Machine Sex”
Conclusions

 


 

Introduction

Science fiction and fantasy have long explored the idea of human-like machines or machine-human relationships in scenarios that range from the darkly dystopian to the broadmindedly charming [1]. Technological advances in the last decades have materialised some of these visions into actual (ro)bots that now populate our contemporary world [2]. Although some of the earliest advances in robotics emerged in the field of defence, an increasing number of (ro)bots are designed for more intimate kinds of human-machine relations such as play, care or conversation. A sample of these might include the AI that beat a professional Go player [3], Paro the cuddly robot seal designed to calm distressed patients (Pym, 2015), or Mitsuku, the world’s best conversational AI (who has won the Loebner Prize Turing Test four times and counting) [4]. These robots represent cutting-edge (semi)autonomous technologies that are in early stage testing with humans — with enormous potential to revolutionise how care, support and companionship are organised in contemporary society. More familiar to us all already, however, are bots — many of which are now able to pass as human for at least short periods of online interaction.

In July 2015, for example, the well-known affairs Web site, Ashley Madison (AM), was hacked and both customer details and internal company correspondence were stolen, used (unsuccessfully) as a bargaining tool, and later dumped online for public access. In the wake of the hack, media coverage critiqued different aspects of the way in which AM did business. In addition to a significant amount of media space devoted to critiquing either the morality of the site’s activities or its heteronormativity, criticism was also directed at a lack of security when it came to handling user details, and the use of bots to entertain or educate customers (see Ben Light’s 2016 article for an analysis of the AM T&Cs that addresses this). These bots behaved in a sufficiently convincing way to persuade human customers to pay to talk more with them, resulting in customers feeling conned when the bots’ existence was finally revealed.

At the time of the hack, I was involved in a research project about changing practices of intimacy in our digitally mediated world. I therefore read the media coverage of the AM hack, and particularly the role of bots, with interest. In the media coverage concerning the bots, I could see two themes regularly occurring. Firstly, there was material dealing with the idea that customers were conned; they paid to meet other humans with whom they could potentially have an affair, and were thus short-changed because they met robotic site representatives without knowing that they were bots (Sharp and Martell, 2016; Dewey, 2015). Secondly, there was material considering the question of why there were so many bots on the site. This material enquired into what aspects of the site had made it such an unappealing place for women such that a massive disparity between male and female customers on the site had resulted, prompting AM to develop bots to perform as female customers to address this imbalance (Murphy, 2015; Abad-Santos, 2011). However, no one was asking the questions that were most interesting to me: What if these bots actually got people hot\wet\hard? What would that say about how we understand intimacy today? How might bots with no loyalty to the human notions of intimacy productively upset or challenge our notions of what intimacy should be/do? This paper sets out to explore exactly this, and it starts from the promise of authenticity that is entangled with contemporary notions of intimacy.

The press coverage mentioned above starts from the assumption that the intimacy in which customers were expecting to engage was judged inauthentic (and therefore less valuable) when the other participant was revealed to be a bot designed by AM, rather than a “real” human. This paper, therefore, takes as its starting point this assumption in the media coverage, and uses it as the jumping off point for an exploration of intimacy, one that brings together the little we know about the AM bots with a short story also about the discomfiting potential of human-machine intimacies. I want to use the two to help me explore what I argue is an inbuilt and little discussed expectation/requirement of “authentic” intimacy: humanness. This is an important exercise to do because it allows us to turn a new lens on intimacy and its norms, and — through a critical exploration of these — may offer the potential to open up intimacy to queerer ways of thinking [5]. In this exploration, I am inspired by critical theorists such as Lauren Berlant who frame intimacy as a kind of social script, one in which our most “personal” relations are revealed to be organised and policed by “public” institutions (Berlant, 2000). This approach calls into question this “authenticity” that we associate with intimacy, and enquires into the foundational premises on which its authority rests. It is increasingly urgent to explore how we understand and value intimate relations with machines as we struggle to understand the burgeoning relations that we are forging with the (ro)bots in our worlds. If Paro the seal can provide comfort or Mitsuku an absorbing conversation, then who are we to devalue intimate human-machine relations as somehow not “authentic”?

 

++++++++++

Background

More than half of Internet traffic is now comprised of bots (Zeifman, 2017), meaning that the existence of bots is more prevalent than many may realise. These bots come in many different forms, from the benign to the irritating to the downright destructive. They are in use throughout the Internet, occurring even in spaces where there might be a high expectation of humanness due to the nature of the site. The “socialbots” that populated AM are one of the more recent incarnations of bots, characterised by their ability to more or less blend in and pass as human. The initial analysis and publication of the data showing the existence of the bots on AM was published by AnnaLee Newitz, a journalist for Gizmodo (https://gizmodo.com). Reporting on the discovery of bots on AM, Newitz painted a thought-provoking picture of the future of intimacy:

The AM con may have played on some of our most ancient desires, but it also gives us a window on what’s to come. What you see on social media isn’t always what it seems. Your friends may be bots, and you could be sharing your most intimate fantasies with hundreds of lines of PHP code. (Newitz, 2015)

But what exactly is a socialbot and how might it differ from other kinds of bots?

... the socialbot is designed not simply to perform undesirable labour (like spambots) and not only to try to emulate a human conversational intelligence (like chatbots). Rather, it is intended to present a Self, to pose as an alter-ego, as a subject with personal biography, stock of knowledge, emotions and body, as a social counterpart, as someone like me, the user, with whom I could build a social relationship. [6]

Scholars such as Steve Jones advocate taking the content and activities of bots more seriously, not only because of their ubiquity but also because they are the creations of humans (Jones, 2015). Thus, what these bots do and generate can be understood as relevant to the creation and maintenance of the social fabric within which humans live. If a socialbot can successfully create an intimate relation with an unsuspecting human, then the moment of revelation forces: i) a reconsideration of the human-machine border in terms of what machines can do and how well they can do it; and, ii) a reassessment of what we consider the defining characteristics or qualities of intimacy (“real”, “authentic”, “personal”, ’private”, and “human”?).

How did this work in the particular case of AM? AM had teams of staff creating fake profiles on the site. These were then “brought to life” by automated programs, resulting in what were called internally “Angels”. A human user would receive messages from an Angel who appeared to them to be a human user and — if drawn in by the initial contact — would be prompted to buy credits on the site that allow users to respond to such messages. The Angels therefore worked to engage and keep customers, and to generate revenue for the company. The bots were primarily targeted at heterosexual men using the site. The motivation for using these bots was — according to the site — to enhance users’ experience of the site. However, as Ben Light (2016), Annalee Newitz (2015), and various disgruntled customers have noted, it seems clear that these bots existed primarily to generate income by encouraging men to purchase credits that would allow them to send messages to the personal profiles associated with these bots. In the case of AM, this ability to pass as human was perceived as a “con”; the bots were perceived as imposters designed to trick unsuspecting customers out of their money. The hacked data showed that a record was kept of every interaction that a bot had with a human user, allowing AM to quantify effectively the value of these interactions in terms of revenue generated. Based on Newitz’s analysis, it appears that potentially thousands of users were fooled into believing they were talking with a real person when in fact they were responding to basic messages from bots, or “Angels,” as they were termed:

As documents from company e-mails now reveal, 80 percent of first purchases on AM were a result of a man trying to contact a bot, or reading a message from one. The overwhelming majority of men on AM were paying to chat with Angels like Sensuous Kitten, whose minds were made of software and whose promises were nothing more than hastily written outputs from algorithms. (Newitz, 2015)

The ubiquity of bots online, and the increasing sophistication of socialbots capable of passing as human companions provides a commercial opportunity for sites such as AM, and also a chance for scholars to reflect on our understandings of intimacy and authenticity. The time is ripe to stop dismissing human-machine intimacies as the stuff of science fiction and to start taking them seriously:

The question is no longer whether bots can pass, but how social interaction with them may be meaningful. [7]

 

++++++++++

Intimacy and authenticity

Part of the promise of intimacy is authenticity of relations, that when we engage in specific practices with others we reveal our own “authentic” selves and emotions, and expect that from our interlocutors. But if we unpick this a little further, we can see that the conditions for achieving “authentic” intimacy are quite specific and that less “authentic” forms of intimacy also exist. For example, responses to life online have long debated energetically whether internet relationships are “real” because they don’t involve (or involve less) physical, face-to-face contact (Lynn, 2007; Kendall, 2002). This discussion continues today, particularly in the area of social media, as scholars develop more sophisticated ways to understand the relations we form in various on/off-line spaces (Hillis, et al., 2015; Baym, 2015; Andreassen, et al., 2017). Similarly, studies on paying for intimacy often explore the idea that “genuine” intimacy is not possible (or at least is less genuine when paid for) under conditions of financial exchange (Zelizer, 2005). These examples suggest that (authentic) intimacy has a number of assumptions built into it (such as requiring physical proximity or being freely given). Under certain conditions these assumptions are revealed, and certain kinds of relations are (de)valued. The coverage from the AM hack reveals another assumption upon which intimacy is built — the humanness of the participants. What difference then (if any) did it make that the non-consensual non-monogamy (NCNM) [8] was taking place with bots? Did it change the practice of intimacy? Could reflecting on this prerequisite for humanness that appears to be embedded in “authentic” intimacy help us to formulate an understanding of the differing values of different practices of intimacy, and the consequences of this valuation practice on certain lives, bodies and relations?

Lauren Berlant has described intimacy as a kind of organizing narrative for the way in which we are encouraged/expected to organize our lives through a particular series of attachments [9], in which the “inwardness of the intimate is met by a corresponding publicness” [10]. This life “narrative” follows a chronological trajectory spanning important “intimate” events such as prom, engagement, marriage, having children and establishing a family home. Berlant is not alone is framing intimacy as a kind of affective script that determines acceptable practices and ways of organizing a life. Nathan Rambukkana, for example, develops this idea to suggest intimacies as a kind of space that allows/restricts certain practices:

The ‘space of intimacy’ is not simply a private one, but a public/private realm that defines multiple forms of human relationship and that acts as a layer of mediation between our selves and our worlds. Intimacies create spaces: social, national, cultural, familial, sexual — spaces that define and constrain what forms of relationship, embodiment and subjectivity are seen as legible, viable, ethical, legal, even real. [11]

Practices of intimacy that fall outside the expected parameters of a “good” life may produce moral outrage or physical violence, effectively rendering the participants “abject sexual citizens.” [12] Berlant, Rambukkana and other critics have taken care to point out the exclusionary mechanisms of intimacy.

Scholars who have looked at particular practices of intimacy have shown how a hierarchy of intimacy exists in which some practices are considered more “appropriate” or “authentic” than others. Ways in which intimacies are negotiated abound in the literature on sex work and escort services, for example, with authors highlighting how “the commodification of intimacy is related to the demand for authenticity in late capitalism.” [13] In this formulation, mass produced experiences are contrasted with individual, distinctive intimate practices that resist mass-production. Intimacy is valued because it is seen as more authentic than mass-produced emotions. This promise of authenticity through intimacy appears too in the advertising done by other NCNM sites such as Victoria Milan, with their catchy tagline: “Relive the passion — find your affair”. What these accounts and others suggest is that intimacy — when practiced between consenting persons with no intention of financial gain — is equated with a kind of honesty or authenticity of relation that is specially prized in contemporary society. But also that the promise of authenticity contained within intimate relations may be exploited for financial gain.

In her opening to the special issue of Critical Inquiry devoted to “intimacy,” Berlant acknowledges the promise of the real contained within intimacy and connects it to the private sphere.

They (the essays in Intimacy) track the processes by which intimate lives absorb and repel the rhetorics, laws, ethics, and ideologies of the hegemonic public sphere, but also personalize the effects of the public sphere and reproduce a fantasy that private life is the real in contrast to collective life. [14]

Berlant’s work repeatedly returns to how the illusion of intimacy as a personal, individual experience is actually highly structured and policed by public institutions and expectations. The case of (ro)bots creating intimacy with humans threatens to destroy this illusion by revealing intimacy as potentially programmable, automatable and still believable. This takes the notion of intimacy as a kind of public affective script designed to create the illusion of a personal experiences to its — some might say — inevitable conclusion. Under these conditions, is human-robot intimacy “authentic”? The success of AM’s socialbots suggests that it is authentic enough to convince thousands of customers to part with their hard-earned cash. To what extent though does the specific kind of intimacy — infidelity — play a part in this decision?

 

++++++++++

Infidelity — A more authentic intimate relation?

It means imagining — as adulterers so often do — that you can do it differently, that you can engineer, through sheer will, a different moral and affective universe. [15]

The question of the kind and quality of intimacy provided by the AM bots is given an extra twist thanks to the specific practice for which the bots were trained to engage: NCNM. When norms of intimacy are challenged, it is not uncommon for there to be a public backlash in which appeals to a wide range of structural devices such as moral codes, nationality or religion are used to support denouncements of new forms. In the case of NCNM, AM and similar sites work “to change the cultural script of NCNM through a combination of content and material affordances” (Harrison, 2019). However, despite increased ease in finding a partner and persuasive site content, advertising campaigns by major NCNM companies have been met by moral outrage and the sites themselves tend to reproduce heteronormative ideas of gender, body and sexual practice (Harrison, 2019; Beasley, et al., 2017). There is therefore a constant tension between wanting to find a way to express desires experienced as being “out of the ordinary” (and consequently more authentic) and the inevitable instinctive, instinctual recoil towards familiar patterns, bodies, performances.

Public responses thus suggest that NCNM is still morally frowned upon and tend to reinforce its status in the shadows. Despite this, NCNM itself has a well-established narrative about being a way to express, acknowledge more authentic, personal feelings and desires, one that appears in the marketing materials for the sites and which Laura Kipnis elegantly deconstructs in her article “Adultery”:

You felt transformed; suddenly so charming, so attractive, awakened from emotional deadness, and dumbstruck with all the stabbing desire you thought you’d long outgrown. [16]

Kipnis’ analysis acknowledges that this rhetoric of discovering a more “authentic” desiring relation is part of a well-worn narrative of NCNM. Indeed, she organises her whole article to show that the narrative of NCNM is as clichéd and public as any other intimacy script. Kipnis plays on this ubiquity by opening each section of the text with a series of clichéd comments. The section titled “The marital panopticon”, for example, opens as follows:

Bad moment over last month’s phone bill. Did you know they now break local calls down by zone? (Although, thank God, not yet by number!) There seems to be an astronomical numbers of calls to one zone all of a sudden — this took some quick thinking. [17]

The banality of the comments, the familiarity of them, only serves to underline what we could term the “norms of NCNM”, meaning the recognisable patterns, discourses and activities. This highlights the paradoxical nature of NCNM. Taboo as a topic of public discussion it is so well-known as to have become stereotypical; promising a new, passionate, authentic relation, it actually follows a series of familiar “scenes”. Despite this, Kipnis still sees potential in NCNM as a way to disrupt intimacy norms and imagine queerer ways of relating:

What would it take to expect more forms of gratification and pleasure in the present, in other spheres than intimacy alone — even without the hand-me-down utopia of sex? If adultery weren’t a placeholder for more sustained kinds of transformation and honesty, or a repository for wishes split off from the pragmatics of everyday life?

At the very least, it would take an unembarrassed commitment to utopian thinking. It would mean forging connections, in theory and in practice, between the myriad forms in which we do tentatively invent these possibilities in our everyday lives and larger questions about the social organization of work, love, shame, and pleasure. [18]

Framed in this way, NCNM becomes an opportunity to disrupt or call into question some of the scripts of intimacy. I read this as Kipnis suggesting that somehow NCNM is related to queerer ways of relating, that NCNM is a sign of how unrealistic monogamy is as a way of organizing society. Understood in this way, NCNM is a moment of break, of rupture, which could be productively used to re-imagine intimacies more generally, through acknowledging that the current state of affairs is claustrophobic and not actually reflective of “authentic” human desire:

We have, after all, been born into social forms in which fighting for happiness looks like a base and selfish thing, and realization of desire is thwarted and fleeting at best, so often an affair of short duration. [19]

NCNM exists in the shadows, both publicly and personally scorned, but with a recognisable narrative because it is so widespread. Indeed, in the case of AM, the success of the bots in attracting new customers can be attributed to the particular combination of i) normative scripts of intimacy; and ii) the narrow bandwidth of the online space (Harrison, 2019). The well-known conversational gambits of intimacy channelled through a primarily text-based medium made it possible to program authentic-sounding prompts while any awkwardness in the communication was attributed to the specificities of chatting online. Thus, the affordances and limitations of this particular socio-technical, software-hardware, content-format combination resulted in potentially thousands of customers paying money for the company of a computer program.

NCNM has long been an organising motif of fictional and factual narrative accounts in which it is deployed strategically and legally to extricate partners from existing relations or deployed emotionally to justify a rush of desire. This makes it the ideal test case for the bots: a recognisable intimacy script but a general disinclination on the part of the participants to look too closely. The secrecy inherent in the practice doesn’t encourage looking too closely into the shadows, and in these shadows the bots proliferated. If we take a good look into those shadows, what do we see? What might human-machine intimacies look like in contemporary society if we took them seriously? Here perhaps the best clues to the potentials of human-machine intimacies lie in the realm of fiction.

 

++++++++++

Human-machine relations: “(Learning about) Machine Sex”

In the short story “(Learning about) Machine Sex” by Candas Jane Dorsey (1990), Dorsey’s protagonist is a programmer called Angel who designs an artificial intelligence. Disillusioned when her boss (and ex-lover) sells the small company for which she works to a larger corporation, Angel enacts her own personal revenge by designing a program called “Machine Sex”. This short story is interesting not only because it upends the masculine technohype of early cyberpunk but also because it depicts a society on the cusp of human-machine intimacy and taking this seriously.

Machine Sex is based on the idea that orgasm can be programmed and the text traces Angel’s development of the hardware she dubs the “Mannboard” and accompanying software (“Machine Sex”); this hardware-software combination results in a piece of equipment with touch pads through which the user is effectively “programmed” to orgasm:

It was very simple, really. If orgasm was binary, it could be programmed. Feed back the sensation through one or more touchpads to program the body. The other thing she knew about human sex was that it was as much cortical as genital, or more so: touch is optional for the turn-on. [20]

In her depiction of the machine, Dorsey does not shy away from the notion of intimacy as programmable in ways that resonate strongly with the case of the AM bots. What is different here, however, is that there is no valuation of the authenticity or not of the intimacy. In this short story, the relationship of the main protagonist, Angel, with her human ex-lover is contrasted with her relationship with the MannBoard/Machine Sex device. In Angel’s relationship with her ex-lover, sex is portrayed as a transaction that is more about power than emotional intimacy, and which provides little pleasure for Angel. The Mannboard, by contrast, is portrayed as being able to sexually pleasure not only Angel, but a cowboy she meets towards the end of the story. By making the MannBoard more capable of pleasuring Angel than her human ex-lover, Dorsey explores the notion that human-machine relationships may constitute intimacies that are as “real” as those between humans.

This story also shows the potential for human-machine relations to disturb a normative intimacy narrative. The characters’ responses to the MannBoard/Machine Sex point to the uncanny experience of having sex with a machine: “At first it did turn him on, then off, then it made his blood run cold. She was pleased by that: her work had chilled her too.” [21] Dorsey’s descriptions of their responses leans towards a reading of the technology as uncanny which chimes with Haraway’s assertion that “Our machines are disturbingly lively, and we ourselves frighteningly inert” (Haraway, 1985). Dorsey’s framing of sex with the MannBoard as more “real” than with her human lover thus raises questions about the prerequisite for humanness in intimacy. This short story is a useful counterpoint to the case of the AM bots because it takes seriously the possibility of human-machine intimacy, it narrows the gap between programmed intimacy and the notion that all intimacy is scripted anyway, and it is honest without being dramatic about the disruptive potential of acknowledging desire with a machine.

The outrage prompted by the revelation of the AM bots’ existence suggests that they too disrupt the human-machine boundary by troubling human expectations of machine behaviour. Human-machine relations have long been the subject of science fiction and film, often exploring themes such as humans being replaced by machines, cyborgs, or intimacy between humans and machines. All of these themes hinge on the idea that the boundary between humans and machines is changing, becoming more permeable; as machines become more “intelligent”, more human-like in their capacities and even their appearance, it may become difficult to distinguish easily between humans and machines. That the AM bots were able to convincingly pose as human long enough to persuade human users to spend money purchasing credits on the site taps into this long history of concern about intelligent machines. How will we know when we are talking to a machine and when a human? What difference does that make? Are our interactions less valuable or less “real” with a bot? If yes, why? The existence of the social bots plays on a perennial fear associated with life online; online interactions prompt fears about authenticity and identity because our methods of accounting for an entity’s authenticity remain woefully old-fashioned and meat-based. If you can’t see, hear, smell, touch or taste the other entity, then how can you be sure that they are who they say they are? [22]

It is possible that without the hack, the bots of AM would still be functioning unbeknownst to the users. While bots are widespread on the Internet, there seems to have been an expectation that if you practice NCNM this would be an “authentic” space. Having breached the veil of secrecy around the practice, then once inside there would be no need to hide anything else. A cynical reading would suggest that AM essentially cashed in on this expectation. This reading, however, closes down the possibility that the AM bots did turn people on, prompting at least two thoughts. First, what does it mean for my own humanity if I got turned on by a bot? Second, what are the contours of this new and potentially very different form of intimacy?

 

++++++++++

Conclusions

Existing scholarship on intimacy has done important work in unpicking some of the normative assumptions that shape our apparently most private moments. However, with the exception of science fiction and fantasy, there has been little exploration of the possibility that the intimacy created between humans and machines might be as “authentic” as human-human intimacies. Or what this might mean for how we understand intimacies today. This kind of exploration is long overdue, and more pressing than ever now as (ro)bots edge ever closer to being significant actors (others?) in our practices of intimacy.

Our guides to this new intimate world order are a strange combination of robotics researchers and science fiction authors. Whilst we urgently need to develop ways to engage empirically with the kind of intimacies that are developing between humans and (ro)bots (Pym, 2015), there is also a powerful political argument to be made about acceptance of a broader range of intimate practices more generally. For anyone who has ever felt themselves labelled as an “abject sexual citizen” then theoretical tools or perspectives that help us to deconstruct the norms of “intimacy” may provide the key to broadening who and what might be considered as “appropriate” partners in intimacy, as Kipnis notes:

... this turn of events may actually raise fundamental questions about what sort of affective world you aspire to inhabit and what fulfillments you’re entitled to ... [23]

Fictional renditions of the human-machine relation, as exemplified pertinently in the short story “Machine Sex”, have long rehearsed various scenarios. However, our reality is now approaching this, from companion/care robots in hospital and elderly care homes, to AM’s socialbots flirting with customers online. In essence, I am in agreement with Ben Light when he wrote in his article about the AM bots that “non-humans matter and they do things with us” (Light, 2016). In the case of AM Angels, these particular non-humans are sufficiently engaging that their interactions generate revenue for the company. David J. Gunkel agrees too:

Even if the programming of these fembots were rather simple, somewhat shoddy and even stupid, a significant number of male users found them socially engaging – so much so that they shared intimate secrets with the bot, and, most importantly, took out the credit card in hopes of continuing the conversation. [24]

By labelling them as simple “fraud,” the media coverage of them obscures the ways in which they matter AND fails to take seriously the intimate practices in which they engage. The media coverage, however, of the bots, can tell us two things.

First, the ways in which these technologies are developed and deployed reproduces norms relating to sexuality and gender. Bots that are designed to “entertain, engage, educate and entice” (Light, 2016) in programmable ways lift up the performativity of intimate practices, and call into question its authenticity in ways that may be usefully explored to unpick heteronormativity or gender roles in these spaces.

Second, the old real/virtual, online/off-line divide seen in the early days of the Internet — which was particularly pertinent to questions of identity and authenticity — has resurfaced in a way that so far dismisses these interactions with bots. However, remembering how those old binaries have been played out in the last 20 years we can see how relationships between humans online have gained validity, we should take seriously cases such as the AM Angels because they point to important future directions in intimacy.

The bots’ potential to disrupt the existing intimacy scripts lies in two aspects. Firstly, whilst the trajectory of prom, engagement, marriage, etc. assumes a chronological development to human affective engagements, bots have no loyalty or basis in this, nor do they — as non-humans — even feel the need to follow the circadian rhythms of the human body. Indeed, in the court cases filed by unhappy AM customers one of the pieces of evidence supporting the existence of bots was their presence online at “unusual” times:

He discovered that many of the women who had contacted him would log in at roughly the same time of the morning every day, and stay online until after 5 PM. Even on Christmas and New Year’s Day. (Newitz 2015)

Our machine lovers are always there and “always on” in a new twist on “ambient intimacy”. These bots have no life narrative, chronology or sense of moving and developing through life, and consequently no drive to complete certain steps in order to live a “happy” life. They are not bound by moral or biological bonds and therefore have no commitment to monogamy, heterosexuality or any other normative practice of intimacy. How liberating to have a lover who has no drive or interest to follow any of the “normal” patterns of intimacy, such as meeting each others’ friends, purchasing items together or making plans for public forms of commitment, who never gets jealous or bored, who thinks your body is fine just as it is. In this context, one would be free to imagine new patterns of intimacy or simply to enjoy the moments of pleasure potentially separate from notions of emotional work.

Secondly, bots hold promise for expanding our ideas of what constitutes intimacy by revealing and challenging the assumption that intimacy is experienced between two humans. There is a long history of film and fiction exploring this (from Frankenstein’s monster wanting a companion, to the computer in Her getting close enough to the narrator to elicit genuine feelings in him), as well as data that AM bots interacted successfully enough with humans to generate revenue for the site. These days having sex with a partner via Skype or other digital/social media channel is no big deal. Whether you use it to stay sexually intimate with a partner when living/working apart, or because there is something about having sex online (the screen, the visuality, the different kind of contact) that gets you off, it’s not uncommon to use digital media to channel human-human desire. In essence, we have a well-developed popular imaginary around human-technology intimacies, and established practices of getting intimate with other humans using technologies. Getting intimate with a socialbot surely isn’t such a leap of faith? End of article

 

About the author

Katherine Harrison is senior lecturer at the Unit of Gender Studies, Linköping University Sweden. Her areas of expertise include feminist cultural studies of technoscience with particular reference to digital technologies, science and technology studies and normcritical perspectives on gender and the body. Katherine's current research focuses on intimacy and care in human-robot relations: https://liu.se/en/research/caring-robots.
E-mail: katherine [dot] harrison [at] liu [dot] se

 

Acknowledgements

Earlier versions of this paper were presented at the Lund University Gender Studies internal research seminar, Lund University Communication and Media research seminar, Affective Politics of Social Media conference, Linköping University “P6: Body, Knowledge, Subjectivity” seminar and the “Imposters and Gatecrashers” workshop. Each of these fora provided a welcome opportunity for discussion, and I would like to take the opportunity to thank the many participants who took the time to give me feedback.

 

Notes

1. Some of the best known examples include The war of the worlds, Her, Blade runner, and He, she and it.

2. For a sample of the range of robotics currently being developed, see IEEE’s https://robots.ieee.org/.

3. See the documentary AlphaGo (https://www.alphagomovie.com) for the full story.

4. https://www.pandorabots.com/mitsuku/.

5. The encounter with the inhuman expands the term queer past its conventional resonance as a container for human sexual nonnormativities, forcing us to ask, once again, what “sex” and “gender” might look like apart from the anthropocentric forms with which we have become perhaps too familiar (Luciano and Chen, 2015, p. 189).

6. Gehl and Bakardjieva, 2016, p. 2.

7. Jones, 2015, p. 1.

8. For the purposes of this article, I am going to use the term ‘non-consensual non-monogamies’ (or in abbreviated form, NCNM) rather than ‘infidelity’. ‘Infidelity’, despite being one of the most commonly used terms in discussions of these sites, is generally used in a pejorative way. My focus here is on the intersection of a particular technology with this practice of intimacy, not on joining the moral debate, and thus, I prefer to use a less loaded term.

9. Berlant, 2000, p. 5.

10. Berlant, 2000, p. 1.

11. Rambukkana, 2010, p. 239, my emphasis.

12. Barker and Langridge, 2010, p. 756.

13. Carbonero and Garrido, 2018, p. 385.

14. Berlant, 2000, p. 2.

15. Kipnis 2000, p. 296.

16. Kipnis, 2000, pp. 291–292.

17. Kipnis, 2000, p. 300.

18. Kipnis, 2000, pp. 326–327.

19. Kipnis, 2000, p. 327.

20. Dorsey, 1990, p. 91.

21. Ibid.

22. The Voight-Kampff Empathy Test, described by Philip K. Dick in his Do androids dream of electric sheep? (Garden City, N.Y.: Doubleday, 1968) and made more widely known in movie interpretations such as Blade runner, shows the development of increasingly sensitive physiological tests to determine human-ness as the boundary line becomes blurred.

23. Kipnis, 2000, p. 293.

24. Gunkel, 2016, p. 238.

 

References

A.Abad-Santos, 2011. “Ashley Madison’s sexist fat joke isn’t funny,” Atlantic (9 November), at https://www.theatlantic.com/entertainment/archive/2011/11/ashley-madisons-sexist-fat-joke-isnt-funny/335719/, accessed 15 August 2019.

AlphaGo, 2017. Director, Greg Kohs; details at https://www.alphagomovie.com.

R. Andreassen, M. Nebeling Petersen, K. Harrison and T. Raun (editors), 2017. Mediated intimacies: Connectivities, relationalities and proximities. London: Routledge.

N.K. Baym, 2015. Personal connections in the digital age. Second edition. Cambridge: Polity.

M. Barker and D. Langdridge (editors), 2010. Understanding non-monogamies. London: Routledge.

C. Beasley, M. Holmes, K. Harrison and C. Wamala Larsson, 2017. “Innovations in intimacy: Internet dating in an international frame,” In: R. Andreassen, M. Nebeling Petersen, K. Harrison and T. Raun (editors). Mediated intimacies: Connectivities, relationalities and proximities. London: Routledge, pp. 89–102.

L. Berlant, 2000. “Intimacy: A special issue,” In: L. Berlant (editor). Intimacy. Chicago: University of Chicago Press, pp. 1–8.

Blade runner, 1982. Director, Ridley Scott; details at https://www.warnerbros.com/movies/blade-runner/.

M.A. Carbonero and M.G. Garrido, 2018. “Being like your girlfriend: Authenticity and the shifting borders of intimacy in sex work,” Sociology, volume 52, number 2, pp. 384–399.
doi: https://doi.org/10.1177/0038038516688609, accessed 26 September 2019.

C. Dewey, 2015. “AM faked female profiles to lure men in, hacked data suggest,” Washington Post (25 August), at https://www.washingtonpost.com/news/the-intersect/wp/2015/08/25/ashley-madison-faked-female-profiles-to-lure-men-in-hacked-data-suggest/, accessed 15 August 2019.

P.K. Dick 1968. Do androids dream of electric sheep? Garden City, N.Y.: Doubleday.

C.J. Dorsey, 1990. “(Learning about) Machine Sex,” In: C.J. Dorsey. Machine Sex and other stories. London: Women’s Press, pp. 76–97.

R.W. Gehl and M. Bakardjieva, 2016. “Socialbots and their friends,” In: R.W. Gehl and M. Bakardjieva (editors). Socialbots and their friends: Digital media and the automation of society. London: Routledge, pp. 1–16.

D.J. Gunkel, 2016. “The other question: Socialbots and the question of ethics,” In: R.W. Gehl and M. Bakardjieva (editors). Socialbots and their friends: Digital media and the automation of society. London: Routledge, pp 230–248.

D.J. Haraway, 1991. “A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century,” In: D.J. Haraway. Simians, cyborgs, and women: The reinvention of nature. London: Routledge, pp. 149–181.

K. Harrison, 2019. “Relive the passion, find your affair: Revising the infidelity script,” Convergence, volume 25, numbers 5–6, pp. 1,077–1,095.
doi: https://doi.org/10.1177/1354856517725987, accessed 26 September 2019.

Her, 2013. Director, Spike Jonze; details at https://en.wikipedia.org/wiki/Her_(film), accessed 26 September 2019.

K. Hillis, S. Paasonen and M. Petit (editors), 2015. Networked affect. Cambridge, Mass.: MIT Press.

IEEE, n.d. “Robots: Your guide to the world of robotics,” at https://robots.ieee.org/, accessed 15 August 2019.

S. Jones, 2015. “How I learned to stop worrying and love the bots,” Social Media + Society (11 May).
doi: https://doi.org/10.1177/2056305115580344, accessed 15 August 2019.

L. Kendall, 2002. Hanging out in the virtual pub: Masculinities and relationships online. Berkeley: University of California Press.

L. Kipnis, 2000. “Adultery,” In: L. Berlant (editor). Intimacy. Chicago: University of Chicago Press, pp. 289–327.

B. Light, 2016. “The rise of speculative devices: Hooking up with the bots of Ashley Madison,” First Monday, volume 21, number 6, at https://firstmonday.org/ojs/index.php/fm/article/view/6426/5525, accessed 15 August 2019.
doi: http://dx.doi.org/10.5210/fm.v21i6.6426, accessed 26 September 2019.

R. Lynn, 2007. “Don’t dismiss online relationships as fantasy,” Wired (9 July), at https://www.wired.com/2007/09/sexdrive-0907/, accessed 15 August 2019.

D. Luciano and M.Y. Chen, 2015, “Has the queer ever been human?” GLQ, volume 21, numbers 2–3, pp. 182–207.
doi: https://doi.org/10.1215/10642684-2843215, accessed 26 September 2019.

M. Murphy, 2015. “Ashley Madison was not for women or queer couples so you can let that argument go” (27 August), at https://www.feministcurrent.com/2015/08/27/ashley-madison-was-not-for-women-or-queer-couples-so-you-can-let-that-argument-go/, accessed 15 August 2019.

Mitsuku, at https://www.pandorabots.com/mitsuku/, accessed 15 August 2019.

A. Newitz, 2015. “How Ashley Madison hid its fembot con from users and investigators,” Gizmodo (9 September), at https://gizmodo.com/how-ashley-madison-hid-its-fembot-con-from-users-and-in-1728410265, accessed 15 August 2019.

M. Piercy, 1991. He, she and it. London: Fawcett.

H. Pym, 2015. “Is this cuddly robot coming to a care home near you?” BBC News (17 September), at https://www.bbc.com/news/health-34271927, accessed 15 August 2019.

N. Rambukkana, 2010. “Sex, space and discourse: Non/monogamy and intimate privilege in the public sphere,” In: M. Barker and D. Langdridge (editors). Understanding non-monogamies. London: Routledge, pp. 237–242.

A. Sharp and A. Martell, 2016. “Infidelity website Ashley Madison facing FTC probe, CEO apologizes,” Reuters (5 July), at https://www.reuters.com/article/us-ashleymadison-cyber/infidelity-website-ashley-madison-facing-ftc-probe-ceo-apologizes-idUSKCN0ZL09J, accessed 15 August 2019.

H.G. Wells, 1898. The war of the worlds. London: William Heinemann.

I. Zeifman, 2017. “Bot traffic report 2016” (24 January), at https://www.imperva.com/blog/bot-traffic-report-2016/, accessed 15 August 2019.

V.A. Zelizer, 2005. The purchase of intimacy. Princeton, N.J.: Princeton University Press.

 


Editorial history

Received 20 August 2019; accepted 19 September 2019.


Copyright © 2019, Katherine Harrison. All Rights Reserved.

Automated and/or authentic intimacy: What can we learn about contemporary intimacy from the case of Ashley Madison’s bots?
by Katherine Harrison.
First Monday, Volume 24, Number 10 - 7 October 2019
https://ojphi.org/ojs/index.php/fm/article/view/10250/8148
doi: http://dx.doi.org/10.5210/fm.v24i10.10250





A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2019. ISSN 1396-0466.