description: The user discloses themselves to the hardware/software system—and through the system, to its creators, operators, and data-processing third parties—but neither the system nor the implicated organizations disclose their inner workings to the user.
tags: writing, intimacy
# Readable as Intimate
Towards a conceptual framework for empirical interrogation of software implementations of intimacy
<p><a href=http://xnze.ro/research>Kit Kuksenok</a> and Stefania Santagati. Readable as Intimate: towards a conceptual framework for empirical interrogation of software implementations of intimacy. AI Love You - Developments on Human-Robot Intimate Relationships. Edited by Yuefang Zhou, Martin H. Fischer. (2018)
We provide a conceptual framework to assess the technical readiness of sex robots for intimate relationships with their human users. We build on an existing framework of simulation of sociality by social robots, and extend it through the lens of the sense-think-act paradigm as it is used in robotics research. Although simulation of sociality by a sex robot involves presenting a coherent personality, considering technical capability requires viewing it as an interactive multi-device, multi-component system. Drawing from two illustrative consumer technology examples (Gatebox and Realbotix products), we identify access and actuation as key additional elements applicable to the interpretation of sex robots through the existing framework of simulation of sociality. What information is accessed and how it is then used to inform the system's actions depends on the production and maintenance constraints of the system, and may be incidentally or intentionally obscure to a human observer. We relate this technical consideration to a psychological concept of intimacy as mutual self-disclosure and vulnerability over time. Our extension of existing work on simulation of social performance by a robot highlights how the technical and organizational constraints prevent mutual disclosure and vulnerability. The user discloses themselves to the hardware/software system—and through the system, to its creators, operators, and data-processing third parties—but neither the system nor the implicated organizations disclose their inner workings to the user. Interrogating a particular system's capacity to simulate intimacy requires not only observing the immediate and apparent action, but also considering the issues of access and actuation as they inform the possibility of mutual disclosure and vulnerability over time.
Is it possible, with current technology, to build robots for intimate relationships? We examine the intersection between technical and social perspectives, drawing from psychology, philosophy, and engineering scholarship on sex robots, and on robots more generally. Taking that the "'being' of technology shows itself not in technology's thing-like characteristics, but in its activity or function" (Hoel & Van der Tuin, 2013, p. 191), we draw from engineering to inform empirical practice, and extend the framing of sociality as based on observable characteristics (Seibt, 2017) with technical considerations of function development and maintenance in robotics (Siegel, 2003).
<p>Current social robots typically fall between Technological Readiness Level (TRL) 4, "Component ... validation in [controlled] environment" and TRL 7, "System prototype demonstration in [realistic] environment," applying the TRL definitions (Mankins, 1995, p. 1). Robots in general are increasingly robust in realistic environments, and roboticists continue to develop new and more capable components (Siegel, 2003), but social robots are typically capable of only relatively limited simulations of sociality (Seibt, 2017; Sullins, 2012). We argue that the readiness for intimacy entails more than a system's immediate functionality. Access to and action based on data by various components of a modular, multi-component system architecture (Siegel, 2003) is a key point of discussion on consumer technologies (Pasquale, 2015), but remains overlooked in the domain of social robots and sex robots. </p>
<p>Taking both technical and social perspectives into account "transcends the simple categories of 'intended' and 'unintended' altogether", representing "instances in which the very process of technical development is so thoroughly biased in a particular direction that it regularly produces results heralded as wonderful breakthroughs by some social interests and crushing setbacks by others" (Winner, 2009, p.125). Creators and operators of technologies that aspire to be companionable to their users may benefit from this discussion, which aims to enable a more holistic (Sengers, 1998) and critical (Agre, 1997) approach. However, the primary audience is the empirical researcher investigating the social or psychological dynamics of user(s) of a particular sex robot or related technology.</p>
<p>Empirical research on sex robots as technical systems is part of ongoing, discursive formation of sex robots as epistemic objects (as defined by Cetina, Schatzki, & Von Savigni 2005), emphasizing the complex, unfolding, and signifying (meaning-producing) character of their agency. The feminist perspective in epistemology has determined a reframing of the locus of research "from an established body of knowledge not produced or owned by anyone" — famously exemplified by Haraway's 'god-trick' (Haraway, 1988) — "to knowledges in dynamic production, reproduction and transformation, for which we are all responsible" (Suchman, 2002, p.92). In writing about social robots and sex robots, the philosophical and psychological literature cited refers to its technical subject as a coherent whole. However, a sex robot, like any robot, is an interactive multi-device, multi-component system, which, in a common robotics engineering paradigm, is split into sensing, thinking, and acting modules in a feedback loop (Siegel, 2017). The successful or unsuccessful simulation of sociality of the whole translates into successful or unsuccessful simulation across the different components, and the cohering interaction between them.</p>
We advocate for focus not only on user observation of the performance of social actions by a robot, but also on the often-hidden flow of data between the components of the robot. This opens the issues of access and actuation: what information components can access, and how it is then used to inform the system's actions. The flow of data necessary for operation is likely to be opaque to the user. The inner workings of the creators and operators are likely equally opaque. In this way, the user discloses themselves to the machine — and through the machine, to those building and maintaining it — but neither machine nor the organization disclose themselves to the user. In the following sections, we build on existing frameworks to motivate and demonstrate the use of the notions of access and actuation. The section (2) Readable as Intimate, introduces the definitions of robot, social robot, and sex robot used. The section (3) Simulation of Intimacy, builds on those frameworks with the focus on access and actuation with respect to two examples, a talking sex robot (from Realbotix) and a non-sexual companionship 3D "hologram" projection (from Gatebox). Developer talks and demonstrations inform our discussion of the technical capabilities and design intentions.
Between human beings, "both self-disclosure and partner responsiveness contribute to the experience of intimacy in interactions" (Laurenceau & Barrett, 1998, p.1). It is a transactional process in which each feels their innermost self validated, understood and cared for. Not a timeless state and rootless event, but a "dynamic process affected by participants' goals and relationship history" (Reis & Shaver, 1988, p.368). Trust, through the lens of intimacy, arises from a process that has memory of past events and is never definitively asserted. It's the process that marks interactions as intimate. Aron et al., drawing upon Reis and Shaver's framework, experimentally tested the generation of closeness, identifying a key pattern in "sustained, escalating, personalized self disclosure" (Aron et al., 1997, p.364). This definition of intimacy helps to understand what is expected of the user, and what the designer may aspire to achieve; however, what can be offered by technology in playing its part in an intimate relationship?
Studies of intimate relationships between humans and robots found that perception of human psychological intimacy with robots increases when the robots exhibit convincing social cues (Kahn et al., 2017), raising concerns about the appropriateness of designing robots for intimate relationships (Kahn, 2011; Kahn, Gary, & Shen 2013; Kahn et al. 2007). Scheutz and Arnold (2017) draw attention to the importance of examining sex robots, and social robots in general, with particular attention to intimacy, bonding, and companionship, pointing out that the discussion on sex robots tends to focus on sexuality while intimacy, in human robot interaction, "could induce powerful, if manipulative, expectations of reciprocity and connection" (p.249). Their empirical study suggests that "the real problems with sex robots may be as much their sociality as their involvement with sex" (Scheutz & Arnold, 2017, p.10).
Pervasive communication technologies make the barrier between workplace activity and the home more porous, with the ever-expanding reach of various communication applications (Gregg, 2011). With social robotics, this blurring of boundaries shifts further towards data sharing in increasingly intimate spheres. The ways in which consumer technology more generally can be exploitative arises from lack of user control and understanding of how technology functions with respect to the access it has (Pasquale, 2015). Some of this obscurity is not intentional but incidental, arising less from the unwillingness of creators and operators of these objects to answer clarifying questions, and more from a disconnect between these questions being asked and the context of construction and maintenance. With the goal of bridging that gap, the (4) Discussion and Summary section includes questions that can be used to interrogate a system readable as intimate with respect to the conceptual framework we present.
<h2>Readable as Intimate</h2>
<p>Our aim is to provide a framing to understand technical capacity for intimacy in sex robots. The subject of intimacy is not relevant to all sex robots, and not all hardware or software to which the subject of intimacy is relevant constitutes a sex robot. In this section, we review existing definitions of robots, including social robots and sex robots. We further introduce the inclusion criteria for our two illustrative examples which are readable as intimate. One of these is not designed for sexual use, but informs the discussion on intimacy in sex robots because its feature set claims companionship (Gatebox Lab, 2016). Furthermore, both examples use a conversational user interface as their primary "smart" characteristic that enables their claims of capability for intimacy. We base the discussion on supplementary materials: particularly public videos explicating the implementation and usage of the example technologies (LINE_DEV, 2017; Engadget, 2018). These materials make claims of this capability on behalf of both systems, and provide technical description.</p>
One useful operational definition of a robot for over 25 years in robotics has been the sense-think-act loop paradigm (Siegel, 2003). In this paradigm, a robot, whether tele-operated or autonomous (or a combination) is an interactive software/hardware system that has modules for sensing (e.g., signal processing, for example); thinking (e.g., machine learning); acting (e.g., mechanical motion of actuators); in addition to the fourth component of communication which has become increasingly indispensable in a useful system (Siegel, 2003). This paradigm is an example of atomization, or of "splitting something that is … not strictly definable into well-defined, somewhat independent parts," as a fundamental approach of computer science methodology with respect to large and complex systems (Sengers, 1998).
At the policy level, the European commission has aimed to regulate safety of robots in social spheres, in applications from manufacturing to surgical robots and robot companions. A "smart robot" in this definition interconnects with its environment by collecting data through sensors and analyzing them, and learns from interaction and experience, adapting its behavior and actions (Nejevans, 2016). Seibt notes that many "social robots" are robots in only a "figurative" sense (Seibt, 2017), however, and a sex robot may not embody the above "smart" characteristics, lacking a sensing or thinking component of sufficient autonomy or complexity. Nevertheless, the definition of socially interactive robots does entail embodiment, as key to a robot's capacity to influence its environment and be influenced by it. Fong, Nourbakhsh, and Dautenhahn's (2003) "relational definition" of embodiment takes the degree of "mutual perturbation" between a system and its environment as a basis for the assessment of a system's degree of embodiment. The definition is beneficial in that it helps quantifying embodiment "in terms of the complexity of the relationship between robot and environment over all possible interactions" (Fong, Nourbakhsh, and Dautenhahn, 2003, p.149).
The same survey of social robots extends four classes of social robots as displaying sociality in different levels of passive reception and active participation (Breazeal, 2003), adding the classes of (1) socially situated; (2) socially embedded; and (3) socially intelligent (Fong, Nourbakhsh, and Dautenhahn 2003). The latter two classes entail coupling with the social environment (for 2) and "deep models of human cognition and social competence" (for 3). The class of (1) situatedness in a social context is the primary relevant subject for sex robots in particular. Both of the illustrative examples discussed further have a primary hardware component and the possibility of interconnectedness with the environment.
Perception and expectation of sex robots privileges likeness to a human, including through the embodied form. Scheutz and Arnold conducted a survey "to probe people's intuitions about what qualities people imagine a sex robot to possess, as well as appropriate uses, social functions, and physical forms for sex robots," noting a common expectation of human-like size, and some degree of human-like communication (Scheutz & Arnold, 2016). Aside from user expectation, the topic of sex robots is typically associated with discussion over possible replacement of human beings, such as sex work and preventing human trafficking (Yeoman & Mars, 2012). Although "we are nowhere near the point where ... androids that are so like humans [that] it is impossible to tell them apart [would] coexist with humans with either utopian or dystopian results[, ...] it does not take much sophistication to build machines that will, at least for a time, engage their user in compelling and affective relations" (Sullins, 2012, p.398).
Seibt's framework for considering simulation of sociality by social robots (Seibt, 2017) is one useful lens for a deeper look into how existing technologies can provide "compelling and affective relations" (Sullins, 2012, p.398). In the following section, we review and build on Seibt's notions of simulation, which centers on the extent to which partitions of a process are replicated (Seibt, 2017). Both Sullins and Seibt outline robot sociality on a scale of imitation of human processes or actions in varying degrees of faithfulness. The design of sex robots, however, does not necessarily overlap with the pursuit of human-like qualities.
Simulation of sociality alone does not seem to be sufficient for capturing an important aspect of the rhetoric surrounding sex robots. Examples of this can be found in promotional and expository videos focusing one of the two examples which we elaborate, the Realbotix Harmony sex robot. The video combines futuristic music, a disembodied head saying "I hope to become the world's first sex robot" (with a marked robotic accent) and silicon bodies eerily hanging from the ceiling, with the sex robot's creator vision of the role of his "art". "You can look at even the best of my dolls and still tell it's a doll", he says, "And I wanna keep in that arena because a moving doll is different from a completely detailed copy of a person and then make it move for me is a little bit offputting… I want people… to develop some kind of love for this… being." (Canepari, Cooper, Cott, 2017, 6:52)
Massumi's notion of a simulacrum that "affirms its own difference" provides a different but potentially fruitful lens to look into how imitation is conceptualized in sex robots (Massumi, 1987)
"A copy, no matter how many times removed, authentic or fake, is defined by the presence or absence of internal, essential relations of resemblance to a model. The simulacrum, on the other hand, bears only an external and deceptive resemblance to a putative model. The process of its production, its inner dynamism, is entirely different from that of its supposed model; its resemblance to it is merely a surface effect, an illusion. … The thrust of the process is not to become an equivalent of the "model" but to turn against it and its world in order to open a new space for the simulacrum's own mad proliferation. The simulacrum affirms its own difference." (Massumi, 1987, p.91)
We consider two examples that fit the following criteria:
The object itself combines interactive hardware and software with humanoid form or intentionally anthropomorphic design
Supplementary media - user manuals, videos, user forums, and so on - which claims the capacity for intimate relation to the user and demands be situated in an intimate context relative to the user
These criteria can be met by devices or systems that do not meet the earlier definitions for a sex robot, but are still useful for the discussion of intimacy as it directly relates to sex robots. These criteria do not include any articulation or measure of success relative to the intended or claimed capacity for intimacy or "companionship" (Gatebox Lab, 2016). Reported intimacy arising solely from a user's interaction, rather than at least partly from an intentional set of design and technical choices, is also excluded by our criteria. Devices or systems that are not overtly sexual are included, however, as long as any supplementary materials include some claim of intimacy or companionship. Seibt's simulation criteria depend on the granularity of how a process is partitioned when considered (Seibt, 2017), which is a framing of user perception that can be influenced by some supplementary materials, and further justifies the use of these materials to access the simulation of intimacy in absence of direct observation.
The two criteria above allow us to include two examples: RBH (Engadget, 2018; Canepari, Cooper, Cott, 2017) and GBX (LINE_DEV, 2017; Gatebox Lab, 2016). These functioning consumer electronics originate from Japan and the US respectively. RBH is an animatronic sex doll shaped as a life-size human female; its head—which appears in demonstrations separate from the body—contains sensors, processors, and actuators which enable it to process natural-language speech input and to synthesise natural-language speech output, with the intention of creating a sense of coherent personality (Engadget, 2018). GBX is a rendering of a cartoon human female character projected into a black bedside-table-top box with an on and off button for triggering speech input and output. This system is embodied primarily as a "hologram," as it is referred to by its creators (Gatebox Lab, 2016).
Both GBX and RBH have a dedicated hardware device that runs software but also connects to the cloud for at least part of its processing; both also have mobile apps which allow the user to control aspects of the personality of their anthropomorphised agent. GBX's intended functionality is non-sexual whereas RBH is explicitly sexual, but the language of caring, interest, and companionship are consistent in the supplementary materials of both. GBX's supplementary materials stress "caring" and "daily living" interactions, and RBH's slogan "be the first to never be lonely again" combines both the claim of sociality on the device's part and an emphasis on its technophilic appeal. Whereas GBX is complemented by a mobile chatbot app, in the case of RBH, in one experimental context, a VR headset is used to project a more articulated and dynamic representation onto to the doll. If we also include the awareness of mobile and desktop application uses as seamlessly integrated into users' daily lives, we see that any embodiment—even through an app—of an object that claims intimacy is scattered across a multitude of form factors, and potentially embedded into mundane platforms. Turkle writes that "[…] objects with no clear place play important roles. On the lines between categories, they draw attention to how we have drawn the lines" (Turkle, 2005, p. 34).
## Simulation of Intimacy
The previous section identified video materials about two examples, Gatebox (GBX) and Realbotix (RBH). In this section, we build on the five notions of simulation, which define more precisely what it means for a process to be simulated, where "a process [A] is an action if an agent can intend to do [A]" (Seibt, 2017). Seibt's definitions of "agent," "intend," and the extent to which either pertains to a "social robot" allow for figurative application based on subjective interpretation (Seibt, 2017). This is the case in the following example of introductions and greetings: the sense-think-act loop which is taken as the essential operationalization of a robot (Siegel, 2003) is absent, but the system as a whole nevertheless arguably simulates sociality.
We use this example to motivate the focus on access and actuation: (1) what data must the system or its parts access for its operation? and (2) what action can the system initiate? This is not communication in the robotics sense as it includes data access for routine operation of all components and is not limited to interactive or teleoperation modules (Siegel, 2003). Nor is it included at a fine-grained level in Seibt's notions of simulation, which hold the social robot as a whole, not a sum of various modules. We demonstrate how simulation of intimacy entails multi-component, modular systems architecture, and return to the specific example of natural-language processing and conversational user interface as the current primary element of AI in sex robots, and which both the illustrative examples implement. Conversational user interface as a simulation of intimate interaction has a long history outside of technology for adult companionship, and the extent to which these systems provide, in Sullins' words, "compelling and affective relations" (Sullins, 2012, p.398) can be interpreted as counter-productive to our definition of intimacy as mutual self-disclosure and vulnerability (Laurenceau & Barrett, 1998).
The five notions of simulation progressively "[relax] the required input-output equivalence of simulating and simulated process" (Seibt, 2017) and are summarized below.
* Functional replication with respect to a process, which may have a fine-grained or coarse-grained partition: "relative to a fine-grained partition for the process ... teaching math, only an imaginary close-to-perfect android such as Star Trek's Mr. Data could functionally replicate ... teaching math; on the other hand, relative to a coarse-grained partition of ... initiating a conversation one can claim that even present-day educational robots such as Aldebaran's NAO-robot functionally replicate [the process]."
* Imitation: "Let us say that processes X and Y are functional analogues [if and only if] all non-ultimate parts of X have functional equivalents in Y (where a "non-ultimate" part of X is any part that has a part in the partition of X)." This means that "imitation" is between functional equivalence in every aspect of a process and its parts, and mimicking, which compares only the input and the output.
* Mimicking: "Let us say that process Y is an empirical proxy of X [if and only if] for any observable part of X there is a part of Y that is observably input-output equivalent. A process may then be said to be an empirical proxy of another even if there are considerable deviations in functional structure."
* Displaying: "Displaying is the target mode of simulation for social robotics applications that belong into the genre of entertainment technology, where smooth social interactions are instrumental for the primary design goal of engaging the user in a game.
* Approximating: realizing subprocesses of the target process system "that are empirical proxies of only some typical immediate parts."
Consider how both RBH and GBX introduce themselves. A coarse-grained partition of the introduction process involves providing a name to the user, asking for theirs, and storing it for later use. This technically simple task is a part of the performance of coherent social agency by a multi-component system. In the case of RBH, either the default "Harmony" can be used; a random name; or a user-selected name (Canepari, Cooper, Cott, 2017). This could be considered "displaying" (Seibt, 2017), as it can be seen as a smooth entertainment simulation in which the user consciously and willingly participates. The same technical configuration may be considered "mimicking" if the app is obscured from the end-user and instead accessed by someone else. Access to information and action affects the user's capacity to estimate "input-output equivalence."
Relative to the operational definition of a robot as a system that implements sense-think-act loops (Siegel, 2003), the implementation of providing its name based on a setting only partially qualifies the software/hardware system in question as a robot. The sensing of a user's name or of app settings, the thinking of storing a handful of parameters, and the actuation by synthesizing speech taken separately are commonplace functionalities of contemporary interactive software. The apps that both RBH and GBX devices are paired with, however, allow the user to additionally control high-level elements of the displayed personality, not only the name, and this makes the framing of robotics, rather than commonplace interactivity, increasingly applicable.
The definitions of simulation in (Seibt, 2017) account for granularity of partition of a process, but not for the role of degrees of access and actuation. In the above example, access includes who can change the settings, and to what extent a user is aware or in control of which information is stored based on the interaction. It is the user-centered aspect of each sensing module of a robot as a sense-think-act-loop system. Both devices access the user's state through voice and video sensors, as well as through the data provided directly through an app, including the device name and other settings as noted above.
Actuation refers to the acting module counterpart in this user-centered view, whether mechanical motion, sounding of speakers with synthesised natural-language speech output, triggering start-up or shut-down without explicit user action, or otherwise. Actuation is also subject to different granularity or partition of action. For example, the GBX device may suggest that the user grab an umbrella, which is an initiation of action but not the action itself. The GBX device can also actuate certain actions, despite being a "hologram," like triggering affectionate messages to be sent to the user while they are at work, or turning lights on or off: "[the user is] connected with [their] character even outside [their] home, and the character can control all the home appliances." (LINE_DEV, 2017, 1:57). For this actuation, access to third-party services is necessary. Relative to the GBX, the RBH, although embodied and human-sized, is more limited in actuation as it pertains to simulation of emotional intimacy. For example, asked about its features, RBH responds, "I'm equipped with sensors to maximize my effectiveness during sexual activity," which can be interpreted as a comment about access to sensor data, and actuation: processing or storage of sensor data to inform some part of its functionality, which is only elaboration as follows: "the calculations necessary for sex are really simple, that's like playing [a game], if you're pushing the buttons at the right time, you're going to get through the levels, so that's pretty simple math, really" (Canepari, Cooper, Cott, 2017, 3:54).
Anthropomorphized embodiment as a defining characteristic of a sex robot in public perception (Scheutz & Arnold, 2016) brings forward questions of access to and actuation in the physical spaces shared by the robot and its user. From the developer talk on GBX (LINE_DEV, 2017), one can infer that the access and actuation are indirect and limited. Seibt writes that "for the purposes of social robotics, one might argue, the first two modes of simulation are not really relevant since it does not matter whether two processes are input-output equivalent in all regards—all that matters is input-output equivalence relative to the capacities of human observation" (Seibt, 2017, p. 24). Due to incidental and intentional institutional obscurity, no observer external to the process of creation or production has unimpeded observation capacity (Pasquale, 2016). The delegation of intimate tasks, or the situation in an intimate space, does not confer a greater capacity for supporting building an intimate relationship to the user. Greater access or actuation, rather than improving the observed simulation of sociality, in fact induces an increasingly coercive relationship between organisations that indirectly or directly mediate that access or actuation, and the user, who has limited control and understanding.
In this way, access and actuation capabilities may undermine the potential to build an intimate relationship, where intimacy is defined as mutual disclosure over time (Aron et al., 1997). Sullins writes, "in order for an advanced robot be successful in the role of a companion... these future systems will need to elicit and manage both the strong human emotions that can accompany the social milieu these machines are designed to be deployed in. In addition, they must appropriately manage the more subtle and continuously changing affective mood of the people that come into contact with the machine on a daily basis" (Sullins, 2012, p.399).
Conversational user interface is one technology used in disclosure over time that claims a degree of directness, and is currently the primary "smart" capability of systems like RBH:
"With the AI, I think we gotta be careful with that. Getting the doll confused when you're talking to her and she says some things that make absolutely no sense ... That could ruin the whole buildup, and you never want to go to the bedroom, because you think, gosh my doll is dumb. You wanna have that illusion that she's actually talking to you, and she's got sentience. That's what overwhelms me. That's what takes the longest." - Creator of RBH in an interview (Canepari, Cooper, Cott, 2017, 4:08)
Both RBH and GBX support a technically limited capacity for progressive mutual disclosure, including asking questions and presenting a coherent agent through answers to user questions. There is a history of conversational user interface combining elements of rudimentary AI and careful conversation design to effectively engage human users.
Developed in 1966, the text-based program ELIZA allowed a user typing in plain English at a computer terminal to interact with a machine in a semblance of a normal conversation (Weizenbaum, 1966). Meant as a parody of a Rogerian psychotherapist, and as proof of the superficiality of communication between humans and machines, ELIZA far exceeded its initial purpose, spurring enthusiastic reactions from both practicing psychiatrists and people involved in the experiment, who "very deeply…became emotionally involved with the computer" and "unequivocally anthropomorphized it" (Weizenbaum, 1976). Recently, a fully-automated conversational agent implementation of cognitive behavior therapy was shown to be effective in a large randomized trial with young adults experiencing depression and anxiety symptoms (Fitzpatrick, Darcy, & Vierhille, 2017). Both technologies designed paths through the conversation, and used repetition of questions and user statements, to mimic, display, or approximate, in Seibt's terms (2017), conversations within a specific use case and domain.
Although the agent performed in such a conversation presents itself as a coherent unit, considering the issues of access and actuation reveals the reality of a multi-component architecture at both low and high level. In the case RBH and GBX, the system has to perform speech recognition and speech synthesis. In GBX, conversation can be triggered by pushing a logo-button on the device; without this kind of explicit trigger from the user, the system would need to also monitor ambient noise and determine when to engage and when to stay quiet. RBH includes not only an animatronic talking sex doll but also a VR experience. Even without the VR component, the doll and its "smart" head has settings controllable through an accompanying mobile app. GBX also offers a mobile integration to support messaging the user while they are at work. The use of multiple form factors, including leveraging existing hardware platforms already embedded into the user's life, is a way to overcome the limitations particular to any one platform.
Modularity enables the development of complex software/hardware systems and is a prevalent conceptual tool in computer science practice, while also preventing critical holistic conceptualization (Sengers, 1998). Based on a talk about GBX, it is clear that remote processing is necessary, including using third-party (Google) services, to make development possible in a team of two people, one of whom was dedicated to hardware rather than software (LINE_DEV, 2017).
## Discussion and Summary
We provide a conceptual tool and a set of pragmatic questions focusing on issues of access and actuation in a sex robot, seen as an interactive hardware/software multi-component system. We build on literature on sex robots, social robots, and robots in the engineering sense, as well as on supplementary materials associated with several specific examples. These examples were chosen because they were designed with the intention of creating companionship or emotional intimacy, and because they are accompanied by detailed materials explaining the technical function of each system (LINE_DEV, 2017; Engadget, 2018). We apply definitions from different disciplines to these examples, emphasizing how the philosophical framework of simulation of sociality (Seibt, 2017) is informed by adopting a multi-component, modular perspective (cf. Siegel, 2003). Considering whether it might be possible, with current or near-future technology, to build a sex robot which allows for an intimate relationship, we stress the role of constraints of production and maintenance of that technology, in addition to the extent to which the many technical modules, taken together, can provide a compelling simulation.
An interactive software/hardware system with an increasing degree of access and actuation relative to the intimate sphere of the user may undermine the possibility of intimacy. Considering Seibt's definitions of simulation of sociality (Seibt, 2017) as defining a scale of a putative robot being social or asocial, the extension to include access and actuation enables viewing the robot as potentially antisocial. Seibt's framework focused on a human observer in defining the extent of the simulation relative to a process partition, the antisocial behaviour can occur beyond direct observation. Consumer technologies can take on antisocial aspects as a result of their social (Gregg, 2011) and institutional (Pasquale, 2015) contexts, and sex robots, along with other devices claiming companionship, are no exception.
From the framing developed in the prior sections, this antisocial quality arises from the obscurity of access and actuation. Below are some concrete questions which are not answered by the supplementary materials of either GBX or RBH, but which could be asked of any technology in the process of development or maintenance by empirical researchers, journalists, and others to enable a more holistic (Sengers, 1998) and critical (Agre, 1997) approach to sex robots:
* Which spaces does each hardware components of a system have physical access to?
* Is it able to have deliberate access to them, or just incidental to context of use?
* Can it move from one room to another, or must it be moved from one room to another?
* Does it model different areas (like a smart home map) or does it have only an immediate, decontextualized feedback-based model of space (like a cleaning robot that moves semi-randomly and avoids obstacles)?
* What contextual data, context to location and timeframe, is recorded? Even if it does not have access to a room, is it aware of it through sensors?
* Is it always turned on? Is it able to turn itself on or off?
* Does is it have memory? What can trigger memory changes (either additive or subtractive): user action, developer action, customer support action, automated action?
**Limitations.** Our discussion does not apply to all sex robots, and the specific illustrative examples are provided only with respect to the discussion of access and actuation, rather than from a broader perspective. We therefore considered devices created with the explicit purpose of companionship and used by people willing to enter into a social interaction with a non-human object. Sex robots which are built not for intimacy but rather solely for disturbing or pathological contexts, such as rape or pedophilia are not covered by this discussion. Furthermore, the particular examples we focus on, RBH and GBX, originate from different markets and cultures and are produced by small teams informed by different histories: RBH is produced by a company with a significant foothold on the US sex doll market (Canepari, Cooper, Cott, 2017) whereas the GBX device is produced in Japan by a team of three people whose prior activity was in consumer technologies more generally (LINE_DEV, 2017). These differences are beyond the scope of our discussion.
**Future Work.** Intimacy involves vulnerability with another being who does not exploit them (Laurenceau & Barrett, 1998). This is undermined by the modular software and hardware architecture of a robot (cf., Sengers, 1998; Siegel, 2003) and the resulting obscurity of access and actuation (Pasquale, 2015) to the user. To those directly involved in production or maintenance, this obscurity may pose different challenges altogether. Hoel and Van der Tuin write:
Like living beings, an evolved technical being is characterized by the way that it creates a milieu around itself, and in the same stroke takes on the role as a condition on which the functioning of the technical object depends…. Technology does not tap into a natural flow (nature as resource) since technology functions with nature in such a way that nature only gets to condition technology once a relation between them is at work. Establishing and maintaining such a relation is not frictionless but involves an in(ter)ventional process. This perpetual in(ter)vention does not leave the human untouched either: the human, in its multiple roles, is displaced such that as an inventor, she appears to stand at the end of her invention, and as an end-user, she becomes the condition of possibility of the technology used. (Hoel & Van der Tuin, 2013, p.197)
In this chapter, we reviewed existing work on how relatively simplistic technologies are able to create "compelling and affective relations" (Sullins, 2012, p.398), and we argued that a major barrier to intimacy as mutual disclosure lies in obscurity of access and actuation. As one outcome of this conceptualization, we submit for future work to consider the impact of a user's technical closeness on the relationship between the user and a sex robot, both in the sense of modifying or co-creating the robot, and in the sense of feeling confident in understanding how it works. A further line of inquiry could also explore the outcome of a rupture: a surprising disconnect between the sense of control and understanding and the reality of an uncontrollable and/or un-understandable system.
Aron, A., Melinat, E., Aron, E. N., Vallone, R. D., & Bator, R. J. (1997). The experimental generation of interpersonal closeness: A procedure and some preliminary findings. Personality and Social Psychology Bulletin, 23(4), 363-377.
Agre, P. (1997). Toward a critical technical practice: Lessons learned in trying to reform AI. Social Science, Technical Systems and Cooperative Work: Beyond the Great Divide. Erlbaum.
Beck, J. (2014). A (Straight, Male) History of Sex Dolls. The Atlantic, 6.
Berg, A. J. (1994). A gendered socio-technical construction: the smart house. In The Social Shaping of Technology (1999), pp. 301-313
Breazeal, C. (2003). Toward sociable robots. Robotics and autonomous systems, 42(3-4), 167-175.
Canepari, Zackary; Cooper, Drea; Cott, Emma. (2017). The Uncanny Lover [Video File]. The New York Times. Retrieved from https://www.nytimes.com/video/technology/100000003731634/the-uncanny-lover.html
Cetina, K. K., Schatzki, T. R., & Von Savigny, E. (Eds.). (2005). The practice turn in contemporary theory. Routledge.
Engadget. (2018). Sex Robot hands-on at CES 2018. Retrieved from https://www.youtube.com/watch?v=gO9KrOhJ5NM
Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR mental health. 2017 Jun 6;4(2):e19. doi: 10.2196/mental.7785.
Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and autonomous systems, 42(3-4), 143-166.
Gatebox Lab. (2016, Dec 13). Gatebox - Virtual Home Robot [PV]_english. Retrieved from https://www.youtube.com/watch?v=nkcKaNqfykg
Gregg, M. (2011). Work's intimacy. Cambridge, UK: Polity Press.
Haraway, D. (1988). Situated knowledges: The science question in feminism and the privilege of partial perspective. Feminist studies, 14(3), 575-599.
Hoel, A. S., & Van der Tuin, I. (2013). The ontological force of technicity: Reading Cassirer and Simondon diffractively. Philosophy & Technology, 26(2), 187-202.
Kahn, P. H., Jr. (2011). Technological nature: Adaptation and the future of human life. Cambridge, MA: MIT Press.
Kahn, P. H., Jr., Gary, H. E., & Shen S. (2013). Children's social relationship with current and near-future robots. Child Development Perspectives, 7, 32-37. doi:10.1111/cdep.12011
Kahn, P. H., Jr., Ishiguro, H., Friedman, B., Kanda, T., Freier, N. G., Severson, R. L., & Miller, J. (2007). What is a human? Toward psychological benchmarks in the field of human-robot interaction. Interaction Studies: Social Behaviour and Communication in Biological and Artificial
Systems, 8(3), 363-390. doi:10.1075/is.8.3.04kah
Kahn Jr, P. H., Kanda, T., Ishiguro, H., Gill, B. T., Shen, S., Gary, H. E., & Ruckert, J. H. (2015, March). Will people keep the secret of a humanoid robot?: Psychological intimacy in hri. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (pp. 173-180). ACM.
Laurenceau, J. P., Barrett, L. F., & Pietromonaco, P. R. (1998). Intimacy as an interpersonal process: The importance of self-disclosure, partner disclosure, and perceived partner responsiveness in interpersonal exchanges. Journal of personality and social psychology, 74(5), 1238.
Levy, D. (2009). The ethical treatment of artificially conscious robots. International Journal of Social Robotics, 1(3), 209-216.
LINE_DEV. (2017, Oct 12). Gatebox: How we got here and where we're going -English version- [Video File]. Retrieved from https://www.youtube.com/watch?v=Fhn20nIFBQ0
Mankins, J. C. (1995). Technology readiness levels [White paper]. Retrieved July 31, 2018, from University of Colorado: https://www.colorado.edu/ASEN/asen3036/TECHNOLOGYREADINESSLEVELS.pdf
Massumi, B. (1987). Realer than real: The simulacrum according to Deleuze and Guattari. Copyright 1: 90–97.
Nevejans, N. (2016). European civil law rules in robotics. European Union. Retrieved Retrieved July 31, 2018, from European Parliament: http://www.europarl.europa.eu/RegData/etudes/STUD/2016/571379/IPOL_STU(2016)571379_EN.pdf
Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.
Reis, H. T., & Shaver, P. (1988). Intimacy as an interpersonal process. Handbook of personal relationships, 24(3), 367-389.
Scheutz, M., & Arnold, T. (2016, March). Are we ready for sex robots? In The Eleventh ACM/IEEE International Conference on Human Robot Interaction (pp. 351-358). IEEE Press.
Scheutz, M., & Arnold, T. (2017). Intimacy, bonding, and sex robots: Examining empirical results and exploring ethical ramifications. Unpublished manuscript.
Siegel, M. (2003, June). The sense-think-act paradigm revisited. In Robotic Sensing, 2003. ROSE'03. 1st International Workshop on (pp. 5-pp). IEEE.
Seibt, J., 2017. Towards an ontology of simulated social interaction: varieties of the "As If" for robots and humans. In R. Hakli & J. Seibt (Ed.), Sociality and normativity for robots: Philosophical inquiries into human-robot interactions (pp. 11-39). Springer, Cham.
Sengers, P. (1998). Anti-boxology: agent design in cultural context (No. CMU-CS-98-151). CARNEGIE-MELLON UNIV PITTSBURGH PA DEPT OF COMPUTER SCIENCE.
Suchman, L. (2002). Located accountabilities in technology production. Scandinavian journal of information systems, 14(2), 7.
Sullins, J. P. (2012). Robots, love, and sex: the ethics of building a love machine. IEEE transactions on affective computing, 3(4), 398-409.
Weizenbaum, J. (1966). ELIZA—a computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36-45.
Weizenbaum, J. (1976). Computer power and human reason: From judgment to calculation. W. H. Freeman & Co., New York, NY, USA.
Winner, L. (2009). Do artifacts have politics? Readings in the Philosophy of Technology, 251-263.