This is the first in a series of blog-posts exploring depictions of androids, robots and AI in popular culture. In these blog-posts I will discuss the philosophical difficulties with these depictions. In this blog-post I will discuss the fictional android Data and his ability to experience emotions and the implications of these experiences for how we interpret the nature of his consciousness.
In an episode of ‘Star Trek The Next Generation’, called the Descent part 1 (Season 6 Episode 26) the android Data experienced the emotion of anger. Data subsequently discussed his emotional experience with his friend Geordi La Forge. La Forge is originally sceptical of Data’s claim to have experienced emotions and asks him how he could know what an angry emotion is. Data asks Geordi to describe his own experiences so he can use them to see if they are similar to his. Geordi’s attempt to describe his emotional experiences turns out to be embarrassingly inept. The dialogue between Data and Geordi is worth expounding in full detail as it reveals a lot about difficulties in describing the nature of emotions:
“Data: I believe I have experienced my first emotion
Geordi: No offence Data, but how would you know a flash of anger from some kind of power surge?
Data: You are correct that I have no frame of reference to use to confirm my hypothesis. In fact I am unable to provide a verbal description of the experience. Perhaps you can describe how it feels to be angry? I could then use that as a reference.
Geordi: Ok…When I feel angry first I feel…hostile!
Data: Could you describe feeling hostile?
Geordi: Well yeah…it’s like feeling belligerent…combative.
Data: Could you describe feeling angry without referring to other feelings?
Geordi: Hmm…no I guess I can’t. I just… feel angry…
Data: That was my experience as well…I simply…felt angry…” (Star Trek: The Next Generation, ‘The Decent’ part 1, 33mins-35 mins.)
There is an element in the above dialogue that puts one in mind of the Socratic dialogues. In the Socratic dialogues Socrates comes across some dupe who claims to understand some abstract concept like ‘Justice’ or ‘Equality’ etc. After a few minutes of being questioned by Socrates we realise that our dupe doesn’t in fact understand these concepts because he cannot even answer simple questions about them. But in the case of the discussion between Data and Geordi we are led to a different conclusion. The brief dialogue is meant to give the impression that although Geordi cannot define the nature of emotions (without appealing to other; equally undefined, emotional terms), he understands the emotional terms based on his immediate experience.
Since it is Data who had just had his first emotion in the above scene it is important to try and understand a bit about him before we can interpret his experience of having an emotional experience. Throughout the series prior to experiencing his first emotion Data is presented an intelligent thoughtful agent who is respected by his colleagues and who is capable of interpreting the behaviour of his colleagues in a largely accurate manner and to use language that is both coherent and (largely) appropriate to the situation.
Given Data’s linguistic proficiency in engaging in communication with his peers, and his ability to interact with his environment, he is typically treated as a conscious member of his tribe. But the question of whether he is a conscious agent is never really dealt with in sufficient detail. Though there is an episode where he has dreams and this is indicative of his being conscious; I will discuss this in more detail later on.
First I want to briefly discuss an early episode of Star Trek The Next Generation where Data is put on trial to defend his status as a person as opposed to mere property. In the trial it is argued that a creature must meet three criteria in order to be considered a sentient agent. They must be intelligent, self aware and conscious. The first two criterion are met when Data indicates intelligence through his performance in various tasks, he is judged to be self aware because he can verbally describe the scenario he finds himself in, on the question of consciousness it isn’t proven he is conscious it is merely remarked that it is as hard to demonstrate that other humans are conscious as it is to demonstrate Data is.
If we take Data at his word that he is has experienced the emotion of anger then we will be forced to admit that he is conscious. However there were moments earlier in the series that would cause one to doubt that diagnosis. Firstly Data doesn’t appear to feel any pain, thus he has at times had his head removed, his arm removed, and doesn’t indicate any pain or discomfort whatsoever. Along with not feeling pain, he doesn’t seem to experience pleasure; thus while he has had sex he doesn’t associate it with any pleasant sensations. Though it should be noted that while there is no evidence he experienced any sensations when having sex; he did afterwords describe the experience as a meaningful one for him.
The question is can a creature who is incapable of experiencing pain or pleasure have conscious experiences? If no aspect of our environment gave us pleasure or pain; in what sense would it make sense to argue that a creature was conscious? In one sense one could argue that consciousness is simply awareness of something. For a person to be aware of x it doesn’t intuitively seem necessary for one to experience pain or pleasure. It would appear to be possible to view a red object as a red object without this experience involving experiences of pleasure or pain.
To intuition relies on the representational theory of the mind; which was made famous from Descartes argument from error. Descartes argument relies on demonstrating that the world as it is revealed by science isn’t the world of experience. Thus, for example, a stick in water can appear to be bent, but in reality the stick is actually straight. With a disjunction between how the world appears to us, and how it really is; the stage was set for a representational theory of the mind. Descartes theorised that the world as it appears to our mind is a representation which we use to make sense of the world but it is not a direct experience of the real world. Not many people followed Descartes in his dualistic way of conceiving of this issue but the majority of scientists and philosophers follow him in accepting that the argument from illusion leads to a representational theory of the mind.
Again consider Data and his experience of the world. A person walks by him wearing a red top; light reflects off the red top and hits Data’s eyes the light is registered by the eye and information is transmitted from his eyes to his positronic brain, and his brain (somehow) creates a representation of the red object. It is this representation that Data experiences. So if we accept this story; we can argue that Data had conscious experiences in the form of representations. There is some support for this interpretation in the Star Trek episode:
In the above scene we are presented with Data’s dream from a first person perspective. Now a dream is an example of Descartes argument from error. In a dream we purportedly experience a world around us; however since we are really asleep in our beds, the world we see isn’t real but is rather a representation of reality that is fooling us. Now given that in the above episode Data is portrayed as being capable of dreaming it seems inescapable but to conclude that he has conscious experiences. But it is a strange disembodied kind of consciousness.
Data is portrayed as having conscious representations of the world which contain rich qualitative experiences of colour, sound, shape etc. But other aspects of his behaviour seem unconscious or reflexive. As we saw above Data seems to have absolutely no pain receptors. His body can obviously register aspects of his environment and respond appropriately to it. But as portrayed in the show he doesn’t seem to experience any conscious proprioception as he moves about his world. In a sense Data’s body is similar to the body of the robots currently being built by Boston Dynamics:
The above robot, like Data, is adroit at moving around its environment and like Data his skilled movement is unaccompanied by any subjective experience. Of course to a degree we are all like Data or a Boston Dynamics robot as we move around our environments. People who have a severe stroke who are trying to re learn how to walk quickly discover how much of our movements around our world rely on non-conscious mechanics that have to be re-learned post stroke. But there is a difference and it is one nicely captured by Heidegger’s distinction between the ‘ready-to-hand’ and the ‘present-at-hand’. Heidegger notes that when engaging in our everyday activities our movements are pre-thematic, they are in the background and we don’t notice them. When working in this state we find things in our environment ‘ready-to-hand’. However, when things go wrong with our relation to our environment we become aware of ourselves and the objects we are interacting with in a more explicit and theoretical way called ‘present-at-hand’.
In a lot of cases our movement from ‘ready-to-hand’ to ‘present-at-hand’ involves things like a feeling of pain or a feeling of frustration. So, for example, a guy working in a warehouse who is moving things in a particular way on a daily basis will not be aware of the movements till he starts suffering from back pain; once he starts experiencing this pain his interactions with the objects he is moving will switch from ‘ready-to-hand’ to ‘present-at-hand’. The objects he is interacting with will suddenly acquire a particular salience and he will need to be more conscious in his movements when interacting with the objects.
Likewise, if a worker has a well worn set of behaviours when engaging with some aspect of his environment and these behaviours cease to work this will inspire some emotions. The worker may feel frustrated or curious and will have to step back from his behaviour and form a new way of interacting with the objects in his environment.
The situation with Data though precludes such switches from ‘ready-to-hand’ to ‘present-at-hand’, Data is incapable of experiencing pain, and he is incapable of experiencing frustration. So if Data was a worker in the above situations he would have little reason to switch to conscious deliberation when things go wrong. Data is constantly portrayed as experiencing curiosity and wonder. But the show never offers an explanation of why a creature who is supposedly devoid of emotions is capable of experiencing curiosity.
Another strange aspect of Data as a concept is his development. He is portrayed as an android who was built with an adult body and who was programmed to move around his environment, speak respond etc. All of this behaviour was built in; he just used these built in competencies to learn the nature of the world he lived in through interaction with humans over a number of years. So Data never experienced life in a womb; or as a child entirely dependent of his care givers; nor did he feel the human innate emotional bond with parents and peers. He never felt embarrassed, or angry, or exhilarated when interacting with childhood friends, parents, neighbours etc. He never went through puberty and the emotional changes it brought about.
So with Data we have a weird combination. He is supposedly capable of conscious representations of things such as shape, colour, size etc. But he doesn’t experience pain, while he can touch things and interact with them; such interaction appears to be on the level of a Boston Dynamics robot (with the added ability to theoretically interpret his own movements). Whatever consciousness Data supposedly has it seems to be entirely disembodied and like a magical add on to his mechanical behaviour.
This leads us back full circle to Data’s conversation with Geordi. Geordi cannot explain his emotional experiences in a satisfactory manner. But he has a lifetime of experience of learning emotional words and keying them with bodily movements and various experiences. He has a life time of shaping his emotion words to his embodied experience and using them in a way that his similarly constituted peers can understand. Data on the other hand has always been able to use emotion words, but because he had no emotions there was no fit between his words and his experiences. Given Data’s weirdly disembodied Cartesian Nature it is unlikely that even if he suddenly did have new experiences (which he would categorise as emotions), they would line up with human emotions. Data if he were suddenly given emotions would be like a person who was blind for his entire life suddenly being given his sight. Even with the ability to see the ability to interpret distance, size etc would take years of learning to be perfected. And there is a strong possibility that it would never be entirely perfected.
 And there are plenty of difficulties with the story; firstly while we have good story of the neural correlates we have no idea how those correlates produce our experiences in the way they do; secondly any representational theory of mind runs the risk of an infinite regress.
 Dennett’s 1976 paper ‘Are Dreams Experiences’ argues that dreams are not something we directly experience but rather stories we rationally reconstruct after waking. But in the fictional world of Star Trek we are shown Data directly experiencing his dreams; so within this fictional world Dennett’s concerns are mute. Though a philosopher living in the fictional trek world could argue a la Dennett when Data reported his having dreams.