Consciousness And Moral Status Reading Group: Week 9
===
###### tags: `Animal` `Consciousness` `Ethics` `Phenomenal Consciousness`
:::info
- **Reading:** Carruthers, P. (2019). Human and animal minds: The consciousness questions laid to rest. Oxford University Press. Chapter 8 (pp. 165-188)
- **Date:** Sep 29, 2020 12:00 PM (LONDON)
- **Host:** MM
- **Reference:** - [Last week meeting](/@tanzor/value8)
:::
I didn't understand:
---
- [name=MM] What is THIS-E/R?
> [name=PC] "In what follows, partly to emphasize that the concepts in question are purely first-person ones, I propose to use the notation “this-R” for a concept applying to an experience of red (either a specific shade or in general), and “this-E” for a concept that applies to one’s phenomenal experience as a whole."
>> [name=UK] thank you! I wanted to just write "WHAT?"
- [name=MM] Carruthers argues that there is not fact of the matter about phenomenal consciusness in animals. What does he mean, and why?
> [name=PC] "Suppose we had complete knowledge of the functional and representational components of the mentality of a monkey, or a salmon, or a honey bee. [...] Would it add anything to our knowledge to have done the comparative work needed to know whether states in these animals are more like globally broadcast ones in humans (on all reasonable ways of precisifying the concept of global broadcasting, perhaps) than they are similar to any of the various kinds of unconscious state that humans undergo, or vice versa? I suggest not. [...] There is no extra property of the mind of the animal that accrues to it by virtue of its similarity or dissimilarity to the human global workspace. As a result, there is no substantive fact of the matter concerning consciousness in nonhuman animals."
> [name=PC]"We can conclude that although global-workspace theory could provide the basis for a categorical concept to employ across species, it seems likely that only a small subset of species (perhaps a set restricted to humans) would qualify as determinately phenomenally conscious, on this approach. In connection with the vast majority of species, the phenomenally conscious status of their perceptual states would have to be something that we stipulate rather than discover."
> [name=PC] "By the same token, then, it must be illegitimate to project my own phenomenal concepts into minds that are significantly different from my own. For doing so presupposes that the minds in question are other than they really are."
>> [name=UK] I couldn't decide which of the following he meant:
> > - That it's still unknown whether other animals have p-Cons (in which case, why isn't this a matter for scientific investigation?)
> > - That it can never be known - in which case, how can it be known about other humans?
> > - That there is nothing to know, because P-cons is just a name that we give from a first-person-perspective to what happens in GWT
> > > [name=MM] I think it is definitely not the first, and I think it is not the second either (he contrasts his view with Dawkins' view on this respect), so it must be the third.
- [name=Nitzan] I didn't understand the disticntion between the two definiitions of "feeling". general feeling vs domain specific? exterception vs somatic feelings?
> [name=MM] I think the difference is very similar to access/phenomenal consciousness?
- [name=MM] What is *The Cambridge Declaration on Consciousness*?
> http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf
- [name=UK] What is “transitive creature consciousness (which can arguably exist in the absence of phenomenal consciousness)”? Being conscious of something, without having a qualia of it? Sort of like A-cons?
What is the difference between the two types of sentience (phenomenal and transitive creature consciousness)
I found it interesting:
---
- [name=MM] The two senses of the word 'feel'.
- [name=MM] Empathy vs. Sympathy as grounds for moral concern.
> [name=UK] Could sympathy exist without assuming something about the internal experience of its subject? i.e. without empathy?
> > We have pointed out two possible meanings for the distinction between empathy and sympathy. The first 'weaker' version is that empathy is projecting one's own desires and value system onto the subject of empathy, and sympathy involves a theory of mind and the understanding that the subject of symapthy can have different wants or desires than one's own. The stronger meaning is that sympathy does not involve any implicit beliefs about the subject having experience at all. It is avoiding hindering others from obtaining their desires not virtue of how this might feel for them at all. I [MM] think Carruther's means the second here.
- Regans argument that only creatures who have a sense of their own past and future qualify as having moral rights- does that mean only humans are worthy?
>[name=MM] An interesting read about the history of this argument: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2666704/
- [name=Anna ]The suggestion that we should not emphasize with beings/creatures in the absence of phenomenal consciousness.
> We've discussed together how Carruther's argument is that empathy should not play any role in our ethical judgments of other animals/beings.
- [name=ND] That empathy should not be used to base moral decisions on period.
- How we dont feel empathy when lots of people are involved eg the widespread famine example.
> [name=MM] https://www.effectivealtruism.org/articles/introduction-to-effective-altruism/
- [name=UK] The idea of unconscious episodic memories
- The difference between the two types of feel and thinking about it in the other way eg when you feel chronic pain in the absence of any real stimuli.
> We've discussed together qhat counts as an external stimulus for pain/suffering. Does nerve stimulation count?
I wanted to discuss together:
---
- [name=MM] Similarities and differences from Levy's argument. Does Carruthers defend a 'desire-satisfaction' approach to well-being?
- [name=MM] The intuition that a good theory of consciousness and moral worth must explain why e.g. mice are more morally worthy than e.g. [Furbys](https://en.wikipedia.org/wiki/Furby) (that also can be said to have wants, desires, etc.).
> One comment was that maybe this is not a problem with the theory. Maybe we *should* treat Furbys as equal to mice.
- [name=Anna]The notion that inflicting unconscious pain unto a being should not be of any moral concern to us
> [name=MM] and the notion of 'unconcsious pain' altogether!
- The experience of the bear- I disagree with the point made. Even if you close your eyes you are still experiencing the bear, even if you are not perceiving it with one of your five senses
- I disagree with the idea that only humans can feel degraded eg a lion that loses a fight will be shunned from the pride and no longer feels the desire to fight the leader of the pack, perhaps due to deference or shame. Also, other animals have a concern for hierarchy and reputation I think.
> [name=MM] Good points!
- [name=uk] Can concepts of the events\items themselves be held without consciousness?
> [name=PC] "In general, our value systems are outward-looking. It is worldly items and worldly events that get appraised as good or bad, and are thus seen as nonconceptually good or bad. Such appraisals don’t depend on the presence of a full global broadcasting architecture, and are surely possible in creatures that lack such an architecture".
- [name=PC] “although active desires in humans are phenomenally conscious whenever we are aware of them, it seems reasonable to judge that their importance derives, not from the phenomenally conscious nature of their content, but rather from their status as desires, as reflecting one’s underlying values and needs.”
> [name=UK] In what sense is it different, then, from chemosensory of amoeba? (That is, if we’re talking about frustration of desires of an animal as cruelty) He solves it by saying: “My view is that the states in question need to possess compositional structure (albeit not necessarily propositional in nature; the structures can be map-like or tree-like), and need to interact with one another in ways that are sensitive to those structures”. but I don’t understand what that means.
> >[name=MM] This is ambiguous and fuzzy. I wonder if the point here is to simply pivot the discussion from experiential properties to behavioural/cognitive/computational/measurable properties?
Random thoughts here:
---
- The metric by which we determine consciousness is fundamentally anthropocentric and so will always be biased towards humans.
> [name=MM] Yes! And this is especially apparent in the Martian example. If they are qualitatively different from us, there will be no truth-of-the-matter as to whether they possess phenomenal consciousness or not.
## Notes
<!-- Other important details discussed during the meeting can be entered here. -->