Esprii Chapman – Notes on The Machine Question – 11/28/20 (Book 4 of 5)

“The idea of agency,” Himma explains, “is conceptually associated with the idea of being capable of doing something that counts as an act or action. As a conceptual matter, X is an agent if and only if X is capable of performing action. Actions are doings, but not every doing is an action; breathing is something we do, but it does not count as an action. Typing these words is an action, and it is in virtue of my ability to do this kind of thing that, as a conceptual matter, I am an agent” (Himma 2009, 19–20). Furthermore, agency, at least as it is typically characterized and understood, requires that there be some kind of animating “intention” behind the observed action. “The difference between breathing and typing words,” Himma continues, “is that the latter depends on my having a certain kind of mental state” (ibid., 20). In this way, agency can be explained by way of what Daniel Dennett calls an “intentional system,” which is characterized as any system—be it a man, machine, or alien creature (Dennett 1998, 9)—to which one can ascribe “beliefs and desires” (Dennett 1998, 3). Consequently, “only beings capable of intentional states (i.e., mental states that are about something else, like a desire for X), then, are agents.

Gunkel, David J. The Machine Question (p. 18-19). MIT Press. Kindle Edition. 

One such alternative can be found in what F. Allan Hanson (2009, 91) calls “extended agency theory,” which is itself a kind of extension of actor-network approaches. According to Hanson, who takes what appears to be a practical and entirely pragmatic view of things, machine responsibility is still undecided and, for that reason, one should be careful not to go too far in speculating about the issue: “Possible future development of automated systems and new ways of thinking about responsibility will spawn plausible arguments for the moral responsibility of non-human agents. For the present, however, questions about the mental qualities of robots and computers make it unwise to go this far” (ibid., 94). Instead, Hanson, following the work of Peter-Paul Verbeek (2009), suggests that this problem may be resolved by considering various theories of “joint responsibility,” where “moral agency is distributed over both human and technological artifacts” (Hanson 2009, 94). This is an elaboration of the “many hands” concept that had been proposed by Helen Nissenbaum (1996) to describe the distributed nature of accountability in computerized society.

Gunkel, David J.. The Machine Question (pp. 164-165). MIT Press. Kindle Edition. 

To respond to these apparent shifts in the ‘center’ of moral consideration, Channell proposes “a decentered ethical framework that reflects a bionic world view” (ibid., 152), what he calls “a bionic ethic” (ibid., 151). This idea is derived from a reworking of Aldo Leopold’s (1966) “land ethic.” “While the land ethic of Leopold focuses on the organic, and in fact is usually interpreted as being in opposition to technology, it does provide a model for including both the organic and the mechanical into the expanding boundaries of a new ethic. In point of fact, Leopold often explained the interdependence of the biotic elements of nature in terms of engine parts or wheels and cogs” (Channell 1991, 153). Although often distinguished from technological concerns, Channell finds Leopold’s land ethic to provide articulation of a moral thinking that can respect and take responsibility for nonliving objects, not only soils, waters, and rocks but also computers and other technological artifacts. For Channell, connecting the dots between these different concerns is not only a matter of metaphorical comparison—that is, the fact that nature has often been described and characterized in explicit mechanical terms—but grounded in established moral and legal precedent, that is, in the fact that “inanimate objects such as trusts, corporations, banks, and ships have long been seen by the courts as possessing rights”; the fact that some “writers have suggested that landmark buildings should be treated in a way similar to endangered species”; and the fact that “objects of artistic creation . . . have an intrinsic right to exist and be treated with respect” (ibid.).

Gunkel, David J.. The Machine Question (p. 166). MIT Press. Kindle Edition. 

that the moral person is not some predefined, stable, and well-established ontological position but is, as Dennett (1998, 285) describes it, a “normative ideal.” In other words, “the concept of a person is only a free-floating honorific that we are all happy to apply to ourselves, and to others as the spirit moves us, guided by our emotions, aesthetic sensibilities, considerations of policy, and the like” (ibid., 268).

Gunkel, David J.. The Machine Question (p. 173). MIT Press. Kindle Edition. 

According to the alternative approaches of Foerst, Dolby, and Stahl, someone or something becomes a moral subject with legitimate ethical standing not on the prior determination and demonstration of his/her/its agency or the possession of some psychological properties that are considered to be “person-making” but by being situated, treated, and responded to as another person by a particular community in concrete situations and encounters. This means that “person” is not, as Foerst concludes, some “empirical fact”; instead it is a dynamic and socially constructed honorarium or “gift” (Benford and Malartre 2007, 165) that can be bestowed (or not) on others by a particular community in a particular place for a particular time. For this reason, “person” (assuming it is decided to retain this word) is never something that is absolute and certain but is always and already a relative term, the assignment of which has its own moral implications and consequences.

Gunkel, David J.. The Machine Question (p. 173). MIT Press. Kindle Edition. 

Gunkel, David J. The Machine Question. MIT Press, 2012. 

1 Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s