Responsibility is a tricky word. We all expect people to be responsible, and sometimes we reflect on our own responsibilities, but what exactly does that mean? Does it just mean your mom left you in charge while she went off to run errands, or does it have greater meaning than that?
Corporate Social Responsibility has been a popular idea in business ethics for thirty or so years. The basic idea is that corporations have the capability to massively affect society. For example, when Elon Musk set up his Tesla factory in Fremont, California, he created a bunch of new jobs in an area that had experienced financial downturn. On the other hand, when corporate giants like Starbucks and Amazon threatened to leave Seattle when a new tax was proposed to help the homeless, they used their economic influence for the worse.
Within ethics, responsibility has many different theoretical approaches and proponents. My own training is in both continental philosophy and religious ethics, so I can't say a whole lot about analytic models of responsibility, but there are good resources on it elsewhere. My thoughts here will be derived from the work of Emmanuel Levinas, Hans Jonas and H. Richard Niebuhr (H. Richard was the younger brother of popular American theologian Reinhold Niebuhr, who wrote the serenity prayer and happens to be President Obama's favorite theologian).
First, responsibility has a dual nature to it. Responsibility tends to be obligatory: when we say we have responsibilities, we typically mean duties. But responsibility is also relational: I am responsible to another person for some action. So responsibilities are both to someone else and for something. Responsibility includes responding to another person in the words of Levinas and Niebuhr, and ensuring certain things happen (or don't happen). In other words, the particular responsibilities any given person has depends on who they are: whom they are related to and what their role is.
In a big sense, according to Emmanuel Levinas, we are responsible to everyone, or perhaps it's better to say to anyone. Levinas says our encounter with others thrusts moral responsibility on us to ensure their continued existence. Now this sounds like a tall order, but it helps us recognize our actions affect people in myriad ways. If I choose to export my labor to a company with labor violations, I have some degree of responsibility for what happens to those workers. That's been at the heart of criticisms against Apple's contract with FoxConn, or the Nike sweatshop scandal of the 90s.
Hans Jonas applies the notion of responsibility to technology explicitly. Jonas saw that technologies were beginning to have bigger impacts on the world, including far-reaching consequences both geographically and temporally. So Jonas said that engineers and designers have the moral responsibility to make sure that their technologies first do not result in the death of all human beings, and second, can help make the future a better place. Because a technology like nuclear fission, or even internal combustion, can have long-lasting effects, such as nuclear war or greenhouse gas emissions causing global warming, technologists have a moral responsibility to create technologies that will not cause harm long-term to human beings.
Finally, H. Richard Niebuhr adds to this that since responsibility is about responding, it also has to be adaptive. The particular situation at hand, the needs of the other to whom we're responding, and the development of the relationship between the person acting and the person who is being affected all change the particular moral needs of any given moment. For Niebuhr, responsibility is about maintaining and developing a relationship toward a morally good goal. But, Niebuhr realizes, different situations require different responses. So responsibility requires understanding how to respond based on changes over time. Ideally, this will lead to the relationship becoming better and better over time.
When we combine these theories, what we get is basically this: tech firms, entrepreneurs, engineers, scientists and designers all have moral responsibility to people they affect. Their responsibility is to create technologies that will not cause massive harm in the long run. Moral frameworks like RRI and VSD are designed to implement this responsibility in the R&D phase. But sometimes there are unforeseen consequences--it's hard to believe Henry Ford or Thomas Watt would have seen global warming as a consequence of their engine designs. When we begin to see negative effects, tech firms have a responsibility to respond to the problem. This requires tech firms to carefully consider moral consequences before a technology is proposed, while its in development, in its testing phase and even after its released. Is this a lot to expect? Yes, but that's the nature of responsibility as well--the person who is responsible has an important burden they assume.
This poses a great challenge to tech people. Think about smartphones, for example. Is it responsible to consumers to make phones that will be obsolete within a couple years? Is it responsible to third world countries and the environment to produce so many high-end electronics with such a low recycling rate? Is the manufacture of smart phones usually responsible to the workers who are involved? It would be hard to really take all of these things into consideration, especially in a competitive market. While a company like FairPhone might make concerted efforts to be responsible to third world markets and consumers, it can't compete with larger companies like Apple and Samsung, especially not with respect to being cutting edge. However, tech people, including entrepreneurs, investors, engineers, designers, project leads and others, do need to take consideration of how their technological projects are impacting people and should make notable efforts to operate in a way that is responsible to everyone involved. It may be difficult--it may even be impossible--but it is necessary for ethical technological development.
コメント