Our smart phones, smart watches, and smart bands promise a lot. They promise to make our lives better, to increase our productivity, to improve our efficiency, to enhance our safety, to make us fitter, faster, stronger and more intelligent. They do this through a combination of methods. One of the most important is outsourcing,* i.e. by taking away the cognitive and emotional burden associated with certain activities. Consider the way in which Google maps allows us to outsource the cognitive labour of remembering directions. This removes a cognitive burden and potential source of anxiety, and enables us to get to our destinations more effectively. We can focus on more important things. It’s clearly a win-win.
Or is it? Evan Selinger (whom I recently interviewed for my podcast) has explored this question in his work. He has his concerns. He worries that an over-reliance on certain outsourcing technologies may be corrosive of virtue. Indeed, in one case he goes so far as to suggest that it may be turning people into sociopaths. In this post I want to unpack the arguments Selinger advances in support of this view.
1. The Varieties of Outsourcing
Before we get into the arguments themselves, it is worth injecting some precision into our understanding of technological outsourcing. The term is being used in a stipulative fashion in this post. It obviously has economic connotations. In economic discussions ‘outsourcing’ is used to describe the practice whereby economic actors rely on agents or institutions outside of themselves to perform one of their key productive tasks. This is usually done in the interests of cost/efficiency savings. These economic connotations are useful in the present context. Indeed, in my interview with him, Selinger suggested that he liked the economic connotations associated with term. But he also noted that it applies to many domains. Across all those domains it has a fundamental structure that involves accepting the assistance of another agent or thing and relying on that thing to perform some important labour on your behalf (for more details on this fundamental structure check out the podcast).
For the remainder of this post, we’ll focus solely on ‘technological outsourcing’. We’ll define this as the practice whereby people get their computers, smart phones (etc) to perform certain day-to-day tasks that they would otherwise have to perform themselves.
It is worth drawing a distinction between two major variants of such technological outsourcing:
Cognitive Outsourcing: Using a device to perform cognitive tasks that you would otherwise have to perform yourself, e.g. getting a calculator to perform arithmetical operations instead of doing them in your head.
Affective Outsourcing: Using a device to perform an affective task that you would otherwise have to perform yourself, e.g. getting an app to send ‘I love you’ texts to your partner at predetermined or random intervals.
Presumably there are many more variants that could be discussed. In previous posts I have talked about the decision-making and motivational outsourcing that is sometimes performed by technological devices. But I’ll stick with just these two variants for now. I do so because, as Selinger points out in one of his articles, there is a sense in which cognitive outsourcing has become ethically normalised. Outsourcing cognitive tasks like memorising phone numbers, directions, mathematical operations and so forth is utterly unexceptionable nowadays. And although people like Nicholas Carr may worry about the degenerative effects of cognitive outsourcing, most of the ethical concern is focused solely on the efficacy of such technologies: if they are better at performing the relevant cognitive task, then we are pretty comfortable with handing things over to them. Affective outsourcing might be more ethically interesting insofar as it has a direct impact on our interpersonal relationships (which is where most affective labour is performed).
Are there many apps that facilitate affective outsourcing? Selinger discusses some in his work. I’ll just mention two here. Both have to do with the affective labour performed within romantic relationships. As anyone who is in a relationship will know, there are many simple but regular affective tasks that need to be performed in order to maintain a smoothly functioning relationship. One of them is to send occasional messages to your partner (when you are apart) to remind them that you still care. The problem is that sometimes we get distracted or have other more important things to do. How can we perform this simple affective task at such times? Apps like Romantimatic and Bro App provide the answer. They allow you to send automatic text messages to your partner at appropriate times. Romantimatic seems fairly well-intentioned, and is primarily designed to remind you to send messages (though it does include pre-set messages that you can select with minimal effort). Bro App is probably less well-intentioned. It is targeted at men and is supposed to allow them to spend more time with the ‘bros’ by facilitating automated messaging. Jimmy Fallon explains in the clip below.
To some extent, these are silly little apps. Selinger speculates that the BroApp may just be a performance art piece. Nevertheless, they are still available for download, and they are interesting because they highlight a potential technological trend which, according to Selinger anyway, might have important repercussions for the ethics of interpersonal relationships.
Why is this? I detect three main lines of argument in his work. Let’s go through each of them in some detail.
2. The Agency and Responsibility Problem
The first major objection has to do with agency and responsibility. Selinger puts it like this in his article ‘Don’t outsource your dating life’:
The more hard and tedious work outsourcing can remove from our lives, the more tempting it will be to take advantage of it.
And yet, that’s exactly why we need to be vigilant.
Defenders of outsourcing believe the Do It Yourself (DIY) ethics has too much cultural prominence and is wrongly perceived as evidence of thrift or even moral virtue. They attribute this mistake to people having difficulty placing a proper value on their time.
Setting a value on time, however, is more complicated than outsourcing boosters lead us to believe.
First, outsourcing can impact responsibility. A calendar isn’t just a tool for offloading memorising commitments. It can be very helpful in organzing busy lives and ensuring we meet our obligations. But delegation can be negative. If you only think of a lover because your phone prompts you to, maybe you’re just not that into your lover.
There are few things going on in this quote and I want to disaggregate them. The last line, in particular, seems to blend into the issue of deceptiveness (discussed below) since it suggests that dependency on an app of this sort highlights a lack of sincerity in your affective performances. We’ll get to that. For now, let’s just focus on the responsibility bit.
I take it that the idea motivating this objection is that responsibility is an important social and personal virtue. In other words, it is good for us to take responsibility for certain aspects of our lives. It would be wrong (and impossible) to take responsibility for everything (you can’t ‘do it all yourself’) but there definitely some things for which you should take responsibility. One of those things is your interactions with your romantic partner. Reliance on outsourcing apps and devices undermines that responsibility. It creates a dependency. This means we no longer have the right responsibility-connection with certain reactions we encourage in our partners. This prevents us from taking responsibility for those reactions.
But what is the responsibility-connection? One could write an entire treatise on that topic, but in broad outline you are responsible for an outcome if (a) you cause that outcome through your actions; (b) you know about the important factual and moral properties of that outcome (the epistemic condition); and c) you voluntarily willed that outcome (the volitional condition). The worry then is that reliance on outsourcing apps prevents one or more of these conditions from being satisfied. For example, you could argue that because the app automatically selects and sends the messages, you don’t cause the eventual outcome (and you may lack awareness of it) and hence you cannot be responsible for it.
Putting it more formally:
- (1) Taking responsibility for certain outcomes is an important social and personal virtue.
- (2) In order to take responsibility for certain outcomes you must cause, know and voluntarily will those outcomes.
- (3) Reliance on outsourcing apps undermines one or more of these conditions.
- (4) Therefore, reliance on outsourcing apps undermines responsibility (from 2 and 3)
- (5) Therefore, reliance on outsourcing apps undermines an important social and personal virtue.
Is this argument any good? I have some fondness for it. I too think that responsibility is an important social and personal virtue. although I reject the desire to over-responsibilise personal failings (a desire that is commonplace in some economic/political ideologies). I think cultivating the power of agency and responsibility is important. For me, the good life doesn’t simply consist in being a passive recipient of the benefits of technology and progress; it requires taking an active role in that progress. And I sometimes worry that technology denies us this active role.
That said, I think the plausibility of this argument will always depend on the mechanics of the particular app. It is not obvious to me that an app like Romantimatic or even Bro App will always sever the responsibility-connection. Using a bit of technology to achieve an outcome that you both desire and intend does not undermine your responsibility. So if you genuinely want to make your partner happy by sending them text messages at certain intervals, and if you merely use the app do execute your intention, then I think you are still sufficiently connected to the outcome to take responsibility for it. Indeed, there is an argument to the effect that technological assistance of this sort actually increases your responsibility because it enhances your ability to perform tasks of this sort.
On the other hand, it does depend on how the app executes your desires and intentions. The more autonomous the app is, the greater the risk of undermining your responsibility. This is something that is widely discussed in the AI and robotics literature. If apps of this sort become more independent from their users — if they start generating and sending message-content that is well-outside the bounds of what their users actually desire and intend — then I think the responsibility argument could work. (For more on responsibility and automation, see my recent paper on robotics and the retribution gap).
3. The Deception/Inauthenticity Objection
A second argument against these apps is that they encourage us to be deceptive in our interpersonal relationships. This deceptive intent is clearly evinced by the makers of the BroApp. The rationale behind the app is that it helps you to maintain the pretense of communicating with your romantic partner when you are actually spending time with your friends (the ‘bros’). Selinger endorses this interpretation of these apps in his article ‘Today’s apps are turning us into sociopaths’**
…the reason technologies like BroApp are problematic is that they’re deceptive. They take situations where people make commitments to be honest and sincere, but treat those underlying moral values as irrelevant — or, worse, as obstacles to be overcome.
To put this into argumentative form:
- (6) It is a bad thing to be deceptive in your interpersonal relationships.
- (7) Affective outsourcing apps (such as BroApp) encourage deceptive interpersonal communications.
- (8) Therefore, these apps are bad things.
The argument is too general in this form. Premise (6) needs to be finessed. There may be contexts in which a degree of deceptiveness is desirable in interpersonal relationships. The so-called ‘white lies’ that we tell to keep our partners happy may often be justifiable (I pass no judgment on that here). And premise (7) is problematic insofar as not all of these apps encourage ‘deceptiveness’, at least not within the strict definition of that term. Deceptiveness is usually taken to denote an active intent to mislead another as to the context or the truth of what you are saying. It’s not clear to me that automated messaging always involves that kind of active intent. I think someone could set up a pre-scheduled set of messages that sincerely and truthfully convey their feelings toward another person.
What might be really problematic is not so much that these apps encourage deceptiveness but that they undermine the real value of certain types of interpersonal communication. Selinger highlighted this in the interview I did with him. He suggested that there are some interpersonal communications in which the value of the communication lies in the fact that it is an immediate, deliberate and conscious representation of how you feel about another person. In other words, that the real value of receiving an affectionate message from a loved one, lies in the fact that the other person is really thinking about you at that moment — that they are being intentional and ‘present’ in the relevant communicative context. The problem is that the very logic of these apps — the automated outsourcing of communications — serves to corrode this ‘real’ value. It might make your partner feel good for a moment or too. But it does so at the expense being consciously present in the communication:
- (9) The real value of certain types of interpersonal communication is that they are immediate, conscious and intentional representations of how we feel.
- (10) Affective outsourcing apps (by their very nature) create communications that are not immediate, conscious and intentional.
- (11) Therefore, affective outsourcing apps undermine the real value of certain types of interpersonal communication.
This is a more formidable objection because it gets to the heart of what these apps are. It is, if you like, as close to an in-principle objection as you are likely to get. But there are several limitations to bear in mind.
First, it clearly doesn’t apply to outsourcing tout court — sometimes there is no value to the immediate conscious performance of an act. The value of dividing up a bill at a restaurant does not lie in the conscious performance of the arithmetical operation; it lies in getting the right result. That said, I think it might apply to more contexts than we first realise. For instance, I happen to believe that the real value of certain cognitive acts lies in the fact that they are immediate, conscious and intentional. That’s how I feel about the value of participation in deliberative political processes and is part of the reason why I worry about algorithmic outsourcing in political contexts.
Second, even in those contexts in which it does apply there may be important tradeoffs to consider. To say that the ‘real’ value of certain communications lies in their immediate, conscious and intentional nature is not to say that ‘all’ the value lies in those properties.
Finally, it is worth noting that technology is not the only thing that undermines this value. Internal automaticity (i.e. the unconscious/subconscious performance of certain acts) also undercuts this value and it can be prompted by many non-technological factors (e.g. mindless repetition of a task, stress or busyness at work).
4. The Virtue-Corrosion Objection
The final objection is simply a more general version of the preceding one. The idea is that the good life (i.e. the life of meaning and flourishing) is one in which people develop (and are encouraged to develop) virtuous character traits — e.g. thoughtfulness, generosity, charity, mercy, courageousness and so on. These are performative virtues. That is to say, the goodness lies in the ability to perform actions in a manner that exemplifies and enriches those virtuous traits. It is not enough for you to simple facilitate an outcome that is generous or charitable — you must be responsible for those outcomes through your own actions.
The worry about outsourcing apps — possibly of all kinds but particularly those involving affective performances (since affect is often central to virtue) — is that they discourage the cultivation of these performative virtues. Think about the logic underlying something like Romantimatic or BroApp. It is an instrumentalising logic. It assumes that the goal of interpersonal communication is to produce a desirable outcome in the mind of your romantic partner. But maybe that’s not all it is about. Maybe it is also about cultivating virtuous performances in interpersonal communication. By outsourcing the activity you completely miss this.
I’m not going to spell this out as a logical argument. As I say, I think it may just be a more general repackaging of the preceding objections. Nevertheless, I think it is a valuable repackaging insofar as it highlights once again how the good life involves more than just being the passive recipient of the benefits that technology brings. The virtues are constitutive of the good; they are not mere instruments of the good. If we create outsourcing apps that both allow and encourage us to bypass the virtues we could be in for trouble.
It goes without saying that this objection still needs to be treated with caution. Not all forms of outsourcing will corrode virtue. Many could free us up to become more virtuous by allowing us to focus on the skills and practices that matter most. Also, we must bear in mind what I previously said about responsibility. Outsourcing doesn’t necessarily entail the severing of the responsibility-connection that is needed for cultivating the virtues.
As with most technology-related debates, the issue is not black-and-white.
* You could just call this ‘automation’, but I want to stick with ‘outsourcing’ since it is the term used in the work of Evan Selinger.
** I have no idea whether Selinger was responsible for the article headlines. It’s quite possible (probable) that he was not.
Post a Comment