Internet Privacy: Market Drivers for Change

Internet Privacy: Market Drivers for Change

One of the comments I received on my last privacy post from Reshika Dhir rhetorically questioned whether users of cloud computing applications had a choice in not using these services to reduce their privacy risks. I agree with Reshika that many individuals in first-world countries have come to the point of technology dependence. To add to her comment, I would argue that our dependence on these services stems from the fact that our social groups, peers, and immediate society use these services to enrich their lives and stay connected. Not adopting and using these technologies could mean that we may be left behind and excluded. The logical conclusion (and undertone of Reshika’s comment) is that the solution is NOT for users to stop using these advances in technology altogether. But rather than throw up our arms in despair, I believe that the current and future privacy issues of cloud computing applications can be resolved through legislative changes in conjunction with allowing the market to self regulate. This post discusses the ability of markets to self regulate in the context of cloud computing applications and emphasizes the need for legal intervention where the market breaks down. I will be focusing on social networks and Facebook for practical examples, but the reasoning could certainly apply to their competitors or other similar cloud computing services.

Exerting Commercial Influence on Social Networks
Customers best exert commercial influence on companies by taking their business to a competitor. Practically, however, a single user’s ability to exert this influence is frustrated by the fundamental nature of cloud computing applications and social networks.

The nature of cloud computing applications like Facebook’s social network is that its large user-base attracts and retains users. Users are attracted to a large social network because there is a greater likelihood that their acquaintances are already part of that network. These large networks are then able to retain users because the cost for a user to switch to a competitor is significant. Even though a user may be dissatisfied with a particular social network service, they may still not switch for fear of losing their network connections. Because users want large networks, the industry also trends towards an oligopoly or monopoly rather than a perfect market with many competitors. Lack of competition is further exacerbated within geographical regions since most real-world connections (and hence social network connections) between users are regionally based.

The net effect of this type of market is that a gradual trickle of dissatisfied users from one social network to a competitor’s is unlikely to occur. More likely is a mass exodus of organized users switching social network because of their dissatisfaction. But for such an organized movement to occur, there would need to be an issue that burns brightly for many network users at one time.

With respect to privacy, the majority of the privacy risks that social network users are subject to do not spark the widespread dissatisfaction needed to bring the winds of change. There are likely two reasons for this. First, users may be unaware of the risks. Second, most of the privacy risks, though material, are probably too remote for users to appreciate.

That said, when a risk materializes and is experienced by a large number of users over a brief period of time, users’ apathy to privacy disappear. One of the best-known examples of this occurred in September 2006 on the Facebook network. A newly implemented news feed feature was released which published updates on a user's home page about other users’ Facebook activities. Users, however, were not adequately notified of this change nor were they given the option to control what news feeds of their activity were published. Upon discovering this change, Facebook users banded together and threatened to have a day without Facebook. Responding almost immediately, Facebook added additional privacy controls and Mark Zuckerberg quickly posted a public statement apologizing for not providing those controls initially. A similar incident occurred in Nov 2007 when 50,000 Facebook users signed a petition requesting that the new advertising feature Beacon be disabled. Beacon tracked a user’s purchases made on affiliate sites such as eBay North America and Travelocity, and then notified the user's network of their purchase. Again, in response to the outcry, Facebook implemented additional privacy controls and Mark Zuckerberg apologized for Facebook’s mistake in implementing the feature.

The previous analysis of social networks and these examples are useful because they provide an indication of the type of privacy issues market-forces may be able to regulate. Specifically, privacy risks that materialize and immediately affect a large number of users over a short period of time are ideally left to the market. As an aside, it is interesting to note that what makes it difficult for one user of a social network to switch to a competitor, may actually facilitate the organizing of a large group of dissatisfied users to drive change. In Zuckerman’s post, he openly thanked all of the dissatisfied users who created groups and even went so far as to create a group of his own.

By contrast, privacy risks that are better dealt with through legislation include those that users are unaware of, or that are remote but could be significantly damaging to a user if they materialize. Particular attention should also be paid to privacy risks where it is in the immediate financial interest of the corporation to exploit or not resolve. Take for example the two Facebook incidents mentioned above. The first incident was the implementation of a feature that had little revenue generating capacity. Facebook’s response was almost immediate (3 days). The second incident, however, had significant revenue generating capacity. It took Facebook close to a month to address and Zuckerberg openly apologized for this delay.

While market self-regulation is an important driver in resolving the privacy issues created by cloud computing applications, it is important to keep in mind the nature of the market in light of the technology, and its shortcomings. Where these markets fail, law has an important role to play by stepping in and protecting end users.