<?xml version="1.0"?>
<!DOCTYPE rfc SYSTEM "rfc2629.dtd" [
<!ENTITY RFC2119 SYSTEM "http://xml.resource.org/public/rfc/bibxml/reference.RFC.2119.xml">
]>
<?xml-stylesheet type='text/xsl' href='rfc2629.xslt' ?>
<?rfc strict="yes" ?>
<?rfc toc="yes"?>
<?rfc tocdepth="4"?>
<?rfc symrefs="yes"?>
<?rfc sortrefs="yes" ?>
<?rfc compact="yes" ?>
<?rfc subcompact="no" ?>
<rfc category="std" docName="draft-bray-privacy-choices-01" ipr="trust200902">
  <!-- ***** FRONT MATTER ***** -->

  <front>
    <title>Privacy Choices for Internet Data Services</title>

    <author fullname="Tim Bray" initials="T." role="editor"
            surname="Bray">
      <organization>Textuality Services</organization>

      <address>
        <email>tbray@textuality.com</email>
	<uri>https://www.tbray.org/</uri>
      </address>
    </author>

    <date year="2015" month="April" />

    <abstract>
      <t>This document argues in favor of Internet service providers deploying
      technologies which offer increased privacy to users of their services.
      The discussion is independent of any particular privacy technology. The
      approach is to consider common objections to the the deployment of such
      technologies, and show that these objections are not well-founded.</t>
    </abstract>
  </front>

  <middle>
    <section title="Introduction">
      <t>Privacy issues are becoming increasingly important to users of
      Internet services, and the providers of those services must choose how
      much privacy it is appropriate to provide their users.</t>
      <t>When discussing the deployment of privacy technology, certain
      objections are encountered repeatedly: That privacy protection is
      inappropriate for freely-public information and "brochure-ware", that it
      is too flawed to be worthwhile, that privacy choices are best left to
      end users, and that the cost of deploying privacy protection is too
      high.</t>
      <t>This document considers these these arguments and shows that they are
      flawed; the conclusion is that in almost every case, the best choice for
      a service provider and its users is the one that maximmizes privacy.</t>
    </section>
    <section title="Summary">
      <t>This document attempts to establish the following:
      <list style="numbers">
	<t>Whether or not information is considered "public" is not a good
	criterion for choosing whether or not to deploy privacy
	technologies for its users.</t>
	<t>Privacy choices are difficult and context-dependent, so it's
	inappropriate to ask users to make them.</t>
	<t>Privacy techologies offer benefits to users of data services even
	when those technologies are imperfect.</t>
	<t>Cost should not be a significant factor while considering the
	deployment of privacy technologies.</t>
      </list>
      </t>
    </section>
    <section title="Terminology">
      <t>The term "data service" means any Internet-mediated offering that is
      accessible to the general public.  Examples would include Web sites,
      HTTP APIs, streaming media, and various flavors of chat.</t>
      <t>In this document, "privacy protection" means technology whose
      deployment increases the cost and difficulty, for anyone but the user and
      provider of a data service, of ascertaining who is accessing which
      services and what messages are being exchanged between the user and the
      service. Obvious examples are encryption and authentication
      technologies.</t>

    </section>
    <section title="Background">
      <t>This section establishes two background facts that will
      serve to support this document's central arguments.</t>
      <section title="Asymmetric failure cost">
	<t>There are two classes of privacy-related failure in the
	operation of data services.
	A positive failure occurs when
	privacy was provided but was not necessary; a negative failure is
	when privacy was not provided, but was necessary for prudent
	use of the data service.</t>
	<t>The cost of these failure classes is not symmetric; negative
	failures can endanger businesses, property, and lives, while
	positive failures usually incur at most a little extra expense.</t>
      </section>
      <section title="Privacy technology cost">
	<t>A wide variety of privacy technologies are available to Internet
	data service providers.  They include public-key infrastructure,
	transport-level encryption, server-side encryption-at-rest, and
	token-based authentication/authorization technologies which reduce
	the use of passwords.</t>
	<t>In every case, the monetary and engineering cost of acquiring the
	necessary resources and deploying the required software has been
	falling steadily in recent years, both absolutely and as a proportion
	of the total cost of service development and deployment.</t>
      </section>
    </section>
    <section title="Common objections to privacy-technology deployment">
      <section title="Free public data">
	<t>It is reasonable to question whether, for freely-available public
	data, such as the contents of an online reference work or a
	promotional Web site, it makes sense to deploy privacy
	protection.</t> 
	<t>Unfortunately, it is very difficult to predict when a person
	accessing online information might suffer negative consequences. For
	example, some 
	governments criminalize certain behaviors to the extent that
	accessing free public reference documents concerning that behavior
	could lead to arrest and prosecution.</t>
	<t>Bearing this in mind, and given the asymmetric cost of privacy
	failure modes, the conclusion is that the "public" or "free" status of
	information is not a good argument against the deployment of privacy
	technology.</t> 
	<t>There is another, subtler point: If one groups available data
	services into those which are non-controversial and thus require no
	privacy protection, and those which are controversial and do, some
	will conclude that anything with privacy protection must be
	controversial and thus subject to suspicion.  This effect is better
	avoided.</t>
      </section>
      <section title="Privacy at user option">
	<t>It is often argued that privacy choices are best left to the users
	of data services; and thus, that opt-in privacy is an appropriate
	strategy. </t> 
	<t>However, the technical and social factors forming the context for
	such choices are complex; even experts often disagree on privacy
	requirements. Thus, the end-users of a data service are likely not
	well-equipped to make good choices.</t>
	<t>Bearing this in mind, and given the asymmetric cost of privacy
	failure modes, it is usually best to remove the necessity for making
	these choices, by always providing the maximum practical
	amount of privacy protection.</t>
      </section>
      <section title="Failures of privacy technology">
	<t>Internet privacy technologies are known to be
	imperfect. Cryptography algorithms have been compromised and there is
	widespread dissatisfaction with the PKI infrastructure.</t> 
	<t>Furthermore, it is widely agreed that an attacker who wishes to
	attack a target’s privacy has many means, ranging from social
	engineering to hardware hacking to zero-day exploits, to bypass
	privacy protection.</t> 
	<t>Therefore, it is reasonable to question the deployment of privacy
	protection, which may create an unrealistic expectation of safety when
	in fact that is not achievable.</t>
	<t>However, this line of argument fails on economic
	grounds. Deployments of privacy technology, however imperfect,
	generally have the effect of increasing the cost to an attacker of
	invading end-users' privacy.  Every time that cost goes up, certain 
	surveillance activities, whether by government bodies or criminals,
	become uneconomic and will be abandoned, with the effect of globally
	increasing the security and privacy of Internet data services.</t> 
      </section>
      <section title="Affordability of privacy technology">
	<t>Privacy technologies are not free; there are monetary costs for
	accessing PKI infrastructure, and bandwidth/computation costs related
	to encryption, authentication, and authorization.</t> 
	<t>Service providers may find it difficult to justify such expenses,
	particularly those who have severe budget constraints.</t> 
	<t>However, the monotonic decline in privacy technology costs
	decreases the force of this argument with every passing year.  It is
	hard to imagine a situation where an organization can afford to
	acquire server resources, domain names, internet connectivity, and
	software deployment expertise, but still cannot afford to offer
	privacy protection.</t> 
	<t>There is a subtle related issue: Those who are operating
	on low budgets are often providing data services to disadvantaged
	groups, whose members may be in particular need of privacy
	protection.</t>
      </section>
    </section>
  </middle>

  <!--  *****BACK MATTER *****
  <back>

    <references title="Normative References">
      &RFC2119;
    </references>

  </back> -->
</rfc>
