idnits 2.17.1 draft-iab-privacy-considerations-00.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (October 23, 2011) is 4568 days in the past. Is this intentional? Checking references for intended status: Informational ---------------------------------------------------------------------------- == Outdated reference: A later version (-03) exists of draft-hansen-privacy-terminology-02 -- Obsolete informational reference (is this intentional?): RFC 2223 (Obsoleted by RFC 7322) -- Obsolete informational reference (is this intentional?): RFC 2616 (Obsoleted by RFC 7230, RFC 7231, RFC 7232, RFC 7233, RFC 7234, RFC 7235) Summary: 0 errors (**), 0 flaws (~~), 2 warnings (==), 3 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Network Working Group A. Cooper 3 Internet-Draft CDT 4 Intended status: Informational H. Tschofenig 5 Expires: April 25, 2012 Nokia Siemens Networks 6 B. Aboba 7 Microsoft Corporation 8 J. Peterson 9 NeuStar, Inc. 10 J. Morris 11 October 23, 2011 13 Privacy Considerations for Internet Protocols 14 draft-iab-privacy-considerations-00.txt 16 Abstract 18 This document offers guidance for developing privacy considerations 19 for IETF documents and aims to make protocol designers aware of 20 privacy-related design choices. 22 Discussion of this document is taking place on the IETF Privacy 23 Discussion mailing list (see 24 https://www.ietf.org/mailman/listinfo/ietf-privacy). 26 Status of this Memo 28 This Internet-Draft is submitted in full conformance with the 29 provisions of BCP 78 and BCP 79. 31 Internet-Drafts are working documents of the Internet Engineering 32 Task Force (IETF). Note that other groups may also distribute 33 working documents as Internet-Drafts. The list of current Internet- 34 Drafts is at http://datatracker.ietf.org/drafts/current/. 36 Internet-Drafts are draft documents valid for a maximum of six months 37 and may be updated, replaced, or obsoleted by other documents at any 38 time. It is inappropriate to use Internet-Drafts as reference 39 material or to cite them other than as "work in progress." 41 This Internet-Draft will expire on April 25, 2012. 43 Copyright Notice 45 Copyright (c) 2011 IETF Trust and the persons identified as the 46 document authors. All rights reserved. 48 This document is subject to BCP 78 and the IETF Trust's Legal 49 Provisions Relating to IETF Documents 50 (http://trustee.ietf.org/license-info) in effect on the date of 51 publication of this document. Please review these documents 52 carefully, as they describe your rights and restrictions with respect 53 to this document. Code Components extracted from this document must 54 include Simplified BSD License text as described in Section 4.e of 55 the Trust Legal Provisions and are provided without warranty as 56 described in the Simplified BSD License. 58 Table of Contents 60 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 3 61 2. Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 62 3. Threat Model . . . . . . . . . . . . . . . . . . . . . . . . . 4 63 4. Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . 6 64 4.1. General . . . . . . . . . . . . . . . . . . . . . . . . . 6 65 4.2. Data Minimization . . . . . . . . . . . . . . . . . . . . 7 66 4.3. User Participation . . . . . . . . . . . . . . . . . . . . 7 67 4.4. Security . . . . . . . . . . . . . . . . . . . . . . . . . 8 68 4.5. Accountability . . . . . . . . . . . . . . . . . . . . . . 8 69 5. Example . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 70 6. Security Considerations . . . . . . . . . . . . . . . . . . . 8 71 7. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 8 72 8. Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . 9 73 9. References . . . . . . . . . . . . . . . . . . . . . . . . . . 9 74 9.1. Normative References . . . . . . . . . . . . . . . . . . . 9 75 9.2. Informative References . . . . . . . . . . . . . . . . . . 9 76 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 10 78 1. Introduction 80 All IETF specifications are required by [RFC2223] to contain a 81 security considerations section. [RFC3552] provides detailed 82 guidance to protocol designers about both how to consider security as 83 part of protocol design and how to inform readers of IETF documents 84 about security issues. This document intends to provide a similar 85 set of guidance for considering privacy in protocol design. Whether 86 any individual document will require a specific privacy 87 considerations section will depend on the document's content. The 88 guidance provided here can and should be used to assess the privacy 89 considerations of protocol and architectural specifications 90 regardless of whether those considerations are documented in a stand- 91 alone section. 93 Privacy is a complicated concept with a rich history that spans many 94 disciplines. Many sets of privacy principles and privacy design 95 frameworks have been developed in different forums over the years. 96 These include the Fair Information Practices (FIPs) [OECD], a 97 baseline set of privacy protections pertaining to the collection and 98 use of data about individuals, and the Privacy by Design framework 99 [PbD], which provides high-level privacy guidance for systems design. 100 The guidance provided in this document is inspired by this prior 101 work, but it aims to be more concrete, pointing protocol designers to 102 specific engineering choices that can impact the privacy of the 103 individuals that make use of Internet protocols. 105 Privacy as a legal concept is understood differently in different 106 jurisdictions. The guidance provided in this document is generic and 107 can be used to inform the design of any protocol to be used anywhere 108 in the world, without reference to specific legal frameworks. 110 The document is organized as follows: Section 2 describes the extent 111 to which the guidance offered in this document is applicable within 112 the IETF, Section 3 discusses a generic threat model to motivate the 113 need for privacy considerations, Section 4 provides the guidelines 114 for analyzing and documenting privacy considerations within IETF 115 specifications, and Section 5 examines the privacy characteristics of 116 an IETF protocol to demonstrate the use of the guidance framework. 118 2. Scope 120 The core function of IETF activity is building protocols. Internet 121 protocols are often built flexibly, making them useful in a variety 122 of architectures, contexts, and deployment scenarios without 123 requiring significant interdependency between disparately designed 124 components. Although some protocols assume particular architectures 125 at design time, it is not uncommon for architectural frameworks to 126 develop later, after implementations exist and have been deployed in 127 combination with other protocols or components to form complete 128 systems. 130 As a consequence, the extent to which protocol designers can foresee 131 all of the privacy implications of a particular protocol at design 132 time is significantly limited. An individual protocol may be 133 relatively benign on its own, but when deployed within a larger 134 system or used in a way not envisioned at design time, its use may 135 create new privacy risks. The guidelines in Section 4 ask protocol 136 designers to consider how their protocols are expected to interact 137 with systems and information that exist outside the protocol bounds, 138 but not to imagine every possible deployment scenario. 140 Furthermore, in many cases the privacy properties of a system are 141 dependent upon API specifics, internal application functionality, 142 database structure, local policy, and other details that are specific 143 to particular instantiations and generally outside the scope of the 144 work conducted in the IETF. The guidance provided here only reaches 145 as far as protocol design can go. 147 As an example, consider HTTP [RFC2616], which was designed to allow 148 the exchange of arbitrary data. A complete analysis of the privacy 149 considerations for uses of HTTP might include what type of data is 150 exchanged, how this data is stored, and how it is processed. Hence 151 the analysis for an individual's static personal web page would be 152 different than the use of HTTP for exchanging health records. A 153 protocol designer working on HTTP extensions (such as WebDAV 154 [RFC4918]) is not expected to describe the privacy risks derived from 155 all possible usage scenarios, but rather the privacy properties 156 specific to the extensions and any particular uses of the extensions 157 that are expected and foreseen at design time. 159 3. Threat Model 161 To consider privacy in protocol design it is helpful to consider the 162 overall communication architecture and different actors' roles within 163 it. This analysis is similar to a threat analysis found in the 164 security considerations sections of IETF documents. Figure 1 165 presents a communication model found in many of today's protocols 166 where a sender wants to establish communication with some recipient 167 and thereby uses some form of intermediary. In some cases this 168 intermediary stays in the communication path for the entire duration 169 of the communication and sometimes it is only used for communication 170 establishment, for either inbound or outbound communication. In rare 171 cases there may be a series of intermediaries that are traversed. 173 +-----------+ 174 | | 175 >| Recipient | 176 / | | 177 ,' +-----------+ 178 +--------+ )--------------( ,' +-----------+ 179 | | | | - | | 180 | Sender |<------>|Intermediary |<------>| Recipient | 181 | | | |`. | | 182 +--------+ )--------------( \ +-----------+ 183 ^ `. +-----------+ 184 : \ | | 185 : `>| Recipient | 186 .....................................>| | 187 +-----------+ 189 Legend: 191 <....> End-to-End Communication 192 <----> Hop-by-Hop Communication 194 Figure 1: Example Instantiation of Architectural Entities 196 This model is vulnerable to three types of adversaries: 197 Eavesdropper: [RFC4949] describes the act of 'eavesdropping' as 198 "Passive wiretapping done secretly, i.e., without the knowledge 199 of the originator or the intended recipients of the 200 communication." 201 Eavesdropping is often considered by IETF protocols in the context 202 of a security analysis. Confidentiality protection is often 203 employed to defend against attacks based on eavesdropping, and 204 [RFC3552] demands that confidentiality be incorporated as a 205 security consideration. While IETF protocols offer guidance on 206 how to secure communication against eavesdroppers, deployments 207 sometimes choose not to enable such security. 208 Intermediary: Many protocols developed today show a more complex 209 communication pattern than simple client-server or peer-to-peer 210 communication, as motivated in Figure 1. Store-and-forward 211 protocols are examples where entities participate in the message 212 delivery even though they are not the final recipients. Often, 213 these intermediaries only require a small amount of information 214 for message routing and/or security. In theory, protocol 215 mechanisms could ensure that end-to-end information is not made 216 accessible to these entities, but in practice the difficulty of 217 deploying end-to-end security procedures, additional messaging or 218 computational overhead, and other business or legal requirements 219 often slow or prevent the deployment of end-to-end security 220 mechanisms, giving intermediaries greater exposure to 221 communication patterns and payloads than is strictly necessary. 222 Recipient: It may not seem intuitive to treat the recipient as an 223 adversary since the entire purpose of the communication 224 interaction is to provide information to the recipient. However, 225 the recipient can act as the sender's privacy foe in two respects. 226 First, the sender may be unintentionally communicating with the 227 recipient, whether because of a lack of access control or because 228 the sender was not properly informed about what data it would be 229 communicating to the recipient. Second, the recipient may choose 230 to use the sender's data in ways that contravene the sender's 231 wishes, whether by putting it to some purpose that the sender 232 opposes, sharing it with other entities, or storing it after the 233 communication session has ended. Whether the recipient becomes an 234 adversary depends on whether it makes use of mechanisms that 235 reduce these risks, including informing the sender about how his 236 or her data will be used, offering choices, and obtaining 237 authorization to receive and use the sender's data. 239 4. Guidelines 241 This section provides guidance for document authors in the form of a 242 questionnaire about a protocol being designed. The questionnaire may 243 be useful at any point in the design process, particularly after 244 document authors have developed a high-level protocol model as 245 described in [RFC4101]. 247 Note that the guidance does not recommend specific practices. The 248 range of protocols developed in the IETF is too broad to make 249 recommendations about particular uses of data or how privacy might be 250 balanced against other design goals. However, by carefully 251 considering the answers to each question, document authors should be 252 able to produce a comprehensive analysis that can serve as the basis 253 for discussion of whether the protocol adequately protects against 254 privacy threats. 256 The framework is divided into four sections that address different 257 aspects of privacy -- data minimization, user participation, 258 security, and accountability -- plus a general section. Security is 259 not elaborated since substantial guidance already exists in 260 [RFC3552]. Privacy-specific terminology used in the framework is 261 [will be] defined in [I-D.hansen-privacy-terminology]. 263 4.1. General 264 a. Trade-offs. Does the protocol make trade-offs between privacy 265 and usability, privacy and efficiency, privacy and 266 implementability, or privacy and other design goals? Describe the 267 trade-offs and the rationale for the design chosen. 269 4.2. Data Minimization 271 a. Identifiers. What identifiers does the protocol use for 272 distinguishing endpoints? Does the protocol use identifiers that 273 allow different protocol interactions to be correlated? 274 b. User information. What information does the protocol expose 275 about end users and/or their devices (other than the identifiers 276 discussed in (a))? How identifiable is this information? How 277 does the protocol combine user information with the identifiers 278 discussed in (a)? 279 c. Fingerprinting. In many cases the specific ordering and/or 280 occurrences of information elements in a protocol allow devices 281 using the protocol to be uniquely fingerprinted. Is this protocol 282 vulnerable to fingerprinting? If so, how? 283 d. Persistence of identifiers. What assumptions are made in the 284 protocol design about the lifetime of the identifiers discussed in 285 (a)? Does the protocol allow implementers or users to delete or 286 recycle identifiers? How often does the specification recommend 287 to delete or recycle identifiers by default? 288 e. Leakage. Are there expected ways that information exposed by 289 the protocol will be combined or correlated with information 290 obtained outside the protocol? How will such combination or 291 correlation facilitate user or device fingerprinting? Are there 292 expected combinations or correlations with outside data that will 293 make the information exposed by the protocol more identifiable? 294 f. Recipients. In the protocol design, what information 295 discussed in (a) and (b) is exposed to other endpoints (i.e., 296 recipients)? Are there ways for protocol implementers to choose 297 to limit the information shared with other endpoints? 298 g. Intermediaries. In the protocol design, what information 299 discussed in (a) and (b) is exposed to intermediaries? Are there 300 ways for protocol implementers to choose to limit the information 301 shared with intermediaries? 302 h. Retention. Do the protocol or its anticipated uses require 303 that the information discussed in (a) or (b) be retained by 304 recipients or intermediaries? Is the retention expected to be 305 persistent or temporary? 307 4.3. User Participation 308 a. Control over initial sharing. What user controls or consent 309 mechanisms does the protocol define or require before user 310 information or identifiers are shared or exposed via the protocol? 311 If no such mechanisms are specified, is it expected that control 312 and consent will be handled outside of the protocol? 313 b. Control over sharing with recipients. Does the protocol 314 provide ways for users to limit which information is shared with 315 recipients? If not, are there mechanisms that exist outside of 316 the protocol to provide users with such control? 317 c. Control over sharing with intermediaries. Does the protocol 318 provide ways for users to limit which information is shared with 319 intermediaries? If not, are there mechanisms that exist outside 320 of the protocol to provide users with such control? Is it 321 expected that users will have relationships (contractual or 322 otherwise) with intermediaries that govern the use of the 323 information? 324 d. Preference expression. Does the protocol provide ways for 325 users to express their preferences to recipients or intermediaries 326 with regard to the use or disclosure of their information? 328 4.4. Security 330 a. Communication security. Do the protocol's security 331 considerations account for communication security, per RFC 3552? 333 4.5. Accountability 335 a. User verification. If the protocol provides for user 336 preference expression, does it also define or require mechanisms 337 that allow users to verify that their preferences are being 338 honored? If not, are there mechanisms that exist outside of the 339 protocol that allow for user verification? 341 5. Example 343 [To be provided in a future version.] 345 6. Security Considerations 347 This document describes privacy aspects that protocol designers 348 should consider in addition to regular security analysis. 350 7. IANA Considerations 352 This document does not require actions by IANA. 354 8. Acknowledgements 356 We would like to thank the participants for the feedback they 357 provided during the December 2010 Internet Privacy workshop co- 358 organized by MIT, ISOC, W3C and the IAB. 360 9. References 362 9.1. Normative References 364 [I-D.hansen-privacy-terminology] 365 Hansen, M. and H. Tschofenig, "Terminology for Talking 366 about Privacy by Data Minimization: Anonymity, 367 Unlinkability, Undetectability, Unobservability, 368 Pseudonymity, and Identity Management", 369 draft-hansen-privacy-terminology-02 (work in progress), 370 March 2011. 372 9.2. Informative References 374 [OECD] Organization for Economic Co-operation and Development, 375 "OECD Guidelines on the Protection of Privacy and 376 Transborder Flows of Personal Data", available at 377 (September 2010) , http://www.oecd.org/EN/document/ 378 0,,EN-document-0-nodirectorate-no-24-10255-0,00.html, 379 1980. 381 [PbD] Office of the Information and Privacy Commissioner, 382 Ontario, Canada, "Privacy by Design", 2011. 384 [RFC2223] Postel, J. and J. Reynolds, "Instructions to RFC Authors", 385 RFC 2223, October 1997. 387 [RFC2616] Fielding, R., Gettys, J., Mogul, J., Frystyk, H., 388 Masinter, L., Leach, P., and T. Berners-Lee, "Hypertext 389 Transfer Protocol -- HTTP/1.1", RFC 2616, June 1999. 391 [RFC3552] Rescorla, E. and B. Korver, "Guidelines for Writing RFC 392 Text on Security Considerations", BCP 72, RFC 3552, 393 July 2003. 395 [RFC4101] Rescorla, E. and IAB, "Writing Protocol Models", RFC 4101, 396 June 2005. 398 [RFC4918] Dusseault, L., "HTTP Extensions for Web Distributed 399 Authoring and Versioning (WebDAV)", RFC 4918, June 2007. 401 [RFC4949] Shirey, R., "Internet Security Glossary, Version 2", 402 RFC 4949, August 2007. 404 [browser-fingerprinting] 405 Eckersley, P., "How Unique Is Your Browser?", Springer 406 Lecture Notes in Computer Science , Privacy Enhancing 407 Technologies Symposium (PETS 2010), 2010. 409 Authors' Addresses 411 Alissa Cooper 412 CDT 413 1634 Eye St. NW, Suite 1100 414 Washington, DC 20006 415 US 417 Phone: +1-202-637-9800 418 Email: acooper@cdt.org 419 URI: http://www.cdt.org/ 421 Hannes Tschofenig 422 Nokia Siemens Networks 423 Linnoitustie 6 424 Espoo 02600 425 Finland 427 Phone: +358 (50) 4871445 428 Email: Hannes.Tschofenig@gmx.net 429 URI: http://www.tschofenig.priv.at 431 Bernard Aboba 432 Microsoft Corporation 433 One Microsoft Way 434 Redmond, WA 98052 435 US 437 Email: bernarda@microsoft.com 438 Jon Peterson 439 NeuStar, Inc. 440 1800 Sutter St Suite 570 441 Concord, CA 94520 442 US 444 Email: jon.peterson@neustar.biz 446 John B. Morris, Jr. 448 Email: ietf@jmorris.org