idnits 2.17.1 draft-iab-privacy-considerations-01.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (October 23, 2011) is 4566 days in the past. Is this intentional? Checking references for intended status: Informational ---------------------------------------------------------------------------- == Outdated reference: A later version (-03) exists of draft-hansen-privacy-terminology-02 -- Obsolete informational reference (is this intentional?): RFC 2223 (Obsoleted by RFC 7322) -- Obsolete informational reference (is this intentional?): RFC 2616 (Obsoleted by RFC 7230, RFC 7231, RFC 7232, RFC 7233, RFC 7234, RFC 7235) Summary: 0 errors (**), 0 flaws (~~), 2 warnings (==), 3 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Network Working Group A. Cooper 3 Internet-Draft CDT 4 Intended status: Informational H. Tschofenig 5 Expires: April 25, 2012 Nokia Siemens Networks 6 B. Aboba 7 Microsoft Corporation 8 J. Peterson 9 NeuStar, Inc. 10 J. Morris 11 October 23, 2011 13 Privacy Considerations for Internet Protocols 14 draft-iab-privacy-considerations-01.txt 16 Abstract 18 This document offers guidance for developing privacy considerations 19 for IETF documents and aims to make protocol designers aware of 20 privacy-related design choices. 22 Discussion of this document is taking place on the IETF Privacy 23 Discussion mailing list (see 24 https://www.ietf.org/mailman/listinfo/ietf-privacy). 26 Status of this Memo 28 This Internet-Draft is submitted in full conformance with the 29 provisions of BCP 78 and BCP 79. 31 Internet-Drafts are working documents of the Internet Engineering 32 Task Force (IETF). Note that other groups may also distribute 33 working documents as Internet-Drafts. The list of current Internet- 34 Drafts is at http://datatracker.ietf.org/drafts/current/. 36 Internet-Drafts are draft documents valid for a maximum of six months 37 and may be updated, replaced, or obsoleted by other documents at any 38 time. It is inappropriate to use Internet-Drafts as reference 39 material or to cite them other than as "work in progress." 41 This Internet-Draft will expire on April 25, 2012. 43 Copyright Notice 45 Copyright (c) 2011 IETF Trust and the persons identified as the 46 document authors. All rights reserved. 48 This document is subject to BCP 78 and the IETF Trust's Legal 49 Provisions Relating to IETF Documents 50 (http://trustee.ietf.org/license-info) in effect on the date of 51 publication of this document. Please review these documents 52 carefully, as they describe your rights and restrictions with respect 53 to this document. Code Components extracted from this document must 54 include Simplified BSD License text as described in Section 4.e of 55 the Trust Legal Provisions and are provided without warranty as 56 described in the Simplified BSD License. 58 Table of Contents 60 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 3 61 2. Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 62 3. Threat Model . . . . . . . . . . . . . . . . . . . . . . . . . 5 63 4. Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . 7 64 4.1. General . . . . . . . . . . . . . . . . . . . . . . . . . 7 65 4.2. Data Minimization . . . . . . . . . . . . . . . . . . . . 7 66 4.3. User Participation . . . . . . . . . . . . . . . . . . . . 8 67 4.4. Security . . . . . . . . . . . . . . . . . . . . . . . . . 9 68 4.5. Accountability . . . . . . . . . . . . . . . . . . . . . . 9 69 5. Example . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 70 6. Security Considerations . . . . . . . . . . . . . . . . . . . 11 71 7. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 12 72 8. Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . 13 73 9. References . . . . . . . . . . . . . . . . . . . . . . . . . . 14 74 9.1. Normative References . . . . . . . . . . . . . . . . . . . 14 75 9.2. Informative References . . . . . . . . . . . . . . . . . . 14 76 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 15 78 1. Introduction 80 All IETF specifications are required by [RFC2223] to contain a 81 security considerations section. [RFC3552] provides detailed 82 guidance to protocol designers about both how to consider security as 83 part of protocol design and how to inform readers of IETF documents 84 about security issues. This document intends to provide a similar 85 set of guidance for considering privacy in protocol design. Whether 86 any individual document will require a specific privacy 87 considerations section will depend on the document's content. The 88 guidance provided here can and should be used to assess the privacy 89 considerations of protocol and architectural specifications 90 regardless of whether those considerations are documented in a stand- 91 alone section. 93 Privacy is a complicated concept with a rich history that spans many 94 disciplines. Many sets of privacy principles and privacy design 95 frameworks have been developed in different forums over the years. 96 These include the Fair Information Practices (FIPs) [OECD], a 97 baseline set of privacy protections pertaining to the collection and 98 use of data about individuals, and the Privacy by Design framework 99 [PbD], which provides high-level privacy guidance for systems design. 100 The guidance provided in this document is inspired by this prior 101 work, but it aims to be more concrete, pointing protocol designers to 102 specific engineering choices that can impact the privacy of the 103 individuals that make use of Internet protocols. 105 Privacy as a legal concept is understood differently in different 106 jurisdictions. The guidance provided in this document is generic and 107 can be used to inform the design of any protocol to be used anywhere 108 in the world, without reference to specific legal frameworks. 110 The document is organized as follows: Section 2 describes the extent 111 to which the guidance offered in this document is applicable within 112 the IETF, Section 3 discusses a generic threat model to motivate the 113 need for privacy considerations, Section 4 provides the guidelines 114 for analyzing and documenting privacy considerations within IETF 115 specifications, and Section 5 examines the privacy characteristics of 116 an IETF protocol to demonstrate the use of the guidance framework. 118 2. Scope 120 The core function of IETF activity is building protocols. Internet 121 protocols are often built flexibly, making them useful in a variety 122 of architectures, contexts, and deployment scenarios without 123 requiring significant interdependency between disparately designed 124 components. Although some protocols assume particular architectures 125 at design time, it is not uncommon for architectural frameworks to 126 develop later, after implementations exist and have been deployed in 127 combination with other protocols or components to form complete 128 systems. 130 As a consequence, the extent to which protocol designers can foresee 131 all of the privacy implications of a particular protocol at design 132 time is significantly limited. An individual protocol may be 133 relatively benign on its own, but when deployed within a larger 134 system or used in a way not envisioned at design time, its use may 135 create new privacy risks. The guidelines in Section 4 ask protocol 136 designers to consider how their protocols are expected to interact 137 with systems and information that exist outside the protocol bounds, 138 but not to imagine every possible deployment scenario. 140 Furthermore, in many cases the privacy properties of a system are 141 dependent upon API specifics, internal application functionality, 142 database structure, local policy, and other details that are specific 143 to particular instantiations and generally outside the scope of the 144 work conducted in the IETF. The guidance provided here only reaches 145 as far as protocol design can go. 147 As an example, consider HTTP [RFC2616], which was designed to allow 148 the exchange of arbitrary data. A complete analysis of the privacy 149 considerations for uses of HTTP might include what type of data is 150 exchanged, how this data is stored, and how it is processed. Hence 151 the analysis for an individual's static personal web page would be 152 different than the use of HTTP for exchanging health records. A 153 protocol designer working on HTTP extensions (such as WebDAV 154 [RFC4918]) is not expected to describe the privacy risks derived from 155 all possible usage scenarios, but rather the privacy properties 156 specific to the extensions and any particular uses of the extensions 157 that are expected and foreseen at design time. 159 3. Threat Model 161 To consider privacy in protocol design it is helpful to consider the 162 overall communication architecture and different actors' roles within 163 it. This analysis is similar to a threat analysis found in the 164 security considerations sections of IETF documents. Figure 1 165 presents a communication model found in many of today's protocols 166 where a sender wants to establish communication with some recipient 167 and thereby uses some form of intermediary. In some cases this 168 intermediary stays in the communication path for the entire duration 169 of the communication and sometimes it is only used for communication 170 establishment, for either inbound or outbound communication. In rare 171 cases there may be a series of intermediaries that are traversed. 173 +-----------+ 174 | | 175 >| Recipient | 176 / | | 177 ,' +-----------+ 178 +--------+ )--------------( ,' +-----------+ 179 | | | | - | | 180 | Sender |<------>|Intermediary |<------>| Recipient | 181 | | | |`. | | 182 +--------+ )--------------( \ +-----------+ 183 ^ `. +-----------+ 184 : \ | | 185 : `>| Recipient | 186 .....................................>| | 187 +-----------+ 189 Legend: 191 <....> End-to-End Communication 192 <----> Hop-by-Hop Communication 194 Figure 1: Example Instantiation of Architectural Entities 196 This model is vulnerable to three types of adversaries: 198 Eavesdropper: [RFC4949] describes the act of 'eavesdropping' as 200 "Passive wiretapping done secretly, i.e., without the knowledge 201 of the originator or the intended recipients of the 202 communication." 204 Eavesdropping is often considered by IETF protocols in the context 205 of a security analysis. Confidentiality protection is often 206 employed to defend against attacks based on eavesdropping, and 207 [RFC3552] demands that confidentiality be incorporated as a 208 security consideration. While IETF protocols offer guidance on 209 how to secure communication against eavesdroppers, deployments 210 sometimes choose not to enable such security. 212 Intermediary: Many protocols developed today show a more complex 213 communication pattern than simple client-server or peer-to-peer 214 communication, as motivated in Figure 1. Store-and-forward 215 protocols are examples where entities participate in the message 216 delivery even though they are not the final recipients. Often, 217 these intermediaries only require a small amount of information 218 for message routing and/or security. In theory, protocol 219 mechanisms could ensure that end-to-end information is not made 220 accessible to these entities, but in practice the difficulty of 221 deploying end-to-end security procedures, additional messaging or 222 computational overhead, and other business or legal requirements 223 often slow or prevent the deployment of end-to-end security 224 mechanisms, giving intermediaries greater exposure to 225 communication patterns and payloads than is strictly necessary. 227 Recipient: It may not seem intuitive to treat the recipient as an 228 adversary since the entire purpose of the communication 229 interaction is to provide information to the recipient. However, 230 the recipient can act as the sender's privacy foe in two respects. 231 First, the sender may be unintentionally communicating with the 232 recipient, whether because of a lack of access control or because 233 the sender was not properly informed about what data it would be 234 communicating to the recipient. Second, the recipient may choose 235 to use the sender's data in ways that contravene the sender's 236 wishes, whether by putting it to some purpose that the sender 237 opposes, sharing it with other entities, or storing it after the 238 communication session has ended. Whether the recipient becomes an 239 adversary depends on whether it makes use of mechanisms that 240 reduce these risks, including informing the sender about how his 241 or her data will be used, offering choices, and obtaining 242 authorization to receive and use the sender's data. 244 4. Guidelines 246 This section provides guidance for document authors in the form of a 247 questionnaire about a protocol being designed. The questionnaire may 248 be useful at any point in the design process, particularly after 249 document authors have developed a high-level protocol model as 250 described in [RFC4101]. 252 Note that the guidance does not recommend specific practices. The 253 range of protocols developed in the IETF is too broad to make 254 recommendations about particular uses of data or how privacy might be 255 balanced against other design goals. However, by carefully 256 considering the answers to each question, document authors should be 257 able to produce a comprehensive analysis that can serve as the basis 258 for discussion of whether the protocol adequately protects against 259 privacy threats. 261 The framework is divided into four sections that address different 262 aspects of privacy -- data minimization, user participation, 263 security, and accountability -- plus a general section. Security is 264 not elaborated since substantial guidance already exists in 265 [RFC3552]. Privacy-specific terminology used in the framework is 266 [will be] defined in [I-D.hansen-privacy-terminology]. 268 4.1. General 270 a. Trade-offs. Does the protocol make trade-offs between privacy 271 and usability, privacy and efficiency, privacy and 272 implementability, or privacy and other design goals? Describe the 273 trade-offs and the rationale for the design chosen. 275 4.2. Data Minimization 277 a. Identifiers. What identifiers does the protocol use for 278 distinguishing endpoints? Does the protocol use identifiers that 279 allow different protocol interactions to be correlated? 281 b. User information. What information does the protocol expose 282 about end users and/or their devices (other than the identifiers 283 discussed in (a))? How identifiable is this information? How 284 does the protocol combine user information with the identifiers 285 discussed in (a)? 287 c. Fingerprinting. In many cases the specific ordering and/or 288 occurrences of information elements in a protocol allow devices 289 using the protocol to be uniquely fingerprinted. Is this protocol 290 vulnerable to fingerprinting? If so, how? 291 d. Persistence of identifiers. What assumptions are made in the 292 protocol design about the lifetime of the identifiers discussed in 293 (a)? Does the protocol allow implementers or users to delete or 294 recycle identifiers? How often does the specification recommend 295 to delete or recycle identifiers by default? 297 e. Leakage. Are there expected ways that information exposed by 298 the protocol will be combined or correlated with information 299 obtained outside the protocol? How will such combination or 300 correlation facilitate user or device fingerprinting? Are there 301 expected combinations or correlations with outside data that will 302 make the information exposed by the protocol more identifiable? 304 f. Recipients. In the protocol design, what information 305 discussed in (a) and (b) is exposed to other endpoints (i.e., 306 recipients)? Are there ways for protocol implementers to choose 307 to limit the information shared with other endpoints? 309 g. Intermediaries. In the protocol design, what information 310 discussed in (a) and (b) is exposed to intermediaries? Are there 311 ways for protocol implementers to choose to limit the information 312 shared with intermediaries? 314 h. Retention. Do the protocol or its anticipated uses require 315 that the information discussed in (a) or (b) be retained by 316 recipients or intermediaries? Is the retention expected to be 317 persistent or temporary? 319 4.3. User Participation 321 a. Control over initial sharing. What user controls or consent 322 mechanisms does the protocol define or require before user 323 information or identifiers are shared or exposed via the protocol? 324 If no such mechanisms are specified, is it expected that control 325 and consent will be handled outside of the protocol? 327 b. Control over sharing with recipients. Does the protocol 328 provide ways for users to limit which information is shared with 329 recipients? If not, are there mechanisms that exist outside of 330 the protocol to provide users with such control? 332 c. Control over sharing with intermediaries. Does the protocol 333 provide ways for users to limit which information is shared with 334 intermediaries? If not, are there mechanisms that exist outside 335 of the protocol to provide users with such control? Is it 336 expected that users will have relationships (contractual or 337 otherwise) with intermediaries that govern the use of the 338 information? 339 d. Preference expression. Does the protocol provide ways for 340 users to express their preferences to recipients or intermediaries 341 with regard to the use or disclosure of their information? 343 4.4. Security 345 a. Communication security. Do the protocol's security 346 considerations account for communication security, per RFC 3552? 348 4.5. Accountability 350 a. User verification. If the protocol provides for user 351 preference expression, does it also define or require mechanisms 352 that allow users to verify that their preferences are being 353 honored? If not, are there mechanisms that exist outside of the 354 protocol that allow for user verification? 356 5. Example 358 [To be provided in a future version.] 360 6. Security Considerations 362 This document describes privacy aspects that protocol designers 363 should consider in addition to regular security analysis. 365 7. IANA Considerations 367 This document does not require actions by IANA. 369 8. Acknowledgements 371 We would like to thank the participants for the feedback they 372 provided during the December 2010 Internet Privacy workshop co- 373 organized by MIT, ISOC, W3C and the IAB. 375 9. References 377 9.1. Normative References 379 [I-D.hansen-privacy-terminology] 380 Hansen, M. and H. Tschofenig, "Terminology for Talking 381 about Privacy by Data Minimization: Anonymity, 382 Unlinkability, Undetectability, Unobservability, 383 Pseudonymity, and Identity Management", 384 draft-hansen-privacy-terminology-02 (work in progress), 385 March 2011. 387 9.2. Informative References 389 [OECD] Organization for Economic Co-operation and Development, 390 "OECD Guidelines on the Protection of Privacy and 391 Transborder Flows of Personal Data", available at 392 (September 2010) , http://www.oecd.org/EN/document/ 393 0,,EN-document-0-nodirectorate-no-24-10255-0,00.html, 394 1980. 396 [PbD] Office of the Information and Privacy Commissioner, 397 Ontario, Canada, "Privacy by Design", 2011. 399 [RFC2223] Postel, J. and J. Reynolds, "Instructions to RFC Authors", 400 RFC 2223, October 1997. 402 [RFC2616] Fielding, R., Gettys, J., Mogul, J., Frystyk, H., 403 Masinter, L., Leach, P., and T. Berners-Lee, "Hypertext 404 Transfer Protocol -- HTTP/1.1", RFC 2616, June 1999. 406 [RFC3552] Rescorla, E. and B. Korver, "Guidelines for Writing RFC 407 Text on Security Considerations", BCP 72, RFC 3552, 408 July 2003. 410 [RFC4101] Rescorla, E. and IAB, "Writing Protocol Models", RFC 4101, 411 June 2005. 413 [RFC4918] Dusseault, L., "HTTP Extensions for Web Distributed 414 Authoring and Versioning (WebDAV)", RFC 4918, June 2007. 416 [RFC4949] Shirey, R., "Internet Security Glossary, Version 2", 417 RFC 4949, August 2007. 419 [browser-fingerprinting] 420 Eckersley, P., "How Unique Is Your Browser?", Springer 421 Lecture Notes in Computer Science , Privacy Enhancing 422 Technologies Symposium (PETS 2010), 2010. 424 Authors' Addresses 426 Alissa Cooper 427 CDT 428 1634 Eye St. NW, Suite 1100 429 Washington, DC 20006 430 US 432 Phone: +1-202-637-9800 433 Email: acooper@cdt.org 434 URI: http://www.cdt.org/ 436 Hannes Tschofenig 437 Nokia Siemens Networks 438 Linnoitustie 6 439 Espoo 02600 440 Finland 442 Phone: +358 (50) 4871445 443 Email: Hannes.Tschofenig@gmx.net 444 URI: http://www.tschofenig.priv.at 446 Bernard Aboba 447 Microsoft Corporation 448 One Microsoft Way 449 Redmond, WA 98052 450 US 452 Email: bernarda@microsoft.com 454 Jon Peterson 455 NeuStar, Inc. 456 1800 Sutter St Suite 570 457 Concord, CA 94520 458 US 460 Email: jon.peterson@neustar.biz 462 John B. Morris, Jr. 464 Email: ietf@jmorris.org