idnits 2.17.1 draft-symeonidis-medup-requirements-00.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- ** The abstract seems to contain references ([RFC8280]), which it shouldn't. Please replace those with straight textual mentions of the documents in question. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year == The document doesn't use any RFC 2119 keywords, yet seems to have RFC 2119 boilerplate text. -- The document date (July 08, 2019) is 1753 days in the past. Is this intentional? Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) ** Downref: Normative reference to an Informational RFC: RFC 4949 ** Downref: Normative reference to an Informational RFC: RFC 7435 == Outdated reference: A later version (-06) exists of draft-birk-pep-03 == Outdated reference: A later version (-05) exists of draft-birk-pep-trustwords-03 Summary: 3 errors (**), 0 flaws (~~), 4 warnings (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Network Working Group I. Symeonidis 3 Internet-Draft University of Luxembourg 4 Intended status: Standards Track B. Hoeneisen 5 Expires: January 9, 2020 Ucom.ch 6 July 08, 2019 8 Privacy and Security Threat Analysis and Requirements for Private 9 Messaging 10 draft-symeonidis-medup-requirements-00 12 Abstract 14 [RFC8280] has identified and documented important principles, such as 15 Data Minimization, End-to-End, and Interoperability in order to 16 enable access to fundamental Human Rights. While (partial) 17 implementations of these concepts are already available, many current 18 applications lack Privacy support that the average user can easily 19 navigate. This document covers analysis of threats to privacy and 20 security and derives requirements from this threat analysis. 22 Status of This Memo 24 This Internet-Draft is submitted in full conformance with the 25 provisions of BCP 78 and BCP 79. 27 Internet-Drafts are working documents of the Internet Engineering 28 Task Force (IETF). Note that other groups may also distribute 29 working documents as Internet-Drafts. The list of current Internet- 30 Drafts is at https://datatracker.ietf.org/drafts/current/. 32 Internet-Drafts are draft documents valid for a maximum of six months 33 and may be updated, replaced, or obsoleted by other documents at any 34 time. It is inappropriate to use Internet-Drafts as reference 35 material or to cite them other than as "work in progress." 37 This Internet-Draft will expire on January 9, 2020. 39 Copyright Notice 41 Copyright (c) 2019 IETF Trust and the persons identified as the 42 document authors. All rights reserved. 44 This document is subject to BCP 78 and the IETF Trust's Legal 45 Provisions Relating to IETF Documents 46 (https://trustee.ietf.org/license-info) in effect on the date of 47 publication of this document. Please review these documents 48 carefully, as they describe your rights and restrictions with respect 49 to this document. Code Components extracted from this document must 50 include Simplified BSD License text as described in Section 4.e of 51 the Trust Legal Provisions and are provided without warranty as 52 described in the Simplified BSD License. 54 Table of Contents 56 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 3 57 1.1. Requirements Language . . . . . . . . . . . . . . . . . . 3 58 1.2. Terms . . . . . . . . . . . . . . . . . . . . . . . . . . 3 59 2. Motivation and Background . . . . . . . . . . . . . . . . . . 4 60 2.1. Objectives . . . . . . . . . . . . . . . . . . . . . . . 4 61 2.2. Known Implementations . . . . . . . . . . . . . . . . . . 4 62 2.2.1. Pretty Easy Privacy (pEp) . . . . . . . . . . . . . . 4 63 2.2.2. Autocrypt . . . . . . . . . . . . . . . . . . . . . . 6 64 2.3. Focus Areas (Design Challenges): . . . . . . . . . . . . 6 65 3. System Model . . . . . . . . . . . . . . . . . . . . . . . . 6 66 3.1. Entities . . . . . . . . . . . . . . . . . . . . . . . . 6 67 3.2. Basic Functional Requirements . . . . . . . . . . . . . . 7 68 4. Threat Analyses . . . . . . . . . . . . . . . . . . . . . . . 7 69 4.1. Adversarial model . . . . . . . . . . . . . . . . . . . . 7 70 4.2. Security Threats and Requirements . . . . . . . . . . . . 8 71 4.2.1. Spoofing and Entity Authentication . . . . . . . . . 8 72 4.2.2. Information Disclosure and Confidentiality . . . . . 8 73 4.2.3. Tampering With Data and Data Authentication . . . . . 8 74 4.2.4. Repudiation and Accountability (Non-Repudiation) . . 9 75 4.3. Privacy Threats and Requirements . . . . . . . . . . . . 9 76 4.3.1. Identifiability - Anonymity . . . . . . . . . . . . . 9 77 4.3.2. Linkability - Unlinkability . . . . . . . . . . . . . 10 78 4.3.3. Detectability and Observability - Undetectability . . 10 79 4.4. Information Disclosure - Confidentiality . . . . . . . . 10 80 4.5. Non-repudiation and Deniability . . . . . . . . . . . . . 10 81 5. Specific Security and Privacy Requirements . . . . . . . . . 11 82 5.1. Messages Exchange . . . . . . . . . . . . . . . . . . . . 11 83 5.1.1. Send Message . . . . . . . . . . . . . . . . . . . . 11 84 5.1.2. Receive Message . . . . . . . . . . . . . . . . . . . 11 85 5.2. Trust Management . . . . . . . . . . . . . . . . . . . . 12 86 5.3. Key Management . . . . . . . . . . . . . . . . . . . . . 12 87 5.4. Synchronization Management . . . . . . . . . . . . . . . 12 88 5.5. Identity Management . . . . . . . . . . . . . . . . . . . 13 89 5.6. User Interface . . . . . . . . . . . . . . . . . . . . . 13 90 6. Subcases . . . . . . . . . . . . . . . . . . . . . . . . . . 13 91 6.1. Interaction States . . . . . . . . . . . . . . . . . . . 13 92 6.2. Subcases for Sending Messages . . . . . . . . . . . . . . 14 93 6.3. Subcases for Receiving Messages . . . . . . . . . . . . . 15 94 7. Security Considerations . . . . . . . . . . . . . . . . . . . 15 95 8. Privacy Considerations . . . . . . . . . . . . . . . . . . . 16 96 9. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 16 97 10. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 16 98 11. References . . . . . . . . . . . . . . . . . . . . . . . . . 16 99 11.1. Normative References . . . . . . . . . . . . . . . . . . 16 100 11.2. Informative References . . . . . . . . . . . . . . . . . 16 101 Appendix A. Document Changelog . . . . . . . . . . . . . . . . . 18 102 Appendix B. Open Issues . . . . . . . . . . . . . . . . . . . . 18 103 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 18 105 1. Introduction 107 [RFC8280] has identified and documented important principles, such as 108 Data Minimization, End-to-End, and Interoperability in order to 109 enable access to fundamental Human Rights. While (partial) 110 implementations of these concepts are already available, many current 111 applications lack Privacy support that the average user can easily 112 navigate. 114 In MEDUP these issues are addressed based on Opportunistic Security 115 [RFC7435] principles. 117 This documents covers analysis of threats to privacy and security and 118 derives requirements from this threat analysis. 120 1.1. Requirements Language 122 The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", 123 "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this 124 document are to be interpreted as described in [RFC2119]. 126 1.2. Terms 128 The following terms are defined for the scope of this document: 130 o Trustwords: A scalar-to-word representation of 16-bit numbers (0 131 to 65535) to natural language words. When doing a Handshake, 132 peers are shown combined Trustwords of both public keys involved 133 to ease the comparison. [I-D.birk-pep-trustwords] 135 o Trust On First Use (TOFU): cf. [RFC7435], which states: "In a 136 protocol, TOFU calls for accepting and storing a public key or 137 credential associated with an asserted identity, without 138 authenticating that assertion. Subsequent communication that is 139 authenticated using the cached key or credential is secure against 140 an MiTM attack, if such an attack did not succeed during the 141 vulnerable initial communication." 143 o Man-in-the-middle (MITM) attack: cf. [RFC4949], which states: "A 144 form of active wiretapping attack in which the attacker intercepts 145 and selectively modifies communicated data to masquerade as one or 146 more of the entities involved in a communication association." 148 2. Motivation and Background 150 2.1. Objectives 152 o An open standard for secure messaging requirements 154 o Unified evaluation framework: unified goals and threat models 156 o Common pitfalls 158 o Future directions on requirements and technologies 160 o Misleading products on the wild (EFF secure messaging scorecard) 162 2.2. Known Implementations 164 2.2.1. Pretty Easy Privacy (pEp) 166 To achieve privacy of exchanged messages in an opportunistic way 167 [RFC7435], the following model (simplified) is proposed by pEp 168 (pretty Easy Privacy) [I-D.birk-pep]: 170 ----- ----- 171 | A | | B | 172 ----- ----- 173 | | 174 +------------------------+ +------------------------+ 175 | auto-generate key pair | | auto-generate key pair | 176 | (if no key yet) | | (if no key yet) | 177 +------------------------+ +------------------------+ 178 | | 179 +-----------------------+ +-----------------------+ 180 | Privacy Status for B: | | Privacy Status for A: | 181 | *Unencrypted* | | *Unencrypted* | 182 +-----------------------+ +-----------------------+ 183 | | 184 | A sends message to B (Public Key | 185 | attached) / optionally signed, but | 186 | NOT ENCRYPTED | 187 +------------------------------------------>| 188 | | 189 | +-----------------------+ 190 | | Privacy Status for A: | 191 | | *Encrypted* | 192 | +-----------------------+ 193 | | 194 | B sends message to A (Public Key | 195 | attached) / signed and ENCRYPTED | 196 |<------------------------------------------+ 197 | | 198 +-----------------------+ | 199 | Privacy Status for B: | | 200 | *Encrypted* | | 201 +-----------------------+ | 202 | | 203 | A and B successfully compare their | 204 | Trustwords over an alternative channel | 205 | (e.g., phone line) | 206 |<-- -- -- -- -- -- -- -- -- -- -- -- -- -->| 207 | | 208 +-----------------------+ +-----------------------+ 209 | Privacy Status for B: | | Privacy Status for A: | 210 | *Trusted* | | *Trusted* | 211 +-----------------------+ +-----------------------+ 212 | | 214 pEp is intended to solve three problems : 216 o Key management 218 o Trust management 220 o Identity management 222 pEp is intended to be used in pre-existing messaging solutions and 223 provide Privacy by Default, at a minimum, for message content. In 224 addition, pEp provides technical data protection including metadata 225 protection. 227 An additional set of use cases applies to enterprise environments 228 only. In some instances, the enterprise may require access to 229 message content. Reasons for this may include the need to conform to 230 compliance requirements or virus/malware defense. 232 2.2.2. Autocrypt 234 Another known approach in this area is Autocrypt. Compared to pEp 235 (cf. Section 2.2.1) - there are certain differences, for example, 236 regarding the prioritization of support for legacy PGP [RFC4880] 237 implementations. 239 More information on Autocrypt can be found on: https://autocrypt.org/ 240 background.html 242 [[ TODO: Input from autocrypt group ]] 244 2.3. Focus Areas (Design Challenges): 246 o Trust establishment: some human interaction 248 o Conversation security: no human interaction 250 o Transport privacy: no human interaction 252 3. System Model 254 3.1. Entities 256 o Users, sender and receiver(s): The communicating parties who 257 exchange messages, typically referred to as senders and receivers. 259 o Messaging operators and network nodes: The communicating service 260 providers and network nodes that are responsible for message 261 delivery and synchronization. 263 o Third parties: Any other entity who interacts with the messaging 264 system. 266 3.2. Basic Functional Requirements 268 This section outlines the functional requirements. We follow the 269 requirements extracted from the literature on private emails and 270 instant messaging [Unger] [Ermoshina] [Clark]. 272 o Message: send and receive message(s) 274 o Multi-device support: synchronization across multiple devices 276 o Group messaging: communication of more than 2 users 278 [[ TODO: Add more text on Group Messaging requirements. ]] 280 4. Threat Analyses 282 This section describes a set of possible threats. Note that not all 283 threats can be addressed, due to conflicting requirements. 285 4.1. Adversarial model 287 An adversary is any entity who leverages threats against the 288 communication system, whose goal is to gain improper access to the 289 message content and users' information. They can be anyone who is 290 involved in communication, such as users of the system, message 291 operators, network nodes, or even third parties. 293 o Internal - external: An adversary can seize control of entities 294 within the system, such as extracting information from a specific 295 entity or preventing a message from being sent. An external 296 adversary can only compromise the communication channels 297 themselves, eavesdropping and tampering with messaging such as 298 performing Man-in-the-Middle (MitM) attacks. 300 o Local - global: A local adversary can control one entity that is 301 part of a system, while a global adversary can seize control of 302 several entities in a system. A global adversary can also monitor 303 and control several parts of the network, granting them the 304 ability to correlate network traffic, which is crucial in 305 performing timing attacks. 307 o Passive - active: A passive attacker can only eavesdrop and 308 extract information, while an active attacker can tamper with the 309 messages themselves, such as adding, removing, or even modifying 310 them. 312 Attackers can combine these adversarial properties in a number of 313 ways, increasing the effectiveness - and probable success - of their 314 attacks. For instance, an external global passive attacker can 315 monitor multiple channels of a system, while an internal local active 316 adversary can tamper with the messages of a targeted messaging 317 provider [Diaz]. 319 4.2. Security Threats and Requirements 321 4.2.1. Spoofing and Entity Authentication 323 Spoofing occurs when an adversary gains improper access to the system 324 upon successfully impersonating the profile of a valid user. The 325 adversary may also attempt to send or receive messages on behalf of 326 that user. The threat posed by an adversary's spoofing capabilities 327 is typically based on the local control of one entity or a set of 328 entities, with each compromised account typically is used to 329 communicate with different end-users. In order to mitigate spoofing 330 threats, it is essential to have entity authentication mechanisms in 331 place that will verify that a user is the legitimate owner of a 332 messaging service account. The entity authentication mechanisms 333 typically rely on the information or physical traits that only the 334 valid user should know/possess, such as passwords, valid public keys, 335 or biometric data like fingerprints. 337 4.2.2. Information Disclosure and Confidentiality 339 An adversary aims to eavesdrop and disclose information about the 340 content of a message. They can attempt to perform a man-in-the- 341 middle attack (MitM). For example, an adversary can attempt to 342 position themselves between two communicating parties, such as 343 gaining access to the messaging server and remain undetectable while 344 collecting information transmitted between the intended users. The 345 threat posed by an adversary can be from local gaining control of one 346 point of a communication channel such as an entity or a communication 347 link within the network. The adversarial threat can also be broader 348 in scope, such as seizing global control of several entities and 349 communication links within the channel. That grants the adversary 350 the ability to correlate and control traffic in order to execute 351 timing attacks, even in the end-to-end communication systems [Tor]. 352 Therefore, confidentiality of messages exchanged within a system 353 should be guaranteed with the use of encryption schemes 355 4.2.3. Tampering With Data and Data Authentication 357 An adversary can also modify the information stored and exchanged 358 between the communication entities in the system. For instance, an 359 adversary may attempt to alter an email or an instant message by 360 changing the content of them. As a result, it can be anyone but the 361 users who are communicating, such as the message operators, the 362 network node, or third parties. The threat posed by an adversary can 363 be in gaining local control of an entity which can alter messages, 364 usually resulting in a MitM attack on an encrypted channel. 365 Therefore, no honest party should accept a message that was modified 366 in transit. Data authentication of messages exchanged needs to be 367 guaranteed, such as with the use of Message Authentication Code (MAC) 368 and digital signatures. 370 4.2.4. Repudiation and Accountability (Non-Repudiation) 372 Adversaries can repudiate, or deny, the status of the message to 373 users of the system. For instance, an adversary may attempt to 374 provide inaccurate information about an action performed, such as 375 about sending or receiving an email. An adversary can be anyone who 376 is involved in communicating, such as the users of the system, the 377 message operators, and the network nodes. To mitigate repudiation 378 threats, accountability, and non-repudiation of actions performed 379 must be guaranteed. Non-repudiation of action can include proof of 380 origin, submission, delivery, and receipt between the intended users. 381 Non-repudiation can be achieved with the use of cryptographic schemes 382 such as digital signatures and audit trails such as timestamps. 384 4.3. Privacy Threats and Requirements 386 4.3.1. Identifiability - Anonymity 388 Identifiability is defined as the extent to which a specific user can 389 be identified from a set of users, which is the identifiability set. 390 Identification is the process of linking information to allow the 391 inference of a particular user's identity [RFC6973]. An adversary 392 can identify a specific user associated with Items of Interest (IOI), 393 which include items such as the ID of a subject, a sent message, or 394 an action performed. For instance, an adversary may identify the 395 sender of a message by examining the headers of a message exchanged 396 within a system. To mitigate identifiability threats, the anonymity 397 of users must be guaranteed. Anonymity is defined from the attackers 398 perspective as the "the attacker cannot sufficiently identify the 399 subject within a set of subjects, the anonymity set" [Pfitzmann]. 400 Essentially, in order to make anonymity possible, there always needs 401 to be a set of possible users such that for an adversary the 402 communicating user is equally likely to be of any other user in the 403 set [Diaz]. Thus, an adversary cannot identify who is the sender of 404 a message. Anonymity can be achieved with the use of pseudonyms and 405 cryptographic schemes such as anonymous remailers (i.e., mixnets), 406 anonymous communications channels (e.g., Tor), and secret sharing. 408 4.3.2. Linkability - Unlinkability 410 Linkability occurs when an adversary can sufficiently distinguish 411 within a given system that two or more IOIs such as subjects (i.e., 412 users), objects (i.e., messages), or actions are related to each 413 other [Pfitzmann]. For instance, an adversary may be able to relate 414 pseudonyms by analyzing exchanged messages and deduce that the 415 pseudonyms belong to one user (though the user may not necessarily be 416 identified in this process). Therefore, unlinkability of IOIs should 417 be guaranteed through the use of pseudonyms as well as cryptographic 418 schemes such as anonymous credentials. 420 4.3.3. Detectability and Observability - Undetectability 422 Detectability occurs when an adversary is able to sufficiently 423 distinguish an IOI, such as messages exchanged within the system, 424 from random noise [Pfitzmann]. Observability occurs when that 425 detectability occurs along with a loss of anonymity for the entities 426 within that same system. An adversary can exploit these states in 427 order to infer linkability and possibly identification of users 428 within a system. Therefore, undetectability of IOIs should be 429 guaranteed, which also ensures unobservability. Undetectability for 430 an IOI is defined as that "the attacker cannot sufficiently 431 distinguish whether it exists or not." [Pfitzmann]. Undetectability 432 can be achieved through the use of cryptographic schemes such as mix- 433 nets and obfuscation mechanisms such as the insertion of dummy 434 traffic within a system. 436 4.4. Information Disclosure - Confidentiality 438 Information disclosure - or loss of confidentiality - about users, 439 message content, metadata or other information is not only a security 440 but also a privacy threat that a communicating system can face. For 441 example, a successful MitM attack can yield metadata that can be used 442 to determine with whom a specific user communicates with, and how 443 frequently. To guarantee the confidentiality of messages and prevent 444 information disclosure, security measures need to be guaranteed with 445 the use of cryptographic schemes such as symmetric, asymmetric or 446 homomorphic encryption and secret sharing. 448 4.5. Non-repudiation and Deniability 450 Non-repudiation can be a threat to a user's privacy for private 451 messaging systems, in contrast to security. As discussed in section 452 6.1.4, non-repudiation should be guaranteed for users. However, non- 453 repudiation carries a potential threat vector in itself when it is 454 used against a user in certain instances. For example, whistle- 455 blowers may find non-repudiation used against them by adversaries, 456 particularly in countries with strict censorship policies and in 457 cases where human lives are at stake. Adversaries in these 458 situations may seek to use shreds of evidence collected within a 459 communication system to prove to others that a whistle-blowing user 460 was the originator of a specific message. Therefore, plausible 461 deniability is essential for these users, to ensure that an adversary 462 can neither confirm nor contradict that a specific user sent a 463 particular message. Deniability can be guaranteed through the use of 464 cryptographic protocols such as off-the-record messaging. 466 [[ TODO: Describe relation of the above introduced Problem Areas to 467 scope of MEDUP ]] 469 5. Specific Security and Privacy Requirements 471 [[ This section is still in early draft state, to be substantially 472 improved in future revisions. Among other things, there needs to be 473 clearer distinction between MEDUP requirements, and those of a 474 specific implementation. ]] 476 5.1. Messages Exchange 478 5.1.1. Send Message 480 o Send encrypted and signed message to another peer 482 o Send unencrypted and unsigned message to another peer 484 Note: Subcases of sending messages are outlined in Section 6.2. 486 5.1.2. Receive Message 488 o Receive encrypted and signed message from another peer 490 o Receive encrypted, but unsigned message from another peer 492 o Receive signed, but unencrypted message from another peer 494 o Receive unencrypted and unsigned message from another peer 496 Note: Subcases of receiving messages are outlined in Section 6.3. 498 5.2. Trust Management 500 o Trust rating of a peer is updated (locally) when: 502 * Public Key is received the first time 504 * Trustwords have been compared successfully and confirmed by 505 user (see above) 507 * Trust of a peer is revoked (cf. Section 5.3, Key Reset) 509 o Trust of a public key is synchronized among different devices of 510 the same user 512 Note: Synchronization management (such as the establishment or 513 revocation of trust) among a user's own devices is described in 514 Section 5.4 516 5.3. Key Management 518 o New Key pair is automatically generated at startup if none are 519 found. 521 o Public Key is sent to peer via message attachment 523 o Once received, Public Key is stored locally 525 o Key pair is declared invalid and other peers are informed (Key 526 Reset) 528 o Public Key is marked invalid after receiving a key reset message 530 o Public Keys of peers are synchronized among a user's devices 532 o Private Keys are synchronized among a user's devices 534 Note: Synchronization management (such as establish or revoke 535 trust) among a user's own devices is described in Section 5.4 537 5.4. Synchronization Management 539 A device group is comprised of devices belonging to one user, which 540 share the same key pairs in order to synchronize data among them. In 541 a device group, devices of the same user mutually grant 542 authentication. 544 o Form a device group of two (yet ungrouped) devices of the same 545 user 547 o Add another device of the same user to existing device group 549 o Leave device group 551 o Remove other device from device group 553 5.5. Identity Management 555 o All involved parties share the same identity system 557 5.6. User Interface 559 [[ TODO ]] 561 6. Subcases 563 6.1. Interaction States 565 The basic model consists of different interaction states: 567 1. Both peers have no public key of each other, no trust possible 569 2. Only one peer has the public key of the other peer, but no trust 571 3. Only one peer has the public key of the other peer and trusts 572 that public key 574 4. Both peers have the public key of each other, but no trust 576 5. Both peers have exchanged public keys, but only one peer trusts 577 the other peer's public key 579 6. Both peers have exchanged public keys, and both peers trust the 580 other's public key 582 The following table shows the different interaction states possible: 584 +-------+-----------------+-------------------+---------+-----------+ 585 | state | Peer's Public | My Public Key | Peer | Peer | 586 | | Key available | available to Peer | Trusted | trusts me | 587 +-------+-----------------+-------------------+---------+-----------+ 588 | 1. | no | no | N/A | N/A | 589 | | | | | | 590 | 2a. | no | yes | N/A | no | 591 | | | | | | 592 | 2b. | yes | no | no | N/A | 593 | | | | | | 594 | 3a. | no | yes | N/A | yes | 595 | | | | | | 596 | 3b. | yes | no | yes | N/A | 597 | | | | | | 598 | 4. | yes | yes | no | no | 599 | | | | | | 600 | 5a. | yes | yes | no | yes | 601 | | | | | | 602 | 5b. | yes | yes | yes | no | 603 | | | | | | 604 | 6. | yes | yes | yes | yes | 605 +-------+-----------------+-------------------+---------+-----------+ 607 In the simplified model, only interaction states 1, 2, 4 and 6 are 608 depicted. States 3 and 5 may result from e.g. key mistrust or 609 abnormal user behavior. Interaction states 1, 2 and 4 are part of 610 TOFU. For a better understanding, you may consult the figure in 611 Section 2.2.1 above. 613 Note: In situations where one peer has multiple key pairs, or group 614 conversations are occurring, interaction states become increasingly 615 complex. For now, we will focus on a single bilateral interaction 616 between two peers, each possessing a single key pair. 618 [[ Note: Future versions of this document will address more complex 619 cases ]] 621 6.2. Subcases for Sending Messages 623 o If peer's Public Key not available (Interaction States 1, 2a, and 624 3a) 626 * Send message Unencrypted (and unsigned) 628 o If peer's Public Key available (Interaction States 2b, 3b, 4, 5a, 629 5b, 6) 631 * Send message Encrypted and Signed 633 6.3. Subcases for Receiving Messages 635 o If peer's Public Key not available (Interaction States 1, 2a, and 636 3a) 638 * If message is signed 640 + ignore signature 642 * If message is encrypted 644 + decrypt with caution 646 * If message unencrypted 648 + No further processing regarding encryption 650 o If peer's Public Key available or can be retrieved from received 651 message (Interaction States 2b, 3b, 4, 5a, 5b, 6) 653 * If message is signed 655 + verify signature 657 + If message is encrypted 659 - Decrypt 661 + If message unencrypted 663 - No further processing regarding encryption 665 * If message unsigned 667 + If message is encrypted 669 - exception 671 + If message unencrypted 673 - No further processing regarding encryption 675 7. Security Considerations 677 Relevant security considerations are outlined in Section 4.2. 679 8. Privacy Considerations 681 Relevant privacy considerations are outlined in Section 4.3. 683 9. IANA Considerations 685 This document requests no action from IANA. 687 [[ RFC Editor: This section may be removed before publication. ]] 689 10. Acknowledgments 691 The authors would like to thank the following people who have 692 provided feedback or significant contributions to the development of 693 this document: Athena Schumacher, Claudio Luck, Hernani Marques, 694 Kelly Bristol, Krista Bennett, and Nana Karlstetter. 696 11. References 698 11.1. Normative References 700 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 701 Requirement Levels", BCP 14, RFC 2119, 702 DOI 10.17487/RFC2119, March 1997, 703 . 705 [RFC4949] Shirey, R., "Internet Security Glossary, Version 2", 706 FYI 36, RFC 4949, DOI 10.17487/RFC4949, August 2007, 707 . 709 [RFC7435] Dukhovni, V., "Opportunistic Security: Some Protection 710 Most of the Time", RFC 7435, DOI 10.17487/RFC7435, 711 December 2014, . 713 11.2. Informative References 715 [Clark] Clark, J., van Oorschot, P., Ruoti, S., Seamons, K., and 716 D. Zappala, "Securing Email", CoRR abs/1804.07706, 2018. 718 [Diaz] Diaz, C., Seys, St., Claessens, J., and B. Preneel, 719 "Towards Measuring Anonymity", PET Privacy Enhancing 720 Technologies, Second International Workshop, San 721 Francisco, CA, USA, April 14-15, 2002, Revised Papers, pp. 722 54-68, 2002. 724 [Ermoshina] 725 Ermoshina, K., Musiani, F., and H. Halpin, "End-to-End 726 Encrypted Messaging Protocols: An Overview", INSCI 2016: 727 pp. 244-254, 2016. 729 [I-D.birk-pep] 730 Marques, H. and B. Hoeneisen, "pretty Easy privacy (pEp): 731 Privacy by Default", draft-birk-pep-03 (work in progress), 732 March 2019. 734 [I-D.birk-pep-trustwords] 735 Birk, V., Marques, H., and B. Hoeneisen, "IANA 736 Registration of Trustword Lists: Guide, Template and IANA 737 Considerations", draft-birk-pep-trustwords-03 (work in 738 progress), March 2019. 740 [Pfitzmann] 741 Pfitzmann, A. and M. Hansen, "A terminology for talking 742 about privacy by data minimization: Anonymity, 743 unlinkability, undetectability, unobservability, 744 pseudonymity, and identity management", 2010, 745 . 748 [RFC4880] Callas, J., Donnerhacke, L., Finney, H., Shaw, D., and R. 749 Thayer, "OpenPGP Message Format", RFC 4880, 750 DOI 10.17487/RFC4880, November 2007, 751 . 753 [RFC6973] Cooper, A., Tschofenig, H., Aboba, B., Peterson, J., 754 Morris, J., Hansen, M., and R. Smith, "Privacy 755 Considerations for Internet Protocols", RFC 6973, 756 DOI 10.17487/RFC6973, July 2013, 757 . 759 [RFC8280] ten Oever, N. and C. Cath, "Research into Human Rights 760 Protocol Considerations", RFC 8280, DOI 10.17487/RFC8280, 761 October 2017, . 763 [Tor] Project, T., "One cell is enough to break Tor's 764 anonymity", June 2019, . 767 [Unger] Unger, N., Dechand, S., Bonneau, J., Fahl, S., Perl, H., 768 Goldberg, I., and M. Smith, "SoK: Secure Messaging", 769 IEEE Proceedings - 2015 IEEE Symposium on Security and 770 Privacy, SP 2015, pages 232-249, July 2015, 771 . 774 Appendix A. Document Changelog 776 [[ RFC Editor: This section is to be removed before publication ]] 778 o draft-symeonidis-medup-requirements-00: 780 * Initial version 782 Appendix B. Open Issues 784 [[ RFC Editor: This section should be empty and is to be removed 785 before publication ]] 787 o Add references to used materials (in particular threat analyses 788 part) 790 o Get content from Autocrypt (Section 2.2.2) 792 o Add more text on Group Messaging requirements 794 o Decide on whether or not "enterprise requirement" will go to this 795 document 797 Authors' Addresses 799 Iraklis Symeonidis 800 University of Luxembourg 801 29, avenue JF Kennedy 802 L-1855 Luxembourg 803 Luxembourg 805 Email: iraklis.symeonidis@uni.lu 806 URI: https://wwwen.uni.lu/snt/people/iraklis_symeonidis 807 Bernie Hoeneisen 808 Ucom Standards Track Solutions GmbH 809 CH-8046 Zuerich 810 Switzerland 812 Phone: +41 44 500 52 40 813 Email: bernie@ietf.hoeneisen.ch (bernhard.hoeneisen AT ucom.ch) 814 URI: https://ucom.ch/