idnits 2.17.1 draft-ietf-ipsec-secconf-00.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- ** Looks like you're using RFC 2026 boilerplate. This must be updated to follow RFC 3978/3979, as updated by RFC 4748. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- ** Missing expiration date. The document expiration date should appear on the first and last page. ** The document seems to lack a 1id_guidelines paragraph about Internet-Drafts being working documents. ** The document seems to lack a 1id_guidelines paragraph about 6 months document validity -- however, there's a paragraph with a matching beginning. Boilerplate error? ** The document seems to lack a 1id_guidelines paragraph about the list of current Internet-Drafts. ** The document seems to lack a 1id_guidelines paragraph about the list of Shadow Directories. ** The document is more than 15 pages and seems to lack a Table of Contents. == No 'Intended status' indicated for this document; assuming Proposed Standard Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- ** The document seems to lack an Introduction section. ** The document seems to lack an IANA Considerations section. (See Section 2.2 of https://www.ietf.org/id-info/checklist for how to handle the case when there are no actions for IANA.) ** The document seems to lack separate sections for Informative/Normative References. All references will be assumed normative when checking for downward references. ** There are 6 instances of too long lines in the document, the longest one being 4 characters in excess of 72. ** The document seems to lack a both a reference to RFC 2119 and the recommended RFC 2119 boilerplate, even if it appears to use RFC 2119 keywords. RFC 2119 keyword, line 81: '... The keywords MUST, MUST NOT, REQUIR...' RFC 2119 keyword, line 82: '... SHOULD NOT, RECOMMENDED, MAY, and O...' RFC 2119 keyword, line 422: '...gnment, security devices MUST not rely...' RFC 2119 keyword, line 425: '...guration process MUST be independent o...' RFC 2119 keyword, line 546: '... via this SA. The SA MUST be protected...' (7 more instances...) Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the RFC 3978 Section 5.4 Copyright Line does not match the current year == Line 50 has weird spacing: '...itially deplo...' == Line 178 has weird spacing: '...n could be a ...' == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: It is also appropriate to assume, at least for the present, that DHCP is insecure. Since IPsec is a layer 3 construct, it cannot be used to protect DHCP transactions, or to authenticate DHCP servers. While some discussion is currently under way regarding DHCP authentication and security [DHCSEC, DHCAUTH], no such mechanisms have yet been widely adopted. Hence, in cases where DHCP must be used for initial address assignment, security devices MUST not rely on the involved IP addresses as identifiers or credentials. That is, the authentication and confidentiality mechanisms used to secure the configuration process MUST be independent of the IP addresses of the security device and configuration server. == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: For this form of configuration, the considerations for all 3 tuples are essentially the same; hence, all are described together. In order to secure the private, partially private, and public configuration exchanges, the public key (or list of public keys) for the allowed configuration server(s) must be assigned to the security device during the manufacturing process. The security device MUST not rely upon the addresses configured in the first phase for authentication. -- The document seems to lack a disclaimer for pre-RFC5378 work, but may have content which was first submitted before 10 November 2008. If you have contacted all the original authors and they are all willing to grant the BCP78 rights to the IETF Trust, then this is fine, and you can ignore this comment. If not, you may need to add the pre-RFC5378 disclaimer. (See the Legal Provisions document at https://trustee.ietf.org/license-info for more information.) -- Couldn't find a document date in the document -- date freshness check skipped. Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) == Missing Reference: 'RFC-2119' is mentioned on line 84, but not defined == Missing Reference: 'SNMP3' is mentioned on line 713, but not defined == Unused Reference: 'IKE' is defined on line 765, but no explicit reference was found in the text -- Possible downref: Non-RFC (?) normative reference: ref. 'ARCH' -- Possible downref: Non-RFC (?) normative reference: ref. 'IKE' -- Possible downref: Non-RFC (?) normative reference: ref. 'DHCSEC' -- Possible downref: Non-RFC (?) normative reference: ref. 'DHCAUTH' -- Possible downref: Non-RFC (?) normative reference: ref. 'TLS' Summary: 12 errors (**), 0 flaws (~~), 9 warnings (==), 7 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 1 IPSEC Working Group Scott Kelly, 2 INTERNET-DRAFT RedCreek Communications 3 draft-ietf-ipsec-secconf-00.txt Mike St. Johns, 4 Expires in 6 months @Home Network 5 October, 1998 7 Secure Configuration of IPsec-Enabled Network Devices 9 Status of This Memo 10 This document is an Internet-Draft. Internet-Drafts are working 11 documents of the Internet Engineering Task Force (IETF), its areas, 12 and its working groups. Note that other groups may also distribute 13 working documents as Internet-Drafts. 15 Internet-Drafts are draft documents valid for a maximum of six months 16 and may be updated, replaced, or obsoleted by other documents at any 17 time. It is inappropriate to use Internet-Drafts as reference 18 material or to cite them other than as ``work in progress.'' 20 To view the entire list of current Internet-Drafts, please check the 21 "1id-abstracts.txt" listing contained in the Internet-Drafts Shadow 22 Directories on ftp.is.co.za (Africa), ftp.nordu.net (Northern 23 Europe), ftp.nis.garr.it (Southern Europe), munnari.oz.au (Pacific 24 Rim), ftp.ietf.org (US East Coast), or ftp.isi.edu (US West Coast). 26 Comments on this document should be sent to "ipsec@tis.com", the IETF 27 IPsec WG discussion list. 29 Abstract 31 Remote configuration of network devices which implement IPsec- 32 related services is desirable as a matter of convenience and of 33 scale. In some cases, these devices are installed on a network with 34 no prior configuration. In such cases, secure mechanisms for 35 bootstrap configuration are required. In this document the associated 36 issues are examined, and a multi-tiered approach is proposed from 37 which a specific method may be selected based upon the security 38 requirements of the environment in which the security device exists. 39 While the primary devices considered here are security gateways and 40 bump-in-the-wire encryptors, many of the resulting conclusions may 41 extend to other devices, including host IPsec implementations. 43 1. Problem Space Overview 45 In general, the level of inconvenience associated with configuring a 46 network device is directly proportional to the level of security 47 desired of the device once configured. To a somewhat lesser degree, 48 it is also a function of the environment in which configuration takes 49 place, and of whether the device has been previously configured for 50 that environment. When initially deploying an IP security device 51 such as a security gateway or a bump-in-the-wire encryptor, there are 52 at least two common requirements. First, the device requires the 53 assignment of an IP address and associated infrastructure 54 information. Second, it requires additional configuration relating to 55 the specific device and to local security policy. 57 In obtaining the required configuration, one of 3 general scenarios 58 may occur: in the first, the device is placed on the network without 59 any initial configuration, and DHCP is used to assign an IP address 60 and a next boot or configuration server, after which time the 61 additional configuration information is retrieved. This is the 62 "bootstrap" method. In the second scenario, the device is entirely 63 configured offline (on a private network, using a serial port, or in 64 some other manner) before being placed on the network. In the third 65 scenario, the device is first assigned an IP address and some sort of 66 configuration server designation, and then it is placed on the 67 network where it obtains additional configuration information from 68 the specified server. 70 In all of the above cases, network configuration is followed by 71 additional security and device configuration, using SNMP, LDAP, or 72 some proprietary mechanism. Depending upon a variety of 73 circumstances, the security requirements of a particular installation 74 will determine which of these methods represents an acceptable level 75 of security. This document examines precisely what the 76 vulnerabilities of each method are, and proposes mechanisms which 77 eliminate or minimize these vulnerabilities. 79 1.1 Requirements Terminology 81 The keywords MUST, MUST NOT, REQUIRED, SHALL, SHALL NOT, SHOULD, 82 SHOULD NOT, RECOMMENDED, MAY, and OPTIONAL, when they appear in this 83 document, are to be interpreted as described in [RFC-2119]. 85 1.2 Security Terminology 87 This document uses the following security terms: 89 "Authentication" 90 Authentication, when used in this document, refers to a process 91 which results in verification of both the identity of a subject, 92 and of the assertion that the object in question was originated 93 by the subject. For a more comprehensive definition of this term, 94 see [ARCH]. 96 "Confidentiality" 97 Confidentiality, when used in this document, refers to the degree 98 to which exchanged information is unknown to all but the 99 endpoints of the exchange. For a more comprehensive definition of 100 this term, see [ARCH]. 102 "Data Integrity" 103 Data integrity, when used in this document, refers to the 104 reliability of received data, or to the degree of certainty that 105 the data which is received has not been altered in transit. For a 106 more comprehensive definition of this term, see [ARCH]. 108 1.3 Secure Configuration Terminology 110 This document uses the following terms in describing secure 111 configuration: 113 "Security Device" 114 A security device may be a security gateway or bump-in-the- wire 115 encryptor which provides IPsec-related services to networks 116 and/or users. 118 "Configuration Server" 119 The terms "server" and "configuration server" are used when 120 referring to either a privately linked configuring system or to a 121 DHCP (or other) configuration server reachable over the not 122 necessarily local network. 124 "Client" 125 The term "client" is used when referring to a security device 126 which must obtain some portion of its configuration from a 127 server. 129 "Public network" 130 A public network is any network on which the security device and 131 configuration server reside which is also accessible to devices 132 other than the security device and the configuration server. 134 2. Configuration Scenarios 136 In evaluating the various configuration scenarios, it is useful to 137 think in terms of the 2 'phases' of device configuration previously 138 discussed. In the first phase, the IP address of the device is 139 assigned, and in some cases an identifier representing an authorized 140 configuration server may also be assigned. In the second phase, 141 additional configuration relating to device function (including 142 security policy parameters) is accomplished, and this may be done 143 either by the same server which participated in the first phase, or 144 by another server. 146 It is also useful to think in terms of tuples corresponding to the 147 phases described above, with the tuple elements denoting the method 148 employed in each phase. These methods take on one of two values: a 149 network interface of the device, provided that no other system has 150 access to the device via that network connection during 151 configuration, or it could instead be accomplished using a private 152 serial (or other) interface. There are four possible tuples: 154 Address Assignment Configuration 155 Mechanism Mechanism 156 ===================================== 157 Private Private 158 Private Network 159 Network Private 160 Network Network 162 In general, the tuple (network, private) is so unlikely that no 163 further consideration is given to that mechanism here. The others 164 have varying likelihood, depending upon the circumstances of 165 deployment. The prominent characteristics of each tuple are defined 166 in the following sections. 168 2.1 Entirely Private Configuration: (private, private) 170 +------------+ private connection +---------------+ 171 | Security |--------------------------| configuration | 172 | device | | server | 173 +------------+ +---------------+ 175 Entirely out-of-band configuration represents a seemingly trivial 176 case, although this process could be compromised in various ways. 177 These are discussed in section 3. The private connection in the above 178 illustration could be a serial line, an ethernet link, or of some 179 other proprietary type. For our purposes these are all equivalent, 180 the distinguishing characteristic being that no other system may 181 interfere with the exchange. 183 This mechanism, while reasonably secure, does not scale well. 184 Consider the provisioning of home users' devices, e.g. for CATV 185 network systems or ADSL installations. In some cases, the devices 186 will be installed by users after picking up the equipment from a 187 central office. In such cases, the level of user expertise may be 188 such that configuring the device is difficult or impractical. Also, 189 while the configuration could arguably be done at the central office 190 (or even at the manufacturing site), this obviously presents scaling 191 problems. It seems quite clear that to maximize the scalability of 192 the installation process, we must minimize the effort (and expertise) 193 required. 195 2.2 Private/Public Configuration: (private, network) 197 Network, possibly disjoint 198 | | 199 +------------+ | | +---------------+ 200 | Security |----------+--/ /---|----| secondary | 201 | device | | | | configuration | 202 +------+-----+ | | | server | 203 | +---------------+ 204 |(initial) private link 205 | 206 +------+--------+ 207 | initial | 208 | configuration | 209 | server | 210 +---------------+ 212 Partially out-of-band configuration represents the case in which the 213 initial IP address and configuration server identifier(s) (and 214 perhaps credentials) are assigned privately, after which the device 215 is installed on the network. When the device is powered on after 216 network installation, it attempts to obtain its device configuration 217 from the configured server ('secondary configuration server' in 218 diagram). 220 2.3 Completely Public Configuration: (network, network) 222 Network, possibly disjoint 224 +------------+ | | +---------------+ 225 | Security |----------+--/ /---|----| DHCP | 226 | device | | | | server | 227 +------------+ | | +---------------+ 228 ~ 229 | +---------------+ 230 |----| configuration | 231 | | server | 232 | +---------------+ 234 Completely public configuration represents the case in which the 235 unconfigured device is connected to the public network, and in which 236 the device first attempts to procure an address and next-boot-server 237 indication from a DHCP server, and then attempts to obtain its 238 configuration from the server with the provided identity, which may 239 be the same as the DHCP server. This is a common provisioning model 240 in the CATV/ADSL home network access industry. 242 3. Vulnerabilities 244 In evaluating the vulnerabilities of each scenario, it is appropriate 245 to recognize that in the worst case the attacker will be highly 246 skilled, have plenty of resources, and have at least one of the 247 devices in question available for leisurely examination. Furthermore, 248 it should also be assumed that the attacker will have access to a 249 network upon which such devices are installed, so that if the devices 250 are publicly configured, there will be ample opportunity to observe 251 this process. 253 Under these circumstances it is very difficult to provide a foolproof 254 security system, although we may come very close if we combine 255 physical security with some of the methodologies proposed below. 256 Additionally, we must recognize that in some cases, even nearly- 257 foolproof security is financially or logistically infeasible. In such 258 cases, a lesser degree of security may be acceptable. 260 There are essentially 3 points within this process which may be 261 attacked, depending upon which configuration scenario we choose: the 262 (private) configuring system, the device which is to be configured, 263 and the remote configuration server. These various attacks are 264 explored in the context of individual configuration scenarios below. 265 Attacks on the network medium itself may generally be classed as 266 Denial-of-Service (DoS) or passive "man in the middle" (MiM) attacks 267 on the various devices, so this is not discussed as a separate point 268 of attack. 270 3.1 Vulnerabilities of Entirely Private Configuration 272 +------------+ private connection +---------------+ 273 | Security |--------------------------| configuration | 274 | device | | server | 275 +------------+ +---------------+ 277 There are 2 points of attack in this situation: the configuration 278 server, and the security device itself. Attacks upon the 279 configuration server might include installation of hostile software, 280 or simply misappropriation and unauthorized use. Attacks upon the 281 device which is to be configured include device substitution or 282 firmware replacement. DoS is not a viable attack in this case, 283 although it could theoretically be (temporarily) mounted by damaging 284 the configuring interface of either device. 286 If we can insure the physical inaccessibility of the devices to 287 anyone other than the authorized system administrator, remaining 288 concerns are minimal. However, insuring inaccessibility of the 289 security device between the time of manufacturing and time of 290 delivery may be difficult. If access to either system by unauthorized 291 parties is a possibility, then additional steps are required to 292 secure the device configuration process. Similar mechanisms may be 293 applied in other configuration scenarios, and discussion of these is 294 deferred to section 4 below. 296 3.2 Vulnerabilities of Partially Private and Public Configuration 297 Network, possibly disjoint 298 +------------+ | | +---------------+ 299 | Security |----------+--/ /---|----| DHCP | 300 | device | | | | server | 301 +------------+ | | +---------------+ 302 ~ 303 | +---------------+ 304 |----| configuration | 305 | | server | 306 | +---------------+ 308 The initial phase of configuration in this scenario is susceptible 309 to the same attacks as for the previous scenario. If the physical 310 security of the devices prior to configuration cannot be 311 guaranteed, both are subject to substitution or damage. Assuming 312 the first phase of configuration (IP address, config server 313 identification) are completed successfully, then the second phase 314 is the only one of concern. 316 | | 317 +---+ | +---+ | +---+ 318 | C |---+-//--| M |-//-+--| S | 319 +---+ | +---+ | +---+ 320 | | 322 Referring to the diagram above, let C represent the 'client' 323 security device, let M represent a malicious user, and let S 324 represent the configuration server. It should be obvious at this 325 point that we are concerned with man-in-the-middle attacks. In this 326 case, M may be capable of impersonating S with respect to C, and of 327 impersonating C with respect to S, effectively tricking both sides 328 into thinking C is appropriately configured, when in fact it is 329 not. Or, M could simply examine all intervening packets, thus 330 (perhaps) learning something useful about C's configuration - in 331 this case, M would not necessarily be between C and S, but would 332 have access to one or the other's network. 334 Alternatively, M could simply launch a DoS attack in several 335 different ways. For a discussion of various DHCP-related attacks, 336 see [DHCSEC]. Finally, the configuration information on the server 337 itself could be somehow compromised prior to download to the 338 security device, in which case the attacker might subsequently gain 339 control of the security device. 341 3.3 Vulnerabilities of Entirely Public Configuration 343 Entirely public configuration represents the case where the device 344 is placed on the network with no prior configuration. It first must 345 use DHCP to determine its IP address and configuration server, and 346 it then must obtain its configuration from the server. This is by 347 far the most vulnerable of the 3 scenarios, being subject to active 348 or passive man in the middle attacks, denial of service, or 349 configuration compromise. Discussion of various mechanisms for 350 securing the each of these scenarios follows. 352 4. Securing the Configuration Process 354 If we cannot insure complete physical security for the devices 355 involved in the configuration process, the risks presented by 356 unauthorized access may be mitigated in a number of ways, if 357 desired. The means for providing additional security may be broadly 358 grouped in 2 categories: authentication mechanisms, and 359 confidentiality mechanisms. While the value of confidentiality 360 during this process may be debatable, authentication is a critical 361 ingredient of any configuration security scheme. For simplicity, we 362 may group authentication mechanisms into 4 general categories: 364 o None, where the security device simply accepts configuration from 365 any system which has the appropriate interface, and which 366 presents the appropriate commands or protocols. 368 o Low, where the security device and the configuration server have 369 a rudimentary authentication mechanism for each other, such as a 370 shared secret which is manufactured into the device. 372 o Medium, where the security device and perhaps the configuration 373 server are preconfigured with digital certificates, and have some 374 mechanism which employs them for authentication. 376 o High, where the security device and the configuration server rely 377 on hardware protection of the authentication keying material and 378 perhaps some form of secondary authentication. Hardware 379 protection might include use of a symmetric hardware token pair 380 (e.g. smartcards), although it could also consist of any tamper- 381 proof hardware storage mechanism. 383 Obviously, there are gradations within the various levels. This 384 classification is simplified in the interest of clarity, and to 385 facilitate discussion. These levels may be modulated by a number of 386 factors, including the addition of confidentiality mechanisms to 387 the configuration process. There are a number of mechanisms 388 available for provision of confidentiality, several of which are 389 discussed below. 391 It should be noted that there are two components to the 392 authentication process with respect to the configuring server: the 393 first relates to authenticating the server as a network entity, 394 while the second relates to authenticating the configuration 395 application which resides on the server. This distinction becomes 396 important in cases where the configuration server resides on a 397 multi-tasking system. 399 In general, both authentication and confidentiality may be provided 400 to the configuration process using the infrastructure provided by 401 IP security (see [ARCH] et al), although there are other mechanisms 402 available. However, as noted above, the authentication provided by 403 IPsec alone may not suffice under all circumstances. It is one 404 thing to form an authenticated security association with the host 405 upon which a configuring application may run, while it is another 406 entirely to form an authenticated security association with the 407 configuring application itself. 409 In the case where the configuring device does not implement IPsec, 410 or in the case where the configuring application runs on a multi- 411 tasking system, either SNMPv3 or TLS are options for securing the 412 configuration protocol. If the device implements IPsec, the 413 transport security protocol may run within an IPsec SA. The details 414 of the such mechanisms are given below. 416 It is also appropriate to assume, at least for the present, that 417 DHCP is insecure. Since IPsec is a layer 3 construct, it cannot be 418 used to protect DHCP transactions, or to authenticate DHCP servers. 419 While some discussion is currently under way regarding DHCP 420 authentication and security [DHCSEC, DHCAUTH], no such mechanisms 421 have yet been widely adopted. Hence, in cases where DHCP must be 422 used for initial address assignment, security devices MUST not rely 423 on the involved IP addresses as identifiers or credentials. That 424 is, the authentication and confidentiality mechanisms used to 425 secure the configuration process MUST be independent of the IP 426 addresses of the security device and configuration server. 428 In the following sections, each of the general security levels 429 referenced above are discussed in detail with respect to the 430 various configuration tuples. 432 4.1 No Preconfigured Authentication Mechanism (NOAUTH) 434 While there might seem to be nothing one could do to prevent the 435 various attacks on this approach, this is not necessarily the case. 436 The associated protective measures for each of the relevant 437 configuration tuples is examined below. 439 4.1.1 NOAUTH - Entirely Private Configuration: (private, private) 441 In the case of entirely offline configuration, the primary defense 442 is physical security for the devices in question. If this method is 443 chosen, the weak points in the system should be recognized and 444 eliminated inasmuch as this is possible. If the security device 445 uses tamper-evident packaging, then it may be reasonable to assume 446 that devices which exhibit no signs of tampering are safe to 447 configure. If the device is not packaged in a tamper-evident (or 448 tamper-proof) manner, the possibility exists that the device has 449 been altered in some way. This must be recognized as a risk. 451 Perhaps of more concern is the integrity of the configuring system. 452 Configuring system integrity concerns may be mitigated in the 453 following ways: 455 o physically isolate the configuring system. 457 o password-protect the configuring system (with a nontrivial 458 password), and report repeated password failures using a secure 459 auditing procedure. 461 o password-protect the configuring software (with a nontrivial 462 password). The software could be signed with using the password 463 (or a hash of it), and refuse to run if the signature does not 464 match. This would also be an auditable event. 466 o require a hardware token for system access. 468 Obviously, using a hardware token in conjunction with password- 469 protecting both the configuring system and the configuring software 470 provides the highest level of security for the case where no 471 authentication is employed. Concerns more directly related to the 472 unauthenticated configuration process may be mitigated in the 473 following ways: 475 o require a physical key of some sort to be inserted into the 476 security device for the duration of the configuration process. 478 o [TBD - must be other things we can do...] 480 4.1.2 NOAUTH - Partially Private Configuration: (private, public) 482 Assuming that the initial configuration phase has been 483 accomplished, the concern now is in reducing the exposure risk for 484 the configuration download. Using no authentication mechanisms, the 485 various mechanisms discussed above for securing the configuration 486 server and device are available. In addition, there are 2 other 487 ways in which to protect the configuration process. First, the 488 device will contact the server (or vice versa) in the clear. Since 489 the protocol is known, the risks of this approach may be mitigated 490 by monitoring the network for such transactions. Second, further 491 reduction of the risks may be attained by insuring that no 492 malicious middle man may detach the network on which the server 493 resides from the network on which the device resides while 494 inserting himself in between the two. 496 4.1.3 NOAUTH - Entirely Public Configuration: (public, public) 498 In the case of public configuration with no authentication 499 mechanism, the physical security considerations for the other cases 500 must still apply, and may in fact become even more critical. In 501 this case, the device will be placed on the network with no 502 configuration, and then it will broadcast a DHCP (or proprietary) 503 request. Any system on the network could respond to this request. 504 In this case, the only protection which may apply will result from 505 a strict network monitoring policy, and a response mechanism for 506 the case in which a rogue configuration server responds to the 507 broadcast. To further reduce the exposure, one of the several 508 techniques referenced in section 4.1.1 should be used. 510 A more subtle risk exists in that the configuration transaction may 511 be passively monitored. If no confidentiality is provided, the 512 attacker may gain insights which aid an attack upon the security 513 device after configuration is complete. If, on the other hand, the 514 security device forms a Security Association (SA - see [ARCH] for 515 relevant definitions) with the configuration server, such passive 516 observation will be effectively impossible, although there will be 517 no authentication provided with the SA. 519 4.2 Authentication using a Preshared Secret (LOWAUTH) 520 4.2.1 LOWAUTH - Entirely Private Configuration: (private, private) 522 In the case of entirely private configuration, essentially the same 523 issues apply for preshared secrets which apply for no 524 authentication whatsoever, with the caveat that illegal 525 configuration by a rogue server may be slightly more difficult to 526 accomplish. The security realized by the addition of the preshared 527 secret to this scheme is minimal, but certainly better than none at 528 all. 530 4.2.2 LOWAUTH - Partially Private Configuration: (private, public) 532 For partially private configuration, the preshared secret has 533 somewhat more utility than for entirely private configuration. In 534 this case, the preshared secret may be configured as part of the 535 manufacturing process. The secret is then used to authenticate the 536 configuration server during the second phase. The usual cautions 537 pertain to the physical security of the devices prior to 538 configuration, but this mechanism does provide some small amount of 539 protection from man-in-the-middle attacks, assuming that the 540 preshared secret is changed often, and is nontrivial. Also assumed 541 is that various password-attack prevention mechanisms are in place. 543 In order to utilize this mechanism, the security device negotiates 544 a SA with the configuration server upon booting up on the network. 545 This SA is authenticated using the preshared key. Subsequent 546 configuration is accomplished via this SA. The SA MUST be protected 547 with encryption as well as authentication, and the shared secret 548 SHOULD be replaced during the configuration process, with the new 549 shared secret being used for subsequent configuration. 551 4.2.3 LOWAUTH - Entirely Public Configuration: (public, public) 553 Entirely public configuration is much the same as for partially 554 private configuration. In no event should the DHCP-derived 555 addresses be considered secure. The shared secret must be 556 configured as part of the manufacturing phase. Following assignment 557 of an IP address and configuration server identity, the secret may 558 then be used to authenticate the configuration server during the 559 second phase. 561 In order to effectively use the preshared secret during the second 562 phase, the security device should negotiate a SA with the 563 configuration server. The SA must be authenticated using the 564 preshared key, and subsequent configuration is accomplished via 565 this SA. This SA MUST be protected with encryption as well as 566 authentication, and the shared secret SHOULD be replaced during the 567 configuration process, with the new shared secret being used for 568 subsequent configuration. 570 The usual cautions pertain to the physical security of the devices 571 prior to configuration, but this mechanism does provide some small 572 amount of protection from man-in-the-middle attacks, assuming that 573 the shared secret is changed from time to time, and is nontrivial. 574 However, the inclusion of the secret in the manufacturing process 575 complicates this approach. 577 4.3 Certificates for Authentication (MEDAUTH) 579 Certificates provide a much more reliable authentication mechanism 580 than preshared secrets, although they are similar in many ways. The 581 primary differences lie in their verifiability via a PKI 582 infrastructure, and in the fact that compromise of the public 583 values within the security device does not give the information 584 necessary to impersonate a configuration server. Given the current 585 low level of interoperable PKI implementations, there are 586 impediments to widespread deployment of this mechanism. 587 Nonetheless, this approach is much less susceptible to compromise 588 than the shared secret approach. 590 For this form of configuration, the considerations for all 3 tuples 591 are essentially the same; hence, all are described together. In 592 order to secure the private, partially private, and public 593 configuration exchanges, the public key (or list of public keys) 594 for the allowed configuration server(s) must be assigned to the 595 security device during the manufacturing process. The security 596 device MUST not rely upon the addresses configured in the first 597 phase for authentication. 599 The configuration server's public key is used to authenticate an 600 IPsec SA which is established after the initial DHCP (or other 601 address assignment) operation. In all tuple cases other than 602 (private, private), this SA MUST be protected with encryption as 603 well as authentication. In all cases, the configuration certificate 604 SHOULD be replaced during the configuration process, with the new 605 certificate being used for subsequent configuration. 607 [TBD - discuss snmpv3 here, along with mechanism for key exchange 608 to setup snmp shared secret] 610 The usual cautions pertain to the physical security of the devices 611 prior to configuration, but this mechanism does provide a much 612 larger measure of protection from impersonation attacks than a 613 shared secret might. However, the inclusion of the public key in 614 the manufacturing process complicates this mechanism. 616 4.4 Hardware Protection for Authentication (HIAUTH) 618 4.4.1 HIAUTH - Entirely Private Configuration: (private, private) 620 This is potentially the most secure mechanism by which to configure 621 security devices, and there are several levels of gradation 622 depending upon the mechanisms applied. These include the following: 624 o symmetric hardware protection; the authentication keying material 625 is stored in symmetric hardware devices (e.g. a PCMCIA card pair) 626 which must be simultaneously inserted in both devices for 627 configuration to proceed. 629 o asymmetric hardware protection A; the authentication keying 630 material is stored in tamper-proof flash memory on the security 631 device, and a hardware device is inserted in the configuring 632 server. 634 o asymmetric hardware protection B; the authentication keying 635 material is communicated to the security device via a hardware 636 token (e.g. PCMCIA card) which is inserted prior to 637 configuration, and is somehow securely stored on the 638 configuration server. 640 This technique may be somewhat strengthened in a number of ways. 641 First, the server-based implementation may require a (one-time ?) 642 password in order to utilize the token. Second, this token/password 643 pair could be required in order to decrypt the actual configuration 644 software. Third, the device could be manufactured to permit only a 645 limited range of hardware tokens the required access. The list goes 646 on and on, primarily limited by the amount of trouble the 647 administrator is willing to go through in order to insure the 648 integrity of the configuration process. 650 In the case of hardware protection, concerns for the physical 651 security of the devices is much lessened, in that there are 652 mechanisms available to effectively prevent compromise even if 653 physical access to the equipment is gained. 655 4.4.2 HIAUTH - Partially Private Configuration: (private, public) 657 After entirely private configuration using hardware protection, 658 this is the next most secure mechanism by which to configure 659 security devices. In this case, the hardware-protected keying 660 material for the security device should include a self- 661 authenticating key (to generate a signature to be passed to the 662 config server), and a public key (list) for the available 663 configuration servers. 665 The initial server configures the security device with its IP 666 address and the IP address of the secondary configuration server; 667 no authentication of any sort is necessary. Subsequently, the 668 security device contacts the secondary server and creates an 669 authenticated SA using the hardware protected keying material for 670 authentication, and using this SA, downloads its configuration from 671 the server. This SA MUST provide for an encrypted data stream. 673 4.4.3 HIAUTH - Entirely Public Configuration: (public, public) 675 This mechanism is very similar to the (private, public) mechanism, 676 except that the initial configuration phase may be subject to DoS 677 attacks. The hardware-protected keying material for the security 678 device should include a self-authenticating key (to generate a 679 signature to be passed to the config server), and a public key 680 (list) for the available configuration server(s). The hardware 681 device is inserted prior to the placement of the security device on 682 the network. 684 Upon booting up, the security device broadcasts a DHCP request, and 685 the response contains the IP address of the security device and of 686 the next boot server. Following this, the security device contacts 687 the secondary server and creates an authenticated/encrypted SA 688 using the protected keying material, and using this SA, downloads 689 its configuration from the server. 691 5. Additional Security Mechanisms 693 There are a number of additional steps which will further secure 694 the configuration process. These various mechanisms may be applied 695 regardless of the other security mechanisms used, so they are 696 discussed separately. 698 5.1 SNMPv3 700 SNMPv3 [SNMP3] may be used as a secondary configuration 701 authentication mechanism as well as for data confidentiality. In 702 fact, in some cases, it may be the only mechanism employed. It 703 should be noted that the encryption or authentication key strengths 704 to be used for the SNMP exchanges are a function of the level of 705 device security you wish to obtain. In general, the algorithms used 706 for configuration should be at least as strong as the strongest 707 algorithms the security device will apply to data flows that it 708 secures. If security policy dictates the use of 3DES for the 709 secured flows, then 3DES should be the minimum security employed 710 for the configuration exchange. 712 SNMPv3 uses shared secrets in order to provide confidentiality. 713 While [SNMP3] states that the protocol must not be tied to any 714 specific Key Management Protocol (KMP), this would not seem to 715 preclude the employment of a KMP if desirable. For our purposes, we 716 should consider the use of KMPs to further secure SNMPv3. There are 717 at least 3 ways in which the shared secrets may be configured: 718 statically, during the initial private phase of a (private, public) 719 scenario, or dynamically. The first 2 cases are relatively 720 straightforward to implement, so they are not discussed further 721 here. The dynamic case is discussed below. 723 [TBD - use ISAKMP with new DOI for this?] 725 5.2 Transport Layer Security (TLS) 727 TLS [TLS] may be used as a secondary authentication mechanism as 728 well as for data confidentiality. As discussed earlier, if the 729 configuration process runs on a multi-tasking system, then IPsec 730 can only authenticate the host - not the configuring application. 731 Further, even if the configuration information is encrypted on the 732 wire due to the IPsec SA, there is still the possibility that it 733 could be observed and/or modified before being encrypted. 735 [TBD - fill in] 737 6. Security Considerations 739 IPsec device configuration security is the subject of this 740 document. Thus, all relevant security considerations are discussed 741 above. 743 7. Editor's Addresses 745 Scott Kelly 746 RedCreek Communications 747 3900 Newpark Mall Road 748 Newark, CA 94560 749 USA 750 email: skelly@redcreek.com 751 Telephone: +1 (510) 745-3969 753 Mike St. Johns 754 @Home Network 755 425 Broadway 756 Redwood City, CA 94063 757 USA 758 email: stjohns@corp.home.net 759 Telephone: +1 (650) 569-5368 761 References 763 [ARCH] S. Kent and R. Atkinson, "Security Architecture for IP", 765 [IKE] D. Harkins and D. Carrell, "The Internet Key Exchange", 767 [DHCSEC] O. Gudmundsson, R. Droms "Security Requirements for the 768 DHCP protocol", Internet Draft, March 1998 770 [DHCAUTH] R. Droms, W. Arbaugh, "Authentication for DHCP Messages", 772 [TLS] T. Dierks, C. Allen, "The TLS Protocol", Internet Draft, 773 May 1998 775 Full Copyright Statement 777 Copyright (C) The Internet Society (1998). All Rights Reserved. 779 This document and translations of it may be copied and furnished to 780 others, and derivative works that comment on or otherwise explain it 781 or assist in its implementation may be prepared, copied, published 782 and distributed, in whole or in part, without restriction of any 783 kind, provided that the above copyright notice and this paragraph 784 are 785 included on all such copies and derivative works. However, this 786 document itself may not be modified in any way, such as by removing 787 the copyright notice or references to the Internet Society or other 788 Internet organizations, except as needed for the purpose of 789 developing Internet standards in which case the procedures for 790 copyrights defined in the Internet Standards process must be 791 followed, or as required to translate it into languages other than 792 English. 794 The limited permissions granted above are perpetual and will not be 795 revoked by the Internet Society or its successors or assigns. 797 This document and the information contained herein is provided on an 798 "AS IS" basis and THE INTERNET SOCIETY AND THE INTERNET ENGINEERING 799 TASK FORCE DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING 800 BUT NOT LIMITED TO ANY WARRANTY THAT THE USE OF THE INFORMATION 801 HEREIN WILL NOT INFRINGE ANY RIGHTS OR ANY IMPLIED WARRANTIES OF 802 MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.