idnits 2.17.1 draft-ietf-ace-usecases-08.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (October 07, 2015) is 3124 days in the past. Is this intentional? Checking references for intended status: Informational ---------------------------------------------------------------------------- No issues found here. Summary: 0 errors (**), 0 flaws (~~), 1 warning (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 ACE Working Group L. Seitz, Ed. 3 Internet-Draft SICS Swedish ICT AB 4 Intended status: Informational S. Gerdes, Ed. 5 Expires: April 9, 2016 Universitaet Bremen TZI 6 G. Selander 7 Ericsson 8 M. Mani 9 Itron 10 S. Kumar 11 Philips Research 12 October 07, 2015 14 ACE use cases 15 draft-ietf-ace-usecases-08 17 Abstract 19 Constrained devices are nodes with limited processing power, storage 20 space and transmission capacities. These devices in many cases do 21 not provide user interfaces and are often intended to interact 22 without human intervention. 24 This document includes a collection of representative use cases for 25 authentication and authorization in constrained environments. These 26 use cases aim at identifying authorization problems that arise during 27 the lifecycle of a constrained device and are intended to provide a 28 guideline for developing a comprehensive authentication and 29 authorization solution for this class of scenarios. 31 Where specific details are relevant, it is assumed that the devices 32 use the Constrained Application Protocol (CoAP) as communication 33 protocol, however most conclusions apply generally. 35 Status of This Memo 37 This Internet-Draft is submitted in full conformance with the 38 provisions of BCP 78 and BCP 79. 40 Internet-Drafts are working documents of the Internet Engineering 41 Task Force (IETF). Note that other groups may also distribute 42 working documents as Internet-Drafts. The list of current Internet- 43 Drafts is at http://datatracker.ietf.org/drafts/current/. 45 Internet-Drafts are draft documents valid for a maximum of six months 46 and may be updated, replaced, or obsoleted by other documents at any 47 time. It is inappropriate to use Internet-Drafts as reference 48 material or to cite them other than as "work in progress." 49 This Internet-Draft will expire on April 9, 2016. 51 Copyright Notice 53 Copyright (c) 2015 IETF Trust and the persons identified as the 54 document authors. All rights reserved. 56 This document is subject to BCP 78 and the IETF Trust's Legal 57 Provisions Relating to IETF Documents 58 (http://trustee.ietf.org/license-info) in effect on the date of 59 publication of this document. Please review these documents 60 carefully, as they describe your rights and restrictions with respect 61 to this document. Code Components extracted from this document must 62 include Simplified BSD License text as described in Section 4.e of 63 the Trust Legal Provisions and are provided without warranty as 64 described in the Simplified BSD License. 66 Table of Contents 68 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 3 69 1.1. Terminology . . . . . . . . . . . . . . . . . . . . . . . 4 70 2. Use Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 4 71 2.1. Container monitoring . . . . . . . . . . . . . . . . . . 4 72 2.1.1. Bananas for Munich . . . . . . . . . . . . . . . . . 5 73 2.1.2. Authorization Problems Summary . . . . . . . . . . . 6 74 2.2. Home Automation . . . . . . . . . . . . . . . . . . . . . 7 75 2.2.1. Controlling the Smart Home Infrastructure . . . . . . 7 76 2.2.2. Seamless Authorization . . . . . . . . . . . . . . . 7 77 2.2.3. Remotely letting in a visitor . . . . . . . . . . . . 8 78 2.2.4. Selling the house . . . . . . . . . . . . . . . . . . 8 79 2.2.5. Authorization Problems Summary . . . . . . . . . . . 8 80 2.3. Personal Health Monitoring . . . . . . . . . . . . . . . 9 81 2.3.1. John and the heart rate monitor . . . . . . . . . . . 10 82 2.3.2. Authorization Problems Summary . . . . . . . . . . . 11 83 2.4. Building Automation . . . . . . . . . . . . . . . . . . . 12 84 2.4.1. Device Lifecycle . . . . . . . . . . . . . . . . . . 12 85 2.4.2. Public Safety . . . . . . . . . . . . . . . . . . . . 16 86 2.4.3. Authorization Problems Summary . . . . . . . . . . . 17 87 2.5. Smart Metering . . . . . . . . . . . . . . . . . . . . . 18 88 2.5.1. Drive-by metering . . . . . . . . . . . . . . . . . . 18 89 2.5.2. Meshed Topology . . . . . . . . . . . . . . . . . . . 19 90 2.5.3. Advanced Metering Infrastructure . . . . . . . . . . 19 91 2.5.4. Authorization Problems Summary . . . . . . . . . . . 20 92 2.6. Sports and Entertainment . . . . . . . . . . . . . . . . 20 93 2.6.1. Dynamically Connecting Smart Sports Equipment . . . . 21 94 2.6.2. Authorization Problems Summary . . . . . . . . . . . 21 95 2.7. Industrial Control Systems . . . . . . . . . . . . . . . 22 96 2.7.1. Oil Platform Control . . . . . . . . . . . . . . . . 22 97 2.7.2. Authorization Problems Summary . . . . . . . . . . . 23 98 3. Security Considerations . . . . . . . . . . . . . . . . . . . 23 99 3.1. Attacks . . . . . . . . . . . . . . . . . . . . . . . . . 24 100 3.2. Configuration of Access Permissions . . . . . . . . . . . 25 101 3.3. Authorization Considerations . . . . . . . . . . . . . . 25 102 3.4. Proxies . . . . . . . . . . . . . . . . . . . . . . . . . 26 103 4. Privacy Considerations . . . . . . . . . . . . . . . . . . . 26 104 5. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 27 105 6. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 27 106 7. Informative References . . . . . . . . . . . . . . . . . . . 27 107 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 28 109 1. Introduction 111 Constrained devices [RFC7228] are nodes with limited processing 112 power, storage space and transmission capacities. These devices are 113 often battery-powered and in many cases do not provide user 114 interfaces. 116 Constrained devices benefit from being interconnected using Internet 117 protocols. However, deploying common security protocols can 118 sometimes be difficult because of device or network limitations. 119 Regardless, adequate security mechanisms are required to protect 120 these constrained devices, which are expected to be integrated in all 121 aspects of everyday life, from attackers wishing to gain control over 122 the device's data or functions. 124 This document comprises a collection of representative use cases for 125 the application of authentication and authorization in constrained 126 environments. These use cases aim at identifying authorization 127 problems that arise during the lifecycle of a constrained device. 128 Note that this document does not aim at collecting all possible use 129 cases. 131 We assume that the communication between the devices is based on the 132 Representational State Transfer (REST) architectural style, i.e. a 133 device acts as a server that offers resources such as sensor data and 134 actuators. The resources can be accessed by clients, sometimes 135 without human intervention (M2M). In some situations the 136 communication will happen through intermediaries (e.g. gateways, 137 proxies). 139 Where specific detail is necessary it is assumed that the devices 140 communicate using CoAP [RFC7252], although most conclusions are 141 generic. 143 1.1. Terminology 145 Readers are required to be familiar with the terms defined in 146 [RFC7228]. 148 2. Use Cases 150 This section includes the use cases; each use case first presents a 151 general description of the application environment, than one or more 152 specific use cases, and finally a summary of the authorization- 153 related problems to be solved. 155 There are various reasons for assigning a function (client or server) 156 to a device, e.g. which device initiates the conversation, how do 157 devices find each other, etc. The definition of the function of a 158 device in a certain use case is not in scope of this document. 159 Readers should be aware that there might be reasons for each setting 160 and that endpoints might even have different functions at different 161 times. 163 2.1. Container monitoring 165 The ability of sensors to communicate environmental data wirelessly 166 opens up new application areas. Sensor systems make it possible to 167 continuously track and transmit characteristics such as temperature, 168 humidity and gas content while goods are transported and stored. 170 Sensors in this scenario have to be associated to the appropriate 171 pallet of the respective container. Sensors as well as the goods 172 belong to specific customers. 174 While in transit goods often pass stops where they are transloaded to 175 other means of transportation, e.g. from ship transport to road 176 transport. 178 Perishable goods need to be stored at constant temperature and with 179 proper ventilation. Real-time information on the state of the goods 180 is needed by both the transporter and the vendor. Transporters want 181 to prioritize good that will expire soon. Vendors want to react when 182 goods are spoiled to continue to fulfill delivery obligations. 184 The Intelligent Container (http://www.intelligentcontainer.com) is an 185 example project that explores solutions to continuously monitor 186 perishable goods. 188 2.1.1. Bananas for Munich 190 A fruit vendor grows bananas in Costa Rica for the German market. It 191 instructs a transport company to deliver the goods via ship to 192 Rotterdam where they are picked up by trucks and transported to a 193 ripening facility. A Munich supermarket chain buys ripened bananas 194 from the fruit vendor and transports them from the ripening facility 195 to the individual markets with their own company trucks. 197 The fruit vendor's quality management wants to assure the quality of 198 their products and thus equips the banana boxes with sensors. The 199 state of the goods is monitored consistently during shipment and 200 ripening and abnormal sensor values are recorded (U1.2). 201 Additionally, the sensor values are used to control the climate 202 within the cargo containers (U1.1, U1.5, U1.7). The sensors 203 therefore need to communicate with the climate control system. Since 204 a wrong sensor value leads to a wrong temperature and thus to spoiled 205 goods, the integrity of the sensor data must be assured (U1.2, U1.3). 206 The banana boxes within a container will in most cases belong to the 207 same owner. Adjacent containers might contain goods and sensors of 208 different owners (U1.1). 210 The personnel that transloads the goods must be able to locate the 211 goods meant for a specific customer (U1.1, U1.6, U1.7). However the 212 fruit vendor does not want to disclose sensor information pertaining 213 to the condition of the goods to other companies and therefore wants 214 to assure the confidentiality of this data (U1.4). Thus, the 215 transloading personnel is only allowed to access logistic information 216 (U1.1). Moreover, the transloading personnel is only allowed to 217 access the data for the time of the transloading (U1.8). 219 Due to the high water content of the fruits, the propagation of radio 220 waves is hindered, thus often inhibiting direct communication between 221 nodes [Jedermann14]. Instead, messages are forwarded over multiple 222 hops (U1.9). The sensors in the banana boxes cannot always reach the 223 Internet during the journey (U1.10). Sensors may need to use relay 224 stations owned by the transport company to connect to endpoints in 225 the Internet. 227 In the ripening facility bananas are stored until they are ready to 228 be sold. The banana box sensors are used to control the ventilation 229 system and to monitor the degree of ripeness of the bananas. Ripe 230 bananas need to be identified and sold before they spoil (U1.2, 231 U1.8). 233 The supermarket chain gains ownership of the banana boxes when the 234 bananas have ripened and are ready to leave the ripening facility. 236 2.1.2. Authorization Problems Summary 238 o U1.1 Fruit vendors and container owners want to grant different 239 authorizations for their resources and/or endpoints to different 240 parties. 242 o U1.2 The fruit vendor requires the integrity and authenticity of 243 the sensor data that pertains the state of the goods for climate 244 control and to ensure the quality of the monitored recordings. 246 o U1.3 The container owner requires the integrity and authenticity 247 of the sensor data that is used for climate control. 249 o U1.4 The fruit vendor requires the confidentiality of the sensor 250 data that pertains the state of the goods and the confidentiality 251 of location data, e.g., to protect them from targeted attacks from 252 competitors. 254 o U1.5 The fruit vendor may need different protection for several 255 different types of data on the same endpoint, e.g., sensor data 256 and the data used for logistics. 258 o U1.6 The fruit vendor and the transloading personnel require the 259 authenticity and integrity of the data that is used to locate the 260 goods, in order to ensure that the goods are correctly treated and 261 delivered. 263 o U1.7 The container owner and the fruit vendor may not be present 264 at the time of access and cannot manually intervene in the 265 authorization process. 267 o U1.8 The fruit vendor, container owner and transloading company 268 want to grant temporary access permissions to a party, in order to 269 avoid giving permanent access to parties that are no longer 270 involved in processing the bananas. 272 o U1.9 The fruit vendor, container owner and transloading company 273 want their security objectives to be achieved, even if the 274 messages between the endpoints need to be forwarded over multiple 275 hops. 277 o U1.10 The constrained devices might not always be able to reach 278 the Internet but still need to enact the authorization policies of 279 their principals. 281 o U1.11 Fruit vendors and container owners want to be able to revoke 282 authorization on a malfunctioning sensor. 284 2.2. Home Automation 286 One application of the Internet of Things is home automation systems. 287 Such a system can connect household devices that control, for example 288 heating, ventilation, lighting, home entertainment, and home security 289 to the Internet making them remotely accessible and manageable. 291 Such a system needs to accommodate a number of regular users 292 (inhabitants, close friends, cleaning personnel) as well as a 293 heterogeneous group of dynamically varying users (visitors, 294 repairmen, delivery men). 296 As the users are not typically trained in security (or even computer 297 use), the configuration must use secure default settings, and the 298 interface must be well adapted to novice users. 300 2.2.1. Controlling the Smart Home Infrastructure 302 Alice and Bob own a flat which is equipped with home automation 303 devices such as HVAC and shutter control, and they have a motion 304 sensor in the corridor which controls the light bulbs there (U2.5). 306 Alice and Bob can control the shutters and the temperature in each 307 room using either wall-mounted touch panels or an internet connected 308 device (e.g. a smartphone). Since Alice and Bob both have a full- 309 time job, they want to be able to change settings remotely, e.g. turn 310 up the heating on a cold day if they will be home earlier than 311 expected (U2.5). 313 The couple does not want people in radio range of their devices, e.g. 314 their neighbors, to be able to control them without authorization. 315 Moreover, they don't want burglars to be able to deduce behavioral 316 patterns from eavesdropping on the network (U2.8). 318 2.2.2. Seamless Authorization 320 Alice buys a new light bulb for the corridor and integrates it into 321 the home network, i.e. makes resources known to other devices in the 322 network. Alice makes sure that the new light bulb and her other 323 devices in the network get to know the authorization policies for the 324 new device. Bob is not at home, but Alice wants him to be able to 325 control the new device with his devices (e.g. his smartphone) without 326 the need for additional administration effort (U2.7). She provides 327 the necessary configurations for that (U2.9, U2.10). 329 2.2.3. Remotely letting in a visitor 331 Alice and Bob have equipped their home with automated connected door- 332 locks and an alarm system at the door and the windows. The couple 333 can control this system remotely. 335 Alice and Bob have invited Alice's parents over for dinner, but are 336 stuck in traffic and cannot arrive in time, while Alice's parents who 337 use the subway will arrive punctually. Alice calls her parents and 338 offers to let them in remotely, so they can make themselves 339 comfortable while waiting (U2.1, U2.6). Then Alice sets temporary 340 permissions that allow them to open the door, and shut down the alarm 341 (U2.2). She wants these permissions to be only valid for the evening 342 since she does not like it if her parents are able to enter the house 343 as they see fit (U2.3, U2.4). 345 When Alice's parents arrive at Alice's and Bob's home, they use their 346 smartphone to communicate with the door-lock and alarm system (U2.5, 347 U2.9). The permissions Alice issued to her parents only allow 348 limited access to the house (e.g. opening the door, turning on the 349 lights). Certain other functions, such as checking the footage from 350 the surveillance cameras is not accessible to them (U2.3). 352 Alice and Bob also issue similarly restricted permissions to e.g. 353 cleaners, repairmen or their nanny (U2.3). 355 2.2.4. Selling the house 357 Alice and Bob have to move because Alice is starting a new job. They 358 therefore decide to sell the house, and transfer control of all 359 automated services to the new owners (U2.11). Before doing that they 360 want to erase privacy relevant data from the logs of the automated 361 systems, while the new owner is interested to keep some historic data 362 e.g. pertaining to the behavior of the heating system (U2.12). At 363 the time of transfer of the house, the new owners also wants make 364 sure that permissions issued by the previous owners to access the 365 house or connected devices (in the case where device management may 366 have separate permissions from house access) are no longer valid 367 (U2.13). 369 2.2.5. Authorization Problems Summary 371 o U2.1 A home owner (Alice and Bob in the example above) wants to 372 spontaneously provision authorization means to visitors. 374 o U2.2 A home owner wants to spontaneously change the home's access 375 control policies. 377 o U2.3 A home owner wants to apply different access rights for 378 different users (including other inhabitants). 380 o U2.4 The home owners want to grant access permissions to a someone 381 during a specified time frame. 383 o U2.5 The smart home devices need to be able to securely 384 communicate with different control devices (e.g. wall-mounted 385 touch panels, smartphones, electronic key fobs, device gateways). 387 o U2.6 The home owner wants to be able to configure authorization 388 policies remotely. 390 o U2.7 Authorized Users want to be able to obtain access with little 391 effort. 393 o U2.8 The owners of the automated home want to prevent unauthorized 394 entities from being able to deduce behavioral profiles from 395 devices in the home network. 397 o U2.9 Usability is particularly important in this scenario since 398 the necessary authorization related tasks in the lifecycle of the 399 device (commissioning, operation, maintenance and decommissioning) 400 likely need to be performed by the home owners who in most cases 401 have little knowledge of security. 403 o U2.10 Home Owners want their devices to seamlessly (and in some 404 cases even unnoticeably) fulfill their purpose. Therefore the 405 authorization administration effort needs to be kept at a minimum. 407 o U2.11 Home Owners want to be able to transfer ownership of their 408 automated systems when they sell the house. 410 o U2.12 Home Owners want to be able to sanitize the logs of the 411 automated systems, when transferring ownership, without deleting 412 important operational data. 414 o U2.13 When a transfer of ownership occurs, the new owner wants to 415 make sure that access rights created by the previous owner are no 416 longer valid. 418 2.3. Personal Health Monitoring 420 Personal health monitoring devices, i.e. eHealth devices, are 421 typically battery driven and located physically on or in the user to 422 monitor some bodily function, such as temperature, blood pressure, or 423 pulse rate. These devices typically connect to the Internet through 424 an intermediary base-station, using wireless technologies and through 425 this connection they report the monitored data to some entity, which 426 may either be the user, or a medical caregiver. 428 Medical data has always been considered as very sensitive, and 429 therefore requires good protection against unauthorized disclosure. 430 A frequent, conflicting requirement is the capability for medical 431 personnel to gain emergency access, even if no specific access rights 432 exist. As a result, the importance of secure audit logs increases in 433 such scenarios. 435 Since the users are not typically trained in security (or even 436 computer use), the configuration must use secure default settings, 437 and the interface must be well adapted to novice users. Parts of the 438 system must operate with minimal maintenance. Especially frequent 439 changes of battery are unacceptable. 441 There is a plethora of wearable health monitoring technology and the 442 need for open industry standards to ensure interoperability between 443 products has lead to initiatives such as Continua Alliance 444 (continuaalliance.org) and Personal Connected Health Alliance 445 (pchalliance.org). 447 2.3.1. John and the heart rate monitor 449 John has a heart condition, that can result in sudden cardiac 450 arrests. He therefore uses a device called HeartGuard that monitors 451 his heart rate and his location (U3.7). In case of a cardiac arrest 452 it automatically sends an alarm to an emergency service, transmitting 453 John's current location (U3.1). Either the device has long range 454 connectivity itself (e.g. via GSM) or it uses some intermediary, 455 nearby device (e.g. John's smartphone) to transmit such an alarm. 456 To ensure Johns safety, the device is expected to be in constant 457 operation (U3.3, U3.6). 459 The device includes an authentication mechanism, in order to prevent 460 other persons who get physical access to it from acting as the owner 461 and altering the access control and security settings (U3.8). 463 John can configure additional persons that get notified in an 464 emergency, for example his daughter Jill. Furthermore the device 465 stores data on John's heart rate, which can later be accessed by a 466 physician to assess the condition of John's heart (U3.2). 468 However John is a privacy conscious person, and is worried that Jill 469 might use HeartGuard to monitor his location while there is no 470 emergency. Furthermore he doesn't want his health insurance to get 471 access to the HeartGuard data, or even to the fact that he is wearing 472 a HeartGuard, since they might refuse to renew his insurance if they 473 decided he was too big a risk for them (U3.8). 475 Finally John, while being comfortable with modern technology and able 476 to operate it reasonably well, is not trained in computer security. 477 He therefore needs an interface for the configuration of the 478 HeartGuard security that is easy to understand and use (U3.5). If 479 John does not understand the meaning of a setting, he tends to leave 480 it alone, assuming that the manufacturer has initialized the device 481 to secure settings (U3.4). 483 NOTE: Monitoring of some state parameter (e.g. an alarm button) and 484 the position of a person also fits well into an elderly care service. 485 This is particularly useful for people suffering from dementia, where 486 the relatives or caregivers need to be notified of the whereabouts of 487 the person under certain conditions. In this case it is not the 488 patient that decides about access. 490 2.3.2. Authorization Problems Summary 492 o U3.1 The wearer of an eHealth device (John in the example above) 493 wants to pre-configure special access rights in the context of an 494 emergency. 496 o U3.2 The wearer of an eHealth device wants to selectively allow 497 different persons or groups access to medical data. 499 o U3.3 Battery changes are very inconvenient and sometimes 500 impractical, so battery life impacts of the authorization 501 mechanisms need to be minimized. 503 o U3.4 Devices are often used with default access control settings 504 which might threaten the security objectives of the device's 505 users. 507 o U3.5 Wearers of eHealth devices are often not trained in computer 508 use, and especially computer security. 510 o U3.6 Security mechanisms themselves could provide opportunities 511 for denial of service attacks, especially on the constrained 512 devices. 514 o U3.7 The device provides a service that can be fatal for the 515 wearer if it fails. Accordingly, the wearer wants the device to 516 have a high degree of resistance against attacks that may cause 517 the device to fail to operate partially or completely. 519 o U3.8 The wearer of an eHealth device requires the integrity and 520 confidentiality of the data measured by the device. 522 2.4. Building Automation 524 Buildings for commercial use such as shopping malls or office 525 buildings nowadays are equipped increasingly with semi-automatic 526 components to enhance the overall living quality and to save energy 527 where possible. This includes for example heating, ventilation and 528 air condition (HVAC) as well as illumination and security systems 529 such as fire alarms. These components are being increasingly managed 530 centrally in a Building and Lighting Management System (BLMS) by a 531 facility manager. 533 Different areas of these buildings are often exclusively leased to 534 different companies. However they also share some of the common 535 areas of the building. Accordingly, a company must be able to 536 control the lighting and HVAC system of its own part of the building 537 and must not have access to control rooms that belong to other 538 companies. 540 Some parts of the building automation system such as entrance 541 illumination and fire alarm systems are controlled either by all 542 parties together or by a facility management company. 544 2.4.1. Device Lifecycle 546 2.4.1.1. Installation and Commissioning 548 Installation of the building automation components often start even 549 before the construction work is completed. Lighting is one of the 550 first components to be installed in new buildings. A lighting plan 551 created by a lighting designer provides the necessary information 552 related to the kind of lighting devices (luminaires, sensors and 553 switches) to be installed along with their expected behavior. The 554 physical installation of the correct lighting devices at the right 555 locations are done by electricians based on the lighting plan. They 556 ensure that the electrical wiring is performed according to local 557 regulations and lighting devices which may be from multiple 558 manufacturers are connected to the electrical power supply properly. 559 After the installation, lighting can be used in a default out-of-box 560 mode for e.g. at full brightness when powered on. After this step 561 (or in parallel in a different section of the building), a lighting 562 commissioner adds the devices to the building domain (U4.1) and 563 performs the proper configuration of the lights as prescribed in the 564 lighting plan. This involves for example grouping to ensure that 565 light points react together, more or less synchronously (U4.8) and 566 defining lighting scenes for particular areas of the building. The 567 commissioning is often done in phases, either by one or more 568 commissioners, on different floors. The building lighting network at 569 this stage may be in different network islands with no connectivity 570 between them due to lack of the IT infrastructure. 572 After this, other building components like HVAC and security systems 573 are similarly installed by electricians and later commissioned by 574 their respective domain professionals. Similar configurations 575 related to grouping (U4.8) are required to ensure for e.g. HVAC 576 equipment are controlled by the closest temperature sensor. 578 For the building IT systems, the Ethernet wiring is initially laid 579 out in the building according to the IT plan. The IT network is 580 commissioned often after the construction is completed to avoid any 581 damage to sensitive networking and computing equipment. The 582 commissioning is performed by an IT engineer with additional switches 583 (wired and/or wireless), IP routers and computing devices. Direct 584 Internet connectivity for all installed/commissioned devices in the 585 building is only available at this point. The BLMS that monitors and 586 controls the various building automation components are only 587 connected to the field devices at this stage. The different network 588 islands (for lighting and HVAC) are also joined together without any 589 further involvement of domain specialist such as lighting or HVAC 590 commissioners. 592 2.4.1.2. Operational 594 The building automation systems is now finally ready and the 595 operational access is transferred to the facility management company 596 of the building (U4.2). The facility manager is responsible for 597 monitoring and ensuring that the building automation systems meets 598 the needs of the building occupants. If changes are needed, the 599 facility management company hires an external installation and 600 commissioning company to perform the changes. 602 Different parts of the building are rented out to different companies 603 for office space. 604 The tenants are provided access to use the automated HVAC, lighting 605 and physical access control systems deployed. The safety of the 606 occupants are also managed using automated systems, such as a fire 607 alarm system, which is triggered by several smoke detectors which are 608 spread out across the building. 610 Company A's staff move into the newly furnished office space. Most 611 lighting is controlled by presence sensors which control the lighting 612 of specific group of lights based on the authorization rules in the 613 BLMS. Additionally employees are allowed to manually override the 614 lighting brightness and color in their office rooms by using the 615 switches or handheld controllers. Such changes are allowed only if 616 the authorization rules exist in the BLMS. For example lighting in 617 the corridors may not be manually adjustable. 619 At the end of the day, lighting is dimmed down or switched off if no 620 occupancy is detected even if manually overridden during the day. 622 On a later date company B also moves into the same building, and 623 shares some of the common spaces and associated building automation 624 components with company A (U4.2, U4.9). 626 2.4.1.3. Maintenance 628 Company A's staff are annoyed that the lighting switches off too 629 often in their rooms if they work silently in front of their 630 computer. Company A notifies the the facility manager of the 631 building to increase the delay before lights switch off. The 632 facility manager can either configure the new values directly in the 633 BLMS or if additional changes are needed on the field devices, hires 634 a commissioning Company C to perform the needed changes (U4.4). 636 Company C gets the necessary authorization from the facility 637 management company to interact with the BLMS. The commissioner's 638 tool gets the necessary authorization from BLMS to send a 639 configuration change to all lighting devices in Company A's offices 640 to increase their delay before they switch off. 642 At some point the facility management company wants to update the 643 firmware of lighting devices in order to eliminate software bugs. 644 Before accepting the new firmware, each device checks the 645 authorization of the facility management company to perform this 646 update. 648 A network diagnostic tool of the BLMS detects that a luminaire in one 649 of the Company A's office room is no longer connected to the network. 650 The BLMS alerts the facility manager to replace the luminaire. The 651 facility manager replaces the old broken luminaire and informs the 652 BLMS of the identity (for e.g. MAC address) of the newly added 653 device. The BLMS then authorizes the new device onto the system and 654 transfers seamlessly all the permissions of the previous broken 655 device to the replacement device (U4.12). 657 2.4.1.4. Recommissioning 659 A vacant area of the building has been recently leased to company A. 660 Before moving into its new office, Company A wishes to replace the 661 lighting with a more energy efficient and a better light quality 662 luminaries. They hire an installation and commissioning company C to 663 redo the illumination. Company C is instructed to integrate the new 664 lighting devices, which may be from multiple manufacturers, into the 665 existing lighting infrastructure of the building which includes 666 presence sensors, switches, controllers etc (U4.1). 668 Company C gets the necessary authorization from the facility 669 management company to interact with the existing BLMS (U4.4). To 670 prevent disturbance to other occupants of the building, Company C is 671 provided authorization to perform the commissioning only during non- 672 office hours and only to modify configuration on devices belonging to 673 the domain of Company A's space (U4.5). Before removing existing 674 devices, all security and configuration material that belongs to the 675 domain are deleted and the devices are set back to factory state 676 (U4.3). This ensures that these devices may be reused at other 677 installations or in other parts of the same building without 678 affecting future operations. After installation (wiring) of the new 679 lighting devices, the commissioner adds the devices into the company 680 A's lighting domain. 682 Once the devices are in the correct domain, the commissioner 683 authorizes the interaction rules between the new lighting devices and 684 existing devices like presence sensors (U4.7). For this, the 685 commissioner creates the authorization rules on the BLMS which define 686 which lights form a group and which sensors/switches/controllers are 687 allowed to control which groups (U4.8). These authorization rules 688 may be context based like time of the day (office or non-office 689 hours) or location of the handheld lighting controller etc (U4.5). 691 2.4.1.5. Decommissioning 693 Company A has noticed that the handheld controllers are often 694 misplaced and hard to find when needed. So most of the time staff 695 use the existing wall switches for manual control. Company A decides 696 it would be better to completely remove handheld controllers and asks 697 Company C to decommission them from the lighting system (U4.4). 699 Company C again gets the necessary authorization from the facility 700 management company to interact with the BLMS. The commissioner now 701 deletes any rules that allowed handheld controllers authorization to 702 control the lighting (U4.3, U4.6). Additionally the commissioner 703 instructs the BLMS to push these new rules to prevent cached rules at 704 the end devices from being used. Any cryptographic key material 705 belonging to the site in the handheld controllers are also removed 706 and they are set to the factory state (U4.3). 708 2.4.2. Public Safety 710 The fire department requires that as part of the building safety 711 code, that the building have sensors that sense the level of smoke, 712 heat, etc., when a fire breaks out. These sensors report metrics 713 which are then used by a back-end server to map safe areas and un- 714 safe areas within a building and also possibly the structural 715 integrity of the building before fire-fighters may enter it. 716 Sensors may also be used to track where human/animal activity is 717 within the building. This will allow people stuck within the 718 building to be guided to safer areas and suggest possible actions 719 that they may take (e.g. using a client application on their phones, 720 or loudspeaker directions) in order to bring them to safety. In 721 certain cases, other organizations such as the Police, Ambulance, and 722 federal organizations are also involved and therefore the co- 723 ordination of tasks between the various entities have to be carried 724 out using efficient messaging and authorization mechanisms. 726 2.4.2.1. A fire breaks out 728 On a really hot day James who works for company A turns on the air 729 condition in his office. Lucy who works for company B wants to make 730 tea using an electric kettle. After she turned it on she goes 731 outside to talk to a colleague until the water is boiling. 732 Unfortunately, her kettle has a malfunction which causes overheating 733 and results in a smoldering fire of the kettle's plastic case. 735 Due to the smoke coming from the kettle the fire alarm is triggered. 736 Alarm sirens throughout the building are switched on simultaneously 737 (using a group communication scheme) to alert the staff of both 738 companies (U4.8). Additionally, the ventilation system of the whole 739 building is closed off to prevent the smoke from spreading and to 740 withdraw oxygen from the fire. The smoke cannot get into James' 741 office although he turned on his air condition because the fire alarm 742 overrides the manual setting by sending commands (using group 743 communication) to switch off all the air conditioning (U4.10). 745 The fire department is notified of the fire automatically and arrives 746 within a short time. They automatically get access to all parts of 747 the building according to an emergency authorization policy (U4.4, 748 U4.5). After inspecting the damage and extinguishing the smoldering 749 fire a fire fighter resets the fire alarm because only the fire 750 department is authorized to do that (U4.4, U4.11). 752 2.4.3. Authorization Problems Summary 754 o U4.1 During commissioning, the building owner or the companies add 755 new devices to their administrative domain. Access control should 756 then apply to these devices seamlessly. 758 o U4.2 During a handover, the building owner or the companies 759 integrate devices that formerly belonged to a different 760 administrative domain to their own administrative domain. Access 761 control of the old domain should then cease to apply, with access 762 control of the new domain taking over. 764 o U4.3 During decommissioning, the building owner or the companies 765 remove devices from their administrative domain. Access control 766 should cease to apply to these devices and relevant credentials 767 need to be erased from the devices. 769 o U4.4 The building owner and the companies want to be able to 770 delegate specific access rights for their devices to others. 772 o U4.5 The building owner and the companies want to be able to 773 define context-based authorization rules. 775 o U4.6 The building owner and the companies want to be able to 776 revoke granted permissions and delegations. 778 o U4.7 The building owner and the companies want to allow authorized 779 entities to send data to their endpoints (default deny). 781 o U4.8 The building owner and the companies want to be able to 782 authorize a device to control several devices at the same time 783 using a group communication scheme. 785 o U4.9 The companies want to be able to interconnect their own 786 subsystems with those from a different operational domain while 787 keeping the control over the authorizations (e.g. granting and 788 revoking permissions) for their endpoints and devices. 790 o U4.10 The authorization mechanisms must be able to cope with 791 extremely time-sensitive operations which have to be carried out 792 in a quick manner. 794 o U4.11 The building owner and the public safety authorities want to 795 be able to perform data origin authentication on messages sent and 796 received by some of the systems in the building. 798 o U4.12 The building owner should be allowed to replace an existing 799 device with a new device providing the same functionality within 800 their administrative domain. Access control from the replaced 801 device should then apply to these new devices seamlessly. 803 2.5. Smart Metering 805 Automated measuring of customer consumption is an established 806 technology for electricity, water, and gas providers. Increasingly 807 these systems also feature networking capability to allow for remote 808 management. Such systems are in use for commercial, industrial and 809 residential customers and require a certain level of security, in 810 order to avoid economic loss to the providers, vulnerability of the 811 distribution system, as well as disruption of services for the 812 customers. 814 The smart metering equipment for gas and water solutions is battery 815 driven and communication should be used sparingly due to battery 816 consumption. Therefore the types of meters sleep most of the time, 817 and only wake up every minute/hour to check for incoming 818 instructions. Furthermore they wake up a few times a day (based on 819 their configuration) to upload their measured metering data. 821 Different networking topologies exist for smart metering solutions. 822 Based on environment, regulatory rules and expected cost, one or a 823 mixture of these topologies may be deployed to collect the metering 824 information. Drive-By metering is one of the most current solutions 825 deployed for collection of gas and water meters. 827 Various stakeholders have a claim on the metering data. Utility 828 companies need the data for accounting, the metering equipment may be 829 operated by a third party Service Operator who needs to maintain it, 830 and the equipment is installed in the premises of the consumers, 831 measuring their consumption, which entails privacy questions. 833 2.5.1. Drive-by metering 835 A service operator offers smart metering infrastructures and related 836 services to various utility companies. Among these is a water 837 provider, who in turn supplies several residential complexes in a 838 city. The smart meters are installed in the end customer's homes to 839 measure water consumption and thus generate billing data for the 840 utility company, they can also be used to shut off the water if the 841 bills are not paid (U5.1, U5.3). The meters do so by sending and 842 receiving data to and from a base station (U5.2). Several base 843 stations are installed around the city to collect the metering data. 844 However in the denser urban areas, the base stations would have to be 845 installed very close to the meters. This would require a high number 846 of base stations and expose this more expensive equipment to 847 manipulation or sabotage. The service operator has therefore chosen 848 another approach, which is to drive around with a mobile base-station 849 and let the meters connect to that in regular intervals in order to 850 gather metering data (U5.4, U5.6, U5.8). 852 2.5.2. Meshed Topology 854 In another deployment, the water meters are installed in a building 855 that already has power meters installed, the latter are mains 856 powered, and are therefore not subject to the same power saving 857 restrictions. The water meters can therefore use the power meters as 858 proxies, in order to achieve better connectivity. This requires the 859 security measures on the water meters to work through intermediaries 860 (U5.9). 862 2.5.3. Advanced Metering Infrastructure 864 A utility company is updating its old utility distribution network 865 with advanced meters and new communication systems, known as an 866 Advanced Metering Infrastructure (AMI). AMI refers to a system that 867 measures, collects and analyzes usage, and interacts with metering 868 devices such as electricity meters, gas meters, heat meters, and 869 water meters, through various communication media either on request 870 (on-demand) or on pre-defined schedules. Based on this technology, 871 new services make it possible for consumers to control their utility 872 consumption (U5.2, U5.7) and reduce costs by supporting new tariff 873 models from utility companies, and more accurate and timely billing. 874 However the end-consumers do not want unauthorized persons to gain 875 access to this data. Furthermore, the fine-grained measurement of 876 consumption data may induce privacy concerns, since it may allow 877 others to create behavioral profiles (U5.5, U5.10). 879 The technical solution is based on levels of data aggregation between 880 smart meters located at the consumer premises and the Meter Data 881 Management (MDM) system located at the utility company (U5.9). For 882 reasons of efficiency and cost, end-to-end connectivity is not always 883 feasible, so metering data is stored and aggregated in various 884 intermediate devices before being forwarded to the utility company, 885 and in turn accessed by the MDM. The intermediate devices may be 886 operated by a third party service operator on behalf of the utility 887 company (U5.7). One responsibility of the service operator is to 888 make sure that meter readings are performed and delivered in a 889 regular, timely manner. An example of a Service Level Agreement 890 between the service operator and the utility company is e.g. "at 891 least 95 % of the meters have readings recorded during the last 72 892 hours". 894 2.5.4. Authorization Problems Summary 896 o U5.1 Devices are installed in hostile environments where they are 897 physically accessible by attackers (including dishonest 898 customers). The service operator and the utility company want to 899 make sure that an attacker cannot use data from a captured device 900 to attack other parts of their infrastructure. 902 o U5.2 The utility company wants to control which entities are 903 allowed to send data to, and read data from their endpoints. 905 o U5.3 The utility company wants to ensure the integrity of the data 906 stored on their endpoints. 908 o U5.4 The utility company wants to protect such data transfers to 909 and from their endpoints. 911 o U5.5 Consumers want to access their own usage information and also 912 prevent unauthorized access by others. 914 o U5.6 The devices may have intermittent Internet connectivity but 915 still need to enact the authorization policies of their 916 principals. 918 o U5.7 Neither the service operator nor the utility company are 919 always present at the time of access and cannot manually intervene 920 in the authorization process. 922 o U5.8 When authorization policies are updated it is impossible, or 923 at least very inefficient to contact all affected endpoints 924 directly. 926 o U5.9 Authorization and authentication must work even if messages 927 between endpoints are stored and forwarded over multiple nodes. 929 o U5.10 Consumers may not want the Service Operator, the Utility 930 company or others to have access to a fine-grained level of 931 consumption data that allows the creation of behavioral profiles. 933 2.6. Sports and Entertainment 935 In the area of leisure time activities, applications can benefit from 936 the small size and weight of constrained devices. Sensors and 937 actuators with various functions can be integrated into fitness 938 equipment, games and even clothes. Users can carry their devices 939 around with them at all times. 941 Usability is especially important in this area since users will often 942 want to spontaneously interconnect their devices with others. 943 Therefore the configuration of access permissions must be simple and 944 fast and not require much effort at the time of access. 946 Continuously monitoring allows authorized users to create behavioral 947 or movement profiles, which corresponds on the devices intended use, 948 and unauthorized access to the collected data would allow an attacker 949 to create the same profiles. 950 Moreover, the aggregation of data can seriously increase the impact 951 on the privacy of the users. 953 2.6.1. Dynamically Connecting Smart Sports Equipment 955 Jody is a an enthusiastic runner. To keep track of her training 956 progress, she has smart running shoes that measure the pressure at 957 various points beneath her feet to count her steps, detect 958 irregularities in her stride and help her to improve her posture and 959 running style. On a sunny afternoon, she goes to the Finnbahn track 960 near her home to work out. She meets her friend Lynn who shows her 961 the smart fitness watch she bought a few days ago. The watch can 962 measure the wearer's pulse, show speed and distance, and keep track 963 of the configured training program. The girls detect that the watch 964 can be connected with Jody's shoes and then can additionally display 965 the information the shoes provide. 967 Jody asks Lynn to let her try the watch and lend it to her for the 968 afternoon. Lynn agrees but doesn't want Jody to access her training 969 plan (U6.4). She configures the access policies for the watch so 970 that Jody's shoes are allowed to access the display and measuring 971 features but cannot read or add training data (U6.1, U6.2). Jody's 972 shoes connect to Lynn's watch after only a press of a button because 973 Jody already configured access rights for devices that belong to Lynn 974 a while ago (U6.3). Jody wants the device to report the data back to 975 her fitness account while she borrows it, so she allows it to access 976 her account temporarily. 978 After an hour, Jody gives the watch back and both girls terminate the 979 connection between their devices. 981 2.6.2. Authorization Problems Summary 983 o U6.1 Sports equipment owners want to be able to grant access 984 rights dynamically when needed. 986 o U6.2 Sports equipment owners want the configuration of access 987 rights to work with very little effort. 989 o U6.3 Sports equipment owners want to be able to pre-configure 990 access policies that grant certain access permissions to endpoints 991 with certain attributes (e.g. endpoints of a certain user) without 992 additional configuration effort at the time of access. 994 o U6.4 Sports equipment owners want to protect the confidentiality 995 of their data for privacy reasons. 997 2.7. Industrial Control Systems 999 Industrial control systems (ICS) and especially supervisory control 1000 and data acquisition systems (SCADA) use a multitude of sensors and 1001 actuators in order to monitor and control industrial processes in the 1002 physical world. Example processes include manufacturing, power 1003 generation, and refining of raw materials. 1005 Since the advent of the Stuxnet worm it has become obvious to the 1006 general public how vulnerable these kind of systems are, especially 1007 when connected to the Internet. The severity of these 1008 vulnerabilities are exacerbated by the fact that many ICS are used to 1009 control critical public infrastructure, such as nuclear power, water 1010 treatment of traffic control. Nevertheless the economical advantages 1011 of connecting such systems to the Internet can be significant if 1012 appropriate security measures are put in place (U7.5). 1014 2.7.1. Oil Platform Control 1016 An oil platform uses an industrial control system to monitor data and 1017 control equipment. The purpose of this system is to gather and 1018 process data from a large number of sensors, and control actuators 1019 such as valves and switches to steer the oil extraction process on 1020 the platform. Raw data, alarms, reports and other information are 1021 also available to the operators, who can intervene with manual 1022 commands. Many of the sensors are connected to the controlling units 1023 by direct wire, but the operator is slowly replacing these units by 1024 wireless ones, since this makes maintenance easier (U7.4). 1026 Some of the controlling units are connected to the Internet, to allow 1027 for remote administration, since it is expensive and inconvenient to 1028 fly in a technician to the platform (U7.3). 1030 The main interest of the operator is to ensure the integrity of 1031 control messages and sensor readings (U7.1). Access in some cases 1032 needs to be restricted, e.g. the operator wants wireless actuators 1033 only to accept commands by authorized control units (U7.2). 1035 The owner of the platform also wants to collect auditing information 1036 for liability reasons (U7.1). 1038 Different levels of access apply e.g. for regular operators, vs. 1039 maintenance technician, vs. auditors of the platform (U7.6) 1041 2.7.2. Authorization Problems Summary 1043 o U7.1 The operator of the platform wants to ensure the integrity 1044 and confidentiality of sensor and actuator data. 1046 o U7.2 The operator wants to ensure that data coming from sensors 1047 and commands sent to actuators are authentic. 1049 o U7.3 Some devices do not have direct Internet connection, but 1050 still need to implement current authorization policies. 1052 o U7.4 Devices need to authenticate the controlling units, 1053 especially those using a wireless connection. 1055 o U7.5 The execution of unauthorized commands or the failure to 1056 execute an authorized command in an ICS can lead to significant 1057 financial damage, and threaten the availability of critical 1058 infrastructure services. Accordingly, the operator wants a 1059 authentication and authorization mechanisms that provide a very 1060 high level of security. 1062 o U7.6 Different users should have different levels of access to the 1063 control system (e.g. operator vs. auditor). 1065 3. Security Considerations 1067 As the use cases listed in this document demonstrate, constrained 1068 devices are used in various environments. These devices are small 1069 and inexpensive and this makes it easy to integrate them into many 1070 aspects of everyday life. With access to vast amounts of valuable 1071 data and possibly control of important functions these devices need 1072 to be protected from unauthorized access. Protecting seemingly 1073 innocuous data and functions will lessen the possible effects of 1074 aggregation; attackers collecting data or functions from several 1075 sources can gain insights or a level of control not immediately 1076 obvious from each of these sources on its own. 1078 Not only the data on the constrained devices themselves is 1079 threatened, the devices might also be abused as an intrusion point to 1080 infiltrate a network. Once an attacker gains control over the 1081 device, it can be used to attack other devices as well. Due to their 1082 limited capabilities, constrained devices appear as the weakest link 1083 in the network and hence pose an attractive target for attackers. 1085 This section summarizes the security problems highlighted by the use 1086 cases above and provides guidelines for the design of protocols for 1087 authentication and authorization in constrained RESTful environments. 1089 3.1. Attacks 1091 This document lists security problems that users of constrained 1092 devices want to solve. Further analysis of attack scenarios is not 1093 in scope of the document. However, there are attacks that must be 1094 considered by solution developers. 1096 Because of the expected large number of devices and their ubiquity, 1097 constrained devices increase the danger from Pervasive Monitoring 1098 [RFC7258] attacks. 1100 Attacks aim at altering data in transit (e.g. to perpetrate fraud) 1101 are a problem that is addressed in many web security protocols such 1102 as TLS or IPSec. 1103 Developers need to consider this type of attacks, and make sure that 1104 the protection measures they implement are adapted to the constrained 1105 environment. 1107 As some of the use cases indicate, constrained devices may be 1108 installed in hostile environments where they are physically 1109 accessible (see Section 2.5). Protection from physical attacks is 1110 not in the scope of this document, but should be kept in mind by 1111 developers of authorization solutions. 1113 Denial of service (DoS) attacks threaten the availability of services 1114 a device provides and constrained devices are especially vulnerable 1115 to these types of attacks because of their limitations. Attackers 1116 can illicit a temporary or, if the battery is drained, permanent 1117 failure in a service simply by repeatedly flooding the device with 1118 connection attempts; for some services (see section Section 2.3), 1119 availability is especially important. 1120 Solution designers must be particularly careful to consider the 1121 following limitations in every part of the authorization solution: 1123 o Battery usage 1125 o Number of required message exchanges 1127 o Size of data that is transmitted (e.g. authentication and access 1128 control data) 1130 o Size of code required to run the protocols 1132 o Size of RAM memory and stack required to run the protocols 1133 o Resources blocked by partially completed exchanges (e.g. while one 1134 party is waiting for a transaction time to run out) 1136 Solution developers also need to consider whether the session should 1137 be protected from information disclosure and tampering. 1139 3.2. Configuration of Access Permissions 1141 o The access control policies need to be enforced (all use cases): 1142 The information that is needed to implement the access control 1143 policies needs to be provided to the device that enforces the 1144 authorization and applied to every incoming request. 1146 o A single resource might have different access rights for different 1147 requesting entities (all use cases). 1149 Rationale: In some cases different types of users need different 1150 access rights, as opposed to a binary approach where the same 1151 access permissions are granted to all authenticated users. 1153 o A device might host several resources where each resource has its 1154 own access control policy (all use cases). 1156 o The device that makes the policy decisions should be able to 1157 evaluate context-based permissions such as location or time of 1158 access (see Section 2.2, Section 2.3, Section 2.4). Access may 1159 depend on local conditions, e.g. access to health data in an 1160 emergency. The device that makes the policy decisions should be 1161 able to take such conditions into account. 1163 3.3. Authorization Considerations 1165 o Devices need to be enabled to enforce authorization policies 1166 without human intervention at the time of the access request (see 1167 Section 2.1, Section 2.2, Section 2.4, Section 2.5). 1169 o Authorization solutions need to consider that constrained devices 1170 might not have internet access at the time of the access request 1171 (see Section 2.1, Section 2.3, Section 2.5, Section 2.6). 1173 o It should be possible to update access control policies without 1174 manually re-provisioning individual devices (see Section 2.2, 1175 Section 2.3, Section 2.5, Section 2.6). 1177 Rationale: Peers can change rapidly which makes manual re- 1178 provisioning unreasonably expensive. 1180 o Authorization policies may be defined to apply to a large number 1181 of devices that might only have intermittent connectivity. 1182 Distributing policy updates to every device for every update might 1183 not be a feasible solution (see Section 2.5). 1185 o It must be possible to dynamically revoke authorizations (see e.g. 1186 Section 2.4). 1188 o The authentication and access control protocol can put undue 1189 burden on the constrained system resources of a device 1190 participating in the protocol. An authorization solutions must 1191 take the limitations of the constrained devices into account (all 1192 use cases, see also Section 3.1). 1194 o Secure default settings are needed for the initial state of the 1195 authentication and authorization protocols (all use cases). 1197 Rationale: Many attacks exploit insecure default settings, and 1198 experience shows that default settings are frequently left 1199 unchanged by the end users. 1201 o Access to resources on other devices should only be permitted if a 1202 rule exists that explicitly allows this access (default deny) (see 1203 e.g. Section 2.4). 1205 o Usability is important for all use cases. The configuration of 1206 authorization policies as well as the gaining access to devices 1207 must be simple for the users of the devices. Special care needs 1208 to be taken for scenarios where access control policies have to be 1209 configured by users that are typically not trained in security 1210 (see Section 2.2, Section 2.3, Section 2.6). 1212 3.4. Proxies 1214 In some cases, the traffic between endpoints might go through 1215 intermediary nodes (e.g. proxies, gateways). This might affect the 1216 function or the security model of authentication and access control 1217 protocols e.g. end-to-end security between endpoints with DTLS might 1218 not be possible (see Section 2.5). 1220 4. Privacy Considerations 1222 Many of the devices that are in focus of this document register data 1223 from the physical world (sensors) or affect processes in the physical 1224 world (actuators), which may involve data or processes belonging to 1225 individuals. To make matters worse the sensor data may be recorded 1226 continuously thus allowing to gather significant information about an 1227 individual subject through the sensor readings. Therefore privacy 1228 protection is especially important, and Authentication and Access 1229 control are important tools for this, since they make it possible to 1230 control who gets access to private data. 1232 Privacy protection can also be weighted in when evaluating the need 1233 for end-to-end confidentiality, since otherwise intermediary nodes 1234 will learn the content of potentially sensitive messages sent between 1235 endpoints and thereby threaten the privacy of the individual that may 1236 be subject of this data. 1238 In some cases, even the possession of a certain type of device can be 1239 confidential, e.g. individuals might not want to others to know that 1240 they are wearing a certain medical device (see Section 2.3). 1242 The personal health monitoring use case (see Section 2.3) indicates 1243 the need for secure audit logs which impose specific requirements on 1244 a solution. 1245 Auditing is not in the scope of ACE. However, if an authorization 1246 solution provides means for audit logs, it must consider the impact 1247 of logged data for the privacy of all parties involved. Suitable 1248 measures for protecting and purging the logs must be taken during 1249 operation, maintenance and decommissioning of the device. 1251 5. Acknowledgments 1253 The authors would like to thank Olaf Bergmann, Sumit Singhal, John 1254 Mattson, Mohit Sethi, Carsten Bormann, Martin Murillo, Corinna 1255 Schmitt, Hannes Tschofenig, Erik Wahlstroem, Andreas Baeckman, Samuel 1256 Erdtman, Steve Moore, Thomas Hardjono, Kepeng Li, Jim Schaad, 1257 Prashant Jhingran, Kathleen Moriarty, and Sean Turner for reviewing 1258 and/or contributing to the document. Also, thanks to Markus Becker, 1259 Thomas Poetsch and Koojana Kuladinithi for their input on the 1260 container monitoring use case. Furthermore the authors thank Akbar 1261 Rahman, Chonggang Wang, Vinod Choyi, and Abhinav Somaraju who 1262 contributed to the building automation use case. 1264 Ludwig Seitz and Goeran Selander worked on this document as part of 1265 EIT-ICT Labs activity PST-14056. 1267 6. IANA Considerations 1269 This document has no IANA actions. 1271 7. Informative References 1273 [Jedermann14] 1274 Jedermann, R., Poetsch, T., and C. LLoyd, "Communication 1275 techniques and challenges for wireless food quality 1276 monitoring", Philosophical Transactions of the Royal 1277 Society A Mathematical, Physical and Engineering Sciences, 1278 May 2014. 1280 [RFC7228] Bormann, C., Ersue, M., and A. Keranen, "Terminology for 1281 Constrained-Node Networks", RFC 7228, DOI 10.17487/ 1282 RFC7228, May 2014, 1283 . 1285 [RFC7252] Shelby, Z., Hartke, K., and C. Bormann, "The Constrained 1286 Application Protocol (CoAP)", RFC 7252, DOI 10.17487/ 1287 RFC7252, June 2014, 1288 . 1290 [RFC7258] Farrell, S. and H. Tschofenig, "Pervasive Monitoring Is an 1291 Attack", BCP 188, RFC 7258, DOI 10.17487/RFC7258, May 1292 2014, . 1294 Authors' Addresses 1296 Ludwig Seitz (editor) 1297 SICS Swedish ICT AB 1298 Scheelevaegen 17 1299 Lund 223 70 1300 Sweden 1302 Email: ludwig@sics.se 1304 Stefanie Gerdes (editor) 1305 Universitaet Bremen TZI 1306 Postfach 330440 1307 Bremen 28359 1308 Germany 1310 Phone: +49-421-218-63906 1311 Email: gerdes@tzi.org 1312 Goeran Selander 1313 Ericsson 1314 Faroegatan 6 1315 Kista 164 80 1316 Sweden 1318 Email: goran.selander@ericsson.com 1320 Mehdi Mani 1321 Itron 1322 52, rue Camille Desmoulins 1323 Issy-les-Moulineaux 92130 1324 France 1326 Email: Mehdi.Mani@itron.com 1328 Sandeep S. Kumar 1329 Philips Research 1330 High Tech Campus 1331 Eindhoven 5656 AA 1332 The Netherlands 1334 Email: sandeep.kumar@philips.com