idnits 2.17.1 draft-ietf-ace-usecases-07.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (October 02, 2015) is 3128 days in the past. Is this intentional? Checking references for intended status: Informational ---------------------------------------------------------------------------- == Unused Reference: 'RFC6347' is defined on line 1190, but no explicit reference was found in the text -- Obsolete informational reference (is this intentional?): RFC 6347 (Obsoleted by RFC 9147) Summary: 0 errors (**), 0 flaws (~~), 2 warnings (==), 2 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 ACE Working Group L. Seitz, Ed. 3 Internet-Draft SICS Swedish ICT AB 4 Intended status: Informational S. Gerdes, Ed. 5 Expires: April 4, 2016 Universitaet Bremen TZI 6 G. Selander 7 Ericsson 8 M. Mani 9 Itron 10 S. Kumar 11 Philips Research 12 October 02, 2015 14 ACE use cases 15 draft-ietf-ace-usecases-07 17 Abstract 19 Constrained devices are nodes with limited processing power, storage 20 space and transmission capacities. These devices in many cases do 21 not provide user interfaces and are often intended to interact 22 without human intervention. 24 This document includes a collection of representative use cases for 25 authentication and authorization in constrained environments. These 26 use cases aim at identifying authorization problems that arise during 27 the lifecycle of a constrained device and are intended to provide a 28 guideline for developing a comprehensive authentication and 29 authorization solution for this class of scenarios. 31 Where specific details are relevant, it is assumed that the devices 32 use the Constrained Application Protocol (CoAP) as communication 33 protocol, however most conclusions apply generally. 35 Status of This Memo 37 This Internet-Draft is submitted in full conformance with the 38 provisions of BCP 78 and BCP 79. 40 Internet-Drafts are working documents of the Internet Engineering 41 Task Force (IETF). Note that other groups may also distribute 42 working documents as Internet-Drafts. The list of current Internet- 43 Drafts is at http://datatracker.ietf.org/drafts/current/. 45 Internet-Drafts are draft documents valid for a maximum of six months 46 and may be updated, replaced, or obsoleted by other documents at any 47 time. It is inappropriate to use Internet-Drafts as reference 48 material or to cite them other than as "work in progress." 49 This Internet-Draft will expire on April 4, 2016. 51 Copyright Notice 53 Copyright (c) 2015 IETF Trust and the persons identified as the 54 document authors. All rights reserved. 56 This document is subject to BCP 78 and the IETF Trust's Legal 57 Provisions Relating to IETF Documents 58 (http://trustee.ietf.org/license-info) in effect on the date of 59 publication of this document. Please review these documents 60 carefully, as they describe your rights and restrictions with respect 61 to this document. Code Components extracted from this document must 62 include Simplified BSD License text as described in Section 4.e of 63 the Trust Legal Provisions and are provided without warranty as 64 described in the Simplified BSD License. 66 Table of Contents 68 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 3 69 1.1. Terminology . . . . . . . . . . . . . . . . . . . . . . . 3 70 2. Use Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 4 71 2.1. Container monitoring . . . . . . . . . . . . . . . . . . 4 72 2.1.1. Bananas for Munich . . . . . . . . . . . . . . . . . 5 73 2.1.2. Authorization Problems Summary . . . . . . . . . . . 6 74 2.2. Home Automation . . . . . . . . . . . . . . . . . . . . . 7 75 2.2.1. Controlling the Smart Home Infrastructure . . . . . . 7 76 2.2.2. Seamless Authorization . . . . . . . . . . . . . . . 7 77 2.2.3. Remotely letting in a visitor . . . . . . . . . . . . 7 78 2.2.4. Selling the house . . . . . . . . . . . . . . . . . . 8 79 2.2.5. Authorization Problems Summary . . . . . . . . . . . 8 80 2.3. Personal Health Monitoring . . . . . . . . . . . . . . . 9 81 2.3.1. John and the heart rate monitor . . . . . . . . . . . 10 82 2.3.2. Authorization Problems Summary . . . . . . . . . . . 11 83 2.4. Building Automation . . . . . . . . . . . . . . . . . . . 12 84 2.4.1. Device Lifecycle . . . . . . . . . . . . . . . . . . 12 85 2.4.2. Public Safety . . . . . . . . . . . . . . . . . . . . 14 86 2.4.3. Authorization Problems Summary . . . . . . . . . . . 15 87 2.5. Smart Metering . . . . . . . . . . . . . . . . . . . . . 16 88 2.5.1. Drive-by metering . . . . . . . . . . . . . . . . . . 16 89 2.5.2. Meshed Topology . . . . . . . . . . . . . . . . . . . 17 90 2.5.3. Advanced Metering Infrastructure . . . . . . . . . . 17 91 2.5.4. Authorization Problems Summary . . . . . . . . . . . 18 92 2.6. Sports and Entertainment . . . . . . . . . . . . . . . . 18 93 2.6.1. Dynamically Connecting Smart Sports Equipment . . . . 19 94 2.6.2. Authorization Problems Summary . . . . . . . . . . . 19 95 2.7. Industrial Control Systems . . . . . . . . . . . . . . . 20 96 2.7.1. Oil Platform Control . . . . . . . . . . . . . . . . 20 97 2.7.2. Authorization Problems Summary . . . . . . . . . . . 21 98 3. Security Considerations . . . . . . . . . . . . . . . . . . . 21 99 3.1. Attacks . . . . . . . . . . . . . . . . . . . . . . . . . 22 100 3.2. Configuration of Access Permissions . . . . . . . . . . . 23 101 3.3. Authorization Considerations . . . . . . . . . . . . . . 23 102 3.4. Proxies . . . . . . . . . . . . . . . . . . . . . . . . . 24 103 4. Privacy Considerations . . . . . . . . . . . . . . . . . . . 24 104 5. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 25 105 6. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 25 106 7. Informative References . . . . . . . . . . . . . . . . . . . 26 107 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 26 109 1. Introduction 111 Constrained devices [RFC7228] are nodes with limited processing 112 power, storage space and transmission capacities. These devices are 113 often battery-powered and in many cases do not provide user 114 interfaces. 116 Constrained devices benefit from being interconnected using Internet 117 protocols. However, deploying common security protocols can 118 sometimes be difficult because of device or network limitations. 119 Regardless, adequate security mechanisms are required to protect 120 these constrained devices, which are expected to be integrated in all 121 aspects of everyday life, from attackers wishing to gain control over 122 the device's data or functions. 124 This document comprises a collection of representative use cases for 125 the application of authentication and authorization in constrained 126 environments. These use cases aim at identifying authorization 127 problems that arise during the lifecycle of a constrained device. 128 Note that this document does not aim at collecting all possible use 129 cases. 131 We assume a scenario where one device acts as a server that offers 132 resources such as sensor data and actuator settings. The resources 133 can be accessed by clients, sometimes without human intervention i.e. 134 machine-to-machine (M2M). 135 In some situations the communication will happen through 136 intermediaries (e.g. gateways, proxies). 138 Where specific detail is necessary it is assumed that the devices 139 communicate using CoAP [RFC7252], although most conclusions are 140 generic. 142 1.1. Terminology 143 Readers are required to be familiar with the terms defined in 144 [RFC7228]. 146 2. Use Cases 148 This section includes the use cases; each use case first presents a 149 general description of the application environment, than one or more 150 specific use cases, and finally a summary of the authorization- 151 related problems to be solved. 153 There are various reasons for assigning a function (client or server) 154 to a device, e.g. which device initiates the conversation, how do 155 devices find each other, etc. The definition of the function of a 156 device in a certain use case is not in scope of this document. 157 Readers should be aware that there might be reasons for each setting 158 and that endpoints might even have different functions at different 159 times. 161 2.1. Container monitoring 163 The ability of sensors to communicate environmental data wirelessly 164 opens up new application areas. Sensor systems make it possible to 165 continuously track and transmit characteristics such as temperature, 166 humidity and gas content while goods are transported and stored. 168 Sensors in this scenario have to be associated to the appropriate 169 pallet of the respective container. Sensors as well as the goods 170 belong to specific customers. 172 While in transit goods often pass stops where they are transloaded to 173 other means of transportation, e.g. from ship transport to road 174 transport. 176 Perishable goods need to be stored at constant temperature and with 177 proper ventilation. Real-time information on the state of the goods 178 is needed by both the transporter and the vendor. Transporters want 179 to prioritize good that will expire soon. Vendors want to react when 180 goods are spoiled to continue to fulfill delivery obligations. 182 The Intelligent Container (http://www.intelligentcontainer.com) is an 183 example project that explores solutions to continuously monitor 184 perishable goods. 186 2.1.1. Bananas for Munich 188 A fruit vendor grows bananas in Costa Rica for the German market. It 189 instructs a transport company to deliver the goods via ship to 190 Rotterdam where they are picked up by trucks and transported to a 191 ripening facility. A Munich supermarket chain buys ripened bananas 192 from the fruit vendor and transports them from the ripening facility 193 to the individual markets with their own company trucks. 195 The fruit vendor's quality management wants to assure the quality of 196 their products and thus equips the banana boxes with sensors. The 197 state of the goods is monitored consistently during shipment and 198 ripening and abnormal sensor values are recorded (U1.2). 199 Additionally, the sensor values are used to control the climate 200 within the cargo containers (U1.1, U1.5, U1.7). The sensors 201 therefore need to communicate with the climate control system. Since 202 a wrong sensor value leads to a wrong temperature and thus to spoiled 203 goods, the integrity of the sensor data must be assured (U1.2, U1.3). 204 The banana boxes within a container will in most cases belong to the 205 same owner. Adjacent containers might contain goods and sensors of 206 different owners (U1.1). 208 The personnel that transloads the goods must be able to locate the 209 goods meant for a specific customer (U1.1, U1.6, U1.7). However the 210 fruit vendor does not want to disclose sensor information pertaining 211 to the condition of the goods to other companies and therefore wants 212 to assure the confidentiality of this data (U1.4). Thus, the 213 transloading personnel is only allowed to access logistic information 214 (U1.1). Moreover, the transloading personnel is only allowed to 215 access the data for the time of the transloading (U1.8). 217 Due to the high water content of the fruits, the propagation of radio 218 waves is hindered, thus often inhibiting direct communication between 219 nodes [Jedermann14]. Instead, messages are forwarded over multiple 220 hops (U1.9). The sensors in the banana boxes cannot always reach the 221 Internet during the journey (U1.10). Sensors may need to use relay 222 stations owned by the transport company to connect to endpoints in 223 the Internet. 225 In the ripening facility bananas are stored until they are ready to 226 be sold. The banana box sensors are used to control the ventilation 227 system and to monitor the degree of ripeness of the bananas. Ripe 228 bananas need to be identified and sold before they spoil (U1.2, 229 U1.8). 231 The supermarket chain gains ownership of the banana boxes when the 232 bananas have ripened and are ready to leave the ripening facility. 234 2.1.2. Authorization Problems Summary 236 o U1.1 Fruit vendors and container owners want to grant different 237 authorizations for their resources and/or endpoints to different 238 parties. 240 o U1.2 The fruit vendor requires the integrity and authenticity of 241 the sensor data that pertains the state of the goods for climate 242 control and to ensure the quality of the monitored recordings. 244 o U1.3 The container owner requires the integrity and authenticity 245 of the sensor data that is used for climate control. 247 o U1.4 The fruit vendor requires the confidentiality of the sensor 248 data that pertains the state of the goods and the confidentiality 249 of location data, e.g., to protect them from targeted attacks from 250 competitors. 252 o U1.5 The fruit vendor may need different protection for several 253 different types of data on the same endpoint, e.g., sensor data 254 and the data used for logistics. 256 o U1.6 The fruit vendor and the transloading personnel require the 257 authenticity and integrity of the data that is used to locate the 258 goods, in order to ensure that the goods are correctly treated and 259 delivered. 261 o U1.7 The container owner and the fruit vendor may not be present 262 at the time of access and cannot manually intervene in the 263 authorization process. 265 o U1.8 The fruit vendor, container owner and transloading company 266 want to grant temporary access permissions to a party, in order to 267 avoid giving permanent access to parties that are no longer 268 involved in processing the bananas. 270 o U1.9 The fruit vendor, container owner and transloading company 271 want their security objectives to be achieved, even if the 272 messages between the endpoints need to be forwarded over multiple 273 hops. 275 o U1.10 The constrained devices might not always be able to reach 276 the Internet but still need to enact the authorization policies of 277 their principals. 279 o U1.11 Fruit vendors and container owners want to be able to revoke 280 authorization on a malfunctioning sensor. 282 2.2. Home Automation 284 One application of the Internet of Things is home automation systems. 285 Such a system can connect household devices that control, for example 286 heating, ventilation, lighting, home entertainment, and home security 287 to the Internet making them remotely accessible and manageable. 289 Such a system needs to accommodate a number of regular users 290 (inhabitants, close friends, cleaning personnel) as well as a 291 heterogeneous group of dynamically varying users (visitors, 292 repairmen, delivery men). 294 As the users are not typically trained in security (or even computer 295 use), the configuration must use secure default settings, and the 296 interface must be well adapted to novice users. 298 2.2.1. Controlling the Smart Home Infrastructure 300 Alice and Bob own a flat which is equipped with home automation 301 devices such as HVAC and shutter control, and they have a motion 302 sensor in the corridor which controls the light bulbs there (U2.5). 304 Alice and Bob can control the shutters and the temperature in each 305 room using either wall-mounted touch panels or an internet connected 306 device (e.g. a smartphone). Since Alice and Bob both have a full- 307 time job, they want to be able to change settings remotely, e.g. turn 308 up the heating on a cold day if they will be home earlier than 309 expected (U2.5). 311 The couple does not want people in radio range of their devices, e.g. 312 their neighbors, to be able to control them without authorization. 313 Moreover, they don't want burglars to be able to deduce behavioral 314 patterns from eavesdropping on the network (U2.8). 316 2.2.2. Seamless Authorization 318 Alice buys a new light bulb for the corridor and integrates it into 319 the home network, i.e. makes resources known to other devices in the 320 network. Alice makes sure that the new light bulb and her other 321 devices in the network get to know the authorization policies for the 322 new device. Bob is not at home, but Alice wants him to be able to 323 control the new device with his devices (e.g. his smartphone) without 324 the need for additional administration effort (U2.7). She provides 325 the necessary configurations for that (U2.9, U2.10). 327 2.2.3. Remotely letting in a visitor 328 Alice and Bob have equipped their home with automated connected door- 329 locks and an alarm system at the door and the windows. The couple 330 can control this system remotely. 332 Alice and Bob have invited Alice's parents over for dinner, but are 333 stuck in traffic and cannot arrive in time, while Alice's parents who 334 use the subway will arrive punctually. Alice calls her parents and 335 offers to let them in remotely, so they can make themselves 336 comfortable while waiting (U2.1, U2.6). Then Alice sets temporary 337 permissions that allow them to open the door, and shut down the alarm 338 (U2.2). She wants these permissions to be only valid for the evening 339 since she does not like it if her parents are able to enter the house 340 as they see fit (U2.3, U2.4). 342 When Alice's parents arrive at Alice's and Bob's home, they use their 343 smartphone to communicate with the door-lock and alarm system (U2.5, 344 U2.9). The permissions Alice issued to her parents only allow 345 limited access to the house (e.g. opening the door, turning on the 346 lights). Certain other functions, such as checking the footage from 347 the surveillance cameras is not accessible to them (U2.3). 349 Alice and Bob also issue similarly restricted permissions to e.g. 350 cleaners, repairmen or their nanny (U2.3). 352 2.2.4. Selling the house 354 Alice and Bob have to move because Alice is starting a new job. They 355 therefore decide to sell the house, and transfer control of all 356 automated services to the new owners (U2.11). Before doing that they 357 want to erase privacy relevant data from the logs of the automated 358 systems, while the new owner is interested to keep some historic data 359 e.g. pertaining to the behavior of the heating system (U2.12). At 360 the time of transfer of the house, the new owners also wants make 361 sure that permissions issued by the previous owners to access the 362 house or connected devices (in the case where device management may 363 have separate permissions from house access) are no longer valid 364 (U2.13). 366 2.2.5. Authorization Problems Summary 368 o U2.1 A home owner (Alice and Bob in the example above) wants to 369 spontaneously provision authorization means to visitors. 371 o U2.2 A home owner wants to spontaneously change the home's access 372 control policies. 374 o U2.3 A home owner wants to apply different access rights for 375 different users (including other inhabitants). 377 o U2.4 The home owners want to grant access permissions to a someone 378 during a specified time frame. 380 o U2.5 The smart home devices need to be able to securely 381 communicate with different control devices (e.g. wall-mounted 382 touch panels, smartphones, electronic key fobs, device gateways). 384 o U2.6 The home owner wants to be able to configure authorization 385 policies remotely. 387 o U2.7 Authorized Users want to be able to obtain access with little 388 effort. 390 o U2.8 The owners of the automated home want to prevent unauthorized 391 entities from being able to deduce behavioral profiles from 392 devices in the home network. 394 o U2.9 Usability is particularly important in this scenario since 395 the necessary authorization related tasks in the lifecycle of the 396 device (commissioning, operation, maintenance and decommissioning) 397 likely need to be performed by the home owners who in most cases 398 have little knowledge of security. 400 o U2.10 Home Owners want their devices to seamlessly (and in some 401 cases even unnoticeably) fulfill their purpose. Therefore the 402 authorization administration effort needs to be kept at a minimum. 404 o U2.11 Home Owners want to be able to transfer ownership of their 405 automated systems when they sell the house. 407 o U2.12 Home Owners want to be able to sanitize the logs of the 408 automated systems, when transferring ownership, without deleting 409 important operational data. 411 o U2.13 When a transfer of ownership occurs, the new owner wants to 412 make sure that access rights created by the previous owner are no 413 longer valid. 415 2.3. Personal Health Monitoring 417 Personal health monitoring devices, i.e. eHealth devices, are 418 typically battery driven and located physically on or in the user to 419 monitor some bodily function, such as temperature, blood pressure, or 420 pulse rate. These devices typically connect to the Internet through 421 an intermediary base-station, using wireless technologies and through 422 this connection they report the monitored data to some entity, which 423 may either be the user, or a medical cargiver. 425 Medical data has always been considered as very sensitive, and 426 therefore requires good protection against unauthorized disclosure. 427 A frequent, conflicting requirement is the capability for medical 428 personnel to gain emergency access, even if no specific access rights 429 exist. As a result, the importance of secure audit logs increases in 430 such scenarios. 432 Since the users are not typically trained in security (or even 433 computer use), the configuration must use secure default settings, 434 and the interface must be well adapted to novice users. Parts of the 435 system must operate with minimal maintenance. Especially frequent 436 changes of battery are unacceptable. 438 There is a plethora of wearable health monitoring technology and the 439 need for open industry standards to ensure interoperability between 440 products has lead to initiatives such as Continua Alliance 441 (continuaalliance.org) and Personal Connected Health Alliance 442 (pchalliance.org). 444 2.3.1. John and the heart rate monitor 446 John has a heart condition, that can result in sudden cardiac 447 arrests. He therefore uses a device called HeartGuard that monitors 448 his heart rate and his location (U3.7). In case of a cardiac arrest 449 it automatically sends an alarm to an emergency service, transmitting 450 John's current location (U3.1). Either the device has long range 451 connectivity itself (e.g. via GSM) or it uses some intermediary, 452 nearby device (e.g. John's smartphone) to transmit such an alarm. To 453 ensure Johns safety, the device is expected to be in constant 454 operation (U3.3, U3.6). 456 The device includes an authentication mechanism, in order to prevent 457 other persons who get physical access to it from acting as the owner 458 and altering the access control and security settings (U3.8). 460 John can configure additional persons that get notified in an 461 emergency, for example his daughter Jill. Furthermore the device 462 stores data on John's heart rate, which can later be accessed by a 463 physician to assess the condition of John's heart (U3.2). 465 However John is a privacy conscious person, and is worried that Jill 466 might use HeartGuard to monitor his location while there is no 467 emergency. Furthermore he doesn't want his health insurance to get 468 access to the HeartGuard data, or even to the fact that he is wearing 469 a HeartGuard, since they might refuse to renew his insurance if they 470 decided he was too big a risk for them (U3.8). 472 Finally John, while being comfortable with modern technology and able 473 to operate it reasonably well, is not trained in computer security. 474 He therefore needs an interface for the configuration of the 475 HeartGuard security that is easy to understand and use (U3.5). If 476 John does not understand the meaning of a setting, he tends to leave 477 it alone, assuming that the manufacturer has initialized the device 478 to secure settings (U3.4). 480 NOTE: Monitoring of some state parameter (e.g. an alarm button) and 481 the position of a person also fits well into an elderly care service. 482 This is particularly useful for people suffering from dementia, where 483 the relatives or caregivers need to be notified of the whereabouts of 484 the person under certain conditions. In this case it is not the 485 patient that decides about access. 487 2.3.2. Authorization Problems Summary 489 o U3.1 The wearer of an eHealth device (John in the example above) 490 wants to pre-configure special access rights in the context of an 491 emergency. 493 o U3.2 The wearer of an eHealth device wants to selectively allow 494 different persons or groups access to medical data. 496 o U3.3 Battery changes are very inconvenient and sometimes 497 impractical, so battery life impacts of the authorization 498 mechanisms need to be minimized. 500 o U3.4 Devices are often used with default access control settings 501 which might threaten the security objectives of the device's 502 users. 504 o U3.5 Wearers of eHealth devices are often not trained in computer 505 use, and especially computer security. 507 o U3.6 Security mechanisms themselves could provide opportunities 508 for denial of service attacks, especially on the constrained 509 devices. 511 o U3.7 The device provides a service that can be fatal for the 512 wearer if it fails. Accordingly, the wearer wants the device to 513 have a high degree of resistance against attacks that may cause 514 the device to fail to operate partially or completely. 516 o U3.8 The wearer of an eHealth device requires the integrity and 517 confidentiality of the data measured by the device. 519 2.4. Building Automation 521 Buildings for commercial use such as shopping malls or office 522 buildings nowadays are equipped increasingly with semi-automatic 523 components to enhance the overall living quality and to save energy 524 where possible. This includes for example heating, ventilation and 525 air condition (HVAC) as well as illumination and security systems 526 such as fire alarms. 528 Different areas of these buildings are often exclusively leased to 529 different companies. However they also share some of the common 530 areas of the building. Accordingly, a company must be able to 531 control the light and HVAC system of its own part of the building and 532 must not have access to control rooms that belong to other companies. 534 Some parts of the building automation system such as entrance 535 illumination and fire alarm systems are controlled either by all 536 parties together or by a service company. 538 2.4.1. Device Lifecycle 540 2.4.1.1. Installation and Commissioning 542 A building is hired out to different companies for office space. 543 This building features various automated systems, such as a fire 544 alarm system, which is triggered by several smoke detectors which are 545 spread out across the building. It also has automated HVAC, lighting 546 and physical access control systems. 548 A vacant area of the building has been recently leased to company A. 549 Before moving into its new office, Company A wishes to replace the 550 lighting with a more energy efficient and a better light quality 551 luminaries. They hire an installation and commissioning company C to 552 redo the illumination. Company C is instructed to integrate the new 553 lighting devices, which may be from multiple manufacturers, into the 554 existing lighting infrastructure of the building which includes 555 presence sensors, switches, controllers etc (U4.1). 557 Company C gets the necessary authorization from the service company 558 to interact with the existing Building and Lighting Management System 559 (BLMS) (U4.4). To prevent disturbance to other occupants of the 560 building, Company C is provided authorization to perform the 561 commissioning only during non-office hours and only to modify 562 configuration on devices belonging to the domain of Company A's space 563 (U4.5). After installation (wiring) of the new lighting devices, the 564 commissioner adds the devices into the company A's lighting domain. 566 Once the devices are in the correct domain, the commissioner 567 authorizes the interaction rules between the new lighting devices and 568 existing devices like presence sensors (U4.7). For this, the 569 commissioner creates the authorization rules on the BLMS which define 570 which lights form a group and which sensors/switches/controllers are 571 allowed to control which groups (U4.8). These authorization rules 572 may be context based like time of the day (office or non-office 573 hours) or location of the handheld lighting controller etc (U4.5). 575 2.4.1.2. Operational 577 Company A's staff move into the newly furnished office space. Most 578 lighting is controlled by presence sensors which control the lighting 579 of specific group of lights based on the authorization rules in the 580 BLMS. Additionally employees are allowed to manually override the 581 lighting brightness and color in their office by using the switches 582 or handheld controllers. Such changes are allowed only if the 583 authorization rules exist in the BLMS. For example lighting in the 584 corridors may not be manually adjustable. 586 At the end of the day, lighting is dimmed down or switched off if no 587 occupancy is detected even if manually overridden during the day. 589 On a later date company B also moves into the same building, and 590 shares some of the common spaces with company A (U4.2, U4.9). 592 2.4.1.3. Maintenance 594 Company A's staff are annoyed that the lights switch off too often in 595 their rooms if they work silently in front of their computer. 596 Company A notifies the commissioning Company C about the issue and 597 asks them to increase the delay before lights switch off (U4.4). 599 Company C again gets the necessary authorization from the service 600 company to interact with the BLMS. The commissioner's tool gets the 601 necessary authorization from BLMS to send a configuration change to 602 all lighting devices in Company A's offices to increase their delay 603 before they switch off. 605 At some point the service company wants to update the firmware of 606 lighting devices in order to eliminate software bugs. Before 607 accepting the new firmware, each device checks the authorization of 608 the service company to perform this update. 610 2.4.1.4. Decommissioning 612 Company A has noticed that the handheld controllers are often 613 misplaced and hard to find when needed. So most of the time staff 614 use the existing wall switches for manual control. Company A decides 615 it would be better to completely remove handheld controllers and asks 616 Company C to decommission them from the lighting system (U4.4). 618 Company C again gets the necessary authorization from the service 619 company to interact with the BLMS. The commissioner now deletes any 620 rules that allowed handheld controllers authorization to control the 621 lighting (U4.3, U4.6). Additionally the commissioner instructs the 622 BLMS to push these new rules to prevent cached rules at the end 623 devices from being used. 625 2.4.2. Public Safety 627 The fire department requires that as part of the building safety 628 code, that the building have sensors that sense the level of smoke, 629 heat, etc., when a fire breaks out. These sensors report metrics 630 which are then used by a back-end server to map safe areas and un- 631 safe areas within a building and also possibly the structural 632 integrity of the building before fire-fighters may enter it. 633 Sensors may also be used to track where human/animal activity is 634 within the building. This will allow people stuck within the 635 building to be guided to safer areas and suggest possible actions 636 that they may take (e.g. using a client application on their phones, 637 or loudspeaker directions) in order to bring them to safety. In 638 certain cases, other organizations such as the Police, Ambulance, and 639 federal organizations are also involved and therefore the co- 640 ordination of tasks between the various entities have to be carried 641 out using efficient messaging and authorization mechanisms. 643 2.4.2.1. A fire breaks out 645 On a really hot day James who works for company A turns on the air 646 condition in his office. Lucy who works for company B wants to make 647 tea using an electric kettle. After she turned it on she goes 648 outside to talk to a colleague until the water is boiling. 649 Unfortunately, her kettle has a malfunction which causes overheating 650 and results in a smoldering fire of the kettle's plastic case. 652 Due to the smoke coming from the kettle the fire alarm is triggered. 653 Alarm sirens throughout the building are switched on simultaneously 654 (using a group communication scheme) to alert the staff of both 655 companies (U4.8). Additionally, the ventilation system of the whole 656 building is closed off to prevent the smoke from spreading and to 657 withdraw oxygen from the fire. The smoke cannot get into James' 658 office although he turned on his air condition because the fire alarm 659 overrides the manual setting by sending commands (using group 660 communication) to switch off all the air conditioning (U4.10). 662 The fire department is notified of the fire automatically and arrives 663 within a short time. They automatically get access to all parts of 664 the building according to an emergency authorization policy (U4.4, 665 U4.5). After inspecting the damage and extinguishing the smoldering 666 fire a fire fighter resets the fire alarm because only the fire 667 department is authorized to do that (U4.4, U4.11). 669 2.4.3. Authorization Problems Summary 671 o U4.1 During commissioning, the building owner or the companies add 672 new devices to their administrative domain. Access control should 673 then apply to these devices seamlessly. 675 o U4.2 During a handover, the building owner or the companies 676 integrate devices that formerly belonged to a different 677 administrative domain to their own administrative domain. Access 678 control of the old domain should then cease to apply, with access 679 control of the new domain taking over. 681 o U4.3 During decommissioning, the building owner or the companies 682 remove devices from their administrative domain. Access control 683 should cease to apply to these devices and relevant credentials 684 need to be erased from the devices. 686 o U4.4 The building owner and the companies want to be able to 687 delegate specific access rights for their devices to others. 689 o U4.5 The building owner and the companies want to be able to 690 define context-based authorization rules. 692 o U4.6 The building owner and the companies want to be able to 693 revoke granted permissions and delegations. 695 o U4.7 The building owner and the companies want to allow authorized 696 entities to send data to their endpoints (default deny). 698 o U4.8 The building owner and the companies want to be able to 699 authorize a device to control several devices at the same time 700 using a group communication scheme. 702 o U4.9 The companies want to be able to interconnect their own 703 subsystems with those from a different operational domain while 704 keeping the control over the authorizations (e.g. granting and 705 revoking permissions) for their endpoints and devices. 707 o U4.10 The authorization mechanisms must be able to cope with 708 extremely time-sensitive operations which have to be carried out 709 in a quick manner. 711 o U4.11 The building owner and the public safety authorities want to 712 be able to perform data origin authentication on messages sent and 713 received by some of the systems in the building. 715 2.5. Smart Metering 717 Automated measuring of customer consumption is an established 718 technology for electricity, water, and gas providers. Increasingly 719 these systems also feature networking capability to allow for remote 720 management. Such systems are in use for commercial, industrial and 721 residential customers and require a certain level of security, in 722 order to avoid economic loss to the providers, vulnerability of the 723 distribution system, as well as disruption of services for the 724 customers. 726 The smart metering equipment for gas and water solutions is battery 727 driven and communication should be used sparingly due to battery 728 consumption. Therefore the types of meters sleep most of the time, 729 and only wake up every minute/hour to check for incoming 730 instructions. Furthermore they wake up a few times a day (based on 731 their configuration) to upload their measured metering data. 733 Different networking topologies exist for smart metering solutions. 734 Based on environment, regulatory rules and expected cost, one or a 735 mixture of these topologies may be deployed to collect the metering 736 information. Drive-By metering is one of the most current solutions 737 deployed for collection of gas and water meters. 739 Various stakeholders have a claim on the metering data. Utility 740 companies need the data for accounting, the metering equipment may be 741 operated by a third party Service Operator who needs to maintain it, 742 and the equipment is installed in the premises of the consumers, 743 measuring their consumption, which entails privacy questions. 745 2.5.1. Drive-by metering 747 A service operator offers smart metering infrastructures and related 748 services to various utility companies. Among these is a water 749 provider, who in turn supplies several residential complexes in a 750 city. The smart meters are installed in the end customer's homes to 751 measure water consumption and thus generate billing data for the 752 utility company, they can also be used to shut off the water if the 753 bills are not paid (U5.1, U5.3). The meters do so by sending and 754 receiving data to and from a base station (U5.2). Several base 755 stations are installed around the city to collect the metering data. 756 However in the denser urban areas, the base stations would have to be 757 installed very close to the meters. This would require a high number 758 of base stations and expose this more expensive equipment to 759 manipulation or sabotage. The service operator has therefore chosen 760 another approach, which is to drive around with a mobile base-station 761 and let the meters connect to that in regular intervals in order to 762 gather metering data (U5.4, U5.6, U5.8). 764 2.5.2. Meshed Topology 766 In another deployment, the water meters are installed in a building 767 that already has power meters installed, the latter are mains 768 powered, and are therefore not subject to the same power saving 769 restrictions. The water meters can therefore use the power meters as 770 proxies, in order to achieve better connectivity. This requires the 771 security measures on the water meters to work through intermediaries 772 (U5.9). 774 2.5.3. Advanced Metering Infrastructure 776 A utility company is updating its old utility distribution network 777 with advanced meters and new communication systems, known as an 778 Advanced Metering Infrastructure (AMI). AMI refers to a system that 779 measures, collects and analyzes usage, and interacts with metering 780 devices such as electricity meters, gas meters, heat meters, and 781 water meters, through various communication media either on request 782 (on-demand) or on pre-defined schedules. Based on this technology, 783 new services make it possible for consumers to control their utility 784 consumption (U5.2, U5.7) and reduce costs by supporting new tariff 785 models from utility companies, and more accurate and timely billing. 786 However the end-consumers do not want unauthorized persons to gain 787 access to this data. Furthermore, the fine-grained measurement of 788 consumption data may induce privacy concerns, since it may allow 789 others to create behavioral profiles (U5.5, U5.10). 791 The technical solution is based on levels of data aggregation between 792 smart meters located at the consumer premises and the Meter Data 793 Management (MDM) system located at the utility company (U5.9). For 794 reasons of efficiency and cost, end-to-end connectivity is not always 795 feasible, so metering data is stored and aggregated in various 796 intermediate devices before being forwarded to the utility company, 797 and in turn accessed by the MDM. The intermediate devices may be 798 operated by a third party service operator on behalf of the utility 799 company (U5.7). One responsibility of the service operator is to 800 make sure that meter readings are performed and delivered in a 801 regular, timely manner. An example of a Service Level Agreement 802 between the service operator and the utility company is e.g. "at 803 least 95 % of the meters have readings recorded during the last 72 804 hours". 806 2.5.4. Authorization Problems Summary 808 o U5.1 Devices are installed in hostile environments where they are 809 physically accessible by attackers (including dishonest 810 customers). The service operator and the utility company want to 811 make sure that an attacker cannot use data from a captured device 812 to attack other parts of their infrastructure. 814 o U5.2 The utility company wants to control which entities are 815 allowed to send data to, and read data from their endpoints. 817 o U5.3 The utility company wants to ensure the integrity of the data 818 stored on their endpoints. 820 o U5.4 The utility company wants to protect such data transfers to 821 and from their endpoints. 823 o U5.5 Consumers want to access their own usage information and also 824 prevent unauthorized access by others. 826 o U5.6 The devices may have intermittent Internet connectivity but 827 still need to enact the authorization policies of their 828 principals. 830 o U5.7 Neither the service operator nor the utility company are 831 always present at the time of access and cannot manually intervene 832 in the authorization process. 834 o U5.8 When authorization policies are updated it is impossible, or 835 at least very inefficient to contact all affected endpoints 836 directly. 838 o U5.9 Authorization and authentication must work even if messages 839 between endpoints are stored and forwarded over multiple nodes. 841 o U5.10 Consumers may not want the Service Operator, the Utility 842 company or others to have access to a fine-grained level of 843 consumption data that allows the creation of behavioral profiles. 845 2.6. Sports and Entertainment 847 In the area of leisure time activities, applications can benefit from 848 the small size and weight of constrained devices. Sensors and 849 actuators with various functions can be integrated into fitness 850 equipment, games and even clothes. Users can carry their devices 851 around with them at all times. 853 Usability is especially important in this area since users will often 854 want to spontaneously interconnect their devices with others. 855 Therefore the configuration of access permissions must be simple and 856 fast and not require much effort at the time of access. 858 Continuously monitoring allows authorized users to create behavioral 859 or movement profiles, which corresponds on the devices intended use, 860 and unauthorized access to the collected data would allow an attacker 861 to create the same profiles. 862 Moreover, the aggregation of data can seriously increase the impact 863 on the privacy of the users. 865 2.6.1. Dynamically Connecting Smart Sports Equipment 867 Jody is a an enthusiastic runner. To keep track of her training 868 progress, she has smart running shoes that measure the pressure at 869 various points beneath her feet to count her steps, detect 870 irregularities in her stride and help her to improve her posture and 871 running style. On a sunny afternoon, she goes to the Finnbahn track 872 near her home to work out. She meets her friend Lynn who shows her 873 the smart fitness watch she bought a few days ago. The watch can 874 measure the wearer's pulse, show speed and distance, and keep track 875 of the configured training program. The girls detect that the watch 876 can be connected with Jody's shoes and then can additionally display 877 the information the shoes provide. 879 Jody asks Lynn to let her try the watch and lend it to her for the 880 afternoon. Lynn agrees but doesn't want Jody to access her training 881 plan (U6.4). She configures the access policies for the watch so 882 that Jody's shoes are allowed to access the display and measuring 883 features but cannot read or add training data (U6.1, U6.2). Jody's 884 shoes connect to Lynn's watch after only a press of a button because 885 Jody already configured access rights for devices that belong to Lynn 886 a while ago (U6.3). Jody wants the device to report the data back to 887 her fitness account while she borrows it, so she allows it to access 888 her account temporarily. 890 After an hour, Jody gives the watch back and both girls terminate the 891 connection between their devices. 893 2.6.2. Authorization Problems Summary 895 o U6.1 Sports equipment owners want to be able to grant access 896 rights dynamically when needed. 898 o U6.2 Sports equipment owners want the configuration of access 899 rights to work with very little effort. 901 o U6.3 Sports equipment owners want to be able to pre-configure 902 access policies that grant certain access permissions to endpoints 903 with certain attributes (e.g. endpoints of a certain user) without 904 additional configuration effort at the time of access. 906 o U6.4 Sports equipment owners want to protect the confidentiality 907 of their data for privacy reasons. 909 2.7. Industrial Control Systems 911 Industrial control systems (ICS) and especially supervisory control 912 and data acquisition systems (SCADA) use a multitude of sensors and 913 actuators in order to monitor and control industrial processes in the 914 physical world. Example processes include manufacturing, power 915 generation, and refining of raw materials. 917 Since the advent of the Stuxnet worm it has become obvious to the 918 general public how vulnerable these kind of systems are, especially 919 when connected to the Internet. The severity of these 920 vulnerabilities are exacerbated by the fact that many ICS are used to 921 control critical public infrastructure, such as nuclear power, water 922 treatment of traffic control. Nevertheless the economical advantages 923 of connecting such systems to the Internet can be significant if 924 appropriate security measures are put in place (U7.5). 926 2.7.1. Oil Platform Control 928 An oil platform uses an industrial control system to monitor data and 929 control equipment. The purpose of this system is to gather and 930 process data from a large number of sensors, and control actuators 931 such as valves and switches to steer the oil extraction process on 932 the platform. Raw data, alarms, reports and other information are 933 also available to the operators, who can intervene with manual 934 commands. Many of the sensors are connected to the controlling units 935 by direct wire, but the operator is slowly replacing these units by 936 wireless ones, since this makes maintenance easier (U7.4). 938 Some of the controlling units are connected to the Internet, to allow 939 for remote administration, since it is expensive and inconvenient to 940 fly in a technician to the platform (U7.3). 942 The main interest of the operator is to ensure the integrity of 943 control messages and sensor readings (U7.1). Access in some cases 944 needs to be restricted, e.g. the operator wants wireless actuators 945 only to accept commands by authorized control units (U7.2). 947 The owner of the platform also wants to collect auditing information 948 for liability reasons (U7.1). 950 Different levels of access apply e.g. for regular operators, vs. 951 maintenance technician, vs. auditors of the platform (U7.6) 953 2.7.2. Authorization Problems Summary 955 o U7.1 The operator of the platform wants to ensure the integrity 956 and confidentiality of sensor and actuator data. 958 o U7.2 The operator wants to ensure that data coming from sensors 959 and commands sent to actuators are authentic. 961 o U7.3 Some devices do not have direct Internet connection, but 962 still need to implement current authorization policies. 964 o U7.4 Devices need to authenticate the controlling units, 965 especially those using a wireless connection. 967 o U7.5 The execution of unauthorized commands or the failure to 968 execute an authorized command in an ICS can lead to significant 969 financial damage, and threaten the availability of critical 970 infrastructure services. Accordingly, the operator wants a 971 authentication and authorization mechanisms that provide a very 972 high level of security. 974 o U7.6 Different users should have different levels of access to the 975 control system (e.g. operator vs. auditor). 977 3. Security Considerations 979 As the use cases listed in this document demonstrate, constrained 980 devices are used in various environments. These devices are small 981 and inexpensive and this makes it easy to integrate them into many 982 aspects of everyday life. With access to vast amounts of valuable 983 data and possibly control of important functions these devices need 984 to be protected from unauthorized access. Protecting seemingly 985 innocuous data and functions will lessen the possible effects of 986 aggregation; attackers collecting data or functions from several 987 sources can gain insights or a level of control not immediately 988 obvious from each of these sources on its own. 990 Not only the data on the constrained devices themselves is 991 threatened, the devices might also be abused as an intrusion point to 992 infiltrate a network. Once an attacker gains control over the 993 device, it can be used to attack other devices as well. Due to their 994 limited capabilities, constrained devices appear as the weakest link 995 in the network and hence pose an attractive target for attackers. 997 This section summarizes the security problems highlighted by the use 998 cases above and provides guidelines for the design of protocols for 999 authentication and authorization in constrained RESTful environments. 1001 3.1. Attacks 1003 This document lists security problems that users of constrained 1004 devices want to solve. Further analysis of attack scenarios is not 1005 in scope of the document. However, there are attacks that must be 1006 considered by solution developers. 1008 Because of the expected large number of devices and their ubiquity, 1009 constrained devices increase the danger from Pervasive Monitoring 1010 [RFC7258] attacks. 1012 Attacks aim at altering data in transit (e.g. to perpetrate fraud) 1013 are a problem that is addressed in many web security protocols such 1014 as TLS or IPSec. 1015 Developers need to consider this type of attacks, and make sure that 1016 the protection measures they implement are adapted to the constrained 1017 environment. 1019 As some of the use cases indicate, constrained devices may be 1020 installed in hostile environments where they are physically 1021 accessible (see Section 2.5). Protection from physical attacks is 1022 not in the scope of this document, but should be kept in mind by 1023 developers of authorization solutions. 1025 Denial of service (DoS) attacks threaten the availability of services 1026 a device provides and constrained devices are especially vulnerable 1027 to these types of attacks because of their limitations. Attackers 1028 can illicit a temporary or, if the battery is drained, permanent 1029 failure in a service simply by repeatedly flooding the device with 1030 connection attempts; for some services (see section Section 2.3), 1031 availability is especially important. 1032 Solution designers must be particularly careful to consider the 1033 following limitations in every part of the authorization solution: 1035 o Battery usage 1037 o Number of required message exchanges 1039 o Size of data that is transmitted (e.g. authentication and access 1040 control data) 1042 o Size of code required to run the protocols 1044 o Size of RAM memory and stack required to run the protocols 1045 o Timers for transaction processing 1047 Solution developers also need to consider whether the session should 1048 be protected from information disclosure and tampering. 1050 3.2. Configuration of Access Permissions 1052 o The access control policies need to be enforced (all use cases): 1053 The information that is needed to implement the access control 1054 policies needs to be provided to the device that enforces the 1055 authorization and applied to every incoming request. 1057 o A single resource might have different access rights for different 1058 requesting entities (all use cases). 1060 Rationale: In some cases different types of users need different 1061 access rights, as opposed to a binary approach where the same 1062 access permissions are granted to all authenticated users. 1064 o A device might host several resources where each resource has its 1065 own access control policy (all use cases). 1067 o The device that makes the policy decisions should be able to 1068 evaluate context-based permissions such as location or time of 1069 access (see Section 2.2, Section 2.3, Section 2.4). Access may 1070 depend on local conditions, e.g. access to health data in an 1071 emergency. The device that makes the policy decisions should be 1072 able to take such conditions into account. 1074 3.3. Authorization Considerations 1076 o Devices need to be enabled to enforce authorization policies 1077 without human intervention at the time of the access request (see 1078 Section 2.1, Section 2.2, Section 2.4, Section 2.5). 1080 o Authorization solutions need to consider that constrained devices 1081 might not have internet access at the time of the access request 1082 (see Section 2.1, Section 2.3, Section 2.5, Section 2.6). 1084 o It should be possible to update access control policies without 1085 manually re-provisioning individual devices (see Section 2.2, 1086 Section 2.3, Section 2.5, Section 2.6). 1088 Rationale: Peers can change rapidly which makes manual re- 1089 provisioning unreasonably expensive. 1091 o Authorization policies may be defined to apply to a large number 1092 of devices that might only have intermittent connectivity. 1094 Distributing policy updates to every device for every update might 1095 not be a feasible solution (see Section 2.5). 1097 o It must be possible to dynamically revoke authorizations (see e.g. 1098 Section 2.4). 1100 o The authentication and access control protocol can put undue 1101 burden on the constrained system resources of a device 1102 participating in the protocol. An authorization solutions must 1103 take the limitations of the constrained devices into account (all 1104 use cases, see also Section 3.1). 1106 o Secure default settings are needed for the initial state of the 1107 authentication and authorization protocols (all use cases). 1109 Rationale: Many attacks exploit insecure default settings, and 1110 experience shows that default settings are frequently left 1111 unchanged by the end users. 1113 o Access to resources on other devices should only be permitted if a 1114 rule exists that explicitly allows this access (default deny) (see 1115 e.g. Section 2.4). 1117 o Usability is important for all use cases. The configuration of 1118 authorization policies as well as the gaining access to devices 1119 must be simple for the users of the devices. Special care needs 1120 to be taken for scenarios where access control policies have to be 1121 configured by users that are typically not trained in security 1122 (see Section 2.2, Section 2.3, Section 2.6). 1124 3.4. Proxies 1126 In some cases, the traffic between endpoints might go through 1127 intermediary nodes (e.g. proxies, gateways). This might affect the 1128 function or the security model of authentication and access control 1129 protocols e.g. end-to-end security between endpoints with DTLS might 1130 not be possible (see Section 2.5). 1132 4. Privacy Considerations 1133 Many of the devices that are in focus of this document register data 1134 from the physical world (sensors) or affect processes in the physical 1135 world (actuators), which may involve data or processes belonging to 1136 individuals. To make matters worse the sensor data may be recorded 1137 continuously thus allowing to gather significant information about an 1138 individual subject through the sensor readings. Therefore privacy 1139 protection is especially important, and Authentication and Access 1140 control are important tools for this, since they make it possible to 1141 control who gets access to private data. 1143 Privacy protection can also be weighted in when evaluating the need 1144 for end-to-end confidentiality, since otherwise intermediary nodes 1145 will learn the content of potentially sensitive messages sent between 1146 endpoints and thereby threaten the privacy of the individual that may 1147 be subject of this data. 1149 In some cases, even the possession of a certain type of device can be 1150 confidential, e.g. individuals might not want to others to know that 1151 they are wearing a certain medical device (see Section 2.3). 1153 The personal health monitoring use case (see Section 2.3) indicates 1154 the need for secure audit logs which impose specific requirements on 1155 a solution. 1156 Auditing is not in the scope of ACE. However, if an authorization 1157 solution provides means for audit logs, it must consider the impact 1158 of logged data for the privacy of all parties involved. Suitable 1159 measures for protecting and purging the logs must be taken during 1160 operation, maintenance and decommissioning of the device. 1162 5. Acknowledgments 1164 The authors would like to thank Olaf Bergmann, Sumit Singhal, John 1165 Mattson, Mohit Sethi, Carsten Bormann, Martin Murillo, Corinna 1166 Schmitt, Hannes Tschofenig, Erik Wahlstroem, Andreas Baeckman, Samuel 1167 Erdtman, Steve Moore, Thomas Hardjono, Kepeng Li and Jim Schaad for 1168 reviewing and/or contributing to the document. Also, thanks to 1169 Markus Becker, Thomas Poetsch and Koojana Kuladinithi for their input 1170 on the container monitoring use case. Furthermore the authors thank 1171 Akbar Rahman, Chonggang Wang, and Vinod Choyi who contributed the 1172 public safety scenario in the building automation use case. 1174 Ludwig Seitz and Goeran Selander worked on this document as part of 1175 EIT-ICT Labs activity PST-14056. 1177 6. IANA Considerations 1179 This document has no IANA actions. 1181 7. Informative References 1183 [Jedermann14] 1184 Jedermann, R., Poetsch, T., and C. LLoyd, "Communication 1185 techniques and challenges for wireless food quality 1186 monitoring", Philosophical Transactions of the Royal 1187 Society A Mathematical, Physical and Engineering Sciences, 1188 May 2014. 1190 [RFC6347] Rescorla, E. and N. Modadugu, "Datagram Transport Layer 1191 Security Version 1.2", RFC 6347, DOI 10.17487/RFC6347, 1192 January 2012, . 1194 [RFC7228] Bormann, C., Ersue, M., and A. Keranen, "Terminology for 1195 Constrained-Node Networks", RFC 7228, DOI 10.17487/ 1196 RFC7228, May 2014, 1197 . 1199 [RFC7252] Shelby, Z., Hartke, K., and C. Bormann, "The Constrained 1200 Application Protocol (CoAP)", RFC 7252, DOI 10.17487/ 1201 RFC7252, June 2014, 1202 . 1204 [RFC7258] Farrell, S. and H. Tschofenig, "Pervasive Monitoring Is an 1205 Attack", BCP 188, RFC 7258, DOI 10.17487/RFC7258, May 1206 2014, . 1208 Authors' Addresses 1210 Ludwig Seitz (editor) 1211 SICS Swedish ICT AB 1212 Scheelevaegen 17 1213 Lund 223 70 1214 Sweden 1216 Email: ludwig@sics.se 1218 Stefanie Gerdes (editor) 1219 Universitaet Bremen TZI 1220 Postfach 330440 1221 Bremen 28359 1222 Germany 1224 Phone: +49-421-218-63906 1225 Email: gerdes@tzi.org 1226 Goeran Selander 1227 Ericsson 1228 Faroegatan 6 1229 Kista 164 80 1230 Sweden 1232 Email: goran.selander@ericsson.com 1234 Mehdi Mani 1235 Itron 1236 52, rue Camille Desmoulins 1237 Issy-les-Moulineaux 92130 1238 France 1240 Email: Mehdi.Mani@itron.com 1242 Sandeep S. Kumar 1243 Philips Research 1244 High Tech Campus 1245 Eindhoven 5656 AA 1246 The Netherlands 1248 Email: sandeep.kumar@philips.com