idnits 2.17.1 draft-ietf-ace-usecases-05.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- == There are 1 instance of lines with non-RFC2606-compliant FQDNs in the document. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (September 01, 2015) is 3157 days in the past. Is this intentional? Checking references for intended status: Informational ---------------------------------------------------------------------------- -- Obsolete informational reference (is this intentional?): RFC 6347 (Obsoleted by RFC 9147) Summary: 0 errors (**), 0 flaws (~~), 2 warnings (==), 2 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 ACE Working Group L. Seitz, Ed. 3 Internet-Draft SICS Swedish ICT AB 4 Intended status: Informational S. Gerdes, Ed. 5 Expires: March 4, 2016 Universitaet Bremen TZI 6 G. Selander 7 Ericsson 8 M. Mani 9 Itron 10 S. Kumar 11 Philips Research 12 September 01, 2015 14 ACE use cases 15 draft-ietf-ace-usecases-05 17 Abstract 19 Constrained devices are nodes with limited processing power, storage 20 space and transmission capacities. These devices in many cases do 21 not provide user interfaces and are often intended to interact 22 without human intervention. 24 This document comprises a collection of representative use cases for 25 the application of authentication and authorization in constrained 26 environments. These use cases aim at identifying authorization 27 problems that arise during the lifecylce of a constrained device and 28 are intended to provide a guideline for developing a comprehensive 29 authentication and authorization solution for this class of 30 scenarios. 32 Where specific details are relevant, it is assumed that the devices 33 use the Constrained Application Protocol (CoAP) as communication 34 protocol, however most conclusions apply generally. 36 Status of This Memo 38 This Internet-Draft is submitted in full conformance with the 39 provisions of BCP 78 and BCP 79. 41 Internet-Drafts are working documents of the Internet Engineering 42 Task Force (IETF). Note that other groups may also distribute 43 working documents as Internet-Drafts. The list of current Internet- 44 Drafts is at http://datatracker.ietf.org/drafts/current/. 46 Internet-Drafts are draft documents valid for a maximum of six months 47 and may be updated, replaced, or obsoleted by other documents at any 48 time. It is inappropriate to use Internet-Drafts as reference 49 material or to cite them other than as "work in progress." 51 This Internet-Draft will expire on March 4, 2016. 53 Copyright Notice 55 Copyright (c) 2015 IETF Trust and the persons identified as the 56 document authors. All rights reserved. 58 This document is subject to BCP 78 and the IETF Trust's Legal 59 Provisions Relating to IETF Documents 60 (http://trustee.ietf.org/license-info) in effect on the date of 61 publication of this document. Please review these documents 62 carefully, as they describe your rights and restrictions with respect 63 to this document. Code Components extracted from this document must 64 include Simplified BSD License text as described in Section 4.e of 65 the Trust Legal Provisions and are provided without warranty as 66 described in the Simplified BSD License. 68 Table of Contents 70 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 3 71 1.1. Terminology . . . . . . . . . . . . . . . . . . . . . . . 4 72 2. Use Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 4 73 2.1. Container monitoring . . . . . . . . . . . . . . . . . . 4 74 2.1.1. Bananas for Munich . . . . . . . . . . . . . . . . . 5 75 2.1.2. Authorization Problems Summary . . . . . . . . . . . 6 76 2.2. Home Automation . . . . . . . . . . . . . . . . . . . . . 7 77 2.2.1. Controlling the Smart Home Infrastructure . . . . . . 7 78 2.2.2. Seamless Authorization . . . . . . . . . . . . . . . 8 79 2.2.3. Remotely letting in a visitor . . . . . . . . . . . . 8 80 2.2.4. Selling the house . . . . . . . . . . . . . . . . . . 8 81 2.2.5. Authorization Problems Summary . . . . . . . . . . . 8 82 2.3. Personal Health Monitoring . . . . . . . . . . . . . . . 10 83 2.3.1. John and the heart rate monitor . . . . . . . . . . . 10 84 2.3.2. Authorization Problems Summary . . . . . . . . . . . 11 85 2.4. Building Automation . . . . . . . . . . . . . . . . . . . 12 86 2.4.1. Device Lifecycle . . . . . . . . . . . . . . . . . . 12 87 2.4.2. Public Safety . . . . . . . . . . . . . . . . . . . . 14 88 2.4.3. Authorization Problems Summary . . . . . . . . . . . 15 89 2.5. Smart Metering . . . . . . . . . . . . . . . . . . . . . 16 90 2.5.1. Drive-by metering . . . . . . . . . . . . . . . . . . 16 91 2.5.2. Meshed Topology . . . . . . . . . . . . . . . . . . . 17 92 2.5.3. Advanced Metering Infrastructure . . . . . . . . . . 17 93 2.5.4. Authorization Problems Summary . . . . . . . . . . . 18 95 2.6. Sports and Entertainment . . . . . . . . . . . . . . . . 19 96 2.6.1. Dynamically Connecting Smart Sports Equipment . . . . 19 97 2.6.2. Authorization Problems Summary . . . . . . . . . . . 20 98 2.7. Industrial Control Systems . . . . . . . . . . . . . . . 20 99 2.7.1. Oil Platform Control . . . . . . . . . . . . . . . . 21 100 2.7.2. Authorization Problems Summary . . . . . . . . . . . 21 101 3. Security Considerations . . . . . . . . . . . . . . . . . . . 21 102 3.1. Attacks . . . . . . . . . . . . . . . . . . . . . . . . . 22 103 3.2. Configuration of Access Permissions . . . . . . . . . . . 23 104 3.3. Authorization Considerations . . . . . . . . . . . . . . 23 105 3.4. Proxies . . . . . . . . . . . . . . . . . . . . . . . . . 24 106 4. Privacy Considerations . . . . . . . . . . . . . . . . . . . 25 107 5. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 25 108 6. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 25 109 7. Informative References . . . . . . . . . . . . . . . . . . . 26 110 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 26 112 1. Introduction 114 Constrained devices [RFC7228] are nodes with limited processing 115 power, storage space and transmission capacities. These devices are 116 often battery-powered and in many cases do not provide user 117 interfaces. 119 Constrained devices benefit from being interconnected using Internet 120 protocols. However, due to the devices' limitations, commonly used 121 security protocols are not always easily applicable. As the devices 122 are expected to be integrated in all aspects of everyday life, the 123 application of adequate security mechanisms is required to prevent 124 attackers from gaining control over data or functions important to 125 our lives. 127 This document comprises a collection of representative use cases for 128 the application of authentication and authorization in constrained 129 environments. These use cases aim at identifying authorization 130 problems that arise during the lifecycle of a constrained device. 131 Note that this document does not aim at collecting all possible use 132 cases. 134 We assume that the communication between the devices is based on the 135 Representational State Transfer (REST) architectural style, i.e. a 136 device acts as a server that offers resources such as sensor data and 137 actuators. The resources can be accessed by clients, sometimes 138 without human intervention (M2M). In some situations the 139 communication will happen through intermediaries (e.g. gateways, 140 proxies). 142 Where specific detail is necessary it is assumed that the devices 143 communicate using CoAP [RFC7252], although most conclusions are 144 generic. 146 1.1. Terminology 148 Readers are required to be familiar with the terms defined in 149 [RFC7228]. In addition, this document uses the following 150 terminology: 152 2. Use Cases 154 This section lists use cases involving constrained devices with 155 certain authorization problems to be solved. Each use case first 156 presents a general description of the application area, then one or 157 more specific use cases, and finally a summary of the authorization- 158 related problems users need to be solved. 160 There are various reasons for assigning a function (client or server) 161 to a device, e.g. which device initiates the conversation, how do 162 devices find each other, etc. The definition of the function of a 163 device in a certain use case is not in scope of this document. 164 Readers should be aware that there might be reasons for each setting 165 and that endpoints might even have different functions at different 166 times. 168 2.1. Container monitoring 170 The ability of sensors to communicate environmental data wirelessly 171 opens up new application areas. The use of such sensor systems makes 172 it possible to continuously track and transmit specific 173 characteristics such as temperature, humidity and gas content during 174 the transportation and storage of goods. 176 The proper handling of the sensors in this scenario is not easy to 177 accomplish. They have to be associated to the appropriate pallet of 178 the respective container. Moreover, the goods and the corresponding 179 sensors belong to specific customers. 181 During the shipment to their destination the goods often pass stops 182 where they are transloaded to other means of transportation, e.g. 183 from ship transport to road transport. 185 The transportation and storage of perishable goods is especially 186 challenging since they have to be stored at a constant temperature 187 and with proper ventilation. Additionally, it is very important for 188 the vendors to be informed about irregularities in the temperature 189 and ventilation of fruits to avoid the delivery of decomposed fruits 190 to their customers. Real-time information on the state of the goods 191 is needed for the transporter in order to prioritize goods that will 192 expire soon. 193 Furthermore the vendor also wants this type of information in real- 194 time, in order to be able to react when goods are spoiled and to be 195 able to still fulfill delivery obligations. 197 The need for a constant monitoring of perishable goods has led to 198 projects such as The Intelligent Container (http:// 199 www.intelligentcontainer.com). 201 2.1.1. Bananas for Munich 203 A fruit vendor grows bananas in Costa Rica for the German market. It 204 instructs a transport company to deliver the goods via ship to 205 Rotterdam where they are picked up by trucks and transported to a 206 ripening facility. A Munich supermarket chain buys ripened bananas 207 from the fruit vendor and transports them from the ripening facility 208 to the individual markets with their own company trucks. 210 The fruit vendor's quality management wants to assure the quality of 211 their products and thus equips the banana boxes with sensors. The 212 state of the goods is monitored consistently during shipment and 213 ripening and abnormal sensor values are recorded (U1.2). 214 Additionally, the sensor values are used to control the climate 215 within the cargo containers (U1.1, U1.5, U1.7). The sensors 216 therefore need to communicate with the climate control system. Since 217 a wrong sensor value leads to a wrong temperature and thus to spoiled 218 goods, the integrity of the sensor data must be assured (U1.2, U1.3). 219 The banana boxes within a container will in most cases belong to the 220 same owner. Adjacent containers might contain goods and sensors of 221 different owners (U1.1). 223 The personnel that transloads the goods must be able to locate the 224 goods meant for a specific customer (U1.1, U1.6, U1.7). However the 225 fruit vendor does not want to disclose sensor information pertaining 226 to the condition of the goods to other companies and therefore wants 227 to assure the confidentiality of this data (U1.4). Thus, the 228 transloading personnel is only allowed to access logistic information 229 (U1.1). Moreover, the transloading personnel is only allowed to 230 access the data for the time of the transloading (U1.8). 232 Due to the high water content of the fruits, the propagation of radio 233 waves is hindered, thus often inhibiting direct communication between 234 nodes [Jedermann14]. Instead, messages are forwarded over multiple 235 hops (U1.9). The sensors in the banana boxes cannot always reach the 236 Internet during the journey (U1.10). Sensors may need to use relay 237 stations owned by the transport company to connect to endpoints in 238 the Internet. 240 In the ripening facility bananas are stored until they are ready for 241 selling. The banana box sensors are used to control the ventilation 242 system and to monitor the degree of ripeness of the bananas. Ripe 243 bananas need to be identified and sold before they spoil (U1.2, 244 U1.8). 246 The supermarket chain gains ownership of the banana boxes when the 247 bananas have ripened and are ready to leave the ripening facility. 249 2.1.2. Authorization Problems Summary 251 o U1.1 Fruit vendors, transloading personnel and container owners 252 want to grant different authorizations for their resources and/or 253 endpoints to different parties. 255 o U1.2 The fruit vendor requires the integrity and authenticity of 256 the sensor data that pertains the state of the goods for climate 257 control and to ensure the quality of the monitored recordings. 259 o U1.3 The container owner requires the integrity and authenticity 260 of the sensor data that is used for climate control. 262 o U1.4 The fruit vendor requires the confidentiality of the sensor 263 data that pertains the state of the goods and the confidentiality 264 of location data, e.g., to protect them from targeted attacks from 265 competitors. 267 o U1.5 The fruit vendor may have several types of data that may be 268 controlled by the same endpoint, e.g., sensor data and the data 269 used for logistics. 271 o U1.6 The fruit vendor and the transloading personnel require the 272 authenticity and integrity of the data that is used to locate the 273 goods, in order to ensure that the good are correctly treated and 274 delivered. 276 o U1.7 The container owner and the fruit vendor may not be present 277 at the time of access and cannot manually intervene in the 278 authorization process. 280 o U1.8 The fruit vendor, container owner and transloading company 281 want to grant temporary access permissions to a party, in order to 282 avoid giving permanent access to parties that are no longer 283 involved in processing the bananas. 285 o U1.9 The fruit vendor, container owner and transloading company 286 want their security objectives to be achieved, even if the 287 messages between the endpoints need to be forwarded over multiple 288 hops. 290 o U1.10 The constrained devices might not always be able to reach 291 the Internet but still need to enact the authorization policies of 292 their principals. 294 o U1.11 Fruit vendors and container owners want to be able to revoke 295 authorization on a malfunctioning sensor. 297 2.2. Home Automation 299 Automation of the home has the potential to become a big future 300 market for the Internet of Things. One function of a home automation 301 system can be to connect devices in a house to the Internet and thus 302 make them accessible and manageable remotely. Such devices might 303 control for example heating, ventilation, lighting, home 304 entertainment or home security. 306 Such a system needs to accommodate a number of regular users 307 (inhabitants, close friends, cleaning personnel) as well as a 308 heterogeneous group of dynamically varying users (visitors, 309 repairmen, delivery men). 311 As the users are not typically trained in security (or even computer 312 use), the configuration must use secure default settings, and the 313 interface must be well adapted to novice users. 315 2.2.1. Controlling the Smart Home Infrastructure 317 Alice and her husband Bob own a flat which is equipped with home 318 automation devices such as HVAC and shutter control, and they have a 319 motion sensor in the corridor which controls the light bulbs there 320 (U2.5). 322 Alice and Bob can control the shutters and the temperature in each 323 room using either wall-mounted touch panels or an internet connected 324 device (e.g. a smartphone). Since Alice and Bob both have a full- 325 time job, they want to be able to change settings remotely, e.g. turn 326 up the heating on a cold day if they will be home earlier than 327 expected (U2.5). 329 The couple does not want people in radio range of their devices, e.g. 330 their neighbors, to be able to control them without authorization. 331 Moreover, they don't want burglars to be able to deduce behavioral 332 patterns from eavesdropping on the network (U2.8). 334 2.2.2. Seamless Authorization 336 Alice buys a new light bulb for the corridor and integrates it into 337 the home network, i.e. makes resources known to other devices in the 338 network. Alice makes sure that the new light bulb and her other 339 devices in the network get to know the authorization policies for the 340 new device. Bob is not at home, but Alice wants him to be able to 341 control the new device with his devices (e.g. his smartphone) without 342 the need for additional administration effort (U2.7). She provides 343 the necessary configurations for that (U2.9, U2.10). 345 2.2.3. Remotely letting in a visitor 347 Alice and Bob have equipped their home with automated connected door- 348 locks and an alarm system at the door and the windows. The couple 349 can control this system remotely. 351 Alice and Bob have invited Alice's parents over for dinner, but are 352 stuck in traffic and cannot arrive in time, while Alice's parents who 353 use the subway will arrive punctually. Alice calls her parents and 354 offers to let them in remotely, so they can make themselves 355 comfortable while waiting (U2.1, U2.6). Then Alice sets temporary 356 permissions that allow them to open the door, and shut down the alarm 357 (U2.2). She wants these permissions to be only valid for the evening 358 since she does not like it if her parents are able to enter the house 359 as they see fit (U2.3, U2.4). 361 When Alice's parents arrive at Alice's and Bob's home, they use their 362 smartphone to communicate with the door-lock and alarm system (U2.5, 363 U2.9). 365 2.2.4. Selling the house 367 Alice and Bob have to move because Alice is starting a new job. They 368 therefore decide to sell the house, and transfer control of all 369 automated services to the new owners (U2.11). Before doing that they 370 want to erase privacy relevant data from the logs of the automated 371 systems, while the new owner is interested to keep some historic data 372 e.g. pertaining to the behavior of the heating system (U2.12). 374 2.2.5. Authorization Problems Summary 375 o U2.1 A home owner (Alice and Bob in the example above) wants to 376 spontaneously provision authorization means to visitors. 378 o U2.2 A home owner wants to spontaneously change the home's access 379 control policies. 381 o U2.3 A home owner wants to apply different access rights for 382 different users. 384 o U2.4 The home owners want to grant access permissions to a party 385 for a specified time frame. 387 o U2.5 The smart home devices need to be able to communicate with 388 different control devices (e.g. wall-mounted touch panels, 389 smartphones, electronic key fobs). 391 o U2.6 The home owner wants to be able to configure authorization 392 policies remotely. 394 o U2.7 Authorized Users want to be able to obtain access with little 395 effort. 397 o U2.8 The owners of the automated home want to prevent unauthorized 398 entities from being able to deduce behavioral profiles from 399 devices in the home network. 401 o U2.9 Usability is particularly important in this scenario since 402 the necessary authorization related tasks in the lifecycle of the 403 device (commissioning, operation, maintenance and decommissioning) 404 likely need to be performed by the home owners who in most cases 405 have little knowledge of security. 407 o U2.10 Home Owners want their devices to seamlessly (and in some 408 cases even unnoticeably) fulfill their purpose. The 409 administration effort needs to be kept at a minimum. 411 o U2.11 Home Owners want to be able to transfer ownership of their 412 automated systems when they sell the house. 414 o U2.12 Home Owners want to be able to sanitize the logs of the 415 automated systems, when transferring ownership, without deleting 416 important operational data. 418 2.3. Personal Health Monitoring 420 The use of wearable health monitoring technology is expected to grow 421 strongly, as a multitude of novel devices are developed and marketed. 422 The need for open industry standards to ensure interoperability 423 between products has lead to initiatives such as Continua Alliance 424 (continuaalliance.org) and Personal Connected Health Alliance 425 (pchalliance.org). Personal health devices are typically battery 426 driven, and located physically on, or in, the user. They monitor 427 some bodily function, such as e.g. temperature, blood pressure, or 428 pulse. They are connected to the Internet through an intermediary 429 base-station, using wireless technologies. Through this connection 430 they report the monitored data to some entity, which may either be 431 the user herself, or some medical personnel in charge of the user. 433 Medical data has always been considered as very sensitive, and 434 therefore requires good protection against unauthorized disclosure. 435 A frequent, conflicting requirement is the capability for medical 436 personnel to gain emergency access, even if no specific access rights 437 exist. As a result, the importance of secure audit logs increases in 438 such scenarios. 440 Since the users are not typically trained in security (or even 441 computer use), the configuration must use secure default settings, 442 and the interface must be well adapted to novice users. Parts of the 443 system must operate with minimal maintenance. Especially frequent 444 changes of battery are unacceptable. 446 2.3.1. John and the heart rate monitor 448 John has a heart condition, that can result in sudden cardiac 449 arrests. He therefore uses a device called HeartGuard that monitors 450 his heart rate and his location (U3.7). In case of a cardiac arrest 451 it automatically sends an alarm to an emergency service, transmitting 452 John's current location (U3.1). This requires the device to be close 453 to a wireless access point, in order to be able to get an Internet 454 connection (e.g. John's smartphone). To ensure Johns safety, the 455 device is expected to be in constant operation (U3.3, U3.6). 457 The device includes some authentication mechanism, in order to 458 prevent other persons who get physical access to it from acting as 459 the owner and messing up the access control and security settings 460 (U3.8). 462 John can configure additional persons that get notified in an 463 emergency, for example his daughter Jill. Furthermore the device 464 stores data on John's heart rate, which can later be accessed by a 465 physician to assess the condition of John's heart (U3.2). 467 However John is a privacy conscious person, and is worried that Jill 468 might use HeartGuard to monitor his location while there is no 469 emergency. Furthermore he doesn't want his health insurance to get 470 access to the HeartGuard data, or even to the fact that he is wearing 471 a HeartGuard, since they might refuse to renew his insurance if they 472 decided he was too big a risk for them (U3.8). 474 Finally John, while being comfortable with modern technology and able 475 to operate it reasonably well, is not trained in computer security. 476 He therefore needs an interface for the configuration of the 477 HeartGuard security that is easy to understand and use (U3.5). If 478 John does not understand the meaning of a setting, he tends to leave 479 it alone, assuming that the manufacturer has initialized the device 480 to secure settings (U3.4). 482 NOTE: Monitoring of some state parameter (e.g. an alarm button) and 483 the position of a person also fits well into an elderly care service. 484 This is particularly useful for people suffering from dementia, where 485 the relatives or caregivers need to be notified of the whereabouts of 486 the person under certain conditions. In this case it is not the 487 patient that decides about access. 489 2.3.2. Authorization Problems Summary 491 o U3.1 The wearer of an eHealth device (John in the example above) 492 wants to pre-configure special access rights in the context of an 493 emergency. 495 o U3.2 The wearer of an eHealth device wants to selectively allow 496 different persons or groups access to medical data. 498 o U3.3 The Security measures could affect battery lifetime of the 499 device and changing the battery is very inconvenient. 501 o U3.4 Devices are often used with default access control settings 502 which might threaten the security objectives of the device's 503 users. 505 o U3.5 Wearers of eHealth devices are often not trained in computer 506 use, and especially computer security. 508 o U3.6 Security mechanisms themselves could provide opportunities 509 for denial of service attacks, especially on the constrained 510 devices. 512 o U3.7 The device provides a service that can be fatal for the 513 wearer if it fails. Accordingly, the wearer wants the device to 514 have a high degree of resistance against attacks that may cause 515 the device to fail to operate partially or completely. 517 o U3.8 The wearer of an eHealth device requires the integrity and 518 confidentiality of the data measured by the device. 520 2.4. Building Automation 522 Buildings for commercial use such as shopping malls or office 523 buildings nowadays are equipped increasingly with semi-automatic 524 components to enhance the overall living quality and to save energy 525 where possible. This includes for example heating, ventilation and 526 air condition (HVAC) as well as illumination and security systems 527 such as fire alarms. 529 Different areas of these buildings are often exclusively leased to 530 different companies. However they also share some of the common 531 areas of the building. Accordingly, a company must be able to 532 control the light and HVAC system of its own part of the building and 533 must not have access to control rooms that belong to other companies. 535 Some parts of the building automation system such as entrance 536 illumination and fire alarm systems are controlled either by all 537 parties together or by a service company. 539 2.4.1. Device Lifecycle 541 2.4.1.1. Installation and Commissioning 543 A building is hired out to different companies for office space. 544 This building features various automated systems, such as a fire 545 alarm system, which is triggered by several smoke detectors which are 546 spread out across the building. It also has automated HVAC, lighting 547 and physical access control systems. 549 A vacant area of the building has been recently leased to company A. 550 Before moving into its new office, Company A wishes to replace the 551 lighting with a more energy efficient and a better light quality 552 luminaries. They hire an installation and commissioning company C to 553 redo the illumination. Company C is instructed to integrate the new 554 lighting devices, which may be from multiple manufacturers, into the 555 existing lighting infrastructure of the building which includes 556 presence sensors, switches, controllers etc (U4.1). 558 Company C gets the necessary authorization from the service company 559 to interact with the existing Building and Lighting Management System 560 (BLMS) (U4.4). To prevent disturbance to other occupants of the 561 building, Company C is provided authorization to perform the 562 commissioning only during non-office hours and only to modify 563 configuration on devices belonging to the domain of Company A's space 564 (U4.5). After installation (wiring) of the new lighting devices, the 565 commissioner adds the devices into the company A's lighting domain. 567 Once the devices are in the correct domain, the commissioner 568 authorizes the interaction rules between the new lighting devices and 569 existing devices like presence sensors (U4.7). For this, the 570 commissioner creates the authorization rules on the BLMS which define 571 which lights form a group and which sensors/switches/controllers are 572 allowed to control which groups (U4.8). These authorization rules 573 may be context based like time of the day (office or non-office 574 hours) or location of the handheld lighting controller etc (U4.5). 576 2.4.1.2. Operational 578 Company A's staff move into the newly furnished office space. Most 579 lighting is controlled by presence sensors which control the lighting 580 of specific group of lights based on the authorization rules in the 581 BLMS. Additionally employees are allowed to manually override the 582 lighting brightness and color in their office by using the switches 583 or handheld controllers. Such changes are allowed only if the 584 authorization rules exist in the BLMS. For example lighting in the 585 corridors may not be manually adjustable. 587 At the end of the day, lighting is dimmed down or switched off if no 588 occupancy is detected even if manually overridden during the day. 590 On a later date company B also moves into the same building, and 591 shares some of the common spaces with company A (U4.2, U4.9). 593 2.4.1.3. Maintenance 595 Company A's staff are annoyed that the lights switch off too often in 596 their rooms if they work silently in front of their computer. 597 Company A notifies the commissioning Company C about the issue and 598 asks them to increase the delay before lights switch off (U4.4). 600 Company C again gets the necessary authorization from the service 601 company to interact with the BLMS. The commissioner's tool gets the 602 necessary authorization from BMLS to send a configuration change to 603 all lighting devices in Company A's offices to increase their delay 604 before they switch off. 606 At some point the service company wants to update the firmware of 607 lighting devices in order to eliminate software bugs. Before 608 accepting the new firmware, each device checks the authorization of 609 the service company to perform this update. 611 2.4.1.4. Decommissioning 613 Company A has noticed that the handheld controllers are often 614 misplaced and hard to find when needed. So most of the time staff 615 use the existing wall switches for manual control. Company A decides 616 it would be better to completely remove handheld controllers and asks 617 Company C to decommission them from the lighting system (U4.4). 619 Company C again gets the necessary authorization from the service 620 company to interact with the BLMS. The commissioner now deletes any 621 rules that allowed handheld controllers authorization to control the 622 lighting (U4.3, U4.6). Additionally the commissioner instructs the 623 BLMS to push these new rules to prevent cached rules at the end 624 devices from being used. 626 2.4.2. Public Safety 628 The fire department requires that as part of the building safety 629 code, that the building have sensors that sense the level of smoke, 630 heat, etc., when a fire breaks out. These sensors report metrics 631 which are then used by a back-end server to map safe areas and un- 632 safe areas within a building and also possibly the structural 633 integrity of the building before fire-fighters may enter it. 634 Sensors may also be used to track where human/animal activity is 635 within the building. This will allow people stuck within the 636 building to be guided to safer areas and suggest possible actions 637 that they make take (e.g. using a client application on their phones, 638 or loudspeaker directions) in order to bring them to safety. In 639 certain cases, other organizations such as the Police, Ambulance, and 640 federal organizations are also involved and therefore the co- 641 ordination of tasks between the various entities have to be carried 642 out using efficient messaging and authorization mechanisms. 644 2.4.2.1. A fire breaks out 646 On a really hot day James who works for company A turns on the air 647 condition in his office. Lucy who works for company B wants to make 648 tea using an electric kettle. After she turned it on she goes 649 outside to talk to a colleague until the water is boiling. 650 Unfortunately, her kettle has a malfunction which causes overheating 651 and results in a smoldering fire of the kettle's plastic case. 653 Due to the smoke coming from the kettle the fire alarm is triggered. 654 Alarm sirens throughout the building are switched on simultaneously 655 (using a group communication scheme) to alert the staff of both 656 companies (U4.8). Additionally, the ventilation system of the whole 657 building is closed off to prevent the smoke from spreading and to 658 withdraw oxygen from the fire. The smoke cannot get into James' 659 office although he turned on his air condition because the fire alarm 660 overrides the manual setting by sending commands (using group 661 communication) to switch off all the air conditioning (U4.10). 663 The fire department is notified of the fire automatically and arrives 664 within a short time. After inspecting the damage and extinguishing 665 the smoldering fire a fire fighter resets the fire alarm because only 666 the fire department is authorized to do that (U4.4, U4.5, U4.11). 668 2.4.3. Authorization Problems Summary 670 o U4.1 The building owner and the companies want to be able to add 671 new devices to their administrative domain (commissioning). 673 o U4.2 The building owner and the companies want to be able to 674 integrate a device that formerly belonged to a different 675 administrative domain to their own administrative domain 676 (handover). 678 o U4.3 The building owner and the companies want to be able to 679 remove a device from their administrative domain 680 (decommissioning). 682 o U4.4 The building owner and the companies want to be able to 683 delegate selected administration tasks for their devices to 684 others. 686 o U4.5 The building owner and the companies want to be able to 687 define context-based authorization rules. 689 o U4.6 The building owner and the companies want to be able to 690 revoke granted permissions and delegations. 692 o U4.7 The building owner and the companies want to allow authorized 693 entities to send data to their endpoints (default deny). 695 o U4.8 The building owner and the companies want to be able to 696 authorize a device to control several devices at the same time 697 using a group communication scheme. 699 o U4.9 The companies want to be able to interconnect their own 700 subsystems with those from a different operational domain while 701 keeping the control over the authorizations (e.g. granting and 702 revoking permissions) for their endpoints and devices. 704 o U4.10 The authorization mechanisms must be able to cope with 705 extremely time-sensitive operations which have to be carried out 706 in a quick manner. 708 o U4.11 The building owner and the public authorities want to be 709 able to be able to perform data origin authentication on messages 710 sent and received by some of the systems in the building. 712 2.5. Smart Metering 714 Automated measuring of customer consumption is an established 715 technology for electricity, water, and gas providers. Increasingly 716 these systems also feature networking capability to allow for remote 717 management. Such systems are in use for commercial, industrial and 718 residential customers and require a certain level of security, in 719 order to avoid economic loss to the providers, vulnerability of the 720 distribution system, as well as disruption of services for the 721 customers. 723 The smart metering equipment for gas and water solutions is battery 724 driven and communication should be used sparingly due to battery 725 consumption. Therefore the types of meters sleep most of the time, 726 and only wake up every minute/hour to check for incoming 727 instructions. Furthermore they wake up a few times a day (based on 728 their configuration) to upload their measured metering data. 730 Different networking topologies exist for smart metering solutions. 731 Based on environment, regulatory rules and expected cost, one or a 732 mixture of these topologies may be deployed to collect the metering 733 information. Drive-By metering is one of the most current solutions 734 deployed for collection of gas and water meters. 736 Various stakeholders have a claim on the metering data. Utility 737 companies need the data for accounting, the metering equipment may be 738 operated by a third party Service Operator who needs to maintain it, 739 and the equipment is installed in the premises of the consumers, 740 measuring their consumption, which entails privacy questions. 742 2.5.1. Drive-by metering 744 A service operator offers smart metering infrastructures and related 745 services to various utility companies. Among these is a water 746 provider, who in turn supplies several residential complexes in a 747 city. The smart meters are installed in the end customer's homes to 748 measure water consumption and thus generate billing data for the 749 utility company, they can also be used to shut off the water if the 750 bills are not paid (U5.1, U5.3). The meters do so by sending and 751 receiving data to and from a base station (U5.2). Several base 752 stations are installed around the city to collect the metering data. 753 However in the denser urban areas, the base stations would have to be 754 installed very close to the meters. This would require a high number 755 of base stations and expose this more expensive equipment to 756 manipulation or sabotage. The service operator has therefore chosen 757 another approach, which is to drive around with a mobile base-station 758 and let the meters connect to that in regular intervals in order to 759 gather metering data (U5.4, U5.5, U5.7). 761 2.5.2. Meshed Topology 763 In another deployment, the water meters are installed in a building 764 that already has power meters installed, the latter are mains 765 powered, and are therefore not subject to the same power saving 766 restrictions. The water meters can therefore use the power meters as 767 proxies, in order to achieve better connectivity. This requires the 768 security measures on the water meters to work through intermediaries 769 (U5.8). 771 2.5.3. Advanced Metering Infrastructure 773 A utility company is updating its old utility distribution network 774 with advanced meters and new communication systems, known as an 775 Advanced Metering Infrastructure (AMI). AMI refers to a system that 776 measures, collects and analyzes usage, and interacts with metering 777 devices such as electricity meters, gas meters, heat meters, and 778 water meters, through various communication media either on request 779 (on-demand) or on pre-defined schedules. Based on this technology, 780 new services make it possible for consumers to control their utility 781 consumption (U5.2, U5.6) and reduce costs by supporting new tariff 782 models from utility companies, and more accurate and billing. 783 However the fine-grained measurement of consumption data may induce 784 privacy concerns for the end-customers, since it may allow others to 785 create behavioral profiles (U5.9). 787 The technical solution is based on levels of data aggregation between 788 smart meters located at the consumer premises and the Meter Data 789 Management (MDM) system located at the utility company (U5.8). For 790 reasons of efficiency and cost, end-to-end connectivity is not always 791 feasible, so metering data is stored and aggregated in various 792 intermediate devices before being forwarded to the utility company, 793 and in turn accessed by the MDM. The intermediate devices may be 794 operated by a third party service operator on behalf of the utility 795 company (U5.6). One responsibility of the service operator is to 796 make sure that meter readings are performed and delivered in a 797 regular, timely manner. An example of a Service Level Agreement 798 between the service operator and the utility company is e.g. "at 799 least 95 % of the meters have readings recorded during the last 72 800 hours". 802 2.5.4. Authorization Problems Summary 804 o U5.1 Devices are installed in hostile environments where they are 805 physically accessible by attackers (including dishonest 806 customers). The service operator and the utility company want to 807 make sure that an attacker cannot use data from a captured device 808 to attack other parts of their infrastructure. 810 o U5.2 The utility company wants to control which entities are 811 allowed to send data to, and read data from their endpoints. 813 o U5.3 The utility company wants to ensure the integrity of the data 814 stored on their endpoints. 816 o U5.4 The utility company wants to protect such data transfers to 817 and from their endpoints. 819 o U5.5 The devices may have intermittent Internet connectivity. 821 o U5.6 Neither the service operator nor the utility company are 822 always present at the time of access and cannot manually intervene 823 in the authorization process. 825 o U5.7 When authorization policies are updated it is impossible, or 826 at least very inefficient to contact all affected endpoints 827 directly. 829 o U5.8 Messages between endpoints may need to be stored and 830 forwarded over multiple nodes. 832 o U5.9 Consumers may not want the Service Operator, the Utility 833 company or others to be able to have access to a fine-grained 834 level of consumption data that allows the creation of behavioral 835 profiles. 837 2.6. Sports and Entertainment 839 In the area of leisure time activities, applications can benefit from 840 the small size and weight of constrained devices. Sensors and 841 actuators with various functions can be integrated into fitness 842 equipment, games and even clothes. Users can carry their devices 843 around with them at all times. 845 Usability is especially important in this area since users will often 846 want to spontaneously interconnect their devices with others. 847 Therefore the configuration of access permissions must be simple and 848 fast and not require much effort at the time of access (preferably 849 none at all). 851 The required level of security will in most cases be low since 852 security breaches will likely have less severe consequences. The 853 continuous monitoring of data might however enable an attacker to 854 create behavioral or movement profiles. Moreover, the aggregation of 855 data can seriously increase the impact on the privacy of the users. 857 2.6.1. Dynamically Connecting Smart Sports Equipment 859 Jody is a an enthusiastic runner. To keep track of her training 860 progress, she has smart running shoes that measure the pressure at 861 various points beneath her feet to count her steps, detect 862 irregularities in her stride and help her to improve her posture and 863 running style. On a sunny afternoon, she goes to the Finnbahn track 864 near her home to work out. She meets her friend Lynn who shows her 865 the smart fitness watch she bought a few days ago. The watch can 866 measure the wearer's pulse, show speed and distance, and keep track 867 of the configured training program. The girls detect that the watch 868 can be connected with Jody's shoes and then can additionally display 869 the information the shoes provide. 871 Jody asks Lynn to let her try the watch and lend it to her for the 872 afternoon. Lynn agrees but doesn't want Jody to access her training 873 plan (U6.4). She configures the access policies for the watch so 874 that Jody's shoes are allowed to access the display and measuring 875 features but cannot read or add training data (U6.1, U6.2). Jody's 876 shoes connect to Lynn's watch after only a press of a button because 877 Jody already configured access rights for devices that belong to Lynn 878 a while ago (U6.3). Jody wants the device to report the data back to 879 her fitness account while she borrows it, so she allows it to access 880 her account temporarily. 882 After an hour, Jody gives the watch back and both girls terminate the 883 connection between their devices. 885 2.6.2. Authorization Problems Summary 887 o U6.1 Sports equipment owners want to be able to grant access 888 rights dynamically when needed. 890 o U6.2 Sports equipment owners want the configuration of access 891 rights to work with very little effort. 893 o U6.3 Sports equipment owners want to be able to pre-configure 894 access policies that grant certain access permissions to endpoints 895 with certain attributes (e.g. endpoints of a certain user) without 896 additional configuration effort at the time of access. 898 o U6.4 Sports equipment owners want to protect the confidentiality 899 of their data for privacy reasons. 901 2.7. Industrial Control Systems 903 Industrial control systems (ICS) and especially supervisory control 904 and data acquisition systems (SCADA) use a multitude of sensors and 905 actuators in order to monitor and control industrial processes in the 906 physical world. Example processes include manufacturing, power 907 generation, and refining of raw materials. 909 Since the advent of the Stuxnet worm it has become obvious to the 910 general public how vulnerable this kind of systems are, especially 911 when connected to the Internet. The severity of these 912 vulnerabilities are exacerbated by the fact that many ICS are used to 913 control critical public infrastructure, such as power, water 914 treatment of traffic control. Nevertheless the economical advantages 915 of connecting such systems to the Internet can be significant if 916 appropriate security measures are put in place (U7.5). 918 2.7.1. Oil Platform Control 920 An oil platform uses an industrial control system to monitor data and 921 control equipment. The purpose of this system is to gather and 922 process data from a large number of sensors, and control actuators 923 such as valves and switches to steer the oil extraction process on 924 the platform. Raw data, alarms, reports and other information are 925 also available to the operators, who can intervene with manual 926 commands. Many of the sensors are connected to the controlling units 927 by direct wire, but the operator is slowly replacing these units by 928 wireless ones, since this makes maintenance easier (U7.4). 930 Some of the controlling units are connected to the Internet, to allow 931 for remote administration, since it is expensive and inconvenient to 932 fly in a technician to the platform (U7.3). 934 The main interest of the operator is to ensure the integrity of 935 control messages and sensor readings (U7.1). Access in some cases 936 needs to be restricted, e.g. the operator wants wireless actuators 937 only to accept commands by authorized control units (U7.2). 939 The owner of the platform also wants to collect auditing information 940 for liability reasons (U7.1). 942 2.7.2. Authorization Problems Summary 944 o U7.1 The operator of the platform wants to ensure the integrity 945 and confidentiality of sensor and actuator data. 947 o U7.2 The operator wants to ensure that data coming from sensors 948 and commands sent to actuators are authentic. 950 o U7.3 Some devices do not have direct Internet connection. 952 o U7.4 Some devices have wired connection while others use wireless. 954 o U7.5 The execution of unauthorized commands in an ICS can lead to 955 significant financial damage, and threaten the availability of 956 critical infrastructure services. Accordingly, the operator wants 957 a security solution that provides a very high level of security. 959 3. Security Considerations 961 As the use cases listed in this document demonstrate, constrained 962 devices are used in various application areas. The appeal of these 963 devices is that they are small and inexpensive. That makes it easy 964 to integrate them into many aspects of everyday life. Therefore such 965 devices will see vast amounts of valuable data passing through and 966 might even be in control of important functions. These assets need 967 to be protected from unauthorized access. Even seemingly innocuous 968 data and functions should be protected due to possible effects of 969 aggregation: By collecting data or functions from several sources, 970 attackers might be able to gain insights or a level of control not 971 immediately obvious from each of these sources on its own. 973 Not only the data on the constrained devices themselves is 974 threatened, the devices might also be abused as an intrusion point to 975 infiltrate a network. Once an attacker gained control over the 976 device, it can be used to attack other devices as well. Due to their 977 limited capabilities, constrained devices appear as the weakest link 978 in the network and hence pose an attractive target for attackers. 980 This section summarizes the security problems highlighted by the use 981 cases above and provides guidelines for the design of protocols for 982 authentication and authorization in constrained RESTful environments. 984 3.1. Attacks 986 This document lists security problems that users of constrained 987 devices want to solve. Further analysis of attack scenarios is not 988 in scope of the document. However, there are attacks that must be 989 considered by solution developers. 991 Because of the expected large number of devices and their ubiquity, 992 constrained devices increase the danger from Pervasive Monitoring 993 [RFC7258] attacks. 995 As some of the use cases indicate, constrained devices may be 996 installed in hostile environments where they are physically 997 accessible (see Section 2.5). Protection from physical attacks is 998 not in the scope of ACE, but should be kept in mind by developers of 999 authorization solutions. 1001 Denial of service (DoS) attacks threaten the availability of services 1002 a device provides. E.g., an attacker can induce a device to perform 1003 steps of a heavy weight security protocol (e.g. Datagram Transport 1004 Layer Security (DTLS) [RFC6347]) before authentication and 1005 authorization can be verified, thus exhausting the device's system 1006 resources. This leads to a temporary or - e.g. if the batteries are 1007 drained - permanent failure of the service. For some services of 1008 constrained devices, availability is especially important (see 1009 Section 2.3). Because of their limitations, constrained devices are 1010 especially vulnerable to denial of service attacks. Solution 1011 designers must be particularly careful to consider these limitations 1012 in every part of the protocol. This includes: 1014 o Battery usage 1016 o Number of message exchanges required by security measures 1018 o Size of data that is transmitted (e.g. authentication and access 1019 control data) 1021 o Size of code required to run the protocol 1023 o Size of RAM memory and stack required to run the protocol 1025 Another category of attacks that needs to be considered by solution 1026 developers is session interception and hijacking. 1028 3.2. Configuration of Access Permissions 1030 o The access control policies need to be enforced (all use cases): 1031 The information that is needed to implement the access control 1032 policies needs to be provided to the device that enforces the 1033 authorization and applied to every incoming request. 1035 o A single resource might have different access rights for different 1036 requesting entities (all use cases). 1038 Rationale: In some cases different types of users need different 1039 access rights, as opposed to a binary approach where the same 1040 access permissions are granted to all authenticated users. 1042 o A device might host several resources where each resource has its 1043 own access control policy (all use cases). 1045 o The device that makes the policy decisions should be able to 1046 evaluate context-based permissions such as location or time of 1047 access (see e.g. Section 2.2, Section 2.3, Section 2.4). Access 1048 may depend on local conditions, e.g. access to health data in an 1049 emergency. The device that makes the policy decisions should be 1050 able to take such conditions into account. 1052 3.3. Authorization Considerations 1054 o Devices need to be enabled to enforce authorization policies 1055 without human intervention at the time of the access request (see 1056 e.g. Section 2.1, Section 2.2, Section 2.4, Section 2.5). 1058 o Authorization solutions need to consider that constrained devices 1059 might not have internet access at the time of the access request 1060 (see e.g. Section 2.1, Section 2.3, Section 2.5, Section 2.6). 1062 o It should be possible to update access control policies without 1063 manually re-provisioning individual devices (see e.g. Section 2.2, 1064 Section 2.3, Section 2.5, Section 2.6). 1066 Rationale: Peers can change rapidly which makes manual re- 1067 provisioning unreasonably expensive. 1069 o Authorization policies may be defined to apply to a large number 1070 of devices that might only have intermittent connectivity. 1071 Distributing policy updates to every device for every update might 1072 not be a feasible solution (see e.g. Section 2.5). 1074 o It must be possible to dynamically revoke authorizations (see e.g. 1075 Section 2.4). 1077 o The authentication and access control protocol can put undue 1078 burden on the constrained system resources of a device 1079 participating in the protocol. An authorization solutions must 1080 take the limitations of the constrained devices into account (all 1081 use cases, see also Section 3.1). 1083 o Secure default settings are needed for the initial state of the 1084 authentication and authorization protocols (all use cases). 1086 Rationale: Many attacks exploit insecure default settings, and 1087 experience shows that default settings are frequently left 1088 unchanged by the end users. 1090 o Access to resources on other devices should only be permitted if a 1091 rule exists that explicitly allows this access (default deny) (see 1092 e.g. Section 2.4). 1094 o Usability is important for all use cases. The configuration of 1095 authorization policies as well as the gaining access to devices 1096 must be simple for the users of the devices. Special care needs 1097 to be taken for home scenarios where access control policies have 1098 to be configured by users that are typically not trained in 1099 security (see Section 2.2, Section 2.3, Section 2.6). 1101 3.4. Proxies 1103 In some cases, the traffic between endpoints might go through 1104 intermediary nodes (e.g. proxies, gateways). This might affect the 1105 function or the security model of authentication and access control 1106 protocols e.g. end-to-end security between endpoints with DTLS might 1107 not be possible (see Section 2.5). 1109 4. Privacy Considerations 1111 Many of the devices that are in focus of this document register data 1112 from the physical world (sensors) or affect processes in the physical 1113 world (actuators), which may involve data or processes belonging to 1114 individuals. To make matters worse the sensor data may be recorded 1115 continuously thus allowing to gather significant information about an 1116 individual subject through the sensor readings. Therefore privacy 1117 protection is especially important, and Authentication and Access 1118 control are important tools for this, since they make it possible to 1119 control who gets access to private data. 1121 Privacy protection can also be weighted in when evaluating the need 1122 for end-to-end confidentiality, since otherwise intermediary nodes 1123 will learn the content of potentially sensitive messages sent between 1124 endpoints and thereby threaten the privacy of the individual that may 1125 be subject of this data. 1127 In some cases, even the possession of a certain type of device can be 1128 confidential, e.g. individuals might not want to others to know that 1129 they are wearing a certain medical device (see Section 2.3). 1131 The personal health monitoring use case (see Section 2.3) indicates 1132 the need for secure audit logs which impose specific requirements on 1133 a solution. 1134 Auditing is not in the scope of ACE. However, if an authorization 1135 solution provides means for audit logs, it must consider the impact 1136 of logged data for the privacy of all parties involved. Suitable 1137 measures for protecting and purging the logs must be taken during 1138 operation, maintenance and decommissioning of the device. 1140 5. Acknowledgments 1142 The authors would like to thank Olaf Bergmann, Sumit Singhal, John 1143 Mattson, Mohit Sethi, Carsten Bormann, Martin Murillo, Corinna 1144 Schmitt, Hannes Tschofenig, Erik Wahlstroem, Andreas Baeckman, Samuel 1145 Erdtman, Steve Moore, Thomas Hardjono, Kepeng Li and Jim Schaad for 1146 reviewing and/or contributing to the document. Also, thanks to 1147 Markus Becker, Thomas Poetsch and Koojana Kuladinithi for their input 1148 on the container monitoring use case. Furthermore the authors thank 1149 Akbar Rahman, Chonggang Wang, and Vinod Choyi who contributed the 1150 public safety scenario in the building automation use case. 1152 Ludwig Seitz and Goeran Selander worked on this document as part of 1153 EIT-ICT Labs activity PST-14056. 1155 6. IANA Considerations 1156 This document has no IANA actions. 1158 7. Informative References 1160 [Jedermann14] 1161 Jedermann, R., Poetsch, T., and C. LLoyd, "Communication 1162 techniques and challenges for wireless food quality 1163 monitoring", Philosophical Transactions of the Royal 1164 Society A Mathematical, Physical and Engineering Sciences, 1165 May 2014. 1167 [RFC6347] Rescorla, E. and N. Modadugu, "Datagram Transport Layer 1168 Security Version 1.2", RFC 6347, DOI 10.17487/RFC6347, 1169 January 2012, . 1171 [RFC7228] Bormann, C., Ersue, M., and A. Keranen, "Terminology for 1172 Constrained-Node Networks", RFC 7228, DOI 10.17487/ 1173 RFC7228, May 2014, 1174 . 1176 [RFC7252] Shelby, Z., Hartke, K., and C. Bormann, "The Constrained 1177 Application Protocol (CoAP)", RFC 7252, DOI 10.17487/ 1178 RFC7252, June 2014, 1179 . 1181 [RFC7258] Farrell, S. and H. Tschofenig, "Pervasive Monitoring Is an 1182 Attack", BCP 188, RFC 7258, DOI 10.17487/RFC7258, May 1183 2014, . 1185 Authors' Addresses 1187 Ludwig Seitz (editor) 1188 SICS Swedish ICT AB 1189 Scheelevaegen 17 1190 Lund 223 70 1191 Sweden 1193 Email: ludwig@sics.se 1195 Stefanie Gerdes (editor) 1196 Universitaet Bremen TZI 1197 Postfach 330440 1198 Bremen 28359 1199 Germany 1201 Phone: +49-421-218-63906 1202 Email: gerdes@tzi.org 1203 Goeran Selander 1204 Ericsson 1205 Faroegatan 6 1206 Kista 164 80 1207 Sweden 1209 Email: goran.selander@ericsson.com 1211 Mehdi Mani 1212 Itron 1213 52, rue Camille Desmoulins 1214 Issy-les-Moulineaux 92130 1215 France 1217 Email: Mehdi.Mani@itron.com 1219 Sandeep S. Kumar 1220 Philips Research 1221 High Tech Campus 1222 Eindhoven 5656 AA 1223 The Netherlands 1225 Email: sandeep.kumar@philips.com