idnits 2.17.1 draft-hansbury-sacm-oval-info-model-mapping-03.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year == The document doesn't use any RFC 2119 keywords, yet seems to have RFC 2119 boilerplate text. -- The document date (September 7, 2016) is 2788 days in the past. Is this intentional? Checking references for intended status: Informational ---------------------------------------------------------------------------- == Missing Reference: 'ECP' is mentioned on line 1017, but not defined == Outdated reference: A later version (-05) exists of draft-ietf-sacm-architecture-03 == Outdated reference: A later version (-18) exists of draft-ietf-sacm-requirements-04 Summary: 0 errors (**), 0 flaws (~~), 5 warnings (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Security Automation and Continuous Monitoring M. Hansbury 3 Internet-Draft D. Haynes 4 Intended status: Informational The MITRE Corporation 5 Expires: March 11, 2017 J. Gonzalez 6 Department of Homeland Security 7 September 7, 2016 9 OVAL and the SACM Information Model 10 draft-hansbury-sacm-oval-info-model-mapping-03 12 Abstract 14 The OVAL community has spent more than ten years developing and 15 employing the OVAL Language. During this time, the community has 16 made a number of design decisions and learned a number of lessons 17 that should be leveraged as the next-generation endpoint posture 18 assessment standards are formulated. There are also a number of 19 places where portions of the OVAL Language align with the SACM 20 Information Model and could serve as a starting point for related 21 work. Another output of the work executed under the OVAL project is 22 a number of lessons that are applicable to the SACM work. These 23 lessons include a clear separation of data collection and evaluation; 24 a call to focus on ensuring both primary source vendors and third 25 party security experts feel invited to the discussion and are 26 empowered to leverage their unique domain knowledge; and to strive 27 for simplicity and flexibility, where possible. In addition, the 28 OVAL community has a set of clear recommendations with respect to 29 which parts of OVAL should be used by SACM as a means to make best 30 use of the efforts of those that have worked on and supported OVAL 31 over the past ten years. Those recommendations are: 33 o Use the OVAL System Characteristics Model to inform the 34 development of a data model for representing endpoint posture 35 attributes. 37 o Use the OVAL Definitions Model to inform the development of data 38 models for representing evaluation and collection guidance. 40 o Do not use the OVAL Results Model to inform the development of a 41 data model for representing evaluation results. 43 Lastly, this document will discuss the OVAL submission, how it is 44 expected to be used, and how it aligns with the SACM Vulnerability 45 Assessment Scenario. 47 Status of This Memo 49 This Internet-Draft is submitted in full conformance with the 50 provisions of BCP 78 and BCP 79. 52 Internet-Drafts are working documents of the Internet Engineering 53 Task Force (IETF). Note that other groups may also distribute 54 working documents as Internet-Drafts. The list of current Internet- 55 Drafts is at http://datatracker.ietf.org/drafts/current/. 57 Internet-Drafts are draft documents valid for a maximum of six months 58 and may be updated, replaced, or obsoleted by other documents at any 59 time. It is inappropriate to use Internet-Drafts as reference 60 material or to cite them other than as "work in progress." 62 This Internet-Draft will expire on March 11, 2017. 64 Copyright Notice 66 Copyright (c) 2016 IETF Trust and the persons identified as the 67 document authors. All rights reserved. 69 This document is subject to BCP 78 and the IETF Trust's Legal 70 Provisions Relating to IETF Documents 71 (http://trustee.ietf.org/license-info) in effect on the date of 72 publication of this document. Please review these documents 73 carefully, as they describe your rights and restrictions with respect 74 to this document. Code Components extracted from this document must 75 include Simplified BSD License text as described in Section 4.e of 76 the Trust Legal Provisions and are provided without warranty as 77 described in the Simplified BSD License. 79 Table of Contents 81 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 4 82 1.1. Requirements Language . . . . . . . . . . . . . . . . . . 5 83 2. SACM Information Model . . . . . . . . . . . . . . . . . . . 5 84 3. OVAL Language . . . . . . . . . . . . . . . . . . . . . . . . 5 85 3.1. Core OVAL Models . . . . . . . . . . . . . . . . . . . . 6 86 3.1.1. Core OVAL Data Models . . . . . . . . . . . . . . . . 6 87 3.1.1.1. OVAL Definitions Model . . . . . . . . . . . . . 6 88 3.1.1.2. OVAL System Characteristics Model . . . . . . . . 6 89 3.1.1.3. OVAL Results Model . . . . . . . . . . . . . . . 6 90 3.1.2. Additional Core OVAL Data Models . . . . . . . . . . 7 91 3.1.2.1. OVAL Common Model . . . . . . . . . . . . . . . . 7 92 3.1.2.2. OVAL Variables Model . . . . . . . . . . . . . . 7 93 3.1.2.3. OVAL Directives Model . . . . . . . . . . . . . . 7 94 3.1.3. Processing Model . . . . . . . . . . . . . . . . . . 7 96 4. Relating the OVAL Models to the SACM Information Model . . . 8 97 4.1. Attribute Collector . . . . . . . . . . . . . . . . . . . 8 98 4.2. Evaluator . . . . . . . . . . . . . . . . . . . . . . . . 9 99 4.3. Endpoint Attribute Assertion . . . . . . . . . . . . . . 9 100 4.4. Evaluation Result . . . . . . . . . . . . . . . . . . . . 10 101 4.5. Collection Guidance . . . . . . . . . . . . . . . . . . . 10 102 4.6. Evaluation Guidance . . . . . . . . . . . . . . . . . . . 11 103 4.7. Provenance . . . . . . . . . . . . . . . . . . . . . . . 12 104 5. SACM Constructs with No OVAL Mapping . . . . . . . . . . . . 12 105 5.1. Tasking . . . . . . . . . . . . . . . . . . . . . . . . . 12 106 5.2. Event-driven Actions . . . . . . . . . . . . . . . . . . 12 107 5.3. User and Authorization . . . . . . . . . . . . . . . . . 13 108 5.4. Location . . . . . . . . . . . . . . . . . . . . . . . . 13 109 6. Lessons Learned and Gaps . . . . . . . . . . . . . . . . . . 13 110 6.1. Simplicity is Key . . . . . . . . . . . . . . . . . . . . 14 111 6.1.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 14 112 6.1.2. SACM Implications . . . . . . . . . . . . . . . . . . 14 113 6.2. Collection and Evaluation Must Be De-coupled . . . . . . 14 114 6.2.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 14 115 6.2.2. SACM Implications . . . . . . . . . . . . . . . . . . 14 116 6.3. Keep Separate Core and Extensions . . . . . . . . . . . . 14 117 6.3.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 15 118 6.3.2. SACM Implications . . . . . . . . . . . . . . . . . . 15 119 6.4. Empower Subject Matter Experts . . . . . . . . . . . . . 15 120 6.4.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 15 121 6.4.2. SACM Implications . . . . . . . . . . . . . . . . . . 15 122 6.5. Carrots Work Better than Sticks . . . . . . . . . . . . . 16 123 6.5.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 16 124 6.5.2. SACM Implications . . . . . . . . . . . . . . . . . . 16 125 6.6. Use Caution Defining Data Collection . . . . . . . . . . 16 126 6.6.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 16 127 6.6.2. SACM Implications . . . . . . . . . . . . . . . . . . 17 128 6.7. Perspective Matters . . . . . . . . . . . . . . . . . . . 17 129 6.7.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 17 130 6.7.2. SACM Implications . . . . . . . . . . . . . . . . . . 17 131 6.8. Flexible Results Fidelity is Important . . . . . . . . . 17 132 6.8.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 17 133 6.8.2. SACM Implications . . . . . . . . . . . . . . . . . . 18 134 6.9. Evaluation Guidance is Platform-Specific . . . . . . . . 18 135 6.9.1. Lesson . . . . . . . . . . . . . . . . . . . . . . . 18 136 6.9.2. SACM Implications . . . . . . . . . . . . . . . . . . 18 137 7. Recommendations . . . . . . . . . . . . . . . . . . . . . . . 18 138 7.1. Use the OVAL System Characteristics Model for Encoding 139 Collection Data . . . . . . . . . . . . . . . . . . . . . 18 140 7.2. Use the OVAL Definitions Model for Collection and 141 Evaluation Guidance . . . . . . . . . . . . . . . . . . . 19 142 7.3. Do NOT Use the OVAL Results Model for Results Sharing . . 20 143 8. OVAL Submission . . . . . . . . . . . . . . . . . . . . . . . 21 144 9. Alignment with the SACM Vulnerability Assessment Scenario . . 22 145 9.1. Endpoint Identification and Initial Data 146 Collection . . . . . . . . . . . . . . . . . . . . . . . 22 147 9.2. Endpoint Applicability and Secondary Assessment . . . . . 23 148 9.3. Assessment Results . . . . . . . . . . . . . . . . . . . 23 149 10. Acknowledgements . . . . . . . . . . . . . . . . . . . . . . 23 150 11. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 24 151 12. Security Considerations . . . . . . . . . . . . . . . . . . . 24 152 13. Change Log . . . . . . . . . . . . . . . . . . . . . . . . . 24 153 13.1. -02 to -03 . . . . . . . . . . . . . . . . . . . . . . . 24 154 13.2. -01 to -02 . . . . . . . . . . . . . . . . . . . . . . . 24 155 13.3. -00 to -01 . . . . . . . . . . . . . . . . . . . . . . . 24 156 14. References . . . . . . . . . . . . . . . . . . . . . . . . . 25 157 14.1. Normative References . . . . . . . . . . . . . . . . . . 25 158 14.2. Informative References . . . . . . . . . . . . . . . . . 25 159 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 26 161 1. Introduction 163 The Security Automation and Continuous Monitoring (SACM) IETF Working 164 Group [SACM] has been chartered with standardizing the mechanisms by 165 which endpoint security assessment is performed. This includes 166 software inventory, compliance and vulnerability management, and 167 other related activities. The Working Group has created a series of 168 artifacts [SACM-DOCUMENTS] to capture the important concepts required 169 to accomplish this goal. In addition to Use Cases, Requirements, and 170 Architecture documents, the Working Group has created an initial 171 draft of an Information Model that describes the high-level 172 components and concepts that fulfill the already defined 173 requirements. 175 This white paper discusses how the Open Vulnerability and Assessment 176 Language (OVAL) [OVAL-DOCUMENTATION] can be used to inform the 177 development of data models that implement the Information Model 178 defined by the SACM group. This paper is not meant to suggest that 179 the entire OVAL Data Model could-or even should-be supported by SACM; 180 rather, it breaks apart the various components of the OVAL Language 181 and discusses how each could be used to satisfy parts of the 182 Information Model. 184 This document assumes that the reader is already familiar with OVAL 185 and its structures. For those readers that require more in-depth 186 information about OVAL, please review the OVAL Tutorial documentation 187 [OVAL-DEFINITION-TUTORIAL] and other related documentation. This 188 document describes how these structures can be thought of as data 189 models whose scopes and activities overlap with the SACM Information 190 Model. 192 Additionally, in later sections, the paper presents lessons learned 193 from the ten plus years of OVAL development and curation, related 194 gaps, and how the OVAL submission is expected to be used and how it 195 aligns with the SACM Vulnerability Assessment Scenario 196 [I-D.coffin-sacm-vuln-scenario]. 198 1.1. Requirements Language 200 The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", 201 "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this 202 document are to be interpreted as described in RFC 2119 [RFC2119]. 204 2. SACM Information Model 206 The information model defined by the SACM Working Group captures the 207 types of objects and data required to fulfill the defined SACM 208 Requirements [I-D.ietf-sacm-requirements]. It additionally provides 209 details on the flow of data to and from the different objects in the 210 system, in conjunction with the SACM Architecture document 211 [I-D.ietf-sacm-architecture]. The document describes all of these 212 things in a protocol and data format neutral manner. 214 The document provides descriptions of the various components that are 215 required to perform endpoint assessments, along with some usage 216 scenarios, and the potential mapping from OVAL to any of these 217 defined components wherever OVAL may be relevant. 219 3. OVAL Language 221 The OVAL Language is made up of several parts, each responsible for 222 encapsulating a part of the assessment model. Each part is discussed 223 briefly below [STRUCTURE-OF-OVAL]. 225 Note: A word about Core vs. Platform Extensions. OVAL can be broadly 226 split into Core structures, which are those that are foundational and 227 give the overall structure to the OVAL Language, and the Platform 228 Extensions, which are platform-specific structures that extend the 229 Core in order to provide ways to encode the underlying low-level, 230 platform-specific tests used by OVAL Content. This paper is chiefly 231 focused on mapping the Core into the SACM Information Model. 233 In a similar fashion, while thinking about how to implement the SACM 234 Information Model, two distinct levels must be considered: 236 1. Platform-agnostic, high level concepts 238 2. Platform-specific concepts 240 3.1. Core OVAL Models 242 The OVAL Language is made up of three primary core data models which 243 define the three steps of the assessment process (desired state, 244 actual state, and the results of comparing the actual state against 245 the desired state), three supplemental core data models, and a 246 processing model which describes how all the core data models work 247 together. 249 3.1.1. Core OVAL Data Models 251 There are a number of data models defined as part of OVAL. This 252 section discusses the three most important data models. 254 3.1.1.1. OVAL Definitions Model 256 The Definitions Model is the central component of the OVAL Language. 257 The structures in this model allow an author to encode what posture 258 data to collect, the expected values for the data, and the rules by 259 which to evaluate that data. However, the current design requires 260 authors to include both what data must be collected, and how the 261 collected data is to be evaluated in a Definition which couples these 262 two separate, but, related concepts together. For more information, 263 see Section 6.2 below. 265 The OVAL Definitions Model provides a Definition object that is the 266 root element for any OVAL check. It contains a set of criteria, 267 either simple or complex, to define how the evaluation should 268 operate. In addition, the OVAL Definitions Model defines the base 269 structures that are used by the Platform Extensions to extend OVAL, 270 as well as Functions, and other high-level concepts. 272 3.1.1.2. OVAL System Characteristics Model 274 The OVAL System Characteristics Model defines structures to encode 275 the actual posture data that is collected. It provides basic 276 structures for representing this data, including the Item construct, 277 which is the base structure for recording collected data in OVAL. It 278 also provides structures for capturing information about the endpoint 279 from which the data was collected, including OS information, endpoint 280 identification information (such as IP and MAC addresses), and other 281 relevant endpoint metadata. 283 3.1.1.3. OVAL Results Model 285 Finally, OVAL provides a third model to encode the results of the 286 evaluation, the OVAL Results Model. This model provides structures 287 to capture essential information about the evaluation results, such 288 as the overall results of each definition evaluation and when the 289 assessment occurred. Additionally, the Results model provides a way 290 to include both the guidance (Definitions) and collected data (System 291 Characteristics) used for the evaluation. By capturing this 292 additional data, the Results model provides a comprehensive way to 293 capture the information used to determine the result in addition to 294 the results themselves. 296 3.1.2. Additional Core OVAL Data Models 298 Additional data models are defined that support specific capabilities 299 that are sometimes useful in conjunction with the OVAL models 300 previously discussed. The models discussed in this section are not 301 intended to stand alone, and require the use of one or more of the 302 core OVAL models. 304 3.1.2.1. OVAL Common Model 306 The Common Model is a very simple collection of global building 307 blocks, such as enumerations used throughout the other models, along 308 with some other foundational pieces. Common values are defined in 309 this model once and then applied within other OVAL models, thus 310 reducing redundancy between each OVAL data model. Examples of the 311 elements provided by the OVAL Common Model are enumerations that 312 provide useful value sets for use within OVAL, such as family types 313 ("windows", "unix", etc.), data types (e.g., "string," "boolean," 314 "int," etc.), and class types (e.g., "vulnerability," "compliance," 315 etc.). 317 3.1.2.2. OVAL Variables Model 319 The OVAL Variables Model provides a simple framework for externally 320 specifying variable values used for the evaluation of an OVAL 321 Definitions document at runtime. 323 3.1.2.3. OVAL Directives Model 325 The OVAL Directives Model provides a very simple model with 326 structures to indicate the level of detail that should be present in 327 an OVAL Results document. This can be used by an evaluator to 328 produce a desired level of result detail. 330 3.1.3. Processing Model 332 The OVAL Processing Model describes in detail how the core OVAL data 333 models are used to produce OVAL Definitions, OVAL System 334 Characteristics, and OVAL Results. 336 4. Relating the OVAL Models to the SACM Information Model 338 The following section discusses each piece of the SACM Information 339 Model, where one or more OVAL models align, wholly or in part. 341 4.1. Attribute Collector 343 The SACM Information Model defines both Internal and External 344 Attribute Collectors. Both are components that perform the 345 collection of posture information from an endpoint. The Information 346 Model lists a number of examples of Collectors such as Network 347 Intrusion Detection Systems (NIDS), NEA posture collectors, and 348 vulnerability scanners. While OVAL is not directly applicable for 349 some types of Attribute Collectors such as NIDS, it is certainly 350 applicable for NEA posture collectors and vulnerability scanners that 351 require the collection and evaluation of configuration and other 352 endpoint state information. 354 An Attribute Collector needs to be instructed as to what specific 355 posture attributes must be collected, when or how often those 356 attributes must be collected, and how to share the collected 357 attributes. In some cases, an Attribute Collector may simply collect 358 data and directly respond to the caller with the required results. 359 In others, it may monitor the endpoint for changes and report these 360 changes when the change occurs, or may execute the data collection at 361 a future time or at some interval. In these last two cases, the 362 collector will need to know how to share the collected data. The 363 OVAL Language does not provide any mechanism for instructing tools 364 where to send collected data, but the OVAL Definitions Model can 365 (among other things) encode what data must be collected; however, it 366 does not allow (as currently constructed) for providing any notion of 367 what constitutes valid data collection (i.e., how recent data must be 368 to be considered acceptable, and how and where it was collected). 370 Additionally, the OVAL Definitions Model could be modified to support 371 monitoring of events. As it is today, OVAL doesn't have any explicit 372 way to include these instructions, but it would be simple to modify 373 the model to include this notion. 375 The OVAL System Characteristics Model allows the encoding of 376 collected information and can be used to implement a data format for 377 sharing collected data. While OVAL does not require that tools store 378 data using a standardized format (though they are free to do so), a 379 standardized format is required to allow tools to exchange data. The 380 OVAL System Characteristics Model provides a standardized way to 381 encode this information for exchange. 383 4.2. Evaluator 385 An Evaluator is the component that analyzes inputs such as Posture 386 Attributes and Evaluation Guidance to determine the result of a 387 particular assessment. It is the piece that answers a question about 388 the security posture of one more endpoints. The Evaluator must be 389 able to ingest inputs of various types, understand the question or 390 questions asked of it, and analyze the inputs to make a 391 determination. 393 In this case, OVAL could be used to provide several of the required 394 inputs to an Evaluator. The format defined in the OVAL Definitions 395 Model could be used to express Evaluation Guidance. Note that when 396 mapping the OVAL Definitions Data Model to the SACM Information 397 Model, it is important to distinguish between Collection and 398 Evaluation within the OVAL Definitions Model. The OVAL Definitions 399 Model structures currently combine both the Collection ("what to 400 collect") and Evaluation ("what the data should look like"). One of 401 the key concepts within the SACM Information Model is that Collection 402 and Evaluation should be separate concepts. Nonetheless, OVAL 403 contains building blocks that could inform solutions that satisfy 404 this need. 406 Similarly, the structures defined in the OVAL System Characteristics 407 Model and the OVAL Results Model could be used to inform the 408 solutions that define the Attributes input to the Evaluator and the 409 results of an assessment respectively. 411 4.3. Endpoint Attribute Assertion 413 According to the SACM Information Model, an Endpoint Attribute 414 Assertion is a way to indicate that a specified set of posture 415 attributes or events were present on an endpoint during a specific 416 interval of time. For example, an Assertion could be made that a 417 particular Windows server had the following attributes from 1/1/2015 418 - 1/8/2015: 420 o os = Windows 7 422 o mac-address = 01:24:42:58:34:2b 424 OVAL does not have a direct corollary to this construct; however, the 425 structures defined by the OVAL System Characteristics Model could 426 provide a base from which such a construct could be built. The 427 System Characteristics Data Model is designed to capture posture 428 attributes, and as such, could be extended or modified to include the 429 concept of a time interval. 431 Additionally, it is important to note that the SACM Information Model 432 also states that Events can be included within an Endpoint Attribute 433 Assertion. While "event" and "attribute" are often used 434 interchangeably, in the SACM Information Model, these two concepts 435 are considered distinct. The distinction is that an "event" is 436 something that has a value that does not change until something 437 causes a change, whereas an "attribute" is something that is observed 438 at a moment in time. The Endpoint Attribute Assertion deals with 439 both posture attributes and events during a time interval. No 440 special treatment is given to Events within OVAL as it is currently 441 constructed, although, as stated previously, adding a time interval 442 to support Events is simple to do. 444 4.4. Evaluation Result 446 An Evaluation Result is the representation of the analysis of a given 447 set of Posture Attributes against Evaluation Guidance. The OVAL 448 Results Model structures can be used to encode one or more Evaluation 449 Results. 451 4.5. Collection Guidance 453 Within the SACM Information Model, Collection Guidance is defined as 454 information that describes which Posture Attributes must be collected 455 from one or more endpoints. It is the means by which an Attribute 456 Collector determines what information it must collect, as well as 457 when that information must be collected (including intervals for 458 repeated collection activities). 460 The OVAL Definitions Model provides structures capable of expressing 461 information about what data must be collected for an assessment. It 462 is important to note that the method by which the OVAL Definitions 463 Model accomplishes this will not necessarily directly apply to the 464 SACM Information Model in its current state. In many cases, which 465 specific posture attributes should be collected is not distinct from 466 its evaluation guidance. For the OVAL Definitions Model to be used 467 to implement the SACM Information Model, work would need to be 468 undertaken to de-couple these concepts. 470 While the model provides the ability to encode details such as what 471 data must be collected from the endpoint, it does not currently 472 provide the ability to include information such as collection 473 interval. The model can be extended, however, to add this 474 capability. Adding the concept of an "interval" to the model to 475 capture the concept may be a way to accomplish this goal. 477 Important Note: One of the key drawbacks to OVAL is that Platform 478 Extensions (using the OVAL Definitions Model as a base) must be 479 created for each platform and data source to capture any Posture 480 Attributes that must be collected for a given platform and data 481 source. As a result, it is not easy or scalable to create or update 482 extensions for rapidly changing platforms and products in a timely 483 manner. 485 With this in mind, it is important that any use of the OVAL 486 Definitions Model to satisfy Collection Guidance for SACM should 487 warrant consideration of updates that change this from a solution 488 where the low-level platform details are part of the language itself, 489 to one where the format provides a way for domain experts (ideally 490 primary source vendors) to instruct tools what Posture Attributes to 491 collect. 493 This also applies to the next section (Evaluation Guidance). 495 4.6. Evaluation Guidance 497 The Evaluation Guidance component contains the information that 498 directs an Evaluator how to perform one or more assessments based on 499 collected data. Evaluation Guidance must direct the Evaluator on 500 what the expected state of collected data should be. Additionally, 501 it must be able to specify desired characteristics of the data. That 502 is, it must be able to not only cite the specific posture attributes 503 under evaluation, but also to specify characteristics such as the 504 type of tool that was used to collect the data, how old the data is, 505 etc. 507 The Evaluator must then ingest this guidance, locate the required 508 data-whether locally or remotely available-and then execute the 509 analysis required. 511 OVAL offers the OVAL Definitions Model to provide the structures for 512 encoding the expected state or values for evaluating collected data. 513 The OVAL Language does not currently provide a way to specify the 514 expected characteristics of the data, but the OVAL Definitions Model 515 could be augmented to include this type of information. 516 Alternatively, the concept could be added elsewhere and re-used as 517 appropriate. Allowing for the description of characteristics 518 information will be important to allow evaluation to do things like 519 only use data if it's been collected within the past x days or only 520 query data that is collected by a credentialed collector. 522 Again, as Collection and Evaluation are intertwined currently in the 523 OVAL Language, some work will be required to de-couple them for use 524 with the Evaluation Guidance component. 526 4.7. Provenance 528 While the SACM Information Model does not attempt to define 529 provenance, it does describe metadata that should be included when 530 exchanging and evaluating posture attribute information (e.g., source 531 of origin, time of collection, observation, etc.). This metadata 532 aims to provide SACM users with enough information to make a 533 determination about the provenance of data as it applies to their 534 enterprise. 536 Within the OVAL Common Model, a Generator structure is defined to 537 express both what created the content, and when it was created. 538 While the purpose of this structure does not meet all the metadata 539 needs for SACM, it could be used as a building block and be extended 540 to achieve this goal. 542 5. SACM Constructs with No OVAL Mapping 544 Finally, while there are many similarities between what is defined by 545 the SACM Information Model and the OVAL data models, there are some 546 things discussed in the SACM Information Model document that are 547 either different from or not supported within OVAL. 549 5.1. Tasking 551 The SACM Information Model discusses Tasks in a few places, including 552 the Collector, Evaluator, and Reporting sections. Tasks represent of 553 notion of "do something at this time", "do something until told 554 otherwise", or "do X when Y occurs". OVAL does not support any 555 notion of a tasking model as currently defined. 557 While the OVAL Definitions Model (or some derivative) could be 558 referenced by a model that captures tasking, it may be difficult to 559 support all of the needs of tasking in this way. Tasking may already 560 be well defined by another, existing model, and if so, it might be 561 best to leverage that existing work. 563 5.2. Event-driven Actions 565 Within the SACM Information Model, in addition to posture attributes, 566 events are also often part of the data collection activities. Events 567 are discussed as both part of an Endpoint Attribute Assertion, and an 568 Endpoint Attribute Collector. In each case, it is clear that, in 569 addition to the collection of posture attribute data, event data must 570 also be taken into account. 572 The OVAL Language does not have any notion of capturing events 573 directly. It is constructed to allow the representation of Posture 574 Attribute data within the OVAL System Characteristics Model, but 575 event data is absent from that model. OVAL can be modified to 576 support Events in large part by simply extending it to include a time 577 interval. 579 5.3. User and Authorization 581 The Information Model talks about Users (i.e., one or more end users 582 or roles) and Authorizations (i.e., their authority to undertake 583 actions). While OVAL includes some entities that may relate to these 584 types of concepts, they appear in very specific low-level tests like 585 Windows and UNIX user-related tests. OVAL lacks any general concept 586 of Users or Authorizations that could be applied across its core data 587 structures. The recommendation is to identify and integrate an 588 external solution into relevant OVAL models to achieve required 589 capabilities in this area. 591 5.4. Location 593 Similar to Users and Authorization, Locations are defined in the 594 Information Model. Locations include physical location (e.g., 595 department, room, Global Positioning System (GPS), wall-jack, etc.) 596 and logical location (e.g., authentication points, which network 597 infrastructure endpoints it is connected to, etc.). 599 Again, as for Users and Authorization, the recommendation is for the 600 relevant OVAL models to be integrated with other solutions to meet 601 these requirements. 603 6. Lessons Learned and Gaps 605 Over the course of ten-plus years in moderating the OVAL project, 606 those involved in the project have released over 15 distinct versions 607 of the Language, 25 versions of the OVAL Interpreter, and have 608 processed over 25,000 OVAL Definitions in the OVAL Repository. In 609 addition, the team has spent a lot of time interacting with security 610 tool vendors, researchers, primary source vendors, and commercial and 611 government end users, discussing their needs and struggles. As such, 612 the following lessons learned are presented to help ensure that the 613 collective experience of the group is shared with the larger 614 community. 616 In addition to a description of the lesson, each also has a suggested 617 application for the SACM work. 619 6.1. Simplicity is Key 621 6.1.1. Lesson 623 Endpoint assessment covers a broad set of activities. From 624 organization to organization, assessment has different meanings, and 625 what is "good enough" for one group, barely scratches the surface for 626 another. Experience suggested that caution must be used to avoid 627 unnecessary complexity as a means to address this diversity. 629 The team has seen that when information sharing is required across 630 diverse parties, the simpler the exchange mechanism design, the more 631 successful the sharing effort will be. 633 6.1.2. SACM Implications 635 Review both the diversity of the different organizations that are 636 sharing information within the SACM framework, and the types and 637 volume of information that must be shared. Include only the 638 information that is required to successfully implement the desired 639 use cases. The modular organization of OVAL supports use of parts of 640 OVAL for different use cases. This organizational structure allows 641 for use of only the parts that are needed to support a use case and 642 nothing more. 644 6.2. Collection and Evaluation Must Be De-coupled 646 6.2.1. Lesson 648 As OVAL - and the security automation space in general - has evolved, 649 it has become clear that the close coupling found in OVAL between the 650 OVAL Object and OVAL State (i.e., what to collect and what the 651 collected data is expected to look like) is an undesirable feature. 652 By forcing these two concepts into a single model, the Language does 653 not easily allow for easy extension, dynamic querying of previously 654 collected data, or efficiencies in data collection and data 655 exchanges. 657 6.2.2. SACM Implications 659 Keep the mechanism by which data is collected and evaluated separate. 661 6.3. Keep Separate Core and Extensions 662 6.3.1. Lesson 664 OVAL, by design, must be frequently updated to keep up with new and 665 expanding sets of assessment platforms. However, tool vendors 666 incurred great cost in updating to new versions of the Language, 667 including implementing new tests in the updated version, as well as 668 general quality testing, updating release and deployment, etc. 670 As the project matured, so too did the Core Models that define the 671 building blocks for endpoint assessment. Over the past few years, 672 the Core Models rarely changed-in some cases, going years without any 673 required update. The Platform Extension Models, however, will always 674 require a frequent revision cycle, and often were out of date very 675 quickly. Despite the fact that these two models had distinct release 676 cycle requirements, with one continually getting longer in the Core 677 Models, and one requiring agility in the Platform Extensions, a full 678 release of both was required to include changes to any part of the 679 OVAL Language. 681 6.3.2. SACM Implications 683 SACM should focus on providing the foundational building blocks that 684 allow those that know how best to express what data must be collected 685 to assess an endpoint. The SNMP standard [RFC1157] could be used as 686 a model for this type of separation. SNMP defines the building 687 blocks for sharing information about network devices, but defers the 688 low-level details of this information sharing to those that best 689 understand the products via Management Information Bases (MIBs). 690 While this is not a perfectly analogous model for the SACM work, this 691 clean separation of core building blocks and protocols from the low- 692 level details of products should be emulated, if possible. 694 6.4. Empower Subject Matter Experts 696 6.4.1. Lesson 698 As the security automation field has matured, more primary source 699 vendors and other subject matter experts have taken increased 700 responsibility in ownership of how their products are assessed. This 701 step in maturity is critical and, within OVAL, as these vendors have 702 become more involved, the quality in tests available to tools and end 703 users has greatly increased. 705 6.4.2. SACM Implications 707 Ensure that usage of SACM means that those that best understand the 708 component being assessed are empowered to instruct what data must be 709 collected for the assessment, along with the meaning of this data. 711 As much as possible, keep the mechanism by which this information is 712 conveyed as simple as possible to ensure that it is as easy as 713 possible for subject matter experts to participate. 715 6.5. Carrots Work Better than Sticks 717 6.5.1. Lesson 719 As much as possible, ensure that usage and compliance with the 720 defined standards is encouraged by offering primary source vendors 721 and subject matter experts incentive to do so. Forced compliance 722 typically encourages organizations to do the least possible, and does 723 not entice them to continually stay engaged. 725 6.5.2. SACM Implications 727 Find ways to encourage participation that drives long term engagement 728 and willing participation. Engage with vendors to understand their 729 problems and, where possible, construct SACM use cases and 730 requirements that not only address the needs of the SACM end users, 731 but also those of the vendors. Build a compelling story for use of 732 SACM that not only shows value to end users, but shows a clear return 733 on investment for vendors. 735 6.6. Use Caution Defining Data Collection 737 6.6.1. Lesson 739 When providing information about what data must be collected as part 740 of an assessment, it can be quite easy to provide this information in 741 a way that dictates how to collect the required data. Doing so can 742 limit innovation and architectural choices for organizations 743 implementing security automation tools. 745 On the other hand, it is not always feasible to express what data 746 must be collected without implying or instructing specific data 747 collection mechanisms. Over the years, there have been a few cases 748 where the OVAL community could not agree on significant issues 749 related to data collection. Discussions on whether to allow open 750 scripting in the Language and how best to support both third party 751 and primary source contributions were very challenging. With good 752 arguments on both sides of these issues, it was difficult to achieve 753 consensus. 755 6.6.2. SACM Implications 757 This will be one of the bigger challenges for SACM to navigate. SACM 758 must allow those that best understand platforms and products to 759 instruct what data must be collected for assessment. At the same 760 time, third party support will be critical in some cases as well, and 761 allowances must be made for this. 763 Additionally, deciding how many, if any, collection methods are 764 allowed as part of the collection instructions will be challenging. 765 Again, a balance should be struck to best allow clarity in data 766 collection instructions, without limiting innovation and product- 767 specific decisions. 769 6.7. Perspective Matters 771 6.7.1. Lesson 773 When evaluating collected posture attributes, it is important to be 774 able to include additional context to this evaluation in some cases. 775 For example, the method by which data was collected could be an 776 important piece of information when performing evaluation. If the 777 scanner was a remote, unauthorized scanner of an endpoint, it is 778 entirely possible that the scanner could not properly scan for a 779 number of posture attributes. If, however, the scanner ran locally 780 on the endpoint as an administrative user, it is much more likely 781 that it accurately collected posture attributes from the endpoint. 783 Other examples of this type of perspective and context include how 784 old the collected data is, and whether the scanner was active or 785 passive. 787 6.7.2. SACM Implications 789 Ensure information that provides necessary context can be provided as 790 part of data collection, thereby allowing context-based decisions to 791 be made. 793 6.8. Flexible Results Fidelity is Important 795 6.8.1. Lesson 797 After data collection and evaluation is complete, evaluation results 798 must be shared, often with multiple parties, and in multiple ways. 799 It is important to provide a reasonable amount of flexibility with 800 respect to what levels of fidelity are allowed with evaluation 801 results. While OVAL did try to achieve a reasonable amount of 802 flexibility with evaluation results fidelity, challenges still exist. 804 6.8.2. SACM Implications 806 As much as possible, allow the end users of evaluation results to 807 determine exactly what level of fidelity they need to achieve their 808 goals. 810 6.9. Evaluation Guidance is Platform-Specific 812 6.9.1. Lesson 814 In the early days of OVAL, initial adoption of the effort was 815 spearheaded by third party security vendors, as opposed to the 816 primary source vendors for software. As the effort matured, more 817 primary source vendors became involved and adopted OVAL in some way. 818 It quickly became evident that, while third party vendors made great 819 strides in determining how to evaluate the security posture of many 820 platforms and products, understanding the best way to evaluate is 821 hard, and very platform-specific. Additionally, OVAL content is 822 costly to create, even for seasoned content authors, due to the need 823 to understand these very low-level product and platform complexities. 825 6.9.2. SACM Implications 827 As cited above, the primary source vendors are best suited to provide 828 evaluation guidance. It is very challenging for third party 829 organizations to truly understand platform-specific evaluation. 830 Empower primary source vendors and other subject matter experts by 831 providing simple and effective ways to provide this information. 832 Also, as discussions on complex topics arise, engage these primary 833 source vendors to understand their valuable views. 835 7. Recommendations 837 In order to successfully standardize the mechanisms by which endpoint 838 posture assessment is performed, the following recommendations are 839 offered to SACM for consideration. 841 7.1. Use the OVAL System Characteristics Model for Encoding Collection 842 Data 844 The OVAL System Characteristics Model is used within the OVAL 845 Language in order to encode the underlying data collected as part of 846 endpoint posture assessment. Each of the posture attributes 847 collected by an OVAL-enabled tool can be represented using the OVAL 848 System Characteristics Model. As such, this model should be used to 849 inform the development of a data model to encode collected posture 850 attributes within SACM. 852 Within the OVAL System Characteristics Model, information such as 853 metadata about the document (who/what created the document, creation 854 timestamp, etc.), endpoint identification information (OS name, host 855 name, and other asset-related information), and the foundational 856 constructs to allow the encoding of posture attributes can be found. 857 It is understood that modifications to the model will be required in 858 order for it to fully implement all of the requirements for SACM. 859 However, the use of this well-supported, standardized mechanism for 860 encoding collected data is recommended as SACM begins moving from 861 Information Model into Data Models and actual implementations. 863 The expectation is that SACM will need to make use of multiple types 864 of standardized formats to encompass a complete solution for endpoint 865 posture assessment. As such, the OVAL System Characteristics Model 866 could be used to inform the development of a data model for encoding 867 collected data from an endpoint. 869 7.2. Use the OVAL Definitions Model for Collection and Evaluation 870 Guidance 872 Similar to the OVAL System Characteristics Model, the OVAL 873 Definitions Model also has aspects that could be very useful in 874 guiding the development of a data model to capture Collection 875 Guidance. Collection Guidance is the mechanism by which a content 876 author can dictate what rules should be used for collecting data from 877 an endpoint. While the OVAL Definitions Model, as it is today, is 878 used for guidance of both Collection and Evaluation, it is well 879 suited to inform the development of a data model for Collection 880 Guidance. 882 This model provides several key features that should be used as 883 building blocks for this capability. For instance, within the OVAL 884 Definitions Model, there is a series of structures that can serve as 885 the base for instructing tools as to what data must be collected, 886 including abstract structures for identifying required posture 887 attributes, Variables, and Functions (which allow several types of 888 data manipulation during collection). The model also supports a 889 number of different data types, such as strings, Booleans, integers, 890 records, and others. 892 While the recommendation is to make use of many of the structures 893 found within the OVAL Definitions Model, it is equally important to 894 note that the current approach for extending OVAL into various 895 platforms is flawed, and should be fixed. Specifically, for every 896 new check that is to be added to the Language, a new concrete test 897 must be created. OVAL provides an abstract Test structure that must 898 be extended to create checks (e.g., "registry_test," "file_test," 899 "ldap_test," etc.). For SACM, it is imperative that a more scalable 900 and flexible approach be implemented. 902 One aspect of SACM that has been discussed, but only partially worked 903 into the Information Model at the time, is the concept of high-level, 904 platform-agnostic configuration items and low-level platform-specific 905 configuration items. In the discussed concept, the high-level items 906 will capture the concepts of configuration that must be defined by 907 those who write the guidance, while the low-level items will be 908 provided by the appropriate vendors and/or subject matter experts to 909 allow those that best know the platforms and products to instruct 910 data collection. With this approach in place, some of the concepts 911 defined within the OVAL Definitions Model (e.g., Objects, which 912 instruct tools as to what data to collect) will need to be modified 913 or removed to accommodate the shift in how posture attributes are 914 defined for Collection. As such, the recommendation is to use many 915 of the underlying structures in the OVAL Definitions Model, including 916 the data types, Variables, Functions, etc., as a base from which to 917 build a complete solution for fulfilling the SACM Information Model. 919 In addition to utility in supporting Collection Guidance, the same 920 OVAL Definitions Model should also be used to inform the development 921 of a data model for Evaluation Guidance. Again, with the current 922 OVAL Language, Collection and Evaluation are wrapped together in the 923 single model. The OVAL Definitions Model provides a series of 924 structures that can be used to support Boolean logic statements, 925 which could be useful for defining evaluation criteria and could be 926 used as the basis for a further enhanced data model for Evaluation 927 Guidance. 929 7.3. Do NOT Use the OVAL Results Model for Results Sharing 931 Despite the fact that the Results Model could be used to share the 932 results of the evaluation part of an endpoint posture assessment, the 933 recommendation is to not use this model to represent this information 934 within SACM. The OVAL Results Model has, over the years, been a 935 source of contention at times within the OVAL Community. Some feel 936 like it provides too little information, while others believe that it 937 offers too much. While there is some flexibility, in the form of 938 OVAL Directives, in how much or how little information is included in 939 the results, it really is not flexible enough to handle the broad set 940 of requirements for SACM without extensive re-working. 942 Furthermore, SACM is working hard at separating data collection and 943 evaluation, which makes the OVAL Results Model a poor fit, as it was 944 constructed with a more combined Collection and Evaluation framework. 945 It is expected that to properly model all of the results requirements 946 within SACM, an alternative solution will be required. 948 While considering an alternative way to encode the results of an 949 assessment, the following requirements have been stated by the OVAL 950 Community as critical factors and should be considered in the 951 development of a new data model for representing evaluation results: 953 o Allow evaluation results with appropriate granularity 955 o Ensure support for enterprise scale uses 957 o Provide results that include only the actionable information 959 o Ensure that data is clear and identifiable within the results 961 o Ensure interoperability 963 8. OVAL Submission 965 The OVAL submission to the IETF consists of seven Internet-Drafts 966 (I-Ds) that define the six core data models and the processing model 967 as described in Section 3.1. Each of the core data model I-Ds 968 include the text from the OVAL Specification that defines the the 969 specific data model as well as the corresponding XML Schema that 970 implements it. The I-D for the processing model includes the text 971 from the OVAL Specification that defines how each of the core data 972 models work together. Given that the processing model describes how 973 the core data models work together, there is no XML Schema associated 974 with it. 976 The decision to split the OVAL Specification up into separate 977 Internet-Drafts was made to encourage SACM to leverage the parts of 978 OVAL that make the most sense and to emphasize that OVAL is not a 979 monolithic data model but rather several distinct data models. 981 Moving forward, SACM should review each of the OVAL models, consider 982 the recommendations in this document, and determine what concepts 983 from OVAL make sense to build upon. From there, SACM should 984 prioritize its data model efforts with respect to Collection 985 Guidance, Evaluation Guidance, Posture Attributes, and Evaluation 986 Results as well as determine how the data models should be 987 implemented (e.g. JSON, XML, etc.). Lastly, SACM should begin 988 development on its highest priority data model leveraging OVAL 989 concepts where appropriate and making improvements and design 990 decisions based on lessons learned. 992 9. Alignment with the SACM Vulnerability Assessment Scenario 994 The SACM Vulnerability Assessment Scenario 995 [I-D.coffin-sacm-vuln-scenario] describes a concrete, operational 996 vulnerability management scenario in an effort to break the SACM 997 problem space into one of several more manageable pieces. 998 Specifically, the scenario focuses on the following steps to 999 determine which endpoints on an enterprise's network are in a 1000 vulnerable state: 1002 1. Endpoint identification and initial (pre-assessment) data 1003 collection 1005 2. Vulnerability description data 1007 3. Endpoint applicability and secondary assessment 1009 4. Assessment results 1011 The OVAL submission provides concepts and lessons learned that will 1012 be valuable in developing the data models for Collection Guidance, 1013 Evaluation Guidance, Posture Attributes, and Evaluation Results which 1014 are necessary to support Steps 1, 2, and 4 of the scenario. However, 1015 OVAL does not provide any protocols or interfaces for communicating 1016 the configuration information that would be expressed using these 1017 data models. As a result, the Endpoint Compliance Profile [ECP] 1018 which provides an extensible framework for collecting, communicating, 1019 and evaluating endpoint information could be extended to support 1020 these data models as it was for software inventory information 1021 expressed using Software Identification tags [ISO.19770-2]. 1023 The following sections describe how the OVAL submission fits into 1024 each of these steps. 1026 9.1. Endpoint Identification and Initial Data Collection 1028 The first step of the SACM Vulnerability Assessment Scenario relies 1029 on the ongoing collection of basic information about an endpoint 1030 (e.g., type, criticality, hardware inventory, software inventory, 1031 configuration settings, etc.) to identify and characterize an 1032 endpoint. In order to do this, an Attribute Collector must first 1033 know what information to collect from an endpoint. This can either 1034 be hard-coded in an Attribute Collector or it can be driven by 1035 Collection Guidance with the latter being the more scalable approach. 1036 The OVAL submission, more specifically the Objects section of the 1037 OVAL Definitions Model, provides a data model for expressing what 1038 configuration information should be collected from an endpoint. This 1039 can be leveraged as a starting point for Collection Guidance that can 1040 be modified to accommodate the lessons learned around empowering 1041 subject matter experts ( i.e. primary source vendors) to identify 1042 what configuration information should be collected on their platforms 1043 and not dictating how tools must collect this information off of an 1044 endpoint. 1046 Once the configuration information has been collected from an 1047 endpoint, it needs to be expressed in a format that is consumable by 1048 other tools (i.e. Posture Attributes) for identification, 1049 correlation, and evaluation purposes. The OVAL submission includes 1050 the OVAL System Characteristics Model which can serve as a starting 1051 point for expressing this collected configuration information as 1052 Posture Attribute in a way that is scalable and enables subject 1053 matter experts to define it in a manner that makes sense to them. 1055 9.2. Endpoint Applicability and Secondary Assessment 1057 In this step, the Posture Attribute information collected, in Step 1, 1058 is evaluated to determine the applicability of vulnerability 1059 description data to an endpoint and to determine if an endpoint is in 1060 a vulnerable state. If additional information is required to make 1061 these determinations, it can be collected during this step of the 1062 scenario. The OVAL Submission aligns with this step in that it 1063 provides the concepts and lessons learned to drive the development of 1064 the Collection Guidance and Posture Attributes data models as 1065 described above. Furthermore, the OVAL Definitions Model and OVAL 1066 Processing Model, in the OVAL submission, provide a starting point 1067 for expressing the expected state of Posture Attribute information as 1068 well as defining the logical framework and algorithms necessary to 1069 compare the actual Posture Attribute information to the expected 1070 state defined in the Evaluation Guidance. 1072 9.3. Assessment Results 1074 Lastly, the OVAL submission aligns with the Assessment Results step 1075 of the scenario in that it identifies several problem areas that have 1076 impacted the usefulness of the OVAL Results Model which in turn led 1077 to several community-defined requirements for the next-generation 1078 assessment results data model. These shortcomings and requirements 1079 supplement the information needs defined in the scenario and should 1080 be significant in shaping the next-generation data model for 1081 assessment results. 1083 10. Acknowledgements 1085 The authors would like to thank Brant Cheikes (MITRE), Juan Gonzalez 1086 (DHS), Adam Montville (CIS), Charles Schmidt (MITRE), David 1087 Waltermire (NIST), and Kim Watson (JHU APL) for reviewing this 1088 document and providing helpful feedback. 1090 11. IANA Considerations 1092 This memo includes no request to IANA. 1094 12. Security Considerations 1096 This memo documents, for informational purposes, the mapping between 1097 the OVAL Data Models and the SACM Information Model as well as the 1098 lessons learned from the past 10+ years of developing OVAL. As a 1099 result, there are no specific security considerations. 1101 13. Change Log 1103 13.1. -02 to -03 1105 There are no textual changes associated with this revision. This 1106 revision simply reflects a resubmission of the document so that it 1107 remains in active status. 1109 13.2. -01 to -02 1111 Updated to reflect the latest changes to the SACM Information Model. 1113 Added text that describes how the OVAL submission is expected to be 1114 used by the SACM WG. 1116 Discusses how OVAL aligns with the SACM Vulnerability Assessment 1117 Scenario. 1119 Updated references to documents on the MITRE OVAL website to the OVAL 1120 community documentation site on GitHub.io. 1122 Added the OVAL Processing Model to the list of core models supported 1123 in OVAL. 1125 13.3. -00 to -01 1127 There are no textual changes associated with this revision. This 1128 revision simply reflects a resubmission of the document so that it 1129 goes back into active status. The document expired on November 6, 1130 2015. 1132 14. References 1134 14.1. Normative References 1136 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 1137 Requirement Levels", BCP 14, RFC 2119, 1138 DOI 10.17487/RFC2119, March 1997, 1139 . 1141 14.2. Informative References 1143 [I-D.coffin-sacm-vuln-scenario] 1144 Coffin, C., Cheikes, B., Schmidt, C., Haynes, D., 1145 Fitzgerald-McKay, J., and D. Waltermire, "SACM 1146 Vulnerability Assessment Scenario", draft-coffin-sacm- 1147 vuln-scenario-01 (work in progress), January 2016. 1149 [I-D.ietf-sacm-architecture] 1150 Cam-Winget, N., Ford, B., Lorenzin, L., McDonald, I., and 1151 l. loxx@cisco.com, "Secure Automation and Continuous 1152 Monitoring (SACM) Architecture", 2015, 1153 . 1156 [I-D.ietf-sacm-requirements] 1157 Cam-Winget, N. and L. Lorenzin, "Secure Automation and 1158 Continuous Monitoring (SACM) Requirements", 2015, 1159 . 1162 [ISO.19770-2] 1163 "Information technology -- Software asset management -- 1164 Part 2: Software identification tag", ISO/IEC 19770-2, 1165 2009. 1167 [OVAL-DEFINITION-TUTORIAL] 1168 United States Government, "The OVAL Definition Tutorial", 1169 2015, 1170 . 1172 [OVAL-DOCUMENTATION] 1173 United States Government, "The OVAL Definition Tutorial", 1174 2015, . 1176 [RFC1157] Case, J., Fedor, M., Schoffstall, M., and J. Davin, "A 1177 Simple Network Management Protocol (SNMP)", 1990, 1178 . 1180 [SACM] The IETF SACM WG, "IETF Security Automation and Continuous 1181 Monitoring (sacm) Working Group Charter", 2015, 1182 . 1184 [SACM-DOCUMENTS] 1185 The IETF SACM WG, "IETF Security Automation and Continuous 1186 Monitoring (sacm) Working Group Documents", 2015, 1187 . 1189 [STRUCTURE-OF-OVAL] 1190 The MITRE Corporation, "Structure of the OVAL Language", 1191 2015, . 1194 Authors' Addresses 1196 Matthew Hansbury 1197 The MITRE Corporation 1198 202 Burlington Road 1199 Bedford, MA 01730 1200 USA 1202 Email: mhansbury@mitre.org 1204 Daniel Haynes 1205 The MITRE Corporation 1206 202 Burlington Road 1207 Bedford, MA 01730 1208 USA 1210 Email: dhaynes@mitre.org 1212 Juan Gonzalez 1213 Department of Homeland Security 1214 245 Murray Lane 1215 Washington, DC 20548 1216 USA 1218 Email: juan.gonzalez@dhs.gov