idnits 2.17.1 draft-manyfolks-ippm-metric-registry-00.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (February 12, 2014) is 3725 days in the past. Is this intentional? Checking references for intended status: Best Current Practice ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) ** Downref: Normative reference to an Informational RFC: RFC 2330 ** Obsolete normative reference: RFC 4148 (Obsoleted by RFC 6248) ** Obsolete normative reference: RFC 5226 (Obsoleted by RFC 8126) ** Downref: Normative reference to an Informational RFC: RFC 6248 == Outdated reference: A later version (-14) exists of draft-ietf-lmap-framework-03 Summary: 4 errors (**), 0 flaws (~~), 2 warnings (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Network Working Group M. Bagnulo 3 Internet-Draft UC3M 4 Intended status: Best Current Practice B. Claise 5 Expires: August 16, 2014 Cisco Systems, Inc. 6 P. Eardley 7 BT 8 A. Morton 9 AT&T Labs 10 February 12, 2014 12 Registry for Performance Metrics 13 draft-manyfolks-ippm-metric-registry-00 15 Abstract 17 This document specifies the common aspects of the IANA registry for 18 performance metrics, both active and passive categories. This 19 document also gives a set of guidelines for Registered Performance 20 Metric requesters and reviewers. 22 Status of This Memo 24 This Internet-Draft is submitted in full conformance with the 25 provisions of BCP 78 and BCP 79. 27 Internet-Drafts are working documents of the Internet Engineering 28 Task Force (IETF). Note that other groups may also distribute 29 working documents as Internet-Drafts. The list of current Internet- 30 Drafts is at http://datatracker.ietf.org/drafts/current/. 32 Internet-Drafts are draft documents valid for a maximum of six months 33 and may be updated, replaced, or obsoleted by other documents at any 34 time. It is inappropriate to use Internet-Drafts as reference 35 material or to cite them other than as "work in progress." 37 This Internet-Draft will expire on August 16, 2014. 39 Copyright Notice 41 Copyright (c) 2014 IETF Trust and the persons identified as the 42 document authors. All rights reserved. 44 This document is subject to BCP 78 and the IETF Trust's Legal 45 Provisions Relating to IETF Documents 46 (http://trustee.ietf.org/license-info) in effect on the date of 47 publication of this document. Please review these documents 48 carefully, as they describe your rights and restrictions with respect 49 to this document. Code Components extracted from this document must 50 include Simplified BSD License text as described in Section 4.e of 51 the Trust Legal Provisions and are provided without warranty as 52 described in the Simplified BSD License. 54 Table of Contents 56 1. Open Issues and Resolutions . . . . . . . . . . . . . . . . . 2 57 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 4 58 3. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 5 59 4. Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 60 5. Design Considerations for the Registry and Registered Metrics 7 61 5.1. Interoperability . . . . . . . . . . . . . . . . . . . . 7 62 5.2. Criteria for Registered Performance Metrics . . . . . . . 8 63 5.3. Single point of reference for Performance metrics . . . . 8 64 5.4. Side benefits . . . . . . . . . . . . . . . . . . . . . . 9 65 6. Performance Metric Registry: Prior attempt . . . . . . . . . 9 66 6.1. Why this Attempt Will Succeed? . . . . . . . . . . . . . 10 67 7. Common Columns of the Performance Metric Registry . . . . . . 10 68 7.1. Performance Metrics Identifier . . . . . . . . . . . . . 11 69 7.2. Performance Metrics Name . . . . . . . . . . . . . . . . 11 70 7.3. Performance Metrics Status . . . . . . . . . . . . . . . 12 71 7.4. Performance Metrics Requester . . . . . . . . . . . . . . 12 72 7.5. Performance Metrics Revision . . . . . . . . . . . . . . 12 73 7.6. Performance Metrics Revision Date . . . . . . . . . . . . 13 74 7.7. Performance Metrics Description . . . . . . . . . . . . . 13 75 7.8. Reference Specification(s) . . . . . . . . . . . . . . . 13 76 8. The Life-Cycle of Registered Metrics . . . . . . . . . . . . 13 77 8.1. The Process for Review by the Performance Metric Experts 13 78 8.2. Revising Registered Performance Metrics . . . . . . . . . 14 79 8.3. Deprecating Registered Performance Metrics . . . . . . . 16 80 9. Performance Metric Registry and other Registries . . . . . . 16 81 10. Security considerations . . . . . . . . . . . . . . . . . . . 17 82 11. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 17 83 12. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 17 84 13. References . . . . . . . . . . . . . . . . . . . . . . . . . 17 85 13.1. Normative References . . . . . . . . . . . . . . . . . . 17 86 13.2. Informative References . . . . . . . . . . . . . . . . . 18 87 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 18 89 1. Open Issues and Resolutions 91 1. I believe that the Performance Metrics Experts and the 92 Performance Metric Directorate will be a different group of 93 people. Reason: every single time a new expert is added, the 94 IESG needs to approve her/him. To be discussed with the Area 95 Directors. *** (v7) Has this discussion taken place? If these 96 are different groups, we don't need to define Performance 97 Metrics Directorate. 99 2. We should expand on the different roles and responsibilities of 100 the Performance Metrics Experts versus the Performance Metric 101 Directorate. At least, the Performance Metric Directorate one 102 should be expanded. --- (v7) If these are different entities, 103 our only concern is the role of the "PM Experts". 105 3. Not sure if this is interesting for this document to go in the 106 details of the LMAP control protocol versus report protocol (see 107 section 'Interoperability'. (the text currently does this in 108 several sections, S5 comes to mind - Closed) 110 4. Marcelo, not sure what you mean by 'Single point of reference'. 111 (Closed - see S5.3) 113 5. Define 'Measurement Parameter'. Even if this is active 114 monitoring specific term, we need it in this draft. Done in v3 115 Terminology section as "Input Parameter". - Closed in v7 as 116 "Parameter". 118 6. Performance Metric Description: part of this document of the 119 active/ passive monitoring documents. -- Closed will be Part of 120 Active & Passive docs. 122 7. Many aspects of the Naming convention are TBD, and need 123 discussion. For example, we have distinguished RTCP-XR metrics 124 as End-Point (neither active nor passive in the traditional 125 sense, so not Act_ or Pas_). Also, the Act_ or Pas_ component 126 is not consistent with "camel_case", as Marcelo points out. 127 Even though we may not cast all naming conventions in stone at 128 the start, it will be helpful to look at several examples of 129 passive metric names now. 131 8. RTCP-XR metrics are currently referred to as "end-point", and 132 have aspects that similar to active (the measured stream 133 characteristics are known a priori and measurement commonly 134 takes place at the end-points of the path) and passive (there is 135 no additional traffic dedicated to measurement, with the 136 exception of the RTCP report packets themselves). We have one 137 example expressing an end-point metric in the active sub- 138 registry memo. 140 9. Revised Registry Entries: Keep for history (deprecated) or 141 Delete? 143 10. In section 7 defining the Registry Common Columns, ~all column 144 names begin with "Performance Metric". Al recommends deleting 145 this prefix in each sub-section as redundant. 147 2. Introduction 149 The IETF specifies and uses Performance Metrics of protocols and 150 applications transported over its protocols. Performance metrics are 151 such an important part of the operations of IETF protocols that 152 [RFC6390] specifies guidelines for their development. 154 The definition and use of Performance Metrics in the IETF happens in 155 various working groups (WG), most notably: 157 The "IP Performance Metrics" (IPPM) WG is the WG primarily 158 focusing on Performance Metrics definition at the IETF. 160 The "Metric Blocks for use with RTCP's Extended Report Framework" 161 (XRBLOCK) WG recently specified many Performance Metrics related 162 to "RTP Control Protocol Extended Reports (RTCP XR)" [RFC3611], 163 which establishes a framework to allow new information to be 164 conveyed in RTCP, supplementing the original report blocks defined 165 in "RTP: A Transport Protocol for Real-Time Applications", 166 [RFC3550]. 168 The "Benchmarking Methodology" WG (BMWG) defined many Performance 169 Metrics for use in laboratory benchmarking of inter-networking 170 technologies. 172 The "IP Flow Information eXport" (IPFIX) WG Information elements 173 related to Performance Metrics are currently proposed. 175 The "Performance Metrics for Other Layers" (PMOL) concluded WG, 176 defined some Performance Metrics related to Session Initiation 177 Protocol (SIP) voice quality [RFC6035]. 179 It is expected that more Performance Metrics will be defined in the 180 future, not only IP-based metrics, but also metrics which are 181 protocol-specific and application-specific. 183 However, despite the importance of Performance Metrics, there are two 184 related problems for the industry. First, how to ensure that when 185 one party requests another party to measure (or report or in some way 186 act on) a particular performance metric, then both parties have 187 exactly the same understanding of what performance metric is being 188 referred to. Second, how to discover which Performance Metrics have 189 been specified, so as to avoid developing new performance metric that 190 is very similar. The problems can be addressed by creating a 191 registry of performance metrics. The usual way in which IETF 192 organizes namespaces is with Internet Assigned Numbers Authority 193 (IANA) registries, and there is currently no Performance Metrics 194 Registry maintained by the IANA. 196 This document therefore proposes the creation of a Performance 197 Metrics Registry. It also provides best practices on how to define 198 new or updated entries in the Performance Metrics Registry. 200 3. Terminology 202 The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", 203 "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and 204 "OPTIONAL" in this document are to be interpreted as described in 205 [RFC2119]. 207 The terms Performance Metric and Performance Metrics Directorate are 208 defined in [RFC6390], and copied over in this document for the 209 readers convenience. 211 Registered Performance Metric: A Registered Performance Metric (or 212 Registered Metric) is a quantitative measure of performance (see 213 section 6.1 of [RFC2330]) expressed as an entry in the Performance 214 Metric Registry, and comprised of a specifically named metric 215 which has met all the registry review criteria, is under the 216 curation of IETF Performance Metrics Experts, and whose changes 217 are controlled by IANA. 219 Registry or Performance Metrics Registry: The IANA registry 220 containing Registered Performance Metrics. 222 Non-IANA Registry: A set of metrics that are registered locally (and 223 not by IANA). 225 Performance Metrics Experts: The Performance Metrics Experts is a 226 group of experts selected by the IESG to validate the Performance 227 Metrics before updating the Performance Metrics Registry. The 228 Performance Metrics Experts work closely with IANA. 230 Performance Metrics Directorate: The Performance Metrics Directorate 231 is a directorate that provides guidance for Performance Metrics 232 development in the IETF. The Performance Metrics Directorate 233 should be composed of experts in the performance community, 234 potentially selected from the IP Performance Metrics (IPPM), 235 Benchmarking Methodology (BMWG), and Performance Metrics for Other 236 Layers (PMOL) WGs. 238 Parameter: An input factor defined as a variable in the definition 239 of a metric. A numerical or other specified factor forming one of 240 a set that defines a metric or sets the conditions of its 241 operation. Most Input Parameters do not change the fundamental 242 nature of the metric's definition, but others have substantial 243 influence. All Input Parameters must be known to measure using a 244 metric and interpret the results. 246 Active Measurement Method: Methods of Measurement conducted on 247 traffic which serves only the purpose of measurement and is 248 generated for that reason alone, and whose traffic characteristics 249 are known a priori. An Internet user's host can generate active 250 measurement traffic (virtually all typical user-generated traffic 251 is not dedicated to active measurement, but it can produce such 252 traffic with the necessary application operating). 254 Passive Measurement Method: Methods of Measurement conducted on 255 Internet user traffic such that sensitive information is present 256 and may be stored in the measurement system, or observations of 257 traffic from other sources for monitoring and measurement 258 purposes. 260 Hybrid Measurement Method: Methods of Measurement which use a 261 combination of Active Measurement and Passive Measurement methods. 263 4. Scope 265 The intended audience of this document includes those who prepare and 266 submit a request for a Registered Performance Metric, and for the 267 Performance Metric Experts who review a request. 269 This document specifies a Performance Metrics Registry in IANA. This 270 Performance Metric Registry is applicable to Performance Metrics 271 issued from Active Measurement, Passive Measurement, or from end- 272 point calculation. This registry is designed to encompass 273 performance metrics developed throughout the IETF and especially for 274 the following working groups: IPPM, XRBLOCK, IPFIX, BMWG, and 275 possibly others. This document analyzes an prior attempt to set up a 276 Performance Metric Registry, and the reasons why this design was 277 inadequate [RFC6248]. Finally, this document gives a set of 278 guidelines for requesters and expert reviewers of candidate 279 Registered Performance Metrics. 281 This document serves as the foundation for further work. It 282 specifies the set of columns describing common aspects necessary for 283 all entries in the Performance Metrics Registry. 285 Two documents describing sub-registries will be developed separately: 286 one for active Registered Metrics and another one for the passive 287 Registered Metrics. Indeed, active and passive performance metrics 288 appear to have different characteristics which must be documented in 289 their respective sub-registies. For example, active performance 290 methods must specify the packet stream characteristics they generate 291 and measure, so it is essential to include the stream specifications 292 in the registry entry. In the case of passive Performance metrics, 293 there is a need to specify the sampling distribution in the registry, 294 while it would be possible to force the definition of the registry 295 field to include both types of distributions in the same registry 296 column, we believe it is cleaner and clearer to have separated sub- 297 registries with different columns that have a narrow definition. 299 It is possible that future metrics may be a hybrid of active and 300 passive measurement methods, and it may be possible to register 301 hybrid metrics using in one of the two planned sub-registries (active 302 or passive), or it may be efficient to define a third sub-registry 303 with unique columns. The current design with sub-registries allows 304 for growth, and this is a recognized option for extension. 306 This document makes no attempt to populate the registry with initial 307 entries. 309 Based on [RFC5226] Section 4.3, this document is processed as Best 310 Current Practice (BCP) [RFC2026]. 312 5. Design Considerations for the Registry and Registered Metrics 314 In this section, we detail several design considerations that are 315 relevant for understanding the motivations and expected use of the 316 metric registry. 318 5.1. Interoperability 320 As any IETF registry, the primary use for a registry is to manage a 321 namespace for its use within one or more protocols. In this 322 particular case of the metric registry, there are two types of 323 protocols that will use the values defined in the registry for their 324 operation: 326 o Control protocol: this type of protocols is used to allow one 327 entity to request another entity to perform a measurement using a 328 specific metric defined by the registry. One particular example 329 is the LMAP framework [I-D.ietf-lmap-framework]. Using the LMAP 330 terminology, the registry is used in the LMAP Control protocol to 331 allow a Controller to request a measurement task to one or more 332 Measurement Agents. In order to enable this use case, the entries 333 of the metric registry must be well enough defined to allow a 334 Measurement Agent implementation to trigger a specific measurement 335 task upon the reception of a control protocol message. This 336 requirements heavily constrains the type of entries that are 337 acceptable for the Metric registry. 339 o Report protocol: This type of protocols is used to allow an entity 340 to report measurement results to another entity. By referencing 341 to a specific metric registry, it is possible to properly 342 characterize the measurement result data being transferred. Using 343 the LMAP terminology, the registry is used in the Report protocol 344 to allow a Measurement Agent to report measurement results to a 345 Collector. 347 5.2. Criteria for Registered Performance Metrics 349 It is neither possible nor desirable to populate the registry with 350 all combinations of input parameters of all performance metrics. The 351 Registered Performance Metrics should be: 353 1. interpretable by the user. 355 2. implementable by the software designer. 357 3. deployable by network operators, without major impact on the 358 networks. 360 4. accurate, for interoperability and deployment across vendors 362 In essence, there needs to be evidence that a candidate registry 363 entry has significant industry interest, or has seen deployment, and 364 there is agreement that the candidate Registered Metric serves its 365 intended purpose. 367 5.3. Single point of reference for Performance metrics 369 A registry for Performance metrics serves as a single point of 370 reference for performance metrics defined in different working groups 371 in the IETF. As we mentioned earlier, there are several WGs that 372 define performance metrics in the IETF and it is hard to keep track 373 of all them. This results in multiple definitions of similar metrics 374 that attempt to measure the same phenomena but in slightly different 375 (and incompatible) ways. Having a registry would allow both the IETF 376 community and external people to have a single list of relevant 377 performance metrics defined by the IETF (and others, where 378 appropriate). The single list is also an essential aspect of 379 communication about metrics, where different entities that request 380 measurements, execute measurements, and report the results can 381 benefit from a common understanding of the referenced metric. 383 5.4. Side benefits 385 There are a couple of side benefits of having such a registry. 386 First, the registry could serve as an inventory of useful and used 387 metrics, that are normally supported by different implementations of 388 measurement agents. Second, the results of the metrics would be 389 comparable even if they are performed by different implementations 390 and in different networks, as the metric is properly defined. BCP 391 176 [RFC6576] examines whether the results produced by independent 392 implementations are equivalent in the context of evaluating the 393 completeness and clarity of metric specifications. This BCP defines 394 the standards track advancement testing for (active) IPPM metrics, 395 and the same process will likely suffice to determine whether 396 registry entries are sufficiently well specified to result in 397 comparable (or equivalent) results. Registry entries which have 398 undergone such testing SHOULD be noted, with a reference to the test 399 results. 401 6. Performance Metric Registry: Prior attempt 403 There was a previous attempt to define a metric registry RFC 4148 404 [RFC4148]. However, it was obsoleted by RFC 6248 [RFC6248] because 405 it was "found to be insufficiently detailed to uniquely identify IPPM 406 metrics... [there was too much] variability possible when 407 characterizing a metric exactly" which led to the RFC4148 registry 408 having "very few users, if any". 410 A couple of interesting additional quotes from RFC 6248 might help 411 understand the issues related to that registry. 413 1. "It is not believed to be feasible or even useful to register 414 every possible combination of Type P, metric parameters, and 415 Stream parameters using the current structure of the IPPM Metrics 416 Registry." 418 2. "The registry structure has been found to be insufficiently 419 detailed to uniquely identify IPPM metrics." 421 3. "Despite apparent efforts to find current or even future users, 422 no one responded to the call for interest in the RFC 4148 423 registry during the second half of 2010." 425 The current approach learns from this by tightly defining each entry 426 in the registry with only a few parameters open, if any. The idea is 427 that entries in the registry represent different measurement methods 428 which require input parameters to set factors like source and 429 destination addresses (which do not change the fundamental nature of 430 the measurement). The downside of this approach is that it could 431 result in a large number of entries in the registry. We believe that 432 less is more in this context - it is better to have a reduced set of 433 useful metrics rather than a large set of metrics with questionable 434 usefulness. Therefore this document defines that the registry only 435 includes metrics that are well defined and that have proven to be 436 operationally useful. In order to guarantee these two 437 characteristics we require that a set of experts review the 438 allocation request to verify that the metric is well defined and it 439 is operationally useful. 441 6.1. Why this Attempt Will Succeed? 443 The registry defined in this document addresses the main issues 444 identified in the previous attempt. As we mention in the previous 445 section, one of the main issues with the previous registry was that 446 the metrics contained in the registry were too generic to be useful. 447 In this registry, the registry requests are evaluated by an expert 448 group that will make sure that the metric is properly defined. This 449 document provides guidelines to assess if a metric is properly 450 defined. 452 Another key difference between this attempt and the previous one is 453 that in this case there is at least one clear user for the registry: 454 the LMAP framework and protocol. Because the LMAP protocol will use 455 the registry values in its operation, this actually helps to 456 determine if a metric is properly defined. In particular, since we 457 expect that the LMAP control protocol will enable a controller to 458 request a measurement agent to perform a measurement using a given 459 metric by embedding the metric registry value in the protocol, a 460 metric is properly specified if it is defined well-enough so that it 461 is possible (and practical) to implement the metric in the 462 measurement agent. This was clearly not the case for the previous 463 attempt: defining a metric with an undefined P-Type makes its 464 implementation unpractical. 466 7. Common Columns of the Performance Metric Registry 468 The metric registry is composed of two sub-registries: the registry 469 for active performance metrics and the registry for passive 470 performance metrics. The rationale for having two sub-registries (as 471 opposed to having a single registry for all metrics) is because the 472 set of registry columns must support unambiguous registry entries, 473 and there are fundamental differences in the methods to collect 474 active and passive metrics and the required input parameters. 475 Forcing them into a single, generalized registry would result in a 476 less meaningful structure for some entries in the registry. 477 Nevertheless, it is desirable that the two sub-registries share the 478 same structure as much as possible. In particular, both registries 479 will share the following columns: the identifier and the name, the 480 requester, the revision, the revision date and the description. All 481 these fields are described below. The design of these two sub- 482 registries is work-in-progress. 484 7.1. Performance Metrics Identifier 486 A numeric identifier for the Registered Performance Metric. This 487 identifier must be unique within the Performance Metric Registry and 488 sub-registries. 490 The Registered Performance Metric unique identifier is a 16-bit 491 integer (range 0 to 65535). When adding newly Registered Performance 492 Metrics to the Performance Metric Registry, IANA should assign the 493 lowest available identifier to the next active monitoring Registered 494 Performance Metric, and the highest available identifier to the next 495 passive monitoring Registered Performance Metric. 497 7.2. Performance Metrics Name 499 As the name of a Registered Performance Metric is the first thing a 500 potential implementor will use when determining whether it is 501 suitable for a given application, it is important to be as precise 502 and descriptive as possible. Names of Registered Performance 503 Metrics: 505 1. "must be chosen carefully to describe the Registered Performance 506 Metric and the context in which it will be used." 508 2. "should be unique within the Performance Metric Registry 509 (including sub-registries)." 511 3. "must use capital letters for the first letter of each component 512 . All other letters are lowercase, even for acronyms. Exceptions 513 are made for acronyms containing a mixture of lowercase and 514 capital letters, such as 'IPv4' and 'IPv6'." 516 4. "must use '_' between each component composing the Registered 517 Performance Metric name." 519 5. "must start with prefix Act_ for active measurement Registered 520 Performance Metric." 522 6. "must start with prefix Pass_ for passive monitoring Registered 523 Performance Metric." AL COMMENTS: how about just 3 letters for 524 consistency: "Pas_" 526 7. MARCELO: I am uncertain whether we should give more guidance here 527 for the naming convention. In particular, the second component 528 could be the highest protocol used in the metric (e.g. UDP, TCP, 529 DNS, SIP, ICMP, IPv4, etc). the third component should be a 530 descriptive name (like latency, packet loss or similar). the 531 fourth component could be stream distribution. the fifth 532 component could be the output type (99mean, 95interval). this is 533 of course very active metric oriented, would be good if we could 534 figure out what is the minimum common structure for both passive 535 and active. TBD. AL COMMENTS: Let's see some examples for 536 passive monitoring. It may not make sense to have common name 537 components, except for Act_ and Pas_. 539 8. BENOIT proposes (approximately this, Al's wording) : The 540 remaining rules for naming are left to the Performance Experts to 541 determine as they gather experience, so this is an area of 542 planned update by a future RFC. 544 An example is "Act_UDP_Latency_Poisson_99mean" for a active 545 monitoring UDP latency metric using a Poisson stream of packets and 546 producing the 99th percentile mean as output. 548 >>>> NEED passive naming examples. 550 7.3. Performance Metrics Status 552 The status of the specification of this Registered Performance 553 Metric. Allowed values are 'current' and 'deprecated'. All newly 554 defined Information Elements have 'current' status. 556 7.4. Performance Metrics Requester 558 The requester for the Registered Performance Metric. The requester 559 may be a document, such as RFC, or person. 561 7.5. Performance Metrics Revision 563 The revision number of a Registered Performance Metric, starting at 0 564 for Registered Performance Metrics at time of definition and 565 incremented by one for each revision. 567 7.6. Performance Metrics Revision Date 569 The date of acceptance or the most recent revision for the Registered 570 Performance Metric. 572 7.7. Performance Metrics Description 574 A Registered Performance Metric Description is a written 575 representation of a particular registry entry. It supplements the 576 metric name to help registry users select relevant Registered 577 Performance Metrics. 579 7.8. Reference Specification(s) 581 Registry entries that follow the common columns must provide the 582 reference specification(s) on which the Registered Performance Metric 583 is based. 585 8. The Life-Cycle of Registered Metrics 587 Once a Performance Metric or set of Performance Metrics has been 588 identified for a given application, candidate registry entry 589 specifications in accordance with Section X are submitted to IANA to 590 follow the process for review by the Performance Metric Experts, as 591 defined below. This process is also used for other changes to the 592 Performance Metric Registry, such as deprecation or revision, as 593 described later in this section. 595 It is also desirable that the author(s) of a candidate registry entry 596 seek review in the relevant IETF working group, or offer the 597 opportunity for review on the WG mailing list. 599 8.1. The Process for Review by the Performance Metric Experts 601 Requests to change Registered Metrics in the Performance Metric 602 Registry or a linked sub-registry are submitted to IANA, which 603 forwards the request to a designated group of experts (Performance 604 Metric Experts) appointed by the IESG; these are the reviewers called 605 for by the Expert Review RFC5226 policy defined for the Performance 606 Metric Registry. The Performance Metric Experts review the request 607 for such things as compliance with this document, compliance with 608 other applicable Performance Metric-related RFCs, and consistency 609 with the currently defined set of Registered Performance Metrics. 611 Authors are expected to review compliance with the specifications in 612 this document to check their submissions before sending them to IANA. 614 The Performance Metric Experts should endeavor to complete referred 615 reviews in a timely manner. If the request is acceptable, the 616 Performance Metric Experts signify their approval to IANA, which 617 changes the Performance Metric Registry. If the request is not 618 acceptable, the Performance Metric Experts can coordinate with the 619 requester to change the request to be compliant. The Performance 620 Metric Experts may also choose in exceptional circumstances to reject 621 clearly frivolous or inappropriate change requests outright. 623 This process should not in any way be construed as allowing the 624 Performance Metric Experts to overrule IETF consensus. Specifically, 625 any Registered Metrics that were added with IETF consensus require 626 IETF consensus for revision or deprecation. 628 Decisions by the Performance Metric Experts may be appealed as in 629 Section 7 of RFC5226. 631 8.2. Revising Registered Performance Metrics 633 Requests to revise the Performance Metric Registry or a linked sub- 634 registry are submitted to IANA, which forwards the request to a 635 designated group of experts (Performance Metric Experts) appointed by 636 the IESG; these are the reviewers called for by the Expert Review 637 [RFC5226] policy defined for the Performance Metric Registry. The 638 Performance Metric Experts review the request for such things as 639 compliance with this document, compliance with other applicable 640 Performance Metric-related RFCs, and consistency with the currently 641 defined set of Registered Performance Metrics. 643 A request for Revision is ONLY permissible when the changes maintain 644 backward-compatibility with implementations of the prior registry 645 entry describing a Registered Metric (entries with lower revision 646 numbers, but the same Identifier and Name). 648 The purpose of the Status field in the Performance Metric Registry is 649 to indicate whether the entry for a Registered Metric is 'current' or 650 'deprecated'. 652 In addition, no policy is defined for revising IANA Performance 653 Metric entries or addressing errors therein. To be certain, changes 654 and deprecations within the Performance Metric Registry are not 655 encouraged, and should be avoided to the extent possible. However, 656 in recognition that change is inevitable, the provisions of this 657 section address the need for revisions. 659 Revisions are initiated by sending a candidate Registered Performance 660 Metric definition to IANA, as in Section X, identifying the existing 661 registry entry. 663 The primary requirement in the definition of a policy for managing 664 changes to existing Registered Performance Metrics is avoidance of 665 interoperability problems; Performance Metric Experts must work to 666 maintain interoperability above all else. Changes to Registered 667 Performance Metrics already in use may only be done in an inter- 668 operable way; necessary changes that cannot be done in a way to allow 669 interoperability with unchanged implementations must result in 670 deprecation of the earlier metric. 672 A change to a Registered Performance Metric is held to be backward- 673 compatible only when: 675 1. "it involves the correction of an error that is obviously only 676 editorial; or" 678 2. "it corrects an ambiguity in the Registered Performance Metric's 679 definition, which itself leads to issues severe enough to prevent 680 the Registered Performance Metric's usage as originally defined; 681 or" 683 3. "it corrects missing information in the metric definition without 684 changing its meaning (e.g., the explicit definition of 'quantity' 685 semantics for numeric fields without a Data Type Semantics 686 value); or" 688 4. "it harmonizes with an external reference that was itself 689 corrected." 691 5. "BENOIT: NOTE THAT THERE ARE MORE RULES IN RFC 7013 SECTION 5 BUT 692 THEY WOULD ONLY APPLY TO THE ACTIVE/PASSIVE DRAFTS. TO BE 693 DISCUSSED." 695 If a change is deemed permissible by the Performance Metric Experts, 696 IANA makes the change in the Performance Metric Registry. The 697 requester of the change is appended to the requester in the registry. 699 Each Registered Performance Metric in the Registry has a revision 700 number, starting at zero. Each change to a Registered Performance 701 Metric following this process increments the revision number by one. 703 COMMENT: Al (and Phil) think we should keep old/revised entries as- 704 is, marked as deprecated >>>> Since any revision must be inter- 705 operable according to the criteria above, there is no need for the 706 Performance Metric Registry to store information about old revisions. 708 When a revised Registered Performance Metric is accepted into the 709 Performance Metric Registry, the date of acceptance of the most 710 recent revision is placed into the revision Date column of the 711 registry for that Registered Performance Metric. 713 Where applicable, additions to registry entries in the form of text 714 Comments or Remarks should include the date, but such additions may 715 not constitute a revision according to this process. 717 8.3. Deprecating Registered Performance Metrics 719 Changes that are not permissible by the above criteria for Registered 720 Metric's revision may only be handled by deprecation. A Registered 721 Performance Metric MAY be deprecated and replaced when: 723 1. "the Registered Performance Metric definition has an error or 724 shortcoming that cannot be permissibly changed as in 725 Section Revising Registered Performance Metrics; or" 727 2. "the deprecation harmonizes with an external reference that was 728 itself deprecated through that reference's accepted deprecation 729 method; or" 731 A request for deprecation is sent to IANA, which passes it to the 732 Performance Metric Expert for review, as in Section 'The Process for 733 Review by the Performance Metric Experts'. When deprecating an 734 Performance Metric, the Performance Metric description in the 735 Performance Metric Registry must be updated to explain the 736 deprecation, as well as to refer to any new Performance Metrics 737 created to replace the deprecated Performance Metric. 739 The revision number of a Registered Performance Metric is incremented 740 upon deprecation, and the revision Date updated, as with any 741 revision. 743 The use of deprecated Registered Metrics should result in a log entry 744 or human-readable warning by the respective application. 746 Names and Metric ID of deprecated Registered Metrics must not be 747 reused. 749 9. Performance Metric Registry and other Registries 751 BENOIT: TBD. 753 THE BASIC IDEA IS THAT PEOPLE COULD DIRECTLY DEFINE PERF. METRICS IN 754 OTHER EXISTING REGISTRIES, FOR SPECIFIC PROTOCOL/ENCODING. EXAMPLE: 755 IPFIX. IDEALLY, ALL PERF. METRICS SHOULD BE DEFINED IN THIS 756 REGISTRY AND REFERS TO FROM OTHER REGISTRIES. 758 10. Security considerations 760 This draft doesn't introduce any new security considerations for the 761 Internet. However, the definition of Performance Metrics may 762 introduce some security concerns, and should be reviewed with 763 security in mind. 765 11. IANA Considerations 767 This document specifies the procedure for Performance Metrics 768 Registry setup. IANA is requested to create a new registry for 769 performance metrics called "Registered Performance Metrics". 771 This Performance Metrics Registry contains two sub registries once 772 for active and another one for passive performance metrics. These 773 sub registries are not defined in this document. However, these two 774 sub registries MUST contain the following columns: the identifier and 775 the name, the requester, the revision, the revision date and the 776 description, as specified in this document. 778 New assignments for Performance Metric Registry will be administered 779 by IANA through Expert Review [RFC5226], i.e., review by one of a 780 group of experts, the Performance Metric Experts, appointed by the 781 IESG upon recommendation of the Transport Area Directors. The 782 experts will initially be drawn from the Working Group Chairs and 783 document editors of the Performance Metrics Directorate [performance- 784 metrics-directorate]. 786 12. Acknowledgments 788 Thanks to Brian Trammell and Bill Cerveny, IPPM chairs, for leading 789 some brainstorming sessions on this topic. 791 13. References 793 13.1. Normative References 795 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 796 Requirement Levels", BCP 14, RFC 2119, March 1997. 798 [RFC2026] Bradner, S., "The Internet Standards Process -- Revision 799 3", BCP 9, RFC 2026, October 1996. 801 [RFC2330] Paxson, V., Almes, G., Mahdavi, J., and M. Mathis, 802 "Framework for IP Performance Metrics", RFC 2330, May 803 1998. 805 [RFC4148] Stephan, E., "IP Performance Metrics (IPPM) Metrics 806 Registry", BCP 108, RFC 4148, August 2005. 808 [RFC5226] Narten, T. and H. Alvestrand, "Guidelines for Writing an 809 IANA Considerations Section in RFCs", BCP 26, RFC 5226, 810 May 2008. 812 [RFC6248] Morton, A., "RFC 4148 and the IP Performance Metrics 813 (IPPM) Registry of Metrics Are Obsolete", RFC 6248, April 814 2011. 816 [RFC6390] Clark, A. and B. Claise, "Guidelines for Considering New 817 Performance Metric Development", BCP 170, RFC 6390, 818 October 2011. 820 [RFC6576] Geib, R., Morton, A., Fardid, R., and A. Steinmitz, "IP 821 Performance Metrics (IPPM) Standard Advancement Testing", 822 BCP 176, RFC 6576, March 2012. 824 13.2. Informative References 826 [RFC3611] Friedman, T., Caceres, R., and A. Clark, "RTP Control 827 Protocol Extended Reports (RTCP XR)", RFC 3611, November 828 2003. 830 [RFC3550] Schulzrinne, H., Casner, S., Frederick, R., and V. 831 Jacobson, "RTP: A Transport Protocol for Real-Time 832 Applications", STD 64, RFC 3550, July 2003. 834 [RFC6035] Pendleton, A., Clark, A., Johnston, A., and H. Sinnreich, 835 "Session Initiation Protocol Event Package for Voice 836 Quality Reporting", RFC 6035, November 2010. 838 [I-D.ietf-lmap-framework] 839 Eardley, P., Morton, A., Bagnulo, M., Burbridge, T., 840 Aitken, P., and A. Akhter, "A framework for large-scale 841 measurement platforms (LMAP)", draft-ietf-lmap- 842 framework-03 (work in progress), January 2014. 844 Authors' Addresses 845 Marcelo Bagnulo 846 Universidad Carlos III de Madrid 847 Av. Universidad 30 848 Leganes, Madrid 28911 849 SPAIN 851 Phone: 34 91 6249500 852 Email: marcelo@it.uc3m.es 853 URI: http://www.it.uc3m.es 855 Benoit Claise 856 Cisco Systems, Inc. 857 De Kleetlaan 6a b1 858 1831 Diegem 859 Belgium 861 Email: bclaise@cisco.com 863 Philip Eardley 864 British Telecom 865 Adastral Park, Martlesham Heath 866 Ipswich 867 ENGLAND 869 Email: philip.eardley@bt.com 871 Al Morton 872 AT&T Labs 873 200 Laurel Avenue South 874 Middletown, NJ 875 USA 877 Email: acmorton@att.com