idnits 2.17.1 draft-ietf-ippm-metric-registry-23.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- -- The document has examples using IPv4 documentation addresses according to RFC6890, but does not use any IPv6 documentation addresses. Maybe there should be IPv6 examples, too? Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (December 11, 2019) is 1597 days in the past. Is this intentional? Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) == Unused Reference: 'RFC2026' is defined on line 1607, but no explicit reference was found in the text ** Downref: Normative reference to an Informational RFC: RFC 2330 ** Downref: Normative reference to an Informational RFC: RFC 7799 == Outdated reference: A later version (-16) exists of draft-ietf-ippm-initial-registry-14 -- Obsolete informational reference (is this intentional?): RFC 4148 (Obsoleted by RFC 6248) Summary: 2 errors (**), 0 flaws (~~), 3 warnings (==), 3 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Network Working Group M. Bagnulo 3 Internet-Draft UC3M 4 Intended status: Standards Track B. Claise 5 Expires: June 13, 2020 Cisco Systems, Inc. 6 P. Eardley 7 BT 8 A. Morton 9 AT&T Labs 10 A. Akhter 11 Consultant 12 December 11, 2019 14 Registry for Performance Metrics 15 draft-ietf-ippm-metric-registry-23 17 Abstract 19 This document defines the format for the IANA Performance Metrics 20 Registry. This document also gives a set of guidelines for 21 Registered Performance Metric requesters and reviewers. 23 Status of This Memo 25 This Internet-Draft is submitted in full conformance with the 26 provisions of BCP 78 and BCP 79. 28 Internet-Drafts are working documents of the Internet Engineering 29 Task Force (IETF). Note that other groups may also distribute 30 working documents as Internet-Drafts. The list of current Internet- 31 Drafts is at https://datatracker.ietf.org/drafts/current/. 33 Internet-Drafts are draft documents valid for a maximum of six months 34 and may be updated, replaced, or obsoleted by other documents at any 35 time. It is inappropriate to use Internet-Drafts as reference 36 material or to cite them other than as "work in progress." 38 This Internet-Draft will expire on June 13, 2020. 40 Copyright Notice 42 Copyright (c) 2019 IETF Trust and the persons identified as the 43 document authors. All rights reserved. 45 This document is subject to BCP 78 and the IETF Trust's Legal 46 Provisions Relating to IETF Documents 47 (https://trustee.ietf.org/license-info) in effect on the date of 48 publication of this document. Please review these documents 49 carefully, as they describe your rights and restrictions with respect 50 to this document. Code Components extracted from this document must 51 include Simplified BSD License text as described in Section 4.e of 52 the Trust Legal Provisions and are provided without warranty as 53 described in the Simplified BSD License. 55 Table of Contents 57 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 4 58 2. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 5 59 3. Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 60 4. Motivation for a Performance Metrics Registry . . . . . . . . 8 61 4.1. Interoperability . . . . . . . . . . . . . . . . . . . . 8 62 4.2. Single point of reference for Performance Metrics . . . . 9 63 4.3. Side benefits . . . . . . . . . . . . . . . . . . . . . . 9 64 5. Criteria for Performance Metrics Registration . . . . . . . . 9 65 6. Performance Metric Registry: Prior attempt . . . . . . . . . 10 66 6.1. Why this Attempt Should Succeed . . . . . . . . . . . . . 11 67 7. Definition of the Performance Metric Registry . . . . . . . . 11 68 7.1. Summary Category . . . . . . . . . . . . . . . . . . . . 13 69 7.1.1. Identifier . . . . . . . . . . . . . . . . . . . . . 13 70 7.1.2. Name . . . . . . . . . . . . . . . . . . . . . . . . 13 71 7.1.3. URIs . . . . . . . . . . . . . . . . . . . . . . . . 17 72 7.1.4. Description . . . . . . . . . . . . . . . . . . . . . 17 73 7.1.5. Reference . . . . . . . . . . . . . . . . . . . . . . 17 74 7.1.6. Change Controller . . . . . . . . . . . . . . . . . . 17 75 7.1.7. Version (of Registry Format) . . . . . . . . . . . . 18 76 7.2. Metric Definition Category . . . . . . . . . . . . . . . 18 77 7.2.1. Reference Definition . . . . . . . . . . . . . . . . 18 78 7.2.2. Fixed Parameters . . . . . . . . . . . . . . . . . . 18 79 7.3. Method of Measurement Category . . . . . . . . . . . . . 19 80 7.3.1. Reference Method . . . . . . . . . . . . . . . . . . 19 81 7.3.2. Packet Stream Generation . . . . . . . . . . . . . . 19 82 7.3.3. Traffic Filter . . . . . . . . . . . . . . . . . . . 20 83 7.3.4. Sampling Distribution . . . . . . . . . . . . . . . . 20 84 7.3.5. Run-time Parameters . . . . . . . . . . . . . . . . . 21 85 7.3.6. Role . . . . . . . . . . . . . . . . . . . . . . . . 22 86 7.4. Output Category . . . . . . . . . . . . . . . . . . . . . 22 87 7.4.1. Type . . . . . . . . . . . . . . . . . . . . . . . . 22 88 7.4.2. Reference Definition . . . . . . . . . . . . . . . . 23 89 7.4.3. Metric Units . . . . . . . . . . . . . . . . . . . . 23 90 7.4.4. Calibration . . . . . . . . . . . . . . . . . . . . . 23 91 7.5. Administrative information . . . . . . . . . . . . . . . 24 92 7.5.1. Status . . . . . . . . . . . . . . . . . . . . . . . 24 93 7.5.2. Requester . . . . . . . . . . . . . . . . . . . . . . 24 94 7.5.3. Revision . . . . . . . . . . . . . . . . . . . . . . 24 95 7.5.4. Revision Date . . . . . . . . . . . . . . . . . . . . 24 96 7.6. Comments and Remarks . . . . . . . . . . . . . . . . . . 24 98 8. Processes for Managing the Performance Metric Registry Group 24 99 8.1. Adding new Performance Metrics to the Performance Metrics 100 Registry . . . . . . . . . . . . . . . . . . . . . . . . 25 101 8.2. Revising Registered Performance Metrics . . . . . . . . . 26 102 8.3. Deprecating Registered Performance Metrics . . . . . . . 27 103 9. Security considerations . . . . . . . . . . . . . . . . . . . 28 104 10. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 29 105 10.1. Registry Group . . . . . . . . . . . . . . . . . . . . . 29 106 10.2. Performance Metric Name Elements . . . . . . . . . . . . 29 107 10.3. New Performance Metrics Registry . . . . . . . . . . . . 30 108 11. Blank Registry Template . . . . . . . . . . . . . . . . . . . 31 109 11.1. Summary . . . . . . . . . . . . . . . . . . . . . . . . 32 110 11.1.1. ID (Identifier) . . . . . . . . . . . . . . . . . . 32 111 11.1.2. Name . . . . . . . . . . . . . . . . . . . . . . . . 32 112 11.1.3. URIs . . . . . . . . . . . . . . . . . . . . . . . . 32 113 11.1.4. Description . . . . . . . . . . . . . . . . . . . . 32 114 11.1.5. Change Controller . . . . . . . . . . . . . . . . . 32 115 11.1.6. Version (of Registry Format) . . . . . . . . . . . . 32 116 11.2. Metric Definition . . . . . . . . . . . . . . . . . . . 32 117 11.2.1. Reference Definition . . . . . . . . . . . . . . . . 32 118 11.2.2. Fixed Parameters . . . . . . . . . . . . . . . . . . 32 119 11.3. Method of Measurement . . . . . . . . . . . . . . . . . 33 120 11.3.1. Reference Method . . . . . . . . . . . . . . . . . . 33 121 11.3.2. Packet Stream Generation . . . . . . . . . . . . . . 33 122 11.3.3. Traffic Filtering (observation) Details . . . . . . 33 123 11.3.4. Sampling Distribution . . . . . . . . . . . . . . . 33 124 11.3.5. Run-time Parameters and Data Format . . . . . . . . 33 125 11.3.6. Roles . . . . . . . . . . . . . . . . . . . . . . . 33 126 11.4. Output . . . . . . . . . . . . . . . . . . . . . . . . . 33 127 11.4.1. Type . . . . . . . . . . . . . . . . . . . . . . . . 34 128 11.4.2. Reference Definition . . . . . . . . . . . . . . . . 34 129 11.4.3. Metric Units . . . . . . . . . . . . . . . . . . . . 34 130 11.4.4. Calibration . . . . . . . . . . . . . . . . . . . . 34 131 11.5. Administrative items . . . . . . . . . . . . . . . . . . 34 132 11.5.1. Status . . . . . . . . . . . . . . . . . . . . . . . 34 133 11.5.2. Requester . . . . . . . . . . . . . . . . . . . . . 34 134 11.5.3. Revision . . . . . . . . . . . . . . . . . . . . . . 34 135 11.5.4. Revision Date . . . . . . . . . . . . . . . . . . . 34 136 11.6. Comments and Remarks . . . . . . . . . . . . . . . . . . 34 137 12. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 34 138 13. References . . . . . . . . . . . . . . . . . . . . . . . . . 35 139 13.1. Normative References . . . . . . . . . . . . . . . . . . 35 140 13.2. Informative References . . . . . . . . . . . . . . . . . 36 141 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 37 143 1. Introduction 145 The IETF specifies and uses Performance Metrics of protocols and 146 applications transported over its protocols. Performance metrics are 147 important part of network operations using IETF protocols, and 148 [RFC6390] specifies guidelines for their development. 150 The definition and use of Performance Metrics in the IETF has been 151 fostered in various working groups (WG), most notably: 153 The "IP Performance Metrics" (IPPM) WG is the WG primarily 154 focusing on Performance Metrics definition at the IETF. 156 The "Benchmarking Methodology" WG (BMWG) defines many Performance 157 Metrics for use in laboratory benchmarking of inter-networking 158 technologies. 160 The "Metric Blocks for use with RTCP's Extended Report Framework" 161 (XRBLOCK) WG (concluded) specified many Performance Metrics 162 related to "RTP Control Protocol Extended Reports (RTCP XR)" 163 [RFC3611], which establishes a framework to allow new information 164 to be conveyed in RTCP, supplementing the original report blocks 165 defined in "RTP: A Transport Protocol for Real-Time Applications", 166 [RFC3550]. 168 The "IP Flow Information eXport" (IPFIX) concluded WG specified an 169 IANA process for new Information Elements. Some Performance 170 Metrics related Information Elements are proposed on regular 171 basis. 173 The "Performance Metrics for Other Layers" (PMOL) a concluded WG 174 defined some Performance Metrics related to Session Initiation 175 Protocol (SIP) voice quality [RFC6035]. 177 It is expected that more Performance Metrics will be defined in the 178 future, not only IP-based metrics, but also metrics which are 179 protocol-specific and application-specific. 181 Despite the importance of Performance Metrics, there are two related 182 problems for the industry. First, ensuring that when one party 183 requests another party to measure (or report or in some way act on) a 184 particular Performance Metric, then both parties have exactly the 185 same understanding of what Performance Metric is being referred to. 186 Second, discovering which Performance Metrics have been specified, to 187 avoid developing a new Performance Metric that is very similar, but 188 not quite inter-operable. These problems can be addressed by 189 creating a registry of performance metrics. The usual way in which 190 the IETF organizes registries is with Internet Assigned Numbers 191 Authority (IANA), and there is currently no Performance Metrics 192 Registry maintained by the IANA. 194 This document requests that IANA create and maintain a Performance 195 Metrics Registry, according to the maintenance procedures and the 196 Performance Metrics Registry format defined in this memo. The 197 resulting Performance Metrics Registry is for use by the IETF and 198 others. Although the Registry formatting specifications herein are 199 primarily for registry creation by IANA, any other organization that 200 wishes to create a performance metrics registry may use the same 201 formatting specifications for their purposes. The authors make no 202 guarantee of the registry format's applicability to any possible set 203 of Performance Metrics envisaged by other organizations, but 204 encourage others to apply it. In the remainder of this document, 205 unless we explicitly say otherwise, we will refer to the IANA- 206 maintained Performance Metrics Registry as simply the Performance 207 Metrics Registry. 209 2. Terminology 211 The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", 212 "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and 213 "OPTIONAL" in this document are to be interpreted as described in BCP 214 14 [RFC2119] [RFC8174] when, and only when, they appear in all 215 capitals, as shown here. 217 Performance Metric: A Performance Metric is a quantitative measure 218 of performance, targeted to an IETF-specified protocol or targeted 219 to an application transported over an IETF-specified protocol. 220 Examples of Performance Metrics are the FTP response time for a 221 complete file download, the DNS response time to resolve the IP 222 address(es), a database logging time, etc. This definition is 223 consistent with the definition of metric in [RFC2330] and broader 224 than the definition of performance metric in [RFC6390]. 226 Registered Performance Metric: A Registered Performance Metric is a 227 Performance Metric expressed as an entry in the Performance 228 Metrics Registry, administered by IANA. Such a performance metric 229 has met all the registry review criteria defined in this document 230 in order to be included in the registry. 232 Performance Metrics Registry: The IANA registry containing 233 Registered Performance Metrics. 235 Proprietary Registry: A set of metrics that are registered in a 236 proprietary registry, as opposed to Performance Metrics Registry. 238 Performance Metrics Experts: The Performance Metrics Experts is a 239 group of designated experts [RFC8126] selected by the IESG to 240 validate the Performance Metrics before updating the Performance 241 Metrics Registry. The Performance Metrics Experts work closely 242 with IANA. 244 Parameter: A Parameter is an input factor defined as a variable in 245 the definition of a Performance Metric. A Parameter is a 246 numerical or other specified factor forming one of a set that 247 defines a metric or sets the conditions of its operation. All 248 Parameters must be known in order to make a measurement using a 249 metric and interpret the results. There are two types of 250 Parameters: Fixed and Run-time parameters. For the Fixed 251 Parameters, the value of the variable is specified in the 252 Performance Metrics Registry entry and different Fixed Parameter 253 values results in different Registered Performance Metrics. For 254 the Run-time Parameters, the value of the variable is defined when 255 the metric measurement method is executed and a given Registered 256 Performance Metric supports multiple values for the parameter. 257 Although Run-time Parameters do not change the fundamental nature 258 of the Performance Metric's definition, some have substantial 259 influence on the network property being assessed and 260 interpretation of the results. 262 Note: Consider the case of packet loss in the following two 263 Active Measurement Method cases. The first case is packet loss 264 as background loss where the Run-time Parameter set includes a 265 very sparse Poisson stream, and only characterizes the times 266 when packets were lost. Actual user streams likely see much 267 higher loss at these times, due to tail drop or radio errors. 268 The second case is packet loss as inverse of throughput where 269 the Run-time Parameter set includes a very dense, bursty 270 stream, and characterizes the loss experienced by a stream that 271 approximates a user stream. These are both "loss metrics", but 272 the difference in interpretation of the results is highly 273 dependent on the Run-time Parameters (at least), to the extreme 274 where we are actually using loss to infer its compliment: 275 delivered throughput. 277 Active Measurement Method: Methods of Measurement conducted on 278 traffic which serves only the purpose of measurement and is 279 generated for that reason alone, and whose traffic characteristics 280 are known a priori. The complete definition of Active Methods is 281 specified in section 3.4 of[RFC7799]. Examples of Active 282 Measurement Methods are the measurement methods for the One way 283 delay metric defined in [RFC7679] and the one for round trip delay 284 defined in [RFC2681]. 286 Passive Measurement Method: Methods of Measurement conducted on 287 network traffic, generated either from the end users or from 288 network elements that would exist regardless whether the 289 measurement was being conducted or not. The complete definition 290 of Passive Methods is specified in section 3.6 of [RFC7799]. One 291 characteristic of Passive Measurement Methods is that sensitive 292 information may be observed, and as a consequence, stored in the 293 measurement system. 295 Hybrid Measurement Method: Hybrid Methods are Methods of Measurement 296 that use a combination of Active Methods and Passive Methods, to 297 assess Active Metrics, Passive Metrics, or new metrics derived 298 from the a priori knowledge and observations of the stream of 299 interest. The complete definition of Hybrid Methods is specified 300 in section 3.8 of [RFC7799]. 302 3. Scope 304 This document is intended for two different audiences: 306 1. For those defining new Registered Performance Metrics, it 307 provides specifications and best practices to be used in deciding 308 which Registered Performance Metrics are useful for a measurement 309 study, instructions for writing the text for each column of the 310 Registered Performance Metrics, and information on the supporting 311 documentation required for the new Performance Metrics Registry 312 entry (up to and including the publication of one or more 313 immutable documents such as an RFC). 315 2. For the appointed Performance Metrics Experts and for IANA 316 personnel administering the new IANA Performance Metrics 317 Registry, it defines a set of acceptance criteria against which 318 these proposed Registered Performance Metrics should be 319 evaluated. 321 In addition, this document may be useful for other organizations who 322 are defining a Performance Metric registry of their own, and may re- 323 use the features of the Performance Metrics Registry defined in this 324 document. 326 This Performance Metrics Registry is applicable to Performance 327 Metrics issued from Active Measurement, Passive Measurement, and any 328 other form of Performance Metric. This registry is designed to 329 encompass Performance Metrics developed throughout the IETF and 330 especially for the technologies specified in the following working 331 groups: IPPM, XRBLOCK, IPFIX, and BMWG. This document analyzes a 332 prior attempt to set up a Performance Metrics Registry, and the 333 reasons why this design was inadequate [RFC6248]. Finally, this 334 document gives a set of guidelines for requesters and expert 335 reviewers of candidate Registered Performance Metrics. 337 This document makes no attempt to populate the Performance Metrics 338 Registry with initial entries; the related memo 339 [I-D.ietf-ippm-initial-registry] proposes the initial set of regsitry 340 entries. 342 4. Motivation for a Performance Metrics Registry 344 In this section, we detail several motivations for the Performance 345 Metrics Registry. 347 4.1. Interoperability 349 As with any IETF registry, the primary intention is to manage 350 registration of identifiers for use within one or more protocols. In 351 the particular case of the Performance Metrics Registry, there are 352 two types of protocols that will use the Performance Metrics in the 353 Performance Metrics Registry during their operation (by referring to 354 the Index values): 356 o Control protocol: This type of protocol used to allow one entity 357 to request another entity to perform a measurement using a 358 specific metric defined by the Performance Metrics Registry. One 359 particular example is the LMAP framework [RFC7594]. Using the 360 LMAP terminology, the Performance Metrics Registry is used in the 361 LMAP Control protocol to allow a Controller to request a 362 measurement task to one or more Measurement Agents. In order to 363 enable this use case, the entries of the Performance Metrics 364 Registry must be sufficiently defined to allow a Measurement Agent 365 implementation to trigger a specific measurement task upon the 366 reception of a control protocol message. This requirement heavily 367 constrains the type of entries that are acceptable for the 368 Performance Metrics Registry. 370 o Report protocol: This type of protocol is used to allow an entity 371 to report measurement results to another entity. By referencing 372 to a specific Performance Metrics Registry, it is possible to 373 properly characterize the measurement result data being reported. 374 Using the LMAP terminology, the Performance Metrics Registry is 375 used in the Report protocol to allow a Measurement Agent to report 376 measurement results to a Collector. 378 It should be noted that the LMAP framework explicitly allows for 379 using not only the IANA-maintained Performance Metrics Registry but 380 also other registries containing Performance Metrics, either defined 381 by other organizations or private ones. However, others who are 382 creating Registries to be used in the context of an LMAP framework 383 are encouraged to use the Registry format defined in this document, 384 because this makes it easier for developers of LMAP Measurement 385 Agents (MAs) to programmatically use information found in those other 386 Registries' entries. 388 4.2. Single point of reference for Performance Metrics 390 A Performance Metrics Registry serves as a single point of reference 391 for Performance Metrics defined in different working groups in the 392 IETF. As we mentioned earlier, there are several WGs that define 393 Performance Metrics in the IETF and it is hard to keep track of all 394 them. This results in multiple definitions of similar Performance 395 Metrics that attempt to measure the same phenomena but in slightly 396 different (and incompatible) ways. Having a registry would allow the 397 IETF community and others to have a single list of relevant 398 Performance Metrics defined by the IETF (and others, where 399 appropriate). The single list is also an essential aspect of 400 communication about Performance Metrics, where different entities 401 that request measurements, execute measurements, and report the 402 results can benefit from a common understanding of the referenced 403 Performance Metric. 405 4.3. Side benefits 407 There are a couple of side benefits of having such a registry. 408 First, the Performance Metrics Registry could serve as an inventory 409 of useful and used Performance Metrics, that are normally supported 410 by different implementations of measurement agents. Second, the 411 results of measurements using the Performance Metrics should be 412 comparable even if they are performed by different implementations 413 and in different networks, as the Performance Metric is properly 414 defined. BCP 176 [RFC6576] examines whether the results produced by 415 independent implementations are equivalent in the context of 416 evaluating the completeness and clarity of metric specifications. 417 This BCP defines the standards track advancement testing for (active) 418 IPPM metrics, and the same process will likely suffice to determine 419 whether Registered Performance Metrics are sufficiently well 420 specified to result in comparable (or equivalent) results. 421 Registered Performance Metrics which have undergone such testing 422 SHOULD be noted, with a reference to the test results. 424 5. Criteria for Performance Metrics Registration 426 It is neither possible nor desirable to populate the Performance 427 Metrics Registry with all combinations of Parameters of all 428 Performance Metrics. The Registered Performance Metrics SHOULD be: 430 1. interpretable by the user. 432 2. implementable by the software or hardware designer, 434 3. deployable by network operators, 436 4. accurate in terms of producing equivalent results, and for 437 interoperability and deployment across vendors, 439 5. Operationally useful, so that it has significant industry 440 interest and/or has seen deployment, 442 6. Sufficiently tightly defined, so that different values for the 443 Run-time Parameters does not change the fundamental nature of the 444 measurement, nor change the practicality of its implementation. 446 In essence, there needs to be evidence that a candidate Registered 447 Performance Metric has significant industry interest, or has seen 448 deployment, and there is agreement that the candidate Registered 449 Performance Metric serves its intended purpose. 451 6. Performance Metric Registry: Prior attempt 453 There was a previous attempt to define a metric registry RFC 4148 454 [RFC4148]. However, it was obsoleted by RFC 6248 [RFC6248] because 455 it was "found to be insufficiently detailed to uniquely identify IPPM 456 metrics... [there was too much] variability possible when 457 characterizing a metric exactly" which led to the RFC4148 registry 458 having "very few users, if any". 460 A couple of interesting additional quotes from RFC 6248 [RFC6248] 461 might help to understand the issues related to that registry. 463 1. "It is not believed to be feasible or even useful to register 464 every possible combination of Type P, metric parameters, and 465 Stream parameters using the current structure of the IPPM Metrics 466 Registry." 468 2. "The registry structure has been found to be insufficiently 469 detailed to uniquely identify IPPM metrics." 471 3. "Despite apparent efforts to find current or even future users, 472 no one responded to the call for interest in the RFC 4148 473 registry during the second half of 2010." 475 The current approach learns from this by tightly defining each 476 Registered Performance Metric with only a few variable (Run-time) 477 Parameters to be specified by the measurement designer, if any. The 478 idea is that entries in the Performance Metrics Registry stem from 479 different measurement methods which require input (Run-time) 480 parameters to set factors like source and destination addresses 481 (which do not change the fundamental nature of the measurement). The 482 downside of this approach is that it could result in a large number 483 of entries in the Performance Metrics Registry. There is agreement 484 that less is more in this context - it is better to have a reduced 485 set of useful metrics rather than a large set of metrics, some with 486 with questionable usefulness. 488 6.1. Why this Attempt Should Succeed 490 As mentioned in the previous section, one of the main issues with the 491 previous registry was that the metrics contained in the registry were 492 too generic to be useful. This document specifies stricter criteria 493 for performance metric registration (see section 5), and imposes a 494 group of Performance Metrics Experts that will provide guidelines to 495 assess if a Performance Metric is properly specified. 497 Another key difference between this attempt and the previous one is 498 that in this case there is at least one clear user for the 499 Performance Metrics Registry: the LMAP framework and protocol. 500 Because the LMAP protocol will use the Performance Metrics Registry 501 values in its operation, this actually helps to determine if a metric 502 is properly defined. 504 In particular, since we expect that the LMAP control protocol will 505 enable a controller to request a measurement agent to perform a 506 measurement using a given metric by embedding the Performance Metrics 507 Registry identifier in the protocol. Such a metric and method are 508 properly specified if they are defined well-enough so that it is 509 possible (and practical) to implement them in the measurement agent. 510 This was the failure of the previous attempt: a registry entry with 511 an undefined Type-P (section 13 of RFC 2330 [RFC2330]) allows 512 implementation to be ambiguous. 514 7. Definition of the Performance Metric Registry 516 This Performance Metrics Registry is applicable to Performance 517 Metrics used for Active Measurement, Passive Measurement, and any 518 other form of Performance Measurement. Each category of measurement 519 has unique properties, so some of the columns defined below are not 520 applicable for a given metric category. In this case, the column(s) 521 SHOULD be populated with the "NA" value (Non Applicable). However, 522 the "NA" value MUST NOT be used by any metric in the following 523 columns: Identifier, Name, URI, Status, Requester, Revision, Revision 524 Date, Description. In the future, a new category of metrics could 525 require additional columns, and adding new columns is a recognized 526 form of registry extension. The specification defining the new 527 column(s) MUST give guidelines to populate the new column(s) for 528 existing entries (in general). 530 The columns of the Performance Metrics Registry are defined below. 531 The columns are grouped into "Categories" to facilitate the use of 532 the registry. Categories are described at the 7.x heading level, and 533 columns are at the 7.x.y heading level. The Figure below illustrates 534 this organization. An entry (row) therefore gives a complete 535 description of a Registered Performance Metric. 537 Each column serves as a check-list item and helps to avoid omissions 538 during registration and expert review. 540 ======================================================================= 541 Legend: 542 Registry Categories and Columns are shown below as: 543 Category 544 ------------------... 545 Column | Column |... 546 ======================================================================= 547 Summary 548 ------------------------------------------------------------------------ 549 Identifier | Name | URIs | Desc. | Reference | Change Controller | Ver | 551 Metric Definition 552 ----------------------------------------- 553 Reference Definition | Fixed Parameters | 555 Method of Measurement 556 --------------------------------------------------------------------- 557 Reference | Packet | Traffic | Sampling | Run-time | Role | 558 Method | Stream | Filter | Distribution | Parameters | | 559 | Generation | 560 Output 561 ----------------------------------------- 562 Type | Reference | Units | Calibration | 563 | Definition | | | 565 Administrative Information 566 ------------------------------------ 567 Status |Requester | Rev | Rev.Date | 569 Comments and Remarks 570 -------------------- 571 7.1. Summary Category 573 7.1.1. Identifier 575 A numeric identifier for the Registered Performance Metric. This 576 identifier MUST be unique within the Performance Metrics Registry. 578 The Registered Performance Metric unique identifier is an unbounded 579 integer (range 0 to infinity). 581 The Identifier 0 should be Reserved. The Identifier values from 582 64512 to 65536 are reserved for private or experimental use, and the 583 user may encounter overlapping uses. 585 When adding newly Registered Performance Metrics to the Performance 586 Metrics Registry, IANA SHOULD assign the lowest available identifier 587 to the new Registered Performance Metric. 589 If a Performance Metrics Expert providing review determines that 590 there is a reason to assign a specific numeric identifier, possibly 591 leaving a temporary gap in the numbering, then the Performance Expert 592 SHALL inform IANA of this decision. 594 7.1.2. Name 596 As the name of a Registered Performance Metric is the first thing a 597 potential human implementor will use when determining whether it is 598 suitable for their measurement study, it is important to be as 599 precise and descriptive as possible. In future, users will review 600 the names to determine if the metric they want to measure has already 601 been registered, or if a similar entry is available as a basis for 602 creating a new entry. 604 Names are composed of the following elements, separated by an 605 underscore character "_": 607 MetricType_Method_SubTypeMethod_... Spec_Units_Output 609 o MetricType: a combination of the directional properties and the 610 metric measured, such as and not limited to: 612 RTDelay (Round Trip Delay) 614 RTDNS (Response Time Domain Name Service) 616 RLDNS (Response Loss Domain Name Service) 618 OWDelay (One Way Delay) 619 RTLoss (Round Trip Loss) 621 OWLoss (One Way Loss) 623 OWPDV (One Way Packet Delay Variation) 625 OWIPDV (One Way Inter-Packet Delay Variation) 627 OWReorder (One Way Packet Reordering) 629 OWDuplic (One Way Packet Duplication) 631 OWBTC (One Way Bulk Transport Capacity) 633 OWMBM (One Way Model Based Metric) 635 SPMonitor (Single Point Monitor) 637 MPMonitor (Multi-Point Monitor) 639 o Method: One of the methods defined in [RFC7799], such as and not 640 limited to: 642 Active (depends on a dedicated measurement packet stream and 643 observations of the stream) 645 Passive (depends *solely* on observation of one or more 646 existing packet streams) 648 HybridType1 (observations on one stream that combine both 649 active and passive methods) 651 HybridType2 (observations on two or more streams that combine 652 both active and passive methods) 654 Spatial (Spatial Metric of RFC5644) 656 o SubTypeMethod: One or more sub-types to further describe the 657 features of the entry, such as and not limited to: 659 ICMP (Internet Control Message Protocol) 661 IP (Internet Protocol) 663 DSCPxx (where xx is replaced by a Diffserv code point) 665 UDP (User Datagram Protocol) 666 TCP (Transport Control Protocol) 668 QUIC (QUIC transport protocol) 670 HS (Hand-Shake, such as TCP's 3-way HS) 672 Poisson (Packet generation using Poisson distribution) 674 Periodic (Periodic packet generation) 676 SendOnRcv (Sender keeps one packet in-transit by sending when 677 previous packet arrives) 679 PayloadxxxxB (where xxxx is replaced by an integer, the number 680 of octets in the Payload)) 682 SustainedBurst (Capacity test, worst case) 684 StandingQueue (test of bottleneck queue behavior) 686 SubTypeMethod values are separated by a hyphen "-" character, 687 which indicates that they belong to this element, and that their 688 order is unimportant when considering name uniqueness. 690 o Spec: an immutable document, for RFCs, the RFC number and major 691 section number that specifies this Registry entry in the form 692 RFCXXXXsecY, such as RFC7799sec3. Note: the RFC number is not the 693 Primary Reference specification for the metric definition, such as 694 [RFC7679] for One-way Delay; it will contain the placeholder 695 "RFCXXXXsecY" until the RFC number is assigned to the specifying 696 document, and would remain blank in private registry entries 697 without a corresponding RFC. Anticipating the "RFC10K" problem, 698 the number of the RFC continues to replace RFCXXXX regardless of 699 the number of digits in the RFC number. Anticipating Registry 700 Entries from other standards bodies, the form of this Name Element 701 MUST be proposed and reviewed for consistency and uniqueness by 702 the Expert Reviewer. 704 o Units: The units of measurement for the output, such as and not 705 limited to: 707 Seconds 709 Ratio (unitless) 711 Percent (value multiplied by 100) 712 Logical (1 or 0) 714 Packets 716 BPS (Bits per Second) 718 PPS (Packets per Second) 720 EventTotal (for unit-less counts) 722 Multiple (more than one type of unit) 724 Enumerated (a list of outcomes) 726 Unitless 728 o Output: The type of output resulting from measurement, such as and 729 not limited to: 731 Singleton 733 Raw (multiple Singletons) 735 Count 737 Minimum 739 Maximum 741 Median 743 Mean 745 95Percentile (95th Percentile) 747 99Percentile (99th Percentile) 749 StdDev (Standard Deviation) 751 Variance 753 PFI (Pass, Fail, Inconclusive) 755 FlowRecords (descriptions of flows observed) 757 LossRatio (lost packets to total packets, <=1) 759 An example is: 761 RTDelay_Active_IP-UDP-Periodic_RFCXXXXsecY_Seconds_95Percentile 763 as described in section 4 of [I-D.ietf-ippm-initial-registry]. 765 Note that private registries following the format described here 766 SHOULD use the prefix "Priv_" on any name to avoid unintended 767 conflicts (further considerations are described in section 10). 768 Private registry entries usually have no specifying RFC, thus the 769 Spec: element has no clear interpretation. 771 7.1.3. URIs 773 The URIs column MUST contain a URL [RFC3986] that uniquely identifies 774 and locates the metric entry so it is accessible through the 775 Internet. The URL points to a file containing all the human-readable 776 information for one registry entry. The URL SHALL reference a target 777 file that is HTML-formatted and contains URLs to referenced sections 778 of HTML-ized RFCs, or other reference specifications. These target 779 files for different entries can be more easily edited and re-used 780 when preparing new entries. The exact form of the URL for each 781 target file will be determined by IANA and reside on "iana.org". The 782 major sections of [I-D.ietf-ippm-initial-registry] provide an example 783 of a target file in HTML form (sections 4 and higher). 785 7.1.4. Description 787 A Registered Performance Metric description is a written 788 representation of a particular Performance Metrics Registry entry. 789 It supplements the Registered Performance Metric name to help 790 Performance Metrics Registry users select relevant Registered 791 Performance Metrics. 793 7.1.5. Reference 795 This entry gives the specification containing the candidate registry 796 entry which was reviewed and agreed, if such an RFC or other 797 specification exists. 799 7.1.6. Change Controller 801 This entry names the entity responsible for approving revisions to 802 the registry entry, and SHALL provide contact information (for an 803 individual, where appropriate). 805 7.1.7. Version (of Registry Format) 807 This entry gives the version number for the registry format used. 808 Formats complying with this memo MUST use 1.0. The version number 809 SHALL NOT change unless a new RFC is published that changes the 810 registry format. The version number of registry entries SHALL NOT 811 change unless the registry entry is updated (following procedures in 812 section 8). 814 7.2. Metric Definition Category 816 This category includes columns to prompt all necessary details 817 related to the metric definition, including the immutable document 818 reference and values of input factors, called fixed parameters, which 819 are left open in the immutable document, but have a particular value 820 defined by the performance metric. 822 7.2.1. Reference Definition 824 This entry provides a reference (or references) to the relevant 825 section(s) of the document(s) that define the metric, as well as any 826 supplemental information needed to ensure an unambiguous definition 827 for implementations. The reference needs to be an immutable 828 document, such as an RFC; for other standards bodies, it is likely to 829 be necessary to reference a specific, dated version of a 830 specification. 832 7.2.2. Fixed Parameters 834 Fixed Parameters are Parameters whose value must be specified in the 835 Performance Metrics Registry. The measurement system uses these 836 values. 838 Where referenced metrics supply a list of Parameters as part of their 839 descriptive template, a sub-set of the Parameters will be designated 840 as Fixed Parameters. As an example for active metrics, Fixed 841 Parameters determine most or all of the IPPM Framework convention 842 "packets of Type-P" as described in [RFC2330], such as transport 843 protocol, payload length, TTL, etc. An example for passive metrics 844 is for RTP packet loss calculation that relies on the validation of a 845 packet as RTP which is a multi-packet validation controlled by 846 MIN_SEQUENTIAL as defined by [RFC3550]. Varying MIN_SEQUENTIAL 847 values can alter the loss report and this value could be set as a 848 Fixed Parameter. 850 Parameters MUST have well-defined names. For human readers, the 851 hanging indent style is preferred, and any Parameter names and 852 definitions that do not appear in the Reference Method Specification 853 MUST appear in this column (or Run-time Parameters column). 855 Parameters MUST have a well-specified data format. 857 A Parameter which is a Fixed Parameter for one Performance Metrics 858 Registry entry may be designated as a Run-time Parameter for another 859 Performance Metrics Registry entry. 861 7.3. Method of Measurement Category 863 This category includes columns for references to relevant sections of 864 the immutable document(s) and any supplemental information needed to 865 ensure an unambiguous method for implementations. 867 7.3.1. Reference Method 869 This entry provides references to relevant sections of immutable 870 documents, such as RFC(s) (for other standards bodies, it is likely 871 to be necessary to reference a specific, dated version of a 872 specification) describing the method of measurement, as well as any 873 supplemental information needed to ensure unambiguous interpretation 874 for implementations referring to the immutable document text. 876 Specifically, this section should include pointers to pseudocode or 877 actual code that could be used for an unambiguous implementation. 879 7.3.2. Packet Stream Generation 881 This column applies to Performance Metrics that generate traffic as 882 part of their Measurement Method, including but not necessarily 883 limited to Active metrics. The generated traffic is referred as a 884 stream and this column describes its characteristics. 886 Each entry for this column contains the following information: 888 o Value: The name of the packet stream scheduling discipline 890 o Reference: the specification where the parameters of the stream 891 are defined 893 The packet generation stream may require parameters such as the 894 average packet rate and distribution truncation value for streams 895 with Poisson-distributed inter-packet sending times. In case such 896 parameters are needed, they should be included either in the Fixed 897 parameter column or in the run time parameter column, depending on 898 whether they will be fixed or will be an input for the metric. 900 The simplest example of stream specification is Singleton scheduling 901 (see [RFC2330]), where a single atomic measurement is conducted. 902 Each atomic measurement could consist of sending a single packet 903 (such as a DNS request) or sending several packets (for example, to 904 request a webpage). Other streams support a series of atomic 905 measurements in a "sample", with a schedule defining the timing 906 between each transmitted packet and subsequent measurement. 907 Principally, two different streams are used in IPPM metrics, Poisson 908 distributed as described in [RFC2330] and Periodic as described in 909 [RFC3432]. Both Poisson and Periodic have their own unique 910 parameters, and the relevant set of parameters names and values 911 should be included either in the Fixed Parameters column or in the 912 Run-time parameter column. 914 7.3.3. Traffic Filter 916 This column applies to Performance Metrics that observe packets 917 flowing through (the device with) the measurement agent i.e. that is 918 not necessarily addressed to the measurement agent. This includes 919 but is not limited to Passive Metrics. The filter specifies the 920 traffic that is measured. This includes protocol field values/ 921 ranges, such as address ranges, and flow or session identifiers. 923 The traffic filter itself depends on needs of the metric itself and a 924 balance of an operator's measurement needs and a user's need for 925 privacy. Mechanics for conveying the filter criteria might be the 926 BPF (Berkley Packet Filter) or PSAMP [RFC5475] Property Match 927 Filtering which reuses IPFIX [RFC7012]. An example BPF string for 928 matching TCP/80 traffic to remote destination net 192.0.2.0/24 would 929 be "dst net 192.0.2.0/24 and tcp dst port 80". More complex filter 930 engines might be supported by the implementation that might allow for 931 matching using Deep Packet Inspection (DPI) technology. 933 The traffic filter includes the following information: 935 Type: the type of traffic filter used, e.g. BPF, PSAMP, OpenFlow 936 rule, etc. as defined by a normative reference 938 Value: the actual set of rules expressed 940 7.3.4. Sampling Distribution 942 The sampling distribution defines out of all the packets that match 943 the traffic filter, which one of those are actually used for the 944 measurement. One possibility is "all" which implies that all packets 945 matching the Traffic filter are considered, but there may be other 946 sampling strategies. It includes the following information: 948 Value: the name of the sampling distribution 950 Reference definition: pointer to the specification where the 951 sampling distribution is properly defined. 953 The sampling distribution may require parameters. In case such 954 parameters are needed, they should be included either in the Fixed 955 parameter column or in the run time parameter column, depending on 956 whether they will be fixed or will be an input for the metric. 958 Sampling and Filtering Techniques for IP Packet Selection are 959 documented in the PSAMP (Packet Sampling) [RFC5475], while the 960 Framework for Packet Selection and Reporting, [RFC5474] provides more 961 background information. The sampling distribution parameters might 962 be expressed in terms of the Information Model for Packet Sampling 963 Exports, [RFC5477], and the Flow Selection Techniques, [RFC7014]. 965 7.3.5. Run-time Parameters 967 Run-Time Parameters are Parameters that must be determined, 968 configured into the measurement system, and reported with the results 969 for the context to be complete. However, the values of these 970 parameters is not specified in the Performance Metrics Registry (like 971 the Fixed Parameters), rather these parameters are listed as an aid 972 to the measurement system implementer or user (they must be left as 973 variables, and supplied on execution). 975 Where metrics supply a list of Parameters as part of their 976 descriptive template, a sub-set of the Parameters will be designated 977 as Run-Time Parameters. 979 Parameters MUST have well defined names. For human readers, the 980 hanging indent style is preferred, and the names and definitions that 981 do not appear in the Reference Method Specification MUST appear in 982 this column. 984 A Data Format for each Run-time Parameter MUST be specified in this 985 column, to simplify the control and implementation of measurement 986 devices. For example, parameters that include an IPv4 address can be 987 encoded as a 32 bit integer (i.e. binary base64 encoded value) or ip- 988 address as defined in [RFC6991]. The actual encoding(s) used must be 989 explicitly defined for each Run-time parameter. IPv6 addresses and 990 options MUST be accommodated, allowing Registered Metrics to be used 991 in either address family. 993 Examples of Run-time Parameters include IP addresses, measurement 994 point designations, start times and end times for measurement, and 995 other information essential to the method of measurement. 997 7.3.6. Role 999 In some methods of measurement, there may be several roles defined, 1000 e.g., for a one-way packet delay active measurement there is one 1001 measurement agent that generates the packets and another agent that 1002 receives the packets. This column contains the name of the Role(s) 1003 for this particular entry. In the one-way delay example above, there 1004 should be two entries in the Role registry column, one for each Role 1005 (Source and Destination). When a measurement agent is instructed to 1006 perform the "Source" Role for one-way delay metric, the agent knows 1007 that it is required to generate packets. The values for this field 1008 are defined in the reference method of measurement (and this 1009 frequently results in abbreviated role names such as "Src"). 1011 When the Role column of a registry entry defines more than one Role, 1012 then the Role SHALL be treated as a Run-time Parameter and supplied 1013 for execution. It should be noted that the LMAP framework [RFC7594] 1014 distinguishes the Role from other Run-time Parameters, and defines a 1015 special parameter "Roles" inside the registry-grouping function list 1016 in the LMAP YANG model[RFC8194]. 1018 7.4. Output Category 1020 For entries which involve a stream and many singleton measurements, a 1021 statistic may be specified in this column to summarize the results to 1022 a single value. If the complete set of measured singletons is 1023 output, this will be specified here. 1025 Some metrics embed one specific statistic in the reference metric 1026 definition, while others allow several output types or statistics. 1028 7.4.1. Type 1030 This column contains the name of the output type. The output type 1031 defines a single type of result that the metric produces. It can be 1032 the raw results (packet send times and singleton metrics), or it can 1033 be a summary statistic. The specification of the output type MUST 1034 define the format of the output. In some systems, format 1035 specifications will simplify both measurement implementation and 1036 collection/storage tasks. Note that if two different statistics are 1037 required from a single measurement (for example, both "Xth percentile 1038 mean" and "Raw"), then a new output type must be defined ("Xth 1039 percentile mean AND Raw"). See the Naming section above for a list 1040 of Output Types. 1042 7.4.2. Reference Definition 1044 This column contains a pointer to the specification(s) where the 1045 output type and format are defined. 1047 7.4.3. Metric Units 1049 The measured results must be expressed using some standard dimension 1050 or units of measure. This column provides the units. 1052 When a sample of singletons (see Section 11 of[RFC2330] for 1053 definitions of these terms) is collected, this entry will specify the 1054 units for each measured value. 1056 7.4.4. Calibration 1058 Some specifications for Methods of Measurement include the 1059 possibility to perform an error calibration. Section 3.7.3 of 1060 [RFC7679] is one example. In the registry entry, this field will 1061 identify a method of calibration for the metric, and when available, 1062 the measurement system SHOULD perform the calibration when requested 1063 and produce the output with an indication that it is the result of a 1064 calibration method. In-situ calibration could be enabled with an 1065 internal loopback that includes as much of the measurement system as 1066 possible, performs address manipulation as needed, and provides some 1067 form of isolation (e.g., deterministic delay) to avoid send-receive 1068 interface contention. Some portion of the random and systematic 1069 error can be characterized this way. 1071 For one-way delay measurements, the error calibration must include an 1072 assessment of the internal clock synchronization with its external 1073 reference (this internal clock is supplying timestamps for 1074 measurement). In practice, the time offsets of clocks at both the 1075 source and destination are needed to estimate the systematic error 1076 due to imperfect clock synchronization (the time offsets are 1077 smoothed, thus the random variation is not usually represented in the 1078 results). 1080 Both internal loopback calibration and clock synchronization can be 1081 used to estimate the *available accuracy* of the Output Metric Units. 1082 For example, repeated loopback delay measurements will reveal the 1083 portion of the Output result resolution which is the result of system 1084 noise, and thus inaccurate. 1086 7.5. Administrative information 1088 7.5.1. Status 1090 The status of the specification of this Registered Performance 1091 Metric. Allowed values are 'current' and 'deprecated'. All newly 1092 defined Information Elements have 'current' status. 1094 7.5.2. Requester 1096 The requester for the Registered Performance Metric. The requester 1097 MAY be a document, such as RFC, or person. 1099 7.5.3. Revision 1101 The revision number of a Registered Performance Metric, starting at 0 1102 for Registered Performance Metrics at time of definition and 1103 incremented by one for each revision. 1105 7.5.4. Revision Date 1107 The date of acceptance or the most recent revision for the Registered 1108 Performance Metric. The date SHALL be determined by IANA and the 1109 reviewing Performance Metrics Expert. 1111 7.6. Comments and Remarks 1113 Besides providing additional details which do not appear in other 1114 categories, this open Category (single column) allows for unforeseen 1115 issues to be addressed by simply updating this informational entry. 1117 8. Processes for Managing the Performance Metric Registry Group 1119 Once a Performance Metric or set of Performance Metrics has been 1120 identified for a given application, candidate Performance Metrics 1121 Registry entry specifications prepared in accordance with Section 7 1122 should be submitted to IANA to follow the process for review by the 1123 Performance Metric Experts, as defined below. This process is also 1124 used for other changes to the Performance Metrics Registry, such as 1125 deprecation or revision, as described later in this section. 1127 It is desirable that the author(s) of a candidate Performance Metrics 1128 Registry entry seek review in the relevant IETF working group, or 1129 offer the opportunity for review on the working group mailing list. 1131 8.1. Adding new Performance Metrics to the Performance Metrics Registry 1133 Requests to add Registered Performance Metrics in the Performance 1134 Metrics Registry SHALL be submitted to IANA, which forwards the 1135 request to a designated group of experts (Performance Metric Experts) 1136 appointed by the IESG; these are the reviewers called for by the 1137 Specification Required [RFC8126] policy defined for the Performance 1138 Metrics Registry. The Performance Metric Experts review the request 1139 for such things as compliance with this document, compliance with 1140 other applicable Performance Metric-related RFCs, and consistency 1141 with the currently defined set of Registered Performance Metrics. 1143 Submission to IANA may be during IESG review (leading to IETF 1144 Standards Action), where an Internet Draft proposes one or more 1145 Registered Performance Metrics to be added to the Performance Metrics 1146 Registry, including the text of the proposed Registered Performance 1147 Metric(s). 1149 If an RFC-to-be includes a Performance Metric and a proposed 1150 Performance Metrics Registry entry, but the IANA and Performance 1151 Metric Expert review determines that one or more of the Section 5 1152 criteria have not been met, then the proposed Performance Metrics 1153 Registry entry MUST be removed from the text. When the RFC-to-be 1154 authors are ready to show evidence of meeting the criteria in section 1155 5, they SHOULD re-submit the proposed Performance Metrics Registry 1156 entry to IANA to be evaluated in consultation with the Performance 1157 Metric Experts for registration at that time. 1159 Authors of proposed Registered Performance Metrics SHOULD review 1160 compliance with the specifications in this document to check their 1161 submissions before sending them to IANA. 1163 At least one Performance Metric Expert should endeavor to complete 1164 referred reviews in a timely manner. If the request is acceptable, 1165 the Performance Metric Experts signify their approval to IANA, and 1166 IANA updates the Performance Metrics Registry. If the request is not 1167 acceptable, the Performance Metric Experts MAY coordinate with the 1168 requester to change the request to be compliant, otherwise IANA SHALL 1169 coordinate resolution of issues on behalf of the expert. The 1170 Performance Metric Experts MAY choose to reject clearly frivolous or 1171 inappropriate change requests outright, but such exceptional 1172 circumstances should be rare. 1174 This process should not in any way be construed as allowing the 1175 Performance Metric Experts to overrule IETF consensus. Specifically, 1176 any Registered Performance Metrics that were added to the Performance 1177 Metrics Registry with IETF consensus require IETF consensus for 1178 revision or deprecation. 1180 Decisions by the Performance Metric Experts may be appealed as in 1181 Section 7 of [RFC8126]. 1183 8.2. Revising Registered Performance Metrics 1185 A request for Revision is only permitted when the requested changes 1186 maintain backward-compatibility with implementations of the prior 1187 Performance Metrics Registry entry describing a Registered 1188 Performance Metric (entries with lower revision numbers, but the same 1189 Identifier and Name). 1191 The purpose of the Status field in the Performance Metrics Registry 1192 is to indicate whether the entry for a Registered Performance Metric 1193 is 'current' or 'deprecated'. 1195 In addition, no policy is defined for revising the Performance Metric 1196 entries in the IANA Registry or addressing errors therein. To be 1197 clear, changes and deprecations within the Performance Metrics 1198 Registry are not encouraged, and should be avoided to the extent 1199 possible. However, in recognition that change is inevitable, the 1200 provisions of this section address the need for revisions. 1202 Revisions are initiated by sending a candidate Registered Performance 1203 Metric definition to IANA, as in Section 8.1, identifying the 1204 existing Performance Metrics Registry entry, and explaining how and 1205 why the existing entry should be revised. 1207 The primary requirement in the definition of procedures for managing 1208 changes to existing Registered Performance Metrics is avoidance of 1209 measurement interoperability problems; the Performance Metric Experts 1210 must work to maintain interoperability above all else. Changes to 1211 Registered Performance Metrics may only be done in an interoperable 1212 way; necessary changes that cannot be done in a way to allow 1213 interoperability with unchanged implementations MUST result in the 1214 creation of a new Registered Performance Metric (with a new Name, 1215 replacing the RFCXXXXsecY portion of the name) and possibly the 1216 deprecation of the earlier metric. 1218 A change to a Registered Performance Metric SHALL be determined to be 1219 backward-compatible when: 1221 1. it involves the correction of an error that is obviously only 1222 editorial; or 1224 2. it corrects an ambiguity in the Registered Performance Metric's 1225 definition, which itself leads to issues severe enough to prevent 1226 the Registered Performance Metric's usage as originally defined; 1227 or 1229 3. it corrects missing information in the metric definition without 1230 changing its meaning (e.g., the explicit definition of 'quantity' 1231 semantics for numeric fields without a Data Type Semantics 1232 value); or 1234 4. it harmonizes with an external reference that was itself 1235 corrected. 1237 If a Performance Metric revision is deemed permissible and backward- 1238 compatible by the Performance Metric Experts, according to the rules 1239 in this document, IANA SHOULD execute the change(s) in the 1240 Performance Metrics Registry. The requester of the change is 1241 appended to the original requester in the Performance Metrics 1242 Registry. The Name of the revised Registered Performance Metric, 1243 including the RFCXXXXsecY portion of the name, SHALL remain unchanged 1244 (even when the change is the result of IETF Standards Action; the 1245 revised registry entry SHOULD reference the new immutable document, 1246 such as an RFC or for other standards bodies, it is likely to be 1247 necessary to reference a specific, dated version of a specification, 1248 in an appropriate category and column). 1250 Each Registered Performance Metric in the Performance Metrics 1251 Registry has a revision number, starting at zero. Each change to a 1252 Registered Performance Metric following this process increments the 1253 revision number by one. 1255 When a revised Registered Performance Metric is accepted into the 1256 Performance Metrics Registry, the date of acceptance of the most 1257 recent revision is placed into the revision Date column of the 1258 registry for that Registered Performance Metric. 1260 Where applicable, additions to Registered Performance Metrics in the 1261 form of text Comments or Remarks should include the date, but such 1262 additions may not constitute a revision according to this process. 1264 Older version(s) of the updated metric entries are kept in the 1265 registry for archival purposes. The older entries are kept with all 1266 fields unmodified (version, revision date) except for the status 1267 field that SHALL be changed to "Deprecated". 1269 8.3. Deprecating Registered Performance Metrics 1271 Changes that are not permissible by the above criteria for Registered 1272 Performance Metric's revision may only be handled by deprecation. A 1273 Registered Performance Metric MAY be deprecated and replaced when: 1275 1. the Registered Performance Metric definition has an error or 1276 shortcoming that cannot be permissibly changed as in Section 8.2 1277 Revising Registered Performance Metrics; or 1279 2. the deprecation harmonizes with an external reference that was 1280 itself deprecated through that reference's accepted deprecation 1281 method. 1283 A request for deprecation is sent to IANA, which passes it to the 1284 Performance Metric Experts for review. When deprecating an 1285 Performance Metric, the Performance Metric description in the 1286 Performance Metrics Registry must be updated to explain the 1287 deprecation, as well as to refer to any new Performance Metrics 1288 created to replace the deprecated Performance Metric. 1290 The revision number of a Registered Performance Metric is incremented 1291 upon deprecation, and the revision Date updated, as with any 1292 revision. 1294 The intentional use of deprecated Registered Performance Metrics 1295 should result in a log entry or human-readable warning by the 1296 respective application. 1298 Names and Metric IDs of deprecated Registered Performance Metrics 1299 must not be reused. 1301 The deprecated entries are kept with all fields unmodified, except 1302 the version, revision date, and the status field (changed to 1303 "Deprecated"). 1305 9. Security considerations 1307 This draft defines a registry structure, and does not itself 1308 introduce any new security considerations for the Internet. The 1309 definition of Performance Metrics for this registry may introduce 1310 some security concerns, but the mandatory references should have 1311 their own considerations for security, and such definitions should be 1312 reviewed with security in mind if the security considerations are not 1313 covered by one or more reference standards. 1315 The aggregated results of the performance metrics described in this 1316 registry might reveal network topology information that may be 1317 considered sensitive. If such cases are found, then access control 1318 mechanisms should be applied. 1320 10. IANA Considerations 1322 With the background and processes described in earlier sections, this 1323 document requests the following IANA Actions. 1325 Editor's Note: Mock-ups of the implementation of this set of requests 1326 have been prepared with IANA's help during development of this memo, 1327 and have been captured in the Proceedings of IPPM working group 1328 sessions. IANA is currently preparing a mock-up. A recent version 1329 is available here: http://encrypted.net/IETFMetricsRegistry-106.html 1331 10.1. Registry Group 1333 The new registry group SHALL be named, "PERFORMANCE METRICS Group". 1335 Registration Procedure: Specification Required 1337 Reference: 1339 Experts: Performance Metrics Experts 1341 Note: TBD 1343 10.2. Performance Metric Name Elements 1345 This document specifies the procedure for Performance Metrics Name 1346 Element Registry setup. IANA is requested to create a new set of 1347 registries for Performance Metric Name Elements called "Registered 1348 Performance Metric Name Elements". Each Registry, whose names are 1349 listed below: 1351 MetricType: 1353 Method: 1355 SubTypeMethod: 1357 Spec: 1359 Units: 1361 Output: 1363 will contain the current set of possibilities for Performance Metrics 1364 Registry Entry Names. 1366 To populate the Registered Performance Metric Name Elements at 1367 creation, the IANA is asked to use the lists of values for each name 1368 element listed in Section 7.1.2. The Name Elements in each registry 1369 are case-sensitive. 1371 When preparing a Metric entry for Registration, the developer SHOULD 1372 choose Name elements from among the registered elements. However, if 1373 the proposed metric is unique in a significant way, it may be 1374 necessary to propose a new Name element to properly describe the 1375 metric, as described below. 1377 A candidate Metric Entry RFC or immutable document for IANA and 1378 Expert Review would propose one or more new element values required 1379 to describe the unique entry, and the new name element(s) would be 1380 reviewed along with the metric entry. New assignments for Registered 1381 Performance Metric Name Elements will be administered by IANA through 1382 Specification Required policy (which includes Expert Review) 1383 [RFC8126], i.e., review by one of a group of experts, the Performance 1384 Metric Experts, who are appointed by the IESG upon recommendation of 1385 the Transport Area Directors. 1387 10.3. New Performance Metrics Registry 1389 This document specifies the procedure for Performance Metrics 1390 Registry setup. IANA is requested to create a new registry for 1391 Performance Metrics called "Performance Metrics Registry". This 1392 Registry will contain the following Summary columns: 1394 Identifier: 1396 Name: 1398 URIs: 1400 Description: 1402 Reference: 1404 Change Controller: 1406 Version: 1408 Descriptions of these columns and additional information found in the 1409 template for registry entries (categories and columns) are further 1410 defined in section Section 7. 1412 The Identifier 0 should be Reserved. The Registered Performance 1413 Metric unique identifier is an unbounded integer (range 0 to 1414 infinity). The Identifier values from 64512 to 65536 are reserved 1415 for private or experimental use, and the user may encounter 1416 overlapping uses. When adding newly Registered Performance Metrics 1417 to the Performance Metrics Registry, IANA SHOULD assign the lowest 1418 available identifier to the new Registered Performance Metric. If a 1419 Performance Metrics Expert providing review determines that there is 1420 a reason to assign a specific numeric identifier, possibly leaving a 1421 temporary gap in the numbering, then the Performance Expert SHALL 1422 inform IANA of this decision. 1424 Names starting with the prefix Priv_ are reserved for private use, 1425 and are not considered for registration. The "Name" column entries 1426 are further defined in section Section 7. 1428 The "URIs" column will have a URL to the full template of each 1429 registry entry. The Registry Entry text SHALL be HTML-ized to aid 1430 the reader, with links to reference RFCs (similar to the way that 1431 Internet Drafts are HTML-ized, the same tool can perform the 1432 function) or immutable document. 1434 The "Reference" column will include an RFC number, an approved 1435 specification designator from another standards body, or other 1436 immutable document. 1438 New assignments for Performance Metrics Registry will be administered 1439 by IANA through Specification Required policty (which includes Expert 1440 Review) [RFC8126], i.e., review by one of a group of experts, the 1441 Performance Metric Experts, who are appointed by the IESG upon 1442 recommendation of the Transport Area Directors, or by Standards 1443 Action. The experts can be initially drawn from the Working Group 1444 Chairs, document editors, and members of the Performance Metrics 1445 Directorate, among other sources of experts. 1447 Extensions of the Performance Metrics Registry require IETF Standards 1448 Action. Only one form of registry extension is envisaged: 1450 1. Adding columns, or both categories and columns, to accommodate 1451 unanticipated aspects of new measurements and metric categories. 1453 If the Performance Metrics Registry is extended in this way, the 1454 Version number of future entries complying with the extension SHALL 1455 be incremented (either in the unit or tenths digit, depending on the 1456 degree of extension. 1458 11. Blank Registry Template 1460 This section provides a blank template to help IANA and registry 1461 entry writers. 1463 11.1. Summary 1465 This category includes multiple indexes to the registry entry: the 1466 element ID and metric name. 1468 11.1.1. ID (Identifier) 1470 1472 11.1.2. Name 1474 1476 11.1.3. URIs 1478 URL: https://www.iana.org/ ... 1480 11.1.4. Description 1482 1484 11.1.5. Change Controller 1486 11.1.6. Version (of Registry Format) 1488 11.2. Metric Definition 1490 This category includes columns to prompt the entry of all necessary 1491 details related to the metric definition, including the immutable 1492 document reference and values of input factors, called fixed 1493 parameters. 1495 11.2.1. Reference Definition 1497 1499 1501 11.2.2. Fixed Parameters 1503 1507 11.3. Method of Measurement 1509 This category includes columns for references to relevant sections of 1510 the immutable documents(s) and any supplemental information needed to 1511 ensure an unambiguous methods for implementations. 1513 11.3.1. Reference Method 1515 1518 11.3.2. Packet Stream Generation 1520 1522 11.3.3. Traffic Filtering (observation) Details 1524 The measured results based on a filtered version of the packets 1525 observed, and this section provides the filter details (when 1526 present). 1528
. 1530 11.3.4. Sampling Distribution 1532 1535 11.3.5. Run-time Parameters and Data Format 1537 Run-time Parameters are input factors that must be determined, 1538 configured into the measurement system, and reported with the results 1539 for the context to be complete. 1541 1543 11.3.6. Roles 1545 1547 11.4. Output 1549 This category specifies all details of the Output of measurements 1550 using the metric. 1552 11.4.1. Type 1554 1556 11.4.2. Reference Definition 1558 1560 11.4.3. Metric Units 1562 . 1565 11.4.4. Calibration 1567 1569 11.5. Administrative items 1571 11.5.1. Status 1573 1575 11.5.2. Requester 1577 1579 11.5.3. Revision 1581 <1.0> 1583 11.5.4. Revision Date 1585 1587 11.6. Comments and Remarks 1589 1591 12. Acknowledgments 1593 Thanks to Brian Trammell and Bill Cerveny, IPPM chairs, for leading 1594 some brainstorming sessions on this topic. Thanks to Barbara Stark 1595 and Juergen Schoenwaelder for the detailed feedback and suggestions. 1596 Thanks to Andrew McGregor for suggestions on metric naming. Thanks 1597 to Michelle Cotton for her early IANA review, and to Amanda Barber 1598 for answering questions related to the presentation of the registry 1599 and accessibility of the complete template via URL. Thanks to Roni 1600 Even for his review and suggestions to generalize the procedures. 1601 Thanks to ~all the Area Directors for their reviews. 1603 13. References 1605 13.1. Normative References 1607 [RFC2026] Bradner, S., "The Internet Standards Process -- Revision 1608 3", BCP 9, RFC 2026, DOI 10.17487/RFC2026, October 1996, 1609 . 1611 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 1612 Requirement Levels", BCP 14, RFC 2119, 1613 DOI 10.17487/RFC2119, March 1997, 1614 . 1616 [RFC2330] Paxson, V., Almes, G., Mahdavi, J., and M. Mathis, 1617 "Framework for IP Performance Metrics", RFC 2330, 1618 DOI 10.17487/RFC2330, May 1998, 1619 . 1621 [RFC3986] Berners-Lee, T., Fielding, R., and L. Masinter, "Uniform 1622 Resource Identifier (URI): Generic Syntax", STD 66, 1623 RFC 3986, DOI 10.17487/RFC3986, January 2005, 1624 . 1626 [RFC6390] Clark, A. and B. Claise, "Guidelines for Considering New 1627 Performance Metric Development", BCP 170, RFC 6390, 1628 DOI 10.17487/RFC6390, October 2011, 1629 . 1631 [RFC6576] Geib, R., Ed., Morton, A., Fardid, R., and A. Steinmitz, 1632 "IP Performance Metrics (IPPM) Standard Advancement 1633 Testing", BCP 176, RFC 6576, DOI 10.17487/RFC6576, March 1634 2012, . 1636 [RFC7799] Morton, A., "Active and Passive Metrics and Methods (with 1637 Hybrid Types In-Between)", RFC 7799, DOI 10.17487/RFC7799, 1638 May 2016, . 1640 [RFC8126] Cotton, M., Leiba, B., and T. Narten, "Guidelines for 1641 Writing an IANA Considerations Section in RFCs", BCP 26, 1642 RFC 8126, DOI 10.17487/RFC8126, June 2017, 1643 . 1645 [RFC8174] Leiba, B., "Ambiguity of Uppercase vs Lowercase in RFC 1646 2119 Key Words", BCP 14, RFC 8174, DOI 10.17487/RFC8174, 1647 May 2017, . 1649 13.2. Informative References 1651 [I-D.ietf-ippm-initial-registry] 1652 Morton, A., Bagnulo, M., Eardley, P., and K. D'Souza, 1653 "Initial Performance Metrics Registry Entries", draft- 1654 ietf-ippm-initial-registry-14 (work in progress), November 1655 2019. 1657 [RFC2681] Almes, G., Kalidindi, S., and M. Zekauskas, "A Round-trip 1658 Delay Metric for IPPM", RFC 2681, DOI 10.17487/RFC2681, 1659 September 1999, . 1661 [RFC3432] Raisanen, V., Grotefeld, G., and A. Morton, "Network 1662 performance measurement with periodic streams", RFC 3432, 1663 DOI 10.17487/RFC3432, November 2002, 1664 . 1666 [RFC3550] Schulzrinne, H., Casner, S., Frederick, R., and V. 1667 Jacobson, "RTP: A Transport Protocol for Real-Time 1668 Applications", STD 64, RFC 3550, DOI 10.17487/RFC3550, 1669 July 2003, . 1671 [RFC3611] Friedman, T., Ed., Caceres, R., Ed., and A. Clark, Ed., 1672 "RTP Control Protocol Extended Reports (RTCP XR)", 1673 RFC 3611, DOI 10.17487/RFC3611, November 2003, 1674 . 1676 [RFC4148] Stephan, E., "IP Performance Metrics (IPPM) Metrics 1677 Registry", BCP 108, RFC 4148, DOI 10.17487/RFC4148, August 1678 2005, . 1680 [RFC5474] Duffield, N., Ed., Chiou, D., Claise, B., Greenberg, A., 1681 Grossglauser, M., and J. Rexford, "A Framework for Packet 1682 Selection and Reporting", RFC 5474, DOI 10.17487/RFC5474, 1683 March 2009, . 1685 [RFC5475] Zseby, T., Molina, M., Duffield, N., Niccolini, S., and F. 1686 Raspall, "Sampling and Filtering Techniques for IP Packet 1687 Selection", RFC 5475, DOI 10.17487/RFC5475, March 2009, 1688 . 1690 [RFC5477] Dietz, T., Claise, B., Aitken, P., Dressler, F., and G. 1691 Carle, "Information Model for Packet Sampling Exports", 1692 RFC 5477, DOI 10.17487/RFC5477, March 2009, 1693 . 1695 [RFC6035] Pendleton, A., Clark, A., Johnston, A., and H. Sinnreich, 1696 "Session Initiation Protocol Event Package for Voice 1697 Quality Reporting", RFC 6035, DOI 10.17487/RFC6035, 1698 November 2010, . 1700 [RFC6248] Morton, A., "RFC 4148 and the IP Performance Metrics 1701 (IPPM) Registry of Metrics Are Obsolete", RFC 6248, 1702 DOI 10.17487/RFC6248, April 2011, 1703 . 1705 [RFC6991] Schoenwaelder, J., Ed., "Common YANG Data Types", 1706 RFC 6991, DOI 10.17487/RFC6991, July 2013, 1707 . 1709 [RFC7012] Claise, B., Ed. and B. Trammell, Ed., "Information Model 1710 for IP Flow Information Export (IPFIX)", RFC 7012, 1711 DOI 10.17487/RFC7012, September 2013, 1712 . 1714 [RFC7014] D'Antonio, S., Zseby, T., Henke, C., and L. Peluso, "Flow 1715 Selection Techniques", RFC 7014, DOI 10.17487/RFC7014, 1716 September 2013, . 1718 [RFC7594] Eardley, P., Morton, A., Bagnulo, M., Burbridge, T., 1719 Aitken, P., and A. Akhter, "A Framework for Large-Scale 1720 Measurement of Broadband Performance (LMAP)", RFC 7594, 1721 DOI 10.17487/RFC7594, September 2015, 1722 . 1724 [RFC7679] Almes, G., Kalidindi, S., Zekauskas, M., and A. Morton, 1725 Ed., "A One-Way Delay Metric for IP Performance Metrics 1726 (IPPM)", STD 81, RFC 7679, DOI 10.17487/RFC7679, January 1727 2016, . 1729 [RFC8194] Schoenwaelder, J. and V. Bajpai, "A YANG Data Model for 1730 LMAP Measurement Agents", RFC 8194, DOI 10.17487/RFC8194, 1731 August 2017, . 1733 Authors' Addresses 1734 Marcelo Bagnulo 1735 Universidad Carlos III de 1736 Madrid 1737 Av. Universidad 30 1738 Leganes, Madrid 28911 1739 SPAIN 1741 Phone: 34 91 6249500 1742 Email: marcelo@it.uc3m.es 1743 URI: http://www.it.uc3m.es 1745 Benoit Claise 1746 Cisco Systems, 1747 Inc. 1748 De Kleetlaan 6a b1 1749 1831 Diegem 1750 Belgium 1752 Email: bclaise@cisco.com 1754 Philip Eardley 1755 BT 1756 Adastral Park, Martlesham Heath 1757 Ipswich 1758 ENGLAND 1760 Email: philip.eardley@bt.com 1762 Al Morton 1763 AT&T Labs 1764 200 Laurel Avenue South 1765 Middletown, NJ 1766 USA 1768 Email: acmorton@att.com 1770 Aamer Akhter 1771 Consultant 1772 118 Timber Hitch 1773 Cary, NC 1774 USA 1776 Email: aakhter@gmail.com