idnits 2.17.1 draft-dusseault-impl-reports-04.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- ** The document seems to lack a License Notice according IETF Trust Provisions of 28 Dec 2009, Section 6.b.ii or Provisions of 12 Sep 2009 Section 6.b -- however, there's a paragraph with a matching beginning. Boilerplate error? (You're using the IETF Trust Provisions' Section 6.b License Notice from 12 Feb 2009 rather than one of the newer Notices. See https://trustee.ietf.org/license-info/.) Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- -- The draft header indicates that this document updates RFC2026, but the abstract doesn't seem to mention this, which it should. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year (Using the creation date from RFC2026, updated by this document, for RFC5378 checks: 1995-09-12) -- The document seems to lack a disclaimer for pre-RFC5378 work, but may have content which was first submitted before 10 November 2008. If you have contacted all the original authors and they are all willing to grant the BCP78 rights to the IETF Trust, then this is fine, and you can ignore this comment. If not, you may need to add the pre-RFC5378 disclaimer. (See the Legal Provisions document at https://trustee.ietf.org/license-info for more information.) -- The document date (July 2, 2009) is 5384 days in the past. Is this intentional? Checking references for intended status: Best Current Practice ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) -- Obsolete informational reference (is this intentional?): RFC 4234 (Obsoleted by RFC 5234) Summary: 1 error (**), 0 flaws (~~), 1 warning (==), 4 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Network Working Group L. Dusseault 3 Internet-Draft Messaging Architects 4 Updates: 2026 (if approved) R. Sparks 5 Intended status: BCP Tekelec 6 Expires: January 3, 2010 July 2, 2009 8 Guidance on Interoperation and Implementation Reports for Advancement to 9 Draft Standard 10 draft-dusseault-impl-reports-04 12 Status of this Memo 14 This Internet-Draft is submitted to IETF in full conformance with the 15 provisions of BCP 78 and BCP 79. 17 Internet-Drafts are working documents of the Internet Engineering 18 Task Force (IETF), its areas, and its working groups. Note that 19 other groups may also distribute working documents as Internet- 20 Drafts. 22 Internet-Drafts are draft documents valid for a maximum of six months 23 and may be updated, replaced, or obsoleted by other documents at any 24 time. It is inappropriate to use Internet-Drafts as reference 25 material or to cite them other than as "work in progress." 27 The list of current Internet-Drafts can be accessed at 28 http://www.ietf.org/ietf/1id-abstracts.txt. 30 The list of Internet-Draft Shadow Directories can be accessed at 31 http://www.ietf.org/shadow.html. 33 This Internet-Draft will expire on January 3, 2010. 35 Copyright Notice 37 Copyright (c) 2009 IETF Trust and the persons identified as the 38 document authors. All rights reserved. 40 This document is subject to BCP 78 and the IETF Trust's Legal 41 Provisions Relating to IETF Documents in effect on the date of 42 publication of this document (http://trustee.ietf.org/license-info). 43 Please review these documents carefully, as they describe your rights 44 and restrictions with respect to this document. 46 Abstract 48 Advancing a protocol to Draft Standard requires documentation of the 49 interoperation and implementation of the protocol. Historic reports 50 have varied widely in form and level of content and there is little 51 guidance available to new report preparers. This document updates 52 the existing processes and provides more detail on what is 53 appropriate in an interoperability and implementation report. 55 Table of Contents 57 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 3 58 2. Content Requirements . . . . . . . . . . . . . . . . . . . . . 4 59 3. Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 60 4. Feature Coverage . . . . . . . . . . . . . . . . . . . . . . . 7 61 5. Special Cases . . . . . . . . . . . . . . . . . . . . . . . . 8 62 5.1. Deployed Protocols . . . . . . . . . . . . . . . . . . . . 8 63 5.2. Undeployed Protocols . . . . . . . . . . . . . . . . . . . 9 64 5.3. Schemas, languages and formats . . . . . . . . . . . . . . 9 65 5.4. Multiple Contributors, Multiple Implementation Reports . . 9 66 5.5. Test Suites . . . . . . . . . . . . . . . . . . . . . . . 10 67 5.6. Optional Features, extensibility features . . . . . . . . 10 68 6. Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 69 6.1. Minimal Implementation Report . . . . . . . . . . . . . . 11 70 6.2. Covering Exceptions . . . . . . . . . . . . . . . . . . . 12 71 7. Security Considerations . . . . . . . . . . . . . . . . . . . 12 72 8. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 12 73 9. References . . . . . . . . . . . . . . . . . . . . . . . . . . 12 74 9.1. Normative References . . . . . . . . . . . . . . . . . . . 12 75 9.2. Informative References . . . . . . . . . . . . . . . . . . 12 76 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 13 78 1. Introduction 80 The Draft Standard level, and requirements for standards to meet it, 81 are described in [RFC2026]. For Draft Standard, not only must two 82 implementations interoperate, but also documentation (the report) 83 must be provided to the IETF. The entire paragraph covering this 84 documentation reads 86 The Working Group chair is responsible for documenting the 87 specific implementations which qualify the specification for Draft 88 or Internet Standard status along with documentation about testing 89 of the interoperation of these implementations. The documentation 90 must include information about the support of each of the 91 individual options and features. This documentation should be 92 submitted to the Area Director with the protocol action request. 93 (see Section 6) 95 Moving documents along the standards track can be an important signal 96 to the user and implementor communities, and the process of 97 submitting a standard for advancement can help improve that standard 98 or the quality of implementations that participate. However, the 99 barriers seem to be high for advancement to Draft Standard, or at the 100 very least confusing. This memo may help in guiding people through 101 one part of advancing specifications to Draft Standard. It also 102 changes some of the requirements made in RFC2026 in ways that are 103 intended to maintain or improve the quality of reports while reducing 104 the burden of creating them. 106 Having and demonstrating sufficient interoperability is a gating 107 requirement for advancing a protocol to Draft Standard. Thus, the 108 primary goal of an implementation report is to convince the IETF and 109 the IESG that the protocol is ready for Draft Standard. This goal 110 can be met by summarizing the interoperability characteristics and by 111 providing just enough detail to support that conclusion. Side 112 benefits may accrue to the community creating the report in the form 113 of bugs found or fixed in tested implementations, documentation that 114 can help future implementors, or ideas for other documents or future 115 revisions of the protocol being tested. 117 Different kinds of documentation are appropriate for widely deployed 118 standards than for standards that are not yet deployed. Different 119 test approaches are appropriate for standards that are not typical 120 protocols: languages, formats, schemas etc. This memo discusses how 121 reports for these standards may vary in Section 5. 123 Implementation should naturally focus on the final version of the 124 RFC. If there's any evidence that implementations are interoperating 125 based on Internet-Drafts or earlier versions of the specification, or 126 if interoperability was greatly aided by mailing list clarifications, 127 this should be noted in the report. 129 The level of detail in reports accepted in the past has varied 130 widely. An example of a submitted report that is not sufficient for 131 demonstrating interoperability is (in its entirety): "A partial list 132 of implementations include: Cray SGI Netstar IBM HP Network Systems 133 Convex." This report does not state how it is known that these 134 implementations interoperate (was it through public lab testing? 135 internal lab testing? deployment?). Nor does it capture whether 136 implementors are aware of, or were asked about, any features that 137 proved to be problematic. At a different extreme, reports have been 138 submitted that contain a great amount of detail about the test 139 methodology, but relatively little information about what worked and 140 what failed to work. 142 This memo is intended to clarify what an implementation report should 143 contain and to suggest a reasonable form for most implementation 144 reports. It is not intended to rule out good ideas. For example, 145 this memo can't take into account all process variations such as 146 documents going to Draft Standard twice, or consider all types of 147 standards. Whenever the situation varies significantly from what's 148 described here, the IESG uses judgement in determining whether an 149 implementation report meets the goals above. 151 The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", 152 "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this 153 document are to be interpreted as described in BCP 14, [RFC2119]. 155 2. Content Requirements 157 The implementation report MUST identify the author of the report, who 158 is responsible for characterizing the interoperability quality of the 159 protocol. The report MAY identify other contributors (testers, those 160 who answered surveys or those who contributed information) to share 161 credit or blame. The report MAY provide a list of report reviewers 162 who corroborate the characterization of interoperability quality, or 163 name an active WG that reviewed the report. 165 Some of the requirements of RFC2026 are relaxed with this update: 167 o The report MAY name exactly which implementations were tested. A 168 requirement to name implementations was implied by the description 169 of the responsibility for "documenting the specific 170 implementations" in RFC2026. However, note that usually 171 identifying implementations will help meet the goals of 172 implementation reports. If a subset of implementations was tested 173 or surveyed, it would also help to explain how that subset was 174 chosen or self-selected. See also the note on implementation 175 independence below. 177 o The report author MAY choose an appropriate level of detail to 178 document feature interoperability, rather than document each 179 individual feature. See note on granularity of features below. 181 o A contributor other than a WG chair MAY submit an implementation 182 report to an AD. 184 o Optional features that are not implemented, but are important and 185 do not harm interoperability, MAY, exceptionally and with approval 186 of the IESG, be left in a protocol at Draft Standard. See 187 Section 5.6 for documentation requirements and an example of where 188 this is needed. 190 Note: Independence of implementations is mentioned in the RFC2026 191 requirements for Draft Standard status. Independent 192 implementations should be written by different people at different 193 organizations using different code and protocol libraries. If 194 it's necessary to relax this definition, it can be relaxed as long 195 as there is evidence to show that success is due more to the 196 quality of the protocol than to out-of-band understandings or 197 common code. If there are only two implementations of an 198 undeployed protocol, the report SHOULD identify the 199 implementations and their "genealogy" (which libraries were used 200 or where the codebase came from). If there are many more 201 implementations, or the protocol is in broad deployment, it is not 202 necessary to call out which two of the implementations 203 demonstrated interoperability of each given feature -- a reader 204 may conclude that at least some of the implementations of that 205 feature are independent. 207 Note: The granularity of features described in a specification is 208 necessarily very detailed. In contrast, the granularity of an 209 implementation report need not be as detailed. A report need not 210 list every "MAY", "SHOULD" and "MUST" in a complete matrix across 211 implementations. A more effective approach might be to 212 characterize the interoperability quality and testing approach, 213 then call out any known problems in either testing or 214 interoperability. 216 3. Format 218 The format of implementation and interoperability reports MUST be 219 ASCII text with line-breaks for readability. As with Internet- 220 Drafts, no 8-bit characters are currently allowed. It is acceptable, 221 but not necessary, for a report to be formatted as an Internet Draft. 223 Here is a simple outline that an implementation report MAY follow in 224 part or in full: 226 Title: Titles of implementation reports are strongly RECOMMENDED to 227 contain one or more RFC number for consistent lookup in a simple 228 archive. In addition, the name or a common mnemonic of the 229 standard should be in the title. An example might look like 230 "Implementation Report for the Example Name of Some Protocol 231 (ENSP) RFC-XXXX". 233 Author: Identify the author of the report. 235 Summary: Attest that the standard meets the requirements for Draft 236 Standard and name who is attesting it. Describe how many 237 implementations were tested or surveyed. Quickly characterize the 238 deployment level and where the standard can be found in 239 deployment. Call out, and if possible, briefly describe any 240 notably difficult or poorly interoperable features and explain why 241 these still meet the requirement. Assert any derivative 242 conclusions: if a high-level system is tested and shown to work, 243 then we may conclude that the normative requirements of that 244 system (all sub-system or lower-layer protocols, to the extent 245 that a range of features is used) have also been shown to work. 247 Methodology: Describe how the information in the report was 248 obtained. This should be no longer than the summary. 250 Exceptions: This section might read "Every feature was implemented, 251 tested and widely interoperable without exception and without 252 question." If that statement is not true, then this section 253 should cover whether any features were thought to be problematic. 254 Problematic features need not disqualify a protocol from Draft 255 Standard, but this section should explain why they do not (e.g. 256 optional, untestable, trace or extension features). See example 257 Section 6.2. 259 Detail sections: Any other justifying or background information can 260 be included here. In particular, any information that would have 261 made the summary or methodology sections more than a few 262 paragraphs long may be created as a detail section and referred 263 to. 265 In this section, it would be good to discuss how the various 266 considerations sections played out. Were the security 267 considerations accurate and dealt with appropriately in 268 implementations? Was real internationalization experience found 269 among the tested implementations? Did the implementations have 270 any common monitoring or management functionality (although note 271 that documenting the interoperability of a management standard 272 might be separate from documenting the interoperability of the 273 protocol itself)? Did the IANA registries or registrations, if 274 any, work as intended? 276 Appendix sections: It's not necessary to archive test material such 277 as test suites, test documents, questionnaire text or 278 questionnaire responses. However, if it's easy to preserve this 279 information, appendix sections allow readers to skip over it if 280 they are not interested. Preserving detailed test information can 281 help people doing similar or follow-on implementation reports, and 282 can also help new implementors. 284 4. Feature Coverage 286 What constitutes a "feature" for the purposes of an interoperability 287 report has been frequently debated. Good judgement is required in 288 finding a level of detail that adequately demonstrates coverage of 289 the requirements. Statements made at too high a level will result in 290 a document that can't be verified and hasn't adequately challenged 291 that the testing accidentally missed an important failure to 292 interoperate. On the other hand, statements at too fine a level 293 result in an exponentially-exploding matrix of requirement 294 interaction that overburdens the testers and report writers. The 295 important information in the resulting report would likely be hard to 296 find in the sea of detail, making it difficult to evaluate whether 297 the important points of interoperability have been addressed. 299 The best interoperability reports will organize statements of 300 interoperability at a level of detail just sufficient to convince the 301 reader that testing has covered the full set of requirements and in 302 particular that the testing was sufficient to uncover any places 303 where interoperability does not exist. Reports similar to that for 304 RTP/RTCP (an excerpt appears below) are more useful than an 305 exhaustive checklist of every normative statement in the 306 specification. 308 10. Interoperable exchange of receiver report packets. 310 o PASS: Many implementations, tested UCL rat with vat, 311 Cisco IP/TV with vat/vic. 313 11. Interoperable exchange of receiver report packets when 314 not receiving data (ie: the empty receiver report 315 which has to be sent first in each compound RTCP packet 316 when no-participants are transmitting data). 318 o PASS: Many implementations, tested UCL rat with vat, 319 Cisco IP/TV with vat/vic. 321 ... 323 8. Interoperable transport of RTP via TCP using the 324 encapsulation defined in the audio/video profile 326 o FAIL: no known implementations. This has been 327 removed from the audio/video profile. 329 Excerpts from 330 http://www.ietf.org/IESG/Implementations/RTP-RTCP-Implementation.txt 332 Consensus can be a good tool to help determine the appropriate level 333 for such feature descriptions. A working group can make a strong 334 statement by documenting its consensus that a report sufficiently 335 covers a specification and that interoperability has been 336 demonstrated. 338 5. Special Cases 340 5.1. Deployed Protocols 342 When a protocol is deployed, results obtained from laboratory testing 343 are not as useful to the IETF as learning what is actually working in 344 deployment. To this end, it may be more informative to survey 345 implementors or operators. A questionnaire or interview can elicit 346 information from a wider number of sources. As long as it is known 347 that independent implementations can work in deployment, it is more 348 useful to discover what problems exist, rather than gather long and 349 detailed checklists of features and options. 351 5.2. Undeployed Protocols 353 It is appropriate to provide finer-grained detail in reports for 354 protocols that do not yet have a wealth of experience gained through 355 deployment. In particular, some complicated, flexible or powerful 356 features might show interoperability problems when testers start to 357 probe outside the core use cases. RFC2026 requires "sufficient 358 successful operational experience" before progressing a standard to 359 Draft, and notes that 361 Draft Standard may still require additional or more widespread 362 field experience, since it is possible for implementations based 363 on Draft Standard specifications to demonstrate unforeseen 364 behavior when subjected to large-scale use in production 365 environments. 367 When possible, reports for protocols without much deployment 368 experience should anticipate common operational considerations. For 369 example, it would be appropriate to put additional emphasis on 370 overload or congestion management features the protocol may have. 372 5.3. Schemas, languages and formats 374 Standards that are not on-the-wire protocols may be special cases for 375 implementation reports. The IESG SHOULD use judgement in what kind 376 of implementation information is acceptable for these kinds of 377 standards. ABNF (RFC 4234) is an example of a language for which an 378 implementation report was filed: it is interoperable in that 379 protocols are specified using ABNF and these protocols can be 380 successfully implemented and syntax verified. Implementations of 381 ABNF include the RFCs that use it as well as ABNF checking software. 382 MIBs are sometimes documented in implementation reports and examples 383 of that can be found in the archive of implementation reports. 385 The interoperability reporting requirements for some classes of 386 documents may be discussed in separate documents. See 387 [I-D.bradner-metricstest] for example. 389 5.4. Multiple Contributors, Multiple Implementation Reports 391 If it's easiest to divide up the work of implementation reports by 392 implementation, the result -- multiple implementation reports -- MAY 393 be submitted to the sponsoring Area Director one-by-one. Each report 394 might cover one implementation, including: 396 identification of the implementation, 397 an affirmation that the implementation works in testing (or 398 better, in deployment) , 400 whether any features are known to interoperate poorly with other 401 implementations, 403 which optional or required features are not implemented (note that 404 there are no protocol police to punish this disclosure, we should 405 instead thank implementors who point out unimplemented or 406 unimplementable features especially if they can explain why), 408 who is submitting this report for this implementation. 410 These SHOULD be collated into one document for archiving under one 411 title, but can be concatenated trivially even if the result has 412 several summary sections or introductions. 414 5.5. Test Suites 416 Some automated tests, such as automated test clients, do not test 417 interoperability directly. When specialized test implementations are 418 necessary, tests can at least be constructed from real-world protocol 419 or document examples. For example: 421 - ABNF [RFC4234] itself was tested by combining real-world 422 examples -- uses of ABNF found in well-known RFCs -- and feeding 423 those real-world examples into ABNF checkers. As the well-known 424 RFCs were themselves interoperable and in broad deployment, this 425 served as both a deployment proof and an interoperability proof. 427 - Atom [RFC4287] clients might be tested by finding that they 428 consistently display the information in a test Atom feed, 429 constructed from real-world examples that cover all the required 430 and optional features. 432 As a counter-example, the automated WebDAV test client Litmus 433 (http://www.webdav.org/neon/litmus/) is of limited use in 434 demonstrating interoperability for WebDAV because it tests 435 completeness of server implementations and simple test cases. It 436 does not test real-world use or whether any real WebDAV clients 437 implement a feature properly or at all. 439 5.6. Optional Features, extensibility features 441 Optional features need not be shown to be implemented everywhere. 442 However, they do need to be implemented somewhere, and more than one 443 independent implementation is required. If an optional feature does 444 not meet this requirement, the implementation report must say so and 445 explain why the feature must be kept anyway versus being evidence of 446 a poor-quality standard. 448 Extensibility points and versioning features are particularly likely 449 to need this kind of treatment. When a protocol version 1 is 450 released, the protocol version field itself is likely to be unused. 451 Before any other versions exist, it can't really be demonstrated that 452 this particular field or option is implemented. 454 6. Examples 456 Some good, extremely brief examples of implementation reports can be 457 found in the archives. 459 http://www.ietf.org/IESG/Implementations/ 460 PPP-LCP-EXT-implementation 462 http://www.ietf.org/IESG/Implementations/OTP-Draft-implementation 464 In some cases, perfectly good implementation reports are longer than 465 necessary, but may preserve helpful information: 467 http://www.ietf.org/IESG/Implementations/rfc2329.txt 469 http://www.ietf.org/IESG/Implementations/RFC4234_implem.txt 471 6.1. Minimal Implementation Report 473 "A large number of SMTP implementations support SMTP pipelining, 474 including: (1) Innosoft's PMDF and Sun's SIMS. (2) ISODE/ 475 MessagingDirect's PP. (3) ISOCOR's nPlex. (4) software.com's 476 post.office. (5) Zmailer. (6) Smail. (7) The SMTP server in Windows 477 2000. SMTP pipelining has been widely deployed in these and other 478 implementations for some time, and there have been no reported 479 interoperability problems." 481 This implementation report can also be found at http://www.ietf.org/ 482 IESG/Implementations/SMTP-PIPELINING-Standard-implementation but the 483 entire report is already reproduced above. Since SMTP pipelining had 484 no interoperability problems, the implementation report was able to 485 provide all the key information in a very terse format. The reader 486 can infer from the different vendors and platforms that the codebases 487 must by and large be independent. This implementation report would 488 only be slightly improved by a positive affirmation that there have 489 been probes or investigations asking about interoperability problems 490 rather than merely a lack of problem reports, and by stating who 491 provided this summary report. 493 6.2. Covering Exceptions 495 The RFC2821bis (SMTP) implementation survey asked implementors what 496 features were not implemented. The VRFY and EXPN commands showed up 497 frequently in the responses as not implemented or disabled. That 498 implementation report might have followed the advice in this 499 document, had it already existed, by justifying the interoperability 500 of those features up front or in an "exceptions" section if the 501 outline defined in this memo were used: 503 "VRFY and EXPN commands are often not implemented or are disabled. 504 This does not pose an interoperability problem for SMTP because EXPN 505 is an optional features and its support is never relied on. VRFY is 506 required, but in practice it is not relied on because servers can 507 legitimately reply with a non-response. These commands should remain 508 in the standard because they are sometimes used by administrators 509 within a domain under controlled circumstances (e.g. authenticated 510 query from within the domain). Thus, the occasional utility argues 511 for keeping these features, while the lack of problems for end-users 512 means that the interoperability of SMTP in real use is not in the 513 least degraded." 515 7. Security Considerations 517 This memo introduces no new security considerations. 519 8. IANA Considerations 521 This document has no actions for IANA. 523 9. References 525 9.1. Normative References 527 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 528 Requirement Levels", BCP 14, RFC 2119, March 1997. 530 9.2. Informative References 532 [I-D.bradner-metricstest] 533 Bradner, S. and V. Paxson, "Advancement of metrics 534 specifications on the IETF Standards Track", 535 draft-bradner-metricstest-03 (work in progress), 536 August 2007. 538 [RFC2026] Bradner, S., "The Internet Standards Process -- Revision 539 3", BCP 9, RFC 2026, October 1996. 541 [RFC4234] Crocker, D., Ed. and P. Overell, "Augmented BNF for Syntax 542 Specifications: ABNF", RFC 4234, October 2005. 544 [RFC4287] Nottingham, M., Ed. and R. Sayre, Ed., "The Atom 545 Syndication Format", RFC 4287, December 2005. 547 Authors' Addresses 549 Lisa Dusseault 550 Messaging Architects 552 Email: lisa.dusseault@gmail.com 554 Robert Sparks 555 Tekelec 556 17210 Campbell Road 557 Suite 250 558 Dallas, Texas 75254-4203 559 USA 561 Email: RjS@nostrum.com