| < draft-eastlake-randomness2-05.txt | draft-eastlake-randomness2-06.txt > | |||
|---|---|---|---|---|
| Network Working Group Donald E. Eastlake, 3rd | Network Working Group Donald E. Eastlake, 3rd | |||
| OBSOLETES RFC 1750 Jeffrey I. Schiller | OBSOLETES RFC 1750 Jeffrey I. Schiller | |||
| Steve Crocker | Steve Crocker | |||
| Expires June 2004 December 2003 | Expires October 2004 April 2004 | |||
| Randomness Requirements for Security | Randomness Requirements for Security | |||
| ---------- ------------ --- -------- | ---------- ------------ --- -------- | |||
| <draft-eastlake-randomness2-05.txt> | <draft-eastlake-randomness2-06.txt> | |||
| Status of This Document | Status of This Document | |||
| This document is intended to become a Best Current Practice. | This document is intended to become a Best Current Practice. | |||
| Comments should be sent to the authors. Distribution is unlimited. | Comments should be sent to the authors. Distribution is unlimited. | |||
| This document is an Internet-Draft and is in full conformance with | This document is an Internet-Draft and is in full conformance with | |||
| all provisions of Section 10 of RFC 2026. Internet-Drafts are | all provisions of Section 10 of RFC 2026. Internet-Drafts are | |||
| working documents of the Internet Engineering Task Force (IETF), its | working documents of the Internet Engineering Task Force (IETF), its | |||
| areas, and its working groups. Note that other groups may also | areas, and its working groups. Note that other groups may also | |||
| distribute working documents as Internet-Drafts. | distribute working documents as Internet-Drafts. | |||
| Internet-Drafts are draft documents valid for a maximum of six months | Internet-Drafts are draft documents valid for a maximum of six months | |||
| and may be updated, replaced, or obsoleted by other documents at any | and may be updated, replaced, or obsoleted by other documents at any | |||
| time. It is inappropriate to use Internet-Drafts as reference | time. It is inappropriate to use Internet-Drafts as reference | |||
| material or to cite them other than as "work in progress." | material or to cite them other than as "work in progress." The list | |||
| of current Internet-Drafts can be accessed at | ||||
| The list of current Internet-Drafts can be accessed at | http://www.ietf.org/ietf/1id-abstracts.txt The list of Internet-Draft | |||
| http://www.ietf.org/ietf/1id-abstracts.txt | Shadow Directories can be accessed at | |||
| The list of Internet-Draft Shadow Directories can be accessed at | ||||
| http://www.ietf.org/shadow.html. | http://www.ietf.org/shadow.html. | |||
| Abstract | Abstract | |||
| Security systems today are built on strong cryptographic algorithms | Security systems are built on strong cryptographic algorithms that | |||
| that foil pattern analysis attempts. However, the security of these | foil pattern analysis attempts. However, the security of these | |||
| systems is dependent on generating secret quantities for passwords, | systems is dependent on generating secret quantities for passwords, | |||
| cryptographic keys, and similar quantities. The use of pseudo-random | cryptographic keys, and similar quantities. The use of pseudo-random | |||
| processes to generate secret quantities can result in pseudo- | processes to generate secret quantities can result in pseudo- | |||
| security. The sophisticated attacker of these security systems may | security. The sophisticated attacker of these security systems may | |||
| find it easier to reproduce the environment that produced the secret | find it easier to reproduce the environment that produced the secret | |||
| quantities, searching the resulting small set of possibilities, than | quantities, searching the resulting small set of possibilities, than | |||
| to locate the quantities in the whole of the potential number space. | to locate the quantities in the whole of the potential number space. | |||
| Choosing random quantities to foil a resourceful and motivated | Choosing random quantities to foil a resourceful and motivated | |||
| adversary is surprisingly difficult. This document points out many | adversary is surprisingly difficult. This document points out many | |||
| pitfalls in using traditional pseudo-random number generation | pitfalls in using traditional pseudo-random number generation | |||
| techniques for choosing such quantities. It recommends the use of | techniques for choosing such quantities. It recommends the use of | |||
| truly random hardware techniques and shows that the existing hardware | truly random hardware techniques and shows that the existing hardware | |||
| on many systems can be used for this purpose. It provides | on many systems can be used for this purpose. It provides suggestions | |||
| suggestions to ameliorate the problem when a hardware solution is not | to ameliorate the problem when a hardware solution is not available. | |||
| available. And it gives examples of how large such quantities need | And it gives examples of how large such quantities need to be for | |||
| to be for some applications. | some applications. | |||
| Acknowledgements | Acknowledgements | |||
| Special thanks to | Special thanks to Peter Gutmann who has permitted the incorporation | |||
| (1) The authors of "Minimal Key Lengths for Symmetric Ciphers to | of material from his paper "Software Generation of Practically Strong | |||
| Provide Adequate Commercial Security" which is incorporated as | Random Numbers". | |||
| Appendix A. | ||||
| (2) Peter Gutmann who has permitted the incorporation into this | ||||
| replacement for RFC 1750 of material from is paper "Software | ||||
| Generation of Practially Strong Random Numbers". | ||||
| The following other persons (in alphabetic order) contributed to this | The following other persons (in alphabetic order) contributed | |||
| document: | substantially to this document: | |||
| Tony Hansen, Sandy Harris | Tony Hansen, Sandy Harris, Paul Hoffman, Russ Housley | |||
| The following persons (in alphabetic order) contributed to RFC 1750, | The following persons (in alphabetic order) contributed to RFC 1750, | |||
| the predeceasor of this document: | the predecessor of this document: | |||
| David M. Balenson, Don T. Davis, Carl Ellison, Marc Horowitz, | David M. Balenson, Don T. Davis, Carl Ellison, Marc Horowitz, | |||
| Christian Huitema, Charlie Kaufman, Steve Kent, Hal Murray, Neil | Christian Huitema, Charlie Kaufman, Steve Kent, Hal Murray, Neil | |||
| Haller, Richard Pitkin, Tim Redmond, and Doug Tygar. | Haller, Richard Pitkin, Tim Redmond, and Doug Tygar. | |||
| Table of Contents | Table of Contents | |||
| Status of This Document....................................1 | Status of This Document....................................1 | |||
| Abstract...................................................1 | ||||
| Abstract...................................................2 | ||||
| Acknowledgements...........................................2 | Acknowledgements...........................................2 | |||
| Table of Contents..........................................3 | Table of Contents..........................................3 | |||
| 1. Introduction............................................5 | 1. Introduction............................................5 | |||
| 2. Requirements............................................6 | 2. General Requirements....................................6 | |||
| 3. Traditional Pseudo-Random Sequences.....................8 | 3. Traditional Pseudo-Random Sequences.....................8 | |||
| 4. Unpredictability.......................................10 | 4. Unpredictability.......................................10 | |||
| 4.1 Problems with Clocks and Serial Numbers...............10 | 4.1 Problems with Clocks and Serial Numbers...............10 | |||
| 4.2 Timing and Content of External Events.................11 | 4.2 Timing and Content of External Events.................11 | |||
| 4.3 The Fallacy of Complex Manipulation...................11 | 4.3 The Fallacy of Complex Manipulation...................11 | |||
| 4.4 The Fallacy of Selection from a Large Database........12 | 4.4 The Fallacy of Selection from a Large Database........12 | |||
| 5. Hardware for Randomness................................13 | 5. Hardware for Randomness................................13 | |||
| 5.1 Volume Required.......................................13 | 5.1 Volume Required.......................................13 | |||
| 5.2 Sensitivity to Skew...................................13 | 5.2 Sensitivity to Skew...................................13 | |||
| 5.2.1 Using Stream Parity to De-Skew......................14 | 5.2.1 Using Stream Parity to De-Skew......................14 | |||
| 5.2.2 Using Transition Mappings to De-Skew................15 | 5.2.2 Using Transition Mappings to De-Skew................15 | |||
| 5.2.3 Using FFT to De-Skew................................16 | 5.2.3 Using FFT to De-Skew................................16 | |||
| 5.2.4 Using S-Boxes to De-Skew............................16 | 5.2.4 Using Compression to De-Skew........................16 | |||
| 5.2.5 Using Compression to De-Skew........................17 | ||||
| 5.3 Existing Hardware Can Be Used For Randomness..........17 | 5.3 Existing Hardware Can Be Used For Randomness..........17 | |||
| 5.3.1 Using Existing Sound/Video Input....................17 | 5.3.1 Using Existing Sound/Video Input....................17 | |||
| 5.3.2 Using Existing Disk Drives..........................18 | 5.3.2 Using Existing Disk Drives..........................17 | |||
| 5.4 Ring Oscillator Sources...............................18 | 5.4 Ring Oscillator Sources...............................18 | |||
| 6. Recommended Software Strategy..........................19 | 6. Recommended Software Strategy..........................19 | |||
| 6.1 Mixing Functions......................................19 | 6.1 Mixing Functions......................................19 | |||
| 6.1.1 A Trivial Mixing Function...........................19 | 6.1.1 A Trivial Mixing Function...........................19 | |||
| 6.1.2 Stronger Mixing Functions...........................20 | 6.1.2 Stronger Mixing Functions...........................20 | |||
| 6.1.3 Diffie-Hellman as a Mixing Function.................21 | 6.1.3 Diffie-Hellman as a Mixing Function.................22 | |||
| 6.1.4 Using a Mixing Function to Stretch Random Bits......22 | 6.1.4 Using a Mixing Function to Stretch Random Bits......22 | |||
| 6.1.5 Other Factors in Choosing a Mixing Function.........22 | 6.1.5 Other Factors in Choosing a Mixing Function.........23 | |||
| 6.2 Non-Hardware Sources of Randomness....................23 | 6.2 Non-Hardware Sources of Randomness....................23 | |||
| 6.3 Cryptographically Strong Sequences....................24 | 6.3 Cryptographically Strong Sequences....................24 | |||
| 6.3.1 Traditional Strong Sequences........................24 | 6.3.1 Traditional Strong Sequences........................25 | |||
| 6.3.2 The Blum Blum Shub Sequence Generator...............25 | 6.3.2 The Blum Blum Shub Sequence Generator...............26 | |||
| 6.3.3 Entropy Pool Techniques.............................26 | 6.3.3 Entropy Pool Techniques.............................27 | |||
| 7. Key Generation Standards and Examples..................28 | 7. Key Generation Standards and Examples..................28 | |||
| 7.1 US DoD Recommendations for Password Generation........28 | 7.1 US DoD Recommendations for Password Generation........28 | |||
| 7.2 X9.17 Key Generation..................................28 | 7.2 X9.17 Key Generation..................................28 | |||
| 7.3 The /dev/random Device under Linux....................29 | 7.3 DSS Pseudo-Random Number Generation...................29 | |||
| 7.4 X9.82 Pseudo-Random Number Generation.................30 | ||||
| 7.5 The /dev/random Device................................30 | ||||
| More Table of Contents | 8. Examples of Randomness Required........................32 | |||
| 8.1 Password Generation..................................32 | ||||
| 8.2 A Very High Security Cryptographic Key................33 | ||||
| 8.2.1 Effort per Key Trial................................33 | ||||
| 8.2.2 Meet in the Middle Attacks..........................34 | ||||
| 8.2.3 Other Considerations................................35 | ||||
| 8. Examples of Randomness Required........................31 | 9. Conclusion.............................................36 | |||
| 8.1 Password Generation..................................31 | ||||
| 8.2 A Very High Security Cryptographic Key................32 | ||||
| 8.2.1 Effort per Key Trial................................32 | ||||
| 8.2.2 Meet in the Middle Attacks..........................32 | ||||
| 9. Conclusion.............................................34 | 10. Security Considerations...............................37 | |||
| 10. Security Considerations...............................34 | 11. Intellectual Property Considerations..................37 | |||
| Intellectual Property Considerations......................34 | ||||
| Appendix: Minimal Secure Key Lengths Study................36 | 12. Appendix A: Changes from RFC 1750.....................38 | |||
| A.0 Abstract..............................................36 | ||||
| A.1. Encryption Plays an Essential Role in Protecting.....37 | ||||
| A.1.1 There is a need for information security............37 | ||||
| A.1.2 Encryption to protect confidentiality...............38 | ||||
| A.1.3 There are a variety of attackers....................39 | ||||
| A.1.4 Strong encryption is not expensive..................40 | ||||
| A.2. Brute-Force is becoming easier.......................40 | ||||
| A.3. 40-Bit Key Lengths Offer Virtually No Protection.....42 | ||||
| A.4. Even DES with 56-Bit Keys Is Increasingly Inadequate.43 | ||||
| A.4.1 DES is no panacea today.............................43 | ||||
| A.4.2 There are smarter avenues of attack than brute force44 | ||||
| A.4.3 Other algorithms are similar........................44 | ||||
| A.5. Appropriate Key Lengths for the Future --- A Proposal45 | ||||
| A.6 About the Authors.....................................47 | ||||
| A.7 Acknowledgement.......................................48 | ||||
| Informative References....................................49 | 13. Informative References................................39 | |||
| Authors Addresses.........................................53 | Authors Addresses.........................................43 | |||
| File Name and Expiration..................................53 | File Name and Expiration..................................43 | |||
| 1. Introduction | 1. Introduction | |||
| Software cryptography is coming into wider use and is continuing to | Software cryptography is coming into wider use and is continuing to | |||
| spread, although there is a long way to go until it becomes | spread, although there is a long way to go until it becomes | |||
| pervasive. | pervasive. | |||
| Systems like SSH, IPSEC, TLS, S/MIME, PGP, DNSSEC, Kerberos, etc. are | Systems like SSH, IPSEC, TLS, S/MIME, PGP, DNSSEC, Kerberos, etc. are | |||
| maturing and becoming a part of the network landscape [SSH, DNSSEC, | maturing and becoming a part of the network landscape [SSH, IPSEC, | |||
| IPSEC, MAIL*, TLS]. By comparison, when the previous version of this | MAIL*, TLS, DNSSEC]. By comparison, when the previous version of this | |||
| document [RFC 1750] was issued in 1994, about the only Internet | document [RFC 1750] was issued in 1994, about the only Internet | |||
| cryptographic security specification in the IETF was the Privacy | cryptographic security specification in the IETF was the Privacy | |||
| Enhanced Mail protocol [MAIL PEM]. | Enhanced Mail protocol [MAIL PEM]. | |||
| These systems provide substantial protection against snooping and | These systems provide substantial protection against snooping and | |||
| spoofing. However, there is a potential flaw. At the heart of all | spoofing. However, there is a potential flaw. At the heart of all | |||
| cryptographic systems is the generation of secret, unguessable (i.e., | cryptographic systems is the generation of secret, unguessable (i.e., | |||
| random) numbers. | random) numbers. | |||
| For the present, the lack of generally available facilities for | The lack of generally available facilities for generating such | |||
| generating such unpredictable numbers is an open wound in the design | unpredictable numbers is an open wound in the design of cryptographic | |||
| of cryptographic software. For the software developer who wants to | software. For the software developer who wants to build a key or | |||
| build a key or password generation procedure that runs on a wide | password generation procedure that runs on a wide range of hardware, | |||
| range of hardware, the only safe strategy so far has been to force | the only safe strategy so far has been to force the local | |||
| the local installation to supply a suitable routine to generate | installation to supply a suitable routine to generate random numbers. | |||
| random numbers. To say the least, this is an awkward, error-prone | This is an awkward, error-prone and unpalatable solution. | |||
| and unpalatable solution. | ||||
| It is important to keep in mind that the requirement is for data that | It is important to keep in mind that the requirement is for data that | |||
| an adversary has a very low probability of guessing or determining. | an adversary has a very low probability of guessing or determining. | |||
| This can easily fail if pseudo-random data is used which only meets | This can easily fail if pseudo-random data is used which only meets | |||
| traditional statistical tests for randomness or which is based on | traditional statistical tests for randomness or which is based on | |||
| limited range sources, such as clocks. Frequently such random | limited range sources, such as clocks. Frequently such random | |||
| quantities are determinable by an adversary searching through an | quantities are determinable by an adversary searching through an | |||
| embarrassingly small space of possibilities. | embarrassingly small space of possibilities. | |||
| This Best Current Practice describes techniques for producing random | This Best Current Practice describes techniques for producing random | |||
| quantities that will be resistant to such attack. It recommends that | quantities that will be resistant to such attack. It recommends that | |||
| future systems include hardware random number generation or provide | future systems include hardware random number generation or provide | |||
| access to existing hardware that can be used for this purpose. It | access to existing hardware that can be used for this purpose. It | |||
| suggests methods for use if such hardware is not available. And it | suggests methods for use if such hardware is not available. And it | |||
| gives some estimates of the number of random bits required for sample | gives some estimates of the number of random bits required for sample | |||
| applications. | applications. | |||
| 2. Requirements | 2. General Requirements | |||
| A commonly encountered randomness requirement today is the user | A commonly encountered randomness requirement today is the user | |||
| password. This is usually a simple character string. Obviously, if a | password. This is usually a simple character string. Obviously, if a | |||
| password can be guessed, it does not provide security. (For re- | password can be guessed, it does not provide security. (For re-usable | |||
| usable passwords, it is desirable that users be able to remember the | passwords, it is desirable that users be able to remember the | |||
| password. This may make it advisable to use pronounceable character | password. This may make it advisable to use pronounceable character | |||
| strings or phrases composed on ordinary words. But this only affects | strings or phrases composed on ordinary words. But this only affects | |||
| the format of the password information, not the requirement that the | the format of the password information, not the requirement that the | |||
| password be very hard to guess.) | password be very hard to guess.) | |||
| Many other requirements come from the cryptographic arena. | Many other requirements come from the cryptographic arena. | |||
| Cryptographic techniques can be used to provide a variety of services | Cryptographic techniques can be used to provide a variety of services | |||
| including confidentiality and authentication. Such services are | including confidentiality and authentication. Such services are based | |||
| based on quantities, traditionally called "keys", that are unknown to | on quantities, traditionally called "keys", that are unknown to and | |||
| and unguessable by an adversary. | unguessable by an adversary. | |||
| In some cases, such as the use of symmetric encryption with the one | In some cases, such as the use of symmetric encryption with the one | |||
| time pads [CRYPTO*] or the US Data Encryption Standard [DES] or | time pads or the US Data Encryption Standard [DES] or Advanced | |||
| Advanced Encryption Standard [AES], the parties who wish to | Encryption Standard [AES], the parties who wish to communicate | |||
| communicate confidentially and/or with authentication must all know | confidentially and/or with authentication must all know the same | |||
| the same secret key. In other cases, using what are called | secret key. In other cases, using what are called asymmetric or | |||
| asymmetric or "public key" cryptographic techniques, keys come in | "public key" cryptographic techniques, keys come in pairs. One key of | |||
| pairs. One key of the pair is private and must be kept secret by one | the pair is private and must be kept secret by one party, the other | |||
| party, the other is public and can be published to the world. It is | is public and can be published to the world. It is computationally | |||
| computationally infeasible to determine the private key from the | infeasible to determine the private key from the public key and | |||
| public key and knowledge of the public is of no help to an adversary. | knowledge of the public is of no help to an adversary [ASYMMETRIC]. | |||
| [ASYMMETRIC, CRYPTO*] | [SCHNEIER, FERGUSON, KAUFMAN] | |||
| The frequency and volume of the requirement for random quantities | The frequency and volume of the requirement for random quantities | |||
| differs greatly for different cryptographic systems. Using pure RSA | differs greatly for different cryptographic systems. Using pure RSA, | |||
| [CRYPTO*], random quantities are required when the key pair is | random quantities are required only when a new key pair is generated; | |||
| generated, but thereafter any number of messages can be signed | thereafter any number of messages can be signed without a further | |||
| without a further need for randomness. The public key Digital | need for randomness. The public key Digital Signature Algorithm | |||
| Signature Algorithm devised by the US National Institute of Standards | devised by the US National Institute of Standards and Technology | |||
| and Technology (NIST) requires good random numbers for each signature | (NIST) requires good random numbers for each signature [DSS]. And | |||
| [DSS]. And encrypting with a one time pad, in principle the | encrypting with a one time pad, in principle the strongest possible | |||
| strongest possible encryption technique, requires a volume of | encryption technique, requires a volume of randomness equal to all | |||
| randomness equal to all the messages to be processed [CRYPTO*]. | the messages to be processed. [SCHNEIER, FERGUSON, KAUFMAN] | |||
| In most of these cases, an adversary can try to determine the | In most of these cases, an adversary can try to determine the | |||
| "secret" key by trial and error. (This is possible as long as the | "secret" key by trial and error. (This is possible as long as the key | |||
| key is enough smaller than the message that the correct key can be | is enough smaller than the message that the correct key can be | |||
| uniquely identified.) The probability of an adversary succeeding at | uniquely identified.) The probability of an adversary succeeding at | |||
| this must be made acceptably low, depending on the particular | this must be made acceptably low, depending on the particular | |||
| application. The size of the space the adversary must search is | application. The size of the space the adversary must search is | |||
| related to the amount of key "information" present in the information | related to the amount of key "information" present in the information | |||
| theoretic sense [SHANNON]. This depends on the number of different | theoretic sense [SHANNON]. This depends on the number of different | |||
| secret values possible and the probability of each value as follows: | secret values possible and the probability of each value as follows: | |||
| ----- | ----- | |||
| \ | \ | |||
| Bits-of-info = \ - p * log ( p ) | Bits-of-information = \ - p * log ( p ) | |||
| / i 2 i | / i 2 i | |||
| / | / | |||
| ----- | ----- | |||
| where i counts from 1 to the number of possible secret values and p | where i counts from 1 to the number of possible secret values and p | |||
| sub i is the probability of the value numbered i. (Since p sub i is | sub i is the probability of the value numbered i. (Since p sub i is | |||
| less than one, the log will be negative so each term in the sum will | less than one, the log will be negative so each term in the sum will | |||
| be non-negative.) | be non-negative.) | |||
| If there are 2^n different values of equal probability, then n bits | If there are 2^n different values of equal probability, then n bits | |||
| of information are present and an adversary would, on the average, | of information are present and an adversary would, on the average, | |||
| have to try half of the values, or 2^(n-1) , before guessing the | have to try half of the values, or 2^(n-1) , before guessing the | |||
| secret quantity. If the probability of different values is unequal, | secret quantity. If the probability of different values is unequal, | |||
| then there is less information present and fewer guesses will, on | then there is less information present and fewer guesses will, on | |||
| average, be required by an adversary. In particular, any values that | average, be required by an adversary. In particular, any values that | |||
| the adversary can know are impossible, or are of low probability, can | the adversary can know are impossible, or are of low probability, can | |||
| be initially ignored by an adversary, who will search through the | be initially ignored by an adversary, who will search through the | |||
| more probable values first. | more probable values first. | |||
| For example, consider a cryptographic system that uses 128 bit keys. | For example, consider a cryptographic system that uses 128 bit keys. | |||
| If these 128 bit keys are derived by using a fixed pseudo-random | If these 128 bit keys are derived by using a fixed pseudo-random | |||
| number generator that is seeded with an 8 bit seed, then an adversary | number generator that is seeded with an 8 bit seed, then an adversary | |||
| needs to search through only 256 keys (by running the pseudo-random | needs to search through only 256 keys (by running the pseudo-random | |||
| number generator with every possible seed), not the 2^128 keys that | number generator with every possible seed), not the 2^128 keys that | |||
| may at first appear to be the case. Only 8 bits of "information" are | may at first appear to be the case. Only 8 bits of "information" are | |||
| in these 128 bit keys. | in these 128 bit keys. | |||
| 3. Traditional Pseudo-Random Sequences | 3. Traditional Pseudo-Random Sequences | |||
| Most traditional sources of random numbers use deterministic sources | Most traditional sources of random numbers use deterministic sources | |||
| of "pseudo-random" numbers. These typically start with a "seed" | of "pseudo-random" numbers. These typically start with a "seed" | |||
| quantity and use numeric or logical operations to produce a sequence | quantity and use numeric or logical operations to produce a sequence | |||
| of values. | of values. | |||
| [KNUTH] has a classic exposition on pseudo-random numbers. | [KNUTH] has a classic exposition on pseudo-random numbers. | |||
| Applications he mentions are simulation of natural phenomena, | Applications he mentions are simulation of natural phenomena, | |||
| sampling, numerical analysis, testing computer programs, decision | sampling, numerical analysis, testing computer programs, decision | |||
| making, and games. None of these have the same characteristics as | making, and games. None of these have the same characteristics as the | |||
| the sort of security uses we are talking about. Only in the last two | sort of security uses we are talking about. Only in the last two | |||
| could there be an adversary trying to find the random quantity. | could there be an adversary trying to find the random quantity. | |||
| However, in these cases, the adversary normally has only a single | However, in these cases, the adversary normally has only a single | |||
| chance to use a guessed value. In guessing passwords or attempting | chance to use a guessed value. In guessing passwords or attempting to | |||
| to break an encryption scheme, the adversary normally has many, | break an encryption scheme, the adversary normally has many, perhaps | |||
| perhaps unlimited, chances at guessing the correct value because they | unlimited, chances at guessing the correct value. They can store the | |||
| can store the message they are trying to break and repeatedly attack | message they are trying to break and repeatedly attack it. They are | |||
| it. They should also be assumed to be aided by a computer. | also be assumed to be aided by a computer. | |||
| For testing the "randomness" of numbers, Knuth suggests a variety of | For testing the "randomness" of numbers, Knuth suggests a variety of | |||
| measures including statistical and spectral. These tests check | measures including statistical and spectral. These tests check things | |||
| things like autocorrelation between different parts of a "random" | like autocorrelation between different parts of a "random" sequence | |||
| sequence or distribution of its values. But they could be met by a | or distribution of its values. But they could be met by a constant | |||
| constant stored random sequence, such as the "random" sequence | stored random sequence, such as the "random" sequence printed in the | |||
| printed in the CRC Standard Mathematical Tables [CRC]. | CRC Standard Mathematical Tables [CRC]. Despite meeting all the tests | |||
| suggested by Knuth, that sequence is unsuitable for cryptographic use | ||||
| as adversaries must be assumed to have copies of all common published | ||||
| "random" sequences and will able to spot the source and predict | ||||
| future values. | ||||
| A typical pseudo-random number generation technique, known as a | A typical pseudo-random number generation technique, known as a | |||
| linear congruence pseudo-random number generator, is modular | linear congruence pseudo-random number generator, is modular | |||
| arithmetic where the value numbered N+1 is calculated from the value | arithmetic where the value numbered N+1 is calculated from the value | |||
| numbered N by | numbered N by | |||
| V = ( V * a + b )(Mod c) | V = ( V * a + b )(Mod c) | |||
| N+1 N | N+1 N | |||
| The above technique has a strong relationship to linear shift | The above technique has a strong relationship to linear shift | |||
| register pseudo-random number generators, which are well understood | register pseudo-random number generators, which are well understood | |||
| cryptographically [SHIFT*]. In such generators bits are introduced | cryptographically [SHIFT*]. In such generators bits are introduced at | |||
| at one end of a shift register as the Exclusive Or (binary sum | one end of a shift register as the Exclusive Or (binary sum without | |||
| without carry) of bits from selected fixed taps into the register. | carry) of bits from selected fixed taps into the register. For | |||
| For example: | example: | |||
| +----+ +----+ +----+ +----+ | +----+ +----+ +----+ +----+ | |||
| | B | <-- | B | <-- | B | <-- . . . . . . <-- | B | <-+ | | B | <-- | B | <-- | B | <-- . . . . . . <-- | B | <-+ | |||
| | 0 | | 1 | | 2 | | n | | | | 0 | | 1 | | 2 | | n | | | |||
| +----+ +----+ +----+ +----+ | | +----+ +----+ +----+ +----+ | | |||
| | | | | | | | | | | |||
| | | V +-----+ | | | V +-----+ | |||
| | V +----------------> | | | | V +----------------> | | | |||
| V +-----------------------------> | XOR | | V +-----------------------------> | XOR | | |||
| +---------------------------------------------------> | | | +---------------------------------------------------> | | | |||
| +-----+ | +-----+ | |||
| V = ( ( V * 2 ) + B .xor. B ... )(Mod 2^n) | V = ( ( V * 2 ) + B .xor. B ... )(Mod 2^n) | |||
| N+1 N 0 2 | N+1 N 0 2 | |||
| The goodness of traditional pseudo-random number generator algorithms | The goodness of traditional pseudo-random number generator algorithms | |||
| is measured by statistical tests on such sequences. Carefully chosen | is measured by statistical tests on such sequences. Carefully chosen | |||
| values of the initial V and a, b, and c or the placement of shift | values a, b, c, and initial V or the placement of shift register tap | |||
| register tap in the above simple processes can produce excellent | in the above simple processes can produce excellent statistics. | |||
| statistics. | ||||
| These sequences may be adequate in simulations (Monte Carlo | These sequences may be adequate in simulations (Monte Carlo | |||
| experiments) as long as the sequence is orthogonal to the structure | experiments) as long as the sequence is orthogonal to the structure | |||
| of the space being explored. Even there, subtle patterns may cause | of the space being explored. Even there, subtle patterns may cause | |||
| problems. However, such sequences are clearly bad for use in | problems. However, such sequences are clearly bad for use in security | |||
| security applications. They are fully predictable if the initial | applications. They are fully predictable if the initial state is | |||
| state is known. Depending on the form of the pseudo-random number | known. Depending on the form of the pseudo-random number generator, | |||
| generator, the sequence may be determinable from observation of a | the sequence may be determinable from observation of a short portion | |||
| short portion of the sequence [CRYPTO*, STERN]. For example, with | of the sequence [SCHNEIER, STERN]. For example, with the generators | |||
| the generators above, one can determine V(n+1) given knowledge of | above, one can determine V(n+1) given knowledge of V(n). In fact, it | |||
| V(n). In fact, it has been shown that with these techniques, even if | has been shown that with these techniques, even if only one bit of | |||
| only one bit of the pseudo-random values are released, the seed can | the pseudo-random values are released, the seed can be determined | |||
| be determined from short sequences. | from short sequences. | |||
| Not only have linear congruent generators been broken, but techniques | Not only have linear congruent generators been broken, but techniques | |||
| are now known for breaking all polynomial congruent generators. | are now known for breaking all polynomial congruent generators. | |||
| [KRAWCZYK] | [KRAWCZYK] | |||
| 4. Unpredictability | 4. Unpredictability | |||
| Randomness in the traditional sense described in section 3 is NOT the | Statistically tested randomness in the traditional sense described in | |||
| same as the unpredictability required for security use. | section 3 is NOT the same as the unpredictability required for | |||
| security use. | ||||
| For example, use of a widely available constant sequence, such as | For example, use of a widely available constant sequence, such as | |||
| that from the CRC tables, is very weak against an adversary. Once | that from the CRC tables, is very weak against an adversary. Once | |||
| they learn of or guess it, they can easily break all security, future | they learn of or guess it, they can easily break all security, future | |||
| and past, based on the sequence. [CRC] Yet the statistical properties | and past, based on the sequence. [CRC] Yet the statistical properties | |||
| of these tables are good. | of these tables are good. | |||
| The following sections describe the limitations of some randomness | The following sections describe the limitations of some randomness | |||
| generation techniques and sources. | generation techniques and sources. | |||
| 4.1 Problems with Clocks and Serial Numbers | 4.1 Problems with Clocks and Serial Numbers | |||
| Computer clocks, or similar operating system or hardware values, | Computer clocks, or similar operating system or hardware values, | |||
| provide significantly fewer real bits of unpredictability than might | provide significantly fewer real bits of unpredictability than might | |||
| appear from their specifications. | appear from their specifications. | |||
| Tests have been done on clocks on numerous systems and it was found | Tests have been done on clocks on numerous systems and it was found | |||
| that their behavior can vary widely and in unexpected ways. One | that their behavior can vary widely and in unexpected ways. One | |||
| version of an operating system running on one set of hardware may | version of an operating system running on one set of hardware may | |||
| actually provide, say, microsecond resolution in a clock while a | actually provide, say, microsecond resolution in a clock while a | |||
| different configuration of the "same" system may always provide the | different configuration of the "same" system may always provide the | |||
| same lower bits and only count in the upper bits at much lower | same lower bits and only count in the upper bits at much lower | |||
| resolution. This means that successive reads on the clock may | resolution. This means that successive reads on the clock may produce | |||
| produce identical values even if enough time has passed that the | identical values even if enough time has passed that the value | |||
| value "should" change based on the nominal clock resolution. There | "should" change based on the nominal clock resolution. There are also | |||
| are also cases where frequently reading a clock can produce | cases where frequently reading a clock can produce artificial | |||
| artificial sequential values because of extra code that checks for | sequential values because of extra code that checks for the clock | |||
| the clock being unchanged between two reads and increases it by one! | being unchanged between two reads and increases it by one! Designing | |||
| Designing portable application code to generate unpredictable numbers | portable application code to generate unpredictable numbers based on | |||
| based on such system clocks is particularly challenging because the | such system clocks is particularly challenging because the system | |||
| system designer does not always know the properties of the system | designer does not always know the properties of the system clocks | |||
| clocks that the code will execute on. | that the code will execute on. | |||
| Use of a hardware serial number such as an Ethernet address may also | Use of hardware serial numbers such as an Ethernet addresses may also | |||
| provide fewer bits of uniqueness than one would guess. Such | provide fewer bits of uniqueness than one would guess. Such | |||
| quantities are usually heavily structured and subfields may have only | quantities are usually heavily structured and subfields may have only | |||
| a limited range of possible values or values easily guessable based | a limited range of possible values or values easily guessable based | |||
| on approximate date of manufacture or other data. For example, it is | on approximate date of manufacture or other data. For example, it is | |||
| likely that a company that manfactures both computers and Ethernet | likely that a company that manufactures both computers and Ethernet | |||
| adapters will, at least internally, use its own adapters, which | adapters will, at least internally, use its own adapters, which | |||
| significantly limits the range of built in addresses. | significantly limits the range of built-in addresses. | |||
| Problems such as those described above related to clocks and serial | Problems such as those described above related to clocks and serial | |||
| numbers make code to produce unpredictable quantities difficult if | numbers make code to produce unpredictable quantities difficult if | |||
| the code is to be ported across a variety of computer platforms and | the code is to be ported across a variety of computer platforms and | |||
| systems. | systems. | |||
| 4.2 Timing and Content of External Events | 4.2 Timing and Content of External Events | |||
| It is possible to measure the timing and content of mouse movement, | It is possible to measure the timing and content of mouse movement, | |||
| key strokes, and similar user events. This is a reasonable source of | key strokes, and similar user events. This is a reasonable source of | |||
| unguessable data with some qualifications. On some machines, inputs | unguessable data with some qualifications. On some machines, inputs | |||
| such as key strokes are buffered. Even though the user's inter- | such as key strokes are buffered. Even though the user's inter- | |||
| keystroke timing may have sufficient variation and unpredictability, | keystroke timing may have sufficient variation and unpredictability, | |||
| there might not be an easy way to access that variation. Another | there might not be an easy way to access that variation. Another | |||
| problem is that no standard method exists to sample timing details. | problem is that no standard method exists to sample timing details. | |||
| This makes it hard to build standard software intended for | This makes it hard to build standard software intended for | |||
| distribution to a large range of machines based on this technique. | distribution to a large range of machines based on this technique. | |||
| The amount of mouse movement or the keys actually hit are usually | The amount of mouse movement or the keys actually hit are usually | |||
| easier to access than timings but may yield less unpredictability as | easier to access than timings but may yield less unpredictability as | |||
| the user may provide highly repetitive input. | the user may provide highly repetitive input. | |||
| Other external events, such as network packet arrival times, can also | Other external events, such as network packet arrival times, can also | |||
| be used with care. In particular, the possibility of manipulation of | be used, with care. In particular, the possibility of manipulation of | |||
| such times by an adversary and the lack of history on system start up | such times by an adversary and the lack of history at system start up | |||
| must be considered. | must be considered. | |||
| 4.3 The Fallacy of Complex Manipulation | 4.3 The Fallacy of Complex Manipulation | |||
| One strategy which may give a misleading appearance of | One strategy which may give a misleading appearance of | |||
| unpredictability is to take a very complex algorithm (or an excellent | unpredictability is to take a very complex algorithm (or an excellent | |||
| traditional pseudo-random number generator with good statistical | traditional pseudo-random number generator with good statistical | |||
| properties) and calculate a cryptographic key by starting with the | properties) and calculate a cryptographic key by starting with | |||
| current value of a computer system clock as the seed. An adversary | limited data such as the computer system clock value as the seed. An | |||
| who knew roughly when the generator was started would have a | adversary who knew roughly when the generator was started would have | |||
| relatively small number of seed values to test as they would know | a relatively small number of seed values to test as they would know | |||
| likely values of the system clock. Large numbers of pseudo-random | likely values of the system clock. Large numbers of pseudo-random | |||
| bits could be generated but the search space an adversary would need | bits could be generated but the search space an adversary would need | |||
| to check could be quite small. | to check could be quite small. | |||
| Thus very strong and/or complex manipulation of data will not help if | Thus very strong and/or complex manipulation of data will not help if | |||
| the adversary can learn what the manipulation is and there is not | the adversary can learn what the manipulation is and there is not | |||
| enough unpredictability in the starting seed value. Even if they can | enough unpredictability in the starting seed value. They can usually | |||
| not learn what the manipulation is, they may be able to use the | use the limited number of results stemming from a limited number of | |||
| limited number of results stemming from a limited number of seed | seed values to defeat security. | |||
| values to defeat security. | ||||
| Another serious strategy error is to assume that a very complex | Another serious strategy error is to assume that a very complex | |||
| pseudo-random number generation algorithm will produce strong random | pseudo-random number generation algorithm will produce strong random | |||
| numbers when there has been no theory behind or analysis of the | numbers when there has been no theory behind or analysis of the | |||
| algorithm. There is a excellent example of this fallacy right near | algorithm. There is a excellent example of this fallacy right near | |||
| the beginning of chapter 3 in [KNUTH] where the author describes a | the beginning of Chapter 3 in [KNUTH] where the author describes a | |||
| complex algorithm. It was intended that the machine language program | complex algorithm. It was intended that the machine language program | |||
| corresponding to the algorithm would be so complicated that a person | corresponding to the algorithm would be so complicated that a person | |||
| trying to read the code without comments wouldn't know what the | trying to read the code without comments wouldn't know what the | |||
| program was doing. Unfortunately, actual use of this algorithm | program was doing. Unfortunately, actual use of this algorithm showed | |||
| showed that it almost immediately converged to a single repeated | that it almost immediately converged to a single repeated value in | |||
| value in one case and a small cycle of values in another case. | one case and a small cycle of values in another case. | |||
| Not only does complex manipulation not help you if you have a limited | Not only does complex manipulation not help you if you have a limited | |||
| range of seeds but blindly chosen complex manipulation can destroy | range of seeds but blindly chosen complex manipulation can destroy | |||
| the randomness in a good seed! | the randomness in a good seed! | |||
| 4.4 The Fallacy of Selection from a Large Database | 4.4 The Fallacy of Selection from a Large Database | |||
| Another strategy that can give a misleading appearance of | Another strategy that can give a misleading appearance of | |||
| unpredictability is selection of a quantity randomly from a database | unpredictability is selection of a quantity randomly from a database | |||
| and assume that its strength is related to the total number of bits | and assume that its strength is related to the total number of bits | |||
| in the database. For example, typical USENET servers process many | in the database. For example, typical USENET servers process many | |||
| megabytes of information per day. Assume a random quantity was | megabytes of information per day [USENET]. Assume a random quantity | |||
| selected by fetching 32 bytes of data from a random starting point in | was selected by fetching 32 bytes of data from a random starting | |||
| this data. This does not yield 32*8 = 256 bits worth of | point in this data. This does not yield 32*8 = 256 bits worth of | |||
| unguessability. Even after allowing that much of the data is human | unguessability. Even after allowing that much of the data is human | |||
| language and probably has no more than 2 or 3 bits of information per | language and probably has no more than 2 or 3 bits of information per | |||
| byte, it doesn't yield 32*2 = 64 bits of unguessability. For an | byte, it doesn't yield 32*2 = 64 bits of unguessability. For an | |||
| adversary with access to the same usenet database the unguessability | adversary with access to the same usenet database the unguessability | |||
| rests only on the starting point of the selection. That is perhaps a | rests only on the starting point of the selection. That is perhaps a | |||
| little over a couple of dozen bits of unguessability. | little over a couple of dozen bits of unguessability. | |||
| The same argument applies to selecting sequences from the data on a | The same argument applies to selecting sequences from the data on a | |||
| publicly available CD/DVD recording or any other large public | publicly available CD/DVD recording or any other large public | |||
| database. If the adversary has access to the same database, this | database. If the adversary has access to the same database, this | |||
| "selection from a large volume of data" step buys very little. | "selection from a large volume of data" step buys little. However, | |||
| However, if a selection can be made from data to which the adversary | if a selection can be made from data to which the adversary has no | |||
| has no access, such as system buffers on an active multi-user system, | access, such as system buffers on an active multi-user system, it may | |||
| it may be of help. | be of help. | |||
| 5. Hardware for Randomness | 5. Hardware for Randomness | |||
| Is there any hope for true strong portable randomness in the future? | Is there any hope for true strong portable randomness in the future? | |||
| There might be. All that's needed is a physical source of | There might be. All that's needed is a physical source of | |||
| unpredictable numbers. | unpredictable numbers. | |||
| A thermal noise (sometimes called Johnson noise in integrated | A thermal noise (sometimes called Johnson noise in integrated | |||
| circuits) or radioactive decay source and a fast, free-running | circuits) or radioactive decay source and a fast, free-running | |||
| oscillator would do the trick directly [GIFFORD]. This is a trivial | oscillator would do the trick directly [GIFFORD]. This is a trivial | |||
| amount of hardware, and could easily be included as a standard part | amount of hardware, and could easily be included as a standard part | |||
| of a computer system's architecture. Furthermore, any system with a | of a computer system's architecture. Furthermore, any system with a | |||
| spinning disk or ring oscillator and a stable (crystal) time source | spinning disk or ring oscillator and a stable (crystal) time source | |||
| or the like has an adequate source of randomness ([DAVIS] and Section | or the like has an adequate source of randomness ([DAVIS] and Section | |||
| 5.4). All that's needed is the common perception among computer | 5.4). All that's needed is the common perception among computer | |||
| vendors that this small additional hardware and the software to | vendors that this small additional hardware and the software to | |||
| access it is necessary and useful. | access it is necessary and useful. | |||
| 5.1 Volume Required | 5.1 Volume Required | |||
| How much unpredictability is needed? Is it possible to quantify the | How much unpredictability is needed? Is it possible to quantify the | |||
| requirement in, say, number of random bits per second? | requirement in, say, number of random bits per second? | |||
| The answer is not very much is needed. For AES, the key can be 128 | The answer is not very much is needed. For AES, the key can be 128 | |||
| bits and, as we show in an example in Section 8, even the highest | bits and, as we show in an example in Section 8, even the highest | |||
| security system is unlikely to require a keying material of much over | security system is unlikely to require strong keying material of much | |||
| 200 bits. If a series of keys are needed, they can be generated from | over 200 bits. If a series of keys are needed, they can be generated | |||
| a strong random seed using a cryptographically strong sequence as | from a strong random seed (starting value) using a cryptographically | |||
| explained in Section 6.3. A few hundred random bits generated at | strong sequence as explained in Section 6.3. A few hundred random | |||
| start up or once a day would be enough using such techniques. Even | bits generated at start up or once a day would be enough using such | |||
| if the random bits are generated as slowly as one per second and it | techniques. Even if the random bits are generated as slowly as one | |||
| is not possible to overlap the generation process, it should be | per second and it is not possible to overlap the generation process, | |||
| tolerable in high security applications to wait 200 seconds | it should be tolerable in most high security applications to wait 200 | |||
| occasionally. | seconds occasionally. | |||
| These numbers are trivial to achieve. It could be done by a person | These numbers are trivial to achieve. It could be done by a person | |||
| repeatedly tossing a coin. Almost any hardware process is likely to | repeatedly tossing a coin. Almost any hardware based process is | |||
| be much faster. | likely to be much faster. | |||
| 5.2 Sensitivity to Skew | 5.2 Sensitivity to Skew | |||
| Is there any specific requirement on the shape of the distribution of | Is there any specific requirement on the shape of the distribution of | |||
| the random numbers? The good news is the distribution need not be | the random numbers? The good news is the distribution need not be | |||
| uniform. All that is needed is a conservative estimate of how non- | uniform. All that is needed is a conservative estimate of how non- | |||
| uniform it is to bound performance. Simple techniques to de-skew the | uniform it is to bound performance. Simple techniques to de-skew the | |||
| bit stream are given below and stronger techniques are mentioned in | bit stream are given below and stronger cryptographic techniques are | |||
| Section 6.1.2 below. | described in Section 6.1.2 below. | |||
| 5.2.1 Using Stream Parity to De-Skew | 5.2.1 Using Stream Parity to De-Skew | |||
| Consider taking a sufficiently long string of bits and map the string | Consider taking a sufficiently long string of bits and map the string | |||
| to "zero" or "one". The mapping will not yield a perfectly uniform | to "zero" or "one". The mapping will not yield a perfectly uniform | |||
| distribution, but it can be as close as desired. One mapping that | distribution, but it can be as close as desired. One mapping that | |||
| serves the purpose is to take the parity of the string. This has the | serves the purpose is to take the parity of the string. This has the | |||
| advantages that it is robust across all degrees of skew up to the | advantages that it is robust across all degrees of skew up to the | |||
| estimated maximum skew and is absolutely trivial to implement in | estimated maximum skew and is absolutely trivial to implement in | |||
| hardware. | hardware. | |||
| The following analysis gives the number of bits that must be sampled: | The following analysis gives the number of bits that must be sampled: | |||
| Suppose the ratio of ones to zeros is 0.5 + e : 0.5 - e, where e is | Suppose the ratio of ones to zeros is ( 0.5 + e ) to ( 0.5 - e ), | |||
| between 0 and 0.5 and is a measure of the "eccentricity" of the | where e is between 0 and 0.5 and is a measure of the "eccentricity" | |||
| distribution. Consider the distribution of the parity function of N | of the distribution. Consider the distribution of the parity function | |||
| bit samples. The probabilities that the parity will be one or zero | of N bit samples. The probabilities that the parity will be one or | |||
| will be the sum of the odd or even terms in the binomial expansion of | zero will be the sum of the odd or even terms in the binomial | |||
| (p + q)^N, where p = 0.5 + e, the probability of a one, and q = 0.5 - | expansion of (p + q)^N, where p = 0.5 + e, the probability of a one, | |||
| e, the probability of a zero. | and q = 0.5 - e, the probability of a zero. | |||
| These sums can be computed easily as | These sums can be computed easily as | |||
| N N | N N | |||
| 1/2 * ( ( p + q ) + ( p - q ) ) | 1/2 * ( ( p + q ) + ( p - q ) ) | |||
| and | and | |||
| N N | N N | |||
| 1/2 * ( ( p + q ) - ( p - q ) ). | 1/2 * ( ( p + q ) - ( p - q ) ). | |||
| (Which one corresponds to the probability the parity will be 1 | (Which one corresponds to the probability the parity will be 1 | |||
| skipping to change at page 14, line 45 ¶ | skipping to change at page 14, line 45 ¶ | |||
| Since p + q = 1 and p - q = 2e, these expressions reduce to | Since p + q = 1 and p - q = 2e, these expressions reduce to | |||
| N | N | |||
| 1/2 * [1 + (2e) ] | 1/2 * [1 + (2e) ] | |||
| and | and | |||
| N | N | |||
| 1/2 * [1 - (2e) ]. | 1/2 * [1 - (2e) ]. | |||
| Neither of these will ever be exactly 0.5 unless e is zero, but we | Neither of these will ever be exactly 0.5 unless e is zero, but we | |||
| can bring them arbitrarily close to 0.5. If we want the | can bring them arbitrarily close to 0.5. If we want the probabilities | |||
| probabilities to be within some delta d of 0.5, i.e. then | to be within some delta d of 0.5, i.e. then | |||
| N | N | |||
| ( 0.5 + ( 0.5 * (2e) ) ) < 0.5 + d. | ( 0.5 + ( 0.5 * (2e) ) ) < 0.5 + d. | |||
| Solving for N yields N > log(2d)/log(2e). (Note that 2e is less than | Solving for N yields N > log(2d)/log(2e). (Note that 2e is less than | |||
| 1, so its log is negative. Division by a negative number reverses | 1, so its log is negative. Division by a negative number reverses the | |||
| the sense of an inequality.) | sense of an inequality.) | |||
| The following table gives the length of the string which must be | The following table gives the length of the string which must be | |||
| sampled for various degrees of skew in order to come within 0.001 of | sampled for various degrees of skew in order to come within 0.001 of | |||
| a 50/50 distribution. | a 50/50 distribution. | |||
| +---------+--------+-------+ | +---------+--------+-------+ | |||
| | Prob(1) | e | N | | | Prob(1) | e | N | | |||
| +---------+--------+-------+ | +---------+--------+-------+ | |||
| | 0.5 | 0.00 | 1 | | | 0.5 | 0.00 | 1 | | |||
| | 0.6 | 0.10 | 4 | | | 0.6 | 0.10 | 4 | | |||
| | 0.7 | 0.20 | 7 | | | 0.7 | 0.20 | 7 | | |||
| skipping to change at page 15, line 29 ¶ | skipping to change at page 15, line 29 ¶ | |||
| The last entry shows that even if the distribution is skewed 99% in | The last entry shows that even if the distribution is skewed 99% in | |||
| favor of ones, the parity of a string of 308 samples will be within | favor of ones, the parity of a string of 308 samples will be within | |||
| 0.001 of a 50/50 distribution. | 0.001 of a 50/50 distribution. | |||
| 5.2.2 Using Transition Mappings to De-Skew | 5.2.2 Using Transition Mappings to De-Skew | |||
| Another technique, originally due to von Neumann [VON NEUMANN], is to | Another technique, originally due to von Neumann [VON NEUMANN], is to | |||
| examine a bit stream as a sequence of non-overlapping pairs. You | examine a bit stream as a sequence of non-overlapping pairs. You | |||
| could then discard any 00 or 11 pairs found, interpret 01 as a 0 and | could then discard any 00 or 11 pairs found, interpret 01 as a 0 and | |||
| 10 as a 1. Assume the probability of a 1 is 0.5+e and the | 10 as a 1. Assume the probability of a 1 is 0.5+e and the probability | |||
| probability of a 0 is 0.5-e where e is the eccentricity of the source | of a 0 is 0.5-e where e is the eccentricity of the source and | |||
| and described in the previous section. Then the probability of each | described in the previous section. Then the probability of each pair | |||
| pair is as follows: | is as follows: | |||
| +------+-----------------------------------------+ | +------+-----------------------------------------+ | |||
| | pair | probability | | | pair | probability | | |||
| +------+-----------------------------------------+ | +------+-----------------------------------------+ | |||
| | 00 | (0.5 - e)^2 = 0.25 - e + e^2 | | | 00 | (0.5 - e)^2 = 0.25 - e + e^2 | | |||
| | 01 | (0.5 - e)*(0.5 + e) = 0.25 - e^2 | | | 01 | (0.5 - e)*(0.5 + e) = 0.25 - e^2 | | |||
| | 10 | (0.5 + e)*(0.5 - e) = 0.25 - e^2 | | | 10 | (0.5 + e)*(0.5 - e) = 0.25 - e^2 | | |||
| | 11 | (0.5 + e)^2 = 0.25 + e + e^2 | | | 11 | (0.5 + e)^2 = 0.25 + e + e^2 | | |||
| +------+-----------------------------------------+ | +------+-----------------------------------------+ | |||
| This technique will completely eliminate any bias but at the expense | This technique will completely eliminate any bias but at the expense | |||
| of taking an indeterminate number of input bits for any particular | of taking an indeterminate number of input bits for any particular | |||
| desired number of output bits. The probability of any particular | desired number of output bits. The probability of any particular pair | |||
| pair being discarded is 0.5 + 2e^2 so the expected number of input | being discarded is 0.5 + 2e^2 so the expected number of input bits to | |||
| bits to produce X output bits is X/(0.25 - e^2). | produce X output bits is X/(0.25 - e^2). | |||
| This technique assumes that the bits are from a stream where each bit | This technique assumes that the bits are from a stream where each bit | |||
| has the same probability of being a 0 or 1 as any other bit in the | has the same probability of being a 0 or 1 as any other bit in the | |||
| stream and that bits are not correlated, i.e., that the bits are | stream and that bits are not correlated, i.e., that the bits are | |||
| identical independent distributions. If alternate bits were from two | identical independent distributions. If alternate bits were from two | |||
| correlated sources, for example, the above analysis breaks down. | correlated sources, for example, the above analysis breaks down. | |||
| The above technique also provides another illustration of how a | The above technique also provides another illustration of how a | |||
| simple statistical analysis can mislead if one is not always on the | simple statistical analysis can mislead if one is not always on the | |||
| lookout for patterns that could be exploited by an adversary. If the | lookout for patterns that could be exploited by an adversary. If the | |||
| algorithm were mis-read slightly so that overlapping successive bits | algorithm were mis-read slightly so that overlapping successive bits | |||
| pairs were used instead of non-overlapping pairs, the statistical | pairs were used instead of non-overlapping pairs, the statistical | |||
| analysis given is the same; however, instead of providing an unbiased | analysis given is the same; however, instead of providing an unbiased | |||
| uncorrelated series of random 1's and 0's, it instead produces a | uncorrelated series of random 1's and 0's, it instead produces a | |||
| totally predictable sequence of exactly alternating 1's and 0's. | totally predictable sequence of exactly alternating 1's and 0's. | |||
| 5.2.3 Using FFT to De-Skew | 5.2.3 Using FFT to De-Skew | |||
| When real world data consists of strongly biased or correlated bits, | When real world data consists of strongly biased or correlated bits, | |||
| it may still contain useful amounts of randomness. This randomness | it may still contain useful amounts of randomness. This randomness | |||
| can be extracted through use of the discrete Fourier transform or its | can be extracted through use of the discrete Fourier transform or its | |||
| optimized variant, the FFT. | optimized variant, the FFT. | |||
| Using the Fourier transform of the data, strong correlations can be | Using the Fourier transform of the data, strong correlations can be | |||
| discarded. If adequate data is processed and remaining correlations | discarded. If adequate data is processed and remaining correlations | |||
| decay, spectral lines approaching statistical independence and | decay, spectral lines approaching statistical independence and | |||
| normally distributed randomness can be produced [BRILLINGER]. | normally distributed randomness can be produced [BRILLINGER]. | |||
| 5.2.4 Using S-Boxes to De-Skew | 5.2.4 Using Compression to De-Skew | |||
| Many modern block encryption functions, including DES and AES, | ||||
| incorporate modules known as S-Boxes (substitution boxes). These | ||||
| produce a smaller number of outputs from a larger number of inputs | ||||
| through a complex non-linear mixing function which would have the | ||||
| effect of concentrating limited entropy in the inputs into the | ||||
| output. | ||||
| S-Boxes sometimes incorporate bent boolean functions which are | ||||
| functions of an even number of bits producing one output bit with | ||||
| maximum non-linearity. Looking at the output for all input pairs | ||||
| differing in any particular bit position, exactly half the outputs | ||||
| are different. | ||||
| An S-Box in which each output bit is produced by a bent function such | ||||
| that any linear combination of these functions is also a bent | ||||
| function is called a "perfect S-Box". Repeated application or | ||||
| cascades of such boxes can be used to de-skew. [SBOX*] | ||||
| 5.2.5 Using Compression to De-Skew | ||||
| Reversible compression techniques also provide a crude method of de- | Reversible compression techniques also provide a crude method of de- | |||
| skewing a skewed bit stream. This follows directly from the | skewing a skewed bit stream. This follows directly from the | |||
| definition of reversible compression and the formula in Section 2 | definition of reversible compression and the formula in Section 2 | |||
| above for the amount of information in a sequence. Since the | above for the amount of information in a sequence. Since the | |||
| compression is reversible, the same amount of information must be | compression is reversible, the same amount of information must be | |||
| present in the shorter output than was present in the longer input. | present in the shorter output than was present in the longer input. | |||
| By the Shannon information equation, this is only possible if, on | By the Shannon information equation, this is only possible if, on | |||
| average, the probabilities of the different shorter sequences are | average, the probabilities of the different shorter sequences are | |||
| more uniformly distributed than were the probabilities of the longer | more uniformly distributed than were the probabilities of the longer | |||
| sequences. Thus the shorter sequences must be de-skewed relative to | sequences. Therefore the shorter sequences must be de-skewed relative | |||
| the input. | to the input. | |||
| However, many compression techniques add a somewhat predictable | However, many compression techniques add a somewhat predictable | |||
| preface to their output stream and may insert such a sequence again | preface to their output stream and may insert such a sequence again | |||
| periodically in their output or otherwise introduce subtle patterns | periodically in their output or otherwise introduce subtle patterns | |||
| of their own. They should be considered only a rough technique | of their own. They should be considered only a rough technique | |||
| compared with those described above or in Section 6.1.2. At a | compared with those described above or in Section 6.1.2. At a | |||
| minimum, the beginning of the compressed sequence should be skipped | minimum, the beginning of the compressed sequence should be skipped | |||
| and only later bits used for applications requiring random bits. | and only later bits used for applications requiring random bits. | |||
| 5.3 Existing Hardware Can Be Used For Randomness | 5.3 Existing Hardware Can Be Used For Randomness | |||
| As described below, many computers come with hardware that can, with | As described below, many computers come with hardware that can, with | |||
| care, be used to generate truly random quantities. | care, be used to generate truly random quantities. | |||
| 5.3.1 Using Existing Sound/Video Input | 5.3.1 Using Existing Sound/Video Input | |||
| Increasingly computers are being built with inputs that digitize some | Many computers are built with inputs that digitize some real world | |||
| real world analog source, such as sound from a microphone or video | analog source, such as sound from a microphone or video input from a | |||
| input from a camera. Under appropriate circumstances, such input can | camera. Under appropriate circumstances, such input can provide | |||
| provide reasonably high quality random bits. The "input" from a | reasonably high quality random bits. The "input" from a sound | |||
| sound digitizer with no source plugged in or a camera with the lens | digitizer with no source plugged in or a camera with the lens cap on, | |||
| cap on, if the system has enough gain to detect anything, is | if the system has enough gain to detect anything, is essentially | |||
| essentially thermal noise. | thermal noise. | |||
| For example, on a SPARCstation, one can read from the /dev/audio | For example, on some UNIX based systems, one can read from the | |||
| device with nothing plugged into the microphone jack. Such data is | /dev/audio device with nothing plugged into the microphone jack or | |||
| essentially random noise although it should not be trusted without | the microphone receiving only low level background noise. Such data | |||
| some checking in case of hardware failure. It will, in any case, | is essentially random noise although it should not be trusted without | |||
| need to be de-skewed as described elsewhere. | some checking in case of hardware failure. It will, in any case, need | |||
| to be de-skewed as described elsewhere. | ||||
| Combining this with compression to de-skew one can, in UNIXese, | Combining this with compression to de-skew one can, in UNIXese, | |||
| generate a huge amount of medium quality random data by doing | generate a huge amount of medium quality random data by doing | |||
| cat /dev/audio | compress - >random-bits-file | cat /dev/audio | compress - >random-bits-file | |||
| 5.3.2 Using Existing Disk Drives | 5.3.2 Using Existing Disk Drives | |||
| Disk drives have small random fluctuations in their rotational speed | Disk drives have small random fluctuations in their rotational speed | |||
| due to chaotic air turbulence [DAVIS]. By adding low level disk seek | due to chaotic air turbulence [DAVIS]. By adding low level disk seek | |||
| time instrumentation to a system, a series of measurements can be | time instrumentation to a system, a series of measurements can be | |||
| obtained that include this randomness. Such data is usually highly | obtained that include this randomness. Such data is usually highly | |||
| correlated so that significant processing is needed, such as FFT (see | correlated so that significant processing is needed, such as FFT (see | |||
| section 5.2.3). Nevertheless experimentation has shown that, with | section 5.2.3). Nevertheless experimentation has shown that, with | |||
| such processing, most disk drives easily produce 100 bits a minute or | such processing, most disk drives easily produce 100 bits a minute or | |||
| more of excellent random data. | more of excellent random data. | |||
| Partly offsetting this need for processing is the fact that disk | Partly offsetting this need for processing is the fact that disk | |||
| drive failure will normally be rapidly noticed. Thus, problems with | drive failure will normally be rapidly noticed. Thus, problems with | |||
| this method of random number generation due to hardware failure are | this method of random number generation due to hardware failure are | |||
| unlikely. | unlikely. | |||
| 5.4 Ring Oscillator Sources | 5.4 Ring Oscillator Sources | |||
| If an integrated circuit is being designed or field programmed, an | If an integrated circuit is being designed or field programmed, an | |||
| odd number of gates can be connected in series to produce a free- | odd number of gates can be connected in series to produce a free- | |||
| running ring oscillator. By sampling a point in the ring at a | running ring oscillator. By sampling a point in the ring at a fixed | |||
| precise fixed frequency, say one determined by a stable crystal | frequency, say one determined by a stable crystal oscillator, some | |||
| oscialltor, some amount of entropy can be extracted due to slight | amount of entropy can be extracted due to slight variations in the | |||
| variations in the free-running osciallator. | free-running oscillator timing. It is possible to increase the rate | |||
| of entropy by xor'ing sampled values from a few ring oscillators with | ||||
| relatively prime lengths. Another possibility is to sample the output | ||||
| of a noisy diode. | ||||
| Such bits will have to be heavily de-skewed as disk rotation timings | Bits from such sources will have to be heavily de-skewed, as disk | |||
| must be (Section 5.3.2). An engineering study would be needed to | rotation timings must be (Section 5.3.2). An engineering study would | |||
| determine the amount of entropy being produced depending on the | be needed to determine the amount of entropy being produced depending | |||
| particular design. It may be possible to increase the rate of entropy | on the particular design. In any case, these can be good sources | |||
| by xor'ing sampled values from a few ring osciallators with | whose cost is a trivial amount of hardware by modern standards. | |||
| relatively prime lengths or the like. In any case, this can be a | ||||
| good, medium speed source whose cost is a trivial number of gates by | As an example, IEEE 802.11 suggests that circuit below be considered | |||
| modern standards. | with due attention in the design to isolation of the rings from each | |||
| other and from clocked circuits to avoid undesired synchronization, | ||||
| etc., and extensive post processing. [IEEE 802.11i] | ||||
| |\ |\ |\ | ||||
| +-->| >0-->| >0-- 19 total --| >0--+-------+ | ||||
| | |/ |/ |/ | | | ||||
| | | | | ||||
| +----------------------------------+ V | ||||
| +-----+ | ||||
| |\ |\ |\ | | output | ||||
| +-->| >0-->| >0-- 23 total --| >0--+--->| XOR |------> | ||||
| | |/ |/ |/ | | | | ||||
| | | +-----+ | ||||
| +----------------------------------+ ^ ^ | ||||
| | | | ||||
| |\ |\ |\ | | | ||||
| +-->| >0-->| >0-- 29 total --| >0--+------+ | | ||||
| | |/ |/ |/ | | | ||||
| | | | | ||||
| +----------------------------------+ | | ||||
| | | ||||
| other randomness if available--------------+ | ||||
| 6. Recommended Software Strategy | 6. Recommended Software Strategy | |||
| What is the best overall strategy for meeting the requirement for | What is the best overall strategy for meeting the requirement for | |||
| unguessable random numbers in the absence of a reliable hardware | unguessable random numbers in the absence of a reliable hardware | |||
| source? It is to obtain random input from a number of uncorrelated | source? It is to obtain random input from a number of uncorrelated | |||
| sources and to mix them with a strong mixing function. Such a | sources and to mix them with a strong mixing function. Such a | |||
| function will preserve the randomness present in any of the sources | function will preserve the randomness present in any of the sources | |||
| even if other quantities being combined happen to be fixed or easily | even if other quantities being combined happen to be fixed or easily | |||
| guessable. This may be advisable even with a good hardware source, | guessable. This may be advisable even with a good hardware source, as | |||
| as hardware can also fail, though this should be weighed against any | hardware can also fail, though this should be weighed against any | |||
| increase in the chance of overall failure due to added software | increase in the chance of overall failure due to added software | |||
| complexity. | complexity. | |||
| 6.1 Mixing Functions | 6.1 Mixing Functions | |||
| A strong mixing function is one which combines two or more inputs and | A strong mixing function is one which combines two or more inputs and | |||
| produces an output where each output bit is a different complex non- | produces an output where each output bit is a different complex non- | |||
| linear function of all the input bits. On average, changing any | linear function of all the input bits. On average, changing any input | |||
| input bit will change about half the output bits. But because the | bit will change about half the output bits. But because the | |||
| relationship is complex and non-linear, no particular output bit is | relationship is complex and non-linear, no particular output bit is | |||
| guaranteed to change when any particular input bit is changed. | guaranteed to change when any particular input bit is changed. | |||
| Consider the problem of converting a stream of bits that is skewed | Consider the problem of converting a stream of bits that is skewed | |||
| towards 0 or 1 to a shorter stream which is more random, as discussed | towards 0 or 1 or which has a somewhat predictable pattern to a | |||
| in Section 5.2 above. This is simply another case where a strong | shorter stream which is more random, as discussed in Section 5.2 | |||
| mixing function is desired, mixing the input bits to produce a | above. This is simply another case where a strong mixing function is | |||
| smaller number of output bits. The technique given in Section 5.2.1 | desired, mixing the input bits to produce a smaller number of output | |||
| of using the parity of a number of bits is simply the result of | bits. The technique given in Section 5.2.1 of using the parity of a | |||
| successively Exclusive Or'ing them which is examined as a trivial | number of bits is simply the result of successively Exclusive Or'ing | |||
| mixing function immediately below. Use of stronger mixing functions | them which is examined as a trivial mixing function immediately | |||
| to extract more of the randomness in a stream of skewed bits is | below. Use of stronger mixing functions to extract more of the | |||
| examined in Section 6.1.2. | randomness in a stream of skewed bits is examined in Section 6.1.2. | |||
| 6.1.1 A Trivial Mixing Function | 6.1.1 A Trivial Mixing Function | |||
| A trivial example for single bit inputs is the Exclusive Or function, | A trivial example for single bit inputs is the Exclusive Or function, | |||
| which is equivalent to addition without carry, as show in the table | which is equivalent to addition without carry, as show in the table | |||
| below. This is a degenerate case in which the one output bit always | below. This is a degenerate case in which the one output bit always | |||
| changes for a change in either input bit. But, despite its | changes for a change in either input bit. But, despite its | |||
| simplicity, it will still provide a useful illustration. | simplicity, it provides a useful illustration. | |||
| +-----------+-----------+----------+ | +-----------+-----------+----------+ | |||
| | input 1 | input 2 | output | | | input 1 | input 2 | output | | |||
| +-----------+-----------+----------+ | +-----------+-----------+----------+ | |||
| | 0 | 0 | 0 | | | 0 | 0 | 0 | | |||
| | 0 | 1 | 1 | | | 0 | 1 | 1 | | |||
| | 1 | 0 | 1 | | | 1 | 0 | 1 | | |||
| | 1 | 1 | 0 | | | 1 | 1 | 0 | | |||
| +-----------+-----------+----------+ | +-----------+-----------+----------+ | |||
| If inputs 1 and 2 are uncorrelated and combined in this fashion then | If inputs 1 and 2 are uncorrelated and combined in this fashion then | |||
| the output will be an even better (less skewed) random bit than the | the output will be an even better (less skewed) random bit than the | |||
| inputs. If we assume an "eccentricity" e as defined in Section 5.2 | inputs. If we assume an "eccentricity" e as defined in Section 5.2 | |||
| above, then the output eccentricity relates to the input eccentricity | above, then the output eccentricity relates to the input eccentricity | |||
| as follows: | as follows: | |||
| e = 2 * e * e | e = 2 * e * e | |||
| output input 1 input 2 | output input 1 input 2 | |||
| Since e is never greater than 1/2, the eccentricity is always | Since e is never greater than 1/2, the eccentricity is always | |||
| improved except in the case where at least one input is a totally | improved except in the case where at least one input is a totally | |||
| skewed constant. This is illustrated in the following table where | skewed constant. This is illustrated in the following table where the | |||
| the top and left side values are the two input eccentricities and the | top and left side values are the two input eccentricities and the | |||
| entries are the output eccentricity: | entries are the output eccentricity: | |||
| +--------+--------+--------+--------+--------+--------+--------+ | +--------+--------+--------+--------+--------+--------+--------+ | |||
| | e | 0.00 | 0.10 | 0.20 | 0.30 | 0.40 | 0.50 | | | e | 0.00 | 0.10 | 0.20 | 0.30 | 0.40 | 0.50 | | |||
| +--------+--------+--------+--------+--------+--------+--------+ | +--------+--------+--------+--------+--------+--------+--------+ | |||
| | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | | | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | | |||
| | 0.10 | 0.00 | 0.02 | 0.04 | 0.06 | 0.08 | 0.10 | | | 0.10 | 0.00 | 0.02 | 0.04 | 0.06 | 0.08 | 0.10 | | |||
| | 0.20 | 0.00 | 0.04 | 0.08 | 0.12 | 0.16 | 0.20 | | | 0.20 | 0.00 | 0.04 | 0.08 | 0.12 | 0.16 | 0.20 | | |||
| | 0.30 | 0.00 | 0.06 | 0.12 | 0.18 | 0.24 | 0.30 | | | 0.30 | 0.00 | 0.06 | 0.12 | 0.18 | 0.24 | 0.30 | | |||
| | 0.40 | 0.00 | 0.08 | 0.16 | 0.24 | 0.32 | 0.40 | | | 0.40 | 0.00 | 0.08 | 0.16 | 0.24 | 0.32 | 0.40 | | |||
| | 0.50 | 0.00 | 0.10 | 0.20 | 0.30 | 0.40 | 0.50 | | | 0.50 | 0.00 | 0.10 | 0.20 | 0.30 | 0.40 | 0.50 | | |||
| +--------+--------+--------+--------+--------+--------+--------+ | +--------+--------+--------+--------+--------+--------+--------+ | |||
| However, keep in mind that the above calculations assume that the | However, keep in mind that the above calculations assume that the | |||
| inputs are not correlated. If the inputs were, say, the parity of | inputs are not correlated. If the inputs were, say, the parity of the | |||
| the number of minutes from midnight on two clocks accurate to a few | number of minutes from midnight on two clocks accurate to a few | |||
| seconds, then each might appear random if sampled at random intervals | seconds, then each might appear random if sampled at random intervals | |||
| much longer than a minute. Yet if they were both sampled and | much longer than a minute. Yet if they were both sampled and combined | |||
| combined with xor, the result would be zero most of the time. | with xor, the result would be zero most of the time. | |||
| 6.1.2 Stronger Mixing Functions | 6.1.2 Stronger Mixing Functions | |||
| The US Government Advanced Encryption Standard [AES] is an example of | The US Government Advanced Encryption Standard [AES] is an example of | |||
| a strong mixing function for multiple bit quantities. It takes up to | a strong mixing function for multiple bit quantities. It takes up to | |||
| 384 bits of input (128 bits of "data" and 256 bits of "key") and | 384 bits of input (128 bits of "data" and 256 bits of "key") and | |||
| produces 128 bits of output each of which is dependent on a complex | produces 128 bits of output each of which is dependent on a complex | |||
| non-linear function of all input bits. Other encryption functions | non-linear function of all input bits. Other encryption functions | |||
| with this characteristic, such as [DES], can also be used by | with this characteristic, such as [DES], can also be used by | |||
| considering them to mix all of their key and data input bits. | considering them to mix all of their key and data input bits. | |||
| Another good family of mixing functions are the "message digest" or | Another good family of mixing functions are the "message digest" or | |||
| hashing functions such as The US Government Secure Hash Standards | hashing functions such as The US Government Secure Hash Standards | |||
| [SHA*] and the MD4, MD5 [MD4, MD5] series. These functions all take | [SHA*] and the MD4, MD5 [MD4, MD5] series. These functions all take a | |||
| an arbitrary amount of input and produce an output mixing all the | practically unlimited amount of input and produce a relatively short | |||
| input bits. The MD* series produce 128 bits of output, SHA-1 produces | fixed length output mixing all the input bits. The MD* series produce | |||
| 160 bits, and other SHA functions produce larger numbers of bits. | 128 bits of output, SHA-1 produces 160 bits, and other SHA functions | |||
| produce up to 512 bits. | ||||
| Although the message digest functions are designed for variable | Although the message digest functions are designed for variable | |||
| amounts of input, AES and other encryption functions can also be used | amounts of input, AES and other encryption functions can also be used | |||
| to combine any number of inputs. If 128 bits of output is adequate, | to combine any number of inputs. If 128 bits of output is adequate, | |||
| the inputs can be packed into a 128 bit data quantity and successive | the inputs can be packed into a 128-bit data quantity and successive | |||
| AES keys, padding with zeros if needed, which are then used to | AES keys, padding with zeros if needed, which are then used to | |||
| successively encrypt using AES in Electronic Codebook Mode [DES | successively encrypt using AES in Electronic Codebook Mode. Or the | |||
| MODES]. If more than 128 bits of output are needed, use more complex | input could be packed into one 128-bit key and multiple data blocks | |||
| mixing. For example, if inputs are packed into three quantities, A, | and a CBC-MAC calculated [MODES]. | |||
| B, and C, use AES to encrypt A with B as a key and then with C as a | ||||
| key to produce the 1st part of the output, then encrypt B with C and | If more than 128 bits of output are needed, use more complex mixing. | |||
| then A for more output and, if necessary, encrypt C with A and then B | But keep in mind that it is absolutely impossible to get more bits of | |||
| for yet more output. Still more output can be produced by reversing | "randomness" out than are put in. For example, if inputs are packed | |||
| the order of the keys given above to stretch things. The same can be | into three quantities, A, B, and C, use AES to encrypt A with B as a | |||
| done with the hash functions by hashing various subsets of the input | key and then with C as a key to produce the 1st part of the output, | |||
| data to produce multiple outputs. But keep in mind that it is | then encrypt B with C and then A for more output and, if necessary, | |||
| impossible to get more bits of "randomness" out than are put in. | encrypt C with A and then B for yet more output. Still more output | |||
| can be produced by reversing the order of the keys given above to | ||||
| stretch things. The same can be done with the hash functions by | ||||
| hashing various subsets of the input data or different copies of the | ||||
| input data with different prefixes and/or suffixes to produce | ||||
| multiple outputs. | ||||
| Many modern block encryption functions, including DES and AES, | ||||
| incorporate modules known as S-Boxes (substitution boxes). These | ||||
| produce a smaller number of outputs from a larger number of inputs | ||||
| through a complex non-linear mixing function which would have the | ||||
| effect of concentrating limited entropy in the inputs into the | ||||
| output. | ||||
| S-Boxes sometimes incorporate bent boolean functions (functions of an | ||||
| even number of bits producing one output bit with maximum non- | ||||
| linearity). Looking at the output for all input pairs differing in | ||||
| any particular bit position, exactly half the outputs are different. | ||||
| An S-Box in which each output bit is produced by a bent function such | ||||
| that any linear combination of these functions is also a bent | ||||
| function is called a "perfect S-Box". | ||||
| S-boxes and various repeated application or cascades of such boxes | ||||
| can be used for mixing. [SBOX*] | ||||
| An example of using a strong mixing function would be to reconsider | An example of using a strong mixing function would be to reconsider | |||
| the case of a string of 308 bits each of which is biased 99% towards | the case of a string of 308 bits each of which is biased 99% towards | |||
| zero. The parity technique given in Section 5.2.1 above reduced this | zero. The parity technique given in Section 5.2.1 above reduced this | |||
| to one bit with only a 1/1000 deviance from being equally likely a | to one bit with only a 1/1000 deviance from being equally likely a | |||
| zero or one. But, applying the equation for information given in | zero or one. But, applying the equation for information given in | |||
| Section 2, this 308 bit skewed sequence has over 5 bits of | Section 2, this 308 bit skewed sequence has over 5 bits of | |||
| information in it. Thus hashing it with SHA-1 and taking the bottom | information in it. Thus hashing it with SHA-1 and taking the bottom 5 | |||
| 5 bits of the result would yield 5 unbiased random bits as opposed to | bits of the result would yield 5 unbiased random bits as opposed to | |||
| the single bit given by calculating the parity of the string. | the single bit given by calculating the parity of the string. | |||
| 6.1.3 Diffie-Hellman as a Mixing Function | 6.1.3 Diffie-Hellman as a Mixing Function | |||
| Diffie-Hellman exponential key exchange is a technique that yields a | Diffie-Hellman exponential key exchange is a technique that yields a | |||
| shared secret between two parties that can be made computationally | shared secret between two parties that can be made computationally | |||
| infeasible for a third party to determine even if they can observe | infeasible for a third party to determine even if they can observe | |||
| all the messages between the two communicating parties. This shared | all the messages between the two communicating parties. This shared | |||
| secret is a mixture of initial quantities generated by each of them | secret is a mixture of initial quantities generated by each of them | |||
| [D-H]. If these initial quantities are random, then the shared | [D-H]. If these initial quantities are random, then the shared secret | |||
| secret contains the combined randomness of them both, assuming they | contains the combined randomness of them both, assuming they are | |||
| are uncorrelated. | uncorrelated. | |||
| 6.1.4 Using a Mixing Function to Stretch Random Bits | 6.1.4 Using a Mixing Function to Stretch Random Bits | |||
| While it is not necessary for a mixing function to produce the same | While it is not necessary for a mixing function to produce the same | |||
| or fewer bits than its inputs, mixing bits cannot "stretch" the | or fewer bits than its inputs, mixing bits cannot "stretch" the | |||
| amount of random unpredictability present in the inputs. Thus four | amount of random unpredictability present in the inputs. Thus four | |||
| inputs of 32 bits each where there is 12 bits worth of | inputs of 32 bits each where there is 12 bits worth of | |||
| unpredicatability (such as 4,096 equally probable values) in each | unpredictability (such as 4,096 equally probable values) in each | |||
| input cannot produce more than 48 bits worth of unpredictable output. | input cannot produce more than 48 bits worth of unpredictable output. | |||
| The output can be expanded to hundreds or thousands of bits by, for | The output can be expanded to hundreds or thousands of bits by, for | |||
| example, mixing with successive integers, but the clever adversary's | example, mixing with successive integers, but the clever adversary's | |||
| search space is still 2^48 possibilities. Furthermore, mixing to | search space is still 2^48 possibilities. Furthermore, mixing to | |||
| fewer bits than are input will tend to strengthen the randomness of | fewer bits than are input will tend to strengthen the randomness of | |||
| the output the way using Exclusive Or to produce one bit from two did | the output the way using Exclusive Or to produce one bit from two did | |||
| above. | above. | |||
| The last table in Section 6.1.1 shows that mixing a random bit with a | The last table in Section 6.1.1 shows that mixing a random bit with a | |||
| constant bit with Exclusive Or will produce a random bit. While this | constant bit with Exclusive Or will produce a random bit. While this | |||
| is true, it does not provide a way to "stretch" one random bit into | is true, it does not provide a way to "stretch" one random bit into | |||
| more than one. If, for example, a random bit is mixed with a 0 and | more than one. If, for example, a random bit is mixed with a 0 and | |||
| then with a 1, this produces a two bit sequence but it will always be | then with a 1, this produces a two bit sequence but it will always be | |||
| either 01 or 10. Since there are only two possible values, there is | either 01 or 10. Since there are only two possible values, there is | |||
| still only the one bit of original randomness. | still only the one bit of original randomness. | |||
| 6.1.5 Other Factors in Choosing a Mixing Function | 6.1.5 Other Factors in Choosing a Mixing Function | |||
| For local use, AES has the advantages that it has been widely tested | For local use, AES has the advantages that it has been widely tested | |||
| for flaws, is reasonably efficient in software, and is widely | for flaws, is reasonably efficient in software, and is widely | |||
| documented and implemented with hardware and software implementations | documented and implemented with hardware and software implementations | |||
| available all over the world including open source code. The SHA* | available all over the world including open source code. The SHA* | |||
| family are younger algorithms but there is no particular reason to | family have had a little less study and tend to require more CPU | |||
| believe they are flawed. Both SHA* and MD5 were derived from the | cycles than AES but there is no reason to believe they are flawed. | |||
| earlier MD4 algorithm. Some signs of weakness have been found in MD4 | Both SHA* and MD5 were derived from the earlier MD4 algorithm. They | |||
| and MD5. They all have source code available [SHA*, MD*]. | all have source code available [SHA*, MD*]. Some signs of weakness | |||
| have been found in MD4 and MD5. In particular, MD4 has only three | ||||
| rounds and there are several independent breaks of the first two or | ||||
| last two rounds. And some collisions have been found in MD5 output. | ||||
| AES and SHA* have been vouched for the the US National Security | AES was selected by a robust, public, and international process. It | |||
| Agency (NSA) on the basis of criteria that primarily remain secret, | and SHA* have been vouched for by the US National Security Agency | |||
| as was DES. While this has been the cause of much speculation and | (NSA) on the basis of criteria that mostly remain secret, as was DES. | |||
| doubt, investigation of DES over the years has indicated that NSA | While this has been the cause of much speculation and doubt, | |||
| investigation of DES over the years has indicated that NSA | ||||
| involvement in modifications to its design, which originated with | involvement in modifications to its design, which originated with | |||
| IBM, was primarily to strengthen it. No concealed or special | IBM, was primarily to strengthen it. No concealed or special weakness | |||
| weakness has been found in DES. It is very likely that the NSA | has been found in DES. It is likely that the NSA modifications to MD4 | |||
| modifications to MD4 to produce the SHA* similarly strengthened these | to produce the SHA algorithms similarly strengthened these | |||
| algorithms, possibly against threats not yet known in the public | algorithms, possibly against threats not yet known in the public | |||
| cryptographic community. | cryptographic community. | |||
| AES, DES, SHA*, MD4, and MD5 are believed to be royalty free for all | Where input lengths are unpredictable, hash algorithms are a little | |||
| purposes. Continued advances in crypography and computing power have | more convenient to use than block encryption algorithms since they | |||
| cast doubts on MD4 and MD5 so their use is generally not recommended. | are generally designed to accept variable length inputs. Block | |||
| encryption algorithms generally require an additional padding | ||||
| algorithm to accomodate inputs that are not an even multiple of the | ||||
| block size. | ||||
| Another advantage of the SHA* or similar hashing algorithms over | As of the time of this document, the authors know of no patent claims | |||
| encryption algorithms in the past was that they are not subject to | to the basic AES, DES, SHA*, MD4, and MD5 algorithms other than | |||
| the same regulations imposed by the US Government prohibiting the | patents for which an irrevocable royalty free license has been | |||
| unlicensed export or import of encryption/decryption software and | granted to the world. There may, of course, be basic patents of which | |||
| hardware. | the authors are unaware or patents on implementations or uses or | |||
| other relevant patents issued or to be issued. | ||||
| 6.2 Non-Hardware Sources of Randomness | 6.2 Non-Hardware Sources of Randomness | |||
| The best source of input for mixing would be a hardware randomness | The best source of input for mixing would be a hardware randomness | |||
| such as disk drive timing effected by air turbulence, audio input | such as ring oscillators, disk drive timing, thermal noise, or | |||
| with thermal noise, or radioactive decay. However, if that is not | radioactive decay. However, if that is not available there are other | |||
| available there are other possibilities. These include system | possibilities. These include system clocks, system or input/output | |||
| clocks, system or input/output buffers, user/system/hardware/network | buffers, user/system/hardware/network serial numbers and/or addresses | |||
| serial numbers and/or addresses and timing, and user input. | and timing, and user input. Unfortunately, each of these sources can | |||
| Unfortunately, any of these sources can produce limited or | produce very limited or predictable values under some circumstances. | |||
| predicatable values under some circumstances. | ||||
| Some of the sources listed above would be quite strong on multi-user | Some of the sources listed above would be quite strong on multi-user | |||
| systems where, in essence, each user of the system is a source of | systems where, in essence, each user of the system is a source of | |||
| randomness. However, on a small single user or embedded system, | randomness. However, on a small single user or embedded system, | |||
| especially at start up, it might be possible for an adversary to | especially at start up, it might be possible for an adversary to | |||
| assemble a similar configuration. This could give the adversary | assemble a similar configuration. This could give the adversary | |||
| inputs to the mixing process that were sufficiently correlated to | inputs to the mixing process that were sufficiently correlated to | |||
| those used originally as to make exhaustive search practical. | those used originally as to make exhaustive search practical. | |||
| The use of multiple random inputs with a strong mixing function is | The use of multiple random inputs with a strong mixing function is | |||
| recommended and can overcome weakness in any particular input. For | recommended and can overcome weakness in any particular input. The | |||
| example, the timing and content of requested "random" user keystrokes | timing and content of requested "random" user keystrokes can yield | |||
| can yield hundreds of random bits but conservative assumptions need | hundreds of random bits but conservative assumptions need to be made. | |||
| to be made. For example, assuming at most a few bits of randomness | For example, assuming at most a few bits of randomness if the inter- | |||
| if the inter-keystroke interval is unique in the sequence up to that | keystroke interval is unique in the sequence up to that point and a | |||
| point and a similar assumption if the key hit is unique but assuming | similar assumption if the key hit is unique but assuming that no bits | |||
| that no bits of randomness are present in the initial key value or if | of randomness are present in the initial key value or if the timing | |||
| the timing or key value duplicate previous values. The results of | or key value duplicate previous values. The results of mixing these | |||
| mixing these timings and characters typed could be further combined | timings and characters typed could be further combined with clock | |||
| with clock values and other inputs. | values and other inputs. | |||
| This strategy may make practical portable code to produce good random | This strategy may make practical portable code to produce good random | |||
| numbers for security even if some of the inputs are very weak on some | numbers for security even if some of the inputs are very weak on some | |||
| of the target systems. However, it may still fail against a high | of the target systems. However, it may still fail against a high | |||
| grade attack on small single user or embedded systems, especially if | grade attack on small, single user or embedded systems, especially if | |||
| the adversary has ever been able to observe the generation process in | the adversary has ever been able to observe the generation process in | |||
| the past. A hardware based random source is still preferable. | the past. A hardware based random source is still preferable. | |||
| 6.3 Cryptographically Strong Sequences | 6.3 Cryptographically Strong Sequences | |||
| In cases where a series of random quantities must be generated, an | In cases where a series of random quantities must be generated, an | |||
| adversary may learn some values in the sequence. In general, they | adversary may learn some values in the sequence. In general, they | |||
| should not be able to predict other values from the ones that they | should not be able to predict other values from the ones that they | |||
| know. | know. | |||
| The correct technique is to start with a strong random seed, take | The correct technique is to start with a strong random seed, take | |||
| cryptographically strong steps from that seed [CRYPTO2, CRYPTO3], and | cryptographically strong steps from that seed [FERGUSON, SCHNEIER], | |||
| do not reveal the complete state of the generator in the sequence | and do not reveal the complete state of the generator in the sequence | |||
| elements. If each value in the sequence can be calculated in a fixed | elements. If each value in the sequence can be calculated in a fixed | |||
| way from the previous value, then when any value is compromised, all | way from the previous value, then when any value is compromised, all | |||
| future values can be determined. This would be the case, for | future values can be determined. This would be the case, for example, | |||
| example, if each value were a constant function of the previously | if each value were a constant function of the previously used values, | |||
| used values, even if the function were a very strong, non-invertible | even if the function were a very strong, non-invertible message | |||
| message digest function. | digest function. | |||
| (It should be noted that if your technique for generating a sequence | (It should be noted that if your technique for generating a sequence | |||
| of key values is fast enough, it can trivially be used as the basis | of key values is fast enough, it can trivially be used as the basis | |||
| for a confidentiality system. If two parties use the same sequence | for a confidentiality system. If two parties use the same sequence | |||
| generating technique and start with the same seed material, they will | generating technique and start with the same seed material, they will | |||
| generate identical sequences. These could, for example, be xor'ed at | generate identical sequences. These could, for example, be xor'ed at | |||
| one end with data being send, encrypting it, and xor'ed with this | one end with data being send, encrypting it, and xor'ed with this | |||
| data as received, decrypting it due to the reversible properties of | data as received, decrypting it due to the reversible properties of | |||
| the xor operation.) | the xor operation. This is commonly referred to as a simple stream | |||
| cipher.) | ||||
| 6.3.1 Traditional Strong Sequences | 6.3.1 Traditional Strong Sequences | |||
| A traditional way to achieve a strong sequence has been to have the | A traditional way to achieve a strong sequence has been to have the | |||
| values be produced by hashing the quantities produced by | values be produced by hashing the quantities produced by | |||
| concatenating the seed with successive integers or the like and then | concatenating the seed with successive integers or the like and then | |||
| mask the values obtained so as to limit the amount of generator state | mask the values obtained so as to limit the amount of generator state | |||
| available to the adversary. | available to the adversary. | |||
| It may also be possible to use an "encryption" algorithm with a | It may also be possible to use an "encryption" algorithm with a | |||
| random key and seed value to encrypt and feedback some or all of the | random key and seed value to encrypt and feedback some or all of the | |||
| output encrypted value into the value to be encrypted for the next | output encrypted value into the value to be encrypted for the next | |||
| iteration. Appropriate feedback techniques will usually be | iteration. Appropriate feedback techniques will usually be | |||
| recommended with the encryption algorithm. An example is shown below | recommended with the encryption algorithm. An example is shown below | |||
| where shifting and masking are used to combine the cypher output | where shifting and masking are used to combine the cypher output | |||
| feedback. This type of feedback was recommended by the US Government | feedback. This type of feedback is defined by the US Government in | |||
| in connection with DES [DES MODES] but should be avoided for reasons | connection with AES and DES [MODES] as Output Feedback Mode (OFM) but | |||
| described below. | should be avoided for reasons described below. | |||
| +---------------+ | +---------------+ | |||
| | V | | | V | | |||
| | | n |--+ | | | n |--+ | |||
| +--+------------+ | | +--+------------+ | | |||
| | | +---------+ | | | +---------+ | |||
| | +---> | | +-----+ | shift| +---> | | +-----+ | |||
| +--+ | Encrypt | <--- | Key | | +--+ | Encrypt | <--- | Key | | |||
| | +-------- | | +-----+ | | +-------- | | +-----+ | |||
| | | +---------+ | | | +---------+ | |||
| V V | V V | |||
| +------------+--+ | +------------+--+ | |||
| | V | | | | V | | | |||
| | n+1 | | | n+1 | | |||
| +---------------+ | +---------------+ | |||
| Note that if a shift of one is used, this is the same as the shift | Note that if a shift of one is used, this is the same as the shift | |||
| register technique described in Section 3 above but with the all | register technique described in Section 3 above but with the all | |||
| important difference that the feedback is determined by a complex | important difference that the feedback is determined by a complex | |||
| non-linear function of all bits rather than a simple linear or | non-linear function of all bits rather than a simple linear or | |||
| polynomial combination of output from a few bit position taps. | polynomial combination of output from a few bit position taps. | |||
| It has been shown by Donald W. Davies that this sort of shifted | It has been shown by Donald W. Davies that this sort of shifted | |||
| partial output feedback significantly weakens an algorithm compared | partial output feedback significantly weakens an algorithm compared | |||
| will feeding all of the output bits back as input. In particular, | with feeding all of the output bits back as input. In particular, for | |||
| for DES, repeated encrypting a full 64 bit quantity will give an | DES, repeated encrypting a full 64 bit quantity will give an expected | |||
| expected repeat in about 2^63 iterations. Feeding back anything less | repeat in about 2^63 iterations. Feeding back anything less than 64 | |||
| than 64 (and more than 0) bits will give an expected repeat in | (and more than 0) bits will give an expected repeat in between 2^31 | |||
| between 2**31 and 2**32 iterations! | and 2^32 iterations! | |||
| To predict values of a sequence from others when the sequence was | To predict values of a sequence from others when the sequence was | |||
| generated by these techniques is equivalent to breaking the | generated by these techniques is equivalent to breaking the | |||
| cryptosystem or inverting the "non-invertible" hashing involved with | cryptosystem or inverting the "non-invertible" hashing involved with | |||
| only partial information available. The less information revealed | only partial information available. The less information revealed | |||
| each iteration, the harder it will be for an adversary to predict the | each iteration, the harder it will be for an adversary to predict the | |||
| sequence. Thus it is best to use only one bit from each value. It | sequence. Thus it is best to use only one bit from each value. It has | |||
| has been shown that in some cases this makes it impossible to break a | been shown that in some cases this makes it impossible to break a | |||
| system even when the cryptographic system is invertible and can be | system even when the cryptographic system is invertible and can be | |||
| broken if all of each generated value was revealed. | broken if all of each generated value was revealed. | |||
| 6.3.2 The Blum Blum Shub Sequence Generator | 6.3.2 The Blum Blum Shub Sequence Generator | |||
| Currently the generator which has the strongest public proof of | Currently the generator which has the strongest public proof of | |||
| strength is called the Blum Blum Shub generator after its inventors | strength is called the Blum Blum Shub generator after its inventors | |||
| [BBS]. It is also very simple and is based on quadratic residues. | [BBS]. It is also very simple and is based on quadratic residues. | |||
| It's only disadvantage is that is is computationally intensive | It's only disadvantage is that it is computationally intensive | |||
| compared with the traditional techniques give in 6.3.1 above. This | compared with the traditional techniques give in 6.3.1 above. This is | |||
| is not a major draw back if it is used for moderately infrequent | not a major draw back if it is used for moderately infrequent | |||
| purposes, such as generating session keys. | purposes, such as generating session keys. | |||
| Simply choose two large prime numbers, say p and q, which both have | Simply choose two large prime numbers, say p and q, which both have | |||
| the property that you get a remainder of 3 if you divide them by 4. | the property that you get a remainder of 3 if you divide them by 4. | |||
| Let n = p * q. Then you choose a random number x relatively prime to | Let n = p * q. Then you choose a random number x relatively prime to | |||
| n. The initial seed for the generator and the method for calculating | n. The initial seed for the generator and the method for calculating | |||
| subsequent values are then | subsequent values are then | |||
| 2 | 2 | |||
| s = ( x )(Mod n) | s = ( x )(Mod n) | |||
| 0 | 0 | |||
| 2 | 2 | |||
| s = ( s )(Mod n) | s = ( s )(Mod n) | |||
| i+1 i | i+1 i | |||
| You must be careful to use only a few bits from the bottom of each s. | You must be careful to use only a few bits from the bottom of each s. | |||
| It is always safe to use only the lowest order bit. If you use no | It is always safe to use only the lowest order bit. If you use no | |||
| more than the | more than the | |||
| log ( log ( s ) ) | log ( log ( s ) ) | |||
| 2 2 i | 2 2 i | |||
| low order bits, then predicting any additional bits from a sequence | low order bits, then predicting any additional bits from a sequence | |||
| generated in this manner is provable as hard as factoring n. As long | generated in this manner is provable as hard as factoring n. As long | |||
| as the initial x is secret, you can even make n public if you want. | as the initial x is secret, you can even make n public if you want. | |||
| An intersting characteristic of this generator is that you can | An interesting characteristic of this generator is that you can | |||
| directly calculate any of the s values. In particular | directly calculate any of the s values. In particular | |||
| i | i | |||
| ( ( 2 )(Mod (( p - 1 ) * ( q - 1 )) ) ) | ( ( 2 )(Mod (( p - 1 ) * ( q - 1 )) ) ) | |||
| s = ( s )(Mod n) | s = ( s )(Mod n) | |||
| i 0 | i 0 | |||
| This means that in applications where many keys are generated in this | This means that in applications where many keys are generated in this | |||
| fashion, it is not necessary to save them all. Each key can be | fashion, it is not necessary to save them all. Each key can be | |||
| effectively indexed and recovered from that small index and the | effectively indexed and recovered from that small index and the | |||
| initial s and n. | initial s and n. | |||
| 6.3.3 Entropy Pool Techniques | 6.3.3 Entropy Pool Techniques | |||
| Many modern pseudo random number sources utilize the technique of | Many modern pseudo-random number sources utilize the technique of | |||
| maintaining a "pool" of bits and providing operations for strongly | maintaining a "pool" of bits and providing operations for strongly | |||
| mixing input with some randomness into the pool and extracting psuedo | mixing input with some randomness into the pool and extracting psuedo | |||
| random bits from the pool. This is illustred in the figure below. | random bits from the pool. This is illustrated in the figure below. | |||
| +--------+ +------+ +---------+ | +--------+ +------+ +---------+ | |||
| --->| Mix In |--->| POOL |--->| Extract |---> | --->| Mix In |--->| POOL |--->| Extract |---> | |||
| | Bits | | | | Bits | | | Bits | | | | Bits | | |||
| +--------+ +------+ +---------+ | +--------+ +------+ +---------+ | |||
| ^ V | ^ V | |||
| | | | | | | |||
| +-----------+ | +-----------+ | |||
| Bits to be feed into the pool can be any of the various hardware, | Bits to be feed into the pool can be any of the various hardware, | |||
| environmental, or user input sources discussed above. It is also | environmental, or user input sources discussed above. It is also | |||
| common to save the state of the pool on shut down and restore it on | common to save the state of the pool on system shut down and restore | |||
| re-starting, if stable storage is available. | it on re-starting, if stable storage is available. | |||
| In fact, all of the [MD*] and [SHA*] message digest functions are | ||||
| implemented by internally maintaining a pool substantially larger | ||||
| than their ultimate output into which the bytes of the message are | ||||
| mixed and from which the ultimate message digest is extracted. Thus | ||||
| the figure above can be implemented by using parts of the message | ||||
| digest code to strongly mix in any new bit supplied and to compute | ||||
| output bits based on the pool. However, additional code is needed so | ||||
| that any number of bits can be extracted and appropriate feedback | ||||
| from the output process is mixed into the pool so as to produce a | ||||
| strong pseudo-random output stream. | ||||
| Care must be taken that enough entropy has been added to the pool to | Care must be taken that enough entropy has been added to the pool to | |||
| support particular output uses desired. See Section 7.3 for for more | support particular output uses desired. See Section 7.5 for more | |||
| details on an example implementation and [RSA BULL1] for similar | details on an example implementation and [RSA BULL1] for similar | |||
| suggestions. | suggestions. | |||
| 7. Key Generation Standards and Examples | 7. Key Generation Standards and Examples | |||
| Several public standards and widely deplyed examples are now in place | Several public standards and widely deployed examples are now in | |||
| for the generation of keys without special hardware. Two standards | place for the generation of keys without special hardware. Three | |||
| are described below. Both use DES but any equally strong or stronger | standards are described below. The two older standards use DES, with | |||
| mixing function could be substituted. Then a few widely deployed | its 64-bit block and key size limit, but any equally strong or | |||
| examples are described. | stronger mixing function could be substituted. The third is a more | |||
| modern and stronger standard based on SHA-1. Finally the widely | ||||
| deployed modern UNIX random number generators are described. | ||||
| 7.1 US DoD Recommendations for Password Generation | 7.1 US DoD Recommendations for Password Generation | |||
| The United States Department of Defense has specific recommendations | The United States Department of Defense has specific recommendations | |||
| for password generation [DoD]. They suggest using the US Data | for password generation [DoD]. They suggest using the US Data | |||
| Encryption Standard [DES] in Output Feedback Mode [DES MODES] as | Encryption Standard [DES] in Output Feedback Mode [MODES] as follows: | |||
| follows: | ||||
| use an initialization vector determined from | use an initialization vector determined from | |||
| the system clock, | the system clock, | |||
| system ID, | system ID, | |||
| user ID, and | user ID, and | |||
| date and time; | date and time; | |||
| use a key determined from | use a key determined from | |||
| system interrupt registers, | system interrupt registers, | |||
| system status registers, and | system status registers, and | |||
| system counters; and, | system counters; and, | |||
| as plain text, use an external randomly generated 64 bit | as plain text, use an external randomly generated 64 bit | |||
| quantity such as 8 characters typed in by a system | quantity such as 8 characters typed in by a system | |||
| administrator. | administrator. | |||
| The password can then be calculated from the 64 bit "cipher text" | The password can then be calculated from the 64 bit "cipher text" | |||
| generated in 64-bit Output Feedback Mode. As many bits as are needed | generated by DES in 64-bit Output Feedback Mode. As many bits as are | |||
| can be taken from these 64 bits and expanded into a pronounceable | needed can be taken from these 64 bits and expanded into a | |||
| word, phrase, or other format if a human being needs to remember the | pronounceable word, phrase, or other format if a human being needs to | |||
| password. | remember the password. | |||
| 7.2 X9.17 Key Generation | 7.2 X9.17 Key Generation | |||
| The American National Standards Institute has specified a method for | The American National Standards Institute has specified a method for | |||
| generating a sequence of keys as follows: | generating a sequence of keys as follows [X9.17]: | |||
| s is the initial 64 bit seed | s is the initial 64 bit seed | |||
| 0 | 0 | |||
| g is the sequence of generated 64 bit key quantities | g is the sequence of generated 64 bit key quantities | |||
| n | n | |||
| k is a random key reserved for generating this key sequence | k is a random key reserved for generating this key sequence | |||
| t is the time at which a key is generated to as fine a resolution | t is the time at which a key is generated to as fine a resolution | |||
| as is available (up to 64 bits). | as is available (up to 64 bits). | |||
| DES ( K, Q ) is the DES encryption of quantity Q with key K | DES ( K, Q ) is the DES encryption of quantity Q with key K | |||
| g = DES ( k, DES ( k, t ) .xor. s ) | g = DES ( k, DES ( k, t ) .xor. s ) | |||
| n n | n n | |||
| s = DES ( k, DES ( k, t ) .xor. g ) | s = DES ( k, DES ( k, t ) .xor. g ) | |||
| n+1 n | n+1 n | |||
| If g sub n is to be used as a DES key, then every eighth bit should | If g sub n is to be used as a DES key, then every eighth bit should | |||
| be adjusted for parity for that use but the entire 64 bit unmodified | be adjusted for parity for that use but the entire 64 bit unmodified | |||
| g should be used in calculating the next s. | g should be used in calculating the next s. | |||
| 7.3 The /dev/random Device under Linux | 7.3 DSS Pseudo-Random Number Generation | |||
| The Linux operating system provides a Kernel resident random number | Appendix 3 of the NIST Digital Signature Standard [DSS] provides an | |||
| generator. This generator makes use of events captured by the Kernel | approved method of producing a sequence of pseudo-random 160 bit | |||
| during normal system operation. | quantities for use as private keys or the like. A subset of that | |||
| algorithm is as follows: | ||||
| The generator consists of a random pool of bytes, by default 512 | t = 0x 67452301 EFCDAB89 98BADCFE 10325476 C3D2E1F0 | |||
| bytes (represented as 128, 4 byte integers). When an event occurs, | ||||
| such as a disk drive interrupt, the time of the event is xor'ed into | q = a 160-bit prime number | |||
| the pool and the pool is stirred via a primitive polynomial of degree | ||||
| 128. The pool itself is treated as a ring buffer, with new data | XKEY = initial seed | |||
| being xor'ed (after stirring with the polynomial) across the entire | 0 | |||
| pool. | ||||
| For j = 0 to ... | ||||
| XVAL = ( XKEY + optional user input ) (Mod 2^512) | ||||
| j | ||||
| X = G( t, XVAL ) (Mod q) | ||||
| j | ||||
| XKEY = ( 1 + XKEY + X ) (Mod 2^512) | ||||
| j+1 j j | ||||
| The quantities X thus produced are the pseudo-random sequence of | ||||
| values in the rang 0 to q. Two functions can be used for "G" above. | ||||
| Each produces a 160-bit value and takes two arguments, the first a | ||||
| 160-bit value and the second a 512 bit value. | ||||
| The first is based on SHA-1 and works by setting the 5 linking | ||||
| variables, denoted H with subscripts in the SHA-1 specification, to | ||||
| the first argument divided into fifths. Then steps (a) through (e) of | ||||
| section 7 of the SHA-1 specification are run over the second argument | ||||
| as if it were a 512-bit data block. The values of the linking | ||||
| variable after those steps are then concatenated to produce the | ||||
| output of G. [SHA-1] | ||||
| As an alternative, NIST also defined an alternate G function based on | ||||
| multiple applications of the DES encryption function [DSS]. | ||||
| 7.4 X9.82 Pseudo-Random Number Generation | ||||
| The National Institute for Standards and Technology (NIST) and the | ||||
| American National Standards Institutes (ANSI) X9F1 committee are in | ||||
| the final stages of creating a standard for random number generation. | ||||
| This standard includes a number of random number generators for use | ||||
| with AES and other block ciphers. It also includes random number | ||||
| generators based on hash functions and the arithmetic of elliptic | ||||
| curves [X9.82]. | ||||
| 7.5 The /dev/random Device | ||||
| Several versions of the UNIX operating system provides a kernel- | ||||
| resident random number generator. In some cases, these generators | ||||
| makes use of events captured by the Kernel during normal system | ||||
| operation. | ||||
| For example, on some versions of Linux, the generator consists of a | ||||
| random pool of 512 bytes represented as 128 words of 4-bytes each. | ||||
| When an event occurs, such as a disk drive interrupt, the time of the | ||||
| event is xor'ed into the pool and the pool is stirred via a primitive | ||||
| polynomial of degree 128. The pool itself is treated as a ring | ||||
| buffer, with new data being XORed (after stirring with the | ||||
| polynomial) across the entire pool. | ||||
| Each call that adds entropy to the pool estimates the amount of | Each call that adds entropy to the pool estimates the amount of | |||
| likely true entropy the input contains. The pool itself contains a | likely true entropy the input contains. The pool itself contains a | |||
| accumulator that estimates the total over all entropy of the pool. | accumulator that estimates the total over all entropy of the pool. | |||
| Input events come from several sources: | Input events come from several sources: | |||
| 1. Keyboard interrupts. The time of the interrupt as well as the scan | 1. Keyboard interrupts. The time of the interrupt as well as the scan | |||
| code are added to the pool. This in effect adds entropy from the | code are added to the pool. This in effect adds entropy from the | |||
| human operator by measuring inter-keystroke arrival times. | human operator by measuring inter-keystroke arrival times. | |||
| skipping to change at page 30, line 10 ¶ | skipping to change at page 31, line 19 ¶ | |||
| 3. Mouse motion. The timing as well as mouse position is added in. | 3. Mouse motion. The timing as well as mouse position is added in. | |||
| When random bytes are required, the pool is hashed with SHA-1 [SHA1] | When random bytes are required, the pool is hashed with SHA-1 [SHA1] | |||
| to yield the returned bytes of randomness. If more bytes are required | to yield the returned bytes of randomness. If more bytes are required | |||
| than the output of SHA-1 (20 bytes), then the hashed output is | than the output of SHA-1 (20 bytes), then the hashed output is | |||
| stirred back into the pool and a new hash performed to obtain the | stirred back into the pool and a new hash performed to obtain the | |||
| next 20 bytes. As bytes are removed from the pool, the estimate of | next 20 bytes. As bytes are removed from the pool, the estimate of | |||
| entropy is similarly decremented. | entropy is similarly decremented. | |||
| To ensure a reasonable random pool upon system startup, the standard | To ensure a reasonable random pool upon system startup, the standard | |||
| Linux startup scripts (and shutdown scripts) save the pool to a disk | startup scripts (and shutdown scripts) save the pool to a disk file | |||
| file at shutdown and read this file at system startup. | at shutdown and read this file at system startup. | |||
| There are two user exported interfaces. /dev/random returns bytes | There are two user exported interfaces. /dev/random returns bytes | |||
| from the pool, but blocks when the estimated entropy drops to zero. | from the pool, but blocks when the estimated entropy drops to zero. | |||
| As entropy is added to the pool from events, more data becomes | As entropy is added to the pool from events, more data becomes | |||
| available via /dev/random. Random data obtained /dev/random is | available via /dev/random. Random data obtained from such a | |||
| suitable for key generation for long term keys. | /dev/random device is suitable for key generation for long term keys. | |||
| /dev/urandom works like /dev/random, however it provides data even | /dev/urandom works like /dev/random, however it provides data even | |||
| when the entropy estimate for the random pool drops to zero. This | when the entropy estimate for the random pool drops to zero. This may | |||
| should be fine for session keys. The risk of continuing to take data | be adequate for session keys. The risk of continuing to take data | |||
| even when the pool's entropy estimate is small is that past output | even when the pool's entropy estimate is small in that past output | |||
| may be computable from current output provided an attacker can | may be computable from current output provided an attacker can | |||
| reverse SHA-1. Given that SHA-1 should not be invertible, this is a | reverse SHA-1. Given that SHA-1 is designed to be non-invertible, | |||
| reasonable risk. | this is a reasonable risk. | |||
| To obtain random numbers under Linux, all an application needs to do | To obtain random numbers under Linux, Solaris, or other UNIX systems | |||
| equiped with code as described above, all an application needs to do | ||||
| is open either /dev/random or /dev/urandom and read the desired | is open either /dev/random or /dev/urandom and read the desired | |||
| number of bytes. | number of bytes. | |||
| The Linux Random device was written by Theodore Ts'o. It is based | (The Linux Random device was written by Theodore Ts'o. It was based | |||
| loosely on the random number generator in PGP 2.X and PGP 3.0 (aka | loosely on the random number generator in PGP 2.X and PGP 3.0 (aka | |||
| PGP 5.0). | PGP 5.0).) | |||
| 8. Examples of Randomness Required | 8. Examples of Randomness Required | |||
| Below are two examples showing rough calculations of needed | Below are two examples showing rough calculations of needed | |||
| randomness for security. The first is for moderate security | randomness for security. The first is for moderate security passwords | |||
| passwords while the second assumes a need for a very high security | while the second assumes a need for a very high security | |||
| cryptographic key. | cryptographic key. | |||
| In addition [ORMAN] and [RSA BULL13] provide information on the | In addition [ORMAN] and [RSA BULL13] provide information on the | |||
| public key lengths that should be used for exchanging symmetric keys. | public key lengths that should be used for exchanging symmetric keys. | |||
| 8.1 Password Generation | 8.1 Password Generation | |||
| Assume that user passwords change once a year and it is desired that | Assume that user passwords change once a year and it is desired that | |||
| the probability that an adversary could guess the password for a | the probability that an adversary could guess the password for a | |||
| particular account be less than one in a thousand. Further assume | particular account be less than one in a thousand. Further assume | |||
| that sending a password to the system is the only way to try a | that sending a password to the system is the only way to try a | |||
| password. Then the crucial question is how often an adversary can | password. Then the crucial question is how often an adversary can try | |||
| try possibilities. Assume that delays have been introduced into a | possibilities. Assume that delays have been introduced into a system | |||
| system so that, at most, an adversary can make one password try every | so that, at most, an adversary can make one password try every six | |||
| six seconds. That's 600 per hour or about 15,000 per day or about | seconds. That's 600 per hour or about 15,000 per day or about | |||
| 5,000,000 tries in a year. Assuming any sort of monitoring, it is | 5,000,000 tries in a year. Assuming any sort of monitoring, it is | |||
| unlikely someone could actually try continuously for a year. In | unlikely someone could actually try continuously for a year. In fact, | |||
| fact, even if log files are only checked monthly, 500,000 tries is | even if log files are only checked monthly, 500,000 tries is more | |||
| more plausible before the attack is noticed and steps taken to change | plausible before the attack is noticed and steps taken to change | |||
| passwords and make it harder to try more passwords. | passwords and make it harder to try more passwords. | |||
| To have a one in a thousand chance of guessing the password in | To have a one in a thousand chance of guessing the password in | |||
| 500,000 tries implies a universe of at least 500,000,000 passwords or | 500,000 tries implies a universe of at least 500,000,000 passwords or | |||
| about 2^29. Thus 29 bits of randomness are needed. This can probably | about 2^29. Thus 29 bits of randomness are needed. This can probably | |||
| be achieved using the US DoD recommended inputs for password | be achieved using the US DoD recommended inputs for password | |||
| generation as it has 8 inputs which probably average over 5 bits of | generation as it has 8 inputs which probably average over 5 bits of | |||
| randomness each (see section 7.1). Using a list of 1000 words, the | randomness each (see section 7.1). Using a list of 1000 words, the | |||
| password could be expressed as a three word phrase (1,000,000,000 | password could be expressed as a three word phrase (1,000,000,000 | |||
| possibilities) or, using case insensitive letters and digits, six | possibilities) or, using case insensitive letters and digits, six | |||
| would suffice ((26+10)^6 = 2,176,782,336 possibilities). | would suffice ((26+10)^6 = 2,176,782,336 possibilities). | |||
| For a higher security password, the number of bits required goes up. | For a higher security password, the number of bits required goes up. | |||
| To decrease the probability by 1,000 requires increasing the universe | To decrease the probability by 1,000 requires increasing the universe | |||
| of passwords by the same factor which adds about 10 bits. Thus to | of passwords by the same factor which adds about 10 bits. Thus to | |||
| have only a one in a million chance of a password being guessed under | have only a one in a million chance of a password being guessed under | |||
| the above scenario would require 39 bits of randomness and a password | the above scenario would require 39 bits of randomness and a password | |||
| that was a four word phrase from a 1000 word list or eight | that was a four word phrase from a 1000 word list or eight | |||
| letters/digits. To go to a one in 10^9 chance, 49 bits of randomness | letters/digits. To go to a one in 10^9 chance, 49 bits of randomness | |||
| are needed implying a five word phrase or ten letter/digit password. | are needed implying a five word phrase or ten letter/digit password. | |||
| In a real system, of course, there are also other factors. For | In a real system, of course, there are also other factors. For | |||
| example, the larger and harder to remember passwords are, the more | example, the larger and harder to remember passwords are, the more | |||
| likely users are to write them down resulting in an additional risk | likely users are to write them down resulting in an additional risk | |||
| of compromise. | of compromise. | |||
| 8.2 A Very High Security Cryptographic Key | 8.2 A Very High Security Cryptographic Key | |||
| Assume that a very high security key is needed for symmetric | Assume that a very high security key is needed for symmetric | |||
| encryption / decryption between two parties. Assume an adversary can | encryption / decryption between two parties. Assume an adversary can | |||
| observe communications and knows the algorithm being used. Within | observe communications and knows the algorithm being used. Within the | |||
| the field of random possibilities, the adversary can try key values | field of random possibilities, the adversary can try key values in | |||
| in hopes of finding the one in use. Assume further that brute force | hopes of finding the one in use. Assume further that brute force | |||
| trial of keys is the best the adversary can do. | trial of keys is the best the adversary can do. | |||
| 8.2.1 Effort per Key Trial | 8.2.1 Effort per Key Trial | |||
| How much effort will it take to try each key? For very high security | How much effort will it take to try each key? For very high security | |||
| applications it is best to assume a low value of effort. This | applications it is best to assume a low value of effort. Even if it | |||
| question is considered in detail in Appendix A. It concludes that a | would clearly take tens of thousands of computer cycles or more to | |||
| reasonable key length in 1995 for very high security is in the range | try a single key, there may be some pattern that enables huge blocks | |||
| of 75 to 90 bits and, since the cost of cryptography does not vary | of key values to be tested with much less effort per key. Thus it is | |||
| much with they key size, recommends 90 bits. To update these | probably best to assume no more than a couple hundred cycles per key. | |||
| recommendations, just add 2/3 of a bit per year for Moore's law | (There is no clear lower bound on this as computers operate in | |||
| [MOORE]. Thus, in the year 2004, this translates to a determination | parallel on a number of bits and a poor encryption algorithm could | |||
| that a reasonable key length is in 81 to 96 bit range. | allow many keys or even groups of keys to be tested in parallel. | |||
| However, we need to assume some value and can hope that a reasonably | ||||
| strong algorithm has been chosen for our hypothetical high security | ||||
| task.) | ||||
| If the adversary can command a highly parallel processor or a large | ||||
| network of work stations, 10^11 cycles per second is probably a | ||||
| minimum assumption for availability today. Looking forward a few | ||||
| years, there should be at least an order of magnitude improvement. | ||||
| Thus assuming 10^10 keys could be checked per second or 3.6*10^12 per | ||||
| hour or 6*10^14 per week or 2.4*10^15 per month is reasonable. This | ||||
| implies a need for a minimum of 63 bits of randomness in keys to be | ||||
| sure they cannot be found in a month. Even then it is possible that, | ||||
| a few years from now, a highly determined and resourceful adversary | ||||
| could break the key in 2 weeks (on average they need try only half | ||||
| the keys). | ||||
| These questions are considered in detail in "Minimal Key Lengths for | ||||
| Symmetric Ciphers to Provide Adequate Commercial Security: A Report | ||||
| by an Ad Hoc Group of Cryptographers and Computer Scientists" | ||||
| [KeyStudy] which was sponsored by the Business Software Alliance. It | ||||
| concluded that a reasonable key length in 1995 for very high security | ||||
| is in the range of 75 to 90 bits and, since the cost of cryptography | ||||
| does not vary much with they key size, recommends 90 bits. To update | ||||
| these recommendations, just add 2/3 of a bit per year for Moore's law | ||||
| [MOORE]. Thus, in the year 2004, this translates to a determination | ||||
| that a reasonable key length is in the 81 to 96 bit range. In fact, | ||||
| today, it is increasingly common to use keys longer than 96 bits, | ||||
| such as 128-bit (or longer) keys with AES and keys with effective | ||||
| lengths of 112-bits using triple-DES. | ||||
| 8.2.2 Meet in the Middle Attacks | 8.2.2 Meet in the Middle Attacks | |||
| If chosen or known plain text and the resulting encrypted text are | If chosen or known plain text and the resulting encrypted text are | |||
| available, a "meet in the middle" attack is possible if the structure | available, a "meet in the middle" attack is possible if the structure | |||
| of the encryption algorithm allows it. (In a known plain text | of the encryption algorithm allows it. (In a known plain text attack, | |||
| attack, the adversary knows all or part of the messages being | the adversary knows all or part of the messages being encrypted, | |||
| encrypted, possibly some standard header or trailer fields. In a | possibly some standard header or trailer fields. In a chosen plain | |||
| chosen plain text attack, the adversary can force some chosen plain | text attack, the adversary can force some chosen plain text to be | |||
| text to be encrypted, possibly by "leaking" an exciting text that | encrypted, possibly by "leaking" an exciting text that would then be | |||
| would then be sent by the adversary over an encrypted channel.) | sent by the adversary over an encrypted channel.) | |||
| An oversimplified explanation of the meet in the middle attack is as | An oversimplified explanation of the meet in the middle attack is as | |||
| follows: the adversary can half-encrypt the known or chosen plain | follows: the adversary can half-encrypt the known or chosen plain | |||
| text with all possible first half-keys, sort the output, then half- | text with all possible first half-keys, sort the output, then half- | |||
| decrypt the encoded text with all the second half-keys. If a match | decrypt the encoded text with all the second half-keys. If a match is | |||
| is found, the full key can be assembled from the halves and used to | found, the full key can be assembled from the halves and used to | |||
| decrypt other parts of the message or other messages. At its best, | decrypt other parts of the message or other messages. At its best, | |||
| this type of attack can halve the exponent of the work required by | this type of attack can halve the exponent of the work required by | |||
| the adversary while adding a large but roughly constant factor of | the adversary while adding a very large but roughly constant factor | |||
| effort. To be assured of safety against this, a doubling of the | of effort. Thus, if this attack can be mounted, a doubling of the | |||
| amount of randomness in the very strong key to a minimum of 162 bits | amount of randomness in the very strong key to a minimum of 192 bits | |||
| is required for the year 2004 based on the Appendix A analysis. | (96*2) is required for the year 2004 based on the [KeyStudy] | |||
| analysis. | ||||
| This amount of randomness is beyond the limit of that in the inputs | This amount of randomness is well beyond the limit of that in the | |||
| recommended by the US DoD for password generation and could require | inputs recommended by the US DoD for password generation and could | |||
| user typing timing, hardware random number generation, or other | require user typing timing, hardware random number generation, or | |||
| sources. | other sources. | |||
| The meet in the middle attack assumes that the cryptographic | The meet in the middle attack assumes that the cryptographic | |||
| algorithm can be decomposed in this way but we can not rule that out | algorithm can be decomposed in this way but we can not rule that out | |||
| without a deep knowledge of the algorithm. Even if a basic algorithm | without a deep knowledge of the algorithm. Even if a basic algorithm | |||
| is not subject to a meet in the middle attack, an attempt to produce | is not subject to a meet in the middle attack, an attempt to produce | |||
| a stronger algorithm by applying the basic algorithm twice (or two | a stronger algorithm by applying the basic algorithm twice (or two | |||
| different algorithms sequentially) with different keys may gain less | different algorithms sequentially) with different keys may gain less | |||
| added security than would be expected. Such a composite algorithm | added security than would be expected. Such a composite algorithm | |||
| would be subject to a meet in the middle attack. | would be subject to a meet in the middle attack. | |||
| Enormous resources may be required to mount a meet in the middle | Enormous resources may be required to mount a meet in the middle | |||
| attack but they are probably within the range of the national | attack but they are probably within the range of the national | |||
| security services of a major nation. Essentially all nations spy on | security services of a major nation. Essentially all nations spy on | |||
| other nations government traffic and several nations are believed to | other nations traffic. | |||
| spy on commercial traffic for economic advantage. | ||||
| 8.2.3 Other Considerations | ||||
| [KeyStudy] also considers the possibilities of special purpose code | ||||
| breaking hardware and having an adequate safety margin. | ||||
| If the two parties agree on a key by Diffie-Hellman exchange [D-H], | ||||
| then in principle only half of this randomness would have to be | ||||
| supplied by each party. However, there is probably some correlation | ||||
| between their random inputs so it is probably best to assume you end | ||||
| up with more like one and a half times the bits of randomness each | ||||
| provides for very high security if Diffie-Hellman is used. | ||||
| It should be noted that key length calculations such at those above | It should be noted that key length calculations such at those above | |||
| are controversial and depend on various assumptions about the | are controversial and depend on various assumptions about the | |||
| cryptographic algorithms in use. In some cases, a professional with | cryptographic algorithms in use. In some cases, a professional with a | |||
| a deep knowledge of code breaking techniques and of the strength of | deep knowledge of code breaking techniques and of the strength of the | |||
| the algorithm in use could be satisfied with less than half of the | algorithm in use could be satisfied with less than half of the 192 | |||
| 162 bit key size derived above. | bit key size derived above. | |||
| For further examples of conservative design principles see | ||||
| [FERGUSON]. | ||||
| 9. Conclusion | 9. Conclusion | |||
| Generation of unguessable "random" secret quantities for security use | Generation of unguessable "random" secret quantities for security use | |||
| is an essential but difficult task. | is an essential but difficult task. | |||
| Hardware techniques to produce such randomness would be relatively | Hardware techniques to produce such randomness would be relatively | |||
| simple. In particular, the volume and quality would not need to be | simple. In particular, the volume and quality would not need to be | |||
| high and existing computer hardware, such as disk drives, can be | high and existing computer hardware, such as disk drives, can be | |||
| used. | used. | |||
| Computational techniques are available to process low quality random | Computational techniques are available to process low quality random | |||
| quantities from multiple sources or a larger quantity of such low | quantities from multiple sources or a larger quantity of such low | |||
| quality input from one source and produce a smaller quantity of | quality input from one source and produce a smaller quantity of | |||
| higher quality keying material. In the absence of hardware sources | higher quality keying material. In the absence of hardware sources of | |||
| of randomness, a variety of user and software sources can frequently, | randomness, a variety of user and software sources can frequently, | |||
| with care, be used instead; however, most modern systems already have | with care, be used instead; however, most modern systems already have | |||
| hardware, such as disk drives or audio input, that could be used to | hardware, such as disk drives or audio input, that could be used to | |||
| produce high quality randomness. | produce high quality randomness. | |||
| Once a sufficient quantity of high quality seed key material (a | Once a sufficient quantity of high quality seed key material (a | |||
| couple of hundred bits) is available, computational techniques are | couple of hundred bits) is available, computational techniques are | |||
| available to produce cryptographically strong sequences of | available to produce cryptographically strong sequences of | |||
| unpredicatable quantities from this seed material. | unpredictable quantities from this seed material. | |||
| 10. Security Considerations | 10. Security Considerations | |||
| The entirety of this document concerns techniques and recommendations | The entirety of this document concerns techniques and recommendations | |||
| for generating unguessable "random" quantities for use as passwords, | for generating unguessable "random" quantities for use as passwords, | |||
| cryptographic keys, initialiazation vectors, sequence numbers, and | cryptographic keys, initialization vectors, sequence numbers, and | |||
| similar security uses. | similar security uses. | |||
| Intellectual Property Considerations | 11. Intellectual Property Considerations | |||
| The IETF takes no position regarding the validity or scope of any | The IETF takes no position regarding the validity or scope of any | |||
| intellectual property or other rights that might be claimed to | Intellectual Property Rights or other rights that might be claimed to | |||
| pertain to the implementation or use of the technology described in | pertain to the implementation or use of the technology described in | |||
| this document or the extent to which any license under such rights | this document or the extent to which any license under such rights | |||
| might or might not be available; neither does it represent that it | might or might not be available; nor does it represent that it has | |||
| has made any effort to identify any such rights. Information on the | made any independent effort to identify any such rights. Information | |||
| IETF's procedures with respect to rights in standards-track and | on the procedures with respect to rights in RFC documents can be | |||
| standards-related documentation can be found in BCP-11. Copies of | found in BCP 78 and BCP 79. | |||
| claims of rights made available for publication and any assurances of | ||||
| licenses to be made available, or the result of an attempt made to | Copies of IPR disclosures made to the IETF Secretariat and any | |||
| obtain a general license or permission for the use of such | assurances of licenses to be made available, or the result of an | |||
| proprietary rights by implementors or users of this specification can | attempt made to obtain a general license or permission for the use of | |||
| be obtained from the IETF Secretariat. | such proprietary rights by implementers or users of this | |||
| specification can be obtained from the IETF on-line IPR repository at | ||||
| http://www.ietf.org/ipr. | ||||
| The IETF invites any interested party to bring to its attention any | The IETF invites any interested party to bring to its attention any | |||
| copyrights, patents or patent applications, or other proprietary | copyrights, patents or patent applications, or other proprietary | |||
| rights which may cover technology that may be required to practice | rights that may cover technology that may be required to implement | |||
| this standard. Please address the information to the IETF Executive | this standard. Please address the information to the IETF at ietf- | |||
| Director. | ipr@ietf.org. | |||
| Appendix: Minimal Secure Key Lengths Study | ||||
| Minimal Key Lengths for Symmetric Ciphers | ||||
| to Provide Adequate Commercial Security | ||||
| A Report by an Ad Hoc Group of | ||||
| Cryptographers and Computer Scientists | ||||
| Matt Blaze, AT&T Research, mab@research.att.com | ||||
| Whitfield Diffie, Sun Microsystems, diffie@eng.sun.com | ||||
| Ronald L. Rivest, MIT LCS, rivest@lcs.mit.edu | ||||
| Bruce Schneier, Counterpane Systems, schneier@counterpane.com | ||||
| Tsutomu Shimomura, San Diego Supercomputer Center, tsutomu@sdsc.edu | ||||
| Eric Thompson Access Data, Inc., eric@accessdata.com | ||||
| Michael Wiener, Bell Northern Research, wiener@bnr.ca | ||||
| January 1996 | ||||
| A.0 Abstract | ||||
| Encryption plays an essential role in protecting the privacy of | ||||
| electronic information against threats from a variety of potential | ||||
| attackers. In so doing, modern cryptography employs a combination of | ||||
| _conventional_ or _symmetric_ cryptographic systems for encrypting | ||||
| data and _public key_ or _asymmetric_ systems for managing the _keys_ | ||||
| used by the symmetric systems. Assessing the strength required of | ||||
| the symmetric cryptographic systems is therefore an essential step in | ||||
| employing cryptography for computer and communication security. | ||||
| Technology readily available today (late 1995) makes _brute- | ||||
| force_ attacks against cryptographic systems considered adequate for | ||||
| the past several years both fast and cheap. General purpose | ||||
| computers can be used, but a much more efficient approach is to | ||||
| employ commercially available _Field Programmable Gate Array (FPGA)_ | ||||
| technology. For attackers prepared to make a higher initial | ||||
| investment, custom-made, special-purpose chips make such calculations | ||||
| much faster and significantly lower the amortized cost per solution. | ||||
| As a result, cryptosystems with 40-bit keys offer virtually no | ||||
| protection at this point against brute-force attacks. Even the U.S. | ||||
| Data Encryption Standard with 56-bit keys is increasingly inadequate. | ||||
| As cryptosystems often succumb to `smarter' attacks than brute-force | ||||
| key search, it is also important to remember that the keylengths | ||||
| discussed here are the minimum needed for security against the | ||||
| computational threats considered. | ||||
| Fortunately, the cost of very strong encryption is not | ||||
| significantly greater than that of weak encryption. Therefore, to | ||||
| provide adequate protection against the most serious threats --- | ||||
| well-funded commercial enterprises or government intelligence | ||||
| agencies --- keys used to protect data today should be at least 75 | ||||
| bits long. To protect information adequately for the next 20 years | ||||
| in the face of expected advances in computing power, keys in newly- | ||||
| deployed systems should be at least 90 bits long. | ||||
| A.1. Encryption Plays an Essential Role in Protecting | ||||
| the Privacy of Electronic Information" | ||||
| A.1.1 There is a need for information security | ||||
| As we write this paper in late 1995, the development of | ||||
| electronic commerce and the Global Information Infrastructure is at a | ||||
| critical juncture. The dirt paths of the middle ages only became | ||||
| highways of business and culture after the security of travelers and | ||||
| the merchandise they carried could be assured. So too the | ||||
| information superhighway will be an ill-traveled road unless | ||||
| information, the goods of the Information Age, can be moved, stored, | ||||
| bought, and sold securely. Neither corporations nor individuals will | ||||
| entrust their private business or personal data to computer networks | ||||
| unless they can assure their information's security. | ||||
| Today, most forms of information can be stored and processed | ||||
| electronically. This means a wide variety of information, with | ||||
| varying economic values and privacy aspects and with a wide variation | ||||
| in the time over which the information needs to be protected, will be | ||||
| found on computer networks. Consider the spectrum: | ||||
| o Electronic Funds Transfers of millions or even billions of | ||||
| dollars, whose short term security is essential but whose | ||||
| exposure is brief; | ||||
| o A company's strategic corporate plans, whose confidentiality | ||||
| must be preserved for a small number of years; | ||||
| o A proprietary product (Coke formula, new drug design) that | ||||
| needs to be protected over its useful life, often decades; | ||||
| and | ||||
| o Information private to an individual (medical condition, | ||||
| employment evaluation) that may need protection for the | ||||
| lifetime of the individual. | ||||
| A.1.2 Encryption to protect confidentiality | ||||
| Encryption Can Provide Strong Confidentiality Protection | ||||
| Encryption is accomplished by scrambling data using mathematical | ||||
| procedures that make it extremely difficult and time consuming for | ||||
| anyone other than authorized recipients --- those with the correct | ||||
| decryption _keys_ --- to recover the _plain text_. Proper encryption | ||||
| guarantees that the information will be safe even if it falls into | ||||
| hostile hands. | ||||
| Encryption --- and decryption --- can be performed by either | ||||
| computer software or hardware. Common approaches include writing the | ||||
| algorithm on a disk for execution by a computer central processor; | ||||
| placing it in ROM or PROM for execution by a microprocessor; and | ||||
| isolating storage and execution in a computer accessory device (smart | ||||
| card or PCMCIA card). | ||||
| The degree of protection obtained depends on several factors. | ||||
| These include: the quality of the cryptosystem; the way it is | ||||
| implemented in software or hardware (especially its reliability and | ||||
| the manner in which the keys are chosen); and the total number of | ||||
| possible keys that can be used to encrypt the information. A | ||||
| cryptographic algorithm is considered strong if: | ||||
| 1. There is no shortcut that allows the opponent to recover the | ||||
| plain text without using brute force to test keys until the | ||||
| correct one is found; and | ||||
| 2. The number of possible keys is sufficiently large to make | ||||
| such an attack infeasible. | ||||
| The principle here is similar to that of a combination lock on a | ||||
| safe. If the lock is well designed so that a burglar cannot hear or | ||||
| feel its inner workings, a person who does not know the combination | ||||
| can open it only by dialing one set of numbers after another until it | ||||
| yields. | ||||
| The sizes of encryption keys are measured in bits and the | ||||
| difficulty of trying all possible keys grows exponentially with the | ||||
| number of bits used. Adding one bit to the key doubles the number of | ||||
| possible keys; adding ten increases it by a factor of more than a | ||||
| thousand. | ||||
| There is no definitive way to look at a cipher and determine | ||||
| whether a shortcut exists. Nonetheless, several encryption | ||||
| algorithms --- most notably the U.S Data Encryption Standard (DES) | ||||
| --- have been extensively studied in the public literature and are | ||||
| widely believed to be of very high quality. An essential element in | ||||
| cryptographic algorithm design is thus the length of the key, whose | ||||
| size places an upper bound on the system's strength. | ||||
| Throughout this paper, we will assume that there are no shortcuts | ||||
| and treat the length of the key as representative of the | ||||
| cryptosystem's _workfactor_ --- the minimum amount of effort required | ||||
| to break the system. It is important to bear in mind, however, that | ||||
| cryptographers regard this as a rash assumption and many would | ||||
| recommend keys two or more times as long as needed to resist brute- | ||||
| force attacks. Prudent cryptographic designs not only employ longer | ||||
| keys than might appear to be needed, but devote more computation to | ||||
| encrypting and decrypting. A good example of this is the popular | ||||
| approach of using _triple-DES_: encrypting the output of DES twice | ||||
| more, using a total of three distinct keys. | ||||
| Encryption systems fall into two broad classes. Conventional or | ||||
| symmetric cryptosystems --- those in which an entity with the ability | ||||
| to encrypt also has the ability to decrypt and vice versa --- are the | ||||
| systems under consideration in this paper. The more recent public | ||||
| key or asymmetric cryptosystems have the property that the ability to | ||||
| encrypt does not imply the ability to decrypt. In contemporary | ||||
| cryptography, public-key systems are indispensable for managing the | ||||
| keys of conventional cryptosystems. All known public key | ||||
| cryptosystems, however, are subject to shortcut attacks and must | ||||
| therefore use keys ten or more times the lengths of those discussed | ||||
| here to achieve the an equivalent level of security. | ||||
| Although computers permit electronic information to be encrypted | ||||
| using very large keys, advances in computing power keep pushing up | ||||
| the size of keys that can be considered large and thus keep making it | ||||
| easier for individuals and organizations to attack encrypted | ||||
| information without the expenditure of unreasonable resources. | ||||
| A.1.3 There are a variety of attackers | ||||
| There Are Threats from a Variety of Potential Attackers. | ||||
| Threats to confidentiality of information come from a number of | ||||
| directions and their forms depend on the resources of the attackers. | ||||
| `Hackers,' who might be anything from high school students to | ||||
| commercial programmers, may have access to mainframe computers or | ||||
| networks of workstations. The same people can readily buy | ||||
| inexpensive, off-the-shelf, boards, containing _Field Programmable | ||||
| Gate Array (FPGA)_ chips that function as `programmable hardware' and | ||||
| vastly increase the effectiveness of a cryptanalytic effort. A | ||||
| startup company or even a well-heeled individual could afford large | ||||
| numbers of these chips. A major corporation or organized crime | ||||
| operation with `serious money' to spend could acquire custom computer | ||||
| chips specially designed for decryption. An intelligence agency, | ||||
| engaged in espionage for national economic advantage, could build a | ||||
| machine employing millions of such chips. | ||||
| A.1.4 Strong encryption is not expensive | ||||
| Current Technology Permits Very Strong Encryption for Effectively the | ||||
| Same Cost As Weaker Encryption. | ||||
| It is a property of computer encryption that modest increases in | ||||
| computational cost can produce vast increases in security. | ||||
| Encrypting information very securely (e.g., with 128-bit keys) | ||||
| typically requires little more computing than encrypting it weakly | ||||
| (e.g., with 40-bit keys). In many applications, the cryptography | ||||
| itself accounts for only a small fraction of the computing costs, | ||||
| compared to such processes as voice or image compression required to | ||||
| prepare material for encryption. | ||||
| One consequence of this uniformity of costs is that there is | ||||
| rarely any need to tailor the strength of cryptography to the | ||||
| sensitivity of the information being protected. Even if most of the | ||||
| information in a system has neither privacy implications nor monetary | ||||
| value, there is no practical or economic reason to design computer | ||||
| hardware or software to provide differing levels of encryption for | ||||
| different messages. It is simplest, most prudent, and thus | ||||
| fundamentally most economical, to employ a uniformly high level of | ||||
| encryption: the strongest encryption required for any information | ||||
| that might be stored or transmitted by a secure system. | ||||
| A.2. Brute-Force is becoming easier | ||||
| Readily Available Technology Makes Brute-Force Decryption Attacks | ||||
| Faster and Cheaper. | ||||
| The kind of hardware used to mount a brute-force attack against | ||||
| an encryption algorithm depends on the scale of the cryptanalytic | ||||
| operation and the total funds available to the attacking enterprise. | ||||
| In the analysis that follows, we consider three general classes of | ||||
| technology that are likely to be employed by attackers with differing | ||||
| resources available to them. Not surprisingly, the cryptanalytic | ||||
| technologies that require larger up-front investments yield the | ||||
| lowest cost per recovered key, amortized over the life of the | ||||
| hardware. | ||||
| It is the nature of brute-force attacks that they can be | ||||
| parallelized indefinitely. It is possible to use as many machines as | ||||
| are available, assigning each to work on a separate part of the | ||||
| problem. Thus regardless of the technology employed, the search time | ||||
| can be reduced by adding more equipment; twice as much hardware can | ||||
| be expected to find the right key in half the time. The total | ||||
| investment will have doubled, but if the hardware is kept constantly | ||||
| busy finding keys, the cost per key recovered is unchanged. | ||||
| At the low end of the technology spectrum is the use of | ||||
| conventional personal computers or workstations programmed to test | ||||
| keys. Many people, by virtue of already owning or having access to | ||||
| the machines, are in a position use such resources at little or no | ||||
| cost. However, general purpose computers --- laden with such | ||||
| ancillary equipment as video controllers, keyboards, interfaces, | ||||
| memory, and disk storage --- make expensive search engines. They are | ||||
| therefore likely to be employed only by casual attackers who are | ||||
| unable or unwilling to invest in more specialized equipment. | ||||
| A more efficient technological approach is to take advantage of | ||||
| commercially available Field Programmable Gate Arrays. FPGAs | ||||
| function as programmable hardware and allow faster implementations of | ||||
| such tasks as encryption and decryption than conventional processors. | ||||
| FPGAs are a commonly used tool for simple computations that need to | ||||
| be done very quickly, particularly simulating integrated circuits | ||||
| during development. | ||||
| FPGA technology is fast and cheap. The cost of an AT&T ORCA chip | ||||
| that can test 30 million DES keys per second is $200. This is 1,000 | ||||
| times faster than a PC at about one-tenth the cost! FPGAs are widely | ||||
| available and, mounted on cards, can be installed in standard PCs | ||||
| just like sound cards, modems, or extra memory. | ||||
| FPGA technology may be optimal when the same tool must be used | ||||
| for attacking a variety of different cryptosystems. Often, as with | ||||
| DES, a cryptosystem is sufficiently widely used to justify the | ||||
| construction of more specialized facilities. In these circumstances, | ||||
| the most cost-effective technology, but the one requiring the largest | ||||
| initial investment, is the use of _Application-Specific Integrated | ||||
| Circuits (ASICs)_. A $10 chip can test 200 million keys per second. | ||||
| This is seven times faster than an FPGA chip at one-twentieth the | ||||
| cost. | ||||
| Because ASICs require a far greater engineering investment than | ||||
| FPGAs and must be fabricated in quantity before they are economical, | ||||
| this approach is only available to serious, well-funded operations | ||||
| such as dedicated commercial (or criminal) enterprises and government | ||||
| intelligence agencies. | ||||
| A.3. 40-Bit Key Lengths Offer Virtually No Protection | ||||
| Current U.S. Government policy generally limits exportable mass | ||||
| market software that incorporates encryption for confidentiality to | ||||
| using the RC2 or RC4 algorithms with 40-bit keys. A 40-bit key | ||||
| length means that there are 2^40 possible keys. On average, half of | ||||
| these (2^39) must be tried to find the correct one. Export of other | ||||
| algorithms and key lengths must be approved on a case by case basis. | ||||
| For example, DES with a 56-bit key has been approved for certain | ||||
| applications such as financial transactions. | ||||
| The recent successful brute-force attack by two French graduate | ||||
| students on Netscape's 40-bit RC4 algorithm demonstrates the dangers | ||||
| of such short keys. These students at the Ecole Polytechnique in | ||||
| Paris used `idle time' on the school's computers, incurring no cost | ||||
| to themselves or their school. Even with these limited resources, | ||||
| they were able to recover the 40-bit key in a few days. | ||||
| There is no need to have the resources of an institution of | ||||
| higher education at hand, however. Anyone with a modicum of computer | ||||
| expertise and a few hundred dollars would be able to attack 40-bit | ||||
| encryption much faster. An FPGA chip --- costing approximately $400 | ||||
| mounted on a card --- would on average recover a 40-bit key in five | ||||
| hours. Assuming the FPGA lasts three years and is used continuously | ||||
| to find keys, the average cost per key is eight cents. | ||||
| A more determined commercial predator, prepared to spend $10,000 | ||||
| for a set-up with 25 ORCA chips, can find 40-bit keys in an average | ||||
| of 12 minutes, at the same average eight cent cost. Spending more | ||||
| money to buy more chips reduces the time accordingly: $300,000 | ||||
| results in a solution in an average of 24 seconds; $10,000,000 | ||||
| results in an average solution in 0.7 seconds. | ||||
| As already noted, a corporation with substantial resources can | ||||
| design and commission custom chips that are much faster. By doing | ||||
| this, a company spending $300,000 could find the right 40-bit key in | ||||
| an average of 0.18 seconds at 1/10th of a cent per solution; a larger | ||||
| company or government agency willing to spend $10,000,000 could find | ||||
| the right key on average in 0.005 seconds (again at 1/10th of a cent | ||||
| per solution). (Note that the cost per solution remains constant | ||||
| because we have conservatively assumed constant costs for chip | ||||
| acquisition --- in fact increasing the quantities purchased of a | ||||
| custom chip reduces the average chip cost as the initial design and | ||||
| set-up costs are spread over a greater number of chips.) | ||||
| These results are summarized in Table I (below). | ||||
| A.4. Even DES with 56-Bit Keys Is Increasingly Inadequate | ||||
| A.4.1 DES is no panacea today | ||||
| The Data Encryption Standard (DES) was developed in the 1970s by | ||||
| IBM and NSA and adopted by the U.S. Government as a Federal | ||||
| Information Processing Standard for data encryption. It was intended | ||||
| to provide strong encryption for the government's sensitive but | ||||
| unclassified information. It was recognized by many, even at the | ||||
| time DES was adopted, that technological developments would make | ||||
| DES's 56-bit key exceedingly vulnerable to attack before the end of | ||||
| the century. | ||||
| Today, DES may be the most widely employed encryption algorithm | ||||
| and continues to be a commonly cited benchmark. Yet DES-like | ||||
| encryption strength is no panacea. Calculations show that DES is | ||||
| inadequate against a corporate or government attacker committing | ||||
| serious resources. The bottom line is that DES is cheaper and easier | ||||
| to break than many believe. | ||||
| As explained above, 40-bit encryption provides inadequate | ||||
| protection against even the most casual of intruders, content to | ||||
| scavenge time on idle machines or to spend a few hundred dollars. | ||||
| Against such opponents, using DES with a 56-bit key will provide a | ||||
| substantial measure of security. At present, it would take a year | ||||
| and a half for someone using $10,000 worth of FPGA technology to | ||||
| search out a DES key. In ten years time an investment of this size | ||||
| would allow one to find a DES key in less than a week. | ||||
| The real threat to commercial transactions and to privacy on the | ||||
| Internet is from individuals and organizations willing to invest | ||||
| substantial time and money. As more and more business and personal | ||||
| information becomes electronic, the potential rewards to a dedicated | ||||
| commercial predator also increase significantly and may justify the | ||||
| commitment of adequate resources. | ||||
| A serious effort --- on the order of $300,000 --- by a legitimate | ||||
| or illegitimate business could find a DES key in an average of 19 | ||||
| days using off-the-shelf technology and in only 3 hours using a | ||||
| custom developed chip. In the latter case, it would cost $38 to find | ||||
| each key (again assuming a 3 year life to the chip and continual | ||||
| use). A business or government willing to spend $10,000,000 on | ||||
| custom chips, could recover DES keys in an average of 6 minutes, for | ||||
| the same $38 per key. | ||||
| At the very high end, an organization --- presumably a government | ||||
| intelligence agency --- willing to spend $300,000,000 could recover | ||||
| DES keys in 12 seconds each! The investment required is large but | ||||
| not unheard of in the intelligence community. It is less than the | ||||
| cost of the Glomar Explorer, built to salvage a single Russian | ||||
| submarine, and far less than the cost of many spy satellites. Such | ||||
| an expense might be hard to justify in attacking a single target, but | ||||
| seems entirely appropriate against a cryptographic algorithm, like | ||||
| DES, enjoying extensive popularity around the world. | ||||
| There is ample evidence of the danger presented by government | ||||
| intelligence agencies seeking to obtain information not only for | ||||
| military purposes but for commercial advantage. Congressional | ||||
| hearings in 1993 highlighted instances in which the French and | ||||
| Japanese governments spied on behalf of their countries' own | ||||
| businesses. Thus, having to protect commercial information against | ||||
| such threats is not a hypothetical proposition. | ||||
| A.4.2 There are smarter avenues of attack than brute force | ||||
| It is easier to walk around a tree than climb up and down it. | ||||
| There is no need to break the window of a house to get in if the | ||||
| front door is unlocked. | ||||
| Calculations regarding the strength of encryption against brute- | ||||
| force attack are _worst case_ scenarios. They assume that the | ||||
| ciphers are in a sense perfect and that attempts to find shortcuts | ||||
| have failed. One important point is that the crudest approach --- | ||||
| searching through the keys --- is entirely feasible against many | ||||
| widely used systems. Another is that the keylengths we discuss are | ||||
| always minimal. As discussed earlier, prudent designs might use keys | ||||
| twice or three times as long to provide a margin of safety. | ||||
| A.4.3 Other algorithms are similar | ||||
| The Analysis for Other Algorithms Is Roughly Comparable. | ||||
| The above analysis has focused on the time and money required to | ||||
| find a key to decrypt information using the RC4 algorithm with a 40- | ||||
| bit key or the DES algorithm with its 56-bit key, but the results are | ||||
| not peculiar to these ciphers. Although each algorithm has its own | ||||
| particular characteristics, the effort required to find the keys of | ||||
| other ciphers is comparable. There may be some differences as the | ||||
| result of implementation procedures, but these do not materially | ||||
| affect the brute-force breakability of algorithms with roughly | ||||
| comparable key lengths. | ||||
| Specifically, it has been suggested at times that differences in | ||||
| set-up procedures, such as the long key-setup process in RC4, result | ||||
| in some algorithms having effectively longer keys than others. For | ||||
| the purpose of our analysis, such factors appear to vary the | ||||
| effective key length by no more than about eight bits. | ||||
| A.5. Appropriate Key Lengths for the Future --- A Proposal | ||||
| Table I summarizes the costs of carrying out brute-force attacks | ||||
| against symmetric cryptosystems with 40-bit and 56-bit keys using | ||||
| networks of general purpose computers, Field Programmable Gate | ||||
| Arrays, and special-purpose chips. | ||||
| It shows that 56 bits provides a level of protection --- about a | ||||
| year and a half --- that would be adequate for many commercial | ||||
| purposes against an opponent prepared to invest $10,000. Against an | ||||
| opponent prepared to invest $300,000, the period of protection has | ||||
| dropped to the barest minimum of 19 days. Above this, the protection | ||||
| quickly declines to negligible. A very large, but easily imaginable, | ||||
| investment by an intelligence agency would clearly allow it to | ||||
| recover keys in real time. | ||||
| What workfactor would be required for security today? For an | ||||
| opponent whose budget lay in the $10 to 300 million range, the time | ||||
| required to search out keys in a 75-bit keyspace would be between 6 | ||||
| years and 70 days. Although the latter figure may seem comparable to | ||||
| the `barest minimum' 19 days mentioned earlier, it represents --- | ||||
| under our amortization assumptions --- a cost of $19 million and a | ||||
| recovery rate of only five keys a year. The victims of such an | ||||
| attack would have to be fat targets indeed. | ||||
| Because many kinds of information must be kept confidential for | ||||
| long periods of time, assessment cannot be limited to the protection | ||||
| required today. Equally important, cryptosystems --- especially if | ||||
| they are standards --- often remain in use for years or even decades. | ||||
| DES, for example, has been in use for more than 20 years and will | ||||
| probably continue to be employed for several more. In particular, | ||||
| the lifetime of a cryptosystem is likely to exceed the lifetime of | ||||
| any individual product embodying it. | ||||
| A rough estimate of the minimum strength required as a function | ||||
| of time can be obtained by applying an empirical rule, popularly | ||||
| called `Moore's Law,' which holds that the computing power available | ||||
| for a given cost doubles every 18 months. Taking into account both | ||||
| the lifetime of cryptographic equipment and the lifetime of the | ||||
| secrets it protects, we believe it is prudent to require that | ||||
| encrypted data should still be secure in 20 years. Moore's Law thus | ||||
| predicts that the keys should be approximately 14 bits longer than | ||||
| required to protect against an attack today. | ||||
| *Bearing in mind that the additional computational costs of | ||||
| stronger encryption are modest, we strongly recommend a minimum key- | ||||
| length of 90 bits for symmetric cryptosystems.* | ||||
| It is instructive to compare this recommendation with both | ||||
| Federal Information Processing Standard 46, The Data Encryption | ||||
| Standard (DES), and Federal Information Processing Standard 185, The | ||||
| Escrowed Encryption Standard (EES). DES was proposed 21 years ago | ||||
| and used a 56-bit key. Applying Moore's Law and adding 14 bits, we | ||||
| see that the strength of DES when it was proposed in 1975 was | ||||
| comparable to that of a 70-bit system today. Furthermore, it was | ||||
| estimated at the time that DES was not strong enough and that keys | ||||
| could be recovered at a rate of one per day for an investment of | ||||
| about twenty-million dollars. Our 75-bit estimate today corresponds | ||||
| to 61 bits in 1975, enough to have moved the cost of key recovery | ||||
| just out of reach. The Escrowed Encryption Standard, while | ||||
| unacceptable to many potential users for other reasons, embodies a | ||||
| notion of appropriate key length that is similar to our own. It uses | ||||
| 80-bit keys, a number that lies between our figures of 75 and 90 | ||||
| bits. | ||||
| Table I | ||||
| Time and cost Length Needed | ||||
| Type of Budget Tool per key recovered for protection | ||||
| Attacker 40bits 56bits in Late 1995 | ||||
| Pedestrian Hacker | ||||
| tiny scavenged 1 week infeasible 45 | ||||
| computer | ||||
| time | ||||
| $400 FPGA 5 hours 38 years 50 | ||||
| ($0.08) ($5,000) | ||||
| Small Business | ||||
| $10,000 FPGA 12 minutes 556 days 55 | ||||
| ($0.08) ($5,000) | ||||
| Corporate Department | ||||
| $300K FPGA 24 seconds 19 days 60 | ||||
| or ($0.08) ($5,000) | ||||
| ASIC .18 seconds 3 hours | ||||
| ($0.001) ($38) | ||||
| Big Company | ||||
| $10M FPGA .7 seconds 13 hours 70 | ||||
| or ($0.08) ($5,000) | ||||
| ASIC .005 seconds 6 minutes | ||||
| ($0.001) ($38) | ||||
| Intellegence Agency | ||||
| $300M ASIC .0002 seconds 12 seconds 75 | ||||
| ($0.001) ($38) | ||||
| A.6 About the Authors | ||||
| *Matt Blaze* is a senior research scientist at AT&T Research in the | 12. Appendix A: Changes from RFC 1750 | |||
| area of computer security and cryptography. Recently Blaze | ||||
| demonstrated weaknesses in the U.S. government's `Clipper Chip' key | ||||
| escrow encryption system. His current interests include large-scale | ||||
| trust management and the applications of smartcards. | ||||
| *Whitfield Diffie* is a distinguished Engineer at Sun Microsystems | 1. Additional acknowledgements have been added. | |||
| specializing in security. In 1976 Diffie and Martin Hellman created | ||||
| public key cryptography, which solved the problem of sending coded | ||||
| information between individuals with no prior relationship and is the | ||||
| basis for widespread encryption in the digital information age. | ||||
| *Ronald L. Rivest* is a professor of computer science at the | 2. Insertion of section 5.2.4 on de-skewing with S-boxes. | |||
| Massachusetts Institute of Technology, and is Associate Director of | ||||
| MIT's Laboratory for Computer Science. Rivest, together with Leonard | ||||
| Adleman and Adi Shamir, invented the RSA public-key cryptosystem that | ||||
| is used widely throughout industry. Ron Rivest is one of the | ||||
| founders of RSA Data Security Inc. and is the creator of variable key | ||||
| length symmetric key ciphers (e.g., RC4). | ||||
| *Bruce Schneier* is president of Counterpane Systems, a consulting | 3. Addition of section 5.4 on Ring Oscillator randomness sources. | |||
| firm specializing in cryptography and computer security. Schneier | ||||
| writes and speaks frequently on computer security and privacy and is | ||||
| the author of a leading cryptography textbook, Applied Cryptography, | ||||
| and is the creator of the symmetric key cipher Blowfish. | ||||
| *Tsutomu Shimomura* is a computational physicist employed by the San | 4. AES and the members of the SHA series producing more than 160 | |||
| Diego Supercomputer Center who is an expert in designing software | bits have been added. Use of AES has been emphasized and the use | |||
| security tools. Last year, Shimomura was responsible for tracking | of DES minimized. | |||
| down the computer outlaw Kevin Mitnick, who electronically stole and | ||||
| altered valuable electronic information around the country. | ||||
| *Eric Thompson* heads AccessData Corporation's cryptanalytic team and | 5. Addition of section 6.3.3 on entropy pool techniques. | |||
| is a frequent lecturer on applied crytography. AccessData | ||||
| specializes in data recovery and decrypting information utilizing | ||||
| brute force as well as `smarter' attacks. Regular clients include | ||||
| the FBI and other law enforcement agencies as well as corporations. | ||||
| *Michael Wiener* is a cryptographic advisor at Bell-Northern Research | 6. Addition of section 7.3 on the pseudo-random number generation | |||
| where he focuses on cryptanalysis, security architectures, and | techniques given in FIPS 186-2, 7.4 on those given in X9.82, and | |||
| public-key infrastructures. His influential 1993 paper, Efficient | section 7.5 on the random number generation techniques of the | |||
| DES Key Search, describes in detail how to construct a machine to | /dev/random device in Linux and other UNIX systems. | |||
| brute force crack DES coded information (and provides cost estimates | ||||
| as well). | ||||
| A.7 Acknowledgement | 7. Addition of references to the "Minimal Key Lengths for Symmetric | |||
| Ciphers to Provide Adequate Commercial Security" study published | ||||
| in January 1996 [KeyStudy]. | ||||
| The [Appendix] authors would like to thank the Business Software | 8. Minor wording changes and reference updates. | |||
| Alliance, which provided support for a one-day meeting, held in | ||||
| Chicago on 20 November 1995. | ||||
| Informative References | 13. Informative References | |||
| [AES] - "Specification of the Advanced Encryption Standard (AES)", | [AES] - "Specification of the Advanced Encryption Standard (AES)", | |||
| United States of America, Department of Commerce, National Institute | United States of America, US National Institute of Standards and | |||
| of Standards and Technology, Federal Information Processing Standard | Technology, FIPS 197, November 2001. | |||
| 197, November 2001. | ||||
| [ASYMMETRIC] - "Secure Communications and Asymmetric Cryptosystems", | [ASYMMETRIC] - "Secure Communications and Asymmetric Cryptosystems", | |||
| edited by Gustavus J. Simmons, AAAS Selected Symposium 69, Westview | edited by Gustavus J. Simmons, AAAS Selected Symposium 69, Westview | |||
| Press, Inc. | Press, Inc. | |||
| [BBS] - "A Simple Unpredictable Pseudo-Random Number Generator", SIAM | [BBS] - "A Simple Unpredictable Pseudo-Random Number Generator", SIAM | |||
| Journal on Computing, v. 15, n. 2, 1986, L. Blum, M. Blum, & M. Shub. | Journal on Computing, v. 15, n. 2, 1986, L. Blum, M. Blum, & M. Shub. | |||
| [BRILLINGER] - "Time Series: Data Analysis and Theory", Holden-Day, | [BRILLINGER] - "Time Series: Data Analysis and Theory", Holden-Day, | |||
| 1981, David Brillinger. | 1981, David Brillinger. | |||
| [CRC] - "C.R.C. Standard Mathematical Tables", Chemical Rubber | [CRC] - "C.R.C. Standard Mathematical Tables", Chemical Rubber | |||
| Publishing Company. | Publishing Company. | |||
| [CRYPTO1] - "Cryptography: A Primer", A Wiley-Interscience | ||||
| Publication, John Wiley & Sons, 1981, Alan G. Konheim. | ||||
| [CRYPTO2] - "Cryptography: A New Dimension in Computer Data | ||||
| Security", A Wiley-Interscience Publication, John Wiley & Sons, 1982, | ||||
| Carl H. Meyer & Stephen M. Matyas. | ||||
| [CRYPTO3] - "Applied Cryptography: Protocols, Algorithsm, and Source | ||||
| Code in C", Second Edition, John Wiley & Sons, 1996, Bruce Schneier. | ||||
| [DAVIS] - "Cryptographic Randomness from Air Turbulence in Disk | [DAVIS] - "Cryptographic Randomness from Air Turbulence in Disk | |||
| Drives", Advances in Cryptology - Crypto '94, Springer-Verlag Lecture | Drives", Advances in Cryptology - Crypto '94, Springer-Verlag Lecture | |||
| Notes in Computer Science #839, 1984, Don Davis, Ross Ihaka, and | Notes in Computer Science #839, 1984, Don Davis, Ross Ihaka, and | |||
| Philip Fenstermacher. | Philip Fenstermacher. | |||
| [DES] - "Data Encryption Standard", United States of America, | [DES] - "Data Encryption Standard", US National Institute of | |||
| Department of Commerce, National Institute of Standards and | Standards and Technology, FIPS 46-3, October 1999. | |||
| Technology, Federal Information Processing Standard (FIPS) 46-3, | - "Data Encryption Algorithm", American National Standards | |||
| October 1999. | Institute, ANSI X3.92-1981. | |||
| - "Data Encryption Algorithm", American National Standards Institute, | (See also FIPS 112, Password Usage, which includes FORTRAN | |||
| ANSI X3.92-1981. | code for performing DES.) | |||
| (See also FIPS 112, Password Usage, which includes FORTRAN code for | ||||
| performing DES.) | ||||
| [DES MODES] - "DES Modes of Operation", United States of America, | ||||
| Department of Commerce, National Institute of Standards and | ||||
| Technology, Federal Information Processing Standard (FIPS) 81, | ||||
| December 1980. | ||||
| - Data Encryption Algorithm - Modes of Operation, American National | ||||
| Standards Institute, ANSI X3.106-1983. | ||||
| [D-H] - "New Directions in Cryptography", IEEE Transactions on | [D-H] - RFC 2631, "Diffie-Hellman Key Agreement Method", Eric | |||
| Information Technology, November, 1976, Whitfield Diffie and Martin | Rescrola, June 1999. | |||
| E. Hellman. | ||||
| [DNSSEC] - RFC 2535, "Domain Name System Security Extensions", D. | [DNSSEC] - RFC 2535, "Domain Name System Security Extensions", D. | |||
| Eastlake, March 1999. | Eastlake, March 1999. | |||
| [DoD] - "Password Management Guideline", United States of America, | [DoD] - "Password Management Guideline", United States of America, | |||
| Department of Defense, Computer Security Center, CSC-STD-002-85. | Department of Defense, Computer Security Center, CSC-STD-002-85. | |||
| (See also FIPS 112, Password Usage, which incorporates CSC-STD-002-85 | (See also FIPS 112, Password Usage, which incorporates CSC-STD-002-85 | |||
| as one of its appendices.) | as one of its appendices.) | |||
| [DSS] - "Digital Signature Standard (DSS)", United States of America, | [DSS] - "Digital Signature Standard (DSS)", US National Institute of | |||
| Department of Commerce, National Institute of Standards and | Standards and Technology, FIPS 186-2, January 2000. | |||
| Technoloy, Federal Information Processing Standard (FIPS) 186-2, | ||||
| January 2000. | ||||
| [GIFFORD] - "Natural Random Number", MIT/LCS/TM-371, September 1988, | [FERGUSON] - "Practical Cryptography", Niels Ferguson and Bruce | |||
| David K. Gifford | Schneier, Wiley Publishing Inc., ISBN 047122894X, April 2003. | |||
| [GIFFORD] - "Natural Random Number", MIT/LCS/TM-371, David K. | ||||
| Gifford, September 1988. | ||||
| [IEEE 802.11i] - "Draft Amendment to Standard for Telecommunications | ||||
| and Information Exchange Between Systems - LAN/MAN Specific | ||||
| Requirements - Part 11: Wireless Medium Access Control (MAC) and | ||||
| physical layer (PHY) specifications: Medium Access Control (MAC) | ||||
| Security Enhancements", The Institute for Electrical and Electronics | ||||
| Engineers, January 2004. | ||||
| [IPSEC] - RFC 2401, "Security Architecture for the Internet | [IPSEC] - RFC 2401, "Security Architecture for the Internet | |||
| Protocol", S. Kent, R. Atkinson, November 1998 | Protocol", S. Kent, R. Atkinson, November 1998. | |||
| [KAUFMAN] - "Network Security: Private Communication in a Public | ||||
| World", Charlie Kaufman, Radia Perlman, and Mike Speciner, Prentis | ||||
| Hall PTR, ISBN 0-13-046019-2, 2nd Edition 2002. | ||||
| [KeyStudy] - "Minimal Key Lengths for Symmetric Ciphers to Provide | ||||
| Adequate Commercial Security: A Report by an Ad Hoc Group of | ||||
| Cryptographers and Computer Scientists", M. Blaze, W. Diffie, R. | ||||
| Rivest, B. Schneier, T. Shimomura, E. Thompson, and M. Weiner, | ||||
| January 1996, <www.counterpane.com/keylength.html>. | ||||
| [KNUTH] - "The Art of Computer Programming", Volume 2: Seminumerical | [KNUTH] - "The Art of Computer Programming", Volume 2: Seminumerical | |||
| Algorithms, Chapter 3: Random Numbers. Addison Wesley Publishing | Algorithms, Chapter 3: Random Numbers. Addison Wesley Publishing | |||
| Company, Second Edition 1982, Donald E. Knuth. | Company, 3rd Edition November 1997, Donald E. Knuth. | |||
| [KRAWCZYK] - "How to Predict Congruential Generators", Journal of | [KRAWCZYK] - "How to Predict Congruential Generators", Journal of | |||
| Algorithms, V. 13, N. 4, December 1992, H. Krawczyk | Algorithms, V. 13, N. 4, December 1992, H. Krawczyk | |||
| [MAIL PEM] - RFCs 1421 through 1424: | [MAIL PEM] - RFCs 1421 through 1424: | |||
| - RFC 1424, Privacy Enhancement for Internet Electronic Mail: Part | - RFC 1421, Privacy Enhancement for Internet Electronic Mail: | |||
| IV: Key Certification and Related Services, 02/10/1993, B. Kaliski | Part I: Message Encryption and Authentication Procedures, 02/10/1993, | |||
| - RFC 1423, Privacy Enhancement for Internet Electronic Mail: Part | J. Linn | |||
| III: Algorithms, Modes, and Identifiers, 02/10/1993, D. Balenson | - RFC 1422, Privacy Enhancement for Internet Electronic Mail: | |||
| - RFC 1422, Privacy Enhancement for Internet Electronic Mail: Part | Part II: Certificate-Based Key Management, 02/10/1993, S. Kent | |||
| II: Certificate-Based Key Management, 02/10/1993, S. Kent | - RFC 1423, Privacy Enhancement for Internet Electronic Mail: | |||
| - RFC 1421, Privacy Enhancement for Internet Electronic Mail: Part I: | Part III: Algorithms, Modes, and Identifiers, 02/10/1993, D. Balenson | |||
| Message Encryption and Authentication Procedures, 02/10/1993, J. Linn | - RFC 1424, Privacy Enhancement for Internet Electronic Mail: | |||
| Part IV: Key Certification and Related Services, 02/10/1993, B. | ||||
| Kaliski | ||||
| [MAIL PGP] - RFC 2440, "OpenPGP Message Format", J. Callas, L. | [MAIL PGP] | |||
| Donnerhacke, H. Finney, R. Thayer", November 1998 | - RFC 2440, "OpenPGP Message Format", J. Callas, L. | |||
| Donnerhacke, H. Finney, R. Thayer", November 1998. | ||||
| - RFC 3156, "MIME Security with OpenPGP" M. Elkins, D. Del | ||||
| Torto, R. Levien, T. Roessler, August 2001. | ||||
| [MAIL S/MIME] - RFC 2633, "S/MIME Version 3 Message Specification", | [MAIL S/MIME] - RFCs 2632 through 2634: | |||
| B. Ramsdell, Ed., June 1999. | - RFC 2632, "S/MIME Version 3 Certificate Handling", B. | |||
| Ramsdell, Ed., June 1999. | ||||
| - RFC 2633, "S/MIME Version 3 Message Specification", B. | ||||
| Ramsdell, Ed., June 1999. | ||||
| - RFC 2634, "Enhanced Security Services for S/MIME" P. | ||||
| Hoffman, Ed., June 1999. | ||||
| [MD4] - "The MD4 Message-Digest Algorithm", RFC1320, April 1992, R. | [MD4] - "The MD4 Message-Digest Algorithm", RFC1320, April 1992, R. | |||
| Rivest | Rivest | |||
| [MD5] - "The MD5 Message-Digest Algorithm", RFC1321, April 1992, R. | [MD5] - "The MD5 Message-Digest Algorithm", RFC1321, April 1992, R. | |||
| Rivest | Rivest | |||
| [MOORE] - Moore's Law: the exponential increase the logic density of | [MODES] - "DES Modes of Operation", US National Institute of | |||
| silicon circuts. Originally formulated by Gordon Moore in 1964 as a | Standards and Technology, FIPS 81, December 1980. | |||
| doubling every year starting in 1962, in the late 1970s the rate fell | - "Data Encryption Algorithm - Modes of Operation", American | |||
| to a doubling every 18 months and has remained there through the date | National Standards Institute, ANSI X3.106-1983. | |||
| of this document. See "The New Hacker's Dictionary", Third Edition, | ||||
| MIT Press, ISBN 0-262-18178-9, Eric S. Raymondm 1996. | [MOORE] - Moore's Law: the exponential increase in the logic density | |||
| of silicon circuits. Originally formulated by Gordon Moore in 1964 as | ||||
| a doubling every year starting in 1962, in the late 1970s the rate | ||||
| fell to a doubling every 18 months and has remained there through the | ||||
| date of this document. See "The New Hacker's Dictionary", Third | ||||
| Edition, MIT Press, ISBN 0-262-18178-9, Eric S. Raymond, 1996. | ||||
| [ORMAN] - "Determining Strengths For Public Keys Used For Exchanging | [ORMAN] - "Determining Strengths For Public Keys Used For Exchanging | |||
| Symmetric Keys", draft-orman-public-key-lengths-*.txt, Hilarie Orman, | Symmetric Keys", draft-orman-public-key-lengths-*.txt, Hilarie Orman, | |||
| Paul Hoffman, work in progress. | Paul Hoffman, work in progress. | |||
| [RFC 1750] - "Randomness Requirements for Security", D. Eastlake, S. | [RFC 1750] - "Randomness Requirements for Security", D. Eastlake, S. | |||
| Crocker, J. Schiller, December 1994. | Crocker, J. Schiller, December 1994. | |||
| [RSA BULL1] - "Suggestions for Random Number Generation in Software", | [RSA BULL1] - "Suggestions for Random Number Generation in Software", | |||
| RSA Laboratories Bulletin #1, January 1996. | RSA Laboratories Bulletin #1, January 1996. | |||
| skipping to change at page 51, line 28 ¶ | skipping to change at page 41, line 42 ¶ | |||
| [RSA BULL1] - "Suggestions for Random Number Generation in Software", | [RSA BULL1] - "Suggestions for Random Number Generation in Software", | |||
| RSA Laboratories Bulletin #1, January 1996. | RSA Laboratories Bulletin #1, January 1996. | |||
| [RSA BULL13] - "A Cost-Based Security Analysis of Symmetric and | [RSA BULL13] - "A Cost-Based Security Analysis of Symmetric and | |||
| Asymmetric Key Lengths", RSA Laboratories Bulletin #13, Robert | Asymmetric Key Lengths", RSA Laboratories Bulletin #13, Robert | |||
| Silverman, April 2000 (revised November 2001). | Silverman, April 2000 (revised November 2001). | |||
| [SBOX1] - "Practical s-box design", S. Mister, C. Adams, Selected | [SBOX1] - "Practical s-box design", S. Mister, C. Adams, Selected | |||
| Areas in Cryptography, 1996. | Areas in Cryptography, 1996. | |||
| [SBOX2] - "Perfect Non-linear S-boxes", K. Nyberg, Advances in | [SBOX2] - "Perfect Non-linear S-boxes", K. Nyberg, Advances in | |||
| Cryptography - Eurocrypt '91 Proceedings, Springer-Verland, 1991. | Cryptography - Eurocrypt '91 Proceedings, Springer-Verland, 1991. | |||
| [SCHNEIER] - "Applied Cryptography: Protocols, Algorithms, and Source | ||||
| Code in C", 2nd Edition, John Wiley & Sons, 1996, Bruce Schneier. | ||||
| [SHANNON] - "The Mathematical Theory of Communication", University of | [SHANNON] - "The Mathematical Theory of Communication", University of | |||
| Illinois Press, 1963, Claude E. Shannon. (originally from: Bell | Illinois Press, 1963, Claude E. Shannon. (originally from: Bell | |||
| System Technical Journal, July and October 1948) | System Technical Journal, July and October 1948) | |||
| [SHIFT1] - "Shift Register Sequences", Aegean Park Press, Revised | [SHIFT1] - "Shift Register Sequences", Aegean Park Press, Revised | |||
| Edition 1982, Solomon W. Golomb. | Edition 1982, Solomon W. Golomb. | |||
| [SHIFT2] - "Cryptanalysis of Shift-Register Generated Stream Cypher | [SHIFT2] - "Cryptanalysis of Shift-Register Generated Stream Cypher | |||
| Systems", Aegean Park Press, 1984, Wayne G. Barker. | Systems", Aegean Park Press, 1984, Wayne G. Barker. | |||
| [SHA-1] - "Secure Hash Standard (SHA-1)", United States of American, | [SHA-1] - "Secure Hash Standard (SHA-1)", US National Institute of | |||
| National Institute of Science and Technology, Federal Information | Science and Technology, FIPS 180-1, April 1993. | |||
| Processing Standard (FIPS) 180-1, April 1993. | ||||
| - RFC 3174, "US Secure Hash Algorithm 1 (SHA1)", D. Eastlake, | - RFC 3174, "US Secure Hash Algorithm 1 (SHA1)", D. Eastlake, | |||
| P. Jones, September 2001. | P. Jones, September 2001. | |||
| [SHA-2] - "Secure Hash Standard", Draft (SHA-2156/384/512), Federal | [SHA-2] - "Secure Hash Standard", Draft (SHA-2156/384/512), US | |||
| Information Processing Standard 180-2, not yet issued. | National Institute of Science and Technology, FIPS 180-2, not yet | |||
| issued. | ||||
| [SSH] - draft-ietf-secsh-*, work in progress. | [SSH] - draft-ietf-secsh-*, work in progress. | |||
| [STERN] - "Secret Linear Congruential Generators are not | [STERN] - "Secret Linear Congruential Generators are not | |||
| Cryptograhically Secure", Proceedings of IEEE STOC, 1987, J. Stern. | Cryptographically Secure", Proceedings of IEEE STOC, 1987, J. Stern. | |||
| [TLS] - RFC 2246, "The TLS Protocol Version 1.0", T. Dierks, C. | [TLS] - RFC 2246, "The TLS Protocol Version 1.0", T. Dierks, C. | |||
| Allen, January 1999. | Allen, January 1999. | |||
| [USENET] - RFC 977, "Network News Transfer Protocol", B. Kantor, P. | ||||
| Lapsley, February 1986. | ||||
| - RFC 2980, "Common NNTP Extensions", S. Barber, October | ||||
| 2000. | ||||
| [VON NEUMANN] - "Various techniques used in connection with random | [VON NEUMANN] - "Various techniques used in connection with random | |||
| digits", von Neumann's Collected Works, Vol. 5, Pergamon Press, 1963, | digits", von Neumann's Collected Works, Vol. 5, Pergamon Press, 1963, | |||
| J. von Neumann. | J. von Neumann. | |||
| [X9.17] - "American National Standard for Financial Institution Key | ||||
| Management (Wholesale)", American Bankers Association, 1985. | ||||
| [X9.82] - "Random Number Generation", ANSI X9F1, work in progress. | ||||
| Authors Addresses | Authors Addresses | |||
| Donald E. Eastlake 3rd | Donald E. Eastlake 3rd | |||
| Motorola Laboratories | Motorola Laboratories | |||
| 155 Beaver Street | 155 Beaver Street | |||
| Milford, MA 01757 USA | Milford, MA 01757 USA | |||
| Telephone: +1 508-786-7554 (w) | Telephone: +1 508-786-7554 (w) | |||
| +1 508-634-2066 (h) | +1 508-634-2066 (h) | |||
| EMail: Donald.Eastlake@motorola.com | EMail: Donald.Eastlake@motorola.com | |||
| skipping to change at page 53, line 30 ¶ | skipping to change at page 43, line 30 ¶ | |||
| Telephone: +1 617-253-0161 | Telephone: +1 617-253-0161 | |||
| E-mail: jis@mit.edu | E-mail: jis@mit.edu | |||
| Steve Crocker | Steve Crocker | |||
| EMail: steve@stevecrocker.com | EMail: steve@stevecrocker.com | |||
| File Name and Expiration | File Name and Expiration | |||
| This is file draft-eastlake-randomness2-05.txt. | This is file draft-eastlake-randomness2-06.txt. | |||
| It expires June 2004. | It expires October 2004. | |||
| End of changes. 236 change blocks. | ||||
| 1215 lines changed or deleted | 795 lines changed or added | |||
This html diff was produced by rfcdiff 1.48. The latest version is available from http://tools.ietf.org/tools/rfcdiff/ | ||||