AUTONOMY DIRECTORATE

⟨ QUANTUM ERROR PORTAL ⟩

Navigate the Error Dimensions

PQ Crypta Logo

Entropy

🌌

Why Entropy Matters

🌍 In the Universe

Entropy is the fundamental measure of disorder and unpredictability in nature. From the arrow of time to the heat death of the universe, entropy governs the evolution of all physical systems. The Second Law of Thermodynamics states that entropy always increases, making it one of the most fundamental principles in physics.

🔐 In Cryptography

Cryptographic security depends entirely on unpredictability. Without high-quality entropy, even mathematically perfect encryption algorithms fail. Every cryptographic key, nonce, and initialization vector must be generated from a source of true randomness. Weak entropy has broken PlayStation 3, Debian OpenSSL, and countless other systems—making it the foundation upon which all digital security is built.

Hardware Entropy & Quantum Random Generation

Generate cryptographically secure random data from multiple sources including hardware RNG (RDRAND/RDSEED), system entropy (/dev/urandom), and quantum random numbers from ANU.

🔢

Hardware RNG

CPU instructions: RDRAND, RDSEED, /dev/hwrng

⚛️

Quantum Random

True quantum random numbers from qrandom.io (ANU)

📊

NIST Tests

Frequency, runs, block frequency, longest run tests

🧮

Quality Analysis

Shannon entropy, chi-square, Monte Carlo Pi estimation

Hardware Entropy Status

Checking hardware...

Generate Entropy

Restricted to 1024 bytes max

Understanding Cryptographic Entropy

Entropy is the cornerstone of all cryptographic systems. Without high-quality randomness, even the most sophisticated encryption algorithms become vulnerable. This page provides real hardware entropy generation with comprehensive statistical validation following NIST Special Publication 800-90B guidelines.

What is Entropy?

In information theory, entropy H(X) measures the uncertainty or unpredictability of a random variable X. For cryptographic purposes, we require maximum entropy - each bit must be equally likely to be 0 or 1, with no predictable patterns.

Shannon Entropy Formula

H(X) = -Σ p(xᵢ) × log₂(p(xᵢ))

Where p(xᵢ) is the probability of symbol xᵢ occurring. For perfectly random bytes, H(X) = 8.0 bits per byte.

Min-Entropy Formula

H∞(X) = -log₂(max p(xᵢ))

Min-entropy provides the worst-case entropy estimate by measuring the probability of the most likely outcome. This is the conservative measure used in cryptographic standards.

Hardware Random Number Generation

RDRAND Instruction

Intel's RDRAND uses a DRNG (Digital Random Number Generator) based on thermal noise. The process:

  1. Entropy Source (ES): Thermal noise from a dedicated hardware circuit produces unpredictable voltage fluctuations
  2. Conditioning: Raw entropy passes through AES-CBC-MAC to remove bias and correlations
  3. DRBG (SP 800-90A): Conditioned entropy seeds a CTR_DRBG using AES-128
  4. Reseeding: Automatic reseeding every 1024 generated values ensures forward secrecy

Security Level: Designed to provide at least 128 bits of security strength. The thermal noise source is continuously monitored for failures.

RDSEED Instruction

RDSEED provides direct access to the conditioned entropy source, bypassing the DRBG. This is the highest quality entropy available from hardware:

  • True Randomness: Direct access to physical entropy source after conditioning
  • Seed Quality: Suitable for seeding other cryptographic RNGs
  • Slower but Higher Quality: May retry multiple times to gather sufficient entropy
  • NIST SP 800-90B Compliance: Meets requirements for full-entropy seed material

/dev/hwrng and Linux Kernel Integration

Modern Linux kernels aggregate entropy from multiple hardware sources:

  • Hardware RNG drivers: TPM chips, Intel RDSEED, AMD RdRand, VIA PadLock
  • LRNG (Linux Random Number Generator): ChaCha20-based DRNG introduced in kernel 5.17+
  • /dev/urandom: Non-blocking cryptographic PRNG seeded from hardware entropy
  • Entropy accounting: /proc/sys/kernel/random/entropy_avail shows available bits

Quantum Random Number Generation

Quantum Vacuum Fluctuations

The Australian National University (ANU) Quantum Random Number Generator measures the quantum vacuum state in an optical system:

  1. Vacuum State: Even in perfect vacuum, quantum mechanics predicts electromagnetic field fluctuations due to the uncertainty principle: ΔE·Δt ≥ ℏ/2
  2. Homodyne Detection: A laser beam is split and recombined to measure the quadrature amplitudes of the vacuum state
  3. Shot Noise Measurement: The detector measures photon arrival times, which are fundamentally random due to quantum mechanics
  4. Digitization: Analog measurements are converted to random bits using least significant bits

Theoretical Foundation: Quantum randomness is proven to be truly random by Bell's theorem. No hidden variable theory can explain the correlations, ensuring the randomness is fundamental to nature itself, not just computational unpredictability.

Heisenberg Uncertainty Principle

Δx · Δp ≥ ℏ/2

This fundamental limit ensures that quantum measurements produce inherently random results. The uncertainty is not due to measurement imprecision but is a property of quantum reality.

NIST Statistical Test Suite (SP 800-22)

1. Frequency (Monobit) Test

Tests whether the number of ones and zeros in the sequence are approximately equal, as expected for a random sequence.

S = Σ(2×xᵢ - 1), where xᵢ ∈ {0,1}

s_obs = |S| / √n

P-value = erfc(s_obs / √2)

Interpretation: P-value > 0.01 indicates the sequence passes. This tests for the most basic bias.

2. Block Frequency Test

Divides the sequence into blocks and checks if the proportion of ones in each block is approximately 1/2.

πᵢ = Σ(xⱼ) / M, for block i of length M

χ² = 4M × Σ(πᵢ - 1/2)²

P-value = igamc(N/2, χ²/2)

Purpose: Detects local biases that might not be visible in the overall frequency test.

3. Runs Test

A "run" is an uninterrupted sequence of identical bits. Tests if the number of runs of different lengths is as expected for a random sequence.

V(n) = Σ r(k) + 1, where r(k) = 0 if xₖ = xₖ₊₁, else 1

P-value = erfc(|V(n) - 2nπ(1-π)| / (2√(2n)π(1-π)))

Detects: Oscillation patterns and clustering of bits.

4. Longest Run of Ones Test

Determines whether the length of the longest run of ones is consistent with the expected length in a random sequence.

χ² = Σ (vᵢ - N×πᵢ)² / (N×πᵢ)

Where vᵢ is the observed frequency of runs in category i, and πᵢ is the expected probability.

Critical for: Detecting non-randomness in the distribution of long sequences.

Advanced Entropy Metrics

Chi-Square Test (χ²)

Measures how well the distribution of bytes matches the expected uniform distribution.

χ² = Σ (Oᵢ - Eᵢ)² / Eᵢ

Where Oᵢ is observed frequency of byte value i, Eᵢ is expected frequency (N/256 for uniform distribution).

For 256 byte values with degrees of freedom df = 255:

Expected range: χ² ≈ 255 ± 2√255 ≈ [223, 287]

Critical Values: χ² significantly outside this range indicates non-uniform distribution and poor entropy quality.

Monte Carlo π Estimation

Uses random coordinate pairs to estimate π. Poor randomness yields inaccurate π estimates.

Generate pairs (x, y) where x, y ∈ [0, 1]

If x² + y² ≤ 1, point is inside quarter circle

π ≈ 4 × (points inside circle) / (total points)

Expected: π ≈ 3.14159 ± 0.01 for good entropy. Significant deviation indicates correlation between random values.

Serial Correlation Coefficient

Measures the correlation between successive bytes. True randomness should show zero correlation.

r = Σ((xᵢ - x̄)(xᵢ₊₁ - x̄)) / (n×σ²)

Where x̄ is mean value, σ² is variance, n is number of byte pairs.

Expected: r ≈ 0 (typically |r| < 0.01 for good entropy)

Non-zero correlation: Indicates predictability between successive values, a critical weakness.

Cryptographic Applications

Key Generation Requirements

High-quality entropy is critical for cryptographic key generation:

  • Symmetric Keys (AES-256): Require full 256 bits of entropy. Any bias enables attacks.
  • RSA Private Keys: Prime generation requires testing hundreds of candidates. Poor entropy leads to weak primes vulnerable to factorization.
  • ECDSA Private Keys: Must be uniformly distributed in [1, n-1] where n is the curve order. Bias enables private key recovery attacks.
  • Post-Quantum Keys: Lattice-based schemes like ML-KEM require large amounts of high-quality entropy for secure parameter generation.

Nonce and IV Generation

Initialization Vectors (IVs) and nonces must be unique and unpredictable:

  • AES-GCM: 96-bit nonces must NEVER repeat with the same key. Nonce reuse completely breaks authentication.
  • ChaCha20-Poly1305: 96-bit nonces, same uniqueness requirement as GCM.
  • CBC Mode: IV must be unpredictable. Predictable IVs enable chosen-plaintext attacks.

⚠️ Critical: Even with strong encryption algorithms, weak entropy in IVs/nonces can completely compromise security.

Defense in Depth: Entropy Mixing

The "Mixed" mode combines multiple entropy sources using XOR:

Output = Hardware ⊕ System ⊕ Quantum

Security Property: The XOR of independent random sources has entropy equal to the maximum entropy of any source. This means:

  • If quantum source provides full entropy, output has full entropy
  • If quantum source fails, hardware RNG maintains security
  • Compromise of any single source does not compromise overall security
  • Known as the "leftover hash lemma" in information theory

Recommendation: Always use mixed sources for production cryptographic applications.

Security Considerations

Best Practices for Cryptographic Entropy

  1. Never rely on a single entropy source - Use multiple sources with different failure modes
  2. Validate entropy quality - Run NIST tests on production entropy before use
  3. Continuous monitoring - Implement runtime health checks for entropy sources
  4. Fail securely - If entropy quality drops, halt cryptographic operations rather than continue with weak randomness
  5. Avoid user-space PRNGs - Use OS-provided cryptographic RNGs (like /dev/urandom) that are seeded from hardware
  6. Reseed frequently - Periodically add new entropy even for PRNGs with good initial seeds
  7. Forward secrecy - Ensure compromise of current state doesn't reveal previous outputs

Known Attacks on Weak Entropy

  • PlayStation 3 ECDSA Failure (2010): Sony's implementation used a fixed k value instead of random, enabling private key recovery from two signatures
  • Debian OpenSSL Bug (2008): Removed entropy source left only process ID for seeding, reducing keyspace to ~32,000 values
  • Dual_EC_DRBG Backdoor (2013): NSA-designed RNG with suspected backdoor. Demonstrates importance of verifiable entropy sources
  • Netscape Browser (1990s): Used predictable time-of-day seeding, enabling SSL session key prediction
  • ROCA Vulnerability (2017): Weak RSA prime generation in hardware tokens produced factorable keys

Lesson: Cryptographic entropy is the foundation of security. Weak entropy undermines even mathematically perfect algorithms.

Quality Scoring Methodology

Our quality score (0-10) combines multiple factors:

Score = 0.3×S + 0.2×M + 0.2×C + 0.15×P + 0.15×N

Where:

  • S = Shannon entropy score (H/8.0 × 3 points)
  • M = Min-entropy score (H∞/8.0 × 2 points)
  • C = Chi-square score (inverse deviation from expected)
  • P = Monte Carlo π accuracy (inverse of |π - 3.14159|)
  • N = NIST test pass rate (percentage passing × 1.5 points)

Quality Score Interpretation (Normalized for Sample Size):

  • Excellent (Score ≥ 9.7): Suitable for all cryptographic purposes including key generation, nonces, and IVs.
  • Good (Score ≥ 9.3): Acceptable for most cryptographic applications. Quality is sufficient for production use.
  • Fair (Score ≥ 8.8): Borderline quality. Investigate source and consider mixing with additional entropy before production use.
  • Poor (Score < 8.8): DO NOT USE for cryptographic purposes. Insufficient randomness.

Note: Scores are normalized for sample size. Small samples (e.g. 256 bytes) cannot achieve Shannon entropy of 8.0 even with perfectly random data due to the birthday problem. The score measures how close your entropy is to the expected maximum for the given sample size.

Quick Reference: Entropy Sources

Hardware RNG (RDRAND/RDSEED)

Modern CPUs include hardware random number generators that use thermal noise and other physical phenomena to generate true random numbers. RDSEED provides higher quality entropy than RDRAND.

  • RDRAND: Fast random number generation
  • RDSEED: Seed-quality entropy for RNGs
  • /dev/hwrng: Kernel hardware RNG device

Quantum Random (qrandom.io)

Australian National University provides free quantum random numbers generated by measuring the quantum fluctuations of the vacuum. This is true quantum randomness, not pseudo-random.

  • Based on quantum vacuum fluctuations
  • True quantum randomness
  • Published research and verification

System Entropy (/dev/urandom)

The Linux kernel's cryptographically secure pseudo-random number generator. It combines multiple entropy sources including hardware interrupts, disk I/O timing, and network activity.

  • Cryptographically secure PRNG
  • Multiple entropy sources
  • Never blocks (unlike /dev/random)

Mixed Sources

The highest quality option combines all available entropy sources using XOR mixing. This provides defense-in-depth - even if one source is compromised, the overall entropy remains secure.

  • XOR combination of all sources
  • Defense against source compromise
  • Maximum entropy quality