Skip to main content

Foggy Maps

Understanding Error in MLWE


The Simple Story

Imagine Bob draws a perfect map showing his secret location.

Problem: If Eve gets the map, she goes right to the secret!

Bob's solution: Fog the map!

Add "dots scattered randomly" all over the map.

Eve sees: Thousands of dots everywhere... which one is the secret?

But Bob knows: His secret location! He can follow his own path to find the correct dot, ignoring the fog.

That's why adding error (noise) makes MLWE hard for Eve but still usable for Bob!


Mental Model

Hold this picture in your head:

Clear map (easy for Eve):

← Secret location!

• • • • •
• • • • •
• • • • •
• • • • •

Foggy map (hard for Eve):

• • ? • •
• • • • •
• • ? • •
• • • • •
? ← Which one?

But Bob knows: "Start at my secret location (I drew it!), follow the path to nearest fogged result... there it is!"

Think of it like:

Adding pepper (add noise = harder to see but still there)

Fog makes it hard to see

Bob has the key = finds the real secret!


See It Happen

Let's see how Bob Eve differ:


Try It Yourself

Question 1: Bob's secret is at position (5, 3). He sends foggy result at (5, 7). If Eve knows there's an error of about 4, what's her dilemma?

Show Answer

Eve sees result: (5, 7)

She thinks: "Secret + Error ≈ (5, 7)"

If error ≈ 4, secret could be:

  • (5, 3) with error = 4
  • (5, 4) with error = 3
  • (5, 5) with error = 2
  • (5, 6) with error = 1
  • (5, 7) with error = 0
  • (5, 8) with error = 1
  • (5, 9) with error = 2
  • etc.

But she doesn't know: Which one is the actual secret?

Many possibilities!


Question 2: If Eve knows error is always one of the values -2, -1, 0, 1, or 2, which positions are candidates for secret "5" (with result "7")?

Show Answer

Equation: Secret + Error = 7

Error must be between -2 and +2

So:

  • If error = -2, secret = 9
  • If error = -1, secret = 8
  • If error = 0, secret = 7
  • If error = +1, secret = 6
  • If error = +2, secret = 5

Possible secrets: 5, 6, 7, 8, 9 (5 possibilities)

Still hard: which one is the real secret?


Question 3: Why can't Eve just try all possibilities? What if she finds the right one by luck?

Show Answer

Eve could try:

  • 5 → check if reasonable
  • 6 → check
  • 7 → check
  • 8 → check
  • 9 → check

That's only 5 possibilities... easy, right?

Wait! This is simplified!

In real ML-KEM:

  • Error = 256 values (not just -2 to +2 per coefficient!)
  • Secret = 256 coefficients
  • Each coefficient can have 5 possible error values

Total possibilities: 5^256 ≈ 10^178

That's more atoms than in the universe!

So yes, finding by luck is basically impossible!


The Math

Error Distribution: Centered Binomial

In ML-KEM, error follows a specific distribution:

eCenteredBinomial(η)\mathbf{e} \leftarrow \text{CenteredBinomial}(\eta)

For η=2\eta = 2 (ML-KEM's parameter):

Possible error values: -2, -1, 0, 1, 2

Probabilities:

  • P(e = -2) = 1/16
  • P(e = -1) = 4/16
  • P(e = 0) = 6/16 (most common!)
  • P(e = +1) = 4/16
  • P(e = +2) = 1/16

Properties:

  • Centered = Mean = 0
  • Small = Keeps errors manageable
  • Non-zero = Prevents linear algebra solving!

The Noise Equation

With error, the MLWE equation:

As+e=t\mathbf{A}\mathbf{s} + \mathbf{e} = \mathbf{t}

Eve sees: A\mathbf{A} and t\mathbf{t}

Eve wants: s\mathbf{s} AND e\mathbf{e}

Problem: Two unknowns (s\mathbf{s} and e\mathbf{e}), one equation!

Decryption Success: Why It Works

Bob can decrypt because:

t=As+e\mathbf{t} = \mathbf{A}\mathbf{s} + \mathbf{e}

Bob knows s\mathbf{s} (his secret), so he can compute:

As=te\mathbf{A}\mathbf{s} = \mathbf{t} - \mathbf{e}

And then recover e\mathbf{e} and find the original message!

The key insight: Errors are SMALL, so Bob can "subtract out" the noise!


Why We Care

Error Makes MLWE Hard

Without error: As=t\mathbf{A}\mathbf{s} = \mathbf{t}

Eve can solve: s=A1t\mathbf{s} = \mathbf{A}^{-1}\mathbf{t} (linear algebra)

With error: As+e=t\mathbf{A}\mathbf{s} + \mathbf{e} = \mathbf{t}

Eve must solve: Find s\mathbf{s} and e\mathbf{e} (two unknowns!)

This is equivalent to: Finding the closest lattice point (SVP)

SVP is hard!

Error Keeps MLWE Usable

If error was too large: e=huge random values\mathbf{e} = \text{huge random values}

Then Bob couldn't decrypt because noise would overwhelm the signal!

If error was too small: e0\mathbf{e} \approx \mathbf{0}

Then Eve could just use linear algebra (guessing e0\mathbf{e} \approx \mathbf{0})!

Just right (η=2\eta = 2):

  • Large enough to confuse Eve (no linear algebra)
  • Small enough for Bob to decode (errors cancel out)

Decryption Failure Probability

When Bob decrypts, he computes:

m=Decode(tAs)=Decode(e)\mathbf{m} = \text{Decode}(\mathbf{t} - \mathbf{A}\mathbf{s}) = \text{Decode}(\mathbf{e})

Sometimes e\mathbf{e} is too large and Bob gets it wrong.

Probability of failure: Approximately 21392^{-139} ≈ 1 in 104210^{42}

That's basically never! (like winning lottery 100× in a row)


Quick Check

Can you explain error in MLWE to a 5-year-old?

Try saying this out loud:

"Bob wants to hide his secret location. He draws a map, foggs it up with random dots, and sends the foggy map. Even though Eve sees the foggy map, there are too many dots - she can't tell which one is the secret! But Bob knows where the secret was, so he can find it."

Can you calculate error probabilities?

Try this examples:

Error distribution: [-2, -1, 0, 1, 2] with probabilities [1/16, 4/16, 6/16, 4/16, 1/16]

Question: What's the probability of exactly 0?

Answer: P(e = 0) = 6/16 = 3/8 = 37.5%

Question: What's the probability of error being non-zero?

Answer: 1 - P(e = 0) = 1 - 6/16 = 10/16 = 5/8 = 62.5%


Key Takeaways

Error = Fog = Random noise added to confuse Eve Distribution = Centered binomial with η\eta=2 for ML-KEM Possible values = -2, -1, 0, 1, 2 (probabilities [1/16, 4/16, 6/16, 4/16, 1/16]) Makes it hard = Too many (s,e)(\mathbf{s}, \mathbf{e}) pairs possible Keeps it usable = Bob can still decrypt (small errors!) Failure probability ≈ 2^-139 (basically never) Trade-off = η=2\eta=2 is "just right!" (small enough for Bob, large enough for security)