๐ซ๏ธ Foggy Maps
Understanding Error in MLWEโ
๐ฏ The Simple Storyโ
Imagine Bob draws a perfect map showing his secret location.
Problem: If Eve gets the map, she goes right to the secret!
Bob's solution: Fog the map!
Add "dots scattered randomly" all over the map.
Eve sees: Thousands of dots everywhere... which one is the secret?
But Bob knows: His secret location! He can follow his own path to find the correct dot, ignoring the fog.
That's why adding error (noise) makes MLWE hard for Eve but still usable for Bob!
๐ง Mental Modelโ
Hold this picture in your head:
Clear map (easy for Eve):
โญ โ Secret location!
โข โข โข โข โข
โข โข โข โข โข
โข โข โข โข โข
โข โข โข โข โข
Foggy map (hard for Eve):
โข โขโญ? โข โข
โข โข โข โข โข
โข โขโญ? โข โข
โข โข โข โข โข
โญ? โ Which one?
But Bob knows: "Start at my secret location (I drew it!), follow the path to nearest fogged result... there it is!"
Think of it like:
๐ญ Adding pepper (add noise = harder to see but still there)
๐ฏ ไผช่ฃ (fog = Eve is confused, Bob isn't!)
๐ ๆญฃ็กฎ่ทฏๅพ (Bob has the key = finds the real secret!)
๐ See It Happenโ
Let's see how Bob Eve differ:
๐ฎ Try It Yourselfโ
Question 1: Bob's secret is at position (5, 3). He sends foggy result at (5, 7). If Eve knows there's an error of about 4, what's her dilemma?
Show Answer
Eve sees result: (5, 7)
She thinks: "Secret + Error โ (5, 7)"
If error โ 4, secret could be:
- (5, 3) with error = 4
- (5, 4) with error = 3
- (5, 5) with error = 2
- (5, 6) with error = 1
- (5, 7) with error = 0
- (5, 8) with error = 1
- (5, 9) with error = 2
- etc.
But she doesn't know: Which one is the actual secret?
Many possibilities! ๐ฐ
Question 2: If Eve knows error is always one of the values -2, -1, 0, 1, or 2, which positions are candidates for secret "5" (with result "7")?
Show Answer
Equation: Secret + Error = 7
Error must be between -2 and +2
So:
- If error = -2, secret = 9
- If error = -1, secret = 8
- If error = 0, secret = 7
- If error = +1, secret = 6
- If error = +2, secret = 5
Possible secrets: 5, 6, 7, 8, 9 (5 possibilities)
Still hard: which one is the real secret?
Question 3: Why can't Eve just try all possibilities? What if she finds the right one by luck?
Show Answer
Eve could try:
- 5 โ check if reasonable
- 6 โ check
- 7 โ check
- 8 โ check
- 9 โ check
That's only 5 possibilities... easy, right?
Wait! This is simplified!
In real ML-KEM:
- Error = 256 values (not just -2 to +2 per coefficient!)
- Secret = 256 coefficients
- Each coefficient can have 5 possible error values
Total possibilities: 5^256 โ 10^178
That's more atoms than in the universe!
So yes, finding by luck is basically impossible!
๐ข The Mathโ
Error Distribution: Centered Binomialโ
In ML-KEM, error follows a specific distribution:
For (ML-KEM's parameter):
Possible error values: -2, -1, 0, 1, 2
Probabilities:
- P(e = -2) = 1/16
- P(e = -1) = 4/16
- P(e = 0) = 6/16 (most common!)
- P(e = +1) = 4/16
- P(e = +2) = 1/16
Properties:
- Centered = Mean = 0
- Small = Keeps errors manageable
- Non-zero = Prevents linear algebra solving!
The Noise Equationโ
With error, the MLWE equation:
Eve sees: and
Eve wants: AND
Problem: Two unknowns ( and ), one equation!
Decryption Success: Why It Worksโ
Bob can decrypt because:
Bob knows (his secret), so he can compute:
And then recover and find the original message!
The key insight: Errors are SMALL, so Bob can "subtract out" the noise!
๐ก Why We Careโ
Error Makes MLWE Hardโ
Without error:
Eve can solve: (linear algebra)
With error:
Eve must solve: Find and (two unknowns!)
This is equivalent to: Finding the closest lattice point (SVP)
SVP is hard!
Error Keeps MLWE Usableโ
If error was too large:
Then Bob couldn't decrypt because noise would overwhelm the signal!
If error was too small:
Then Eve could just use linear algebra (guessing )!
Just right ():
- Large enough to confuse Eve (no linear algebra)
- Small enough for Bob to decode (errors cancel out)
Decryption Failure Probabilityโ
When Bob decrypts, he computes:
Sometimes is too large and Bob gets it wrong.
Probability of failure: Approximately โ 1 in
That's basically never! (like winning lottery 100ร in a row)
โ Quick Checkโ
Can you explain error in MLWE to a 5-year-old?
Try saying this out loud:
"Bob wants to hide his secret location. He draws a map, foggs it up with random dots, and sends the foggy map. Even though Eve sees the foggy map, there are too many dots - she can't tell which one is the secret! But Bob knows where the secret was, so he can find it."
Can you calculate error probabilities?
Try this examples:
Error distribution: [-2, -1, 0, 1, 2] with probabilities [1/16, 4/16, 6/16, 4/16, 1/16]
Question: What's the probability of exactly 0?
Answer: P(e = 0) = 6/16 = 3/8 = 37.5%
Question: What's the probability of error being non-zero?
Answer: 1 - P(e = 0) = 1 - 6/16 = 10/16 = 5/8 = 62.5%
๐ Key Takeawaysโ
โ Error = Fog = Random noise added to confuse Eve โ Distribution = Centered binomial with =2 for ML-KEM โ Possible values = -2, -1, 0, 1, 2 (probabilities [1/16, 4/16, 6/16, 4/16, 1/16]) โ Makes it hard = Too many pairs possible โ Keeps it usable = Bob can still decrypt (small errors!) โ Failure probability โ 2^-139 (basically never) โ Trade-off = is "just right!" (small enough for Bob, large enough for security)
๐ What You'll Learn Nextโ
Now you understand error in MLWE! Next, we'll see the official puzzle notation:
๐ข The MLWE Equation โ Official Problem
The mathematical statement of the hard problem behind ML-KEM!