Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -155,6 +155,8 @@ binding = H(SLSS_pk || TDD_pk || EGRW_pk)

All three shares are required to recover the secret, and the binding ensures the three problem instances are cryptographically linked.

> ⚠️ Implementation note: the library prefers native SHAKE256 (XOF) support. If the runtime lacks native SHAKE256, kMOSAIC falls back to a counter-mode SHA3-256 based construction which may not provide the same security margins as a native XOF. For production deployments, ensure your runtime supports SHAKE256 or use an environment that provides it.

### Hard Problems

#### SLSS (Sparse Lattice Subset Sum)
Expand Down
49 changes: 41 additions & 8 deletions SECURITY_REPORT.md
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,27 @@ while (idx < n) {

This eliminates statistical bias by rejecting values that would cause modular reduction bias.

### VULN-014: Decapsulation throws on malformed ciphertext (implicit oracle)

**File:** `src/kem/index.ts`
**Lines:** 360-420 (approx)
**Status:** ✅ **FIXED**

#### Description

Certain malformed or corrupted ciphertexts (for example, a truncated NIZK proof or malformed fragment lengths) could cause `decapsulate()` to throw exceptions or exhibit distinguishable behavior. This could be used as a decryption oracle by an attacker to learn about ciphertext validity.

#### Fix Applied

- Compute the **implicit rejection value** early from the raw ciphertext bytes and use it as the default return value on any validation failure.
- Wrap critical parsing and verification steps in try/catch blocks: serialization, component decryption (SLSS/TDD/EGRW), NIZK deserialization and verification, and re-encapsulation. Any failure marks decapsulation as invalid but does not throw.
- Normalize share lengths (expect 32-byte shares) and use zeroed fallbacks to avoid reconstruction exceptions.
- Replace direct ciphertext byte comparison with fixed-length SHA3-256 hash comparisons to avoid leaks from variable-length ciphertexts.
- Add a public key consistency check: `sha3_256(serializePublicKey(publicKey)) === secretKey.publicKeyHash`; treat mismatches as invalid decapsulation.
- Added unit tests exercising tampering and malformed inputs: `test/kem-malformed.test.ts`.

These changes ensure `decapsulate()` always returns a 32-byte pseudorandom secret (implicit reject) on invalid input, preventing oracle-style leakage.

---

### VULN-005: Potential Integer Precision Issues
Expand Down Expand Up @@ -257,17 +278,21 @@ JavaScript's garbage collector may copy buffer contents during compaction. The `

**File:** `src/utils/shake.ts`
**Lines:** 82-100
**Status:** 🟡 ACKNOWLEDGED
**Status:** ✅ MITIGATED

#### Description

The counter-mode SHA3-256 fallback is not a proven XOF construction. While unlikely to be used on Node.js/Bun, security properties are unverified.

#### Mitigation
#### Mitigation / Fix Applied

- Native SHAKE256 is available in all target environments (Node.js 18+, Bun)
- Fallback only triggers in edge cases
- Consider adding warning log when fallback is used
- Added `isNativeShake256Available()` helper to allow application code to detect and enforce native SHAKE256 availability.
- Added an explicit README note advising production deployments to use native SHAKE256 or a runtime that supports it.
- Fallback continues to exist for compatibility, but the above mitigations reduce the risk and make it visible to operators.

#### Recommendation

For highest assurance, consider adding a configuration flag that causes startup to fail when native SHAKE256 is unavailable.

---

Expand Down Expand Up @@ -370,6 +395,7 @@ Generator cache creates timing differences between cache hits and misses, potent
| VULN-001 | TDD plaintext storage | ✅ FIXED | XOR encryption with masked-matrix keystream |
| VULN-002 | EGRW randomness leak | ✅ FIXED | Ephemeral walk vertex derivation |
| VULN-004 | Modular bias | ✅ FIXED | Rejection sampling in TDD |
| VULN-014 | Decapsulation oracle | ✅ FIXED | Safe parsing, implicit-reject, hash-compare |

### Acknowledged Limitations

Expand Down Expand Up @@ -397,9 +423,16 @@ Generator cache creates timing differences between cache hits and misses, potent

The kMOSAIC implementation has been assessed and critical security issues have been remediated:

1. **VULN-001 (TDD Plaintext):** Now uses XOR encryption with keystream derived from the masked tensor matrix
2. **VULN-002 (EGRW Randomness):** Randomness no longer exposed; ephemeral walk vertex used instead
3. **VULN-004 (Modular Bias):** Rejection sampling now ensures uniform distribution
1. **VULN-001 (TDD Plaintext):** Now uses XOR encryption with keystream derived from the masked tensor matrix2. **VULN-002 (EGRW randomness exposure):** Now derives ciphertext endpoints from ephemeral walks and does not expose randomness
2. **VULN-004 (Modular bias):** Rejection sampling implemented in TDD sampling
3. **VULN-014 (Decapsulation oracle):** Decapsulation hardened to return implicit-reject values on malformed or tampered ciphertexts; added unit tests to verify behavior

Additional improvements:

- Added `isNativeShake256Available()` and README guidance to make SHAKE256 availability explicit for production deployments.
- Added robust unit tests for malformed/corrupted ciphertext handling: `test/kem-malformed.test.ts` (proof tampering, malformed fragments, truncated ciphertexts, publicKey mismatch).

Overall, the most critical issues have been remediated and the codebase now includes tests that guard against malformed ciphertext behavior and oracle leakage. Continuous monitoring and peer review are recommended for the remaining acknowledged limitations (timing, zeroization limits, and JS runtime concerns).2. **VULN-002 (EGRW Randomness):** Randomness no longer exposed; ephemeral walk vertex used instead 3. **VULN-004 (Modular Bias):** Rejection sampling now ensures uniform distribution

The remaining acknowledged items are primarily JavaScript runtime limitations that are well-documented in the code and do not constitute exploitable vulnerabilities in typical deployment scenarios.

Expand Down
174 changes: 147 additions & 27 deletions src/kem/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -365,41 +365,108 @@ export async function decapsulate(

// Compute implicit rejection value first (constant-time protection)
// This is returned if any validation fails
const ciphertextBytes = serializeCiphertext(ciphertext)
let ciphertextBytes: Uint8Array
try {
ciphertextBytes = serializeCiphertext(ciphertext)
} catch {
// Malformed ciphertext serialization — use empty buffer and mark invalid
ciphertextBytes = new Uint8Array(0)
}

const implicitRejectSecret = shake256(
hashWithDomain(DOMAIN_IMPLICIT_REJECT, hashConcat(seed, ciphertextBytes)),
32,
)

let validDecapsulation = 1 // 1 = valid, 0 = invalid

// Decrypt each fragment
const share1 = slssDecrypt(c1, slssSK, params.slss)
const share2 = tddDecrypt(c2, tddSK, params.tdd)
const share3 = egrwDecrypt(c3, egrwSK, egrwPK, params.egrw)
// Quick sanity: ensure public key matches secret key's recorded hash
try {
const pkHash = sha3_256(serializePublicKey(publicKey))
if (!constantTimeEqual(pkHash, publicKeyHash)) {
validDecapsulation = 0
}
} catch {
// If serialization of public key fails, mark invalid but continue
validDecapsulation = 0
}

// Reconstruct ephemeral secret
const recoveredSecret = secretReconstruct([share1, share2, share3])
// Decrypt each fragment with safe failure handling
let share1: Uint8Array = new Uint8Array(32)
let share2: Uint8Array = new Uint8Array(32)
let share3: Uint8Array = new Uint8Array(32)

try {
const s1 = slssDecrypt(c1, slssSK, params.slss)
if (s1.length === 32) share1 = s1
else {
validDecapsulation = 0
}
} catch {
validDecapsulation = 0
}

// Fujisaki-Okamoto re-encryption check
// Re-encapsulate with recovered secret and verify ciphertext matches
const reEncapsulated = encapsulateDeterministic(publicKey, recoveredSecret)
const reEncapsulatedBytes = serializeCiphertext(reEncapsulated.ciphertext)
try {
const s2 = tddDecrypt(c2, tddSK, params.tdd)
if (s2.length === 32) share2 = s2
else {
validDecapsulation = 0
}
} catch {
validDecapsulation = 0
}

// Constant-time comparison of ciphertexts
if (!constantTimeEqual(ciphertextBytes, reEncapsulatedBytes)) {
try {
const s3 = egrwDecrypt(c3, egrwSK, egrwPK, params.egrw)
if (s3.length === 32) share3 = s3
else {
validDecapsulation = 0
}
} catch {
validDecapsulation = 0
}

// Verify NIZK proof (additional check)
const proof = deserializeNIZKProof(proofBytes)
const ciphertextHashes = [
sha3_256(serializeSLSSCiphertext(c1)),
sha3_256(serializeTDDCiphertext(c2)),
sha3_256(serializeEGRWCiphertext(c3)),
]
// Reconstruct ephemeral secret (shares are normalized to 32 bytes)
let recoveredSecret: Uint8Array
try {
recoveredSecret = secretReconstruct([share1, share2, share3])
} catch {
// Reconstruction failure — use zeroed secret and mark invalid
recoveredSecret = new Uint8Array(32)
validDecapsulation = 0
}

if (!verifyNIZKProof(proof, ciphertextHashes, recoveredSecret)) {
// Fujisaki-Okamoto re-encryption check (compare hashes to avoid length leaks)
let reEncapsulatedBytes: Uint8Array
try {
const reEncapsulated = encapsulateDeterministic(publicKey, recoveredSecret)
reEncapsulatedBytes = serializeCiphertext(reEncapsulated.ciphertext)
} catch {
reEncapsulatedBytes = new Uint8Array(0)
validDecapsulation = 0
}

// Compare fixed-length hashes (constant-time)
const originalCtHash = sha3_256(ciphertextBytes)
const reCtHash = sha3_256(reEncapsulatedBytes)
if (!constantTimeEqual(originalCtHash, reCtHash)) {
validDecapsulation = 0
}

// Verify NIZK proof (additional check)
try {
const proof = deserializeNIZKProof(proofBytes)
const ciphertextHashes = [
sha3_256(serializeSLSSCiphertext(c1)),
sha3_256(serializeTDDCiphertext(c2)),
sha3_256(serializeEGRWCiphertext(c3)),
]

if (!verifyNIZKProof(proof, ciphertextHashes, recoveredSecret)) {
validDecapsulation = 0
}
} catch {
// Any failure in proof parsing or verification marks invalid
validDecapsulation = 0
}

Expand Down Expand Up @@ -730,17 +797,38 @@ export function serializeCiphertext(ct: MOSAICCiphertext): Uint8Array {
* @returns Ciphertext object
*/
export function deserializeCiphertext(data: Uint8Array): MOSAICCiphertext {
// Basic bounds checks
if (data.length < 4) throw new Error('Invalid ciphertext: too short')

const view = new DataView(data.buffer, data.byteOffset)
let offset = 0

// c1
if (offset + 4 > data.length)
throw new Error('Invalid ciphertext: truncated c1 length')
const c1Len = view.getUint32(offset, true)
offset += 4
const MAX_PART = 8 * 1024 * 1024 // 8 MB per component to prevent resource exhaustion (supports MOS-256 public keys)
if (c1Len <= 0 || c1Len > MAX_PART || offset + c1Len > data.length)
throw new Error('Invalid ciphertext: c1 extends beyond data or too large')
const c1Start = offset
const c1View = new DataView(data.buffer, data.byteOffset + c1Start)

// Validate SLSS component structure
if (c1Len < 8) throw new Error('Invalid SLSS ciphertext: too short')
const uLen = c1View.getUint32(0, true)
const u = new Int32Array(data.buffer, data.byteOffset + c1Start + 4, uLen / 4)
if (uLen % 4 !== 0)
throw new Error('Invalid SLSS ciphertext: u length not multiple of 4')
if (4 + uLen + 4 > c1Len)
throw new Error('Invalid SLSS ciphertext: malformed lengths')

const vLen = c1View.getUint32(4 + uLen, true)
if (4 + uLen + 4 + vLen !== c1Len)
throw new Error('Invalid SLSS ciphertext: length mismatch')
if (vLen % 4 !== 0)
throw new Error('Invalid SLSS ciphertext: v length not multiple of 4')

const u = new Int32Array(data.buffer, data.byteOffset + c1Start + 4, uLen / 4)
const v = new Int32Array(
data.buffer,
data.byteOffset + c1Start + 8 + uLen,
Expand All @@ -749,13 +837,19 @@ export function deserializeCiphertext(data: Uint8Array): MOSAICCiphertext {
offset += c1Len

// c2
if (offset + 4 > data.length)
throw new Error('Invalid ciphertext: truncated c2 length')
const c2Len = view.getUint32(offset, true)
offset += 4
if (c2Len <= 0 || c2Len > MAX_PART || offset + c2Len > data.length)
throw new Error('Invalid ciphertext: c2 extends beyond data or too large')
const c2Start = offset
const c2DataLen = new DataView(
data.buffer,
data.byteOffset + c2Start,
).getUint32(0, true)
const c2View = new DataView(data.buffer, data.byteOffset + c2Start)
const c2DataLen = c2View.getUint32(0, true)
if (4 + c2DataLen !== c2Len)
throw new Error('Invalid TDD ciphertext: length mismatch')
if (c2DataLen % 4 !== 0)
throw new Error('Invalid TDD ciphertext: data length not multiple of 4')
const tddData = new Int32Array(
data.buffer,
data.byteOffset + c2Start + 4,
Expand All @@ -764,8 +858,12 @@ export function deserializeCiphertext(data: Uint8Array): MOSAICCiphertext {
offset += c2Len

// c3
if (offset + 4 > data.length)
throw new Error('Invalid ciphertext: truncated c3 length')
const c3Len = view.getUint32(offset, true)
offset += 4
if (c3Len <= 16 || c3Len > MAX_PART || offset + c3Len > data.length)
throw new Error('Invalid EGRW ciphertext: malformed c3 or too large')
const c3Start = offset
const vertexView = new DataView(data.buffer, data.byteOffset + c3Start)
const vertex = {
Expand Down Expand Up @@ -851,38 +949,60 @@ export function serializePublicKey(pk: MOSAICPublicKey): Uint8Array {
* Format: [level_len:4][level_string][slss_len:4][slss_data][tdd_len:4][tdd_data][egrw_len:4][egrw_data][binding:32]
*/
export function deserializePublicKey(data: Uint8Array): MOSAICPublicKey {
// Basic bounds check
if (data.length < 4) throw new Error('Invalid public key: too short')

const view = new DataView(data.buffer, data.byteOffset)
let offset = 0

// Read security level string
if (offset + 4 > data.length)
throw new Error('Invalid public key: truncated level length')
const levelLen = view.getUint32(offset, true)
offset += 4
if (levelLen <= 0 || offset + levelLen > data.length || levelLen > 255)
throw new Error('Invalid public key: level length invalid')

const levelBytes = data.slice(offset, offset + levelLen)
const level = new TextDecoder().decode(levelBytes) as SecurityLevel
offset += levelLen

// Get params from level
// Get params from level (may throw if level unknown)
const params = getParams(level)

// Read SLSS public key
if (offset + 4 > data.length)
throw new Error('Invalid public key: truncated SLSS length')
const slssLen = view.getUint32(offset, true)
offset += 4
if (slssLen <= 0 || offset + slssLen > data.length)
throw new Error('Invalid public key: SLSS component out of bounds')
const slss = slssDeserializePublicKey(data.slice(offset, offset + slssLen))
offset += slssLen

// Read TDD public key
if (offset + 4 > data.length)
throw new Error('Invalid public key: truncated TDD length')
const tddLen = view.getUint32(offset, true)
offset += 4
if (tddLen <= 0 || offset + tddLen > data.length)
throw new Error('Invalid public key: TDD component out of bounds')
const tdd = tddDeserializePublicKey(data.slice(offset, offset + tddLen))
offset += tddLen

// Read EGRW public key
if (offset + 4 > data.length)
throw new Error('Invalid public key: truncated EGRW length')
const egrwLen = view.getUint32(offset, true)
offset += 4
if (egrwLen <= 0 || offset + egrwLen > data.length)
throw new Error('Invalid public key: EGRW component out of bounds')
const egrw = egrwDeserializePublicKey(data.slice(offset, offset + egrwLen))
offset += egrwLen

// Read binding (fixed 32 bytes)
if (offset + 32 > data.length)
throw new Error('Invalid public key: missing binding')
const binding = data.slice(offset, offset + 32)

return { slss, tdd, egrw, binding, params }
Expand Down
2 changes: 2 additions & 0 deletions src/problems/egrw/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -486,6 +486,8 @@ export function egrwSerializePublicKey(pk: EGRWPublicKey): Uint8Array {
* @returns Public key
*/
export function egrwDeserializePublicKey(data: Uint8Array): EGRWPublicKey {
if (data.length < 32)
throw new Error('Invalid EGRW public key: expected 32 bytes')
const vStart = bytesToSl2(data.slice(0, 16))
const vEnd = bytesToSl2(data.slice(16, 32))
return { vStart, vEnd }
Expand Down
Loading