Skip to content

Refactor Cache locking mechanism to use NSRecursiveLock for improved thread safety and prevent deadlocks. Added comprehensive thread safety tests to validate the fix and ensure no new locking issues are introduced.#22

Merged
0xLeif merged 4 commits intomainfrom
leif/bug/rare-deadlock
Sep 20, 2025

Conversation

@0xLeif
Copy link
Owner

@0xLeif 0xLeif commented Sep 20, 2025

No description provided.

…thread safety and prevent deadlocks. Added comprehensive thread safety tests to validate the fix and ensure no new locking issues are introduced.
@gemini-code-assist
Copy link

Summary of Changes

Hello @0xLeif, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses critical thread safety and deadlock issues within the Cache class by upgrading its internal locking mechanism. The change from NSLock to NSRecursiveLock is specifically aimed at resolving re-entrant lock acquisition problems that arise in complex multithreaded environments, particularly when interacting with @Published property wrappers and SwiftUI's reactive framework. To thoroughly validate this fix and prevent future regressions, a comprehensive set of new thread safety tests has been introduced, meticulously designed to reproduce and verify the resolution of various deadlock scenarios.

Highlights

  • Locking Mechanism Refactor: The Cache class's internal locking mechanism has been refactored from NSLock to NSRecursiveLock. This change specifically addresses and prevents deadlocks that can occur due to re-entrant calls, particularly those arising from @Published property wrappers and SwiftUI/Combine interactions.
  • Comprehensive Thread Safety Tests: A significant suite of new thread safety tests has been added to ThreadSafetyTests.swift. These tests are meticulously designed to reproduce and validate the fix for various deadlock scenarios, including those related to the resolve() method, @Published property wrappers, ObservableObject interactions, and complex SwiftUI/Combine update cycles, ensuring the robustness of the cache implementation.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly addresses a potential deadlock issue by switching from NSLock to NSRecursiveLock. This change properly handles re-entrant locking scenarios that can occur with @Published properties. The addition of comprehensive thread safety tests is a great step towards ensuring the stability of the cache. However, I've identified a few areas for improvement in the new tests: one test has a logical flaw that prevents it from testing what its name implies, there is significant code duplication across the new tests that affects maintainability, and a comment in one test misrepresents the cause of the deadlock. My detailed comments provide suggestions for addressing these points.

…ity with Windows. Added type annotations for ComposableCache and AnyCacheable instances, and wrapped tests in conditional compilation to exclude them on Windows.
…stantiation. This change enhances test clarity and performance by reusing the cache instance across multiple operations.
…oving test clarity. Updated comments to reflect changes in locking mechanisms and added new test scenarios to validate cache behavior under high contention and mixed operations.
@0xLeif
Copy link
Owner Author

0xLeif commented Sep 20, 2025

/gemini review

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly addresses a potential deadlock issue by switching from NSLock to NSRecursiveLock, which is the right solution for re-entrant locking scenarios. The addition of an extensive suite of thread safety tests is a great improvement to ensure the cache's robustness. My review includes a high-severity comment on a bug in one of the new tests where it's not validating the intended scenario, and a medium-severity comment on significant code duplication in the new tests, with a suggestion for refactoring to improve maintainability.

cache.set(value: ["array"], forKey: "array_key")

DispatchQueue.concurrentPerform(iterations: iterations) { i in
let key = "key_\(i % 3)"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This test has a logic issue. The keys being generated and used inside DispatchQueue.concurrentPerform (key_0, key_1, key_2) do not match the keys that were initially set in the cache (string_key, int_key, array_key).

As a result, cache.resolve will always fail with a MissingRequiredKeysError because contains(key) returns false, and the code path that could potentially deadlock is never reached. The test passes, but it doesn't validate the fix for the deadlock scenario as intended.

To fix this, you should use the keys that are actually in the cache.

Suggested change
let key = "key_\(i % 3)"
let key = ["string_key", "int_key", "array_key"][i % 3]

Comment on lines +42 to +68
func testResolveDeadlockIssue() {
let cache = Cache<String, Any>()
let iterations = 100
let expectation = XCTestExpectation(description: "Resolve deadlock test")
expectation.expectedFulfillmentCount = iterations

cache.set(value: "string_value", forKey: "string_key")
cache.set(value: 42, forKey: "int_key")
cache.set(value: ["array"], forKey: "array_key")

DispatchQueue.concurrentPerform(iterations: iterations) { i in
let key = "key_\(i % 3)"

do {
let _: Int = try cache.resolve(key, as: Int.self)
} catch {
}

if cache.contains(key) {
let _ = cache.get(key)
}

expectation.fulfill()
}

wait(for: [expectation], timeout: 5.0)
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

There is a large amount of duplicated code across the new thread safety tests. Each test repeats the setup for XCTestExpectation, DispatchQueue.concurrentPerform, and wait(for:). This makes the test suite harder to read and maintain.

Consider refactoring this boilerplate into a shared helper function. This would make the individual tests more concise and focused on their specific logic.

Here's an example of what a helper function could look like:

private func runConcurrentTest(
    description: String,
    iterations: Int,
    timeout: TimeInterval,
    testBody: @escaping (Int) -> Void
) {
    let expectation = XCTestExpectation(description: description)
    expectation.expectedFulfillmentCount = iterations
    
    DispatchQueue.concurrentPerform(iterations: iterations) { i in
        testBody(i)
        expectation.fulfill()
    }
    
    wait(for: [expectation], timeout: timeout)
}

Using this helper, this test could be simplified significantly.

@0xLeif 0xLeif merged commit 9471103 into main Sep 20, 2025
3 of 4 checks passed
@0xLeif 0xLeif deleted the leif/bug/rare-deadlock branch September 20, 2025 20:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant