Skip to content

Comments

Revert "chore: try to fix the mobile build"#7844

Merged
LucasXu0 merged 1 commit intomainfrom
revert_mobile_changes
Apr 28, 2025
Merged

Revert "chore: try to fix the mobile build"#7844
LucasXu0 merged 1 commit intomainfrom
revert_mobile_changes

Conversation

@LucasXu0
Copy link
Collaborator

@LucasXu0 LucasXu0 commented Apr 28, 2025

This reverts commit 3f06f64.

Feature Preview


PR Checklist

  • My code adheres to AppFlowy's Conventions
  • I've listed at least one issue that this PR fixes in the description above.
  • I've added a test(s) to validate changes in this PR, or this PR only contains semantic changes.
  • All existing tests are passing.

Summary by Sourcery

Chores:

  • Remove OS-specific conditional compilation flags for Ollama integration.

@sourcery-ai
Copy link
Contributor

sourcery-ai bot commented Apr 28, 2025

Reviewer's Guide by Sourcery

This pull request reverts a previous commit (3f06f64) that introduced platform-specific conditional compilation for local AI features, particularly those related to Ollama and desktop operating systems. The revert removes these conditional checks, restoring the code to its state before the problematic commit.

No diagrams generated as the changes look simple and do not need a visual representation.

File-Level Changes

Change Details Files
Removed platform-specific conditional compilation for Ollama and related local AI functionality.
  • Removed #[cfg] attributes from Ollama imports.
  • Removed #[cfg] attribute from the ollama field declaration.
  • Removed #[cfg] block around the initialization logic for the ollama field.
  • Removed #[cfg] block around the background task for subscribing to local AI state changes.
  • Removed #[cfg] attributes from model retrieval functions (get_all_chat_local_models, get_all_embedded_local_models, get_filtered_local_models).
  • Removed #[cfg] blocks within the get_local_ai_model_type function.
frontend/rust-lib/flowy-ai/src/local_ai/controller.rs

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@LucasXu0 LucasXu0 merged commit 168b29a into main Apr 28, 2025
3 checks passed
@LucasXu0 LucasXu0 deleted the revert_mobile_changes branch April 28, 2025 05:25
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @LucasXu0 - I've reviewed your changes - here's some feedback:

Overall Comments:

  • Removing the desktop-specific conditional compilation might reintroduce the mobile build issues this revert relates to.
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟡 Complexity: 1 issue found
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

let resource_lack = cloned_llm_res.get_lack_of_resource().await;
(downloaded, resource_lack)
let key = crate::local_ai::controller::local_ai_enabled_key(&workspace_id.to_string());
info!("[AI Plugin] state: {:?}", state);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (complexity): Consider extracting the inner loop of the spawned async task into a separate helper function to improve code structure and readability.

Consider extracting the inner loop of the spawned async task into its own helper function. This reduces nesting and makes the control flow easier to follow. For example:

```rust
async fn process_plugin_state(
    mut running_state_rx: impl StreamExt<Item = RunningState> + Unpin,
    cloned_llm_res: Arc<LocalAIResourceController>,
    cloned_store_preferences: Weak<KVStorePreferences>,
    cloned_local_ai: Arc<OllamaAIPlugin>,
    cloned_user_service: Arc<dyn AIUserService>,
) {
    while let Some(state) = running_state_rx.next().await {
        // Early exit if workspace_id is not accessible
        let workspace_id = match cloned_user_service.workspace_id() {
            Ok(id) => id,
            Err(_) => continue,
        };

        let key = crate::local_ai::controller::local_ai_enabled_key(&workspace_id.to_string());
        info!("[AI Plugin] state: {:?}", state);

        if let Some(store_preferences) = cloned_store_preferences.upgrade() {
            let enabled = store_preferences.get_bool(&key).unwrap_or(true);

            let (plugin_downloaded, lack_of_resource) = if !matches!(state, RunningState::UnexpectedStop { .. }) && enabled {
                let downloaded = is_plugin_ready();
                let resource_lack = cloned_llm_res.get_lack_of_resource().await;
                (downloaded, resource_lack)
            } else {
                (false, None)
            };

            let plugin_version = if matches!(state, RunningState::Running { .. }) {
                match cloned_local_ai.plugin_info().await {
                    Ok(info) => Some(info.version),
                    Err(_) => None,
                }
            } else {
                None
            };

            let new_state = RunningStatePB::from(state);
            chat_notification_builder(
                APPFLOWY_AI_NOTIFICATION_KEY,
                ChatNotification::UpdateLocalAIState,
            )
            .payload(LocalAIPB {
                enabled,
                plugin_downloaded,
                lack_of_resource,
                state: new_state,
                plugin_version,
            })
            .send();
        } else {
            warn!("[AI Plugin] store preferences is dropped");
        }
    }
}

Then call it from the spawn block like:

tokio::spawn(async move {
    process_plugin_state(
        running_state_rx,
        cloned_llm_res,
        cloned_store_preferences,
        cloned_local_ai,
        cloned_user_service,
    )
    .await;
});

This extraction flattens the async task, reduces the nesting in your main flow, and keeps the functionality intact.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant