Skip to content

Conversation

josh-arnold-1
Copy link
Contributor

@josh-arnold-1 josh-arnold-1 commented Sep 14, 2025

What

Follow up from this issue here: #2238

Allow projects to define a custom preparationBatchSize in their projects. This can be incredibly useful for my large Bazel based project.

Test plan

Tested locally in my project.

@ahoppen
Copy link
Member

ahoppen commented Sep 16, 2025

Thanks for picking this up @josh-arnold-1 🙏. One high-level comment: I would really like us to design the configuration option that allows us to customize the batching strategy as I described in #2238 (comment). do you think you could look into that?

@josh-arnold-1
Copy link
Contributor Author

Thanks for picking this up @josh-arnold-1 🙏. One high-level comment: I would really like us to design the configuration option that allows us to customize the batching strategy as I described in #2238 (comment). do you think you could look into that?

Thanks for the review!

What if we update the schema to be an enum of configuration options like you specified, but we just supply a single option for now, which we default to a target size of 1 to maintain SourceKit-LSPs default behavior?

That way, we can easily extend additional strategies in future PRs, whilst maintaining the current configuration API?

What are your thoughts? Thanks!

      {
        "type": "object",
        "description": "Prepare a fixed number of targets in a single batch",
        "properties": {
          "strategy": {
            "const": "target"
          },
          "batchSize": {
            "type": "integer",
            "description": "Defines how many targets should be prepared in a single batch"
          }
        },
        "required": [
          "strategy",
          "batchSize"
        ]
      },

@ahoppen
Copy link
Member

ahoppen commented Sep 17, 2025

Your proposal for the JSON schema sounds great to me!

@brentleyjones
Copy link

Is there a way to set batchSize to inf/"batch everything at once"?

@bnbarham
Copy link
Contributor

Is there a way to set batchSize to inf/"batch everything at once"?

Setting to a high value seems reasonable to me for this rather than handling it specifically

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants