Skip to content

Conversation

@josh-arnold-1
Copy link
Contributor

@josh-arnold-1 josh-arnold-1 commented Sep 14, 2025

What

Follow up from this issue here: #2238

Allow projects to define a custom preparationBatchSize in their projects. This can be incredibly useful for my large Bazel based project.

Currently, I've only supported a target strategy which allows you to define a constant batch size. We can easily extend this in the future for multiple strategies.

Usage

"preparationBatchingStrategy": {    
  "strategy": "target",
  "batchSize": 200    
}

Test plan

Tested locally in my project.

@ahoppen
Copy link
Member

ahoppen commented Sep 16, 2025

Thanks for picking this up @josh-arnold-1 🙏. One high-level comment: I would really like us to design the configuration option that allows us to customize the batching strategy as I described in #2238 (comment). do you think you could look into that?

@josh-arnold-1
Copy link
Contributor Author

Thanks for picking this up @josh-arnold-1 🙏. One high-level comment: I would really like us to design the configuration option that allows us to customize the batching strategy as I described in #2238 (comment). do you think you could look into that?

Thanks for the review!

What if we update the schema to be an enum of configuration options like you specified, but we just supply a single option for now, which we default to a target size of 1 to maintain SourceKit-LSPs default behavior?

That way, we can easily extend additional strategies in future PRs, whilst maintaining the current configuration API?

What are your thoughts? Thanks!

      {
        "type": "object",
        "description": "Prepare a fixed number of targets in a single batch",
        "properties": {
          "strategy": {
            "const": "target"
          },
          "batchSize": {
            "type": "integer",
            "description": "Defines how many targets should be prepared in a single batch"
          }
        },
        "required": [
          "strategy",
          "batchSize"
        ]
      },

@ahoppen
Copy link
Member

ahoppen commented Sep 17, 2025

Your proposal for the JSON schema sounds great to me!

@brentleyjones
Copy link

Is there a way to set batchSize to inf/"batch everything at once"?

@bnbarham
Copy link
Contributor

Is there a way to set batchSize to inf/"batch everything at once"?

Setting to a high value seems reasonable to me for this rather than handling it specifically

@josh-arnold-1
Copy link
Contributor Author

Sorry for the delayed response — I was OOO recently. I’m looking into this now and realized that config.schema.json generation might not support enums with associated values (unless I’m missing something?).

If that’s the case, what would be the best way to represent the different batching strategies?

@ahoppen, any guidance here would be super helpful. Thanks!

@ahoppen
Copy link
Member

ahoppen commented Oct 14, 2025

Yeah, the JSON schema generation would need to be expanded to support this. That’s what I meant in #2238 (comment).

As a side note, allowing us to generate the oneOf in the schema above like this will likely need quite a bit new functionality in ConfigSchemaGen. If we only stick to the target-based strategy, we should only need support for the const key in the JSON schema, which should be a lot easier to accomplish.

@josh-arnold-1 josh-arnold-1 force-pushed the preparation-batch-size branch from b8a8cad to 5cd67b8 Compare October 16, 2025 19:37
@josh-arnold-1
Copy link
Contributor Author

Thanks @ahoppen, I just updated the code with what we discussed!

Copy link
Member

@ahoppen ahoppen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you. I really appreciate that you put in the effort to generate the enum options in the JSON schema and Markdown document 🙏🏽 Just a few nitpicky comments.

Two other high-level comments:

  • Could you include the test from #2238 in this PR as well?
  • I would also like to see the BSP server advertising if it can handle multi-target preparation as I mentioned in #2238 (comment) so that users can’t get regressed performance for SwiftPM projects by increasing the target batch size. #2238 already implements most of this, so you should be able to just copy it. I would be happy for this to be follow-up PR though so we can get this one in without any further discussions on the BSP protocol extension.

@josh-arnold-1
Copy link
Contributor Author

Sorry for the month-long hiatus, I was quite strapped for time.

I tried to respond to all the comments, please let me know if I missed anything! Thanks @ahoppen for your time reviewing this 🙏

Copy link
Member

@ahoppen ahoppen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No worries with the delay and thanks again for your work on this. I have two nitpicky comments, but I’d be happy to merge this as-is as well and address those in follow-up PRs.

@ahoppen
Copy link
Member

ahoppen commented Nov 23, 2025

@swift-ci Please test

@ahoppen
Copy link
Member

ahoppen commented Nov 23, 2025

@swift-ci Please test Windows

@josh-arnold-1
Copy link
Contributor Author

Thank you for the review! I responded to the comments. Happy thanksgiving 🦃

Copy link
Member

@ahoppen ahoppen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. LGTM. Very excited to see this get in 🤩

@ahoppen
Copy link
Member

ahoppen commented Dec 1, 2025

@swift-ci Please test

@josh-arnold-1 josh-arnold-1 force-pushed the preparation-batch-size branch from 7863f68 to ab49c03 Compare December 2, 2025 14:35
@josh-arnold-1
Copy link
Contributor Author

@ahoppen just updated! Thank you!

@ahoppen
Copy link
Member

ahoppen commented Dec 2, 2025

@swift-ci Please test

@josh-arnold-1
Copy link
Contributor Author

I see the testing failed — where is the right place to interpret the test failure logs? I can look into this and try to fix. Thanks!

@ahoppen
Copy link
Member

ahoppen commented Dec 3, 2025

Searching for the last occurrences of error: yields the following. I think you need to rebase your PR on top of main to pick up #2366 and then resolve the build failure.

[2025-12-03T03:55:36.906Z] /Users/ec2-user/jenkins/workspace/swift-sourcekit-lsp-PR-macOS/branch-main/sourcekit-lsp/Sources/SKOptions/PreparationBatchingStrategy.swift:46:34: error: use of protocol 'Encoder' as a type must be written 'any Encoder'; this will be an error in a future Swift language mode [#ExistentialAny]
[2025-12-03T03:55:36.906Z] 44 |   }
[2025-12-03T03:55:36.906Z] 45 | 
[2025-12-03T03:55:36.906Z] 46 |   public func encode(to encoder: Encoder) throws {
[2025-12-03T03:55:36.906Z]    |                                  `- error: use of protocol 'Encoder' as a type must be written 'any Encoder'; this will be an error in a future Swift language mode [#ExistentialAny]
[2025-12-03T03:55:36.906Z] 47 |     var container = encoder.container(keyedBy: CodingKeys.self)
[2025-12-03T03:55:36.906Z] 48 |     switch self {

@josh-arnold-1 josh-arnold-1 force-pushed the preparation-batch-size branch from ab49c03 to 668aab3 Compare December 3, 2025 19:38
@josh-arnold-1
Copy link
Contributor Author

Updated! Thanks!

@ahoppen
Copy link
Member

ahoppen commented Dec 3, 2025

@swift-ci Please test

@ahoppen
Copy link
Member

ahoppen commented Dec 3, 2025

@swift-ci Please test Windows

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants