Skip to content

Conversation

ajaychandran
Copy link
Contributor

@ajaychandran ajaychandran commented Jul 2, 2025

Motivation

Each build importer uses custom models to extract module configurations and translate the data to an IR. These custom types differ from the module structure and hierarchies within Mill. This makes the implementation complex and difficult to extend with new features.

Solution

This PR introduces new build representations that are optimized for code generation. The importer implementations were simplified and extended with several new features during development.

libs.init.buildgen.api

  • The ModuleConfig ADT replaces IR types in libs.init.buildgen such as IrBuild, IrScopedDeps. The ADT members have a one-to-one mapping with module types within Mill. This greatly improves extensibility; new features can be introduced by adding fields in existing members or by adding new members.
  • The ModuleRepr datatype forms the target for conversion. It replaces the BuildObject type in libs.init.buildgen.
  • The module is cross compiled against Scala 2.12 so that it can be used within SBT.

libs.init.buildgen

  • Added BuildWriter to render the new representations.
  • Added MetaBuildRepr for base traits and "Deps" object.
  • Removed unused types BuildGenBase, BuildGenUtil, OptionNodeTree and ir.scala.
  • Simplified Tree implementation.

libs.init.gradle

  • Added ExportGradleBuildPlugin to generate instances of ModuleRepr. These instances are serialized and the JSON string is wrapped in ExportGradleBuildModel. These types are isolated in sub-modules to exclude the coursier-paths dependency from the classpath passed to the Gradle daemon.
  • GradleBuildGenMain was updated to use the new plugin. Also, the Gradle daemon is shutdown after use now.
  • Custom types such as ProjectModel and ProjectTreePlugin were removed.

libs.init.maven

  • MavenBuildGenMain was updated to target the new build representations.

libs.init.sbt

  • Added ExportSbtBuildScript in exportscript to target the new build representations. This is simpler to use via sideloading and replaces the plugin in exportplugin.
  • SbtBuildGenMain was updated to use the new script. Also, added support for reading JVM options in .sbtopts.

libs.init

  • Updated BuildGenModule.buildGenScalafmtConfig to write the file under the task destination directory.
  • Renamed Util.scalafmtConfigFile to Util.scalafmtConfig returning the content instead of a temp file. Also, the configuration was updated to not add blank lines before top level statements to prevent separation between cross trait and module object declarations.
  • Updated Util.buildFiles to filter files based on name instead of extension. This is required to avoid formatting any ".sc" files in projects.

New features

  • Extended SBT conversion to cross platform and cross Scala version projects.
  • Autoconfigure ErrorProneModule for Gradle and Maven projects.
  • Autoconfigure jvmId for Gradle and Maven projects.
  • Added support for generating meta-build.

SBT conversion for cross projects

The custom script adds a task to each project that generates the module configuration and writes it to a JSON file.

On the importer side, this task is run with the + prefix to generate the module configurations per project per cross Scala version. The configurations are grouped by ModuleRepr.segments and groups that contain more one member are processed as a cross Scala module.

  • The configuration shared between cross versions is set in ModuleRepr.configs.
  • The version specific configurations are set in ModuleRepr.crossConfigs.

To identify cross platform members, additional information is exported that consists of a flag and the effective moduleDir.

  • For projects that use the sbt-crossproject plugin, the flag is set to true and the effective moduleDir is set to the parent directory.
  • For all other projects, the flag is set to false.

On the importer side, the configurations are first grouped by this metadata. The members in a group are grouped again by ModuleRepr.segements to identify the cross-platform members and their respective configurations. These configurations are multi-valued if the cross-platform member is also a cross Scala module.

Dropped features

  • JavaModule.resources is not configured anymore since most projects tested do not require it. The conversion itself has always limited support to the standard layout.
  • Redundant command line options were removed.

Impact

In general, the accuracy and quality of the generated build files has been improved. This can be observed by comparing the expected outputs for unit tests.
With support extended to cross projects, the scope of SBT conversion has increased greatly.

A noticeable difference in the log output is that it is missing messages for dependencies that were skipped. This is because of the functional nature of the new implementation. It is possible to restore such messages but the added complexity introduced is expected to nullify any gains.

Pending items

Add examples/integration tests for new features.

Future work

  1. Extend Gradle conversion to Kotlin modules/projects.
  2. Extend SBT cross platform conversion to projects that use the sbt-projectmatrix plugin.

@ajaychandran
Copy link
Contributor Author

@lihaoyi As per the Maven POM schema, the description, organization and url fields are not mandatory. Is there a reason why these are required in Mill PomSettings?

@lihaoyi
Copy link
Member

lihaoyi commented Jul 3, 2025

No reason I'm aware of, we can make them optional

@lefou
Copy link
Member

lefou commented Jul 3, 2025

I think the Maven Central validator rejects projects without these fields. Although it's specific to Maven Central, having these fields mandatory avoids some late pitfalls, when publishing artifacts.

We only use PomSettings for publishing/sharing, so having some minimal standards on meta-data is probably not bad.

@ajaychandran
Copy link
Contributor Author

ajaychandran commented Jul 3, 2025

We could allow null values and handle in the render method. This would satisfy the empty use-case without requiring changes in user code.

@ajaychandran
Copy link
Contributor Author

ajaychandran commented Jul 29, 2025

A new SBT importer implementation has been added that can import projects that use sbt-crossproject plugin for cross platform development. The implementation also generates cross modules for cross Scala versions.

This was tested against 12 projects and all but 2 were successfully imported.

  • scala/scala3: Multiple SBT projects use the same base directory.
  • scalapb/ScalaPB: Uses sbt-projectmatrix for cross development.

One of the tests can be run using the following command:

./mill standalone.migrate[sbt].testOnly mill.standalone.MillInitAirstreamTests

@ajaychandran
Copy link
Contributor Author

ajaychandran commented Jul 29, 2025

@lihaoyi

I am going to be working on adding support for some more settings such as jvmId and generating base trait feature.

What are your thoughts on supporting more plugins? We already have tests that require the following:

  • sbt-projectmatrix
  • sbt-play
  • ScalablyTypedConverterGenSourcePlugin

@lihaoyi
Copy link
Member

lihaoyi commented Jul 30, 2025

@ajaychandran update the PR title and description to summarize the motivation, major changes, how it integrates with the existing codebase, and testing strategy. It's a large PR and that would make it easier to review

Comment on lines 88 to 91
ps.println()
ps.print("def repositories = super.repositories() ++ ")
ps.print("Seq(")
ps.print(literalize(repositories.head))
Copy link
Member

@lihaoyi lihaoyi Jul 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use string interpolation for this and all similar code snippets so it's not so verbose

import mill.constants.OutFiles.{millDaemon, millNoDaemon, out}
import os.{Path, ProcessInput, ProcessOutput, Shellable}

class StandaloneTester(
Copy link
Member

@lihaoyi lihaoyi Jul 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does this need to be different from normal IntegrationTester? We should make these normal integration tests using the normal IntegrationTester utilities

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't have the same requirements as IntegrationTester. The workspace creation and initialization happens outside the tester.
Given that this PR proposal will take some time, I chose to isolate this to avoid any conflicts when re-basing.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What workspace creation and initialization do we expect with IntegrationTester? If run on an empty folder, the workspace starts empty

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IntegrationTester clears the workspace and copies resources. I suppose this can be circumvented with an empty resource folder.

I am not opposed to the idea of sharing existing code. I just kept all new code isolated initially. I will merge the testkit classes and move the tests to integration/manual.

@@ -0,0 +1,33 @@
package build.standalone
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's put this in the integration/manual/, indicating that these are tests meant to be run manually but won't be run in CI, but are otherwise normal integration tests

@@ -0,0 +1,108 @@
package mill.init.migrate
Copy link
Member

@lihaoyi lihaoyi Jul 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does this file related to the code we already have in libs/init/buildgen/src/mill/main/buildgen/ir.scala? Does it replace it entirely, or does it re-use parts of the existing model?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is meant to replace it.
In essence, it is a refactoring of the existing model that organizes settings into groups that correspond to actual module types.
Also, the IR types with the rendered code are not required and do not exist in the new model. This was primarily introduced, by me, for file merging. The new implementation achieves the same in a sane way.

"org.testng" -> "TestModule.TestNg",
"org.scalatest" -> "TestModule.ScalaTest",
"org.specs2" -> "TestModule.Specs2",
"com.lihaoyi" -> "TestModule.Utest",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems a bit imprecise. For example a lot of people use com.lihaoyi::os-lib but don't use utest. Could we match on a whitelist of artifact names (or prefixes, ignoring the _{scala-version}) as well?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, this seems to duplicate logic we already have in BuildgenUtil. We should consolidate it and avoid copy-pasting code

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is the same as the list in BuildGenUtil.
Given that this is a rewrite of the core design, I have kept the new implementation separate. If the proposal is approved, this can be consolidated.


import upickle.default.{ReadWriter, macroRW}

case class Tree[+A](root: A, children: Seq[Tree[A]] = Nil) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How is this different from libs/init/buildgen/src/mill/main/buildgen/Tree.scala?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's in Scala 2 and the unused features are removed.
But basically it is the same and only exists because the new implementation is isolated.

}

def renderSbtPlatformModule =
s"""trait SbtPlatformModule extends SbtModule { outer =>
Copy link
Member

@lihaoyi lihaoyi Jul 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How is this different from SbtModule with PlatformScalaModule? If it's necessary, we should put it (and CrossSbtPlatformModule) in libs/scalalib/src/mill/scalalib/ rather than codegen-ing it every import

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SbtModule resets sources

  override def sources = Task.Sources("src/main/scala", "src/main/java")

and PlatformModuleBase operates on sourceFolders.

  override def sourcesFolders: Seq[os.SubPath] = super.sourcesFolders.flatMap {
    source => Seq(source, source / os.up / s"${source.last}-${platformCrossSuffix}")
  }

Are these compatible in any scenario?

This is being used to replicate the layout used by sbt-crossproject plugin. But now we also have sbt-projectmatrix for cross development which seems to use a completely different structure. Is it okay to add a trait to scalalib per plugin?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should make SbtModule use sourceFolders as well, just for consistency with the rest.

I think having traits in scalalib for each of them is fine as long as they're meaningful and well documented .

@@ -0,0 +1,225 @@
package mill.init.migrate.sbt
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does this relate to the existing libs/init/sbt/exportplugin/src/mill/main/sbt/ExportBuildPlugin.scala? Can we consolidate them?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Functionally they are the same.
But the script offers certain advantages such as not requiring any extra build files to add a plugin. The idea was borrowed from IDEA :)

import scala.collection.mutable
import scala.util.Using

opaque type PackageTree = Tree[PackageRepr]
Copy link
Member

@lihaoyi lihaoyi Jul 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's just use a normal case class here with normal methods, rather than this opaque type with extensions

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A PackageTree is supposed to have the following invariants:

  • The root package is located at the workspace root.
  • The tree structure represents the filesystem structure.
    An opaque type seemed the "right" choice.
    The extension methods can be defined as normal methods.

Could you provide the reason behind the comment? This is just for my learning.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In general, we aim to use simpler language features over more powerful ones, unless there is a strong reason otherwise. In this case, there is no strong reason to use an opaque type, so we use case class which people are more familiar with

@lihaoyi
Copy link
Member

lihaoyi commented Jul 30, 2025

@ajaychandran Overall this looks decent as a first pass, but what is missing is how this interacts with the existing libs/init/sbt implementation

  • Does the new implementation pass all the same tests? I saw some tests failing on your fork's CI
  • Does the new implementation replace the old one? If so can we delete the old one entirely?
  • Can we share code between the old and new implementations? I see a bunch of duplicate logic that can be consolidated

@lihaoyi
Copy link
Member

lihaoyi commented Jul 30, 2025

@lihaoyi

I am going to be working on adding support for some more settings such as jvmId and generating base trait feature.

What are your thoughts on supporting more plugins? We already have tests that require the following:

sbt-projectmatrix
sbt-play
ScalablyTypedConverterGenSourcePlugin

@ajaychandran Let's focus on getting this PR merged first before we start looking at follow ups, I think there's enough work here to keep you occupied for at least a few more days, and I don't want us to get distracted and end up leaving work half finished

import java.io.PrintStream
import scala.collection.immutable.SortedSet

trait PackageWriter {
Copy link
Member

@lihaoyi lihaoyi Jul 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should re-use as much of BuildGenUtil as possible here, which already has logic for rendering def mvnDeps, def moduleDeps, etc. We should not need to re-implement it from scratch and end up maintaining two different implementations. We should only need to implement that parts that BuildGenutil does not already implement

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was hoping to port the other importers to the new design. This might eliminate BuildGenUtil entirely.
Can we discuss this?

val platform = crossVersion match {
case v: librarymanagement.Full if v.prefix.nonEmpty => "::"
case v: librarymanagement.Binary if v.prefix.nonEmpty => "::"
case v: librarymanagement.For2_13Use3 if v.prefix.nonEmpty => "_3" + "::"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not just "_3::"? Same for the "_2.13" + "::" below

@ajaychandran ajaychandran changed the title WIP: Improve support for importing projects to Mill from SBT, Gradle, Maven WIP: Proposal for new SBT importer implementation supporting cross projects using a revised core design Jul 30, 2025
@ajaychandran
Copy link
Contributor Author

The PR title and description have been updated.

@lihaoyi
Copy link
Member

lihaoyi commented Jul 30, 2025

@ajaychandran Thanks for updating the PR description. Given this is a cleanroom implementation that aims to completely replace the old one, the PR description will need to be a lot more detailed to explain what the differences between the old implementation and new one are, to justify why it needs to be replaced and why the new implementation is better.

As mentioned over email, this will need to be integrated into the existing utils and Maven/Gradle implementation before merging. While it's fine for you to experiment with a new design in a separate module, it needs to be fully integrated into the existing code before merging, so we can see the results of your new "better" architecture in a realistic scenarios with the same requirements. The old SBT implementation should be fully removed, and whatever code that can be shared between the new SBT implementation and the Maven/Gradle implementations should be consolidated

- Removed custom Gradle models
- Added standalone tests for Gradle
- Separated meta-build to generate files under mill-build
@ajaychandran ajaychandran changed the title WIP: Proposal for simplifying build importer implementations Simplified and improved build importer implementations Aug 31, 2025
@ajaychandran ajaychandran marked this pull request as ready for review August 31, 2025 19:42
@ajaychandran ajaychandran requested a review from lihaoyi August 31, 2025 19:45
@lihaoyi
Copy link
Member

lihaoyi commented Aug 31, 2025

@ajaychandran can you explain in more detail in the PR descriptions

  1. What the technical changes are and why they were necessary
  2. What the user-facing impact of the new approach is: what builds import now that didn't before?
  3. Any limitations of the new approach compared to the old one. Are there any regressions or open issues?
  4. How was this tested. Manually? Through some script or command? In CI?

@lefou could you help review this? I'll look at it too, but it's a big PR so would be good to get another pair of eyes on it

@lihaoyi
Copy link
Member

lihaoyi commented Aug 31, 2025

Added support for generating meta-build.

What scenarios would we need to generate a meta-build?

Comment on lines +9 to +11
> echo 21 > .mill-jvm-version # Repo needs Java >=21 to build and test

> ./mill init
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we no longer support --jvm-id?

Copy link
Contributor Author

@ajaychandran ajaychandran Sep 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The conversion now auto-configures JavaHomeModule.jvmId for Gradle and Maven projects.
Based on the projects tested, JDK version enforcement is quite rare. It is much more common to ensure compatibility with javacOptions like -target.
The Netty project is an example where the version, 1.8, is enforced. But we cannot support this value due to #5782.

The option was also being used to configure the JVM version in the build header. We could rename it to --mill-jvm-version and use it for this purpose.
I am not convinced that the conversion should configure this property since, in the typical case, the preferred JDK is the one already installed.
Consider the Mockito example in 5-gradle-incomplete.

> git checkout v5.19.0 # multi-module Java project that requires Java 11

> echo "17" > .mill-jvm-version # Gradle version in use requires Java 17

Since init runs the Gradle daemon, we need at least version 17. Post conversion, JDK 11 would suffice (this won't be configured at the module level since it is not enforced in the build).
But why not use the latest version? The Mill project itself uses zulu:21.
The reason the version needs to be specified here is that we do not know the JDK version in the CI environment. We could move the value to .mill-jvm-version file to "hide" this from the user.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we look at versions below 11 like 1.8 and generate mill-jvm-version: 11? That won't work for all projects, but it should hopefully work for many of them which aren't too picky about the version they are using. Then the user won't have to create the .mill-jvm-version file manually.

I think trying to specify a version as close as possible to that configured within their build via javacOptions sounds best. Not all projects work with newer versions, e.g. Scala, Kotlin, Spring, etc. all can potentially have problems if we pick a too-new version of the JVM to run them on

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then the user won't have to create the .mill-jvm-version file manually.

I think trying to specify a version as close as possible to that configured within their build via javacOptions sounds best.

But why should the conversion explicitly specify the .mill-jvm-version at all? Isn't the default Mill behavior of using the installed JDK sufficient for real world usage?

The conversion should be concerned with configuring the JVM only at the module level and only if it is strictly enforced in the Gradle / Maven build.
I don't think javacOptions should be used to determine a version since a low value is targeted in many cases. Downgrading the JDK will likely adversely affect build times and require a fix in the build file (assuming the cause for the slowdown is identified).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the Mill main branch, we need to opt-into using the installed JDK via mill-jvm-version: system. But in general that is discouraged (hence making it no longer the default) and so we shouldn't generate it on-import if possible.

Pulling a low Java version is fine as long as it works. People can tweak and adjust it later, but our goal for the import "a version that is most likely to work while avoiding mill-jvm-version: system". So if there is a -target or -release flag we should use as close to a version as possible, and if there is no flag we maybe just leave it unspecified and Mill will default to Java 21

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reinstated mill-jvm-version in build header and added lower bound for Mill/module JVM version.

@ajaychandran
Copy link
Contributor Author

Added support for generating meta-build.

What scenarios would we need to generate a meta-build?

Meta-build refers to base traits and Deps object defined under the mill-build folder. These are generated by default now.
This can be disabled with the --no-meta-build option resulting in build files that do not share anything.

Comment on lines +6 to +8
* - src/main/scala-2
* - src/main/scala-2.12
* - src/main/scala-2.13
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does the -2, -2.12, -2.13 suffixes interact with the -jvm, -js, and -native suffixes? Can they be used together? I assume so, but we should mention how it works in the doccoment

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated Scaladocs.

Comment on lines 16 to 24
ProblemFilter.exclude[NewMixinForwarderProblem]("mill.javalib.MavenModule.sources"),
ProblemFilter.exclude[NewMixinForwarderProblem]("mill.javalib.MavenModule#MavenTests.sources"),
ProblemFilter.exclude[NewMixinForwarderProblem]("mill.scalalib.SbtModule.sources"),
ProblemFilter.exclude[ReversedMissingMethodProblem](
"mill.scalalib.SbtModule.mill$scalalib$SbtModule$$super$sourcesFolders"
),
ProblemFilter.exclude[NewMixinForwarderProblem]("mill.scalalib.SbtModule#SbtTests.sources"),
ProblemFilter.exclude[ReversedMissingMethodProblem](
"mill.scalalib.SbtModule#SbtTests.mill$scalalib$SbtModule$SbtTests$$super$sourcesFolders"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are not allowed to make these changes due to binary compatibility. You can normally work around this by adding your new code in helper methods that both the super-class and sub-class call

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are overriding the sourcesFolders member here as per a suggestion made earlier.
I don't think we can work around this. Should the changes be rolled back?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are allowed to override sourceFolders, the only problem is we are not allowed to add a super call within the override. The solution is to add a private[mill] def sourceFolders0 task in the parent class and then call sourceFolders0 in both classes without using super; that should have the same semantics as you want but avoid breaking binary compatibility

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The suggested changes were implemented and the MiMa filters were eliminated. But this caused ./mill example.thirdparty[netty].native.daemon to fail with

/home/runner/work/mill/mill/out/example/thirdparty/netty/native/daemon/testForked.dest/sandbox/run-2/build.mill:80:10
[7498]   [error] 80 ?  object test extends NettyBaseTestSuiteModule, MavenTests {
[7498]   [error]    ?         ^
[7498]   [error]    ?error overriding method sourcesFolders0 in trait MavenModule of type => Seq[os.SubPath];
[7498]   [error]    ?  method sourcesFolders0 in trait MavenTests of type => Seq[os.SubPath] module class test$ inherits conflicting members:
[7498]   [error]    ?  method sourcesFolders0 in trait MavenModule of type => Seq[os.SubPath]  and
[7498]   [error]    ?  method sourcesFolders0 in trait MavenTests of type => Seq[os.SubPath]
[7498]   [error]    ?(Note: this can be resolved by declaring an override in module class test$.)
[7498]   [warn] -- /home/runner/work/mill/mill/out/example/thirdparty/netty/native/daemon/testForked.dest/sandbox/run-2/build.mill:460:22
[7498]   [warn] 460 ?    } finally server.destroyForcibly()
[7498]   [warn]     ?                     ^^^^^^^^^^^^^^^
[7498]   [warn]     ?method destroyForcibly in class SubProcess is deprecated: Use destroy(shutdownGracePeriod = 0)
[7498]   [warn] one warning found
[7498]   [error] one error found
[7498]   1 tasks failed
[7498]   compile Compilation failed

I suppose this can be fixed by updating the test build definition or using a different name for one of the duplicates.

@lihaoyi Is there a recommended solution for such cases?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we make MavenTests inherit from MavenModule? I think that should make it work by prioritizing the MavenTests version of the method. You may need to tweak some overrides in MavenTests to make that work and preserve the current behavior

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can probably do the same with SbtTests and SbtModule

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed. Thanks for the suggestions.

* - [[SbtPlatformModule.CrossTypePure]]
* - [[SbtPlatformModule.CrossTypeDummy]]
*/
trait SbtPlatformModule extends PlatformScalaModule with SbtModule { outer =>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you explain in the doc-comment what this provides that the underlying PlatformScalaModule and SbtModule do not?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated Scaladocs.

@ajaychandran
Copy link
Contributor Author

@ajaychandran can you explain in more detail in the PR descriptions

1. What the technical changes are and why they were necessary

2. What the user-facing impact of the new approach is: what builds import now that didn't before?

3. Any limitations of the new approach compared to the old one. Are there any regressions or open issues?

4. How was this tested. Manually? Through some script or command? In CI?

The PR description has been updated.

@lihaoyi
Copy link
Member

lihaoyi commented Sep 2, 2025

@ajaychandran can you include in your description a summary of:

  • How the new data model handles cross-platform and cross-version Scala modules: how does it determine the Scala platforms and versions?
  • How does it model single-platform single-version Scala modules?
  • How does it handle Java modules, which cannot have platform or version?
  • What example projects did you use to test these various cases?

@ajaychandran
Copy link
Contributor Author

@ajaychandran can you include in your description a summary of:

* How the new data model handles cross-platform and cross-version Scala modules: how does it determine the Scala platforms and versions?

* How does it model single-platform single-version Scala modules?

* How does it handle Java modules, which cannot have platform or version?

* What example projects did you use to test these various cases?

I have updated the description with the mechanism used.

The testing strategy is being reworked. Marked this as pending in the description.

@ajaychandran ajaychandran marked this pull request as draft September 3, 2025 13:08
@ajaychandran
Copy link
Contributor Author

The PR has been marked as draft. The status will be updated once the pending tasks are completed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants