-
Notifications
You must be signed in to change notification settings - Fork 396
Fix #3661: Load Linker Reflectively #3805
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
fc5161d
to
52b2b0b
Compare
Note that fullOptJS does currently not work because there is a dependency problem on Guava between closure compiler and jimfs. As a result of the class loading, the older version gets loaded and there are binary incompatibilites. |
14efa2a
to
10ca987
Compare
Hah, now of course we have the problem that we load the wrong Scala version :) Alternatives:
@sjrd WDYT? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm confused by the fact that the sbt plugin still depends on the full linker, then loads the linker reflectively but from the same class loader. I was expecting something more along the lines of: the sbt plugin does not depend on the full linker, but rather downloads it separately from the classpath of the sbt plugin, then creates a class-loader to load the full linker inside it.
That would also solve things like #3193, and any other mismatch in dependencies on stuff that is also depended on by other sbt plugins.
To resolve the Guava thing and the Scala version thing, I think we need something similar to what we had done for the JettyClassLoader
of PhantomJS in 0.6.x: a custom ClassLoader
that loads java.*
, scala.*
and org.scalajs.linker.interface.*
from its parent class-loader, but everything else is forced not to go the parent class-loader.
@@ -183,16 +202,21 @@ object ScalaJSPlugin extends AutoPlugin { | |||
|
|||
scalaJSLinkerConfig := StandardConfig(), | |||
|
|||
scalaJSLinkerImpl := LinkerImpl.default(getClass().getClassLoader()), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the point of loading reflectively if you are using the current class loader?
That is of course an even nicer solution. Actually was considering it, but didn't because it needs a custom configuration :) I'll do this.
This will solve the Guava thing, but not the Scala version thing: The linker interface depends on the scala library. So we have to load at least these classes and their transitive dependencies from the parent classloader. Otherwise we'll have a class mismatch. |
You don't need a custom configuration anymore with sbt 1.x. You can directly invoke the
Hum yes but as long as the linker interface you load uses a 2.12.x Scala version, it will correctly link with the Scala 2.12.y used by sbt. |
8e66c55
to
b7772c9
Compare
@SethTisue before I investigate the failures here further: Could you have a look at the changes to how we chose the Scala version in this build and check if it is community build compatible? If no, how do I have to adapt it? |
a72b00a
to
e66371c
Compare
offhand it looks okay to me, I'm not 100% sure, but we can always deal it with later |
So the CI failures look like sbt is loading the wrong SJS compiler when compiling the |
0129372
to
d338ecc
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Self review.
project/Build.scala
Outdated
@@ -495,7 +506,7 @@ object Build { | |||
if (v < 8) | |||
throw new MessageOnlyException("This build requires JDK 8 or later. Aborting.") | |||
v | |||
} | |||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spurious change.
project/Build.scala
Outdated
organization := "org.scala-js", | ||
scalaVersion := "2.12.8", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spurious change
@@ -57,13 +57,17 @@ object ScalaJSPlugin extends AutoPlugin { | |||
|
|||
// All our public-facing keys |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we make the internal keys private?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, I'd rather not. In the end all keys are public in the sbt shell. Making them private in the code is a fallacy.
KeyRanks.Invisible) | ||
|
||
val scalaJSLinkerImpl = TaskKey[LinkerImpl]("scalaJSLinkerImpl", | ||
"Factory for a Scala.js linker", KeyRanks.Invisible) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: Describe this.
"Factory for a Scala.js linker", KeyRanks.Invisible) | ||
|
||
val scalaJSLinkerBox = SettingKey[CacheBox[ClearableLinker]]("scalaJSLinkerBox", | ||
"CacheBox for a Scala.js linker", KeyRanks.Invisible) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mark this as internal.
globalIRCacheBox.foreach { globalIRCache => | ||
logger.debug("Global IR cache stats: " + globalIRCache.stats.logLine) | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really don't like this. Can we find a better way to do this? Ideally after every task that users the linker :)
|
||
{ () => | ||
prev() | ||
ScalaJSPluginInternal.closeAllTestAdapters() | ||
ScalaJSPluginInternal.globalIRCache.clearStats() | ||
globalIRCacheBox.foreach(_.clearStats()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be acceptable to log IR cache stats just here? That would get rid of the strange logger and also make sure nobody ever forgets it. At the cost of only logging once if multiple linking (or other IR consuming) operations are executed in a single swoop.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that would be acceptable, yes.
8da8c9f
to
1918e78
Compare
1918e78
to
538a317
Compare
This is so that the main sbt-plugin-test does not have a direct dependency on the Linker anymore (see scala-js#3805 for details).
9ce8ade
to
586d011
Compare
Updated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Should we merge this now or wait until we've figured out the community build?
I guess that depends on @SethTisue's confidence level that we can make it work in this form? Personally I'm OK with waiting. |
Right. In fact I realize that our solution to this does not transfer to #3817: We will need two levels of pinning there: The linker needs to be conditionally pinned; the linker private library needs to be always pinned. Another option is to create a project per major scala version for all projects. It seems that would solve the version problem for us for all cases (including using scripted with different scala versions). Minor versions could be selected with WDYT? |
(I am committed to helping find a way forward here, but I'm on semi-vacation for a while as I move from CA to NV, so my time is a bit scarce right now) |
Using one project per major Scala version for all projects seems reasonable, indeed. |
586d011
to
7b6852f
Compare
823c616
to
bea0a85
Compare
This is much simpler and good enough. It also will not force us to expose a strange setting in the next commit.
bea0a85
to
df3916a
Compare
This gives us the following advantages: - Allows us to give resources to the linker that the build itself compiles first (unblocks scala-js#3537). - Hides linker dependencies away from other sbt plugins. Notably the Google Closure Compiler and Guava (fixes scala-js#3193).
df3916a
to
341bc00
Compare
this is green in the 2.12 community build since the Scala major version happens to align between the build and the meta-build but in the 2.11 and 2.13 builds, it fails, since the changes here pull in I have some ideas about how to work around it. I'll mess with it |
it's best if I don't have to fork your build, since then my fork would always be falling out of date. could you offer me a setting I could flip that would cause |
The artifacts you'll find on Maven Central will not correspond to the version being tested, so that's not a viable path forward I'm afraid. I see two possible paths:
|
I've opened a separate ticket, scala/community-build#1009, about whether it makes sense to continue using dbuild for Scala.js at all. let's also discuss there |
I was thinking along these lines as well, but it doesn't seem easy to me, I made a few stabs at it already, my stab like your first suggestion didn't work, and in my stab like your second suggestion I got bogged down and it seemed like a time sink without success being assured. so I'd like to discuss on #1009 a bit first, then depending on how that goes, perhaps return to this ticket to go into further details about the stick-with-dbuild option. |
No description provided.