You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Crosscompiling the macrotask-executor to both js/jvm would allow to use it in shared code without having to make platform-specific imports. Possibly by using a standard/supplied ExecutionContext shim on the jvm side.
Thanks for reporting! Having cross-compiled lots of libraries for JVM/JS I completely understand the nuisance this creates.
However, shimming the MacrotaskExecutor for JVM ... would be really weird and even confusing. The concept of "macrotasks" and "microtasks" is unique to JS.
A slight rephrasing of this issue could be:
Use the MacrotaskExecutor as the default implementation for scala.concurrent.ExecutionContext.global in Scala.js.
scala.concurrent.ExecutionContext.globalwas a JVM/JS cross-platform way to get an ExecutionContext. Except, that import is now deprecated as of Scala.js 1.8.0.
Thanks for the explanation! When coding shared code though, one needs to have both implementations anyway somehow and it would be nice without workarounds.
Maybe the subject of micro/macrotasks are not that important from the user perspective and more of an implementation consideration? Would the weird/confusing thing of a shared api be that setImmediate and setTimeout are not relevant on the jvm?
When coding shared code though, one needs to have both implementations anyway somehow and it would be nice without workarounds.
💯 agree. I think in an ideal world this would live in scala.concurrent.ExecutionContext.global since that is a standard API that exists across all Scala platforms. In a less-than-ideal world, someone can create a cross-platform library e.g. "scala-execution-contexts" that provides good defaults.
However, it is almost definitely out-of-scope for this project to do that. For example, what if someone opens an issue asking us to also shim this for Scala Native? Should we do that too, why/why not?
My side on this is that importing a specific ExecutionContext in library code is usually not ideal because you remove the flexibility from providing your own, in my case, I write the shared code to depend in the ExecutionContext just like the Future API, this way, you can hook up the correct ExecutionContext in the application's entry point (macrotask-executor in js, something else in jvm/native).
Activity
marcgrue commentedon Dec 15, 2021
Workaround by using platform-specific imports:
armanbilge commentedon Dec 15, 2021
Thanks for reporting! Having cross-compiled lots of libraries for JVM/JS I completely understand the nuisance this creates.
However, shimming the
MacrotaskExecutor
for JVM ... would be really weird and even confusing. The concept of "macrotasks" and "microtasks" is unique to JS.A slight rephrasing of this issue could be:
scala.concurrent.ExecutionContext.global
was a JVM/JS cross-platform way to get anExecutionContext
. Except, that import is now deprecated as of Scala.js 1.8.0.The explanation for why the
MacrotaskExecutor
cannot be used as the implementation forscala.concurrent.ExecutionContext.global
is given here:http://www.scala-js.org/news/2021/12/10/announcing-scalajs-1.8.0/
marcgrue commentedon Dec 15, 2021
Thanks for the explanation! When coding shared code though, one needs to have both implementations anyway somehow and it would be nice without workarounds.
Maybe the subject of micro/macrotasks are not that important from the user perspective and more of an implementation consideration? Would the weird/confusing thing of a shared api be that
setImmediate
andsetTimeout
are not relevant on the jvm?armanbilge commentedon Dec 15, 2021
💯 agree. I think in an ideal world this would live in
scala.concurrent.ExecutionContext.global
since that is a standard API that exists across all Scala platforms. In a less-than-ideal world, someone can create a cross-platform library e.g. "scala-execution-contexts" that provides good defaults.However, it is almost definitely out-of-scope for this project to do that. For example, what if someone opens an issue asking us to also shim this for Scala Native? Should we do that too, why/why not?
sjrd commentedon Dec 15, 2021
Scala Native is a good point. Perhaps this is a job for https://github.com/portable-scala, actually.
armanbilge commentedon Dec 15, 2021
Exactly my thinking! :)
marcgrue commentedon Dec 15, 2021
How many shims are we potentially talking about?
AlexITC commentedon Dec 15, 2021
My side on this is that importing a specific
ExecutionContext
in library code is usually not ideal because you remove the flexibility from providing your own, in my case, I write the shared code to depend in theExecutionContext
just like theFuture
API, this way, you can hook up the correctExecutionContext
in the application's entry point (macrotask-executor in js, something else in jvm/native).Example: