Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The generated jar isn't always self-contained #113

Open
thufschmitt opened this issue Nov 18, 2017 · 5 comments
Open

The generated jar isn't always self-contained #113

thufschmitt opened this issue Nov 18, 2017 · 5 comments

Comments

@thufschmitt
Copy link
Contributor

I packaged an executable depending on libEGL with sparkle, and although the resulting jar can be run locally, it fails on an amazon EMR cluster, with the error libEGL.so.1: : cannot open shared object file: No such file or directory.
Installing the package providing this library (mesa-libEGL) in the server fixed this error, but there are some other dependencies (libgbm) which I couldn't install, which makes running this program impossible.

All the required libraries are included in the haskell-app.zip archive, but for some reason beyond my knowledge of how the linker works, only some of them are actually loaded at runtime.

I'll provide a minimal repro and some logs asap

@thufschmitt
Copy link
Contributor Author

I manage to make a simple failing example (except that it's libpixman which it fails to load instead of libEGL):

I created a branch of sparkle here where I added a dependency on cairo to the hello example, and

  • From this branch, building the hello example and running it locally (with spark-submit) works
  • If I run the same jar on another machine without the dependencies installed, then it fails.

I tried running it on the gettyimage/spark docker image, (which has spark installed but not libpixman), and got

$ docker run -i -t -v $(pwd)/sparkle-example-hello.jar:/sparkle-example-hello.jar gettyimages/spark spark-submit --master 'local[1]' /sparkle-example-hello.jar
Exception in thread "main" java.lang.UnsatisfiedLinkError: /tmp/sparkle-app-1156298747917222018/hsapp: libpixman-1.so.0: cannot open shared object file: No such file or directory
        at java.lang.ClassLoader$NativeLibrary.load(Native Method)
        at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
        at java.lang.Runtime.load0(Runtime.java:809)
        at java.lang.System.load(System.java:1086)
        at io.tweag.sparkle.SparkleBase.loadApplication(SparkleBase.java:61)
        at io.tweag.sparkle.SparkleBase.<clinit>(SparkleBase.java:28)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:712)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I checked the haskell-app.zip archive, and libpixman-1.so.0 is included along the other ones.

@mboes
Copy link
Member

mboes commented Nov 18, 2017

Wild guess: are you building on NixOS? That won't work, because the linker path is hard-coded to a non-standard location. Note that .jar is not entirely hermetic. There are three things that come from the host:

  1. the linker
  2. libc.so
  3. libpthread.so

@thufschmitt
Copy link
Contributor Author

I'll have to double-check, but I think that I also had failures of non-NixOS builds (but still using Nix).

As a temporary workaround, do you think that installing Nix and populating the store with the relevant paths could be enough to make this work?

@thufschmitt
Copy link
Contributor Author

Answering my own question: bind-mounting my nix-store on a docker container allows the jar to work, so pulling in the dependencies via Nix seems to do the trick

@thufschmitt
Copy link
Contributor Author

I confirm that building on a non-nixos docker image doesn't fix the problem.
(anyway, this is not top-priority for me, as the part of the program that needs these dependencies doesn't need to run through spark, it's just a convenience to run everything at once)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants