-
Notifications
You must be signed in to change notification settings - Fork 15
Pass the blob URL for preloads in WasmEMCCBenchmark. #74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
✅ Deploy Preview for webkit-jetstream-preview ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
a9e68ec
to
21aa2dc
Compare
Ugh, unfortunately now the benchmark jetsams on iOS... I might need to get creative with a solution here... |
@@ -34,14 +34,24 @@ function dumpFrame(vec) { | |||
|
|||
class Benchmark { | |||
isInstantiated = false; | |||
romBinary; | |||
|
|||
async init() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IIUC, this is essentially shared code with the getBinary
function in WasmEMCCBenchmark
below, just that there it's only used for initializing the Module["wasmBinary"] = getBinary(...)
and here it's for an arbitrary (non-Wasm) file, that happens to be the emulator ROM, right?
Can we unify this, ideally also with the mechansim for preloading blogs of JavaScript line items (ARES-6/Babylon below)?
Dart/benchmark.js
Outdated
} catch { | ||
this.dart2wasmJsModule = await import("./Dart/build/flute.dart2wasm.mjs"); | ||
} | ||
if (!isInBrowser) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So what happens in !isInBrowser
environments, where the previous dynamic import threw an exception? In that case, wouldn't we simply not set this.dart2wasmJsModule
and fail later? Or do we assume that we only reach here if isInBrowser == false
? In that case, can we instead change this to an explicit assertion, e.g.
console.assert(!isInBrowser, "relative imports should always succeed in browsers, this code is only for shells");
or something.
|
||
const promise = Promise.all(filePromises).then((texts) => { | ||
if (isInBrowser) { | ||
this._resourcesPromise = Promise.resolve(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't fully understood the (somewhat intertwined, complex) preload, resource loading code. At a high-level, why do we just early return here having added anything to this.scripts
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I don't fully understand it either. My guess is that various parts were edited by different people (or the same person that forgot how it worked) over time so it has lots of now dead code and/or duplicated logic. It seems like this could be greatly simplified but I'd rather do that in a follow up since this PR blocks "correct" results from the benchmark as a whole.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, happy to have more cleanups later and land this first.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Filed: #86
return this._resourcesPromise; | ||
} | ||
|
||
const filePromises = this.plan.files.map((file) => fileLoader.load(file)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't get this in conjunction with fileLoader._loadInternal
: Here, we never reach if isInBrowser
, because of the early return above. But in fileLoader._loadInternal
(from line 199,
Lines 199 to 225 in 6947a46
async _loadInternal(url) { | |
if (!isInBrowser) | |
return Promise.resolve(readFile(url)); | |
let response; | |
const tries = 3; | |
while (tries--) { | |
let hasError = false; | |
try { | |
response = await fetch(url); | |
} catch (e) { | |
hasError = true; | |
} | |
if (!hasError && response.ok) | |
break; | |
if (tries) | |
continue; | |
globalThis.allIsGood = false; | |
throw new Error("Fetch failed"); | |
} | |
if (url.indexOf(".js") !== -1) | |
return response.text(); | |
else if (url.indexOf(".wasm") !== -1) | |
return response.arrayBuffer(); | |
throw new Error("should not be reached!"); | |
} |
if (!isInBrowser) Promise.resolve(readFile(url))
. In other words, isn't this whole remaining code of fileLoader
superfluous?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does look like it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Alright: let's try to remember to remove _loadInternal
for the follow-up clenaup.
JetStreamDriver.js
Outdated
@@ -994,9 +993,16 @@ class Benchmark { | |||
if (this._resourcesPromise) | |||
return this._resourcesPromise; | |||
|
|||
const filePromises = !isInBrowser ? this.plan.files.map((file) => fileLoader.load(file)) : []; | |||
this.preloads = []; | |||
this.blobs = []; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AFAICT, this.blobs
is never read, can't this be removed?
} | ||
|
||
const filePromises = this.plan.files.map((file) => fileLoader.load(file)); | ||
this._resourcesPromise = Promise.all(filePromises).then((texts) => { | ||
if (isInBrowser) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The if (isInBrowser) return;
in lines 1006/1007 is dead code now, because of line 999, right?
JetStreamDriver.js
Outdated
} | ||
|
||
// FIXME: Why is this part of the runnerCode and not prerunCode? | ||
// This is in runnerCode rather than prerunCode because prerunCode isn't currently structured to be async by default. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the difference between prerunCode()
and Benchmark.init()
? The latter is async / called with await for AsyncBenchmark
, see
Line 1138 in 6947a46
await __benchmark.init(); |
Could one unify the two, or move this code into
async init()
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the Benchmark.init
is specific to a given benchmark.js
file but prerunCode
is part of the driver's Benchmark
variant, thus shared between all the line items using that variant.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right, thanks.
JetStreamDriver.js
Outdated
}; | ||
xhr.send(null); | ||
async function getBinary(key, blobURL) { | ||
const response = await fetch(blobURL); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I said above, this getBinary
function is only used for the Wasm binary of Emscripten benchmarks, but could we use some more general mechanism from the JavaScript line items?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, although, since it uses fetch
so it only works for async benchmarks. There doesn't appear to be an easy way to switch the response type synchronously in Chrome/Firefox.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Alright, so reminder for another follow-up cleanup (unless I miss some reason why it's not possible): Let's make everything an AsyncBenchmark
and remove Benchmark
altogether.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Filed: #87
JetStreamDriver.js
Outdated
for (let i = 0; i < keys.length; ++i) { | ||
str += `loadBlob("${keys[i]}", "${this.plan.preload[keys[i]]}", async () => {\n`; | ||
for (let [ preloadKey, blobURLOrPath ] of this.preloads) { | ||
if (preloadKey == "wasmBinary") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we get rid of this special casing of just the wasmBinary
preloadKey? There are not that many Emscripten workloads, so I'd be happy to just add a line Module["wasmBinary"] = genericPreloadFunc(wasmBinary)
to every one of those (for a little less "magic").
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for starting this, the preload/blob loading code is certainly ripe for a simplification/cleanup! (Unfortunately, I haven't fully understood all parts / why it's so complex, so some clarification questions in the review.)
Could you explain what the issue is? (Also TIL a new word "jetsams".) |
Right now we pass the original path when loading preloads for WasmEMCCBenchmark. This means we might skip the cache and fetch the content from the network again. We don't want to do this because it means the OS might spin down the CPU, which can punish running faster. I also moved this logic to the `prerunCode` rather than adding it to the `runnerCode` since logically that makes more sense. As a drive by change, this patch has the preloads tuples contain the path for cli runs. This means we no longer have to duplicate the preload paths into the benchmark.js file. Additionally, remove a console.log of the test that just finished which broke the dumpJSONResults option for the CLIs.
…ion to get the blob as an array buffer in modern browsers.
… that process I also had to change how 8bitbench loads the rom it's going to use.
There's now three helpers `getBinary`/`getString`/`dynamicImport` that help fetch resources from blobs/paths in the various `benchmark.js`'s `init` functions. All of the preloads in JetStream have been switched to using one of those, thus an AsyncBenchmark with the exception of WasmLegacyBenchmark (aka tfjs-wasm). Lastly, there's also no more special casing for the wasmBinary preload, instead it's set in each relevant `benchmark.js`'s `init` function.
07c8db1
to
510e017
Compare
Nvm, I take this back it must have been my setup. It seems to be working now.
iOS doesn't have paging to disk so when a process uses too much memory (or the system is under memory pressure) the OS will just "jetsam" your process (aka send it a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the changes, LGTM.
…terationCode This makes it easier for Benchmark subclasses to also call. This fixes a bug in WebKit#74 which caused inspector code load benchmarks to not set up properly ending the benchmark in 0ms. Also, use optional chaining for the call (and few other similar optional helpers).
…terationCode This makes it easier for Benchmark subclasses to also call. This fixes a bug in WebKit#74 which caused inspector code load benchmarks to not set up properly ending the benchmark in 0ms. Also, use optional chaining for the call (and few other similar optional helpers).
Right now we pass the original path when loading preloads for WasmEMCCBenchmark. This means we might skip the cache and fetch the content from the network again. We don't want to do this because it means the OS might spin down the CPU, which can punish running faster.
I also moved this logic to the
prerunCode
rather than adding it to therunnerCode
since logically that makes more sense.Lastly, have the preloads tuples contain the path for cli runs. This means we no longer have to duplicate the preload paths into the benchmark.js file.