-
Notifications
You must be signed in to change notification settings - Fork 7
Anvil: Assethub Forking #399
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: feature/forking
Are you sure you want to change the base?
Anvil: Assethub Forking #399
Conversation
|
This includes a bunch of unrelated commits, making it very difficult to review. Please only add your changes |
a6cfea9 to
c5fe40b
Compare
updated |
skunert
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left some comments. I think its going in the right direction, but slot duration and para id need to be read from the runtime.
c5fe40b to
26e183c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, looks like there is some progress in terms mocking inherents.
26e183c to
8872d04
Compare
|
One other important thing to note for all the comments left so far is that I've been able to consistently produce blocks 6 seconds apart with the storage fetching in the PR, and with the adjustments I suggested in the comments, from an asset-hub-westend network started with zombienet like so (copied from ChainSafe#8, and modified slightly): I would say that shortly after addressing the comments we should look into the testing based on the mining strategies discussed. |
|
There is another concern (@alindima reminded me) related to coinbase. If we fork from a state with multiple aura authorities, the coinbase address used to mine blocks will probably vary based on the slot. This should be tested and confirmed. I am not sure if we should restrict the blocks to be built by a single authority though (which implies reducing the authorities set to a single authority right after the forking). Also, we must test that setting a certain coinbase address will consider only that coinbase address for every new built block from that point on. LE: Standard |
Update: Step 7 should use was instead of HTTP, so |
iulianbarbu
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice 🚀 ! Block production seems to work with <6s and >=6s block-times right away with the branch now. We can continue the testing in the meantime, until lazy loading backend is available in the feature branch, and you can rebase on top of it (and use it instead of the BackendWithOverlay). And then continue the testing on a ready to be merged version of this work.
In the meantime, I discovered a scenario that causes issues:
1. start the zombienet network as described in a previous comment
2. start anvil-polkadot with no mining like so: `cargo run -p anvil-polkadot -- --fork-url <url>
3. run:
3.1 `cargo run -p cast -- rpc evm_mine`
3.2 `cargo run -p cast -- rpc evm_setTime 1764161024 # (or unix time in the past relative to you)
3.3 `cargo run -p cast -- rpc evm_mine` # (should fail without the fix in the diff below)
3.4 `cargo run -p cast -- rpc evm_mine` # (do it once more to ensure we can build on top of a block with timestamp in the past, after the fix)
--- a/crates/anvil-polkadot/src/api_server/server.rs
+++ b/crates/anvil-polkadot/src/api_server/server.rs
@@ -66,6 +66,8 @@
sc_service::{InPoolTransaction, SpawnTaskHandle, TransactionPool},
sp_api::{Metadata as _, ProvideRuntimeApi},
sp_blockchain::Info,
+ sp_consensus_aura::AuraApi,
+ sp_consensus_babe::Slot,
sp_core::{self, Hasher, keccak_256},
sp_runtime::{FixedU128, traits::BlakeTwo256},
};
@@ -519,13 +521,24 @@ fn set_time(&self, timestamp: U256) -> Result<u64> {
}
let time = timestamp.to::<u64>();
let time_ms = time.saturating_mul(1000);
+
// Get the time for the last block.
let latest_block = self.latest_block();
let last_block_timestamp = self.backend.read_timestamp(latest_block)?;
+
// Inject the new time if the timestamp precedes last block time
if time_ms < last_block_timestamp {
self.backend.inject_timestamp(latest_block, time_ms);
+ let current_aura_slot = self.backend.read_aura_current_slot(latest_block)?;
+ let updated_aura_slot = time_ms
+ .saturating_div(self.client.runtime_api().slot_duration(latest_block)?.as_millis());
+ if current_aura_slot > updated_aura_slot {
+ self.backend.inject_aura_current_slot(latest_block, Slot::from(updated_aura_slot));
+ self.backend
+ .inject_relay_slot_info(latest_block, (Slot::from(updated_aura_slot), 0));
+ }
}
+
Ok(self.mining_engine.set_time(Duration::from_secs(time)))
}|
@jimjbrettj , another issue you'll face, when sending transactions to the node to test the auto mining, is a panic, which stops the node and happens in our We agreed some time ago that modifying the chain id will not work when forking, but sending transactions should still work by detecting we started the node from a fork, and use the chain id from the fork. Unfortunately there isn't a runtime API to get it, and the chain id is hardcoded in asset-hub-westend to 420_420_421. Not sure if the same chain id will be used for polkadot/kusama, or other assethub chains, but for now we shouldn't block with the testing in this, and just hardcode the chain id as well. |
hm it's part of the chain metadata, so can be queried through subxt if not found in storage |
You mean the runtime metadata? It should be doable then to handle this inside the |
Yes, I think it should be very easy |
|
Small update: through the edge cases I ran, I am quite confident that this logic is pretty much there against our mining strategies, delta the comments left so far. Here is a log excerpt where we can see a few things I tried. I tested individually and also mixed all mining modes: auto mining, interval based mining, and evm_mine RPC calls, while tweaking the time to past and future. Excerpt |
|
Accidentally committed a large |
iulianbarbu
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks nice! Just the unwraps left to handle. Thanks!

Motivation
Draft PR to show what I have currently for mocking inherent data and digests to support assethub forking
Solution
PR Checklist