Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

perf(tree): add cross-block caching #13769

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from
Draft

Conversation

Rjected
Copy link
Member

@Rjected Rjected commented Jan 10, 2025

This adds cross-block caching, just keeping the most recent cache in memory for reuse. The cached state provider has a save_caches method which applies state updates to the account and storage caches.

When we implement prewarming, we will need to stop and clean up the prewarm threads before saving the caches. It's probably a good idea to do this anyways before the state root task starts, since we should be done prewarming when execution finishes.

ref #13713

@Rjected Rjected added C-enhancement New feature or request C-perf A change motivated by improving speed, memory usage or disk footprint A-execution Related to the Execution and EVM labels Jan 10, 2025
@Rjected Rjected force-pushed the dan/add-cross-block-caching branch from 3507aed to 65428f5 Compare January 13, 2025 21:52
@Rjected Rjected force-pushed the dan/add-cross-block-caching branch from 65428f5 to a39ba9b Compare January 13, 2025 22:12
Copy link
Collaborator

@mattsse mattsse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cool, this actually looks pretty simple
because this is a concurrent one, this means we can also populate the cache from anywhere?

lets hope the metrics look good.

I also wonder if we should think about making these features dependent on certain core counts, I assume there's a point where this could become counter productive if node is run on a low end machine (like a raspberry pi)

Comment on lines +2490 to +2493
let Ok(saved_cache) = state_provider.save_cache(sealed_block.hash(), &output.state) else {
todo!("error bubbling for save_cache errors")
};
self.most_recent_cache = Some(saved_cache);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we even do error handling here?
we could just reset to None?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

true, yeah let's just reset in this case

@Rjected
Copy link
Member Author

Rjected commented Jan 14, 2025

because this is a concurrent one, this means we can also populate the cache from anywhere?

Yes! Although this does mean we need to be careful when applying the state updates, we need to make sure nothing is relying on the caches before we update

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-execution Related to the Execution and EVM C-enhancement New feature or request C-perf A change motivated by improving speed, memory usage or disk footprint
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants