graph-node
graph-node copied to clipboard
[Bug] lfu cache error in 0.31.0
Bug report
Getting lfu cache error in 0.31.0. The issue doesn't appear in 0.30.0. I believe something in this PRs broke it:
- https://github.com/graphprotocol/graph-node/pull/4624
- https://github.com/graphprotocol/graph-node/pull/4485
Relevant log output
Jul 04 10:21:19.414 INFO Done processing trigger, gas_used: 178730037, data_source: Keeper, handler: handleRewardsUpdated, total_ms: 228, transaction: 0x5aa6…e4e6, address: 0xdbb2…5752, signature: RewardsUpdated(indexed address,indexed bytes32,uint256,uint64,uint64,string), sgd: 9, subgraph_id: QmYas9XFKX2TdUPAYg4TUWP6YzxgVnpnrCT1oBKAWZuMA6, component: SubgraphInstanceManager
thread 'QmYas9XFKX2TdUPAYg4TUWP6YzxgVnpnrCT1oBKAWZuMA6[9]' panicked at 'empty cache but total_weight > max_weight', graph/src/util/lfu_cache.rs:282:18
stack backtrace:
0: rust_begin_unwind
1: core::panicking::panic_fmt
2: core::panicking::panic_display
3: core::panicking::panic_str
4: core::option::expect_failed
5: graph::util::lfu_cache::LfuCache<K,V>::evict_and_stats
6: graph::components::store::entity_cache::EntityCache::as_modifications
7: graph_core::subgraph::runner::SubgraphRunner<C,T>::process_block::{{closure}}
8: <graph_core::subgraph::runner::SubgraphRunner<C,T> as graph_core::subgraph::runner::StreamEventHandler<C>>::handle_process_block::{{closure}}
9: <tokio::task::unconstrained::Unconstrained<F> as core::future::future::Future>::poll
10: tokio::runtime::park::CachedParkThread::block_on
11: tokio::runtime::handle::Handle::block_on
12: graph::task_spawn::block_on
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
IPFS hash
No response
Subgraph name or link to explorer
No response
Some information to help us out
- [X] Tick this box if this bug is caused by a regression found in the latest release.
- [ ] Tick this box if this bug is specific to the hosted service.
- [X] I have searched the issue tracker to make sure this issue is not a duplicate.
OS information
Linux
Looks like this issue has been open for 6 months with no activity. Is it still relevant? If not, please remember to close it.