[BUG] eth_call discrepancy with public endpoint
Seid version
name: sei
server_name: <appd>
version: v6.0.2
commit: f17ec67f1a05c0fa1c3a978bad456edb362aa7c9
build_tags: netgo,
go: go version go1.21.13 linux/amd64
build_deps:
- cosmossdk.io/[email protected]
- filippo.io/[email protected]
- github.com/99designs/[email protected]
Chain ID Mainnet / Pacific
Describe the bug
Inconsistent eth_call response with public endpoint
To Reproduce
Contract : 0x9851d050a3db9388628cd3da22944a10248a4f3b
{
"jsonrpc": "2.0",
"method": "eth_call",
"params": [
{
"to": "0x9851d050a3db9388628cd3da22944a10248a4f3b",
"data": "0xc87b56dd0000000000000000000000000000000000000000000000000000000000000296"
},
"latest"
],
"id": 1
}
This is the Token that’s returning execution reverted when getting the URI but works with the public endpoint https://evm-rpc.sei-apis.com/ https://opensea.io/assets/sei/0x9851d050a3db9388628cd3da22944a10248a4f3b/662
Expected behavior Return the token URI
Screenshots
Additional context https://sei-evm-rpc.publicnode.com/ and https://1329.rpc.thirdweb.com/ have the same -32000 response.
Is there some query gas cap configured on different RPCs? I believe the sei-apis.com RPC is configured with 10M gas cap for queries.
This is a call to an ERC721 pointer contract, and the base contract is CW721. What's notable is that the CW721 metadata is fully onchain, the nft_info query on the CW721 returns >30KB of data, base64 encoded.
When querying the ERC721 tokenURI call, there's a staticcall made which seems to be significantly gas intensive. Furthermore, since the staticcall for nft_info returns more data than is needed by tokenURI, I imagine there is extraneous data that gets loaded into memory, contributing to the high gas usage.
Thank you for opening this issue and taking the time to share your thoughts. We’d love to keep moving this forward, but we need a bit more information from you. Please add any additional details within the next 2 day(s) so we can continue collaborating on a solution together. If we don’t hear back, the issue will close automatically — but you’re always welcome to reopen it when you’re ready.
We’re closing this issue for now as we haven’t received additional input. Please know that your effort is appreciated — and when you’re ready to revisit this, we’ll be here to pick it up again. Thank you for contributing and helping us improve Sei.