node-crawler
node-crawler copied to clipboard
Attempts to crawl the Ethereum network of valid Ethereum execution nodes and visualizes them in a nice web dashboard.
When browsing list of nodes, the fronted currently enables searching for client name, user string or ip. It would be great to have more sorts of filters like version, dial...
Hello, Is there any public API available to fetch nodes rather than downloading the PG dump? cc @MariusVanDerWijden Thanks.
The intent of this feature is to allow anyone to send to our crawler their node so we can crawl. Makes p2p discovery easier and faster.. This will require: -...
`lastStatusUpdate` was not assigned before, so the remote nodeURL was queried each time instead of respecting the 15s interval. Date: 2023-11-11 11:14:31-07:00
- Add the option to crawl the Holesky testnet Depends on #53 - Updates the geth library version
API has some issues reading certain data from crawler database. Not sure if this is issue with the parser on API side or formatting by crawler. Example of error in...
`/docs/api.md` is outdated. # Actual endpoints ``` defer wg.Done() router := mux.NewRouter().StrictSlash(true) router.HandleFunc("/", func(rw http.ResponseWriter, r *http.Request) { rw.Write([]byte("Hello")) }) router.HandleFunc("/v1/dashboard", a.handleDashboard).Queries("filter", "{filter}") router.HandleFunc("/v1/dashboard", a.handleDashboard) fmt.Println("Start serving on port 10000")...
What should I do to compile node-crawler/crawler into a Linux program? My system: macOS
Figure out what the deployment story is and how we can safely deploy a new version of the experience to production. Create documentation that explains this as well.
Right now we just add nodes to the UI, we never truncate/prune nodes that can't be discovered again. This needs proper design to figure out what it means to prune....