Peronlabs/Ethereum-Node-Crawler
Last updated
Last updated
Full Source Code on Peron Github :
Crawls the network and visualizes collected data. This repository includes backend, API and frontend for Ethereum network crawler.
Backend is based on devp2p tool. It tries to connect to discovered nodes, fetches info about them and creates a database. API software reads raw node database, filters it, caches and serves as API. Frontend is a web application which reads data from the API and visualizes them as a dashboard.
Features:
Advanced filtering, allows you to add filters for a customized dashboard
Drilldown support, allows you to drill down the data to find interesting trends
Network upgrade readiness overview
Responsive mobile design
Project is still in an early stage, contribution and testing is welcomed. You can run manually each part of the software for development purposes or deploy whole production ready stack with Docker.
Development
For local development with debugging, remoting, etc:
Copy .env
into .env.local
and replace the variables.
And then npm install
then npm start
Run tests to make sure the data processing is working good. npm test
Production
To deploy this web app:
Build the production bits by npm install
then npm run build
the contents will be located in build
folder.
Use your favorite web server, in this example we will be using nginx.
The nginx config for that website could be which proxies the api to endpoint /v1
:
The API is using 2 databases. 1 of them is the raw data from the crawler and the other one is the API database. Data will be moved from the crawler DB to the API DB regularly by this binary. Make sure to start the crawler before the API if you intend to run them together during development.
Dependencies
golang
sqlite3
Development
Production
Build the assembly into /usr/bin
Make sure database is in /etc/node-crawler-backend/nodetable
Create a systemd service in /etc/systemd/system/node-crawler-backend.service
:
Then enable it and start it.
Dependencies
golang
sqlite3
Country location
GeoLite2-Country.mmdb
file from https://dev.maxmind.com/geoip/geolite2-free-geolocation-data?lang=en
you will have to create an account to get access to this file
Development
Run crawler using crawl
command.
Production
Build crawler and copy the binary to /usr/bin
.
Create a systemd service similarly to above API example. In executed command, override default settings by pointing crawler database to chosen path and setting period to write crawled nodes. If you want to get the country that a Node is in you have to specify the location the geoIP database as well.
No GeoIP
With GeoIP
Production build of preconfigured software stack can be easily deployed with Docker. To achieve this, clone this repository and access docker
directory.
Make sure you have Docker and docker-compose tools installed.