Try   HackMD

EPF - Update 1

This one is two weeks merged into one.

I did some refactoring of the Node Crawler.

I upgraded the dependencies and refactored the file structure so it was more in line with a typical Go project, while trying not to change too much so it would be easier to review. The last part might not have materialized as much as I wanted.

Upgrading the dependencies was a bit more challanging than I expected. Old Go projects seem to have a lot of things that break over time. At least there were no language breakages. This was mainly packages changing things. The bigggest change was the updating the go-ethereum package. Even still, this wasn't too difficult at the end of the day, it just took some time to understand what was going on.

The biggest change over this time was new versions of the data-exchange protocol. The project was created with capabilities set to eth/64, eth/65, eth/66, which isn't compatible with a lot of the current clients out now which support eth/66 as a minimum version. In the mean time, we have had eth/67 and eth/68. Fortuntely, this was pretty simple once I understood it. Basically copying the file cmd/devp2p/internal/ethtest/types.go which contained the updated types, and updating the capabilities supported by the crawler. One change I had to make was to decode the Disconnect case as a raw int instead of RLPx-decoding the message. I'm not sure why some clients send this. I assume it's an old protocol.

The CLI library, urfave/cli needed to be upgraded from v1 to v2. This was pretty simple, just needed to update the imports and change some structs to pointers.

I also updated the Dockerfiles and docker-compose so it works out of the box with the code from the repo. The original Dockerfiles were cloning the repo in the build phase because it was in a subdirectory of the repo, meaning it wouldn't work as expected. This seems much easier to work with.

I opened a PR with these changes: https://github.com/ethereum/node-crawler/pull/40

In the mean time, I also worked on creating a way to deploy the project using NixOS. This was pretty interesting as it's not something I've done before, and I learned unexpected things, like more about SystemD, mainly DymanicUser.

I've also created a PR with these changes: https://github.com/ethereum/node-crawler/pull/43, but it needs the previous PR to be merged first.

I've deployed the current state of my changes to the project to https://node-crawler.angaz.io/ and I will keep this updated as I make changes which affect the crawler, API, or frontend.

The next steps, I plan to add a client-info subcommand to the crawler so we can test the GetClientInfo function. One of the issues is to find out why the crawler can't connect to certain clients. Hopefully this will help us debug this issue. This will also help me to create the new capability I wish to add, basically adding your own node to the database so it can be scraped, and you can see if it's properly accessible from the public internet.