Fastest Mirror Finder
Helps you to find the fastest mirror among Debian mirrors and present it to you using REST API.
I got the idea from Hands-On RESTful Web Services with Go book, but I totally changed the implementation, added gorilla/mux as router, added tests, added Redis as cache, added Swagger, Dockerized the whole project, Continuous Integration using Github Actions, ...
Production Environment (Using Docker)
If you have docker and docker-compose installed, then simply you can run:
(Make sure you are in root directory of this project which is the folder that contains main.go
)
docker-compose up
And that's it :) now you can serve Swagger Doc to see available APIs.
Tests (Inside Docker container)
To run tests inside Docker container, run:
docker-compose run app go test -v ./...
APIs Document (Using Swagger)
- to generate spec: (optional, needed after changing things in handlers or router)
Make sure you have go-swagger installed, then simply you can run:
swagger generate spec -o ./docs/swagger.yaml --scan-models .
- to serve spec: (In other word, see available APIs)
swagger serve ./docs/swagger.yaml
Development Environment (Without Docker)
- If you want to run the project without Docker, first thing first, you need to install Redis:
sudo apt install redis-server
- Install project's dependencies:
go mod download
-
Change REDIS_ADDR
's value to localhost
in ./internal/api/database/redis.go
-
Build the project:
go build
- Run the project:
./fastestMirrorFinder
Tests
To run tests without Docker, run:
go test -v ./...
Note: If you don't want to run tests inside Docker container, you can simply comment RUN apk add go gcc
in Dockerfile, to make the build inside Docker even faster.
Project's Structure
Because I wrote this project on the first week of my journey on learning Golang, I tried to follow a famous and big project's structure as far as I can, so I chose CockroachDB and tried to dig deep into it; which was really helpful and formative; I highly recommend it to any one else who is currently reading this :)
How does it work?
Really simple :)
-
For the first request, it will scrap Debian's mirrors, then extract related urls using Regex, then save it to Redis to use it as cache. (mirrors package)
(Instead of using Regex for parsing scraped page, we could use golang.org/x/net/html)
-
Send a GET request to each URL and compute the latency while saving the one which has the lowest latency. (fastest_mirror package)
-
Present the result using REST APIs. (api package)
Useful Links