dasimmet/zig-git-crawler
crawling and indexing git hosting sites in an sqlite database. [Website](https://dasimmet.gitlab.io/zig-git-crawler/)
mastermaster07e469c6e10712e3f8e0f2f576f69ca574d5b07cfa55f4080752ce1c48f2bb392e9513f87b59d8cbcb150bbee0a6261c8693d7187fb17b8306e56d06generated a sqlite database of git repositories with zig dependencies. Also includes a small web application to view the database contents.
Repositories are indexed from these sites if they have the topics
zig or zig-package assigned:
Dependencies are resolved between those sites even if those topics
aren't set based on build.zig.zon urls.
Hosted on gitlab pages: https://dasimmet.gitlab.io/zig-git-crawler/
# build the exe
zig build
# setup an API token for github:
export GITHUB_API_AUTH_TOKEN=********************
# crawl through major all git hosting sites, create the db
zig build run
# alternatively, download the db from the ci crawl process:
zig build download-db
# query the db
sqlite3 zig-out/www/zig-git.db "SELECT * FROM view_manifest_tree"
# build the webapp:
zig build web
# run the dev webserver:
zig build dev
# open the browser to view it: http://localhost:8080/www
repo and ref in the details view.path dependencies into account since the sql query was
easier to implement.zig and zig-package tags. New commits and
manifests are fetched though.