dasimmet/zig-git-crawler
crawling and indexing git hosting sites in an sqlite database. [Website](https://dasimmet.gitlab.io/zig-git-crawler/)
59c06617d5b86546c0c739e73ae01d3965dbe8ce
ef7eba4ccb0ef303ee0ec263320114442f1239b2
0ad4086aef8a7ae2c20f4483cdad56e1da840d03
e5261db1379c4d70dbb6cd6fa3b39f4ed5883942
b132f7d13e5d859f6c84815f7ab0804f8c207d5b
generated a sqlite database of git repositories with zig dependencies. Also includes a small web application to view the database contents.
Repositories are indexed from these sites if they have the topics
zig
or zig-package
assigned:
Dependencies are resolved between those sites even if those topics
aren't set based on build.zig.zon
urls.
Hosted on gitlab pages: https://dasimmet.gitlab.io/zig-git-crawler/
# build the exe
zig build
# setup an API token for github:
export GITHUB_API_AUTH_TOKEN=********************
# crawl through major all git hosting sites, create the db
zig build run
# query the db
sqlite3 zig-out/www/zig-git.db "SELECT * FROM view_manifest_tree"
# build the webapp:
zig build web
# run the dev webserver:
zig build dev
# open the browser to view it: http://localhost:8080/www
repo
and ref
in the details view.path
dependencies into account since the sql query was
easier to implement.zig
and zig-package
tags. New commits and
manifests are fetched though.