Open computational mathematics. Custom CUDA kernels. GPU clusters. AI-audited, not peer-reviewed.
Human-readable. Agent-consumable. Every result has structured metadata, JSON-LD, citation tags, and machine endpoints.
| Component | Location |
|---|---|
| Website | This repo — Astro + KaTeX, Cloudflare Pages |
| Experiments | cahlen/idontknow — CUDA kernels, results, reviews, research agent |
| MCP Server | workers/mcp/ — 23 tools, no auth, mcp.bigcompute.science |
| Datasets | Hugging Face — Zaremba, Kronecker, spectra, Ramanujan |
| Page | Purpose |
|---|---|
| /findings/ | 18 findings, grouped by cert level |
| /verification/ | Audit dashboard — 53 reviews, 7 models, 3 providers |
| /cite/ | BibTeX/APA citations for every finding |
| /interactive/ | Colab notebooks — free GPU compute |
| /about/ | Project history and how it was built |
| /llms.txt | Agent-consumable structured index |
| /meta.json | Machine-readable site metadata |
Every finding page emits:
- ScholarlyArticle JSON-LD (Google Scholar)
- Dataset JSON-LD (Google Dataset Search) — when finding has a dataset
- Highwire Press meta tags (
citation_title,citation_author,citation_date) - Dublin Core metadata (
DC.title,DC.creator,DC.subject) - IndexNow pinged on every deploy (Bing, Yandex, etc.)
All data-driven from src/data/certifications.json (generated by sync_website.py):
- Verification page: dynamic cards with review ledger + issue tracker
- Finding badges: per-model verdicts, remediation status, commit links
- Changelog: auto-generated from git log
npm install
npm run build # static output in dist/
npm run dev # local dev at localhost:4321Deployed via Cloudflare Pages — push to main auto-deploys.
- Astro + KaTeX + Cloudflare Pages
- MCP server: Cloudflare Worker (TypeScript)
- Data pipeline:
manifest.json→certifications.json→meta.json→ all pages
CC BY 4.0. Use it, build on it, cite bigcompute.science.