The unofficial mirror of Coryn Club, since Coryn Club can't handle too much bandwidth. Only for searching items, information on each item are as close to Coryn as it can get, but no visual display so we can add that once we have more time.
Made manually, I would have scrapped if it wasn't illegal.
Compile flag for dockerfile for render will probably break your build if you are not careful, so remove it, it is only used in deployment on render.com.
Scrapped from Coryn, delays of 9 seconds per website load to avoid DDOS.
Run command
python3 ./server/server.py
Node version
node ./server/server.js
class="card-container" or "class="card-container-1" was how we scrape the data from the website, we used to use these names for the python scripts. now we crawl properly through each document
Run all scrapping python scripts, and run 0_find_import, then use pip list and chatgpt to make requirements.txt if you want to deploy your own version on python.
On js, do the same thing by asking chatgpt what imports are used and reinstall the libraries.
On deployment, use
pip install -r requirements.txt
Node version
npm install
The metadata files inside of the database folder tells the server what to find inside of the database, generated from 4_auto_metadata.py.
<!-- <% console.log(table) %> -->