Compare commits
82 Commits
doot
...
085ade75bf
| Author | SHA1 | Date | |
|---|---|---|---|
| 085ade75bf | |||
| dae0882e0f | |||
| d0f50141c7 | |||
| b59842899b | |||
| 9553e77b2f | |||
| 2ae714db48 | |||
| efaf2fb169 | |||
| dd6eeb6e40 | |||
| 00f8d707d8 | |||
| cf0f66452d | |||
| 5e2acb3ae9 | |||
|
|
e66b675979 | ||
|
|
9847d6422e | ||
|
|
c8b1f124f2 | ||
| 9b6e29a15c | |||
| b55a96c51c | |||
| 800f42c9bb | |||
| a7635c62d3 | |||
| 99bf0f6c5f | |||
| b1e75bd91f | |||
| 46fc66971d | |||
| d54aa6009a | |||
| 68e9facdc7 | |||
| 0b311678c6 | |||
| 652ac886a4 | |||
| be055ae98c | |||
| b1f94b990c | |||
| 2f90b03e6b | |||
| e9abe472ff | |||
| 18c291d37a | |||
| a6aa73bb21 | |||
| fba1eab648 | |||
| c142b4da3d | |||
| fdc5b12a61 | |||
| 34e5512a8d | |||
| 9bc0eb1ce5 | |||
| 1721a2d885 | |||
| 850e27a8e2 | |||
| f437dd4271 | |||
| ea918ff778 | |||
| 986a30735a | |||
| 1b17f0bf60 | |||
| ed0d715b18 | |||
| f18b57bc0d | |||
| a2c05e5c97 | |||
| a6fb8ab43a | |||
| 2bf7c6837b | |||
| c19496f3da | |||
| 80bfaca041 | |||
| 0da1c405d1 | |||
| 45e1967b74 | |||
| 13c1eac9b7 | |||
| c023481303 | |||
| 53956a0be1 | |||
| 6b274dea52 | |||
| 3e0b269f57 | |||
| 1113658d9b | |||
| 11c96416e8 | |||
| 1e69486ae5 | |||
| 7b0e58d8bd | |||
| 55d8f22816 | |||
| 912ed8fae2 | |||
| 4f833202d1 | |||
| 8b8163399c | |||
| 3f47565426 | |||
| abd414f692 | |||
| 11c884be63 | |||
| 719d99e2c0 | |||
| 06e072b050 | |||
| 689c523e0f | |||
| 7486949ff9 | |||
| 1bfa26a7d6 | |||
| 233208c910 | |||
| cb27948649 | |||
| 2981d4ccc2 | |||
| d27f874483 | |||
| 2cb1368440 | |||
| 7744786431 | |||
| f679f970f7 | |||
| 391d5a1768 | |||
| c92cbf8abc | |||
| 44f9b4eb74 |
17
.dockerignore
Normal file → Executable file
@@ -1,6 +1,15 @@
|
||||
react_OLD
|
||||
.git
|
||||
.gitignore
|
||||
.env
|
||||
.venv
|
||||
.vscode
|
||||
.git
|
||||
.git*
|
||||
__pycache__
|
||||
.claude
|
||||
CLAUDE.md
|
||||
README.md
|
||||
STATUS_MONITOR_README.md
|
||||
Dockerfile
|
||||
docker-compose.yml
|
||||
notes.txt
|
||||
react_OLD
|
||||
__pycache__
|
||||
*.pyc
|
||||
|
||||
0
.gitattributes
vendored
Normal file → Executable file
2
.gitconfig
Normal file
@@ -0,0 +1,2 @@
|
||||
[core]
|
||||
filemode = false
|
||||
8
.gitignore
vendored
Normal file → Executable file
@@ -2,4 +2,10 @@
|
||||
__pycache__
|
||||
notes.txt
|
||||
react_OLD
|
||||
envs.py
|
||||
envs.py
|
||||
.env
|
||||
status_history.json
|
||||
|
||||
.claude
|
||||
CLAUDE.md
|
||||
.aider*
|
||||
|
||||
0
.vscode/launch.json
vendored
Normal file → Executable file
0
.vscode/settings.json
vendored
Normal file → Executable file
34
Dockerfile
Normal file → Executable file
@@ -1,34 +1,10 @@
|
||||
FROM ubuntu:lunar
|
||||
FROM python:3.10-bullseye
|
||||
LABEL maintainer="Andrew Simonson <asimonson1125@gmail.com>"
|
||||
|
||||
ENV DEBIAN_FRONTEND noninteractive
|
||||
WORKDIR /app
|
||||
|
||||
RUN apt-get update
|
||||
RUN apt-get install -y python3-pip nginx gunicorn supervisor
|
||||
COPY src/ .
|
||||
|
||||
# Setup flask application
|
||||
RUN mkdir -p /deploy/app
|
||||
COPY src /deploy/app
|
||||
RUN pip install -r /deploy/app/requirements.txt --break-system-packages
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Setup nginx
|
||||
RUN rm /etc/nginx/sites-enabled/default
|
||||
COPY flask.conf /etc/nginx/sites-available/
|
||||
RUN ln -s /etc/nginx/sites-available/flask.conf /etc/nginx/sites-enabled/flask.conf && \
|
||||
echo "daemon off;" >> /etc/nginx/nginx.conf
|
||||
|
||||
# Setup supervisord
|
||||
RUN mkdir -p /var/log/supervisor
|
||||
COPY supervisord.conf /etc/supervisor/conf.d/supervisord.conf
|
||||
COPY gunicorn.conf /etc/supervisor/conf.d/gunicorn.conf
|
||||
|
||||
# Permissions
|
||||
# RUN adduser --disabled-password --gecos '' supervisor && \
|
||||
RUN chmod -R 777 /var/* && \
|
||||
chown -R root /var/*
|
||||
|
||||
# Entrypoint
|
||||
USER root
|
||||
|
||||
# Start processes
|
||||
CMD ["/usr/bin/supervisord"]
|
||||
CMD [ "gunicorn", "--bind", "0.0.0.0:8080", "app:app"]
|
||||
|
||||
2
README.md
Normal file → Executable file
@@ -7,3 +7,5 @@ So people can see how excellent my coding standards are.
|
||||
* Viruses: not included
|
||||
|
||||
You gotta uhh `pip3 install -r requirements.txt` and `python3 app.py` that thing
|
||||
|
||||
Docker compose configured to expose at `localhost:8080`
|
||||
|
||||
138
STATUS_MONITOR_README.md
Normal file
@@ -0,0 +1,138 @@
|
||||
# Service Status Monitor
|
||||
|
||||
## Overview
|
||||
Server-side monitoring system that checks the availability of asimonson.com services every 2 hours and provides uptime statistics.
|
||||
|
||||
## Architecture
|
||||
|
||||
### Backend Components
|
||||
|
||||
#### 1. `monitor.py` - Service Monitoring Module
|
||||
- **Purpose**: Performs automated health checks on all services
|
||||
- **Check Interval**: Every 2 hours (7200 seconds)
|
||||
- **Services Monitored**:
|
||||
- asimonson.com
|
||||
- files.asimonson.com
|
||||
- git.asimonson.com
|
||||
- pass.asimonson.com
|
||||
- ssh.asimonson.com
|
||||
|
||||
**Features**:
|
||||
- Tracks response times and HTTP status codes
|
||||
- Calculates uptime percentages for multiple time periods (24h, 7d, 30d, all-time)
|
||||
- Persists data to PostgreSQL (`service_checks` table) via `DATABASE_URL` env var
|
||||
- Gracefully degrades when no database is configured (local dev)
|
||||
- Runs in a background thread
|
||||
|
||||
#### 2. `app.py` - Flask Integration
|
||||
- **New API Endpoint**: `/api/status`
|
||||
- Returns current status for all services
|
||||
- Includes uptime statistics
|
||||
- Provides last check and next check times
|
||||
- **Auto-start**: Monitoring begins when the Flask app starts
|
||||
|
||||
### Frontend Components
|
||||
|
||||
#### 1. `templates/status.html` - Status Page Template
|
||||
- Displays real-time service status
|
||||
- Shows uptime percentages (24h, 7d, 30d, all-time)
|
||||
- Displays response times and status codes
|
||||
- Shows total number of checks performed
|
||||
- Manual refresh button
|
||||
- Auto-refreshes every 5 minutes
|
||||
|
||||
#### 2. `static/js/status.js` - Frontend Logic
|
||||
- Fetches status data from `/api/status` API
|
||||
- Updates UI with service status and uptime
|
||||
- Handles error states gracefully
|
||||
- Auto-refresh every 5 minutes
|
||||
|
||||
#### 3. `static/css/App.css` - Styling
|
||||
- Color-coded status indicators:
|
||||
- Green: Operational
|
||||
- Yellow: Degraded/Timeout
|
||||
- Red: Offline
|
||||
- Responsive grid layout
|
||||
- Dark theme matching existing site design
|
||||
|
||||
## Data Storage
|
||||
|
||||
Check history is stored in a PostgreSQL `service_checks` table. The connection is configured via the `DATABASE_URL` environment variable (e.g. `postgresql://user:pass@host:5432/dbname`).
|
||||
|
||||
```sql
|
||||
CREATE TABLE service_checks (
|
||||
id SERIAL PRIMARY KEY,
|
||||
service_id VARCHAR(50) NOT NULL,
|
||||
timestamp TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
status VARCHAR(20) NOT NULL,
|
||||
response_time INTEGER,
|
||||
status_code INTEGER,
|
||||
error TEXT
|
||||
);
|
||||
```
|
||||
|
||||
The table and index are created automatically on startup. If `DATABASE_URL` is not set, the monitor runs without persistence (useful for local development).
|
||||
|
||||
## Status Types
|
||||
|
||||
- **online**: HTTP status 2xx-4xx, service responding
|
||||
- **degraded**: HTTP status 5xx or slow response
|
||||
- **timeout**: Request exceeded timeout limit (10 seconds)
|
||||
- **offline**: Unable to reach service
|
||||
- **unknown**: No checks performed yet
|
||||
|
||||
## Uptime Calculation
|
||||
|
||||
Uptime percentage = (number of online checks / total checks) × 100
|
||||
|
||||
Calculated for:
|
||||
- Last 24 hours
|
||||
- Last 7 days
|
||||
- Last 30 days
|
||||
- All-time (since monitoring began)
|
||||
|
||||
## Usage
|
||||
|
||||
### Starting the Server
|
||||
```bash
|
||||
cd src
|
||||
python3 app.py
|
||||
```
|
||||
|
||||
The monitoring will start automatically and perform an initial check immediately, then every 2 hours thereafter.
|
||||
|
||||
### Accessing the Status Page
|
||||
Navigate to: `https://asimonson.com/status`
|
||||
|
||||
### API Access
|
||||
Direct API access: `https://asimonson.com/api/status`
|
||||
|
||||
Returns JSON with current status and uptime statistics for all services.
|
||||
|
||||
## Configuration
|
||||
|
||||
To modify monitoring behavior, edit `src/monitor.py`:
|
||||
|
||||
```python
|
||||
# Change check interval (in seconds)
|
||||
CHECK_INTERVAL = 7200 # 2 hours
|
||||
|
||||
# Modify service list
|
||||
SERVICES = [
|
||||
{
|
||||
'id': 'main',
|
||||
'name': 'asimonson.com',
|
||||
'url': 'https://asimonson.com',
|
||||
'timeout': 10 # seconds
|
||||
},
|
||||
# Add more services here
|
||||
]
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- First deployment will show limited uptime data until enough checks accumulate
|
||||
- Historical data is preserved across server restarts (stored in PostgreSQL)
|
||||
- Page auto-refreshes every 5 minutes to show latest server data
|
||||
- Manual refresh button available for immediate updates
|
||||
- All checks performed server-side (no client-side CORS issues)
|
||||
32
docker-compose.yml
Executable file
@@ -0,0 +1,32 @@
|
||||
services:
|
||||
portfolio:
|
||||
image: 'asimonson1125/portfolio'
|
||||
build:
|
||||
context: ./
|
||||
dockerfile: Dockerfile
|
||||
restart: 'no'
|
||||
ports:
|
||||
- 8080:8080
|
||||
environment:
|
||||
DATABASE_URL: postgresql://portfolio:portfolio@db:5432/portfolio
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
|
||||
db:
|
||||
image: postgres:16-alpine
|
||||
restart: 'no'
|
||||
environment:
|
||||
POSTGRES_USER: portfolio
|
||||
POSTGRES_PASSWORD: portfolio
|
||||
POSTGRES_DB: portfolio
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U portfolio"]
|
||||
interval: 5s
|
||||
timeout: 3s
|
||||
retries: 5
|
||||
|
||||
volumes:
|
||||
pgdata:
|
||||
24
flask.conf
@@ -1,24 +0,0 @@
|
||||
server {
|
||||
listen 8080;
|
||||
server_name www.asimonson.com;
|
||||
return 301 https://asimonson.com$request_uri;
|
||||
}
|
||||
server {
|
||||
listen 8080;
|
||||
server_name asimonson.com;
|
||||
|
||||
gzip on;
|
||||
gzip_types text/plain text/javascript text/css;
|
||||
gunzip on;
|
||||
|
||||
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
|
||||
add_header X-Content-Type-Options 'nosniff';
|
||||
add_header X-Frame-Options 'SAMEORIGIN';
|
||||
|
||||
location / {
|
||||
proxy_pass http://localhost:5000/;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,3 +0,0 @@
|
||||
[program:gunicorn]
|
||||
command=/usr/bin/gunicorn -k geventwebsocket.gunicorn.workers.GeventWebSocketWorker app:app -b localhost:5000
|
||||
directory=/deploy/app
|
||||
170
src/app.py
Normal file → Executable file
@@ -1,34 +1,89 @@
|
||||
import flask
|
||||
from flask_minify import Minify
|
||||
import json
|
||||
import os
|
||||
import hashlib
|
||||
import werkzeug.exceptions as HTTPerror
|
||||
import requests
|
||||
from config import *
|
||||
from monitor import monitor, SERVICES
|
||||
|
||||
proj = json.load(open("./static/json/projects.json", "r"))
|
||||
books = json.load(open("./static/json/books.json", "r"))
|
||||
skillList = json.load(open("./static/json/skills.json", "r"))
|
||||
timeline = json.load(open("./static/json/timeline.json", "r"))
|
||||
pages = json.load(open("./static/json/pages.json", "r"))
|
||||
pages['about']['skillList'] = skillList
|
||||
pages['about']['timeline'] = timeline
|
||||
app = flask.Flask(__name__)
|
||||
|
||||
# Compute content hashes for static file fingerprinting
|
||||
static_file_hashes = {}
|
||||
for dirpath, _, filenames in os.walk(app.static_folder):
|
||||
for filename in filenames:
|
||||
filepath = os.path.join(dirpath, filename)
|
||||
relative = os.path.relpath(filepath, app.static_folder)
|
||||
with open(filepath, 'rb') as f:
|
||||
static_file_hashes[relative] = hashlib.md5(f.read()).hexdigest()[:8]
|
||||
|
||||
@app.context_processor
|
||||
def override_url_for():
|
||||
def versioned_url_for(endpoint, **values):
|
||||
if endpoint == 'static':
|
||||
filename = values.get('filename')
|
||||
if filename and filename in static_file_hashes:
|
||||
values['v'] = static_file_hashes[filename]
|
||||
return flask.url_for(endpoint, **values)
|
||||
return dict(url_for=versioned_url_for)
|
||||
|
||||
# Add security and caching headers
|
||||
@app.after_request
|
||||
def add_security_headers(response):
|
||||
"""Add security and performance headers to all responses"""
|
||||
# Security headers
|
||||
response.headers['X-Content-Type-Options'] = 'nosniff'
|
||||
response.headers['X-Frame-Options'] = 'SAMEORIGIN'
|
||||
response.headers['X-XSS-Protection'] = '1; mode=block'
|
||||
response.headers['Referrer-Policy'] = 'strict-origin-when-cross-origin'
|
||||
|
||||
# Cache control for static assets
|
||||
if flask.request.path.startswith('/static/'):
|
||||
response.headers['Cache-Control'] = 'public, max-age=31536000, immutable'
|
||||
elif flask.request.path in ['/sitemap.xml', '/robots.txt']:
|
||||
response.headers['Cache-Control'] = 'public, max-age=86400'
|
||||
else:
|
||||
response.headers['Cache-Control'] = 'no-cache, must-revalidate'
|
||||
|
||||
return response
|
||||
|
||||
|
||||
|
||||
def load_json(path):
|
||||
with open(path, "r") as f:
|
||||
return json.load(f)
|
||||
|
||||
proj = load_json("./static/json/projects.json")
|
||||
books = load_json("./static/json/books.json")
|
||||
skillList = load_json("./static/json/skills.json")
|
||||
timeline = load_json("./static/json/timeline.json")
|
||||
pages = load_json("./static/json/pages.json")
|
||||
|
||||
pages['projects']['skillList'] = skillList
|
||||
# pages['about']['timeline'] = timeline
|
||||
pages['projects']['projects'] = proj
|
||||
pages['home']['books'] = books
|
||||
pages['books']['books'] = books
|
||||
pages['status']['services'] = SERVICES
|
||||
|
||||
app = flask.Flask(__name__)
|
||||
@app.route('/api/status')
|
||||
def api_status():
|
||||
"""API endpoint for service status"""
|
||||
return flask.jsonify(monitor.get_status_summary())
|
||||
|
||||
@app.route('/api/goto/')
|
||||
@app.route('/api/goto/<location>')
|
||||
def goto(location='home'):
|
||||
if location not in pages:
|
||||
flask.abort(404)
|
||||
pagevars = pages[location]
|
||||
page = None
|
||||
try:
|
||||
page = flask.render_template(pagevars["template"], var=pagevars)
|
||||
except Exception as e:
|
||||
# raise e
|
||||
e = HTTPerror.InternalServerError(None, e)
|
||||
page = page404(e)
|
||||
except Exception:
|
||||
e = HTTPerror.InternalServerError()
|
||||
page = handle_http_error(e)
|
||||
return [pagevars, page]
|
||||
|
||||
def funcGen(pagename, pages):
|
||||
@@ -37,63 +92,59 @@ def funcGen(pagename, pages):
|
||||
return flask.render_template('header.html', var=pages[pagename])
|
||||
except Exception:
|
||||
e = HTTPerror.InternalServerError()
|
||||
return page404(e)
|
||||
print(e)
|
||||
return handle_http_error(e)
|
||||
return dynamicRule
|
||||
|
||||
for i in pages:
|
||||
func = funcGen(i, pages)
|
||||
app.add_url_rule(pages[i]['canonical'], i, func)
|
||||
|
||||
|
||||
# for i in pages:
|
||||
# exec(f"@app.route(pages['{i}']['canonical'])\ndef {i}(): return flask.render_template('header.html', var=pages['{i}'])")
|
||||
|
||||
|
||||
@app.route("/resume")
|
||||
@app.route("/Resume.pdf")
|
||||
@app.route("/Resume_Simonson_Andrew.pdf")
|
||||
def resume():
|
||||
return flask.send_file("./static/Resume.pdf")
|
||||
return flask.send_file("./static/Resume_Simonson_Andrew.pdf")
|
||||
|
||||
@app.route("/hotspots")
|
||||
def hotspotsRIT():
|
||||
url = HotspotsURL
|
||||
if flask.request.args.get("legend") == "false":
|
||||
url += "?legend=false"
|
||||
pagevars = {
|
||||
"template": "iframe.html",
|
||||
"title": f"Hotspots @ RIT",
|
||||
"description": "Hotspots @ RIT by Andrew Simonson",
|
||||
"canonical": "/hotspots",
|
||||
}
|
||||
return flask.render_template("iframe.html", url=url, var=pagevars)
|
||||
|
||||
@app.route("/hotspots/<path>")
|
||||
def hotspotsProxy(path):
|
||||
return requests.get(f"{HotspotsURL}/{path}").content
|
||||
|
||||
@app.errorhandler(Exception)
|
||||
def page404(e):
|
||||
@app.errorhandler(HTTPerror.HTTPException)
|
||||
def handle_http_error(e):
|
||||
eCode = e.code
|
||||
message = e.description
|
||||
try:
|
||||
message = e.length
|
||||
finally:
|
||||
pagevars = {
|
||||
"template": "error.html",
|
||||
"title": f"{eCode} - Simonson",
|
||||
"description": "Error on Andrew Simonson's Digital Portfolio",
|
||||
"canonical": "404",
|
||||
}
|
||||
return (
|
||||
flask.render_template(
|
||||
"header.html",
|
||||
var=pagevars,
|
||||
error=eCode,
|
||||
message=message,
|
||||
title=f"{eCode} - Simonson Portfolio",
|
||||
),
|
||||
eCode,
|
||||
)
|
||||
pagevars = {
|
||||
"template": "error.html",
|
||||
"title": f"{eCode} - Simonson",
|
||||
"description": "Error on Andrew Simonson's Digital Portfolio",
|
||||
"canonical": f"/{eCode}",
|
||||
}
|
||||
return (
|
||||
flask.render_template(
|
||||
"header.html",
|
||||
var=pagevars,
|
||||
error=eCode,
|
||||
message=message,
|
||||
title=f"{eCode} - Simonson Portfolio",
|
||||
),
|
||||
eCode,
|
||||
)
|
||||
|
||||
@app.errorhandler(Exception)
|
||||
def handle_generic_error(e):
|
||||
pagevars = {
|
||||
"template": "error.html",
|
||||
"title": "500 - Simonson",
|
||||
"description": "Error on Andrew Simonson's Digital Portfolio",
|
||||
"canonical": "/500",
|
||||
}
|
||||
return (
|
||||
flask.render_template(
|
||||
"header.html",
|
||||
var=pagevars,
|
||||
error=500,
|
||||
message="Internal Server Error",
|
||||
title="500 - Simonson Portfolio",
|
||||
),
|
||||
500,
|
||||
)
|
||||
|
||||
|
||||
@app.route("/sitemap.xml")
|
||||
@@ -106,6 +157,7 @@ if __name__ == "__main__":
|
||||
# import sass
|
||||
|
||||
# sass.compile(dirname=("static/scss", "static/css"), output_style="compressed")
|
||||
app.run()
|
||||
app.run(debug=False)
|
||||
else:
|
||||
Minify(app=app, html=True, js=True, cssless=True)
|
||||
monitor.start_monitoring()
|
||||
|
||||
4
src/config.py
Normal file → Executable file
@@ -1,8 +1,6 @@
|
||||
from os import environ as env
|
||||
# automatically updates some dev envs. need to remove for production.
|
||||
try:
|
||||
__import__('envs.py')
|
||||
__import__('envs')
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
HotspotsURL = env.get('HotspotsURL', 'https://asimonson.com/hotspots')
|
||||
387
src/monitor.py
Normal file
@@ -0,0 +1,387 @@
|
||||
"""
|
||||
Service monitoring module
|
||||
Checks service availability and tracks uptime statistics
|
||||
"""
|
||||
import os
|
||||
import requests
|
||||
import time
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
from datetime import datetime, timedelta
|
||||
from threading import Thread, Lock
|
||||
|
||||
import psycopg2
|
||||
|
||||
# Service configuration
|
||||
SERVICES = [
|
||||
{
|
||||
'id': 'main',
|
||||
'name': 'asimonson.com',
|
||||
'url': 'https://asimonson.com',
|
||||
'timeout': 10
|
||||
},
|
||||
{
|
||||
'id': 'files',
|
||||
'name': 'files.asimonson.com',
|
||||
'url': 'https://files.asimonson.com',
|
||||
'timeout': 10
|
||||
},
|
||||
{
|
||||
'id': 'git',
|
||||
'name': 'git.asimonson.com',
|
||||
'url': 'https://git.asimonson.com',
|
||||
'timeout': 10
|
||||
}
|
||||
]
|
||||
|
||||
# Check interval: 1 min
|
||||
CHECK_INTERVAL = 60
|
||||
|
||||
# Retention: 90 days (quarter year)
|
||||
RETENTION_DAYS = 90
|
||||
CLEANUP_INTERVAL = 86400 # 24 hours
|
||||
|
||||
DATABASE_URL = os.environ.get('DATABASE_URL')
|
||||
|
||||
# Expected columns (besides id) — name: SQL type
|
||||
_EXPECTED_COLUMNS = {
|
||||
'service_id': 'VARCHAR(50) NOT NULL',
|
||||
'timestamp': 'TIMESTAMPTZ NOT NULL DEFAULT NOW()',
|
||||
'status': 'VARCHAR(20) NOT NULL',
|
||||
'response_time': 'INTEGER',
|
||||
'status_code': 'INTEGER',
|
||||
'error': 'TEXT',
|
||||
}
|
||||
|
||||
|
||||
class ServiceMonitor:
|
||||
def __init__(self):
|
||||
self.lock = Lock()
|
||||
# Lightweight in-memory cache of latest status per service
|
||||
self._current = {}
|
||||
for service in SERVICES:
|
||||
self._current[service['id']] = {
|
||||
'name': service['name'],
|
||||
'url': service['url'],
|
||||
'status': 'unknown',
|
||||
'response_time': None,
|
||||
'status_code': None,
|
||||
'last_online': None,
|
||||
}
|
||||
self._last_check = None
|
||||
self._ensure_schema()
|
||||
|
||||
# ── database helpers ──────────────────────────────────────────
|
||||
|
||||
@staticmethod
|
||||
def _get_conn():
|
||||
"""Return a new psycopg2 connection, or None if DATABASE_URL is unset."""
|
||||
if not DATABASE_URL:
|
||||
return None
|
||||
return psycopg2.connect(DATABASE_URL)
|
||||
|
||||
def _ensure_schema(self):
|
||||
"""Create the service_checks table (and index) if needed, then
|
||||
reconcile columns with _EXPECTED_COLUMNS."""
|
||||
if not DATABASE_URL:
|
||||
print("DATABASE_URL not set — running without persistence")
|
||||
return
|
||||
|
||||
# Retry connection in case DB is still starting (e.g. Docker)
|
||||
conn = None
|
||||
for attempt in range(5):
|
||||
try:
|
||||
conn = psycopg2.connect(DATABASE_URL)
|
||||
break
|
||||
except psycopg2.OperationalError:
|
||||
if attempt < 4:
|
||||
print(f"Database not ready, retrying in 2s (attempt {attempt + 1}/5)...")
|
||||
time.sleep(2)
|
||||
else:
|
||||
print("Could not connect to database — running without persistence")
|
||||
return
|
||||
try:
|
||||
with conn, conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
CREATE TABLE IF NOT EXISTS service_checks (
|
||||
id SERIAL PRIMARY KEY,
|
||||
service_id VARCHAR(50) NOT NULL,
|
||||
timestamp TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
status VARCHAR(20) NOT NULL,
|
||||
response_time INTEGER,
|
||||
status_code INTEGER,
|
||||
error TEXT
|
||||
);
|
||||
""")
|
||||
cur.execute("""
|
||||
CREATE INDEX IF NOT EXISTS idx_service_checks_service_timestamp
|
||||
ON service_checks (service_id, timestamp DESC);
|
||||
""")
|
||||
|
||||
# Introspect existing columns
|
||||
cur.execute("""
|
||||
SELECT column_name
|
||||
FROM information_schema.columns
|
||||
WHERE table_name = 'service_checks'
|
||||
""")
|
||||
existing = {row[0] for row in cur.fetchall()}
|
||||
|
||||
# Add missing columns
|
||||
for col, col_type in _EXPECTED_COLUMNS.items():
|
||||
if col not in existing:
|
||||
# Strip NOT NULL / DEFAULT for ALTER ADD (can't enforce
|
||||
# NOT NULL on existing rows without a default)
|
||||
bare_type = col_type.split('NOT NULL')[0].split('DEFAULT')[0].strip()
|
||||
cur.execute(
|
||||
f'ALTER TABLE service_checks ADD COLUMN {col} {bare_type}'
|
||||
)
|
||||
print(f"Added column {col} to service_checks")
|
||||
|
||||
# Drop unexpected columns (besides 'id')
|
||||
expected_names = set(_EXPECTED_COLUMNS) | {'id'}
|
||||
for col in existing - expected_names:
|
||||
cur.execute(
|
||||
f'ALTER TABLE service_checks DROP COLUMN {col}'
|
||||
)
|
||||
print(f"Dropped column {col} from service_checks")
|
||||
|
||||
print("Database schema OK")
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
def _insert_check(self, service_id, result):
|
||||
"""Insert a single check result into the database."""
|
||||
conn = self._get_conn()
|
||||
if conn is None:
|
||||
return
|
||||
try:
|
||||
with conn, conn.cursor() as cur:
|
||||
cur.execute(
|
||||
"""INSERT INTO service_checks
|
||||
(service_id, timestamp, status, response_time, status_code, error)
|
||||
VALUES (%s, %s, %s, %s, %s, %s)""",
|
||||
(
|
||||
service_id,
|
||||
result['timestamp'],
|
||||
result['status'],
|
||||
result.get('response_time'),
|
||||
result.get('status_code'),
|
||||
result.get('error'),
|
||||
),
|
||||
)
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
# ── service checks ────────────────────────────────────────────
|
||||
|
||||
def check_service(self, service):
|
||||
"""Check a single service and return status"""
|
||||
start_time = time.time()
|
||||
result = {
|
||||
'timestamp': datetime.now().isoformat(),
|
||||
'status': 'offline',
|
||||
'response_time': None,
|
||||
'status_code': None
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.head(
|
||||
service['url'],
|
||||
timeout=service['timeout'],
|
||||
allow_redirects=True
|
||||
)
|
||||
|
||||
elapsed = int((time.time() - start_time) * 1000) # ms
|
||||
|
||||
result['response_time'] = elapsed
|
||||
result['status_code'] = response.status_code
|
||||
|
||||
# Consider 2xx and 3xx as online
|
||||
if 200 <= response.status_code < 400:
|
||||
result['status'] = 'online'
|
||||
elif 400 <= response.status_code < 500:
|
||||
# Client errors might still mean service is up
|
||||
result['status'] = 'online'
|
||||
else:
|
||||
result['status'] = 'degraded'
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
result['status'] = 'timeout'
|
||||
result['response_time'] = service['timeout'] * 1000
|
||||
except Exception as e:
|
||||
result['status'] = 'offline'
|
||||
result['error'] = str(e)
|
||||
|
||||
return result
|
||||
|
||||
def check_all_services(self):
|
||||
"""Check all services and update status data"""
|
||||
print(f"[{datetime.now().strftime('%Y-%m-%d %H:%M:%S')}] Checking all services...")
|
||||
|
||||
# Perform all network checks concurrently and OUTSIDE the lock
|
||||
results = {}
|
||||
with ThreadPoolExecutor(max_workers=len(SERVICES)) as executor:
|
||||
futures = {executor.submit(self.check_service, s): s for s in SERVICES}
|
||||
for future in futures:
|
||||
service = futures[future]
|
||||
result = future.result()
|
||||
results[service['id']] = result
|
||||
print(f" {service['name']}: {result['status']} ({result['response_time']}ms)")
|
||||
|
||||
# Persist to database (outside lock — DB has its own concurrency)
|
||||
for service_id, result in results.items():
|
||||
self._insert_check(service_id, result)
|
||||
|
||||
# Update lightweight in-memory cache under lock
|
||||
with self.lock:
|
||||
for service in SERVICES:
|
||||
result = results[service['id']]
|
||||
cached = self._current[service['id']]
|
||||
cached['status'] = result['status']
|
||||
cached['response_time'] = result['response_time']
|
||||
cached['status_code'] = result['status_code']
|
||||
if result['status'] == 'online':
|
||||
cached['last_online'] = result['timestamp']
|
||||
self._last_check = datetime.now().isoformat()
|
||||
|
||||
# ── uptime calculations ───────────────────────────────────────
|
||||
|
||||
def _calculate_uptime_unlocked(self, service_id, hours=None):
|
||||
"""Calculate uptime percentage for a service by querying the DB."""
|
||||
conn = self._get_conn()
|
||||
if conn is None:
|
||||
return None
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
if hours:
|
||||
cutoff = datetime.now() - timedelta(hours=hours)
|
||||
cur.execute(
|
||||
"""SELECT
|
||||
COUNT(*) FILTER (WHERE status = 'online'),
|
||||
COUNT(*)
|
||||
FROM service_checks
|
||||
WHERE service_id = %s AND timestamp > %s""",
|
||||
(service_id, cutoff),
|
||||
)
|
||||
else:
|
||||
cur.execute(
|
||||
"""SELECT
|
||||
COUNT(*) FILTER (WHERE status = 'online'),
|
||||
COUNT(*)
|
||||
FROM service_checks
|
||||
WHERE service_id = %s""",
|
||||
(service_id,),
|
||||
)
|
||||
|
||||
online_count, total_count = cur.fetchone()
|
||||
|
||||
if total_count == 0:
|
||||
return None
|
||||
|
||||
# Only show uptime for a window if we have data older than it
|
||||
if hours:
|
||||
cur.execute(
|
||||
'SELECT EXISTS(SELECT 1 FROM service_checks WHERE service_id = %s AND timestamp <= %s)',
|
||||
(service_id, cutoff),
|
||||
)
|
||||
if not cur.fetchone()[0]:
|
||||
return None
|
||||
|
||||
return round((online_count / total_count) * 100, 2)
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
def calculate_uptime(self, service_id, hours=None):
|
||||
"""Calculate uptime percentage for a service"""
|
||||
return self._calculate_uptime_unlocked(service_id, hours)
|
||||
|
||||
def get_status_summary(self):
|
||||
"""Get current status summary with uptime statistics"""
|
||||
with self.lock:
|
||||
summary = {
|
||||
'last_check': self._last_check,
|
||||
'next_check': None,
|
||||
'services': []
|
||||
}
|
||||
|
||||
if self._last_check:
|
||||
last_check = datetime.fromisoformat(self._last_check)
|
||||
next_check = last_check + timedelta(seconds=CHECK_INTERVAL)
|
||||
summary['next_check'] = next_check.isoformat()
|
||||
|
||||
for service_id, cached in self._current.items():
|
||||
service_summary = {
|
||||
'id': service_id,
|
||||
'name': cached['name'],
|
||||
'url': cached['url'],
|
||||
'status': cached['status'],
|
||||
'response_time': cached['response_time'],
|
||||
'status_code': cached['status_code'],
|
||||
'last_online': cached['last_online'],
|
||||
'uptime': {
|
||||
'24h': self._calculate_uptime_unlocked(service_id, 24),
|
||||
'7d': self._calculate_uptime_unlocked(service_id, 24 * 7),
|
||||
'30d': self._calculate_uptime_unlocked(service_id, 24 * 30),
|
||||
'all_time': self._calculate_uptime_unlocked(service_id)
|
||||
},
|
||||
'total_checks': self._get_total_checks(service_id),
|
||||
}
|
||||
summary['services'].append(service_summary)
|
||||
|
||||
return summary
|
||||
|
||||
def _get_total_checks(self, service_id):
|
||||
"""Return the total number of checks for a service."""
|
||||
conn = self._get_conn()
|
||||
if conn is None:
|
||||
return 0
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(
|
||||
'SELECT COUNT(*) FROM service_checks WHERE service_id = %s',
|
||||
(service_id,),
|
||||
)
|
||||
return cur.fetchone()[0]
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
def _purge_old_records(self):
|
||||
"""Delete check records older than RETENTION_DAYS."""
|
||||
conn = self._get_conn()
|
||||
if conn is None:
|
||||
return
|
||||
try:
|
||||
cutoff = datetime.now() - timedelta(days=RETENTION_DAYS)
|
||||
with conn, conn.cursor() as cur:
|
||||
cur.execute(
|
||||
'DELETE FROM service_checks WHERE timestamp < %s',
|
||||
(cutoff,),
|
||||
)
|
||||
deleted = cur.rowcount
|
||||
if deleted:
|
||||
print(f"Purged {deleted} records older than {RETENTION_DAYS} days")
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
def start_monitoring(self):
|
||||
"""Start background monitoring thread"""
|
||||
def monitor_loop():
|
||||
self.check_all_services()
|
||||
self._purge_old_records()
|
||||
|
||||
checks_since_cleanup = 0
|
||||
checks_per_cleanup = CLEANUP_INTERVAL // CHECK_INTERVAL
|
||||
|
||||
while True:
|
||||
time.sleep(CHECK_INTERVAL)
|
||||
self.check_all_services()
|
||||
checks_since_cleanup += 1
|
||||
if checks_since_cleanup >= checks_per_cleanup:
|
||||
self._purge_old_records()
|
||||
checks_since_cleanup = 0
|
||||
|
||||
thread = Thread(target=monitor_loop, daemon=True)
|
||||
thread.start()
|
||||
print(f"Service monitoring started (checks every {CHECK_INTERVAL} seconds)")
|
||||
|
||||
# Global monitor instance
|
||||
monitor = ServiceMonitor()
|
||||
58
src/requirements.txt
Normal file → Executable file
@@ -1,43 +1,23 @@
|
||||
bidict==0.22.1
|
||||
black==22.12.0
|
||||
blinker==1.5
|
||||
certifi==2023.7.22
|
||||
cffi==1.15.1
|
||||
charset-normalizer==3.3.1
|
||||
click==8.1.3
|
||||
colorama==0.4.6
|
||||
Flask==2.2.2
|
||||
Flask-Minify==0.41
|
||||
Flask-SocketIO==5.3.2
|
||||
gevent==22.10.2
|
||||
gevent-websocket==0.10.1
|
||||
greenlet==2.0.2
|
||||
gunicorn==20.1.0
|
||||
htmlmin==0.1.12
|
||||
idna==3.4
|
||||
importlib-metadata==6.0.0
|
||||
itsdangerous==2.1.2
|
||||
Jinja2==3.1.2
|
||||
blinker==1.8.2
|
||||
certifi==2024.7.4
|
||||
charset-normalizer==3.3.2
|
||||
click==8.1.7
|
||||
Flask==3.0.3
|
||||
Flask-Minify==0.48
|
||||
gunicorn==22.0.0
|
||||
htmlminf==0.1.13
|
||||
idna==3.7
|
||||
itsdangerous==2.2.0
|
||||
Jinja2==3.1.4
|
||||
jsmin==3.0.1
|
||||
lesscpy==0.15.1
|
||||
libsass==0.22.0
|
||||
MarkupSafe==2.1.2
|
||||
mypy-extensions==0.4.3
|
||||
pathspec==0.11.0
|
||||
platformdirs==2.6.2
|
||||
MarkupSafe==2.1.5
|
||||
packaging==24.1
|
||||
ply==3.11
|
||||
pycparser==2.21
|
||||
pyScss==1.4.0
|
||||
python-engineio==4.3.4
|
||||
python-socketio==5.7.2
|
||||
rcssmin==1.1.1
|
||||
requests==2.31.0
|
||||
rcssmin==1.1.2
|
||||
requests==2.32.3
|
||||
six==1.16.0
|
||||
tomli==2.0.1
|
||||
typing_extensions==4.4.0
|
||||
urllib3==2.0.7
|
||||
Werkzeug==2.2.2
|
||||
xxhash==3.2.0
|
||||
zipp==3.11.0
|
||||
zope.event==4.6
|
||||
zope.interface==5.5.2
|
||||
urllib3==2.2.2
|
||||
Werkzeug==3.0.3
|
||||
xxhash==3.4.1
|
||||
psycopg2-binary==2.9.9
|
||||
|
||||
BIN
src/static/Resume.pdf → src/static/Resume_Simonson_Andrew.pdf
Normal file → Executable file
0
src/static/chesscom-embed/default.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 1.1 KiB |
0
src/static/chesscom-embed/diamonds.png
Normal file → Executable file
|
Before Width: | Height: | Size: 1.7 KiB After Width: | Height: | Size: 1.7 KiB |
728
src/static/css/App.css
Normal file → Executable file
0
src/static/css/checkbox.css
Normal file → Executable file
0
src/static/css/head.css
Normal file → Executable file
0
src/static/favicon.ico
Normal file → Executable file
|
Before Width: | Height: | Size: 97 KiB After Width: | Height: | Size: 97 KiB |
0
src/static/fonts/NeonFuture.ttf
Normal file → Executable file
0
src/static/fonts/RobotoCondensed-Regular.ttf
Normal file → Executable file
0
src/static/fonts/SHUTTLE-X.ttf
Normal file → Executable file
0
src/static/fonts/SunsetClub.otf
Normal file → Executable file
BIN
src/static/fonts/chessglyph-new.0cc8115c.woff2
Normal file
0
src/static/icons/email.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 1.0 KiB After Width: | Height: | Size: 1.0 KiB |
0
src/static/icons/github.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 759 B After Width: | Height: | Size: 759 B |
0
src/static/icons/globe.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 2.0 KiB After Width: | Height: | Size: 2.0 KiB |
0
src/static/icons/instagram.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 683 B After Width: | Height: | Size: 683 B |
0
src/static/icons/linkedin.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 1.3 KiB After Width: | Height: | Size: 1.3 KiB |
0
src/static/icons/log.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 177 KiB After Width: | Height: | Size: 177 KiB |
0
src/static/icons/menu.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 764 B After Width: | Height: | Size: 764 B |
BIN
src/static/icons/min.rasterLogo.avif
Normal file
|
After Width: | Height: | Size: 9.5 KiB |
0
src/static/icons/neonfinal3.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 16 KiB |
0
src/static/icons/rasterLogo.png
Normal file → Executable file
|
Before Width: | Height: | Size: 382 KiB After Width: | Height: | Size: 382 KiB |
0
src/static/icons/rasterLogoCircle.png
Normal file → Executable file
|
Before Width: | Height: | Size: 213 KiB After Width: | Height: | Size: 213 KiB |
0
src/static/icons/withBackground.svg
Normal file → Executable file
|
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 16 KiB |
14
src/static/js/checkbox.js
Normal file → Executable file
@@ -27,15 +27,13 @@ function toggleCheckbox(dir) {
|
||||
}
|
||||
|
||||
function activeSkill(obj) {
|
||||
if (obj.parentElement.classList.contains("activeSkill")) {
|
||||
obj.parentElement.classList.remove("activeSkill");
|
||||
let skill = obj.closest(".skill");
|
||||
if (skill.classList.contains("activeSkill")) {
|
||||
skill.classList.remove("activeSkill");
|
||||
return;
|
||||
}
|
||||
// document.querySelectorAll(".skill").forEach((x) => {
|
||||
// x.classList.remove("activeSkill");
|
||||
// });
|
||||
while (obj.parentElement.classList.contains("skill")) {
|
||||
obj = obj.parentElement;
|
||||
obj.classList.add("activeSkill");
|
||||
while (skill) {
|
||||
skill.classList.add("activeSkill");
|
||||
skill = skill.parentElement.closest(".skill");
|
||||
}
|
||||
}
|
||||
|
||||
2
src/static/js/chessbed.js
Normal file → Executable file
@@ -10,7 +10,7 @@ async function addChessEmbed(username) {
|
||||
if (user.status === 200) {
|
||||
user = await user.json();
|
||||
stats = await stats.json();
|
||||
ratings = {
|
||||
const ratings = {
|
||||
rapid: stats.chess_rapid.last.rating,
|
||||
blitz: stats.chess_blitz.last.rating,
|
||||
bullet: stats.chess_bullet.last.rating,
|
||||
|
||||
45
src/static/js/idler.js
Normal file → Executable file
@@ -1,5 +1,5 @@
|
||||
const balls = [];
|
||||
const density = 0.00003;
|
||||
const density = 0.00005;
|
||||
let screenWidth = window.innerWidth + 10;
|
||||
let screenHeight = window.innerHeight + 10;
|
||||
|
||||
@@ -69,24 +69,45 @@ function windowResized() {
|
||||
function draw() {
|
||||
background(24);
|
||||
|
||||
// Update all balls
|
||||
for (let i = 0; i < balls.length; i++) {
|
||||
balls[i].update();
|
||||
}
|
||||
|
||||
// Draw lines with additive blending so overlaps increase brightness
|
||||
blendMode(ADD);
|
||||
strokeWeight(2);
|
||||
|
||||
const maxDist = 150;
|
||||
const maxDistSquared = maxDist * maxDist;
|
||||
|
||||
for (let i = 0; i < balls.length - 1; i++) {
|
||||
const ball1 = balls[i];
|
||||
for (let j = i + 1; j < balls.length; j++) {
|
||||
let distance = dist(balls[i].x, balls[i].y, balls[j].x, balls[j].y);
|
||||
if (distance < 100){
|
||||
stroke(150);
|
||||
line(balls[i].x, balls[i].y, balls[j].x, balls[j].y);
|
||||
}
|
||||
else if (distance < 150) {
|
||||
stroke(100);
|
||||
let chance = 0.3 ** (((random(0.2) + 0.8) * distance) / 150);
|
||||
if (chance < 0.5) {
|
||||
stroke(50);
|
||||
const ball2 = balls[j];
|
||||
|
||||
const dx = ball2.x - ball1.x;
|
||||
const dy = ball2.y - ball1.y;
|
||||
const distSquared = dx * dx + dy * dy;
|
||||
|
||||
if (distSquared < maxDistSquared) {
|
||||
const distance = Math.sqrt(distSquared);
|
||||
|
||||
if (distance < 75) {
|
||||
stroke(255, 85);
|
||||
line(ball1.x, ball1.y, ball2.x, ball2.y);
|
||||
} else {
|
||||
const chance = 0.3 ** (((random(0.2) + 0.8) * distance) / 150);
|
||||
if (chance < 0.5) {
|
||||
stroke(255, 40);
|
||||
} else {
|
||||
stroke(255, 75);
|
||||
}
|
||||
line(ball1.x, ball1.y, ball2.x, ball2.y);
|
||||
}
|
||||
line(balls[i].x, balls[i].y, balls[j].x, balls[j].y);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
blendMode(BLEND);
|
||||
}
|
||||
|
||||
82
src/static/js/responsive.js
Normal file → Executable file
@@ -1,35 +1,9 @@
|
||||
window.onload = function () {
|
||||
onLoaded();
|
||||
};
|
||||
function onLoaded() {
|
||||
window.onresize = function () {
|
||||
resizer();
|
||||
};
|
||||
resizer();
|
||||
}
|
||||
|
||||
function resizer() {
|
||||
const e = document.querySelector(".navControl");
|
||||
if (window.innerWidth > 1400) {
|
||||
// desktop view
|
||||
e.style.maxHeight = `${e.scrollHeight + 10}px`;
|
||||
} else {
|
||||
// mobile view
|
||||
document.querySelector(".header").style.borderBottomWidth = "3px";
|
||||
e.style.maxHeight = "0px";
|
||||
document.querySelectorAll(".navElement *").forEach((x) => {
|
||||
x.style.paddingTop = ".3rem";
|
||||
x.style.paddingBottom = ".3rem";
|
||||
x.style.fontSize = "1rem";
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
function toggleMenu() {
|
||||
function toggleMenu(collapse=false) {
|
||||
if (window.innerWidth < 1400) {
|
||||
const e = document.querySelector(".navControl");
|
||||
const bar = document.querySelector(".header");
|
||||
if (e.style.maxHeight === "0px") {
|
||||
const isCollapsed = !e.style.maxHeight || e.style.maxHeight === "0px";
|
||||
if (isCollapsed && !collapse) {
|
||||
e.style.maxHeight = `${e.scrollHeight + 10}px`;
|
||||
bar.style.borderBottomWidth = "0px";
|
||||
} else {
|
||||
@@ -39,26 +13,48 @@ function toggleMenu() {
|
||||
}
|
||||
}
|
||||
|
||||
async function goto(location, { push = true, toggle = true } = {}) {
|
||||
let a = await fetch("/api/goto/" + location, {
|
||||
credentials: "include",
|
||||
method: "GET",
|
||||
mode: "cors",
|
||||
});
|
||||
const response = await a.json();
|
||||
if (!location.includes("#")) {
|
||||
window.scrollTo({top: 0, left: 0, behavior:"instant"});
|
||||
async function goto(location, { push = true } = {}) {
|
||||
let a;
|
||||
try {
|
||||
a = await fetch("/api/goto/" + location, {
|
||||
credentials: "include",
|
||||
method: "GET",
|
||||
mode: "cors",
|
||||
});
|
||||
if (!a.ok) {
|
||||
console.error(`Navigation failed: HTTP ${a.status}`);
|
||||
return;
|
||||
}
|
||||
} catch (err) {
|
||||
console.error("Navigation fetch failed:", err);
|
||||
return;
|
||||
}
|
||||
|
||||
document.dispatchEvent(new Event('beforenavigate'));
|
||||
|
||||
const response = await a.json();
|
||||
const metadata = response[0];
|
||||
const content = response[1];
|
||||
let root = document.getElementById("root");
|
||||
const root = document.getElementById("root");
|
||||
root.innerHTML = content;
|
||||
root.querySelectorAll("script").forEach((x) => {
|
||||
eval(x.innerHTML);
|
||||
root.querySelectorAll("script").forEach((oldScript) => {
|
||||
const newScript = document.createElement("script");
|
||||
Array.from(oldScript.attributes).forEach(attr => {
|
||||
newScript.setAttribute(attr.name, attr.value);
|
||||
});
|
||||
newScript.textContent = oldScript.textContent;
|
||||
oldScript.parentNode.replaceChild(newScript, oldScript);
|
||||
});
|
||||
if (toggle) {
|
||||
toggleMenu();
|
||||
|
||||
if (!window.location.href.includes("#")) {
|
||||
window.scrollTo({top: 0, left: 0, behavior:"instant"});
|
||||
} else {
|
||||
const eid = decodeURIComponent(window.location.hash.substring(1));
|
||||
const el = document.getElementById(eid);
|
||||
if (el) el.scrollIntoView();
|
||||
}
|
||||
|
||||
toggleMenu(collapse=true);
|
||||
document.querySelector("title").textContent = metadata["title"];
|
||||
if (push) {
|
||||
history.pushState(null, null, metadata["canonical"]);
|
||||
|
||||
264
src/static/js/status.js
Normal file
@@ -0,0 +1,264 @@
|
||||
// Fetch and display service status from API
|
||||
|
||||
/**
|
||||
* Fetch status data from server
|
||||
*/
|
||||
async function fetchStatus() {
|
||||
try {
|
||||
const response = await fetch('/api/status');
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`);
|
||||
}
|
||||
const data = await response.json();
|
||||
updateStatusDisplay(data);
|
||||
} catch (error) {
|
||||
console.error('Error fetching status:', error);
|
||||
showError('Failed to fetch service status. Please try again later.');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the status display with fetched data
|
||||
*/
|
||||
function updateStatusDisplay(data) {
|
||||
// Update last check time
|
||||
if (data.last_check) {
|
||||
const lastCheck = new Date(data.last_check);
|
||||
const timeString = lastCheck.toLocaleString();
|
||||
document.getElementById('lastUpdate').textContent = `Last checked: ${timeString}`;
|
||||
}
|
||||
|
||||
// Update next check time
|
||||
if (data.next_check) {
|
||||
const nextCheck = new Date(data.next_check);
|
||||
const timeString = nextCheck.toLocaleString();
|
||||
const nextCheckEl = document.getElementById('nextUpdate');
|
||||
if (nextCheckEl) {
|
||||
nextCheckEl.textContent = `Next check: ${timeString}`;
|
||||
}
|
||||
}
|
||||
|
||||
// Update each service
|
||||
data.services.forEach(service => {
|
||||
updateServiceCard(service);
|
||||
});
|
||||
|
||||
// Update overall status
|
||||
updateOverallStatus(data.services);
|
||||
|
||||
// Re-enable refresh button
|
||||
const refreshBtn = document.getElementById('refreshBtn');
|
||||
if (refreshBtn) {
|
||||
refreshBtn.disabled = false;
|
||||
refreshBtn.textContent = 'Refresh Now';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a single service card
|
||||
*/
|
||||
function updateServiceCard(service) {
|
||||
const card = document.getElementById(`status-${service.id}`);
|
||||
if (!card) return;
|
||||
|
||||
const stateDot = card.querySelector('.state-dot');
|
||||
const stateText = card.querySelector('.state-text');
|
||||
const timeDisplay = document.getElementById(`time-${service.id}`);
|
||||
const codeDisplay = document.getElementById(`code-${service.id}`);
|
||||
const uptimeDisplay = document.getElementById(`uptime-${service.id}`);
|
||||
const checksDisplay = document.getElementById(`checks-${service.id}`);
|
||||
|
||||
// Update response time
|
||||
if (service.response_time !== null) {
|
||||
timeDisplay.textContent = `${service.response_time}ms`;
|
||||
} else {
|
||||
timeDisplay.textContent = '--';
|
||||
}
|
||||
|
||||
// Update status code
|
||||
if (service.status_code !== null) {
|
||||
codeDisplay.textContent = service.status_code;
|
||||
} else {
|
||||
codeDisplay.textContent = service.status === 'unknown' ? 'Unknown' : 'Error';
|
||||
}
|
||||
|
||||
// Update status indicator
|
||||
card.classList.remove('online', 'degraded', 'offline', 'unknown');
|
||||
|
||||
switch (service.status) {
|
||||
case 'online':
|
||||
stateDot.className = 'state-dot online';
|
||||
stateText.textContent = 'Operational';
|
||||
card.classList.add('online');
|
||||
break;
|
||||
case 'degraded':
|
||||
case 'timeout':
|
||||
stateDot.className = 'state-dot degraded';
|
||||
stateText.textContent = service.status === 'timeout' ? 'Timeout' : 'Degraded';
|
||||
card.classList.add('degraded');
|
||||
break;
|
||||
case 'offline':
|
||||
stateDot.className = 'state-dot offline';
|
||||
stateText.textContent = 'Offline';
|
||||
card.classList.add('offline');
|
||||
break;
|
||||
default:
|
||||
stateDot.className = 'state-dot loading';
|
||||
stateText.textContent = 'Unknown';
|
||||
card.classList.add('unknown');
|
||||
}
|
||||
|
||||
// Update uptime statistics
|
||||
if (uptimeDisplay && service.uptime) {
|
||||
const uptimeHTML = [];
|
||||
|
||||
// Helper function to get color class based on uptime percentage
|
||||
const getUptimeClass = (value) => {
|
||||
if (value === null) return 'text-muted';
|
||||
if (value >= 99) return 'text-excellent';
|
||||
if (value >= 95) return 'text-good';
|
||||
if (value >= 90) return 'text-fair';
|
||||
return 'text-poor';
|
||||
};
|
||||
|
||||
// Helper function to format uptime value
|
||||
const formatUptime = (value, label) => {
|
||||
const display = value !== null ? `${value}%` : '--';
|
||||
const colorClass = getUptimeClass(value);
|
||||
return `${label}: <strong class="${colorClass}">${display}</strong>`;
|
||||
};
|
||||
|
||||
// Add all uptime metrics
|
||||
uptimeHTML.push(formatUptime(service.uptime['24h'], '24h'));
|
||||
uptimeHTML.push(formatUptime(service.uptime['7d'], '7d'));
|
||||
uptimeHTML.push(formatUptime(service.uptime['30d'], '30d'));
|
||||
uptimeHTML.push(formatUptime(service.uptime.all_time, 'All'));
|
||||
|
||||
uptimeDisplay.innerHTML = uptimeHTML.join(' | ');
|
||||
}
|
||||
|
||||
// Update total checks
|
||||
if (checksDisplay && service.total_checks !== undefined) {
|
||||
checksDisplay.textContent = service.total_checks;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update overall status bar
|
||||
*/
|
||||
function updateOverallStatus(services) {
|
||||
const overallBar = document.getElementById('overallStatus');
|
||||
const icon = overallBar.querySelector('.summary-icon');
|
||||
const title = overallBar.querySelector('.summary-title');
|
||||
const subtitle = document.getElementById('summary-subtitle');
|
||||
const onlineCount = document.getElementById('onlineCount');
|
||||
const totalCount = document.getElementById('totalCount');
|
||||
|
||||
// Count service statuses
|
||||
const total = services.length;
|
||||
const online = services.filter(s => s.status === 'online').length;
|
||||
const degraded = services.filter(s => s.status === 'degraded' || s.status === 'timeout').length;
|
||||
const offline = services.filter(s => s.status === 'offline').length;
|
||||
|
||||
// Update counts
|
||||
onlineCount.textContent = online;
|
||||
totalCount.textContent = total;
|
||||
|
||||
// Remove all status classes
|
||||
overallBar.classList.remove('online', 'degraded', 'offline');
|
||||
icon.classList.remove('operational', 'partial', 'major', 'loading');
|
||||
|
||||
// Determine overall status
|
||||
if (online === total) {
|
||||
// All systems operational
|
||||
overallBar.classList.add('online');
|
||||
icon.classList.add('operational');
|
||||
icon.textContent = '\u2713';
|
||||
title.textContent = 'All Systems Operational';
|
||||
subtitle.textContent = `All ${total} services are running normally`;
|
||||
} else if (offline >= Math.ceil(total / 2)) {
|
||||
// Major outage (50% or more offline)
|
||||
overallBar.classList.add('offline');
|
||||
icon.classList.add('major');
|
||||
icon.textContent = '\u2715';
|
||||
title.textContent = 'Major Outage';
|
||||
subtitle.textContent = `${offline} service${offline !== 1 ? 's' : ''} offline, ${degraded} degraded`;
|
||||
} else if (offline > 0 || degraded > 0) {
|
||||
// Partial outage
|
||||
overallBar.classList.add('degraded');
|
||||
icon.classList.add('partial');
|
||||
icon.textContent = '\u26A0';
|
||||
title.textContent = 'Partial Outage';
|
||||
if (offline > 0 && degraded > 0) {
|
||||
subtitle.textContent = `${offline} offline, ${degraded} degraded`;
|
||||
} else if (offline > 0) {
|
||||
subtitle.textContent = `${offline} service${offline !== 1 ? 's' : ''} offline`;
|
||||
} else {
|
||||
subtitle.textContent = `${degraded} service${degraded !== 1 ? 's' : ''} degraded`;
|
||||
}
|
||||
} else {
|
||||
// Unknown state
|
||||
icon.classList.add('loading');
|
||||
icon.textContent = '\u25D0';
|
||||
title.textContent = 'Status Unknown';
|
||||
subtitle.textContent = 'Waiting for service data';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show error message
|
||||
*/
|
||||
function showError(message) {
|
||||
const errorDiv = document.createElement('div');
|
||||
errorDiv.className = 'status-error';
|
||||
errorDiv.textContent = message;
|
||||
errorDiv.style.cssText = 'background: rgba(244, 67, 54, 0.2); color: #f44336; padding: 1em; margin: 1em 0; border-radius: 0.5em; text-align: center;';
|
||||
|
||||
const container = document.querySelector('.foregroundContent');
|
||||
if (container) {
|
||||
container.insertBefore(errorDiv, container.firstChild);
|
||||
setTimeout(() => errorDiv.remove(), 5000);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Manual refresh
|
||||
*/
|
||||
function refreshStatus() {
|
||||
const refreshBtn = document.getElementById('refreshBtn');
|
||||
if (refreshBtn) {
|
||||
refreshBtn.disabled = true;
|
||||
refreshBtn.textContent = 'Checking...';
|
||||
}
|
||||
fetchStatus();
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize on page load
|
||||
*/
|
||||
var statusIntervalId = null;
|
||||
|
||||
function initStatusPage() {
|
||||
// Clear any existing interval from a previous SPA navigation
|
||||
if (statusIntervalId !== null) {
|
||||
clearInterval(statusIntervalId);
|
||||
}
|
||||
fetchStatus();
|
||||
// Auto-refresh every 1 minute to get latest data
|
||||
statusIntervalId = setInterval(fetchStatus, 60000);
|
||||
}
|
||||
|
||||
// Clean up interval when navigating away via SPA
|
||||
document.addEventListener('beforenavigate', () => {
|
||||
if (statusIntervalId !== null) {
|
||||
clearInterval(statusIntervalId);
|
||||
statusIntervalId = null;
|
||||
}
|
||||
});
|
||||
|
||||
// Start when page loads
|
||||
if (document.readyState === 'loading') {
|
||||
document.addEventListener('DOMContentLoaded', initStatusPage);
|
||||
} else {
|
||||
initStatusPage();
|
||||
}
|
||||
42
src/static/json/books.json
Normal file → Executable file
@@ -1,11 +1,19 @@
|
||||
{
|
||||
"selection": [
|
||||
"The Rational Optimist",
|
||||
"The Accidental Superpower",
|
||||
"The Rational Optimist",
|
||||
"The End of the World is Just the Beginning",
|
||||
"When to Rob a Bank",
|
||||
"Freakonomics",
|
||||
"The Accidental Superpower",
|
||||
"Verbal Judo",
|
||||
"Zero To One"
|
||||
],
|
||||
"books": {
|
||||
"Fooled By Randomness": {
|
||||
"filename": "fooledbyrandomness.jpg",
|
||||
"link": "https://www.amazon.com/Fooled-Randomness-Hidden-Chance-Markets-dp-B006Q7VYC4/dp/B006Q7VYC4/ref=dp_ob_title_bk",
|
||||
"review": "A lengthy compendium on probabilistic reasoning that helped kick off a curiosity of indefinite computation. There's more ancient philosophy than a book like this really needs but the occasional brazen punchline from the contemporary anecdotes make it bearable."
|
||||
},
|
||||
"The Rational Optimist": {
|
||||
"filename": "ratOpt.jpg",
|
||||
"link": "https://www.amazon.com/Rational-Optimist-Prosperity-Evolves-P-s/dp/0061452068",
|
||||
@@ -31,6 +39,11 @@
|
||||
"link": "https://freakonomics.com/books/",
|
||||
"review": "More like the other Freakonomics books than I expected (cracked storytelling), which is still excellent, but I wished there was greater insights into seeing past conventional wisdom, which is what thinking like a freak means. Still a great book."
|
||||
},
|
||||
"The Tyranny of Metrics": {
|
||||
"filename": "TyrannyOfMetrics.jpg",
|
||||
"link": "https://www.amazon.com/Tyranny-Metrics-Jerry-Z-Muller/dp/0691174954",
|
||||
"review": "Library find. Very appreciated read given my field of study. Adds a new lens on the cost of information and how it impacts us from the cube office to the oval office."
|
||||
},
|
||||
"The Accidental Superpower": {
|
||||
"filename": "theAccidentalSuperpower.jpeg",
|
||||
"link": "https://zeihan.com/",
|
||||
@@ -66,6 +79,11 @@
|
||||
"link": "https://www.amazon.com/Give-Me-Break-Exposed-Hucksters-ebook/dp/B000FC2NF8/",
|
||||
"review": "I expected a boring autobiography-type book, but instead is a glimpse inside Stossel's work that transformed itself as it transformed his view. Was very happy to see a figure of similar personal ideology. Probably made it a little too easy to swallow that pill."
|
||||
},
|
||||
"Reign of Terror": {
|
||||
"filename": "reignofterror.jpg",
|
||||
"link": "https://www.amazon.com/Reign-Terror-Destabilized-America-Produced/dp/1984879774",
|
||||
"review": "Packed with real news events and first person accounts, Reign of Terror chronicles the story of politics and intelligence agencies during the War on Terror. In typical journalist fashion, the populist (read: racist) cause for the events is mostly conjecture to fit a progressive narrative. Nonetheless, a comprehensive history of malpractice in public office."
|
||||
},
|
||||
"Zero To One": {
|
||||
"filename": "zeroToOne.jpeg",
|
||||
"link": "https://www.amazon.com/Zero-One-Notes-Startups-Future/dp/0804139296",
|
||||
@@ -81,21 +99,26 @@
|
||||
"link": "https://www.amazon.com/Discipline-Destiny-Power-Self-Control-Virtues/dp/0593191692",
|
||||
"review": "Much like the first in its series - small chapters (very helpful) each with inspiring insider stories of figures of history. Anyone capable of learning from these figures would benefit greatly from implementing the virtues in this series."
|
||||
},
|
||||
"Right Thing, Right Now": {
|
||||
"filename": "rightthingrightnow.png",
|
||||
"link": "https://www.amazon.com/Right-Thing-Now-Values-Character/dp/0593191714",
|
||||
"review": "As the third in its series, the virtue of justice derives a large portion of its meaning from the previous two. While still an good read with a valuable influence for personal growth, it lacks a distinction between justice as a virtue and fighting for the right cause. Some sections preach for ideological purity while others insist on pragmatism which is a pretty important detail regarding justice."
|
||||
},
|
||||
"On Grand Strategy": {
|
||||
"filename": "onGrandStrategy.jpeg",
|
||||
"link": "https://www.amazon.com/Grand-Strategy-John-Lewis-Gaddis/dp/1594203512",
|
||||
"review": "Book for the academically-inclined. Not fun to read. Big words scary. It's insightful to be sure but I wouldn't read it again. The message on conceptual contradictions has stuck with me. Quite the brain food."
|
||||
},
|
||||
"The Parasitic Mind": {
|
||||
"filename": "theParasiticMind.jpeg",
|
||||
"link": "https://www.amazon.com/Parasitic-Mind-Infectious-Killing-Common/dp/1684512298/",
|
||||
"review": "The humor is the most memorable part but the concepts are no slouches. The contemporary culture war basis makes it tricky to talk about, but it absolutely should be discussed."
|
||||
},
|
||||
"David and Goliath": {
|
||||
"filename": "davidAndGoliath.png",
|
||||
"link": "https://www.amazon.com/David-Goliath-Underdogs-Misfits-Battling/dp/0316239852/",
|
||||
"review": "Book contains takes that may not be hot, but *are* incredibly based. In a sentence: Goliath is only the giant from the wrong perspectives. The only reason it's not one of my favorites is that it's tamer than the aggressively standoffish and hilarious."
|
||||
},
|
||||
"The Scout Mindset": {
|
||||
"filename": "scoutMindset.png",
|
||||
"link": "https://www.amazon.com/Scout-Mindset-People-Things-Clearly-ebook/dp/B07L2HQ26K/",
|
||||
"review": "Felt like a list of things that I already do that I should be more mindful of. Maybe that's just me. There was some interesting mental probablism sprinkled in the first half but the second half did not have much new to say. Good but not eye-opening."
|
||||
},
|
||||
"Verbal Judo": {
|
||||
"filename": "verbalJudo.png",
|
||||
"link": "https://www.amazon.com/Verbal-Judo-Second-Gentle-Persuasion-ebook/dp/B00FJ3CMI6/",
|
||||
@@ -106,6 +129,11 @@
|
||||
"link": "https://www.amazon.com/YOU-READ-ANYONE-David-Lieberman-ebook/dp/B001J6OV0Y",
|
||||
"review": "Not as page-turning as many of the others and clearly not as memorable. The techniques pique curiosity but are difficult to use without practice."
|
||||
},
|
||||
"The Parasitic Mind": {
|
||||
"filename": "theParasiticMind.jpeg",
|
||||
"link": "https://www.amazon.com/Parasitic-Mind-Infectious-Killing-Common/dp/1684512298/",
|
||||
"review": "The humor is the most memorable part but the concepts are no slouches. The contemporary culture war basis makes it tricky to talk about, but it absolutely should be discussed."
|
||||
},
|
||||
"Profiles in Courage": {
|
||||
"filename": "profilesInCourage.jpeg",
|
||||
"link": "https://www.amazon.com/Profiles-Courage-John-F-Kennedy/dp/0060854936",
|
||||
|
||||
18
src/static/json/pages.json
Normal file → Executable file
@@ -5,18 +5,18 @@
|
||||
"description": "Andrew Simonson's Digital Portfolio home",
|
||||
"canonical": "/"
|
||||
},
|
||||
"status": {
|
||||
"template": "status.html",
|
||||
"title":"Andrew Simonson - Status Page",
|
||||
"description": "Status page for my services",
|
||||
"canonical": "/status"
|
||||
},
|
||||
"projects": {
|
||||
"template": "projects.html",
|
||||
"title": "Andrew Simonson - Projects",
|
||||
"description": "Recent projects by Andrew Simonson on his lovely portfolio website :)",
|
||||
"canonical": "/projects"
|
||||
},
|
||||
"about": {
|
||||
"template": "about.html",
|
||||
"title": "Andrew Simonson - About Me",
|
||||
"description": "About Andrew Simonson",
|
||||
"canonical": "/about"
|
||||
},
|
||||
"books": {
|
||||
"template": "books.html",
|
||||
"title": "Andrew Simonson - Bookshelf",
|
||||
@@ -28,5 +28,11 @@
|
||||
"title":"You've been ducked!",
|
||||
"description": "Face it, you've been ducked",
|
||||
"canonical": "/duck"
|
||||
},
|
||||
"certificates": {
|
||||
"template": "certs.html",
|
||||
"title": "Certificates and Awards",
|
||||
"description": "Certificates and Awards Listing",
|
||||
"canonical": "/certs"
|
||||
}
|
||||
}
|
||||
|
||||
143
src/static/json/projects.json
Normal file → Executable file
@@ -1,81 +1,65 @@
|
||||
{
|
||||
"Antietam-Conococheague Watershed Monitoring": {
|
||||
"status": "complete",
|
||||
"classes": "geospacial",
|
||||
"bgi": "watershedTemps.png",
|
||||
"content": "Geospacial analysis of Maryland's Antietam and Conococheague sub-watersheds, monitoring water quality and temperatures through the summer months for reporting to governmental review boards for environmental protection"
|
||||
},
|
||||
"Automotive Brand Valuation Analysis": {
|
||||
"status": "complete",
|
||||
"classes": "programming",
|
||||
"bgi": "automotiveBrandAnalysis.png",
|
||||
"content": "Brand valuation analysis of the used car market, measuring value decay by mileage to extrapolate qualities such as percieved reliability and persistent value of luxury features."
|
||||
},
|
||||
"RIT Hotspots": {
|
||||
"status": "WIP",
|
||||
"status": "incomplete",
|
||||
"classes": "pinned geospacial programming",
|
||||
"bgi": "hotspotsrit.png",
|
||||
"content": "Live crowd migration map using RIT occupancy data",
|
||||
"content": "Live crowd migration map using RIT occupancy data. It seems RIT didn't like me exposing their surveilance state but since they didn't want to talk to me about it they instead changed the service response schema a few times. When that didn't stop me they just shut down the whole service. Nerds.",
|
||||
"links": [
|
||||
[
|
||||
"github", "https://github.com/asimonson1125/hotspotsrit", "git repo"
|
||||
],
|
||||
[
|
||||
"globe", "https://asimonson.com/hotspots", "demo"
|
||||
]
|
||||
["github", "https://github.com/asimonson1125/hotspotsrit", "git repo"]
|
||||
]
|
||||
},
|
||||
"LogicFlow": {
|
||||
"status": "incomplete",
|
||||
"classes": "programming",
|
||||
"bgi": "logicflow.jpg",
|
||||
"content": "Translate paragraphs to logical flowcharts, powered by ChatGPT Winner of CSHacks' Best Use of AI by Paychex",
|
||||
"links": [
|
||||
[
|
||||
"github", "https://github.com/asimonson1125/LogicFlow", "git repo"
|
||||
],
|
||||
[
|
||||
"globe", "https://devpost.com/software/logicflow", "Hackathon listing"
|
||||
]
|
||||
]
|
||||
"Calorimetry Analysis Engineering": {
|
||||
"status": "complete",
|
||||
"classes": "pinned programming",
|
||||
"bgi": "calorimeterAnalysis.png",
|
||||
"content": "An analytical toolkit designed for reactive chemistry analysis, especially calorimetry. Works include automatic analysis, alerting unusual and dangerous results derived from a wide range of testing envrionments and equipment",
|
||||
"links": []
|
||||
},
|
||||
"Alternative Energy Map": {
|
||||
"Geography of Alternative Energy": {
|
||||
"status": "complete",
|
||||
"classes": "pinned geospacial",
|
||||
"bgi": "geovisF.png",
|
||||
"content": "ArcGIS Map of the most effective alternative energy sources in the continental United States",
|
||||
"bgi": "energyGeography.png",
|
||||
"content": "An ArcGIS geospacial analysis comparing the difference in effectiveness of wind, solar, and geothermal energy across the continental 48 United States.",
|
||||
"links": [
|
||||
[
|
||||
"globe",
|
||||
"https://ritarcgis.maps.arcgis.com/apps/dashboards/17d5bda01edc4a2eb6205a4922d889c9",
|
||||
"ArcGIS"
|
||||
"Dashboard"
|
||||
]
|
||||
]
|
||||
},
|
||||
"OccupyRIT": {
|
||||
"status": "WIP",
|
||||
"classes": "programming",
|
||||
"status": "complete",
|
||||
"classes": "pinned programming",
|
||||
"bgi": "occupyRIT.png",
|
||||
"content": "Collects RIT Gym Occupancy data, determining busiest workout times",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/Occupy-RIT", "git repo"]
|
||||
]
|
||||
},
|
||||
"Chesscom Embeds": {
|
||||
"status": "complete",
|
||||
"classes": "programming",
|
||||
"bgi": "chessbed.png",
|
||||
"content": "A template for creating Chess.com user profile embeds",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/chesscom-embed", "git repo"]
|
||||
]
|
||||
},
|
||||
"Resume": {
|
||||
"Portfolio Website": {
|
||||
"status": "WIP",
|
||||
"classes": "programming",
|
||||
"bgi": "resume.png",
|
||||
"content": "My Resume, made in LaTeX with a custom design derived by the AltaCV template on OverLeaf",
|
||||
"content": "This website is my personal sandbox where I've integrated some of my data projects via docker cluster. It is self hosted and zero-trust secure while remaining dynamic and free of the tech debt that comes with pre-designed sites and excessive framework application. Yeah, I can do E2E.",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/Resume", "git repo"],
|
||||
["globe", "https://asimonson.com/Resume.pdf", "Resume"]
|
||||
]
|
||||
},
|
||||
"Digital Portfolio": {
|
||||
"status": "WIP",
|
||||
"classes": "programming",
|
||||
"bgi": "website.png",
|
||||
"content": "My personal portfolio website (you're on it now!)",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/asimonson1125.github.io", "git repo"],
|
||||
["globe", "https://asimonson.com", "site link"]
|
||||
["globe", "https://asimonson.com", "Homepage"],
|
||||
[
|
||||
"github",
|
||||
"https://github.com/asimonson1125/asimonson1125.github.io",
|
||||
"git repo"
|
||||
]
|
||||
]
|
||||
},
|
||||
"Slate": {
|
||||
@@ -84,62 +68,21 @@
|
||||
"bgi": "slate.png",
|
||||
"content": "Slate is a web app designed to help event coordinators schedule events by congregating participant calendar data. Includes Computer Science House account integration",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/Slate", "git repo"],
|
||||
["globe", "https://slate.csh.rit.edu", "site link"]
|
||||
["globe", "https://slate.csh.rit.edu", "site link"],
|
||||
["github", "https://github.com/asimonson1125/Slate", "git repo"]
|
||||
]
|
||||
},
|
||||
"HvZ Bot": {
|
||||
"status": "complete",
|
||||
"classes": "programming",
|
||||
"bgi": "",
|
||||
"content": "A Discord bot to handle role management and statistics for RIT's Humans vs. Zombies games",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/HvZ-bot", "git repo"]
|
||||
]
|
||||
},
|
||||
"FinTech": {
|
||||
"status": "WIP",
|
||||
"classes": "pinned programming",
|
||||
"bgi": "",
|
||||
"content": "A team derived from the RIT Financial Management Association dedicated to learning about financial management of equities using automated solutions developed by students",
|
||||
"links": [
|
||||
["github", "https://github.com/LukeHorigan/Financial-Management-Assocation-", "git repo"]
|
||||
]
|
||||
},
|
||||
"Browser Trivia Bot": {
|
||||
"status": "complete",
|
||||
"classes": "programming",
|
||||
"bgi": "",
|
||||
"content": "A tampermonkey tool used to automatically answer and submit online trivia forms, which can be tailored to different site layouts. Source currently private.",
|
||||
"links": [
|
||||
]
|
||||
},
|
||||
"NationsGame Rolls Sim": {
|
||||
"Monte Carlo Engine for NationsGame": {
|
||||
"status": "complete",
|
||||
"classes": "programming",
|
||||
"bgi": "ceoOfYugo.png",
|
||||
"content": "A simulator for the browser game, NationsGame, to analyze unit composition and predict in-game victors and unit statistics. Unfortunately, NationsGame is now defunct. Limited screenshots of functionality.",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/NG-Rolls-Simulator", "git repo"]
|
||||
]
|
||||
},
|
||||
"VEXcode Button Engine": {
|
||||
"status": "complete",
|
||||
"classes": "programming",
|
||||
"bgi": "vexcodeButtons.jpeg",
|
||||
"content": "VEXcode button library + examples and template for the VEX V5 brain",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/VEXcode-Button-Generator", "git repo"],
|
||||
["globe", "https://www.vexforum.com/t/vexcode-button-generator/72450", "Forum post"]
|
||||
]
|
||||
},
|
||||
"WinKeylogger": {
|
||||
"status": "complete",
|
||||
"classes": "programming",
|
||||
"bgi": "",
|
||||
"content": "A C++ keylogger for windows based off a Udemy course with my custom modifications and powershell script",
|
||||
"links": [
|
||||
["github", "https://github.com/asimonson1125/WinKeylogger", "git repo"]
|
||||
[
|
||||
"github",
|
||||
"https://github.com/asimonson1125/NG-Rolls-Simulator",
|
||||
"git repo"
|
||||
]
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
43
src/static/json/skills.json
Normal file → Executable file
@@ -1,26 +1,35 @@
|
||||
{
|
||||
"Data and AI": {
|
||||
"Python": {
|
||||
"PyTorch/TensorFlow": {},
|
||||
"Numpy/Pandas": {},
|
||||
"Selenium/BS4": {}
|
||||
"ML": {
|
||||
"PySpark ML": {},
|
||||
"Numpy/Pandas/Polars": {},
|
||||
"TensorFlow": {},
|
||||
"Scikit": {}
|
||||
},
|
||||
"R": {},
|
||||
"SQL": {}
|
||||
"PySpark": {},
|
||||
"Selenium/BS4 Web Hacking": {},
|
||||
"SQL": {},
|
||||
"Declarative Pipelines": {},
|
||||
"ArcGIS": {}
|
||||
},
|
||||
"DevOps": {
|
||||
"Docker": {},
|
||||
"Microsoft Azure": {},
|
||||
"Databricks": {},
|
||||
"Kubernetes/Openshift": {},
|
||||
"Cloudflare": {},
|
||||
"Bash": {}
|
||||
},
|
||||
"Frontend": {
|
||||
"Flask (Python)": {},
|
||||
"React (Javascript)": {},
|
||||
"Angular (Typescript)": {}
|
||||
"REST APIs": {},
|
||||
"Web Scraping": {}
|
||||
},
|
||||
"Backend & DevOps": {
|
||||
"DevOps": {
|
||||
"Docker": {},
|
||||
"Microsoft Azure": {},
|
||||
"Kubernetes/Openshift": {},
|
||||
"Bash": {}
|
||||
},
|
||||
"C#": {},
|
||||
"C++": {}
|
||||
"Offline Skills": {
|
||||
"Circuitry": {},
|
||||
"Skiing": {},
|
||||
"Chess": {},
|
||||
"Plinking": {},
|
||||
"Building something with trash that solves my problems": {}
|
||||
}
|
||||
}
|
||||
|
||||
0
src/static/json/timeline.json
Normal file → Executable file
|
Before Width: | Height: | Size: 98 KiB |
|
Before Width: | Height: | Size: 813 KiB |
|
Before Width: | Height: | Size: 898 KiB |
|
Before Width: | Height: | Size: 463 KiB |
BIN
src/static/photos/blinkies/010.gif
Normal file
|
After Width: | Height: | Size: 987 B |
BIN
src/static/photos/blinkies/bobthebuilder.gif
Normal file
|
After Width: | Height: | Size: 6.7 KiB |
BIN
src/static/photos/blinkies/brainglow.gif
Normal file
|
After Width: | Height: | Size: 1.4 KiB |
BIN
src/static/photos/blinkies/fearnobeer.gif
Normal file
|
After Width: | Height: | Size: 4.6 KiB |
BIN
src/static/photos/blinkies/pepsiaddict.gif
Normal file
|
After Width: | Height: | Size: 2.0 KiB |
BIN
src/static/photos/blinkies/tooclose.gif
Normal file
|
After Width: | Height: | Size: 1.7 KiB |
BIN
src/static/photos/blinkies/usa.gif
Normal file
|
After Width: | Height: | Size: 620 B |
0
src/static/photos/books/12RulesForLife.jpg
Normal file → Executable file
|
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 16 KiB |
0
src/static/photos/books/BeyondOrder.jpg
Normal file → Executable file
|
Before Width: | Height: | Size: 22 KiB After Width: | Height: | Size: 22 KiB |
0
src/static/photos/books/HitchhikersGuideToTheGalaxy.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 320 KiB After Width: | Height: | Size: 320 KiB |
BIN
src/static/photos/books/TyrannyOfMetrics.jpg
Executable file
|
After Width: | Height: | Size: 110 KiB |
0
src/static/photos/books/autobioOfJefferson.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 26 KiB After Width: | Height: | Size: 26 KiB |
0
src/static/photos/books/courageIsCalling.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 40 KiB After Width: | Height: | Size: 40 KiB |
0
src/static/photos/books/davidAndGoliath.png
Normal file → Executable file
|
Before Width: | Height: | Size: 64 KiB After Width: | Height: | Size: 64 KiB |
0
src/static/photos/books/disciplineIsDestiny.jpg
Normal file → Executable file
|
Before Width: | Height: | Size: 44 KiB After Width: | Height: | Size: 44 KiB |
0
src/static/photos/books/disunitedNations.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 40 KiB After Width: | Height: | Size: 40 KiB |
BIN
src/static/photos/books/fooledbyrandomness.jpg
Executable file
|
After Width: | Height: | Size: 17 KiB |
0
src/static/photos/books/freakonomics.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 90 KiB After Width: | Height: | Size: 90 KiB |
0
src/static/photos/books/giveMeABreak.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 37 KiB After Width: | Height: | Size: 37 KiB |
0
src/static/photos/books/makeYourBed.jpg
Normal file → Executable file
|
Before Width: | Height: | Size: 32 KiB After Width: | Height: | Size: 32 KiB |
0
src/static/photos/books/no-they-cant.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 308 KiB After Width: | Height: | Size: 308 KiB |
0
src/static/photos/books/onGrandStrategy.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 111 KiB After Width: | Height: | Size: 111 KiB |
0
src/static/photos/books/profilesInCourage.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 144 KiB After Width: | Height: | Size: 144 KiB |
0
src/static/photos/books/ratOpt.jpg
Normal file → Executable file
|
Before Width: | Height: | Size: 164 KiB After Width: | Height: | Size: 164 KiB |
BIN
src/static/photos/books/reignofterror.jpg
Executable file
|
After Width: | Height: | Size: 126 KiB |
BIN
src/static/photos/books/rightthingrightnow.png
Executable file
|
After Width: | Height: | Size: 204 KiB |
BIN
src/static/photos/books/scoutMindset.png
Executable file
|
After Width: | Height: | Size: 253 KiB |
0
src/static/photos/books/superfreakonomics.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 235 KiB After Width: | Height: | Size: 235 KiB |
0
src/static/photos/books/theAbsentSuperpower.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 54 KiB After Width: | Height: | Size: 54 KiB |
0
src/static/photos/books/theAccidentalSuperpower.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 79 KiB After Width: | Height: | Size: 79 KiB |
0
src/static/photos/books/theEndOfTheWorldIsJustTheBeginning.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 140 KiB After Width: | Height: | Size: 140 KiB |
0
src/static/photos/books/theParasiticMind.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 35 KiB After Width: | Height: | Size: 35 KiB |
0
src/static/photos/books/theStormBeforeTheCalm.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 119 KiB After Width: | Height: | Size: 119 KiB |
0
src/static/photos/books/thinkLikeAFreak.jpg
Normal file → Executable file
|
Before Width: | Height: | Size: 32 KiB After Width: | Height: | Size: 32 KiB |
0
src/static/photos/books/verbalJudo.png
Normal file → Executable file
|
Before Width: | Height: | Size: 201 KiB After Width: | Height: | Size: 201 KiB |
0
src/static/photos/books/whenToRobABank.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 129 KiB After Width: | Height: | Size: 129 KiB |
0
src/static/photos/books/where-good-ideas-come-from.png
Normal file → Executable file
|
Before Width: | Height: | Size: 184 KiB After Width: | Height: | Size: 184 KiB |
0
src/static/photos/books/youCanReadAnyone.jpg
Normal file → Executable file
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 14 KiB |
0
src/static/photos/books/zeroToOne.jpeg
Normal file → Executable file
|
Before Width: | Height: | Size: 71 KiB After Width: | Height: | Size: 71 KiB |
|
Before Width: | Height: | Size: 54 KiB |
BIN
src/static/photos/electricityStabby.png
Executable file
|
After Width: | Height: | Size: 203 KiB |
BIN
src/static/photos/extradimensionalSq.avif
Normal file
|
After Width: | Height: | Size: 42 KiB |
|
Before Width: | Height: | Size: 1.9 MiB |
0
src/static/photos/gifs/duck-spinning.gif
Normal file → Executable file
|
Before Width: | Height: | Size: 936 KiB After Width: | Height: | Size: 936 KiB |
BIN
src/static/photos/gifs/tflame.gif
Normal file
|
After Width: | Height: | Size: 14 KiB |
|
Before Width: | Height: | Size: 829 KiB |