mirror of
https://github.com/asimonson1125/asimonson1125.github.io.git
synced 2026-02-25 05:09:49 -06:00
Compare commits
12 Commits
efaf2fb169
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
| d21a8ec278 | |||
| 7232d1f8de | |||
| 3bd27f59d7 | |||
| 44948a6e9f | |||
| 3f0f9907ed | |||
| 7b969ea6c2 | |||
| 085ade75bf | |||
| dae0882e0f | |||
| d0f50141c7 | |||
| b59842899b | |||
| 9553e77b2f | |||
| 2ae714db48 |
30
Dockerfile
30
Dockerfile
@@ -1,10 +1,34 @@
|
||||
FROM python:3.10-bullseye
|
||||
# Use a slimmer base image to reduce image size and pull times
|
||||
FROM python:3.10-slim-bullseye
|
||||
LABEL maintainer="Andrew Simonson <asimonson1125@gmail.com>"
|
||||
|
||||
# Set environment variables for better Python performance in Docker
|
||||
ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||
PYTHONUNBUFFERED=1 \
|
||||
PIP_NO_CACHE_DIR=1
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Create a non-root user for security
|
||||
RUN groupadd -r appuser && useradd -r -g appuser appuser
|
||||
|
||||
# Copy only the requirements file first to leverage Docker layer caching
|
||||
COPY src/requirements.txt .
|
||||
|
||||
# Install dependencies as root, but then switch to the non-root user
|
||||
RUN pip install -r requirements.txt
|
||||
|
||||
# Copy the rest of the source code
|
||||
COPY src/ .
|
||||
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
# Ensure the appuser owns the app directory
|
||||
RUN chown -R appuser:appuser /app
|
||||
|
||||
CMD [ "gunicorn", "--bind", "0.0.0.0:8080", "app:app"]
|
||||
# Switch to the non-root user for better security
|
||||
USER appuser
|
||||
|
||||
# Expose the port (Gunicorn's default or specified in CMD)
|
||||
EXPOSE 8080
|
||||
|
||||
# Start Gunicorn
|
||||
CMD ["gunicorn", "--bind", "0.0.0.0:8080", "app:app"]
|
||||
|
||||
97
README.md
97
README.md
@@ -1,11 +1,92 @@
|
||||
# I made a uhh website
|
||||
So people can see how excellent my coding standards are.
|
||||
# Personal Portfolio & Service Monitor
|
||||
|
||||
* Style: 5/10
|
||||
* Originality: 3/10
|
||||
* Security: Yes*
|
||||
* Viruses: not included
|
||||
A Flask-based website for my personal portfolio and a service monitoring dashboard. This project handles dynamic project showcases, automated service health tracking, and production-ready optimizations.
|
||||
|
||||
You gotta uhh `pip3 install -r requirements.txt` and `python3 app.py` that thing
|
||||
## Features
|
||||
|
||||
Docker compose configured to expose at `localhost:8080`
|
||||
- **Content Management**: Pages like projects, books, and skills are managed via JSON files in the `static` directory.
|
||||
- **Service Monitoring**: Background health checks for external services with uptime statistics stored in PostgreSQL.
|
||||
- **Optimizations**:
|
||||
- HTML, CSS, and JS minification via `Flask-Minify`.
|
||||
- MD5-based cache busting for static assets.
|
||||
- Configurable cache-control headers.
|
||||
- **Security**: Pre-configured headers for XSS protection and frame security.
|
||||
- **Deployment**: Ready for containerized deployment with Docker and Gunicorn.
|
||||
|
||||
## Tech Stack
|
||||
|
||||
- **Backend**: Python 3.12, Flask
|
||||
- **Frontend**: Vanilla CSS/JS, Jinja2
|
||||
- **Database**: PostgreSQL (optional, for monitoring history)
|
||||
- **Infrastructure**: Docker, docker-compose
|
||||
|
||||
## Project Structure
|
||||
|
||||
```text
|
||||
.
|
||||
├── src/
|
||||
│ ├── app.py # Application entry point
|
||||
│ ├── monitor.py # Service monitoring logic
|
||||
│ ├── config.py # Environment configuration
|
||||
│ ├── templates/ # HTML templates
|
||||
│ ├── static/ # CSS, JS, and JSON data
|
||||
│ └── requirements.txt # Python dependencies
|
||||
├── Dockerfile # Container definition
|
||||
├── docker-compose.yml # Local stack orchestration
|
||||
└── STATUS_MONITOR_README.md # Monitoring system documentation
|
||||
```
|
||||
|
||||
## Getting Started
|
||||
|
||||
### Using Docker
|
||||
|
||||
To run the full stack (App + PostgreSQL):
|
||||
|
||||
1. **Clone the repository**:
|
||||
```bash
|
||||
git clone https://github.com/asimonson1125/asimonson1125.github.io.git
|
||||
cd asimonson1125.github.io
|
||||
```
|
||||
|
||||
2. **Start services**:
|
||||
```bash
|
||||
docker-compose up --build
|
||||
```
|
||||
|
||||
3. **Access the site**:
|
||||
Visit [http://localhost:8080](http://localhost:8080).
|
||||
|
||||
### Local Development
|
||||
|
||||
To run the Flask app without Docker:
|
||||
|
||||
1. **Set up a virtual environment**:
|
||||
```bash
|
||||
python3 -m venv .venv
|
||||
source .venv/bin/activate
|
||||
```
|
||||
|
||||
2. **Install dependencies**:
|
||||
```bash
|
||||
pip install -r src/requirements.txt
|
||||
```
|
||||
|
||||
3. **Run the application**:
|
||||
```bash
|
||||
cd src
|
||||
python3 app.py
|
||||
```
|
||||
*Note: status monitor is by default disabled outside of its container cluster*
|
||||
|
||||
## Service Monitoring
|
||||
|
||||
The monitoring system in `src/monitor.py` tracks service availability. It:
|
||||
- Runs concurrent health checks every hour.
|
||||
- Calculates uptime for various windows (24h, 7d, 30d).
|
||||
- Provides a status UI at `/status` and a JSON API at `/api/status`.
|
||||
|
||||
See [STATUS_MONITOR_README.md](./STATUS_MONITOR_README.md) for more details.
|
||||
|
||||
## License
|
||||
|
||||
This project is personal property. All rights reserved.
|
||||
|
||||
@@ -19,9 +19,9 @@ Server-side monitoring system that checks the availability of asimonson.com serv
|
||||
|
||||
**Features**:
|
||||
- Tracks response times and HTTP status codes
|
||||
- Stores check history (up to 720 checks = 60 days of data)
|
||||
- Calculates uptime percentages for multiple time periods (24h, 7d, 30d, all-time)
|
||||
- Persists data to `static/json/status_history.json`
|
||||
- Persists data to PostgreSQL (`service_checks` table) via `DATABASE_URL` env var
|
||||
- Gracefully degrades when no database is configured (local dev)
|
||||
- Runs in a background thread
|
||||
|
||||
#### 2. `app.py` - Flask Integration
|
||||
@@ -57,32 +57,22 @@ Server-side monitoring system that checks the availability of asimonson.com serv
|
||||
|
||||
## Data Storage
|
||||
|
||||
Status history is stored in `src/static/json/status_history.json`:
|
||||
Check history is stored in a PostgreSQL `service_checks` table. The connection is configured via the `DATABASE_URL` environment variable (e.g. `postgresql://user:pass@host:5432/dbname`).
|
||||
|
||||
```json
|
||||
{
|
||||
"last_check": "2026-02-11T14:30:00",
|
||||
"services": {
|
||||
"main": {
|
||||
"name": "asimonson.com",
|
||||
"url": "https://asimonson.com",
|
||||
"status": "online",
|
||||
"response_time": 156,
|
||||
"status_code": 200,
|
||||
"last_online": "2026-02-11T14:30:00",
|
||||
"checks": [
|
||||
{
|
||||
"timestamp": "2026-02-11T14:30:00",
|
||||
"status": "online",
|
||||
"response_time": 156,
|
||||
"status_code": 200
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```sql
|
||||
CREATE TABLE service_checks (
|
||||
id SERIAL PRIMARY KEY,
|
||||
service_id VARCHAR(50) NOT NULL,
|
||||
timestamp TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
status VARCHAR(20) NOT NULL,
|
||||
response_time INTEGER,
|
||||
status_code INTEGER,
|
||||
error TEXT
|
||||
);
|
||||
```
|
||||
|
||||
The table and index are created automatically on startup. If `DATABASE_URL` is not set, the monitor runs without persistence (useful for local development).
|
||||
|
||||
## Status Types
|
||||
|
||||
- **online**: HTTP status 2xx-4xx, service responding
|
||||
@@ -142,8 +132,7 @@ SERVICES = [
|
||||
## Notes
|
||||
|
||||
- First deployment will show limited uptime data until enough checks accumulate
|
||||
- Historical data is preserved across server restarts
|
||||
- Maximum 720 checks stored per service (60 days at 2-hour intervals)
|
||||
- Historical data is preserved across server restarts (stored in PostgreSQL)
|
||||
- Page auto-refreshes every 5 minutes to show latest server data
|
||||
- Manual refresh button available for immediate updates
|
||||
- All checks performed server-side (no client-side CORS issues)
|
||||
|
||||
@@ -7,3 +7,26 @@ services:
|
||||
restart: 'no'
|
||||
ports:
|
||||
- 8080:8080
|
||||
environment:
|
||||
DATABASE_URL: postgresql://portfolio:portfolio@db:5432/portfolio
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
|
||||
db:
|
||||
image: postgres:16-alpine
|
||||
restart: 'no'
|
||||
environment:
|
||||
POSTGRES_USER: portfolio
|
||||
POSTGRES_PASSWORD: portfolio
|
||||
POSTGRES_DB: portfolio
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U portfolio"]
|
||||
interval: 5s
|
||||
timeout: 3s
|
||||
retries: 5
|
||||
|
||||
volumes:
|
||||
pgdata:
|
||||
|
||||
143
src/app.py
143
src/app.py
@@ -1,15 +1,18 @@
|
||||
import flask
|
||||
from flask_minify import Minify
|
||||
import hashlib
|
||||
import json
|
||||
import os
|
||||
import hashlib
|
||||
|
||||
import flask
|
||||
from flask_minify import Minify
|
||||
import werkzeug.exceptions as HTTPerror
|
||||
from config import *
|
||||
|
||||
import config # noqa: F401 — side-effect: loads dev env vars
|
||||
from monitor import monitor, SERVICES
|
||||
|
||||
app = flask.Flask(__name__)
|
||||
|
||||
# Compute content hashes for static file fingerprinting
|
||||
# ── Static file fingerprinting ────────────────────────────────────────
|
||||
|
||||
static_file_hashes = {}
|
||||
for dirpath, _, filenames in os.walk(app.static_folder):
|
||||
for filename in filenames:
|
||||
@@ -18,6 +21,7 @@ for dirpath, _, filenames in os.walk(app.static_folder):
|
||||
with open(filepath, 'rb') as f:
|
||||
static_file_hashes[relative] = hashlib.md5(f.read()).hexdigest()[:8]
|
||||
|
||||
|
||||
@app.context_processor
|
||||
def override_url_for():
|
||||
def versioned_url_for(endpoint, **values):
|
||||
@@ -28,17 +32,16 @@ def override_url_for():
|
||||
return flask.url_for(endpoint, **values)
|
||||
return dict(url_for=versioned_url_for)
|
||||
|
||||
# Add security and caching headers
|
||||
|
||||
# ── Security and caching headers ──────────────────────────────────────
|
||||
|
||||
@app.after_request
|
||||
def add_security_headers(response):
|
||||
"""Add security and performance headers to all responses"""
|
||||
# Security headers
|
||||
def add_headers(response):
|
||||
response.headers['X-Content-Type-Options'] = 'nosniff'
|
||||
response.headers['X-Frame-Options'] = 'SAMEORIGIN'
|
||||
response.headers['X-XSS-Protection'] = '1; mode=block'
|
||||
response.headers['Referrer-Policy'] = 'strict-origin-when-cross-origin'
|
||||
|
||||
# Cache control for static assets
|
||||
if flask.request.path.startswith('/static/'):
|
||||
response.headers['Cache-Control'] = 'public, max-age=31536000, immutable'
|
||||
elif flask.request.path in ['/sitemap.xml', '/robots.txt']:
|
||||
@@ -49,56 +52,93 @@ def add_security_headers(response):
|
||||
return response
|
||||
|
||||
|
||||
# ── Load page data ────────────────────────────────────────────────────
|
||||
|
||||
def load_json(path):
|
||||
with open(path, "r") as f:
|
||||
return json.load(f)
|
||||
|
||||
proj = load_json("./static/json/projects.json")
|
||||
|
||||
projects = load_json("./static/json/projects.json")
|
||||
books = load_json("./static/json/books.json")
|
||||
skillList = load_json("./static/json/skills.json")
|
||||
skills = load_json("./static/json/skills.json")
|
||||
timeline = load_json("./static/json/timeline.json")
|
||||
pages = load_json("./static/json/pages.json")
|
||||
|
||||
pages['projects']['skillList'] = skillList
|
||||
# pages['about']['timeline'] = timeline
|
||||
pages['projects']['projects'] = proj
|
||||
pages['projects']['skillList'] = skills
|
||||
pages['projects']['projects'] = projects
|
||||
pages['home']['books'] = books
|
||||
pages['books']['books'] = books
|
||||
pages['status']['services'] = SERVICES
|
||||
|
||||
|
||||
# ── Error rendering ──────────────────────────────────────────────────
|
||||
|
||||
def render_error(code, message):
|
||||
pagevars = {
|
||||
"template": "error.html",
|
||||
"title": f"{code} - Simonson",
|
||||
"description": "Error on Andrew Simonson's Digital Portfolio",
|
||||
"canonical": f"/{code}",
|
||||
}
|
||||
return (
|
||||
flask.render_template(
|
||||
"header.html",
|
||||
var=pagevars,
|
||||
error=code,
|
||||
message=message,
|
||||
title=f"{code} - Simonson Portfolio",
|
||||
),
|
||||
code,
|
||||
)
|
||||
|
||||
|
||||
@app.errorhandler(HTTPerror.HTTPException)
|
||||
def handle_http_error(e):
|
||||
return render_error(e.code, e.description)
|
||||
|
||||
|
||||
@app.errorhandler(Exception)
|
||||
def handle_generic_error(e):
|
||||
return render_error(500, "Internal Server Error")
|
||||
|
||||
|
||||
# ── API routes ────────────────────────────────────────────────────────
|
||||
|
||||
@app.route('/api/status')
|
||||
def api_status():
|
||||
"""API endpoint for service status"""
|
||||
return flask.jsonify(monitor.get_status_summary())
|
||||
|
||||
|
||||
@app.route('/api/goto/')
|
||||
@app.route('/api/goto/<location>')
|
||||
def goto(location='home'):
|
||||
def api_goto(location='home'):
|
||||
if location not in pages:
|
||||
flask.abort(404)
|
||||
pagevars = pages[location]
|
||||
page = None
|
||||
try:
|
||||
page = flask.render_template(pagevars["template"], var=pagevars)
|
||||
except Exception:
|
||||
e = HTTPerror.InternalServerError()
|
||||
page = handle_http_error(e)
|
||||
page = render_error(500, "Internal Server Error")
|
||||
return [pagevars, page]
|
||||
|
||||
def funcGen(pagename, pages):
|
||||
def dynamicRule():
|
||||
|
||||
# ── Dynamic page routes ──────────────────────────────────────────────
|
||||
|
||||
def make_page_handler(pagename):
|
||||
def handler():
|
||||
try:
|
||||
return flask.render_template('header.html', var=pages[pagename])
|
||||
except Exception:
|
||||
e = HTTPerror.InternalServerError()
|
||||
print(e)
|
||||
return handle_http_error(e)
|
||||
return dynamicRule
|
||||
return render_error(500, "Internal Server Error")
|
||||
return handler
|
||||
|
||||
for i in pages:
|
||||
func = funcGen(i, pages)
|
||||
app.add_url_rule(pages[i]['canonical'], i, func)
|
||||
|
||||
for name in pages:
|
||||
app.add_url_rule(pages[name]['canonical'], name, make_page_handler(name))
|
||||
|
||||
|
||||
# ── Static file routes ───────────────────────────────────────────────
|
||||
|
||||
@app.route("/resume")
|
||||
@app.route("/Resume.pdf")
|
||||
@@ -106,46 +146,6 @@ for i in pages:
|
||||
def resume():
|
||||
return flask.send_file("./static/Resume_Simonson_Andrew.pdf")
|
||||
|
||||
@app.errorhandler(HTTPerror.HTTPException)
|
||||
def handle_http_error(e):
|
||||
eCode = e.code
|
||||
message = e.description
|
||||
pagevars = {
|
||||
"template": "error.html",
|
||||
"title": f"{eCode} - Simonson",
|
||||
"description": "Error on Andrew Simonson's Digital Portfolio",
|
||||
"canonical": f"/{eCode}",
|
||||
}
|
||||
return (
|
||||
flask.render_template(
|
||||
"header.html",
|
||||
var=pagevars,
|
||||
error=eCode,
|
||||
message=message,
|
||||
title=f"{eCode} - Simonson Portfolio",
|
||||
),
|
||||
eCode,
|
||||
)
|
||||
|
||||
@app.errorhandler(Exception)
|
||||
def handle_generic_error(e):
|
||||
pagevars = {
|
||||
"template": "error.html",
|
||||
"title": "500 - Simonson",
|
||||
"description": "Error on Andrew Simonson's Digital Portfolio",
|
||||
"canonical": "/500",
|
||||
}
|
||||
return (
|
||||
flask.render_template(
|
||||
"header.html",
|
||||
var=pagevars,
|
||||
error=500,
|
||||
message="Internal Server Error",
|
||||
title="500 - Simonson Portfolio",
|
||||
),
|
||||
500,
|
||||
)
|
||||
|
||||
|
||||
@app.route("/sitemap.xml")
|
||||
@app.route("/robots.txt")
|
||||
@@ -153,10 +153,9 @@ def static_from_root():
|
||||
return flask.send_from_directory(app.static_folder, flask.request.path[1:])
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# import sass
|
||||
# ── Startup ───────────────────────────────────────────────────────────
|
||||
|
||||
# sass.compile(dirname=("static/scss", "static/css"), output_style="compressed")
|
||||
if __name__ == "__main__":
|
||||
app.run(debug=False)
|
||||
else:
|
||||
Minify(app=app, html=True, js=True, cssless=True)
|
||||
|
||||
398
src/monitor.py
398
src/monitor.py
@@ -1,114 +1,170 @@
|
||||
"""
|
||||
Service monitoring module
|
||||
Checks service availability and tracks uptime statistics
|
||||
Service monitoring module.
|
||||
Checks service availability and tracks uptime statistics in PostgreSQL.
|
||||
"""
|
||||
import requests
|
||||
import os
|
||||
import time
|
||||
import json
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
from datetime import datetime, timedelta
|
||||
from threading import Thread, Lock
|
||||
from pathlib import Path
|
||||
|
||||
# Service configuration
|
||||
import psycopg2
|
||||
import requests
|
||||
|
||||
SERVICES = [
|
||||
{
|
||||
'id': 'main',
|
||||
'name': 'asimonson.com',
|
||||
'url': 'https://asimonson.com',
|
||||
'timeout': 10
|
||||
},
|
||||
{
|
||||
'id': 'files',
|
||||
'name': 'files.asimonson.com',
|
||||
'url': 'https://files.asimonson.com',
|
||||
'timeout': 10
|
||||
},
|
||||
{
|
||||
'id': 'git',
|
||||
'name': 'git.asimonson.com',
|
||||
'url': 'https://git.asimonson.com',
|
||||
'timeout': 10
|
||||
}
|
||||
{'id': 'main', 'name': 'asimonson.com', 'url': 'https://asimonson.com', 'timeout': 10},
|
||||
{'id': 'files', 'name': 'files.asimonson.com', 'url': 'https://files.asimonson.com', 'timeout': 10},
|
||||
{'id': 'git', 'name': 'git.asimonson.com', 'url': 'https://git.asimonson.com', 'timeout': 10},
|
||||
]
|
||||
|
||||
# Check interval: 30 mins
|
||||
CHECK_INTERVAL = 1800
|
||||
CHECK_INTERVAL = 60 # seconds between checks
|
||||
RETENTION_DAYS = 90 # how long to keep records
|
||||
CLEANUP_INTERVAL = 86400 # seconds between purge runs
|
||||
|
||||
DATABASE_URL = os.environ.get('DATABASE_URL')
|
||||
|
||||
# Expected columns (besides id) -- name: SQL type
|
||||
_EXPECTED_COLUMNS = {
|
||||
'service_id': 'VARCHAR(50) NOT NULL',
|
||||
'timestamp': 'TIMESTAMPTZ NOT NULL DEFAULT NOW()',
|
||||
'status': 'VARCHAR(20) NOT NULL',
|
||||
'response_time': 'INTEGER',
|
||||
'status_code': 'INTEGER',
|
||||
'error': 'TEXT',
|
||||
}
|
||||
|
||||
# File to store status history
|
||||
STATUS_FILE = Path(__file__).parent / 'static' / 'json' / 'status_history.json'
|
||||
|
||||
class ServiceMonitor:
|
||||
def __init__(self):
|
||||
self.status_data = {}
|
||||
self.lock = Lock()
|
||||
self.load_history()
|
||||
|
||||
def load_history(self):
|
||||
"""Load status history from file"""
|
||||
if STATUS_FILE.exists():
|
||||
try:
|
||||
with open(STATUS_FILE, 'r') as f:
|
||||
self.status_data = json.load(f)
|
||||
except Exception as e:
|
||||
print(f"Error loading status history: {e}")
|
||||
self.initialize_status_data()
|
||||
else:
|
||||
self.initialize_status_data()
|
||||
|
||||
def initialize_status_data(self):
|
||||
"""Initialize empty status data structure"""
|
||||
self.status_data = {
|
||||
'last_check': None,
|
||||
'services': {}
|
||||
}
|
||||
for service in SERVICES:
|
||||
self.status_data['services'][service['id']] = {
|
||||
'name': service['name'],
|
||||
'url': service['url'],
|
||||
self._current = {
|
||||
svc['id']: {
|
||||
'name': svc['name'],
|
||||
'url': svc['url'],
|
||||
'status': 'unknown',
|
||||
'response_time': None,
|
||||
'status_code': None,
|
||||
'last_online': None,
|
||||
'checks': [] # List of check results
|
||||
}
|
||||
for svc in SERVICES
|
||||
}
|
||||
self._last_check = None
|
||||
self._ensure_schema()
|
||||
|
||||
def save_history(self):
|
||||
"""Save status history to file"""
|
||||
# ── Database helpers ──────────────────────────────────────────
|
||||
|
||||
@staticmethod
|
||||
def _get_conn():
|
||||
"""Return a new psycopg2 connection, or None if DATABASE_URL is unset."""
|
||||
if not DATABASE_URL:
|
||||
return None
|
||||
return psycopg2.connect(DATABASE_URL)
|
||||
|
||||
def _ensure_schema(self):
|
||||
"""Create or migrate the service_checks table to match _EXPECTED_COLUMNS."""
|
||||
if not DATABASE_URL:
|
||||
print("DATABASE_URL not set -- running without persistence")
|
||||
return
|
||||
|
||||
conn = None
|
||||
for attempt in range(5):
|
||||
try:
|
||||
STATUS_FILE.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(STATUS_FILE, 'w') as f:
|
||||
json.dump(self.status_data, f, indent=2)
|
||||
except Exception as e:
|
||||
print(f"Error saving status history: {e}")
|
||||
conn = psycopg2.connect(DATABASE_URL)
|
||||
break
|
||||
except psycopg2.OperationalError:
|
||||
if attempt < 4:
|
||||
print(f"Database not ready, retrying in 2s (attempt {attempt + 1}/5)...")
|
||||
time.sleep(2)
|
||||
else:
|
||||
print("Could not connect to database -- running without persistence")
|
||||
return
|
||||
|
||||
try:
|
||||
with conn, conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
CREATE TABLE IF NOT EXISTS service_checks (
|
||||
id SERIAL PRIMARY KEY,
|
||||
service_id VARCHAR(50) NOT NULL,
|
||||
timestamp TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
status VARCHAR(20) NOT NULL,
|
||||
response_time INTEGER,
|
||||
status_code INTEGER,
|
||||
error TEXT
|
||||
);
|
||||
""")
|
||||
cur.execute("""
|
||||
CREATE INDEX IF NOT EXISTS idx_service_checks_service_timestamp
|
||||
ON service_checks (service_id, timestamp DESC);
|
||||
""")
|
||||
|
||||
# Introspect existing columns
|
||||
cur.execute("""
|
||||
SELECT column_name
|
||||
FROM information_schema.columns
|
||||
WHERE table_name = 'service_checks'
|
||||
""")
|
||||
existing = {row[0] for row in cur.fetchall()}
|
||||
|
||||
for col, col_type in _EXPECTED_COLUMNS.items():
|
||||
if col not in existing:
|
||||
bare_type = col_type.split('NOT NULL')[0].split('DEFAULT')[0].strip()
|
||||
cur.execute(f'ALTER TABLE service_checks ADD COLUMN {col} {bare_type}')
|
||||
print(f"Added column {col} to service_checks")
|
||||
|
||||
expected_names = set(_EXPECTED_COLUMNS) | {'id'}
|
||||
for col in existing - expected_names:
|
||||
cur.execute(f'ALTER TABLE service_checks DROP COLUMN {col}')
|
||||
print(f"Dropped column {col} from service_checks")
|
||||
|
||||
print("Database schema OK")
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
def _insert_check(self, service_id, result):
|
||||
"""Persist a single check result to the database."""
|
||||
conn = self._get_conn()
|
||||
if conn is None:
|
||||
return
|
||||
try:
|
||||
with conn, conn.cursor() as cur:
|
||||
cur.execute(
|
||||
"""INSERT INTO service_checks
|
||||
(service_id, timestamp, status, response_time, status_code, error)
|
||||
VALUES (%s, %s, %s, %s, %s, %s)""",
|
||||
(
|
||||
service_id,
|
||||
result['timestamp'],
|
||||
result['status'],
|
||||
result.get('response_time'),
|
||||
result.get('status_code'),
|
||||
result.get('error'),
|
||||
),
|
||||
)
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
# ── Service checks ────────────────────────────────────────────
|
||||
|
||||
def check_service(self, service):
|
||||
"""Check a single service and return status"""
|
||||
"""Perform an HTTP HEAD against a service and return a status dict."""
|
||||
start_time = time.time()
|
||||
result = {
|
||||
'timestamp': datetime.now().isoformat(),
|
||||
'status': 'offline',
|
||||
'response_time': None,
|
||||
'status_code': None
|
||||
'status_code': None,
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.head(
|
||||
service['url'],
|
||||
timeout=service['timeout'],
|
||||
allow_redirects=True
|
||||
allow_redirects=True,
|
||||
)
|
||||
|
||||
elapsed = int((time.time() - start_time) * 1000) # ms
|
||||
|
||||
result['response_time'] = elapsed
|
||||
result['response_time'] = int((time.time() - start_time) * 1000)
|
||||
result['status_code'] = response.status_code
|
||||
|
||||
# Consider 2xx and 3xx as online
|
||||
if 200 <= response.status_code < 400:
|
||||
result['status'] = 'online'
|
||||
elif 400 <= response.status_code < 500:
|
||||
# Client errors might still mean service is up
|
||||
if response.status_code < 500:
|
||||
result['status'] = 'online'
|
||||
else:
|
||||
result['status'] = 'degraded'
|
||||
@@ -123,10 +179,9 @@ class ServiceMonitor:
|
||||
return result
|
||||
|
||||
def check_all_services(self):
|
||||
"""Check all services and update status data"""
|
||||
"""Check every service concurrently, persist results, and update cache."""
|
||||
print(f"[{datetime.now().strftime('%Y-%m-%d %H:%M:%S')}] Checking all services...")
|
||||
|
||||
# Perform all network checks concurrently and OUTSIDE the lock
|
||||
results = {}
|
||||
with ThreadPoolExecutor(max_workers=len(SERVICES)) as executor:
|
||||
futures = {executor.submit(self.check_service, s): s for s in SERVICES}
|
||||
@@ -136,121 +191,154 @@ class ServiceMonitor:
|
||||
results[service['id']] = result
|
||||
print(f" {service['name']}: {result['status']} ({result['response_time']}ms)")
|
||||
|
||||
# Only acquire lock when updating the shared data structure
|
||||
for service_id, result in results.items():
|
||||
self._insert_check(service_id, result)
|
||||
|
||||
with self.lock:
|
||||
for service in SERVICES:
|
||||
result = results[service['id']]
|
||||
service_data = self.status_data['services'][service['id']]
|
||||
|
||||
# Update current status
|
||||
service_data['status'] = result['status']
|
||||
service_data['response_time'] = result['response_time']
|
||||
service_data['status_code'] = result['status_code']
|
||||
|
||||
cached = self._current[service['id']]
|
||||
cached['status'] = result['status']
|
||||
cached['response_time'] = result['response_time']
|
||||
cached['status_code'] = result['status_code']
|
||||
if result['status'] == 'online':
|
||||
service_data['last_online'] = result['timestamp']
|
||||
cached['last_online'] = result['timestamp']
|
||||
self._last_check = datetime.now().isoformat()
|
||||
|
||||
# Add to check history (keep last 2880 checks = 60 days at 2hr intervals)
|
||||
service_data['checks'].append(result)
|
||||
if len(service_data['checks']) > 2880:
|
||||
service_data['checks'] = service_data['checks'][-2880:]
|
||||
# ── Uptime calculations ───────────────────────────────────────
|
||||
|
||||
self.status_data['last_check'] = datetime.now().isoformat()
|
||||
self.save_history()
|
||||
|
||||
def _calculate_uptime_unlocked(self, service_id, hours=None):
|
||||
"""Calculate uptime percentage for a service (assumes lock is held)"""
|
||||
service_data = self.status_data['services'].get(service_id)
|
||||
if not service_data or not service_data['checks']:
|
||||
def _calculate_uptime(self, service_id, hours=None):
|
||||
"""Return uptime percentage for a service, or None if insufficient data."""
|
||||
conn = self._get_conn()
|
||||
if conn is None:
|
||||
return None
|
||||
|
||||
checks = service_data['checks']
|
||||
|
||||
# Filter by time period if specified
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
if hours:
|
||||
cutoff = datetime.now() - timedelta(hours=hours)
|
||||
checks = [
|
||||
c for c in checks
|
||||
if datetime.fromisoformat(c['timestamp']) > cutoff
|
||||
]
|
||||
|
||||
if not checks:
|
||||
return None
|
||||
|
||||
# Require minimum data coverage for the time period
|
||||
# Calculate expected number of checks for this period
|
||||
expected_checks = (hours * 3600) / CHECK_INTERVAL
|
||||
|
||||
# Require at least 50% of expected checks to show this metric
|
||||
minimum_checks = max(3, expected_checks * 0.5)
|
||||
|
||||
if len(checks) < minimum_checks:
|
||||
return None
|
||||
cur.execute(
|
||||
"""SELECT
|
||||
COUNT(*) FILTER (WHERE status = 'online'),
|
||||
COUNT(*)
|
||||
FROM service_checks
|
||||
WHERE service_id = %s AND timestamp > %s""",
|
||||
(service_id, cutoff),
|
||||
)
|
||||
else:
|
||||
# For all-time, require at least 3 checks
|
||||
if len(checks) < 3:
|
||||
cur.execute(
|
||||
"""SELECT
|
||||
COUNT(*) FILTER (WHERE status = 'online'),
|
||||
COUNT(*)
|
||||
FROM service_checks
|
||||
WHERE service_id = %s""",
|
||||
(service_id,),
|
||||
)
|
||||
|
||||
online_count, total_count = cur.fetchone()
|
||||
if total_count == 0:
|
||||
return None
|
||||
|
||||
online_count = sum(1 for c in checks if c['status'] == 'online')
|
||||
uptime = (online_count / len(checks)) * 100
|
||||
# Only report a time-windowed uptime if data exists beyond the window
|
||||
if hours:
|
||||
cur.execute(
|
||||
'SELECT EXISTS(SELECT 1 FROM service_checks WHERE service_id = %s AND timestamp <= %s)',
|
||||
(service_id, cutoff),
|
||||
)
|
||||
if not cur.fetchone()[0]:
|
||||
return None
|
||||
|
||||
return round(uptime, 2)
|
||||
return round((online_count / total_count) * 100, 2)
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
def calculate_uptime(self, service_id, hours=None):
|
||||
"""Calculate uptime percentage for a service"""
|
||||
with self.lock:
|
||||
return self._calculate_uptime_unlocked(service_id, hours)
|
||||
def _get_total_checks(self, service_id):
|
||||
"""Return the total number of recorded checks for a service."""
|
||||
conn = self._get_conn()
|
||||
if conn is None:
|
||||
return 0
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(
|
||||
'SELECT COUNT(*) FROM service_checks WHERE service_id = %s',
|
||||
(service_id,),
|
||||
)
|
||||
return cur.fetchone()[0]
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
# ── Status summary ────────────────────────────────────────────
|
||||
|
||||
def get_status_summary(self):
|
||||
"""Get current status summary with uptime statistics"""
|
||||
"""Build a JSON-serializable status summary with uptime statistics."""
|
||||
with self.lock:
|
||||
summary = {
|
||||
'last_check': self.status_data['last_check'],
|
||||
'last_check': self._last_check,
|
||||
'next_check': None,
|
||||
'services': []
|
||||
'services': [],
|
||||
}
|
||||
|
||||
# Calculate next check time
|
||||
if self.status_data['last_check']:
|
||||
last_check = datetime.fromisoformat(self.status_data['last_check'])
|
||||
next_check = last_check + timedelta(seconds=CHECK_INTERVAL)
|
||||
summary['next_check'] = next_check.isoformat()
|
||||
if self._last_check:
|
||||
last_check = datetime.fromisoformat(self._last_check)
|
||||
summary['next_check'] = (last_check + timedelta(seconds=CHECK_INTERVAL)).isoformat()
|
||||
|
||||
for service_id, service_data in self.status_data['services'].items():
|
||||
service_summary = {
|
||||
for service_id, cached in self._current.items():
|
||||
summary['services'].append({
|
||||
'id': service_id,
|
||||
'name': service_data['name'],
|
||||
'url': service_data['url'],
|
||||
'status': service_data['status'],
|
||||
'response_time': service_data['response_time'],
|
||||
'status_code': service_data['status_code'],
|
||||
'last_online': service_data['last_online'],
|
||||
'name': cached['name'],
|
||||
'url': cached['url'],
|
||||
'status': cached['status'],
|
||||
'response_time': cached['response_time'],
|
||||
'status_code': cached['status_code'],
|
||||
'last_online': cached['last_online'],
|
||||
'uptime': {
|
||||
'24h': self._calculate_uptime_unlocked(service_id, 24),
|
||||
'7d': self._calculate_uptime_unlocked(service_id, 24 * 7),
|
||||
'30d': self._calculate_uptime_unlocked(service_id, 24 * 30),
|
||||
'all_time': self._calculate_uptime_unlocked(service_id)
|
||||
'24h': self._calculate_uptime(service_id, 24),
|
||||
'7d': self._calculate_uptime(service_id, 24 * 7),
|
||||
'30d': self._calculate_uptime(service_id, 24 * 30),
|
||||
'all_time': self._calculate_uptime(service_id),
|
||||
},
|
||||
'total_checks': len(service_data['checks'])
|
||||
}
|
||||
summary['services'].append(service_summary)
|
||||
'total_checks': self._get_total_checks(service_id),
|
||||
})
|
||||
|
||||
return summary
|
||||
|
||||
def start_monitoring(self):
|
||||
"""Start background monitoring thread"""
|
||||
def monitor_loop():
|
||||
# Initial check
|
||||
self.check_all_services()
|
||||
# ── Background loop ───────────────────────────────────────────
|
||||
|
||||
def _purge_old_records(self):
|
||||
"""Delete check records older than RETENTION_DAYS."""
|
||||
conn = self._get_conn()
|
||||
if conn is None:
|
||||
return
|
||||
try:
|
||||
cutoff = datetime.now() - timedelta(days=RETENTION_DAYS)
|
||||
with conn, conn.cursor() as cur:
|
||||
cur.execute('DELETE FROM service_checks WHERE timestamp < %s', (cutoff,))
|
||||
deleted = cur.rowcount
|
||||
if deleted:
|
||||
print(f"Purged {deleted} records older than {RETENTION_DAYS} days")
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
def start_monitoring(self):
|
||||
"""Start the background daemon thread for periodic checks and cleanup."""
|
||||
def monitor_loop():
|
||||
self.check_all_services()
|
||||
self._purge_old_records()
|
||||
|
||||
checks_since_cleanup = 0
|
||||
checks_per_cleanup = CLEANUP_INTERVAL // CHECK_INTERVAL
|
||||
|
||||
# Periodic checks
|
||||
while True:
|
||||
time.sleep(CHECK_INTERVAL)
|
||||
self.check_all_services()
|
||||
checks_since_cleanup += 1
|
||||
if checks_since_cleanup >= checks_per_cleanup:
|
||||
self._purge_old_records()
|
||||
checks_since_cleanup = 0
|
||||
|
||||
thread = Thread(target=monitor_loop, daemon=True)
|
||||
thread.start()
|
||||
print(f"Service monitoring started (checks every {CHECK_INTERVAL/3600} hours)")
|
||||
print(f"Service monitoring started (checks every {CHECK_INTERVAL}s)")
|
||||
|
||||
|
||||
# Global monitor instance
|
||||
monitor = ServiceMonitor()
|
||||
|
||||
@@ -1,22 +1,23 @@
|
||||
blinker==1.8.2
|
||||
certifi==2024.7.4
|
||||
charset-normalizer==3.3.2
|
||||
click==8.1.7
|
||||
Flask==3.0.3
|
||||
Flask-Minify==0.48
|
||||
gunicorn==22.0.0
|
||||
blinker==1.9.0
|
||||
certifi==2026.1.4
|
||||
charset-normalizer==3.4.4
|
||||
click==8.3.1
|
||||
Flask==3.1.3
|
||||
Flask-Minify==0.50
|
||||
gunicorn==25.1.0
|
||||
htmlminf==0.1.13
|
||||
idna==3.7
|
||||
idna==3.11
|
||||
itsdangerous==2.2.0
|
||||
Jinja2==3.1.4
|
||||
Jinja2==3.1.6
|
||||
jsmin==3.0.1
|
||||
lesscpy==0.15.1
|
||||
MarkupSafe==2.1.5
|
||||
packaging==24.1
|
||||
MarkupSafe==3.0.3
|
||||
packaging==26.0
|
||||
ply==3.11
|
||||
rcssmin==1.1.2
|
||||
requests==2.32.3
|
||||
six==1.16.0
|
||||
urllib3==2.2.2
|
||||
Werkzeug==3.0.3
|
||||
xxhash==3.4.1
|
||||
psycopg2-binary==2.9.11
|
||||
rcssmin==1.2.2
|
||||
requests==2.32.5
|
||||
six==1.17.0
|
||||
urllib3==2.6.3
|
||||
Werkzeug==3.1.6
|
||||
xxhash==3.6.0
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,70 +0,0 @@
|
||||
.hidden {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.hiddenup {
|
||||
max-height: 0px !important;
|
||||
}
|
||||
|
||||
.checkbox-wrapper > div {
|
||||
display: inline-block;
|
||||
margin-right: 1em;
|
||||
margin-bottom: 1em;
|
||||
}
|
||||
|
||||
.checkbox-wrapper > div:last-child {
|
||||
margin-bottom: 0;;
|
||||
}
|
||||
|
||||
.checkbox-wrapper .switch {
|
||||
display: flex;
|
||||
position: relative;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.checkbox-wrapper .switch > * {
|
||||
align-self: center;
|
||||
}
|
||||
|
||||
.checkbox-wrapper .switch input {
|
||||
display: none;
|
||||
}
|
||||
|
||||
|
||||
.checkbox-wrapper .slider {
|
||||
background-color: #ccc;
|
||||
transition: 0.4s;
|
||||
height: 34px;
|
||||
width: 60px;
|
||||
}
|
||||
|
||||
.checkbox-wrapper .slider:before {
|
||||
background-color: #fff;
|
||||
bottom: 4px;
|
||||
content: "";
|
||||
height: 26px;
|
||||
left: 4px;
|
||||
position: absolute;
|
||||
transition: 0.4s;
|
||||
width: 26px;
|
||||
}
|
||||
|
||||
.checkbox-wrapper input:checked+.slider {
|
||||
background-color: #66bb6a;
|
||||
}
|
||||
|
||||
.checkbox-wrapper input:checked+.slider:before {
|
||||
transform: translateX(26px);
|
||||
}
|
||||
|
||||
.checkbox-wrapper .slider.round {
|
||||
border-radius: 34px;
|
||||
}
|
||||
|
||||
.checkbox-wrapper .slider.round:before {
|
||||
border-radius: 50%;
|
||||
}
|
||||
|
||||
.checkbox-wrapper strong {
|
||||
margin-left: .5em;
|
||||
}
|
||||
@@ -1,41 +0,0 @@
|
||||
function toggleCheckbox(dir) {
|
||||
let toggles = document.querySelectorAll(
|
||||
".checkbox-wrapper input[type=checkbox]"
|
||||
);
|
||||
let allow = [];
|
||||
toggles.forEach(function (x) {
|
||||
if (x.checked) {
|
||||
allow.push(x.id);
|
||||
}
|
||||
});
|
||||
let list = document.querySelectorAll(".checkbox-client > div");
|
||||
if (allow.length === 0) {
|
||||
for (let i = 0; i < list.length; i++) {
|
||||
list[i].classList.remove("hidden" + dir);
|
||||
}
|
||||
} else {
|
||||
for (let i = 0; i < list.length; i++) {
|
||||
list[i].classList.remove("hidden" + dir);
|
||||
for (let x = 0; x < allow.length; x++) {
|
||||
if (!list[i].classList.contains(allow[x])) {
|
||||
list[i].classList.add("hidden" + dir);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function activeSkill(obj) {
|
||||
if (obj.parentElement.classList.contains("activeSkill")) {
|
||||
obj.parentElement.classList.remove("activeSkill");
|
||||
return;
|
||||
}
|
||||
// document.querySelectorAll(".skill").forEach((x) => {
|
||||
// x.classList.remove("activeSkill");
|
||||
// });
|
||||
while (obj.parentElement.classList.contains("skill")) {
|
||||
obj = obj.parentElement;
|
||||
obj.classList.add("activeSkill");
|
||||
}
|
||||
}
|
||||
@@ -7,17 +7,21 @@ async function addChessEmbed(username) {
|
||||
setChess({ cName: "Chess.com request failed" });
|
||||
return;
|
||||
}
|
||||
|
||||
if (user.status === 200) {
|
||||
user = await user.json();
|
||||
stats = await stats.json();
|
||||
const ratings = {
|
||||
setChess({
|
||||
cName: user["username"],
|
||||
pic: user.avatar,
|
||||
ratings: {
|
||||
rapid: stats.chess_rapid.last.rating,
|
||||
blitz: stats.chess_blitz.last.rating,
|
||||
bullet: stats.chess_bullet.last.rating,
|
||||
tactics: stats.tactics.highest.rating,
|
||||
};
|
||||
setChess({ cName: user["username"], pic: user.avatar, ratings: ratings });
|
||||
} else if (user === null || user.status === 403 || user.status === null) {
|
||||
},
|
||||
});
|
||||
} else if (user.status === 403) {
|
||||
setChess({ cName: "Chess.com request failed" });
|
||||
} else {
|
||||
setChess({ cName: "User Not Found" });
|
||||
@@ -33,16 +37,12 @@ function setChess({ cName = null, pic = null, ratings = null }) {
|
||||
document.querySelector(".chessImage").src = pic;
|
||||
}
|
||||
if (ratings) {
|
||||
document.querySelector(".chessRapid .chessStat").textContent =
|
||||
ratings.rapid;
|
||||
document.querySelector(".chessBlitz .chessStat").textContent =
|
||||
ratings.blitz;
|
||||
document.querySelector(".chessBullet .chessStat").textContent =
|
||||
ratings.bullet;
|
||||
document.querySelector(".chessPuzzles .chessStat").textContent =
|
||||
ratings.tactics;
|
||||
document.querySelector(".chessRapid .chessStat").textContent = ratings.rapid;
|
||||
document.querySelector(".chessBlitz .chessStat").textContent = ratings.blitz;
|
||||
document.querySelector(".chessBullet .chessStat").textContent = ratings.bullet;
|
||||
document.querySelector(".chessPuzzles .chessStat").textContent = ratings.tactics;
|
||||
}
|
||||
} catch {
|
||||
console.log("fucker clicking so fast the internet can't even keep up");
|
||||
console.warn("Chess DOM elements not available (navigated away during fetch)");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,11 @@
|
||||
const balls = [];
|
||||
const density = 0.00003;
|
||||
const density = 0.00005;
|
||||
let screenWidth = window.innerWidth + 10;
|
||||
let screenHeight = window.innerHeight + 10;
|
||||
|
||||
const MAX_DIST = 150;
|
||||
const MAX_DIST_SQUARED = MAX_DIST * MAX_DIST;
|
||||
|
||||
class Ball {
|
||||
constructor(x, y, size, speed, angle) {
|
||||
this.x = x;
|
||||
@@ -14,8 +17,9 @@ class Ball {
|
||||
}
|
||||
|
||||
calcChange() {
|
||||
this.xSpeed = this.speed * Math.sin((this.angle * Math.PI) / 180);
|
||||
this.ySpeed = this.speed * Math.cos((this.angle * Math.PI) / 180);
|
||||
const radians = (this.angle * Math.PI) / 180
|
||||
this.xSpeed = this.speed * Math.sin(radians);
|
||||
this.ySpeed = this.speed * Math.cos(radians);
|
||||
}
|
||||
|
||||
update() {
|
||||
@@ -44,19 +48,17 @@ class Ball {
|
||||
|
||||
function setup() {
|
||||
frameRate(15);
|
||||
const pix = screenHeight * screenWidth;
|
||||
const pixels = screenHeight * screenWidth;
|
||||
createCanvas(screenWidth, screenHeight);
|
||||
for (let i = 0; i < pix * density; i++) {
|
||||
let thisBall = new Ball(
|
||||
for (let i = 0; i < pixels * density; i++) {
|
||||
balls.push(new Ball(
|
||||
random(screenWidth),
|
||||
random(screenHeight),
|
||||
random(6) + 3,
|
||||
Math.exp(random(4) + 3) / 1000 + 1,
|
||||
random(360)
|
||||
);
|
||||
balls.push(thisBall);
|
||||
));
|
||||
}
|
||||
|
||||
stroke(255);
|
||||
}
|
||||
|
||||
@@ -69,40 +71,34 @@ function windowResized() {
|
||||
function draw() {
|
||||
background(24);
|
||||
|
||||
// Update all balls
|
||||
for (let i = 0; i < balls.length; i++) {
|
||||
balls[i].update();
|
||||
}
|
||||
|
||||
// Optimize line drawing with early distance checks
|
||||
const maxDist = 150;
|
||||
const maxDistSquared = maxDist * maxDist; // Avoid sqrt in distance calculation
|
||||
// Draw connection lines with additive blending so overlaps brighten
|
||||
blendMode(ADD);
|
||||
strokeWeight(2);
|
||||
|
||||
for (let i = 0; i < balls.length - 1; i++) {
|
||||
const ball1 = balls[i];
|
||||
const a = balls[i];
|
||||
for (let j = i + 1; j < balls.length; j++) {
|
||||
const ball2 = balls[j];
|
||||
|
||||
// Quick rejection test using squared distance (faster than sqrt)
|
||||
const dx = ball2.x - ball1.x;
|
||||
const dy = ball2.y - ball1.y;
|
||||
const b = balls[j];
|
||||
const dx = b.x - a.x;
|
||||
const dy = b.y - a.y;
|
||||
const distSquared = dx * dx + dy * dy;
|
||||
|
||||
if (distSquared < maxDistSquared) {
|
||||
const distance = Math.sqrt(distSquared); // Only calculate sqrt if needed
|
||||
|
||||
if (distance < 100) {
|
||||
stroke(150);
|
||||
line(ball1.x, ball1.y, ball2.x, ball2.y);
|
||||
if (distSquared < MAX_DIST_SQUARED) {
|
||||
const distance = Math.sqrt(distSquared);
|
||||
if (distance < 75) {
|
||||
stroke(255, 85);
|
||||
} else {
|
||||
stroke(100);
|
||||
const chance = 0.3 ** (((random(0.2) + 0.8) * distance) / 150);
|
||||
if (chance < 0.5) {
|
||||
stroke(50);
|
||||
}
|
||||
line(ball1.x, ball1.y, ball2.x, ball2.y);
|
||||
const chance = 0.3 ** (((random(0.2) + 0.8) * distance) / MAX_DIST);
|
||||
stroke(255, chance < 0.5 ? 40 : 75);
|
||||
}
|
||||
line(a.x, a.y, b.x, b.y);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
blendMode(BLEND);
|
||||
}
|
||||
|
||||
@@ -1,67 +1,107 @@
|
||||
function toggleMenu(collapse=false) {
|
||||
function toggleMenu(collapse) {
|
||||
if (window.innerWidth < 1400) {
|
||||
const e = document.querySelector(".navControl");
|
||||
const menu = document.querySelector(".navControl");
|
||||
const bar = document.querySelector(".header");
|
||||
const isCollapsed = !e.style.maxHeight || e.style.maxHeight === "0px";
|
||||
const isCollapsed = !menu.style.maxHeight || menu.style.maxHeight === "0px";
|
||||
if (isCollapsed && !collapse) {
|
||||
e.style.maxHeight = `${e.scrollHeight + 10}px`;
|
||||
menu.style.maxHeight = `${menu.scrollHeight + 10}px`;
|
||||
bar.style.borderBottomWidth = "0px";
|
||||
} else {
|
||||
e.style.maxHeight = "0px";
|
||||
menu.style.maxHeight = "0px";
|
||||
bar.style.borderBottomWidth = "3px";
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function goto(location, { push = true } = {}) {
|
||||
let a;
|
||||
const loadingBar = document.getElementById('loading-bar');
|
||||
|
||||
if (loadingBar) {
|
||||
loadingBar.style.width = ''; // Clear inline style from previous run
|
||||
}
|
||||
|
||||
let loadingTimeout = setTimeout(() => {
|
||||
if (loadingBar) {
|
||||
loadingBar.classList.remove('finish');
|
||||
loadingBar.classList.add('active');
|
||||
loadingBar.classList.add('visible');
|
||||
}
|
||||
}, 150);
|
||||
|
||||
try {
|
||||
a = await fetch("/api/goto/" + location, {
|
||||
const response = await fetch("/api/goto/" + location, {
|
||||
credentials: "include",
|
||||
method: "GET",
|
||||
mode: "cors",
|
||||
});
|
||||
if (!a.ok) {
|
||||
console.error(`Navigation failed: HTTP ${a.status}`);
|
||||
return;
|
||||
}
|
||||
} catch (err) {
|
||||
console.error("Navigation fetch failed:", err);
|
||||
return;
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
// Wait for the full body to download - this is usually the slow part
|
||||
const [metadata, content] = await response.json();
|
||||
|
||||
document.dispatchEvent(new Event('beforenavigate'));
|
||||
|
||||
const response = await a.json();
|
||||
const metadata = response[0];
|
||||
const content = response[1];
|
||||
const root = document.getElementById("root");
|
||||
root.innerHTML = content;
|
||||
root.querySelectorAll("script").forEach((oldScript) => {
|
||||
|
||||
// Re-execute scripts
|
||||
root.querySelectorAll("script").forEach(function(oldScript) {
|
||||
const newScript = document.createElement("script");
|
||||
Array.from(oldScript.attributes).forEach(attr => {
|
||||
Array.from(oldScript.attributes).forEach(function(attr) {
|
||||
newScript.setAttribute(attr.name, attr.value);
|
||||
});
|
||||
newScript.textContent = oldScript.textContent;
|
||||
oldScript.parentNode.replaceChild(newScript, oldScript);
|
||||
});
|
||||
|
||||
if (!window.location.href.includes("#")) {
|
||||
window.scrollTo({top: 0, left: 0, behavior:"instant"});
|
||||
} else {
|
||||
const eid = decodeURIComponent(window.location.hash.substring(1));
|
||||
const el = document.getElementById(eid);
|
||||
if (window.location.href.includes("#")) {
|
||||
const id = decodeURIComponent(window.location.hash.substring(1));
|
||||
const el = document.getElementById(id);
|
||||
if (el) el.scrollIntoView();
|
||||
} else {
|
||||
window.scrollTo({ top: 0, left: 0, behavior: "instant" });
|
||||
}
|
||||
|
||||
toggleMenu(collapse=true);
|
||||
toggleMenu(true);
|
||||
document.querySelector("title").textContent = metadata["title"];
|
||||
if (push) {
|
||||
history.pushState(null, null, metadata["canonical"]);
|
||||
}
|
||||
|
||||
} catch (err) {
|
||||
console.error("Navigation failed:", err);
|
||||
} finally {
|
||||
clearTimeout(loadingTimeout);
|
||||
if (loadingBar && loadingBar.classList.contains('active')) {
|
||||
loadingBar.classList.add('finish');
|
||||
loadingBar.classList.remove('active');
|
||||
setTimeout(() => {
|
||||
if (!loadingBar.classList.contains('active')) {
|
||||
loadingBar.style.width = '0%';
|
||||
loadingBar.classList.remove('finish');
|
||||
loadingBar.classList.remove('visible');
|
||||
}
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function backButton() {
|
||||
const location = window.location.pathname;
|
||||
goto(location.substring(1), { push: false }); // remove slash, goto already does that
|
||||
const path = window.location.pathname;
|
||||
goto(path.substring(1), { push: false });
|
||||
}
|
||||
|
||||
function activeSkill(obj) {
|
||||
let skill = obj.closest(".skill");
|
||||
if (skill.classList.contains("activeSkill")) {
|
||||
skill.classList.remove("activeSkill");
|
||||
return;
|
||||
}
|
||||
while (skill) {
|
||||
skill.classList.add("activeSkill");
|
||||
skill = skill.parentElement.closest(".skill");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
// Fetch and display service status from API
|
||||
// Use a global to track the interval and ensure we don't stack listeners
|
||||
if (window.statusIntervalId) {
|
||||
clearInterval(window.statusIntervalId);
|
||||
window.statusIntervalId = null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch status data from server
|
||||
*/
|
||||
async function fetchStatus() {
|
||||
try {
|
||||
const response = await fetch('/api/status');
|
||||
@@ -17,36 +18,28 @@ async function fetchStatus() {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the status display with fetched data
|
||||
*/
|
||||
function updateStatusDisplay(data) {
|
||||
// Update last check time
|
||||
if (data.last_check) {
|
||||
const lastCheck = new Date(data.last_check);
|
||||
const timeString = lastCheck.toLocaleString();
|
||||
document.getElementById('lastUpdate').textContent = `Last checked: ${timeString}`;
|
||||
const lastUpdateEl = document.getElementById('lastUpdate');
|
||||
if (lastUpdateEl) lastUpdateEl.textContent = `Last checked: ${lastCheck.toLocaleString()}`;
|
||||
}
|
||||
|
||||
// Update next check time
|
||||
if (data.next_check) {
|
||||
const nextCheck = new Date(data.next_check);
|
||||
const timeString = nextCheck.toLocaleString();
|
||||
const nextCheckEl = document.getElementById('nextUpdate');
|
||||
if (nextCheckEl) {
|
||||
nextCheckEl.textContent = `Next check: ${timeString}`;
|
||||
const nextCheck = new Date(data.next_check);
|
||||
nextCheckEl.textContent = `Next check: ${nextCheck.toLocaleString()}`;
|
||||
}
|
||||
}
|
||||
|
||||
// Update each service
|
||||
data.services.forEach(service => {
|
||||
if (data.services) {
|
||||
data.services.forEach(function(service) {
|
||||
updateServiceCard(service);
|
||||
});
|
||||
|
||||
// Update overall status
|
||||
updateOverallStatus(data.services);
|
||||
}
|
||||
|
||||
// Re-enable refresh button
|
||||
const refreshBtn = document.getElementById('refreshBtn');
|
||||
if (refreshBtn) {
|
||||
refreshBtn.disabled = false;
|
||||
@@ -54,9 +47,19 @@ function updateStatusDisplay(data) {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a single service card
|
||||
*/
|
||||
function getUptimeClass(value) {
|
||||
if (value === null) return 'text-muted';
|
||||
if (value >= 99) return 'text-excellent';
|
||||
if (value >= 95) return 'text-good';
|
||||
if (value >= 90) return 'text-fair';
|
||||
return 'text-poor';
|
||||
}
|
||||
|
||||
function formatUptime(value, label) {
|
||||
const display = value !== null ? `${value}%` : '--';
|
||||
return `${label}: <strong class="${getUptimeClass(value)}">${display}</strong>`;
|
||||
}
|
||||
|
||||
function updateServiceCard(service) {
|
||||
const card = document.getElementById(`status-${service.id}`);
|
||||
if (!card) return;
|
||||
@@ -68,127 +71,101 @@ function updateServiceCard(service) {
|
||||
const uptimeDisplay = document.getElementById(`uptime-${service.id}`);
|
||||
const checksDisplay = document.getElementById(`checks-${service.id}`);
|
||||
|
||||
// Update response time
|
||||
if (service.response_time !== null) {
|
||||
timeDisplay.textContent = `${service.response_time}ms`;
|
||||
} else {
|
||||
timeDisplay.textContent = '--';
|
||||
}
|
||||
if (timeDisplay) timeDisplay.textContent = service.response_time !== null ? `${service.response_time}ms` : '--';
|
||||
|
||||
// Update status code
|
||||
if (codeDisplay) {
|
||||
if (service.status_code !== null) {
|
||||
codeDisplay.textContent = service.status_code;
|
||||
} else {
|
||||
codeDisplay.textContent = service.status === 'unknown' ? 'Unknown' : 'Error';
|
||||
}
|
||||
}
|
||||
|
||||
// Update status indicator
|
||||
card.classList.remove('online', 'degraded', 'offline', 'unknown');
|
||||
|
||||
switch (service.status) {
|
||||
case 'online':
|
||||
stateDot.className = 'state-dot online';
|
||||
stateText.textContent = 'Operational';
|
||||
if (stateDot) stateDot.className = 'state-dot online';
|
||||
if (stateText) stateText.textContent = 'Operational';
|
||||
card.classList.add('online');
|
||||
break;
|
||||
case 'degraded':
|
||||
case 'timeout':
|
||||
stateDot.className = 'state-dot degraded';
|
||||
stateText.textContent = service.status === 'timeout' ? 'Timeout' : 'Degraded';
|
||||
if (stateDot) stateDot.className = 'state-dot degraded';
|
||||
if (stateText) stateText.textContent = service.status === 'timeout' ? 'Timeout' : 'Degraded';
|
||||
card.classList.add('degraded');
|
||||
break;
|
||||
case 'offline':
|
||||
stateDot.className = 'state-dot offline';
|
||||
stateText.textContent = 'Offline';
|
||||
if (stateDot) stateDot.className = 'state-dot offline';
|
||||
if (stateText) stateText.textContent = 'Offline';
|
||||
card.classList.add('offline');
|
||||
break;
|
||||
default:
|
||||
stateDot.className = 'state-dot loading';
|
||||
stateText.textContent = 'Unknown';
|
||||
if (stateDot) stateDot.className = 'state-dot loading';
|
||||
if (stateText) stateText.textContent = 'Unknown';
|
||||
card.classList.add('unknown');
|
||||
}
|
||||
|
||||
// Update uptime statistics
|
||||
if (uptimeDisplay && service.uptime) {
|
||||
const uptimeHTML = [];
|
||||
|
||||
// Helper function to get color class based on uptime percentage
|
||||
const getUptimeClass = (value) => {
|
||||
if (value === null) return 'text-muted';
|
||||
if (value >= 99) return 'text-excellent';
|
||||
if (value >= 95) return 'text-good';
|
||||
if (value >= 90) return 'text-fair';
|
||||
return 'text-poor';
|
||||
};
|
||||
|
||||
// Helper function to format uptime value
|
||||
const formatUptime = (value, label) => {
|
||||
const display = value !== null ? `${value}%` : '--';
|
||||
const colorClass = getUptimeClass(value);
|
||||
return `${label}: <strong class="${colorClass}">${display}</strong>`;
|
||||
};
|
||||
|
||||
// Add all uptime metrics
|
||||
uptimeHTML.push(formatUptime(service.uptime['24h'], '24h'));
|
||||
uptimeHTML.push(formatUptime(service.uptime['7d'], '7d'));
|
||||
uptimeHTML.push(formatUptime(service.uptime['30d'], '30d'));
|
||||
uptimeHTML.push(formatUptime(service.uptime.all_time, 'All'));
|
||||
|
||||
uptimeDisplay.innerHTML = uptimeHTML.join(' | ');
|
||||
uptimeDisplay.innerHTML = [
|
||||
formatUptime(service.uptime['24h'], '24h'),
|
||||
formatUptime(service.uptime['7d'], '7d'),
|
||||
formatUptime(service.uptime['30d'], '30d'),
|
||||
formatUptime(service.uptime.all_time, 'All'),
|
||||
].join(' | ');
|
||||
}
|
||||
|
||||
// Update total checks
|
||||
if (checksDisplay && service.total_checks !== undefined) {
|
||||
checksDisplay.textContent = service.total_checks;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update overall status bar
|
||||
*/
|
||||
function updateOverallStatus(services) {
|
||||
const overallBar = document.getElementById('overallStatus');
|
||||
if (!overallBar) return;
|
||||
|
||||
const icon = overallBar.querySelector('.summary-icon');
|
||||
const title = overallBar.querySelector('.summary-title');
|
||||
const subtitle = document.getElementById('summary-subtitle');
|
||||
const onlineCount = document.getElementById('onlineCount');
|
||||
const totalCount = document.getElementById('totalCount');
|
||||
|
||||
// Count service statuses
|
||||
const total = services.length;
|
||||
const online = services.filter(s => s.status === 'online').length;
|
||||
const degraded = services.filter(s => s.status === 'degraded' || s.status === 'timeout').length;
|
||||
const offline = services.filter(s => s.status === 'offline').length;
|
||||
const online = services.filter(function(s) { return s.status === 'online'; }).length;
|
||||
const degraded = services.filter(function(s) { return s.status === 'degraded' || s.status === 'timeout'; }).length;
|
||||
const offline = services.filter(function(s) { return s.status === 'offline'; }).length;
|
||||
|
||||
// Update counts
|
||||
onlineCount.textContent = online;
|
||||
totalCount.textContent = total;
|
||||
if (onlineCount) onlineCount.textContent = online;
|
||||
if (totalCount) totalCount.textContent = total;
|
||||
|
||||
// Remove all status classes
|
||||
overallBar.classList.remove('online', 'degraded', 'offline');
|
||||
icon.classList.remove('operational', 'partial', 'major', 'loading');
|
||||
if (icon) icon.classList.remove('operational', 'partial', 'major', 'loading');
|
||||
|
||||
// Determine overall status
|
||||
if (online === total) {
|
||||
// All systems operational
|
||||
overallBar.classList.add('online');
|
||||
if (icon) {
|
||||
icon.classList.add('operational');
|
||||
icon.textContent = '\u2713';
|
||||
title.textContent = 'All Systems Operational';
|
||||
subtitle.textContent = `All ${total} services are running normally`;
|
||||
}
|
||||
if (title) title.textContent = 'All Systems Operational';
|
||||
if (subtitle) subtitle.textContent = `All ${total} services are running normally`;
|
||||
} else if (offline >= Math.ceil(total / 2)) {
|
||||
// Major outage (50% or more offline)
|
||||
overallBar.classList.add('offline');
|
||||
if (icon) {
|
||||
icon.classList.add('major');
|
||||
icon.textContent = '\u2715';
|
||||
title.textContent = 'Major Outage';
|
||||
subtitle.textContent = `${offline} service${offline !== 1 ? 's' : ''} offline, ${degraded} degraded`;
|
||||
}
|
||||
if (title) title.textContent = 'Major Outage';
|
||||
if (subtitle) subtitle.textContent = `${offline} service${offline !== 1 ? 's' : ''} offline, ${degraded} degraded`;
|
||||
} else if (offline > 0 || degraded > 0) {
|
||||
// Partial outage
|
||||
overallBar.classList.add('degraded');
|
||||
if (icon) {
|
||||
icon.classList.add('partial');
|
||||
icon.textContent = '\u26A0';
|
||||
title.textContent = 'Partial Outage';
|
||||
}
|
||||
if (title) title.textContent = 'Partial Outage';
|
||||
if (subtitle) {
|
||||
if (offline > 0 && degraded > 0) {
|
||||
subtitle.textContent = `${offline} offline, ${degraded} degraded`;
|
||||
} else if (offline > 0) {
|
||||
@@ -196,18 +173,17 @@ function updateOverallStatus(services) {
|
||||
} else {
|
||||
subtitle.textContent = `${degraded} service${degraded !== 1 ? 's' : ''} degraded`;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Unknown state
|
||||
if (icon) {
|
||||
icon.classList.add('loading');
|
||||
icon.textContent = '\u25D0';
|
||||
title.textContent = 'Status Unknown';
|
||||
subtitle.textContent = 'Waiting for service data';
|
||||
}
|
||||
if (title) title.textContent = 'Status Unknown';
|
||||
if (subtitle) subtitle.textContent = 'Waiting for service data';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show error message
|
||||
*/
|
||||
function showError(message) {
|
||||
const errorDiv = document.createElement('div');
|
||||
errorDiv.className = 'status-error';
|
||||
@@ -217,13 +193,10 @@ function showError(message) {
|
||||
const container = document.querySelector('.foregroundContent');
|
||||
if (container) {
|
||||
container.insertBefore(errorDiv, container.firstChild);
|
||||
setTimeout(() => errorDiv.remove(), 5000);
|
||||
setTimeout(function() { errorDiv.remove(); }, 5000);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Manual refresh
|
||||
*/
|
||||
function refreshStatus() {
|
||||
const refreshBtn = document.getElementById('refreshBtn');
|
||||
if (refreshBtn) {
|
||||
@@ -233,32 +206,24 @@ function refreshStatus() {
|
||||
fetchStatus();
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize on page load
|
||||
*/
|
||||
var statusIntervalId = null;
|
||||
|
||||
function initStatusPage() {
|
||||
// Clear any existing interval from a previous SPA navigation
|
||||
if (statusIntervalId !== null) {
|
||||
clearInterval(statusIntervalId);
|
||||
if (window.statusIntervalId) {
|
||||
clearInterval(window.statusIntervalId);
|
||||
}
|
||||
fetchStatus();
|
||||
// Auto-refresh every 5 minutes to get latest data
|
||||
statusIntervalId = setInterval(fetchStatus, 300000);
|
||||
window.statusIntervalId = setInterval(fetchStatus, 60000);
|
||||
}
|
||||
|
||||
// Clean up interval when navigating away via SPA
|
||||
document.addEventListener('beforenavigate', () => {
|
||||
if (statusIntervalId !== null) {
|
||||
clearInterval(statusIntervalId);
|
||||
statusIntervalId = null;
|
||||
function cleanupStatusPage() {
|
||||
if (window.statusIntervalId) {
|
||||
clearInterval(window.statusIntervalId);
|
||||
window.statusIntervalId = null;
|
||||
}
|
||||
});
|
||||
document.removeEventListener('beforenavigate', cleanupStatusPage);
|
||||
}
|
||||
|
||||
// Start when page loads
|
||||
if (document.readyState === 'loading') {
|
||||
document.addEventListener('DOMContentLoaded', initStatusPage);
|
||||
} else {
|
||||
document.addEventListener('beforenavigate', cleanupStatusPage);
|
||||
|
||||
if (document.getElementById('overallStatus')) {
|
||||
initStatusPage();
|
||||
}
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
"status": "complete",
|
||||
"classes": "geospacial",
|
||||
"bgi": "watershedTemps.png",
|
||||
"content": "Geospacial analysis of Maryland's Antietam and Conococheague sub-watersheds, monitoring water quality and temperatures through the summer months for reporting to governmental review boards for environmental protection"
|
||||
"content": "Live geospacial analysis of Maryland's Antietam and Conococheague sub-watersheds, monitoring water quality and temperatures through the summer months for governmental environment health review boards."
|
||||
},
|
||||
"Automotive Brand Valuation Analysis": {
|
||||
"status": "complete",
|
||||
|
||||
@@ -1,41 +1,35 @@
|
||||
{
|
||||
"Tools": {
|
||||
"Microsoft Azure": {
|
||||
"Databricks": {},
|
||||
"Data Factory": {},
|
||||
"Stream Analytics": {}
|
||||
},
|
||||
"Databricks": {},
|
||||
"Apache Spark": {},
|
||||
"Visual Basic for Applications (Excel)": {}
|
||||
},
|
||||
"Data and AI": {
|
||||
"Python": {
|
||||
"PyTorch/TensorFlow": {},
|
||||
"Numpy/Pandas": {},
|
||||
"Scikit/Sklearn": {},
|
||||
"Selenium/BS4": {},
|
||||
"Pyspark": {}
|
||||
"ML": {
|
||||
"PySpark ML": {},
|
||||
"Numpy/Pandas/Polars": {},
|
||||
"TensorFlow": {},
|
||||
"Scikit": {}
|
||||
},
|
||||
"R": {},
|
||||
"SQL": {}
|
||||
},
|
||||
"Frontend": {
|
||||
"Flask (Python)": {},
|
||||
"React (Javascript)": {},
|
||||
"SASS/SCSS": {}
|
||||
},
|
||||
"Backend & DevOps": {
|
||||
"Backend": {
|
||||
"Rust": {},
|
||||
"C#": {}
|
||||
"PySpark": {},
|
||||
"Selenium/BS4 Web Hacking": {},
|
||||
"SQL": {},
|
||||
"Declarative Pipelines": {},
|
||||
"ArcGIS": {}
|
||||
},
|
||||
"DevOps": {
|
||||
"Docker": {},
|
||||
"Microsoft Azure": {},
|
||||
"Databricks": {},
|
||||
"Kubernetes/Openshift": {},
|
||||
"Cloudflare": {},
|
||||
"Bash": {}
|
||||
}
|
||||
},
|
||||
"Frontend": {
|
||||
"Flask (Python)": {},
|
||||
"REST APIs": {},
|
||||
"Web Scraping": {}
|
||||
},
|
||||
"Offline Skills": {
|
||||
"Circuitry": {},
|
||||
"Skiing": {},
|
||||
"Chess": {},
|
||||
"Plinking": {},
|
||||
"Building something with trash that solves my problems": {}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
<loc>https://asimonson.com/projects</loc>
|
||||
<loc>https://asimonson.com/Resume</loc>
|
||||
<loc>https://asimonson.com/duck</loc>
|
||||
<loc>https://asimonson.com/books</loc>
|
||||
<lastmod>2024-07-24</lastmod>
|
||||
<loc>https://asimonson.com/status</loc>
|
||||
<lastmod>2026-02-12</lastmod>
|
||||
</url>
|
||||
</urlset>
|
||||
@@ -20,7 +20,7 @@
|
||||
property="og:image"
|
||||
content="{{ url_for('static', filename='icons/rasterLogoCircle.png') }}"
|
||||
/>
|
||||
<meta property="og:url" content="{{ var['canonical'] }}" />
|
||||
<meta property="og:url" content="{{ request.url_root | trim('/') }}{{ var['canonical'] }}" />
|
||||
<meta property="twitter:title" content="Andrew Simonson" />
|
||||
<meta name="twitter:description" content="{{ var['description'] }}" />
|
||||
<meta name="twitter:card" content="summary_large_image" />
|
||||
@@ -50,16 +50,16 @@
|
||||
rel="stylesheet"
|
||||
href="{{ url_for('static', filename='css/App.css') }}"
|
||||
/>
|
||||
<link rel="canonical" href="{{ var['canonical'] }}" />
|
||||
<script defer src="{{ url_for('static', filename='js/checkbox.js') }}"></script>
|
||||
<link rel="canonical" href="{{ request.url_root | trim('/') }}{{ var['canonical'] }}" />
|
||||
<script defer src="{{ url_for('static', filename='js/responsive.js') }}"></script>
|
||||
<script src="{{ url_for('static', filename='js/chessbed.js') }}"></script>
|
||||
{# <script src="{{ url_for('static', filename='js/chessbed.js') }}"></script> #}
|
||||
<script defer src="{{ url_for('static', filename='js/idler.js') }}"></script>
|
||||
<script defer src="https://cdn.jsdelivr.net/npm/p5@1.4.1/lib/p5.min.js"></script>
|
||||
<title>{{ var['title'] }}</title>
|
||||
</head>
|
||||
{% block header %}
|
||||
<body onpopstate="backButton()">
|
||||
<div id="loading-bar"></div>
|
||||
<noscript>You need to enable JavaScript to run this app.</noscript>
|
||||
<main id='map'></main>
|
||||
<div id="contentStuffer">
|
||||
|
||||
@@ -20,28 +20,27 @@
|
||||
<!--<INSERT SMALL BANNER HERE FOR PROJECT IMAGECARD CAROUSEL>-->
|
||||
<div id="desktopSpacer"></div>
|
||||
<div class="homeSubContent">
|
||||
<img class='blinkies' alt='My Brain is Glowing' src="{{ url_for('static', filename='photos/blinkies/brainglow.gif') }}" loading="lazy" />
|
||||
<img class='blinkies' alt='Pepsi Addict' src="{{ url_for('static', filename='photos/blinkies/pepsiaddict.gif') }}" loading="lazy" />
|
||||
<img class='blinkies' alt='I Fear No Beer' src="{{ url_for('static', filename='photos/blinkies/fearnobeer.gif') }}" loading="lazy" />
|
||||
<img class='blinkies' alt='Secret Message' src="{{ url_for('static', filename='photos/blinkies/tooclose.gif') }}" loading="lazy" />
|
||||
<img class='blinkies' alt="They took my blood but it wasn't DNA, it was USA" src="{{ url_for('static', filename='photos/blinkies/usa.gif') }}" loading="lazy" />
|
||||
<img class='blinkies' alt='Bob the Builder gif' src="{{ url_for('static', filename='photos/blinkies/bobthebuilder.gif') }}" loading="lazy" />
|
||||
<div>
|
||||
<br />
|
||||
<strong> You've reached the website for Andrew Simonson's personal online shenanigans.</strong>
|
||||
<strong> You've reached the website for Andrew Simonson's digital shenanigans.</strong>
|
||||
<h3>Now What?</h3>
|
||||
<p>
|
||||
Go back and find the link that I originally shared. Or poke around. Be your own person.</br>
|
||||
I guess I'll grant myself some titles while I'm at it:
|
||||
I'll grant myself some titles while I'm at it:
|
||||
</p>
|
||||
<ul>
|
||||
<li>Load-Bearing Coconut</li>
|
||||
<li>Wicked Wizard of the West</li>
|
||||
<li>Enemy of Node.js, Hater of Bloat</li>
|
||||
<li>Load-Bearing Coconut</li>
|
||||
<li>Creator and Harnesser of Energy</li>
|
||||
</ul>
|
||||
</div>
|
||||
<br />
|
||||
{#
|
||||
<div id="aboutCards" class="flex">
|
||||
<div class="chess">
|
||||
{% from 'partials/chess.html' import chess %} {{
|
||||
@@ -53,6 +52,7 @@
|
||||
</div>
|
||||
<br />
|
||||
</div>
|
||||
#}
|
||||
</div>
|
||||
{% endblock %}
|
||||
</div>
|
||||
|
||||
@@ -1,35 +1,32 @@
|
||||
{% macro project(title, classes, status, bgi, content, links) %}
|
||||
<div class="project {{ classes }}" data-aos="fade-up">
|
||||
<div class="projTitle">
|
||||
<h3>{{ title }}</h3>
|
||||
<p><span class="state-dot {{ status }}"></span> {{ status }}</p>
|
||||
</div>
|
||||
<div class="projBody mobileV">
|
||||
<div class="projImage">
|
||||
{% if bgi|length > 0 %} {% set path = url_for('static',
|
||||
filename='photos/projects/' + bgi) %}
|
||||
<img class="" src="{{ path }}" alt="Ref image for {{ title }} project" />
|
||||
<div class="projImageWrap">
|
||||
{% if bgi|length > 0 %}
|
||||
{% set path = url_for('static', filename='photos/projects/' + bgi) %}
|
||||
<img src="{{ path }}" alt="Ref image for {{ title }} project" />
|
||||
{% else %}
|
||||
<div class="projImagePlaceholder"></div>
|
||||
{% endif %}
|
||||
<div class="proj-status-badge {{ status }}">
|
||||
<span class="status-indicator"></span>{{ status }}
|
||||
</div>
|
||||
|
||||
<div class="grow backedBody">
|
||||
<div class="projDesc vFlex spaceBetween">
|
||||
<p>{{ content }}</p>
|
||||
</div>
|
||||
<div class="projContent">
|
||||
<h3>{{ title }}</h3>
|
||||
<p class="projDesc">{{ content }}</p>
|
||||
{% if links %}
|
||||
<div class="projLinks">
|
||||
{% for i in links %} {% set src = 'icons/' + i[0] + '.svg' %}
|
||||
{% for i in links %}
|
||||
{% set src = 'icons/' + i[0] + '.svg' %}
|
||||
{% if i[1].startswith('https://') or i[1].startswith('http://') %}
|
||||
<a href="{{i[1]}}" rel="noopener noreferrer">
|
||||
<img
|
||||
class="projectLink"
|
||||
src="{{ url_for('static', filename=src) }}"
|
||||
alt="{{i[0]}}"
|
||||
/>
|
||||
<a href="{{ i[1] }}" rel="noopener noreferrer" class="proj-link">
|
||||
<img class="projectLink" src="{{ url_for('static', filename=src) }}" alt="{{ i[0] }}" />
|
||||
<span>{{ i[2] }}</span>
|
||||
</a>
|
||||
{% endif %}
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endmacro %}
|
||||
|
||||
@@ -1,9 +1,15 @@
|
||||
{% macro expandSkill(dict, name, classes="") %}
|
||||
<div class='skill {{ classes }}' data-length='{{ dict[name]|length }}'>
|
||||
<div onclick='activeSkill(this)' class='skillname'>{{ name }}</div>
|
||||
{% if dict[name]|length > 0 %}
|
||||
<div class='skill-children'>
|
||||
<div class='skill-children-inner'>
|
||||
{% for child in dict[name] %}
|
||||
{{ expandSkill(dict[name], child) }}
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endmacro %}
|
||||
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
<div class='socials'>
|
||||
<a href='https://github.com/asimonson1125'><img alt='Github' src="{{ url_for('static', filename='icons/github.svg') }}" /></a>
|
||||
<a href='https://www.instagram.com/an_a.simonson/'><img alt='Instagram' src="{{ url_for('static', filename='icons/instagram.svg') }}" /></a>
|
||||
<a href='https://www.linkedin.com/in/simonsonandrew/'><img alt='LinkedIn' src="{{ url_for('static', filename='icons/linkedin.svg') }}" /></a>
|
||||
<a href='mailto:asimonson1125@gmail.com'><img alt='E-mail' src="{{ url_for('static', filename='icons/email.svg') }}" /></a>
|
||||
<div id='vertLine'></div>
|
||||
|
||||
@@ -1,13 +0,0 @@
|
||||
{% macro timeitem(title, classes, date, deets) %}
|
||||
<div class="timeitem {{classes}}">
|
||||
<p class="datetext">{{date}}</p>
|
||||
<div class="timeline-item">
|
||||
<h2>{{title}}</h2>
|
||||
<div class="timeline-deets">
|
||||
<p>
|
||||
{{deets}}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endmacro %}
|
||||
@@ -3,7 +3,8 @@
|
||||
<div class="foregroundContent">
|
||||
<div class="flex equalitems vertOnMobile">
|
||||
<div>
|
||||
<h2 class="concentratedHead">About Me</h2>
|
||||
<div>
|
||||
<h2 class="concentratedHead">About Me<p><sup>Data Scientist, Amateur SysAdmin, Polymath</sup></p></h2>
|
||||
<p>
|
||||
I'm Andrew Simonson<!--, CEO of the anti-thermodynamics syndicate.-->,
|
||||
a <strong>Data Scientist at Ecolab</strong> and a graduate Data
|
||||
@@ -12,30 +13,32 @@
|
||||
recently completed the <b>Computer Science BS</b> program
|
||||
(international relations minor) with a focus on probability
|
||||
theory.
|
||||
<br />
|
||||
<!-- <br />
|
||||
<br />
|
||||
I started in ~2017, reverse engineering probablistic logic
|
||||
models in games and developing interfaces to recreate my
|
||||
findings for friends. Now I develop tracable AI built on
|
||||
deductive reasoning, maintaning scientific methodology in an
|
||||
industry obsessed with implicit rules and exclusive empiricism.
|
||||
<!-- As the analysis grew more sophisticated, so too did the tech
|
||||
As the analysis grew more sophisticated, so too did the tech
|
||||
stack - to the point that I now manage most services, like this
|
||||
website, end to end, container image to insight visual. -->
|
||||
<br />
|
||||
<br />
|
||||
I get bored and throw random stuff on this website. It's a form
|
||||
of unprofessional development and I swear by this form of
|
||||
learning.
|
||||
I get bored and throw random stuff on this website.<br/>
|
||||
This is what unprofessional development looks like.
|
||||
</p>
|
||||
<h3 class='concentratedHead'>
|
||||
</div>
|
||||
<br/>
|
||||
<br/>
|
||||
<h4 class='concentratedHead'>
|
||||
I also have a
|
||||
<a href="Resume_Simonson_Andrew.pdf" target="_blank">resume</a>
|
||||
for some reason.
|
||||
</h3>
|
||||
for unexplained reasons.
|
||||
</h4>
|
||||
</div>
|
||||
<div id="skills">
|
||||
<h2 id="skillstag">Skills</h2>
|
||||
<h2 id="skillstag">Technologies</h2>
|
||||
{% from 'partials/skills.html' import skills %} {{
|
||||
skills(var['skillList']) }}
|
||||
</div>
|
||||
@@ -43,32 +46,7 @@
|
||||
|
||||
<br />
|
||||
<h2 class="concentratedHead">Projects</h2>
|
||||
<!-- >
|
||||
<div class="checkbox-wrapper">
|
||||
<div class="flex start">
|
||||
<label class="switch" htmlFor="pinned">
|
||||
<input type="checkbox" id="pinned" onClick="toggleCheckbox('')" checked/>
|
||||
<div class="slider round"></div>
|
||||
<strong>Pinned</strong>
|
||||
</label>
|
||||
</div>
|
||||
<div class="flex start">
|
||||
<label class="switch" htmlFor="programming">
|
||||
<input type="checkbox" id="programming" onClick="toggleCheckbox('')" />
|
||||
<div class="slider round"></div>
|
||||
<strong>Programming</strong>
|
||||
</label>
|
||||
</div>
|
||||
<div class="flex start">
|
||||
<label class="switch" htmlFor="geospacial" onClick="toggleCheckbox('')">
|
||||
<input type="checkbox" id="geospacial" />
|
||||
<div class="slider round"></div>
|
||||
<strong>Geospacial</strong>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</!-->
|
||||
<div class="projectList vContainer">
|
||||
<div class="projectList">
|
||||
{% from 'partials/project.html' import project %} {% for i in
|
||||
var["projects"] %} {{ project(i, var["projects"][i]["classes"],
|
||||
var["projects"][i]["status"], var["projects"][i]["bgi"],
|
||||
@@ -77,5 +55,4 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!--><script>toggleCheckbox('')</script></!-->
|
||||
{% endblock %}
|
||||
|
||||
@@ -91,8 +91,8 @@
|
||||
<div class="info-box">
|
||||
<h4>About This Monitor</h4>
|
||||
<ul>
|
||||
<li><strong>Check Frequency:</strong> Services are checked automatically every 30 minutes from the server</li>
|
||||
<li><strong>Page Refresh:</strong> This page auto-refreshes every 5 minutes to show latest data</li>
|
||||
<li><strong>Check Frequency:</strong> Services are checked automatically every minute from the server</li>
|
||||
<li><strong>Page Refresh:</strong> This page auto-refreshes every minute to show latest data</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
Reference in New Issue
Block a user