Compare commits

12 Commits

25 changed files with 1288 additions and 975 deletions

View File

@@ -1,10 +1,34 @@
FROM python:3.10-bullseye # Use a slimmer base image to reduce image size and pull times
FROM python:3.10-slim-bullseye
LABEL maintainer="Andrew Simonson <asimonson1125@gmail.com>" LABEL maintainer="Andrew Simonson <asimonson1125@gmail.com>"
# Set environment variables for better Python performance in Docker
ENV PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \
PIP_NO_CACHE_DIR=1
WORKDIR /app WORKDIR /app
# Create a non-root user for security
RUN groupadd -r appuser && useradd -r -g appuser appuser
# Copy only the requirements file first to leverage Docker layer caching
COPY src/requirements.txt .
# Install dependencies as root, but then switch to the non-root user
RUN pip install -r requirements.txt
# Copy the rest of the source code
COPY src/ . COPY src/ .
RUN pip install --no-cache-dir -r requirements.txt # Ensure the appuser owns the app directory
RUN chown -R appuser:appuser /app
CMD [ "gunicorn", "--bind", "0.0.0.0:8080", "app:app"] # Switch to the non-root user for better security
USER appuser
# Expose the port (Gunicorn's default or specified in CMD)
EXPOSE 8080
# Start Gunicorn
CMD ["gunicorn", "--bind", "0.0.0.0:8080", "app:app"]

View File

@@ -1,11 +1,92 @@
# I made a uhh website # Personal Portfolio & Service Monitor
So people can see how excellent my coding standards are.
* Style: 5/10 A Flask-based website for my personal portfolio and a service monitoring dashboard. This project handles dynamic project showcases, automated service health tracking, and production-ready optimizations.
* Originality: 3/10
* Security: Yes*
* Viruses: not included
You gotta uhh `pip3 install -r requirements.txt` and `python3 app.py` that thing ## Features
Docker compose configured to expose at `localhost:8080` - **Content Management**: Pages like projects, books, and skills are managed via JSON files in the `static` directory.
- **Service Monitoring**: Background health checks for external services with uptime statistics stored in PostgreSQL.
- **Optimizations**:
- HTML, CSS, and JS minification via `Flask-Minify`.
- MD5-based cache busting for static assets.
- Configurable cache-control headers.
- **Security**: Pre-configured headers for XSS protection and frame security.
- **Deployment**: Ready for containerized deployment with Docker and Gunicorn.
## Tech Stack
- **Backend**: Python 3.12, Flask
- **Frontend**: Vanilla CSS/JS, Jinja2
- **Database**: PostgreSQL (optional, for monitoring history)
- **Infrastructure**: Docker, docker-compose
## Project Structure
```text
.
├── src/
│ ├── app.py # Application entry point
│ ├── monitor.py # Service monitoring logic
│ ├── config.py # Environment configuration
│ ├── templates/ # HTML templates
│ ├── static/ # CSS, JS, and JSON data
│ └── requirements.txt # Python dependencies
├── Dockerfile # Container definition
├── docker-compose.yml # Local stack orchestration
└── STATUS_MONITOR_README.md # Monitoring system documentation
```
## Getting Started
### Using Docker
To run the full stack (App + PostgreSQL):
1. **Clone the repository**:
```bash
git clone https://github.com/asimonson1125/asimonson1125.github.io.git
cd asimonson1125.github.io
```
2. **Start services**:
```bash
docker-compose up --build
```
3. **Access the site**:
Visit [http://localhost:8080](http://localhost:8080).
### Local Development
To run the Flask app without Docker:
1. **Set up a virtual environment**:
```bash
python3 -m venv .venv
source .venv/bin/activate
```
2. **Install dependencies**:
```bash
pip install -r src/requirements.txt
```
3. **Run the application**:
```bash
cd src
python3 app.py
```
*Note: status monitor is by default disabled outside of its container cluster*
## Service Monitoring
The monitoring system in `src/monitor.py` tracks service availability. It:
- Runs concurrent health checks every hour.
- Calculates uptime for various windows (24h, 7d, 30d).
- Provides a status UI at `/status` and a JSON API at `/api/status`.
See [STATUS_MONITOR_README.md](./STATUS_MONITOR_README.md) for more details.
## License
This project is personal property. All rights reserved.

View File

@@ -19,9 +19,9 @@ Server-side monitoring system that checks the availability of asimonson.com serv
**Features**: **Features**:
- Tracks response times and HTTP status codes - Tracks response times and HTTP status codes
- Stores check history (up to 720 checks = 60 days of data)
- Calculates uptime percentages for multiple time periods (24h, 7d, 30d, all-time) - Calculates uptime percentages for multiple time periods (24h, 7d, 30d, all-time)
- Persists data to `static/json/status_history.json` - Persists data to PostgreSQL (`service_checks` table) via `DATABASE_URL` env var
- Gracefully degrades when no database is configured (local dev)
- Runs in a background thread - Runs in a background thread
#### 2. `app.py` - Flask Integration #### 2. `app.py` - Flask Integration
@@ -57,32 +57,22 @@ Server-side monitoring system that checks the availability of asimonson.com serv
## Data Storage ## Data Storage
Status history is stored in `src/static/json/status_history.json`: Check history is stored in a PostgreSQL `service_checks` table. The connection is configured via the `DATABASE_URL` environment variable (e.g. `postgresql://user:pass@host:5432/dbname`).
```json ```sql
{ CREATE TABLE service_checks (
"last_check": "2026-02-11T14:30:00", id SERIAL PRIMARY KEY,
"services": { service_id VARCHAR(50) NOT NULL,
"main": { timestamp TIMESTAMPTZ NOT NULL DEFAULT NOW(),
"name": "asimonson.com", status VARCHAR(20) NOT NULL,
"url": "https://asimonson.com", response_time INTEGER,
"status": "online", status_code INTEGER,
"response_time": 156, error TEXT
"status_code": 200, );
"last_online": "2026-02-11T14:30:00",
"checks": [
{
"timestamp": "2026-02-11T14:30:00",
"status": "online",
"response_time": 156,
"status_code": 200
}
]
}
}
}
``` ```
The table and index are created automatically on startup. If `DATABASE_URL` is not set, the monitor runs without persistence (useful for local development).
## Status Types ## Status Types
- **online**: HTTP status 2xx-4xx, service responding - **online**: HTTP status 2xx-4xx, service responding
@@ -142,8 +132,7 @@ SERVICES = [
## Notes ## Notes
- First deployment will show limited uptime data until enough checks accumulate - First deployment will show limited uptime data until enough checks accumulate
- Historical data is preserved across server restarts - Historical data is preserved across server restarts (stored in PostgreSQL)
- Maximum 720 checks stored per service (60 days at 2-hour intervals)
- Page auto-refreshes every 5 minutes to show latest server data - Page auto-refreshes every 5 minutes to show latest server data
- Manual refresh button available for immediate updates - Manual refresh button available for immediate updates
- All checks performed server-side (no client-side CORS issues) - All checks performed server-side (no client-side CORS issues)

View File

@@ -7,3 +7,26 @@ services:
restart: 'no' restart: 'no'
ports: ports:
- 8080:8080 - 8080:8080
environment:
DATABASE_URL: postgresql://portfolio:portfolio@db:5432/portfolio
depends_on:
db:
condition: service_healthy
db:
image: postgres:16-alpine
restart: 'no'
environment:
POSTGRES_USER: portfolio
POSTGRES_PASSWORD: portfolio
POSTGRES_DB: portfolio
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U portfolio"]
interval: 5s
timeout: 3s
retries: 5
volumes:
pgdata:

View File

@@ -1,15 +1,18 @@
import flask import hashlib
from flask_minify import Minify
import json import json
import os import os
import hashlib
import flask
from flask_minify import Minify
import werkzeug.exceptions as HTTPerror import werkzeug.exceptions as HTTPerror
from config import *
import config # noqa: F401 — side-effect: loads dev env vars
from monitor import monitor, SERVICES from monitor import monitor, SERVICES
app = flask.Flask(__name__) app = flask.Flask(__name__)
# Compute content hashes for static file fingerprinting # ── Static file fingerprinting ────────────────────────────────────────
static_file_hashes = {} static_file_hashes = {}
for dirpath, _, filenames in os.walk(app.static_folder): for dirpath, _, filenames in os.walk(app.static_folder):
for filename in filenames: for filename in filenames:
@@ -18,6 +21,7 @@ for dirpath, _, filenames in os.walk(app.static_folder):
with open(filepath, 'rb') as f: with open(filepath, 'rb') as f:
static_file_hashes[relative] = hashlib.md5(f.read()).hexdigest()[:8] static_file_hashes[relative] = hashlib.md5(f.read()).hexdigest()[:8]
@app.context_processor @app.context_processor
def override_url_for(): def override_url_for():
def versioned_url_for(endpoint, **values): def versioned_url_for(endpoint, **values):
@@ -28,17 +32,16 @@ def override_url_for():
return flask.url_for(endpoint, **values) return flask.url_for(endpoint, **values)
return dict(url_for=versioned_url_for) return dict(url_for=versioned_url_for)
# Add security and caching headers
# ── Security and caching headers ──────────────────────────────────────
@app.after_request @app.after_request
def add_security_headers(response): def add_headers(response):
"""Add security and performance headers to all responses"""
# Security headers
response.headers['X-Content-Type-Options'] = 'nosniff' response.headers['X-Content-Type-Options'] = 'nosniff'
response.headers['X-Frame-Options'] = 'SAMEORIGIN' response.headers['X-Frame-Options'] = 'SAMEORIGIN'
response.headers['X-XSS-Protection'] = '1; mode=block' response.headers['X-XSS-Protection'] = '1; mode=block'
response.headers['Referrer-Policy'] = 'strict-origin-when-cross-origin' response.headers['Referrer-Policy'] = 'strict-origin-when-cross-origin'
# Cache control for static assets
if flask.request.path.startswith('/static/'): if flask.request.path.startswith('/static/'):
response.headers['Cache-Control'] = 'public, max-age=31536000, immutable' response.headers['Cache-Control'] = 'public, max-age=31536000, immutable'
elif flask.request.path in ['/sitemap.xml', '/robots.txt']: elif flask.request.path in ['/sitemap.xml', '/robots.txt']:
@@ -49,56 +52,93 @@ def add_security_headers(response):
return response return response
# ── Load page data ────────────────────────────────────────────────────
def load_json(path): def load_json(path):
with open(path, "r") as f: with open(path, "r") as f:
return json.load(f) return json.load(f)
proj = load_json("./static/json/projects.json")
projects = load_json("./static/json/projects.json")
books = load_json("./static/json/books.json") books = load_json("./static/json/books.json")
skillList = load_json("./static/json/skills.json") skills = load_json("./static/json/skills.json")
timeline = load_json("./static/json/timeline.json") timeline = load_json("./static/json/timeline.json")
pages = load_json("./static/json/pages.json") pages = load_json("./static/json/pages.json")
pages['projects']['skillList'] = skillList pages['projects']['skillList'] = skills
# pages['about']['timeline'] = timeline pages['projects']['projects'] = projects
pages['projects']['projects'] = proj
pages['home']['books'] = books pages['home']['books'] = books
pages['books']['books'] = books pages['books']['books'] = books
pages['status']['services'] = SERVICES pages['status']['services'] = SERVICES
# ── Error rendering ──────────────────────────────────────────────────
def render_error(code, message):
pagevars = {
"template": "error.html",
"title": f"{code} - Simonson",
"description": "Error on Andrew Simonson's Digital Portfolio",
"canonical": f"/{code}",
}
return (
flask.render_template(
"header.html",
var=pagevars,
error=code,
message=message,
title=f"{code} - Simonson Portfolio",
),
code,
)
@app.errorhandler(HTTPerror.HTTPException)
def handle_http_error(e):
return render_error(e.code, e.description)
@app.errorhandler(Exception)
def handle_generic_error(e):
return render_error(500, "Internal Server Error")
# ── API routes ────────────────────────────────────────────────────────
@app.route('/api/status') @app.route('/api/status')
def api_status(): def api_status():
"""API endpoint for service status"""
return flask.jsonify(monitor.get_status_summary()) return flask.jsonify(monitor.get_status_summary())
@app.route('/api/goto/') @app.route('/api/goto/')
@app.route('/api/goto/<location>') @app.route('/api/goto/<location>')
def goto(location='home'): def api_goto(location='home'):
if location not in pages: if location not in pages:
flask.abort(404) flask.abort(404)
pagevars = pages[location] pagevars = pages[location]
page = None
try: try:
page = flask.render_template(pagevars["template"], var=pagevars) page = flask.render_template(pagevars["template"], var=pagevars)
except Exception: except Exception:
e = HTTPerror.InternalServerError() page = render_error(500, "Internal Server Error")
page = handle_http_error(e)
return [pagevars, page] return [pagevars, page]
def funcGen(pagename, pages):
def dynamicRule(): # ── Dynamic page routes ──────────────────────────────────────────────
def make_page_handler(pagename):
def handler():
try: try:
return flask.render_template('header.html', var=pages[pagename]) return flask.render_template('header.html', var=pages[pagename])
except Exception: except Exception:
e = HTTPerror.InternalServerError() return render_error(500, "Internal Server Error")
print(e) return handler
return handle_http_error(e)
return dynamicRule
for i in pages:
func = funcGen(i, pages) for name in pages:
app.add_url_rule(pages[i]['canonical'], i, func) app.add_url_rule(pages[name]['canonical'], name, make_page_handler(name))
# ── Static file routes ───────────────────────────────────────────────
@app.route("/resume") @app.route("/resume")
@app.route("/Resume.pdf") @app.route("/Resume.pdf")
@@ -106,46 +146,6 @@ for i in pages:
def resume(): def resume():
return flask.send_file("./static/Resume_Simonson_Andrew.pdf") return flask.send_file("./static/Resume_Simonson_Andrew.pdf")
@app.errorhandler(HTTPerror.HTTPException)
def handle_http_error(e):
eCode = e.code
message = e.description
pagevars = {
"template": "error.html",
"title": f"{eCode} - Simonson",
"description": "Error on Andrew Simonson's Digital Portfolio",
"canonical": f"/{eCode}",
}
return (
flask.render_template(
"header.html",
var=pagevars,
error=eCode,
message=message,
title=f"{eCode} - Simonson Portfolio",
),
eCode,
)
@app.errorhandler(Exception)
def handle_generic_error(e):
pagevars = {
"template": "error.html",
"title": "500 - Simonson",
"description": "Error on Andrew Simonson's Digital Portfolio",
"canonical": "/500",
}
return (
flask.render_template(
"header.html",
var=pagevars,
error=500,
message="Internal Server Error",
title="500 - Simonson Portfolio",
),
500,
)
@app.route("/sitemap.xml") @app.route("/sitemap.xml")
@app.route("/robots.txt") @app.route("/robots.txt")
@@ -153,10 +153,9 @@ def static_from_root():
return flask.send_from_directory(app.static_folder, flask.request.path[1:]) return flask.send_from_directory(app.static_folder, flask.request.path[1:])
if __name__ == "__main__": # ── Startup ───────────────────────────────────────────────────────────
# import sass
# sass.compile(dirname=("static/scss", "static/css"), output_style="compressed") if __name__ == "__main__":
app.run(debug=False) app.run(debug=False)
else: else:
Minify(app=app, html=True, js=True, cssless=True) Minify(app=app, html=True, js=True, cssless=True)

View File

@@ -1,114 +1,170 @@
""" """
Service monitoring module Service monitoring module.
Checks service availability and tracks uptime statistics Checks service availability and tracks uptime statistics in PostgreSQL.
""" """
import requests import os
import time import time
import json
from concurrent.futures import ThreadPoolExecutor from concurrent.futures import ThreadPoolExecutor
from datetime import datetime, timedelta from datetime import datetime, timedelta
from threading import Thread, Lock from threading import Thread, Lock
from pathlib import Path
# Service configuration import psycopg2
import requests
SERVICES = [ SERVICES = [
{ {'id': 'main', 'name': 'asimonson.com', 'url': 'https://asimonson.com', 'timeout': 10},
'id': 'main', {'id': 'files', 'name': 'files.asimonson.com', 'url': 'https://files.asimonson.com', 'timeout': 10},
'name': 'asimonson.com', {'id': 'git', 'name': 'git.asimonson.com', 'url': 'https://git.asimonson.com', 'timeout': 10},
'url': 'https://asimonson.com',
'timeout': 10
},
{
'id': 'files',
'name': 'files.asimonson.com',
'url': 'https://files.asimonson.com',
'timeout': 10
},
{
'id': 'git',
'name': 'git.asimonson.com',
'url': 'https://git.asimonson.com',
'timeout': 10
}
] ]
# Check interval: 30 mins CHECK_INTERVAL = 60 # seconds between checks
CHECK_INTERVAL = 1800 RETENTION_DAYS = 90 # how long to keep records
CLEANUP_INTERVAL = 86400 # seconds between purge runs
DATABASE_URL = os.environ.get('DATABASE_URL')
# Expected columns (besides id) -- name: SQL type
_EXPECTED_COLUMNS = {
'service_id': 'VARCHAR(50) NOT NULL',
'timestamp': 'TIMESTAMPTZ NOT NULL DEFAULT NOW()',
'status': 'VARCHAR(20) NOT NULL',
'response_time': 'INTEGER',
'status_code': 'INTEGER',
'error': 'TEXT',
}
# File to store status history
STATUS_FILE = Path(__file__).parent / 'static' / 'json' / 'status_history.json'
class ServiceMonitor: class ServiceMonitor:
def __init__(self): def __init__(self):
self.status_data = {}
self.lock = Lock() self.lock = Lock()
self.load_history() self._current = {
svc['id']: {
def load_history(self): 'name': svc['name'],
"""Load status history from file""" 'url': svc['url'],
if STATUS_FILE.exists():
try:
with open(STATUS_FILE, 'r') as f:
self.status_data = json.load(f)
except Exception as e:
print(f"Error loading status history: {e}")
self.initialize_status_data()
else:
self.initialize_status_data()
def initialize_status_data(self):
"""Initialize empty status data structure"""
self.status_data = {
'last_check': None,
'services': {}
}
for service in SERVICES:
self.status_data['services'][service['id']] = {
'name': service['name'],
'url': service['url'],
'status': 'unknown', 'status': 'unknown',
'response_time': None, 'response_time': None,
'status_code': None, 'status_code': None,
'last_online': None, 'last_online': None,
'checks': [] # List of check results
} }
for svc in SERVICES
}
self._last_check = None
self._ensure_schema()
def save_history(self): # ── Database helpers ──────────────────────────────────────────
"""Save status history to file"""
@staticmethod
def _get_conn():
"""Return a new psycopg2 connection, or None if DATABASE_URL is unset."""
if not DATABASE_URL:
return None
return psycopg2.connect(DATABASE_URL)
def _ensure_schema(self):
"""Create or migrate the service_checks table to match _EXPECTED_COLUMNS."""
if not DATABASE_URL:
print("DATABASE_URL not set -- running without persistence")
return
conn = None
for attempt in range(5):
try: try:
STATUS_FILE.parent.mkdir(parents=True, exist_ok=True) conn = psycopg2.connect(DATABASE_URL)
with open(STATUS_FILE, 'w') as f: break
json.dump(self.status_data, f, indent=2) except psycopg2.OperationalError:
except Exception as e: if attempt < 4:
print(f"Error saving status history: {e}") print(f"Database not ready, retrying in 2s (attempt {attempt + 1}/5)...")
time.sleep(2)
else:
print("Could not connect to database -- running without persistence")
return
try:
with conn, conn.cursor() as cur:
cur.execute("""
CREATE TABLE IF NOT EXISTS service_checks (
id SERIAL PRIMARY KEY,
service_id VARCHAR(50) NOT NULL,
timestamp TIMESTAMPTZ NOT NULL DEFAULT NOW(),
status VARCHAR(20) NOT NULL,
response_time INTEGER,
status_code INTEGER,
error TEXT
);
""")
cur.execute("""
CREATE INDEX IF NOT EXISTS idx_service_checks_service_timestamp
ON service_checks (service_id, timestamp DESC);
""")
# Introspect existing columns
cur.execute("""
SELECT column_name
FROM information_schema.columns
WHERE table_name = 'service_checks'
""")
existing = {row[0] for row in cur.fetchall()}
for col, col_type in _EXPECTED_COLUMNS.items():
if col not in existing:
bare_type = col_type.split('NOT NULL')[0].split('DEFAULT')[0].strip()
cur.execute(f'ALTER TABLE service_checks ADD COLUMN {col} {bare_type}')
print(f"Added column {col} to service_checks")
expected_names = set(_EXPECTED_COLUMNS) | {'id'}
for col in existing - expected_names:
cur.execute(f'ALTER TABLE service_checks DROP COLUMN {col}')
print(f"Dropped column {col} from service_checks")
print("Database schema OK")
finally:
conn.close()
def _insert_check(self, service_id, result):
"""Persist a single check result to the database."""
conn = self._get_conn()
if conn is None:
return
try:
with conn, conn.cursor() as cur:
cur.execute(
"""INSERT INTO service_checks
(service_id, timestamp, status, response_time, status_code, error)
VALUES (%s, %s, %s, %s, %s, %s)""",
(
service_id,
result['timestamp'],
result['status'],
result.get('response_time'),
result.get('status_code'),
result.get('error'),
),
)
finally:
conn.close()
# ── Service checks ────────────────────────────────────────────
def check_service(self, service): def check_service(self, service):
"""Check a single service and return status""" """Perform an HTTP HEAD against a service and return a status dict."""
start_time = time.time() start_time = time.time()
result = { result = {
'timestamp': datetime.now().isoformat(), 'timestamp': datetime.now().isoformat(),
'status': 'offline', 'status': 'offline',
'response_time': None, 'response_time': None,
'status_code': None 'status_code': None,
} }
try: try:
response = requests.head( response = requests.head(
service['url'], service['url'],
timeout=service['timeout'], timeout=service['timeout'],
allow_redirects=True allow_redirects=True,
) )
result['response_time'] = int((time.time() - start_time) * 1000)
elapsed = int((time.time() - start_time) * 1000) # ms
result['response_time'] = elapsed
result['status_code'] = response.status_code result['status_code'] = response.status_code
# Consider 2xx and 3xx as online if response.status_code < 500:
if 200 <= response.status_code < 400:
result['status'] = 'online'
elif 400 <= response.status_code < 500:
# Client errors might still mean service is up
result['status'] = 'online' result['status'] = 'online'
else: else:
result['status'] = 'degraded' result['status'] = 'degraded'
@@ -123,10 +179,9 @@ class ServiceMonitor:
return result return result
def check_all_services(self): def check_all_services(self):
"""Check all services and update status data""" """Check every service concurrently, persist results, and update cache."""
print(f"[{datetime.now().strftime('%Y-%m-%d %H:%M:%S')}] Checking all services...") print(f"[{datetime.now().strftime('%Y-%m-%d %H:%M:%S')}] Checking all services...")
# Perform all network checks concurrently and OUTSIDE the lock
results = {} results = {}
with ThreadPoolExecutor(max_workers=len(SERVICES)) as executor: with ThreadPoolExecutor(max_workers=len(SERVICES)) as executor:
futures = {executor.submit(self.check_service, s): s for s in SERVICES} futures = {executor.submit(self.check_service, s): s for s in SERVICES}
@@ -136,121 +191,154 @@ class ServiceMonitor:
results[service['id']] = result results[service['id']] = result
print(f" {service['name']}: {result['status']} ({result['response_time']}ms)") print(f" {service['name']}: {result['status']} ({result['response_time']}ms)")
# Only acquire lock when updating the shared data structure for service_id, result in results.items():
self._insert_check(service_id, result)
with self.lock: with self.lock:
for service in SERVICES: for service in SERVICES:
result = results[service['id']] result = results[service['id']]
service_data = self.status_data['services'][service['id']] cached = self._current[service['id']]
cached['status'] = result['status']
# Update current status cached['response_time'] = result['response_time']
service_data['status'] = result['status'] cached['status_code'] = result['status_code']
service_data['response_time'] = result['response_time']
service_data['status_code'] = result['status_code']
if result['status'] == 'online': if result['status'] == 'online':
service_data['last_online'] = result['timestamp'] cached['last_online'] = result['timestamp']
self._last_check = datetime.now().isoformat()
# Add to check history (keep last 2880 checks = 60 days at 2hr intervals) # ── Uptime calculations ───────────────────────────────────────
service_data['checks'].append(result)
if len(service_data['checks']) > 2880:
service_data['checks'] = service_data['checks'][-2880:]
self.status_data['last_check'] = datetime.now().isoformat() def _calculate_uptime(self, service_id, hours=None):
self.save_history() """Return uptime percentage for a service, or None if insufficient data."""
conn = self._get_conn()
def _calculate_uptime_unlocked(self, service_id, hours=None): if conn is None:
"""Calculate uptime percentage for a service (assumes lock is held)"""
service_data = self.status_data['services'].get(service_id)
if not service_data or not service_data['checks']:
return None return None
checks = service_data['checks'] try:
with conn.cursor() as cur:
# Filter by time period if specified
if hours: if hours:
cutoff = datetime.now() - timedelta(hours=hours) cutoff = datetime.now() - timedelta(hours=hours)
checks = [ cur.execute(
c for c in checks """SELECT
if datetime.fromisoformat(c['timestamp']) > cutoff COUNT(*) FILTER (WHERE status = 'online'),
] COUNT(*)
FROM service_checks
if not checks: WHERE service_id = %s AND timestamp > %s""",
return None (service_id, cutoff),
)
# Require minimum data coverage for the time period
# Calculate expected number of checks for this period
expected_checks = (hours * 3600) / CHECK_INTERVAL
# Require at least 50% of expected checks to show this metric
minimum_checks = max(3, expected_checks * 0.5)
if len(checks) < minimum_checks:
return None
else: else:
# For all-time, require at least 3 checks cur.execute(
if len(checks) < 3: """SELECT
COUNT(*) FILTER (WHERE status = 'online'),
COUNT(*)
FROM service_checks
WHERE service_id = %s""",
(service_id,),
)
online_count, total_count = cur.fetchone()
if total_count == 0:
return None return None
online_count = sum(1 for c in checks if c['status'] == 'online') # Only report a time-windowed uptime if data exists beyond the window
uptime = (online_count / len(checks)) * 100 if hours:
cur.execute(
'SELECT EXISTS(SELECT 1 FROM service_checks WHERE service_id = %s AND timestamp <= %s)',
(service_id, cutoff),
)
if not cur.fetchone()[0]:
return None
return round(uptime, 2) return round((online_count / total_count) * 100, 2)
finally:
conn.close()
def calculate_uptime(self, service_id, hours=None): def _get_total_checks(self, service_id):
"""Calculate uptime percentage for a service""" """Return the total number of recorded checks for a service."""
with self.lock: conn = self._get_conn()
return self._calculate_uptime_unlocked(service_id, hours) if conn is None:
return 0
try:
with conn.cursor() as cur:
cur.execute(
'SELECT COUNT(*) FROM service_checks WHERE service_id = %s',
(service_id,),
)
return cur.fetchone()[0]
finally:
conn.close()
# ── Status summary ────────────────────────────────────────────
def get_status_summary(self): def get_status_summary(self):
"""Get current status summary with uptime statistics""" """Build a JSON-serializable status summary with uptime statistics."""
with self.lock: with self.lock:
summary = { summary = {
'last_check': self.status_data['last_check'], 'last_check': self._last_check,
'next_check': None, 'next_check': None,
'services': [] 'services': [],
} }
# Calculate next check time if self._last_check:
if self.status_data['last_check']: last_check = datetime.fromisoformat(self._last_check)
last_check = datetime.fromisoformat(self.status_data['last_check']) summary['next_check'] = (last_check + timedelta(seconds=CHECK_INTERVAL)).isoformat()
next_check = last_check + timedelta(seconds=CHECK_INTERVAL)
summary['next_check'] = next_check.isoformat()
for service_id, service_data in self.status_data['services'].items(): for service_id, cached in self._current.items():
service_summary = { summary['services'].append({
'id': service_id, 'id': service_id,
'name': service_data['name'], 'name': cached['name'],
'url': service_data['url'], 'url': cached['url'],
'status': service_data['status'], 'status': cached['status'],
'response_time': service_data['response_time'], 'response_time': cached['response_time'],
'status_code': service_data['status_code'], 'status_code': cached['status_code'],
'last_online': service_data['last_online'], 'last_online': cached['last_online'],
'uptime': { 'uptime': {
'24h': self._calculate_uptime_unlocked(service_id, 24), '24h': self._calculate_uptime(service_id, 24),
'7d': self._calculate_uptime_unlocked(service_id, 24 * 7), '7d': self._calculate_uptime(service_id, 24 * 7),
'30d': self._calculate_uptime_unlocked(service_id, 24 * 30), '30d': self._calculate_uptime(service_id, 24 * 30),
'all_time': self._calculate_uptime_unlocked(service_id) 'all_time': self._calculate_uptime(service_id),
}, },
'total_checks': len(service_data['checks']) 'total_checks': self._get_total_checks(service_id),
} })
summary['services'].append(service_summary)
return summary return summary
def start_monitoring(self): # ── Background loop ───────────────────────────────────────────
"""Start background monitoring thread"""
def monitor_loop(): def _purge_old_records(self):
# Initial check """Delete check records older than RETENTION_DAYS."""
self.check_all_services() conn = self._get_conn()
if conn is None:
return
try:
cutoff = datetime.now() - timedelta(days=RETENTION_DAYS)
with conn, conn.cursor() as cur:
cur.execute('DELETE FROM service_checks WHERE timestamp < %s', (cutoff,))
deleted = cur.rowcount
if deleted:
print(f"Purged {deleted} records older than {RETENTION_DAYS} days")
finally:
conn.close()
def start_monitoring(self):
"""Start the background daemon thread for periodic checks and cleanup."""
def monitor_loop():
self.check_all_services()
self._purge_old_records()
checks_since_cleanup = 0
checks_per_cleanup = CLEANUP_INTERVAL // CHECK_INTERVAL
# Periodic checks
while True: while True:
time.sleep(CHECK_INTERVAL) time.sleep(CHECK_INTERVAL)
self.check_all_services() self.check_all_services()
checks_since_cleanup += 1
if checks_since_cleanup >= checks_per_cleanup:
self._purge_old_records()
checks_since_cleanup = 0
thread = Thread(target=monitor_loop, daemon=True) thread = Thread(target=monitor_loop, daemon=True)
thread.start() thread.start()
print(f"Service monitoring started (checks every {CHECK_INTERVAL/3600} hours)") print(f"Service monitoring started (checks every {CHECK_INTERVAL}s)")
# Global monitor instance
monitor = ServiceMonitor() monitor = ServiceMonitor()

View File

@@ -1,22 +1,23 @@
blinker==1.8.2 blinker==1.9.0
certifi==2024.7.4 certifi==2026.1.4
charset-normalizer==3.3.2 charset-normalizer==3.4.4
click==8.1.7 click==8.3.1
Flask==3.0.3 Flask==3.1.3
Flask-Minify==0.48 Flask-Minify==0.50
gunicorn==22.0.0 gunicorn==25.1.0
htmlminf==0.1.13 htmlminf==0.1.13
idna==3.7 idna==3.11
itsdangerous==2.2.0 itsdangerous==2.2.0
Jinja2==3.1.4 Jinja2==3.1.6
jsmin==3.0.1 jsmin==3.0.1
lesscpy==0.15.1 lesscpy==0.15.1
MarkupSafe==2.1.5 MarkupSafe==3.0.3
packaging==24.1 packaging==26.0
ply==3.11 ply==3.11
rcssmin==1.1.2 psycopg2-binary==2.9.11
requests==2.32.3 rcssmin==1.2.2
six==1.16.0 requests==2.32.5
urllib3==2.2.2 six==1.17.0
Werkzeug==3.0.3 urllib3==2.6.3
xxhash==3.4.1 Werkzeug==3.1.6
xxhash==3.6.0

File diff suppressed because it is too large Load Diff

View File

@@ -1,70 +0,0 @@
.hidden {
display: none;
}
.hiddenup {
max-height: 0px !important;
}
.checkbox-wrapper > div {
display: inline-block;
margin-right: 1em;
margin-bottom: 1em;
}
.checkbox-wrapper > div:last-child {
margin-bottom: 0;;
}
.checkbox-wrapper .switch {
display: flex;
position: relative;
cursor: pointer;
}
.checkbox-wrapper .switch > * {
align-self: center;
}
.checkbox-wrapper .switch input {
display: none;
}
.checkbox-wrapper .slider {
background-color: #ccc;
transition: 0.4s;
height: 34px;
width: 60px;
}
.checkbox-wrapper .slider:before {
background-color: #fff;
bottom: 4px;
content: "";
height: 26px;
left: 4px;
position: absolute;
transition: 0.4s;
width: 26px;
}
.checkbox-wrapper input:checked+.slider {
background-color: #66bb6a;
}
.checkbox-wrapper input:checked+.slider:before {
transform: translateX(26px);
}
.checkbox-wrapper .slider.round {
border-radius: 34px;
}
.checkbox-wrapper .slider.round:before {
border-radius: 50%;
}
.checkbox-wrapper strong {
margin-left: .5em;
}

View File

@@ -1,41 +0,0 @@
function toggleCheckbox(dir) {
let toggles = document.querySelectorAll(
".checkbox-wrapper input[type=checkbox]"
);
let allow = [];
toggles.forEach(function (x) {
if (x.checked) {
allow.push(x.id);
}
});
let list = document.querySelectorAll(".checkbox-client > div");
if (allow.length === 0) {
for (let i = 0; i < list.length; i++) {
list[i].classList.remove("hidden" + dir);
}
} else {
for (let i = 0; i < list.length; i++) {
list[i].classList.remove("hidden" + dir);
for (let x = 0; x < allow.length; x++) {
if (!list[i].classList.contains(allow[x])) {
list[i].classList.add("hidden" + dir);
break;
}
}
}
}
}
function activeSkill(obj) {
if (obj.parentElement.classList.contains("activeSkill")) {
obj.parentElement.classList.remove("activeSkill");
return;
}
// document.querySelectorAll(".skill").forEach((x) => {
// x.classList.remove("activeSkill");
// });
while (obj.parentElement.classList.contains("skill")) {
obj = obj.parentElement;
obj.classList.add("activeSkill");
}
}

View File

@@ -7,17 +7,21 @@ async function addChessEmbed(username) {
setChess({ cName: "Chess.com request failed" }); setChess({ cName: "Chess.com request failed" });
return; return;
} }
if (user.status === 200) { if (user.status === 200) {
user = await user.json(); user = await user.json();
stats = await stats.json(); stats = await stats.json();
const ratings = { setChess({
cName: user["username"],
pic: user.avatar,
ratings: {
rapid: stats.chess_rapid.last.rating, rapid: stats.chess_rapid.last.rating,
blitz: stats.chess_blitz.last.rating, blitz: stats.chess_blitz.last.rating,
bullet: stats.chess_bullet.last.rating, bullet: stats.chess_bullet.last.rating,
tactics: stats.tactics.highest.rating, tactics: stats.tactics.highest.rating,
}; },
setChess({ cName: user["username"], pic: user.avatar, ratings: ratings }); });
} else if (user === null || user.status === 403 || user.status === null) { } else if (user.status === 403) {
setChess({ cName: "Chess.com request failed" }); setChess({ cName: "Chess.com request failed" });
} else { } else {
setChess({ cName: "User Not Found" }); setChess({ cName: "User Not Found" });
@@ -33,16 +37,12 @@ function setChess({ cName = null, pic = null, ratings = null }) {
document.querySelector(".chessImage").src = pic; document.querySelector(".chessImage").src = pic;
} }
if (ratings) { if (ratings) {
document.querySelector(".chessRapid .chessStat").textContent = document.querySelector(".chessRapid .chessStat").textContent = ratings.rapid;
ratings.rapid; document.querySelector(".chessBlitz .chessStat").textContent = ratings.blitz;
document.querySelector(".chessBlitz .chessStat").textContent = document.querySelector(".chessBullet .chessStat").textContent = ratings.bullet;
ratings.blitz; document.querySelector(".chessPuzzles .chessStat").textContent = ratings.tactics;
document.querySelector(".chessBullet .chessStat").textContent =
ratings.bullet;
document.querySelector(".chessPuzzles .chessStat").textContent =
ratings.tactics;
} }
} catch { } catch {
console.log("fucker clicking so fast the internet can't even keep up"); console.warn("Chess DOM elements not available (navigated away during fetch)");
} }
} }

View File

@@ -1,8 +1,11 @@
const balls = []; const balls = [];
const density = 0.00003; const density = 0.00005;
let screenWidth = window.innerWidth + 10; let screenWidth = window.innerWidth + 10;
let screenHeight = window.innerHeight + 10; let screenHeight = window.innerHeight + 10;
const MAX_DIST = 150;
const MAX_DIST_SQUARED = MAX_DIST * MAX_DIST;
class Ball { class Ball {
constructor(x, y, size, speed, angle) { constructor(x, y, size, speed, angle) {
this.x = x; this.x = x;
@@ -14,8 +17,9 @@ class Ball {
} }
calcChange() { calcChange() {
this.xSpeed = this.speed * Math.sin((this.angle * Math.PI) / 180); const radians = (this.angle * Math.PI) / 180
this.ySpeed = this.speed * Math.cos((this.angle * Math.PI) / 180); this.xSpeed = this.speed * Math.sin(radians);
this.ySpeed = this.speed * Math.cos(radians);
} }
update() { update() {
@@ -44,19 +48,17 @@ class Ball {
function setup() { function setup() {
frameRate(15); frameRate(15);
const pix = screenHeight * screenWidth; const pixels = screenHeight * screenWidth;
createCanvas(screenWidth, screenHeight); createCanvas(screenWidth, screenHeight);
for (let i = 0; i < pix * density; i++) { for (let i = 0; i < pixels * density; i++) {
let thisBall = new Ball( balls.push(new Ball(
random(screenWidth), random(screenWidth),
random(screenHeight), random(screenHeight),
random(6) + 3, random(6) + 3,
Math.exp(random(4) + 3) / 1000 + 1, Math.exp(random(4) + 3) / 1000 + 1,
random(360) random(360)
); ));
balls.push(thisBall);
} }
stroke(255); stroke(255);
} }
@@ -69,40 +71,34 @@ function windowResized() {
function draw() { function draw() {
background(24); background(24);
// Update all balls
for (let i = 0; i < balls.length; i++) { for (let i = 0; i < balls.length; i++) {
balls[i].update(); balls[i].update();
} }
// Optimize line drawing with early distance checks // Draw connection lines with additive blending so overlaps brighten
const maxDist = 150; blendMode(ADD);
const maxDistSquared = maxDist * maxDist; // Avoid sqrt in distance calculation strokeWeight(2);
for (let i = 0; i < balls.length - 1; i++) { for (let i = 0; i < balls.length - 1; i++) {
const ball1 = balls[i]; const a = balls[i];
for (let j = i + 1; j < balls.length; j++) { for (let j = i + 1; j < balls.length; j++) {
const ball2 = balls[j]; const b = balls[j];
const dx = b.x - a.x;
// Quick rejection test using squared distance (faster than sqrt) const dy = b.y - a.y;
const dx = ball2.x - ball1.x;
const dy = ball2.y - ball1.y;
const distSquared = dx * dx + dy * dy; const distSquared = dx * dx + dy * dy;
if (distSquared < maxDistSquared) { if (distSquared < MAX_DIST_SQUARED) {
const distance = Math.sqrt(distSquared); // Only calculate sqrt if needed const distance = Math.sqrt(distSquared);
if (distance < 75) {
if (distance < 100) { stroke(255, 85);
stroke(150);
line(ball1.x, ball1.y, ball2.x, ball2.y);
} else { } else {
stroke(100); const chance = 0.3 ** (((random(0.2) + 0.8) * distance) / MAX_DIST);
const chance = 0.3 ** (((random(0.2) + 0.8) * distance) / 150); stroke(255, chance < 0.5 ? 40 : 75);
if (chance < 0.5) {
stroke(50);
}
line(ball1.x, ball1.y, ball2.x, ball2.y);
} }
line(a.x, a.y, b.x, b.y);
} }
} }
} }
blendMode(BLEND);
} }

View File

@@ -1,67 +1,107 @@
function toggleMenu(collapse=false) { function toggleMenu(collapse) {
if (window.innerWidth < 1400) { if (window.innerWidth < 1400) {
const e = document.querySelector(".navControl"); const menu = document.querySelector(".navControl");
const bar = document.querySelector(".header"); const bar = document.querySelector(".header");
const isCollapsed = !e.style.maxHeight || e.style.maxHeight === "0px"; const isCollapsed = !menu.style.maxHeight || menu.style.maxHeight === "0px";
if (isCollapsed && !collapse) { if (isCollapsed && !collapse) {
e.style.maxHeight = `${e.scrollHeight + 10}px`; menu.style.maxHeight = `${menu.scrollHeight + 10}px`;
bar.style.borderBottomWidth = "0px"; bar.style.borderBottomWidth = "0px";
} else { } else {
e.style.maxHeight = "0px"; menu.style.maxHeight = "0px";
bar.style.borderBottomWidth = "3px"; bar.style.borderBottomWidth = "3px";
} }
} }
} }
async function goto(location, { push = true } = {}) { async function goto(location, { push = true } = {}) {
let a; const loadingBar = document.getElementById('loading-bar');
if (loadingBar) {
loadingBar.style.width = ''; // Clear inline style from previous run
}
let loadingTimeout = setTimeout(() => {
if (loadingBar) {
loadingBar.classList.remove('finish');
loadingBar.classList.add('active');
loadingBar.classList.add('visible');
}
}, 150);
try { try {
a = await fetch("/api/goto/" + location, { const response = await fetch("/api/goto/" + location, {
credentials: "include", credentials: "include",
method: "GET", method: "GET",
mode: "cors", mode: "cors",
}); });
if (!a.ok) {
console.error(`Navigation failed: HTTP ${a.status}`); if (!response.ok) {
return; throw new Error(`HTTP ${response.status}`);
}
} catch (err) {
console.error("Navigation fetch failed:", err);
return;
} }
// Wait for the full body to download - this is usually the slow part
const [metadata, content] = await response.json();
document.dispatchEvent(new Event('beforenavigate')); document.dispatchEvent(new Event('beforenavigate'));
const response = await a.json();
const metadata = response[0];
const content = response[1];
const root = document.getElementById("root"); const root = document.getElementById("root");
root.innerHTML = content; root.innerHTML = content;
root.querySelectorAll("script").forEach((oldScript) => {
// Re-execute scripts
root.querySelectorAll("script").forEach(function(oldScript) {
const newScript = document.createElement("script"); const newScript = document.createElement("script");
Array.from(oldScript.attributes).forEach(attr => { Array.from(oldScript.attributes).forEach(function(attr) {
newScript.setAttribute(attr.name, attr.value); newScript.setAttribute(attr.name, attr.value);
}); });
newScript.textContent = oldScript.textContent; newScript.textContent = oldScript.textContent;
oldScript.parentNode.replaceChild(newScript, oldScript); oldScript.parentNode.replaceChild(newScript, oldScript);
}); });
if (!window.location.href.includes("#")) { if (window.location.href.includes("#")) {
window.scrollTo({top: 0, left: 0, behavior:"instant"}); const id = decodeURIComponent(window.location.hash.substring(1));
} else { const el = document.getElementById(id);
const eid = decodeURIComponent(window.location.hash.substring(1));
const el = document.getElementById(eid);
if (el) el.scrollIntoView(); if (el) el.scrollIntoView();
} else {
window.scrollTo({ top: 0, left: 0, behavior: "instant" });
} }
toggleMenu(collapse=true); toggleMenu(true);
document.querySelector("title").textContent = metadata["title"]; document.querySelector("title").textContent = metadata["title"];
if (push) { if (push) {
history.pushState(null, null, metadata["canonical"]); history.pushState(null, null, metadata["canonical"]);
} }
} catch (err) {
console.error("Navigation failed:", err);
} finally {
clearTimeout(loadingTimeout);
if (loadingBar && loadingBar.classList.contains('active')) {
loadingBar.classList.add('finish');
loadingBar.classList.remove('active');
setTimeout(() => {
if (!loadingBar.classList.contains('active')) {
loadingBar.style.width = '0%';
loadingBar.classList.remove('finish');
loadingBar.classList.remove('visible');
}
}, 500);
}
}
} }
function backButton() { function backButton() {
const location = window.location.pathname; const path = window.location.pathname;
goto(location.substring(1), { push: false }); // remove slash, goto already does that goto(path.substring(1), { push: false });
}
function activeSkill(obj) {
let skill = obj.closest(".skill");
if (skill.classList.contains("activeSkill")) {
skill.classList.remove("activeSkill");
return;
}
while (skill) {
skill.classList.add("activeSkill");
skill = skill.parentElement.closest(".skill");
}
} }

View File

@@ -1,8 +1,9 @@
// Fetch and display service status from API // Use a global to track the interval and ensure we don't stack listeners
if (window.statusIntervalId) {
clearInterval(window.statusIntervalId);
window.statusIntervalId = null;
}
/**
* Fetch status data from server
*/
async function fetchStatus() { async function fetchStatus() {
try { try {
const response = await fetch('/api/status'); const response = await fetch('/api/status');
@@ -17,36 +18,28 @@ async function fetchStatus() {
} }
} }
/**
* Update the status display with fetched data
*/
function updateStatusDisplay(data) { function updateStatusDisplay(data) {
// Update last check time
if (data.last_check) { if (data.last_check) {
const lastCheck = new Date(data.last_check); const lastCheck = new Date(data.last_check);
const timeString = lastCheck.toLocaleString(); const lastUpdateEl = document.getElementById('lastUpdate');
document.getElementById('lastUpdate').textContent = `Last checked: ${timeString}`; if (lastUpdateEl) lastUpdateEl.textContent = `Last checked: ${lastCheck.toLocaleString()}`;
} }
// Update next check time
if (data.next_check) { if (data.next_check) {
const nextCheck = new Date(data.next_check);
const timeString = nextCheck.toLocaleString();
const nextCheckEl = document.getElementById('nextUpdate'); const nextCheckEl = document.getElementById('nextUpdate');
if (nextCheckEl) { if (nextCheckEl) {
nextCheckEl.textContent = `Next check: ${timeString}`; const nextCheck = new Date(data.next_check);
nextCheckEl.textContent = `Next check: ${nextCheck.toLocaleString()}`;
} }
} }
// Update each service if (data.services) {
data.services.forEach(service => { data.services.forEach(function(service) {
updateServiceCard(service); updateServiceCard(service);
}); });
// Update overall status
updateOverallStatus(data.services); updateOverallStatus(data.services);
}
// Re-enable refresh button
const refreshBtn = document.getElementById('refreshBtn'); const refreshBtn = document.getElementById('refreshBtn');
if (refreshBtn) { if (refreshBtn) {
refreshBtn.disabled = false; refreshBtn.disabled = false;
@@ -54,9 +47,19 @@ function updateStatusDisplay(data) {
} }
} }
/** function getUptimeClass(value) {
* Update a single service card if (value === null) return 'text-muted';
*/ if (value >= 99) return 'text-excellent';
if (value >= 95) return 'text-good';
if (value >= 90) return 'text-fair';
return 'text-poor';
}
function formatUptime(value, label) {
const display = value !== null ? `${value}%` : '--';
return `${label}: <strong class="${getUptimeClass(value)}">${display}</strong>`;
}
function updateServiceCard(service) { function updateServiceCard(service) {
const card = document.getElementById(`status-${service.id}`); const card = document.getElementById(`status-${service.id}`);
if (!card) return; if (!card) return;
@@ -68,127 +71,101 @@ function updateServiceCard(service) {
const uptimeDisplay = document.getElementById(`uptime-${service.id}`); const uptimeDisplay = document.getElementById(`uptime-${service.id}`);
const checksDisplay = document.getElementById(`checks-${service.id}`); const checksDisplay = document.getElementById(`checks-${service.id}`);
// Update response time if (timeDisplay) timeDisplay.textContent = service.response_time !== null ? `${service.response_time}ms` : '--';
if (service.response_time !== null) {
timeDisplay.textContent = `${service.response_time}ms`;
} else {
timeDisplay.textContent = '--';
}
// Update status code if (codeDisplay) {
if (service.status_code !== null) { if (service.status_code !== null) {
codeDisplay.textContent = service.status_code; codeDisplay.textContent = service.status_code;
} else { } else {
codeDisplay.textContent = service.status === 'unknown' ? 'Unknown' : 'Error'; codeDisplay.textContent = service.status === 'unknown' ? 'Unknown' : 'Error';
} }
}
// Update status indicator
card.classList.remove('online', 'degraded', 'offline', 'unknown'); card.classList.remove('online', 'degraded', 'offline', 'unknown');
switch (service.status) { switch (service.status) {
case 'online': case 'online':
stateDot.className = 'state-dot online'; if (stateDot) stateDot.className = 'state-dot online';
stateText.textContent = 'Operational'; if (stateText) stateText.textContent = 'Operational';
card.classList.add('online'); card.classList.add('online');
break; break;
case 'degraded': case 'degraded':
case 'timeout': case 'timeout':
stateDot.className = 'state-dot degraded'; if (stateDot) stateDot.className = 'state-dot degraded';
stateText.textContent = service.status === 'timeout' ? 'Timeout' : 'Degraded'; if (stateText) stateText.textContent = service.status === 'timeout' ? 'Timeout' : 'Degraded';
card.classList.add('degraded'); card.classList.add('degraded');
break; break;
case 'offline': case 'offline':
stateDot.className = 'state-dot offline'; if (stateDot) stateDot.className = 'state-dot offline';
stateText.textContent = 'Offline'; if (stateText) stateText.textContent = 'Offline';
card.classList.add('offline'); card.classList.add('offline');
break; break;
default: default:
stateDot.className = 'state-dot loading'; if (stateDot) stateDot.className = 'state-dot loading';
stateText.textContent = 'Unknown'; if (stateText) stateText.textContent = 'Unknown';
card.classList.add('unknown'); card.classList.add('unknown');
} }
// Update uptime statistics
if (uptimeDisplay && service.uptime) { if (uptimeDisplay && service.uptime) {
const uptimeHTML = []; uptimeDisplay.innerHTML = [
formatUptime(service.uptime['24h'], '24h'),
// Helper function to get color class based on uptime percentage formatUptime(service.uptime['7d'], '7d'),
const getUptimeClass = (value) => { formatUptime(service.uptime['30d'], '30d'),
if (value === null) return 'text-muted'; formatUptime(service.uptime.all_time, 'All'),
if (value >= 99) return 'text-excellent'; ].join(' | ');
if (value >= 95) return 'text-good';
if (value >= 90) return 'text-fair';
return 'text-poor';
};
// Helper function to format uptime value
const formatUptime = (value, label) => {
const display = value !== null ? `${value}%` : '--';
const colorClass = getUptimeClass(value);
return `${label}: <strong class="${colorClass}">${display}</strong>`;
};
// Add all uptime metrics
uptimeHTML.push(formatUptime(service.uptime['24h'], '24h'));
uptimeHTML.push(formatUptime(service.uptime['7d'], '7d'));
uptimeHTML.push(formatUptime(service.uptime['30d'], '30d'));
uptimeHTML.push(formatUptime(service.uptime.all_time, 'All'));
uptimeDisplay.innerHTML = uptimeHTML.join(' | ');
} }
// Update total checks
if (checksDisplay && service.total_checks !== undefined) { if (checksDisplay && service.total_checks !== undefined) {
checksDisplay.textContent = service.total_checks; checksDisplay.textContent = service.total_checks;
} }
} }
/**
* Update overall status bar
*/
function updateOverallStatus(services) { function updateOverallStatus(services) {
const overallBar = document.getElementById('overallStatus'); const overallBar = document.getElementById('overallStatus');
if (!overallBar) return;
const icon = overallBar.querySelector('.summary-icon'); const icon = overallBar.querySelector('.summary-icon');
const title = overallBar.querySelector('.summary-title'); const title = overallBar.querySelector('.summary-title');
const subtitle = document.getElementById('summary-subtitle'); const subtitle = document.getElementById('summary-subtitle');
const onlineCount = document.getElementById('onlineCount'); const onlineCount = document.getElementById('onlineCount');
const totalCount = document.getElementById('totalCount'); const totalCount = document.getElementById('totalCount');
// Count service statuses
const total = services.length; const total = services.length;
const online = services.filter(s => s.status === 'online').length; const online = services.filter(function(s) { return s.status === 'online'; }).length;
const degraded = services.filter(s => s.status === 'degraded' || s.status === 'timeout').length; const degraded = services.filter(function(s) { return s.status === 'degraded' || s.status === 'timeout'; }).length;
const offline = services.filter(s => s.status === 'offline').length; const offline = services.filter(function(s) { return s.status === 'offline'; }).length;
// Update counts if (onlineCount) onlineCount.textContent = online;
onlineCount.textContent = online; if (totalCount) totalCount.textContent = total;
totalCount.textContent = total;
// Remove all status classes
overallBar.classList.remove('online', 'degraded', 'offline'); overallBar.classList.remove('online', 'degraded', 'offline');
icon.classList.remove('operational', 'partial', 'major', 'loading'); if (icon) icon.classList.remove('operational', 'partial', 'major', 'loading');
// Determine overall status // Determine overall status
if (online === total) { if (online === total) {
// All systems operational
overallBar.classList.add('online'); overallBar.classList.add('online');
if (icon) {
icon.classList.add('operational'); icon.classList.add('operational');
icon.textContent = '\u2713'; icon.textContent = '\u2713';
title.textContent = 'All Systems Operational'; }
subtitle.textContent = `All ${total} services are running normally`; if (title) title.textContent = 'All Systems Operational';
if (subtitle) subtitle.textContent = `All ${total} services are running normally`;
} else if (offline >= Math.ceil(total / 2)) { } else if (offline >= Math.ceil(total / 2)) {
// Major outage (50% or more offline)
overallBar.classList.add('offline'); overallBar.classList.add('offline');
if (icon) {
icon.classList.add('major'); icon.classList.add('major');
icon.textContent = '\u2715'; icon.textContent = '\u2715';
title.textContent = 'Major Outage'; }
subtitle.textContent = `${offline} service${offline !== 1 ? 's' : ''} offline, ${degraded} degraded`; if (title) title.textContent = 'Major Outage';
if (subtitle) subtitle.textContent = `${offline} service${offline !== 1 ? 's' : ''} offline, ${degraded} degraded`;
} else if (offline > 0 || degraded > 0) { } else if (offline > 0 || degraded > 0) {
// Partial outage
overallBar.classList.add('degraded'); overallBar.classList.add('degraded');
if (icon) {
icon.classList.add('partial'); icon.classList.add('partial');
icon.textContent = '\u26A0'; icon.textContent = '\u26A0';
title.textContent = 'Partial Outage'; }
if (title) title.textContent = 'Partial Outage';
if (subtitle) {
if (offline > 0 && degraded > 0) { if (offline > 0 && degraded > 0) {
subtitle.textContent = `${offline} offline, ${degraded} degraded`; subtitle.textContent = `${offline} offline, ${degraded} degraded`;
} else if (offline > 0) { } else if (offline > 0) {
@@ -196,18 +173,17 @@ function updateOverallStatus(services) {
} else { } else {
subtitle.textContent = `${degraded} service${degraded !== 1 ? 's' : ''} degraded`; subtitle.textContent = `${degraded} service${degraded !== 1 ? 's' : ''} degraded`;
} }
}
} else { } else {
// Unknown state if (icon) {
icon.classList.add('loading'); icon.classList.add('loading');
icon.textContent = '\u25D0'; icon.textContent = '\u25D0';
title.textContent = 'Status Unknown'; }
subtitle.textContent = 'Waiting for service data'; if (title) title.textContent = 'Status Unknown';
if (subtitle) subtitle.textContent = 'Waiting for service data';
} }
} }
/**
* Show error message
*/
function showError(message) { function showError(message) {
const errorDiv = document.createElement('div'); const errorDiv = document.createElement('div');
errorDiv.className = 'status-error'; errorDiv.className = 'status-error';
@@ -217,13 +193,10 @@ function showError(message) {
const container = document.querySelector('.foregroundContent'); const container = document.querySelector('.foregroundContent');
if (container) { if (container) {
container.insertBefore(errorDiv, container.firstChild); container.insertBefore(errorDiv, container.firstChild);
setTimeout(() => errorDiv.remove(), 5000); setTimeout(function() { errorDiv.remove(); }, 5000);
} }
} }
/**
* Manual refresh
*/
function refreshStatus() { function refreshStatus() {
const refreshBtn = document.getElementById('refreshBtn'); const refreshBtn = document.getElementById('refreshBtn');
if (refreshBtn) { if (refreshBtn) {
@@ -233,32 +206,24 @@ function refreshStatus() {
fetchStatus(); fetchStatus();
} }
/**
* Initialize on page load
*/
var statusIntervalId = null;
function initStatusPage() { function initStatusPage() {
// Clear any existing interval from a previous SPA navigation if (window.statusIntervalId) {
if (statusIntervalId !== null) { clearInterval(window.statusIntervalId);
clearInterval(statusIntervalId);
} }
fetchStatus(); fetchStatus();
// Auto-refresh every 5 minutes to get latest data window.statusIntervalId = setInterval(fetchStatus, 60000);
statusIntervalId = setInterval(fetchStatus, 300000);
} }
// Clean up interval when navigating away via SPA function cleanupStatusPage() {
document.addEventListener('beforenavigate', () => { if (window.statusIntervalId) {
if (statusIntervalId !== null) { clearInterval(window.statusIntervalId);
clearInterval(statusIntervalId); window.statusIntervalId = null;
statusIntervalId = null;
} }
}); document.removeEventListener('beforenavigate', cleanupStatusPage);
}
// Start when page loads document.addEventListener('beforenavigate', cleanupStatusPage);
if (document.readyState === 'loading') {
document.addEventListener('DOMContentLoaded', initStatusPage); if (document.getElementById('overallStatus')) {
} else {
initStatusPage(); initStatusPage();
} }

View File

@@ -3,7 +3,7 @@
"status": "complete", "status": "complete",
"classes": "geospacial", "classes": "geospacial",
"bgi": "watershedTemps.png", "bgi": "watershedTemps.png",
"content": "Geospacial analysis of Maryland's Antietam and Conococheague sub-watersheds, monitoring water quality and temperatures through the summer months for reporting to governmental review boards for environmental protection" "content": "Live geospacial analysis of Maryland's Antietam and Conococheague sub-watersheds, monitoring water quality and temperatures through the summer months for governmental environment health review boards."
}, },
"Automotive Brand Valuation Analysis": { "Automotive Brand Valuation Analysis": {
"status": "complete", "status": "complete",

View File

@@ -1,41 +1,35 @@
{ {
"Tools": {
"Microsoft Azure": {
"Databricks": {},
"Data Factory": {},
"Stream Analytics": {}
},
"Databricks": {},
"Apache Spark": {},
"Visual Basic for Applications (Excel)": {}
},
"Data and AI": { "Data and AI": {
"Python": { "ML": {
"PyTorch/TensorFlow": {}, "PySpark ML": {},
"Numpy/Pandas": {}, "Numpy/Pandas/Polars": {},
"Scikit/Sklearn": {}, "TensorFlow": {},
"Selenium/BS4": {}, "Scikit": {}
"Pyspark": {}
}, },
"R": {}, "PySpark": {},
"SQL": {} "Selenium/BS4 Web Hacking": {},
}, "SQL": {},
"Frontend": { "Declarative Pipelines": {},
"Flask (Python)": {}, "ArcGIS": {}
"React (Javascript)": {},
"SASS/SCSS": {}
},
"Backend & DevOps": {
"Backend": {
"Rust": {},
"C#": {}
}, },
"DevOps": { "DevOps": {
"Docker": {}, "Docker": {},
"Microsoft Azure": {}, "Microsoft Azure": {},
"Databricks": {},
"Kubernetes/Openshift": {}, "Kubernetes/Openshift": {},
"Cloudflare": {}, "Cloudflare": {},
"Bash": {} "Bash": {}
} },
"Frontend": {
"Flask (Python)": {},
"REST APIs": {},
"Web Scraping": {}
},
"Offline Skills": {
"Circuitry": {},
"Skiing": {},
"Chess": {},
"Plinking": {},
"Building something with trash that solves my problems": {}
} }
} }

View File

@@ -5,7 +5,7 @@
<loc>https://asimonson.com/projects</loc> <loc>https://asimonson.com/projects</loc>
<loc>https://asimonson.com/Resume</loc> <loc>https://asimonson.com/Resume</loc>
<loc>https://asimonson.com/duck</loc> <loc>https://asimonson.com/duck</loc>
<loc>https://asimonson.com/books</loc> <loc>https://asimonson.com/status</loc>
<lastmod>2024-07-24</lastmod> <lastmod>2026-02-12</lastmod>
</url> </url>
</urlset> </urlset>

View File

@@ -20,7 +20,7 @@
property="og:image" property="og:image"
content="{{ url_for('static', filename='icons/rasterLogoCircle.png') }}" content="{{ url_for('static', filename='icons/rasterLogoCircle.png') }}"
/> />
<meta property="og:url" content="{{ var['canonical'] }}" /> <meta property="og:url" content="{{ request.url_root | trim('/') }}{{ var['canonical'] }}" />
<meta property="twitter:title" content="Andrew Simonson" /> <meta property="twitter:title" content="Andrew Simonson" />
<meta name="twitter:description" content="{{ var['description'] }}" /> <meta name="twitter:description" content="{{ var['description'] }}" />
<meta name="twitter:card" content="summary_large_image" /> <meta name="twitter:card" content="summary_large_image" />
@@ -50,16 +50,16 @@
rel="stylesheet" rel="stylesheet"
href="{{ url_for('static', filename='css/App.css') }}" href="{{ url_for('static', filename='css/App.css') }}"
/> />
<link rel="canonical" href="{{ var['canonical'] }}" /> <link rel="canonical" href="{{ request.url_root | trim('/') }}{{ var['canonical'] }}" />
<script defer src="{{ url_for('static', filename='js/checkbox.js') }}"></script>
<script defer src="{{ url_for('static', filename='js/responsive.js') }}"></script> <script defer src="{{ url_for('static', filename='js/responsive.js') }}"></script>
<script src="{{ url_for('static', filename='js/chessbed.js') }}"></script> {# <script src="{{ url_for('static', filename='js/chessbed.js') }}"></script> #}
<script defer src="{{ url_for('static', filename='js/idler.js') }}"></script> <script defer src="{{ url_for('static', filename='js/idler.js') }}"></script>
<script defer src="https://cdn.jsdelivr.net/npm/p5@1.4.1/lib/p5.min.js"></script> <script defer src="https://cdn.jsdelivr.net/npm/p5@1.4.1/lib/p5.min.js"></script>
<title>{{ var['title'] }}</title> <title>{{ var['title'] }}</title>
</head> </head>
{% block header %} {% block header %}
<body onpopstate="backButton()"> <body onpopstate="backButton()">
<div id="loading-bar"></div>
<noscript>You need to enable JavaScript to run this app.</noscript> <noscript>You need to enable JavaScript to run this app.</noscript>
<main id='map'></main> <main id='map'></main>
<div id="contentStuffer"> <div id="contentStuffer">

View File

@@ -20,28 +20,27 @@
<!--<INSERT SMALL BANNER HERE FOR PROJECT IMAGECARD CAROUSEL>--> <!--<INSERT SMALL BANNER HERE FOR PROJECT IMAGECARD CAROUSEL>-->
<div id="desktopSpacer"></div> <div id="desktopSpacer"></div>
<div class="homeSubContent"> <div class="homeSubContent">
<img class='blinkies' alt='My Brain is Glowing' src="{{ url_for('static', filename='photos/blinkies/brainglow.gif') }}" loading="lazy" />
<img class='blinkies' alt='Pepsi Addict' src="{{ url_for('static', filename='photos/blinkies/pepsiaddict.gif') }}" loading="lazy" /> <img class='blinkies' alt='Pepsi Addict' src="{{ url_for('static', filename='photos/blinkies/pepsiaddict.gif') }}" loading="lazy" />
<img class='blinkies' alt='I Fear No Beer' src="{{ url_for('static', filename='photos/blinkies/fearnobeer.gif') }}" loading="lazy" />
<img class='blinkies' alt='Secret Message' src="{{ url_for('static', filename='photos/blinkies/tooclose.gif') }}" loading="lazy" /> <img class='blinkies' alt='Secret Message' src="{{ url_for('static', filename='photos/blinkies/tooclose.gif') }}" loading="lazy" />
<img class='blinkies' alt="They took my blood but it wasn't DNA, it was USA" src="{{ url_for('static', filename='photos/blinkies/usa.gif') }}" loading="lazy" /> <img class='blinkies' alt="They took my blood but it wasn't DNA, it was USA" src="{{ url_for('static', filename='photos/blinkies/usa.gif') }}" loading="lazy" />
<img class='blinkies' alt='Bob the Builder gif' src="{{ url_for('static', filename='photos/blinkies/bobthebuilder.gif') }}" loading="lazy" /> <img class='blinkies' alt='Bob the Builder gif' src="{{ url_for('static', filename='photos/blinkies/bobthebuilder.gif') }}" loading="lazy" />
<div> <div>
<br /> <br />
<strong> You've reached the website for Andrew Simonson's personal online shenanigans.</strong> <strong> You've reached the website for Andrew Simonson's digital shenanigans.</strong>
<h3>Now What?</h3> <h3>Now What?</h3>
<p> <p>
Go back and find the link that I originally shared. Or poke around. Be your own person.</br> Go back and find the link that I originally shared. Or poke around. Be your own person.</br>
I guess I'll grant myself some titles while I'm at it: I'll grant myself some titles while I'm at it:
</p> </p>
<ul> <ul>
<li>Load-Bearing Coconut</li>
<li>Wicked Wizard of the West</li> <li>Wicked Wizard of the West</li>
<li>Enemy of Node.js, Hater of Bloat</li> <li>Enemy of Node.js, Hater of Bloat</li>
<li>Load-Bearing Coconut</li>
<li>Creator and Harnesser of Energy</li> <li>Creator and Harnesser of Energy</li>
</ul> </ul>
</div> </div>
<br /> <br />
{#
<div id="aboutCards" class="flex"> <div id="aboutCards" class="flex">
<div class="chess"> <div class="chess">
{% from 'partials/chess.html' import chess %} {{ {% from 'partials/chess.html' import chess %} {{
@@ -53,6 +52,7 @@
</div> </div>
<br /> <br />
</div> </div>
#}
</div> </div>
{% endblock %} {% endblock %}
</div> </div>

View File

@@ -1,35 +1,32 @@
{% macro project(title, classes, status, bgi, content, links) %} {% macro project(title, classes, status, bgi, content, links) %}
<div class="project {{ classes }}" data-aos="fade-up"> <div class="project {{ classes }}" data-aos="fade-up">
<div class="projTitle"> <div class="projImageWrap">
<h3>{{ title }}</h3> {% if bgi|length > 0 %}
<p><span class="state-dot {{ status }}"></span> {{ status }}</p> {% set path = url_for('static', filename='photos/projects/' + bgi) %}
</div> <img src="{{ path }}" alt="Ref image for {{ title }} project" />
<div class="projBody mobileV"> {% else %}
<div class="projImage"> <div class="projImagePlaceholder"></div>
{% if bgi|length > 0 %} {% set path = url_for('static',
filename='photos/projects/' + bgi) %}
<img class="" src="{{ path }}" alt="Ref image for {{ title }} project" />
{% endif %} {% endif %}
<div class="proj-status-badge {{ status }}">
<span class="status-indicator"></span>{{ status }}
</div> </div>
</div>
<div class="grow backedBody"> <div class="projContent">
<div class="projDesc vFlex spaceBetween"> <h3>{{ title }}</h3>
<p>{{ content }}</p> <p class="projDesc">{{ content }}</p>
{% if links %}
<div class="projLinks"> <div class="projLinks">
{% for i in links %} {% set src = 'icons/' + i[0] + '.svg' %} {% for i in links %}
{% set src = 'icons/' + i[0] + '.svg' %}
{% if i[1].startswith('https://') or i[1].startswith('http://') %} {% if i[1].startswith('https://') or i[1].startswith('http://') %}
<a href="{{i[1]}}" rel="noopener noreferrer"> <a href="{{ i[1] }}" rel="noopener noreferrer" class="proj-link">
<img <img class="projectLink" src="{{ url_for('static', filename=src) }}" alt="{{ i[0] }}" />
class="projectLink" <span>{{ i[2] }}</span>
src="{{ url_for('static', filename=src) }}"
alt="{{i[0]}}"
/>
</a> </a>
{% endif %} {% endif %}
{% endfor %} {% endfor %}
</div> </div>
</div> {% endif %}
</div>
</div> </div>
</div> </div>
{% endmacro %} {% endmacro %}

View File

@@ -1,9 +1,15 @@
{% macro expandSkill(dict, name, classes="") %} {% macro expandSkill(dict, name, classes="") %}
<div class='skill {{ classes }}' data-length='{{ dict[name]|length }}'> <div class='skill {{ classes }}' data-length='{{ dict[name]|length }}'>
<div onclick='activeSkill(this)' class='skillname'>{{ name }}</div> <div onclick='activeSkill(this)' class='skillname'>{{ name }}</div>
{% if dict[name]|length > 0 %}
<div class='skill-children'>
<div class='skill-children-inner'>
{% for child in dict[name] %} {% for child in dict[name] %}
{{ expandSkill(dict[name], child) }} {{ expandSkill(dict[name], child) }}
{% endfor %} {% endfor %}
</div>
</div>
{% endif %}
</div> </div>
{% endmacro %} {% endmacro %}

View File

@@ -1,6 +1,5 @@
<div class='socials'> <div class='socials'>
<a href='https://github.com/asimonson1125'><img alt='Github' src="{{ url_for('static', filename='icons/github.svg') }}" /></a> <a href='https://github.com/asimonson1125'><img alt='Github' src="{{ url_for('static', filename='icons/github.svg') }}" /></a>
<a href='https://www.instagram.com/an_a.simonson/'><img alt='Instagram' src="{{ url_for('static', filename='icons/instagram.svg') }}" /></a>
<a href='https://www.linkedin.com/in/simonsonandrew/'><img alt='LinkedIn' src="{{ url_for('static', filename='icons/linkedin.svg') }}" /></a> <a href='https://www.linkedin.com/in/simonsonandrew/'><img alt='LinkedIn' src="{{ url_for('static', filename='icons/linkedin.svg') }}" /></a>
<a href='mailto:asimonson1125@gmail.com'><img alt='E-mail' src="{{ url_for('static', filename='icons/email.svg') }}" /></a> <a href='mailto:asimonson1125@gmail.com'><img alt='E-mail' src="{{ url_for('static', filename='icons/email.svg') }}" /></a>
<div id='vertLine'></div> <div id='vertLine'></div>

View File

@@ -1,13 +0,0 @@
{% macro timeitem(title, classes, date, deets) %}
<div class="timeitem {{classes}}">
<p class="datetext">{{date}}</p>
<div class="timeline-item">
<h2>{{title}}</h2>
<div class="timeline-deets">
<p>
{{deets}}
</p>
</div>
</div>
</div>
{% endmacro %}

View File

@@ -3,7 +3,8 @@
<div class="foregroundContent"> <div class="foregroundContent">
<div class="flex equalitems vertOnMobile"> <div class="flex equalitems vertOnMobile">
<div> <div>
<h2 class="concentratedHead">About Me</h2> <div>
<h2 class="concentratedHead">About Me<p><sup>Data Scientist, Amateur SysAdmin, Polymath</sup></p></h2>
<p> <p>
I'm Andrew Simonson<!--, CEO of the anti-thermodynamics syndicate.-->, I'm Andrew Simonson<!--, CEO of the anti-thermodynamics syndicate.-->,
a <strong>Data Scientist at Ecolab</strong> and a graduate Data a <strong>Data Scientist at Ecolab</strong> and a graduate Data
@@ -12,30 +13,32 @@
recently completed the <b>Computer Science BS</b> program recently completed the <b>Computer Science BS</b> program
(international relations minor) with a focus on probability (international relations minor) with a focus on probability
theory. theory.
<br /> <!-- <br />
<br /> <br />
I started in ~2017, reverse engineering probablistic logic I started in ~2017, reverse engineering probablistic logic
models in games and developing interfaces to recreate my models in games and developing interfaces to recreate my
findings for friends. Now I develop tracable AI built on findings for friends. Now I develop tracable AI built on
deductive reasoning, maintaning scientific methodology in an deductive reasoning, maintaning scientific methodology in an
industry obsessed with implicit rules and exclusive empiricism. industry obsessed with implicit rules and exclusive empiricism.
<!-- As the analysis grew more sophisticated, so too did the tech As the analysis grew more sophisticated, so too did the tech
stack - to the point that I now manage most services, like this stack - to the point that I now manage most services, like this
website, end to end, container image to insight visual. --> website, end to end, container image to insight visual. -->
<br /> <br />
<br /> <br />
I get bored and throw random stuff on this website. It's a form I get bored and throw random stuff on this website.<br/>
of unprofessional development and I swear by this form of This is what unprofessional development looks like.
learning.
</p> </p>
<h3 class='concentratedHead'> </div>
<br/>
<br/>
<h4 class='concentratedHead'>
I also have a I also have a
<a href="Resume_Simonson_Andrew.pdf" target="_blank">resume</a> <a href="Resume_Simonson_Andrew.pdf" target="_blank">resume</a>
for some reason. for unexplained reasons.
</h3> </h4>
</div> </div>
<div id="skills"> <div id="skills">
<h2 id="skillstag">Skills</h2> <h2 id="skillstag">Technologies</h2>
{% from 'partials/skills.html' import skills %} {{ {% from 'partials/skills.html' import skills %} {{
skills(var['skillList']) }} skills(var['skillList']) }}
</div> </div>
@@ -43,32 +46,7 @@
<br /> <br />
<h2 class="concentratedHead">Projects</h2> <h2 class="concentratedHead">Projects</h2>
<!-- > <div class="projectList">
<div class="checkbox-wrapper">
<div class="flex start">
<label class="switch" htmlFor="pinned">
<input type="checkbox" id="pinned" onClick="toggleCheckbox('')" checked/>
<div class="slider round"></div>
<strong>Pinned</strong>
</label>
</div>
<div class="flex start">
<label class="switch" htmlFor="programming">
<input type="checkbox" id="programming" onClick="toggleCheckbox('')" />
<div class="slider round"></div>
<strong>Programming</strong>
</label>
</div>
<div class="flex start">
<label class="switch" htmlFor="geospacial" onClick="toggleCheckbox('')">
<input type="checkbox" id="geospacial" />
<div class="slider round"></div>
<strong>Geospacial</strong>
</label>
</div>
</div>
</!-->
<div class="projectList vContainer">
{% from 'partials/project.html' import project %} {% for i in {% from 'partials/project.html' import project %} {% for i in
var["projects"] %} {{ project(i, var["projects"][i]["classes"], var["projects"] %} {{ project(i, var["projects"][i]["classes"],
var["projects"][i]["status"], var["projects"][i]["bgi"], var["projects"][i]["status"], var["projects"][i]["bgi"],
@@ -77,5 +55,4 @@
</div> </div>
</div> </div>
<!--><script>toggleCheckbox('')</script></!-->
{% endblock %} {% endblock %}

View File

@@ -91,8 +91,8 @@
<div class="info-box"> <div class="info-box">
<h4>About This Monitor</h4> <h4>About This Monitor</h4>
<ul> <ul>
<li><strong>Check Frequency:</strong> Services are checked automatically every 30 minutes from the server</li> <li><strong>Check Frequency:</strong> Services are checked automatically every minute from the server</li>
<li><strong>Page Refresh:</strong> This page auto-refreshes every 5 minutes to show latest data</li> <li><strong>Page Refresh:</strong> This page auto-refreshes every minute to show latest data</li>
</ul> </ul>
</div> </div>
</div> </div>