Compare commits

...

2 Commits

Author SHA1 Message Date
Matthew Raymer
f354d89ece feat: implement DB normalization, WebAuthn server, diagnostics, and build improvements
Major Features:
- Normalize dbExec() changes count across all platforms using SQLite changes()
- Create WebAuthn verification server (Fastify) for secure passkey operations
- Add platform diagnostics interface and UI view
- Implement diagnostic export service with memory logs and git hash
- Add git hash extraction to build config

Database Improvements:
- Create dbResultNormalizer.ts shared helper for reliable change counts
- Update CapacitorPlatformService to use normalizer with SQLite queries
- Remove read-before/read-after workaround from databaseUtil.ts
- All platforms now return reliable { changes: number; lastId?: number }

WebAuthn Security:
- Split WebAuthn into client/offline modules for proper isolation
- Create passkeyDidPeer.client.ts (server endpoint integration)
- Create passkeyDidPeer.offlineVerify.ts (offline mode, dynamic import only)
- Refactor passkeyDidPeer.ts as facade routing to client/offline
- Server-side verification required by default (offline mode behind flag)

WebAuthn Server:
- Fastify-based server in server/ directory
- 4 endpoints: registration/options, registration/verify, authentication/options, authentication/verify
- In-memory storage for development (production-ready structure)
- Comprehensive API documentation and setup guide

Platform Diagnostics:
- Create PlatformDiagnostics interface
- Implement getDiagnostics() in all platform services
- Create PlatformDiagnosticsView.vue debug UI
- Add /debug/diagnostics route
- Display platform info, capabilities, DB status, worker/queue stats

Diagnostic Export:
- Update DiagnosticExportService to include memory logs
- Add redaction for sensitive data
- Include git hash and build info in exports
- Export via platform file sharing

Build Configuration:
- Extract git hash at build time in vite.config.common.mts
- Set VITE_GIT_HASH via define for all builds
- Available in diagnostics and export bundles

Documentation:
- Add WebAuthn server section to README.md and BUILDING.md
- Explain security rationale for server-side verification
- Document setup, deployment, and configuration
- Add environment variable examples

Files Created:
- src/services/dbResultNormalizer.ts
- src/libs/crypto/vc/passkeyDidPeer.client.ts
- src/views/debug/PlatformDiagnosticsView.vue
- server/package.json, server/tsconfig.json, server/src/index.ts
- server/README.md, server/.env.example

Files Modified:
- src/services/platforms/CapacitorPlatformService.ts
- src/db/databaseUtil.ts
- src/libs/crypto/vc/passkeyDidPeer.ts
- src/services/DiagnosticExportService.ts
- src/services/platforms/WebPlatformService.ts
- src/services/platforms/ElectronPlatformService.ts
- src/services/PlatformService.ts
- src/interfaces/diagnostics.ts
- src/router/index.ts
- src/constants/app.ts
- vite.config.common.mts
- README.md, BUILDING.md
2026-01-01 12:07:44 +00:00
Matthew Raymer
5247a37fac fix: resolve build failures, security issues, and architectural improvements
Critical Fixes:
- Remove missing sw_combine.js from prebuild script and all documentation
- Remove missing test-safety-check.sh from test:all script
- Add build:web:build alias to fix docker commands
- Fix syntax errors in validate-critical-files.sh script

Security:
- Fix Electron path traversal vulnerability in export-data-to-downloads handler
  - Sanitize file names using basename() to prevent directory traversal
  - Enforce allowed file extensions (.json, .txt, .csv, .md, .log)
  - Add validation for empty names, path separators, and length limits

Architecture Improvements:
- Add queue size guard to CapacitorPlatformService (max 1000 operations)
  - Fail-fast when queue is full to prevent memory exhaustion
  - Add warning at 80% capacity
  - Add getQueueTelemetry() method for monitoring queue health
  - Track peak queue size for diagnostics

- Standardize environment variable usage in PlatformServiceFactory
  - Prefer import.meta.env.VITE_PLATFORM (standard Vite pattern)
  - Maintain backward compatibility with process.env fallback

Documentation:
- Clarify PWA status: remove misleading VitePWA comments
- Update BUILDING.md to reflect removed sw_combine.js step
- Update build-arch-guard.sh to remove sw_combine.js from protected files

All changes maintain backward compatibility and improve code quality.
2026-01-01 10:54:07 +00:00
35 changed files with 3576 additions and 216 deletions

View File

@@ -379,6 +379,50 @@ rsync -azvu -e "ssh -i ~/.ssh/..." dist ubuntutest@test.timesafari.app:time-safa
- Record the new hash in the changelog. Edit package.json to increment version &
add "-beta", `npm install`, commit, and push. Also record what version is on production.
## WebAuthn Verification Server
TimeSafari includes a server-side WebAuthn verification service for secure passkey registration and authentication. This server must be running for passkey features to work (unless offline mode is enabled).
### Quick Setup
```bash
# Navigate to server directory
cd server
# Install dependencies
npm install
# Copy and configure environment
cp .env.example .env
# Edit .env with your RP_ID, RP_NAME, RP_ORIGIN
# Start development server
npm run dev
```
The server runs on `http://localhost:3002` by default.
### Production Deployment
For production, you'll need to:
1. **Configure environment variables** in `.env`:
- `RP_ID`: Your domain (e.g., `timesafari.app`)
- `RP_NAME`: Application name
- `RP_ORIGIN`: Your app's origin URL
- `PORT`: Server port (default: 3002)
2. **Replace in-memory storage** with:
- Redis for challenge storage
- Database for credential persistence
- Session management for user binding
3. **Deploy the server** alongside your main application
4. **Configure client** via `VITE_WEBAUTHN_SERVER_URL` environment variable
See [server/README.md](server/README.md) for complete API documentation and deployment guide.
## Docker Deployment
The application can be containerized using Docker for consistent deployment across
@@ -1534,6 +1578,7 @@ VITE_APP_SERVER=https://timesafari.app
# Feature Flags
VITE_PASSKEYS_ENABLED=true
VITE_WEBAUTHN_SERVER_URL=http://localhost:3002
VITE_BVC_MEETUPS_PROJECT_CLAIM_ID=https://endorser.ch/entity/01HWE8FWHQ1YGP7GFZYYPS272F
```
@@ -1547,6 +1592,9 @@ VITE_DEFAULT_ENDORSER_API_SERVER=http://localhost:3000
VITE_DEFAULT_PARTNER_API_SERVER=http://localhost:3000
VITE_DEFAULT_IMAGE_API_SERVER=https://test-image-api.timesafari.app
VITE_APP_SERVER=http://localhost:8080
# WebAuthn Server (for passkey verification)
VITE_WEBAUTHN_SERVER_URL=http://localhost:3002
```
**Test Environment** (`.env.test`):
@@ -1724,14 +1772,12 @@ npx prettier --write ./sw_scripts/
The `prebuild` script automatically runs before any build:
```json
"prebuild": "eslint --ext .js,.ts,.vue --ignore-path .gitignore src && node sw_combine.js && node scripts/copy-wasm.js"
"prebuild": "eslint --ext .js,.ts,.vue --ignore-path .gitignore src && node scripts/copy-wasm.js"
```
**What happens automatically:**
- **ESLint**: Checks and fixes code formatting in `src/`
- **Script Combination**: `sw_combine.js` combines all `sw_scripts/*.js` files
into `sw_scripts-combined.js`
- **WASM Copy**: `copy-wasm.js` copies SQLite WASM files to `public/wasm/`
#### Build Process Architecture
@@ -1739,10 +1785,10 @@ The `prebuild` script automatically runs before any build:
**Web Build Process:**
```text
1. Pre-Build: ESLint + Script Combination + WASM Copy
1. Pre-Build: ESLint + WASM Copy
2. Environment Setup: Load .env files, set NODE_ENV
3. Vite Build: Bundle web assets with PWA support
4. Service Worker: Inject combined scripts into PWA
4. Service Worker: Inject service worker scripts into PWA
5. Output: Production-ready files in dist/
```
@@ -1770,10 +1816,8 @@ The `prebuild` script automatically runs before any build:
**Script Organization:**
- `sw_scripts/` - Individual third-party scripts
- `sw_combine.js` - Combines scripts into single file
- `sw_scripts-combined.js` - Combined service worker (317KB, 10K+ lines)
- `vite.config.utils.mts` - PWA configuration using combined script
- `sw_scripts/` - Individual third-party scripts for service worker
- `vite.config.utils.mts` - PWA configuration
**PWA Integration:**
@@ -1781,18 +1825,16 @@ The `prebuild` script automatically runs before any build:
// vite.config.utils.mts
pwaConfig: {
strategies: "injectManifest",
filename: "sw_scripts-combined.js", // Uses our combined script
filename: "sw_scripts-combined.js", // Service worker file
// ... manifest configuration
}
```
**What Gets Combined:**
**Service Worker Scripts:**
- `nacl.js` - NaCl cryptographic library
- `noble-curves.js` - Elliptic curve cryptography (177KB)
- `noble-hashes.js` - Cryptographic hash functions (91KB)
- `safari-notifications.js` - Safari-specific notifications
- `additional-scripts.js` - Additional service worker functionality
#### Process Environment Configuration
@@ -1828,6 +1870,7 @@ VITE_APP_SERVER=https://timesafari.app
# Feature Flags
VITE_PASSKEYS_ENABLED=true
VITE_WEBAUTHN_SERVER_URL=http://localhost:3002
VITE_BVC_MEETUPS_PROJECT_CLAIM_ID=https://endorser.ch/entity/01HWE8FWHQ1YGP7GFZYYPS272F
```

View File

@@ -89,6 +89,65 @@ VITE_LOG_LEVEL=debug npm run build:web:dev
See [Logging Configuration Guide](doc/logging-configuration.md) for complete details.
## WebAuthn Verification Server
TimeSafari includes a server-side WebAuthn verification service for secure passkey registration and authentication.
### Why a Separate Server?
WebAuthn verification **must** be performed server-side for security. Client-side verification can be tampered with and should never be trusted. The server:
- Verifies attestation signatures during registration
- Validates authentication signatures during login
- Prevents replay attacks by tracking counters
- Stores credentials securely with proper user binding
- Enforces origin and RP ID validation
**Note**: The client includes an optional "offline mode" for development (`VITE_OFFLINE_WEBAUTHN_VERIFY=true`), but this is not recommended for production as it compromises security.
### Quick Start
```bash
# Navigate to server directory
cd server
# Install dependencies
npm install
# Copy environment template
cp .env.example .env
# Edit .env with your configuration
# RP_ID=your-domain.com
# RP_NAME=Time Safari
# RP_ORIGIN=https://your-app-url.com
# Start development server
npm run dev
```
The server runs on `http://localhost:3002` by default (configurable via `PORT` in `.env`).
### Documentation
See [server/README.md](server/README.md) for:
- Complete API documentation
- Endpoint specifications
- Production deployment guide
- Security considerations
### Client Configuration
The client automatically uses the server when `VITE_OFFLINE_WEBAUTHN_VERIFY` is not set to `true`. Configure the server URL via:
- Environment variable: `VITE_WEBAUTHN_SERVER_URL`
- Defaults to `http://localhost:3002` in development
- Defaults to same origin in production
### Development Database Clearing
TimeSafari provides a simple script-based approach to clear the local database (not the claim server) for development purposes.
### Quick Usage
```bash
# Run the database clearing script

View File

@@ -105,8 +105,7 @@ Build Scripts:
├── electron/** # Electron build files
├── android/** # Android build configuration
├── ios/** # iOS build configuration
── sw_scripts/** # Service worker scripts
└── sw_combine.js # Service worker combination
── sw_scripts/** # Service worker scripts
Deployment:
├── Dockerfile # Docker configuration

View File

@@ -6,7 +6,7 @@ import electronIsDev from 'electron-is-dev';
import unhandled from 'electron-unhandled';
// import { autoUpdater } from 'electron-updater';
import { promises as fs } from 'fs';
import { join } from 'path';
import { join, basename } from 'path';
import { ElectronCapacitorApp, setupContentSecurityPolicy, setupReloadWatcher } from './setup';
@@ -151,15 +151,47 @@ app.on('activate', async function () {
* This provides a secure, native way to save files directly to the Downloads
* directory using the main process's file system access.
*
* Security: File names are sanitized to prevent path traversal attacks.
* Only safe file extensions are allowed (.json, .txt, .csv, .md).
*
* @param fileName - The name of the file to save (including extension)
* @param data - The data to write to the file (string or buffer)
* @returns Promise<{success: boolean, path?: string, error?: string}>
*/
ipcMain.handle('export-data-to-downloads', async (_event, fileName: string, data: string) => {
try {
// Security: Sanitize file name to prevent path traversal
// 1. Extract only the basename (removes any directory components)
const sanitizedBaseName = basename(fileName);
// 2. Reject if still contains path separators (shouldn't happen after basename, but double-check)
if (sanitizedBaseName.includes('/') || sanitizedBaseName.includes('\\')) {
throw new Error('Invalid file name: path separators not allowed');
}
// 3. Enforce allowed file extensions for security
const allowedExtensions = ['.json', '.txt', '.csv', '.md', '.log'];
const hasAllowedExtension = allowedExtensions.some(ext =>
sanitizedBaseName.toLowerCase().endsWith(ext.toLowerCase())
);
if (!hasAllowedExtension) {
throw new Error(`Invalid file extension. Allowed: ${allowedExtensions.join(', ')}`);
}
// 4. Additional validation: reject empty or suspicious names
if (!sanitizedBaseName || sanitizedBaseName.trim().length === 0) {
throw new Error('File name cannot be empty');
}
// 5. Reject names that are too long (prevent potential filesystem issues)
if (sanitizedBaseName.length > 255) {
throw new Error('File name too long (max 255 characters)');
}
// Get the user's Downloads directory path
const downloadsDir = app.getPath('downloads');
const filePath = join(downloadsDir, fileName);
const filePath = join(downloadsDir, sanitizedBaseName);
// Write the file to the Downloads directory
await fs.writeFile(filePath, data, 'utf-8');

View File

@@ -218,17 +218,50 @@ export class ElectronCapacitorApp {
}
}
// Set a CSP up for our application based on the custom scheme
/**
* Set up Content Security Policy for Electron application
*
* CSP is assembled from structured directives to prevent truncation/corruption.
* This ensures the CSP string is always complete and valid.
*
* @param customScheme - The custom URL scheme for the Electron app (e.g., 'capacitor-electron')
*/
export function setupContentSecurityPolicy(customScheme: string): void {
// Build CSP from structured directives to prevent truncation issues
const buildCSP = (isDev: boolean): string => {
const directives: string[] = [];
// Default source: allow custom scheme, inline scripts (required for some libs), and data URIs
const defaultSrc = [
`${customScheme}://*`,
"'unsafe-inline'",
"data:",
"https:",
];
if (isDev) {
// Development: allow devtools and eval for debugging
defaultSrc.push("devtools://*", "'unsafe-eval'", "http:");
}
directives.push(`default-src ${defaultSrc.join(" ")}`);
// Style source: allow custom scheme and inline styles
directives.push(`style-src ${customScheme}://* 'unsafe-inline'`);
// Font source: allow custom scheme and data URIs
directives.push(`font-src ${customScheme}://* data:`);
return directives.join("; ");
};
session.defaultSession.webRequest.onHeadersReceived((details, callback) => {
const csp = buildCSP(electronIsDev);
callback({
responseHeaders: {
...details.responseHeaders,
'Content-Security-Policy': [
electronIsDev
? `default-src ${customScheme}://* 'unsafe-inline' devtools://* 'unsafe-eval' data: https: http:; style-src ${customScheme}://* 'unsafe-inline'; font-src ${customScheme}://* data:`
: `default-src ${customScheme}://* 'unsafe-inline' data: https:; style-src ${customScheme}://* 'unsafe-inline'; font-src ${customScheme}://* data:`,
],
'Content-Security-Policy': [csp],
},
});
});

View File

@@ -10,10 +10,10 @@
"lint-fix": "eslint --ext .js,.ts,.vue --ignore-path .gitignore --fix src",
"type-safety-check": "./scripts/type-safety-check.sh",
"type-check": "tsc --noEmit",
"prebuild": "eslint --ext .js,.ts,.vue --ignore-path .gitignore src && node sw_combine.js && node scripts/copy-wasm.js",
"prebuild": "eslint --ext .js,.ts,.vue --ignore-path .gitignore src && node scripts/copy-wasm.js",
"test:prerequisites": "node scripts/check-prerequisites.js",
"check:dependencies": "./scripts/check-dependencies.sh",
"test:all": "npm run lint && tsc && npm run test:web && npm run test:mobile && ./scripts/test-safety-check.sh && echo '\n\n\nGotta add the performance tests'",
"test:all": "npm run lint && tsc && npm run test:web && npm run test:mobile && echo '\n\n\nGotta add the performance tests'",
"test:web": "npx playwright test -c playwright.config-local.ts --trace on",
"test:mobile": "./scripts/test-mobile.sh",
"test:android": "node scripts/test-android.js",
@@ -64,6 +64,7 @@
"build:web:serve:test": "./scripts/build-web.sh --serve --test",
"build:web:serve:prod": "./scripts/build-web.sh --serve --prod",
"docker:up": "docker-compose up",
"build:web:build": "./scripts/build-web.sh",
"docker:up:test": "npm run build:web:build -- --mode test && docker-compose up test",
"docker:up:prod": "npm run build:web:build -- --mode production && docker-compose up production",
"docker:down": "docker-compose down",

View File

@@ -24,7 +24,6 @@ SENSITIVE=(
"android/**"
"ios/**"
"sw_scripts/**"
"sw_combine.js"
"Dockerfile"
"docker/**"
"capacitor.config.ts"

View File

@@ -1,6 +1,11 @@
#!/bin/bash
#
# Critical Files Migration Validator
# Author: Matthew Raymer
# Description: Validates migration status of critical files
#
echo 🔍 Critical Files Migration Validator"
echo "🔍 Critical Files Migration Validator"
echo "====================================="
# Function to check actual usage (not comments)
@@ -10,77 +15,87 @@ check_actual_usage() {
local description="$3"
# Remove comments and check for actual usage
local count=$(grep -v ^[[:space:]]*//\|^[[:space:]]*\*\|^[[:space:]]*<!--" "$file" | \
grep -v TODO.*migration\|FIXME.*migration" | \
local count=$(grep -v "^[[:space:]]*//\|^[[:space:]]*\*\|^[[:space:]]*<!--" "$file" | \
grep -v "TODO.*migration\|FIXME.*migration" | \
grep -v "Migration.*replaced\|migrated.*from" | \
grep -c $pattern" || echo 0)
grep -c "$pattern" || echo 0)
if [$count" -gt0 then
echo$description: $count instances
return 1 else
echo$description: None found
if [ "$count" -gt 0 ]; then
echo "$description: $count instances"
return 1
else
echo "$description: None found"
return 0
fi
}
# Function to check notification migration
check_notifications() {
local file="$1
local file="$1"
# Check for notification helpers
local has_helpers=$(grep -c "createNotifyHelpers" $file" || echo "0")
local has_helpers=$(grep -c "createNotifyHelpers" "$file" || echo "0")
# Check for direct $notify calls (excluding helper setup)
local direct_notify=$(grep -v "createNotifyHelpers" "$file" | \
grep -v this\.notify\." | \
grep -v "this\.notify\." | \
grep -c "this\.\$notify" || echo 0)
if $has_helpers" -gt0 && $direct_notify" -eq0 then
echo " ✅ Complete notification migration
if [ "$has_helpers" -gt 0 ] && [ "$direct_notify" -eq 0 ]; then
echo " ✅ Complete notification migration"
return 0
elif $has_helpers" -gt0 && $direct_notify" -gt0 then
echo " ⚠️ Mixed pattern: $direct_notify direct calls
return 1 else
echo " ❌ No notification migration
elif [ "$has_helpers" -gt 0 ] && [ "$direct_notify" -gt 0 ]; then
echo " ⚠️ Mixed pattern: $direct_notify direct calls"
return 1
else
echo " ❌ No notification migration"
return 1
fi
}
# Function to analyze a file
analyze_file() {
local file="$1 echo ""
local file="$1"
echo ""
echo "📄 Analyzing: $file"
echo "----------------------------------------"
local issues=0 # Check legacy patterns
echo "🔍 Legacy Patterns:
check_actual_usage$file aseUtil" "databaseUtil usage || ((issues++))
check_actual_usage "$filelogConsoleAndDb ConsoleAndDb usage || ((issues++))
check_actual_usage$file formServiceFactory\.getInstance ct PlatformService usage ||((issues++))
local issues=0
# Check legacy patterns
echo "🔍 Legacy Patterns:"
check_actual_usage "$file" "databaseUtil" "databaseUtil usage" || ((issues++))
check_actual_usage "$file" "logConsoleAndDb" "ConsoleAndDb usage" || ((issues++))
check_actual_usage "$file" "formServiceFactory\.getInstance" "PlatformService usage" || ((issues++))
# Check notifications
echo 🔔 Notifications:"
check_notifications "$file ||((issues++))
echo "🔔 Notifications:"
check_notifications "$file" || ((issues++))
# Check PlatformServiceMixin
echo "🔧 PlatformServiceMixin:"
local has_mixin=$(grep -cPlatformServiceMixin" $file || echo 0)
local has_mixins=$(grep -cmixins.*PlatformServiceMixin\|mixins.*\[PlatformServiceMixin" $file" || echo 0)
local has_mixin=$(grep -c "PlatformServiceMixin" "$file" || echo 0)
local has_mixins=$(grep -c "mixins.*PlatformServiceMixin\|mixins.*\[PlatformServiceMixin" "$file" || echo 0)
if $has_mixin" -gt 0 && $has_mixins" -gt0 then
echo " ✅ PlatformServiceMixin properly integrated elif $has_mixin" -gt 0 && $has_mixins" -eq0 then
echo " ⚠️ Imported but not used as mixin ((issues++))
if [ "$has_mixin" -gt 0 ] && [ "$has_mixins" -gt 0 ]; then
echo " ✅ PlatformServiceMixin properly integrated"
elif [ "$has_mixin" -gt 0 ] && [ "$has_mixins" -eq 0 ]; then
echo " ⚠️ Imported but not used as mixin"
((issues++))
else
echo " ❌ No PlatformServiceMixin usage ((issues++))
echo " ❌ No PlatformServiceMixin usage"
((issues++))
fi
# Check TODO comments
local todo_count=$(grep -c TODO.*migration\|FIXME.*migration" $file || echo "0) if $todo_count" -gt0 then
echo ⚠️ TODO/FIXME comments: $todo_count ((issues++))
local todo_count=$(grep -c "TODO.*migration\|FIXME.*migration" "$file" || echo "0")
if [ "$todo_count" -gt 0 ]; then
echo " ⚠️ TODO/FIXME comments: $todo_count"
((issues++))
fi
if$issues" -eq0 then
echo "✅ File is fully migrated else
echo❌ $issues issues found"
if [ "$issues" -eq 0 ]; then
echo "✅ File is fully migrated"
else
echo "$issues issues found"
fi
return $issues
@@ -88,35 +103,39 @@ analyze_file() {
# Main analysis
echo ""
echo 📊 Critical Files Analysis"
echo "📊 Critical Files Analysis"
echo "=========================="
# Critical files from our assessment
files=(
src/components/MembersList.vue"
"src/components/MembersList.vue"
"src/views/ContactsView.vue"
src/views/OnboardMeetingSetupView.vue"
src/db/databaseUtil.ts"
src/db/index.ts
"src/views/OnboardMeetingSetupView.vue"
"src/db/databaseUtil.ts"
"src/db/index.ts"
)
total_issues=0
for file in ${files[@]}"; do
for file in "${files[@]}"; do
if [ -f "$file" ]; then
analyze_file "$file"
total_issues=$((total_issues + $?))
else
echo ❌ File not found: $file"
echo "❌ File not found: $file"
fi
done
# Summary
echo "echo📋 Summary"
echo=========="
echo ""
echo "📋 Summary"
echo "=========="
echo "Files analyzed: ${#files[@]}"
echo "Total issues found: $total_issues"
if$total_issues" -eq 0]; then
echo "✅ All critical files are properly migrated exit 0 echo "❌ Migration issues require attention"
if [ "$total_issues" -eq 0 ]; then
echo "✅ All critical files are properly migrated"
exit 0
else
echo "❌ Migration issues require attention"
exit 1
fi
fi

11
server/.env Normal file
View File

@@ -0,0 +1,11 @@
# Relying Party Configuration
RP_ID=localhost
RP_NAME=Time Safari
RP_ORIGIN=http://localhost:8080
# Server Configuration
PORT=3002
HOST=0.0.0.0
# CORS (optional, defaults to RP_ORIGIN)
# CORS_ORIGIN=http://localhost:8080

11
server/.env.example Normal file
View File

@@ -0,0 +1,11 @@
# Relying Party Configuration
RP_ID=localhost
RP_NAME=Time Safari
RP_ORIGIN=http://localhost:8080
# Server Configuration
PORT=3002
HOST=0.0.0.0
# CORS (optional, defaults to RP_ORIGIN)
# CORS_ORIGIN=http://localhost:8080

197
server/README.md Normal file
View File

@@ -0,0 +1,197 @@
# WebAuthn Verification Server
Server-side WebAuthn verification service for Time Safari.
## Why This Server Exists
WebAuthn verification **must** be performed server-side for security. Client-side verification can be tampered with and should never be trusted for security-critical operations.
### Security Rationale
1. **Trust Boundary**: The client bundle runs in an untrusted environment (user's browser). Any verification code in the client can be modified, bypassed, or replaced by an attacker.
2. **Attestation Verification**: During registration, the server must verify:
- The attestation signature is valid
- The authenticator is genuine (not a software emulator)
- The challenge matches what was issued
- The origin and RP ID are correct
3. **Authentication Verification**: During authentication, the server must verify:
- The signature is valid for the stored credential
- The challenge matches
- The counter has increased (replay attack prevention)
- The origin and RP ID are correct
4. **Credential Storage**: Credentials must be stored securely server-side with proper user binding to prevent unauthorized access.
### Offline Mode
The client includes an optional "offline mode" (`VITE_OFFLINE_WEBAUTHN_VERIFY=true`) that allows client-side verification, but this is:
- **Not recommended for production** - security can be compromised
- **Intended for development/testing** - when a server isn't available
- **Clearly documented** - with security warnings
### Architecture
```
┌─────────────┐ ┌──────────────┐ ┌─────────────┐
│ Client │────────▶│ WebAuthn │────────▶│ Authenticator
│ (Browser) │ │ Server │ │ (Passkey)
└─────────────┘ └──────────────┘ └─────────────┘
│ │
│ 1. Request options │
│◀─────────────────────────│
│ │
│ 2. Create credential │
│ (browser API) │
│ │
│ 3. Send attestation │
│────────────────────────▶│
│ │
│ 4. Verify & store │
│ (server-side only) │
│ │
│◀─────────────────────────│
│ 5. Return credential info│
```
The server acts as the **Relying Party (RP)** and performs all cryptographic verification that cannot be safely done client-side.
## Setup
1. Install dependencies:
```bash
npm install
```
2. Copy `.env.example` to `.env` and configure:
```bash
cp .env.example .env
```
3. Update `.env` with your Relying Party configuration:
```
RP_ID=your-domain.com
RP_NAME=Time Safari
RP_ORIGIN=https://your-app-url.com
```
## Development
Run in development mode with hot reload:
```bash
npm run dev
```
## Production
Build and run:
```bash
npm run build
npm start
```
## Endpoints
### POST /webauthn/registration/options
Generate registration options for a new passkey.
**Request:**
```json
{
"username": "User Name",
"userId": "optional-user-id"
}
```
**Response:**
```json
{
"rp": { "name": "Time Safari", "id": "localhost" },
"user": { "id": "...", "name": "User Name", "displayName": "User Name" },
"challenge": "...",
"pubKeyCredParams": [...],
...
}
```
### POST /webauthn/registration/verify
Verify a registration response.
**Request:**
```json
{
"options": { ... },
"attestationResponse": { ... }
}
```
**Response:**
```json
{
"verified": true,
"credential": {
"credentialID": "...",
"credentialPublicKey": [...],
"counter": 0
}
}
```
### POST /webauthn/authentication/options
Generate authentication options.
**Request:**
```json
{
"credentialId": "...",
"userId": "optional-user-id"
}
```
**Response:**
```json
{
"challenge": "...",
"rpId": "localhost",
"allowCredentials": [...],
...
}
```
### POST /webauthn/authentication/verify
Verify an authentication response.
**Request:**
```json
{
"options": { ... },
"assertionResponse": { ... }
}
```
**Response:**
```json
{
"verified": true,
"counter": 1
}
```
## Storage
**Development:** Uses in-memory storage (challenges and credentials).
**Production:** Replace with:
- Redis for challenge storage
- Database for credential persistence
- Session management for user binding
## Security Notes
- Challenges expire after 5 minutes
- Credentials are stored in-memory (lost on restart)
- In production, implement proper credential persistence and user binding
- Use HTTPS in production
- Validate origin and RP ID strictly

1333
server/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

25
server/package.json Normal file
View File

@@ -0,0 +1,25 @@
{
"name": "timesafari-webauthn-server",
"version": "1.0.0",
"description": "WebAuthn verification server for Time Safari",
"type": "module",
"main": "dist/index.js",
"scripts": {
"dev": "tsx watch src/index.ts",
"build": "tsc",
"start": "node dist/index.js"
},
"dependencies": {
"@simplewebauthn/server": "^9.0.0",
"fastify": "^4.24.3",
"zod": "^3.22.4",
"@fastify/cors": "^8.4.0",
"dotenv": "^16.3.1"
},
"devDependencies": {
"@types/node": "^20.10.0",
"tsx": "^4.7.0",
"typescript": "^5.3.3"
}
}

340
server/src/index.ts Normal file
View File

@@ -0,0 +1,340 @@
/**
* WebAuthn Verification Server
*
* Fastify-based server for WebAuthn registration and authentication verification.
* This server handles the server-side verification of WebAuthn credentials.
*
* @author Matthew Raymer
*/
import Fastify from "fastify";
import cors from "@fastify/cors";
import dotenv from "dotenv";
import {
generateRegistrationOptions,
verifyRegistrationResponse,
generateAuthenticationOptions,
verifyAuthenticationResponse,
} from "@simplewebauthn/server";
import type {
PublicKeyCredentialCreationOptionsJSON,
PublicKeyCredentialRequestOptionsJSON,
VerifyRegistrationResponseOpts,
VerifyAuthenticationResponseOpts,
} from "@simplewebauthn/types";
// Load environment variables
dotenv.config();
const fastify = Fastify({
logger: true,
});
// Register CORS
await fastify.register(cors, {
origin: process.env.RP_ORIGIN || "http://localhost:8080",
credentials: true,
});
// Relying Party configuration from environment
const rpId = process.env.RP_ID || "localhost";
const rpName = process.env.RP_NAME || "Time Safari";
const rpOrigin = process.env.RP_ORIGIN || "http://localhost:8080";
// In-memory challenge storage (for development)
// In production, use Redis or a database
interface ChallengeStore {
challenge: string;
userId?: string;
expiresAt: number;
}
const challengeStore = new Map<string, ChallengeStore>();
// Credential storage (in-memory for development)
// In production, use a database
interface StoredCredential {
credentialID: string;
credentialPublicKey: Uint8Array;
counter: number;
userId?: string;
createdAt: number;
}
const credentialStore = new Map<string, StoredCredential>();
// Cleanup expired challenges every 5 minutes
setInterval(() => {
const now = Date.now();
for (const [key, value] of challengeStore.entries()) {
if (value.expiresAt < now) {
challengeStore.delete(key);
}
}
}, 5 * 60 * 1000);
/**
* POST /webauthn/registration/options
* Generate registration options for a new passkey
*/
fastify.post<{
Body: {
username?: string;
userId?: string;
};
}>("/webauthn/registration/options", async (request, reply) => {
try {
const { username, userId } = request.body;
const options = await generateRegistrationOptions({
rpName,
rpID: rpId,
userName: username || rpName + " User",
userID: userId || crypto.randomUUID(),
timeout: 60000,
attestationType: "none",
authenticatorSelection: {
residentKey: "preferred",
userVerification: "preferred",
authenticatorAttachment: "platform",
},
});
// Store challenge for verification
const challengeKey = userId || options.user.id;
challengeStore.set(challengeKey, {
challenge: options.challenge,
userId: userId,
expiresAt: Date.now() + 5 * 60 * 1000, // 5 minutes
});
return options;
} catch (error) {
fastify.log.error(error);
reply.code(500).send({ error: "Failed to generate registration options" });
}
});
/**
* POST /webauthn/registration/verify
* Verify a registration response
*/
fastify.post<{
Body: {
options: PublicKeyCredentialCreationOptionsJSON;
attestationResponse: unknown;
};
}>("/webauthn/registration/verify", async (request, reply) => {
try {
const { options, attestationResponse } = request.body;
// Retrieve stored challenge
const challengeKey = options.user.id;
const storedChallenge = challengeStore.get(challengeKey);
if (!storedChallenge) {
reply.code(400).send({ error: "Challenge not found or expired" });
return;
}
if (storedChallenge.expiresAt < Date.now()) {
challengeStore.delete(challengeKey);
reply.code(400).send({ error: "Challenge expired" });
return;
}
// Verify registration response
const verification = await verifyRegistrationResponse({
response: attestationResponse as any,
expectedChallenge: storedChallenge.challenge,
expectedOrigin: rpOrigin,
expectedRPID: rpId,
});
// Clean up challenge
challengeStore.delete(challengeKey);
if (!verification.verified || !verification.registrationInfo) {
reply.code(400).send({ verified: false, error: "Verification failed" });
return;
}
// Store credential
const credentialID = verification.registrationInfo.credentialID;
credentialStore.set(credentialID, {
credentialID: credentialID,
credentialPublicKey: verification.registrationInfo.credentialPublicKey,
counter: verification.registrationInfo.counter,
userId: storedChallenge.userId,
createdAt: Date.now(),
});
return {
verified: true,
credential: {
credentialID: credentialID,
credentialPublicKey: Array.from(verification.registrationInfo.credentialPublicKey),
counter: verification.registrationInfo.counter,
},
};
} catch (error) {
fastify.log.error(error);
reply.code(500).send({ error: "Verification failed", details: String(error) });
}
});
/**
* POST /webauthn/authentication/options
* Generate authentication options for an existing passkey
*/
fastify.post<{
Body: {
credentialId?: string;
userId?: string;
};
}>("/webauthn/authentication/options", async (request, reply) => {
try {
const { credentialId, userId } = request.body;
// Find credential(s) for user
let credentials: StoredCredential[] = [];
if (credentialId) {
const cred = credentialStore.get(credentialId);
if (cred) {
credentials = [cred];
}
} else if (userId) {
credentials = Array.from(credentialStore.values()).filter(
(c) => c.userId === userId
);
} else {
reply.code(400).send({ error: "credentialId or userId required" });
return;
}
if (credentials.length === 0) {
reply.code(404).send({ error: "Credential not found" });
return;
}
const options = await generateAuthenticationOptions({
rpID: rpId,
allowCredentials: credentials.map((cred) => ({
id: cred.credentialID,
transports: ["internal"],
})),
userVerification: "preferred",
});
// Store challenge for verification
const challengeKey = credentialId || userId || options.challenge;
challengeStore.set(challengeKey, {
challenge: options.challenge,
userId: userId,
expiresAt: Date.now() + 5 * 60 * 1000, // 5 minutes
});
return options;
} catch (error) {
fastify.log.error(error);
reply.code(500).send({ error: "Failed to generate authentication options" });
}
});
/**
* POST /webauthn/authentication/verify
* Verify an authentication response
*/
fastify.post<{
Body: {
options: PublicKeyCredentialRequestOptionsJSON;
assertionResponse: unknown;
};
}>("/webauthn/authentication/verify", async (request, reply) => {
try {
const { options, assertionResponse } = request.body;
// Find credential by ID
const credentialId = (assertionResponse as any).id;
const credential = credentialStore.get(credentialId);
if (!credential) {
reply.code(404).send({ error: "Credential not found" });
return;
}
// Retrieve stored challenge
const challengeKey = credentialId;
const storedChallenge = challengeStore.get(challengeKey);
if (!storedChallenge) {
reply.code(400).send({ error: "Challenge not found or expired" });
return;
}
if (storedChallenge.expiresAt < Date.now()) {
challengeStore.delete(challengeKey);
reply.code(400).send({ error: "Challenge expired" });
return;
}
// Verify authentication response
const verification = await verifyAuthenticationResponse({
response: assertionResponse as any,
expectedChallenge: storedChallenge.challenge,
expectedOrigin: rpOrigin,
expectedRPID: rpId,
authenticator: {
credentialID: credential.credentialID,
credentialPublicKey: credential.credentialPublicKey,
counter: credential.counter,
},
});
// Clean up challenge
challengeStore.delete(challengeKey);
if (!verification.verified) {
reply.code(400).send({ verified: false, error: "Verification failed" });
return;
}
// Update counter
if (verification.authenticationInfo) {
credential.counter = verification.authenticationInfo.newCounter;
}
return {
verified: true,
counter: credential.counter,
};
} catch (error) {
fastify.log.error(error);
reply.code(500).send({ error: "Verification failed", details: String(error) });
}
});
/**
* Health check endpoint
*/
fastify.get("/health", async () => {
return { status: "ok", timestamp: new Date().toISOString() };
});
// Start server
const start = async () => {
try {
const port = parseInt(process.env.PORT || "3002");
const host = process.env.HOST || "0.0.0.0";
await fastify.listen({ port, host });
fastify.log.info(`WebAuthn server listening on ${host}:${port}`);
fastify.log.info(`RP ID: ${rpId}, RP Name: ${rpName}, RP Origin: ${rpOrigin}`);
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
};
start();

21
server/tsconfig.json Normal file
View File

@@ -0,0 +1,21 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "ES2022",
"lib": ["ES2022"],
"moduleResolution": "node",
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}

View File

@@ -47,6 +47,16 @@ export const DEFAULT_PARTNER_API_SERVER =
export const DEFAULT_PUSH_SERVER =
import.meta.env.VITE_DEFAULT_PUSH_SERVER || AppString.PROD_PUSH_SERVER;
/**
* WebAuthn server endpoint URL
* Defaults to localhost:3002 for development, or can be set via VITE_WEBAUTHN_SERVER_URL
*/
export const DEFAULT_WEBAUTHN_SERVER =
import.meta.env.VITE_WEBAUTHN_SERVER_URL ||
(import.meta.env.DEV || window.location.hostname === "localhost"
? "http://localhost:3002"
: window.location.origin);
export const IMAGE_TYPE_PROFILE = "profile";
export const PASSKEYS_ENABLED =

View File

@@ -94,7 +94,7 @@ const MIGRATIONS = [
id INTEGER PRIMARY KEY AUTOINCREMENT,
dateCreated TEXT NOT NULL,
derivationPath TEXT,
did TEXT NOT NULL,
did TEXT NOT NULL UNIQUE, -- UNIQUE constraint ensures no duplicate DIDs
identityEncrBase64 TEXT, -- encrypted & base64-encoded
mnemonicEncrBase64 TEXT, -- encrypted & base64-encoded
passkeyCredIdHex TEXT,

View File

@@ -31,18 +31,7 @@ export async function updateDidSpecificSettings(
const platform = PlatformServiceFactory.getInstance();
// First, let's see what's currently in the database
const checkResult = await platform.dbQuery(
"SELECT * FROM settings WHERE accountDid = ?",
[accountDid],
);
// Get the current values for comparison
const currentRecord = checkResult?.values?.length
? mapColumnsToValues(checkResult.columns, checkResult.values)[0]
: null;
// First try to update existing record
// Generate and execute the update statement
const { sql: updateSql, params: updateParams } = generateUpdateStatement(
settingsChanges,
"settings",
@@ -50,66 +39,13 @@ export async function updateDidSpecificSettings(
[accountDid],
);
await platform.dbExec(updateSql, updateParams);
// **WORKAROUND**: AbsurdSQL doesn't return changes count correctly
// Instead, check if the record was actually updated
const postUpdateResult = await platform.dbQuery(
"SELECT * FROM settings WHERE accountDid = ?",
[accountDid],
);
const updatedRecord = postUpdateResult?.values?.length
? mapColumnsToValues(postUpdateResult.columns, postUpdateResult.values)[0]
: null;
// Note that we want to eliminate this check (and fix the above if it doesn't work).
// Check if any of the target fields were actually changed
let actuallyUpdated = false;
if (currentRecord && updatedRecord) {
for (const key of Object.keys(settingsChanges)) {
if (key !== "accountDid" && currentRecord[key] !== updatedRecord[key]) {
actuallyUpdated = true;
}
}
}
// If the standard update didn't work, try a different approach
if (
!actuallyUpdated &&
settingsChanges.firstName &&
settingsChanges.isRegistered !== undefined
) {
// Update firstName
await platform.dbExec(
"UPDATE settings SET firstName = ? WHERE accountDid = ?",
[settingsChanges.firstName, accountDid],
);
// Update isRegistered
await platform.dbExec(
"UPDATE settings SET isRegistered = ? WHERE accountDid = ?",
[settingsChanges.isRegistered ? 1 : 0, accountDid],
);
// Check if the individual updates worked
const finalCheckResult = await platform.dbQuery(
"SELECT * FROM settings WHERE accountDid = ?",
[accountDid],
);
const finalRecord = finalCheckResult?.values?.length
? mapColumnsToValues(finalCheckResult.columns, finalCheckResult.values)[0]
: null;
if (finalRecord) {
actuallyUpdated =
finalRecord.firstName === settingsChanges.firstName &&
finalRecord.isRegistered === (settingsChanges.isRegistered ? 1 : 0);
}
}
return actuallyUpdated;
// dbExec() now returns reliable changes count across all platforms
// (normalized using SQLite's changes() function in Capacitor/Electron,
// and reliable from AbsurdSQL in web platform)
const result = await platform.dbExec(updateSql, updateParams);
// Return true if any rows were affected
return result.changes > 0;
}
const DEFAULT_SETTINGS: Settings = {

View File

@@ -0,0 +1,73 @@
/**
* Platform Diagnostics Interface
*
* Provides comprehensive diagnostic information about the current platform,
* database backend, worker status, and build information.
*
* @author Matthew Raymer
*/
import { PlatformCapabilities } from "@/services/PlatformService";
/**
* Database backend information
*/
export interface DatabaseDiagnostics {
/** Type of database backend in use */
kind: "absurd-sql" | "capacitor-sqlite" | "electron-sqlite" | "unknown";
/** SharedArrayBuffer availability status (web platform only) */
sharedArrayBuffer?: "available" | "fallback" | "unknown";
/** Worker thread status (web platform only) */
worker?: {
/** Whether the worker is ready to process messages */
ready: boolean;
/** Number of pending messages */
pending: number;
/** Time since last ping in milliseconds */
lastPingMs?: number;
};
/** Operation queue status (Capacitor/Electron platforms) */
queue?: {
/** Current queue length */
current: number;
/** Peak queue size reached */
maxReached: number;
/** Maximum queue size limit */
limit: number;
/** Whether queue is currently processing */
isProcessing: boolean;
};
/** Database initialization status */
initialized: boolean;
}
/**
* Build information
*/
export interface BuildDiagnostics {
/** Application version from package.json */
version?: string;
/** Git commit hash */
commit?: string;
/** Build mode (development, test, production) */
mode?: string;
/** Build timestamp */
timestamp?: string;
}
/**
* Complete platform diagnostics
*/
export interface PlatformDiagnostics {
/** Detected platform */
platform: "web" | "capacitor" | "electron" | "development" | string;
/** Platform capabilities */
capabilities: PlatformCapabilities;
/** Database diagnostics */
db: DatabaseDiagnostics;
/** Build information */
build: BuildDiagnostics;
/** Additional platform-specific diagnostics */
metadata?: Record<string, unknown>;
}

View File

@@ -0,0 +1,219 @@
/**
* Client-side WebAuthn Passkey Functions
*
* This module provides client-side WebAuthn operations using @simplewebauthn/browser.
* All verification is performed server-side via API endpoints.
*
* @author Matthew Raymer
*/
import {
startAuthentication,
startRegistration,
} from "@simplewebauthn/browser";
import type {
PublicKeyCredentialCreationOptionsJSON,
PublicKeyCredentialRequestOptionsJSON,
PublicKeyCredentialJSON,
} from "@simplewebauthn/types";
import { AppString } from "../../../constants/app";
import { logger } from "../../../utils/logger";
/**
* WebAuthn server endpoint configuration
*/
const getWebAuthnServerUrl = (): string => {
// Check for custom endpoint in settings/env
const customUrl = import.meta.env.VITE_WEBAUTHN_SERVER_URL;
if (customUrl) {
return customUrl;
}
// Default to localhost:3002 for development (matches server default port)
// In production, this should point to your WebAuthn verification service
if (import.meta.env.DEV || window.location.hostname === "localhost") {
return "http://localhost:3002";
}
// Production: use same origin or configured endpoint
return window.location.origin;
};
/**
* Registration result from server verification
*/
export interface RegistrationVerificationResult {
verified: boolean;
credential: {
credentialID: string;
credentialPublicKey: Uint8Array;
counter: number;
};
}
/**
* Authentication result from server verification
*/
export interface AuthenticationVerificationResult {
verified: boolean;
counter?: number;
}
/**
* Register a new passkey credential
*
* Flow:
* 1. Request registration options from server
* 2. Start registration with browser API
* 3. Send attestation response to server for verification
* 4. Return verified credential info
*
* @param passkeyName - Optional name for the passkey
* @param userId - Optional user ID (if not provided, server generates)
* @returns Verified registration result with credential info
*/
export async function registerPasskey(
passkeyName?: string,
userId?: string
): Promise<RegistrationVerificationResult> {
const serverUrl = getWebAuthnServerUrl();
try {
// Step 1: Request registration options from server
const optionsResponse = await fetch(`${serverUrl}/webauthn/registration/options`, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
username: passkeyName || AppString.APP_NAME + " User",
userId: userId,
}),
});
if (!optionsResponse.ok) {
throw new Error(
`Failed to get registration options: ${optionsResponse.statusText}`
);
}
const options: PublicKeyCredentialCreationOptionsJSON =
await optionsResponse.json();
// Step 2: Start registration with browser API
const attestationResponse = await startRegistration(options);
// Step 3: Send attestation response to server for verification
const verifyResponse = await fetch(`${serverUrl}/webauthn/registration/verify`, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
options: options,
attestationResponse: attestationResponse,
}),
});
if (!verifyResponse.ok) {
throw new Error(
`Registration verification failed: ${verifyResponse.statusText}`
);
}
const verification: RegistrationVerificationResult =
await verifyResponse.json();
if (!verification.verified) {
throw new Error("Registration verification failed on server");
}
logger.debug("[passkeyDidPeer.client] Registration successful", {
credentialID: verification.credential.credentialID,
});
return verification;
} catch (error) {
logger.error("[passkeyDidPeer.client] Registration failed:", error);
throw error;
}
}
/**
* Authenticate with an existing passkey credential
*
* Flow:
* 1. Request authentication options from server
* 2. Start authentication with browser API
* 3. Send assertion response to server for verification
* 4. Return verification result
*
* @param credentialId - Base64URL encoded credential ID
* @param userId - Optional user ID (if not provided, server looks up by credential)
* @returns Verification result
*/
export async function authenticatePasskey(
credentialId: string,
userId?: string
): Promise<AuthenticationVerificationResult> {
const serverUrl = getWebAuthnServerUrl();
try {
// Step 1: Request authentication options from server
const optionsResponse = await fetch(`${serverUrl}/webauthn/authentication/options`, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
credentialId: credentialId,
userId: userId,
}),
});
if (!optionsResponse.ok) {
throw new Error(
`Failed to get authentication options: ${optionsResponse.statusText}`
);
}
const options: PublicKeyCredentialRequestOptionsJSON =
await optionsResponse.json();
// Step 2: Start authentication with browser API
const assertionResponse = await startAuthentication(options);
// Step 3: Send assertion response to server for verification
const verifyResponse = await fetch(`${serverUrl}/webauthn/authentication/verify`, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
options: options,
assertionResponse: assertionResponse,
}),
});
if (!verifyResponse.ok) {
throw new Error(
`Authentication verification failed: ${verifyResponse.statusText}`
);
}
const verification: AuthenticationVerificationResult =
await verifyResponse.json();
if (!verification.verified) {
throw new Error("Authentication verification failed on server");
}
logger.debug("[passkeyDidPeer.client] Authentication successful");
return verification;
} catch (error) {
logger.error("[passkeyDidPeer.client] Authentication failed:", error);
throw error;
}
}

View File

@@ -0,0 +1,123 @@
/**
* Offline WebAuthn Verification Module
*
* This module contains server-side WebAuthn verification functions that are
* only available when VITE_OFFLINE_WEBAUTHN_VERIFY is enabled.
*
* SECURITY WARNING: Client-side verification can be tampered with and should
* not be trusted for security-critical operations. This module is intended
* for offline-only mode where server-side verification is not available.
*
* For production deployments, verification should be performed on a server.
*/
import type {
Base64URLString,
VerifyAuthenticationResponseOpts,
} from "@simplewebauthn/types";
import { logger } from "../../../utils/logger";
/**
* Dynamically import server-side verification functions
* This prevents bundling @simplewebauthn/server in normal builds
*/
async function getServerVerification() {
// Check if offline verification is enabled
const offlineVerifyEnabled =
import.meta.env.VITE_OFFLINE_WEBAUTHN_VERIFY === "true";
if (!offlineVerifyEnabled) {
throw new Error(
"Offline WebAuthn verification is disabled. " +
"Set VITE_OFFLINE_WEBAUTHN_VERIFY=true to enable offline mode. " +
"For production, use server-side verification instead."
);
}
try {
// Dynamic import prevents bundling in normal builds
const serverModule = await import("@simplewebauthn/server");
return {
verifyRegistrationResponse: serverModule.verifyRegistrationResponse,
verifyAuthenticationResponse: serverModule.verifyAuthenticationResponse,
generateRegistrationOptions: serverModule.generateRegistrationOptions,
generateAuthenticationOptions: serverModule.generateAuthenticationOptions,
};
} catch (error) {
logger.error(
"[passkeyDidPeer.offlineVerify] Failed to load server verification module:",
error
);
throw new Error(
"Server-side WebAuthn verification module is not available. " +
"This feature requires VITE_OFFLINE_WEBAUTHN_VERIFY=true and @simplewebauthn/server to be installed."
);
}
}
/**
* Verify registration response (offline mode only)
*
* @throws Error if offline verification is not enabled
*/
export async function verifyRegistrationResponseOffline(
response: unknown,
expectedChallenge: string,
expectedOrigin: string,
expectedRPID: string
) {
const { verifyRegistrationResponse } = await getServerVerification();
return verifyRegistrationResponse({
response: response as any,
expectedChallenge,
expectedOrigin,
expectedRPID,
});
}
/**
* Verify authentication response (offline mode only)
*
* @throws Error if offline verification is not enabled
*/
export async function verifyAuthenticationResponseOffline(
opts: VerifyAuthenticationResponseOpts
) {
const { verifyAuthenticationResponse } = await getServerVerification();
return verifyAuthenticationResponse(opts);
}
/**
* Generate registration options (offline mode only)
*
* @throws Error if offline verification is not enabled
*/
export async function generateRegistrationOptionsOffline(opts: {
rpName: string;
rpID: string;
userName: string;
attestationType?: string;
authenticatorSelection?: {
residentKey?: string;
userVerification?: string;
authenticatorAttachment?: string;
};
}) {
const { generateRegistrationOptions } = await getServerVerification();
return generateRegistrationOptions(opts);
}
/**
* Generate authentication options (offline mode only)
*
* @throws Error if offline verification is not enabled
*/
export async function generateAuthenticationOptionsOffline(opts: {
challenge: Uint8Array;
rpID: string;
allowCredentials?: Array<{ id: string }>;
}) {
const { generateAuthenticationOptions } = await getServerVerification();
return generateAuthenticationOptions(opts);
}

View File

@@ -5,18 +5,21 @@ import {
startAuthentication,
startRegistration,
} from "@simplewebauthn/browser";
import {
generateAuthenticationOptions,
generateRegistrationOptions,
verifyAuthenticationResponse,
verifyRegistrationResponse,
VerifyAuthenticationResponseOpts,
} from "@simplewebauthn/server";
import {
/**
* Client-side WebAuthn Passkey Functions
*
* This module provides client-side WebAuthn operations using @simplewebauthn/browser.
* Server-side verification is isolated in passkeyDidPeer.offlineVerify.ts and only
* available when VITE_OFFLINE_WEBAUTHN_VERIFY=true.
*
* For production deployments, verification should be performed on a server endpoint.
*/
import type {
Base64URLString,
PublicKeyCredentialCreationOptionsJSON,
PublicKeyCredentialRequestOptionsJSON,
AuthenticatorAssertionResponse,
VerifyAuthenticationResponseOpts,
} from "@simplewebauthn/types";
import { AppString } from "../../../constants/app";
@@ -36,32 +39,103 @@ export interface JWK {
y: string;
}
/**
* Register a new passkey credential
*
* This is a facade that routes to either:
* - Client module (server-side verification) - default/production
* - Offline verification module (client-side) - only if VITE_OFFLINE_WEBAUTHN_VERIFY=true
*
* @param passkeyName - Optional name for the passkey
* @returns Registration result with credential info
*/
export async function registerCredential(passkeyName?: string) {
const options: PublicKeyCredentialCreationOptionsJSON =
await generateRegistrationOptions({
const offlineVerifyEnabled =
import.meta.env.VITE_OFFLINE_WEBAUTHN_VERIFY === "true";
if (offlineVerifyEnabled) {
// Offline mode: use dynamic import for client-side verification
const {
generateRegistrationOptionsOffline,
verifyRegistrationResponseOffline,
} = await import("./passkeyDidPeer.offlineVerify");
const options = await generateRegistrationOptionsOffline({
rpName: AppString.APP_NAME,
rpID: window.location.hostname,
userName: passkeyName || AppString.APP_NAME + " User",
// Don't prompt users for additional information about the authenticator
// (Recommended for smoother UX)
attestationType: "none",
authenticatorSelection: {
// Defaults
residentKey: "preferred",
userVerification: "preferred",
// Optional
authenticatorAttachment: "platform",
},
});
// someday, instead of simplwebauthn, we'll go direct: navigator.credentials.create with PublicKeyCredentialCreationOptions
// with pubKeyCredParams: { type: "public-key", alg: -7 }
const attResp = await startRegistration(options);
const verification = await verifyRegistrationResponse({
response: attResp,
expectedChallenge: options.challenge,
expectedOrigin: window.location.origin,
expectedRPID: window.location.hostname,
});
const attResp = await startRegistration(options);
const verification = await verifyRegistrationResponseOffline(
attResp,
options.challenge,
window.location.origin,
window.location.hostname
);
return extractCredentialInfo(attResp, verification);
} else {
// Production mode: use client module with server-side verification
const { registerPasskey } = await import("./passkeyDidPeer.client");
const verification = await registerPasskey(passkeyName);
// Convert server response to expected format
const credIdBase64Url = verification.credential.credentialID;
const credIdHex = Buffer.from(
base64URLStringToArrayBuffer(credIdBase64Url),
).toString("hex");
const { publicKeyJwk } = cborToKeys(
verification.credential.credentialPublicKey
);
return {
authData: undefined, // Not available from server response
credIdHex: credIdHex,
publicKeyJwk: publicKeyJwk,
publicKeyBytes: verification.credential.credentialPublicKey,
};
}
}
/**
* Extract credential info from attestation response and verification result
* Used by offline mode
*/
function extractCredentialInfo(
attResp: any,
verification: any
): {
authData: unknown;
credIdHex: string;
publicKeyJwk: JWK;
publicKeyBytes: Uint8Array;
} {
const credIdBase64Url = verification.registrationInfo?.credentialID as string;
if (attResp.rawId !== credIdBase64Url) {
logger.warn("Warning! The raw ID does not match the credential ID.");
}
const credIdHex = Buffer.from(
base64URLStringToArrayBuffer(credIdBase64Url),
).toString("hex");
const { publicKeyJwk } = cborToKeys(
verification.registrationInfo?.credentialPublicKey as Uint8Array,
);
return {
authData: verification.registrationInfo?.attestationObject,
credIdHex: credIdHex,
publicKeyJwk: publicKeyJwk,
publicKeyBytes: verification.registrationInfo
?.credentialPublicKey as Uint8Array,
};
}
// references for parsing auth data and getting the public key
// https://github.com/MasterKale/SimpleWebAuthn/blob/master/packages/server/src/helpers/parseAuthenticatorData.ts#L11
@@ -113,12 +187,32 @@ export class PeerSetup {
};
this.challenge = new Uint8Array(Buffer.from(JSON.stringify(fullPayload)));
// const payloadHash: Uint8Array = sha256(this.challenge);
const options: PublicKeyCredentialRequestOptionsJSON =
await generateAuthenticationOptions({
// Use offline verification if enabled
const offlineVerifyEnabled =
import.meta.env.VITE_OFFLINE_WEBAUTHN_VERIFY === "true";
let options: PublicKeyCredentialRequestOptionsJSON;
if (offlineVerifyEnabled) {
const { generateAuthenticationOptionsOffline } = await import(
"./passkeyDidPeer.offlineVerify"
);
options = await generateAuthenticationOptionsOffline({
challenge: this.challenge,
rpID: window.location.hostname,
allowCredentials: [{ id: credentialId }],
});
} else {
// Production mode: should use server endpoint
// For now, fall back to direct navigator.credentials.get
// TODO: Implement server endpoint for authentication options
options = {
challenge: arrayBufferToBase64URLString(this.challenge.buffer),
rpId: window.location.hostname,
allowCredentials: [{ id: credentialId, type: "public-key" }],
userVerification: "preferred",
} as PublicKeyCredentialRequestOptionsJSON;
}
// console.log("simple authentication options", options);
const clientAuth = await startAuthentication(options);
@@ -345,6 +439,22 @@ export async function verifyJwtSimplewebauthn(
clientDataJsonBase64Url: Base64URLString,
signature: Base64URLString,
) {
// Only allow offline verification if explicitly enabled
const offlineVerifyEnabled =
import.meta.env.VITE_OFFLINE_WEBAUTHN_VERIFY === "true";
if (!offlineVerifyEnabled) {
throw new Error(
"Client-side WebAuthn verification is disabled for security. " +
"Please use server-side verification endpoint or enable offline mode " +
"with VITE_OFFLINE_WEBAUTHN_VERIFY=true (not recommended for production)."
);
}
const { verifyAuthenticationResponseOffline } = await import(
"./passkeyDidPeer.offlineVerify"
);
const authData = arrayToBase64Url(Buffer.from(authenticatorData));
const publicKeyBytes = peerDidToPublicKeyBytes(issuerDid);
const credId = arrayBufferToBase64URLString(
@@ -372,7 +482,7 @@ export async function verifyJwtSimplewebauthn(
type: "public-key",
},
};
const verification = await verifyAuthenticationResponse(authOpts);
const verification = await verifyAuthenticationResponseOffline(authOpts);
return verification.verified;
}

View File

@@ -3,7 +3,8 @@ import { logger } from "./utils/logger";
const platform = process.env.VITE_PLATFORM;
// PWA service worker is automatically registered by VitePWA plugin
// Note: PWA functionality is currently not implemented.
// Service worker registration would be handled here when PWA support is added.
const app = initializeApp();

View File

@@ -5,31 +5,21 @@ if (typeof window === "undefined") {
globalThis.window = globalThis;
// Enhanced crypto polyfill
// SECURITY: Never use Math.random() for cryptographic operations
// If crypto is missing, fail fast rather than silently using insecure randomness
if (typeof crypto === "undefined") {
globalThis.crypto = {
getRandomValues: (array) => {
// Simple fallback for worker context
for (let i = 0; i < array.length; i++) {
array[i] = Math.floor(Math.random() * 256);
}
return array;
},
subtle: {
generateKey: async () => ({ type: "secret" }),
sign: async () => new Uint8Array(32),
verify: async () => true,
digest: async () => new Uint8Array(32),
},
};
throw new Error(
"[SQLWorker] crypto API is not available in worker context. " +
"This is required for secure database operations. " +
"Please ensure the worker is running in a secure context with crypto support."
);
} else if (!crypto.getRandomValues) {
// Crypto exists but doesn't have getRandomValues - extend it
crypto.getRandomValues = (array) => {
// Simple fallback for worker context
for (let i = 0; i < array.length; i++) {
array[i] = Math.floor(Math.random() * 256);
}
return array;
};
// Crypto exists but doesn't have getRandomValues - fail fast
throw new Error(
"[SQLWorker] crypto.getRandomValues is not available. " +
"This is required for secure database operations. " +
"Please ensure the worker environment supports the Web Crypto API."
);
}
}

View File

@@ -280,6 +280,17 @@ const routes: Array<RouteRecordRaw> = [
name: "test",
component: () => import("../views/TestView.vue"),
},
{
path: "/debug/diagnostics",
name: "debug-diagnostics",
component: () => import("../views/debug/PlatformDiagnosticsView.vue"),
meta: {
title: "Platform Diagnostics",
requiresAuth: false,
// Only show in dev mode or if explicitly enabled in settings
devOnly: true,
},
},
{
path: "/user-profile/:id?",
name: "user-profile",

View File

@@ -1,17 +1,13 @@
// **WORKER-COMPATIBLE CRYPTO POLYFILL**: Must be at the very top
// This prevents "crypto is not defined" errors when running in worker context
// **SECURITY**: Crypto API is required for secure database operations
// This service runs in a worker context where crypto should be available via Web Crypto API
// If crypto is missing, fail fast rather than silently using insecure Math.random()
// This matches the fail-fast approach in registerSQLWorker.js
if (typeof window === "undefined" && typeof crypto === "undefined") {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(globalThis as any).crypto = {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
getRandomValues: (array: any) => {
// Simple fallback for worker context
for (let i = 0; i < array.length; i++) {
array[i] = Math.floor(Math.random() * 256);
}
return array;
},
};
throw new Error(
"[AbsurdSqlDatabaseService] crypto API is not available. " +
"This is required for secure database operations. " +
"Please ensure the worker is running in a secure context with crypto support."
);
}
import initSqlJs from "@jlongster/sql.js";

View File

@@ -0,0 +1,169 @@
/**
* Diagnostic Export Service
*
* Provides functionality to export comprehensive diagnostic information
* including platform diagnostics, settings, logs, and build information.
*
* @author Matthew Raymer
*/
import { PlatformServiceFactory } from "./PlatformServiceFactory";
import { retrieveSettingsForActiveAccount } from "@/db/databaseUtil";
import type { PlatformDiagnostics } from "@/interfaces/diagnostics";
import { logger, getMemoryLogs } from "@/utils/logger";
/**
* Redacts sensitive information from diagnostic data
*/
function redactSensitive(data: unknown): unknown {
if (typeof data !== "object" || data === null) {
return data;
}
if (Array.isArray(data)) {
return data.map(redactSensitive);
}
const redacted: Record<string, unknown> = {};
const sensitiveKeys = [
"privateKey",
"privateKeyHex",
"mnemonic",
"secret",
"password",
"token",
"apiKey",
"identityEncrBase64",
"mnemonicEncrBase64",
];
for (const [key, value] of Object.entries(data)) {
if (sensitiveKeys.some((sk) => key.toLowerCase().includes(sk.toLowerCase()))) {
redacted[key] = "[REDACTED]";
} else if (typeof value === "object" && value !== null) {
redacted[key] = redactSensitive(value);
} else {
redacted[key] = value;
}
}
return redacted;
}
/**
* Exports comprehensive diagnostic bundle
*
* @returns Promise resolving to diagnostic bundle as JSON string
*/
export async function exportDiagnostics(): Promise<string> {
const platform = PlatformServiceFactory.getInstance();
const timestamp = new Date().toISOString();
// Collect diagnostics
const diagnostics: PlatformDiagnostics | null = platform.getDiagnostics
? await platform.getDiagnostics()
: null;
// Collect settings
let settingsDefault = null;
let settingsActive = null;
try {
// Note: retrieveSettingsForDefaultAccount might not exist, handle gracefully
settingsActive = await retrieveSettingsForActiveAccount();
} catch (error) {
logger.debug("[DiagnosticExport] Could not retrieve settings:", error);
}
// Collect recent logs from memory
let memoryLogs: string[] = [];
try {
memoryLogs = getMemoryLogs(1000); // Get last 1000 log entries
} catch (error) {
logger.debug("[DiagnosticExport] Could not retrieve memory logs:", error);
}
// Collect recent logs from database (if logs table exists)
let dbLogs: unknown[] = [];
try {
const logsResult = await platform.dbQuery(
"SELECT * FROM logs ORDER BY date DESC LIMIT 100"
);
if (logsResult?.values) {
dbLogs = logsResult.values.map((row) => {
const logEntry: Record<string, unknown> = {};
if (logsResult.columns && row) {
logsResult.columns.forEach((col, idx) => {
logEntry[col] = row[idx];
});
}
return logEntry;
});
}
} catch (error) {
logger.debug("[DiagnosticExport] Could not retrieve DB logs:", error);
}
// Get build info
let packageJson: { version?: string } = {};
try {
packageJson = await import("../../../package.json");
} catch (error) {
logger.debug("[DiagnosticExport] Could not load package.json:", error);
}
// Get git commit hash if available
let commitHash: string | undefined;
try {
// This would need to be set at build time via Vite define
commitHash = import.meta.env.VITE_GIT_HASH;
} catch (error) {
// Ignore
}
// Assemble diagnostic bundle
const bundle = {
timestamp,
version: "1.0",
diagnostics: diagnostics ? redactSensitive(diagnostics) : null,
settings: {
active: redactSensitive(settingsActive),
default: redactSensitive(settingsDefault),
},
logs: {
memory: redactSensitive(memoryLogs),
database: redactSensitive(dbLogs),
},
build: {
version: packageJson.version,
commit: commitHash,
mode: import.meta.env.MODE,
platform: import.meta.env.VITE_PLATFORM,
},
};
return JSON.stringify(bundle, null, 2);
}
/**
* Exports diagnostic bundle to file
*
* @param fileName - Optional custom filename (default: diagnostics-{timestamp}.json)
* @returns Promise that resolves when file is exported
*/
export async function exportDiagnosticsToFile(
fileName?: string
): Promise<void> {
const platform = PlatformServiceFactory.getInstance();
const timestamp = new Date().toISOString().replace(/[:.]/g, "-");
const defaultFileName = fileName || `diagnostics-${timestamp}.json`;
try {
const bundle = await exportDiagnostics();
await platform.writeAndShareFile(defaultFileName, bundle);
logger.log(`[DiagnosticExport] Diagnostic bundle exported: ${defaultFileName}`);
} catch (error) {
logger.error("[DiagnosticExport] Failed to export diagnostics:", error);
throw error;
}
}

View File

@@ -215,7 +215,10 @@ export interface PlatformService {
*/
registerServiceWorker?(): void;
// --- Diagnostics (optional, for debugging) ---
/**
* Returns true if PWA is enabled (web only)
* Gets comprehensive diagnostic information about the platform
* @returns Promise resolving to platform diagnostics
*/
getDiagnostics?(): Promise<import("@/interfaces/diagnostics").PlatformDiagnostics>;
}

View File

@@ -39,7 +39,9 @@ export class PlatformServiceFactory {
}
// Only log when actually creating the instance
const platform = process.env.VITE_PLATFORM || "web";
// Use import.meta.env for Vite environment variables (standard Vite pattern)
// process.env.VITE_PLATFORM is defined via Vite's define config, but import.meta.env is preferred
const platform = (import.meta.env?.VITE_PLATFORM || process.env.VITE_PLATFORM || "web") as string;
if (!PlatformServiceFactory.creationLogged) {
// Use console for critical startup message to avoid circular dependency

View File

@@ -0,0 +1,155 @@
/**
* Database Result Normalizer
*
* Provides shared logic to normalize database execution results across
* platforms, ensuring reliable changes count and last insert row ID.
*
* This addresses platform-specific inconsistencies where plugins may not
* return reliable change counts. The normalizer queries SQLite's connection
* state directly when plugin-provided values are missing or unreliable.
*
* @author Matthew Raymer
*/
import { logger } from "@/utils/logger";
/**
* Result from a database run operation
*/
interface RunResult {
changes?: {
changes?: number;
lastId?: number;
};
lastId?: number;
changes?: number;
}
/**
* Normalized database execution result
*/
export interface NormalizedRunResult {
changes: number;
lastId?: number;
}
/**
* Query function type for fallback queries
* Must use the same database connection to ensure changes() is accurate
*/
type QueryFunction = (
sql: string,
params?: unknown[]
) => Promise<{
values?: Array<Record<string, unknown>>;
columns?: string[];
}>;
/**
* Normalizes a database run result to ensure reliable changes count
*
* Strategy:
* 1. Prefer plugin-provided values if present and numeric
* 2. Fall back to querying SQLite connection state (changes(), last_insert_rowid())
* 3. Return normalized result with guaranteed numeric changes count
*
* @param runResult - Raw result from database plugin
* @param queryFn - Optional query function for fallback (must use same connection)
* @returns Promise resolving to normalized result with reliable changes count
*/
export async function normalizeRunResult(
runResult: RunResult,
queryFn?: QueryFunction
): Promise<NormalizedRunResult> {
let changes = 0;
let lastId: number | undefined;
// Extract plugin-provided values (handle different plugin response shapes)
if (runResult.changes?.changes !== undefined) {
changes = Number(runResult.changes.changes) || 0;
lastId = runResult.changes.lastId
? Number(runResult.changes.lastId)
: undefined;
} else if (runResult.changes !== undefined) {
changes = Number(runResult.changes) || 0;
}
if (runResult.lastId !== undefined && !lastId) {
lastId = Number(runResult.lastId);
}
// If we have a query function and changes is 0 (or missing), query SQLite directly
// This ensures correctness even if plugin doesn't return reliable counts
if (queryFn && (changes === 0 || runResult.changes === undefined)) {
try {
// Query SQLite's changes() function for the actual number of rows affected
// This must use the same connection to get accurate results
const changesResult = await queryFn("SELECT changes() AS changes");
if (
changesResult.values &&
changesResult.values.length > 0 &&
changesResult.values[0]
) {
const changesValue = Object.values(changesResult.values[0])[0];
if (typeof changesValue === "number") {
changes = changesValue;
}
}
// Query last_insert_rowid() for INSERT statements
const lastIdResult = await queryFn("SELECT last_insert_rowid() AS lastId");
if (
lastIdResult.values &&
lastIdResult.values.length > 0 &&
lastIdResult.values[0]
) {
const lastIdValue = Object.values(lastIdResult.values[0])[0];
if (typeof lastIdValue === "number" && lastIdValue > 0) {
lastId = lastIdValue;
}
}
} catch (error) {
// If querying SQLite state fails, log but don't fail the operation
// Fall back to plugin-provided values (which may be 0)
logger.debug(
"[dbResultNormalizer] Failed to query SQLite state, using plugin values:",
error
);
}
}
return {
changes: Math.max(0, changes), // Ensure non-negative
lastId: lastId && lastId > 0 ? lastId : undefined,
};
}
/**
* Synchronous version that uses provided values only
* Use this when query function is not available or not needed
*/
export function normalizeRunResultSync(
runResult: RunResult
): NormalizedRunResult {
let changes = 0;
let lastId: number | undefined;
if (runResult.changes?.changes !== undefined) {
changes = Number(runResult.changes.changes) || 0;
lastId = runResult.changes.lastId
? Number(runResult.changes.lastId)
: undefined;
} else if (runResult.changes !== undefined) {
changes = Number(runResult.changes) || 0;
}
if (runResult.lastId !== undefined && !lastId) {
lastId = Number(runResult.lastId);
}
return {
changes: Math.max(0, changes),
lastId: lastId && lastId > 0 ? lastId : undefined,
};
}

View File

@@ -23,6 +23,8 @@ import {
} from "../PlatformService";
import { logger } from "../../utils/logger";
import { BaseDatabaseService } from "./BaseDatabaseService";
import type { PlatformDiagnostics } from "@/interfaces/diagnostics";
import { normalizeRunResult } from "../dbResultNormalizer";
interface QueuedOperation {
type: "run" | "query" | "rawQuery";
@@ -54,6 +56,8 @@ export class CapacitorPlatformService
private initializationPromise: Promise<void> | null = null;
private operationQueue: Array<QueuedOperation> = [];
private isProcessingQueue: boolean = false;
private readonly MAX_QUEUE_SIZE = 1000; // Maximum queue size before failing
private maxQueueSizeReached = 0; // Track peak queue size for telemetry
constructor() {
super();
@@ -217,14 +221,34 @@ export class CapacitorPlatformService
let result: unknown;
switch (operation.type) {
case "run": {
// Execute the statement
const runResult = await this.db.run(
operation.sql,
operation.params,
);
result = {
changes: runResult.changes?.changes || 0,
lastId: runResult.changes?.lastId,
};
// Normalize using shared normalizer with query fallback
// The query function uses the same connection to ensure changes() is accurate
const normalized = await normalizeRunResult(
runResult,
async (sql: string, params?: unknown[]) => {
const queryResult = await this.db.query(sql, params || []);
return {
values: queryResult.values?.map((row) => {
const obj: Record<string, unknown> = {};
if (queryResult.columns) {
queryResult.columns.forEach((col, idx) => {
obj[col] = row[idx];
});
}
return obj;
}),
columns: queryResult.columns,
};
}
);
result = normalized;
break;
}
case "query": {
@@ -371,6 +395,31 @@ export class CapacitorPlatformService
});
return new Promise<R>((resolve, reject) => {
// Queue size guard: prevent memory exhaustion from unbounded queue growth
if (this.operationQueue.length >= this.MAX_QUEUE_SIZE) {
const error = new Error(
`Database operation queue is full (${this.MAX_QUEUE_SIZE} operations). ` +
`This usually indicates the database is not initializing properly or operations are being queued too quickly.`
);
logger.error(
`[CapacitorPlatformService] Queue size limit reached: ${this.operationQueue.length}/${this.MAX_QUEUE_SIZE}`,
);
reject(error);
return;
}
// Track peak queue size for telemetry
if (this.operationQueue.length > this.maxQueueSizeReached) {
this.maxQueueSizeReached = this.operationQueue.length;
}
// Log warning if queue is getting large (but not at limit yet)
if (this.operationQueue.length > this.MAX_QUEUE_SIZE * 0.8) {
logger.warn(
`[CapacitorPlatformService] Queue size is high: ${this.operationQueue.length}/${this.MAX_QUEUE_SIZE}`,
);
}
// Create completely plain objects that Vue cannot make reactive
// Step 1: Deep clone the converted params to ensure they're plain objects
const plainParams = JSON.parse(JSON.stringify(convertedParams));
@@ -865,6 +914,27 @@ export class CapacitorPlatformService
};
}
/**
* Gets telemetry information about the database operation queue.
* Useful for debugging and monitoring queue health.
* @returns Queue telemetry data
*/
getQueueTelemetry(): {
currentSize: number;
maxSize: number;
peakSize: number;
isProcessing: boolean;
initialized: boolean;
} {
return {
currentSize: this.operationQueue.length,
maxSize: this.MAX_QUEUE_SIZE,
peakSize: this.maxQueueSizeReached,
isProcessing: this.isProcessingQueue,
initialized: this.initialized,
};
}
/**
* Checks and requests storage permissions if needed
* @returns Promise that resolves when permissions are granted
@@ -1409,6 +1479,38 @@ export class CapacitorPlatformService
// --- PWA/Web-only methods (no-op for Capacitor) ---
public registerServiceWorker(): void {}
/**
* Gets comprehensive diagnostic information about the Capacitor platform
* @returns Promise resolving to platform diagnostics
*/
async getDiagnostics(): Promise<PlatformDiagnostics> {
const platform = Capacitor.getPlatform();
const queueTelemetry = this.getQueueTelemetry();
return {
platform: "capacitor",
capabilities: this.getCapabilities(),
db: {
kind: "capacitor-sqlite",
queue: {
current: queueTelemetry.currentSize,
maxReached: queueTelemetry.peakSize,
limit: queueTelemetry.maxSize,
isProcessing: queueTelemetry.isProcessing,
},
initialized: this.initialized,
},
build: {
appVersion: import.meta.env.VITE_APP_VERSION,
mode: import.meta.env.MODE,
gitHash: import.meta.env.VITE_GIT_HASH,
},
metadata: {
nativePlatform: platform,
},
};
}
// Database utility methods - inherited from BaseDatabaseService
// generateInsertStatement, updateDefaultSettings, updateActiveDid,
// getActiveIdentity, insertNewDidIntoSettings, updateDidSpecificSettings,

View File

@@ -22,6 +22,7 @@
import { CapacitorPlatformService } from "./CapacitorPlatformService";
import { logger } from "../../utils/logger";
import type { PlatformDiagnostics } from "@/interfaces/diagnostics";
/**
* Electron-specific platform service implementation.
@@ -164,6 +165,24 @@ export class ElectronPlatformService extends CapacitorPlatformService {
return false;
}
/**
* Gets comprehensive diagnostic information about the Electron platform
* @returns Promise resolving to platform diagnostics
*/
async getDiagnostics(): Promise<PlatformDiagnostics> {
const baseDiagnostics = await super.getDiagnostics();
return {
...baseDiagnostics,
platform: "electron",
capabilities: this.getCapabilities(),
metadata: {
...baseDiagnostics.metadata,
electronVersion: process.versions?.electron,
nodeVersion: process.versions?.node,
},
};
}
// --- PWA/Web-only methods (no-op for Electron) ---
public registerServiceWorker(): void {}
}

View File

@@ -6,6 +6,7 @@ import {
import { logger } from "../../utils/logger";
import { QueryExecResult } from "@/interfaces/database";
import { BaseDatabaseService } from "./BaseDatabaseService";
import type { PlatformDiagnostics } from "@/interfaces/diagnostics";
// Dynamic import of initBackend to prevent worker context errors
import type {
WorkerRequest,
@@ -673,6 +674,42 @@ export class WebPlatformService
// SharedArrayBuffer initialization is handled by initBackend call in initializeWorker
}
/**
* Gets comprehensive diagnostic information about the web platform
* @returns Promise resolving to platform diagnostics
*/
async getDiagnostics(): Promise<PlatformDiagnostics> {
const platform = (import.meta.env?.VITE_PLATFORM || process.env.VITE_PLATFORM || "web") as string;
const sabAvailable = typeof SharedArrayBuffer !== "undefined";
// Get version from build-time env var if available
let version: string | undefined;
try {
version = import.meta.env.VITE_APP_VERSION;
} catch {
// Ignore
}
return {
platform,
capabilities: this.getCapabilities(),
db: {
kind: "absurd-sql",
sharedArrayBuffer: sabAvailable ? "available" : "fallback",
worker: {
ready: this.workerReady,
pending: this.pendingMessages.size,
},
initialized: this.workerReady,
},
build: {
appVersion: version,
mode: import.meta.env.MODE,
gitHash: import.meta.env.VITE_GIT_HASH,
},
};
}
// Database utility methods - inherited from BaseDatabaseService
// generateInsertStatement, updateDefaultSettings, updateActiveDid,
// getActiveIdentity, insertNewDidIntoSettings, updateDidSpecificSettings,

View File

@@ -0,0 +1,269 @@
<template>
<div class="platform-diagnostics-view p-6 max-w-6xl mx-auto">
<div class="mb-6">
<h1 class="text-3xl font-bold mb-2">Platform Diagnostics</h1>
<p class="text-gray-600">
Comprehensive diagnostic information about the current platform, database,
and build configuration.
</p>
</div>
<div class="mb-4 flex gap-4">
<button
@click="refreshDiagnostics"
:disabled="loading"
class="px-4 py-2 bg-blue-600 text-white rounded hover:bg-blue-700 disabled:opacity-50"
>
{{ loading ? "Loading..." : "Refresh" }}
</button>
<button
@click="exportDiagnostics"
:disabled="loading || !diagnostics"
class="px-4 py-2 bg-green-600 text-white rounded hover:bg-green-700 disabled:opacity-50"
>
Export Diagnostics Bundle
</button>
</div>
<div v-if="error" class="mb-4 p-4 bg-red-100 border border-red-400 text-red-700 rounded">
{{ error }}
</div>
<div v-if="diagnostics" class="space-y-6">
<!-- Platform Info -->
<div class="bg-white rounded-lg shadow p-6">
<h2 class="text-xl font-semibold mb-4">Platform Information</h2>
<dl class="grid grid-cols-2 gap-4">
<div>
<dt class="font-medium text-gray-700">Platform</dt>
<dd class="text-gray-900">{{ diagnostics.platform }}</dd>
</div>
<div>
<dt class="font-medium text-gray-700">Build Mode</dt>
<dd class="text-gray-900">{{ diagnostics.build.mode || "N/A" }}</dd>
</div>
<div>
<dt class="font-medium text-gray-700">App Version</dt>
<dd class="text-gray-900">{{ diagnostics.build.appVersion || "N/A" }}</dd>
</div>
<div>
<dt class="font-medium text-gray-700">Git Hash</dt>
<dd class="text-gray-900 font-mono text-sm">
{{ diagnostics.build.gitHash || "N/A" }}
</dd>
</div>
</dl>
</div>
<!-- Capabilities -->
<div class="bg-white rounded-lg shadow p-6">
<h2 class="text-xl font-semibold mb-4">Platform Capabilities</h2>
<dl class="grid grid-cols-2 gap-4">
<div v-for="(value, key) in diagnostics.capabilities" :key="key">
<dt class="font-medium text-gray-700">{{ formatKey(key) }}</dt>
<dd class="text-gray-900">
<span
:class="
value
? 'text-green-600 font-semibold'
: 'text-gray-400'
"
>
{{ value ? "✓ Yes" : "✗ No" }}
</span>
</dd>
</div>
</dl>
</div>
<!-- Database Info -->
<div class="bg-white rounded-lg shadow p-6">
<h2 class="text-xl font-semibold mb-4">Database Information</h2>
<dl class="grid grid-cols-2 gap-4">
<div>
<dt class="font-medium text-gray-700">Backend Type</dt>
<dd class="text-gray-900">{{ diagnostics.db.kind }}</dd>
</div>
<div>
<dt class="font-medium text-gray-700">Initialized</dt>
<dd class="text-gray-900">
<span
:class="
diagnostics.db.initialized
? 'text-green-600 font-semibold'
: 'text-red-600 font-semibold'
"
>
{{ diagnostics.db.initialized ? "✓ Yes" : "✗ No" }}
</span>
</dd>
</div>
<div v-if="diagnostics.db.sharedArrayBuffer">
<dt class="font-medium text-gray-700">SharedArrayBuffer</dt>
<dd class="text-gray-900">
<span
:class="
diagnostics.db.sharedArrayBuffer === 'available'
? 'text-green-600 font-semibold'
: 'text-yellow-600 font-semibold'
"
>
{{ diagnostics.db.sharedArrayBuffer }}
</span>
</dd>
</div>
</dl>
<!-- Worker Status (Web Platform) -->
<div v-if="diagnostics.db.worker" class="mt-4 pt-4 border-t">
<h3 class="font-semibold mb-2">Worker Status</h3>
<dl class="grid grid-cols-2 gap-4">
<div>
<dt class="font-medium text-gray-700">Ready</dt>
<dd class="text-gray-900">
<span
:class="
diagnostics.db.worker.ready
? 'text-green-600 font-semibold'
: 'text-red-600 font-semibold'
"
>
{{ diagnostics.db.worker.ready ? "✓ Yes" : "✗ No" }}
</span>
</dd>
</div>
<div>
<dt class="font-medium text-gray-700">Pending Messages</dt>
<dd class="text-gray-900">{{ diagnostics.db.worker.pending }}</dd>
</div>
<div v-if="diagnostics.db.worker.lastPingMs">
<dt class="font-medium text-gray-700">Last Ping</dt>
<dd class="text-gray-900">{{ diagnostics.db.worker.lastPingMs }}ms ago</dd>
</div>
</dl>
</div>
<!-- Queue Status (Capacitor/Electron) -->
<div v-if="diagnostics.db.queue" class="mt-4 pt-4 border-t">
<h3 class="font-semibold mb-2">Operation Queue</h3>
<dl class="grid grid-cols-2 gap-4">
<div>
<dt class="font-medium text-gray-700">Current Size</dt>
<dd class="text-gray-900">{{ diagnostics.db.queue.current }}</dd>
</div>
<div>
<dt class="font-medium text-gray-700">Limit</dt>
<dd class="text-gray-900">{{ diagnostics.db.queue.limit }}</dd>
</div>
<div>
<dt class="font-medium text-gray-700">Peak Reached</dt>
<dd class="text-gray-900">{{ diagnostics.db.queue.maxReached }}</dd>
</div>
<div>
<dt class="font-medium text-gray-700">Processing</dt>
<dd class="text-gray-900">
<span
:class="
diagnostics.db.queue.isProcessing
? 'text-green-600 font-semibold'
: 'text-gray-400'
"
>
{{ diagnostics.db.queue.isProcessing ? "✓ Yes" : "✗ No" }}
</span>
</dd>
</div>
</dl>
<div class="mt-2">
<div class="w-full bg-gray-200 rounded-full h-2">
<div
class="bg-blue-600 h-2 rounded-full transition-all"
:style="{
width: `${Math.min(
(diagnostics.db.queue.current / diagnostics.db.queue.limit) * 100,
100
)}%`,
}"
></div>
</div>
<p class="text-xs text-gray-500 mt-1">
{{ Math.round((diagnostics.db.queue.current / diagnostics.db.queue.limit) * 100) }}%
capacity
</p>
</div>
</div>
</div>
<!-- Metadata -->
<div v-if="diagnostics.metadata && Object.keys(diagnostics.metadata).length > 0" class="bg-white rounded-lg shadow p-6">
<h2 class="text-xl font-semibold mb-4">Additional Metadata</h2>
<pre class="bg-gray-100 p-4 rounded text-sm overflow-auto">{{ JSON.stringify(diagnostics.metadata, null, 2) }}</pre>
</div>
</div>
<div v-else-if="!loading" class="text-center py-12 text-gray-500">
No diagnostics available. Click "Refresh" to load.
</div>
</div>
</template>
<script setup lang="ts">
import { ref, onMounted } from "vue";
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
import { exportDiagnosticsToFile } from "@/services/DiagnosticExportService";
import type { PlatformDiagnostics } from "@/interfaces/diagnostics";
import { logger } from "@/utils/logger";
const diagnostics = ref<PlatformDiagnostics | null>(null);
const loading = ref(false);
const error = ref<string | null>(null);
const formatKey = (key: string): string => {
return key
.replace(/([A-Z])/g, " $1")
.replace(/^./, (str) => str.toUpperCase())
.trim();
};
const refreshDiagnostics = async () => {
loading.value = true;
error.value = null;
try {
const platform = PlatformServiceFactory.getInstance();
if (platform.getDiagnostics) {
diagnostics.value = await platform.getDiagnostics();
logger.debug("[PlatformDiagnosticsView] Diagnostics refreshed", diagnostics.value);
} else {
error.value = "Diagnostics not available on this platform";
}
} catch (err) {
error.value = `Failed to load diagnostics: ${err}`;
logger.error("[PlatformDiagnosticsView] Failed to refresh diagnostics:", err);
} finally {
loading.value = false;
}
};
const exportDiagnostics = async () => {
try {
await exportDiagnosticsToFile();
logger.log("[PlatformDiagnosticsView] Diagnostics exported successfully");
} catch (err) {
error.value = `Failed to export diagnostics: ${err}`;
logger.error("[PlatformDiagnosticsView] Failed to export diagnostics:", err);
}
};
onMounted(() => {
refreshDiagnostics();
});
</script>
<style scoped>
.platform-diagnostics-view {
min-height: 100vh;
background-color: #f5f5f5;
}
</style>

View File

@@ -4,6 +4,7 @@ import dotenv from "dotenv";
import { loadAppConfig } from "./vite.config.utils.mts";
import path from "path";
import { fileURLToPath } from 'url';
import { execSync } from "child_process";
// Load environment variables
dotenv.config({ path: `.env.${process.env.NODE_ENV}` })
@@ -20,6 +21,15 @@ export async function createBuildConfig(platform: string): Promise<UserConfig> {
// Set platform - PWA is always enabled for web platforms
process.env.VITE_PLATFORM = platform;
// Get git commit hash for build info (fallback to empty if git not available)
let gitHash = "";
try {
gitHash = execSync("git rev-parse --short HEAD", { encoding: "utf-8" }).trim();
} catch (error) {
// Git not available or not a git repo - use empty string
gitHash = "";
}
// Environment variables are loaded from .env files via dotenv.config() above
return {
@@ -71,6 +81,8 @@ export async function createBuildConfig(platform: string): Promise<UserConfig> {
'process.env.NODE_ENV': JSON.stringify(process.env.NODE_ENV),
'process.env.VITE_PLATFORM': JSON.stringify(platform),
'process.env.VITE_LOG_LEVEL': JSON.stringify(process.env.VITE_LOG_LEVEL),
'import.meta.env.VITE_GIT_HASH': JSON.stringify(gitHash),
'process.env.VITE_GIT_HASH': JSON.stringify(gitHash),
// PWA is always enabled for web platforms
__dirname: JSON.stringify(process.cwd()),
__IS_MOBILE__: JSON.stringify(isCapacitor),