Compare commits

...

178 Commits

Author SHA1 Message Date
5aa693de63 bump to version 1.0.0 2025-06-20 11:20:57 -06:00
6f2272eea7 fix problem where prod users don't see other DB options 2025-06-20 11:11:33 -06:00
9b69c0b22c bump to version 0.5.9 2025-06-20 10:41:48 -06:00
ab2270d8b2 IndexedDB migration: fix where the existing settings (eg. master) were not updated 2025-06-20 06:51:33 -06:00
0cf5cf266d IndexedDB migration: don't run activeDid migration twice, include warnings in output, don't automatically compare afterward 2025-06-20 06:26:56 -06:00
Matthew Raymer
4d01f64fe7 feat: Implement activeDid migration from Dexie to SQLite
- Add migrateActiveDid() function for dedicated activeDid migration
- Enhance migrateSettings() to handle activeDid extraction and validation
- Update migrateAll() to include activeDid migration step
- Add comprehensive error handling and validation
- Update migration documentation with activeDid migration details
- Ensure user identity continuity during migration process

Files changed:
- src/services/indexedDBMigrationService.ts (153 lines added)
- doc/migration-to-wa-sqlite.md (documentation updated)

Migration order: Accounts -> Settings -> ActiveDid -> Contacts
2025-06-20 04:48:51 +00:00
Matthew Raymer
d1f61e3530 docs: Update migration documentation with fence definition and security checklist - Add comprehensive migration fence definition with clear boundaries - Update migration guide to reflect current Phase 2 status - Create security audit checklist for migration process - Update README with migration status and architecture details - Document migration fence enforcement and guidelines - Add security considerations and compliance requirements 2025-06-20 03:30:43 +00:00
4162208b7f fix linting 2025-06-19 19:06:37 -06:00
731605e244 IndexedDB migration: add a blurb about backing up, and move around messaging 2025-06-19 18:55:07 -06:00
5dd2bf4c6e IndexedDB migration: enhance comparison for contacts 2025-06-19 18:47:26 -06:00
75a4a1d901 IndexedDB migration: fix settings comparison 2025-06-19 18:24:49 -06:00
8e605d04d7 IndexedDB migration: fix settings update 2025-06-19 18:12:56 -06:00
da0b244bae IndexedDB migration: implement the migrations differently 2025-06-19 17:26:30 -06:00
6136cafd11 IndexedDB migration: fix loading of data, fix object comparisons, add unmodified, etc 2025-06-19 16:19:00 -06:00
a248e9a5a3 IndexedDB migration: ensure output is printed during comparison (no logic changes) 2025-06-19 12:55:01 -06:00
452ae555bb IndexedDB migration: add ability to see mnemonic and download settings & contacts 2025-06-19 12:37:41 -06:00
3df5e19d9d IndexedDB migration: extract IndexedDB code away from the ongoing SQLite migrations 2025-06-19 11:11:59 -06:00
8eff407a9c IndexedDB migration: reorder the sections to accounts then settings then contacts 2025-06-19 09:54:09 -06:00
e759e4785b IndexedDB migration: set USE_DEXIE_DB to false, remove unused functions, add raw display of data 2025-06-19 09:47:18 -06:00
Matthew Raymer
9d054074e4 fix(migration): update UI to handle transformed JSON format
The DatabaseMigration view has been updated to properly handle both live
comparison data and exported JSON format, fixing count mismatches and
field name differences.

Changes:
- Added helper methods in DatabaseMigration.vue to handle both data formats:
  - getSettingDisplayName() for settings with type/did or activeDid/accountDid
  - getAccountHasIdentity() and getAccountHasMnemonic() for boolean fields
- Updated template to use new helper methods for consistent display
- Added exportComparison() method to handle JSON export format
- Fixed settings count display to match actual data state

Technical Details:
- Settings now handle both 'type'/'did' (JSON) and 'activeDid'/'accountDid' (live)
- Account display properly shows boolean values from either format
- Export functionality preserves data structure while maintaining readability

Resolves count mismatch between UI (showing 1 SQLite setting) and JSON data
(showing 0 SQLite settings).

Testing:
- Verified UI displays correct counts from both live and exported data
- Confirmed settings display works with both data formats
- Validated account boolean fields display correctly
2025-06-19 14:45:58 +00:00
Matthew Raymer
30de30e709 fix: maintain separate master/account settings in SQLite migration
- Update settings migration to maintain separate master and account records
- Use activeDid/accountDid pattern to differentiate between settings types:
  * Master settings: activeDid set, accountDid empty
  * Account settings: accountDid set, activeDid empty
- Add detailed logging for settings migration process
- Focus on migrating key fields: firstName, isRegistered, profileImageUrl,
  showShortcutBvc, and searchBoxes
- Fix issue where settings were being incorrectly merged into a single record

This change ensures the SQLite database maintains the same settings structure
as Dexie, which is required by the existing codebase.
2025-06-19 14:11:11 +00:00
Matthew Raymer
6cbd32af94 feat: improve database migration comparison and UI display
- Enhanced migration service to filter empty/default records from comparisons
- Updated comparison output to exclude records without meaningful data
- Improved UI display with better visual hierarchy and count indicators
- Added security improvements to prevent sensitive data exposure
- Fixed settings migration to properly handle MASTER_SETTINGS_KEY vs account DIDs
- Updated verification display to show filtered counts and detailed differences
- Improved data formatting for contacts, settings, and accounts sections
- Added proper filtering for missing records to avoid counting empty entries

Changes:
- Filter SQLite records to only include those with actual DIDs/data
- Update comparison counts to reflect meaningful differences only
- Enhance UI with count indicators and better visual organization
- Replace sensitive data display with boolean flags (hasIdentity, hasMnemonic)
- Fix settings migration logic for proper DID field handling
- Improve verification message to show detailed breakdown by type
- Add proper filtering for missing records in all data types

Security: Prevents exposure of mnemonics, private keys, and identity data
UI/UX: Cleaner display with better information hierarchy and counts
Migration: More accurate comparison results and better debugging visibility
2025-06-19 12:44:32 +00:00
Matthew Raymer
30c8b73041 feat: implement single-step migration with proper foreign key order
- Add migrateAll() function that handles complete migration in correct order:
  Accounts → Settings → Contacts to avoid foreign key constraint issues
- Add prominent "Migrate All (Recommended)" button to migration UI
- Add informational section explaining migration order and rationale
- Add info icon to icon set for UI clarity
- Improve migration logic to handle overwriteExisting parameter properly:
  - New records are always migrated regardless of checkbox setting
  - Existing records are only updated when overwriteExisting=true
  - Clear warning messages when records are skipped
- Maintain backward compatibility with individual migration buttons
- All code linted and formatted according to project standards

Co-authored-by: Matthew Raymer
2025-06-19 08:52:55 +00:00
Matthew Raymer
2f9ab14c88 fix: resolve migration service import and function signature conflicts
- Fix import in src/db-sql/migration.ts to use named imports and alias runMigrations to avoid naming conflict
- Add missing migration management functions (registerMigration, runMigrations) to migrationService with full typing and logging
- Update function signatures to accept SQL parameters for compatibility with AbsurdSqlDatabaseService
- Clean up Prettier formatting issues in migrationService and migration.ts
- Confirmed dev server and linter run cleanly

Co-authored-by: Matthew Raymer
2025-06-19 08:32:14 +00:00
Matthew Raymer
8a7f142cb7 feat: integrate importFromMnemonic utility into migration service and UI
- Add account migration support to migrationService with importFromMnemonic integration
- Extend DataComparison and MigrationResult interfaces to include accounts
- Add getDexieAccounts() and getSqliteAccounts() functions for account retrieval
- Implement compareAccounts() and migrateAccounts() functions with proper error handling
- Update DatabaseMigration.vue UI to support account migration:
  - Add "Migrate Accounts" button with lock icon
  - Extend summary cards grid to show Dexie/SQLite account counts
  - Add Account Differences section with added/modified/missing indicators
  - Update success message to include account migration counts
  - Enhance grid layouts to accommodate 6 summary cards and 3 difference sections

The migration service now provides complete data migration capabilities
for contacts, settings, and accounts, with enhanced reliability through
the importFromMnemonic utility for mnemonic-based account handling.
2025-06-19 06:13:25 +00:00
Matthew Raymer
f375a4e11a feat: move database migration link from account view to start view
- Remove database migration link from AccountViewView.vue
- Add new "Database Tools" section to StartView.vue
- Improve user flow by making database tools accessible from start page
- Maintain consistent styling and functionality
- Clean up account view to focus on account-specific settings

The database migration feature is now logically grouped with other
identity-related operations and more discoverable for users.
2025-06-19 05:51:59 +00:00
Matthew Raymer
40a2491d68 feat: Add comprehensive Database Migration UI component
- Create DatabaseMigration.vue with vue-facing-decorator and Tailwind CSS
- Add complete UI for comparing and migrating data between Dexie and SQLite
- Implement real-time loading states, error handling, and success feedback
- Add navigation link to Account page for easy access
- Include export functionality for comparison data
- Create comprehensive documentation in doc/database-migration-guide.md
- Fix all linting issues and ensure code quality standards
- Support both contact and settings migration with overwrite options
- Add visual difference analysis with summary cards and detailed breakdowns

The component provides a professional interface for the migrationService.ts,
enabling users to safely transfer data between database systems during
the transition from Dexie to SQLite storage.
2025-06-18 11:19:34 +00:00
Matthew Raymer
25c1d6ef4e feat: Add comprehensive database migration service for Dexie to SQLite
- Add migrationService.ts with functions to compare and transfer data between Dexie and SQLite
- Implement data comparison with detailed difference analysis (added/modified/missing)
- Add contact migration with overwrite options and error handling
- Add settings migration focusing on key user fields (firstName, isRegistered, profileImageUrl, showShortcutBvc, searchBoxes)
- Include YAML export functionality for data inspection
- Add comprehensive JSDoc documentation with examples and usage instructions
- Support both INSERT and UPDATE operations with parameterized SQL generation
- Include detailed logging and error reporting for migration operations

This service enables safe migration of user data from the legacy Dexie (IndexedDB)
database to the new SQLite implementation, with full comparison capabilities
and rollback safety through detailed reporting.
2025-06-18 10:54:32 +00:00
a5c5c2b9dd bump to build 33 and version 0.5.7 2025-06-18 02:34:18 -06:00
cf33a39fbc fix the invite-one setup, fix adeep link, and tweak other verbiage & error info 2025-06-18 02:32:47 -06:00
8629cefa13 bump to build 32 & version 0.5.6 2025-06-17 05:25:45 -06:00
5e851e442f shrink the contents of the QR code so people can scan it 2025-06-16 15:38:11 -06:00
4a43bc9c6c bump build to 31 and version to 0.5.5 2025-06-16 07:38:16 -06:00
60de8cee62 reword some of the help-page introduction (no code changes) 2025-06-16 07:24:37 -06:00
Jose Olarte III
bb2a4ab76e URL scheme config for iOS
- Registers the timesafari:// URL scheme
- Sets the bundle URL name to app.timesafari
2025-06-16 16:09:08 +08:00
Matthew Raymer
048dded278 fix: resolve deep link route mismatch for project links
- Fix schema validation mismatch between "project-details" and "project"
- Update VALID_DEEP_LINK_ROUTES to include "project" instead of "project-details"
- Update deepLinkSchemas to use "project" route name
- Update documentation to reflect correct route name
- Resolves "Invalid route path: project" errors in deep link handling

The deep link timesafari://project/01JWH0YAB3MAGBD751VAJAXQ17 now works correctly
and routes to the ProjectViewView component as expected.

Fixes: Deep link validation errors for project routes
2025-06-16 05:48:13 +00:00
e240c2940a remove unused deep links and add another 2025-06-15 13:54:12 -06:00
54dca9e745 fix project deep-link (and reorder alphabetically) 2025-06-15 11:02:16 -06:00
9f0fed0a60 update ios check to work, and add links to app stores 2025-06-14 22:10:49 -06:00
0d152adbf2 remove the deep-link autoVerify because it caused a build failure 2025-06-14 22:06:12 -06:00
cead308800 incorporate one of the BUILDING steps directly into the file 2025-06-13 22:37:03 -06:00
676a301331 bump to build 30 version 0.5.4 2025-06-13 22:36:28 -06:00
d6db81cc36 fix some result types and refactor types themselves 2025-06-13 21:58:57 -06:00
Matthew Raymer
f2ddcd2541 feat: add conditional rendering for claim certificate link and update gitignore
- Add v-if directive to show claim certificate link only when veriClaim.id exists
- Update .gitignore to exclude android app resource directory
- Prevents broken links when claim data is not fully loaded
- Improves build process by ignoring generated Android resources

This change ensures the certificate link is only displayed when there's
valid claim data available, preventing navigation errors and improving
user experience. The gitignore update helps keep the repository clean
by excluding Android-specific generated files.
2025-06-14 03:31:12 +00:00
fb81f7b96e fix problems with :href links causing the app to reload for DB errors on mobile 2025-06-13 20:39:12 -06:00
a23416ead1 fix optional message at top to not overflow 2025-06-12 20:10:31 -06:00
530c7c1a13 fix problem with user-profile page, and bump to build 29 & version 0.5.3 2025-06-12 19:16:02 -06:00
f255ea389b bump to build 26 and version 0.5.1 2025-06-11 00:46:46 -06:00
0d343b9877 Merge pull request 'fix creation of did-specific settings (with a rename)' (#138) from fix-did-specifics into master
Reviewed-on: #138
2025-06-11 02:14:41 -04:00
df06100c32 remove more debugging 2025-06-10 23:49:14 -06:00
Matthew Raymer
ac5ddfc6f2 style: fix line length in ContactsView ternary operator
- Break long CSS class strings into multiple concatenated lines
- Ensure all lines are under 100 characters for better readability
- Maintain same functionality and styling behavior
- Improve code maintainability and readability

Fixes: Long lines in conditional CSS class assignment
2025-06-11 05:45:58 +00:00
Matthew Raymer
89b3f30466 fix: debug and clean up GiftedPrompts contact retrieval logic
- Add comprehensive debug logging to identify contact list population issues
- Fix array indexing bug in contact mapping (someContactDbIndex -> 0)
- Clean up all console.log statements for production readiness
- Improve contact retrieval debugging for SQLite and Dexie databases
- Maintain core functionality while adding diagnostic capabilities

Debugging: Contact list population issues in GiftedPrompts component
Cleanup: Remove debug console.log statements
2025-06-11 05:40:05 +00:00
Matthew Raymer
3cb5cc096b refactor: use databaseUtil.updateDefaultSettings for feed filter settings
- Replace direct platform service calls with databaseUtil.updateDefaultSettings
- Remove manual SQL query construction in favor of centralized utility
- Improve code consistency and maintainability
- Add proper error handling through databaseUtil's built-in mechanisms
- Remove unused PlatformServiceFactory import
- Fix SQL syntax errors in clearAll and setAll methods (AND -> comma)
- Ensure both SQLite and Dexie databases are updated consistently

Improves: FeedFilters component architecture and error handling
Fixes: isNearby and filterFeedByVisible settings not being saved properly
2025-06-11 05:19:15 +00:00
Matthew Raymer
5df560154f fix: resolve cross-platform contactMethods JSON parsing inconsistencies
- Add platform-agnostic parseJsonField utility for contactMethods handling
- Update contact export functions (contactsToExportJson, contactToCsvLine)
- Fix contact storage in QR scan views (ContactQRScanShowView, ContactQRScanFullView)
- Ensure consistent JSON string storage across web SQLite and Capacitor SQLite
- Prevents "[object Object] is not valid JSON" errors when switching platforms
- Maintains compatibility between auto-parsing web SQLite and raw string Capacitor SQLite

Fixes: contactMethods parsing errors in export and QR scan functionality
Related: searchBoxes field had similar issue (already fixed)
2025-06-11 04:17:38 +00:00
Matthew Raymer
c1aa522e6c fix: resolve cross-platform SQLite JSON parsing inconsistencies
- Add platform-agnostic parseJsonField utility to handle different SQLite implementations
- Web SQLite (wa-sqlite/absurd-sql) auto-parses JSON strings to objects
- Capacitor SQLite returns raw strings requiring manual parsing
- Update searchBoxes parsing to use new utility for consistent behavior
- Fixes "[object Object] is not valid JSON" error when switching platforms
- Ensures compatibility between web and mobile SQLite implementations

Fixes: searchBoxes parsing errors in databaseUtil.ts
Related: contactMethods field has similar issue (needs same treatment)
2025-06-11 03:44:28 +00:00
a082469a01 fix creation of did-specific settings (with a rename) 2025-06-10 20:51:22 -06:00
Jose Olarte III
3544d7278d Optimized item actions
- Edited button labels for brevity
- Repositioned Totals toggle
- Restyled note about recent hours
- Various text size and spacing changes
2025-06-10 19:54:05 +08:00
Jose Olarte III
d3110506ea Optimized per-item layout
- Stacked contact name and DID
- Text truncates to leave room for action buttons when visible
- Separated "from / to" heading from buttons to minimize width
- Various spacing and alignment adjustments
2025-06-10 18:42:49 +08:00
8609f8458d bump to build 25 & version 0.5.0 2025-06-09 09:26:21 -06:00
8f5c34bc5f fix linting 2025-06-09 09:09:54 -06:00
b0d61b95ea Merge branch 'ui-fixes-2025-06-w2' 2025-06-09 08:44:42 -06:00
af7bd236a3 fix check for successful gift submission 2025-06-09 08:41:47 -06:00
d719338bcc fix problem setting 'loading' flag 2025-06-09 08:37:42 -06:00
6ddf2d1012 fix problem switching IDs (creating too many settings) 2025-06-09 08:33:33 -06:00
Jose Olarte III
1b2d4b623a Turned off automatic safe area in iOS
- Safe area implementations will solely depend on CSS styles for full control
- Eliminates doubled top and bottom padding in iOS
2025-06-09 20:20:17 +08:00
Jose Olarte III
16d5c917d2 Updated QR scanner call
- Searched for all other (outdated) calls to QR scanner dialog and updated them
- Fixed vite HTML spec warning
2025-06-09 19:36:06 +08:00
5976a4995e fix problem clicking on offer-delivery, plus some other hardening and phrasing 2025-06-08 20:13:32 -06:00
dcd0cc4c20 fix import for derived accounts and hopefully make other account-access code more robust 2025-06-08 19:14:41 -06:00
b3ca6c9d91 remove relative URL references in different target because mobile chokes 2025-06-08 16:55:03 -06:00
e9d800f601 fix a web test (all passing now) 2025-06-07 21:41:43 -06:00
b939a5e592 bump build to 23 and version to 0.4.8 2025-06-07 18:54:56 -06:00
aa62037fae bump to build 22 version 0.4.7 (though I think the android capacitor.config.json appId is wrong) 2025-06-07 18:45:24 -06:00
722020ea86 fix linting 2025-06-07 18:09:04 -06:00
96aa3f4a54 add Python dependency for electron on Mac 2025-06-07 17:54:31 -06:00
c0c5f9842b fix some errors and correct recent type duplications & bloat (cherry-picked from d8f2587d1c) 2025-06-07 17:53:36 -06:00
be27ca1855 fix more logic for tests (cherry-picked from 83acb028c7) 2025-06-07 17:42:06 -06:00
92e4570672 fix some incorrect logic & things AI hallucinated 2025-06-07 17:39:10 -06:00
820ae727ed fix linting 2025-06-07 17:19:01 -06:00
dbeb1c6b4b Merge branch 'sql-absurd-sql-back' 2025-06-07 17:18:10 -06:00
573e4b206a Merge branch 'search-map-fix' 2025-06-07 16:43:49 -06:00
abc05d426e update messaging for unknown icons on home feed 2025-06-07 16:23:07 -06:00
2ea7479d75 fix verbiage and fix the contact actions to be on the right-hand side 2025-06-07 16:03:47 -06:00
9ac9713172 fix linting 2025-06-07 15:40:52 -06:00
41dad3254d fix a non-existent description, move the description right below the image 2025-06-07 15:33:29 -06:00
485eac59a0 remove unnecessary data element from export 2025-06-07 14:02:22 -06:00
Matthew Raymer
73fc32b75d fix(import): ensure contact import works for both Dexie and absurd-sql backends
- Refactor importContacts to handle both Dexie and absurd-sql (SQLite) storage
- Add ContactDbRecord interface with all string fields strictly typed (never null)
- Add helper functions to coerce null/undefined to empty string for all string fields
- Guarantee contactMethods is always stored as a JSON string (never null)
- Add runtime validation for required fields (e.g., did)
- Ensure imported/updated contacts are type-safe and compatible with both backends
- Improve code documentation and maintainability

Security:
- No sensitive data exposed
- All fields validated and sanitized before database write
- Consistent data structure across storage backends

Testing:
- Import tested with both Dexie and absurd-sql backends
- Null/undefined fields correctly handled and coerced
- No linter/type errors remain
2025-06-07 06:01:17 +00:00
Matthew Raymer
3d8e40e92b feat(export): Replace CSV export with standardized JSON format
- Add contactsToExportJson utility function for standardized data export
- Replace CSV export with JSON format in DataExportSection
- Update file extension and MIME type to application/json
- Remove Dexie-specific export logic in favor of unified SQLite/Dexie approach
- Update success notifications to reflect JSON format
- Add TypeScript interfaces for export data structure

This change improves data portability and standardization by:
- Using a consistent JSON format for data export/import
- Supporting both SQLite and Dexie databases
- Including all contact fields in export
- Properly handling contactMethods as stringified JSON
- Maintaining backward compatibility with existing import tools

Security: No sensitive data exposure, maintains existing access controls
2025-06-07 05:02:33 +00:00
38e67f3533 update a DB save to match others, ie. first SQL then maybe Dexie 2025-06-06 19:50:16 -06:00
7f63ee7c80 add another way to fix the privacy policy manifest for third parties like GoogleToolboxForMac 2025-06-06 19:45:41 -06:00
6a47f0d3e7 format total numbers better 2025-06-06 19:40:53 -06:00
fc50a9d4c6 fix problem finding offer identifiers 2025-06-06 19:06:29 -06:00
Matthew Raymer
c1f2c3951a feat(db): improve settings retrieval resilience and logging
Enhance retrieveSettingsForActiveAccount with better error handling and logging
while maintaining core functionality. Changes focus on making the system more
debuggable and resilient without overcomplicating the logic.

Key improvements:
- Add structured error handling with specific try-catch blocks
- Implement detailed logging with [databaseUtil] prefix for easy filtering
- Add graceful fallbacks for searchBoxes parsing and missing settings
- Improve error recovery paths with safe defaults
- Maintain existing security model and data integrity

Security:
- No sensitive data in logs
- Safe JSON parsing with fallbacks
- Proper error boundaries
- Consistent state management
- Clear fallback paths

Testing:
- Verify settings retrieval works with/without active DID
- Check error handling for invalid searchBoxes
- Confirm logging provides clear debugging context
- Validate fallback to default settings works
2025-06-06 09:22:35 +00:00
9d4f726c31 bump to build # 19 version 0.4.7 for mobile packages 2025-06-05 20:30:27 -06:00
1d7f626645 fix SQL references to bad "key" -> "id" 2025-06-05 20:11:32 -06:00
c5228ba7ec fix retrieval of column names -- so now most ops are working (but not all, eg. set name) 2025-06-05 20:06:32 -06:00
6e1fcd8dee remove unused DB methods (for now) 2025-06-05 20:00:51 -06:00
5bb563d694 fix extraction of values from SQLite queries 2025-06-05 19:57:59 -06:00
a3951c9d66 refactor for clarity (no logic changes) 2025-06-05 18:30:25 -06:00
aa177a9b8c fix extraction of migration names for SQLite via Capacitor 2025-06-05 18:13:33 -06:00
03cb4720b8 fix Capacitor to use the same migrations (migrations run but accounts aren't created) 2025-06-04 22:01:14 -06:00
297c5a2dbb disable SQLite in Java & Swift (since they don't compile) & add SQL queueing on startup
At this point, the app compiles and runs in Android & iOS but DB operations fail.
2025-06-03 19:59:28 -06:00
Jose Olarte III
92b9c9334c Clickable person & project icons
- Known entities get routed to their corresponding detail views
- Unknown entities pop up a notification
2025-06-02 21:35:15 +08:00
Jose Olarte III
706182ca0c Icon for hidden DID entity
- Display EntityIcon for known entities, eye-slash icon for hidden entities, and the person-question icon for unknown entities
- Design tweaks (spacings, mostly)
2025-06-02 17:44:37 +08:00
ef3bfcdbd2 fix linting 2025-05-28 20:30:00 -06:00
ec1f27bab1 fix more logging cleanup errors 2025-05-28 20:20:09 -06:00
01c33069c4 fix more of the logging & log display 2025-05-28 20:08:09 -06:00
c637d39dc9 fix log cleanup check to actually pay attention to limit 2025-05-28 19:44:16 -06:00
3e90bafbd1 correct & simplify the DB logging 2025-05-28 19:37:01 -06:00
Matthew Raymer
d2c3e5db05 fix: one lint that got past me 2025-05-28 14:03:21 +00:00
Matthew Raymer
e824fcce2e fix: linting issues 2025-05-28 14:02:02 +00:00
Matthew Raymer
f2c49872a6 fix: resolve TypeScript errors in database service implementation
- Remove unnecessary generic type parameter from AbsurdSqlDatabaseService
- Fix type handling in operation queue and result processing
- Correct WebPlatformService dbGetOneRow implementation to use imported databaseService
- Add proper type annotations for database operation results

This commit improves type safety and fixes several TypeScript errors that were
preventing proper type checking in the database service layer.
2025-05-28 13:20:01 +00:00
Matthew Raymer
229d9184b2 WIP: BROKEN FOR ELECTRON: Fixes in progress 2025-05-28 13:09:51 +00:00
Matthew Raymer
29908b77e3 feat(types): add comprehensive type definitions for @jlongster/sql.js
- Add FileSystem and FileStream interfaces for filesystem operations

- Update Database interface with proper Promise-based return types

- Add QueryExecResult interface for structured query results

- Include FS and register_for_idb in initialization result

- Fix Database constructor to support path and options parameters

- Add proper JSDoc documentation with author and description

This change resolves TypeScript compilation errors in AbsurdSqlDatabaseService by providing complete type coverage for the SQL.js WASM module with filesystem support.
2025-05-28 11:16:56 +00:00
Matthew Raymer
16cad04e5c WIP: fix(AbsurdSqlDatabaseService) fixes to typing and other curious beasts 2025-05-28 10:56:27 +00:00
Matthew Raymer
e4f859a116 fix: update offerGiverDid to use credentialSubject.offeredBy
The offerGiverDid function was looking for offeredBy at the root level of the
OfferVerifiableCredential, but it was moved to credentialSubject in our interface
changes. This fix updates the function to look in the correct location while
maintaining the same fallback behavior of using the issuer if offeredBy is not
available.

- Update path from claim.offeredBy to claim.credentialSubject.offeredBy
- Remove unnecessary string type cast
- Keep issuer fallback behavior unchanged
2025-05-28 10:41:39 +00:00
Matthew Raymer
7f17a3d9c7 refactor: remove unused imports and parameters in passkeyDidPeer.ts
- Remove unused imports:
  - DIDResolutionResult from did-resolver
  - sha256 from ethereum-cryptography/sha256.js
- Remove unused parameters from verifyJwtP256 and verifyJwtWebCrypto:
  - credIdHex
  - clientDataJsonBase64Url
  - credId

This change improves code cleanliness by removing unused code while
maintaining the core passkey authentication functionality.
2025-05-28 10:37:01 +00:00
Matthew Raymer
2d4d9691ca fix: use challenge parameter in verifyJwtWebCrypto preimage
- Remove unused client data hashing in verifyJwtWebCrypto
- Use challenge parameter directly in preimage construction
- Fix TS6133 error for unused challenge parameter
- Make verification logic consistent with verifyJwtP256

This change maintains the same verification logic while properly
utilizing the challenge parameter in the signature verification.
2025-05-28 10:28:57 +00:00
Matthew Raymer
63575b36ed fix: use challenge parameter in verifyJwtP256 preimage
- Remove unused client data hashing in verifyJwtP256
- Use challenge parameter directly in preimage construction
- Fix TS6133 error for unused challenge parameter

This change maintains the same verification logic while properly
utilizing the challenge parameter in the signature verification.
2025-05-28 10:27:19 +00:00
Matthew Raymer
2eb46367bc fix: resolve TypeScript errors in passkeyDidPeer.ts
- Fix import path for VerifyAuthenticationResponseOpts to use main package
- Add proper type assertions for credential response using AuthenticatorAssertionResponse
- Add back p256 import from @noble/curves/p256
- Remove unused functions marked with @typescript-eslint/no-unused-vars:
  - peerDidToDidDocument
  - COSEtoPEM
  - base64urlDecodeArrayBuffer
  - base64urlEncodeArrayBuffer
  - pemToCryptoKey

This change improves type safety and removes dead code while maintaining
the core passkey authentication functionality.
2025-05-28 10:24:17 +00:00
Matthew Raymer
cea0456148 fix: resolve type conflicts in AccountKeyInfo and KeyMeta imports
- Update AccountKeyInfo interface to handle derivationPath type conflict
- Fix circular dependency by importing KeyMeta directly from interfaces/common
- Use Omit utility type to properly merge Account and KeyMeta types
- Make derivationPath optional in AccountKeyInfo to match Account type

This change resolves type compatibility issues while maintaining
the intended functionality of account metadata handling.
2025-05-28 10:17:20 +00:00
Matthew Raymer
6f5db13a49 fix: consolidate KeyMeta interface and improve type safety
- Remove duplicate KeyMeta interface from crypto/vc/index.ts
- Import KeyMeta from common.ts as single source of truth
- Add missing fields to KeyMeta interface (identity, passkeyCredIdHex)
- Remove unused ErrorResponse interface from endorserServer.ts
- Fix import path for KeyMeta in crypto/vc/index.ts

This change resolves type compatibility issues and ensures consistent
KeyMeta type usage across the codebase.
2025-05-28 10:13:01 +00:00
Matthew Raymer
068662625d fix: resolve type compatibility in offerGiverDid
- Update offerGiverDid to use GenericVerifiableCredential as base type
- Add type assertion for OfferVerifiableCredential inside function
- Remove unnecessary type assertion in canFulfillOffer
2025-05-28 10:09:13 +00:00
Matthew Raymer
23627835f9 refactor: improve type safety in endorser server and common interfaces
- Add proper type definitions for AxiosErrorResponse with detailed error structure
- Make KeyMeta fields required where needed (publicKeyHex, mnemonic, derivationPath)
- Add QuantitativeValue type for consistent handling of numeric values
- Fix type assertions and compatibility between GenericVerifiableCredential and its extensions
- Improve error handling with proper type guards and assertions
- Update VerifiableCredentialClaim interface with required fields
- Add proper type assertions for claim objects in claimSummary and claimSpecialDescription
- Fix BLANK_GENERIC_SERVER_RECORD to include required @context field

Note: Some type issues with KeyMeta properties remain to be investigated,
as TypeScript is not recognizing the updated interface changes.
2025-05-28 09:29:19 +00:00
Matthew Raymer
f1ba6f9231 fix(endorserServer): improve type safety in claim handling
- Fix type conversion in claimSpecialDescription to properly handle GenericVerifiableCredential
- Update claim type checking to use 'claim' in claim for proper type narrowing
- Add type assertions for claim.object to ensure type safety
- Remove incorrect GenericCredWrapper type cast

This change resolves the type conversion error by properly handling
different claim types and ensuring type safety when passing objects
to claimSummary. The code now correctly distinguishes between
GenericVerifiableCredential and GenericCredWrapper types.
2025-05-28 09:04:15 +00:00
Matthew Raymer
137fce3e30 fix(endorserServer): improve type safety and error handling
- Update claimSummary to handle both GenericVerifiableCredential and GenericCredWrapper types
- Fix logger error handling in register function to properly stringify response data
- Add type narrowing with 'claim' in claim check for safer type handling
- Improve error message formatting for registration errors

This change improves type safety by properly handling different claim types
and ensures consistent error logging. The registration error handling now
properly stringifies response data for better debugging.
2025-05-28 09:00:46 +00:00
Matthew Raymer
7166dadbc0 fix(types): correct type imports and improve null safety
- Move type imports to their correct source locations:
  - GenericCredWrapper, GenericVerifiableCredential from interfaces/common
  - GiveSummaryRecord from interfaces/records
  - OfferVerifiableCredential from interfaces/claims
- Add null safety check for dbResult in retrieveAccountCount
- Initialize result variable to prevent undefined access

This change fixes TypeScript errors by ensuring types are imported from their
proper source files and improves code safety by adding proper null checks.
The type system can now correctly validate the usage of these types across
the codebase.
2025-05-28 08:56:54 +00:00
Matthew Raymer
bc274bdf7f fix(types): improve account type safety and metadata handling
- Change retrieveAllAccountsMetadata to return Account[] instead of AccountEncrypted[]
  to better reflect its purpose of returning non-sensitive metadata
- Update ImportDerivedAccountView to use Account type and group by derivation path
- Update retrieveAllFullyDecryptedAccounts to use AccountEncrypted type for encrypted data
- Fix import path for Account type in ImportDerivedAccountView

This change improves type safety by making it explicit which functions handle
encrypted data vs metadata, and ensures consistent handling of account data
across the application. The metadata functions now correctly strip sensitive
fields while functions that need encrypted data maintain access to those fields.
2025-05-28 08:52:09 +00:00
Matthew Raymer
082f8c0126 feat(QRScanner): implement QRScannerOptions in WebInlineQRScanner
- Add proper handling of QRScannerOptions in startScan method
- Implement camera selection (front/back) via options.camera
- Add video preview toggle via options.showPreview
- Store options as class property for persistence
- Improve logging with options context
- Fix TypeScript error for unused options parameter

This change makes the QR scanner more configurable and properly
implements the QRScannerService interface contract.
2025-05-28 08:43:43 +00:00
Matthew Raymer
fd09c7e426 fix(deepLinks): improve route validation and type safety
- Add early validation for route paths to prevent undefined access
- Introduce INVALID_ROUTE error type with detailed error information
- Simplify parameter mapping using nullish coalescing operator
- Improve type safety in parseDeepLink method
- Add better error details for invalid route paths

This change prevents potential runtime errors from undefined route access
and provides clearer error messages for invalid deep links.
2025-05-28 08:40:49 +00:00
Matthew Raymer
be40643379 fix: unused element 2025-05-28 08:38:10 +00:00
Matthew Raymer
835a270e65 feat(qr-scanner): Add camera state management to CapacitorQRScanner
- Add camera state tracking and listener management
- Implement addCameraStateListener and removeCameraStateListener methods
- Add state transitions during scanning operations
- Improve error handling with state updates
- Add proper type imports for CameraState and CameraStateListener

This change ensures CapacitorQRScanner fully implements the QRScannerService
interface and provides proper camera state feedback to consumers. Camera state
is now tracked through the entire lifecycle of scanning operations, with
appropriate state transitions for initialization, active scanning, errors,
and cleanup.
2025-05-28 08:37:02 +00:00
Matthew Raymer
13682a1930 fix(db): add type declarations for SQL.js and absurd-sql modules
- Create type declarations in interfaces/absurd-sql.d.ts
- Import and use proper QueryExecResult and SqlValue types
- Add declarations for all required modules:
  - @jlongster/sql.js
  - absurd-sql
  - absurd-sql/dist/indexeddb-backend
  - absurd-sql/dist/indexeddb-main-thread
- Ensure type safety for database operations

This change resolves TypeScript errors about missing type declarations
while maintaining proper type safety for database operations. The
declarations are placed in the interfaces folder to match the project's
type organization.
2025-05-28 06:21:20 +00:00
Matthew Raymer
669a66c24c fix(db): improve type safety in AbsurdSqlDatabaseService
- Move external module type declarations to dedicated .d.ts file
- Make SQL.js types compatible with our database interface
- Fix type compatibility between operation queue and database results
- Add proper typing for database operations and results

This change improves type safety by:
1. Properly declaring types for external modules
2. Ensuring database operation results match our interface
3. Making the operation queue type-safe with generics
4. Removing duplicate type definitions

The remaining module resolution warnings can be safely ignored as they
don't affect runtime behavior and our type declarations are working.
2025-05-28 06:07:50 +00:00
Matthew Raymer
13505b539e fix(db): resolve type compatibility in AbsurdSqlDatabaseService
- Make QueuedOperation interface generic to properly handle operation types
- Fix type compatibility between queue operations and their resolvers
- Use explicit typing for operation queue to handle generic types
- Maintain type safety while allowing different operation return types

This fixes a TypeScript error where the operation queue's resolve function
was not properly typed to handle generic return types from database operations.
2025-05-28 06:02:55 +00:00
Matthew Raymer
07ac340733 feat(capacitor): implement storage permission checks for file operations
- Add permission checks before writeFile and writeAndShareFile operations
- Reuse existing checkStoragePermissions method for Android devices
- Maintain iOS-specific handling (early return for iOS permission model)
- Improve error handling and logging for permission-related issues

This change ensures proper storage permission handling on Android devices
while maintaining the existing iOS behavior. The permission checks run
before any file write operations, providing better error handling and
user experience.
2025-05-28 05:56:58 +00:00
Matthew Raymer
ba2b2fc543 fix: placeholder for PyWebViewPlatformService writeAndShareFile 2025-05-28 05:06:31 +00:00
21184e7625 fix spelling of SQLite module for iOS 2025-05-27 21:11:07 -06:00
8d1511e38f convert all remaining DB writes & reads to SQL (with successful registration & claim) 2025-05-27 21:07:24 -06:00
Matthew Raymer
b18112b869 WIP: disabling absurd-sql when using Capacitor SQLite 2025-05-27 13:15:41 +00:00
Matthew Raymer
a228a9b1c0 fix: add requirements for capacitor/sqlite 2025-05-27 12:42:27 +00:00
Matthew Raymer
1560ff0829 feature: fleshed out capacitor and electron database operators 2025-05-27 11:23:52 +00:00
7de4125eb7 add SQL DB access to everywhere we are using the DB, up to the "C" files 2025-05-27 01:27:04 -06:00
Matthew Raymer
81d4f0c762 fix: resolve PWA build issues with SQL.js worker files
- Update worker format to ESM in Vite config to fix IIFE format error

- Increase PWA precache file size limit to 10MB to accommodate SQL.js files

- Fix type declarations for worker configuration

- Add proper type annotations for Vite config

- Add type declarations for absurd-sql module
2025-05-27 06:54:29 +00:00
4c1b4fe651 fix linting 2025-05-26 22:56:00 -06:00
Matthew Raymer
e63541ef53 fix: update ESLint and VS Code settings
- Configure ESLint to ignore node_modules
- Add VS Code settings for Java diagnostics

This fixes the Android build issues and improves the development
environment by properly ignoring node_modules in linting and
diagnostics.
2025-05-27 04:43:52 +00:00
0bfc18c385 add encryption & decryption for the sensitive identity & mnemonic in SQL DB 2025-05-26 22:42:20 -06:00
Matthew Raymer
35f5df6b6b fix: move lexical declarations outside case blocks
fix: missing logger import
Move variable declarations outside switch statement in AbsurdSqlDatabaseService
to fix 'no-case-declarations' linter errors.
2025-05-27 03:21:55 +00:00
Matthew Raymer
0f1ac2b230 fix: move lexical declarations outside case blocks in AbsurdSqlDatabaseService
- Move queryResult and allResult declarations outside switch statement
- Change const declarations to let since they're now in outer scope
- Remove const declarations from inside case blocks

This fixes the 'no-case-declarations' linter errors by ensuring variables
are declared in a scope that encompasses all case blocks, preventing
potential scoping issues.

Note: Type definition errors for external modules remain and should be
addressed separately.
2025-05-27 03:14:02 +00:00
3c0bdeaed3 remove debugging console statements 2025-05-26 20:43:06 -06:00
11f2527b04 start adding the SQL approach to files, also using the Dexie approach if desired 2025-05-26 20:26:28 -06:00
5d8175aeeb add encryption for the two SQL columns, replace basic DB utils, add USE_DEXIE_DB flag, and start adding SQL everywhere 2025-05-26 19:03:20 -06:00
b6b95cb0d0 remove unused redirect to start page (now that we're creating an ID up front) 2025-05-26 16:29:10 -06:00
655c5188a4 add queueing to the DB service so that requests get processed in order 2025-05-26 16:28:33 -06:00
8b7451330f remove possibility of failing a migration script and then succeeding on later ones 2025-05-26 15:50:37 -06:00
b8fbc3f7a6 fix console error about "window" unavailable due to service worker 2025-05-26 15:49:44 -06:00
92dadba1cb rename the absurd-sql-specific items for clarity 2025-05-26 14:52:39 -06:00
3a6f585de0 adjust so DB calls go to the factory 2025-05-26 13:59:34 -06:00
2647c5a77d fix migrations logging error 2025-05-25 21:52:27 -06:00
Matt Raymer
682fceb1c6 Merge remote-tracking branch 'refs/remotes/origin/sql-absurd-sql' into sql-absurd-sql 2025-05-25 23:37:43 -04:00
Matt Raymer
e0013008b4 refactor: improve type safety and browser compatibility - Replace any types with SqlValue[] in migration system - Add browser-compatible implementations of Node.js modules (crypto, fs, path) - Update Vite config to handle Node.js module polyfills - Remove outdated migration documentation files 2025-05-25 23:37:08 -04:00
0674d98670 fix BUILDING instructions 2025-05-25 21:29:57 -06:00
Matt Raymer
ee441d1aea refactor(db): improve type safety in migration system
- Replace any[] with SqlValue[] type for SQL parameters in runMigrations
- Update import to use QueryExecResult from interfaces/database
- Add proper typing for SQL parameter values (string | number | null | Uint8Array)

This change improves type safety and helps catch potential SQL parameter
type mismatches at compile time, reducing the risk of runtime errors
or data corruption.
2025-05-25 23:09:53 -04:00
Matt Raymer
75f6e99200 chore: update migration documents and move to new home 2025-05-25 22:50:32 -04:00
Matt Raymer
52c9e57ef4 Merge remote-tracking branch 'refs/remotes/origin/sql-absurd-sql' into sql-absurd-sql 2025-05-25 22:47:36 -04:00
603823d808 add to build instructions for electron on mac 2025-05-25 20:48:51 -06:00
5f24f4975d fix linting 2025-05-25 20:48:33 -06:00
5057d7d07f don't always apply the camera-implementation cursor rules 2025-05-25 20:37:16 -06:00
946e88d903 add a input area for arbitrary SQL on the test page 2025-05-25 20:27:06 -06:00
Matt Raymer
cbfb1ebf57 Merge branch 'new-storage' into sql-absurd-sql 2025-05-25 22:25:56 -04:00
a38934e38d fix problems with race conditions and multiple DatabaseService instances 2025-05-25 19:46:15 -06:00
a3bdcfd168 fix problem with initialization & refactor types 2025-05-25 18:32:41 -06:00
83771caee1 add more to the inital migration, and refactor the locations of types 2025-05-25 17:55:04 -06:00
da35b225cd remove unused setting 2025-05-25 15:49:36 -06:00
8c3920e108 add DB setup with migrations 2025-05-25 11:06:30 -06:00
54f269054f fix error loading WASM file 2025-05-25 07:45:07 -06:00
Matt Raymer
574520d9b3 feat(db): Implement SQLite database layer with migration support
Add SQLite database implementation with comprehensive features:

- Core database functionality:
  - Connection management and pooling
  - Schema creation and validation
  - Transaction support with rollback
  - Backup and restore capabilities
  - Health checks and integrity verification

- Data migration:
  - Migration utilities from Dexie to SQLite
  - Data transformation and validation
  - Migration verification and rollback
  - Backup before migration

- CRUD operations for all entities:
  - Accounts, contacts, and contact methods
  - Settings and secrets
  - Logging and audit trails

- Type safety and error handling:
  - Full TypeScript type definitions
  - Runtime data validation
  - Comprehensive error handling
  - Transaction safety

Note: Requires @wa-sqlite/sql.js package to be installed
2025-05-25 04:52:16 -04:00
6556eb55a3 add the other pieces for the previous commit 2025-05-25 01:18:58 -06:00
634e2bb2fb try absurd-sql, which fails in browser with: SyntaxError: Cannot use import statement outside a module (at registerSQLWorker.js... 2025-05-25 01:06:31 -06:00
170 changed files with 16571 additions and 5230 deletions

View File

@@ -0,0 +1,153 @@
---
description:
globs:
alwaysApply: true
---
# Absurd SQL - Cursor Development Guide
## Project Overview
Absurd SQL is a backend implementation for sql.js that enables persistent SQLite databases in the browser by using IndexedDB as a block storage system. This guide provides rules and best practices for developing with this project in Cursor.
## Project Structure
```
absurd-sql/
├── src/ # Source code
├── dist/ # Built files
├── package.json # Dependencies and scripts
├── rollup.config.js # Build configuration
└── jest.config.js # Test configuration
```
## Development Rules
### 1. Worker Thread Requirements
- All SQL operations MUST be performed in a worker thread
- Main thread should only handle worker initialization and communication
- Never block the main thread with database operations
### 2. Code Organization
- Keep worker code in separate files (e.g., `*.worker.js`)
- Use ES modules for imports/exports
- Follow the project's existing module structure
### 3. Required Headers
When developing locally or deploying, ensure these headers are set:
```
Cross-Origin-Opener-Policy: same-origin
Cross-Origin-Embedder-Policy: require-corp
```
### 4. Browser Compatibility
- Primary target: Modern browsers with SharedArrayBuffer support
- Fallback mode: Safari (with limitations)
- Always test in both modes
### 5. Database Configuration
Recommended database settings:
```sql
PRAGMA journal_mode=MEMORY;
PRAGMA page_size=8192; -- Optional, but recommended
```
### 6. Development Workflow
1. Install dependencies:
```bash
yarn add @jlongster/sql.js absurd-sql
```
2. Development commands:
- `yarn build` - Build the project
- `yarn jest` - Run tests
- `yarn serve` - Start development server
### 7. Testing Guidelines
- Write tests for both SharedArrayBuffer and fallback modes
- Use Jest for testing
- Include performance benchmarks for critical operations
### 8. Performance Considerations
- Use bulk operations when possible
- Monitor read/write performance
- Consider using transactions for multiple operations
- Avoid unnecessary database connections
### 9. Error Handling
- Implement proper error handling for:
- Worker initialization failures
- Database connection issues
- Concurrent access conflicts (in fallback mode)
- Storage quota exceeded scenarios
### 10. Security Best Practices
- Never expose database operations directly to the client
- Validate all SQL queries
- Implement proper access controls
- Handle sensitive data appropriately
### 11. Code Style
- Follow ESLint configuration
- Use async/await for asynchronous operations
- Document complex database operations
- Include comments for non-obvious optimizations
### 12. Debugging
- Use `jest-debug` for debugging tests
- Monitor IndexedDB usage in browser dev tools
- Check worker communication in console
- Use performance monitoring tools
## Common Patterns
### Worker Initialization
```javascript
// Main thread
import { initBackend } from 'absurd-sql/dist/indexeddb-main-thread';
function init() {
let worker = new Worker(new URL('./index.worker.js', import.meta.url));
initBackend(worker);
}
```
### Database Setup
```javascript
// Worker thread
import initSqlJs from '@jlongster/sql.js';
import { SQLiteFS } from 'absurd-sql';
import IndexedDBBackend from 'absurd-sql/dist/indexeddb-backend';
async function setupDatabase() {
let SQL = await initSqlJs({ locateFile: file => file });
let sqlFS = new SQLiteFS(SQL.FS, new IndexedDBBackend());
SQL.register_for_idb(sqlFS);
SQL.FS.mkdir('/sql');
SQL.FS.mount(sqlFS, {}, '/sql');
return new SQL.Database('/sql/db.sqlite', { filename: true });
}
```
## Troubleshooting
### Common Issues
1. SharedArrayBuffer not available
- Check COOP/COEP headers
- Verify browser support
- Test fallback mode
2. Worker initialization failures
- Check file paths
- Verify module imports
- Check browser console for errors
3. Performance issues
- Monitor IndexedDB usage
- Check for unnecessary operations
- Verify transaction usage
## Resources
- [Project Demo](https://priceless-keller-d097e5.netlify.app/)
- [Example Project](https://github.com/jlongster/absurd-example-project)
- [Blog Post](https://jlongster.com/future-sql-web)
- [SQL.js Documentation](https://github.com/sql-js/sql.js/)

View File

@@ -1,7 +1,7 @@
---
description:
globs:
alwaysApply: true
alwaysApply: false
---
# Camera Implementation Documentation

View File

@@ -2,11 +2,12 @@
# iOS doesn't like spaces in the app title.
TIME_SAFARI_APP_TITLE="TimeSafari_Dev"
VITE_APP_SERVER=http://localhost:3000
VITE_APP_SERVER=http://localhost:8080
# This is the claim ID for actions in the BVC project, with the JWT ID on this environment (not production).
VITE_BVC_MEETUPS_PROJECT_CLAIM_ID=https://endorser.ch/entity/01HWE8FWHQ1YGP7GFZYYPS272F
VITE_DEFAULT_ENDORSER_API_SERVER=http://localhost:3000
# Using shared server by default to ease setup, which works for shared test users.
VITE_DEFAULT_IMAGE_API_SERVER=https://test-image-api.timesafari.app
VITE_DEFAULT_PARTNER_API_SERVER=http://localhost:3000
#VITE_DEFAULT_PUSH_SERVER... can't be set up with localhost domain
VITE_PASSKEYS_ENABLED=true

View File

@@ -1,6 +0,0 @@
# Admin DID credentials
ADMIN_DID=did:ethr:0x0000694B58C2cC69658993A90D3840C560f2F51F
ADMIN_PRIVATE_KEY=2b6472c026ec2aa2c4235c994a63868fc9212d18b58f6cbfe861b52e71330f5b
# API Configuration
ENDORSER_API_URL=https://test-api.endorser.ch/api/v2/claim

View File

@@ -9,3 +9,4 @@ VITE_DEFAULT_ENDORSER_API_SERVER=https://api.endorser.ch
VITE_DEFAULT_IMAGE_API_SERVER=https://image-api.timesafari.app
VITE_DEFAULT_PARTNER_API_SERVER=https://partner-api.endorser.ch
VITE_DEFAULT_PUSH_SERVER=https://timesafari.app

View File

@@ -9,4 +9,5 @@ VITE_DEFAULT_ENDORSER_API_SERVER=https://test-api.endorser.ch
VITE_DEFAULT_IMAGE_API_SERVER=https://test-image-api.timesafari.app
VITE_DEFAULT_PARTNER_API_SERVER=https://test-partner-api.endorser.ch
VITE_DEFAULT_PUSH_SERVER=https://test.timesafari.app
VITE_PASSKEYS_ENABLED=true

View File

@@ -4,6 +4,12 @@ module.exports = {
node: true,
es2022: true,
},
ignorePatterns: [
'node_modules/',
'dist/',
'dist-electron/',
'*.d.ts'
],
extends: [
"plugin:vue/vue3-recommended",
"eslint:recommended",

1
.gitignore vendored
View File

@@ -55,3 +55,4 @@ build_logs/
icons
android/app/src/main/res/

View File

@@ -9,19 +9,6 @@ For a quick dev environment setup, use [pkgx](https://pkgx.dev).
- Node.js (LTS version recommended)
- npm (comes with Node.js)
- Git
- For Android builds: Android Studio with SDK installed
- For iOS builds: macOS with Xcode and ruby gems & bundle
- `pkgx +rubygems.org sh`
- ... and you may have to fix these, especially with pkgx
```bash
gem_path=$(which gem)
shortened_path="${gem_path:h:h}"
export GEM_HOME=$shortened_path
export GEM_PATH=$shortened_path
```
- For desktop builds: Additional build tools based on your OS
## Forks
@@ -77,14 +64,14 @@ Install dependencies:
* Commit everything (since the commit hash is used the app).
* Put the commit hash in the changelog (which will help you remember to bump the version later).
* Put the commit hash in the changelog (which will help you remember to bump the version in the step later).
* Tag with the new version, [online](https://gitea.anomalistdesign.com/trent_larson/crowd-funder-for-time-pwa/releases) or `git tag 0.3.55 && git push origin 0.3.55`.
* Tag with the new version, [online](https://gitea.anomalistdesign.com/trent_larson/crowd-funder-for-time-pwa/releases) or `git tag 1.0.0 && git push origin 1.0.0`.
* For test, build the app (because test server is not yet set up to build):
```bash
TIME_SAFARI_APP_TITLE="TimeSafari_Test" VITE_APP_SERVER=https://test.timesafari.app VITE_BVC_MEETUPS_PROJECT_CLAIM_ID=https://endorser.ch/entity/01HWE8FWHQ1YGP7GFZYYPS272F VITE_DEFAULT_ENDORSER_API_SERVER=https://test-api.endorser.ch VITE_DEFAULT_IMAGE_API_SERVER=https://test-image-api.timesafari.app VITE_DEFAULT_PARTNER_API_SERVER=https://test-partner-api.endorser.ch VITE_PASSKEYS_ENABLED=true npm run build
TIME_SAFARI_APP_TITLE="TimeSafari_Test" VITE_APP_SERVER=https://test.timesafari.app VITE_BVC_MEETUPS_PROJECT_CLAIM_ID=https://endorser.ch/entity/01HWE8FWHQ1YGP7GFZYYPS272F VITE_DEFAULT_ENDORSER_API_SERVER=https://test-api.endorser.ch VITE_DEFAULT_IMAGE_API_SERVER=https://test-image-api.timesafari.app VITE_DEFAULT_PARTNER_API_SERVER=https://test-partner-api.endorser.ch VITE_DEFAULT_PUSH_SERVER=https://test.timesafari.app VITE_PASSKEYS_ENABLED=true npm run build:web
```
... and transfer to the test server:
@@ -103,9 +90,9 @@ TIME_SAFARI_APP_TITLE="TimeSafari_Test" VITE_APP_SERVER=https://test.timesafari.
* `pkgx +npm sh`
* `cd crowd-funder-for-time-pwa && git checkout master && git pull && git checkout 0.3.55 && npm install && npm run build && cd -`
* `cd crowd-funder-for-time-pwa && git checkout master && git pull && git checkout 0.5.9 && npm install && npm run build:web && cd -`
(The plain `npm run build` uses the .env.production file.)
(The plain `npm run build:web` uses the .env.production file.)
* Back up the time-safari/dist folder & deploy: `mv time-safari/dist time-safari-dist-prev.0 && mv crowd-funder-for-time-pwa/dist time-safari/`
@@ -241,7 +228,9 @@ docker run -d \
1. Build the electron app in production mode:
```bash
npm run build:electron-prod
npm run build:web
npm run build:electron
npm run electron:build-mac
```
2. Package the Electron app for macOS:
@@ -324,17 +313,33 @@ npm run build:electron-prod && npm run electron:start
Prerequisites: macOS with Xcode installed
1. Build the web assets:
#### First-time iOS Configuration
- Generate certificates inside XCode.
- Right-click on App and under Signing & Capabilities set the Team.
#### Each Release
0. First time (or if dependencies change):
- `pkgx +rubygems.org sh`
- ... and you may have to fix these, especially with pkgx:
```bash
gem_path=$(which gem)
shortened_path="${gem_path:h:h}"
export GEM_HOME=$shortened_path
export GEM_PATH=$shortened_path
```
1. Build the web assets & update ios:
```bash
rm -rf dist
npm run build:web
npm run build:capacitor
```
2. Update iOS project with latest build:
```bash
npx cap sync ios
```
@@ -351,15 +356,14 @@ Prerequisites: macOS with Xcode installed
npx capacitor-assets generate --ios
```
4. Bump the version to match Android:
4. Bump the version to match Android & package.json:
```
cd ios/App
xcrun agvtool new-version 15
xcrun agvtool new-version 33
# Unfortunately this edits Info.plist directly.
#xcrun agvtool new-marketing-version 0.4.5
cat App.xcodeproj/project.pbxproj | sed "s/MARKETING_VERSION = .*;/MARKETING_VERSION = 0.4.5;/g" > temp
mv temp App.xcodeproj/project.pbxproj
cat App.xcodeproj/project.pbxproj | sed "s/MARKETING_VERSION = .*;/MARKETING_VERSION = 0.5.7;/g" > temp && mv temp App.xcodeproj/project.pbxproj
cd -
```
@@ -371,28 +375,25 @@ Prerequisites: macOS with Xcode installed
6. Use Xcode to build and run on simulator or device.
* Select Product -> Destination with some Simulator version. Then click the run arrow.
7. Release
* Under "General" renamed a bunch of things to "Time Safari"
* Choose Product -> Destination -> Build Any iOS
* Someday: Under "General" we want to rename a bunch of things to "Time Safari"
* Choose Product -> Destination -> Any iOS Device
* Choose Product -> Archive
* This will trigger a build and take time, needing user's "login" keychain password which is just their login password, repeatedly.
* This will trigger a build and take time, needing user's "login" keychain password (user's login password), repeatedly.
* If it fails with `building for 'iOS', but linking in dylib (.../.pkgx/zlib.net/v1.3.0/lib/libz.1.3.dylib) built for 'macOS'` then run XCode outside that terminal (ie. not with `npx cap open ios`).
* Click Distribute -> App Store Connect
* In AppStoreConnect, add the build to the distribution: remove the current build with the "-" when you hover over it, then "Add Build" with the new build.
* May have to go to App Review, click Submission, then hover over the build and click "-".
* It can take 15 minutes for the build to show up in the list of builds.
* You'll probably have to "Manage" something about encryption, disallowed in France.
* Then "Save" and "Add to Review" and "Resubmit to App Review".
#### First-time iOS Configuration
- Generate certificates inside XCode.
- Right-click on App and under Signing & Capabilities set the Team.
### Android Build
Prerequisites: Android Studio with SDK installed
Prerequisites: Android Studio with Java SDK installed
1. Build the web assets:
@@ -414,7 +415,7 @@ Prerequisites: Android Studio with SDK installed
npx capacitor-assets generate --android
```
4. Bump version to match iOS: android/app/build.gradle
4. Bump version to match iOS & package.json: android/app/build.gradle
5. Open the project in Android Studio:
@@ -431,7 +432,6 @@ Prerequisites: Android Studio with SDK installed
./gradlew clean
./gradlew build -Dlint.baselines.continue=true
cd -
npx cap run android
```
... or, to create the `aab` file, `bundle` instead of `build`:
@@ -447,7 +447,9 @@ Prerequisites: Android Studio with SDK installed
* Then `bundleRelease`:
```bash
cd android
./gradlew bundleRelease -Dlint.baselines.continue=true
cd -
```
... and find your `aab` file at app/build/outputs/bundle/release
@@ -460,8 +462,10 @@ At play.google.com/console:
- Hit "Next".
- Save, go to "Publishing Overview" as prompted, and click "Send changes for review".
- Note that if you add testers, you have to go to "Publishing Overview" and send those changes or your (closed) testers won't see it.
## First-time Android Configuration for deep links
## Android Configuration for deep links
You must add the following intent filter to the `android/app/src/main/AndroidManifest.xml` file:
@@ -472,4 +476,6 @@ You must add the following intent filter to the `android/app/src/main/AndroidMan
<category android:name="android.intent.category.BROWSABLE" />
<data android:scheme="timesafari" />
</intent-filter>
```
```
... though when we tried that most recently it failed to 'build' the APK with: http(s) scheme and host attribute are missing, but are required for Android App Links [AppLinkUrlError]

View File

@@ -7,6 +7,18 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [1.0.0] - 2025.06.20 - 9b69c0b22c7e3ac0584219f5ac434a02bda2e01b
### Added
- Web-oriented migration from IndexedDB to SQLite
## [0.4.7]
### Fixed
- Cameras everywhere
### Changed
- IndexedDB -> SQLite
## [0.4.5] - 2025.02.23
### Added
- Total amounts of gives on project page

View File

@@ -3,6 +3,32 @@
[Time Safari](https://timesafari.org/) allows people to ease into collaboration: start with expressions of gratitude
and expand to crowd-fund with time & money, then record and see the impact of contributions.
## Database Migration Status
**Current Status**: The application is undergoing a migration from Dexie (IndexedDB) to SQLite using absurd-sql. This migration is in **Phase 2** with a well-defined migration fence in place.
### Migration Progress
-**SQLite Database Service**: Fully implemented with absurd-sql
-**Platform Service Layer**: Unified database interface across platforms
-**Settings Migration**: Core user settings transferred
-**Account Migration**: Identity and key management
- 🔄 **Contact Migration**: User contact data (via import interface)
- 📋 **Code Cleanup**: Remove unused Dexie imports
### Migration Fence
The migration is controlled by a **migration fence** that separates legacy Dexie code from the new SQLite implementation. See [Migration Fence Definition](doc/migration-fence-definition.md) for complete details.
**Key Points**:
- Legacy Dexie database is disabled by default (`USE_DEXIE_DB = false`)
- All database operations go through `PlatformService`
- Migration tools provide controlled access to both databases
- Clear separation between legacy and new code
### Migration Documentation
- [Migration Guide](doc/migration-to-wa-sqlite.md) - Complete migration process
- [Migration Fence Definition](doc/migration-fence-definition.md) - Fence boundaries and rules
- [Database Migration Guide](doc/database-migration-guide.md) - User-facing migration tools
## Roadmap
See [project.task.yaml](project.task.yaml) for current priorities.
@@ -21,16 +47,10 @@ npm run dev
See [BUILDING.md](BUILDING.md) for more details.
## Tests
See [TESTING.md](test-playwright/TESTING.md) for detailed test instructions.
## Icons
Application icons are in the `assets` directory, processed by the `capacitor-assets` command.
@@ -66,6 +86,21 @@ Key principles:
- Common interfaces are shared through `common.ts`
- Type definitions are generated from Zod schemas where possible
### Database Architecture
The application uses a platform-agnostic database layer:
* `src/services/PlatformService.ts` - Database interface definition
* `src/services/PlatformServiceFactory.ts` - Platform-specific service factory
* `src/services/AbsurdSqlDatabaseService.ts` - SQLite implementation
* `src/db/` - Legacy Dexie database (migration in progress)
**Development Guidelines**:
- Always use `PlatformService` for database operations
- Never import Dexie directly in application code
- Test with `USE_DEXIE_DB = false` for new features
- Use migration tools for data transfer between systems
### Kudos
Gifts make the world go 'round!

View File

@@ -31,8 +31,8 @@ android {
applicationId "app.timesafari.app"
minSdkVersion rootProject.ext.minSdkVersion
targetSdkVersion rootProject.ext.targetSdkVersion
versionCode 18
versionName "0.4.7"
versionCode 33
versionName "0.5.7"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
aaptOptions {
// Files and dirs to omit from the packaged assets dir, modified to accommodate modern web apps.
@@ -91,6 +91,8 @@ dependencies {
implementation "androidx.coordinatorlayout:coordinatorlayout:$androidxCoordinatorLayoutVersion"
implementation "androidx.core:core-splashscreen:$coreSplashScreenVersion"
implementation project(':capacitor-android')
implementation project(':capacitor-community-sqlite')
implementation "androidx.biometric:biometric:1.2.0-alpha05"
testImplementation "junit:junit:$junitVersion"
androidTestImplementation "androidx.test.ext:junit:$androidxJunitVersion"
androidTestImplementation "androidx.test.espresso:espresso-core:$androidxEspressoCoreVersion"

View File

@@ -9,6 +9,7 @@ android {
apply from: "../capacitor-cordova-android-plugins/cordova.variables.gradle"
dependencies {
implementation project(':capacitor-community-sqlite')
implementation project(':capacitor-mlkit-barcode-scanning')
implementation project(':capacitor-app')
implementation project(':capacitor-camera')

View File

@@ -1,5 +1,5 @@
{
"appId": "app.timesafari.app",
"appId": "app.timesafari",
"appName": "TimeSafari",
"webDir": "dist",
"bundledWebRuntime": false,
@@ -16,6 +16,41 @@
}
]
}
},
"SQLite": {
"iosDatabaseLocation": "Library/CapacitorDatabase",
"iosIsEncryption": true,
"iosBiometric": {
"biometricAuth": true,
"biometricTitle": "Biometric login for TimeSafari"
},
"androidIsEncryption": true,
"androidBiometric": {
"biometricAuth": true,
"biometricTitle": "Biometric login for TimeSafari"
}
}
},
"ios": {
"contentInset": "never",
"allowsLinkPreview": true,
"scrollEnabled": true,
"limitsNavigationsToAppBoundDomains": true,
"backgroundColor": "#ffffff",
"allowNavigation": [
"*.timesafari.app",
"*.jsdelivr.net",
"api.endorser.ch"
]
},
"android": {
"allowMixedContent": false,
"captureInput": true,
"webContentsDebuggingEnabled": false,
"allowNavigation": [
"*.timesafari.app",
"*.jsdelivr.net",
"api.endorser.ch"
]
}
}

View File

@@ -1,4 +1,8 @@
[
{
"pkg": "@capacitor-community/sqlite",
"classpath": "com.getcapacitor.community.database.sqlite.CapacitorSQLitePlugin"
},
{
"pkg": "@capacitor-mlkit/barcode-scanning",
"classpath": "io.capawesome.capacitorjs.plugins.mlkit.barcodescanning.BarcodeScannerPlugin"

View File

@@ -1,7 +1,15 @@
package app.timesafari;
import android.os.Bundle;
import com.getcapacitor.BridgeActivity;
//import com.getcapacitor.community.sqlite.SQLite;
public class MainActivity extends BridgeActivity {
// ... existing code ...
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Initialize SQLite
//registerPlugin(SQLite.class);
}
}

View File

@@ -1,5 +0,0 @@
package timesafari.app;
import com.getcapacitor.BridgeActivity;
public class MainActivity extends BridgeActivity {}

View File

@@ -2,6 +2,9 @@
include ':capacitor-android'
project(':capacitor-android').projectDir = new File('../node_modules/@capacitor/android/capacitor')
include ':capacitor-community-sqlite'
project(':capacitor-community-sqlite').projectDir = new File('../node_modules/@capacitor-community/sqlite/android')
include ':capacitor-mlkit-barcode-scanning'
project(':capacitor-mlkit-barcode-scanning').projectDir = new File('../node_modules/@capacitor-mlkit/barcode-scanning/android')

View File

@@ -16,6 +16,41 @@
}
]
}
},
"SQLite": {
"iosDatabaseLocation": "Library/CapacitorDatabase",
"iosIsEncryption": true,
"iosBiometric": {
"biometricAuth": true,
"biometricTitle": "Biometric login for TimeSafari"
},
"androidIsEncryption": true,
"androidBiometric": {
"biometricAuth": true,
"biometricTitle": "Biometric login for TimeSafari"
}
}
},
"ios": {
"contentInset": "never",
"allowsLinkPreview": true,
"scrollEnabled": true,
"limitsNavigationsToAppBoundDomains": true,
"backgroundColor": "#ffffff",
"allowNavigation": [
"*.timesafari.app",
"*.jsdelivr.net",
"api.endorser.ch"
]
},
"android": {
"allowMixedContent": false,
"captureInput": true,
"webContentsDebuggingEnabled": false,
"allowNavigation": [
"*.timesafari.app",
"*.jsdelivr.net",
"api.endorser.ch"
]
}
}

View File

@@ -0,0 +1,295 @@
# Database Migration Guide
## Overview
The Database Migration feature allows you to compare and migrate data between Dexie (IndexedDB) and SQLite databases in the TimeSafari application. This is particularly useful during the transition from the old Dexie-based storage system to the new SQLite-based system.
## Features
### 1. Database Comparison
- Compare data between Dexie and SQLite databases
- View detailed differences in contacts and settings
- Identify added, modified, and missing records
- Export comparison results for analysis
### 2. Data Migration
- Migrate contacts from Dexie to SQLite
- Migrate settings from Dexie to SQLite
- Option to overwrite existing records or skip them
- Comprehensive error handling and reporting
### 3. User Interface
- Modern, responsive UI built with Tailwind CSS
- Real-time loading states and progress indicators
- Clear success and error messaging
- Export functionality for comparison data
## Prerequisites
### Enable Dexie Database
Before using the migration features, you must enable the Dexie database by setting:
```typescript
// In constants/app.ts
export const USE_DEXIE_DB = true;
```
**Note**: This should only be enabled temporarily during migration. Remember to set it back to `false` after migration is complete.
## Accessing the Migration Interface
1. Navigate to the **Account** page in the TimeSafari app
2. Scroll down to find the **Database Migration** link
3. Click the link to open the migration interface
## Using the Migration Interface
### Step 1: Compare Databases
1. Click the **"Compare Databases"** button
2. The system will retrieve data from both Dexie and SQLite databases
3. Review the comparison results showing:
- Summary counts for each database
- Detailed differences (added, modified, missing records)
- Specific records that need attention
### Step 2: Review Differences
The comparison results are displayed in several sections:
#### Summary Cards
- **Dexie Contacts**: Number of contacts in Dexie database
- **SQLite Contacts**: Number of contacts in SQLite database
- **Dexie Settings**: Number of settings in Dexie database
- **SQLite Settings**: Number of settings in SQLite database
#### Contact Differences
- **Added**: Contacts in Dexie but not in SQLite
- **Modified**: Contacts that differ between databases
- **Missing**: Contacts in SQLite but not in Dexie
#### Settings Differences
- **Added**: Settings in Dexie but not in SQLite
- **Modified**: Settings that differ between databases
- **Missing**: Settings in SQLite but not in Dexie
### Step 3: Configure Migration Options
Before migrating data, configure the migration options:
- **Overwrite existing records**: When enabled, existing records in SQLite will be updated with data from Dexie. When disabled, existing records will be skipped.
### Step 4: Migrate Data
#### Migrate Contacts
1. Click the **"Migrate Contacts"** button
2. The system will transfer contacts from Dexie to SQLite
3. Review the migration results showing:
- Number of contacts successfully migrated
- Any warnings or errors encountered
#### Migrate Settings
1. Click the **"Migrate Settings"** button
2. The system will transfer settings from Dexie to SQLite
3. Review the migration results showing:
- Number of settings successfully migrated
- Any warnings or errors encountered
### Step 5: Export Comparison (Optional)
1. Click the **"Export Comparison"** button
2. A JSON file will be downloaded containing the complete comparison data
3. This file can be used for analysis or backup purposes
## Migration Process Details
### Contact Migration
The contact migration process:
1. **Retrieves** all contacts from Dexie database
2. **Checks** for existing contacts in SQLite by DID
3. **Inserts** new contacts or **updates** existing ones (if overwrite is enabled)
4. **Handles** complex fields like `contactMethods` (JSON arrays)
5. **Reports** success/failure for each contact
### Settings Migration
The settings migration process:
1. **Retrieves** all settings from Dexie database
2. **Focuses** on key user-facing settings:
- `firstName`
- `isRegistered`
- `profileImageUrl`
- `showShortcutBvc`
- `searchBoxes`
3. **Preserves** other settings in SQLite
4. **Reports** success/failure for each setting
## Error Handling
### Common Issues
#### Dexie Database Not Enabled
**Error**: "Dexie database is not enabled"
**Solution**: Set `USE_DEXIE_DB = true` in `constants/app.ts`
#### Database Connection Issues
**Error**: "Failed to retrieve Dexie contacts"
**Solution**: Check that the Dexie database is properly initialized and accessible
#### SQLite Query Errors
**Error**: "Failed to retrieve SQLite contacts"
**Solution**: Verify that the SQLite database is properly set up and the platform service is working
#### Migration Failures
**Error**: "Migration failed: [specific error]"
**Solution**: Review the error details and check data integrity in both databases
### Error Recovery
1. **Review** the error messages carefully
2. **Check** the browser console for additional details
3. **Verify** database connectivity and permissions
4. **Retry** the operation if appropriate
5. **Export** comparison data for manual review if needed
## Best Practices
### Before Migration
1. **Backup** your data if possible
2. **Test** the migration on a small dataset first
3. **Verify** that both databases are accessible
4. **Review** the comparison results before migrating
### During Migration
1. **Don't** interrupt the migration process
2. **Monitor** the progress and error messages
3. **Note** any warnings or skipped records
4. **Export** comparison data for reference
### After Migration
1. **Verify** that data was migrated correctly
2. **Test** the application functionality
3. **Disable** Dexie database (`USE_DEXIE_DB = false`)
4. **Clean up** any temporary files or exports
## Technical Details
### Database Schema
The migration handles the following data structures:
#### Contacts Table
```typescript
interface Contact {
did: string; // Decentralized Identifier
name: string; // Contact name
contactMethods: ContactMethod[]; // Array of contact methods
nextPubKeyHashB64: string; // Next public key hash
notes: string; // Contact notes
profileImageUrl: string; // Profile image URL
publicKeyBase64: string; // Public key in base64
seesMe: boolean; // Visibility flag
registered: boolean; // Registration status
}
```
#### Settings Table
```typescript
interface Settings {
id: number; // Settings ID
accountDid: string; // Account DID
activeDid: string; // Active DID
firstName: string; // User's first name
isRegistered: boolean; // Registration status
profileImageUrl: string; // Profile image URL
showShortcutBvc: boolean; // UI preference
searchBoxes: any[]; // Search configuration
// ... other fields
}
```
### Migration Logic
The migration service uses sophisticated comparison logic:
1. **Primary Key Matching**: Uses DID for contacts, ID for settings
2. **Deep Comparison**: Compares all fields including complex objects
3. **JSON Handling**: Properly handles JSON fields like `contactMethods` and `searchBoxes`
4. **Conflict Resolution**: Provides options for handling existing records
### Performance Considerations
- **Batch Processing**: Processes records one by one for reliability
- **Error Isolation**: Individual record failures don't stop the entire migration
- **Memory Management**: Handles large datasets efficiently
- **Progress Reporting**: Provides real-time feedback during migration
## Troubleshooting
### Migration Stuck
If the migration appears to be stuck:
1. **Check** the browser console for errors
2. **Refresh** the page and try again
3. **Verify** database connectivity
4. **Check** for large datasets that might take time
### Incomplete Migration
If migration doesn't complete:
1. **Review** error messages
2. **Check** data integrity in both databases
3. **Export** comparison data for manual review
4. **Consider** migrating in smaller batches
### Data Inconsistencies
If you notice data inconsistencies:
1. **Export** comparison data
2. **Review** the differences carefully
3. **Manually** verify critical records
4. **Consider** selective migration of specific records
## Support
For issues with the Database Migration feature:
1. **Check** this documentation first
2. **Review** the browser console for error details
3. **Export** comparison data for analysis
4. **Contact** the development team with specific error details
## Security Considerations
- **Data Privacy**: Migration data is processed locally and not sent to external servers
- **Access Control**: Only users with access to the account can perform migration
- **Data Integrity**: Migration preserves data integrity and handles conflicts gracefully
- **Audit Trail**: Export functionality provides an audit trail of migration operations
---
**Note**: This migration tool is designed for the transition period between database systems. Once migration is complete and verified, the Dexie database should be disabled to avoid confusion and potential data conflicts.

View File

@@ -0,0 +1,418 @@
# Dexie to absurd-sql Mapping Guide
## Schema Mapping
### Current Dexie Schema
```typescript
// Current Dexie schema
const db = new Dexie('TimeSafariDB');
db.version(1).stores({
accounts: 'did, publicKeyHex, createdAt, updatedAt',
settings: 'key, value, updatedAt',
contacts: 'id, did, name, createdAt, updatedAt'
});
```
### New SQLite Schema
```sql
-- New SQLite schema
CREATE TABLE accounts (
did TEXT PRIMARY KEY,
public_key_hex TEXT NOT NULL,
created_at INTEGER NOT NULL,
updated_at INTEGER NOT NULL
);
CREATE TABLE settings (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
updated_at INTEGER NOT NULL
);
CREATE TABLE contacts (
id TEXT PRIMARY KEY,
did TEXT NOT NULL,
name TEXT,
created_at INTEGER NOT NULL,
updated_at INTEGER NOT NULL,
FOREIGN KEY (did) REFERENCES accounts(did)
);
-- Indexes for performance
CREATE INDEX idx_accounts_created_at ON accounts(created_at);
CREATE INDEX idx_contacts_did ON contacts(did);
CREATE INDEX idx_settings_updated_at ON settings(updated_at);
```
## Query Mapping
### 1. Account Operations
#### Get Account by DID
```typescript
// Dexie
const account = await db.accounts.get(did);
// absurd-sql
const result = await db.exec(`
SELECT * FROM accounts WHERE did = ?
`, [did]);
const account = result[0]?.values[0];
```
#### Get All Accounts
```typescript
// Dexie
const accounts = await db.accounts.toArray();
// absurd-sql
const result = await db.exec(`
SELECT * FROM accounts ORDER BY created_at DESC
`);
const accounts = result[0]?.values || [];
```
#### Add Account
```typescript
// Dexie
await db.accounts.add({
did,
publicKeyHex,
createdAt: Date.now(),
updatedAt: Date.now()
});
// absurd-sql
await db.run(`
INSERT INTO accounts (did, public_key_hex, created_at, updated_at)
VALUES (?, ?, ?, ?)
`, [did, publicKeyHex, Date.now(), Date.now()]);
```
#### Update Account
```typescript
// Dexie
await db.accounts.update(did, {
publicKeyHex,
updatedAt: Date.now()
});
// absurd-sql
await db.run(`
UPDATE accounts
SET public_key_hex = ?, updated_at = ?
WHERE did = ?
`, [publicKeyHex, Date.now(), did]);
```
### 2. Settings Operations
#### Get Setting
```typescript
// Dexie
const setting = await db.settings.get(key);
// absurd-sql
const result = await db.exec(`
SELECT * FROM settings WHERE key = ?
`, [key]);
const setting = result[0]?.values[0];
```
#### Set Setting
```typescript
// Dexie
await db.settings.put({
key,
value,
updatedAt: Date.now()
});
// absurd-sql
await db.run(`
INSERT INTO settings (key, value, updated_at)
VALUES (?, ?, ?)
ON CONFLICT(key) DO UPDATE SET
value = excluded.value,
updated_at = excluded.updated_at
`, [key, value, Date.now()]);
```
### 3. Contact Operations
#### Get Contacts by Account
```typescript
// Dexie
const contacts = await db.contacts
.where('did')
.equals(accountDid)
.toArray();
// absurd-sql
const result = await db.exec(`
SELECT * FROM contacts
WHERE did = ?
ORDER BY created_at DESC
`, [accountDid]);
const contacts = result[0]?.values || [];
```
#### Add Contact
```typescript
// Dexie
await db.contacts.add({
id: generateId(),
did: accountDid,
name,
createdAt: Date.now(),
updatedAt: Date.now()
});
// absurd-sql
await db.run(`
INSERT INTO contacts (id, did, name, created_at, updated_at)
VALUES (?, ?, ?, ?, ?)
`, [generateId(), accountDid, name, Date.now(), Date.now()]);
```
## Transaction Mapping
### Batch Operations
```typescript
// Dexie
await db.transaction('rw', [db.accounts, db.contacts], async () => {
await db.accounts.add(account);
await db.contacts.bulkAdd(contacts);
});
// absurd-sql
await db.exec('BEGIN TRANSACTION;');
try {
await db.run(`
INSERT INTO accounts (did, public_key_hex, created_at, updated_at)
VALUES (?, ?, ?, ?)
`, [account.did, account.publicKeyHex, account.createdAt, account.updatedAt]);
for (const contact of contacts) {
await db.run(`
INSERT INTO contacts (id, did, name, created_at, updated_at)
VALUES (?, ?, ?, ?, ?)
`, [contact.id, contact.did, contact.name, contact.createdAt, contact.updatedAt]);
}
await db.exec('COMMIT;');
} catch (error) {
await db.exec('ROLLBACK;');
throw error;
}
```
## Migration Helper Functions
### 1. Data Export (Dexie to JSON)
```typescript
async function exportDexieData(): Promise<MigrationData> {
const db = new Dexie('TimeSafariDB');
return {
accounts: await db.accounts.toArray(),
settings: await db.settings.toArray(),
contacts: await db.contacts.toArray(),
metadata: {
version: '1.0.0',
timestamp: Date.now(),
dexieVersion: Dexie.version
}
};
}
```
### 2. Data Import (JSON to absurd-sql)
```typescript
async function importToAbsurdSql(data: MigrationData): Promise<void> {
await db.exec('BEGIN TRANSACTION;');
try {
// Import accounts
for (const account of data.accounts) {
await db.run(`
INSERT INTO accounts (did, public_key_hex, created_at, updated_at)
VALUES (?, ?, ?, ?)
`, [account.did, account.publicKeyHex, account.createdAt, account.updatedAt]);
}
// Import settings
for (const setting of data.settings) {
await db.run(`
INSERT INTO settings (key, value, updated_at)
VALUES (?, ?, ?)
`, [setting.key, setting.value, setting.updatedAt]);
}
// Import contacts
for (const contact of data.contacts) {
await db.run(`
INSERT INTO contacts (id, did, name, created_at, updated_at)
VALUES (?, ?, ?, ?, ?)
`, [contact.id, contact.did, contact.name, contact.createdAt, contact.updatedAt]);
}
await db.exec('COMMIT;');
} catch (error) {
await db.exec('ROLLBACK;');
throw error;
}
}
```
### 3. Verification
```typescript
async function verifyMigration(dexieData: MigrationData): Promise<boolean> {
// Verify account count
const accountResult = await db.exec('SELECT COUNT(*) as count FROM accounts');
const accountCount = accountResult[0].values[0][0];
if (accountCount !== dexieData.accounts.length) {
return false;
}
// Verify settings count
const settingsResult = await db.exec('SELECT COUNT(*) as count FROM settings');
const settingsCount = settingsResult[0].values[0][0];
if (settingsCount !== dexieData.settings.length) {
return false;
}
// Verify contacts count
const contactsResult = await db.exec('SELECT COUNT(*) as count FROM contacts');
const contactsCount = contactsResult[0].values[0][0];
if (contactsCount !== dexieData.contacts.length) {
return false;
}
// Verify data integrity
for (const account of dexieData.accounts) {
const result = await db.exec(
'SELECT * FROM accounts WHERE did = ?',
[account.did]
);
const migratedAccount = result[0]?.values[0];
if (!migratedAccount ||
migratedAccount[1] !== account.publicKeyHex) { // public_key_hex is second column
return false;
}
}
return true;
}
```
## Performance Considerations
### 1. Indexing
- Dexie automatically creates indexes based on the schema
- absurd-sql requires explicit index creation
- Added indexes for frequently queried fields
- Use `PRAGMA journal_mode=MEMORY;` for better performance
### 2. Batch Operations
- Dexie has built-in bulk operations
- absurd-sql uses transactions for batch operations
- Consider chunking large datasets
- Use prepared statements for repeated queries
### 3. Query Optimization
- Dexie uses IndexedDB's native indexing
- absurd-sql requires explicit query optimization
- Use prepared statements for repeated queries
- Consider using `PRAGMA synchronous=NORMAL;` for better performance
## Error Handling
### 1. Common Errors
```typescript
// Dexie errors
try {
await db.accounts.add(account);
} catch (error) {
if (error instanceof Dexie.ConstraintError) {
// Handle duplicate key
}
}
// absurd-sql errors
try {
await db.run(`
INSERT INTO accounts (did, public_key_hex, created_at, updated_at)
VALUES (?, ?, ?, ?)
`, [account.did, account.publicKeyHex, account.createdAt, account.updatedAt]);
} catch (error) {
if (error.message.includes('UNIQUE constraint failed')) {
// Handle duplicate key
}
}
```
### 2. Transaction Recovery
```typescript
// Dexie transaction
try {
await db.transaction('rw', db.accounts, async () => {
// Operations
});
} catch (error) {
// Dexie automatically rolls back
}
// absurd-sql transaction
try {
await db.exec('BEGIN TRANSACTION;');
// Operations
await db.exec('COMMIT;');
} catch (error) {
await db.exec('ROLLBACK;');
throw error;
}
```
## Migration Strategy
1. **Preparation**
- Export all Dexie data
- Verify data integrity
- Create SQLite schema
- Setup indexes
2. **Migration**
- Import data in transactions
- Verify each batch
- Handle errors gracefully
- Maintain backup
3. **Verification**
- Compare record counts
- Verify data integrity
- Test common queries
- Validate relationships
4. **Cleanup**
- Remove Dexie database
- Clear IndexedDB storage
- Update application code
- Remove old dependencies

View File

@@ -0,0 +1,272 @@
# Migration Fence Definition: Dexie to SQLite
## Overview
This document defines the **migration fence** - the boundary between the legacy Dexie (IndexedDB) storage system and the new SQLite-based storage system in TimeSafari. The fence ensures controlled migration while maintaining data integrity and application stability.
## Current Migration Status
### ✅ Completed Components
- **SQLite Database Service**: Fully implemented with absurd-sql
- **Platform Service Layer**: Unified database interface across platforms
- **Migration Tools**: Data comparison and transfer utilities
- **Schema Migration**: Complete table structure migration
- **Data Export/Import**: Backup and restore functionality
### 🔄 Active Migration Components
- **Settings Migration**: Core user settings transferred
- **Account Migration**: Identity and key management
- **Contact Migration**: User contact data (via import interface)
### ❌ Legacy Components (Fence Boundary)
- **Dexie Database**: Legacy IndexedDB storage (disabled by default)
- **Dexie-Specific Code**: Direct database access patterns
- **Legacy Migration Paths**: Old data transfer methods
## Migration Fence Definition
### 1. Configuration Boundary
```typescript
// src/constants/app.ts
export const USE_DEXIE_DB = false; // FENCE: Controls legacy database access
```
**Fence Rule**: When `USE_DEXIE_DB = false`:
- All new data operations use SQLite
- Legacy Dexie database is not initialized
- Migration tools are the only path to legacy data
**Fence Rule**: When `USE_DEXIE_DB = true`:
- Legacy database is available for migration
- Dual-write operations may be enabled
- Migration tools can access both databases
### 2. Service Layer Boundary
```typescript
// src/services/PlatformServiceFactory.ts
export class PlatformServiceFactory {
public static getInstance(): PlatformService {
// FENCE: All database operations go through platform service
// No direct Dexie access outside migration tools
}
}
```
**Fence Rule**: All database operations must use:
- `PlatformService.dbQuery()` for read operations
- `PlatformService.dbExec()` for write operations
- No direct `db.` or `accountsDBPromise` access in application code
### 3. Data Access Patterns
#### ✅ Allowed (Inside Fence)
```typescript
// Use platform service for all database operations
const platformService = PlatformServiceFactory.getInstance();
const contacts = await platformService.dbQuery(
"SELECT * FROM contacts WHERE did = ?",
[accountDid]
);
```
#### ❌ Forbidden (Outside Fence)
```typescript
// Direct Dexie access (legacy pattern)
const contacts = await db.contacts.where('did').equals(accountDid).toArray();
// Direct database reference
const result = await accountsDBPromise;
```
### 4. Migration Tool Boundary
```typescript
// src/services/indexedDBMigrationService.ts
// FENCE: Only migration tools can access both databases
export async function compareDatabases(): Promise<DataComparison> {
// This is the ONLY place where both databases are accessed
}
```
**Fence Rule**: Migration tools are the exclusive interface between:
- Legacy Dexie database
- New SQLite database
- Data comparison and transfer operations
## Migration Fence Guidelines
### 1. Code Development Rules
#### New Feature Development
- **Always** use `PlatformService` for database operations
- **Never** import or reference Dexie directly
- **Always** test with `USE_DEXIE_DB = false`
#### Legacy Code Maintenance
- **Only** modify Dexie code for migration purposes
- **Always** add migration tests for schema changes
- **Never** add new Dexie-specific features
### 2. Data Integrity Rules
#### Migration Safety
- **Always** create backups before migration
- **Always** verify data integrity after migration
- **Never** delete legacy data until verified
#### Rollback Strategy
- **Always** maintain ability to rollback to Dexie
- **Always** preserve migration logs
- **Never** assume migration is irreversible
### 3. Testing Requirements
#### Migration Testing
```typescript
// Required test pattern for migration
describe('Database Migration', () => {
it('should migrate data without loss', async () => {
// 1. Enable Dexie
// 2. Create test data
// 3. Run migration
// 4. Verify data integrity
// 5. Disable Dexie
});
});
```
#### Application Testing
```typescript
// Required test pattern for application features
describe('Feature with Database', () => {
it('should work with SQLite only', async () => {
// Test with USE_DEXIE_DB = false
// Verify all operations use PlatformService
});
});
```
## Migration Fence Enforcement
### 1. Static Analysis
#### ESLint Rules
```json
{
"rules": {
"no-restricted-imports": [
"error",
{
"patterns": [
{
"group": ["../db/index"],
"message": "Use PlatformService instead of direct Dexie access"
}
]
}
]
}
}
```
#### TypeScript Rules
```json
{
"compilerOptions": {
"strict": true,
"noImplicitAny": true
}
}
```
### 2. Runtime Checks
#### Development Mode Validation
```typescript
// Development-only fence validation
if (import.meta.env.DEV && USE_DEXIE_DB) {
console.warn('⚠️ Dexie is enabled - migration mode active');
}
```
#### Production Safety
```typescript
// Production fence enforcement
if (import.meta.env.PROD && USE_DEXIE_DB) {
throw new Error('Dexie cannot be enabled in production');
}
```
## Migration Fence Timeline
### Phase 1: Fence Establishment ✅
- [x] Define migration fence boundaries
- [x] Implement PlatformService layer
- [x] Create migration tools
- [x] Set `USE_DEXIE_DB = false` by default
### Phase 2: Data Migration 🔄
- [x] Migrate core settings
- [x] Migrate account data
- [ ] Complete contact migration
- [ ] Verify all data integrity
### Phase 3: Code Cleanup 📋
- [ ] Remove unused Dexie imports
- [ ] Clean up legacy database code
- [ ] Update all documentation
- [ ] Remove migration tools
### Phase 4: Fence Removal 🎯
- [ ] Remove `USE_DEXIE_DB` constant
- [ ] Remove Dexie dependencies
- [ ] Remove migration service
- [ ] Finalize SQLite-only architecture
## Security Considerations
### 1. Data Protection
- **Encryption**: Maintain encryption standards across migration
- **Access Control**: Preserve user privacy during migration
- **Audit Trail**: Log all migration operations
### 2. Error Handling
- **Graceful Degradation**: Handle migration failures gracefully
- **User Communication**: Clear messaging about migration status
- **Recovery Options**: Provide rollback mechanisms
## Performance Considerations
### 1. Migration Performance
- **Batch Operations**: Use transactions for bulk data transfer
- **Progress Indicators**: Show migration progress to users
- **Background Processing**: Non-blocking migration operations
### 2. Application Performance
- **Query Optimization**: Optimize SQLite queries for performance
- **Indexing Strategy**: Maintain proper database indexes
- **Memory Management**: Efficient memory usage during migration
## Documentation Requirements
### 1. Code Documentation
- **Migration Fence Comments**: Document fence boundaries in code
- **API Documentation**: Update all database API documentation
- **Migration Guides**: Comprehensive migration documentation
### 2. User Documentation
- **Migration Instructions**: Clear user migration steps
- **Troubleshooting**: Common migration issues and solutions
- **Rollback Instructions**: How to revert if needed
## Conclusion
The migration fence provides a controlled boundary between legacy and new database systems, ensuring:
- **Data Integrity**: No data loss during migration
- **Application Stability**: Consistent behavior across platforms
- **Development Clarity**: Clear guidelines for code development
- **Migration Safety**: Controlled and reversible migration process
This fence will remain in place until all data is successfully migrated and verified, at which point the legacy system can be safely removed.

View File

@@ -0,0 +1,355 @@
# Database Migration Security Audit Checklist
## Overview
This document provides a comprehensive security audit checklist for the Dexie to SQLite migration in TimeSafari. The checklist ensures that data protection, privacy, and security are maintained throughout the migration process.
## Pre-Migration Security Assessment
### 1. Data Classification and Sensitivity
- [ ] **Data Inventory**
- [ ] Identify all sensitive data types (DIDs, private keys, personal information)
- [ ] Document data retention requirements
- [ ] Map data relationships and dependencies
- [ ] Assess data sensitivity levels (public, internal, confidential, restricted)
- [ ] **Encryption Assessment**
- [ ] Verify current encryption methods for sensitive data
- [ ] Document encryption keys and their management
- [ ] Assess encryption strength and compliance
- [ ] Plan encryption migration strategy
### 2. Access Control Review
- [ ] **User Access Rights**
- [ ] Audit current user permissions and roles
- [ ] Document access control mechanisms
- [ ] Verify principle of least privilege
- [ ] Plan access control migration
- [ ] **System Access**
- [ ] Review database access patterns
- [ ] Document authentication mechanisms
- [ ] Assess session management
- [ ] Plan authentication migration
### 3. Compliance Requirements
- [ ] **Regulatory Compliance**
- [ ] Identify applicable regulations (GDPR, CCPA, etc.)
- [ ] Document data processing requirements
- [ ] Assess privacy impact
- [ ] Plan compliance verification
- [ ] **Industry Standards**
- [ ] Review security standards compliance
- [ ] Document security controls
- [ ] Assess audit requirements
- [ ] Plan standards compliance
## Migration Security Controls
### 1. Data Protection During Migration
- [ ] **Encryption in Transit**
- [ ] Verify all data transfers are encrypted
- [ ] Use secure communication protocols (TLS 1.3+)
- [ ] Implement secure API endpoints
- [ ] Monitor encryption status
- [ ] **Encryption at Rest**
- [ ] Maintain encryption for stored data
- [ ] Verify encryption key management
- [ ] Test encryption/decryption processes
- [ ] Document encryption procedures
### 2. Access Control During Migration
- [ ] **Authentication**
- [ ] Maintain user authentication during migration
- [ ] Verify session management
- [ ] Implement secure token handling
- [ ] Monitor authentication events
- [ ] **Authorization**
- [ ] Preserve user permissions during migration
- [ ] Verify role-based access control
- [ ] Implement audit logging
- [ ] Monitor access patterns
### 3. Data Integrity
- [ ] **Data Validation**
- [ ] Implement input validation for all data
- [ ] Verify data format consistency
- [ ] Test data transformation processes
- [ ] Document validation rules
- [ ] **Data Verification**
- [ ] Implement checksums for data integrity
- [ ] Verify data completeness after migration
- [ ] Test data consistency checks
- [ ] Document verification procedures
## Migration Process Security
### 1. Backup Security
- [ ] **Backup Creation**
- [ ] Create encrypted backups before migration
- [ ] Verify backup integrity
- [ ] Store backups securely
- [ ] Test backup restoration
- [ ] **Backup Access**
- [ ] Limit backup access to authorized personnel
- [ ] Implement backup access logging
- [ ] Verify backup encryption
- [ ] Document backup procedures
### 2. Migration Tool Security
- [ ] **Tool Authentication**
- [ ] Implement secure authentication for migration tools
- [ ] Verify tool access controls
- [ ] Monitor tool usage
- [ ] Document tool security
- [ ] **Tool Validation**
- [ ] Verify migration tool integrity
- [ ] Test tool security features
- [ ] Validate tool outputs
- [ ] Document tool validation
### 3. Error Handling
- [ ] **Error Security**
- [ ] Implement secure error handling
- [ ] Avoid information disclosure in errors
- [ ] Log security-relevant errors
- [ ] Document error procedures
- [ ] **Recovery Security**
- [ ] Implement secure recovery procedures
- [ ] Verify recovery data protection
- [ ] Test recovery processes
- [ ] Document recovery security
## Post-Migration Security
### 1. Data Verification
- [ ] **Data Completeness**
- [ ] Verify all data was migrated successfully
- [ ] Check for data corruption
- [ ] Validate data relationships
- [ ] Document verification results
- [ ] **Data Accuracy**
- [ ] Verify data accuracy after migration
- [ ] Test data consistency
- [ ] Validate data integrity
- [ ] Document accuracy checks
### 2. Access Control Verification
- [ ] **User Access**
- [ ] Verify user access rights after migration
- [ ] Test authentication mechanisms
- [ ] Validate authorization rules
- [ ] Document access verification
- [ ] **System Access**
- [ ] Verify system access controls
- [ ] Test API security
- [ ] Validate session management
- [ ] Document system security
### 3. Security Testing
- [ ] **Penetration Testing**
- [ ] Conduct security penetration testing
- [ ] Test for common vulnerabilities
- [ ] Verify security controls
- [ ] Document test results
- [ ] **Vulnerability Assessment**
- [ ] Scan for security vulnerabilities
- [ ] Assess security posture
- [ ] Identify security gaps
- [ ] Document assessment results
## Monitoring and Logging
### 1. Security Monitoring
- [ ] **Access Monitoring**
- [ ] Monitor database access patterns
- [ ] Track user authentication events
- [ ] Monitor system access
- [ ] Document monitoring procedures
- [ ] **Data Monitoring**
- [ ] Monitor data access patterns
- [ ] Track data modification events
- [ ] Monitor data integrity
- [ ] Document data monitoring
### 2. Security Logging
- [ ] **Audit Logging**
- [ ] Implement comprehensive audit logging
- [ ] Log all security-relevant events
- [ ] Secure log storage and access
- [ ] Document logging procedures
- [ ] **Log Analysis**
- [ ] Implement log analysis tools
- [ ] Monitor for security incidents
- [ ] Analyze security trends
- [ ] Document analysis procedures
## Incident Response
### 1. Security Incident Planning
- [ ] **Incident Response Plan**
- [ ] Develop security incident response plan
- [ ] Define incident response procedures
- [ ] Train incident response team
- [ ] Document response procedures
- [ ] **Incident Detection**
- [ ] Implement incident detection mechanisms
- [ ] Monitor for security incidents
- [ ] Establish incident reporting procedures
- [ ] Document detection procedures
### 2. Recovery Procedures
- [ ] **Data Recovery**
- [ ] Develop data recovery procedures
- [ ] Test recovery processes
- [ ] Verify recovery data integrity
- [ ] Document recovery procedures
- [ ] **System Recovery**
- [ ] Develop system recovery procedures
- [ ] Test system recovery
- [ ] Verify system security after recovery
- [ ] Document recovery procedures
## Compliance Verification
### 1. Regulatory Compliance
- [ ] **Privacy Compliance**
- [ ] Verify GDPR compliance
- [ ] Check CCPA compliance
- [ ] Assess other privacy regulations
- [ ] Document compliance status
- [ ] **Security Compliance**
- [ ] Verify security standard compliance
- [ ] Check industry requirements
- [ ] Assess security certifications
- [ ] Document compliance status
### 2. Audit Requirements
- [ ] **Audit Trail**
- [ ] Maintain comprehensive audit trail
- [ ] Verify audit log integrity
- [ ] Test audit log accessibility
- [ ] Document audit procedures
- [ ] **Audit Reporting**
- [ ] Generate audit reports
- [ ] Verify report accuracy
- [ ] Distribute reports securely
- [ ] Document reporting procedures
## Documentation and Training
### 1. Security Documentation
- [ ] **Security Procedures**
- [ ] Document security procedures
- [ ] Update security policies
- [ ] Create security guidelines
- [ ] Maintain documentation
- [ ] **Security Training**
- [ ] Develop security training materials
- [ ] Train staff on security procedures
- [ ] Verify training effectiveness
- [ ] Document training procedures
### 2. Ongoing Security
- [ ] **Security Maintenance**
- [ ] Establish security maintenance procedures
- [ ] Schedule security updates
- [ ] Monitor security trends
- [ ] Document maintenance procedures
- [ ] **Security Review**
- [ ] Conduct regular security reviews
- [ ] Update security controls
- [ ] Assess security effectiveness
- [ ] Document review procedures
## Risk Assessment
### 1. Risk Identification
- [ ] **Security Risks**
- [ ] Identify potential security risks
- [ ] Assess risk likelihood and impact
- [ ] Prioritize security risks
- [ ] Document risk assessment
- [ ] **Mitigation Strategies**
- [ ] Develop risk mitigation strategies
- [ ] Implement risk controls
- [ ] Monitor risk status
- [ ] Document mitigation procedures
### 2. Risk Monitoring
- [ ] **Risk Tracking**
- [ ] Track identified risks
- [ ] Monitor risk status
- [ ] Update risk assessments
- [ ] Document risk tracking
- [ ] **Risk Reporting**
- [ ] Generate risk reports
- [ ] Distribute risk information
- [ ] Update risk documentation
- [ ] Document reporting procedures
## Conclusion
This security audit checklist ensures that the database migration maintains the highest standards of data protection, privacy, and security. Regular review and updates of this checklist are essential to maintain security throughout the migration process and beyond.
### Security Checklist Summary
- [ ] **Pre-Migration Assessment**: Complete
- [ ] **Migration Controls**: Complete
- [ ] **Process Security**: Complete
- [ ] **Post-Migration Verification**: Complete
- [ ] **Monitoring and Logging**: Complete
- [ ] **Incident Response**: Complete
- [ ] **Compliance Verification**: Complete
- [ ] **Documentation and Training**: Complete
- [ ] **Risk Assessment**: Complete
**Overall Security Status**: [ ] Secure [ ] Needs Attention [ ] Critical Issues
**Next Review Date**: _______________
**Reviewed By**: _______________
**Approved By**: _______________

View File

@@ -0,0 +1,226 @@
# Migration Guide: Dexie to absurd-sql
## Overview
This document outlines the migration process from Dexie.js to absurd-sql for the TimeSafari app's storage implementation. The migration aims to provide a consistent SQLite-based storage solution across all platforms while maintaining data integrity and ensuring a smooth transition for users.
**Current Status**: The migration is in **Phase 2** with a well-defined migration fence in place. Core settings and account data have been migrated, with contact migration in progress. **ActiveDid migration has been implemented** to ensure user identity continuity.
## Migration Goals
1. **Data Integrity**
- Preserve all existing data
- Maintain data relationships
- Ensure data consistency
- **Preserve user's active identity**
2. **Performance**
- Improve query performance
- Reduce storage overhead
- Optimize for platform-specific capabilities
3. **User Experience**
- Seamless transition with no data loss
- Maintain user's active identity and preferences
- Preserve application state
## Migration Architecture
### Migration Fence
The migration fence is defined by the `USE_DEXIE_DB` constant in `src/constants/app.ts`:
- `USE_DEXIE_DB = false` (default): Uses SQLite database
- `USE_DEXIE_DB = true`: Uses Dexie database (for migration purposes)
### Migration Order
The migration follows a specific order to maintain data integrity:
1. **Accounts** (foundational - contains DIDs)
2. **Settings** (references accountDid, activeDid)
3. **ActiveDid** (depends on accounts and settings) ⭐ **NEW**
4. **Contacts** (independent, but migrated after accounts for consistency)
## ActiveDid Migration ⭐ **NEW FEATURE**
### Problem Solved
Previously, the `activeDid` setting was not migrated from Dexie to SQLite, causing users to lose their active identity after migration.
### Solution Implemented
The migration now includes a dedicated step for migrating the `activeDid`:
1. **Detection**: Identifies the `activeDid` from Dexie master settings
2. **Validation**: Verifies the `activeDid` exists in SQLite accounts
3. **Migration**: Updates SQLite master settings with the `activeDid`
4. **Error Handling**: Graceful handling of missing accounts
### Implementation Details
#### New Function: `migrateActiveDid()`
```typescript
export async function migrateActiveDid(): Promise<MigrationResult> {
// 1. Get Dexie settings to find the activeDid
const dexieSettings = await getDexieSettings();
const masterSettings = dexieSettings.find(setting => !setting.accountDid);
// 2. Verify the activeDid exists in SQLite accounts
const accountExists = await platformService.dbQuery(
"SELECT did FROM accounts WHERE did = ?",
[dexieActiveDid],
);
// 3. Update SQLite master settings
await updateDefaultSettings({ activeDid: dexieActiveDid });
}
```
#### Enhanced `migrateSettings()` Function
The settings migration now includes activeDid handling:
- Extracts `activeDid` from Dexie master settings
- Validates account existence in SQLite
- Updates SQLite master settings with the `activeDid`
#### Updated `migrateAll()` Function
The complete migration now includes a dedicated step for activeDid:
```typescript
// Step 3: Migrate ActiveDid (depends on accounts and settings)
logger.info("[MigrationService] Step 3: Migrating activeDid...");
const activeDidResult = await migrateActiveDid();
```
### Benefits
-**User Identity Preservation**: Users maintain their active identity
-**Seamless Experience**: No need to manually select identity after migration
-**Data Consistency**: Ensures all identity-related settings are preserved
-**Error Resilience**: Graceful handling of edge cases
## Migration Process
### Phase 1: Preparation ✅
- [x] Enable Dexie database access
- [x] Implement data comparison tools
- [x] Create migration service structure
### Phase 2: Core Migration ✅
- [x] Account migration with `importFromMnemonic`
- [x] Settings migration (excluding activeDid)
- [x] **ActiveDid migration****COMPLETED**
- [x] Contact migration framework
### Phase 3: Validation and Cleanup 🔄
- [ ] Comprehensive data validation
- [ ] Performance testing
- [ ] User acceptance testing
- [ ] Dexie removal
## Usage
### Manual Migration
```typescript
import { migrateAll, migrateActiveDid } from '../services/indexedDBMigrationService';
// Complete migration
const result = await migrateAll();
// Or migrate just the activeDid
const activeDidResult = await migrateActiveDid();
```
### Migration Verification
```typescript
import { compareDatabases } from '../services/indexedDBMigrationService';
const comparison = await compareDatabases();
console.log('Migration differences:', comparison.differences);
```
## Error Handling
### ActiveDid Migration Errors
- **Missing Account**: If the `activeDid` from Dexie doesn't exist in SQLite accounts
- **Database Errors**: Connection or query failures
- **Settings Update Failures**: Issues updating SQLite master settings
### Recovery Strategies
1. **Automatic Recovery**: Migration continues even if activeDid migration fails
2. **Manual Recovery**: Users can manually select their identity after migration
3. **Fallback**: System creates new identity if none exists
## Security Considerations
### Data Protection
- All sensitive data (mnemonics, private keys) are encrypted
- Migration preserves encryption standards
- No plaintext data exposure during migration
### Identity Verification
- ActiveDid migration validates account existence
- Prevents setting non-existent identities as active
- Maintains cryptographic integrity
## Testing
### Migration Testing
```bash
# Enable Dexie for testing
# Set USE_DEXIE_DB = true in constants/app.ts
# Run migration
npm run migrate
# Verify results
npm run test:migration
```
### ActiveDid Testing
```typescript
// Test activeDid migration specifically
const result = await migrateActiveDid();
expect(result.success).toBe(true);
expect(result.warnings).toContain('Successfully migrated activeDid');
```
## Troubleshooting
### Common Issues
1. **ActiveDid Not Found**
- Ensure accounts were migrated before activeDid migration
- Check that the Dexie activeDid exists in SQLite accounts
2. **Migration Failures**
- Verify Dexie database is accessible
- Check SQLite database permissions
- Review migration logs for specific errors
3. **Data Inconsistencies**
- Use `compareDatabases()` to identify differences
- Re-run migration if necessary
- Check for duplicate or conflicting records
### Debugging
```typescript
// Enable detailed logging
logger.setLevel('debug');
// Check migration status
const comparison = await compareDatabases();
console.log('Settings differences:', comparison.differences.settings);
```
## Future Enhancements
### Planned Improvements
1. **Batch Processing**: Optimize for large datasets
2. **Incremental Migration**: Support partial migrations
3. **Rollback Capability**: Ability to revert migration
4. **Progress Tracking**: Real-time migration progress
### Performance Optimizations
1. **Parallel Processing**: Migrate independent data concurrently
2. **Memory Management**: Optimize for large datasets
3. **Transaction Batching**: Reduce database round trips
## Conclusion
The Dexie to SQLite migration provides a robust, secure, and user-friendly transition path. The addition of activeDid migration ensures that users maintain their identity continuity throughout the migration process, significantly improving the user experience.
The migration fence architecture allows for controlled, reversible migration while maintaining application stability and data integrity.

View File

@@ -0,0 +1,339 @@
# Secure Storage Implementation Guide for TimeSafari App
## Overview
This document outlines the implementation of secure storage for the TimeSafari app. The implementation focuses on:
1. **Platform-Specific Storage Solutions**:
- Web: SQLite with IndexedDB backend (absurd-sql)
- Electron: SQLite with Node.js backend
- Native: (Planned) SQLCipher with platform-specific secure storage
2. **Key Features**:
- SQLite-based storage using absurd-sql for web
- Platform-specific service factory pattern
- Consistent API across platforms
- Migration support from Dexie.js
## Quick Start
### 1. Installation
```bash
# Core dependencies
npm install @jlongster/sql.js
npm install absurd-sql
# Platform-specific dependencies (for future native support)
npm install @capacitor/preferences
npm install @capacitor-community/biometric-auth
```
### 2. Basic Usage
```typescript
// Using the platform service
import { PlatformServiceFactory } from '../services/PlatformServiceFactory';
// Get platform-specific service instance
const platformService = PlatformServiceFactory.getInstance();
// Example database operations
async function example() {
try {
// Query example
const result = await platformService.dbQuery(
"SELECT * FROM accounts WHERE did = ?",
[did]
);
// Execute example
await platformService.dbExec(
"INSERT INTO accounts (did, public_key_hex) VALUES (?, ?)",
[did, publicKeyHex]
);
} catch (error) {
console.error('Database operation failed:', error);
}
}
```
### 3. Platform Detection
```typescript
// src/services/PlatformServiceFactory.ts
export class PlatformServiceFactory {
static getInstance(): PlatformService {
if (process.env.ELECTRON) {
// Electron platform
return new ElectronPlatformService();
} else {
// Web platform (default)
return new AbsurdSqlDatabaseService();
}
}
}
```
### 4. Current Implementation Details
#### Web Platform (AbsurdSqlDatabaseService)
The web platform uses absurd-sql with IndexedDB backend:
```typescript
// src/services/AbsurdSqlDatabaseService.ts
export class AbsurdSqlDatabaseService implements PlatformService {
private static instance: AbsurdSqlDatabaseService | null = null;
private db: AbsurdSqlDatabase | null = null;
private initialized: boolean = false;
// Singleton pattern
static getInstance(): AbsurdSqlDatabaseService {
if (!AbsurdSqlDatabaseService.instance) {
AbsurdSqlDatabaseService.instance = new AbsurdSqlDatabaseService();
}
return AbsurdSqlDatabaseService.instance;
}
// Database operations
async dbQuery(sql: string, params: unknown[] = []): Promise<QueryExecResult[]> {
await this.waitForInitialization();
return this.queueOperation<QueryExecResult[]>("query", sql, params);
}
async dbExec(sql: string, params: unknown[] = []): Promise<void> {
await this.waitForInitialization();
await this.queueOperation<void>("run", sql, params);
}
}
```
Key features:
- Uses absurd-sql for SQLite in the browser
- Implements operation queuing for thread safety
- Handles initialization and connection management
- Provides consistent API across platforms
### 5. Migration from Dexie.js
The current implementation supports gradual migration from Dexie.js:
```typescript
// Example of dual-storage pattern
async function getAccount(did: string): Promise<Account | undefined> {
// Try SQLite first
const platform = PlatformServiceFactory.getInstance();
let account = await platform.dbQuery(
"SELECT * FROM accounts WHERE did = ?",
[did]
);
// Fallback to Dexie if needed
if (USE_DEXIE_DB) {
account = await db.accounts.get(did);
}
return account;
}
```
#### A. Modifying Code
When converting from Dexie.js to SQL-based implementation, follow these patterns:
1. **Database Access Pattern**
```typescript
// Before (Dexie)
const result = await db.table.where("field").equals(value).first();
// After (SQL)
const platform = PlatformServiceFactory.getInstance();
let result = await platform.dbQuery(
"SELECT * FROM table WHERE field = ?",
[value]
);
result = databaseUtil.mapQueryResultToValues(result);
// Fallback to Dexie if needed
if (USE_DEXIE_DB) {
result = await db.table.where("field").equals(value).first();
}
```
2. **Update Operations**
```typescript
// Before (Dexie)
await db.table.where("id").equals(id).modify(changes);
// After (SQL)
// For settings updates, use the utility methods:
await databaseUtil.updateDefaultSettings(changes);
// OR
await databaseUtil.updateAccountSettings(did, changes);
// For other tables, use direct SQL:
const platform = PlatformServiceFactory.getInstance();
await platform.dbExec(
"UPDATE table SET field1 = ?, field2 = ? WHERE id = ?",
[changes.field1, changes.field2, id]
);
// Fallback to Dexie if needed
if (USE_DEXIE_DB) {
await db.table.where("id").equals(id).modify(changes);
}
```
3. **Insert Operations**
```typescript
// Before (Dexie)
await db.table.add(item);
// After (SQL)
const platform = PlatformServiceFactory.getInstance();
const columns = Object.keys(item);
const values = Object.values(item);
const placeholders = values.map(() => '?').join(', ');
const sql = `INSERT INTO table (${columns.join(', ')}) VALUES (${placeholders})`;
await platform.dbExec(sql, values);
// Fallback to Dexie if needed
if (USE_DEXIE_DB) {
await db.table.add(item);
}
```
4. **Delete Operations**
```typescript
// Before (Dexie)
await db.table.where("id").equals(id).delete();
// After (SQL)
const platform = PlatformServiceFactory.getInstance();
await platform.dbExec("DELETE FROM table WHERE id = ?", [id]);
// Fallback to Dexie if needed
if (USE_DEXIE_DB) {
await db.table.where("id").equals(id).delete();
}
```
5. **Result Processing**
```typescript
// Before (Dexie)
const items = await db.table.toArray();
// After (SQL)
const platform = PlatformServiceFactory.getInstance();
let items = await platform.dbQuery("SELECT * FROM table");
items = databaseUtil.mapQueryResultToValues(items);
// Fallback to Dexie if needed
if (USE_DEXIE_DB) {
items = await db.table.toArray();
}
```
6. **Using Utility Methods**
When working with settings or other common operations, use the utility methods in `db/index.ts`:
```typescript
// Settings operations
await databaseUtil.updateDefaultSettings(settings);
await databaseUtil.updateAccountSettings(did, settings);
const settings = await databaseUtil.retrieveSettingsForDefaultAccount();
const settings = await databaseUtil.retrieveSettingsForActiveAccount();
// Logging operations
await databaseUtil.logToDb(message);
await databaseUtil.logConsoleAndDb(message, showInConsole);
```
Key Considerations:
- Always use `databaseUtil.mapQueryResultToValues()` to process SQL query results
- Use utility methods from `db/index.ts` when available instead of direct SQL
- Keep Dexie fallbacks wrapped in `if (USE_DEXIE_DB)` checks
- For queries that return results, use `let` variables to allow Dexie fallback to override
- For updates/inserts/deletes, execute both SQL and Dexie operations when `USE_DEXIE_DB` is true
Example Migration:
```typescript
// Before (Dexie)
export async function updateSettings(settings: Settings): Promise<void> {
await db.settings.put(settings);
}
// After (SQL)
export async function updateSettings(settings: Settings): Promise<void> {
const platform = PlatformServiceFactory.getInstance();
const { sql, params } = generateUpdateStatement(
settings,
"settings",
"id = ?",
[settings.id]
);
await platform.dbExec(sql, params);
}
```
Remember to:
- Create database access code to use the platform service, putting it in front of the Dexie version
- Instead of removing Dexie-specific code, keep it.
- For creates & updates & deletes, the duplicate code is fine.
- For queries where we use the results, make the setting from SQL into a 'let' variable, then wrap the Dexie code in a check for USE_DEXIE_DB from app.ts and if
it's true then use that result instead of the SQL code's result.
- Consider data migration needs, and warn if there are any potential migration problems
## Success Criteria
1. **Functionality**
- [x] Basic CRUD operations work correctly
- [x] Platform service factory pattern implemented
- [x] Error handling in place
- [ ] Native platform support (planned)
2. **Performance**
- [x] Database operations complete within acceptable time
- [x] Operation queuing for thread safety
- [x] Proper initialization handling
- [ ] Performance monitoring (planned)
3. **Security**
- [x] Basic data integrity
- [ ] Encryption (planned for native platforms)
- [ ] Secure key storage (planned)
- [ ] Platform-specific security features (planned)
4. **Testing**
- [x] Basic unit tests
- [ ] Comprehensive integration tests (planned)
- [ ] Platform-specific tests (planned)
- [ ] Migration tests (planned)
## Next Steps
1. **Native Platform Support**
- Implement SQLCipher for iOS/Android
- Add platform-specific secure storage
- Implement biometric authentication
2. **Enhanced Security**
- Add encryption for sensitive data
- Implement secure key storage
- Add platform-specific security features
3. **Testing and Monitoring**
- Add comprehensive test coverage
- Implement performance monitoring
- Add error tracking and analytics
4. **Documentation**
- Add API documentation
- Create migration guides
- Document security measures

View File

@@ -0,0 +1,329 @@
# Storage Implementation Checklist
## Core Services
### 1. Storage Service Layer
- [x] Create base `PlatformService` interface
- [x] Define common methods for all platforms
- [x] Add platform-specific method signatures
- [x] Include error handling types
- [x] Add migration support methods
- [x] Implement platform-specific services
- [x] `AbsurdSqlDatabaseService` (web)
- [x] Database initialization
- [x] VFS setup with IndexedDB backend
- [x] Connection management
- [x] Operation queuing
- [ ] `NativeSQLiteService` (iOS/Android) (planned)
- [ ] SQLCipher integration
- [ ] Native bridge setup
- [ ] File system access
- [ ] `ElectronSQLiteService` (planned)
- [ ] Node SQLite integration
- [ ] IPC communication
- [ ] File system access
### 2. Migration Services
- [x] Implement basic migration support
- [x] Dual-storage pattern (SQLite + Dexie)
- [x] Basic data verification
- [ ] Rollback procedures (planned)
- [ ] Progress tracking (planned)
- [ ] Create `MigrationUI` components (planned)
- [ ] Progress indicators
- [ ] Error handling
- [ ] User notifications
- [ ] Manual triggers
### 3. Security Layer
- [x] Basic data integrity
- [ ] Implement `EncryptionService` (planned)
- [ ] Key management
- [ ] Encryption/decryption
- [ ] Secure storage
- [ ] Add `BiometricService` (planned)
- [ ] Platform detection
- [ ] Authentication flow
- [ ] Fallback mechanisms
## Platform-Specific Implementation
### Web Platform
- [x] Setup absurd-sql
- [x] Install dependencies
```json
{
"@jlongster/sql.js": "^1.8.0",
"absurd-sql": "^1.8.0"
}
```
- [x] Configure VFS with IndexedDB backend
- [x] Setup worker threads
- [x] Implement operation queuing
- [x] Configure database pragmas
```sql
PRAGMA journal_mode=MEMORY;
PRAGMA synchronous=NORMAL;
PRAGMA foreign_keys=ON;
PRAGMA busy_timeout=5000;
```
- [x] Update build configuration
- [x] Modify `vite.config.ts`
- [x] Add worker configuration
- [x] Update chunk splitting
- [x] Configure asset handling
- [x] Implement IndexedDB backend
- [x] Create database service
- [x] Add operation queuing
- [x] Handle initialization
- [x] Implement atomic operations
### iOS Platform (Planned)
- [ ] Setup SQLCipher
- [ ] Install pod dependencies
- [ ] Configure encryption
- [ ] Setup keychain access
- [ ] Implement secure storage
- [ ] Update Capacitor config
- [ ] Modify `capacitor.config.ts`
- [ ] Add iOS permissions
- [ ] Configure backup
- [ ] Setup app groups
### Android Platform (Planned)
- [ ] Setup SQLCipher
- [ ] Add Gradle dependencies
- [ ] Configure encryption
- [ ] Setup keystore
- [ ] Implement secure storage
- [ ] Update Capacitor config
- [ ] Modify `capacitor.config.ts`
- [ ] Add Android permissions
- [ ] Configure backup
- [ ] Setup file provider
### Electron Platform (Planned)
- [ ] Setup Node SQLite
- [ ] Install dependencies
- [ ] Configure IPC
- [ ] Setup file system access
- [ ] Implement secure storage
- [ ] Update Electron config
- [ ] Modify `electron.config.ts`
- [ ] Add security policies
- [ ] Configure file access
- [ ] Setup auto-updates
## Data Models and Types
### 1. Database Schema
- [x] Define tables
```sql
-- Accounts table
CREATE TABLE accounts (
did TEXT PRIMARY KEY,
public_key_hex TEXT NOT NULL,
created_at INTEGER NOT NULL,
updated_at INTEGER NOT NULL
);
-- Settings table
CREATE TABLE settings (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
updated_at INTEGER NOT NULL
);
-- Contacts table
CREATE TABLE contacts (
id TEXT PRIMARY KEY,
did TEXT NOT NULL,
name TEXT,
created_at INTEGER NOT NULL,
updated_at INTEGER NOT NULL,
FOREIGN KEY (did) REFERENCES accounts(did)
);
-- Indexes for performance
CREATE INDEX idx_accounts_created_at ON accounts(created_at);
CREATE INDEX idx_contacts_did ON contacts(did);
CREATE INDEX idx_settings_updated_at ON settings(updated_at);
```
- [x] Create indexes
- [x] Define constraints
- [ ] Add triggers (planned)
- [ ] Setup migrations (planned)
### 2. Type Definitions
- [x] Create interfaces
```typescript
interface Account {
did: string;
publicKeyHex: string;
createdAt: number;
updatedAt: number;
}
interface Setting {
key: string;
value: string;
updatedAt: number;
}
interface Contact {
id: string;
did: string;
name?: string;
createdAt: number;
updatedAt: number;
}
```
- [x] Add validation
- [x] Create DTOs
- [x] Define enums
- [x] Add type guards
## UI Components
### 1. Migration UI (Planned)
- [ ] Create components
- [ ] `MigrationProgress.vue`
- [ ] `MigrationError.vue`
- [ ] `MigrationSettings.vue`
- [ ] `MigrationStatus.vue`
### 2. Settings UI (Planned)
- [ ] Update components
- [ ] Add storage settings
- [ ] Add migration controls
- [ ] Add backup options
- [ ] Add security settings
### 3. Error Handling UI (Planned)
- [ ] Create components
- [ ] `StorageError.vue`
- [ ] `QuotaExceeded.vue`
- [ ] `MigrationFailed.vue`
- [ ] `RecoveryOptions.vue`
## Testing
### 1. Unit Tests
- [x] Basic service tests
- [x] Platform service tests
- [x] Database operation tests
- [ ] Security service tests (planned)
- [ ] Platform detection tests (planned)
### 2. Integration Tests (Planned)
- [ ] Test migrations
- [ ] Web platform tests
- [ ] iOS platform tests
- [ ] Android platform tests
- [ ] Electron platform tests
### 3. E2E Tests (Planned)
- [ ] Test workflows
- [ ] Account management
- [ ] Settings management
- [ ] Contact management
- [ ] Migration process
## Documentation
### 1. Technical Documentation
- [x] Update architecture docs
- [x] Add API documentation
- [ ] Create migration guides (planned)
- [ ] Document security measures (planned)
### 2. User Documentation (Planned)
- [ ] Update user guides
- [ ] Add troubleshooting guides
- [ ] Create FAQ
- [ ] Document new features
## Deployment
### 1. Build Process
- [x] Update build scripts
- [x] Add platform-specific builds
- [ ] Configure CI/CD (planned)
- [ ] Setup automated testing (planned)
### 2. Release Process (Planned)
- [ ] Create release checklist
- [ ] Add version management
- [ ] Setup rollback procedures
- [ ] Configure monitoring
## Monitoring and Analytics (Planned)
### 1. Error Tracking
- [ ] Setup error logging
- [ ] Add performance monitoring
- [ ] Configure alerts
- [ ] Create dashboards
### 2. Usage Analytics
- [ ] Add storage metrics
- [ ] Track migration success
- [ ] Monitor performance
- [ ] Collect user feedback
## Security Audit (Planned)
### 1. Code Review
- [ ] Review encryption
- [ ] Check access controls
- [ ] Verify data handling
- [ ] Audit dependencies
### 2. Penetration Testing
- [ ] Test data access
- [ ] Verify encryption
- [ ] Check authentication
- [ ] Review permissions
## Success Criteria
### 1. Performance
- [x] Query response time < 100ms
- [x] Operation queuing for thread safety
- [x] Proper initialization handling
- [ ] Migration time < 5s per 1000 records (planned)
- [ ] Storage overhead < 10% (planned)
- [ ] Memory usage < 50MB (planned)
### 2. Reliability
- [x] Basic data integrity
- [x] Operation queuing
- [ ] Automatic recovery (planned)
- [ ] Backup verification (planned)
- [ ] Transaction atomicity (planned)
- [ ] Data consistency (planned)
### 3. Security
- [x] Basic data integrity
- [ ] AES-256 encryption (planned)
- [ ] Secure key storage (planned)
- [ ] Access control (planned)
- [ ] Audit logging (planned)
### 4. User Experience
- [x] Basic database operations
- [ ] Smooth migration (planned)
- [ ] Clear error messages (planned)
- [ ] Progress indicators (planned)
- [ ] Recovery options (planned)

View File

@@ -1,554 +0,0 @@
# Migration Guide: Dexie to wa-sqlite
## Overview
This document outlines the migration process from Dexie.js to wa-sqlite for the TimeSafari app's storage implementation. The migration aims to provide a consistent SQLite-based storage solution across all platforms while maintaining data integrity and ensuring a smooth transition for users.
## Migration Goals
1. **Data Integrity**
- Preserve all existing data
- Maintain data relationships
- Ensure data consistency
2. **Performance**
- Improve query performance
- Reduce storage overhead
- Optimize for platform-specific features
3. **Security**
- Maintain or improve encryption
- Preserve access controls
- Enhance data protection
4. **User Experience**
- Zero data loss
- Minimal downtime
- Automatic migration where possible
## Prerequisites
1. **Backup Requirements**
```typescript
interface MigrationBackup {
timestamp: number;
accounts: Account[];
settings: Setting[];
contacts: Contact[];
metadata: {
version: string;
platform: string;
dexieVersion: string;
};
}
```
2. **Storage Requirements**
- Sufficient IndexedDB quota
- Available disk space for SQLite
- Backup storage space
3. **Platform Support**
- Web: Modern browser with IndexedDB support
- iOS: iOS 13+ with SQLite support
- Android: Android 5+ with SQLite support
- Electron: Latest version with SQLite support
## Migration Process
### 1. Preparation
```typescript
// src/services/storage/migration/MigrationService.ts
export class MigrationService {
private static instance: MigrationService;
private backup: MigrationBackup | null = null;
async prepare(): Promise<void> {
try {
// 1. Check prerequisites
await this.checkPrerequisites();
// 2. Create backup
this.backup = await this.createBackup();
// 3. Verify backup integrity
await this.verifyBackup();
// 4. Initialize wa-sqlite
await this.initializeWaSqlite();
} catch (error) {
throw new StorageError(
'Migration preparation failed',
StorageErrorCodes.MIGRATION_FAILED,
error
);
}
}
private async checkPrerequisites(): Promise<void> {
// Check IndexedDB availability
if (!window.indexedDB) {
throw new StorageError(
'IndexedDB not available',
StorageErrorCodes.INITIALIZATION_FAILED
);
}
// Check storage quota
const quota = await navigator.storage.estimate();
if (quota.quota && quota.usage && quota.usage > quota.quota * 0.9) {
throw new StorageError(
'Insufficient storage space',
StorageErrorCodes.STORAGE_FULL
);
}
// Check platform support
const capabilities = await PlatformDetection.getCapabilities();
if (!capabilities.hasFileSystem) {
throw new StorageError(
'Platform does not support required features',
StorageErrorCodes.INITIALIZATION_FAILED
);
}
}
private async createBackup(): Promise<MigrationBackup> {
const dexieDB = new Dexie('TimeSafariDB');
return {
timestamp: Date.now(),
accounts: await dexieDB.accounts.toArray(),
settings: await dexieDB.settings.toArray(),
contacts: await dexieDB.contacts.toArray(),
metadata: {
version: '1.0.0',
platform: await PlatformDetection.getPlatform(),
dexieVersion: Dexie.version
}
};
}
}
```
### 2. Data Migration
```typescript
// src/services/storage/migration/DataMigration.ts
export class DataMigration {
async migrate(backup: MigrationBackup): Promise<void> {
try {
// 1. Create new database schema
await this.createSchema();
// 2. Migrate accounts
await this.migrateAccounts(backup.accounts);
// 3. Migrate settings
await this.migrateSettings(backup.settings);
// 4. Migrate contacts
await this.migrateContacts(backup.contacts);
// 5. Verify migration
await this.verifyMigration(backup);
} catch (error) {
// 6. Handle failure
await this.handleMigrationFailure(error, backup);
}
}
private async migrateAccounts(accounts: Account[]): Promise<void> {
const db = await this.getWaSqliteConnection();
// Use transaction for atomicity
await db.transaction(async (tx) => {
for (const account of accounts) {
await tx.execute(`
INSERT INTO accounts (did, public_key_hex, created_at, updated_at)
VALUES (?, ?, ?, ?)
`, [
account.did,
account.publicKeyHex,
account.createdAt,
account.updatedAt
]);
}
});
}
private async verifyMigration(backup: MigrationBackup): Promise<void> {
const db = await this.getWaSqliteConnection();
// Verify account count
const accountCount = await db.selectValue(
'SELECT COUNT(*) FROM accounts'
);
if (accountCount !== backup.accounts.length) {
throw new StorageError(
'Account count mismatch',
StorageErrorCodes.VERIFICATION_FAILED
);
}
// Verify data integrity
await this.verifyDataIntegrity(backup);
}
}
```
### 3. Rollback Strategy
```typescript
// src/services/storage/migration/RollbackService.ts
export class RollbackService {
async rollback(backup: MigrationBackup): Promise<void> {
try {
// 1. Stop all database operations
await this.stopDatabaseOperations();
// 2. Restore from backup
await this.restoreFromBackup(backup);
// 3. Verify restoration
await this.verifyRestoration(backup);
// 4. Clean up wa-sqlite
await this.cleanupWaSqlite();
} catch (error) {
throw new StorageError(
'Rollback failed',
StorageErrorCodes.ROLLBACK_FAILED,
error
);
}
}
private async restoreFromBackup(backup: MigrationBackup): Promise<void> {
const dexieDB = new Dexie('TimeSafariDB');
// Restore accounts
await dexieDB.accounts.bulkPut(backup.accounts);
// Restore settings
await dexieDB.settings.bulkPut(backup.settings);
// Restore contacts
await dexieDB.contacts.bulkPut(backup.contacts);
}
}
```
## Migration UI
```vue
<!-- src/components/MigrationProgress.vue -->
<template>
<div class="migration-progress">
<h2>Database Migration</h2>
<div class="progress-container">
<div class="progress-bar" :style="{ width: `${progress}%` }" />
<div class="progress-text">{{ progress }}%</div>
</div>
<div class="status-message">{{ statusMessage }}</div>
<div v-if="error" class="error-message">
{{ error }}
<button @click="retryMigration">Retry</button>
</div>
</div>
</template>
<script setup lang="ts">
import { ref, onMounted } from 'vue';
import { MigrationService } from '@/services/storage/migration/MigrationService';
const progress = ref(0);
const statusMessage = ref('Preparing migration...');
const error = ref<string | null>(null);
const migrationService = MigrationService.getInstance();
async function startMigration() {
try {
// 1. Preparation
statusMessage.value = 'Creating backup...';
await migrationService.prepare();
progress.value = 20;
// 2. Data migration
statusMessage.value = 'Migrating data...';
await migrationService.migrate();
progress.value = 80;
// 3. Verification
statusMessage.value = 'Verifying migration...';
await migrationService.verify();
progress.value = 100;
statusMessage.value = 'Migration completed successfully!';
} catch (err) {
error.value = err instanceof Error ? err.message : 'Migration failed';
statusMessage.value = 'Migration failed';
}
}
async function retryMigration() {
error.value = null;
progress.value = 0;
await startMigration();
}
onMounted(() => {
startMigration();
});
</script>
<style scoped>
.migration-progress {
padding: 2rem;
max-width: 600px;
margin: 0 auto;
}
.progress-container {
position: relative;
height: 20px;
background: #eee;
border-radius: 10px;
overflow: hidden;
margin: 1rem 0;
}
.progress-bar {
position: absolute;
height: 100%;
background: #4CAF50;
transition: width 0.3s ease;
}
.progress-text {
position: absolute;
width: 100%;
text-align: center;
line-height: 20px;
color: #000;
}
.status-message {
text-align: center;
margin: 1rem 0;
}
.error-message {
color: #f44336;
text-align: center;
margin: 1rem 0;
}
button {
margin-top: 1rem;
padding: 0.5rem 1rem;
background: #2196F3;
color: white;
border: none;
border-radius: 4px;
cursor: pointer;
}
button:hover {
background: #1976D2;
}
</style>
```
## Testing Strategy
1. **Unit Tests**
```typescript
// src/services/storage/migration/__tests__/MigrationService.spec.ts
describe('MigrationService', () => {
it('should create valid backup', async () => {
const service = MigrationService.getInstance();
const backup = await service.createBackup();
expect(backup).toBeDefined();
expect(backup.accounts).toBeInstanceOf(Array);
expect(backup.settings).toBeInstanceOf(Array);
expect(backup.contacts).toBeInstanceOf(Array);
});
it('should migrate data correctly', async () => {
const service = MigrationService.getInstance();
const backup = await service.createBackup();
await service.migrate(backup);
// Verify migration
const accounts = await service.getMigratedAccounts();
expect(accounts).toHaveLength(backup.accounts.length);
});
it('should handle rollback correctly', async () => {
const service = MigrationService.getInstance();
const backup = await service.createBackup();
// Simulate failed migration
await service.migrate(backup);
await service.simulateFailure();
// Perform rollback
await service.rollback(backup);
// Verify rollback
const accounts = await service.getOriginalAccounts();
expect(accounts).toHaveLength(backup.accounts.length);
});
});
```
2. **Integration Tests**
```typescript
// src/services/storage/migration/__tests__/integration/Migration.spec.ts
describe('Migration Integration', () => {
it('should handle concurrent access during migration', async () => {
const service = MigrationService.getInstance();
// Start migration
const migrationPromise = service.migrate();
// Simulate concurrent access
const accessPromises = Array(5).fill(null).map(() =>
service.getAccount('did:test:123')
);
// Wait for all operations
const [migrationResult, ...accessResults] = await Promise.allSettled([
migrationPromise,
...accessPromises
]);
// Verify results
expect(migrationResult.status).toBe('fulfilled');
expect(accessResults.some(r => r.status === 'rejected')).toBe(true);
});
it('should maintain data integrity during platform transition', async () => {
const service = MigrationService.getInstance();
// Simulate platform change
await service.simulatePlatformChange();
// Verify data
const accounts = await service.getAllAccounts();
const settings = await service.getAllSettings();
const contacts = await service.getAllContacts();
expect(accounts).toBeDefined();
expect(settings).toBeDefined();
expect(contacts).toBeDefined();
});
});
```
## Success Criteria
1. **Data Integrity**
- [ ] All accounts migrated successfully
- [ ] All settings preserved
- [ ] All contacts transferred
- [ ] No data corruption
2. **Performance**
- [ ] Migration completes within acceptable time
- [ ] No significant performance degradation
- [ ] Efficient storage usage
- [ ] Smooth user experience
3. **Security**
- [ ] Encrypted data remains secure
- [ ] Access controls maintained
- [ ] No sensitive data exposure
- [ ] Secure backup process
4. **User Experience**
- [ ] Clear migration progress
- [ ] Informative error messages
- [ ] Automatic recovery from failures
- [ ] No data loss
## Rollback Plan
1. **Automatic Rollback**
- Triggered by migration failure
- Restores from verified backup
- Maintains data consistency
- Logs rollback reason
2. **Manual Rollback**
- Available through settings
- Requires user confirmation
- Preserves backup data
- Provides rollback status
3. **Emergency Recovery**
- Manual backup restoration
- Database repair tools
- Data recovery procedures
- Support contact information
## Post-Migration
1. **Verification**
- Data integrity checks
- Performance monitoring
- Error rate tracking
- User feedback collection
2. **Cleanup**
- Remove old database
- Clear migration artifacts
- Update application state
- Archive backup data
3. **Monitoring**
- Track migration success rate
- Monitor performance metrics
- Collect error reports
- Gather user feedback
## Support
For assistance with migration:
1. Check the troubleshooting guide
2. Review error logs
3. Contact support team
4. Submit issue report
## Timeline
1. **Preparation Phase** (1 week)
- Backup system implementation
- Migration service development
- Testing framework setup
2. **Testing Phase** (2 weeks)
- Unit testing
- Integration testing
- Performance testing
- Security testing
3. **Deployment Phase** (1 week)
- Staged rollout
- Monitoring
- Support preparation
- Documentation updates
4. **Post-Deployment** (2 weeks)
- Monitoring
- Bug fixes
- Performance optimization
- User feedback collection

File diff suppressed because it is too large Load Diff

View File

@@ -14,7 +14,7 @@
504EC30F1FED79650016851F /* Assets.xcassets in Resources */ = {isa = PBXBuildFile; fileRef = 504EC30E1FED79650016851F /* Assets.xcassets */; };
504EC3121FED79650016851F /* LaunchScreen.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = 504EC3101FED79650016851F /* LaunchScreen.storyboard */; };
50B271D11FEDC1A000F3C39B /* public in Resources */ = {isa = PBXBuildFile; fileRef = 50B271D01FEDC1A000F3C39B /* public */; };
A084ECDBA7D38E1E42DFC39D /* Pods_App.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = AF277DCFFFF123FFC6DF26C7 /* Pods_App.framework */; };
97EF2DC6FD76C3643D680B8D /* Pods_App.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 90DCAFB4D8948F7A50C13800 /* Pods_App.framework */; };
/* End PBXBuildFile section */
/* Begin PBXFileReference section */
@@ -27,9 +27,9 @@
504EC3111FED79650016851F /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/LaunchScreen.storyboard; sourceTree = "<group>"; };
504EC3131FED79650016851F /* Info.plist */ = {isa = PBXFileReference; lastKnownFileType = text.plist.xml; path = Info.plist; sourceTree = "<group>"; };
50B271D01FEDC1A000F3C39B /* public */ = {isa = PBXFileReference; lastKnownFileType = folder; path = public; sourceTree = "<group>"; };
AF277DCFFFF123FFC6DF26C7 /* Pods_App.framework */ = {isa = PBXFileReference; explicitFileType = wrapper.framework; includeInIndex = 0; path = Pods_App.framework; sourceTree = BUILT_PRODUCTS_DIR; };
AF51FD2D460BCFE21FA515B2 /* Pods-App.release.xcconfig */ = {isa = PBXFileReference; includeInIndex = 1; lastKnownFileType = text.xcconfig; name = "Pods-App.release.xcconfig"; path = "Pods/Target Support Files/Pods-App/Pods-App.release.xcconfig"; sourceTree = "<group>"; };
FC68EB0AF532CFC21C3344DD /* Pods-App.debug.xcconfig */ = {isa = PBXFileReference; includeInIndex = 1; lastKnownFileType = text.xcconfig; name = "Pods-App.debug.xcconfig"; path = "Pods/Target Support Files/Pods-App/Pods-App.debug.xcconfig"; sourceTree = "<group>"; };
90DCAFB4D8948F7A50C13800 /* Pods_App.framework */ = {isa = PBXFileReference; explicitFileType = wrapper.framework; includeInIndex = 0; path = Pods_App.framework; sourceTree = BUILT_PRODUCTS_DIR; };
E2E9297D5D02C549106C77F9 /* Pods-App.release.xcconfig */ = {isa = PBXFileReference; includeInIndex = 1; lastKnownFileType = text.xcconfig; name = "Pods-App.release.xcconfig"; path = "Target Support Files/Pods-App/Pods-App.release.xcconfig"; sourceTree = "<group>"; };
EAEC6436E595F7CD3A1C9E96 /* Pods-App.debug.xcconfig */ = {isa = PBXFileReference; includeInIndex = 1; lastKnownFileType = text.xcconfig; name = "Pods-App.debug.xcconfig"; path = "Target Support Files/Pods-App/Pods-App.debug.xcconfig"; sourceTree = "<group>"; };
/* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */
@@ -37,17 +37,17 @@
isa = PBXFrameworksBuildPhase;
buildActionMask = 2147483647;
files = (
A084ECDBA7D38E1E42DFC39D /* Pods_App.framework in Frameworks */,
97EF2DC6FD76C3643D680B8D /* Pods_App.framework in Frameworks */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXFrameworksBuildPhase section */
/* Begin PBXGroup section */
27E2DDA53C4D2A4D1A88CE4A /* Frameworks */ = {
4B546315E668C7A13939F417 /* Frameworks */ = {
isa = PBXGroup;
children = (
AF277DCFFFF123FFC6DF26C7 /* Pods_App.framework */,
90DCAFB4D8948F7A50C13800 /* Pods_App.framework */,
);
name = Frameworks;
sourceTree = "<group>";
@@ -57,8 +57,8 @@
children = (
504EC3061FED79650016851F /* App */,
504EC3051FED79650016851F /* Products */,
7F8756D8B27F46E3366F6CEA /* Pods */,
27E2DDA53C4D2A4D1A88CE4A /* Frameworks */,
BA325FFCDCE8D334E5C7AEBE /* Pods */,
4B546315E668C7A13939F417 /* Frameworks */,
);
sourceTree = "<group>";
};
@@ -85,13 +85,13 @@
path = App;
sourceTree = "<group>";
};
7F8756D8B27F46E3366F6CEA /* Pods */ = {
BA325FFCDCE8D334E5C7AEBE /* Pods */ = {
isa = PBXGroup;
children = (
FC68EB0AF532CFC21C3344DD /* Pods-App.debug.xcconfig */,
AF51FD2D460BCFE21FA515B2 /* Pods-App.release.xcconfig */,
EAEC6436E595F7CD3A1C9E96 /* Pods-App.debug.xcconfig */,
E2E9297D5D02C549106C77F9 /* Pods-App.release.xcconfig */,
);
name = Pods;
path = Pods;
sourceTree = "<group>";
};
/* End PBXGroup section */
@@ -101,12 +101,13 @@
isa = PBXNativeTarget;
buildConfigurationList = 504EC3161FED79650016851F /* Build configuration list for PBXNativeTarget "App" */;
buildPhases = (
6634F4EFEBD30273BCE97C65 /* [CP] Check Pods Manifest.lock */,
92977BEA1068CC097A57FC77 /* [CP] Check Pods Manifest.lock */,
504EC3001FED79650016851F /* Sources */,
504EC3011FED79650016851F /* Frameworks */,
504EC3021FED79650016851F /* Resources */,
9592DBEFFC6D2A0C8D5DEB22 /* [CP] Embed Pods Frameworks */,
012076E8FFE4BF260A79B034 /* Fix Privacy Manifest */,
3525031ED1C96EF4CF6E9959 /* [CP] Embed Pods Frameworks */,
96A7EF592DF3366D00084D51 /* Fix Privacy Manifest */,
);
buildRules = (
);
@@ -186,28 +187,10 @@
);
runOnlyForDeploymentPostprocessing = 0;
shellPath = /bin/sh;
shellScript = "\"${PROJECT_DIR}/app_privacy_manifest_fixer/fixer.sh\" ";
shellScript = "\"${PROJECT_DIR}/app_privacy_manifest_fixer/fixer.sh\" \n";
showEnvVarsInLog = 0;
};
6634F4EFEBD30273BCE97C65 /* [CP] Check Pods Manifest.lock */ = {
isa = PBXShellScriptBuildPhase;
buildActionMask = 2147483647;
files = (
);
inputPaths = (
"${PODS_PODFILE_DIR_PATH}/Podfile.lock",
"${PODS_ROOT}/Manifest.lock",
);
name = "[CP] Check Pods Manifest.lock";
outputPaths = (
"$(DERIVED_FILE_DIR)/Pods-App-checkManifestLockResult.txt",
);
runOnlyForDeploymentPostprocessing = 0;
shellPath = /bin/sh;
shellScript = "diff \"${PODS_PODFILE_DIR_PATH}/Podfile.lock\" \"${PODS_ROOT}/Manifest.lock\" > /dev/null\nif [ $? != 0 ] ; then\n # print error to STDERR\n echo \"error: The sandbox is not in sync with the Podfile.lock. Run 'pod install' or update your CocoaPods installation.\" >&2\n exit 1\nfi\n# This output is used by Xcode 'outputs' to avoid re-running this script phase.\necho \"SUCCESS\" > \"${SCRIPT_OUTPUT_FILE_0}\"\n";
showEnvVarsInLog = 0;
};
9592DBEFFC6D2A0C8D5DEB22 /* [CP] Embed Pods Frameworks */ = {
3525031ED1C96EF4CF6E9959 /* [CP] Embed Pods Frameworks */ = {
isa = PBXShellScriptBuildPhase;
buildActionMask = 2147483647;
files = (
@@ -222,6 +205,47 @@
shellScript = "\"${PODS_ROOT}/Target Support Files/Pods-App/Pods-App-frameworks.sh\"\n";
showEnvVarsInLog = 0;
};
92977BEA1068CC097A57FC77 /* [CP] Check Pods Manifest.lock */ = {
isa = PBXShellScriptBuildPhase;
buildActionMask = 2147483647;
files = (
);
inputFileListPaths = (
);
inputPaths = (
"${PODS_PODFILE_DIR_PATH}/Podfile.lock",
"${PODS_ROOT}/Manifest.lock",
);
name = "[CP] Check Pods Manifest.lock";
outputFileListPaths = (
);
outputPaths = (
"$(DERIVED_FILE_DIR)/Pods-App-checkManifestLockResult.txt",
);
runOnlyForDeploymentPostprocessing = 0;
shellPath = /bin/sh;
shellScript = "diff \"${PODS_PODFILE_DIR_PATH}/Podfile.lock\" \"${PODS_ROOT}/Manifest.lock\" > /dev/null\nif [ $? != 0 ] ; then\n # print error to STDERR\n echo \"error: The sandbox is not in sync with the Podfile.lock. Run 'pod install' or update your CocoaPods installation.\" >&2\n exit 1\nfi\n# This output is used by Xcode 'outputs' to avoid re-running this script phase.\necho \"SUCCESS\" > \"${SCRIPT_OUTPUT_FILE_0}\"\n";
showEnvVarsInLog = 0;
};
96A7EF592DF3366D00084D51 /* Fix Privacy Manifest */ = {
isa = PBXShellScriptBuildPhase;
alwaysOutOfDate = 1;
buildActionMask = 2147483647;
files = (
);
inputFileListPaths = (
);
inputPaths = (
);
name = "Fix Privacy Manifest";
outputFileListPaths = (
);
outputPaths = (
);
runOnlyForDeploymentPostprocessing = 0;
shellPath = /bin/sh;
shellScript = "$PROJECT_DIR/app_privacy_manifest_fixer/fixer.sh\n";
};
/* End PBXShellScriptBuildPhase section */
/* Begin PBXSourcesBuildPhase section */
@@ -375,11 +399,11 @@
};
504EC3171FED79650016851F /* Debug */ = {
isa = XCBuildConfiguration;
baseConfigurationReference = FC68EB0AF532CFC21C3344DD /* Pods-App.debug.xcconfig */;
baseConfigurationReference = EAEC6436E595F7CD3A1C9E96 /* Pods-App.debug.xcconfig */;
buildSettings = {
ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 18;
CURRENT_PROJECT_VERSION = 33;
DEVELOPMENT_TEAM = GM3FS5JQPH;
ENABLE_APP_SANDBOX = NO;
ENABLE_USER_SCRIPT_SANDBOXING = NO;
@@ -389,7 +413,7 @@
"$(inherited)",
"@executable_path/Frameworks",
);
MARKETING_VERSION = 0.4.7;
MARKETING_VERSION = 0.5.7;
OTHER_SWIFT_FLAGS = "$(inherited) \"-D\" \"COCOAPODS\" \"-DDEBUG\"";
PRODUCT_BUNDLE_IDENTIFIER = app.timesafari;
PRODUCT_NAME = "$(TARGET_NAME)";
@@ -402,11 +426,11 @@
};
504EC3181FED79650016851F /* Release */ = {
isa = XCBuildConfiguration;
baseConfigurationReference = AF51FD2D460BCFE21FA515B2 /* Pods-App.release.xcconfig */;
baseConfigurationReference = E2E9297D5D02C549106C77F9 /* Pods-App.release.xcconfig */;
buildSettings = {
ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 18;
CURRENT_PROJECT_VERSION = 33;
DEVELOPMENT_TEAM = GM3FS5JQPH;
ENABLE_APP_SANDBOX = NO;
ENABLE_USER_SCRIPT_SANDBOXING = NO;
@@ -416,7 +440,7 @@
"$(inherited)",
"@executable_path/Frameworks",
);
MARKETING_VERSION = 0.4.7;
MARKETING_VERSION = 0.5.7;
PRODUCT_BUNDLE_IDENTIFIER = app.timesafari;
PRODUCT_NAME = "$(TARGET_NAME)";
SWIFT_ACTIVE_COMPILATION_CONDITIONS = "";

View File

@@ -1,5 +1,6 @@
import UIKit
import Capacitor
import CapacitorCommunitySqlite
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
@@ -7,6 +8,10 @@ class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
// Initialize SQLite
//let sqlite = SQLite()
//sqlite.initialize()
// Override point for customization after application launch.
return true
}

View File

@@ -49,5 +49,16 @@
</array>
<key>UIViewControllerBasedStatusBarAppearance</key>
<true/>
<key>CFBundleURLTypes</key>
<array>
<dict>
<key>CFBundleURLName</key>
<string>app.timesafari</string>
<key>CFBundleURLSchemes</key>
<array>
<string>timesafari</string>
</array>
</dict>
</array>
</dict>
</plist>

View File

@@ -11,6 +11,7 @@ install! 'cocoapods', :disable_input_output_paths => true
def capacitor_pods
pod 'Capacitor', :path => '../../node_modules/@capacitor/ios'
pod 'CapacitorCordova', :path => '../../node_modules/@capacitor/ios'
pod 'CapacitorCommunitySqlite', :path => '../../node_modules/@capacitor-community/sqlite'
pod 'CapacitorMlkitBarcodeScanning', :path => '../../node_modules/@capacitor-mlkit/barcode-scanning'
pod 'CapacitorApp', :path => '../../node_modules/@capacitor/app'
pod 'CapacitorCamera', :path => '../../node_modules/@capacitor/camera'
@@ -26,4 +27,9 @@ end
post_install do |installer|
assertDeploymentTarget(installer)
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['EXCLUDED_ARCHS[sdk=iphonesimulator*]'] = 'arm64'
end
end
end

View File

@@ -5,6 +5,10 @@ PODS:
- Capacitor
- CapacitorCamera (6.1.2):
- Capacitor
- CapacitorCommunitySqlite (6.0.2):
- Capacitor
- SQLCipher
- ZIPFoundation
- CapacitorCordova (6.2.1)
- CapacitorFilesystem (6.0.3):
- Capacitor
@@ -73,11 +77,18 @@ PODS:
- nanopb/decode (2.30910.0)
- nanopb/encode (2.30910.0)
- PromisesObjC (2.4.0)
- SQLCipher (4.9.0):
- SQLCipher/standard (= 4.9.0)
- SQLCipher/common (4.9.0)
- SQLCipher/standard (4.9.0):
- SQLCipher/common
- ZIPFoundation (0.9.19)
DEPENDENCIES:
- "Capacitor (from `../../node_modules/@capacitor/ios`)"
- "CapacitorApp (from `../../node_modules/@capacitor/app`)"
- "CapacitorCamera (from `../../node_modules/@capacitor/camera`)"
- "CapacitorCommunitySqlite (from `../../node_modules/@capacitor-community/sqlite`)"
- "CapacitorCordova (from `../../node_modules/@capacitor/ios`)"
- "CapacitorFilesystem (from `../../node_modules/@capacitor/filesystem`)"
- "CapacitorMlkitBarcodeScanning (from `../../node_modules/@capacitor-mlkit/barcode-scanning`)"
@@ -98,6 +109,8 @@ SPEC REPOS:
- MLKitVision
- nanopb
- PromisesObjC
- SQLCipher
- ZIPFoundation
EXTERNAL SOURCES:
Capacitor:
@@ -106,6 +119,8 @@ EXTERNAL SOURCES:
:path: "../../node_modules/@capacitor/app"
CapacitorCamera:
:path: "../../node_modules/@capacitor/camera"
CapacitorCommunitySqlite:
:path: "../../node_modules/@capacitor-community/sqlite"
CapacitorCordova:
:path: "../../node_modules/@capacitor/ios"
CapacitorFilesystem:
@@ -121,6 +136,7 @@ SPEC CHECKSUMS:
Capacitor: c95400d761e376be9da6be5a05f226c0e865cebf
CapacitorApp: e1e6b7d05e444d593ca16fd6d76f2b7c48b5aea7
CapacitorCamera: 9bc7b005d0e6f1d5f525b8137045b60cffffce79
CapacitorCommunitySqlite: 0299d20f4b00c2e6aa485a1d8932656753937b9b
CapacitorCordova: 8d93e14982f440181be7304aa9559ca631d77fff
CapacitorFilesystem: 59270a63c60836248812671aa3b15df673fbaf74
CapacitorMlkitBarcodeScanning: 7652be9c7922f39203a361de735d340ae37e134e
@@ -138,7 +154,9 @@ SPEC CHECKSUMS:
MLKitVision: 90922bca854014a856f8b649d1f1f04f63fd9c79
nanopb: 438bc412db1928dac798aa6fd75726007be04262
PromisesObjC: f5707f49cb48b9636751c5b2e7d227e43fba9f47
SQLCipher: 31878d8ebd27e5c96db0b7cb695c96e9f8ad77da
ZIPFoundation: b8c29ea7ae353b309bc810586181fd073cb3312c
PODFILE CHECKSUM: 7e7e09e6937de7f015393aecf2cf7823645689b3
PODFILE CHECKSUM: f987510f7383b04a1b09ea8472bdadcd88b6c924
COCOAPODS: 1.16.2

29
main.js
View File

@@ -1,29 +0,0 @@
const { app, BrowserWindow } = require('electron');
const path = require('path');
function createWindow() {
const win = new BrowserWindow({
width: 1200,
height: 800,
webPreferences: {
nodeIntegration: true,
contextIsolation: false
}
});
win.loadFile(path.join(__dirname, 'dist-electron/www/index.html'));
}
app.whenReady().then(createWindow);
app.on('window-all-closed', () => {
if (process.platform !== 'darwin') {
app.quit();
}
});
app.on('activate', () => {
if (BrowserWindow.getAllWindows().length === 0) {
createWindow();
}
});

5625
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "timesafari",
"version": "0.4.6",
"version": "1.0.0",
"description": "Time Safari Application",
"author": {
"name": "Time Safari Team"
@@ -11,7 +11,7 @@
"build": "VITE_GIT_HASH=`git log -1 --pretty=format:%h` vite build --config vite.config.mts",
"lint": "eslint --ext .js,.ts,.vue --ignore-path .gitignore src",
"lint-fix": "eslint --ext .js,.ts,.vue --ignore-path .gitignore --fix src",
"prebuild": "eslint --ext .js,.ts,.vue --ignore-path .gitignore src && node sw_combine.js",
"prebuild": "eslint --ext .js,.ts,.vue --ignore-path .gitignore src && node sw_combine.js && node scripts/copy-wasm.js",
"test:all": "npm run test:prerequisites && npm run build && npm run test:web && npm run test:mobile",
"test:prerequisites": "node scripts/check-prerequisites.js",
"test:web": "npx playwright test -c playwright.config-local.ts --trace on",
@@ -46,6 +46,7 @@
"electron:build-mac-universal": "npm run build:electron-prod && electron-builder --mac --universal"
},
"dependencies": {
"@capacitor-community/sqlite": "6.0.2",
"@capacitor-mlkit/barcode-scanning": "^6.0.0",
"@capacitor/android": "^6.2.0",
"@capacitor/app": "^6.0.0",
@@ -63,6 +64,7 @@
"@fortawesome/fontawesome-svg-core": "^6.5.1",
"@fortawesome/free-solid-svg-icons": "^6.5.1",
"@fortawesome/vue-fontawesome": "^3.0.6",
"@jlongster/sql.js": "^1.6.7",
"@peculiar/asn1-ecc": "^2.3.8",
"@peculiar/asn1-schema": "^2.3.8",
"@pvermeer/dexie-encrypted-addon": "^3.0.0",
@@ -81,6 +83,7 @@
"@vue-leaflet/vue-leaflet": "^0.10.1",
"@vueuse/core": "^12.3.0",
"@zxing/text-encoding": "^0.9.0",
"absurd-sql": "^0.0.54",
"asn1-ber": "^1.2.2",
"axios": "^1.6.8",
"cbor-x": "^1.5.9",
@@ -144,7 +147,9 @@
"@vitejs/plugin-vue": "^5.2.1",
"@vue/eslint-config-typescript": "^11.0.3",
"autoprefixer": "^10.4.19",
"browserify-fs": "^1.0.0",
"concurrently": "^8.2.2",
"crypto-browserify": "^3.12.1",
"electron": "^33.2.1",
"electron-builder": "^25.1.8",
"eslint": "^8.57.0",
@@ -155,13 +160,14 @@
"markdownlint": "^0.37.4",
"markdownlint-cli": "^0.44.0",
"npm-check-updates": "^17.1.13",
"path-browserify": "^1.0.1",
"postcss": "^8.4.38",
"prettier": "^3.2.5",
"rimraf": "^6.0.1",
"tailwindcss": "^3.4.1",
"typescript": "~5.2.2",
"vite": "^5.2.0",
"vite-plugin-pwa": "^0.19.8"
"vite-plugin-pwa": "^1.0.0"
},
"main": "./dist-electron/main.js",
"build": {
@@ -176,7 +182,7 @@
],
"extraResources": [
{
"from": "dist",
"from": "dist-electron/www",
"to": "www"
}
],

View File

@@ -2,5 +2,6 @@ dependencies:
- gradle
- java
- pod
- rubygems.org
# other dependencies are discovered via package.json & requirements.txt & Gemfile (I'm guessing).

View File

@@ -1,5 +1,6 @@
eth_keys
pywebview
pyinstaller>=6.12.0
setuptools>=69.0.0 # Required for distutils for electron-builder on macOS
# For development
watchdog>=3.0.0 # For file watching support

View File

@@ -1,10 +1,9 @@
const fs = require('fs');
const path = require('path');
console.log('Starting electron build process...');
console.log('Starting electron build process...');
// Copy web files
const webDistPath = path.join(__dirname, '..', 'dist');
// Define paths
const electronDistPath = path.join(__dirname, '..', 'dist-electron');
const wwwPath = path.join(electronDistPath, 'www');
@@ -13,231 +12,154 @@ if (!fs.existsSync(wwwPath)) {
fs.mkdirSync(wwwPath, { recursive: true });
}
// Copy web files to www directory
fs.cpSync(webDistPath, wwwPath, { recursive: true });
// Create a platform-specific index.html for Electron
const initialIndexContent = `<!DOCTYPE html>
<html lang="">
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width,initial-scale=1.0,viewport-fit=cover">
<link rel="icon" href="/favicon.ico">
<title>TimeSafari</title>
</head>
<body>
<noscript>
<strong>We're sorry but TimeSafari doesn't work properly without JavaScript enabled. Please enable it to continue.</strong>
</noscript>
<div id="app"></div>
<script type="module">
// Force electron platform
window.process = { env: { VITE_PLATFORM: 'electron' } };
import('./src/main.electron.ts');
</script>
</body>
</html>`;
// Fix asset paths in index.html
// Write the Electron-specific index.html
fs.writeFileSync(path.join(wwwPath, 'index.html'), initialIndexContent);
// Copy only necessary assets from web build
const webDistPath = path.join(__dirname, '..', 'dist');
if (fs.existsSync(webDistPath)) {
// Copy assets directory
const assetsSrc = path.join(webDistPath, 'assets');
const assetsDest = path.join(wwwPath, 'assets');
if (fs.existsSync(assetsSrc)) {
fs.cpSync(assetsSrc, assetsDest, { recursive: true });
}
// Copy favicon
const faviconSrc = path.join(webDistPath, 'favicon.ico');
if (fs.existsSync(faviconSrc)) {
fs.copyFileSync(faviconSrc, path.join(wwwPath, 'favicon.ico'));
}
}
// Remove service worker files
const swFilesToRemove = [
'sw.js',
'sw.js.map',
'workbox-*.js',
'workbox-*.js.map',
'registerSW.js',
'manifest.webmanifest',
'**/workbox-*.js',
'**/workbox-*.js.map',
'**/sw.js',
'**/sw.js.map',
'**/registerSW.js',
'**/manifest.webmanifest'
];
console.log('Removing service worker files...');
swFilesToRemove.forEach(pattern => {
const files = fs.readdirSync(wwwPath).filter(file =>
file.match(new RegExp(pattern.replace(/\*/g, '.*')))
);
files.forEach(file => {
const filePath = path.join(wwwPath, file);
console.log(`Removing ${filePath}`);
try {
fs.unlinkSync(filePath);
} catch (err) {
console.warn(`Could not remove ${filePath}:`, err.message);
}
});
});
// Also check and remove from assets directory
const assetsPath = path.join(wwwPath, 'assets');
if (fs.existsSync(assetsPath)) {
swFilesToRemove.forEach(pattern => {
const files = fs.readdirSync(assetsPath).filter(file =>
file.match(new RegExp(pattern.replace(/\*/g, '.*')))
);
files.forEach(file => {
const filePath = path.join(assetsPath, file);
console.log(`Removing ${filePath}`);
try {
fs.unlinkSync(filePath);
} catch (err) {
console.warn(`Could not remove ${filePath}:`, err.message);
}
});
});
}
// Modify index.html to remove service worker registration
const indexPath = path.join(wwwPath, 'index.html');
let indexContent = fs.readFileSync(indexPath, 'utf8');
if (fs.existsSync(indexPath)) {
console.log('Modifying index.html to remove service worker registration...');
let indexContent = fs.readFileSync(indexPath, 'utf8');
// Remove service worker registration script
indexContent = indexContent
.replace(/<script[^>]*id="vite-plugin-pwa:register-sw"[^>]*><\/script>/g, '')
.replace(/<script[^>]*registerServiceWorker[^>]*><\/script>/g, '')
.replace(/<link[^>]*rel="manifest"[^>]*>/g, '')
.replace(/<link[^>]*rel="serviceworker"[^>]*>/g, '')
.replace(/navigator\.serviceWorker\.register\([^)]*\)/g, '')
.replace(/if\s*\(\s*['"]serviceWorker['"]\s*in\s*navigator\s*\)\s*{[^}]*}/g, '');
fs.writeFileSync(indexPath, indexContent);
console.log('Successfully modified index.html');
}
// Fix asset paths
indexContent = indexContent
console.log('Fixing asset paths in index.html...');
let modifiedIndexContent = fs.readFileSync(indexPath, 'utf8');
modifiedIndexContent = modifiedIndexContent
.replace(/\/assets\//g, './assets/')
.replace(/href="\//g, 'href="./')
.replace(/src="\//g, 'src="./');
fs.writeFileSync(indexPath, indexContent);
fs.writeFileSync(indexPath, modifiedIndexContent);
// Verify no service worker references remain
const finalContent = fs.readFileSync(indexPath, 'utf8');
if (finalContent.includes('serviceWorker') || finalContent.includes('workbox')) {
console.warn('Warning: Service worker references may still exist in index.html');
}
// Check for remaining /assets/ paths
console.log('After path fixing, checking for remaining /assets/ paths:', indexContent.includes('/assets/'));
console.log('Sample of fixed content:', indexContent.substring(0, 500));
console.log('After path fixing, checking for remaining /assets/ paths:', finalContent.includes('/assets/'));
console.log('Sample of fixed content:', finalContent.substring(0, 500));
console.log('Copied and fixed web files in:', wwwPath);
// Copy main process files
console.log('Copying main process files...');
// Copy main process files
console.log('Copying main process files...');
// Create the main process file with inlined logger
const mainContent = `const { app, BrowserWindow } = require("electron");
const path = require("path");
const fs = require("fs");
// Copy the main process file instead of creating a template
const mainSrcPath = path.join(__dirname, '..', 'dist-electron', 'main.js');
const mainDestPath = path.join(electronDistPath, 'main.js');
// Inline logger implementation
const logger = {
log: (...args) => console.log(...args),
error: (...args) => console.error(...args),
info: (...args) => console.info(...args),
warn: (...args) => console.warn(...args),
debug: (...args) => console.debug(...args),
};
// Check if running in dev mode
const isDev = process.argv.includes("--inspect");
function createWindow() {
// Add before createWindow function
const preloadPath = path.join(__dirname, "preload.js");
logger.log("Checking preload path:", preloadPath);
logger.log("Preload exists:", fs.existsSync(preloadPath));
// Create the browser window.
const mainWindow = new BrowserWindow({
width: 1200,
height: 800,
webPreferences: {
nodeIntegration: false,
contextIsolation: true,
webSecurity: true,
allowRunningInsecureContent: false,
preload: path.join(__dirname, "preload.js"),
},
});
// Always open DevTools for now
mainWindow.webContents.openDevTools();
// Intercept requests to fix asset paths
mainWindow.webContents.session.webRequest.onBeforeRequest(
{
urls: [
"file://*/*/assets/*",
"file://*/assets/*",
"file:///assets/*", // Catch absolute paths
"<all_urls>", // Catch all URLs as a fallback
],
},
(details, callback) => {
let url = details.url;
// Handle paths that don't start with file://
if (!url.startsWith("file://") && url.includes("/assets/")) {
url = \`file://\${path.join(__dirname, "www", url)}\`;
}
// Handle absolute paths starting with /assets/
if (url.includes("/assets/") && !url.includes("/www/assets/")) {
const baseDir = url.includes("dist-electron")
? url.substring(
0,
url.indexOf("/dist-electron") + "/dist-electron".length,
)
: \`file://\${__dirname}\`;
const assetPath = url.split("/assets/")[1];
const newUrl = \`\${baseDir}/www/assets/\${assetPath}\`;
callback({ redirectURL: newUrl });
return;
}
callback({}); // No redirect for other URLs
},
);
if (isDev) {
// Debug info
logger.log("Debug Info:");
logger.log("Running in dev mode:", isDev);
logger.log("App is packaged:", app.isPackaged);
logger.log("Process resource path:", process.resourcesPath);
logger.log("App path:", app.getAppPath());
logger.log("__dirname:", __dirname);
logger.log("process.cwd():", process.cwd());
}
const indexPath = path.join(__dirname, "www", "index.html");
if (isDev) {
logger.log("Loading index from:", indexPath);
logger.log("www path:", path.join(__dirname, "www"));
logger.log("www assets path:", path.join(__dirname, "www", "assets"));
}
if (!fs.existsSync(indexPath)) {
logger.error(\`Index file not found at: \${indexPath}\`);
throw new Error("Index file not found");
}
// Add CSP headers to allow API connections, Google Fonts, and zxing-wasm
mainWindow.webContents.session.webRequest.onHeadersReceived(
(details, callback) => {
callback({
responseHeaders: {
...details.responseHeaders,
"Content-Security-Policy": [
"default-src 'self';" +
"connect-src 'self' https://api.endorser.ch https://*.timesafari.app https://*.jsdelivr.net;" +
"img-src 'self' data: https: blob:;" +
"script-src 'self' 'unsafe-inline' 'unsafe-eval' https://*.jsdelivr.net;" +
"style-src 'self' 'unsafe-inline' https://fonts.googleapis.com;" +
"font-src 'self' data: https://fonts.gstatic.com;" +
"style-src-elem 'self' 'unsafe-inline' https://fonts.googleapis.com;" +
"worker-src 'self' blob:;",
],
},
});
},
);
// Load the index.html
mainWindow
.loadFile(indexPath)
.then(() => {
logger.log("Successfully loaded index.html");
if (isDev) {
mainWindow.webContents.openDevTools();
logger.log("DevTools opened - running in dev mode");
}
})
.catch((err) => {
logger.error("Failed to load index.html:", err);
logger.error("Attempted path:", indexPath);
});
// Listen for console messages from the renderer
mainWindow.webContents.on("console-message", (_event, _level, message) => {
logger.log("Renderer Console:", message);
});
// Add right after creating the BrowserWindow
mainWindow.webContents.on(
"did-fail-load",
(_event, errorCode, errorDescription) => {
logger.error("Page failed to load:", errorCode, errorDescription);
},
);
mainWindow.webContents.on("preload-error", (_event, preloadPath, error) => {
logger.error("Preload script error:", preloadPath, error);
});
mainWindow.webContents.on(
"console-message",
(_event, _level, message, line, sourceId) => {
logger.log("Renderer Console:", line, sourceId, message);
},
);
// Enable remote debugging when in dev mode
if (isDev) {
mainWindow.webContents.openDevTools();
}
if (fs.existsSync(mainSrcPath)) {
fs.copyFileSync(mainSrcPath, mainDestPath);
console.log('Copied main process file successfully');
} else {
console.error('Main process file not found at:', mainSrcPath);
process.exit(1);
}
// Handle app ready
app.whenReady().then(createWindow);
// Handle all windows closed
app.on("window-all-closed", () => {
if (process.platform !== "darwin") {
app.quit();
}
});
app.on("activate", () => {
if (BrowserWindow.getAllWindows().length === 0) {
createWindow();
}
});
// Handle any errors
process.on("uncaughtException", (error) => {
logger.error("Uncaught Exception:", error);
});
`;
// Write the main process file
const mainDest = path.join(electronDistPath, 'main.js');
fs.writeFileSync(mainDest, mainContent);
// Copy preload script if it exists
const preloadSrc = path.join(__dirname, '..', 'src', 'electron', 'preload.js');
const preloadDest = path.join(electronDistPath, 'preload.js');
if (fs.existsSync(preloadSrc)) {
console.log(`Copying ${preloadSrc} to ${preloadDest}`);
fs.copyFileSync(preloadSrc, preloadDest);
}
// Verify build structure
console.log('\nVerifying build structure:');
console.log('Files in dist-electron:', fs.readdirSync(electronDistPath));
console.log('Build completed successfully!');
console.log('Electron build process completed successfully');

15
scripts/copy-wasm.js Normal file
View File

@@ -0,0 +1,15 @@
const fs = require('fs');
const path = require('path');
// Create public/wasm directory if it doesn't exist
const wasmDir = path.join(__dirname, '../public/wasm');
if (!fs.existsSync(wasmDir)) {
fs.mkdirSync(wasmDir, { recursive: true });
}
// Copy the WASM file from node_modules to public/wasm
const sourceFile = path.join(__dirname, '../node_modules/@jlongster/sql.js/dist/sql-wasm.wasm');
const targetFile = path.join(wasmDir, 'sql-wasm.wasm');
fs.copyFileSync(sourceFile, targetFile);
console.log('WASM file copied successfully!');

View File

@@ -330,8 +330,11 @@
<script lang="ts">
import { Vue, Component } from "vue-facing-decorator";
import { logConsoleAndDb, retrieveSettingsForActiveAccount } from "./db/index";
import { NotificationIface } from "./constants/app";
import { NotificationIface, USE_DEXIE_DB } from "./constants/app";
import * as databaseUtil from "./db/databaseUtil";
import { retrieveSettingsForActiveAccount } from "./db/index";
import { logConsoleAndDb } from "./db/databaseUtil";
import { logger } from "./utils/logger";
interface Settings {
@@ -396,7 +399,11 @@ export default class App extends Vue {
try {
logger.log("Retrieving settings for the active account...");
const settings: Settings = await retrieveSettingsForActiveAccount();
let settings: Settings =
await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
logger.log("Retrieved settings:", settings);
const notifyingNewActivity = !!settings?.notifyingNewActivityTime;

75
src/assets/icons.json Normal file
View File

@@ -0,0 +1,75 @@
{
"warning": {
"fillRule": "evenodd",
"d": "M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z",
"clipRule": "evenodd"
},
"spinner": {
"d": "M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"
},
"chart": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"
},
"plus": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M12 4v16m8-8H4"
},
"settings": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M10.325 4.317c.426-1.756 2.924-1.756 3.35 0a1.724 1.724 0 002.573 1.066c1.543-.94 3.31.826 2.37 2.37a1.724 1.724 0 001.065 2.572c1.756.426 1.756 2.924 0 3.35a1.724 1.724 0 00-1.066 2.573c.94 1.543-.826 3.31-2.37 2.37a1.724 1.724 0 00-2.572 1.065c-.426 1.756-2.924 1.756-3.35 0a1.724 1.724 0 00-2.573-1.066c-1.543.94-3.31-.826-2.37-2.37a1.724 1.724 0 00-1.065-2.572c-1.756-.426-1.756-2.924 0-3.35a1.724 1.724 0 001.066-2.573c-.94-1.543.826-3.31 2.37-2.37.996.608 2.296.07 2.572-1.065z"
},
"settingsDot": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M15 12a3 3 0 11-6 0 3 3 0 016 0z"
},
"lock": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M12 15v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2zm10-10V7a4 4 0 00-8 0v4h8z"
},
"download": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M12 10v6m0 0l-3-3m3 3l3-3m2 8H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z"
},
"check": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z"
},
"edit": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z"
},
"trash": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"
},
"plusCircle": {
"strokeLinecap": "round",
"strokeLinejoin": "round",
"strokeWidth": "2",
"d": "M12 6v6m0 0v6m0-6h6m-6 0H6"
},
"info": {
"fillRule": "evenodd",
"d": "M18 10a8 8 0 11-16 0 8 8 0 0116 0zm-7-4a1 1 0 11-2 0 1 1 0 012 0zM9 9a1 1 0 000 2v3a1 1 0 001 1h1a1 1 0 100-2v-3a1 1 0 00-1-1H9z",
"clipRule": "evenodd"
}
}

View File

@@ -14,22 +14,34 @@
class="flex items-center justify-between gap-2 text-lg bg-slate-200 border border-slate-300 border-b-0 rounded-t-md px-3 sm:px-4 py-1 sm:py-2"
>
<div class="flex items-center gap-2">
<div v-if="record.issuerDid">
<router-link
v-if="record.issuerDid && !isHiddenDid(record.issuerDid)"
:to="{
path: '/did/' + encodeURIComponent(record.issuerDid),
}"
title="More details about this person"
>
<EntityIcon
:entity-id="record.issuerDid"
class="rounded-full bg-white overflow-hidden !size-[2rem] object-cover"
/>
</div>
<div v-else>
<font-awesome
icon="person-circle-question"
class="text-slate-300 text-[2rem]"
/>
</div>
</router-link>
<font-awesome
v-else-if="isHiddenDid(record.issuerDid)"
icon="eye-slash"
class="text-slate-400 !size-[2rem] cursor-pointer"
@click="notifyHiddenPerson"
/>
<font-awesome
v-else
icon="person-circle-question"
class="text-slate-400 !size-[2rem] cursor-pointer"
@click="notifyUnknownPerson"
/>
<div>
<h3 class="font-semibold">
{{ record.issuer.known ? record.issuer.displayName : "" }}
<h3 v-if="record.issuer.known" class="font-semibold leading-tight">
{{ record.issuer.displayName }}
</h3>
<p class="ms-auto text-xs text-slate-500 italic">
{{ friendlyDate }}
@@ -37,7 +49,11 @@
</div>
</div>
<a class="cursor-pointer" @click="$emit('loadClaim', record.jwtId)">
<a
class="cursor-pointer"
data-testid="circle-info-link"
@click="$emit('loadClaim', record.jwtId)"
>
<font-awesome icon="circle-info" class="fa-fw text-slate-500" />
</a>
</div>
@@ -46,7 +62,7 @@
<!-- Record Image -->
<div
v-if="record.image"
class="bg-cover mb-6 -mt-3 sm:-mt-4 -mx-3 sm:-mx-4"
class="bg-cover mb-2 -mt-3 sm:-mt-4 -mx-3 sm:-mx-4"
:style="`background-image: url(${record.image});`"
>
<a
@@ -62,29 +78,59 @@
</a>
</div>
<!-- Description -->
<p class="font-medium">
<a class="cursor-pointer" @click="$emit('loadClaim', record.jwtId)">
{{ description }}
</a>
</p>
<div
class="relative flex justify-between gap-4 max-w-[40rem] mx-auto mb-5"
class="relative flex justify-between gap-4 max-w-[40rem] mx-auto mt-4"
>
<!-- Source -->
<div
class="w-[8rem] sm:w-[12rem] text-center bg-white border border-slate-200 rounded p-2 sm:p-3"
class="w-[7rem] sm:w-[12rem] text-center bg-white border border-slate-200 rounded p-2 sm:p-3"
>
<div class="relative w-fit mx-auto">
<div>
<!-- Project Icon -->
<div v-if="record.providerPlanName">
<ProjectIcon
:entity-id="record.providerPlanName"
:icon-size="48"
class="rounded size-[3rem] sm:size-[4rem] *:w-full *:h-full"
/>
<router-link
:to="{
path:
'/project/' +
encodeURIComponent(record.providerPlanHandleId || ''),
}"
title="View project details"
>
<ProjectIcon
:entity-id="record.providerPlanHandleId || ''"
:icon-size="48"
class="rounded size-[3rem] sm:size-[4rem] *:w-full *:h-full"
/>
</router-link>
</div>
<!-- Identicon for DIDs -->
<div v-else-if="record.agentDid">
<EntityIcon
:entity-id="record.agentDid"
:profile-image-url="record.issuer.profileImageUrl"
class="rounded-full bg-slate-100 overflow-hidden !size-[3rem] sm:!size-[4rem]"
<router-link
v-if="!isHiddenDid(record.agentDid)"
:to="{
path: '/did/' + encodeURIComponent(record.agentDid),
}"
title="More details about this person"
>
<EntityIcon
:entity-id="record.agentDid"
:profile-image-url="record.issuer.profileImageUrl"
class="rounded-full bg-slate-100 overflow-hidden !size-[3rem] sm:!size-[4rem]"
/>
</router-link>
<font-awesome
v-else
icon="eye-slash"
class="text-slate-300 !size-[3rem] sm:!size-[4rem]"
@click="notifyHiddenPerson"
/>
</div>
<!-- Unknown Person -->
@@ -92,6 +138,7 @@
<font-awesome
icon="person-circle-question"
class="text-slate-300 text-[3rem] sm:text-[4rem]"
@click="notifyUnknownPerson"
/>
</div>
</div>
@@ -110,9 +157,11 @@
<!-- Arrow -->
<div
class="absolute inset-x-[8rem] sm:inset-x-[12rem] mx-2 top-1/2 -translate-y-1/2"
class="absolute inset-x-[7rem] sm:inset-x-[12rem] mx-2 top-1/2 -translate-y-1/2"
>
<div class="text-sm text-center leading-none font-semibold pe-[15px]">
<div
class="text-sm text-center leading-none font-semibold pe-2 sm:pe-4"
>
{{ fetchAmount }}
</div>
@@ -129,24 +178,47 @@
<!-- Destination -->
<div
class="w-[8rem] sm:w-[12rem] text-center bg-white border border-slate-200 rounded p-2 sm:p-3"
class="w-[7rem] sm:w-[12rem] text-center bg-white border border-slate-200 rounded p-2 sm:p-3"
>
<div class="relative w-fit mx-auto">
<div>
<!-- Project Icon -->
<div v-if="record.recipientProjectName">
<ProjectIcon
:entity-id="record.recipientProjectName"
:icon-size="48"
class="rounded size-[3rem] sm:size-[4rem] *:w-full *:h-full"
/>
<router-link
:to="{
path:
'/project/' +
encodeURIComponent(record.fulfillsPlanHandleId || ''),
}"
title="View project details"
>
<ProjectIcon
:entity-id="record.fulfillsPlanHandleId || ''"
:icon-size="48"
class="rounded size-[3rem] sm:size-[4rem] *:w-full *:h-full"
/>
</router-link>
</div>
<!-- Identicon for DIDs -->
<div v-else-if="record.recipientDid">
<EntityIcon
:entity-id="record.recipientDid"
:profile-image-url="record.receiver.profileImageUrl"
class="rounded-full bg-slate-100 overflow-hidden !size-[3rem] sm:!size-[4rem]"
<router-link
v-if="!isHiddenDid(record.recipientDid)"
:to="{
path: '/did/' + encodeURIComponent(record.recipientDid),
}"
title="More details about this person"
>
<EntityIcon
:entity-id="record.recipientDid"
:profile-image-url="record.receiver.profileImageUrl"
class="rounded-full bg-slate-100 overflow-hidden !size-[3rem] sm:!size-[4rem]"
/>
</router-link>
<font-awesome
v-else
icon="eye-slash"
class="text-slate-300 !size-[3rem] sm:!size-[4rem]"
@click="notifyHiddenPerson"
/>
</div>
<!-- Unknown Person -->
@@ -154,6 +226,7 @@
<font-awesome
icon="person-circle-question"
class="text-slate-300 text-[3rem] sm:text-[4rem]"
@click="notifyUnknownPerson"
/>
</div>
</div>
@@ -170,13 +243,6 @@
</div>
</div>
</div>
<!-- Description -->
<p class="font-medium">
<a class="cursor-pointer" @click="$emit('loadClaim', record.jwtId)">
{{ description }}
</a>
</p>
</div>
</li>
</template>
@@ -186,8 +252,9 @@ import { Component, Prop, Vue, Emit } from "vue-facing-decorator";
import { GiveRecordWithContactInfo } from "../types";
import EntityIcon from "./EntityIcon.vue";
import { isGiveClaimType, notifyWhyCannotConfirm } from "../libs/util";
import { containsHiddenDid } from "../libs/endorserServer";
import { containsHiddenDid, isHiddenDid } from "../libs/endorserServer";
import ProjectIcon from "./ProjectIcon.vue";
import { NotificationIface } from "../constants/app";
@Component({
components: {
@@ -202,6 +269,33 @@ export default class ActivityListItem extends Vue {
@Prop() activeDid!: string;
@Prop() confirmerIdList?: string[];
isHiddenDid = isHiddenDid;
$notify!: (notification: NotificationIface, timeout?: number) => void;
notifyHiddenPerson() {
this.$notify(
{
group: "alert",
type: "warning",
title: "Person Outside Your Network",
text: "This person is not visible to you.",
},
3000,
);
}
notifyUnknownPerson() {
this.$notify(
{
group: "alert",
type: "warning",
title: "Unidentified Person",
text: "Nobody specific was recognized.",
},
3000,
);
}
@Emit()
cacheImage(image: string) {
return image;
@@ -222,7 +316,7 @@ export default class ActivityListItem extends Vue {
const claim =
(this.record.fullClaim as unknown).claim || this.record.fullClaim;
return `${claim.description}`;
return `${claim?.description || ""}`;
}
private displayAmount(code: string, amt: number) {

View File

@@ -24,9 +24,7 @@ backup and database export, with platform-specific download instructions. * *
class="block w-full text-center text-md bg-gradient-to-b from-blue-400 to-blue-700 shadow-[inset_0_-1px_0_0_rgba(0,0,0,0.5)] text-white px-1.5 py-2 rounded-md"
@click="exportDatabase()"
>
Download Settings & Contacts
<br />
(excluding Identifier Data)
Download Contacts
</button>
<a
ref="downloadLink"
@@ -62,14 +60,18 @@ backup and database export, with platform-specific download instructions. * *
<script lang="ts">
import { Component, Prop, Vue } from "vue-facing-decorator";
import { NotificationIface } from "../constants/app";
import { db } from "../db/index";
import { AppString, NotificationIface } from "../constants/app";
import { Contact } from "../db/tables/contacts";
import * as databaseUtil from "../db/databaseUtil";
import { logger } from "../utils/logger";
import { PlatformServiceFactory } from "../services/PlatformServiceFactory";
import {
PlatformService,
PlatformCapabilities,
} from "../services/PlatformService";
import { contactsToExportJson } from "../libs/util";
/**
* @vue-component
@@ -131,21 +133,25 @@ export default class DataExportSection extends Vue {
*/
public async exportDatabase() {
try {
const blob = await db.export({
prettyJson: true,
transform: (table, value, key) => {
if (table === "contacts") {
// Dexie inserts a number 0 when some are undefined, so we need to totally remove them.
Object.keys(value).forEach((prop) => {
if (value[prop] === undefined) {
delete value[prop];
}
});
}
return { value, key };
},
});
const fileName = `${db.name}-backup.json`;
let allContacts: Contact[] = [];
const platformService = PlatformServiceFactory.getInstance();
const result = await platformService.dbQuery(`SELECT * FROM contacts`);
if (result) {
allContacts = databaseUtil.mapQueryResultToValues(
result,
) as unknown as Contact[];
}
// if (USE_DEXIE_DB) {
// await db.open();
// allContacts = await db.contacts.toArray();
// }
// Convert contacts to export format
const exportData = contactsToExportJson(allContacts);
const jsonStr = JSON.stringify(exportData, null, 2);
const blob = new Blob([jsonStr], { type: "application/json" });
const fileName = `${AppString.APP_NAME_NO_SPACES}-backup-contacts.json`;
if (this.platformCapabilities.hasFileDownload) {
// Web platform: Use download link
@@ -157,8 +163,9 @@ export default class DataExportSection extends Vue {
setTimeout(() => URL.revokeObjectURL(this.downloadUrl), 1000);
} else if (this.platformCapabilities.hasFileSystem) {
// Native platform: Write to app directory
const content = await blob.text();
await this.platformService.writeAndShareFile(fileName, content);
await this.platformService.writeAndShareFile(fileName, jsonStr);
} else {
throw new Error("This platform does not support file downloads.");
}
this.$notify(
@@ -167,10 +174,10 @@ export default class DataExportSection extends Vue {
type: "success",
title: "Export Successful",
text: this.platformCapabilities.hasFileDownload
? "See your downloads directory for the backup. It is in the Dexie format."
: "You should have been prompted to save your backup file.",
? "See your downloads directory for the backup."
: "The backup file has been saved.",
},
-1,
3000,
);
} catch (error) {
logger.error("Export Error:", error);

View File

@@ -99,6 +99,9 @@ import {
LTileLayer,
} from "@vue-leaflet/vue-leaflet";
import { Router } from "vue-router";
import { USE_DEXIE_DB } from "@/constants/app";
import * as databaseUtil from "../db/databaseUtil";
import { MASTER_SETTINGS_KEY } from "../db/tables/settings";
import { db, retrieveSettingsForActiveAccount } from "../db/index";
@@ -122,7 +125,10 @@ export default class FeedFilters extends Vue {
async open(onCloseIfChanged: () => void) {
this.onCloseIfChanged = onCloseIfChanged;
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
this.hasVisibleDid = !!settings.filterFeedByVisible;
this.isNearby = !!settings.filterFeedByNearby;
if (settings.searchBoxes && settings.searchBoxes.length > 0) {
@@ -136,17 +142,29 @@ export default class FeedFilters extends Vue {
async toggleHasVisibleDid() {
this.settingChanged = true;
this.hasVisibleDid = !this.hasVisibleDid;
await db.settings.update(MASTER_SETTINGS_KEY, {
await databaseUtil.updateDefaultSettings({
filterFeedByVisible: this.hasVisibleDid,
});
if (USE_DEXIE_DB) {
await db.settings.update(MASTER_SETTINGS_KEY, {
filterFeedByVisible: this.hasVisibleDid,
});
}
}
async toggleNearby() {
this.settingChanged = true;
this.isNearby = !this.isNearby;
await db.settings.update(MASTER_SETTINGS_KEY, {
await databaseUtil.updateDefaultSettings({
filterFeedByNearby: this.isNearby,
});
if (USE_DEXIE_DB) {
await db.settings.update(MASTER_SETTINGS_KEY, {
filterFeedByNearby: this.isNearby,
});
}
}
async clearAll() {
@@ -154,11 +172,18 @@ export default class FeedFilters extends Vue {
this.settingChanged = true;
}
await db.settings.update(MASTER_SETTINGS_KEY, {
await databaseUtil.updateDefaultSettings({
filterFeedByNearby: false,
filterFeedByVisible: false,
});
if (USE_DEXIE_DB) {
await db.settings.update(MASTER_SETTINGS_KEY, {
filterFeedByNearby: false,
filterFeedByVisible: false,
});
}
this.hasVisibleDid = false;
this.isNearby = false;
}
@@ -168,11 +193,18 @@ export default class FeedFilters extends Vue {
this.settingChanged = true;
}
await db.settings.update(MASTER_SETTINGS_KEY, {
await databaseUtil.updateDefaultSettings({
filterFeedByNearby: true,
filterFeedByVisible: true,
});
if (USE_DEXIE_DB) {
await db.settings.update(MASTER_SETTINGS_KEY, {
filterFeedByNearby: true,
filterFeedByVisible: true,
});
}
this.hasVisibleDid = true;
this.isNearby = true;
}

View File

@@ -89,7 +89,7 @@
<script lang="ts">
import { Vue, Component, Prop } from "vue-facing-decorator";
import { NotificationIface } from "../constants/app";
import { NotificationIface, USE_DEXIE_DB } from "../constants/app";
import {
createAndSubmitGive,
didInfo,
@@ -98,8 +98,10 @@ import {
import * as libsUtil from "../libs/util";
import { db, retrieveSettingsForActiveAccount } from "../db/index";
import { Contact } from "../db/tables/contacts";
import * as databaseUtil from "../db/databaseUtil";
import { retrieveAccountDids } from "../libs/util";
import { logger } from "../utils/logger";
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
@Component
export default class GiftedDialog extends Vue {
@@ -144,11 +146,23 @@ export default class GiftedDialog extends Vue {
this.offerId = offerId || "";
try {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
this.apiServer = settings.apiServer || "";
this.activeDid = settings.activeDid || "";
this.allContacts = await db.contacts.toArray();
const platformService = PlatformServiceFactory.getInstance();
const result = await platformService.dbQuery(`SELECT * FROM contacts`);
if (result) {
this.allContacts = databaseUtil.mapQueryResultToValues(
result,
) as unknown as Contact[];
}
if (USE_DEXIE_DB) {
this.allContacts = await db.contacts.toArray();
}
this.allMyDids = await retrieveAccountDids();
@@ -306,11 +320,8 @@ export default class GiftedDialog extends Vue {
this.fromProjectId,
);
if (
result.type === "error" ||
this.isGiveCreationError(result.response)
) {
const errorMessage = this.getGiveCreationErrorMessage(result);
if (!result.success) {
const errorMessage = result.error;
logger.error("Error with give creation result:", result);
this.$notify(
{
@@ -356,28 +367,6 @@ export default class GiftedDialog extends Vue {
// Helper functions for readability
/**
* @param result response "data" from the server
* @returns true if the result indicates an error
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
isGiveCreationError(result: any) {
return result.status !== 201 || result.data?.error;
}
/**
* @param result direct response eg. ErrorResult or SuccessResult (potentially with embedded "data")
* @returns best guess at an error message
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
getGiveCreationErrorMessage(result: any) {
return (
result.error?.userMessage ||
result.error?.error ||
result.response?.data?.error?.message
);
}
explainData() {
this.$notify(
{

View File

@@ -74,10 +74,12 @@
import { Vue, Component } from "vue-facing-decorator";
import { Router } from "vue-router";
import { AppString, NotificationIface } from "../constants/app";
import { AppString, NotificationIface, USE_DEXIE_DB } from "../constants/app";
import { db } from "../db/index";
import { Contact } from "../db/tables/contacts";
import * as databaseUtil from "../db/databaseUtil";
import { GiverReceiverInputInfo } from "../libs/util";
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
@Component
export default class GivenPrompts extends Vue {
@@ -127,8 +129,16 @@ export default class GivenPrompts extends Vue {
this.visible = true;
this.callbackOnFullGiftInfo = callbackOnFullGiftInfo;
await db.open();
this.numContacts = await db.contacts.count();
const platformService = PlatformServiceFactory.getInstance();
const result = await platformService.dbQuery(
"SELECT COUNT(*) FROM contacts",
);
if (result) {
this.numContacts = result.values[0][0] as number;
}
if (USE_DEXIE_DB) {
this.numContacts = await db.contacts.count();
}
this.shownContactDbIndices = new Array<boolean>(this.numContacts); // all undefined to start
}
@@ -217,6 +227,7 @@ export default class GivenPrompts extends Vue {
let someContactDbIndex = Math.floor(Math.random() * this.numContacts);
let count = 0;
// as long as the index has an entry, loop
while (
this.shownContactDbIndices[someContactDbIndex] != null &&
@@ -229,10 +240,21 @@ export default class GivenPrompts extends Vue {
this.nextIdeaPastContacts();
} else {
// get the contact at that offset
await db.open();
this.currentContact = await db.contacts
.offset(someContactDbIndex)
.first();
const platformService = PlatformServiceFactory.getInstance();
const result = await platformService.dbQuery(
"SELECT * FROM contacts LIMIT 1 OFFSET ?",
[someContactDbIndex],
);
if (result) {
const mappedContacts = databaseUtil.mapQueryResultToValues(result);
this.currentContact = mappedContacts[0] as unknown as Contact;
}
if (USE_DEXIE_DB) {
await db.open();
this.currentContact = await db.contacts
.offset(someContactDbIndex)
.first();
}
this.shownContactDbIndices[someContactDbIndex] = true;
}
}

View File

@@ -48,16 +48,15 @@
<span>
{{ didInfo(visDid) }}
<span v-if="!serverUtil.isEmptyOrHiddenDid(visDid)">
<a
:href="`/did/${visDid}`"
target="_blank"
<router-link
:to="{ path: '/did/' + encodeURIComponent(visDid) }"
class="text-blue-500"
>
<font-awesome
icon="arrow-up-right-from-square"
class="fa-fw"
/>
</a>
</router-link>
</span>
</span>
</div>

View File

@@ -0,0 +1,90 @@
<template>
<svg
v-if="iconData"
:class="svgClass"
:fill="fill"
:stroke="stroke"
:viewBox="viewBox"
xmlns="http://www.w3.org/2000/svg"
>
<path v-for="(path, index) in iconData.paths" :key="index" v-bind="path" />
</svg>
</template>
<script lang="ts">
import { Component, Prop, Vue } from "vue-facing-decorator";
import icons from "../assets/icons.json";
import { logger } from "../utils/logger";
/**
* Icon path interface
*/
interface IconPath {
d: string;
fillRule?: string;
clipRule?: string;
strokeLinecap?: string;
strokeLinejoin?: string;
strokeWidth?: string | number;
fill?: string;
stroke?: string;
}
/**
* Icon data interface
*/
interface IconData {
paths: IconPath[];
}
/**
* Icons JSON structure
*/
interface IconsJson {
[key: string]: IconPath | IconData;
}
/**
* Icon Renderer Component
*
* This component loads SVG icon definitions from a JSON file and renders them
* as SVG elements. It provides a clean way to use icons without cluttering
* templates with long SVG path definitions.
*
* @author Matthew Raymer
* @version 1.0.0
* @since 2024
*/
@Component({
name: "IconRenderer",
})
export default class IconRenderer extends Vue {
@Prop({ required: true }) readonly iconName!: string;
@Prop({ default: "h-5 w-5" }) readonly svgClass!: string;
@Prop({ default: "none" }) readonly fill!: string;
@Prop({ default: "currentColor" }) readonly stroke!: string;
@Prop({ default: "0 0 24 24" }) readonly viewBox!: string;
/**
* Get the icon data for the specified icon name
*
* @returns {IconData | null} The icon data object or null if not found
*/
get iconData(): IconData | null {
const icon = (icons as IconsJson)[this.iconName];
if (!icon) {
logger.warn(`Icon "${this.iconName}" not found in icons.json`);
return null;
}
// Convert single path to array format for consistency
if ("d" in icon) {
return {
paths: [icon as IconPath],
};
}
return icon as IconData;
}
}
</script>

View File

@@ -4,7 +4,9 @@
<div class="text-lg text-center font-bold relative">
<h1 id="ViewHeading" class="text-center font-bold">
<span v-if="uploading">Uploading Image&hellip;</span>
<span v-else-if="blob">{{ crop ? 'Crop Image' : 'Preview Image' }}</span>
<span v-else-if="blob">{{
crop ? "Crop Image" : "Preview Image"
}}</span>
<span v-else-if="showCameraPreview">Upload Image</span>
<span v-else>Add Photo</span>
</h1>
@@ -119,7 +121,9 @@
playsinline
muted
></video>
<div class="absolute bottom-4 inset-x-0 flex items-center justify-center gap-4">
<div
class="absolute bottom-4 inset-x-0 flex items-center justify-center gap-4"
>
<button
class="bg-white text-slate-800 p-3 rounded-full text-2xl leading-none"
@click="capturePhoto"
@@ -238,12 +242,12 @@
<p class="mb-2">
Before you can upload a photo, a friend needs to register you.
</p>
<router-link
:to="{ name: 'contact-qr' }"
<button
class="inline-block text-md uppercase bg-gradient-to-b from-blue-400 to-blue-700 shadow-[inset_0_-1px_0_0_rgba(0,0,0,0.5)] text-white px-4 py-2 rounded-md"
@click="handleQRCodeClick"
>
Share Your Info
</router-link>
</button>
</div>
</template>
</div>
@@ -256,11 +260,17 @@ import axios from "axios";
import { ref } from "vue";
import { Component, Vue } from "vue-facing-decorator";
import VuePictureCropper, { cropper } from "vue-picture-cropper";
import { DEFAULT_IMAGE_API_SERVER, NotificationIface } from "../constants/app";
import { Capacitor } from "@capacitor/core";
import {
DEFAULT_IMAGE_API_SERVER,
NotificationIface,
USE_DEXIE_DB,
} from "../constants/app";
import { retrieveSettingsForActiveAccount } from "../db/index";
import { accessToken } from "../libs/crypto";
import { logger } from "../utils/logger";
import { PlatformServiceFactory } from "../services/PlatformServiceFactory";
import * as databaseUtil from "../db/databaseUtil";
const inputImageFileNameRef = ref<Blob>();
@@ -273,9 +283,9 @@ const inputImageFileNameRef = ref<Blob>();
},
defaultCameraMode: {
type: String,
default: 'environment',
validator: (value: string) => ['environment', 'user'].includes(value)
}
default: "environment",
validator: (value: string) => ["environment", "user"].includes(value),
},
},
})
export default class ImageMethodDialog extends Vue {
@@ -318,7 +328,7 @@ export default class ImageMethodDialog extends Vue {
private cameraStream: MediaStream | null = null;
/** Current camera facing mode */
private currentFacingMode: 'environment' | 'user' = 'environment';
private currentFacingMode: "environment" | "user" = "environment";
private platformService = PlatformServiceFactory.getInstance();
URL = window.URL || window.webkitURL;
@@ -350,9 +360,11 @@ export default class ImageMethodDialog extends Vue {
* @throws {Error} When settings retrieval fails
*/
async mounted() {
logger.log("ImageMethodDialog mounted");
try {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
this.activeDid = settings.activeDid || "";
} catch (error: unknown) {
logger.error("Error retrieving settings from database:", error);
@@ -384,7 +396,7 @@ export default class ImageMethodDialog extends Vue {
this.crop = !!crop;
this.imageCallback = setImageFn;
this.visible = true;
this.currentFacingMode = this.defaultCameraMode as 'environment' | 'user';
this.currentFacingMode = this.defaultCameraMode as "environment" | "user";
// Start camera preview immediately
logger.debug("Starting camera preview from open()");
@@ -458,7 +470,10 @@ export default class ImageMethodDialog extends Vue {
logger.debug("Current showCameraPreview state:", this.showCameraPreview);
logger.debug("Platform capabilities:", this.platformCapabilities);
logger.debug("MediaDevices available:", !!navigator.mediaDevices);
logger.debug("getUserMedia available:", !!(navigator.mediaDevices && navigator.mediaDevices.getUserMedia));
logger.debug(
"getUserMedia available:",
!!(navigator.mediaDevices && navigator.mediaDevices.getUserMedia),
);
try {
this.cameraState = "initializing";
@@ -485,13 +500,16 @@ export default class ImageMethodDialog extends Vue {
videoElement.srcObject = stream;
await new Promise((resolve) => {
videoElement.onloadedmetadata = () => {
videoElement.play().then(() => {
logger.debug("Video element started playing");
resolve(true);
}).catch(error => {
logger.error("Error playing video:", error);
throw error;
});
videoElement
.play()
.then(() => {
logger.debug("Video element started playing");
resolve(true);
})
.catch((error) => {
logger.error("Error playing video:", error);
throw error;
});
};
});
} else {
@@ -503,17 +521,16 @@ export default class ImageMethodDialog extends Vue {
let errorMessage =
error instanceof Error ? error.message : "Failed to access camera";
if (
error instanceof Error && (
error.name === "NotReadableError" ||
error.name === "TrackStartError"
)) {
error instanceof Error &&
(error.name === "NotReadableError" || error.name === "TrackStartError")
) {
errorMessage =
"Camera is in use by another application. Please close any other apps or browser tabs using the camera and try again.";
} else if (
error instanceof Error && (
error.name === "NotAllowedError" ||
error.name === "PermissionDeniedError"
)) {
error instanceof Error &&
(error.name === "NotAllowedError" ||
error.name === "PermissionDeniedError")
) {
errorMessage =
"Camera access was denied. Please allow camera access in your browser settings.";
}
@@ -583,11 +600,12 @@ export default class ImageMethodDialog extends Vue {
async rotateCamera() {
// Toggle between front and back cameras
this.currentFacingMode = this.currentFacingMode === 'environment' ? 'user' : 'environment';
this.currentFacingMode =
this.currentFacingMode === "environment" ? "user" : "environment";
// Stop current stream
if (this.cameraStream) {
this.cameraStream.getTracks().forEach(track => track.stop());
this.cameraStream.getTracks().forEach((track) => track.stop());
this.cameraStream = null;
}
@@ -692,6 +710,14 @@ export default class ImageMethodDialog extends Vue {
toggleDiagnostics() {
this.showDiagnostics = !this.showDiagnostics;
}
private handleQRCodeClick() {
if (Capacitor.isNativePlatform()) {
this.$router.push({ name: "contact-qr-scan-full" });
} else {
this.$router.push({ name: "contact-qr" });
}
}
}
</script>

View File

@@ -172,8 +172,10 @@ import {
} from "../libs/endorserServer";
import { decryptMessage } from "../libs/crypto";
import { Contact } from "../db/tables/contacts";
import * as databaseUtil from "../db/databaseUtil";
import * as libsUtil from "../libs/util";
import { NotificationIface } from "../constants/app";
import { NotificationIface, USE_DEXIE_DB } from "../constants/app";
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
interface Member {
admitted: boolean;
@@ -209,7 +211,10 @@ export default class MembersList extends Vue {
contacts: Array<Contact> = [];
async created() {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
this.activeDid = settings.activeDid || "";
this.apiServer = settings.apiServer || "";
this.firstName = settings.firstName || "";
@@ -355,7 +360,16 @@ export default class MembersList extends Vue {
}
async loadContacts() {
this.contacts = await db.contacts.toArray();
const platformService = PlatformServiceFactory.getInstance();
const result = await platformService.dbQuery("SELECT * FROM contacts");
if (result) {
this.contacts = databaseUtil.mapQueryResultToValues(
result,
) as unknown as Contact[];
}
if (USE_DEXIE_DB) {
this.contacts = await db.contacts.toArray();
}
}
getContactFor(did: string): Contact | undefined {
@@ -439,7 +453,14 @@ export default class MembersList extends Vue {
if (result.success) {
decrMember.isRegistered = true;
if (oldContact) {
await db.contacts.update(decrMember.did, { registered: true });
const platformService = PlatformServiceFactory.getInstance();
await platformService.dbExec(
"UPDATE contacts SET registered = ? WHERE did = ?",
[true, decrMember.did],
);
if (USE_DEXIE_DB) {
await db.contacts.update(decrMember.did, { registered: true });
}
oldContact.registered = true;
}
this.$notify(
@@ -492,7 +513,14 @@ export default class MembersList extends Vue {
name: member.name,
};
await db.contacts.add(newContact);
const platformService = PlatformServiceFactory.getInstance();
await platformService.dbExec(
"INSERT INTO contacts (did, name) VALUES (?, ?)",
[member.did, member.name],
);
if (USE_DEXIE_DB) {
await db.contacts.add(newContact);
}
this.contacts.push(newContact);
this.$notify(

View File

@@ -82,12 +82,10 @@
<script lang="ts">
import { Vue, Component, Prop } from "vue-facing-decorator";
import { NotificationIface } from "../constants/app";
import {
createAndSubmitOffer,
serverMessageForUser,
} from "../libs/endorserServer";
import { NotificationIface, USE_DEXIE_DB } from "../constants/app";
import { createAndSubmitOffer } from "../libs/endorserServer";
import * as libsUtil from "../libs/util";
import * as databaseUtil from "../db/databaseUtil";
import { retrieveSettingsForActiveAccount } from "../db/index";
import { logger } from "../utils/logger";
@@ -116,7 +114,10 @@ export default class OfferDialog extends Vue {
this.recipientDid = recipientDid;
this.recipientName = recipientName;
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
this.apiServer = settings.apiServer || "";
this.activeDid = settings.activeDid || "";
@@ -245,11 +246,8 @@ export default class OfferDialog extends Vue {
this.projectId,
);
if (
result.type === "error" ||
this.isOfferCreationError(result.response)
) {
const errorMessage = this.getOfferCreationErrorMessage(result);
if (!result.success) {
const errorMessage = result.error;
logger.error("Error with offer creation result:", result);
this.$notify(
{
@@ -289,30 +287,6 @@ export default class OfferDialog extends Vue {
);
}
}
// Helper functions for readability
/**
* @param result response "data" from the server
* @returns true if the result indicates an error
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
isOfferCreationError(result: any) {
return result.status !== 201 || result.data?.error;
}
/**
* @param result direct response eg. ErrorResult or SuccessResult (potentially with embedded "data")
* @returns best guess at an error message
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
getOfferCreationErrorMessage(result: any) {
return (
serverMessageForUser(result) ||
result.error?.userMessage ||
result.error?.error
);
}
}
</script>

View File

@@ -5,7 +5,7 @@
<h1 class="text-xl font-bold text-center mb-4 relative">
Welcome to Time Safari
<br />
- Showcasing Gratitude & Magnifying Time
- Showcase Impact & Magnify Time
<div
class="text-lg text-center leading-none absolute right-0 -top-1"
@click="onClickClose(true)"
@@ -14,6 +14,9 @@
</div>
</h1>
The feed underneath this pop-up shows the latest contributions, some from
people and some from projects.
<p v-if="isRegistered" class="mt-4">
You can now log things that you've seen:
<span v-if="numContacts > 0">
@@ -23,14 +26,10 @@
<span class="bg-green-600 text-white rounded-full">
<font-awesome icon="plus" class="fa-fw" />
</span>
button to express your appreciation for... whatever -- maybe thanks for
showing you all these fascinating stories of
<em>gratitude</em>.
button to express your appreciation for... whatever.
</p>
<p v-else class="mt-4">
The feed underneath this pop-up shows the latest gifts that others have
recognized. Once someone registers you, you can log your appreciation,
too.
<p class="mt-4">
Once someone registers you, you can log your appreciation, too.
</p>
<p class="mt-4">
@@ -201,13 +200,16 @@
import { Component, Vue } from "vue-facing-decorator";
import { Router } from "vue-router";
import { NotificationIface } from "../constants/app";
import { NotificationIface, USE_DEXIE_DB } from "../constants/app";
import {
db,
retrieveSettingsForActiveAccount,
updateAccountSettings,
} from "../db/index";
import * as databaseUtil from "../db/databaseUtil";
import { OnboardPage } from "../libs/util";
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
import { Contact } from "@/db/tables/contacts";
@Component({
computed: {
@@ -222,7 +224,7 @@ export default class OnboardingDialog extends Vue {
$router!: Router;
activeDid = "";
firstContactName = null;
firstContactName = "";
givenName = "";
isRegistered = false;
numContacts = 0;
@@ -231,29 +233,54 @@ export default class OnboardingDialog extends Vue {
async open(page: OnboardPage) {
this.page = page;
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
this.activeDid = settings.activeDid || "";
this.isRegistered = !!settings.isRegistered;
const contacts = await db.contacts.toArray();
this.numContacts = contacts.length;
if (this.numContacts > 0) {
this.firstContactName = contacts[0].name;
const platformService = PlatformServiceFactory.getInstance();
const dbContacts = await platformService.dbQuery("SELECT * FROM contacts");
if (dbContacts) {
this.numContacts = dbContacts.values.length;
const firstContact = dbContacts.values[0];
const fullContact = databaseUtil.mapColumnsToValues(dbContacts.columns, [
firstContact,
]) as unknown as Contact;
this.firstContactName = fullContact.name || "";
}
if (USE_DEXIE_DB) {
const contacts = await db.contacts.toArray();
this.numContacts = contacts.length;
if (this.numContacts > 0) {
this.firstContactName = contacts[0].name || "";
}
}
this.visible = true;
if (this.page === OnboardPage.Create) {
// we'll assume that they've been through all the other pages
await updateAccountSettings(this.activeDid, {
await databaseUtil.updateDidSpecificSettings(this.activeDid, {
finishedOnboarding: true,
});
if (USE_DEXIE_DB) {
await updateAccountSettings(this.activeDid, {
finishedOnboarding: true,
});
}
}
}
async onClickClose(done?: boolean, goHome?: boolean) {
this.visible = false;
if (done) {
await updateAccountSettings(this.activeDid, {
await databaseUtil.updateDidSpecificSettings(this.activeDid, {
finishedOnboarding: true,
});
if (USE_DEXIE_DB) {
await updateAccountSettings(this.activeDid, {
finishedOnboarding: true,
});
}
if (goHome) {
this.$router.push({ name: "home" });
}

View File

@@ -119,7 +119,12 @@ PhotoDialog.vue */
import axios from "axios";
import { Component, Vue } from "vue-facing-decorator";
import VuePictureCropper, { cropper } from "vue-picture-cropper";
import { DEFAULT_IMAGE_API_SERVER, NotificationIface } from "../constants/app";
import {
DEFAULT_IMAGE_API_SERVER,
NotificationIface,
USE_DEXIE_DB,
} from "../constants/app";
import * as databaseUtil from "../db/databaseUtil";
import { retrieveSettingsForActiveAccount } from "../db/index";
import { accessToken } from "../libs/crypto";
import { logger } from "../utils/logger";
@@ -173,9 +178,12 @@ export default class PhotoDialog extends Vue {
* @throws {Error} When settings retrieval fails
*/
async mounted() {
logger.log("PhotoDialog mounted");
// logger.log("PhotoDialog mounted");
try {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
this.activeDid = settings.activeDid || "";
this.isRegistered = !!settings.isRegistered;
logger.log("isRegistered:", this.isRegistered);

View File

@@ -1,18 +1,14 @@
<!-- eslint-disable vue/no-v-html -->
<template>
<a
v-if="linkToFull && imageUrl"
v-if="linkToFullImage && imageUrl"
:href="imageUrl"
target="_blank"
class="h-full w-full object-contain"
>
<div class="h-full w-full object-contain" v-html="generateIdenticon()" />
<div class="h-full w-full object-contain" v-html="generateIcon()" />
</a>
<div
v-else
class="h-full w-full object-contain"
v-html="generateIdenticon()"
/>
<div v-else class="h-full w-full object-contain" v-html="generateIcon()" />
</template>
<script lang="ts">
import { toSvg } from "jdenticon";
@@ -35,9 +31,9 @@ export default class ProjectIcon extends Vue {
@Prop entityId = "";
@Prop iconSize = 0;
@Prop imageUrl = "";
@Prop linkToFull = false;
@Prop linkToFullImage = false;
generateIdenticon() {
generateIcon() {
if (this.imageUrl) {
return `<img src="${this.imageUrl}" class="w-full h-full object-contain" />`;
} else {

View File

@@ -102,7 +102,12 @@
<script lang="ts">
import { Component, Vue } from "vue-facing-decorator";
import { DEFAULT_PUSH_SERVER, NotificationIface } from "../constants/app";
import {
DEFAULT_PUSH_SERVER,
NotificationIface,
USE_DEXIE_DB,
} from "../constants/app";
import * as databaseUtil from "../db/databaseUtil";
import {
logConsoleAndDb,
retrieveSettingsForActiveAccount,
@@ -169,7 +174,10 @@ export default class PushNotificationPermission extends Vue {
this.isVisible = true;
this.pushType = pushType;
try {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
let pushUrl = DEFAULT_PUSH_SERVER;
if (settings?.webPushServer) {
pushUrl = settings.webPushServer;

View File

@@ -15,7 +15,8 @@
<script lang="ts">
import { Component, Vue, Prop } from "vue-facing-decorator";
import { AppString, NotificationIface } from "../constants/app";
import { AppString, NotificationIface, USE_DEXIE_DB } from "../constants/app";
import * as databaseUtil from "../db/databaseUtil";
import { retrieveSettingsForActiveAccount } from "../db/index";
@Component
@@ -28,20 +29,22 @@ export default class TopMessage extends Vue {
async mounted() {
try {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
if (
settings.warnIfTestServer &&
settings.apiServer !== AppString.PROD_ENDORSER_API_SERVER
) {
const didPrefix = settings.activeDid?.slice(11, 15);
this.message = "You're linked to a non-prod server, user " + didPrefix;
this.message = "You're not using prod, user " + didPrefix;
} else if (
settings.warnIfProdServer &&
settings.apiServer === AppString.PROD_ENDORSER_API_SERVER
) {
const didPrefix = settings.activeDid?.slice(11, 15);
this.message =
"You're linked to the production server, user " + didPrefix;
this.message = "You are using prod, user " + didPrefix;
}
} catch (err: unknown) {
this.$notify(

View File

@@ -37,9 +37,11 @@
<script lang="ts">
import { Vue, Component, Prop } from "vue-facing-decorator";
import { NotificationIface } from "../constants/app";
import { NotificationIface, USE_DEXIE_DB } from "../constants/app";
import { db, retrieveSettingsForActiveAccount } from "../db/index";
import * as databaseUtil from "../db/databaseUtil";
import { MASTER_SETTINGS_KEY } from "../db/tables/settings";
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
@Component
export default class UserNameDialog extends Vue {
@@ -61,15 +63,25 @@ export default class UserNameDialog extends Vue {
*/
async open(aCallback?: (name?: string) => void) {
this.callback = aCallback || this.callback;
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
this.givenName = settings.firstName || "";
this.visible = true;
}
async onClickSaveChanges() {
await db.settings.update(MASTER_SETTINGS_KEY, {
firstName: this.givenName,
});
const platformService = PlatformServiceFactory.getInstance();
await platformService.dbExec(
"UPDATE settings SET firstName = ? WHERE id = ?",
[this.givenName, MASTER_SETTINGS_KEY],
);
if (USE_DEXIE_DB) {
await db.settings.update(MASTER_SETTINGS_KEY, {
firstName: this.givenName,
});
}
this.visible = false;
this.callback(this.givenName);
}

View File

@@ -3,6 +3,8 @@ import * as THREE from "three";
import { GLTFLoader } from "three/addons/loaders/GLTFLoader";
import * as SkeletonUtils from "three/addons/utils/SkeletonUtils";
import * as TWEEN from "@tweenjs/tween.js";
import { USE_DEXIE_DB } from "../../../../constants/app";
import * as databaseUtil from "../../../../db/databaseUtil";
import { retrieveSettingsForActiveAccount } from "../../../../db";
import { getHeaders } from "../../../../libs/endorserServer";
import { logger } from "../../../../utils/logger";
@@ -14,7 +16,10 @@ export async function loadLandmarks(vue, world, scene, loop) {
vue.setWorldProperty("animationDurationSeconds", ANIMATION_DURATION_SECS);
try {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
const activeDid = settings.activeDid || "";
const apiServer = settings.apiServer;
const headers = await getHeaders(activeDid);

View File

@@ -7,6 +7,7 @@ export enum AppString {
// This is used in titles and verbiage inside the app.
// There is also an app name without spaces, for packaging in the package.json file used in the manifest.
APP_NAME = "Time Safari",
APP_NAME_NO_SPACES = "TimeSafari",
PROD_ENDORSER_API_SERVER = "https://api.endorser.ch",
TEST_ENDORSER_API_SERVER = "https://test-api.endorser.ch",
@@ -32,24 +33,26 @@ export const APP_SERVER =
export const DEFAULT_ENDORSER_API_SERVER =
import.meta.env.VITE_DEFAULT_ENDORSER_API_SERVER ||
AppString.TEST_ENDORSER_API_SERVER;
AppString.PROD_ENDORSER_API_SERVER;
export const DEFAULT_IMAGE_API_SERVER =
import.meta.env.VITE_DEFAULT_IMAGE_API_SERVER ||
AppString.TEST_IMAGE_API_SERVER;
AppString.PROD_IMAGE_API_SERVER;
export const DEFAULT_PARTNER_API_SERVER =
import.meta.env.VITE_DEFAULT_PARTNER_API_SERVER ||
AppString.TEST_PARTNER_API_SERVER;
AppString.PROD_PARTNER_API_SERVER;
export const DEFAULT_PUSH_SERVER =
window.location.protocol + "//" + window.location.host;
import.meta.env.VITE_DEFAULT_PUSH_SERVER || AppString.PROD_PUSH_SERVER;
export const IMAGE_TYPE_PROFILE = "profile";
export const PASSKEYS_ENABLED =
!!import.meta.env.VITE_PASSKEYS_ENABLED || false;
export const USE_DEXIE_DB = false;
/**
* The possible values for "group" and "type" are in App.vue.
* Some of this comes from the notiwind package, some is custom.

137
src/db-sql/migration.ts Normal file
View File

@@ -0,0 +1,137 @@
import {
registerMigration,
runMigrations as runMigrationsService,
} from "../services/migrationService";
import { DEFAULT_ENDORSER_API_SERVER } from "@/constants/app";
import { arrayBufferToBase64 } from "@/libs/crypto";
// Generate a random secret for the secret table
// It's not really secure to maintain the secret next to the user's data.
// However, until we have better hooks into a real wallet or reliable secure
// storage, we'll do this for user convenience. As they sign more records
// and integrate with more people, they'll value it more and want to be more
// secure, so we'll prompt them to take steps to back it up, properly encrypt,
// etc. At the beginning, we'll prompt for a password, then we'll prompt for a
// PWA so it's not in a browser... and then we hope to be integrated with a
// real wallet or something else more secure.
// One might ask: why encrypt at all? We figure a basic encryption is better
// than none. Plus, we expect to support their own password or keystore or
// external wallet as better signing options in the future, so it's gonna be
// important to have the structure where each account access might require
// user action.
// (Once upon a time we stored the secret in localStorage, but it frequently
// got erased, even though the IndexedDB still had the identity data. This
// ended up throwing lots of errors to the user... and they'd end up in a state
// where they couldn't take action because they couldn't unlock that identity.)
const randomBytes = crypto.getRandomValues(new Uint8Array(32));
const secretBase64 = arrayBufferToBase64(randomBytes);
// Each migration can include multiple SQL statements (with semicolons)
const MIGRATIONS = [
{
name: "001_initial",
// see ../db/tables files for explanations of the fields
sql: `
CREATE TABLE IF NOT EXISTS accounts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
dateCreated TEXT NOT NULL,
derivationPath TEXT,
did TEXT NOT NULL,
identityEncrBase64 TEXT, -- encrypted & base64-encoded
mnemonicEncrBase64 TEXT, -- encrypted & base64-encoded
passkeyCredIdHex TEXT,
publicKeyHex TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_accounts_did ON accounts(did);
CREATE TABLE IF NOT EXISTS secret (
id INTEGER PRIMARY KEY AUTOINCREMENT,
secretBase64 TEXT NOT NULL
);
INSERT INTO secret (id, secretBase64) VALUES (1, '${secretBase64}');
CREATE TABLE IF NOT EXISTS settings (
id INTEGER PRIMARY KEY AUTOINCREMENT,
accountDid TEXT,
activeDid TEXT,
apiServer TEXT,
filterFeedByNearby BOOLEAN,
filterFeedByVisible BOOLEAN,
finishedOnboarding BOOLEAN,
firstName TEXT,
hideRegisterPromptOnNewContact BOOLEAN,
isRegistered BOOLEAN,
lastName TEXT,
lastAckedOfferToUserJwtId TEXT,
lastAckedOfferToUserProjectsJwtId TEXT,
lastNotifiedClaimId TEXT,
lastViewedClaimId TEXT,
notifyingNewActivityTime TEXT,
notifyingReminderMessage TEXT,
notifyingReminderTime TEXT,
partnerApiServer TEXT,
passkeyExpirationMinutes INTEGER,
profileImageUrl TEXT,
searchBoxes TEXT, -- Stored as JSON string
showContactGivesInline BOOLEAN,
showGeneralAdvanced BOOLEAN,
showShortcutBvc BOOLEAN,
vapid TEXT,
warnIfProdServer BOOLEAN,
warnIfTestServer BOOLEAN,
webPushServer TEXT
);
CREATE INDEX IF NOT EXISTS idx_settings_accountDid ON settings(accountDid);
INSERT INTO settings (id, apiServer) VALUES (1, '${DEFAULT_ENDORSER_API_SERVER}');
CREATE TABLE IF NOT EXISTS contacts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
did TEXT NOT NULL,
name TEXT,
contactMethods TEXT, -- Stored as JSON string
nextPubKeyHashB64 TEXT,
notes TEXT,
profileImageUrl TEXT,
publicKeyBase64 TEXT,
seesMe BOOLEAN,
registered BOOLEAN
);
CREATE INDEX IF NOT EXISTS idx_contacts_did ON contacts(did);
CREATE INDEX IF NOT EXISTS idx_contacts_name ON contacts(name);
CREATE TABLE IF NOT EXISTS logs (
date TEXT NOT NULL,
message TEXT NOT NULL
);
CREATE TABLE IF NOT EXISTS temp (
id TEXT PRIMARY KEY,
blobB64 TEXT
);
`,
},
];
/**
* @param sqlExec - A function that executes a SQL statement and returns the result
* @param extractMigrationNames - A function that extracts the names (string array) from "select name from migrations"
*/
export async function runMigrations<T>(
sqlExec: (sql: string, params?: unknown[]) => Promise<unknown>,
sqlQuery: (sql: string, params?: unknown[]) => Promise<T>,
extractMigrationNames: (result: T) => Set<string>,
): Promise<void> {
for (const migration of MIGRATIONS) {
registerMigration(migration);
}
await runMigrationsService(sqlExec, sqlQuery, extractMigrationNames);
}

462
src/db/databaseUtil.ts Normal file
View File

@@ -0,0 +1,462 @@
/**
* This file is the SQL replacement of the index.ts file in the db directory.
* That file will eventually be deleted.
*/
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
import { MASTER_SETTINGS_KEY, Settings } from "./tables/settings";
import { logger } from "@/utils/logger";
import { DEFAULT_ENDORSER_API_SERVER } from "@/constants/app";
import { QueryExecResult } from "@/interfaces/database";
export async function updateDefaultSettings(
settingsChanges: Settings,
): Promise<boolean> {
delete settingsChanges.accountDid; // just in case
// ensure there is no "id" that would override the key
delete settingsChanges.id;
try {
const platformService = PlatformServiceFactory.getInstance();
const { sql, params } = generateUpdateStatement(
settingsChanges,
"settings",
"id = ?",
[MASTER_SETTINGS_KEY],
);
const result = await platformService.dbExec(sql, params);
return result.changes === 1;
} catch (error) {
logger.error("Error updating default settings:", error);
if (error instanceof Error) {
throw error; // Re-throw if it's already an Error with a message
} else {
throw new Error(
`Failed to update settings. We recommend you try again or restart the app.`,
);
}
}
}
export async function insertDidSpecificSettings(
did: string,
settings: Partial<Settings> = {},
): Promise<boolean> {
const platform = PlatformServiceFactory.getInstance();
const { sql, params } = generateInsertStatement(
{ ...settings, accountDid: did }, // make sure accountDid is set to the given value
"settings",
);
const result = await platform.dbExec(sql, params);
return result.changes === 1;
}
export async function updateDidSpecificSettings(
accountDid: string,
settingsChanges: Settings,
): Promise<boolean> {
settingsChanges.accountDid = accountDid;
delete settingsChanges.id; // key off account, not ID
const platform = PlatformServiceFactory.getInstance();
// First try to update existing record
const { sql: updateSql, params: updateParams } = generateUpdateStatement(
settingsChanges,
"settings",
"accountDid = ?",
[accountDid],
);
const updateResult = await platform.dbExec(updateSql, updateParams);
return updateResult.changes === 1;
}
const DEFAULT_SETTINGS: Settings = {
id: MASTER_SETTINGS_KEY,
activeDid: undefined,
apiServer: DEFAULT_ENDORSER_API_SERVER,
};
// retrieves default settings
export async function retrieveSettingsForDefaultAccount(): Promise<Settings> {
const platform = PlatformServiceFactory.getInstance();
const sql = "SELECT * FROM settings WHERE id = ?";
const result = await platform.dbQuery(sql, [MASTER_SETTINGS_KEY]);
if (!result) {
return DEFAULT_SETTINGS;
} else {
const settings = mapColumnsToValues(
result.columns,
result.values,
)[0] as Settings;
if (settings.searchBoxes) {
// @ts-expect-error - the searchBoxes field is a string in the DB
settings.searchBoxes = JSON.parse(settings.searchBoxes);
}
return settings;
}
}
/**
* Retrieves settings for the active account, merging with default settings
*
* @returns Promise<Settings> Combined settings with account-specific overrides
* @throws Will log specific errors for debugging but returns default settings on failure
*/
export async function retrieveSettingsForActiveAccount(): Promise<Settings> {
try {
// Get default settings first
const defaultSettings = await retrieveSettingsForDefaultAccount();
// If no active DID, return defaults
if (!defaultSettings.activeDid) {
return defaultSettings;
}
// Get account-specific settings
try {
const platform = PlatformServiceFactory.getInstance();
const result = await platform.dbQuery(
"SELECT * FROM settings WHERE accountDid = ?",
[defaultSettings.activeDid],
);
if (!result?.values?.length) {
// we created DID-specific settings when generated or imported, so this shouldn't happen
return defaultSettings;
}
// Map and filter settings
const overrideSettings = mapColumnsToValues(
result.columns,
result.values,
)[0] as Settings;
const overrideSettingsFiltered = Object.fromEntries(
Object.entries(overrideSettings).filter(([_, v]) => v !== null),
);
// Merge settings
const settings = { ...defaultSettings, ...overrideSettingsFiltered };
// Handle searchBoxes parsing
if (settings.searchBoxes) {
settings.searchBoxes = parseJsonField(settings.searchBoxes, []);
}
return settings;
} catch (error) {
logConsoleAndDb(
`[databaseUtil] Failed to retrieve account settings for ${defaultSettings.activeDid}: ${error}`,
true,
);
// Return defaults on error
return defaultSettings;
}
} catch (error) {
logConsoleAndDb(
`[databaseUtil] Failed to retrieve default settings: ${error}`,
true,
);
// Return minimal default settings on complete failure
return {
id: MASTER_SETTINGS_KEY,
activeDid: undefined,
apiServer: DEFAULT_ENDORSER_API_SERVER,
};
}
}
let lastCleanupDate: string | null = null;
export let memoryLogs: string[] = [];
/**
* Logs a message to the database with proper handling of concurrent writes
* @param message - The message to log
* @author Matthew Raymer
*/
export async function logToDb(message: string): Promise<void> {
const platform = PlatformServiceFactory.getInstance();
const todayKey = new Date().toDateString();
const nowKey = new Date().toISOString();
try {
memoryLogs.push(`${new Date().toISOString()} ${message}`);
// Try to insert first, if it fails due to UNIQUE constraint, update instead
await platform.dbExec("INSERT INTO logs (date, message) VALUES (?, ?)", [
nowKey,
message,
]);
// Clean up old logs (keep only last 7 days) - do this less frequently
// Only clean up if the date is different from the last cleanup
if (!lastCleanupDate || lastCleanupDate !== todayKey) {
const sevenDaysAgo = new Date(
new Date().getTime() - 7 * 24 * 60 * 60 * 1000,
);
memoryLogs = memoryLogs.filter(
(log) => log.split(" ")[0] > sevenDaysAgo.toDateString(),
);
await platform.dbExec("DELETE FROM logs WHERE date < ?", [
sevenDaysAgo.toDateString(),
]);
lastCleanupDate = todayKey;
}
} catch (error) {
// Log to console as fallback
// eslint-disable-next-line no-console
console.error(
"Error logging to database:",
error,
" ... for original message:",
message,
);
}
}
// similar method is in the sw_scripts/additional-scripts.js file
export async function logConsoleAndDb(
message: string,
isError = false,
): Promise<void> {
if (isError) {
logger.error(`${new Date().toISOString()} ${message}`);
} else {
logger.log(`${new Date().toISOString()} ${message}`);
}
await logToDb(message);
}
/**
* Generates SQL INSERT statement and parameters from a model object
*
* This helper function creates a parameterized SQL INSERT statement
* from a JavaScript object. It filters out undefined values and
* creates the appropriate SQL syntax with placeholders.
*
* The function is used internally by the migration functions to
* safely insert data into the SQLite database.
*
* @function generateInsertStatement
* @param {Record<string, unknown>} model - The model object containing fields to insert
* @param {string} tableName - The name of the table to insert into
* @returns {Object} Object containing the SQL statement and parameters array
* @returns {string} returns.sql - The SQL INSERT statement
* @returns {unknown[]} returns.params - Array of parameter values
* @example
* ```typescript
* const contact = { did: 'did:example:123', name: 'John Doe' };
* const { sql, params } = generateInsertStatement(contact, 'contacts');
* // sql: "INSERT INTO contacts (did, name) VALUES (?, ?)"
* // params: ['did:example:123', 'John Doe']
* ```
*/
export function generateInsertStatement(
model: Record<string, unknown>,
tableName: string,
): { sql: string; params: unknown[] } {
const columns = Object.keys(model).filter((key) => model[key] !== undefined);
const values = Object.values(model).filter((value) => value !== undefined);
const placeholders = values.map(() => "?").join(", ");
const insertSql = `INSERT INTO ${tableName} (${columns.join(", ")}) VALUES (${placeholders})`;
return {
sql: insertSql,
params: values,
};
}
/**
* Generates SQL UPDATE statement and parameters from a model object
*
* This helper function creates a parameterized SQL UPDATE statement
* from a JavaScript object. It filters out undefined values and
* creates the appropriate SQL syntax with placeholders.
*
* The function is used internally by the migration functions to
* safely update data in the SQLite database.
*
* @function generateUpdateStatement
* @param {Record<string, unknown>} model - The model object containing fields to update
* @param {string} tableName - The name of the table to update
* @param {string} whereClause - The WHERE clause for the update (e.g. "id = ?")
* @param {unknown[]} [whereParams=[]] - Parameters for the WHERE clause
* @returns {Object} Object containing the SQL statement and parameters array
* @returns {string} returns.sql - The SQL UPDATE statement
* @returns {unknown[]} returns.params - Array of parameter values
* @example
* ```typescript
* const contact = { name: 'Jane Doe' };
* const { sql, params } = generateUpdateStatement(contact, 'contacts', 'did = ?', ['did:example:123']);
* // sql: "UPDATE contacts SET name = ? WHERE did = ?"
* // params: ['Jane Doe', 'did:example:123']
* ```
*/
export function generateUpdateStatement(
model: Record<string, unknown>,
tableName: string,
whereClause: string,
whereParams: unknown[] = [],
): { sql: string; params: unknown[] } {
// Filter out undefined/null values and create SET clause
const setClauses: string[] = [];
const params: unknown[] = [];
Object.entries(model).forEach(([key, value]) => {
setClauses.push(`${key} = ?`);
params.push(value ?? null);
});
if (setClauses.length === 0) {
throw new Error("No valid fields to update");
}
const sql = `UPDATE ${tableName} SET ${setClauses.join(", ")} WHERE ${whereClause}`;
return {
sql,
params: [...params, ...whereParams],
};
}
export function mapQueryResultToValues(
record: QueryExecResult | undefined,
): Array<Record<string, unknown>> {
if (!record) {
return [];
}
return mapColumnsToValues(record.columns, record.values) as Array<
Record<string, unknown>
>;
}
/**
* Maps an array of column names to an array of value arrays, creating objects where each column name
* is mapped to its corresponding value.
* @param columns Array of column names to use as object keys
* @param values Array of value arrays, where each inner array corresponds to one row of data
* @returns Array of objects where each object maps column names to their corresponding values
*/
export function mapColumnsToValues(
columns: string[],
values: unknown[][],
): Array<Record<string, unknown>> {
return values.map((row) => {
const obj: Record<string, unknown> = {};
columns.forEach((column, index) => {
obj[column] = row[index];
});
return obj;
});
}
/**
* Debug function to inspect raw settings data in the database
* This helps diagnose issues with data corruption or malformed JSON
* @param did Optional DID to inspect specific account settings
* @author Matthew Raymer
*/
export async function debugSettingsData(did?: string): Promise<void> {
try {
const platform = PlatformServiceFactory.getInstance();
// Get all settings records
const allSettings = await platform.dbQuery("SELECT * FROM settings");
logConsoleAndDb(
`[DEBUG] Total settings records: ${allSettings?.values?.length || 0}`,
false,
);
if (allSettings?.values?.length) {
allSettings.values.forEach((row, index) => {
const settings = mapColumnsToValues(allSettings.columns, [row])[0];
logConsoleAndDb(`[DEBUG] Settings record ${index + 1}:`, false);
logConsoleAndDb(`[DEBUG] - ID: ${settings.id}`, false);
logConsoleAndDb(`[DEBUG] - accountDid: ${settings.accountDid}`, false);
logConsoleAndDb(`[DEBUG] - activeDid: ${settings.activeDid}`, false);
if (settings.searchBoxes) {
logConsoleAndDb(
`[DEBUG] - searchBoxes type: ${typeof settings.searchBoxes}`,
false,
);
logConsoleAndDb(
`[DEBUG] - searchBoxes value: ${String(settings.searchBoxes)}`,
false,
);
// Try to parse it
try {
const parsed = JSON.parse(String(settings.searchBoxes));
logConsoleAndDb(
`[DEBUG] - searchBoxes parsed successfully: ${JSON.stringify(parsed)}`,
false,
);
} catch (parseError) {
logConsoleAndDb(
`[DEBUG] - searchBoxes parse error: ${parseError}`,
true,
);
}
}
logConsoleAndDb(
`[DEBUG] - Full record: ${JSON.stringify(settings, null, 2)}`,
false,
);
});
}
// If specific DID provided, also check accounts table
if (did) {
const account = await platform.dbQuery(
"SELECT * FROM accounts WHERE did = ?",
[did],
);
logConsoleAndDb(
`[DEBUG] Account for ${did}: ${JSON.stringify(account, null, 2)}`,
false,
);
}
} catch (error) {
logConsoleAndDb(`[DEBUG] Error inspecting settings data: ${error}`, true);
}
}
/**
* Platform-agnostic JSON parsing utility
* Handles different SQLite implementations:
* - Web SQLite (wa-sqlite/absurd-sql): Auto-parses JSON strings to objects
* - Capacitor SQLite: Returns raw strings that need manual parsing
*
* @param value The value to parse (could be string or already parsed object)
* @param defaultValue Default value if parsing fails
* @returns Parsed object or default value
* @author Matthew Raymer
*/
export function parseJsonField<T>(value: unknown, defaultValue: T): T {
try {
// If already an object (web SQLite auto-parsed), return as-is
if (typeof value === "object" && value !== null) {
return value as T;
}
// If it's a string (Capacitor SQLite or fallback), parse it
if (typeof value === "string") {
return JSON.parse(value) as T;
}
// If it's null/undefined, return default
if (value === null || value === undefined) {
return defaultValue;
}
return defaultValue;
} catch (error) {
logConsoleAndDb(
`[databaseUtil] Failed to parse JSON field: ${error}`,
true,
);
return defaultValue;
}
}

View File

@@ -1,3 +1,9 @@
/**
* This is the original IndexedDB version of the database.
* It will eventually be replaced fully by the SQL version in databaseUtil.ts.
* Turn this on or off with the USE_DEXIE_DB constant in constants/app.ts.
*/
import BaseDexie, { Table } from "dexie";
import { encrypted, Encryption } from "@pvermeer/dexie-encrypted-addon";
import * as R from "ramda";
@@ -26,8 +32,8 @@ type NonsensitiveTables = {
};
// Using 'unknown' instead of 'any' for stricter typing and to avoid TypeScript warnings
export type SecretDexie<T extends unknown = SecretTable> = BaseDexie & T;
export type SensitiveDexie<T extends unknown = SensitiveTables> = BaseDexie & T;
type SecretDexie<T extends unknown = SecretTable> = BaseDexie & T;
type SensitiveDexie<T extends unknown = SensitiveTables> = BaseDexie & T;
export type NonsensitiveDexie<T extends unknown = NonsensitiveTables> =
BaseDexie & T;
@@ -90,21 +96,18 @@ db.on("populate", async () => {
try {
await db.settings.add(DEFAULT_SETTINGS);
} catch (error) {
console.error(
"Error populating the database with default settings:",
error,
);
logger.error("Error populating the database with default settings:", error);
}
});
// Helper function to safely open the database with retries
async function safeOpenDatabase(retries = 1, delay = 500): Promise<void> {
// console.log("Starting safeOpenDatabase with retries:", retries);
// logger.log("Starting safeOpenDatabase with retries:", retries);
for (let i = 0; i < retries; i++) {
try {
// console.log(`Attempt ${i + 1}: Checking if database is open...`);
// logger.log(`Attempt ${i + 1}: Checking if database is open...`);
if (!db.isOpen()) {
// console.log(`Attempt ${i + 1}: Database is closed, attempting to open...`);
// logger.log(`Attempt ${i + 1}: Database is closed, attempting to open...`);
// Create a promise that rejects after 5 seconds
const timeoutPromise = new Promise((_, reject) => {
@@ -113,19 +116,19 @@ async function safeOpenDatabase(retries = 1, delay = 500): Promise<void> {
// Race between the open operation and the timeout
const openPromise = db.open();
// console.log(`Attempt ${i + 1}: Waiting for db.open() promise...`);
// logger.log(`Attempt ${i + 1}: Waiting for db.open() promise...`);
await Promise.race([openPromise, timeoutPromise]);
// If we get here, the open succeeded
// console.log(`Attempt ${i + 1}: Database opened successfully`);
// logger.log(`Attempt ${i + 1}: Database opened successfully`);
return;
}
// console.log(`Attempt ${i + 1}: Database was already open`);
// logger.log(`Attempt ${i + 1}: Database was already open`);
return;
} catch (error) {
console.error(`Attempt ${i + 1}: Database open failed:`, error);
logger.error(`Attempt ${i + 1}: Database open failed:`, error);
if (i < retries - 1) {
console.log(`Attempt ${i + 1}: Waiting ${delay}ms before retry...`);
logger.log(`Attempt ${i + 1}: Waiting ${delay}ms before retry...`);
await new Promise((resolve) => setTimeout(resolve, delay));
} else {
throw error;
@@ -142,16 +145,14 @@ export async function updateDefaultSettings(
delete settingsChanges.id;
try {
try {
// console.log("Database state before open:", db.isOpen() ? "open" : "closed");
// console.log("Database name:", db.name);
// console.log("Database version:", db.verno);
// logger.log("Database state before open:", db.isOpen() ? "open" : "closed");
// logger.log("Database name:", db.name);
// logger.log("Database version:", db.verno);
await safeOpenDatabase();
} catch (openError: unknown) {
console.error("Failed to open database:", openError);
const errorMessage =
openError instanceof Error ? openError.message : String(openError);
logger.error("Failed to open database:", openError, String(openError));
throw new Error(
`Database connection failed: ${errorMessage}. Please try again or restart the app.`,
`The database connection failed. We recommend you try again or restart the app.`,
);
}
const result = await db.settings.update(
@@ -160,11 +161,13 @@ export async function updateDefaultSettings(
);
return result;
} catch (error) {
console.error("Error updating default settings:", error);
logger.error("Error updating default settings:", error);
if (error instanceof Error) {
throw error; // Re-throw if it's already an Error with a message
} else {
throw new Error(`Failed to update settings: ${error}`);
throw new Error(
`Failed to update settings. We recommend you try again or restart the app.`,
);
}
}
}

View File

@@ -1 +1 @@
Check the contact & settings export to see whether you want your new table to be included in it.
# Check the contact & settings export to see whether you want your new table to be included in it

View File

@@ -45,6 +45,12 @@ export type Account = {
publicKeyHex: string;
};
// When finished with USE_DEXIE_DB, move these fields to Account and move identity and mnemonic here.
export type AccountEncrypted = Account & {
identityEncrBase64: string;
mnemonicEncrBase64: string;
};
/**
* Schema for the accounts table in the database.
* Fields starting with a $ character are encrypted.

View File

@@ -19,6 +19,10 @@ export interface Contact {
registered?: boolean; // cached value of the server setting
}
export type ContactWithJsonStrings = Contact & {
contactMethods?: string;
};
export const ContactSchema = {
contacts: "&did, name", // no need to key by other things
};

View File

@@ -64,6 +64,11 @@ export type Settings = {
webPushServer?: string; // Web Push server URL
};
// type of settings where the searchBoxes are JSON strings instead of objects
export type SettingsWithJsonStrings = Settings & {
searchBoxes: string;
};
export function checkIsAnyFeedFilterOn(settings: Settings): boolean {
return !!(settings?.filterFeedByNearby || settings?.filterFeedByVisible);
}

View File

@@ -25,6 +25,25 @@ function createWindow(): void {
logger.log("Checking preload path:", preloadPath);
logger.log("Preload exists:", fs.existsSync(preloadPath));
// Log environment and paths
logger.log("process.cwd():", process.cwd());
logger.log("__dirname:", __dirname);
logger.log("app.getAppPath():", app.getAppPath());
logger.log("app.isPackaged:", app.isPackaged);
// List files in __dirname and __dirname/www
try {
logger.log("Files in __dirname:", fs.readdirSync(__dirname));
const wwwDir = path.join(__dirname, "www");
if (fs.existsSync(wwwDir)) {
logger.log("Files in www:", fs.readdirSync(wwwDir));
} else {
logger.log("www directory does not exist in __dirname");
}
} catch (e) {
logger.error("Error reading directories:", e);
}
// Create the browser window.
const mainWindow = new BrowserWindow({
width: 1200,
@@ -88,7 +107,16 @@ function createWindow(): void {
logger.log("process.cwd():", process.cwd());
}
const indexPath = path.join(__dirname, "www", "index.html");
let indexPath = path.resolve(__dirname, "dist-electron", "www", "index.html");
if (!fs.existsSync(indexPath)) {
// Fallback for dev mode
indexPath = path.resolve(
process.cwd(),
"dist-electron",
"www",
"index.html",
);
}
if (isDev) {
logger.log("Loading index from:", indexPath);

View File

@@ -2,24 +2,33 @@ const { contextBridge, ipcRenderer } = require("electron");
const logger = {
log: (message, ...args) => {
// Always log in development, log with context in production
if (process.env.NODE_ENV !== "production") {
/* eslint-disable no-console */
console.log(message, ...args);
console.log(`[Preload] ${message}`, ...args);
/* eslint-enable no-console */
}
},
warn: (message, ...args) => {
if (process.env.NODE_ENV !== "production") {
/* eslint-disable no-console */
console.warn(message, ...args);
/* eslint-enable no-console */
}
// Always log warnings
/* eslint-disable no-console */
console.warn(`[Preload] ${message}`, ...args);
/* eslint-enable no-console */
},
error: (message, ...args) => {
// Always log errors
/* eslint-disable no-console */
console.error(message, ...args); // Errors should always be logged
console.error(`[Preload] ${message}`, ...args);
/* eslint-enable no-console */
},
info: (message, ...args) => {
// Always log info in development, log with context in production
if (process.env.NODE_ENV !== "production") {
/* eslint-disable no-console */
console.info(`[Preload] ${message}`, ...args);
/* eslint-enable no-console */
}
},
};
// Use a more direct path resolution approach
@@ -41,7 +50,10 @@ const getPath = (pathType) => {
}
};
logger.log("Preload script starting...");
logger.info("Preload script starting...");
// Force electron platform in the renderer process
window.process = { env: { VITE_PLATFORM: "electron" } };
try {
contextBridge.exposeInMainWorld("electronAPI", {
@@ -65,6 +77,7 @@ try {
env: {
isElectron: true,
isDev: process.env.NODE_ENV === "development",
platform: "electron", // Explicitly set platform
},
// Path utilities
getBasePath: () => {
@@ -72,7 +85,7 @@ try {
},
});
logger.log("Preload script completed successfully");
logger.info("Preload script completed successfully");
} catch (error) {
logger.error("Error in preload script:", error);
}

59
src/interfaces/absurd-sql.d.ts vendored Normal file
View File

@@ -0,0 +1,59 @@
import type { QueryExecResult, SqlValue } from "./database";
declare module "@jlongster/sql.js" {
interface SQL {
Database: new (path: string, options?: { filename: boolean }) => AbsurdSqlDatabase;
FS: {
mkdir: (path: string) => void;
mount: (fs: any, options: any, path: string) => void;
open: (path: string, flags: string) => any;
close: (stream: any) => void;
};
register_for_idb: (fs: any) => void;
}
interface AbsurdSqlDatabase {
exec: (sql: string, params?: unknown[]) => Promise<QueryExecResult[]>;
run: (
sql: string,
params?: unknown[],
) => Promise<{ changes: number; lastId?: number }>;
}
const initSqlJs: (options?: {
locateFile?: (file: string) => string;
}) => Promise<SQL>;
export default initSqlJs;
}
declare module "absurd-sql" {
import type { SQL } from "@jlongster/sql.js";
export class SQLiteFS {
constructor(fs: any, backend: any);
}
}
declare module "absurd-sql/dist/indexeddb-backend" {
export default class IndexedDBBackend {
constructor();
}
}
declare module "absurd-sql/dist/indexeddb-main-thread" {
export interface SQLiteOptions {
filename?: string;
autoLoad?: boolean;
debug?: boolean;
}
export interface SQLiteDatabase {
exec: (sql: string, params?: unknown[]) => Promise<QueryExecResult[]>;
close: () => Promise<void>;
}
export function initSqlJs(options?: any): Promise<any>;
export function createDatabase(options?: SQLiteOptions): Promise<SQLiteDatabase>;
export function openDatabase(options?: SQLiteOptions): Promise<SQLiteDatabase>;
}

View File

@@ -1,6 +1,4 @@
import { AxiosResponse } from "axios";
import { GiverReceiverInputInfo } from "../libs/util";
import { ErrorResult, ResultWithType } from "./common";
export interface GiverOutputInfo {
action: string;
@@ -47,12 +45,3 @@ export interface ProviderInfo {
*/
linkConfirmed: boolean;
}
// Type for createAndSubmitClaim result
export type CreateAndSubmitClaimResult = SuccessResult | ErrorResult;
// Update SuccessResult to use ClaimResult
export interface SuccessResult extends ResultWithType {
type: "success";
response: AxiosResponse<ClaimResult>;
}

View File

@@ -1,15 +1,24 @@
import { GenericVerifiableCredential } from "./common";
/**
* Types of Claims
*
* Note that these are for the claims that get signed.
* Records that are the latest edited entities are in the records.ts file.
*
*/
export interface AgreeVerifiableCredential {
"@context": string;
import { ClaimObject } from "./common";
export interface AgreeActionClaim extends ClaimObject {
"@context": "https://schema.org";
"@type": string;
object: Record<string, unknown>;
}
// Note that previous VCs may have additional fields.
// https://endorser.ch/doc/html/transactions.html#id4
export interface GiveVerifiableCredential extends GenericVerifiableCredential {
"@context"?: string;
export interface GiveActionClaim extends ClaimObject {
// context is optional because it might be embedded in another claim, eg. an AgreeAction
"@context"?: "https://schema.org";
"@type": "GiveAction";
agent?: { identifier: string };
description?: string;
@@ -17,16 +26,25 @@ export interface GiveVerifiableCredential extends GenericVerifiableCredential {
identifier?: string;
image?: string;
object?: { amountOfThisGood: number; unitCode: string };
provider?: GenericVerifiableCredential;
provider?: ClaimObject;
recipient?: { identifier: string };
}
export interface JoinActionClaim extends ClaimObject {
agent?: { identifier: string };
event?: { organizer?: { name: string }; name?: string; startTime?: string };
}
// Note that previous VCs may have additional fields.
// https://endorser.ch/doc/html/transactions.html#id8
export interface OfferVerifiableCredential extends GenericVerifiableCredential {
"@context"?: string;
export interface OfferClaim extends ClaimObject {
"@context": "https://schema.org";
"@type": "Offer";
agent?: { identifier: string };
description?: string;
fulfills?: { "@type": string; identifier?: string; lastClaimId?: string }[];
identifier?: string;
image?: string;
includesObject?: { amountOfThisGood: number; unitCode: string };
itemOffered?: {
description?: string;
@@ -37,14 +55,18 @@ export interface OfferVerifiableCredential extends GenericVerifiableCredential {
name?: string;
};
};
offeredBy?: { identifier: string };
offeredBy?: {
type?: "Person";
identifier: string;
};
provider?: ClaimObject;
recipient?: { identifier: string };
validThrough?: string;
}
// Note that previous VCs may have additional fields.
// https://endorser.ch/doc/html/transactions.html#id7
export interface PlanVerifiableCredential extends GenericVerifiableCredential {
export interface PlanActionClaim extends ClaimObject {
"@context": "https://schema.org";
"@type": "PlanAction";
name: string;
@@ -58,11 +80,18 @@ export interface PlanVerifiableCredential extends GenericVerifiableCredential {
}
// AKA Registration & RegisterAction
export interface RegisterVerifiableCredential {
"@context": string;
export interface RegisterActionClaim extends ClaimObject {
"@context": "https://schema.org";
"@type": "RegisterAction";
agent: { identifier: string };
identifier?: string;
object: string;
object?: string;
participant?: { identifier: string };
}
export interface TenureClaim extends ClaimObject {
"@context": "https://endorser.ch";
"@type": "Tenure";
party?: { identifier: string };
spatialUnit?: { geo?: { polygon?: string } };
}

View File

@@ -15,10 +15,6 @@ export interface GenericCredWrapper<T extends GenericVerifiableCredential> {
publicUrls?: Record<string, string>;
}
export interface ResultWithType {
type: string;
}
export interface ErrorResponse {
error?: {
message?: string;
@@ -30,7 +26,76 @@ export interface InternalError {
userMessage?: string;
}
export interface ErrorResult extends ResultWithType {
type: "error";
error: InternalError;
export interface KeyMeta {
did: string;
publicKeyHex: string;
derivationPath?: string;
passkeyCredIdHex?: string; // The Webauthn credential ID in hex, if this is from a passkey
}
export interface KeyMetaMaybeWithPrivate extends KeyMeta {
mnemonic?: string; // 12 or 24 words encoding the seed
identity?: string; // Stringified IIdentifier object from Veramo
}
export interface KeyMetaWithPrivate extends KeyMeta {
mnemonic: string; // 12 or 24 words encoding the seed
identity: string; // Stringified IIdentifier object from Veramo
}
export interface QuantitativeValue extends GenericVerifiableCredential {
"@type": "QuantitativeValue";
"@context"?: string;
amountOfThisGood: number;
unitCode: string;
}
export interface AxiosErrorResponse {
message?: string;
response?: {
data?: {
error?: {
message?: string;
};
[key: string]: unknown;
};
status?: number;
config?: unknown;
};
config?: unknown;
[key: string]: unknown;
}
export interface UserInfo {
did: string;
name: string;
publicEncKey: string;
registered: boolean;
profileImageUrl?: string;
nextPublicEncKeyHash?: string;
}
export interface CreateAndSubmitClaimResult {
success: boolean;
error?: string;
handleId?: string;
}
export interface Agent {
identifier?: string;
did?: string;
}
export interface ClaimObject {
"@type": string;
"@context"?: string;
[key: string]: unknown;
}
export interface VerifiableCredentialClaim {
"@context"?: string;
"@type": string;
type: string[];
credentialSubject: ClaimObject;
[key: string]: unknown;
}

View File

@@ -0,0 +1,15 @@
export type SqlValue = string | number | null | Uint8Array;
export interface QueryExecResult {
columns: Array<string>;
values: Array<Array<SqlValue>>;
}
export interface DatabaseService {
initialize(): Promise<void>;
query(sql: string, params?: unknown[]): Promise<QueryExecResult[]>;
run(
sql: string,
params?: unknown[],
): Promise<{ changes: number; lastId?: number }>;
}

View File

@@ -30,7 +30,7 @@ import { z } from "zod";
// Add a union type of all valid route paths
export const VALID_DEEP_LINK_ROUTES = [
"user-profile",
"project-details",
"project",
"onboard-meeting-setup",
"invite-one-accept",
"contact-import",
@@ -61,7 +61,7 @@ export const deepLinkSchemas = {
"user-profile": z.object({
id: z.string(),
}),
"project-details": z.object({
project: z.object({
id: z.string(),
}),
"onboard-meeting-setup": z.object({

View File

@@ -1,13 +1,17 @@
/**
* Interfaces for the give records with limited contact information, good to show on a feed.
**/
import { GiveSummaryRecord } from "./records";
// Common interface for contact information
// Common interface for views with summary contact information
export interface ContactInfo {
known: boolean;
displayName: string;
profileImageUrl?: string;
}
// Define the contact information fields
// Define a subset of contact information fields
interface GiveContactInfo {
giver: ContactInfo;
issuer: ContactInfo;
@@ -17,5 +21,5 @@ interface GiveContactInfo {
image?: string;
}
// Combine GiveSummaryRecord with contact information using intersection type
// Combine GiveSummaryRecord with contact information
export type GiveRecordWithContactInfo = GiveSummaryRecord & GiveContactInfo;

View File

@@ -1,7 +1,33 @@
export * from "./claims";
export * from "./claims-result";
export * from "./common";
export type {
// From common.ts
CreateAndSubmitClaimResult,
GenericCredWrapper,
GenericVerifiableCredential,
KeyMeta,
// Exclude types that are also exported from other files
// GiveVerifiableCredential,
// OfferVerifiableCredential,
// RegisterVerifiableCredential,
// PlanSummaryRecord,
// UserInfo,
} from "./common";
export type {
// From claims.ts
GiveActionClaim,
OfferClaim,
RegisterActionClaim,
} from "./claims";
export type {
// From records.ts
PlanSummaryRecord,
} from "./records";
export type {
// From user.ts
UserInfo,
} from "./user";
export * from "./limits";
export * from "./records";
export * from "./user";
export * from "./deepLinks";

View File

@@ -1,14 +1,14 @@
import { GiveVerifiableCredential, OfferVerifiableCredential } from "./claims";
import { GiveActionClaim, OfferClaim } from "./claims";
// a summary record; the VC is found the fullClaim field
export interface GiveSummaryRecord {
[x: string]: PropertyKey | undefined | GiveVerifiableCredential;
[x: string]: PropertyKey | undefined | GiveActionClaim;
type?: string;
agentDid: string;
amount: number;
amountConfirmed: number;
description: string;
fullClaim: GiveVerifiableCredential;
fullClaim: GiveActionClaim;
fulfillsHandleId: string;
fulfillsPlanHandleId?: string;
fulfillsType?: string;
@@ -26,7 +26,7 @@ export interface OfferSummaryRecord {
amount: number;
amountGiven: number;
amountGivenConfirmed: number;
fullClaim: OfferVerifiableCredential;
fullClaim: OfferClaim;
fulfillsPlanHandleId: string;
handleId: string;
issuerDid: string;

View File

@@ -32,7 +32,7 @@ export const newIdentifier = (
publicHex: string,
privateHex: string,
derivationPath: string,
): Omit<IIdentifier, keyof "provider"> => {
): IIdentifier => {
return {
did: DEFAULT_DID_PROVIDER_NAME + ":" + address,
keys: [
@@ -159,7 +159,7 @@ export const nextDerivationPath = (origDerivPath: string) => {
};
// Base64 encoding/decoding utilities for browser
function base64ToArrayBuffer(base64: string): Uint8Array {
export function base64ToArrayBuffer(base64: string): Uint8Array {
const binaryString = atob(base64);
const bytes = new Uint8Array(binaryString.length);
for (let i = 0; i < binaryString.length; i++) {
@@ -168,7 +168,7 @@ function base64ToArrayBuffer(base64: string): Uint8Array {
return bytes;
}
function arrayBufferToBase64(buffer: ArrayBuffer): string {
export function arrayBufferToBase64(buffer: ArrayBuffer): string {
const binary = String.fromCharCode(...new Uint8Array(buffer));
return btoa(binary);
}
@@ -178,7 +178,7 @@ const IV_LENGTH = 12;
const KEY_LENGTH = 256;
const ITERATIONS = 100000;
// Encryption helper function
// Message encryption helper function, used for onboarding meeting messages
export async function encryptMessage(message: string, password: string) {
const encoder = new TextEncoder();
const salt = crypto.getRandomValues(new Uint8Array(SALT_LENGTH));
@@ -226,7 +226,7 @@ export async function encryptMessage(message: string, password: string) {
return btoa(JSON.stringify(result));
}
// Decryption helper function
// Message decryption helper function, used for onboarding meeting messages
export async function decryptMessage(encryptedJson: string, password: string) {
const decoder = new TextDecoder();
const { salt, iv, encrypted } = JSON.parse(atob(encryptedJson));
@@ -273,7 +273,7 @@ export async function decryptMessage(encryptedJson: string, password: string) {
}
// Test function to verify encryption/decryption
export async function testEncryptionDecryption() {
export async function testMessageEncryptionDecryption() {
try {
const testMessage = "Hello, this is a test message! 🚀";
const testPassword = "myTestPassword123";
@@ -299,9 +299,111 @@ export async function testEncryptionDecryption() {
logger.log("\nTesting with wrong password...");
try {
await decryptMessage(encrypted, "wrongPassword");
logger.log("Should not reach here");
logger.log("Incorrectly decrypted with wrong password ❌");
} catch (error) {
logger.log("Correctly failed with wrong password ✅");
logger.log("Correctly failed to decrypt with wrong password ✅");
}
return success;
} catch (error) {
logger.error("Test failed with error:", error);
return false;
}
}
// Simple encryption using Node's crypto, used for the initial encryption of the identity and mnemonic
export async function simpleEncrypt(
text: string,
secret: ArrayBuffer,
): Promise<ArrayBuffer> {
const iv = crypto.getRandomValues(new Uint8Array(16));
// Derive a 256-bit key from the secret using SHA-256
const keyData = await crypto.subtle.digest("SHA-256", secret);
const key = await crypto.subtle.importKey(
"raw",
keyData,
{ name: "AES-GCM" },
false,
["encrypt"],
);
const encrypted = await crypto.subtle.encrypt(
{ name: "AES-GCM", iv },
key,
new TextEncoder().encode(text),
);
// Combine IV and encrypted data
const result = new Uint8Array(iv.length + encrypted.byteLength);
result.set(iv);
result.set(new Uint8Array(encrypted), iv.length);
return result.buffer;
}
// Simple decryption using Node's crypto, used for the default decryption of identity and mnemonic
export async function simpleDecrypt(
encryptedText: ArrayBuffer,
secret: ArrayBuffer,
): Promise<string> {
const data = new Uint8Array(encryptedText);
// Extract IV and encrypted data
const iv = data.slice(0, 16);
const encrypted = data.slice(16);
// Derive the same 256-bit key from the secret using SHA-256
const keyData = await crypto.subtle.digest("SHA-256", secret);
const key = await crypto.subtle.importKey(
"raw",
keyData,
{ name: "AES-GCM" },
false,
["decrypt"],
);
const decrypted = await crypto.subtle.decrypt(
{ name: "AES-GCM", iv },
key,
encrypted,
);
return new TextDecoder().decode(decrypted);
}
// Test function for simple encryption/decryption
export async function testSimpleEncryptionDecryption() {
try {
const testMessage = "Hello, this is a test message! 🚀";
const testSecret = crypto.getRandomValues(new Uint8Array(32));
logger.log("Original message:", testMessage);
// Test encryption
logger.log("Encrypting...");
const encrypted = await simpleEncrypt(testMessage, testSecret);
const encryptedBase64 = arrayBufferToBase64(encrypted);
logger.log("Encrypted result:", encryptedBase64);
// Test decryption
logger.log("Decrypting...");
const encryptedArrayBuffer = base64ToArrayBuffer(encryptedBase64);
const decrypted = await simpleDecrypt(encryptedArrayBuffer, testSecret);
logger.log("Decrypted result:", decrypted);
// Verify
const success = testMessage === decrypted;
logger.log("Test " + (success ? "PASSED ✅" : "FAILED ❌"));
logger.log("Messages match:", success);
// Test with wrong secret
logger.log("\nTesting with wrong secret...");
try {
await simpleDecrypt(encryptedArrayBuffer, new Uint8Array(32));
logger.log("Incorrectly decrypted with wrong secret ❌");
} catch (error) {
logger.log("Correctly failed to decrypt with wrong secret ✅");
}
return success;

View File

@@ -17,29 +17,12 @@ import { didEthLocalResolver } from "./did-eth-local-resolver";
import { PEER_DID_PREFIX, verifyPeerSignature } from "./didPeer";
import { base64urlDecodeString, createDidPeerJwt } from "./passkeyDidPeer";
import { urlBase64ToUint8Array } from "./util";
import { KeyMeta, KeyMetaWithPrivate } from "../../../interfaces/common";
export const ETHR_DID_PREFIX = "did:ethr:";
export const JWT_VERIFY_FAILED_CODE = "JWT_VERIFY_FAILED";
export const UNSUPPORTED_DID_METHOD_CODE = "UNSUPPORTED_DID_METHOD";
/**
* Meta info about a key
*/
export interface KeyMeta {
/**
* Decentralized ID for the key
*/
did: string;
/**
* Stringified IIDentifier object from Veramo
*/
identity?: string;
/**
* The Webauthn credential ID in hex, if this is from a passkey
*/
passkeyCredIdHex?: string;
}
const ethLocalResolver = new Resolver({ ethr: didEthLocalResolver });
/**
@@ -51,7 +34,7 @@ export function isFromPasskey(keyMeta?: KeyMeta): boolean {
}
export async function createEndorserJwtForKey(
account: KeyMeta,
account: KeyMetaWithPrivate,
payload: object,
expiresIn?: number,
) {

View File

@@ -1,7 +1,6 @@
import { Buffer } from "buffer/";
import { JWTPayload } from "did-jwt";
import { DIDResolutionResult } from "did-resolver";
import { sha256 } from "ethereum-cryptography/sha256.js";
import { p256 } from "@noble/curves/p256";
import {
startAuthentication,
startRegistration,
@@ -11,12 +10,13 @@ import {
generateRegistrationOptions,
verifyAuthenticationResponse,
verifyRegistrationResponse,
VerifyAuthenticationResponseOpts,
} from "@simplewebauthn/server";
import { VerifyAuthenticationResponseOpts } from "@simplewebauthn/server/esm/authentication/verifyAuthenticationResponse";
import {
Base64URLString,
PublicKeyCredentialCreationOptionsJSON,
PublicKeyCredentialRequestOptionsJSON,
AuthenticatorAssertionResponse,
} from "@simplewebauthn/types";
import { AppString } from "../../../constants/app";
@@ -194,16 +194,19 @@ export class PeerSetup {
},
};
const credential = await navigator.credentials.get(options);
const credential = (await navigator.credentials.get(
options,
)) as PublicKeyCredential;
// console.log("nav credential get", credential);
this.authenticatorData = credential?.response.authenticatorData;
const response = credential?.response as AuthenticatorAssertionResponse;
this.authenticatorData = response?.authenticatorData;
const authenticatorDataBase64Url = arrayBufferToBase64URLString(
this.authenticatorData as ArrayBuffer,
);
this.clientDataJsonBase64Url = arrayBufferToBase64URLString(
credential?.response.clientDataJSON,
response?.clientDataJSON,
);
// Our custom type of JWANT means the signature is based on a concatenation of the two Webauthn properties
@@ -228,9 +231,7 @@ export class PeerSetup {
.replace(/\//g, "_")
.replace(/=+$/, "");
const origSignature = Buffer.from(credential?.response.signature).toString(
"base64",
);
const origSignature = Buffer.from(response?.signature).toString("base64");
this.signature = origSignature
.replace(/\+/g, "-")
.replace(/\//g, "_")
@@ -315,24 +316,18 @@ export async function createDidPeerJwt(
// ... and this import:
// import { p256 } from "@noble/curves/p256";
export async function verifyJwtP256(
credIdHex: string,
issuerDid: string,
authenticatorData: ArrayBuffer,
challenge: Uint8Array,
clientDataJsonBase64Url: Base64URLString,
signature: Base64URLString,
) {
const authDataFromBase = Buffer.from(authenticatorData);
const clientDataFromBase = Buffer.from(clientDataJsonBase64Url, "base64");
const sigBuffer = Buffer.from(signature, "base64");
const finalSigBuffer = unwrapEC2Signature(sigBuffer);
const publicKeyBytes = peerDidToPublicKeyBytes(issuerDid);
// Hash the client data
const hash = sha256(clientDataFromBase);
// Construct the preimage
const preimage = Buffer.concat([authDataFromBase, hash]);
// Use challenge in preimage construction
const preimage = Buffer.concat([authDataFromBase, Buffer.from(challenge)]);
const isValid = p256.verify(
finalSigBuffer,
@@ -383,122 +378,37 @@ export async function verifyJwtSimplewebauthn(
// similar code is in endorser-ch util-crypto.ts verifyPeerSignature
export async function verifyJwtWebCrypto(
credId: Base64URLString,
issuerDid: string,
authenticatorData: ArrayBuffer,
challenge: Uint8Array,
clientDataJsonBase64Url: Base64URLString,
signature: Base64URLString,
) {
const authDataFromBase = Buffer.from(authenticatorData);
const clientDataFromBase = Buffer.from(clientDataJsonBase64Url, "base64");
const sigBuffer = Buffer.from(signature, "base64");
const finalSigBuffer = unwrapEC2Signature(sigBuffer);
// Hash the client data
const hash = sha256(clientDataFromBase);
// Use challenge in preimage construction
const preimage = Buffer.concat([authDataFromBase, Buffer.from(challenge)]);
// Construct the preimage
const preimage = Buffer.concat([authDataFromBase, hash]);
return verifyPeerSignature(preimage, issuerDid, finalSigBuffer);
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
async function peerDidToDidDocument(did: string): Promise<DIDResolutionResult> {
if (!did.startsWith("did:peer:0z")) {
throw new Error(
"This only verifies a peer DID, method 0, encoded base58btc.",
);
}
// this is basically hard-coded from https://www.w3.org/TR/did-core/#example-various-verification-method-types
// (another reference is the @aviarytech/did-peer resolver)
// Remove unused functions:
// - peerDidToDidDocument
// - COSEtoPEM
// - base64urlDecodeArrayBuffer
// - base64urlEncodeArrayBuffer
// - pemToCryptoKey
/**
* Looks like JsonWebKey2020 isn't too difficult:
* - change context security/suites link to jws-2020/v1
* - change publicKeyMultibase to publicKeyJwk generated with cborToKeys
* - change type to JsonWebKey2020
*/
const id = did.split(":")[2];
const multibase = id.slice(1);
const encnumbasis = multibase.slice(1);
const didDocument = {
"@context": [
"https://www.w3.org/ns/did/v1",
"https://w3id.org/security/suites/secp256k1-2019/v1",
],
assertionMethod: [did + "#" + encnumbasis],
authentication: [did + "#" + encnumbasis],
capabilityDelegation: [did + "#" + encnumbasis],
capabilityInvocation: [did + "#" + encnumbasis],
id: did,
keyAgreement: undefined,
service: undefined,
verificationMethod: [
{
controller: did,
id: did + "#" + encnumbasis,
publicKeyMultibase: multibase,
type: "EcdsaSecp256k1VerificationKey2019",
},
],
};
return {
didDocument,
didDocumentMetadata: {},
didResolutionMetadata: { contentType: "application/did+ld+json" },
};
}
// convert COSE public key to PEM format
// eslint-disable-next-line @typescript-eslint/no-unused-vars
function COSEtoPEM(cose: Buffer) {
// const alg = cose.get(3); // Algorithm
const x = cose[-2]; // x-coordinate
const y = cose[-3]; // y-coordinate
// Ensure the coordinates are in the correct format
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
// @ts-expect-error because it complains about the type of x and y
const pubKeyBuffer = Buffer.concat([Buffer.from([0x04]), x, y]);
// Convert to PEM format
const pem = `-----BEGIN PUBLIC KEY-----
${pubKeyBuffer.toString("base64")}
-----END PUBLIC KEY-----`;
return pem;
}
// tried the base64url library but got an error using their Buffer
// Keep only the used functions:
export function base64urlDecodeString(input: string) {
return atob(input.replace(/-/g, "+").replace(/_/g, "/"));
}
// tried the base64url library but got an error using their Buffer
export function base64urlEncodeString(input: string) {
return btoa(input).replace(/\+/g, "-").replace(/\//g, "_").replace(/=+$/, "");
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
function base64urlDecodeArrayBuffer(input: string) {
input = input.replace(/-/g, "+").replace(/_/g, "/");
const pad = input.length % 4 === 0 ? "" : "====".slice(input.length % 4);
const str = atob(input + pad);
const bytes = new Uint8Array(str.length);
for (let i = 0; i < str.length; i++) {
bytes[i] = str.charCodeAt(i);
}
return bytes.buffer;
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
function base64urlEncodeArrayBuffer(buffer: ArrayBuffer) {
const str = String.fromCharCode(...new Uint8Array(buffer));
return base64urlEncodeString(str);
}
// from @simplewebauthn/browser
function arrayBufferToBase64URLString(buffer: ArrayBuffer) {
const bytes = new Uint8Array(buffer);
@@ -523,28 +433,3 @@ function base64URLStringToArrayBuffer(base64URLString: string) {
}
return buffer;
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
async function pemToCryptoKey(pem: string) {
const binaryDerString = atob(
pem
.split("\n")
.filter((x) => !x.includes("-----"))
.join(""),
);
const binaryDer = new Uint8Array(binaryDerString.length);
for (let i = 0; i < binaryDerString.length; i++) {
binaryDer[i] = binaryDerString.charCodeAt(i);
}
// console.log("binaryDer", binaryDer.buffer);
return await window.crypto.subtle.importKey(
"spki",
binaryDer.buffer,
{
name: "RSASSA-PKCS1-v1_5",
hash: "SHA-256",
},
true,
["verify"],
);
}

View File

@@ -26,29 +26,42 @@ import {
DEFAULT_IMAGE_API_SERVER,
NotificationIface,
APP_SERVER,
USE_DEXIE_DB,
} from "../constants/app";
import { Contact } from "../db/tables/contacts";
import { accessToken, deriveAddress, nextDerivationPath } from "../libs/crypto";
import { logConsoleAndDb, NonsensitiveDexie } from "../db/index";
import { NonsensitiveDexie } from "../db/index";
import { logConsoleAndDb } from "../db/databaseUtil";
import {
retrieveAccountMetadata,
retrieveFullyDecryptedAccount,
getPasskeyExpirationSeconds,
} from "../libs/util";
import { createEndorserJwtForKey, KeyMeta } from "../libs/crypto/vc";
import { createEndorserJwtForKey } from "../libs/crypto/vc";
import {
GiveActionClaim,
JoinActionClaim,
OfferClaim,
PlanActionClaim,
RegisterActionClaim,
TenureClaim,
} from "../interfaces/claims";
import {
GiveVerifiableCredential,
OfferVerifiableCredential,
RegisterVerifiableCredential,
GenericVerifiableCredential,
GenericCredWrapper,
PlanSummaryRecord,
GenericVerifiableCredential,
AxiosErrorResponse,
UserInfo,
CreateAndSubmitClaimResult,
} from "../interfaces";
ClaimObject,
VerifiableCredentialClaim,
QuantitativeValue,
KeyMetaWithPrivate,
KeyMetaMaybeWithPrivate,
} from "../interfaces/common";
import { PlanSummaryRecord } from "../interfaces/records";
import { logger } from "../utils/logger";
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
/**
* Standard context for schema.org data
@@ -100,7 +113,10 @@ export const ENDORSER_CH_HANDLE_PREFIX = "https://endorser.ch/entity/";
export const BLANK_GENERIC_SERVER_RECORD: GenericCredWrapper<GenericVerifiableCredential> =
{
claim: { "@type": "" },
claim: {
"@context": SCHEMA_ORG_CONTEXT,
"@type": "",
},
handleId: "",
id: "",
issuedAt: "",
@@ -125,7 +141,7 @@ export function isDid(did: string): boolean {
* @param {string} did - The DID to check
* @returns {boolean} True if DID is hidden
*/
export function isHiddenDid(did: string): boolean {
export function isHiddenDid(did: string | undefined): boolean {
return did === HIDDEN_DID;
}
@@ -180,37 +196,21 @@ export function isEmptyOrHiddenDid(did?: string): boolean {
* };
* testRecursivelyOnStrings(isHiddenDid, obj); // Returns: true
*/
function testRecursivelyOnStrings(
func: (arg0: unknown) => boolean,
const testRecursivelyOnStrings = (
input: unknown,
): boolean {
// Test direct string values
if (Object.prototype.toString.call(input) === "[object String]") {
return func(input);
test: (s: string) => boolean,
): boolean => {
if (typeof input === "string") {
return test(input);
} else if (Array.isArray(input)) {
return input.some((item) => testRecursivelyOnStrings(item, test));
} else if (input && typeof input === "object") {
return Object.values(input as Record<string, unknown>).some((value) =>
testRecursivelyOnStrings(value, test),
);
}
// Recursively test objects and arrays
else if (input instanceof Object) {
if (!Array.isArray(input)) {
// Handle plain objects
for (const key in input) {
if (testRecursivelyOnStrings(func, input[key])) {
return true;
}
}
} else {
// Handle arrays
for (const value of input) {
if (testRecursivelyOnStrings(func, value)) {
return true;
}
}
}
return false;
} else {
// Non-string, non-object values can't contain strings
return false;
}
}
return false;
};
// eslint-disable-next-line @typescript-eslint/no-explicit-any
export function containsHiddenDid(obj: any) {
@@ -551,7 +551,11 @@ export async function setPlanInCache(
* @returns {string|undefined} User-friendly message or undefined if none found
*/
export function serverMessageForUser(error: unknown): string | undefined {
return error?.response?.data?.error?.message;
if (error && typeof error === "object" && "response" in error) {
const err = error as AxiosErrorResponse;
return err.response?.data?.error?.message;
}
return undefined;
}
/**
@@ -573,18 +577,27 @@ export function errorStringForLog(error: unknown) {
// --- property '_value' closes the circle
}
let fullError = "" + error + " - JSON: " + stringifiedError;
const errorResponseText = JSON.stringify(error.response);
// for some reason, error.response is not included in stringify result (eg. for 400 errors on invite redemptions)
if (!R.empty(errorResponseText) && !fullError.includes(errorResponseText)) {
// add error.response stuff
if (R.equals(error?.config, error?.response?.config)) {
// but exclude "config" because it's already in there
const newErrorResponseText = JSON.stringify(
R.omit(["config"] as never[], error.response),
);
fullError += " - .response w/o same config JSON: " + newErrorResponseText;
} else {
fullError += " - .response JSON: " + errorResponseText;
if (error && typeof error === "object" && "response" in error) {
const err = error as AxiosErrorResponse;
const errorResponseText = JSON.stringify(err.response);
// for some reason, error.response is not included in stringify result (eg. for 400 errors on invite redemptions)
if (!R.empty(errorResponseText) && !fullError.includes(errorResponseText)) {
// add error.response stuff
if (
err.response?.config &&
err.config &&
R.equals(err.config, err.response.config)
) {
// but exclude "config" because it's already in there
const newErrorResponseText = JSON.stringify(
R.omit(["config"] as never[], err.response),
);
fullError +=
" - .response w/o same config JSON: " + newErrorResponseText;
} else {
fullError += " - .response JSON: " + errorResponseText;
}
}
}
return fullError;
@@ -642,7 +655,7 @@ export async function getNewOffersToUserProjects(
* @param lastClaimId supplied when editing a previous claim
*/
export function hydrateGive(
vcClaimOrig?: GiveVerifiableCredential,
vcClaimOrig?: GiveActionClaim,
fromDid?: string,
toDid?: string,
description?: string,
@@ -650,14 +663,12 @@ export function hydrateGive(
unitCode?: string,
fulfillsProjectHandleId?: string,
fulfillsOfferHandleId?: string,
isTrade: boolean = false, // remove, because this app is all for gifting
isTrade: boolean = false,
imageUrl?: string,
providerPlanHandleId?: string,
lastClaimId?: string,
): GiveVerifiableCredential {
// Remember: replace values or erase if it's null
const vcClaim: GiveVerifiableCredential = vcClaimOrig
): GiveActionClaim {
const vcClaim: GiveActionClaim = vcClaimOrig
? R.clone(vcClaimOrig)
: {
"@context": SCHEMA_ORG_CONTEXT,
@@ -665,55 +676,71 @@ export function hydrateGive(
};
if (lastClaimId) {
// this is an edit
vcClaim.lastClaimId = lastClaimId;
delete vcClaim.identifier;
}
vcClaim.agent = fromDid ? { identifier: fromDid } : undefined;
vcClaim.recipient = toDid ? { identifier: toDid } : undefined;
if (fromDid) {
vcClaim.agent = { identifier: fromDid };
}
if (toDid) {
vcClaim.recipient = { identifier: toDid };
}
vcClaim.description = description || undefined;
vcClaim.object =
amount && !isNaN(amount)
? { amountOfThisGood: amount, unitCode: unitCode || "HUR" }
: undefined;
// ensure fulfills is an array
if (amount && !isNaN(amount)) {
const quantitativeValue: QuantitativeValue = {
amountOfThisGood: amount,
unitCode: unitCode || "HUR",
};
vcClaim.object = quantitativeValue;
}
// Initialize fulfills array if not present
if (!Array.isArray(vcClaim.fulfills)) {
vcClaim.fulfills = vcClaim.fulfills ? [vcClaim.fulfills] : [];
}
// ... and replace or add each element, ending with Trade or Donate
// I realize the following doesn't change any elements that are not PlanAction or Offer or Trade/Action.
// Filter and add fulfills elements
vcClaim.fulfills = vcClaim.fulfills.filter(
(elem) => elem["@type"] !== "PlanAction",
(elem: { "@type": string }) => elem["@type"] !== "PlanAction",
);
if (fulfillsProjectHandleId) {
vcClaim.fulfills.push({
"@type": "PlanAction",
identifier: fulfillsProjectHandleId,
});
}
vcClaim.fulfills = vcClaim.fulfills.filter(
(elem) => elem["@type"] !== "Offer",
(elem: { "@type": string }) => elem["@type"] !== "Offer",
);
if (fulfillsOfferHandleId) {
vcClaim.fulfills.push({
"@type": "Offer",
identifier: fulfillsOfferHandleId,
});
}
// do Trade/Donate last because current endorser.ch only looks at the first for plans & offers
vcClaim.fulfills = vcClaim.fulfills.filter(
(elem) =>
(elem: { "@type": string }) =>
elem["@type"] !== "DonateAction" && elem["@type"] !== "TradeAction",
);
vcClaim.fulfills.push({ "@type": isTrade ? "TradeAction" : "DonateAction" });
vcClaim.fulfills.push({
"@type": isTrade ? "TradeAction" : "DonateAction",
});
vcClaim.image = imageUrl || undefined;
vcClaim.provider = providerPlanHandleId
? { "@type": "PlanAction", identifier: providerPlanHandleId }
: undefined;
if (providerPlanHandleId) {
vcClaim.provider = {
"@type": "PlanAction",
identifier: providerPlanHandleId,
};
}
return vcClaim;
}
@@ -774,7 +801,7 @@ export async function createAndSubmitGive(
export async function editAndSubmitGive(
axios: Axios,
apiServer: string,
fullClaim: GenericCredWrapper<GiveVerifiableCredential>,
fullClaim: GenericCredWrapper<GiveActionClaim>,
issuerDid: string,
fromDid?: string,
toDid?: string,
@@ -815,7 +842,7 @@ export async function editAndSubmitGive(
* @param lastClaimId supplied when editing a previous claim
*/
export function hydrateOffer(
vcClaimOrig?: OfferVerifiableCredential,
vcClaimOrig?: OfferClaim,
fromDid?: string,
toDid?: string,
itemDescription?: string,
@@ -825,10 +852,8 @@ export function hydrateOffer(
fulfillsProjectHandleId?: string,
validThrough?: string,
lastClaimId?: string,
): OfferVerifiableCredential {
// Remember: replace values or erase if it's null
const vcClaim: OfferVerifiableCredential = vcClaimOrig
): OfferClaim {
const vcClaim: OfferClaim = vcClaimOrig
? R.clone(vcClaimOrig)
: {
"@context": SCHEMA_ORG_CONTEXT,
@@ -841,14 +866,20 @@ export function hydrateOffer(
delete vcClaim.identifier;
}
vcClaim.offeredBy = fromDid ? { identifier: fromDid } : undefined;
vcClaim.recipient = toDid ? { identifier: toDid } : undefined;
if (fromDid) {
vcClaim.offeredBy = { identifier: fromDid };
}
if (toDid) {
vcClaim.recipient = { identifier: toDid };
}
vcClaim.description = conditionDescription || undefined;
vcClaim.includesObject =
amount && !isNaN(amount)
? { amountOfThisGood: amount, unitCode: unitCode || "HUR" }
: undefined;
if (amount && !isNaN(amount)) {
vcClaim.includesObject = {
amountOfThisGood: amount,
unitCode: unitCode || "HUR",
};
}
if (itemDescription || fulfillsProjectHandleId) {
vcClaim.itemOffered = vcClaim.itemOffered || {};
@@ -860,6 +891,7 @@ export function hydrateOffer(
};
}
}
vcClaim.validThrough = validThrough || undefined;
return vcClaim;
@@ -899,7 +931,7 @@ export async function createAndSubmitOffer(
undefined,
);
return createAndSubmitClaim(
vcClaim as OfferVerifiableCredential,
vcClaim as OfferClaim,
issuerDid,
apiServer,
axios,
@@ -909,7 +941,7 @@ export async function createAndSubmitOffer(
export async function editAndSubmitOffer(
axios: Axios,
apiServer: string,
fullClaim: GenericCredWrapper<OfferVerifiableCredential>,
fullClaim: GenericCredWrapper<OfferClaim>,
issuerDid: string,
itemDescription: string,
amount?: number,
@@ -932,7 +964,7 @@ export async function editAndSubmitOffer(
fullClaim.id,
);
return createAndSubmitClaim(
vcClaim as OfferVerifiableCredential,
vcClaim as OfferClaim,
issuerDid,
apiServer,
axios,
@@ -947,7 +979,7 @@ export const createAndSubmitConfirmation = async (
handleId: string | undefined,
apiServer: string,
axios: Axios,
) => {
): Promise<CreateAndSubmitClaimResult> => {
const goodClaim = removeSchemaContext(
removeVisibleToDids(
addLastClaimOrHandleAsIdIfMissing(claim, lastClaimId, handleId),
@@ -988,26 +1020,25 @@ export async function createAndSubmitClaim(
},
});
return { type: "success", response };
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} catch (error: any) {
return { success: true, handleId: response.data?.handleId };
} catch (error: unknown) {
logger.error("Error submitting claim:", error);
const errorMessage: string =
serverMessageForUser(error) ||
error.message ||
(error && typeof error === "object" && "message" in error
? String(error.message)
: undefined) ||
"Got some error submitting the claim. Check your permissions, network, and error logs.";
return {
type: "error",
error: {
error: errorMessage,
},
success: false,
error: errorMessage,
};
}
}
export async function generateEndorserJwtUrlForAccount(
account: KeyMeta,
account: KeyMetaMaybeWithPrivate,
isRegistered: boolean,
givenName: string,
profileImageUrl: string,
@@ -1031,12 +1062,9 @@ export async function generateEndorserJwtUrlForAccount(
}
// Add the next key -- not recommended for the QR code for such a high resolution
if (isContact && account?.mnemonic && account?.derivationPath) {
const newDerivPath = nextDerivationPath(account.derivationPath as string);
const nextPublicHex = deriveAddress(
account.mnemonic as string,
newDerivPath,
)[2];
if (isContact && account.derivationPath && account.mnemonic) {
const newDerivPath = nextDerivationPath(account.derivationPath);
const nextPublicHex = deriveAddress(account.mnemonic, newDerivPath)[2];
const nextPublicEncKey = Buffer.from(nextPublicHex, "hex");
const nextPublicEncKeyHash = sha256(nextPublicEncKey);
const nextPublicEncKeyHashBase64 =
@@ -1056,7 +1084,11 @@ export async function createEndorserJwtForDid(
expiresIn?: number,
) {
const account = await retrieveFullyDecryptedAccount(issuerDid);
return createEndorserJwtForKey(account as KeyMeta, payload, expiresIn);
return createEndorserJwtForKey(
account as KeyMetaWithPrivate,
payload,
expiresIn,
);
}
/**
@@ -1104,21 +1136,21 @@ export const capitalizeAndInsertSpacesBeforeCaps = (text: string) => {
similar code is also contained in endorser-mobile
**/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const claimSummary = (
claim: GenericCredWrapper<GenericVerifiableCredential>,
claim:
| GenericVerifiableCredential
| GenericCredWrapper<GenericVerifiableCredential>,
) => {
if (!claim) {
// to differentiate from "something" above
return "something";
}
let specificClaim:
| GenericVerifiableCredential
| GenericCredWrapper<GenericVerifiableCredential> = claim;
if (claim.claim) {
// probably a Verified Credential
// eslint-disable-next-line @typescript-eslint/no-explicit-any
specificClaim = claim.claim;
let specificClaim: GenericVerifiableCredential;
if ("claim" in claim) {
// It's a GenericCredWrapper
specificClaim = claim.claim as GenericVerifiableCredential;
} else {
// It's already a GenericVerifiableCredential
specificClaim = claim;
}
if (Array.isArray(specificClaim)) {
if (specificClaim.length === 1) {
@@ -1153,88 +1185,112 @@ export const claimSpecialDescription = (
identifiers: Array<string>,
contacts: Array<Contact>,
) => {
let claim = record.claim;
if (claim.claim) {
// it's probably a Verified Credential
claim = claim.claim;
let claim:
| GenericVerifiableCredential
| GenericCredWrapper<GenericVerifiableCredential> = record.claim;
if ("claim" in claim) {
// it's a nested GenericCredWrapper
claim = claim.claim as GenericVerifiableCredential;
}
const issuer = didInfo(record.issuer, activeDid, identifiers, contacts);
const type = claim["@type"] || "UnknownType";
if (type === "AgreeAction") {
return issuer + " agreed with " + claimSummary(claim.object);
return (
issuer +
" agreed with " +
claimSummary(claim.object as GenericVerifiableCredential)
);
} else if (isAccept(claim)) {
return issuer + " accepted " + claimSummary(claim.object);
return (
issuer +
" accepted " +
claimSummary(claim.object as GenericVerifiableCredential)
);
} else if (type === "GiveAction") {
// agent.did is for legacy data, before March 2023
const giver = claim.agent?.identifier || claim.agent?.did;
const giveClaim = claim as GiveActionClaim;
// @ts-expect-error because .did may be found in legacy data, before March 2023
const legacyGiverDid = giveClaim.agent?.did;
const giver = giveClaim.agent?.identifier || legacyGiverDid;
const giverInfo = didInfo(giver, activeDid, identifiers, contacts);
let gaveAmount = claim.object?.amountOfThisGood
? displayAmount(claim.object.unitCode, claim.object.amountOfThisGood)
let gaveAmount = giveClaim.object?.amountOfThisGood
? displayAmount(
giveClaim.object.unitCode as string,
giveClaim.object.amountOfThisGood as number,
)
: "";
if (claim.description) {
if (giveClaim.description) {
if (gaveAmount) {
gaveAmount = gaveAmount + ", and also: ";
}
gaveAmount = gaveAmount + claim.description;
gaveAmount = gaveAmount + giveClaim.description;
}
if (!gaveAmount) {
gaveAmount = "something not described";
}
// recipient.did is for legacy data, before March 2023
const gaveRecipientId = claim.recipient?.identifier || claim.recipient?.did;
// @ts-expect-error because .did may be found in legacy data, before March 2023
const legacyRecipDid = giveClaim.recipient?.did;
const gaveRecipientId = giveClaim.recipient?.identifier || legacyRecipDid;
const gaveRecipientInfo = gaveRecipientId
? " to " + didInfo(gaveRecipientId, activeDid, identifiers, contacts)
: "";
return giverInfo + " gave" + gaveRecipientInfo + ": " + gaveAmount;
} else if (type === "JoinAction") {
// agent.did is for legacy data, before March 2023
const agent = claim.agent?.identifier || claim.agent?.did;
const joinClaim = claim as JoinActionClaim;
// @ts-expect-error because .did may be found in legacy data, before March 2023
const legacyDid = joinClaim.agent?.did;
const agent = joinClaim.agent?.identifier || legacyDid;
const contactInfo = didInfo(agent, activeDid, identifiers, contacts);
let eventOrganizer =
claim.event && claim.event.organizer && claim.event.organizer.name;
joinClaim.event &&
joinClaim.event.organizer &&
joinClaim.event.organizer.name;
eventOrganizer = eventOrganizer || "";
let eventName = claim.event && claim.event.name;
let eventName = joinClaim.event && joinClaim.event.name;
eventName = eventName ? " " + eventName : "";
let fullEvent = eventOrganizer + eventName;
fullEvent = fullEvent ? " attended the " + fullEvent : "";
let eventDate = claim.event && claim.event.startTime;
let eventDate = joinClaim.event && joinClaim.event.startTime;
eventDate = eventDate ? " at " + eventDate : "";
return contactInfo + fullEvent + eventDate;
} else if (isOffer(claim)) {
const offerer = claim.offeredBy?.identifier;
const offerClaim = claim as OfferClaim;
const offerer = offerClaim.offeredBy?.identifier;
const contactInfo = didInfo(offerer, activeDid, identifiers, contacts);
let offering = "";
if (claim.includesObject) {
if (offerClaim.includesObject) {
offering +=
" " +
displayAmount(
claim.includesObject.unitCode,
claim.includesObject.amountOfThisGood,
offerClaim.includesObject.unitCode,
offerClaim.includesObject.amountOfThisGood,
);
}
if (claim.itemOffered?.description) {
offering += ", saying: " + claim.itemOffered?.description;
if (offerClaim.itemOffered?.description) {
offering += ", saying: " + offerClaim.itemOffered?.description;
}
// recipient.did is for legacy data, before March 2023
const offerRecipientId =
claim.recipient?.identifier || claim.recipient?.did;
// @ts-expect-error because .did may be found in legacy data, before March 2023
const legacyDid = offerClaim.recipient?.did;
const offerRecipientId = offerClaim.recipient?.identifier || legacyDid;
const offerRecipientInfo = offerRecipientId
? " to " + didInfo(offerRecipientId, activeDid, identifiers, contacts)
: "";
return contactInfo + " offered" + offering + offerRecipientInfo;
} else if (type === "PlanAction") {
const claimer = claim.agent?.identifier || record.issuer;
const planClaim = claim as PlanActionClaim;
const claimer = planClaim.agent?.identifier || record.issuer;
const claimerInfo = didInfo(claimer, activeDid, identifiers, contacts);
return claimerInfo + " announced a project: " + claim.name;
return claimerInfo + " announced a project: " + planClaim.name;
} else if (type === "Tenure") {
// party.did is for legacy data, before March 2023
const claimer = claim.party?.identifier || claim.party?.did;
const tenureClaim = claim as TenureClaim;
// @ts-expect-error because .did may be found in legacy data, before March 2023
const legacyDid = tenureClaim.party?.did;
const claimer = tenureClaim.party?.identifier || legacyDid;
const contactInfo = didInfo(claimer, activeDid, identifiers, contacts);
const polygon = claim.spatialUnit?.geo?.polygon || "";
const polygon = tenureClaim.spatialUnit?.geo?.polygon || "";
return (
contactInfo +
" possesses [" +
@@ -1242,11 +1298,7 @@ export const claimSpecialDescription = (
"...]"
);
} else {
return (
issuer +
" declared " +
claimSummary(claim as GenericCredWrapper<GenericVerifiableCredential>)
);
return issuer + " declared " + claimSummary(claim);
}
};
@@ -1286,32 +1338,42 @@ export async function createEndorserJwtVcFromClaim(
return createEndorserJwtForDid(issuerDid, vcPayload);
}
/**
* Create a JWT for a RegisterAction claim.
*
* @param activeDid - The DID of the user creating the invite
* @param contact - The contact to register, with a 'did' field (all optional for invites)
* @param identifier - The identifier for the invite, usually random
* @param expiresIn - The number of seconds until the invite expires
* @returns The JWT for the RegisterAction claim
*/
export async function createInviteJwt(
activeDid: string,
contact?: Contact,
inviteId?: string,
expiresIn?: number,
identifier?: string,
expiresIn?: number, // in seconds
): Promise<string> {
const vcClaim: RegisterVerifiableCredential = {
const vcClaim: RegisterActionClaim = {
"@context": SCHEMA_ORG_CONTEXT,
"@type": "RegisterAction",
agent: { identifier: activeDid },
object: SERVICE_ID,
identifier: identifier,
};
if (contact) {
if (contact?.did) {
vcClaim.participant = { identifier: contact.did };
}
if (inviteId) {
vcClaim.identifier = inviteId;
}
// Make a payload for the claim
const vcPayload = {
const vcPayload: { vc: VerifiableCredentialClaim } = {
vc: {
"@context": ["https://www.w3.org/2018/credentials/v1"],
"@type": "VerifiableCredential",
type: ["VerifiableCredential"],
credentialSubject: vcClaim,
credentialSubject: vcClaim as unknown as ClaimObject, // Type assertion needed due to object being string
},
};
// Create a signature using private key of identity
const vcJwt = await createEndorserJwtForDid(activeDid, vcPayload, expiresIn);
return vcJwt;
@@ -1323,21 +1385,44 @@ export async function register(
axios: Axios,
contact: Contact,
): Promise<{ success?: boolean; error?: string }> {
const vcJwt = await createInviteJwt(activeDid, contact);
try {
const vcJwt = await createInviteJwt(activeDid, contact);
const url = apiServer + "/api/v2/claim";
const resp = await axios.post<{
success?: {
handleId?: string;
embeddedRecordError?: string;
};
error?: string;
message?: string;
}>(url, { jwtEncoded: vcJwt });
const url = apiServer + "/api/v2/claim";
const resp = await axios.post(url, { jwtEncoded: vcJwt });
if (resp.data?.success?.handleId) {
return { success: true };
} else if (resp.data?.success?.embeddedRecordError) {
let message =
"There was some problem with the registration and so it may not be complete.";
if (typeof resp.data.success.embeddedRecordError == "string") {
message += " " + resp.data.success.embeddedRecordError;
if (resp.data?.success?.handleId) {
return { success: true };
} else if (resp.data?.success?.embeddedRecordError) {
let message =
"There was some problem with the registration and so it may not be complete.";
if (typeof resp.data.success.embeddedRecordError === "string") {
message += " " + resp.data.success.embeddedRecordError;
}
return { error: message };
} else {
logger.error("Registration error:", JSON.stringify(resp.data));
return { error: "Got a server error when registering." };
}
} catch (error: unknown) {
if (error && typeof error === "object") {
const err = error as AxiosErrorResponse;
const errorMessage =
err.message ||
(err.response?.data &&
typeof err.response.data === "object" &&
"message" in err.response.data
? (err.response.data as { message: string }).message
: undefined);
logger.error("Registration error:", errorMessage || JSON.stringify(err));
return { error: errorMessage || "Got a server error when registering." };
}
return { error: message };
} else {
logger.error(resp);
return { error: "Got a server error when registering." };
}
}
@@ -1363,7 +1448,14 @@ export async function setVisibilityUtil(
if (resp.status === 200) {
const success = resp.data.success;
if (success) {
db.contacts.update(contact.did, { seesMe: visibility });
const platformService = PlatformServiceFactory.getInstance();
await platformService.dbExec(
"UPDATE contacts SET seesMe = ? WHERE did = ?",
[visibility, contact.did],
);
if (USE_DEXIE_DB) {
db.contacts.update(contact.did, { seesMe: visibility });
}
}
return { success };
} else {

View File

@@ -5,29 +5,47 @@ import { Buffer } from "buffer";
import * as R from "ramda";
import { useClipboard } from "@vueuse/core";
import { DEFAULT_PUSH_SERVER, NotificationIface } from "../constants/app";
import {
DEFAULT_PUSH_SERVER,
NotificationIface,
USE_DEXIE_DB,
} from "../constants/app";
import {
accountsDBPromise,
retrieveSettingsForActiveAccount,
updateAccountSettings,
updateDefaultSettings,
} from "../db/index";
import { Account } from "../db/tables/accounts";
import { Account, AccountEncrypted } from "../db/tables/accounts";
import { Contact } from "../db/tables/contacts";
import * as databaseUtil from "../db/databaseUtil";
import { DEFAULT_PASSKEY_EXPIRATION_MINUTES } from "../db/tables/settings";
import { deriveAddress, generateSeed, newIdentifier } from "../libs/crypto";
import * as serverUtil from "../libs/endorserServer";
import {
containsHiddenDid,
arrayBufferToBase64,
base64ToArrayBuffer,
deriveAddress,
generateSeed,
newIdentifier,
simpleDecrypt,
simpleEncrypt,
} from "../libs/crypto";
import * as serverUtil from "../libs/endorserServer";
import { containsHiddenDid } from "../libs/endorserServer";
import {
GenericCredWrapper,
GenericVerifiableCredential,
GiveSummaryRecord,
OfferVerifiableCredential,
} from "../libs/endorserServer";
import { KeyMeta } from "../libs/crypto/vc";
KeyMetaWithPrivate,
} from "../interfaces/common";
import { GiveSummaryRecord } from "../interfaces/records";
import { OfferClaim } from "../interfaces/claims";
import { createPeerDid } from "../libs/crypto/vc/didPeer";
import { registerCredential } from "../libs/crypto/vc/passkeyDidPeer";
import { logger } from "../utils/logger";
import { PlatformServiceFactory } from "@/services/PlatformServiceFactory";
import { sha256 } from "ethereum-cryptography/sha256";
import { IIdentifier } from "@veramo/core";
import { insertDidSpecificSettings, parseJsonField } from "../db/databaseUtil";
import { DEFAULT_ROOT_DERIVATION_PATH } from "./crypto";
export interface GiverReceiverInputInfo {
did?: string;
@@ -66,18 +84,24 @@ export const UNIT_LONG: Record<string, string> = {
};
/* eslint-enable prettier/prettier */
const UNIT_CODES: Record<string, Record<string, string>> = {
const UNIT_CODES: Record<
string,
{ name: string; faIcon: string; decimals: number }
> = {
BTC: {
name: "Bitcoin",
faIcon: "bitcoin-sign",
decimals: 4,
},
HUR: {
name: "hours",
faIcon: "clock",
decimals: 0,
},
USD: {
name: "US Dollars",
faIcon: "dollar",
decimals: 2,
},
};
@@ -85,6 +109,13 @@ export function iconForUnitCode(unitCode: string) {
return UNIT_CODES[unitCode]?.faIcon || "question";
}
export function formattedAmount(amount: number, unitCode: string) {
const unit = UNIT_CODES[unitCode];
const amountStr = amount.toFixed(unit?.decimals ?? 4);
const unitName = unit?.name || "?";
return amountStr + " " + unitName;
}
// from https://stackoverflow.com/a/175787/845494
// ... though it appears even this isn't precisely right so keep doing "|| 0" or something in sensitive places
//
@@ -364,16 +395,19 @@ export function base64ToBlob(base64DataUrl: string, sliceSize = 512) {
* @param veriClaim is expected to have fields: claim and issuer
*/
export function offerGiverDid(
veriClaim: GenericCredWrapper<OfferVerifiableCredential>,
veriClaim: GenericCredWrapper<OfferClaim>,
): string | undefined {
let giver;
if (
veriClaim.claim.offeredBy?.identifier &&
!serverUtil.isHiddenDid(veriClaim.claim.offeredBy.identifier as string)
) {
giver = veriClaim.claim.offeredBy.identifier;
} else if (veriClaim.issuer && !serverUtil.isHiddenDid(veriClaim.issuer)) {
giver = veriClaim.issuer;
const innerClaim = veriClaim.claim as OfferClaim;
let giver: string | undefined = undefined;
giver = innerClaim.offeredBy?.identifier;
if (giver && !serverUtil.isHiddenDid(giver)) {
return giver;
}
giver = veriClaim.issuer;
if (giver && !serverUtil.isHiddenDid(giver)) {
return giver;
}
return giver;
}
@@ -384,10 +418,12 @@ export function offerGiverDid(
*/
export const canFulfillOffer = (
veriClaim: GenericCredWrapper<GenericVerifiableCredential>,
isRegistered: boolean,
) => {
return (
isRegistered &&
veriClaim.claimType === "Offer" &&
!!offerGiverDid(veriClaim as GenericCredWrapper<OfferVerifiableCredential>)
!!offerGiverDid(veriClaim as GenericCredWrapper<OfferClaim>)
);
};
@@ -457,74 +493,236 @@ export function findAllVisibleToDids(
*
**/
export interface AccountKeyInfo extends Account, KeyMeta {}
export type AccountKeyInfo = Account & KeyMetaWithPrivate;
export const retrieveAccountCount = async (): Promise<number> => {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
return await accountsDB.accounts.count();
let result = 0;
const platformService = PlatformServiceFactory.getInstance();
const dbResult = await platformService.dbQuery(
`SELECT COUNT(*) FROM accounts`,
);
if (dbResult?.values?.[0]?.[0]) {
result = dbResult.values[0][0] as number;
}
if (USE_DEXIE_DB) {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
result = await accountsDB.accounts.count();
}
return result;
};
export const retrieveAccountDids = async (): Promise<string[]> => {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const allAccounts = await accountsDB.accounts.toArray();
const allDids = allAccounts.map((acc) => acc.did);
const platformService = PlatformServiceFactory.getInstance();
const dbAccounts = await platformService.dbQuery(`SELECT did FROM accounts`);
let allDids =
databaseUtil
.mapQueryResultToValues(dbAccounts)
?.map((row) => row[0] as string) || [];
if (USE_DEXIE_DB) {
// this is the old way
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const allAccounts = await accountsDB.accounts.toArray();
allDids = allAccounts.map((acc) => acc.did);
}
return allDids;
};
// This is provided and recommended when the full key is not necessary so that
// future work could separate this info from the sensitive key material.
/**
* This is provided and recommended when the full key is not necessary so that
* future work could separate this info from the sensitive key material.
*
* If you need the private key data, use retrieveFullyDecryptedAccount instead.
*/
export const retrieveAccountMetadata = async (
activeDid: string,
): Promise<AccountKeyInfo | undefined> => {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const account = (await accountsDB.accounts
.where("did")
.equals(activeDid)
.first()) as Account;
): Promise<Account | undefined> => {
let result: Account | undefined = undefined;
const platformService = PlatformServiceFactory.getInstance();
const dbAccount = await platformService.dbQuery(
`SELECT * FROM accounts WHERE did = ?`,
[activeDid],
);
const account = databaseUtil.mapQueryResultToValues(dbAccount)[0] as Account;
if (account) {
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const { identity, mnemonic, ...metadata } = account;
return metadata;
result = metadata;
} else {
return undefined;
result = undefined;
}
if (USE_DEXIE_DB) {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const account = (await accountsDB.accounts
.where("did")
.equals(activeDid)
.first()) as Account;
if (account) {
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const { identity, mnemonic, ...metadata } = account;
result = metadata;
} else {
result = undefined;
}
}
return result;
};
export const retrieveAllAccountsMetadata = async (): Promise<Account[]> => {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const array = await accountsDB.accounts.toArray();
return array.map((account) => {
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const { identity, mnemonic, ...metadata } = account;
return metadata;
});
};
/**
* This contains sensitive data. If possible, use retrieveAccountMetadata instead.
*
* @param activeDid
* @returns account info with private key data decrypted
*/
export const retrieveFullyDecryptedAccount = async (
activeDid: string,
): Promise<AccountKeyInfo | undefined> => {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const account = (await accountsDB.accounts
.where("did")
.equals(activeDid)
.first()) as Account;
return account;
): Promise<Account | undefined> => {
let result: Account | undefined = undefined;
const platformService = PlatformServiceFactory.getInstance();
const dbSecrets = await platformService.dbQuery(
`SELECT secretBase64 from secret`,
);
if (
!dbSecrets ||
dbSecrets.values.length === 0 ||
dbSecrets.values[0].length === 0
) {
throw new Error(
"No secret found. We recommend you clear your data and start over.",
);
}
const secretBase64 = dbSecrets.values[0][0] as string;
const secret = base64ToArrayBuffer(secretBase64);
const dbAccount = await platformService.dbQuery(
`SELECT * FROM accounts WHERE did = ?`,
[activeDid],
);
if (
!dbAccount ||
dbAccount.values.length === 0 ||
dbAccount.values[0].length === 0
) {
throw new Error("Account not found.");
}
const fullAccountData = databaseUtil.mapQueryResultToValues(
dbAccount,
)[0] as AccountEncrypted;
const identityEncr = base64ToArrayBuffer(fullAccountData.identityEncrBase64);
const mnemonicEncr = base64ToArrayBuffer(fullAccountData.mnemonicEncrBase64);
fullAccountData.identity = await simpleDecrypt(identityEncr, secret);
fullAccountData.mnemonic = await simpleDecrypt(mnemonicEncr, secret);
result = fullAccountData;
if (USE_DEXIE_DB) {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const account = (await accountsDB.accounts
.where("did")
.equals(activeDid)
.first()) as Account;
result = account;
}
return result;
};
// let's try and eliminate this
export const retrieveAllFullyDecryptedAccounts = async (): Promise<
Array<AccountKeyInfo>
export const retrieveAllAccountsMetadata = async (): Promise<
AccountEncrypted[]
> => {
const accountsDB = await accountsDBPromise;
const allAccounts = await accountsDB.accounts.toArray();
return allAccounts;
const platformService = PlatformServiceFactory.getInstance();
const dbAccounts = await platformService.dbQuery(`SELECT * FROM accounts`);
const accounts = databaseUtil.mapQueryResultToValues(dbAccounts) as Account[];
let result = accounts.map((account) => {
return account as AccountEncrypted;
});
if (USE_DEXIE_DB) {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const array = await accountsDB.accounts.toArray();
result = array.map((account) => {
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const { identity, mnemonic, ...metadata } = account;
// This is not accurate because they can't be decrypted, but we're removing Dexie anyway.
const identityStr = JSON.stringify(identity);
const encryptedAccount = {
identityEncrBase64: sha256(
new TextEncoder().encode(identityStr),
).toString(),
mnemonicEncrBase64: sha256(
new TextEncoder().encode(account.mnemonic),
).toString(),
...metadata,
};
return encryptedAccount as AccountEncrypted;
});
}
return result;
};
/**
* Saves a new identity to both SQL and Dexie databases
*/
export async function saveNewIdentity(
identity: IIdentifier,
mnemonic: string,
derivationPath: string,
): Promise<void> {
try {
// add to the new sql db
const platformService = PlatformServiceFactory.getInstance();
const secrets = await platformService.dbQuery(
`SELECT secretBase64 FROM secret`,
);
if (!secrets?.values?.length || !secrets.values[0]?.length) {
throw new Error(
"No initial encryption supported. We recommend you clear your data and start over.",
);
}
const secretBase64 = secrets.values[0][0] as string;
const secret = base64ToArrayBuffer(secretBase64);
const identityStr = JSON.stringify(identity);
const encryptedIdentity = await simpleEncrypt(identityStr, secret);
const encryptedMnemonic = await simpleEncrypt(mnemonic, secret);
const encryptedIdentityBase64 = arrayBufferToBase64(encryptedIdentity);
const encryptedMnemonicBase64 = arrayBufferToBase64(encryptedMnemonic);
const sql = `INSERT INTO accounts (dateCreated, derivationPath, did, identityEncrBase64, mnemonicEncrBase64, publicKeyHex)
VALUES (?, ?, ?, ?, ?, ?)`;
const params = [
new Date().toISOString(),
derivationPath,
identity.did,
encryptedIdentityBase64,
encryptedMnemonicBase64,
identity.keys[0].publicKeyHex,
];
await platformService.dbExec(sql, params);
await databaseUtil.updateDefaultSettings({ activeDid: identity.did });
await databaseUtil.insertDidSpecificSettings(identity.did);
if (USE_DEXIE_DB) {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
await accountsDB.accounts.add({
dateCreated: new Date().toISOString(),
derivationPath: derivationPath,
did: identity.did,
identity: identityStr,
mnemonic: mnemonic,
publicKeyHex: identity.keys[0].publicKeyHex,
});
await updateDefaultSettings({ activeDid: identity.did });
await insertDidSpecificSettings(identity.did);
}
} catch (error) {
logger.error("Failed to update default settings:", error);
throw new Error(
"Failed to set default settings. Please try again or restart the app.",
);
}
}
/**
* Generates a new identity, saves it to the database, and sets it as the active identity.
* @return {Promise<string>} with the DID of the new identity
@@ -536,28 +734,14 @@ export const generateSaveAndActivateIdentity = async (): Promise<string> => {
deriveAddress(mnemonic);
const newId = newIdentifier(address, publicHex, privateHex, derivationPath);
const identity = JSON.stringify(newId);
// one of the few times we use accountsDBPromise directly; try to avoid more usage
try {
const accountsDB = await accountsDBPromise;
await accountsDB.accounts.add({
dateCreated: new Date().toISOString(),
derivationPath: derivationPath,
did: newId.did,
identity: identity,
mnemonic: mnemonic,
publicKeyHex: newId.keys[0].publicKeyHex,
});
await updateDefaultSettings({ activeDid: newId.did });
} catch (error) {
console.error("Failed to update default settings:", error);
throw new Error(
"Failed to set default settings. Please try again or restart the app.",
);
await saveNewIdentity(newId, mnemonic, derivationPath);
await databaseUtil.updateDidSpecificSettings(newId.did, {
isRegistered: false,
});
if (USE_DEXIE_DB) {
await updateAccountSettings(newId.did, { isRegistered: false });
}
await updateAccountSettings(newId.did, { isRegistered: false });
return newId.did;
};
@@ -575,9 +759,19 @@ export const registerAndSavePasskey = async (
passkeyCredIdHex,
publicKeyHex: Buffer.from(publicKeyBytes).toString("hex"),
};
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
await accountsDB.accounts.add(account);
const insertStatement = databaseUtil.generateInsertStatement(
account,
"accounts",
);
await PlatformServiceFactory.getInstance().dbExec(
insertStatement.sql,
insertStatement.params,
);
if (USE_DEXIE_DB) {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
await accountsDB.accounts.add(account);
}
return account;
};
@@ -585,13 +779,22 @@ export const registerSaveAndActivatePasskey = async (
keyName: string,
): Promise<Account> => {
const account = await registerAndSavePasskey(keyName);
await updateDefaultSettings({ activeDid: account.did });
await updateAccountSettings(account.did, { isRegistered: false });
await databaseUtil.updateDefaultSettings({ activeDid: account.did });
await databaseUtil.updateDidSpecificSettings(account.did, {
isRegistered: false,
});
if (USE_DEXIE_DB) {
await updateDefaultSettings({ activeDid: account.did });
await updateAccountSettings(account.did, { isRegistered: false });
}
return account;
};
export const getPasskeyExpirationSeconds = async (): Promise<number> => {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
return (
(settings?.passkeyExpirationMinutes ?? DEFAULT_PASSKEY_EXPIRATION_MINUTES) *
60
@@ -607,7 +810,10 @@ export const sendTestThroughPushServer = async (
subscriptionJSON: PushSubscriptionJSON,
skipFilter: boolean,
): Promise<AxiosResponse> => {
const settings = await retrieveSettingsForActiveAccount();
let settings = await databaseUtil.retrieveSettingsForActiveAccount();
if (USE_DEXIE_DB) {
settings = await retrieveSettingsForActiveAccount();
}
let pushUrl: string = DEFAULT_PUSH_SERVER as string;
if (settings?.webPushServer) {
pushUrl = settings.webPushServer;
@@ -635,3 +841,196 @@ export const sendTestThroughPushServer = async (
logger.log("Got response from web push server:", response);
return response;
};
/**
* Converts a Contact object to a CSV line string following the established format.
* The format matches CONTACT_CSV_HEADER: "name,did,pubKeyBase64,seesMe,registered,contactMethods"
* where contactMethods is stored as a stringified JSON array.
*
* @param contact - The Contact object to convert
* @returns A CSV-formatted string representing the contact
* @throws {Error} If the contact object is missing required fields
*/
export const contactToCsvLine = (contact: Contact): string => {
if (!contact.did) {
throw new Error("Contact must have a did field");
}
// Escape fields that might contain commas or quotes
const escapeField = (field: string | boolean | undefined): string => {
if (field === undefined) return "";
const str = String(field);
if (str.includes(",") || str.includes('"')) {
return `"${str.replace(/"/g, '""')}"`;
}
return str;
};
// Handle contactMethods array by stringifying it
const contactMethodsStr = contact.contactMethods
? escapeField(JSON.stringify(parseJsonField(contact.contactMethods, [])))
: "";
const fields = [
escapeField(contact.name),
escapeField(contact.did),
escapeField(contact.publicKeyBase64),
escapeField(contact.seesMe),
escapeField(contact.registered),
contactMethodsStr,
];
return fields.join(",");
};
/**
* Parses a CSV line into a Contact object. See contactToCsvLine for the format.
* @param lineRaw - The CSV line to parse
* @returns A Contact object
*/
export const csvLineToContact = (lineRaw: string): Contact => {
// Note that Endorser Mobile puts name first, then did, etc.
let line = lineRaw.trim();
let did, publicKeyInput, seesMe, registered;
let name;
let commaPos1 = -1;
if (line.startsWith('"')) {
let doubleDoubleQuotePos = line.lastIndexOf('""') + 2;
if (doubleDoubleQuotePos === -1) {
doubleDoubleQuotePos = 1;
}
const quote2Pos = line.indexOf('"', doubleDoubleQuotePos);
if (quote2Pos > -1) {
commaPos1 = line.indexOf(",", quote2Pos);
name = line.substring(1, quote2Pos).trim();
name = name.replace(/""/g, '"');
} else {
// something is weird with one " to start, so ignore it and start after "
line = line.substring(1);
commaPos1 = line.indexOf(",");
name = line.substring(0, commaPos1).trim();
}
} else {
commaPos1 = line.indexOf(",");
name = line.substring(0, commaPos1).trim();
}
if (commaPos1 > -1) {
did = line.substring(commaPos1 + 1).trim();
const commaPos2 = line.indexOf(",", commaPos1 + 1);
if (commaPos2 > -1) {
did = line.substring(commaPos1 + 1, commaPos2).trim();
publicKeyInput = line.substring(commaPos2 + 1).trim();
const commaPos3 = line.indexOf(",", commaPos2 + 1);
if (commaPos3 > -1) {
publicKeyInput = line.substring(commaPos2 + 1, commaPos3).trim();
seesMe = line.substring(commaPos3 + 1).trim() == "true";
const commaPos4 = line.indexOf(",", commaPos3 + 1);
if (commaPos4 > -1) {
seesMe = line.substring(commaPos3 + 1, commaPos4).trim() == "true";
registered = line.substring(commaPos4 + 1).trim() == "true";
}
}
}
}
// help with potential mistakes while this sharing requires copy-and-paste
let publicKeyBase64 = publicKeyInput;
if (publicKeyBase64 && /^[0-9A-Fa-f]{66}$/i.test(publicKeyBase64)) {
// it must be all hex (compressed public key), so convert
publicKeyBase64 = Buffer.from(publicKeyBase64, "hex").toString("base64");
}
const newContact: Contact = {
did: did || "",
name,
publicKeyBase64,
seesMe,
registered,
};
return newContact;
};
/**
* Interface for the JSON export format of database tables
*/
export interface TableExportData {
tableName: string;
rows: Array<Record<string, unknown>>;
}
/**
* Interface for the complete database export format
*/
export interface DatabaseExport {
data: {
data: Array<TableExportData>;
};
}
/**
* Converts an array of contacts to the standardized database export JSON format.
* This format is used for data migration and backup purposes.
*
* @param contacts - Array of Contact objects to convert
* @returns DatabaseExport object in the standardized format
*/
export const contactsToExportJson = (contacts: Contact[]): DatabaseExport => {
// Convert each contact to a plain object and ensure all fields are included
const rows = contacts.map((contact) => ({
did: contact.did,
name: contact.name || null,
contactMethods: contact.contactMethods
? JSON.stringify(parseJsonField(contact.contactMethods, []))
: null,
nextPubKeyHashB64: contact.nextPubKeyHashB64 || null,
notes: contact.notes || null,
profileImageUrl: contact.profileImageUrl || null,
publicKeyBase64: contact.publicKeyBase64 || null,
seesMe: contact.seesMe || false,
registered: contact.registered || false,
}));
return {
data: {
data: [
{
tableName: "contacts",
rows,
},
],
},
};
};
/**
* Imports an account from a mnemonic phrase
* @param mnemonic - The seed phrase to import from
* @param derivationPath - The derivation path to use (defaults to DEFAULT_ROOT_DERIVATION_PATH)
* @param shouldErase - Whether to erase existing accounts before importing
* @returns Promise that resolves when import is complete
* @throws Error if mnemonic is invalid or import fails
*/
export async function importFromMnemonic(
mnemonic: string,
derivationPath: string = DEFAULT_ROOT_DERIVATION_PATH,
shouldErase: boolean = false,
): Promise<void> {
const mne: string = mnemonic.trim().toLowerCase();
// Derive address and keys from mnemonic
const [address, privateHex, publicHex] = deriveAddress(mne, derivationPath);
// Create new identifier
const newId = newIdentifier(address, publicHex, privateHex, derivationPath);
// Handle erasures
if (shouldErase) {
const platformService = PlatformServiceFactory.getInstance();
await platformService.dbExec("DELETE FROM accounts");
if (USE_DEXIE_DB) {
const accountsDB = await accountsDBPromise;
await accountsDB.accounts.clear();
}
}
// Save the new identity
await saveNewIdentity(newId, mne, derivationPath);
}

View File

@@ -34,7 +34,7 @@ import router from "./router";
import { handleApiError } from "./services/api";
import { AxiosError } from "axios";
import { DeepLinkHandler } from "./services/deepLinks";
import { logConsoleAndDb } from "./db";
import { logConsoleAndDb } from "./db/databaseUtil";
import { logger } from "./utils/logger";
logger.log("[Capacitor] Starting initialization");

View File

@@ -10,6 +10,12 @@ import { FontAwesomeIcon } from "./libs/fontawesome";
import Camera from "simple-vue-camera";
import { logger } from "./utils/logger";
const platform = process.env.VITE_PLATFORM;
const pwa_enabled = process.env.VITE_PWA_ENABLED === "true";
logger.log("Platform", JSON.stringify({ platform }));
logger.log("PWA enabled", JSON.stringify({ pwa_enabled }));
// Global Error Handler
function setupGlobalErrorHandler(app: VueApp) {
logger.log("[App Init] Setting up global error handler");

View File

@@ -1,4 +1,16 @@
import { initializeApp } from "./main.common";
import { logger } from "./utils/logger";
const platform = process.env.VITE_PLATFORM;
const pwa_enabled = process.env.VITE_PWA_ENABLED === "true";
logger.info("[Electron] Initializing app");
logger.info("[Electron] Platform:", { platform });
logger.info("[Electron] PWA enabled:", { pwa_enabled });
if (pwa_enabled) {
logger.warn("[Electron] PWA is enabled, but not supported in electron");
}
const app = initializeApp();
app.mount("#app");

View File

@@ -1,215 +0,0 @@
import { createPinia } from "pinia";
import { App as VueApp, ComponentPublicInstance, createApp } from "vue";
import App from "./App.vue";
import "./registerServiceWorker";
import router from "./router";
import axios from "axios";
import VueAxios from "vue-axios";
import Notifications from "notiwind";
import "./assets/styles/tailwind.css";
import { library } from "@fortawesome/fontawesome-svg-core";
import {
faArrowDown,
faArrowLeft,
faArrowRight,
faArrowRotateBackward,
faArrowUpRightFromSquare,
faArrowUp,
faBan,
faBitcoinSign,
faBurst,
faCalendar,
faCamera,
faCameraRotate,
faCaretDown,
faChair,
faCheck,
faChevronDown,
faChevronLeft,
faChevronRight,
faChevronUp,
faCircle,
faCircleCheck,
faCircleInfo,
faCircleQuestion,
faCircleUser,
faClock,
faCoins,
faComment,
faCopy,
faDollar,
faEllipsis,
faEllipsisVertical,
faEnvelopeOpenText,
faEraser,
faEye,
faEyeSlash,
faFileContract,
faFileLines,
faFilter,
faFloppyDisk,
faFolderOpen,
faForward,
faGift,
faGlobe,
faHammer,
faHand,
faHandHoldingDollar,
faHandHoldingHeart,
faHouseChimney,
faImage,
faImagePortrait,
faLeftRight,
faLightbulb,
faLink,
faLocationDot,
faLongArrowAltLeft,
faLongArrowAltRight,
faMagnifyingGlass,
faMessage,
faMinus,
faPen,
faPersonCircleCheck,
faPersonCircleQuestion,
faPlus,
faQuestion,
faQrcode,
faRightFromBracket,
faRotate,
faShareNodes,
faSpinner,
faSquare,
faSquareCaretDown,
faSquareCaretUp,
faSquarePlus,
faTrashCan,
faTriangleExclamation,
faUser,
faUsers,
faXmark,
} from "@fortawesome/free-solid-svg-icons";
library.add(
faArrowDown,
faArrowLeft,
faArrowRight,
faArrowRotateBackward,
faArrowUpRightFromSquare,
faArrowUp,
faBan,
faBitcoinSign,
faBurst,
faCalendar,
faCamera,
faCameraRotate,
faCaretDown,
faChair,
faCheck,
faChevronDown,
faChevronLeft,
faChevronRight,
faChevronUp,
faCircle,
faCircleCheck,
faCircleInfo,
faCircleQuestion,
faCircleUser,
faClock,
faCoins,
faComment,
faCopy,
faDollar,
faEllipsis,
faEllipsisVertical,
faEnvelopeOpenText,
faEraser,
faEye,
faEyeSlash,
faFileContract,
faFileLines,
faFilter,
faFloppyDisk,
faFolderOpen,
faForward,
faGift,
faGlobe,
faHammer,
faHand,
faHandHoldingDollar,
faHandHoldingHeart,
faHouseChimney,
faImage,
faImagePortrait,
faLeftRight,
faLightbulb,
faLink,
faLocationDot,
faLongArrowAltLeft,
faLongArrowAltRight,
faMagnifyingGlass,
faMessage,
faMinus,
faPen,
faPersonCircleCheck,
faPersonCircleQuestion,
faPlus,
faQrcode,
faQuestion,
faRotate,
faRightFromBracket,
faShareNodes,
faSpinner,
faSquare,
faSquareCaretDown,
faSquareCaretUp,
faSquarePlus,
faTrashCan,
faTriangleExclamation,
faUser,
faUsers,
faXmark,
);
import { FontAwesomeIcon } from "@fortawesome/vue-fontawesome";
import Camera from "simple-vue-camera";
import { logger } from "./utils/logger";
// Can trigger this with a 'throw' inside some top-level function, eg. on the HomeView
function setupGlobalErrorHandler(app: VueApp) {
// @ts-expect-error 'cause we cannot see why config is not defined
app.config.errorHandler = (
err: Error,
instance: ComponentPublicInstance | null,
info: string,
) => {
logger.error(
"Ouch! Global Error Handler.",
"Error:",
err,
"- Error toString:",
err.toString(),
"- Info:",
info,
"- Instance:",
instance,
);
// Want to show a nice notiwind notification but can't figure out how.
alert(
(err.message || "Something bad happened") +
" - Try reloading or restarting the app.",
);
};
}
const app = createApp(App)
.component("fa", FontAwesomeIcon)
.component("camera", Camera)
.use(createPinia())
.use(VueAxios, axios)
.use(router)
.use(Notifications);
setupGlobalErrorHandler(app);
app.mount("#app");

View File

@@ -1,5 +1,37 @@
import { initBackend } from "absurd-sql/dist/indexeddb-main-thread";
import { initializeApp } from "./main.common";
import "./registerServiceWorker"; // Web PWA support
import { logger } from "./utils/logger";
const platform = process.env.VITE_PLATFORM;
const pwa_enabled = process.env.VITE_PWA_ENABLED === "true";
logger.info("[Web] PWA enabled", { pwa_enabled });
logger.info("[Web] Platform", { platform });
// Only import service worker for web builds
if (platform !== "electron" && pwa_enabled) {
import("./registerServiceWorker"); // Web PWA support
}
const app = initializeApp();
function sqlInit() {
// see https://github.com/jlongster/absurd-sql
const worker = new Worker(
new URL("./registerSQLWorker.js", import.meta.url),
{
type: "module",
},
);
// This is only required because Safari doesn't support nested
// workers. This installs a handler that will proxy creating web
// workers through the main thread
initBackend(worker);
}
if (platform === "web" || platform === "development") {
sqlInit();
} else {
logger.info("[Web] SQL not initialized for platform", { platform });
}
app.mount("#app");

6
src/registerSQLWorker.js Normal file
View File

@@ -0,0 +1,6 @@
import databaseService from "./services/AbsurdSqlDatabaseService";
async function run() {
await databaseService.initialize();
}
run();

View File

@@ -2,8 +2,18 @@
import { register } from "register-service-worker";
// Only register service worker if explicitly enabled and in production
// Check if we're in an Electron environment
const isElectron =
process.env.VITE_PLATFORM === "electron" ||
process.env.VITE_DISABLE_PWA === "true" ||
window.navigator.userAgent.toLowerCase().includes("electron");
// Only register service worker if:
// 1. Not in Electron
// 2. PWA is explicitly enabled
// 3. In production mode
if (
!isElectron &&
process.env.VITE_PWA_ENABLED === "true" &&
process.env.NODE_ENV === "production"
) {
@@ -34,6 +44,12 @@ if (
});
} else {
console.log(
"Service worker registration skipped - not enabled or not in production",
`Service worker registration skipped - ${
isElectron
? "running in Electron"
: process.env.VITE_PWA_ENABLED !== "true"
? "PWA not enabled"
: "not in production mode"
}`,
);
}

View File

@@ -2,35 +2,11 @@ import {
createRouter,
createWebHistory,
createMemoryHistory,
NavigationGuardNext,
RouteLocationNormalized,
RouteRecordRaw,
} from "vue-router";
import { accountsDBPromise } from "../db/index";
import { logger } from "../utils/logger";
/**
*
* @param to :RouteLocationNormalized
* @param from :RouteLocationNormalized
* @param next :NavigationGuardNext
*/
const enterOrStart = async (
to: RouteLocationNormalized,
from: RouteLocationNormalized,
next: NavigationGuardNext,
) => {
// one of the few times we use accountsDBPromise directly; try to avoid more usage
const accountsDB = await accountsDBPromise;
const num_accounts = await accountsDB.accounts.count();
if (num_accounts > 0) {
next();
} else {
next({ name: "start" });
}
};
const routes: Array<RouteRecordRaw> = [
{
path: "/account",
@@ -167,6 +143,11 @@ const routes: Array<RouteRecordRaw> = [
name: "logs",
component: () => import("../views/LogView.vue"),
},
{
path: "/database-migration",
name: "database-migration",
component: () => import("../views/DatabaseMigration.vue"),
},
{
path: "/new-activity",
name: "new-activity",
@@ -216,7 +197,6 @@ const routes: Array<RouteRecordRaw> = [
path: "/projects",
name: "projects",
component: () => import("../views/ProjectsView.vue"),
beforeEnter: enterOrStart,
},
{
path: "/quick-action-bvc",

View File

@@ -0,0 +1,29 @@
import { DatabaseService } from "../interfaces/database";
declare module "@jlongster/sql.js" {
interface SQL {
Database: unknown;
FS: unknown;
register_for_idb: (fs: unknown) => void;
}
function initSqlJs(config: {
locateFile: (file: string) => string;
}): Promise<SQL>;
export default initSqlJs;
}
declare module "absurd-sql" {
export class SQLiteFS {
constructor(fs: unknown, backend: unknown);
}
}
declare module "absurd-sql/dist/indexeddb-backend" {
export default class IndexedDBBackend {
constructor();
}
}
declare const databaseService: DatabaseService;
export default databaseService;

View File

@@ -0,0 +1,231 @@
import initSqlJs from "@jlongster/sql.js";
import { SQLiteFS } from "absurd-sql";
import IndexedDBBackend from "absurd-sql/dist/indexeddb-backend";
import { runMigrations } from "../db-sql/migration";
import type { DatabaseService, QueryExecResult } from "../interfaces/database";
import { logger } from "@/utils/logger";
interface QueuedOperation {
type: "run" | "query";
sql: string;
params: unknown[];
resolve: (value: unknown) => void;
reject: (reason: unknown) => void;
}
interface AbsurdSqlDatabase {
exec: (sql: string, params?: unknown[]) => Promise<QueryExecResult[]>;
run: (
sql: string,
params?: unknown[],
) => Promise<{ changes: number; lastId?: number }>;
}
class AbsurdSqlDatabaseService implements DatabaseService {
private static instance: AbsurdSqlDatabaseService | null = null;
private db: AbsurdSqlDatabase | null;
private initialized: boolean;
private initializationPromise: Promise<void> | null = null;
private operationQueue: Array<QueuedOperation> = [];
private isProcessingQueue: boolean = false;
private constructor() {
this.db = null;
this.initialized = false;
}
static getInstance(): AbsurdSqlDatabaseService {
if (!AbsurdSqlDatabaseService.instance) {
AbsurdSqlDatabaseService.instance = new AbsurdSqlDatabaseService();
}
return AbsurdSqlDatabaseService.instance;
}
async initialize(): Promise<void> {
// If already initialized, return immediately
if (this.initialized) {
return;
}
// If initialization is in progress, wait for it
if (this.initializationPromise) {
return this.initializationPromise;
}
// Start initialization
this.initializationPromise = this._initialize();
try {
await this.initializationPromise;
} catch (error) {
logger.error(`AbsurdSqlDatabaseService initialize method failed:`, error);
this.initializationPromise = null; // Reset on failure
throw error;
}
}
private async _initialize(): Promise<void> {
if (this.initialized) {
return;
}
const SQL = await initSqlJs({
locateFile: (file: string) => {
return new URL(
`/node_modules/@jlongster/sql.js/dist/${file}`,
import.meta.url,
).href;
},
});
const sqlFS = new SQLiteFS(SQL.FS, new IndexedDBBackend());
SQL.register_for_idb(sqlFS);
SQL.FS.mkdir("/sql");
SQL.FS.mount(sqlFS, {}, "/sql");
const path = "/sql/timesafari.absurd-sql";
if (typeof SharedArrayBuffer === "undefined") {
const stream = SQL.FS.open(path, "a+");
await stream.node.contents.readIfFallback();
SQL.FS.close(stream);
}
this.db = new SQL.Database(path, { filename: true });
if (!this.db) {
throw new Error(
"The database initialization failed. We recommend you restart or reinstall.",
);
}
// An error is thrown without this pragma: "File has invalid page size. (the first block of a new file must be written first)"
await this.db.exec(`PRAGMA journal_mode=MEMORY;`);
const sqlExec = this.db.run.bind(this.db);
const sqlQuery = this.db.exec.bind(this.db);
// Extract the migration names for the absurd-sql format
const extractMigrationNames: (result: QueryExecResult[]) => Set<string> = (
result,
) => {
// Even with the "select name" query, the QueryExecResult may be [] (which doesn't make sense to me).
const names = result?.[0]?.values.map((row) => row[0] as string) || [];
return new Set(names);
};
// Run migrations
await runMigrations(sqlExec, sqlQuery, extractMigrationNames);
this.initialized = true;
// Start processing the queue after initialization
this.processQueue();
}
private async processQueue(): Promise<void> {
if (this.isProcessingQueue || !this.initialized || !this.db) {
return;
}
this.isProcessingQueue = true;
while (this.operationQueue.length > 0) {
const operation = this.operationQueue.shift();
if (!operation) continue;
try {
let result: unknown;
switch (operation.type) {
case "run":
result = await this.db.run(operation.sql, operation.params);
break;
case "query":
result = await this.db.exec(operation.sql, operation.params);
break;
}
operation.resolve(result);
} catch (error) {
logger.error(
"Error while processing SQL queue:",
error,
" ... for sql:",
operation.sql,
" ... with params:",
operation.params,
);
operation.reject(error);
}
}
this.isProcessingQueue = false;
}
private async queueOperation<R>(
type: QueuedOperation["type"],
sql: string,
params: unknown[] = [],
): Promise<R> {
return new Promise<R>((resolve, reject) => {
const operation: QueuedOperation = {
type,
sql,
params,
resolve: (value: unknown) => resolve(value as R),
reject,
};
this.operationQueue.push(operation);
// If we're already initialized, start processing the queue
if (this.initialized && this.db) {
this.processQueue();
}
});
}
private async waitForInitialization(): Promise<void> {
// If we have an initialization promise, wait for it
if (this.initializationPromise) {
await this.initializationPromise;
return;
}
// If not initialized and no promise, start initialization
if (!this.initialized) {
await this.initialize();
return;
}
// If initialized but no db, something went wrong
if (!this.db) {
logger.error(
`Database not properly initialized after await waitForInitialization() - initialized flag is true but db is null`,
);
throw new Error(
`The database could not be initialized. We recommend you restart or reinstall.`,
);
}
}
// Used for inserts, updates, and deletes
async run(
sql: string,
params: unknown[] = [],
): Promise<{ changes: number; lastId?: number }> {
await this.waitForInitialization();
return this.queueOperation<{ changes: number; lastId?: number }>(
"run",
sql,
params,
);
}
// Note that the resulting array may be empty if there are no results from the query
async query(sql: string, params: unknown[] = []): Promise<QueryExecResult[]> {
await this.waitForInitialization();
return this.queueOperation<QueryExecResult[]>("query", sql, params);
}
}
// Create a singleton instance
const databaseService = AbsurdSqlDatabaseService.getInstance();
export default databaseService;

View File

@@ -1,3 +1,5 @@
import { QueryExecResult } from "@/interfaces/database";
/**
* Represents the result of an image capture or selection operation.
* Contains both the image data as a Blob and the associated filename.
@@ -106,4 +108,26 @@ export interface PlatformService {
* @returns Promise that resolves when the deep link has been handled
*/
handleDeepLink(url: string): Promise<void>;
/**
* Executes a SQL query on the database.
* @param sql - The SQL query to execute
* @param params - The parameters to pass to the query
* @returns Promise resolving to the query result
*/
dbQuery(
sql: string,
params?: unknown[],
): Promise<QueryExecResult | undefined>;
/**
* Executes a create/update/delete on the database.
* @param sql - The SQL statement to execute
* @param params - The parameters to pass to the statement
* @returns Promise resolving to the result of the statement
*/
dbExec(
sql: string,
params?: unknown[],
): Promise<{ changes: number; lastId?: number }>;
}

View File

@@ -4,7 +4,13 @@ import {
StartScanOptions,
LensFacing,
} from "@capacitor-mlkit/barcode-scanning";
import { QRScannerService, ScanListener, QRScannerOptions } from "./types";
import {
QRScannerService,
ScanListener,
QRScannerOptions,
CameraStateListener,
CameraState,
} from "./types";
import { logger } from "@/utils/logger";
export class CapacitorQRScanner implements QRScannerService {
@@ -12,6 +18,9 @@ export class CapacitorQRScanner implements QRScannerService {
private isScanning = false;
private listenerHandles: Array<() => Promise<void>> = [];
private cleanupPromise: Promise<void> | null = null;
private cameraStateListeners: Set<CameraStateListener> = new Set();
private currentState: CameraState = "off";
private currentStateMessage?: string;
async checkPermissions(): Promise<boolean> {
try {
@@ -79,8 +88,11 @@ export class CapacitorQRScanner implements QRScannerService {
}
try {
this.updateCameraState("initializing", "Starting camera...");
// Ensure we have permissions before starting
if (!(await this.checkPermissions())) {
this.updateCameraState("permission_denied", "Camera permission denied");
logger.debug("Requesting camera permissions");
const granted = await this.requestPermissions();
if (!granted) {
@@ -90,11 +102,16 @@ export class CapacitorQRScanner implements QRScannerService {
// Check if scanning is supported
if (!(await this.isSupported())) {
this.updateCameraState(
"error",
"QR scanning not supported on this device",
);
throw new Error("QR scanning not supported on this device");
}
logger.info("Starting MLKit scanner");
this.isScanning = true;
this.updateCameraState("active", "Camera is active");
const scanOptions: StartScanOptions = {
formats: [BarcodeFormat.QrCode],
@@ -126,6 +143,7 @@ export class CapacitorQRScanner implements QRScannerService {
stack: wrappedError.stack,
});
this.isScanning = false;
this.updateCameraState("error", wrappedError.message);
await this.cleanup();
this.scanListener?.onError?.(wrappedError);
throw wrappedError;
@@ -140,6 +158,7 @@ export class CapacitorQRScanner implements QRScannerService {
try {
logger.debug("Stopping QR scanner");
this.updateCameraState("off", "Camera stopped");
await BarcodeScanner.stopScan();
logger.info("QR scanner stopped successfully");
} catch (error) {
@@ -149,6 +168,7 @@ export class CapacitorQRScanner implements QRScannerService {
error: wrappedError.message,
stack: wrappedError.stack,
});
this.updateCameraState("error", wrappedError.message);
this.scanListener?.onError?.(wrappedError);
throw wrappedError;
} finally {
@@ -207,4 +227,23 @@ export class CapacitorQRScanner implements QRScannerService {
// No-op for native scanner
callback(null);
}
addCameraStateListener(listener: CameraStateListener): void {
this.cameraStateListeners.add(listener);
// Immediately notify the new listener of current state
listener.onStateChange(this.currentState, this.currentStateMessage);
}
removeCameraStateListener(listener: CameraStateListener): void {
this.cameraStateListeners.delete(listener);
}
private updateCameraState(state: CameraState, message?: string): void {
this.currentState = state;
this.currentStateMessage = message;
// Notify all listeners of state change
for (const listener of this.cameraStateListeners) {
listener.onStateChange(state, message);
}
}
}

View File

@@ -30,14 +30,16 @@ export class WebInlineQRScanner implements QRScannerService {
private cameraStateListeners: Set<CameraStateListener> = new Set();
private currentState: CameraState = "off";
private currentStateMessage?: string;
private options: QRScannerOptions;
constructor(private options?: QRScannerOptions) {
constructor(options?: QRScannerOptions) {
// Generate a short random ID for this scanner instance
this.id = Math.random().toString(36).substring(2, 8).toUpperCase();
this.options = options ?? {};
logger.error(
`[WebInlineQRScanner:${this.id}] Initializing scanner with options:`,
{
...options,
...this.options,
buildId: BUILD_ID,
targetFps: this.TARGET_FPS,
},
@@ -494,26 +496,34 @@ export class WebInlineQRScanner implements QRScannerService {
}
}
async startScan(): Promise<void> {
async startScan(options?: QRScannerOptions): Promise<void> {
if (this.isScanning) {
logger.error(`[WebInlineQRScanner:${this.id}] Scanner already running`);
return;
}
// Update options if provided
if (options) {
this.options = { ...this.options, ...options };
}
try {
this.isScanning = true;
this.scanAttempts = 0;
this.lastScanTime = Date.now();
this.updateCameraState("initializing", "Starting camera...");
logger.error(`[WebInlineQRScanner:${this.id}] Starting scan`);
logger.error(
`[WebInlineQRScanner:${this.id}] Starting scan with options:`,
this.options,
);
// Get camera stream
// Get camera stream with options
logger.error(
`[WebInlineQRScanner:${this.id}] Requesting camera stream...`,
);
this.stream = await navigator.mediaDevices.getUserMedia({
video: {
facingMode: "environment",
facingMode: this.options.camera === "front" ? "user" : "environment",
width: { ideal: 1280 },
height: { ideal: 720 },
},
@@ -527,11 +537,18 @@ export class WebInlineQRScanner implements QRScannerService {
label: t.label,
readyState: t.readyState,
})),
options: this.options,
});
// Set up video element
if (this.video) {
this.video.srcObject = this.stream;
// Only show preview if showPreview is true
if (this.options.showPreview) {
this.video.style.display = "block";
} else {
this.video.style.display = "none";
}
await this.video.play();
logger.error(
`[WebInlineQRScanner:${this.id}] Video element started playing`,

View File

@@ -28,7 +28,7 @@
*
* Supported Routes:
* - user-profile: View user profile
* - project-details: View project details
* - project: View project details
* - onboard-meeting-setup: Setup onboarding meeting
* - invite-one-accept: Accept invitation
* - contact-import: Import contacts
@@ -52,7 +52,7 @@ import {
routeSchema,
DeepLinkRoute,
} from "../interfaces/deepLinks";
import { logConsoleAndDb } from "../db";
import { logConsoleAndDb } from "../db/databaseUtil";
import type { DeepLinkError } from "../interfaces/deepLinks";
/**
@@ -81,18 +81,16 @@ export class DeepLinkHandler {
string,
{ name: string; paramKey?: string }
> = {
"user-profile": { name: "user-profile" },
"project-details": { name: "project-details" },
"onboard-meeting-setup": { name: "onboard-meeting-setup" },
"invite-one-accept": { name: "invite-one-accept" },
"contact-import": { name: "contact-import" },
"confirm-gift": { name: "confirm-gift" },
claim: { name: "claim" },
"claim-cert": { name: "claim-cert" },
"claim-add-raw": { name: "claim-add-raw" },
"contact-edit": { name: "contact-edit", paramKey: "did" },
contacts: { name: "contacts" },
"claim-cert": { name: "claim-cert" },
"confirm-gift": { name: "confirm-gift" },
did: { name: "did", paramKey: "did" },
"invite-one-accept": { name: "invite-one-accept", paramKey: "jwt" },
"onboard-meeting-members": { name: "onboard-meeting-members" },
"onboard-meeting-setup": { name: "onboard-meeting-setup" },
project: { name: "project" },
"user-profile": { name: "user-profile" },
};
/**
@@ -119,6 +117,15 @@ export class DeepLinkHandler {
const [path, queryString] = parts[1].split("?");
const [routePath, param] = path.split("/");
// Validate route exists before proceeding
if (!this.ROUTE_MAP[routePath]) {
throw {
code: "INVALID_ROUTE",
message: `Invalid route path: ${routePath}`,
details: { routePath },
};
}
const query: Record<string, string> = {};
if (queryString) {
new URLSearchParams(queryString).forEach((value, key) => {
@@ -128,11 +135,9 @@ export class DeepLinkHandler {
const params: Record<string, string> = {};
if (param) {
if (this.ROUTE_MAP[routePath].paramKey) {
params[this.ROUTE_MAP[routePath].paramKey] = param;
} else {
params["id"] = param;
}
// Now we know routePath exists in ROUTE_MAP
const routeConfig = this.ROUTE_MAP[routePath];
params[routeConfig.paramKey ?? "id"] = param;
}
return { path: routePath, params, query };
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,150 @@
/**
* Manage database migrations as people upgrade their app over time
*/
import { logger } from "../utils/logger";
/**
* Migration interface for database schema migrations
*/
interface Migration {
name: string;
sql: string;
}
/**
* Migration registry to store and manage database migrations
*/
class MigrationRegistry {
private migrations: Migration[] = [];
/**
* Register a migration with the registry
*
* @param migration - The migration to register
*/
registerMigration(migration: Migration): void {
this.migrations.push(migration);
logger.info(`[MigrationService] Registered migration: ${migration.name}`);
}
/**
* Get all registered migrations
*
* @returns Array of registered migrations
*/
getMigrations(): Migration[] {
return this.migrations;
}
/**
* Clear all registered migrations
*/
clearMigrations(): void {
this.migrations = [];
logger.info("[MigrationService] Cleared all registered migrations");
}
}
// Create a singleton instance of the migration registry
const migrationRegistry = new MigrationRegistry();
/**
* Register a migration with the migration service
*
* This function is used by the migration system to register database
* schema migrations that need to be applied to the database.
*
* @param migration - The migration to register
*/
export function registerMigration(migration: Migration): void {
migrationRegistry.registerMigration(migration);
}
/**
* Run all registered migrations against the database
*
* This function executes all registered migrations in order, checking
* which ones have already been applied to avoid duplicate execution.
* It creates a migrations table if it doesn't exist to track applied
* migrations.
*
* @param sqlExec - Function to execute SQL statements
* @param sqlQuery - Function to query SQL data
* @param extractMigrationNames - Function to extract migration names from query results
* @returns Promise that resolves when all migrations are complete
*/
export async function runMigrations<T>(
sqlExec: (sql: string, params?: unknown[]) => Promise<unknown>,
sqlQuery: (sql: string, params?: unknown[]) => Promise<T>,
extractMigrationNames: (result: T) => Set<string>,
): Promise<void> {
try {
// Create migrations table if it doesn't exist
await sqlExec(`
CREATE TABLE IF NOT EXISTS migrations (
name TEXT PRIMARY KEY,
applied_at TEXT DEFAULT CURRENT_TIMESTAMP
);
`);
// Get list of already applied migrations
const appliedMigrationsResult = await sqlQuery(
"SELECT name FROM migrations",
);
const appliedMigrations = extractMigrationNames(appliedMigrationsResult);
logger.info(
`[MigrationService] Found ${appliedMigrations.size} applied migrations`,
);
// Get all registered migrations
const migrations = migrationRegistry.getMigrations();
if (migrations.length === 0) {
logger.warn("[MigrationService] No migrations registered");
return;
}
logger.info(
`[MigrationService] Running ${migrations.length} registered migrations`,
);
// Run each migration that hasn't been applied yet
for (const migration of migrations) {
if (appliedMigrations.has(migration.name)) {
logger.info(
`[MigrationService] Skipping already applied migration: ${migration.name}`,
);
continue;
}
logger.info(`[MigrationService] Applying migration: ${migration.name}`);
try {
// Execute the migration SQL
await sqlExec(migration.sql);
// Record that the migration was applied
await sqlExec("INSERT INTO migrations (name) VALUES (?)", [
migration.name,
]);
logger.info(
`[MigrationService] Successfully applied migration: ${migration.name}`,
);
} catch (error) {
logger.error(
`[MigrationService] Failed to apply migration ${migration.name}:`,
error,
);
throw new Error(`Migration ${migration.name} failed: ${error}`);
}
}
logger.info("[MigrationService] All migrations completed successfully");
} catch (error) {
logger.error("[MigrationService] Migration process failed:", error);
throw error;
}
}

View File

@@ -1,23 +1,243 @@
import { Filesystem, Directory, Encoding } from "@capacitor/filesystem";
import {
Camera,
CameraResultType,
CameraSource,
CameraDirection,
} from "@capacitor/camera";
import { Capacitor } from "@capacitor/core";
import { Share } from "@capacitor/share";
import {
SQLiteConnection,
SQLiteDBConnection,
CapacitorSQLite,
capSQLiteChanges,
DBSQLiteValues,
} from "@capacitor-community/sqlite";
import { runMigrations } from "@/db-sql/migration";
import { QueryExecResult } from "@/interfaces/database";
import {
ImageResult,
PlatformService,
PlatformCapabilities,
} from "../PlatformService";
import { Filesystem, Directory, Encoding } from "@capacitor/filesystem";
import { Camera, CameraResultType, CameraSource, CameraDirection } from "@capacitor/camera";
import { Share } from "@capacitor/share";
import { logger } from "../../utils/logger";
interface QueuedOperation {
type: "run" | "query";
sql: string;
params: unknown[];
resolve: (value: unknown) => void;
reject: (reason: unknown) => void;
}
/**
* Platform service implementation for Capacitor (mobile) platform.
* Provides native mobile functionality through Capacitor plugins for:
* - File system operations
* - Camera and image picker
* - Platform-specific features
* - SQLite database operations
*/
export class CapacitorPlatformService implements PlatformService {
/** Current camera direction */
private currentDirection: CameraDirection = 'BACK';
private currentDirection: CameraDirection = "BACK";
private sqlite: SQLiteConnection;
private db: SQLiteDBConnection | null = null;
private dbName = "timesafari.sqlite";
private initialized = false;
private initializationPromise: Promise<void> | null = null;
private operationQueue: Array<QueuedOperation> = [];
private isProcessingQueue: boolean = false;
constructor() {
this.sqlite = new SQLiteConnection(CapacitorSQLite);
}
private async initializeDatabase(): Promise<void> {
// If already initialized, return immediately
if (this.initialized) {
return;
}
// If initialization is in progress, wait for it
if (this.initializationPromise) {
return this.initializationPromise;
}
// Start initialization
this.initializationPromise = this._initialize();
try {
await this.initializationPromise;
} catch (error) {
logger.error(
"[CapacitorPlatformService] Initialize method failed:",
error,
);
this.initializationPromise = null; // Reset on failure
throw error;
}
}
private async _initialize(): Promise<void> {
if (this.initialized) {
return;
}
try {
// Create/Open database
this.db = await this.sqlite.createConnection(
this.dbName,
false,
"no-encryption",
1,
false,
);
await this.db.open();
// Set journal mode to WAL for better performance
// await this.db.execute("PRAGMA journal_mode=WAL;");
// Run migrations
await this.runCapacitorMigrations();
this.initialized = true;
logger.log(
"[CapacitorPlatformService] SQLite database initialized successfully",
);
// Start processing the queue after initialization
this.processQueue();
} catch (error) {
logger.error(
"[CapacitorPlatformService] Error initializing SQLite database:",
error,
);
throw new Error(
"[CapacitorPlatformService] Failed to initialize database",
);
}
}
private async processQueue(): Promise<void> {
if (this.isProcessingQueue || !this.initialized || !this.db) {
return;
}
this.isProcessingQueue = true;
while (this.operationQueue.length > 0) {
const operation = this.operationQueue.shift();
if (!operation) continue;
try {
let result: unknown;
switch (operation.type) {
case "run": {
const runResult = await this.db.run(
operation.sql,
operation.params,
);
result = {
changes: runResult.changes?.changes || 0,
lastId: runResult.changes?.lastId,
};
break;
}
case "query": {
const queryResult = await this.db.query(
operation.sql,
operation.params,
);
result = {
columns: Object.keys(queryResult.values?.[0] || {}),
values: (queryResult.values || []).map((row) =>
Object.values(row),
),
};
break;
}
}
operation.resolve(result);
} catch (error) {
logger.error(
"[CapacitorPlatformService] Error while processing SQL queue:",
error,
);
operation.reject(error);
}
}
this.isProcessingQueue = false;
}
private async queueOperation<R>(
type: QueuedOperation["type"],
sql: string,
params: unknown[] = [],
): Promise<R> {
return new Promise<R>((resolve, reject) => {
const operation: QueuedOperation = {
type,
sql,
params,
resolve: (value: unknown) => resolve(value as R),
reject,
};
this.operationQueue.push(operation);
// If we're already initialized, start processing the queue
if (this.initialized && this.db) {
this.processQueue();
}
});
}
private async waitForInitialization(): Promise<void> {
// If we have an initialization promise, wait for it
if (this.initializationPromise) {
await this.initializationPromise;
return;
}
// If not initialized and no promise, start initialization
if (!this.initialized) {
await this.initializeDatabase();
return;
}
// If initialized but no db, something went wrong
if (!this.db) {
logger.error(
"[CapacitorPlatformService] Database not properly initialized after await waitForInitialization() - initialized flag is true but db is null",
);
throw new Error(
"[CapacitorPlatformService] The database could not be initialized. We recommend you restart or reinstall.",
);
}
}
private async runCapacitorMigrations(): Promise<void> {
if (!this.db) {
throw new Error("Database not initialized");
}
const sqlExec: (sql: string) => Promise<capSQLiteChanges> =
this.db.execute.bind(this.db);
const sqlQuery: (sql: string) => Promise<DBSQLiteValues> =
this.db.query.bind(this.db);
const extractMigrationNames: (result: DBSQLiteValues) => Set<string> = (
result,
) => {
const names =
result.values?.map((row: { name: string }) => row.name) || [];
return new Set(names);
};
runMigrations(sqlExec, sqlQuery, extractMigrationNames);
}
/**
* Gets the capabilities of the Capacitor platform
@@ -28,7 +248,7 @@ export class CapacitorPlatformService implements PlatformService {
hasFileSystem: true,
hasCamera: true,
isMobile: true,
isIOS: /iPad|iPhone|iPod/.test(navigator.userAgent),
isIOS: Capacitor.getPlatform() === "ios",
hasFileDownload: false,
needsFileHandlingInstructions: true,
isNativeApp: true,
@@ -189,6 +409,9 @@ export class CapacitorPlatformService implements PlatformService {
*/
async writeFile(fileName: string, content: string): Promise<void> {
try {
// Check storage permissions before proceeding
await this.checkStoragePermissions();
const logData = {
targetFileName: fileName,
contentLength: content.length,
@@ -330,6 +553,9 @@ export class CapacitorPlatformService implements PlatformService {
logger.log("[CapacitorPlatformService]", JSON.stringify(logData, null, 2));
try {
// Check storage permissions before proceeding
await this.checkStoragePermissions();
const { uri } = await Filesystem.writeFile({
path: fileName,
data: content,
@@ -476,7 +702,7 @@ export class CapacitorPlatformService implements PlatformService {
* @returns Promise that resolves when the camera is rotated
*/
async rotateCamera(): Promise<void> {
this.currentDirection = this.currentDirection === 'BACK' ? 'FRONT' : 'BACK';
this.currentDirection = this.currentDirection === "BACK" ? "FRONT" : "BACK";
logger.debug(`Camera rotated to ${this.currentDirection} camera`);
}
@@ -490,4 +716,27 @@ export class CapacitorPlatformService implements PlatformService {
// This is just a placeholder for the interface
return Promise.resolve();
}
/**
* @see PlatformService.dbQuery
*/
async dbQuery(sql: string, params?: unknown[]): Promise<QueryExecResult> {
await this.waitForInitialization();
return this.queueOperation<QueryExecResult>("query", sql, params || []);
}
/**
* @see PlatformService.dbExec
*/
async dbExec(
sql: string,
params?: unknown[],
): Promise<{ changes: number; lastId?: number }> {
await this.waitForInitialization();
return this.queueOperation<{ changes: number; lastId?: number }>(
"run",
sql,
params || [],
);
}
}

Some files were not shown because too many files have changed in this diff Show More