# Tasks: Geutebruck Surveillance API **Input**: Design documents from `/specs/001-surveillance-api/` **Prerequisites**: plan.md ✅, spec.md ✅, research.md ✅, data-model.md ✅, contracts/openapi.yaml ✅ **Tests**: TDD approach enforced - all tests MUST be written first and FAIL before implementation begins. **Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story. --- ## Format: `[ID] [P?] [Story] Description` - **[P]**: Can run in parallel (different files, no dependencies) - **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3) - Include exact file paths in descriptions --- ## Path Conventions This project uses **web application structure**: - **Python API**: `src/api/` (FastAPI application) - **C# SDK Bridge**: `src/sdk-bridge/` (gRPC service) - **Tests**: `tests/api/` (Python), `tests/sdk-bridge/` (C#) --- ## Phase 1: Setup (Shared Infrastructure) **Purpose**: Project initialization and basic structure - [ ] T001 Create Python project structure: src/api/ with subdirs (models/, schemas/, routers/, services/, clients/, middleware/, websocket/, utils/, migrations/) - [ ] T002 Create C# SDK Bridge structure: src/sdk-bridge/ with GeViScopeBridge.sln, Services/, SDK/, Protos/ - [ ] T003 Create test structure: tests/api/ (unit/, integration/, contract/) and tests/sdk-bridge/ (Unit/, Integration/) - [ ] T004 [P] Initialize Python dependencies in requirements.txt (FastAPI, Uvicorn, SQLAlchemy, Redis, grpcio, PyJWT, pytest) - [ ] T005 [P] Initialize C# project with .NET 8.0 gRPC and .NET Framework 4.8 SDK dependencies - [ ] T006 [P] Configure Python linting/formatting (ruff, black, mypy) in pyproject.toml - [ ] T007 [P] Create .env.example with all required environment variables - [ ] T008 [P] Create scripts/setup_dev_environment.ps1 for automated development environment setup - [ ] T009 [P] Create scripts/start_services.ps1 to start Redis, SDK Bridge, and API - [ ] T010 [P] Setup Alembic for database migrations in src/api/migrations/ --- ## Phase 2: Foundational (Blocking Prerequisites) **Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented **⚠️ CRITICAL**: No user story work can begin until this phase is complete ### C# SDK Bridge Foundation - [ ] T011 Define gRPC protocol buffer for common types in src/sdk-bridge/Protos/common.proto (Status, Error, Timestamp) - [ ] T012 Create GeViDatabaseWrapper.cs in src/sdk-bridge/SDK/ (wraps GeViDatabase connection lifecycle) - [ ] T013 Implement connection management: Create → RegisterCallback → Connect pattern with retry logic - [ ] T014 [P] Create StateQueryHandler.cs for GetFirst/GetNext enumeration pattern - [ ] T015 [P] Create DatabaseQueryHandler.cs for historical query sessions - [ ] T016 Implement error translation from Windows error codes to gRPC status codes in src/sdk-bridge/Utils/ErrorTranslator.cs - [ ] T017 Setup gRPC server in src/sdk-bridge/Program.cs with service registration ### Python API Foundation - [ ] T018 Create FastAPI app initialization in src/api/main.py with CORS, middleware registration - [ ] T019 [P] Create configuration management in src/api/config.py loading from environment variables - [ ] T020 [P] Setup PostgreSQL connection with SQLAlchemy in src/api/models/__init__.py - [ ] T021 [P] Setup Redis client with connection pooling in src/api/clients/redis_client.py - [ ] T022 Create gRPC SDK Bridge client in src/api/clients/sdk_bridge_client.py with connection pooling - [ ] T023 [P] Implement JWT utilities in src/api/utils/jwt_utils.py (encode, decode, verify) - [ ] T024 [P] Create error translation utilities in src/api/utils/error_translation.py (SDK errors → HTTP status) - [ ] T025 Implement global error handler middleware in src/api/middleware/error_handler.py - [ ] T026 [P] Create base Pydantic schemas in src/api/schemas/__init__.py (ErrorResponse, SuccessResponse) ### Database & Testing Infrastructure - [ ] T027 Create initial Alembic migration for database schema (users, audit_logs tables) - [ ] T028 [P] Setup pytest configuration in tests/api/conftest.py with fixtures (test_db, test_redis, test_client) - [ ] T029 [P] Setup xUnit test infrastructure in tests/sdk-bridge/ with test SDK connection **Checkpoint**: Foundation ready - user story implementation can now begin in parallel --- ## Phase 3: User Story 1 - Secure API Access (Priority: P1) 🎯 MVP **Goal**: Implement JWT-based authentication with role-based access control (viewer, operator, administrator) **Independent Test**: Can authenticate with valid credentials to receive JWT token, access protected endpoints with token, and receive 401 for invalid/expired tokens ### Tests for User Story 1 (TDD - Write FIRST, Ensure FAIL) - [ ] T030 [P] [US1] Write contract test for POST /api/v1/auth/login in tests/api/contract/test_auth_contract.py (should FAIL) - [ ] T031 [P] [US1] Write contract test for POST /api/v1/auth/refresh in tests/api/contract/test_auth_contract.py (should FAIL) - [ ] T032 [P] [US1] Write contract test for POST /api/v1/auth/logout in tests/api/contract/test_auth_contract.py (should FAIL) - [ ] T033 [P] [US1] Write integration test for authentication flow in tests/api/integration/test_auth_flow.py (should FAIL) - [ ] T034 [P] [US1] Write unit test for AuthService in tests/api/unit/test_auth_service.py (should FAIL) ### Implementation for User Story 1 - [ ] T035 [P] [US1] Create User model in src/api/models/user.py (id, username, password_hash, role, permissions, created_at, updated_at) - [ ] T036 [P] [US1] Create AuditLog model in src/api/models/audit_log.py (id, user_id, action, target, outcome, timestamp, details) - [ ] T037 [US1] Create Alembic migration for User and AuditLog tables - [ ] T038 [P] [US1] Create auth request/response schemas in src/api/schemas/auth.py (LoginRequest, TokenResponse, RefreshRequest) - [ ] T039 [US1] Implement AuthService in src/api/services/auth_service.py (login, refresh, logout, validate_token) - [ ] T040 [US1] Implement password hashing with bcrypt in AuthService - [ ] T041 [US1] Implement JWT token generation (access: 1hr, refresh: 7 days) with Redis session storage - [ ] T042 [US1] Implement authentication middleware in src/api/middleware/auth_middleware.py (verify JWT, extract user) - [ ] T043 [US1] Implement rate limiting middleware for auth endpoints in src/api/middleware/rate_limiter.py (5 attempts/min) - [ ] T044 [US1] Create auth router in src/api/routers/auth.py with login, refresh, logout endpoints - [ ] T045 [US1] Implement audit logging for authentication attempts (success and failures) - [ ] T046 [US1] Add role-based permission checking utilities in src/api/utils/permissions.py **Verify**: Run tests T030-T034 - they should now PASS **Checkpoint**: Authentication system complete - can login, get tokens, access protected endpoints --- ## Phase 4: User Story 2 - Live Video Stream Access (Priority: P1) **Goal**: Enable users to view live video streams from surveillance cameras with <2s initialization time **Independent Test**: Authenticate, request stream URL for camera, receive RTSP URL with token, play stream in video player ### gRPC Protocol Definitions - [ ] T047 [US2] Define camera.proto in src/sdk-bridge/Protos/ (ListCamerasRequest/Response, GetCameraRequest/Response, CameraInfo) - [ ] T048 [US2] Define stream.proto in src/sdk-bridge/Protos/ (StartStreamRequest/Response, StopStreamRequest/Response, StreamInfo) ### Tests for User Story 2 (TDD - Write FIRST, Ensure FAIL) - [ ] T049 [P] [US2] Write contract test for GET /api/v1/cameras in tests/api/contract/test_cameras_contract.py (should FAIL) - [ ] T050 [P] [US2] Write contract test for GET /api/v1/cameras/{id} in tests/api/contract/test_cameras_contract.py (should FAIL) - [ ] T051 [P] [US2] Write contract test for POST /api/v1/cameras/{id}/stream in tests/api/contract/test_cameras_contract.py (should FAIL) - [ ] T052 [P] [US2] Write contract test for DELETE /api/v1/cameras/{id}/stream/{stream_id} in tests/api/contract/test_cameras_contract.py (should FAIL) - [ ] T053 [P] [US2] Write integration test for stream lifecycle in tests/api/integration/test_stream_lifecycle.py (should FAIL) - [ ] T054 [P] [US2] Write unit test for CameraService in tests/api/unit/test_camera_service.py (should FAIL) - [ ] T055 [P] [US2] Write C# unit test for CameraService gRPC in tests/sdk-bridge/Unit/CameraServiceTests.cs (should FAIL) ### Implementation - SDK Bridge (C#) - [ ] T056 [US2] Implement CameraService.cs in src/sdk-bridge/Services/ with ListCameras (GetFirstVideoInput/GetNextVideoInput pattern) - [ ] T057 [US2] Implement GetCameraDetails in CameraService.cs (query video input info: channel, name, capabilities) - [ ] T058 [US2] Implement GetCameraStatus in CameraService.cs (online/offline detection) - [ ] T059 [US2] Implement StreamService.cs in src/sdk-bridge/Services/ with StartStream method - [ ] T060 [US2] Generate RTSP URL with token in StreamService.cs (format: rtsp://host:port/stream/{id}?token={jwt}) - [ ] T061 [US2] Implement StopStream method in StreamService.cs - [ ] T062 [US2] Track active streams with channel mapping in StreamService.cs ### Implementation - Python API - [ ] T063 [P] [US2] Create Camera model in src/api/models/camera.py (id, channel, name, description, status, capabilities) - [ ] T064 [P] [US2] Create Stream model in src/api/models/stream.py (id, camera_id, user_id, url, started_at, expires_at) - [ ] T065 [US2] Create Alembic migration for Camera and Stream tables - [ ] T066 [P] [US2] Create camera schemas in src/api/schemas/camera.py (CameraInfo, CameraList, CameraCapabilities) - [ ] T067 [P] [US2] Create stream schemas in src/api/schemas/stream.py (StartStreamRequest, StreamResponse) - [ ] T068 [US2] Implement CameraService in src/api/services/camera_service.py (list, get_details, sync from SDK bridge) - [ ] T069 [US2] Implement StreamService in src/api/services/stream_service.py (start, stop, track active streams) - [ ] T070 [US2] Implement token generation for stream URLs (15min expiration) - [ ] T071 [US2] Create cameras router in src/api/routers/cameras.py with GET /cameras, GET /cameras/{id} - [ ] T072 [US2] Implement stream endpoints: POST /cameras/{id}/stream, DELETE /cameras/{id}/stream/{stream_id} - [ ] T073 [US2] Add permission checks: users can only access cameras they have permission for (403 if unauthorized) - [ ] T074 [US2] Implement camera offline error handling (clear error message when camera unavailable) **Verify**: Run tests T049-T055 - they should now PASS **Checkpoint**: Live streaming functional - can list cameras, start/stop streams, play video --- ## Phase 5: User Story 3 - Camera PTZ Control (Priority: P1) **Goal**: Enable remote pan-tilt-zoom control for PTZ-capable cameras with <500ms response time **Independent Test**: Send PTZ command (pan left/right, tilt up/down, zoom in/out) to PTZ camera, verify movement occurs ### gRPC Protocol Definitions - [ ] T075 [US3] Define ptz.proto in src/sdk-bridge/Protos/ (PTZMoveRequest, PTZPresetRequest, PTZResponse) ### Tests for User Story 3 (TDD - Write FIRST, Ensure FAIL) - [ ] T076 [P] [US3] Write contract test for POST /api/v1/cameras/{id}/ptz in tests/api/contract/test_ptz_contract.py (should FAIL) - [ ] T077 [P] [US3] Write integration test for PTZ control in tests/api/integration/test_ptz_control.py (should FAIL) - [ ] T078 [P] [US3] Write unit test for PTZService in tests/api/unit/test_ptz_service.py (should FAIL) - [ ] T079 [P] [US3] Write C# unit test for PTZService gRPC in tests/sdk-bridge/Unit/PTZServiceTests.cs (should FAIL) ### Implementation - SDK Bridge (C#) - [ ] T080 [US3] Implement PTZService.cs in src/sdk-bridge/Services/ with MoveCamera method (pan, tilt, zoom, speed) - [ ] T081 [US3] Implement SetPreset and GotoPreset methods in PTZService.cs - [ ] T082 [US3] Implement StopMovement method in PTZService.cs - [ ] T083 [US3] Add PTZ command queuing for concurrent control conflict resolution ### Implementation - Python API - [ ] T084 [P] [US3] Create PTZ schemas in src/api/schemas/ptz.py (PTZMoveCommand, PTZPresetCommand, PTZResponse) - [ ] T085 [US3] Implement PTZService in src/api/services/ptz_service.py (move, set_preset, goto_preset, stop) - [ ] T086 [US3] Add PTZ endpoints to cameras router: POST /cameras/{id}/ptz - [ ] T087 [US3] Implement PTZ capability validation (return error if camera doesn't support PTZ) - [ ] T088 [US3] Implement operator role requirement for PTZ control (viewers can't control PTZ) - [ ] T089 [US3] Add audit logging for all PTZ commands **Verify**: Run tests T076-T079 - they should now PASS **Checkpoint**: PTZ control functional - can move cameras, use presets, operators have control --- ## Phase 6: User Story 4 - Real-time Event Notifications (Priority: P1) **Goal**: Deliver real-time surveillance event notifications via WebSocket with <100ms latency to 1000+ concurrent clients **Independent Test**: Connect to WebSocket, subscribe to event types, trigger test alarm, receive notification within 100ms ### gRPC Protocol Definitions - [ ] T090 [US4] Define event.proto in src/sdk-bridge/Protos/ (SubscribeEventsRequest, EventNotification with server streaming) ### Tests for User Story 4 (TDD - Write FIRST, Ensure FAIL) - [ ] T091 [P] [US4] Write contract test for WebSocket /api/v1/events/stream in tests/api/contract/test_events_contract.py (should FAIL) - [ ] T092 [P] [US4] Write contract test for GET /api/v1/events in tests/api/contract/test_events_contract.py (should FAIL) - [ ] T093 [P] [US4] Write integration test for event notification flow in tests/api/integration/test_event_notifications.py (should FAIL) - [ ] T094 [P] [US4] Write unit test for EventService in tests/api/unit/test_event_service.py (should FAIL) - [ ] T095 [P] [US4] Write C# unit test for EventService gRPC in tests/sdk-bridge/Unit/EventServiceTests.cs (should FAIL) ### Implementation - SDK Bridge (C#) - [ ] T096 [US4] Implement EventService.cs in src/sdk-bridge/Services/ with SubscribeEvents (server streaming) - [ ] T097 [US4] Register SDK event callbacks for motion, alarms, analytics, system events - [ ] T098 [US4] Map SDK events to gRPC EventNotification messages - [ ] T099 [US4] Implement event filtering by type and camera channel ### Implementation - Python API - [ ] T100 [P] [US4] Create Event model in src/api/models/event.py (id, type, camera_id, timestamp, severity, data) - [ ] T101 [US4] Create Alembic migration for Event table - [ ] T102 [P] [US4] Create event schemas in src/api/schemas/event.py (EventNotification, EventQuery, EventFilter) - [ ] T103 [US4] Implement WebSocket connection manager in src/api/websocket/connection_manager.py (add, remove, broadcast) - [ ] T104 [US4] Implement Redis pub/sub event broadcaster in src/api/websocket/event_broadcaster.py (subscribe to SDK bridge events) - [ ] T105 [US4] Create background task to consume SDK bridge event stream and publish to Redis - [ ] T106 [US4] Implement WebSocket endpoint in src/api/routers/events.py: WS /events/stream - [ ] T107 [US4] Implement event subscription management (subscribe, unsubscribe to event types) - [ ] T108 [US4] Implement client reconnection handling with missed event recovery - [ ] T109 [US4] Implement EventService in src/api/services/event_service.py (query historical events) - [ ] T110 [US4] Create REST endpoint: GET /events (query with filters: camera, type, time range) - [ ] T111 [US4] Implement permission filtering (users only receive events for authorized cameras) **Verify**: Run tests T091-T095 - they should now PASS **Checkpoint**: Event notifications working - WebSocket delivers real-time alerts, query historical events --- ## Phase 7: User Story 5 - Recording Management (Priority: P2) **Goal**: Manage video recording settings and query recorded footage for investigations **Independent Test**: Start recording on camera, query recordings by time range, receive list with download URLs ### gRPC Protocol Definitions - [ ] T112 [US5] Define recording.proto in src/sdk-bridge/Protos/ (QueryRecordingsRequest, StartRecordingRequest, RecordingInfo) ### Tests for User Story 5 (TDD - Write FIRST, Ensure FAIL) - [ ] T113 [P] [US5] Write contract test for GET /api/v1/recordings in tests/api/contract/test_recordings_contract.py (should FAIL) - [ ] T114 [P] [US5] Write contract test for POST /api/v1/recordings/{id}/export in tests/api/contract/test_recordings_contract.py (should FAIL) - [ ] T115 [P] [US5] Write integration test for recording management in tests/api/integration/test_recording_management.py (should FAIL) - [ ] T116 [P] [US5] Write unit test for RecordingService in tests/api/unit/test_recording_service.py (should FAIL) - [ ] T117 [P] [US5] Write C# unit test for RecordingService gRPC in tests/sdk-bridge/Unit/RecordingServiceTests.cs (should FAIL) ### Implementation - SDK Bridge (C#) - [ ] T118 [US5] Implement RecordingService.cs in src/sdk-bridge/Services/ with QueryRecordings (database query with time range) - [ ] T119 [US5] Implement StartRecording and StopRecording methods - [ ] T120 [US5] Implement GetRecordingCapacity method (ring buffer metrics) - [ ] T121 [US5] Query recording segments using CDBQCreateActionQuery pattern ### Implementation - Python API - [ ] T122 [P] [US5] Create Recording model in src/api/models/recording.py (id, camera_id, start_time, end_time, size_bytes, trigger_type) - [ ] T123 [US5] Create Alembic migration for Recording table - [ ] T124 [P] [US5] Create recording schemas in src/api/schemas/recording.py (RecordingQuery, RecordingInfo, ExportRequest) - [ ] T125 [US5] Implement RecordingService in src/api/services/recording_service.py (query, start, stop, export) - [ ] T126 [US5] Create recordings router in src/api/routers/recordings.py: GET /recordings, POST /recordings/{id}/export - [ ] T127 [US5] Implement recording query with filters (camera, time range, event type) - [ ] T128 [US5] Implement export job creation (async job with progress tracking) - [ ] T129 [US5] Implement ring buffer capacity monitoring and warnings (alert at 90%) - [ ] T130 [US5] Add administrator role requirement for starting/stopping recording **Verify**: Run tests T113-T117 - they should now PASS **Checkpoint**: Recording management functional - query, export, capacity monitoring --- ## Phase 8: User Story 6 - Video Analytics Configuration (Priority: P2) **Goal**: Configure video content analysis features (VMD, object tracking, perimeter protection) **Independent Test**: Configure motion detection zone on camera, trigger motion, verify analytics event generated ### gRPC Protocol Definitions - [ ] T131 [US6] Define analytics.proto in src/sdk-bridge/Protos/ (ConfigureAnalyticsRequest, AnalyticsConfig with union types for VMD/NPR/OBTRACK/G-Tect) ### Tests for User Story 6 (TDD - Write FIRST, Ensure FAIL) - [ ] T132 [P] [US6] Write contract test for GET /api/v1/analytics/{camera_id} in tests/api/contract/test_analytics_contract.py (should FAIL) - [ ] T133 [P] [US6] Write contract test for POST /api/v1/analytics/{camera_id} in tests/api/contract/test_analytics_contract.py (should FAIL) - [ ] T134 [P] [US6] Write integration test for analytics configuration in tests/api/integration/test_analytics_config.py (should FAIL) - [ ] T135 [P] [US6] Write unit test for AnalyticsService in tests/api/unit/test_analytics_service.py (should FAIL) - [ ] T136 [P] [US6] Write C# unit test for AnalyticsService gRPC in tests/sdk-bridge/Unit/AnalyticsServiceTests.cs (should FAIL) ### Implementation - SDK Bridge (C#) - [ ] T137 [US6] Implement AnalyticsService.cs in src/sdk-bridge/Services/ with ConfigureAnalytics method - [ ] T138 [US6] Implement GetAnalyticsConfig method (query current analytics settings) - [ ] T139 [US6] Map analytics types to SDK sensor types (VMD, NPR, OBTRACK, G-Tect, CPA) - [ ] T140 [US6] Implement region/zone configuration for analytics ### Implementation - Python API - [ ] T141 [P] [US6] Create AnalyticsConfig model in src/api/models/analytics_config.py (id, camera_id, type, enabled, configuration JSON) - [ ] T142 [US6] Create Alembic migration for AnalyticsConfig table - [ ] T143 [P] [US6] Create analytics schemas in src/api/schemas/analytics.py (AnalyticsConfigRequest, VMDConfig, NPRConfig, OBTRACKConfig) - [ ] T144 [US6] Implement AnalyticsService in src/api/services/analytics_service.py (configure, get_config, validate) - [ ] T145 [US6] Create analytics router in src/api/routers/analytics.py: GET/POST /analytics/{camera_id} - [ ] T146 [US6] Implement analytics capability validation (return error if camera doesn't support requested analytics) - [ ] T147 [US6] Add administrator role requirement for analytics configuration - [ ] T148 [US6] Implement schedule support for analytics (enable/disable by time/day) **Verify**: Run tests T132-T136 - they should now PASS **Checkpoint**: Analytics configuration functional - configure VMD, NPR, OBTRACK, receive analytics events --- ## Phase 9: User Story 7 - Multi-Camera Management (Priority: P2) **Goal**: View and manage multiple cameras simultaneously with location grouping **Independent Test**: Request camera list, verify all authorized cameras returned with metadata, group by location ### Tests for User Story 7 (TDD - Write FIRST, Ensure FAIL) - [ ] T149 [P] [US7] Write contract test for camera list with filtering/pagination in tests/api/contract/test_camera_list_contract.py (should FAIL) - [ ] T150 [P] [US7] Write integration test for multi-camera operations in tests/api/integration/test_multi_camera.py (should FAIL) ### Implementation - [ ] T151 [P] [US7] Add location field to Camera model (update migration) - [ ] T152 [US7] Implement camera list filtering by location, status, capabilities in CameraService - [ ] T153 [US7] Implement pagination for camera list (page, page_size parameters) - [ ] T154 [US7] Update GET /cameras endpoint with query parameters (location, status, capabilities, page, page_size) - [ ] T155 [US7] Implement camera grouping by location in response - [ ] T156 [US7] Implement concurrent stream limit tracking (warn if approaching limit) - [ ] T157 [US7] Add camera status change notifications via WebSocket (camera goes offline → event) **Verify**: Run tests T149-T150 - they should now PASS **Checkpoint**: Multi-camera management functional - filtering, grouping, concurrent access --- ## Phase 10: User Story 8 - License Plate Recognition Integration (Priority: P3) **Goal**: Receive automatic license plate recognition events with watchlist matching **Independent Test**: Configure NPR zone, drive test vehicle through zone, receive NPR event with plate number ### Tests for User Story 8 (TDD - Write FIRST, Ensure FAIL) - [ ] T158 [P] [US8] Write integration test for NPR events in tests/api/integration/test_npr_events.py (should FAIL) - [ ] T159 [P] [US8] Write unit test for NPR watchlist matching in tests/api/unit/test_npr_service.py (should FAIL) ### Implementation - [ ] T160 [P] [US8] Create NPREvent model extending Event in src/api/models/event.py (plate_number, country_code, confidence, image_url) - [ ] T161 [US8] Create Alembic migration for NPREvent table - [ ] T162 [P] [US8] Create Watchlist model in src/api/models/watchlist.py (id, plate_number, alert_level, notes) - [ ] T163 [US8] Create Alembic migration for Watchlist table - [ ] T164 [P] [US8] Create NPR schemas in src/api/schemas/npr.py (NPREventData, WatchlistEntry) - [ ] T165 [US8] Implement NPR event handling in EventService (parse NPR data from SDK) - [ ] T166 [US8] Implement watchlist matching service (check incoming plates against watchlist) - [ ] T167 [US8] Implement high-priority alerts for watchlist matches - [ ] T168 [US8] Add NPR-specific filtering to GET /events endpoint - [ ] T169 [US8] Create watchlist management endpoints: GET/POST/DELETE /api/v1/watchlist **Verify**: Run tests T158-T159 - they should now PASS **Checkpoint**: NPR integration functional - receive plate events, watchlist matching, alerts --- ## Phase 11: User Story 9 - Video Export and Backup (Priority: P3) **Goal**: Export specific video segments for evidence with progress tracking **Independent Test**: Request export of 10-minute segment, poll job status, download exported file ### Tests for User Story 9 (TDD - Write FIRST, Ensure FAIL) - [ ] T170 [P] [US9] Write contract test for export job in tests/api/contract/test_export_contract.py (should FAIL) - [ ] T171 [P] [US9] Write integration test for export workflow in tests/api/integration/test_export_workflow.py (should FAIL) - [ ] T172 [P] [US9] Write unit test for ExportService in tests/api/unit/test_export_service.py (should FAIL) ### Implementation - [ ] T173 [P] [US9] Create ExportJob model in src/api/models/export_job.py (id, camera_id, start_time, end_time, status, progress, file_path) - [ ] T174 [US9] Create Alembic migration for ExportJob table - [ ] T175 [P] [US9] Create export schemas in src/api/schemas/export.py (ExportRequest, ExportJobStatus) - [ ] T176 [US9] Implement ExportService in src/api/services/export_service.py (create_job, get_status, download) - [ ] T177 [US9] Implement background worker for export processing (query recordings, concatenate, encode to MP4) - [ ] T178 [US9] Implement progress tracking and updates (percentage complete, ETA) - [ ] T179 [US9] Update POST /recordings/{id}/export to create export job and return job ID - [ ] T180 [US9] Create GET /api/v1/exports/{job_id} endpoint for job status polling - [ ] T181 [US9] Create GET /api/v1/exports/{job_id}/download endpoint for file download - [ ] T182 [US9] Implement cleanup of old export files (auto-delete after 24 hours) - [ ] T183 [US9] Add timestamp watermarking to exported video **Verify**: Run tests T170-T172 - they should now PASS **Checkpoint**: Video export functional - create jobs, track progress, download files --- ## Phase 12: User Story 10 - System Health Monitoring (Priority: P3) **Goal**: Monitor API and surveillance system health with proactive alerts **Independent Test**: Query health endpoint, verify SDK connectivity status, simulate component failure ### Tests for User Story 10 (TDD - Write FIRST, Ensure FAIL) - [ ] T184 [P] [US10] Write contract test for GET /api/v1/health in tests/api/contract/test_health_contract.py (should FAIL) - [ ] T185 [P] [US10] Write contract test for GET /api/v1/status in tests/api/contract/test_health_contract.py (should FAIL) - [ ] T186 [P] [US10] Write integration test for health monitoring in tests/api/integration/test_health_monitoring.py (should FAIL) ### Implementation - [ ] T187 [P] [US10] Create health schemas in src/api/schemas/health.py (HealthResponse, SystemStatus, ComponentHealth) - [ ] T188 [US10] Implement HealthService in src/api/services/health_service.py (check all components) - [ ] T189 [US10] Implement SDK Bridge health check (gRPC connectivity test) - [ ] T190 [US10] Implement Redis health check (ping test) - [ ] T191 [US10] Implement PostgreSQL health check (simple query) - [ ] T192 [US10] Implement disk space check for recordings (warn if <10%) - [ ] T193 [US10] Create system router in src/api/routers/system.py: GET /health, GET /status - [ ] T194 [US10] Implement GET /health endpoint (public, returns basic status) - [ ] T195 [US10] Implement GET /status endpoint (authenticated, returns detailed metrics) - [ ] T196 [US10] Add Prometheus metrics endpoint at /metrics (request count, latency, errors, active streams, WebSocket connections) - [ ] T197 [US10] Implement background health monitoring task (check every 30s, alert on failures) **Verify**: Run tests T184-T186 - they should now PASS **Checkpoint**: Health monitoring functional - status endpoints, metrics, component checks --- ## Phase 13: User Story 12 - GeViSoft Configuration Management (Priority: P1) ✅ IMPLEMENTED (2025-12-16) **Goal**: Manage GeViSoft configuration (G-Core servers, action mappings) via REST API **Implementation Status**: CRUD operations working with critical bug fixes applied ### Implementation Summary (Completed) **REST API Endpoints**: - ✅ `GET /api/v1/configuration/servers` - List all G-Core servers - ✅ `GET /api/v1/configuration/servers/{server_id}` - Get single server - ✅ `POST /api/v1/configuration/servers` - Create new server - ⚠️ `PUT /api/v1/configuration/servers/{server_id}` - Update server (known bug) - ✅ `DELETE /api/v1/configuration/servers/{server_id}` - Delete server - ✅ `GET /api/v1/configuration/action-mappings` - List all action mappings - ✅ `GET /api/v1/configuration/action-mappings/{mapping_id}` - Get single mapping - ✅ `POST /api/v1/configuration/action-mappings` - Create mapping - ✅ `PUT /api/v1/configuration/action-mappings/{mapping_id}` - Update mapping - ✅ `DELETE /api/v1/configuration/action-mappings/{mapping_id}` - Delete mapping **gRPC SDK Bridge**: - ✅ ConfigurationService implementation - ✅ SetupClient integration for .set file operations - ✅ FolderTreeParser for binary configuration parsing - ✅ FolderTreeWriter for configuration updates - ✅ CreateServer, UpdateServer, DeleteServer methods - ✅ CreateActionMapping, UpdateActionMapping, DeleteActionMapping methods - ✅ ReadConfigurationTree for querying configuration **Critical Fixes**: - ✅ **Cascade Deletion Bug**: Fixed deletion order issue (delete in reverse order) - ✅ **Bool Type Handling**: Proper bool type usage for GeViSet compatibility - ✅ **Auto-increment Server IDs**: Find highest numeric ID and increment **Test Scripts**: - ✅ `comprehensive_crud_test.py` - Full CRUD verification - ✅ `safe_delete_test.py` - Cascade deletion fix verification - ✅ `server_manager.py` - Production server management - ✅ `cleanup_to_base.py` - Configuration reset utility - ✅ `verify_config_via_grpc.py` - Configuration verification **Documentation**: - ✅ `SERVER_CRUD_IMPLEMENTATION.md` - Complete implementation guide - ✅ `CRITICAL_BUG_FIX_DELETE.md` - Bug analysis and fix documentation - ✅ Updated spec.md with User Story 12 and functional requirements **Known Issues**: - ⚠️ Server UPDATE method has "Server ID is required" bug (workaround: delete and recreate) **Checkpoint**: Configuration management complete - can manage G-Core servers and action mappings via API --- ## Phase 14: Polish & Cross-Cutting Concerns **Purpose**: Improvements that affect multiple user stories - [ ] T198 [P] Add comprehensive API documentation to all endpoints (docstrings, parameter descriptions) - [ ] T199 [P] Create architecture diagram in docs/architecture.md with component interaction flows - [ ] T200 [P] Create SDK integration guide in docs/sdk-integration.md with connection patterns - [ ] T201 [P] Create deployment guide in docs/deployment.md (Windows Server, Docker, environment setup) - [ ] T202 [P] Add OpenAPI specification auto-generation from code annotations - [ ] T203 [P] Implement request/response logging with correlation IDs for debugging - [ ] T204 [P] Add performance profiling endpoints (debug mode only) - [ ] T205 [P] Create load testing scripts for concurrent streams and WebSocket connections - [ ] T206 [P] Implement graceful shutdown handling (close connections, flush logs) - [ ] T207 [P] Add TLS/HTTPS configuration guide and certificate management - [ ] T208 [P] Security hardening: Remove stack traces from production errors, sanitize logs - [ ] T209 [P] Add database connection pooling optimization - [ ] T210 [P] Implement API response caching for camera lists (Redis cache, 60s TTL) - [ ] T211 [P] Create GitHub Actions CI/CD pipeline (run tests, build Docker images) - [ ] T212 [P] Add code coverage reporting (target 80% minimum) - [ ] T213 Validate quickstart.md by following guide end-to-end - [ ] T214 Create README.md with project overview, links to documentation - [ ] T215 Final security audit: Check for OWASP top 10 vulnerabilities --- ## Dependencies & Execution Order ### Phase Dependencies - **Setup (Phase 1)**: No dependencies - can start immediately - **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories - **User Stories (Phase 3-12)**: All depend on Foundational phase completion - User Story 1 (P1): Authentication - NO dependencies on other stories - User Story 2 (P1): Live Streaming - Requires User Story 1 (auth for protected endpoints) - User Story 3 (P1): PTZ Control - Requires User Story 1 (auth) and User Story 2 (camera service exists) - User Story 4 (P1): Event Notifications - Requires User Story 1 (auth), User Story 2 (camera service) - User Story 5 (P2): Recording Management - Requires User Story 1 (auth), User Story 2 (camera service) - User Story 6 (P2): Analytics Config - Requires User Story 1 (auth), User Story 2 (camera service), User Story 4 (events) - User Story 7 (P2): Multi-Camera - Extends User Story 2 (camera service) - User Story 8 (P3): NPR Integration - Requires User Story 4 (events), User Story 6 (analytics) - User Story 9 (P3): Video Export - Requires User Story 5 (recording management) - User Story 10 (P3): Health Monitoring - Can start after Foundational, but best after all services exist - **Polish (Phase 13)**: Depends on all desired user stories being complete ### Critical Path (Sequential) ``` Phase 1: Setup ↓ Phase 2: Foundational (BLOCKS all user stories) ↓ Phase 3: User Story 1 - Authentication (BLOCKS all protected endpoints) ↓ Phase 4: User Story 2 - Live Streaming (BLOCKS camera-dependent features) ↓ Phase 5: User Story 3 - PTZ Control ↓ Phase 6: User Story 4 - Event Notifications (BLOCKS analytics) ↓ [Phase 7-12 can proceed in parallel after their dependencies are met] ↓ Phase 13: Polish ``` ### User Story Dependencies - **US1 (Authentication)**: No dependencies - can start after Foundational - **US2 (Live Streaming)**: Depends on US1 completion - **US3 (PTZ Control)**: Depends on US1, US2 completion - **US4 (Event Notifications)**: Depends on US1, US2 completion - **US5 (Recording Management)**: Depends on US1, US2 completion - **US6 (Analytics Config)**: Depends on US1, US2, US4 completion - **US7 (Multi-Camera)**: Depends on US2 completion - **US8 (NPR Integration)**: Depends on US4, US6 completion - **US9 (Video Export)**: Depends on US5 completion - **US10 (Health Monitoring)**: Can start after Foundational - **US12 (Configuration Management)**: ✅ COMPLETED - Depends on Foundational only ### Parallel Opportunities **Within Phases**: - Phase 1 (Setup): T004-T010 can run in parallel (all marked [P]) - Phase 2 (Foundational): T014-T015, T019-T021, T023-T024, T028-T029 can run in parallel **Within User Stories**: - US1 Tests: T030-T034 can run in parallel - US1 Models: T035-T036 can run in parallel - US1 Schemas: T038 independent - US2 Tests: T049-T055 can run in parallel - US2 Models: T063-T064 can run in parallel - US2 Schemas: T066-T067 can run in parallel - [Similar pattern for all user stories] **Across User Stories** (if team capacity allows): - After Foundational completes: US1, US10, US12 can start in parallel - After US1 completes: US2, US5 can start in parallel - After US2 completes: US3, US4, US7 can start in parallel - After US4 completes: US6 can start - After US5 completes: US9 can start - After US6 completes: US8 can start - US12 ✅ COMPLETED (Configuration Management) **Polish Phase**: T198-T212, T214-T215 all marked [P] can run in parallel --- ## Parallel Example: User Story 2 (Live Streaming) ```bash # Step 1: Write all tests in parallel (TDD - ensure they FAIL) Task T049: Contract test for GET /cameras Task T050: Contract test for GET /cameras/{id} Task T051: Contract test for POST /cameras/{id}/stream Task T052: Contract test for DELETE /cameras/{id}/stream/{stream_id} Task T053: Integration test for stream lifecycle Task T054: Unit test for CameraService (Python) Task T055: Unit test for CameraService (C#) # Step 2: Create models in parallel Task T063: Camera model Task T064: Stream model # Step 3: Create schemas in parallel Task T066: Camera schemas Task T067: Stream schemas # Step 4: Implement services sequentially (dependency on models) Task T068: CameraService (depends on T063, T064) Task T069: StreamService (depends on T068) # Step 5: Implement SDK Bridge sequentially Task T056: CameraService.cs (depends on gRPC proto T047) Task T059: StreamService.cs (depends on gRPC proto T048) # Step 6: Implement routers sequentially (depends on services) Task T071: Cameras router Task T072: Stream endpoints # Verify: Run tests T049-T055 - they should now PASS ``` --- ## Implementation Strategy ### MVP First (User Stories 1-4 Only) **Rationale**: US1-US4 are all P1 and deliver core surveillance functionality 1. ✅ Complete Phase 1: Setup 2. ✅ Complete Phase 2: Foundational (CRITICAL - blocks all stories) 3. ✅ Complete Phase 3: User Story 1 (Authentication) - STOP and TEST 4. ✅ Complete Phase 4: User Story 2 (Live Streaming) - STOP and TEST 5. ✅ Complete Phase 5: User Story 3 (PTZ Control) - STOP and TEST 6. ✅ Complete Phase 6: User Story 4 (Event Notifications) - STOP and TEST 7. **STOP and VALIDATE**: Test all P1 stories together as integrated MVP 8. Deploy/demo MVP **MVP Delivers**: - ✅ Secure authentication with RBAC - ✅ Live video streaming from cameras - ✅ PTZ camera control - ✅ Real-time event notifications **Not in MVP** (can add incrementally): - Recording management (US5) - Analytics configuration (US6) - Multi-camera enhancements (US7) - NPR integration (US8) - Video export (US9) - Health monitoring (US10) ### Incremental Delivery (After MVP) 1. **MVP** (US1-4) → Deploy → Validate 2. **+Recording** (US5) → Deploy → Validate 3. **+Analytics** (US6) → Deploy → Validate 4. **+Multi-Camera** (US7) → Deploy → Validate 5. **+NPR** (US8) → Deploy → Validate 6. **+Export** (US9) → Deploy → Validate 7. **+Health** (US10) → Deploy → Validate 8. **+Polish** (Phase 13) → Final Release Each increment adds value without breaking previous functionality. ### Parallel Team Strategy With 3 developers after Foundational phase completes: **Week 1-2**: All work on US1 together (foundational for everything) **Week 3-4**: - Developer A: US2 (Live Streaming) - Developer B: Start US4 (Events - can partially proceed) - Developer C: Setup/tooling improvements **Week 5-6**: - Developer A: US3 (PTZ - depends on US2) - Developer B: Complete US4 (Events) - Developer C: US5 (Recording) **Week 7+**: - Developer A: US6 (Analytics) - Developer B: US7 (Multi-Camera) - Developer C: US9 (Export) --- ## Task Summary **Total Tasks**: 215 **By Phase**: - Phase 1 (Setup): 10 tasks - Phase 2 (Foundational): 19 tasks - Phase 3 (US1 - Authentication): 17 tasks - Phase 4 (US2 - Live Streaming): 29 tasks - Phase 5 (US3 - PTZ Control): 15 tasks - Phase 6 (US4 - Event Notifications): 22 tasks - Phase 7 (US5 - Recording Management): 19 tasks - Phase 8 (US6 - Analytics Config): 18 tasks - Phase 9 (US7 - Multi-Camera): 9 tasks - Phase 10 (US8 - NPR Integration): 12 tasks - Phase 11 (US9 - Video Export): 14 tasks - Phase 12 (US10 - Health Monitoring): 14 tasks - Phase 13 (US12 - Configuration Management): ✅ COMPLETED (2025-12-16) - Phase 14 (Polish): 18 tasks **MVP Tasks** (Phases 1-6): 112 tasks **Configuration Management**: ✅ Implemented separately (not part of original task breakdown) **Tests**: 80+ test tasks (all marked TDD - write first, ensure FAIL) **Parallel Tasks**: 100+ tasks marked [P] **Estimated Timeline**: - MVP (US1-4): 8-10 weeks (1 developer) or 4-6 weeks (3 developers) - Full Feature Set (US1-10 + Polish): 16-20 weeks (1 developer) or 8-12 weeks (3 developers) --- ## Notes - **[P] tasks**: Different files, no dependencies - safe to parallelize - **[Story] labels**: Maps task to specific user story for traceability - **TDD enforced**: All test tasks MUST be written first and FAIL before implementation - **Independent stories**: Each user story should be independently completable and testable - **Commit frequently**: After each task or logical group - **Stop at checkpoints**: Validate each story independently before proceeding - **MVP focus**: Complete US1-4 first for deployable surveillance system - **Avoid**: Vague tasks, same-file conflicts, cross-story dependencies that break independence --- **Generated**: 2025-12-08 **Updated**: 2025-12-16 (Configuration Management implemented) **Based on**: spec.md (12 user stories), plan.md (tech stack), data-model.md (8 entities), contracts/openapi.yaml (27+ endpoints)