# STATEMENT OF WORK (SOW)
## Project: TaxNGOFund – Backend, Services & Infrastructure Implementation
Client: SKFO / TaxNGOFund Platform
Contract Type: Milestone-Based (Fixed Price)
---
## 1. Project Overview
TaxNGOFund is a multi-tenant SaaS platform for:
- Healthcare claims analytics
- NGO / fund / tax automation
- Publisher marketplace and revenue-sharing
- Developer API platform
- Fraud detection and cryptographic audit ledger
The client (SKFO) has already completed:
- Frontend (React / Tailwind / shadcn UI)
- API contracts and endpoint specifications
- SQL schema and Alembic migration specifications
- Publisher Economics engine design (revenue split and payout logic)
- Fraud detection MVP design (rules + DAG outline)
- Cryptographic ledger design (hash-chain + Merkle roots)
- Terraform layout for AWS environments
- CI/CD workflow design
This SOW covers the **implementation** of the backend services, business logic, Airflow DAGs, AWS infrastructure, and deployment pipeline, based on the existing specifications.
---
## 2. Work Already Completed (Provided by Client)
The following artifacts will be provided by SKFO:
- React frontend for:
- Marketplace
- Publisher portal
- Developer portal
- Ops dashboard
- Licenses pages
- API endpoint contracts (request/response shapes)
- Logical and physical SQL schema design
- Alembic migration plan/specification
- Publisher Economics engine pseudo-SQL and flow diagrams
- Fraud detection MVP ruleset and DAG outline
- Cryptographic ledger model:
- Transaction tables
- Hash chaining
- Merkle root design
- Terraform module structure for:
- VPC / networking
- ECS
- RDS
- S3 / SQS
- MWAA (Airflow)
- CI/CD workflow design (GitHub Actions)
The developer’s responsibility is **implementation**, not redesign.
---
## 3. Milestone Structure (Required for All Bids)
All bids must follow this **three-milestone structure**.
- Placeholder or lump-sum bids will NOT be considered.
- Each milestone must have a specific price and delivery time (in days).
---
## Milestone 1 – Backend Services and Core Business Logic
### 3.1 Scope
Implement all backend APIs and core business logic as per the existing contracts and specifications. No frontend changes are allowed.
### 3.2 API Endpoints
Implement the following endpoints (FastAPI or equivalent stack):
- `/api/v1/auth/me`
- OIDC/session-based user identity resolution
- Marketplace:
- `GET /api/v1/marketplace/modules`
- `GET /api/v1/marketplace/modules/{slugOrId}`
- Publisher Economics:
- `GET /api/v1/publishers/{id}/earnings`
- `GET /api/v1/publishers/{id}/settlements`
- Developer Platform:
- `GET /api/v1/dev/api-keys`
- `POST /api/v1/dev/api-keys`
- `POST /api/v1/dev/api-keys/{id}/revoke`
- `GET /api/v1/dev/usage-summary`
- Ops:
- `GET /api/v1/ops/kpis`
- Licenses:
- `GET /api/v1/tenants/{tenantId}/licenses`
All responses must match the existing frontend/API contracts exactly.
### 3.3 Business Logic Components
Implement the following:
1. **Publisher Economics Engine**
- Revenue split calculation based on:
- Module sales
- Discounts
- Overrides
- Publisher share percentages
- Settlement line creation into settlement tables
- Payout batching logic ready for use by Airflow DAGs
- Single-currency MVP implementation (multi-currency support is phase 2 but should not block design)
2. **Developer Platform**
- API key lifecycle:
- Secure key generation
- Hashing keys at rest
- Revocation
- Usage aggregation:
- Daily usage summary
- Error rate statistics
3. **Fraud Detection MVP**
- Rule-based fraud scoring engine (no ML required for MVP)
- Writing `fraud_scores` and `fraud_flags` linked to claims/entities
4. **Cryptographic Ledger Core**
- Hash-chaining of ledger entries using `prev_hash`
- Merkle root generation for batches of transactions
- Basic proof verification functions
### 3.4 Data Layer
- Implement SQLAlchemy models consistent with provided schema.
- Implement Alembic migrations consistent with the provided migration plan.
- Ensure all models and migrations are tested and align with existing design.
### 3.5 Acceptance Criteria (Milestone 1)
- All endpoints return valid responses and the existing frontend works without modification.
- Revenue split and payout logic produce correct results for test data.
- Fraud scores and flags are written correctly as per rules.
- Ledger insertions are hash-chained and Merkle roots can be computed.
- All models and migrations run successfully on a fresh database.
---
## Milestone 2 – Airflow DAGs and Batch Processing
### 4.1 Scope
Implement all batch workflows using Airflow (MWAA-compatible), integrating them with the backend/data layer implemented in Milestone 1.
### 4.2 DAGs to Implement
1. **Daily Revenue Split DAG**
- Reads module sales for the period
- Executes revenue split logic
- Writes settlement lines
- Must be idempotent (safe to re-run for a given date)
2. **Monthly Payout Batch DAG**
- Groups unpaid settlement lines into payout batches
- Creates or updates payout batch records
- Manages status transitions (e.g., pending → processing → completed)
3. **Daily Fraud Scoring DAG**
- Executes the rule-based fraud scoring engine
- Writes fraud scores and flags for eligible claims
- Supports backfilling for historical ranges
### 4.3 Monitoring and Logging
- Ensure DAG runs are logged with sufficient detail for debugging.
- Configure basic monitoring/alerts for failed tasks (CloudWatch or callbacks).
### 4.4 Integration
- DAGs must connect to the same database and services as Milestone 1.
- DAGs must respect environment configuration (dev and staging).
### 4.5 Acceptance Criteria (Milestone 2)
- All DAGs deploy and run successfully in MWAA for dev and staging.
- DAGs process sample test data correctly and idempotently.
- Backfill operations work as expected.
- Logs provide sufficient detail to identify failures.
---
## Milestone 3 – AWS Infrastructure, Terraform, ECS Deployment and CI/CD
### 5.1 Scope
Implement AWS infrastructure using Terraform, deploy all services to ECS, wire MWAA, and configure CI/CD via GitHub Actions.
### 5.2 Terraform Infrastructure
Implement Terraform modules and apply them to create:
- VPC and networking (subnets, route tables, security groups)
- ECS cluster and ECS services for:
- Core backend API
- Publisher Economics service (if separated)
- Fraud service (if separated)
- Ledger service (if separated)
- Developer platform service (if separated)
- RDS Postgres instance
- S3 buckets for:
- Application assets / data as required
- Terraform state (if specified)
- SQS queues as per design
- MWAA (Managed Airflow) environment
- Remote state configuration (S3 + DynamoDB locking) if not already done
### 5.3 Environments
- Deploy infrastructure to at least:
- Development environment (dev)
- Staging environment (staging)
Production environment support should be possible from the same Terraform code, even if not fully deployed in this milestone.
### 5.4 CI/CD (GitHub Actions)
- Configure GitHub Actions workflows to:
- Run tests
- Build Docker images
- Push images to ECR
- Run database migrations (Alembic)
- Update ECS task definitions and services for:
- Dev environment
- Staging environment
### 5.5 Verification
- Verify that:
- ECS services are up and reachable.
- RDS is connected and serving the backend.
- MWAA can run the DAGs developed in Milestone 2.
- Logs and metrics are visible in CloudWatch.
- Provide basic runbooks for deployment and rollback steps.
### 5.6 Acceptance Criteria (Milestone 3)
- `terraform apply` completes successfully for dev and staging.
- GitHub Actions can fully deploy code changes to dev and staging.
- All services run successfully in ECS using the Terraform-managed infrastructure.
- DAGs run and interact with ECS/RDS as expected.
---
## 6. Deliverables Summary
- Fully implemented backend services (Milestone 1)
- Functional Airflow DAGs and batch jobs (Milestone 2)
- AWS infrastructure, ECS deployment, and CI/CD pipeline (Milestone 3)
- Documentation and runbooks for:
- Local development
- Deployment to dev and staging
- Running and monitoring DAGs
---
## 7. Payment Terms
- Payments are tied to successful completion and acceptance of each milestone.
- Each milestone is considered complete when all acceptance criteria listed above are met.
---
## 8. Proposal Requirements (For Guru Bidders)
To be considered, each bid must:
1. Follow the **three-milestone structure** given above.
2. Provide a **fixed price for each milestone** (no placeholder or lump-sum bids).
3. Provide an **estimated delivery time (in days) for each milestone**.
4. Confirm experience with:
- Python / FastAPI (or similar)
- SQLAlchemy, Alembic, PostgreSQL
- AWS (ECS, RDS, S3, SQS, MWAA)
- Terraform (modular design)
- Airflow DAGs
- GitHub Actions (or similar CI/CD)
5. Confirm that the developer will follow existing specs and API contracts without modifying the frontend.
Bids that do not meet these requirements may be rejected without further review.
---
## 9. Communication and Reporting
- Regular status updates (at least weekly).
- All work to be delivered via GitHub pull requests.
- All infrastructure changes via Terraform (version-controlled).
- Issues and blockers must be communicated early.
---
## 10. Intellectual Property
All work produced under this SOW is considered Work-For-Hire.
All code, infrastructure definitions, and related documentation become the exclusive property of **SKFO / TaxNGOFund Platform** upon payment.
... Show more