– Rapidly spin up the cloud environment. – Deliver working historical backfill pipelines from Tardis.dev into a queryable database. – Deliver a real-time Tardis WebSocket pipeline, ensuring data is normalized, cached for live consumption, accurate, replayable, and queryable by Day 60. – Ensure all pipelines are idempotent, retryable, and use exactly-once semantics. – Implement full CI/CD, Terraform, automated testing, and secrets management. – Implement proper observability (structured logs, metrics, dashboards, alerting) from day one. – Provide immediate self-service access to the MVP database for Trading and BI teams via tools like Tableau/Metabase, and through simple internal REST APIs. – Develop specialized timeseries data, including USDe backing-asset and a full opportunity-surface timeseries for Delta-neutral/lending/borrow opportunities. – Ingest data from additional sources (Kaiko, CoinAPI, on-chain via TheGraph/Dune). – Plan for 10x+ data growth via schema evolution, partitioning, and performance tuning. – Establish enterprise-grade governance, including a data quality framework, RBAC, audit logs, and a semantic layer. – Create full architecture documentation, runbooks, and a data dictionary. – Onboard and mentor future junior staff. – Serve as the go-to data expert for the firm and will be responsible for mentoring future junior data engineers or analysts.