Career Narrative
From audit assertions to data certification.
The same intellectual framework — verify what institutions claim about their data — applied with increasing scope and technical depth across four roles.
Data Orchestration & Governance Lead
Designed and built the end-to-end data collection orchestration and quality certification platform for private market data. The system handles counterparty data requests, ingestion across 20+ sources (SaaS platforms, Excel, PDFs, database connections), submission tracking, and multi-gate quality certification down to the individual data point.
Started from 22 isolated SQL boolean checks with no remediation workflows. Built a vendor-neutral framework flexible enough to run on Great Expectations, Pandera, Precisely, Informatica, or Collibra. Now serves 6 departments and 100+ direct and downstream users including data stewards and owners across the organisation.
Lead a mixed team of engineers, business analysts, QA, and technical support. Currently implementing Gherkin-based test specifications, systematic pytest workflows, and Datadog monitoring. Scaling the framework from private markets to public markets.
Automation Engineering & Digital Twin PM
Built automation infrastructure and operational dashboards using Power Automate and Power BI for finance and sales functions. Project managed the implementation of a Finance Command Center / Digital Twin — a system mirroring operational reality for monitoring and executive decision-making.
This role developed the capability to translate business requirements into working technical infrastructure at enterprise scale — a skill that directly enabled the architectural work that followed.
Financial Data Calculations — Fundamental Analysis
Managed the calculation logic powering Bloomberg Fundamental Analysis, one of Bloomberg's most-used products. Led the Asia region implementation of calculations arising from the convergence of US GAAP and IFRS accounting standards.
First deep exposure to data cataloguing and data quality at scale. At Bloomberg, data quality is existential — when calculations are wrong, portfolio managers make bad investment decisions. This set the standard for what institutional commitment to data integrity looks like.
Audit · Chartered Accountant
Trained and qualified as a Chartered Accountant. Developed the foundational controls mindset through risk-based auditing, COSO framework application, sampling methodology, and audit assertions — completeness, existence, accuracy, valuation, rights and obligations.
These audit assertions map almost directly onto data quality dimensions: completeness, validity, accuracy, consistency, timeliness. The same intellectual framework, translated into a different medium. This is where the instinct for verification — the understanding that institutional claims about data require independent testing — originated.
Qualifications & Tools
Chartered Accountant (CA) · Python · SQL · Great Expectations · Pandera · Power Platform · Docker · Snowflake · Collibra · Informatica · Precisely · Git · Linux · Datadog · Gherkin/BDD
Vendor-neutral by design. The framework I build can run on open-source tooling or enterprise platforms — the methodology is independent of any single technology stack.