Roles that connect engineering execution with measurable business decisions.
The page is intentionally structured around what was built, what kind of workflow it improved, and why the work mattered. The goal is to make the experience legible as evidence, not just chronology.
Current framing
Balanced range, strongest proof in reliable systems.
The roles span data engineering, analytics engineering, operational reporting, and product measurement. The common thread is reliability under ambiguity: making complex workflows measurable and usable for the people who depend on them.
How I work
Strongest when the process is messy and the decision matters.
I tend to gravitate toward work that sits between raw systems, business definitions, and operational edge cases. That is usually where better data handling, better metrics, and better coordination create the biggest lift.
GTM Data Engineer
Builds GTM data systems that turn fragmented operational signals into reliable revenue, lifecycle, and reporting workflows.
- Unified CRM, GA4, product, billing, and support data into a governed Customer 360 model that raised attributable revenue coverage from 48% to 86%.
- Designed attribution logic and lifecycle reporting that reduced reporting lag from 6 days to 1 day for GTM stakeholders.
- Introduced dbt tests, Airflow monitoring, lineage, and CI patterns that pushed pipeline reliability to 99.6% and reduced bad-data incidents downstream.
Data Analyst Intern
Replaced manual reporting with warehouse-backed operational dashboards that made logistics KPIs easier to trust, monitor, and act on.
- Consolidated 20+ conflicting business metrics into a governed KPI layer for product, operations, and finance stakeholders.
- Built dashboards and QA workflows that sharply reduced manual reporting time while improving executive adoption and trust.
- Improved refresh performance and reporting reliability through star-schema modeling, Azure SQL, and targeted DAX optimization.
Brokerage Data Administrator
Worked close to messy operational documents, OCR-assisted extraction, and compliance workflows, building the judgment layer between raw inputs and reliable downstream records.
- Supported OCR and NLP-assisted extraction workflows for customs brokerage inputs with validation, routing, and human review for low-confidence outputs.
- Improved operational transparency by standardizing records used for compliance tracking, backlog visibility, and downstream reporting.
- Helped turn fragile intake processes into more reliable datasets and review workflows for high-risk operational work.
Business Analyst Intern, Product Analytics
Built product analytics and retention workflows that connected event data, experimentation, and predictive scoring to better product decisions.
- Built reusable funnel and experimentation readouts that helped identify onboarding friction and improve activation and completion metrics.
- Delivered predictive retention workflows that prioritized high-risk cohorts and linked model outputs to intervention decisions.
- Improved analytics reliability with structured Snowflake refresh patterns and decision-ready reporting for product stakeholders.
Venture Capital Intern
Applied structured analytics, prospect scoring, and lightweight experimentation to improve sourcing quality and decision readiness.
- Doubled weekly qualified leads by automating prospect list building and testing outreach variants with simple but disciplined measurement loops.
- Framed ambiguous commercial questions into structured reporting, research, and prioritization workflows for leadership.
- Used lightweight analytics to improve both decision speed and the quality of follow-up actions.
Why the page is organized this way
Role titles matter less than the system patterns they reveal.
The earlier site leaned too much on isolated project references. This version uses the work history to show a stronger progression: from operational inputs and document-heavy workflows, to KPI systems, to GTM warehouse modeling and more mature reliability patterns.