Skip to content

Work

Selected projects from five years of building data systems across fleet analytics, construction compliance, financial reconciliation, and full-stack business applications.

Most data problems aren't dashboard problems. They're pipeline problems. The data is messy, the structure doesn't exist, the reports don't connect to decisions, and nobody owns the full picture.

I do.

Every project below represents a stage of the data lifecycle I've built for real clients — from raw cleanup to production-grade systems. Together, they're proof that I don't just build dashboards. I solve the whole problem.

Data Cleanup + Reporting

Boost Payment Solutions

Client
Internal — customer database migration
Company
Boost Payment Solutions, New York, NY
Period
June 2019 – August 2019
Role
Database Management Intern — CRM migration and data cleanup

Problem

A payment solutions company was migrating its customer database from a legacy CRM to Salesforce. The existing data was full of duplicate records, outdated entries, and inconsistent formatting — migrating it as-is would have poisoned the new system from day one.

Approach

I cleaned and prepared the full customer dataset for migration. I built Excel VBA scripts to construct parallel spreadsheets that cross-verified migration accuracy record by record. I systematically flagged outdated and duplicate records for removal before transfer, making sure only verified data entered the new system. I also established documentation protocols for ongoing data quality management.

Result

Successful data migration to Salesforce with verified accuracy. Prevented legacy data quality issues from carrying into the new CRM. Clean data from day one.

Tech Stack
Excel VBA Salesforce (target CRM)
Boost Data Quality Visual 1–2 images: Before/After data quality infographic showing cleanup metrics (sample data)
Representative example built with sample data. Original client work is under NDA.
SQL Query Development + Financial Reconciliation & Reporting

Orion Fleet Intelligence

Client
Fleet management and insurance analytics clients
Company
Orion Fleet Intelligence, Conshohocken, PA
Period
March 2021 – March 2023
Role
Data Analyst II — SQL analytical layer development

Problem

Fleet management clients were overpaying on insurance because insurers had no visibility into granular driver behavior and telematics data. Internally, the analytics team spent 10+ hours per week manually generating client-specific reports — mileage breakdowns, diagnostics summaries, risk profiles — because existing tooling couldn't handle the query complexity at scale.

Approach

I developed the SQL analytical layer powering fleet telematics reporting — driver behavior scoring, mileage analysis, and diagnostics summaries. These scripts fed through SSIS pipelines into the Orion FI Dashboard, the client-facing reporting interface. I introduced self-service reporting parameters so clients could generate their own filtered reports without waiting on internal analysts. I also maintained data integrity across multiple insurance and fleet data systems through automated ETL scripts.

Result

$100K+ in annual client savings through data-driven risk profiling. 10+ hours per week of internal workload eliminated through self-service reporting. Clients went from waiting on analysts to running their own reports.

Tech Stack
SQL Server SSIS Orion FI Dashboard Google Sheets/Excel
Orion Dashboard Mockup 3 images: Fleet Overview, Driver Behavior Analytics, Insurance Impact (Power BI mockup with sample data)
Representative example built with sample data. Original client work is under NDA.
Dashboard Build + SQL Query Development

EOS Group — Construction BI Dashboards

Client
Construction-sector project teams and executives
Company
EOS Group Inc., Remote, US
Period
March 2023 – March 2024
Role
Business and Data Analyst II — SQL procedures + dashboard development

Problem

A construction-sector company needed real-time visibility into project viability metrics — lead times, key quantities, cost estimates — but data was siloed across systems with no automated way to surface decision-ready insights. Compliance tracking and performance monitoring lagged behind actual project status, creating risk in project approvals.

Approach

I created SQL stored procedures that automated viability calculations and compliance checks, feeding directly into Navigator, a custom C# application used by project teams for daily decision-making. I built a compliance architecture from scratch for First Energy — multi-factor analysis covering equipment availability, crew scheduling, and third-party supply chain data. I also built an interactive admin dashboard that let administrators trigger specific stored procedures based on user input, streamlining communication between junior staff and senior decision-makers with real-time project status updates. I implemented automated Python scripts to scrape supply chain data, letting clients preempt material shortages. And I authored comprehensive documentation so both technical and non-technical teams could adopt and maintain the systems.

Result

Real-time project viability data replaced manual reporting workflows. Automated compliance monitoring reduced the gap between what was happening on projects and what executives could see. The compliance architecture was adopted by additional clients including Eversource.

Tech Stack
SQL Server C# (Navigator) Python Live Server Integration
EOS Dashboard Mockup 2 images: Project Viability Overview, Compliance & Performance (Power BI mockup with sample data)
Representative example built with sample data. Original client work is under NDA.
Full System Build (Python + SQL)

Timber Center IMS ~95% complete

Client
Zimbabwean import/export timber business
Company
InterConsult Zimbabwe (Pvt) Ltd
Period
2024 – Present
Role
Sole developer — scoped, designed, and built end-to-end

Problem

A Zimbabwean timber import/retail business was running entirely on handwritten records. Inventory tracking, procurement, pricing, and financial reconciliation were all manual. The business had no visibility into stock levels, no way to trace transactions, and reconciliation took days of paper-shuffling. The system needed to work reliably in a cash-based economy with intermittent connectivity.

Approach

I designed and built a comprehensive retail management system from scratch. The architecture covers inventory tracking, procurement workflows, point-of-sale, and financial reconciliation — all running on an 88-table SQL Server database with a PyQt5 desktop interface. The system supports multi-currency transactions (USD/ZiG with live exchange rates), barcode scanning with 80ms keystroke detection, vendor management, stock transfers, low-stock alerts, and a full accounting module with double-entry GL journal entries across six templates. I migrated historical paper records into the structured database and designed the whole thing to function offline-first, syncing when connectivity is available.

Result

Metrics pending — to be captured after project ships: time saved on reconciliation, error reduction, decision-making improvements, user adoption rates.
Tech Stack
Python PyQt5 SQL Server (88 tables) SQLAlchemy Excel Export Offline-capable
Timber Center Screenshots 3–5 images: main dashboard, inventory view, transaction flow, reconciliation view, database schema
Representative example built with sample data. Original client work is under NDA.

Every business has data. Not every business has someone who can take that data from a mess on a spreadsheet to a system that runs itself. That's what I do — and every project above is proof of a different part of that process.

Have a data problem? Let's figure out where in the pipeline you're stuck.