I help businesses design, build, and optimize scalable, high-performance data solutions. With 4 years of experience in the telecom and financial domains, I specialize in:
Core Services
Data Pipeline Development: Building robust ETL/ELT workflows using Azure Data Factory, Databricks, and PySpark
Data Warehousing: Designing and optimizing Snowflake schemas, queries, and performance tuning
Big Data Processing: Efficiently handling large datasets with Apache Spark (batch & streaming)
Cloud Integration: Delivering secure, end-to-end Azure-based solutions
Data Transformation & Modeling: Creating fact/dimension tables and summary layers for reporting (Power BI/Tableau)
API Integration: Fetching and processing external data via REST APIs in Databricks/ADF
CI/CD for Data Workflows: Implementing Git-based deployment pipelines for ADF and Databricks.
Why Choose Me?
✅ Proven Impact – Reduced ETL failures by 50%, improved ingestion efficiency, and delivered 3x faster reporting for stakeholders.
✅ Domain Expertise – Hands-on telecom analytics (DIRECTV) and financial analytics.
✅ Scalable Solutions – Designed pipelines to process 10GB+ daily batch data with optimized Spark configurations.
✅ Clear Communication – Skilled in translating complex technical requirements into simple business terms.
📩 Let’s connect to discuss your project and see how I can help transform your data into actionable insights.