Software Developer
6+ years building data engineering pipelines, automation systems, and backend solutions. Specialized in large-scale financial market data — processed 1.5TB+ of options & futures data using DuckDB, Parquet, and Python.
I'm a Software Developer based in Noida, India with 6+ years of experience across data engineering, automation, and backend systems. I started my career building RPA bots and automation scripts, and have since moved into large-scale financial data pipelines.
Currently at Open Futures and Commodities, I work with NSE and TAIFEX market data — converting raw datasets into optimized Parquet files using DuckDB, building quantitative trading strategies, and setting up real-time visualization pipelines with QuestDB and Grafana.
I'm passionate about turning raw, messy data into fast, queryable systems and solving bottlenecks through practical engineering. I hold an AWS Certified Cloud Practitioner certification and have hands-on experience with S3 integration and cloud migrations.
1.5TB+ raw market data pipelines using DuckDB, Parquet & Zstd compression
End-to-end RPA bots, PDF parsing, OCR, Selenium automation for enterprise clients
Real-time market data dashboards via QuestDB + Grafana with rsync automation
AWS S3 integration, GCP → AWS migration, Django CRM, REST API development
Foundational level — Amazon Web Services · Issued Aug 2023
Open to full-time roles, freelance projects, and interesting data engineering challenges. Based in Noida — open to remote.