Skills
Programming Languages: Python, SQL, Java, Scala, TypeScript, Bash
• Frameworks & Technologies: Apache Airflow, FastAPI, Kafka, GraphQL, RESTful APIs
• Data Processing & Big Data: pandas, NumPy, PySpark, Apache Spark, Hive, HDFS, MapReduce, Snowflake, Redshift, BigQuery, Delta
Lake
• Data Warehousing & ETL: dbt, Airbyte, AWS Glue, Azure Data Factory, Dataflow (GCP)
• Data Visualization Tools: Power BI, Tableau, Looker
• Databases: MySQL, PostgreSQL, T-SQL, MongoDB
• Cloud & DevOps: AWS (S3, Redshift, Lambda, EMR, Athena), Azure(Synapse Analytics, Data Lake, Databricks, SQL Database), Google
Cloud (BigQuery, Cloud Storage, Composer)
• Tools & Platforms: GitHub, GitLab, Terraform, Kubernetes, Docker, Jenkins, Apache Kafka, Databricks
• Core Skills: Data Architecture, ETL & ELT Pipelines, Real-Time Data Processing, Data Governance, Data Modelling, Agile & DevOps,
Problem-Solving, Strategic Thinking, Team Collaboration
About
Results-oriented Data Engineer with 5 years of experience in building and optimizing scalable data pipelines, ETL processes, and cloud
solutions. Skilled in SQL, Python, Scala, and Big Data technologies like Hadoop, Spark, and Kafka for efficient data processing. Experienced
with cloud platforms (AWS, Azure, GCP) and services such as Redshift, BigQuery, and S3. Expertise in workflow orchestration with Airflow
and implementing CI/CD pipelines for reliable data deployment.