Job Description:
- At least 3 years of experience building and maintaining ETL/ELT pipelines in enterprise environments using Azure-native tools.
- Hands-on expertise with Azure Data Factory, Dataflows, Synapse Pipelines, or similar orchestration tools.
- Proficiency in SQL, Python, or PySpark for transformation logic and data cleansing workflows.
- Experience with Delta Lake, Azure Data Lake Storage Gen2, JSON, and Parquet formats.
- Ability to build modular, reusable pipeline components using metadata-driven approaches and robust error handling.
- Familiarity with public data sources, government transparency datasets, and publishing workflows.
- Knowledge of data masking, PII handling, and encryption techniques to manage sensitive data responsibly.
- Experience with data quality frameworks, including automated validation, logging, and data reconciliation methods.
- Strong grasp of DevOps/DataOps practices, including versioning, testing, and CI/CD for data pipelines.
- Experience supporting data publishing for oversight, regulatory, or open data initiatives is highly desirable.
- Certifications such as DP-203 (Azure Data Engineer Associate) or Azure Solutions Architect are a plus.
Job Type: Contract
Pay: $45.00 - $55.00 per hour
Schedule:
License/Certification:
- Microsoft Certified: Azure Data Engineer Associate (Required)
- Azure Solutions Architect (Required)
Work Location: Remote