Job Title: Data Engineering Contractor
Department: Technology
About the Role:
Join our dynamic team of talented engineers with a proven track record in constructing data
warehouses, lakes, and pipelines. We are on a mission to fuel Unite Us' ongoing expansion and
enhance its positive influence on both the healthcare industry and individuals nationwide. As part
of our team, your role will be instrumental in collecting, processing, and delivering data to both
internal and external stakeholders. This pivotal work empowers us to intelligently invest in
emerging opportunities, quantifying the value and ROI that Unite Us networks bring to our
customers. By leveraging your expertise, you will help establish Unite Us as a thought leader in
the dynamic social care landscape. Become an integral part of our team and help us in shaping
the future of healthcare and making a meaningful impact on the lives of people across the
country.
What You'll Do:
● Implement a data architecture and infrastructure that aligns with business objectives.
Collaborate closely with Application Engineers and Product Managers to ensure that the
technical infrastructure robustly supports client requirements
● Create ETL and data pipeline solutions for efficient loading of data into the warehouse
along with their testing to ensure reliability and optimal performance
● Collect, validate, and provide high-quality data, ensuring data integrity
● Champion data democratization efforts, facilitating accessibility to data for relevant
stakeholders
● Guide the team with regard to technical best practices and contribute substantially to the
architecture of our systems
● Supporting operational work like onboarding new customers to our data products and
participating in on-call for the team
● Engage with cross-functional teams, including Solutions Delivery, Business Intelligence,
Predictive Analytics, and Enterprise Services, to address and support any data-related
technical issues or requirements
You’re a great fit for this role if:
● At least 6-8 years of experience in working with data warehouses, data lakes, and ETL
pipelines
● Proven experience with building optimized data pipelines using Snowflake and dbt
● Expert in orchestrating data pipelines using Apache Airflow, including authoring,
scheduling, and monitoring workflows
● Exposure to AWS and proficiency in cloud services such as EKS(Kubernetes), ECS, S3,
RDS, IAM etc.
● Experience designing and implementing CI/CD workflows using GitHub Actions,
Codeship, Jenkins etc.
● Experience with tools like Terraform, Docker, Kafka
● Strong experience with Spark using Scala and Python
● Advanced SQL knowledge, with experience in pulling complex queries, query authoring,
and strong familiarity with Snowflake and various relational databases like Redshift,
Postgres, etc.
● Experience with data modeling and system design architecting scalable data platforms
and applications for large enterprise clients.
● A dedicated focus on building high-performance systems
● Exposure to building data quality frameworks
● Strong problem-solving and troubleshooting skills, with the ability to identify and resolve
data engineering issues and system failures
● Excellent communication skills, with the ability to communicate technical information to
non-technical stakeholders and collaborate effectively with cross-functional teams
● The ability to envision and construct scalable solutions that meet diverse needs for
enterprise clients with dedicated data teams
Nice to have:
● Previous engagement with healthcare and/or social determinants of health data products.
● Experience leveraging agentic-assisted coding tools (eg:, Cursor, Codex AI, Amazon Q,
GitHub Copilot)
● Experience working with R
● Experience with processing health care eligibility and claims data
● Exposure to Matillion ETL
● Experience using and building solutions to support various reporting and data user tools