Role: Senior Data Engineer
Location: Plano, TX
Duration: Full Time
Overview
We’re looking for a Senior Data Engineer to help design, build, and optimize a modern cloud-based data platform. This role will be focused on creating scalable pipelines, enhancing data flow, and enabling cross-functional teams to access clean, reliable data for decision-making. The right candidate is hands-on with today’s data technologies, enjoys building end-to-end solutions, and thrives in collaborative environments.
Responsibilities
Pipeline & Platform Development- Develop and maintain robust data pipelines for ingestion, transformation, and delivery.
- Build and optimize workflows for both batch and streaming data, using tools such as dbt and Airflow.
- Work with large, complex data sets to ensure scalability, performance, and resilience.
- Establish monitoring and alerting systems to track pipeline health and quickly resolve issues.
Collaboration & Stakeholder Support- Partner with business and technical teams to gather requirements and deliver reliable data solutions.
- Work closely with analysts, data scientists, and BI developers to transform business questions into technical implementations.
- Improve data accessibility and consistency across teams and systems.
Data Quality, Security & Governance- Put frameworks in place to validate, cleanse, and monitor data quality.
- Apply best practices around data security, encryption, and handling of sensitive information.
- Maintain metadata and lineage tracking to ensure clarity and trust in the data.
Continuous Improvement- Recommend and implement improvements in pipeline performance, scalability, and automation.
- Contribute to long-term architecture planning to meet evolving data and product needs.
- Research new tools and troubleshoot complex integration and performance issues.
Required Experience- 8-10 years in software development or data engineering, with exposure to modern data ecosystems.
- 5+ years of hands-on SQL experience, including schema design and dimensional modeling.
- 5+ years designing and maintaining ETL/ELT pipelines, ideally using dbt and Airflow.
- Proficiency in Python for data manipulation and scripting.
- Experience working in cloud environments such as AWS or Azure, with production-grade deployments.
- Demonstrated success in building scalable data systems and optimizing workflows.
- Strong written and verbal communication skills, with the ability to make technical concepts understandable for non-technical audiences.
Preferred Skills- Familiarity with BI/reporting tools (Tableau, Power BI, Domo, etc.).
- Knowledge of data encryption and governance practices.
- Background in retail or consumer-facing industries.
- Degree in a quantitative or technical field (Math, Engineering, Economics, etc.); advanced degree a plus.
What Makes You a Fit- You enjoy mentoring and sharing knowledge with peers.
- You can quickly adapt to shifting priorities and handle multiple projects at once.
- You’re comfortable working with both structured and unstructured data.
- You thrive in a team setting but can also operate independently with little oversight.
- You value clear communication and can translate data problems into actionable business outcomes.