Role Summary
As a Data Analytics Engineer, you will design data models, build scalable pipelines and maintain documentation so that the team can focus on exploration, modeling and high‑level insights. This role reports directly to the Chief Information Officer (CIO) and works as part of the data team. You will be responsible for bridging engineering and analytics, ensuring clean, well‑structured data is available for reporting and modeling across the organization. You will collaborate closely with the team to design data models, build pipelines, validate data and support advanced analytics projects.
Key Responsibilities
- Design and improve data models: Work with subject‑matter experts to optimize data structures and build models that support effective analysis and decision‑making.
- Collaborate across the data team: Partner with the Data Scientist, Data Analysts and CIO to deploy logical and physical data models; use PowerBI and database tools to build insightful reports.
- Build and maintain data pipelines: Develop automated, scalable ETL/ELT pipelines to prepare raw data for analysis; implement modern transformation tools.
- Ensure data quality and documentation: Develop testing and monitoring solutions to maintain data quality; document data architecture, transformation processes and create/maintain data‑asset catalogues.
- Implement data governance: Apply data governance and security best practices to protect sensitive information and comply with internal standards.
- Support stakeholders: Provide reliable data access and clear insights to analysts, the data scientist and business users; respond to data inquiries and assist with ad‑hoc analyses.
Integration with the Team (Daily Workflow)
Intake → Build → Validate → Release → Adopt
- Intake & Prioritization : Requests scored (RICE), placed on the sprint board; AE scopes semantic/model impact.
- Upstream Modeling : GCP PR creates/updates fact/dim with tests; AE signs off on grain/keys and naming.
- Semantic Model : Add/reshape tables, relationships, role‑playing dimensions; add/curate measures; update calc groups; enforce RLS.
- Report Layer : Thin reports consume datasets; AE reviews measure usage/performance patterns.
- Quality Gate : GCP tests green, data‑diff pass, performance checks; stakeholder UAT in Test.
- Release : PBIP pipeline promotion → Prod; freeze as needed; AE owns rollback plan if needed.
- Adoption & Support : Publish release notes; usage telemetry monitored; deprecate redundant artifacts.
Qualifications
Education: Bachelor’s degree in Computer Science, Data Analytics/Engineering, or a related field (or equivalent professional experience).
Technical Skills:
- Proficiency in SQL and experience with ETL/ELT processes and data modelling
- Familiarity with cloud data platforms and data‑warehouse concepts
- Ability to program in Python or R and leverage BI tools like Power BI
Soft Skills:
- Strong problem‑solving
- Critical‑thinking and communication skills
- Ability to work independently and in cross‑functional teams
Preferred Experience: Exposure to Google Cloud Platform and other modern data‑transformation frameworks; experience with version control (Git) and CI/CD practices; knowledge of the logistics or supply‑chain sector.
Work Location: Springfield, MO (HQ) preferred. Remote allowed with required travel to HQ up to once monthly.
To be considered for employment by Good Company, please note: We require a pre-employment drug screening, as well as Criminal Background check.
Good Company is an Equal Opportunity Employer.
Job Type: Full-time
Pay: $60,000.00 - $80,000.00 per year
Benefits:
- 401(k) matching
- Health insurance
- Paid time off
- Parental leave
- Vision insurance
Work Location: Hybrid remote in Springfield, MO 65803