For decades, DTN has been the silent force behind some of the world’s most critical industries—helping businesses navigate complexity, uncertainty, and risk with smarter, faster decisions. From agriculture to energy to weather intelligence, our proprietary Operational Decisioning Platform transforms raw data into decision-grade insights—enabling companies to optimize supply chains, ensure market stability, and safeguard infrastructure against disruption. We don’t follow trends—we set the standard for precision, trust, and operational impact.
Job Description:
We are seeking a detail-oriented Associate Data Engineer to join our data team. In this role, you will be responsible for ensuring the ingestion, quality, integrity, and reliability of our data engineering processes and outputs. You will work closely with data engineers, analysts, and other stakeholders to develop and implement data imports and processes for our data pipelines and systems.
What You'll be Responsible for:
Design, develop, and execute test plans for ETL processes, data pipelines, and data warehousing solutions
Perform thorough testing of data transformations, integrations, and migrations
Perform data lineage audits and ensuring applied updates reflect incoming data sources
Develop and maintain automated test suites for continuous integration and deployment
Conduct performance testing and optimization of data workflows
Collaborate with other data engineers to troubleshoot and resolve data quality issues
Implement data validation checks and monitoring systems
Participate in code reviews and provide constructive feedback
Document testing procedures, results, and best practices
Stay current with industry trends and emerging technologies in data engineering and quality assurance
What You'll Bring to the Position:
1+ years of experience as a Data Engineer, or a related role
Experience with SQL and Python programming
Experience with big data technologies such as Hadoop and Spark
Familiarity with cloud platforms (AWS, Azure, or GCP)
Experience with Docker and Kubernetes
Knowledge of data modeling and ETL processes
Experience with version control systems (e.g., Git)
Proficiency in Linux/Unix command-line operations
Preferred Skills and Experience:
Familiarity with workflow management tools like Apache Airflow, Argo Workflows, and AWS Step Functions
Experience with data transformation tools like PySpark, Jupyter, and SQL
Experience with AWS tooling like Athena, Lambda functions, Glue, RDS Postgres, Redis, etc
Familiarity with CI/CD pipelines and practices
Experience with test automation frameworks (e.g., pytest)
Knowledge of data privacy and security best practices
What You Can Expect from DTN:
The targeted hiring base pay range for this position is between $56,250 and $84,500. DTN is a pay for performance organization, which means there is the opportunity to advance your compensation with performance over time. The actual base pay offered for this position will be dependent upon many factors, including but not limited to: prior work experience, training/education, transferable skills, business needs, internal equity and applicable laws. The targeted hiring base pay range is subject to change and may be modified in the future. This role may also be eligible for market competitive variable pay and benefits.
#LI-hybrid
Why DTN?
OUR VISION: To be the independent, trusted source of insights to our customers who feed, protect, and fuel the world.
OUR MISSION: Empower our customers with intelligent and actionable insights that exceed their expectations and enable their success on a daily basis.
OUR VALUES: Customer-Focused, Forward-Thinking, People-Centric, Solution-Oriented
We have great benefits at DTN – apply today to find out more!
At DTN, we are an equal opportunity employer. Come join us as we help feed, fuel, and protect the world!