Senior Data Engineer – Chicago - Leading proprietary trading firm
My client, a global market-making and trading firm, is seeking a Senior Data Engineer to join their Chicago team and advance their data infrastructure. The role focuses on transforming experimental research workflows into scalable, production-grade systems using technologies like Databricks, Apache Spark, Delta Lake, and streaming data platforms.
The engineer will design and optimize ETL/ELT pipelines, collaborate with researchers and traders, define best practices, and mentor junior engineers—all while contributing to the broader Lakehouse architecture vision.
Requirements:
- 5+ years in data engineering with robust pipeline experience
- Advanced Python, Spark, Databricks, and Delta Lake skills
- Experience with streaming systems (Kafka) and cloud-native architectures (AWS preferred)
- Strong collaboration, mentoring, and communication skills
- Bonus: system-level languages (C++, Rust) or MLOps/MLflow exposure
Perks: Competitive benefits, international collaboration, and the chance to make an immediate impact on real-time research and trading systems.