This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer in New York (USA).
This role offers the opportunity to contribute to a cutting-edge AI platform that transforms investment analysis for private equity. You will design, build, and maintain scalable data pipelines and architectures, handling complex and diverse datasets to support data-driven decision-making. The position emphasizes collaboration with data science, investment, and operations teams, ensuring seamless data flow, reproducibility, and integration across analytics platforms. You will work in a dynamic, global environment, leveraging advanced technologies and best practices in data engineering to drive innovation and efficiency. This role provides exposure to large-scale financial data, AI-driven insights, and opportunities for professional growth in a fast-paced setting.
Accountabilities:
- Design, develop, and maintain robust data pipelines and architectures to support complex investment workflows
- Ensure reliable ingestion, transformation, and integration of structured and unstructured datasets from multiple internal and external sources
- Monitor, optimize, and maintain performance, scalability, and data quality of all pipelines and systems
- Collaborate with data scientists, investment professionals, and operations teams to operationalize analytics and ensure reproducible results
- Implement automation, CI/CD pipelines, and continuous improvement initiatives in data engineering practices
- Evaluate and integrate new tools and architectures to enhance efficiency, scalability, and cost-effectiveness
Requirements
- Master's degree in Data Engineering, Computer Science, or a related technical field
- 3-5 years of hands-on experience designing and maintaining large-scale data pipelines
- Experience working with complex financial or alternative data sets
- Proficiency in Python and SQL for data processing and transformation
- Hands-on experience with cloud platforms (AWS, Azure, GCP, or Databricks) and workflow orchestration tools (Airflow, Prefect, or Dagster)
- Knowledge of data modeling, schema design, API integration, and big data frameworks (Spark, Kafka, Delta Lake)
- Familiarity with version control (Git) and DevOps practices (CI/CD pipelines)
- Strong communication skills and ability to work cross-functionally with non-technical stakeholders
- Fluency in English; additional languages are a plus
Benefits
- Flexible remote work and collaboration with global teams
- Multicultural work environment with international exposure
- Opportunities for career growth and professional development in AI and ML applications
- Structured mentorship, training sessions, and access to educational platforms
- Regular company updates and transparent communication channels
- Focus on work-life balance, learning culture, and team collaboration
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
When you apply, your profile goes through our AI-powered screening process designed to identify top talent efficiently and fairly.
🔍 Our AI evaluates your CV and LinkedIn profile thoroughly, analyzing your skills, experience and achievements.
📊 It compares your profile to the job's core requirements and past success factors to determine your match score.
🎯 Based on this analysis, we automatically shortlist the 3 candidates with the highest match to the role.
🧠 When necessary, our human team may perform an additional manual review to ensure no strong profile is missed.
The process is transparent, skills-based, and free of bias — focusing solely on your fit for the role.
Once the shortlist is completed, we share it directly with the company that owns the job opening. The final decision and next steps (such as interviews or additional assessments) are then made by their internal hiring team.
Thank you for your interest!