At Serve Robotics, we’re reimagining how things move in cities. Our personable sidewalk robot is our vision for the future. It’s designed to take deliveries away from congested streets, make deliveries available to more people, and benefit local businesses.
The Serve fleet has been delighting merchants, customers, and pedestrians along the way in Los Angeles, Miami, Dallas, Atlanta and Chicago while doing commercial deliveries. We’re looking for talented individuals who will grow robotic deliveries from surprising novelty to efficient ubiquity.
We are tech industry veterans in software, hardware, and design who are pooling our skills to build the future we want to live in. We are solving real-world problems leveraging robotics, machine learning and computer vision, among other disciplines, with a mindful eye towards the end-to-end user experience. Our team is agile, diverse, and driven. We believe that the best way to solve complicated dynamic problems is collaboratively and respectfully.
This role accelerates decision-making across Serve Robotics by transforming raw data into actionable insights. You will own the architecture, automation, and reliability of our analytics data pipelines while collaborating closely with cross-functional partners to make data a strategic asset for our autonomy, operations, and product teams.
Responsibilities
Define the processes and ETL/ELT infrastructure to transform and make data readily available across the company
Build and manage core datasets as unique sources of truth for product, operations and business functions
Partner with data analysts and other internal stakeholders to design, build and monitor pipelines that meet today's requirements but can gracefully scale with our growing data size
Implement automated workflows that lower manual/operational cost for stakeholders
Contribute to Serve’s goal of data democratization and self-service analytics
Develop and enforce data quality monitoring systems focused on freshness, accuracy, and lineage tracking
Develop and maintain batch, real-time and near-real-time data pipelines supporting fleet performance and robot telemetry
Qualifications
4+ years of experience as an Analytics or Data Engineer, ideally in product-focused or high-growth environments
1+ years experience and expertise in SQL: understand aggregation and window functions, UDFs, self-joins, partitioning and clustering approaches to run correct and highly-performant queries
Proficient in Python (bonus for experience with pandas, PySpark, Airflow, or similar frameworks)
Hands-on experience delivering scalable data solutions using modern data platforms (BigQuery, Snowflake, or Redshift)
Strong understanding of data modeling (star/snowflake schemas, data marts, dimensional modeling)
Experience building and monitoring CI/CD pipelines for data workflows
Excellent communication skills with the ability to build meaningful stakeholder relationships
Strong sense of ownership and ability to thrive in a fast-paced robotics and autonomy-driven environment
You are passionate about analytics use cases, data models and solving complex data problems
Excitement to learn about new fields, with the ability to be scrappy as needed
What Makes You Stand Out
Experience with BI tools (Looker, Preset, ThoughtSpot) and modern data observability frameworks (Monte Carlo, Datadog, or similar)
Implementing and enforcing data access & cost controls.
Experience working with dbt for transformation management
Familiarity with data governance, access control, and cost optimization strategies
Understanding of telemetry data from IoT or robotics systems
Compensation Range: $155K - $190K