Job Description
What You’ll Do
Design, build, and maintain large-scale data pipelines (batch and streaming) for robotics foundation model training and evaluation at petabyte scale
Own core data infrastructure: data model, storage systems, ingestion pipelines, transformation frameworks, and orchestration layers
Standardize data models and unify processing pipelines across real-world teleoperation and synthetic simulation datasets
Collaborate with a team of driven individuals committed to building general-purpose Physical AI
What You’ll Bring
Excellent software engineering skills (Python, Go, or similar)
Extensive experience designing, building, and maintaining large-scale data pipelines (8+ years)
Deep understanding of distributed systems (Spark, Kafka, or similar)
Extensive experience with data storage technologies (data lakes, warehouses, object stores like S3)
Experience running and maintaining production-grade infrastructure (Kubernetes, Terraform)
Bonus:
Experience supporting AI systems, in particular embodied AI like self-driving
#J-18808-Ljbffr
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
December 19, 2025
Job Type:
Technology
Location:
San Carlos, California, 94071, United States
Company:
Genesis AI
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.