* Maintain the on-vehicle code responsible for data collection, monitoring and real-time communication with our backend systems (supporting use cases such as real-time low-latency streaming of sensors such as camera and LiDAR over LTE) while also building the backend systems that power all of this.
* Evaluate, architect, and deploy new database systems and data pipelines to enable our engineering team to gain richer insights into our data
* Build scalable data pipelines which operate over petabytes of autonomous vehicle data to extract useful features, enable advanced queries, and feed machine learning models and simulation environments
* Deploy, build, and maintain infrastructure that ingests terabytes of data uploaded from our vehicles on a daily basis. This includes the software and hardware, as well as the operational processes that power this system.
* Experience architecting and maintaining systems that process and store large amounts of data
* Significant experience with Python, C++, Go, or similar
* Experience managing distributed data processing frameworks such as Spark and Hadoop; significant, direct experience writing jobs using such frameworks
* Experience working with AWS and/or GCP
* Experience with classical relational and NoSQL databases
* Experience working with distributed message brokers like Kafka, Kinesis, or RabbitMQ
* Experience managing large Kubernetes clusters powering microservice-oriented architectures