Static Environment Team - Senior Algorithm Engineer
Detroit, Michigan, USA
About Zenuity- Making it Real
We are breaking new ground with world-class driver-assistance and autonomous-driving technologies. To meet the automotive industry’s requirements, we are developing a software in a fashion and context that hasn’t been done before. With robustness, speed to market and a flexible approach we are changing the way the world looks at mobility, infrastructure and everyday life for people worldwide.
We work on interesting technical challenges using cutting edge technologies in a high performance, agile organization with an innovative environment & high level of autonomy, putting you in the driver’s seat. We have a team-based structure with professional management & continuous development, where you will be exposed to many areas within Autonomous Driving, maximizing development opportunities.
We work on interesting technical challenges using cutting edge technologies in a high performance, fully agile organization with an innovative environment & high level of autonomy, putting you in the driver’s seat. We have a team-based structure with professional management & continuous development, where you will be exposed to many areas within Autonomous Driving, maximizing development opportunities.
WHAT YOU WILL DO
Our Static Environment Team is looking for a Senior Algorithm Engineer! The Static Environment Team at Zenuity develops algorithms that fuse sensor (camera, RADAR, LIDAR, Ultrasonic, etc.) data to detect, classify, and map the static environment around the autonomous vehicle to enable safe operation alongside other vehicles and objects. As a Senior Algorithm Engineer, you will be responsible for the development, simulation, implementation, integration, and testing of sensor / data / object fusion, and occupancy grid algorithms to help develop a model of the world around the vehicle for safety critical autonomous vehicle systems for real production products. You will use agile processes and C++ code to develop real-time software and collateral. You will work in modern simulation environments and have direct access to test vehicles to rapidly experience your results first-hand. You will work on self-guided cross-functional teams and interface globally where you will help make decisions. You will also mentor the team on technical issues, tools, and processes as necessary.
A Bachelor’s Degree in Engineering or Science is required, but a Master’s or PhD is preferred, and 10+ years of industry experience. You must have a strong background and prior experience developing fusion software for fully autonomous vehicles, specifically for camera, radar, LIDAR and ultrasonic sensors. Experience with Bayesian inference, state estimation, and time synchronization of multiple sensor data sets is necessary. Experience with LIDAR and interfacing with other automotive AD sensors (RADAR, Camera, Ultrasonic) are also necessary. You must have excellent C/C++ programming experience and be comfortable working in a Linux environment. For this role, you must also have prior AD/ADAS automotive, aerospace, or related experience.
Beneficial skills and experience include and automotive embedded software experience, NVIDIA experience, Functional Safety (ISO 26262) and AUTOSAR knowledge, experience in requirements development, unit testing, and software integration, and experience with Git, JIRA and Confluence. Additionally, experience with robotics, and/or UAVs is beneficial.
401(k) and Gain Sharing with company matching funds
Vacation, Personal, Generous Holidays, College Savings
Flexible working hours & work-life balance
Dynamic work environment & attractive and flexible work spaces
With the skillset and mindset described you have the chance to make self driving cars a reality as an engineer at our site in Detroit.
We are really excited to welcome candidates with these skills so don't hesitate to take the chance of a lifetime!
Interviews are held on a continuous basis, so we highly recommend that you submit your application at your earliest convenience.