How to Hire a Perception Engineer
Published April 2026 · Mycelium
Perception engineering is one of the most specialised and competitive disciplines in robotics hiring. The talent pool is small, fragmented across industries, and heavily passive.
This guide covers the sub-disciplines, where candidates come from, how to assess depth, and what an effective hiring process looks like.
Perception in robotics vs other industries
Perception engineers work across robotics, automotive ADAS, AR/VR, and industrial vision. The skills transfer — but the deployment context differs significantly.
Robotics perception must handle physical interaction, real-time constraints, and safety-critical decisions. ADAS engineers often have the right technical depth but need context about robotic systems. AR/VR engineers may lack the sensor diversity experience needed for multi-modal fusion.
Understanding these differences is essential for assessing transferability — and for approaching candidates with the right pitch.
Required skills: what to look for
Core skills vary by sub-discipline. A sensor fusion engineer needs strong probabilistic modelling and Kalman filter depth. A computer vision engineer needs object detection and tracking expertise. A 3D perception engineer needs point cloud processing and geometric understanding.
All production perception engineers need real-time C++ or Python performance awareness, and most need some ML depth to assess and deploy neural perception models.
Be specific in your brief about which sensors the role covers — camera, LiDAR, radar, depth — as specialisation is common and claiming all of them often signals shallow experience in each.
Where perception engineers are found
The best perception engineers are almost never actively looking. They are embedded in production teams at AV companies, robotics scale-ups, and research labs. They do not respond to generic outreach.
Effective sourcing means mapping the teams — not the job boards. Conference publications (CVPR, ICCV, ICRA, ECCV) and GitHub contributions are useful signals for finding candidates, but the outreach still needs to be personalised and specific.
How to assess depth vs breadth
Many candidates can discuss perception concepts fluently without having shipped production systems. Ask specifically about failure modes — what happens when the sensor is occluded, when the scene is ambiguous, when the compute budget is exceeded.
Good candidates have strong opinions about the limits of their approach and can discuss trade-offs between methods. Candidates who claim everything works well in all conditions should be probed further.
Interview structure for perception roles
Start with a technical screen focused on the specific domain — sensor modalities, fusion approaches, algorithmic trade-offs. This should be led by someone with perception engineering experience, not a general engineer.
Follow with a systems design discussion covering how they would architect a perception stack for your specific application. This tests both breadth and systems thinking.
Include a practical element — a code review, a dataset analysis, or a specific problem in your domain. Keep it scoped and relevant; avoid generic LeetCode-style exercises.
Speak to a specialist robotics recruiter
If you are hiring a perception engineer and want specialist advice on search strategy, get in touch.