Skip to main content
Talent Toppers logo

Visual SLAM Systems Engineer/ Architect

Talent Toppers
Full Timemid
Navi Mumbai, Maharashtra, INPosted April 6, 2026

Job Description

Education: B.Tech / M.Tech / M.S. in Electronics, Computer Science, Robotics, or a closely related field

Experience: 5 + years of hands-on experience in SLAM, state estimation, or autonomous navigation systems

Location: Navi Mumbai

Role Summary: We are seeking a Visual SLAM Architect / Systems Engineer to design, implement, and optimise SLAM-based localization and mapping systems UAV platforms. The role is focused and deep: you will own the full SLAM stack — from sensor interface and algorithm selection through real-time C/C++ implementation, ROS integration, embedded deployment, and flight validation. You will work closely with the Perception Lead and avionics teams to ensure that localisation outputs meet the accuracy, latency, and robustness requirements of autonomous flight.

Key Responsibilities

Algorithm Design Implementation

Design, implement, and optimise Visual and Visual-Inertial SLAM pipelines for monocular, stereo, and RGB-D configurations Evaluate and adapt open-source frameworks (ORB-SLAM3, VINS-Mono, OpenVINS, LIO-SAM) to hardware and operational scenarios Develop and tune front-end components: feature detection/matching, optical flow, keyframe selection, and outlier rejection Implement and maintain back-end optimisation using graph-based solvers (g2o, Ceres, GTSAM) with loop closure and map merging Integrate barometer, IMU, and optional GPS measurements for scale recovery and drift bounding in long-duration flights.

Systems Architecture Integration:

Define the overall SLAM system architecture: data flow, processing topology, memory budget, and latency constraints Develop and maintain ROS 2 perception nodes for SLAM, odometry publication, and map management, interfacing with PX4 / MAVLink flight stack. Architect sensor calibration pipelines: camera intrinsics, extrinsics, IMU-camera temporal and spatial calibration Design and implement localisation health monitoring — covariance tracking, keyframe density alerts, and graceful degradation to complementary sensors.

Embedded Deployment Optimisation:

Port and optimise SLAM algorithms for ARM-based embedded targets (Qualcomm RB5 / RPi CM5 / Nvidia Jetson)Profile and reduce pipeline latency to meet real-time flight control update requirements (10fps Visual position updates)Leverage hardware accelerators (GPU, DSP, NPU) where beneficial; manage memory and power constraints

Validation, Testing Documentation:

Design simulation and Hardware-in-the-Loop test rigs using Gazebo / AirSim for pre-flight SLAM validationEstablish quantitative benchmarks: ATE, RPE, loop closure recall, and re-localisation success rate across scenarios Conduct field trials — analyse flight logs, diagnose failure modes, and iterate on algorithm parameters Produce clear technical documentation: architecture design records, calibration procedures, and algorithm trade-off notes

Experience: 5+ years of hands-on experience in SLAM, state estimation, or autonomous navigation systems Demonstrable end-to-end ownership of at least one production or research SLAM system (UAV, robot, or automotive context)Experience deploying SLAM on real hardware with field validation, not purely simulation

Core Technical Skills SLAM Algorithms: Strong working knowledge of feature-based (ORB, AKAZE, SIFT, SuperPoint), semi-direct, and direct methods; loop closure (DBoW, NetVLAD); multi-map managementC/C++ (C++14/17): Production-quality, real-time code; proficient with STL, Eigen, OpenCVROS / ROS2: Node development, topic/service/action patterns, tf2 transforms, rosbag tooling, lifecycle nodes Optimisation Backends: Practical experience with g2o, Ceres Solver, or GTSAM for bundle adjustment and pose graph optimisation Camera Models Calibration: Pinhole, fisheye (Kannala-Brandt / MEI), stereo, multi-camera; tooling with Kalibr or equivalent IMU Pre-integration: Forster pre-integration, bias estimation, extrinsic calibration with camera Linux Embedded Systems: CMake build systems, cross-compilation, performance profiling (perf, Valgrind, Tracy)Mathematics: Linear algebra, Lie groups (SO(3)/SE(3)), probabilistic inference, non-linear least squares

Desirable / Good to Have Experience with thermal or NIR cameras and associated calibration and feature detection challenges Familiarity with integrating SLAM outputs as external vision position estimates into an Autopilot or control stack Knowledge of dense mapping: occupancy grids, TSDF, voxel hashing (Open3D, VoxelMap)Exposure to learning-based SLAM components: depth estimation (MiDaS, Depth Anything), feature matchers (SuperGlue, LightGlue)Contributions to open-source SLAM or robotics projects Experience with Gazebo / AirSim simulation for SLAM benchmarking.

Relevant Experience candidate can share their cv to Krupa@talenttoppers.com

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free