Navigation robotsmust single out partners requiring navigation and move in the cluttered environment where people walk around.\nDeveloping such robots requires two different people detections: detecting partners and detecting all moving people around\nthe robots. For detecting partners, we design divided spaces based on the spatial relationships and sensing ranges. Mapping the\nfriendliness of each divided space based on the stimulus from the multiple sensors to detect people calling robots positively, robots\ndetect partners on the highest friendliness space. For detecting moving people, we regard objects� floor boundary points in an\nomnidirectional image as obstacles. We classify obstacles as moving people by comparing movement of each point with robot\nmovement using odometry data, dynamically changing thresholds to detect. Our robot detected 95.0% of partners while it stands\nby and interacts with people and detected 85.0% of moving people while robot moves, which was four times higher than previous\nmethods did.
Loading....