Skip to content
/ scope Public

A novel state-estimation algorithm that jointly estimates contact location and object pose using exclusively proprioceptive tactile feedback

Notifications You must be signed in to change notification settings

MMintLab/scope

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simultaneous Contact Location and Object Pose Estimation Using Proprioceptive Tactile Feedback

Project Website

Project Video

Imagine that a robot picks up two objects to complete a task such as assembly or insertion. It will almost always pick them up with some sort of pose uncertainty. In this paper, we address the challenge of grasped object localization using only the sense of touch. With our method, the robot can bring two objects into contact and, from feel, guess their in-hand poses. To accomplish this, we propose a novel state-estimation algorithm that jointly estimates contact location and object pose in 3D using exclusively proprioceptive tactile feedback. Our approach leverages two complementary particle filters: one to estimate contact location (CPFGrasp) and another to estimate object poses (SCOPE). We implement and evaluate our approach on real-world single-arm and dual-arm robotic systems. We demonstrate how by bringing two objects into contact, the robots can infer contact location and object poses simultaneously.

Project Image

About

A novel state-estimation algorithm that jointly estimates contact location and object pose using exclusively proprioceptive tactile feedback

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages