Skip to content

Latest commit

 

History

History
64 lines (36 loc) · 2.02 KB

README.md

File metadata and controls

64 lines (36 loc) · 2.02 KB

Autoencoding a Soft Touch to Learn Grasping from On-land to Underwater

Ning Guo, Xudong Han, Xiaobo Liu, Shuqiao Zhong, Zhiyuan Zhou, Jian Lin, Jiansheng Dai, Fang Wan*, Chaoyang Song*

Graphic Abstract

graphical_abstract

Datasets

  1. Download Soft Finger Dataset.
  2. Extract "dataset" folder to root directory of this git repository : /path/to/SoftFingerSvae

Training && Evaluation

  1. Cnn regression model: SoftFinger_cnn.py
  2. Vae model: SoftFinger_Vae.py
  3. Svae model: SoftFinger_Svae.py

Requirements

This code was developed with Python 3.8 on Ubuntu 18.04. Additional Python packages:

  • pytorch
  • pytorch_lightning
  • torchmetrics
  • numpy
  • torchvision
  • pandas
  • skimage
  • PILLOW

Supplementary Videos

Movie S1: Amphibian Grasping with Visual-Tactile Soft Finger.

S1.mp4

Movie S2: Real-time Force/Torque Prediction.

S2.mp4

Movie S3: Object Grasping Success Rates Experiments with/without Contact Feedback.

S3.mp4

Movie S4: Contact Force Following Experiments.

S4.mp4

Movie S5: Object Shape Adaptation Experiments.

S5.mp4

Movie S6: Robot End-effector Reaction to Soft Finger Twist.

S6.mp4

Links