This page contains the dataset and the links to the open-source codes. If you find our work useful, please cite the following articles.
X. Wang, F. Geiger, V. Niculescu, M. Magno and L. Benini, "Leveraging Tactile Sensors for Low Latency Embedded Smart Hands for Prosthetic and Robotic Applications," in IEEE Transactions on Instrumentation and Measurement, vol. 71, pp. 1-14, 2022, Art no. 4004314, doi: 10.1109/TIM.2022.3165828.
The above articile is a journal extension of the conference paper below:
X. Wang, F. Geiger, V. Niculescu, M. Magno and L. Benini, "SmartHand: Towards Embedded Smart Hands for Prosthetic and Robotic Applications," 2021 IEEE Sensors Applications Symposium (SAS), 2021, pp. 1-6,
doi: 10.1109/SAS51076.2021.9530050.
Dataset
There are in total 17 classes (16 objects + 1 empty-hand class), as shown in the figure below:
License
The dataset is distributed under the Creative Commons Attribution Non-Commercial 4.0 (CC-BY-NC) license.
Download the data collected for the additional experiments on the square sensor
here.
Open-source codes
All the codes used in this project are hosted on several repositories on GitHub.
STAG_slim: Classification on MIT-STAG dataset using an adapted model from the same authors of the paper Subramanian Sundaram, Petr Kellnhofer, Yunzhu Li, Jun-Yan Zhu, Antonio Torralba and Wojciech Matusik. "Learning the signatures of the human grasp using a scalable tactile glove". Nature, 2019. The goal is to reproduce the baseline and get familiar with the project and the sensor data classification.
STAG_DataLogging: Flashing this code on the STM32F769I-DISC1 board enables the collection and logging of tactile data.
Tactile_NN: Some test evaluations with TCN on MIT-STAG dataset.
STAG_GUI: Flashing this code on the STM32F769I-DISC1 board enables it to collect tactile data and send the data to a connected PC. Using the Python GUI `STAG_GUI.py` from the `utils` folder visualizes the tactile frame.
STAG_Demo: Flashing this code on the STM32F769I-DISC1 board enables it to collect tactile data and send the data to a connected PC via a DMA controller. At the same time the Arm Cortex-M7 runs an inference on the tactile frame and sends the result of that inference to the PC as well. Using the Python GUI `STAG_Demo.py` from the `utils` folder visualizes the tactile frame and its inference in the same interface.
STAG_Complete: Flashing this code on the STM32F769I-DISC1 board makes the MCU continuously collect and process sensor data. It then sends its inference results to a connected PC via UART.
STAG_Complete_lowPower: Flashing this code on the STM32F769I-DISC1 board simulates an application scenario in which the MCU is actively collecting and processing sensor data for some time before entering low-power mode.
STAG_slim_minimal_realData: This is a minimal implementation of the adapted neural network that was found to have the best size-accuracy trade-off. It is minimal in the sense that it tries to reduce the number of tunable variables to a minimum. The goal of this neural network is to classify 17 different objects from the data that was collected with a self-made STAG.
STAG_sensorfusion: This is a minimal implementation of the adapted neural network that was found to have the best size-accuracy trade-off. It is minimal in the sense that it tries to reduce the number of tunable variables to a minimum. The goal of this neural network is to classify 17 different objects from a data fusion between tactile data and synchronized IMU data.
utils: This repo contains useful functions for visualizing, manipulating, analyzing, ... tactile or IMU data, or for syncing local directories with Sassauna directories.
Jupyter_Notebooks: This contains jupyter notebooks, mostly for plotting.
License
Please refer to the License in each individual repository.