Projekte und Jobs für Studierende

The development of the tactile and dexterous end effectors for the surgical robotic system will require the expertise to design and integrate sensing interfaces in a highly compact space. Moreover, it will be important to oversee the general robotic testbed development including the control and sensing architecture for seamless integration of systems from the bottom-up. The operator would require the crucial sense of touch with low-magnitude normal and shear forces as well as sensing of multiple contact points. Additionally, the end effectors need to be modular to handle different surgical tools. Furthermore, this sensory information needs to be translated to the operator side and be integrated into the overall control and sensing architecture of the robotic system. 

In our institute we already have developed an actuated EndoWrist. It comprises four motors that are connected to the pullies which in turn through a cable mechanism drive the joints of the EndoWrist.

The research in this master's thesis primarily focuses on the kinematic modeling of available  EndoWrist in order to obtain the coupled transformation between the motor space and joint space.  After successful implementation, the thesis will progress to dynamic identification (emphasizing friction modeling in the EndoWrist) as well as external torque estimation through the current measurement. Finally, the proposed model will be validated on a testbed. 
 

Prerequisites

  • Good Matlab skills

  • Good understanding of coordinate frames and 3d transformations 

  • Excellent mathematic knowledge specifically Linear Algebra 

  • Creative and independent thinker   
     

Helpful but not required 

  • Experience with ROS 

  • Experience with Maxon motor controller  

  • Python, Cpp 
     

Related Literature

  • Lee, C., Park, Y. H., Yoon, C., Noh, S., Lee, C., Kim, Y., ... & Kim, S. (2015). A grip force model for the da Vinci end-effector to predict a compensation force. Medical & biological engineering & computing, 53, 253-261. 

  • S. Kim and D. Y. Lee, "Friction-model-based estimation of interaction force of a surgical robot," 2015 15th International Conference on Control, Automation and Systems (ICCAS), Busan, Korea (South), 2015, pp. 1503-1507, doi: 10.1109/ICCAS.2015.7364591. 

  • Longmore, S. K., Naik, G., & Gargiulo, G. D. (2020). Laparoscopic robotic surgery: Current perspective and future directions. Robotics, 9(2), 42. 

  • Guadagni, S., Di Franco, G., Gianardi, D., Palmeri, M., Ceccarelli, C., Bianchini, M., ... & Morelli, L. (2018). Control comparison of the new EndoWrist and traditional laparoscopic staplers for anterior rectal resection with the Da Vinci Xi: a case study. Journal of Laparoendoscopic & Advanced Surgical Techniques, 28(12), 1422-1427.  

  • Abeywardena, S., Yuan, Q., Tzemanaki, A., Psomopoulou, E., Droukas, L., Melhuish, C., & Dogramadzi, S. (2019). Estimation of tool-tissue forces in robot-assisted minimally invasive surgery using neural networks. Frontiers in Robotics and AI, 6, 56.
     

For more information, please contact: 

Mario Tröbinger (mario.troebinger@tum.de)
Dr Hamid Sadeghian (hamid.sadeghian@tum.de)
 

Location:

Forschungszentrum Geriatronik, MIRMI, TUM, Bahnhofstraße 37, 82467 Garmisch-Partenkirchen.
MIRMI, TUMGeorg-Brauchle-Ring 60-62, 80992 München.

The thesis goal is to study how vision modulates planning and execution of movements

 

Healthy participants will wear EEG in order to study what happens at brain level during different grasping tasks (tasks execution in the light and in the darkness and tasks imagination)

Good to have:

  • Good programming and signal processing skills
     

Contacts:

Location/s: Faculty of Biology, LMU Biocenter  and Großhaderner Str. 2 Munich, and 

MIRMI (TUM), Garching bei München, Carl Zeiss Strasse

 

References:

https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8008384

https://iopscience.iop.org/article/10.1088/1741-2552/aa8911/pdf

Europe is leading the market of torque-controlled robots. These robots can withstand physical interaction with the environment, including impacts while providing accurate sensing and actuation capabilities. This opens the door to the exploitation of intentional dynamic contact transitions for manipulation. This exciting field of research, which we refer to as impact-aware robotics, requires the development of a new holistic framework
comprising modeling, learning, planning, sensing, and control aspects, supported by collision-tolerant hardware.

This thesis will focus on developing an impact monitoring pipeline that incorporates pre-existing knowledge of impact and release motions to discriminate between expected and unexpected impacts. 
The student will implement a classification pipeline communicating with the AGX Dynamics physical engine and validating the controller using a Franka Emika dual-arm setup.  

What you will gain

  • Hands-on experience with Franka Emika robot manipulators
  • Experience with Machine Learning classification for a Robotic application
  • Experience with software integration of C++ and Python/MatLab software packages
  • Access to I.AM. consortium software and network (our academic partners are TU/e, EPFL, and CNRS, while the industrial are Franka Emika, Smart Robotics, and Algoryx) 


Prerequisites

  • Good C++ programming skills
  • Good MatLab or Python programming skills
  • Good understanding of dynamical system modeling principles (e.g., mass-spring-damper model)
  • Familiarity with classification principles (data preprocessing, classification tuning, etc.)
  • Willingness to get familiar with new Software frameworks and integrate them 


Helpful but not required

  •  Experience with Ubuntu
  •  Experience with robot manipulators

Related Literature:

  • B. Proper, A. Kurdas, S. Abdolshah, S. Haddadin and A. Saccon, "Aim-Aware Collision Monitoring: Discriminating Between Expected and Unexpected Post-Impact Behaviors," in IEEE Robotics and Automation Letters, vol. 8, no. 8, pp. 4609-4616, Aug. 2023, doi: 10.1109/LRA.2023.3284371.
  • I. Aouaj, V. Padois and A. Saccon, "Predicting the Post-Impact Velocity of a Robotic Arm via Rigid Multibody Models: an Experimental Study," 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China, 2021, pp. 2264-2271, doi: 10.1109/ICRA48506.2021.9561768.
  • J. J. van Steen, N. van de Wouw and A. Saccon, "Robot Control for Simultaneous Impact tasks via Quadratic Programming-based Reference Spreading," 2022 American Control Conference (ACC), Atlanta, GA, USA, 2022, pp. 3865-3872, doi: 10.23919/ACC53348.2022.9867812.

Websites

i-am-project.eu


www.algoryx.se/agx-dynamics/


jrl-umi3218.github.io/mc_rtc/

For more information, please contact:

Alessandro Melone at alessandro.melone@tum.de 

We are looking for a HiWi candidate for supporting the organization of the 2023 International Workshop on Human-Friendly Robotics (HFR) that is happening in Munich on Sept. 20/21. The position is available immediately!  

The main tasks of the HiWi should be:

  • Website creation for the workshop on human friendly robotics; 
  • Maintenance and updates of the website; 
  • Potentially (if desired) communication with world-wide renowmed speakers; 
  • Support with media advertising;  
  • Support with folder/documents digital organization. 

Pre-requisites:
Experience with Webdesign 

Helpful but not required:
German language


Position: 4 to 8 weeks 
Hour per week/Salary: to be defined depending of the availability and experience 

For more information, please contact:

Dr Luis Figueredo

Mensch-Roboter-Interaktion

For a successful integration of robots in our society, task execution is an important skill with which we are trying to equip robots. Besides task-model for the trajectory of human-object interactions, which has been previously developed, grasping and manipulating objects is the aspect of our developed task model that we want to improve in this project. There are grasps that may not lead to a successful execution and should not be considered during the robot’s task execution. For example, when pouring the contents of a cup into another, grasping the cup on top (by the cup’s opening) will not allow the cup’s content to be poured correctly. Also important is the ability to grasp a previously unknown object, that is similar to other previously seen objects or that has a similar shape to another. In addition, for the execution of the grasp, a metric indicating the quality of the grasp is a desirable feature to have.

The system will use a simulation (Vrep/CoppeliaSim) to show a human its exploration of grasps. Upon seeing the simulated exploration, the human provides the system with feedback on whether the seen simulation/alteration of the task still fits the implicit task features (e.g. the feedback could say whether the robot will be able to successfully complete a pouring task based on the selected grasp).

Available modules:

  • Visualization of different robot gripper types in Vrep/CoppeliaSim
  • Control interface to Vrep/CoppeliaSim from C++ code
  • Rudimentary grasp and gripper (code-)models
  • Database (not dataset) for storing object information

Tasks may include a few of the below (to be discussed depending on your interest and background)

  • Explore available, online datasets of object meshes and of 3d object point clouds (e.g. YCB dataset) -> select most comprehensive
  • Explore & implement available grasp planners (working with object meshes and point clouds) -> select best performing one on the selected dataset + check if the planner outputs a grasp metric
  • Take into account different gripper types (e.g. antipotal gripper & human hand gripper)
  • Create visualization of the generated grasp points / EE-poses in Vrep
  • Explore mesh segmentation approaches in relation to the generated grasps to condense & compress the amount of generated grasps from the planner
  • Evaluate & implement methods for determining shape similarity and correspondence between a detected object shape and a list of mesh models from a database (look at: CLIP, BERT and similar resources)
  • Automate correct positioning of the visually-detected objects in simulation
  • Generate a visualization framework for the user feedback.

Prerequisites

  • Good C++ programming skills
  • Good understanding of coordinate frames and 3d transformations
  • Willingness to really get familiar with the CoppeliaSim simulator (and other simulators) and its features (especially concerning user-interaction)

Helpful but not required
- Experience with ROS
- Experience with robot manipulators
- Python

Related Literature:

  • Grasp Taxonomy: Paper 1 (https://www.csc.kth.se/grasp/taxonomyGRASP.pdf), Paper 2 (https://ieeexplore.ieee.org/document/7243327)
  • Grasp Planning via Hand-Object Geometric Fitting (https://link.springer.com/article/10.1007/s00371-016-1333-x)
  • Contact Grasp-Net: Efficient Grasp Generation in Cluttered Spaces (https://arxiv.org/pdf/2103.14127.pdf)
  • Automatic Grasp Planning Using Shape Primitives (https://www.cs.columbia.edu/~allen/PAPERS/grasp.plan.ra03.pdf)
  • Generating Task-specific Robotic Grasps (https://arxiv.org/pdf/2203.10498v1.pdf)
  • Using Geometry to Detect Grasps in 3D Point Clouds (https://arxiv.org/pdf/1501.03100.pdf)
  • Grasp-It (https://graspit-simulator.github.io/build/html/grasp_planning.html)

For more information, please contact:

Dr Luis Figueredo

Andrei Costinescu

KI-fähige Laborautomatisierung

Andere Kategorien

Weitere Forschungsmöglichkeiten finden Sie an den Lehrstühlen unserer Principal Investigators!