(no entries)

We are seeking a motivated student to join the Environmental Robotics Lab at MIRMI and contribute to cutting-edge projects in underwater manipulation. Our research in underwater manipulation focuses on developing robotic systems capable of performing complex tasks in underwater environments. This involves the control of autonomous underwater vehicles (AUVs) equipped with manipulators to handle objects, conduct repairs, and collect samples. Key challenges include dealing with the non-linearity and uncertainty of the dynamic nature of underwater environments while ensuring precise control and stability of the manipulators. This thesis focuses on investigating learning-based control strategies for underwater manipulation using small-scale platforms, like BlueROV2. The goal is to identify which learning-based methods perform best in terms of robustness, accuracy, and efficiency in underwater environments based on real-world experiments with small-scale platforms.
Tasks:
- Selection of suitable control models: Identify promising learning-based approaches (e.g., reinforcement learning, imitation learning) and relevant model-based baselines.
- Simulation environment adaptation: Modify existing underwater simulation environments to accurately represent manipulation tasks,
- Training and benchmarking: Train selected models/agents and compare their performance against traditional controllers under various conditions.
- ROS integration: Implement the chosen approach in ROS for deployment on our BlueROV2 platform.
- Experimental evaluation: Conduct tests in the lab’s underwater robotics test basin to validate simulation results and assess real-world performance.
Requirements:
- Strong interest in robotics, machine learning, and control theory
- Experience with Python and ROS
- Familiarity with reinforcement learning frameworks (e.g., PyTorch, TensorFlow, Stable-Baselines3).
- Hands-on experience with robots is a plus
What We Offer:
- Opportunity to work on innovative projects with real-world applications
- Access to state-of-the-art facilities and resources
- Collaborative and supportive research environment
- Flexible working hours to accommodate your academic schedule
How to Apply: Please send your CV, a brief cover letter including your prior experience and what you expect to learn, as well as any supporting documents to Moritz Graf (moritz.graf@tum.de). This project is only open to students enrolled at the Technical University of Munich (TUM). Applications will be reviewed on a rolling basis until the position is filled. The Technical University of Munich is an equal opportunity employer committed to excellence through diversity. We explicitly encourage women to apply, and preference will be given to disabled applicants with equivalent qualifications.
Proposed date: 06/10/2025

Background: Assembly is a crucial skill for robots in both modern manufacturing and service robotics. Robot assembly is generally considered as a vison-tactile contact-rich manipulation task. While prior studies have improved task performance by naively combining force/torque data with vision information through the whole manipulation process, ignoring the fact that the importance of the vision and tactile signals are different in the multiple phases [1]. For instance, an assembly task involves three phases, i.e., “move to approach”, “move to contact” and “insertion”. Among these phases, vision perception only plays a key role in “move to approach”. While, the “move to contact” and “insertion” phrases relies on more the force feedback during the manipulation. In our previous works [2][3][4], we have developed a novel full-fledged for robotic systems to effectively solved the “move to contact” and “insertion” phases based on an approximated random approaching pose. To further complete the entire manipulation process, this works aims at solving the “move to approach” process by guiding the robot to an approximated aligned pose. By the active perception (proactive selection of camera views), the system gathers more data regarding the relative pose relationship between the male part and female part and utilized the information to guide the robot to align the male part to the female part [5][6].
Your Tasks: 1. Propose the autonomy allocation method in our application. 2. Estimate the relative pose and generate the corresponding robot motion command. 3. Make experiments to demonstrate the feasibility and superiority of this method.
Requirement: ⚫ Highly self-motivated; ⚫ Python programming experience; ⚫ Basic computer vision or machine learning background;
Supervisor: Yansong Wu, Dr. Xing Hao yansong.wu@tum.de hao.xing@tum.de
Proposed date: 06/10/2025

Key words: VLA; Foundation Model; Contact-rich Manipulation
Background: Recent progress in Vision-Language-Action (VLA) foundation models has opened up new opportunities in robotics, enabling generalization across tasks and environments []. While these models demonstrate impressive performance in vision-guided control, the role of tactile sensing—a crucial modality for contact-rich manipulation—remains underexplored. Only a limited number of studies have investigated how tactile information can be effectively integrated into such models [2-4], leaving a gap in leveraging this modality for improved robot robustness and dexterity. This thesis will investigate whether there exists a simpler yet effective way to enhance the performance of VLA foundation models on contact-rich manipulation tasks, by exploring fine-tuning a pre-trained strategy.
Your Tasks: 1. Reproduce the SOTA open-source model. 2. Fine-tune the model on the contact-rich manipulation tasks. 3. Contribute towards a publication-oriented study. (The concrete details of the research direction will be discussed in person for confidentiality reasons.)
Requirement: ⚫ High self-motivation and interest in publication work. ⚫ Background in deep learning. ⚫ Interest in robotics and robotic manipulation.
Supervisor: Yansong Wu, Junan Li yansong.wu@tum.de junnan.li(at)tum.de
Fine-Tuning a Robot Foundation Model for Contact-rich Manipulation
Proposed date: 10/12/2025
Background and Motivation:
Deformable Linear Objects (DLOs), such as cables are ubiquitous in industrial automation, household robotics, and service applications. Unlike rigid objects, DLOs introduce complex challenges due to their high flexibility, infinite degrees of freedom, and sensitivity to external forces. Accurate manipulation of DLOs is critical for tasks like cable routing in automotive manufacturing, wire harness assembly, or even household chores such as untangling earphones. Traditional robotic manipulation approaches often fall short when dealing with DLOs because they rely heavily on visual sensing alone. Visual feedback is prone to occlusions and limited in capturing fine details like contact forces or subtle deformations. To overcome these limitations, tactile sensing has emerged as a promising complementary modality. By integrating tactile feedback, robots can gain local contact information, improving their ability to model, estimate, and control the dynamic behavior of DLOs in real-time.
Research Objectives:
This thesis aims to investigate and develop novel methods for the tactile-based manipulation of linear deformable objects. The main objectives include:
- Designing a framework for integrating tactile sensing with robotic manipulators for DLO handling.
- Developing real-time state estimation techniques that fuse tactile and visual information.
- Exploring collaborative manipulation strategies, either using multiple robotic arms or leveraging environmental constraints, to achieve desired DLO shapes reliably.
- Validating the proposed methods through practical experiments on tasks such as cable routing, knot tying, or shape formation.
Prerequisites
- Interests in robotics and control
- Excellent mathematic knowledge with good C++ programming and simulation skills
Application
Interested applicants should send the CV, academic transcripts, and previous experiences (if any) via email to hamid.sadeghian(at)tum.de and yu.li(at)tum.de
References
[1] Chen, Kejia; Bing, Zhenshan; Wu, Fan; Meng, Yuan; Kraft, Andre; Haddadin, Sami; Knoll, Alois, ”Contact-aware Shaping and Maintenance of Deformable Linear Objects With Fixtures”, 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2023
[2] J. Zhu, B. Navarro, R. Passama, P. Fraisse, A. Crosnier, and A. Cherubini, “Robotic manipulation planning for shaping deformable linear objects with environmental contacts,” IEEE Robotics and Automation Letters, vol. 5, no. 1, pp. 16–23, 2019
You can find more research opportunities at the Chairs of our Principal Investigators!
Data Protection Information
When you apply for a position with the Technical University of Munich (TUM), you are submitting personal information. With regard to personal information, please take note of the Datenschutzhinweise gemäß Art. 13 Datenschutz-Grundverordnung (DSGVO) zur Erhebung und Verarbeitung von personenbezogenen Daten im Rahmen Ihrer Bewerbung. (data protection information on collecting and processing personal data contained in your application in accordance with Art. 13 of the General Data Protection Regulation (GDPR)). By submitting your application, you confirm that you have acknowledged the above data protection information of TUM.