Skip to main content
DA / EN

Projects

If you're employed at SDU and want to apply for an R&D project, please contact I40LAB@mmmi.sdu.dk or inform you more here (only access for SDU employees).

R&D projects

I4.0 CAPeX will establish a unique self-driving laboratory for P2X materials discovery, which accelerates the discovery process by fully integrating multiple scientific disciplines and techniques to transcend existing sequential and trial-and-error-based discovery approaches. The CAPeX approach will be capable of bridging the extensive separation in spatial and temporal scales between the fundamental processes controlling the electrochemical performance and the degradation processes that govern the durability, reliability, and economic viability of the P2X devices. In doing so, we can shorten the divide between fundamental breakthrough science and materials discovery bringing curiosity-driven strategic research to the proof-of-concept level.

This project will help building competences within manipulation of flexible objects and assembly of fragile objects. Therefore, we will develop software for modelling, simulation, estimation, and control of flexible objects. Also, our existing software on force control and peg-in-hole will be refined to ensure small contact forces. 

I4.0 CAPeX is a subproject of  “Pioneer Center for Accelerating P2X Materials Discovery”, funded by Danish Ministry of Higher Education and Science, the Danish National Research Foundation, the Carlsberg Foundation, the Lundbeck Foundation, the Novo Nordisk Foundation, and the Villum Foundation.

Contact: Christoffer Sloth

Period: 20.12.2023 - 01.01.2027

Read morehttps://www.sdu.dk/en/forskning/sdurobotics/researchprojects/capex 

I4.0 Bin-Picking (Developing Bin-Picking Solutions for Robotics) aims to develop and test bin-picking solutions employing computer vision. By utilizing computer vision for bin-picking, two main benefits can be obtained: shorter set-up time and faster run-time. The set-up time can be reduced as modifications only need to be made in the software, without hardware changes. The faster run-time can be achieved as the grasping operation can be performed directly in the bin, without any intermediate steps.  

In recent years, many new algorithms for object pose estimation have been developed at SDU, which have achieved state-of-the-art results on benchmarking datasets. The performance of vision sensors has also developed rapidly in recent years. By implementing the developed pose estimation algorithms with new sensors, the ability to solve bin-picking tasks can be evaluated. 

Currently, MMMI has not tested any modern high-end 3D sensors. This limits the application of current computer vision methods. To the best of our knowledge, the Zivid-2 camera is currently the best possible 3D sensor at an affordable price. Two sensors are tested to obtain a deeper understanding of current possibilities. 

Nevertheless, bin-picking is often a difficult task, and for some scenarios, pose estimation solutions cannot be created. The conventional approach uses pose estimation only if it can solve all tasks and is therefore often not considered a possibility.  

A new methodology is therefore proposed as an alternative to the conventional. As solving a task with computer vision is the most straightforward approach, this should be applied where possible. If a computer vision solution is not possible, alternative solutions should be used. Thus, even if computer vision cannot solve all scenarios, robotic set-ups can still gain from individual solutions. The success of such an approach is then entirely dependent on developing a strategy for determining which method to use.  

Contact: Frederik Hagelskjær

Period: This project is ending 31.12.2024 - extended into 2026

Continuum Robots (CRs), a type of snake-like robot known for their flexibility and adaptability, have become increasingly valuable in both industrial and medical applications. Their unique ability to navigate complex and constrained environments provides significant advantages over traditional rigid-link robots.

Thanks to the advantages of flexibility and miniature, CR possesses great potentials for inspection, repair and production within a constrained environment. One application can be PCB repair and workpiece inner cavity processing.

The I4.0 CONTINUUM project aims to explore a more complicated CR structure, where one CR or tool operates through the working channel of another CR, creating a nested structure. This design offers several advantages, such as enabling access to tight and constrained spaces while maintaining the robot's rigidity or allowing outer tube to provide enhanced vision functionalities.

The primary goals of this project are to develop a nested CR hardware platform and to achieve cooperative control for the nested CR system, enabling more efficient and automated operations in constrained environments.

Contact: Di Wu and Zhuoqi Cheng

Period: This project ends August 2026

 

 

Foundation Models for Machine Tending Applications - I4.0 FTend

Despite the increasing availability of collaborative robots, many small and medium-sized enterprises (SMEs) continue to face significant barriers to successful adoption. While recent reports indicate that industrial SMEs in some regions have invested in cobots, many still struggle to implement these systems effectively. One of the most common use cases for these robots is machine tending. However, key challenges persist, including the complexity of robot programming, the need for frequent task reconfiguration due to low-volume production, and a lack of in-house technical expertise. In many cases, robots remain underutilized or unused because of the steep learning curve associated with robot programming interfaces. Although newer interfaces have helped reduce this complexity, they still do not enable non-experts to program robots independently.

This project proposes a modular architecture (see Fig. 1) for intuitive robot programming in machine tending applications, building on the findings of the FERA project. While FERA demonstrated the value of reusing structured data to reduce programming time, this initiative extends that vision by integrating foundation models to enable intelligent, user-friendly interaction with robotic systems. Specifically, the system combines Large Language Models (LLMs) to support natural and multimodal task specification with a two-layer control architecture based on Vision-Language Action (VLA) models to generate robot actions. The goal is to allow SME operators to instruct robots using natural language, without requiring any prior knowledge of robotics or coding.

To make the system accessible to non-experts, an LLM-based interface is introduced as a semantic bridge between the user and the control architecture. The LLM interprets natural language instructions and translates them into structured queries or task descriptions that the VLA can process. This allows operators to specify tasks such as “Pick up the metal part and place it in the CNC machine” without needing to understand robot programming syntax or control logic.

ContactJuan Esteban Heredia Mena

Period: This project ends 01.03.2027

Read more: About the FERA project

I4.0 Feeding - Escalation on Part Feeding Activities

A big advantage of simulation and synthetic data is the ability to virtually design and experiment before physical construction. This is especially beneficial when it comes to part feeding automation equipment as setup time and effort to build and test is the common inhibitors to deploy this type of technology efficiently in industry. This is whether it involves configuring a vision system to handle a new part variant, designing a set of gripper fingers to properly grasp a part, or designing the right set of orienting mechanism for a vibratory part feeder to always deliver parts in a specific pose to the subsequent manipulation process. Linking these technologies together on a common platform will allow us to investigate their synergies and demonstrate a holistic system utilizing all these technologies to solve part feeding tasks more efficiently. The aim is to make part feeding technology more available to industry especially in situations where production rate is not high enough to make dedicated solutions economically feasible.

The project will deliver research into the development and realization of a new type of flexible feeder. This feeder will be based on a vibratory feeder and be designed to be modular and thus fast and easy to reconfigure. Its flexibility will be enabled by novel technologies developed in the project including simulation-based module design for mechanical orientation of parts as well as an easy to setup computer vision system for part pose classification to allow active rejection and recirculation. Furthermore, these technologies will be combined to enable a near autonomous design and commissioning of new part feeding solutions.

The project is directly linked with the research activities in MADE FAST 3.04. 

Contact: Simon Faarvang Mathiesen

Period: 01.08.2022 - 31.12.2024 and extended into 2026

Read more: 
This project is a subproject of MADE FAST, you can find more information about MADE FAST here.
Article (22.08.2023): ”Disruption” af en basisteknologi alle bruger: LEGO Group og Danfoss er begejstrede (only in Danish)

I4.0 Human-Grasping enables robots to grasp and manipulate objects like humans do. If robots obtain human-like grasping and manipulation capabilities, then feeding systems, custom finger design, and vision‐based pin‐picking would be obsolete for low‐volume production. Thus, it is our hypothesis that the introduction of a universal gripper – a human‐like hand –will enable hyperflexible robotic assembly. We aim at demonstrating an assembly task consisting of the following steps:

  • Grasping:  Grasping two objects from a table
  • Classification: Classify the grasped object(s) based on tactile feedback
  • In‐Hand Manipulation: Move object of interest to desired pose in‐hand
  • Compliance Shaping: Shape the compliance of the part in‐hand to enable assembly

Contact: Christoffer Sloth

Period: The project is running until 31.12.2026

Learning to Grasp Waste - I4.0 LTGW

While significant amounts of plastic waste are collected separate from other waste streams in Denmark only a minor portion of that plastic waste is re-used cleanly (rather plastic waste is incinerated or downcycled). To gain the most value, it is important to sort plastic waste by primarily type of plastic.

 In previous investigations we met multiple challenges when sorting plastic waste. We therefore will build a system with: 

  1. A subsystem to predict good grasps for unknown objects (e.g., waste)
  2. Gripper technology suitable for grasping diverse plastic waste parts
  3. Integrated collision detection with sensed 3D model (using better 3D sensor) for safer grasping

Once a basic system has been established further iterations to improve handling success will be performed

Contact: Dirk Kraft

Period: This project ends 31.12.2028

I4.0 Teleoperation - Low-cost Robot for Teleoperation

The idea with the I4.0 Teleoperation project is to take an existing open-hardware/open-source robot teleoperation system, such as Gello, and implement both hardware and software improvements with two goals in mind:

First, to improve the general usability of the system, for instance by adding an initialization to the system whereby it automatically returns to a pose equivalent to that of the robot manipulator it is controlling (i.e., bilateral teleoperation). This can be implemented either mechanically or in the control loop.

Second, to bring the system more in line with the needs of use cases in the I4.0 Lab and SDU Robotics, for example by adding input scaling or haptic feedback by means of an admittance/impedance controller, allowing a user to “feel” the contact force measured by the robot or the force calculated by a simulation environment. The resulting system will enable us to provide teleoperated demonstrations – with haptic feedback – of assembly/disassembly skills in a simulation environment, allowing us to make use of a wider range of datasets for learning robot assembly/disassembly policies.

This project is related to two other external projects VP Disassembly and SimBotics.

ContactIñigo Iturrate

Period: 15.01.2025 - 30.06.2026

I4.0 Mac-T-Demonstrator - FERA: Machine Tending Demonstrator

Many manufacturing jobs are tied to machine tending. This is an example of a high mix / low volume automation task. Universal Robots (UR) is a Danish world leading producer of collaborative robots and together with partners provide complete solutions for automating machine tending. However, the current approaches involve several design steps and case by case programming. This machine tending demonstrator will help study together with UR how to reuse data from comparable but dissimilar setups to make design and programming of machine tending solutions easier.

The project is funded by Innovation Fund Denmark (IFD) and the methods used will follow the plan of the IFD FERA project. The IFD FERA project plans to use the SDU 4.0Lab for hosting the demonstrators and the developed software. The demonstrator will consist of a robotic cell with a mock-up of a CNC machine. We will use the data sets collected for different machine tending tasks to prepopulate the data infrastructure. The demonstrator will then show how the FERA software tools facilitate a faster and more efficient programming of machine tending.

ContactMikkel Baun Kjærgaard

Period:  The project ends 31.12.2027

Monitoring and Testing of Uncertainty and Safety for Cobots in Industry 4.0 Remanufacturing - I4.0 Motus

AI is being increasingly used in robot control pipelines, as it can simplify the setup of, e.g., computer vision tasks and make it possible to train complex sensorimotor manipulation policies using imitation (IL) or reinforcement learning (RL). For this reason, robot manufacturers such as Universal Robots are starting to release products, like the AI Accelerator kit, that democratize AI solutions by allowing developers to deploy them on their robots, and end-users to easily integrate them into their applications.

Yet AI deployed on robots cannot be treated in the same way as general AI for text or image generation. Robotics is a safety- and performance-critical sector where the criteria and tolerances for acceptance are significantly stricter. The fundamental issue precluding the adoption of AI in robotics in industrial/commercial settings is that most AI systems do not provide bounds on the confidence of their predictions and simply provide an output even in cases they have not been trained on – where this prediction is likely to be wrong. In such cases, the AI system should instead tell the user “I do not know what to do” or “I cannot meet your performance criteria”.

The I4.0-MOTUS project will take an outset in a (re)manufacturing use case: the (dis)assembly of a Danfoss Drives frequency converter. It will develop AI-based robotic solutions for its automation and investigate how uncertainty quantification methods can be applied to the AI system to provide bounds on its performance and its ability to meet required tolerances. The main output of the project is expected to be a demonstrator showing how to:

  1. Ensure safety by stopping the system when the AI is operating outside of its training distribution.
  2. Interact with the end-user, informing them on when they should provide the AI with more training data in order to reach their required level of performance.

The project is meant to support the activities in the MOTUS: Monitoring and Testing of Uncertainty and Safety for Smart Co-bots, funded by DIREC – Digital Research Center Denmark (https://direc.dk/motus-monitoring-and-testing-of-uncertainty-and-safety-for-smart-co-bots/).

ContactIñigo Iturrate

Period: 01.03.2026 - 31.12.2027

Advanced robot technologies for the future of pharmaceutical device manufacturing
 

The aim of this project is to develop advance technologies for development and manufacturing for small scale production of pharmaceutical devices.

 

The main development fields include, robot assembly, simulation, modeling and control, quality control, simulation-based development of mechanical tools, vision-based bin picking, software structures.

 

The development in this fields will strengthen the collaboration between academia and pharmaceutical industry, as well as pave the road for digital continuity in product development.

 

Project partners: Supported by Novo Nordisk and SDU I4.0Lab

 

ContactAljaz Kramberger

 

Period: This projects ends 31.12.2026

In this project, a safe and ergonomic teleoperation system will be developed to deal with challenging wind turbine blade maintenance tasks to reduce the maintenance cost and improve the safety of the working environment. The safety and ergonomics of a teleoperation system will be improved by using a combination of a kinesthetic and a cutaneous haptic device as a master device for commanding a remote robot in the teleoperation system. The human operator status during the teleoperation will be monitored and the robot behavior will be adapted to optimize the safety and ergonomics of the system. A physical emulator developed based on haptics and VR technologies in one previous I4.0 ASCERTAIN project will be used for effective operator training.

ContactCheng Fang

Period: 01.0.2023 - 28.02.2027

 

 

 

SDU I4.0Lab University of Southern Denmark

  • Campusvej 55
  • Odense M - DK-5230

Last Updated 22.05.2025