Skip to main content
DA / EN

Projects

If you're employed at SDU and want to apply for an R&D project, please contact I40LAB@mmmi.sdu.dk or inform you more here (only access for SDU employees).

R&D projects

I4.0 CAPeX will establish a unique self-driving laboratory for P2X materials discovery, which accelerates the discovery process by fully integrating multiple scientific disciplines and techniques to transcend existing sequential and trial-and-error-based discovery approaches. The CAPeX approach will be capable of bridging the extensive separation in spatial and temporal scales between the fundamental processes controlling the electrochemical performance and the degradation processes that govern the durability, reliability, and economic viability of the P2X devices. In doing so, we can shorten the divide between fundamental breakthrough science and materials discovery bringing curiosity-driven strategic research to the proof-of-concept level.

This project will help building competences within manipulation of flexible objects and assembly of fragile objects. Therefore, we will develop software for modelling, simulation, estimation, and control of flexible objects. Also, our existing software on force control and peg-in-hole will be refined to ensure small contact forces. 

I4.0 CAPeX is a subproject of  “Pioneer Center for Accelerating P2X Materials Discovery”, funded by Danish Ministry of Higher Education and Science, the Danish National Research Foundation, the Carlsberg Foundation, the Lundbeck Foundation, the Novo Nordisk Foundation, and the Villum Foundation.

Contact: Christoffer Sloth

Period: 20.12.2023 - 01.01.2027

Read morehttps://www.sdu.dk/en/forskning/sdurobotics/researchprojects/capex 

In this project, we aim to develop an end-to-end framework for setting up force-sensitive tasks on soft and deformable objects. This project will focus on two research tasks:

1) serial elastic actuator (SEA) for constraining and controlling the output force, and

2) a learning-from-demonstration framework for adapting the robot’s impedance profile to the waypoints along the trajectory.

Contact: Iñigo Iturrate and Zhouqi Cheng

Period: 01.03.2023 - 30.09.2024

I4.0 Bin-Picking (Developing Bin-Picking Solutions for Robotics) aims to develop and test bin-picking solutions employing computer vision. By utilizing computer vision for bin-picking, two main benefits can be obtained: shorter set-up time and faster run-time. The set-up time can be reduced as modifications only need to be made in the software, without hardware changes. The faster run-time can be achieved as the grasping operation can be performed directly in the bin, without any intermediate steps.  

In recent years, many new algorithms for object pose estimation have been developed at SDU, which have achieved state-of-the-art results on benchmarking datasets. The performance of vision sensors has also developed rapidly in recent years. By implementing the developed pose estimation algorithms with new sensors, the ability to solve bin-picking tasks can be evaluated. 

Currently, MMMI has not tested any modern high-end 3D sensors. This limits the application of current computer vision methods. To the best of our knowledge, the Zivid-2 camera is currently the best possible 3D sensor at an affordable price. Two sensors are tested to obtain a deeper understanding of current possibilities. 

Nevertheless, bin-picking is often a difficult task, and for some scenarios, pose estimation solutions cannot be created. The conventional approach uses pose estimation only if it can solve all tasks and is therefore often not considered a possibility.  

A new methodology is therefore proposed as an alternative to the conventional. As solving a task with computer vision is the most straightforward approach, this should be applied where possible. If a computer vision solution is not possible, alternative solutions should be used. Thus, even if computer vision cannot solve all scenarios, robotic set-ups can still gain from individual solutions. The success of such an approach is then entirely dependent on developing a strategy for determining which method to use.  

Contact: Frederik Hagelskjær

Period: This project is ending 31.07.2024

This I4.0 Digital Twin project aims to enable model-based robot controller design for in-contact applications. This will be accomplished by developing a high-fidelity robot simulation of a robot manipulator.  The project will take an outset in a UR5e manipulator but will develop general methodology that applies to all collaborative robots. In other research application areas like wind turbine control, high-fidelity models have used for many years for benchmarking and are considered to have a very small reality-gap. 

Contact: Christoffer Sloth

Period: Running until 30.09.2024

I4.0 DigiTech

Digital technologies are an important driver for Industry 4.0 solutions. New generations of hardware and software technology enable the collection and processing of data at a much larger scale. New AI  technology  enables new options for robotic solutions to understand their surroundings and tasks from data in much more detail. New computing platforms enable the software and AI technologies to be executed distributed over edge and cloud nodes, however, also create new challenges, e.g. cyber-security. 

We aim at developing data-, computing- and storage-structures to enable the re-use and exploitation of assembly data (before and during assembly) as well as mobile robot data. For that, the use-cases need to be analyzed in detail in terms of the relevant data and the concrete examples of data-re-use (e.g. for predictive maintenance, fleet management, gripper design or improvement of the system through machine learning) as well as the required software and hardware structure. For software and hardware structures there is an explicit focus on the opportunities for distributed edge computing.

Project partners: MiR, Universal Robots, NordBo, Rockwool, Novo Nordisk, Welltec, MADE and Odense Robotics. 

Contact: Mikkel Baun Kjærgaard

Period: Milestone 2 ends 31.12.2023

Read more: I4.0 DigiTech is a subproject of DIREC, you can find more information about DIREC here

I4.0 Feeding - MADE FAST Feeding 

A big advantage of simulation and synthetic data is the ability to virtually design and experiment before physical construction. This is especially beneficial when it comes to part feeding automation equipment as setup time and effort to build and test is the common inhibitors to deploy this type of technology efficiently in industry. This is whether it involves configuring a vision system to handle a new part variant, designing a set of gripper fingers to properly grasp a part, or designing the right set of orienting mechanism for a vibratory part feeder to always deliver parts in a specific pose to the subsequent manipulation process. Linking these technologies together on a common platform will allow us to investigate their synergies and demonstrate a holistic system utilizing all these technologies to solve part feeding tasks more efficiently. The aim is to make part feeding technology more available to industry especially in situations where production rate is not high enough to make dedicated solutions economically feasible.

The project will deliver research into the development and realization of a new type of flexible feeder. This feeder will be based on a vibratory feeder and be designed to be modular and thus fast and easy to reconfigure. Its flexibility will be enabled by novel technologies developed in the project including simulation-based module design for mechanical orientation of parts as well as an easy to setup computer vision system for part pose classification to allow active rejection and recirculation. Furthermore, these technologies will be combined to enable a near autonomous design and commissioning of new part feeding solutions.

The project is directly linked with the research activities in MADE FAST 3.04. 

Contact: Simon Faarvang Mathiesen

Period: 01.08.2022 - 31.12.2024

Read more: This project is a subproject of MADE FAST, you can find more information about MADE FAST here.

Article (22.08.2023): ”Disruption” af en basisteknologi alle bruger: LEGO Group og Danfoss er begejstrede (only in Danish)

The project will make plant simulation available to SDU I4.0Lab and to researchers at the Faculty of Engineering.  The models developed will be around Factory of the Future including reconfigurable manufacturing, matrix production, swarm production etc.  Cases in planning are the digital Twin solutions for the LEGO demonstrator. Here we will integrate Verosim with Plant simulation to model and performing experiments at plant level. Another case may be to demonstrate and test digital Twin solutions for the Danfoss demonstrator, where Matrix production will be modeled at the Viking line.

Contact: Arne Bilberg

Period: The project is running until 31.12.2024 with milestone 1 evaluated at 31.12.2023

The I4.0 Fluently project will develop methods for assessing the human operator’s state e.g., trust, during the task operation. The methods will be implemented as behavior trees and tested in a number of experiments carried out in the SDU I4.0Lab. For the experimental work, a digital twin in Isaac sim will be established, which will give another dimensionality for the HRI. In addition to that we will also develop a pipeline for learning assembly task parameters, which can be used for online quality assessment of the assembly task. The method will consist of two phases, first: the learning phase – the system will explore and try to learn the assembly task in simulation. During this phase task data will be collected and used for training AI models, which can in the second phase, predict the quality of the assembly execution. The two methods will be integrated together and tested in an assembly and pick and place experiment.

ContactLeon Bodenhagen

Period: The project ends 31.12.2025

Read more:  The project is  a subproject of the Fluently project under Horizon, find more information here.

The I4.0 GremRob project aims at, in cooperation with its sister project GremeOH under InnoMission2, developing methodologies and systems for efficient and reliable assembly of electrolyzer stacks from Green Hydrogen Systems. The stack components comprise both rigid and flexible parts, and hence the handling is challenging. We will use advanced gripper development, force based control and computer vision for ensuring efficiency and robustness. 

Project partners: SDU Robotics and SDU Mechanical Engineering

ContactHenrik Gordon Petersen

Period: 01.08.2023 - 31.12.2024

Read more:  The project is  an activity supplementing the MissionGreenFuels project under InnoMission2. You can find more information about the InnoMissions here

I4.0 Human-Grasping enables robots to grasp and manipulate objects like humans do. If robots obtain human-like grasping and manipulation capabilities, then feeding systems, custom finger design, and vision‐based pin‐picking would be obsolete for low‐volume production. Thus, it is our hypothesis that the introduction of a universal gripper – a human‐like hand –will enable hyperflexible robotic assembly. We aim at demonstrating an assembly task consisting of the following steps:

  • Grasping:  Grasping two objects from a table
  • Classification: Classify the grasped object(s) based on tactile feedback
  • In‐Hand Manipulation: Move object of interest to desired pose in‐hand
  • Compliance Shaping: Shape the compliance of the part in‐hand to enable assembly

Contact: Christoffer Sloth

Period: The project is running until 31.12.2025 with a milestone at 30.09.2023

Nowadays computers are widely used, and the use of technology is spreading and being developed in various spaces and especially in industrial environments. This growth nevertheless has counterparts such as the lack of qualified people to develop, maintain, and utilize it. Researchers have explored various methods and techniques in the past two decades to solve this problem and enable end-users to create or customize software systems. The present project aims to investigate these methods and techniques, focusing on a specific use case provided by a company facing a similar problem.

Enabled Robotics is specialized in the robotics field. They develop a mobile manipulator the “ER-FLEX”, corresponding to a collaborative robot arm placed at the top of a MIR mobile platform. The company provides their robots to customers in various fields such as manufacturing, healthcare, research, or educational centers.

Their use-case for this project concerns the deployment of the robot in an industrial warehouse (see Figure 1), where it is mainly used to perform logistic tasks such as transportations, loading and unloading objects, placing objects on shelves etc. The robot works autonomously and sometimes remotely controlled by an operator who sends commands from an interface.

In the warehouse environment, the robots use laser sensors to create a 2D map of the surroundings. They are deployed to move boxes between two areas: a Production area, including the production line and a Warehouse area comprised of storage shelves.

To enable the robots to pick up or place objects on shelves, a preliminary operation is required. An operator must systematically move the robot’s arm along each column of the shelves to record the exact locations of the frames where the boxes are to be taken or placed. Given that industrial storage unites consist of numerous frames, this current preliminary operation consumes significant time and energy. The company is working to eliminate the current process and find a more efficient and autonomous solution for object picking in storage furniture. When a box needs to be picked up, but it is not found on a shelf, the robot continues to move and take care of another task. Therefore, additional features should be added to the solution to verify and validate the success of each process.

A simulation environment will be employed to test and simulate different operation scenarios. One of the challenges faced by this project is therefore how to integrate real-world environment data into the simulation considering the scalability of the process. Warehouses typically correspond to vast spaces with diverse object placements and varying configurations, so the accurate reproduction of the three-dimensional space introduces an important challenge to the process.

Another challenge faced by the project is related to the programming interface. The interface to control the robot currently uses block-based programming corresponding to the use of modular portions that are dragged and assembled to form a task. Executable actions are created by the assembled blocks where each block is an instruction. For now, coding through the interface often induces dense and extended blocks. Moreover, the programming environment still requires the end user to master fundamental programming concepts such as conditionals and loops. The logic of programming must therefore be simplified and adapted to provide a tool that is simpler to control and accessible to every user profile.

Project partners: The sister project is a collaboration between the company Enabled Robotics, Roskilde University, SDU Software Engineering and SDU Robotics and supported by DIREC (Digital Center Research Denmark) and SDU I4.0Lab.

ContactAljaz Kramberger

Period: Running until 31.10.2026

 

I4.0 MFE – I4.0 MADE FAST Electronics

The I4.0 MADE FAST Electronics project aims at, in cooperation with its sister project MADE FAST Electronics, developing methodologies and systems for efficient assembly of through-hole components on printed circuit boards that allow for high mix low volume production of PCBs in Denmark. The three main focus areas are feeding (how to singulate the components to be inserted) from different packaging types, insertion (strategies to insert different components reliably into the PCB) and monitoring (ongoing process monitoring to detect faults that will lead to stopping conditions).

On the feeding side, the project works on improving existing bulk feeding mechanisms and integrating feeding technologies for more structured packaging styles (e.g., tube or tray) as well. We aim to develop two new vibration tray feeding mechanisms. Further investigations of force-based insertion, especially for more challenging components (e.g., snap lock, large/heavy components) are performed. For process monitoring, we are investigating different methods to evaluate the progress of the assembly process based on prior performance.

Project partners: The sister project MADE FAST Electronics is performed in cooperation with the following industrial partners: Paul E. Danchell A/S, Danfoss, LEGO, Terma A/S, Robot Nordic Aps, KUKA Nordic AB and Danish Technological Institute.

Contact: Dirk Kraft

Period: Running until 31.10.2024

Read more: MADE FAST Electronics is a subproject of MADE FAST, you can find more information about MADE FAST here.

Advanced robot technologies for the future of pharmaceutical device manufacturing
 

The aim of this project is to develop advance technologies for development and manufacturing for small scale production of pharmaceutical devices.

 

The main development fields include, robot assembly, simulation, modeling and control, quality control, simulation-based development of mechanical tools, vision-based bin picking, software structures.

 

The development in this fields will strengthen the collaboration between academia and pharmaceutical industry, as well as pave the road for digital continuity in product development.

 

Project partners: Supported by Novo Nordisk and SDU I4.0Lab

 

ContactAljaz Kramberger

 

Period: This projects ends 31.12.2024

 

 

Although the processes of milling and pick/place have been studied extensively in academia and industry, there have been not many studies of how to deploy this for disassembly. This project Robotic Disassembly of Plastic Components (I4.0 RDPC) focused on the problem. The different disassembly processes are studied and how they can be facilitated by design. They will provide input to the execution of advanced disassembly processes, and it will be tested which of such processes can be programmed by PbD. Finally, we will deploy results from other projects to test bin, tray or table picking and if we can use vision/AI methods for adjusting for misalignments between e.g., screwdriver and screw. 

Project partners: SDU Robotics and SDU Innovation and Design Engineering

Contact: Henrik Gordon Petersen

Period: 01.01.2023 - 31.12.2024

Read more:  The project is  an activity supplementing the Design for Disassembly project under InnoMission4. You can find more information about the InnoMissions here

I4.0 ROBOTIC TIMBER ASSEMBLY (RTA)

The collaborative set-up for I4.0 Robotic Timber Assembly combines the use of parametric and computational design algorithms, Mixed Reality (MR) and collaborative robots to create real-time adaptive timber construction processes. The developed activity of automated assembly/disassembly will be applied to a real-scale architectural demonstration project. The goal of this project: 

  • Use of collaborative robots in the assembly of wood construction
  • Development of digital twin and feedback loops for real-time assembly and disassembly of reversible and reusable timber structures
  • Integration of Mixed Reality (MR) and collaborative robots (UR) in a automated and digitally-enhanced human-robot collaborative construction

 

Brief description of advancements: 

RTA project in I4.0 Lab employs human-robot collaborative assembly of reconfigurable and reusable timber elements into structural building components, namely the ReconWood construction system. Humans, equipped with Mixed Reality (MR) devices and robots, equipped with a multi-phase end-effector (a gripper for pick and place, a wrist camera for the hole detection and a screwdriver for screwing operations), interact in real-time in the shared cyber-physical space to assemble the ReconWood blocks. 

 

Design Digital Twin / DDT,
The design, engineering, segmentation of the global structure for the assembly, and assembly path-planning of the ReconWood structures are developed in a parametric design environment. The geometric and scalar data extracted from the design workflow, such as the geometry of the piece, the number and locations of the holes, the positioning of the bolts and nuts, the assembly sequence and the position of a specific ReconWood block in the overall structure are sent from the design environment to an online database, namely Design Digital Twin, from where the data can be accessed and further used.

Link to ReconWood Slab Design Digital Twin

 

Material Digital Twin / MDT

ReconWood blocks are discrete and modular in their lengths and types. Each ReconWood block is assigned a unique Identification tag and a corresponding QR code imprint. The identification block tag (B name) contains the data regarding the blocks’ length and type. Once a specific block is used in a design process, it is also given its design part name (P name). To prepare the ReconWood blocks for assembly, a set of insert steel nuts are embedded in the specific holes along the piece, which are retrieved for each block from the DDT. Here, Mixed Reality (MR) is used to visualize the locations of the insert nuts for each piece, and automatically match the specific block’s B name and P name. This data is stored in an online material database, namely Material Digital Twin. In the perspective of material re-use and reconfigurability, each material piece will have a stored design/use history. 

 

Link to ReconWood Material Digital Twin

 

Assembly Digital Twin / ADT
Thanks to the QR code imprints and the link to their digital datasets, ReconWood blocks can be tracked and traced during the assembly operations. The assembly is carried out both in a human-robot collaborative setup, as well as a manual assembly with the use of Mixed Reality. The assembly data for each block is in real-time sent to the Assembly Digital Twin, where the specific aspects such as the time breakdown of the different operations (pick block, place block, pick screw, insert screw), hole detection deviations, and workload distribution between different assembly actors (e.g. two robots) are tracked and plotted. This allows for comparison in efficiency between the robotic and manual MR assembly, as well as keeping the record of all the assembly operations related to individual ReconWood blocks.

 Link to ReconWood Assembly Digital Twin

Period: Running until 01.04.2024

In this project, a safe and ergonomic teleoperation system will be developed to deal with challenging wind turbine blade maintenance tasks to reduce the maintenance cost and improve the safety of the working environment. The safety and ergonomics of a teleoperation system will be improved by using a combination of a kinesthetic and a cutaneous haptic device as a master device for commanding a remote robot in the teleoperation system. The human operator status during the teleoperation will be monitored and the robot behavior will be adapted to optimize the safety and ergonomics of the system. A physical emulator developed based on haptics and VR technologies in one previous I4.0 ASCERTAIN project will be used for effective operator training.

ContactCheng Fang

Period: 01.0.2023 - 28.02.2027

 

 

I4.0 Spatais – Grasping of unknown objects for trash sorting

The Spatais project deals with grasping (potentially for sorting) of unknown objects. It contains a vision component about detecting/localizing the objects on a conveyor belt, a component about development of grippers for this application and lastly a component that matches detected objects to gripper and suggests specific grasps. These developed technologies should be applicable for other challenges dealing with unknown objects as well.

Detecting and grasping unknown objects essentially corresponds to one-of-a-kind tasks, and hence is well-aligned with the vision to handle high-mix-low-volume tasks. The developed technologies can be used for industrial manufacturing tasks of detecting, grasping and placing objects where no CAD models are available. This encompasses form unstable objects which is well aligned with other activities in the I4.0Lab. Hence, the knowledge obtained in this project can be transferred to other I4.0 project activities, such as bin picking of form unstable objects.

The work on this is also funded through the SPATAIS project (https://www.sdu.dk/en/forskning/sdurobotics/researchprojects/spatais), which is funded through TRACE (https://trace.dk/), which is funded through the Innovation fund (https://innovationsfonden.dk/en)

Contact: Dirk Kraft

Period: Running until 31.12.2024

 

See concluded R&D projects

Concluded R&D projects

YouTube

Find SDU I4.0Lab on YouTube

SDU I4.0Lab YouTube

SDU I4.0Lab University of Southern Denmark

  • Campusvej 55
  • Odense M - DK-5230

Last Updated 16.06.2023