If you're employed at SDU and want to apply for an R&D project, please contact I40LAB@mmmi.sdu.dk or inform you more here (only access for SDU employees).
R&D projects
I4.0 Rockwool - Advanced reconfigurable tool for quality inspection of flexible material
The purpose of this activity is to develop a highly adaptable robot tool, based on inhouse designed mechanical arms, for handling flexible Rockwool products to be incorporated into an automatic quality control system.
Project partners: Companies involves are Rockwool, Danish Technological Institute and C&H System A/S. The development will be done in collaboration between SDU Robotics and SDU Mechanical Engineering.
Contact: Aljaz Kramberger
Period: 01.08.2022 - 31.10.2023
Read more: I4.0 Rockwool is an activity supplementing the MADE Fast project 3.18 Rockwool. You can find more information about MADE FAST here.
In this project, we aim to develop an end-to-end framework for setting up force-sensitive tasks on soft and deformable objects. This project will focus on two research tasks:
1) serial elastic actuator (SEA) for constraining and controlling the output force, and
2) a learning-from-demonstration framework for adapting the robot’s impedance profile to the waypoints along the trajectory.
Contact: Iñigo Iturrate and Zhouqi Cheng
Period: 01.03.2023 - 30.09.2024
In this PhD project - I4.0 DanfossDemonstrator -, a virtual description of the robot systems is created in the form of a Digital Twin in a software tool that allows to run simulations of assembly tasks and execute the same logic using physical components (VEROSIM). Robot tasks are represented as Services and visualized as ServiceBlocks in a Visual Programming interface. Programs are automatically generated starting from a digital description of the assembly task and can be easily modified thanks to the simplified programming interface, reducing the complexity of handling product variants. Among the partner companies, KUKA is collaborating to define the software solutions to control industrial robots, and LEGO and Danfoss are providing industrial use cases to apply the technology in real manufacturing environments(LEGO Demonstrator and the proposed Danfoss Demonstrator).
Project partners: KUKA, LEGO and Danfoss Drives
Contact: Alberto Sartori
Period: 01.09.2022 - 30.09.2023
Read more: I4.0 Danfoss Demonstrator is an activity supplementing the MADE Fast project 3.02 - Agile programming - in collaboration with KUKA, LEGO and Danfoss Drive. You can find more information about MADE FAST here , and an article about the MADE LEGO demonstrator here (only in Danish).
The main goal of an earlier project was investigating deep learning methods for semantic understanding (e.g., action segmentation and classification) of demonstrations of industrial assembly tasks using limited sensory input. This I4.0 Deep Learning from Demonstration project will further develop this methodology, as well as research state-of-the-art methods, such as implicit behavioral cloning, for robust uncalibrated policy learning from demonstration.
Contact: Iñigo Iturrate and Thiusius R. Savarimuthu
Period: The project runs until 31.12.2023
I4.0 Bin-Picking (Developing Bin-Picking Solutions for Robotics) aims to develop and test bin-picking solutions employing computer vision. By utilizing computer vision for bin-picking, two main benefits can be obtained: shorter set-up time and faster run-time. The set-up time can be reduced as modifications only need to be made in the software, without hardware changes. The faster run-time can be achieved as the grasping operation can be performed directly in the bin, without any intermediate steps.
In recent years, many new algorithms for object pose estimation have been developed at SDU, which have achieved state-of-the-art results on benchmarking datasets. The performance of vision sensors has also developed rapidly in recent years. By implementing the developed pose estimation algorithms with new sensors, the ability to solve bin-picking tasks can be evaluated.
Currently, MMMI has not tested any modern high-end 3D sensors. This limits the application of current computer vision methods. To the best of our knowledge, the Zivid-2 camera is currently the best possible 3D sensor at an affordable price. Two sensors are tested to obtain a deeper understanding of current possibilities.
Nevertheless, bin-picking is often a difficult task, and for some scenarios, pose estimation solutions cannot be created. The conventional approach uses pose estimation only if it can solve all tasks and is therefore often not considered a possibility.
A new methodology is therefore proposed as an alternative to the conventional. As solving a task with computer vision is the most straightforward approach, this should be applied where possible. If a computer vision solution is not possible, alternative solutions should be used. Thus, even if computer vision cannot solve all scenarios, robotic set-ups can still gain from individual solutions. The success of such an approach is then entirely dependent on developing a strategy for determining which method to use.
Contact: Frederik Hagelskjær
Period: This project is ending 31.07.2024
This I4.0 Digital Twin project aims to enable model-based robot controller design for in-contact applications. This will be accomplished by developing a high-fidelity robot simulation of a robot manipulator. The project will take an outset in a UR5e manipulator but will develop general methodology that applies to all collaborative robots. In other research application areas like wind turbine control, high-fidelity models have used for many years for benchmarking and are considered to have a very small reality-gap.
Contact: Christoffer Sloth
Period: Running until 30.09.2023
I4.0 DigiTech
Digital technologies are an important driver for Industry 4.0 solutions. New generations of hardware and software technology enable the collection and processing of data at a much larger scale. New AI technology enables new options for robotic solutions to understand their surroundings and tasks from data in much more detail. New computing platforms enable the software and AI technologies to be executed distributed over edge and cloud nodes, however, also create new challenges, e.g. cyber-security.
We aim at developing data-, computing- and storage-structures to enable the re-use and exploitation of assembly data (before and during assembly) as well as mobile robot data. For that, the use-cases need to be analyzed in detail in terms of the relevant data and the concrete examples of data-re-use (e.g. for predictive maintenance, fleet management, gripper design or improvement of the system through machine learning) as well as the required software and hardware structure. For software and hardware structures there is an explicit focus on the opportunities for distributed edge computing.
Project partners: MiR, Universal Robots, NordBo, Rockwool, Novo Nordisk, Welltec, MADE and Odense Robotics.
Contact: Mikkel Baun Kjærgaard
Period: Milestone 2 ends 31.12.2023
Read more: I4.0 DigiTech is a subproject of DIREC, you can find more information about DIREC here.
I4.0 Feeding - MADE FAST Feeding
A big advantage of simulation and synthetic data is the ability to virtually design and experiment before physical construction. This is especially beneficial when it comes to part feeding automation equipment as setup time and effort to build and test is the common inhibitors to deploy this type of technology efficiently in industry. This is whether it involves configuring a vision system to handle a new part variant, designing a set of gripper fingers to properly grasp a part, or designing the right set of orienting mechanism for a vibratory part feeder to always deliver parts in a specific pose to the subsequent manipulation process. Linking these technologies together on a common platform will allow us to investigate their synergies and demonstrate a holistic system utilizing all these technologies to solve part feeding tasks more efficiently. The aim is to make part feeding technology more available to industry especially in situations where production rate is not high enough to make dedicated solutions economically feasible.
The project will deliver research into the development and realization of a new type of flexible feeder. This feeder will be based on a vibratory feeder and be designed to be modular and thus fast and easy to reconfigure. Its flexibility will be enabled by novel technologies developed in the project including simulation-based module design for mechanical orientation of parts as well as an easy to setup computer vision system for part pose classification to allow active rejection and recirculation. Furthermore, these technologies will be combined to enable a near autonomous design and commissioning of new part feeding solutions.
The project is directly linked with the research activities in MADE FAST 3.04.
Contact: Simon Faarvang Mathiesen
Period: 01.08.2022 - 31.12.2023
Read more: This project is a subproject of MADE FAST, you can find more information about MADE FAST here.
Article (22.08.2023): ”Disruption” af en basisteknologi alle bruger: LEGO Group og Danfoss er begejstrede (only in Danish)
The project will make plant simulation available to SDU I4.0Lab and to researchers at the Faculty of Engineering. The models developed will be around Factory of the Future including reconfigurable manufacturing, matrix production, swarm production etc. Cases in planning are the digital Twin solutions for the LEGO demonstrator. Here we will integrate Verosim with Plant simulation to model and performing experiments at plant level. Another case may be to demonstrate and test digital Twin solutions for the Danfoss demonstrator, where Matrix production will be modeled at the Viking line.
Period: Running until 31.12.2024
The purpose of I4.0 FCobot is to escalate the effort put into the various methods for grasp planning, object localization and world modeling currently being researched and developed at SDU Robotics and SDU Software Engineering. The mentioned methods have already become a research and development topic under the new FacilityCobot project, but with increased funds for equipment, it can be ensured that the relevant technology can continuously be developed and matured fully centered around SDU.
Project partners: The sister project FacilityCobot is performed in cooperation with the following industrial partners: Enabled Robotics, ISS A/S and UbiquiSense ApS.
Contact: Aljaz Kramberger
Period: Running until 01.10.2023
I4.0 Human-Grasping enables robots to grasp and manipulate objects like humans do. If robots obtain human-like grasping and manipulation capabilities, then feeding systems, custom finger design, and vision‐based pin‐picking would be obsolete for low‐volume production. Thus, it is our hypothesis that the introduction of a universal gripper – a human‐like hand –will enable hyperflexible robotic assembly. We aim at demonstrating an assembly task consisting of the following steps:
- Grasping: Grasping two objects from a table
- Classification: Classify the grasped object(s) based on tactile feedback
- In‐Hand Manipulation: Move object of interest to desired pose in‐hand
- Compliance Shaping: Shape the compliance of the part in‐hand to enable assembly
Contact: Christoffer Sloth
Period: Milestone 1 is running until 30.09.2023
I4.0 MFE – I4.0 MADE FAST Electronics
The I4.0 MADE FAST Electronics project aims at, in cooperation with its sister project MADE FAST Electronics, developing methodologies and systems for efficient assembly of through-hole components on printed circuit boards that allow for high mix low volume production of PCBs in Denmark. The three main focus areas are feeding (how to singulate the components to be inserted) from different packaging types, insertion (strategies to insert different components reliably into the PCB) and monitoring (ongoing process monitoring to detect faults that will lead to stopping conditions).
On the feeding side, the project works on improving existing bulk feeding mechanisms and integrating feeding technologies for more structured packaging styles (e.g., tube or tray) as well. We aim to develop two new vibration tray feeding mechanisms. Further investigations of force-based insertion, especially for more challenging components (e.g., snap lock, large/heavy components) are performed. For process monitoring, we are investigating different methods to evaluate the progress of the assembly process based on prior performance.
Project partners: The sister project MADE FAST Electronics is performed in cooperation with the following industrial partners: Paul E. Danchell A/S, Danfoss, LEGO, Terma A/S, Robot Nordic Aps, KUKA Nordic AB and Danish Technological Institute.
Contact: Dirk Kraft
Period: Running until 31.10.2024
Read more: MADE FAST Electronics is a subproject of MADE FAST, you can find more information about MADE FAST here.
The aim of this project is to develop advance technologies for development and manufacturing for small scale production of pharmaceutical devices.
The main development fields include, robot assembly, simulation, modeling and control, quality control, simulation-based development of mechanical tools, vision-based bin picking, software structures.
The development in this fields will strengthen the collaboration between academia and pharmaceutical industry, as well as pave the road for digital continuity in product development.
Project partners: Supported by Novo Nordisk and SDU I4.0Lab
Contact: Aljaz Kramberger
Period: This projects ends 31.12.2023
SDU I4.0Lab supports the development of no-code robot programming for low-volume manufacturing. We aim to generate highly reliable robot programs for industrial assembly tasks from kinesthetic teaching. The project focuses on modelling, robot control, and parameter estimation.
Contact: Christoffer Sloth
Period: Running until 01.04.2024
Although the processes of milling and pick/place have been studied extensively in academia and industry, there have been not many studies of how to deploy this for disassembly. This project Robotic Disassembly of Plastic Components (I4.0 RDPC) focused on the problem. The different disassembly processes are studied and how they can be facilitated by design. They will provide input to the execution of advanced disassembly processes, and it will be tested which of such processes can be programmed by PbD. Finally, we will deploy results from other projects to test bin, tray or table picking and if we can use vision/AI methods for adjusting for misalignments between e.g., screwdriver and screw.
Project partners: SDU Robotics and SDU Innovation and Design Engineering
Contact: Henrik Gordon Petersen
Period: 01.01.2023 - 31.12.2024
Read more: The project is an activity supplementing the Design for Disassembly project under InnoMission4. You can find more information about the InnoMissions here.
I4.0 ROBOTIC TIMBER ASSEMBLY (RTA)
The collaborative set-up for I4.0 Robotic Timber Assembly combines the use of parametric and computational design algorithms, Mixed Reality (MR) and collaborative robots to create real-time adaptive timber construction processes. The developed activity of automated assembly/disassembly will be applied to a real-scale architectural demonstration project. The goal of this project:
- Use of collaborative robots in the assembly of wood construction
- Development of digital twin and feedback loops for real-time assembly and disassembly of reversible and reusable timber structures
- Integration of Mixed Reality (MR) and collaborative robots (UR) in a automated and digitally-enhanced human-robot collaborative construction
In this project, a safe and ergonomic teleoperation system will be developed to deal with challenging wind turbine blade maintenance tasks to reduce the maintenance cost and improve the safety of the working environment. The safety and ergonomics of a teleoperation system will be improved by using a combination of a kinesthetic and a cutaneous haptic device as a master device for commanding a remote robot in the teleoperation system. The human operator status during the teleoperation will be monitored and the robot behavior will be adapted to optimize the safety and ergonomics of the system. A physical emulator developed based on haptics and VR technologies in one previous I4.0 ASCERTAIN project will be used for effective operator training.
Contact: Cheng Fang
Period: 01.0.2023 - 28.02.2027