Skip to main content

Concluded projects

If you're employed at SDU and want to apply for an R&D project, please contact I40LAB@mmmi.sdu.dk or inform you more here (only access for SDU employees).

Concluded R&D projects

I4.0 Rockwool - Advanced reconfigurable tool for quality inspection of flexible material 

The purpose of this activity is to develop a highly adaptable robot tool, based on inhouse designed mechanical arms, for handling flexible Rockwool products to be incorporated into an automatic quality control system.  

Project partners: Companies involves are Rockwool, Danish Technological Institute and C&H System A/S. The development will be done in collaboration between SDU Robotics and SDU Mechanical Engineering. 

Contact: Aljaz Kramberger

Period: 01.08.2022 - 31.10.2023

Read more: I4.0 Rockwool  is  an activity supplementing the MADE Fast project 3.18 Rockwool.  You can find more information about MADE FAST here.

I4.0 ASCERTAIN aims to develop a physical emulator which can emulate physical human-robot collaboration tasks, of which the safety and ergonomics can be evaluated in a risk-controlled way with this emulator. Safety and ergonomics are crucial for the implementation and deployment of such human-robot collaboration  tasks,  e.g.  co-manipulating a car engine. 

Project partners:  I4.0 ASCERTAIN cooperates with Odense Robotics, Aarhus University, Rope Robotics, Xplor XR and Sensae. 

Contact: Cheng Fang

Period: Running until 31.12.2022

Read more: I4.0 ASCERTAIN is a subproject. You can find more information about the main project here

In this project, we aim to develop an end-to-end framework for setting up force-sensitive tasks on soft and deformable objects. This project will focus on two research tasks:

1) serial elastic actuator (SEA) for constraining and controlling the output force, and

2) a learning-from-demonstration framework for adapting the robot’s impedance profile to the waypoints along the trajectory.

Contact: Iñigo Iturrate and Zhouqi Cheng

Period: 01.03.2023 - 30.09.2024

Debugging robotic programs is a well-known intricate problem. This is particularly true given the conventional method for identifying issues (e.g., executing behaviors, examining logs, and analyzing source code). We propose a tool that offers several visualizations based on the runtime data of robotic applications, where developers can see the runtime state across the execution time of the program, including which exact line of code influences the changes in the runtime state of the robot.

The project proposed the development of a debugging tool for robotic arm applications. For this, we need to collect data from a realistic application to design the visualizations presented in the tool. To get this data, we propose:

  1. Create a realistic pick-and-place applicationusing an external sensor to detect the objects the robot should pick up.
  2. Record and save the runtime state of the robot while performing several versions of the pick-and-place application with multiple runs.  

ContactMiguel Campusano
Period: 01.06.2023 - 20.08.2023

In this PhD project - I4.0 DanfossDemonstrator -, a virtual description of the robot systems is created in the form of a Digital Twin in a software tool that allows to run simulations of assembly tasks and execute the same logic using physical components (VEROSIM).  Robot  tasks  are  represented  as  Services  and  visualized  as  ServiceBlocks  in  a Visual Programming interface. Programs are automatically generated starting from a digital description of the  assembly task  and can  be  easily  modified  thanks  to the  simplified  programming  interface,  reducing the  complexity  of  handling  product  variants. Among  the  partner  companies,  KUKA  is  collaborating  to define the software solutions to control industrial robots, and LEGO and Danfoss are providing industrial use  cases  to  apply  the  technology  in  real  manufacturing environments(LEGO  Demonstrator  and  the proposed Danfoss Demonstrator).

Project partners: KUKA, LEGO and Danfoss Drives

Contact: Alberto Sartori

Period: 01.09.2022 - 30.09.2023

Read more: I4.0 Danfoss Demonstrator is an activity supplementing the MADE Fast project 3.02 - Agile programming - in collaboration with KUKA, LEGO and Danfoss Drive. You can find more information about MADE FAST here , and an article about the MADE LEGO demonstrator here (only in Danish).

The main goal of an earlier project was investigating deep learning methods for semantic understanding (e.g., action segmentation and classification) of demonstrations of industrial assembly tasks using limited sensory input. This I4.0 Deep Learning from Demonstration project will further develop this methodology, as well as research state-of-the-art methods, such as implicit behavioral cloning, for robust uncalibrated policy learning from demonstration. 

Contact: Iñigo Iturrate and Thiusius R. Savarimuthu

Period: The project runs until 31.12.2023

I4.0 DigiTech

Digital technologies are an important driver for Industry 4.0 solutions. New generations of hardware and software technology enable the collection and processing of data at a much larger scale. New AI  technology  enables new options for robotic solutions to understand their surroundings and tasks from data in much more detail. New computing platforms enable the software and AI technologies to be executed distributed over edge and cloud nodes, however, also create new challenges, e.g. cyber-security. 

We aim at developing data-, computing- and storage-structures to enable the re-use and exploitation of assembly data (before and during assembly) as well as mobile robot data. For that, the use-cases need to be analyzed in detail in terms of the relevant data and the concrete examples of data-re-use (e.g. for predictive maintenance, fleet management, gripper design or improvement of the system through machine learning) as well as the required software and hardware structure. For software and hardware structures there is an explicit focus on the opportunities for distributed edge computing.

Project partners: MiR, Universal Robots, NordBo, Rockwool, Novo Nordisk, Welltec, MADE and Odense Robotics. 

Contact: Mikkel Baun Kjærgaard

Period: The project ends 31.03.2025

Read more: I4.0 DigiTech is a subproject of DIREC, you can find more information about DIREC here

The project will make plant simulation available to SDU I4.0Lab and to researchers at the Faculty of Engineering.  The models developed will be around Factory of the Future including reconfigurable manufacturing, matrix production, swarm production etc.  Cases in planning are the digital Twin solutions for the LEGO demonstrator. Here we will integrate Verosim with Plant simulation to model and performing experiments at plant level. Another case may be to demonstrate and test digital Twin solutions for the Danfoss demonstrator, where Matrix production will be modeled at the Viking line.

Contact: Arne Bilberg

Period: The project is running until 31.12.2024 with milestone 1 evaluated at 31.12.2023

The purpose of I4.0 FCobot is to escalate the effort put into the various methods for grasp planning, object localization and world modeling currently being researched and developed at SDU Robotics and SDU Software Engineering. The mentioned methods have already become a research and development topic under the new FacilityCobot project, but with increased funds for equipment, it can be ensured that the relevant technology can continuously be developed and matured fully centered around SDU.  

Project partners: The sister project FacilityCobot is performed in cooperation with the following industrial partners: Enabled Robotics, ISS A/S and UbiquiSense  ApS. 

Contact: Aljaz Kramberger

Period: Running until 01.07.2024

I4.0 IExpRoBI - Industry 4.0 lab facilities for Experimenting with Spatial and Electricity Consumption Data

The purpose of this activity is to escalate efforts in integrating the robots in the SDU I4.0Lab with the physical space they are deployed in and the humans around. The activity addresses the following two questions: 1) how can we in the SDU I4.0Lab study how humans and robots use physical space? and 2) how can we study and minimize the energy consumed by cobots in the SDU I4.0Lab to enable an overall energy-efficient operation?

The project will, for instance, enable the following capabilities in the future:
  • Imagining a production line with flexible production with human steps. The UWB infrastructure enables the collection of spatial data about assets with UWB tags, e.g., products as they transition through different stages in the production. The Xovis infrastructure enables the collection of the spatial positions of humans without a tag, e.g. to study the impact of safety zones on human behavior. Both types of data can be live streamed to power dashboards or stored for post-hoc analysis.
  • Imagining a robotic task to be carried out by a battery-powered mobile manipulator. The models and database of data on electricity consumption of collaborative robots enable the energy optimization of tasks, e.g., to enable longer operation time of the mobile manipulator. The AR app realized can be retrofitted to in AR to show the energy consumption and other info on the mobile manipulator.

Project partners: Enabled Robotics and Universal Robots

Contact: Mikkel Baun Kjærgaard

Period: Running until 31.08.2023

Read more here (only in Danish).

This activity contains theoretical and empirical (experiments) work with the MVP and software infrastructure in the SDU I4.0Lab. The aim is to bring the Information Backbone to a point where it is well-documented and can be used in other research projects. 

Contact: Torben Worm
Period: Running until 31.12.2022

I40-PoseEstimation aims to facilitate easier setup of robotic solutions for new object-centered perception and manipulation tasks. Specifically, we have developed a novel pose estimation algorithm, which learns important relations between an image of an object and the potential for using such an object for e.g. grasping and handling using a robot.

The developed method has been tested for object localization (pose estimation) using only RGB images as inputs and no potentially expensive depth sensor, where it has shown state of the art results. The method is based on neural networks and works by identifying the visible parts of the surface of an object in an image. These parts are described by a so-called embedding using a neural network. This embedding has potential for much more than pure localization, since it gives a flexible and rich representation of an object surface.

As an additional strength of the method, the learned object description intrinsically handles both discrete and continuous symmetries, which are often seen in industrial objects, and sometimes in household objects.

Furthermore, in contrast with many of our other developed methods for perception, this method does not require depth or range sensors to work, but still maintains an impressive precision during localization tasks. This allows for extremely cheap and simple deployment in new scenarios.

Contact: Anders Glent Buch

Period: Running until 31.08.2023

SDU I4.0Lab supports the development of no-code robot programming for low-volume manufacturing. We aim to generate highly reliable robot programs for industrial assembly tasks from kinesthetic teaching. The project focuses on modelling, robot control, and parameter estimation.

Contact: Christoffer Sloth

Period: Running until 01.04.2024

See YouTube video: Programming by Demonstration

I4.0 ROBOTIC TIMBER ASSEMBLY (RTA)

The collaborative set-up for I4.0 Robotic Timber Assembly combines the use of parametric and computational design algorithms, Mixed Reality (MR) and collaborative robots to create real-time adaptive timber construction processes. The developed activity of automated assembly/disassembly will be applied to a real-scale architectural demonstration project. The goal of this project: 

  • Use of collaborative robots in the assembly of wood construction
  • Development of digital twin and feedback loops for real-time assembly and disassembly of reversible and reusable timber structures
  • Integration of Mixed Reality (MR) and collaborative robots (UR) in a automated and digitally-enhanced human-robot collaborative construction

 

Brief description of advancements: 

RTA project in I4.0 Lab employs human-robot collaborative assembly of reconfigurable and reusable timber elements into structural building components, namely the ReconWood construction system. Humans, equipped with Mixed Reality (MR) devices and robots, equipped with a multi-phase end-effector (a gripper for pick and place, a wrist camera for the hole detection and a screwdriver for screwing operations), interact in real-time in the shared cyber-physical space to assemble the ReconWood blocks. 

 

Design Digital Twin / DDT,
The design, engineering, segmentation of the global structure for the assembly, and assembly path-planning of the ReconWood structures are developed in a parametric design environment. The geometric and scalar data extracted from the design workflow, such as the geometry of the piece, the number and locations of the holes, the positioning of the bolts and nuts, the assembly sequence and the position of a specific ReconWood block in the overall structure are sent from the design environment to an online database, namely Design Digital Twin, from where the data can be accessed and further used.

Link to ReconWood Slab Design Digital Twin

 

Material Digital Twin / MDT

ReconWood blocks are discrete and modular in their lengths and types. Each ReconWood block is assigned a unique Identification tag and a corresponding QR code imprint. The identification block tag (B name) contains the data regarding the blocks’ length and type. Once a specific block is used in a design process, it is also given its design part name (P name). To prepare the ReconWood blocks for assembly, a set of insert steel nuts are embedded in the specific holes along the piece, which are retrieved for each block from the DDT. Here, Mixed Reality (MR) is used to visualize the locations of the insert nuts for each piece, and automatically match the specific block’s B name and P name. This data is stored in an online material database, namely Material Digital Twin. In the perspective of material re-use and reconfigurability, each material piece will have a stored design/use history. 

 

Link to ReconWood Material Digital Twin

 

Assembly Digital Twin / ADT
Thanks to the QR code imprints and the link to their digital datasets, ReconWood blocks can be tracked and traced during the assembly operations. The assembly is carried out both in a human-robot collaborative setup, as well as a manual assembly with the use of Mixed Reality. The assembly data for each block is in real-time sent to the Assembly Digital Twin, where the specific aspects such as the time breakdown of the different operations (pick block, place block, pick screw, insert screw), hole detection deviations, and workload distribution between different assembly actors (e.g. two robots) are tracked and plotted. This allows for comparison in efficiency between the robotic and manual MR assembly, as well as keeping the record of all the assembly operations related to individual ReconWood blocks.

 Link to ReconWood Assembly Digital Twin

Period: Running until 01.04.2024

 

 


Go back to R&D projects

R&D projects

SDU I4.0Lab University of Southern Denmark

  • Campusvej 55
  • Odense M - DK-5230

Last Updated 17.11.2023