Skip to main content

Concluded projects

If you're employed at SDU and want to apply for an R&D project, please contact I40LAB@mmmi.sdu.dk or inform you more here (only access for SDU employees).

Concluded R&D projects

I4.0 Rockwool - Advanced reconfigurable tool for quality inspection of flexible material 

The purpose of this activity is to develop a highly adaptable robot tool, based on inhouse designed mechanical arms, for handling flexible Rockwool products to be incorporated into an automatic quality control system.  

Project partners: Companies involves are Rockwool, Danish Technological Institute and C&H System A/S. The development will be done in collaboration between SDU Robotics and SDU Mechanical Engineering. 

Contact: Aljaz Kramberger

Period: 01.08.2022 - 31.10.2023

Read more: I4.0 Rockwool  is  an activity supplementing the MADE Fast project 3.18 Rockwool.  You can find more information about MADE FAST here.

I4.0 ASCERTAIN aims to develop a physical emulator which can emulate physical human-robot collaboration tasks, of which the safety and ergonomics can be evaluated in a risk-controlled way with this emulator. Safety and ergonomics are crucial for the implementation and deployment of such human-robot collaboration  tasks,  e.g.  co-manipulating a car engine. 

Project partners:  I4.0 ASCERTAIN cooperates with Odense Robotics, Aarhus University, Rope Robotics, Xplor XR and Sensae. 

Contact: Cheng Fang

Period: Running until 31.12.2022

Read more: I4.0 ASCERTAIN is a subproject. You can find more information about the main project here

Debugging robotic programs is a well-known intricate problem. This is particularly true given the conventional method for identifying issues (e.g., executing behaviors, examining logs, and analyzing source code). We propose a tool that offers several visualizations based on the runtime data of robotic applications, where developers can see the runtime state across the execution time of the program, including which exact line of code influences the changes in the runtime state of the robot.

The project proposed the development of a debugging tool for robotic arm applications. For this, we need to collect data from a realistic application to design the visualizations presented in the tool. To get this data, we propose:

  1. Create a realistic pick-and-place applicationusing an external sensor to detect the objects the robot should pick up.
  2. Record and save the runtime state of the robot while performing several versions of the pick-and-place application with multiple runs.  

ContactMiguel Campusano
Period: 01.06.2023 - 20.08.2023

In this PhD project - I4.0 DanfossDemonstrator -, a virtual description of the robot systems is created in the form of a Digital Twin in a software tool that allows to run simulations of assembly tasks and execute the same logic using physical components (VEROSIM).  Robot  tasks  are  represented  as  Services  and  visualized  as  ServiceBlocks  in  a Visual Programming interface. Programs are automatically generated starting from a digital description of the  assembly task  and can  be  easily  modified  thanks  to the  simplified  programming  interface,  reducing the  complexity  of  handling  product  variants. Among  the  partner  companies,  KUKA  is  collaborating  to define the software solutions to control industrial robots, and LEGO and Danfoss are providing industrial use  cases  to  apply  the  technology  in  real  manufacturing environments(LEGO  Demonstrator  and  the proposed Danfoss Demonstrator).

Project partners: KUKA, LEGO and Danfoss Drives

Contact: Alberto Sartori

Period: 01.09.2022 - 30.09.2023

Read more: I4.0 Danfoss Demonstrator is an activity supplementing the MADE Fast project 3.02 - Agile programming - in collaboration with KUKA, LEGO and Danfoss Drive. You can find more information about MADE FAST here , and an article about the MADE LEGO demonstrator here (only in Danish).

The main goal of an earlier project was investigating deep learning methods for semantic understanding (e.g., action segmentation and classification) of demonstrations of industrial assembly tasks using limited sensory input. This I4.0 Deep Learning from Demonstration project will further develop this methodology, as well as research state-of-the-art methods, such as implicit behavioral cloning, for robust uncalibrated policy learning from demonstration. 

Contact: Iñigo Iturrate and Thiusius R. Savarimuthu

Period: The project runs until 31.12.2023

The purpose of I4.0 FCobot is to escalate the effort put into the various methods for grasp planning, object localization and world modeling currently being researched and developed at SDU Robotics and SDU Software Engineering. The mentioned methods have already become a research and development topic under the new FacilityCobot project, but with increased funds for equipment, it can be ensured that the relevant technology can continuously be developed and matured fully centered around SDU.  

Project partners: The sister project FacilityCobot is performed in cooperation with the following industrial partners: Enabled Robotics, ISS A/S and UbiquiSense  ApS. 

Contact: Aljaz Kramberger

Period: Running until 01.07.2024

I4.0 IExpRoBI - Industry 4.0 lab facilities for Experimenting with Spatial and Electricity Consumption Data

The purpose of this activity is to escalate efforts in integrating the robots in the SDU I4.0Lab with the physical space they are deployed in and the humans around. The activity addresses the following two questions: 1) how can we in the SDU I4.0Lab study how humans and robots use physical space? and 2) how can we study and minimize the energy consumed by cobots in the SDU I4.0Lab to enable an overall energy-efficient operation?

The project will, for instance, enable the following capabilities in the future:
  • Imagining a production line with flexible production with human steps. The UWB infrastructure enables the collection of spatial data about assets with UWB tags, e.g., products as they transition through different stages in the production. The Xovis infrastructure enables the collection of the spatial positions of humans without a tag, e.g. to study the impact of safety zones on human behavior. Both types of data can be live streamed to power dashboards or stored for post-hoc analysis.
  • Imagining a robotic task to be carried out by a battery-powered mobile manipulator. The models and database of data on electricity consumption of collaborative robots enable the energy optimization of tasks, e.g., to enable longer operation time of the mobile manipulator. The AR app realized can be retrofitted to in AR to show the energy consumption and other info on the mobile manipulator.

Project partners: Enabled Robotics and Universal Robots

Contact: Mikkel Baun Kjærgaard

Period: Running until 31.08.2023

Read more here (only in Danish).

This activity contains theoretical and empirical (experiments) work with the MVP and software infrastructure in the SDU I4.0Lab. The aim is to bring the Information Backbone to a point where it is well-documented and can be used in other research projects. 

Contact: Torben Worm
Period: Running until 31.12.2022

I40-PoseEstimation aims to facilitate easier setup of robotic solutions for new object-centered perception and manipulation tasks. Specifically, we have developed a novel pose estimation algorithm, which learns important relations between an image of an object and the potential for using such an object for e.g. grasping and handling using a robot.

The developed method has been tested for object localization (pose estimation) using only RGB images as inputs and no potentially expensive depth sensor, where it has shown state of the art results. The method is based on neural networks and works by identifying the visible parts of the surface of an object in an image. These parts are described by a so-called embedding using a neural network. This embedding has potential for much more than pure localization, since it gives a flexible and rich representation of an object surface.

As an additional strength of the method, the learned object description intrinsically handles both discrete and continuous symmetries, which are often seen in industrial objects, and sometimes in household objects.

Furthermore, in contrast with many of our other developed methods for perception, this method does not require depth or range sensors to work, but still maintains an impressive precision during localization tasks. This allows for extremely cheap and simple deployment in new scenarios.

Contact: Anders Glent Buch

Period: Running until 31.08.2023

SDU I4.0Lab supports the development of no-code robot programming for low-volume manufacturing. We aim to generate highly reliable robot programs for industrial assembly tasks from kinesthetic teaching. The project focuses on modelling, robot control, and parameter estimation.

Contact: Christoffer Sloth

Period: Running until 01.04.2024

See YouTube video: Programming by Demonstration

 

 


Go back to R&D projects

R&D projects

SDU I4.0Lab University of Southern Denmark

  • Campusvej 55
  • Odense M - DK-5230

Last Updated 17.11.2023