Various key aspect of machine-based environment interpretation are the automatic detection and recognition of objects, obstacle avoidance in navigation, and object tracking in certain applications. Integrating visual sensors, such as video cameras, with sensors providing direct 3D spatial measurements, such as Lidars may offer various benefits (high spatial and temporal resolution, distance, color or illumination invariance). However, fusing the different data modalities often implies sensor specific unique challenges.
The mission of CloudiFacturing is to optimize production processes and producibility using Cloud/HPC-based modelling and simulation, leveraging online factory data and advanced data analytics, thus contributing to the competitiveness and resource efficiency of manufacturing SMEs, ultimately fostering the vision of Factories 4.0 and the circular economy. CloudiFacturing will empower over 60 European organizations (many of them being manufacturing SMEs) and will support about 20 cross-national application experiments that will primarily be selected via two Open Calls.
We can efficiently use digital holographic microscopy for monitoring of sparse samples. From a recorded hologram the whole illuminated volume can be reconstructed using numerical simulation of wave propagation. From a single recorded hologram we can reconstruct several objects at different depths within the volume. Thus we can avoid the small depth of field constraint of conventional microscopes, and even 200 times larger volume can be observed from a single exposure.
Plastic bottle extruding and labeling development for building new innovative, environment friendly packaging materials and technology: The goal of the project is to develop a new environment friendly packaging and labeling material and technology. Nowadays, the labels are stuck with various adhesives, which generates extra chemical washing steps the recycling phase of these bottles. The new technology - developed in the framework of this project - targets to use thermal bonding of the labels without any added glue.
The main, overall objective of the project is to establish the Centre of Excellence in Production Informatics and Control (EPIC CoE) as a leading, internationally acknowledged focus point in the field of cyber-physical production systems. The main goals of the EPIC CoE are, one the one hand, to upgrade this scientific centre of excellence, and on the other hand, to strengthen the ability of SZTAKI and the two faculties of BME to transfer the research results to the industry with the support of the participating FhG institutions, with other words to enhance the applied resea
Define a generic pluggable framework, called MiCADO (Microservices-based Cloud Application-level Dynamic Orchestrator) that supports optimal and secure deployment and run-time orchestration of cloud applications. The project will provide a reference implementation of this framework by customising and extending existing, typically open source solutions. Moreover, it will demonstrate via large scale close to operational level SME and public sector demonstrators the applicability and impact of the solution.
In this project we address a new and very important issue: the observation of small backcountry wetland areas surrounded by different areas, hosting important species and delivering essential ecosystem services and biodiversity. Although these patches are small one by one, but together they can contribute to the wetland cover area with a very high rate – their protection and mapping is a need.
Recent Simultaneous Localization and Mapping (SLAM) algorithms are basically developed for stable environment in time; dynamic scenes cause strong bias in the localization models. For this reason we will improve the conventional SLAM calculus with statistical optimizing the models of changing parts and their neighborhood connection; this will result in semantic connectedness investigation on the models, which needs good classification methods of the scalable cluster structure.
Up to date 3D sensors revolutionized the acquisition of environmental information. 3D vision systems of self driving vehicles can be used for -apart from safe navigation- real time mapping of the environment, detecting and analyzing static (traffic signs, power lines, vegetation, street furniture), and dynamic (traffic flow, crowd gathering, unusual events) scene elements.