Mobile robotics applications

Mobile robotics related applications are one of the most emerging ones on the market, including aerial, ground and underwater variants. This trend is likely to be continued in the near future too, as long as companies such as Google, Bosch, Toyota are heavily interested and investigating into the future autonomous car.

The main components of a mobile robot include the perception, localization&mapping, planning and low lever control parts. Perception is essential in order to receive and interpret the information from the environment. We use different sensors for this, like IMU, stereo camera, 3D laser or the popular kinect camera. Localization&mapping is referring to the ability of the robot to answer the question where am I with respect to some map information. The creation of a map is strongly related to the localization, as these two are related to each other, usually denoted as simultaneous localization and mapping (SLAM) within the robotics. The planning of the robot gives an answer to the question where I go, and how do I get there. This incorporates advanced reasoning and AI methods too. Finally, the low level control translates the higher level planning command to the robot actuators taking into account the kinematic and dynamical model of the device.

We focus on the application side of these techniques using different platforms that we have in our laboratory, mainly developing code within the ROS framework in C++. Special research topics from these applications are treated in the other research directions from our group.

Our open and ongoing projects in this area are listed below, together with a selection of completed projects where relevant.

Young Teams grant: Handling non-smooth effects in control of real robotic systems

Robotics has a growing impact on our everyday life. Traditional applications are complemented by the integration of robots in the human environment. With the availability of low cost sensors, aerial robotics also became an active area of research. However, many of the practical challenges associated to the real time control of robotic systems are not yet resolved.

AUF-RO grant: AI methods for the networked control of assistive UAVs (NETASSIST)

This project develops methods for the networked control and sensing for a team of unmanned, assistive aerial vehicles that follows a group of vulnerable persons. On the control side, we consider multiagent and consensus techniques, while on the vision side the focus is egomotion estimation of the UAVs and cooperative tracking of persons with filtering techniques. NETASSIST is an international cooperation project involving the Technical University of Cluj-Napoca in Romania, the University of Szeged in Hungary, and the University of Lorraine at Nancy, France.

Robot, bring me... my glasses: Mixed air and ground assistive teams

Robots that assist elderly or disabled persons, or even anyone in their day-to-day tasks, can lead to a huge improvement in quality of life. At ROCON we are pursuing domestic mobile manipulators, as well as UAVs for monitoring the persons. Our next goal is to integrate these two platforms into an overall framework that will both monitor the persons and assist them on the ground.

Young Teams grant: Reinforcement learning and planning for large-scale systems

Many controlled systems, such as robots in open environments, traffic and energy networks, etc. are large-scale: they have many continuous variables. Such systems may also be nonlinear, stochastic, and impossible to model accurately. Optimistic planning (OP) is a recent paradigm for general nonlinear and stochastic control, which works when a model is available; reinforcement learning (RL) additionally works model-free, by learning from data. However, existing OP and RL methods cannot handle the number of continuous variables required in large-scale systems.

Nonlinear control for commercial drones in autonomous railway maintenance

Drones are getting widespread and low-cost platforms already offer good flight and video recording experience. This project intends to use such drones in the context of railway maintenance by developing applications for autonomous navigation in railway environment.

Autonomous Guidance of a quadcopter based on vanishing point detection.

Unmanned aerial vehicles are increasingly being used and showing their advantages in many domains. However, their application to railway systems is very little studied. In this paper, we focus on controlling an AR.Drone UAV in order to follow the railway track.

Vision based quadrotor navigation around objects

This project focuses on the railway inspection scenario where the drone is meant to find and inspect certain target objects. Working with an AR.Drone quadrotor, we intend to have basic GPS waypoint navigation to approach the target object, target object detection and basic control architecture for navigating around the detected target.

Localization and Mapping with NI Mobile Robot

This project has its main goal the mapping and localization of a mobile robot from the National Instruments in a flat 2D environment. Standard techniques are envisaged in order to create a 2D environment map and to use this in further steps for localization.

In order to apply to this project, the ideal candidate should be open-minded, challenge facing and hardworking personality. Additional knowledge of C++ and VI is a plus. The project is planned to be partially done at the office of NI at Cluj-Napoca.

Shape Based Active Perception

Perception for artificial systems is increasingly gaining importance especially for systems with multiple sensors. For example, the autonomous vehicles equipped with camera, lidar, inertial measurement units, positioning systems etc are becoming popular. More importantly they are using the information from the multiple sources in the mean time, in the same coordinate system in order to gain information about the environment. Thus, the need for fusing the information from several sources in a common space for reasoning purposes is crucial.

Active Perception for Flexible Object Handling in Smart Manufacturing

Intelligent object handling is becoming a must in a smart manufacturing system especially with the recent appearance of the motion compliant dual handed industrial robotic systems. Also the enhanced 3D sensing capabilities from the robotics domain enables us to reconsider our view about the smart manufacturing by enabling on the fly spatial perception of the robot working space.

Subscribe to Mobile robotics applications