ºÝºÝߣ

ºÝºÝߣShare a Scribd company logo
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Task Representation in Robots for
Robust Coupling of Perception to Action
in Dynamic Scenes
Darius Burschka
Machine Vision and Perception Group
Department of Computer Science
Technische Universität München
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
We have seen Examples of the proposed Idea before...
In static environments, two types of robot navigation evolved
Map-based navigation Visual Servoing (Visual Maps)
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Coupling Alternatives for Perception Modules
M
M
Sensors
Camera,IMU,Laser
Structure-from-X
Actuators
3D Map-based action
planning
Reactive behavior(Instincts)
e.g., Obstacle avoidance,…
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Task-Specific Indexing to Information
Example – Collision Avoidance
Cartesian maps originate from the time
when the robot was the only moving
agent in static environments - there,
collision is proportional to distance to an
object
In dynamic environments, the collision avoidance task is a
function of distance and velocity of the object = collision time
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Capturing Motion Properties of Large Dynamic Scenes
Derivation of the dynamic state for moving objects in large
distances is not possible from consecutive metric
reconstructions due to detection and calibration uncertainties.
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Abstracting Information to 3D prohibits reliable Calculation
of the Dynamic State
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Direct Mapping of Point Motion on Image Observation
Direction of motion and collision times can be observed directly from
the properties of the projetion of imaged points in the images.
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Fast Uncalibrated Monocular Segmentation of Independent Motion
Components
Analysis of the sparse optical flow in images can be used
directly to group moving objects and define motion properties.
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
DariusBurschka–MVPGroupatTUM
http://ww6.in.tum.de/burschka/ Dagstuhl Seminar Talk, Nov 9th, 2015
Optimisation in Collision Space
(Schaub, Burschka ITSCC 2015)
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Avoidance based on Collision Space from Uncalibrated
Cameras
MachineisionandrceptionGroup@TUM
http://www6.in.tum.de/burschka/ ISRR 2017- Dec 12, 2017
Conclusions
 Cartesian Representation of the world is not always
appropriate for task description
 Exchange of information clos to the sensor representation
results in more robust results – avoid external parameters
 Parsing of actions can be defined as changes of contact
relations in the world that can also be monitored directly in the
image space.
 Need of new research in
• Optimization in new representations
• Modification of the control approaches
• Analysis of the task relevant physical properties (for sensor mapping)

More Related Content

Task-Representation for Robust Coupling of Perception to Action in Dynamic Environments