Você está na página 1de 23

European Union's Seventh Framework Programme for Research and Technological Development Information and Communication Technologies (ICT)

Theme

FP7 - ICT - CHALLENGE 2, Objective ICT-2009.2.1

Cognitive Systems and Robotics


Projects resulting from the sixth FP7-ICT Call for Proposals FP7-ICT-2009-6

ACTIVE : Active Constraints Technologies for Ill-defined or Volatile Environments


The ACTIVE project exploits ICT and other engineering methods and technologies for the design and development of an integrated redundant robotic platform for neurosurgery. A light and agile redundant robotic cell with 20 degrees-of-freedom (DoFs) and an advanced processing unit for pre- and intra-operative control will operate both autonomously and cooperatively with surgical staff on the brain, a loosely structured environment. As the patient will not be considered rigidly fixed to the operating table and/or to the robot, the system will push the boundaries of the state of the art in the fields of robotics and control for the accuracy and bandwidth required by the challenging and complex surgical scenario., , Two cooperating robots will interact with the brain that will deform for the tool contact, blood pressure, breathing and deliquoration. Human factors are considered by allowing easy interaction with the users through a novel haptic interface for telemanipulation and by a collaborative control mode ("hands-on"). Force and video feedback signals will be provided to surgeons. Active constraints will limit and direct tool tip position, force and speed preventing damage to eloquent areas, defined on realistic tissue models updated on-the-field through sensors information. The active constraints will be updated (displaced) in real time in response to the feedback from tool-tissue interactions and any additional constraints arising from a complex shared workspace. The overarching control architecture of ACTIVE will negotiate the requirements and references of the two slave robots., , The operative room represents the epitome of a dynamic and unstructured volatile environment, crowded with people and instruments. The workspace will thus be monitored by environmental cameras, and machine learning techniques will be used for the safe workspace sharing. Decisions about collision avoidance and downgrading to a safe state will be taken autonomously, the movement of the head of the patient will be filtered by a bespoke active head frame, while fast and unpredictable patient motion will be compensated by a real-time cooperative control system. Cognitive skills will help to identify the target location in the brain and constrain robotic motions by means of on-field observations. URL: http://www.active-fp7.eu/ COORDINATOR POLITECNICO DI MILANO - ITALY Ferrigno, Giancarlo -Tel. +39 0223993371 Email: giancarlo.ferrigno@polimi.it OTHER CONSORTIUM MEMBERS
KARLSRUHER INSTITUT FUER TECHNOLOGIE DEUTSCHES FORSCHUNGSZENTRUM FUER KUENSTLICHE INTELLIGENZ GMBH MEDIMATON LIMITED FORCE DIMENSION S.A.R.L. FONDAZIONE IRCCS ISTITUTO NEUROLOGICO CARLO BESTA CONSIGLIO NAZIONALE DELLE RICERCHE CF CONSULTING FINANZIAMENTI UNIONE EUROPEA SRL THE FOUNDATION FOR MEDICAL RESEARCH INFRASTRUCTURAL DEVELOPMENT AND HEALTH SERVICES NEXT TO THE MEDICAL CENTER TEL AVIV RENISHAW (IRELAND) LTD KUKA ROBOTER GMBH IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA TECHNISCHE UNIVERSITAET MUENCHEN TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY. GERMANY GERMANY UNITED KINGDOM SWITZERLAND ITALY ITALY ITALY ISRAEL IRELAND GERMANY UNITED KINGDOM ITALY GERMANY ISRAEL

-1-

CoCoRo : Collective Cognitive Robots


This ambitious project aims at creating a swarm of interacting, cognitive, autonomous robots. We will develop a swarm of autonomous underwater vehicles (AUVs) that are able to interact with each other and which can balance tasks (interactions between/within swarms). These tasks are: ecological monitoring, searching, maintaining, exploring and harvesting resources in underwater habitats. The swarm will maintain swarm integrity under conditions of dynamically changing environments and will therefore require robustness and flexibility. This will be achieved by letting the AUVs interact with each other and exchange information, resulting in a cognitive system that is aware of its environment, of local individual goals and threats and of global swarmlevel goals and threats. Our consortium consists of both, biological and technical institutions and is therefore optimally qualified to achieve this goal. By a combination of locally acting and globally acting self-organizing mechanisms, information from the global level flows into the local level and influences the behaviour of individual AUVs. Such a cognitive-based scheme creates a very fast reaction of the whole collective system when optimizing the global performance. As shown by natural swimming fish swarms, such mechanisms are also flexible and scalable. The usage of cognition-generating algorithms can even allow robotic swarms to mimic each other's behaviour and to learn from each other adequate reactions to environmental changes. In addition, we plan to investigate the emergence of artificial collective pre-consciousness, which leads to self-identification and further improvement of collective performance. In this way we explore several general principles of swarm-level cognition and can assess their importance in realworld applications. This can be exploited for improving the robustness, flexibility and efficiency of other technical applications in the field of ICT.

URL: http://cocoro.uni-graz.at/ COORDINATOR UNIVERSITAET GRAZ - AUSTRIA SCHMICKL Thomas Tel: +43-3163808759 Email: thomas.schmickl@uni-graz.at

OTHER CONSORTIUM MEMBERS


Organisations UNIVERSITY OF YORK SCUOLA SUPERIORE DI STUDI UNIVERSITARI E DI PERFEZIONAMENTO SANT'ANNA UNIVERSITAET STUTTGART UNIVERSITE LIBRE DE BRUXELLES UNITED KINGDOM ITALY GERMANY BELGIUM

-2-

COMPLACS: Composing Learning for Artificial Cognitive Systems


One of the aspirations of machine learning is to develop intelligent systems that can address a wide variety of control problems of many different types. However, although the community has developed successful technologies for many individual problems, these technologies have not previously been integrated into a unified framework. As a result, the technology used to specify, solve and analyse one control problem typically cannot be reused on a different problem. The community has fragmented into a diverse set of specialists with particular solutions to particular problems. The purpose of this project is to develop a unified toolkit for intelligent control in many different problem areas. This toolkit will incorporate many of the most successful approaches to a variety of important control problems within a single framework, including bandit problems, Markov Decision Processes (MDPs), Partially Observable MDPs (POMDPs), continuous stochastic control, and multi-agent systems. In addition, the toolkit will provide methods for the automatic construction of representations and capabilities, which can then be applied to any of these problem types. Finally, the toolkit will provide a generic interface to specifying problems and analysing performance, by mapping intuitive, human-understandable goals into machine-understandable objectives, and by mapping algorithm performance and regret back into human-understandable terms.

URL: http://www.complacs.org/ COORDINATOR UNIVERSITY COLLEGE LONDON - UNITED KINGDOM John Shawe-Taylor -Tel. +44(0)207 679 0481 Email: admin@complacs.org OTHER CONSORTIUM MEMBERS
Organisations MONTANUNIVERSITAET LEOBEN ROYAL HOLLOWAY AND BEDFORD NEW COLLEGE STICHTING KATHOLIEKE UNIVERSITEIT MAX PLANCK GESELLSCHAFT ZUR FOERDERUNG DER WISSENSCHAFTEN E.V. INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET EN AUTOMATIQUE TECHNISCHE UNIVERSITAT BERLIN UNIVERSITY OF BRISTOL AUSTRIA UNITED KINGDOM NETHERLANDS GERMANY FRANCE GERMANY UNITED KINGDOM

-3-

CORBYS: Cognitive Control Framework for Robotic Systems


CORBYS focus is on robotic systems that have symbiotic relationship with humans. Such robotic systems have to cope with highly dynamic environments as humans are demanding, curious and often act unpredictably. CORBYS will design and implement a cognitive robot control architecture that allows the integration of 1) high-level cognitive control modules, 2) a semantically-driven selfawareness module, and 3) a cognitive framework for anticipation of, and synergy with, human behaviour based on biologically-inspired informationtheoretic principles. These modules, supported with an advanced multisensor system to facilitate dynamic environment perception, will endow the robotic systems with high-level cognitive capabilities such as situation-awareness, and attention control. This will enable the adaptation of robot behaviour, to the user s variable requirements, to be directed by cognitively adapted control parameters. CORBYS will provide a flexible and extensible architecture to benefit a wide range of applications; ranging from robotised vehicles and autonomous systems such as robots performing object manipulation tasks in an unstructured environment to systems where robots work in synergy with humans. The latter class of systems will be a special focus of CORBYS innovation as there exist important classes of critical applications where support for humans and robots sharing their cognitive capabilities is a particularly crucial requirement to be met. CORBYS control architecture will be validated within two challenging demonstrators: i) a novel mobile robot-assisted gait rehabilitation system CORBYS; ii) an existing autonomous robotic system. The CORBYS demonstrator to be developed during the project, will be a self-aware system capable of learning and reasoning that enables it to optimally match the requirements of the user at different stages of rehabilitation in a wide range of gait disorders. URL: http://corbys.eu/ COORDINATOR UNIVERSITAET BREMEN - GERMANY GRSER Axel - Tel: +49-42121862444 Email: ag@iat.uni-bremen.de OTHER CONSORTIUM MEMBERS
Organisations OTTO BOCK MOBILITY SOLUTIONS GMBH VRIJE UNIVERSITEIT BRUSSEL BIT&BRAIN TECHNOLOGIES SL UNIVERZITETNI REHABILITACIJSKI INSTITUT REPUBLIKE SLOVENIJE-SOCA THE UNIVERSITY OF HERTFORDSHIRE HIGHER EDUCATION CORPORATION NEUROLOGISCHES REHABILITATIONSZENTRUM FRIEDEHORST GEM GMBH SCHUNK GMBH & CO KG SPANN- UND GREIFTECHNIK OTTO BOCK HEALTHCARE GMBH THE UNIVERSITY OF READING STIFTELSEN SINTEF GERMANY BELGIUM SPAIN SLOVENIA UNITED KINGDOM GERMANY GERMANY GERMANY UNITED KINGDOM NORWAY

-4-

DARWIN : Dextrous Assembler Robot Working with Embodied Intelligence


The iCub humanoid platform (by IIT)Targeting both assembly and service industry the DARWIN project aims to develop an acting, learning and reasoning assembler robot that will ultimately be capable of assembling complex objects from its constituent parts. First steps will also be taken to explore the general reparation problem (seen as a sequence of assembly-disassembly actions). Functionally the robot will operate in three modes: a) slave, where the necessary sequence of operations will be provided by a CAD-CAM system; b) semi-autonomous, where the sequence will be provided either through demonstration by a teacher performing the same task or by describing it in a higher level language, suitable for a human operator. In this mode the systems perception will provide closure of the perception-action loop, so eliminating the need for detailed spatial information as in (a); c) fully autonomous mode, where an executive process will generate the necessary sequence by reasoning and mental simulation of consequences of actions on objects. Effort will be put to keep the resulting cognitive architecture domain agnostic. This directly implies that the robot should be able to effectively generalize and transfer previously gained knowledge to new tasks (or performing the same task in a slightly modified world). The objects in its world will not be known apriori. Rather, their knowledge will be built by interacting with them. Thus categorization, object affordances, accurate manipulation and discovery of the naive physics will be acquired gradually by the robot. In addition, it will also learn the solution steps for an assembly problem by observing snapshots of the assembled object during various construction phases. The reasoning system will exploit all these experiences in order to allow the robot to go beyond experience when confronted with novel situations. A series of demonstrators of increasing complexity will be developed in correlation with the maturation of the cognitive architecture.

URL: http://www.darwin-project.eu/ COORDINATOR PROFACTOR GMBH STEYR - AUSTRIA Dr. Christian Eitzinger Email: christian.eitzinger@profactor.at

OTHER CONSORTIUM MEMBERS


Organisations FOUNDATION RESEARCH AND TECHNLGY HELLAS KING'S COLLEGE LONDON ITALIAN INSTITUTE OF TECHNOLOGY CZECH TECHNICAL UNIVESITY IN PRAGUE NOVOCAPTIS GREECE UNITED KINGDOM ITALY CZECH REPUBLIC GREECE

-5-

EFAA: Experimental Functional Android Assistant


As the introduction of robots into our daily life becomes a reality, the social compatibility of such robots gains importance. In order to meaningfully interact with humans, robots must develop an advanced real-world social intelligence that includes novel perceptual, behavioural, emotional, motivational and cognitive capabilities. The Experimental Functional Android Assistant (EFAA) project will contribute to the development of socially intelligent humanoids by advancing the state of the art in both single human-like social capabilities and in their integration in a consistent architecture. The EFAA project proposes a biomimetic, braininspired approach. The central assumption of EFAA is that social robots must develop a sense of self as to overcome the fundamental problem of social inference. It only in possesses the core aspects of a human-like self, that inferences about others can be made through analogy. The EFAA Biomimetic Architecture for Situated Social Intelligence Systems, called BASSIS, is based on our growing understanding of the neuronal mechanisms and psychological processes underlying social perception, cognition and action and will exploit the availability, amongst the members of the consortium, of a number of complementary prototype robot-based perceptual, cognitive and motor architectures. By integrating across these existing architectures, by directing focused effort on specific core problems, and by exploiting the availability of unique advanced real-time neuronal simulation and hardware, the impact of the EFAA project is assured. The EFAA project will apply and benchmark BASSIS on a mobile humanoid assistant based on the iCub platform. The resultant system, EFAA, will actively engage in social interactions and interactive cognitive tasks. To facilitate the realization of these game alike interactions and the performance analysis will be facilitated through a mixed-reality interaction paradigm using a table-top tangible interface system. URL: http://efaa.upf.edu/ COORDINATOR UNIVERSITAT POMPEU FABRA, SPAIN Paul Verschure - Tel: +34-935422140 Email: efaa.info@upf.edu OTHER CONSORTIUM MEMBERS
Organisations INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE (INSERM) IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE THE UNIVERSITY OF SHEFFIELD FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA FRANCE UNITED KINGDOM UNITED KINGDOM ITALY

-6-

EMICAB: Embodied Motion Intelligence for Cognitive, Autonomous Robots


The EMICAB consortium takes a holistic approach to the engineering of artificial cognitive systems. Our goal is to integrate smart body mechanics in intelligent planning and control of motor behaviour. To achieve this goal the consortium accounts equally for problems in neuroscience (e.g., multi-sensory integration, internal body models, intelligent action planning) and technology (smart body mechanics, distributed embodied sensors and brain-like controllers). Our approach starts with a strongly sensorised bionic body with redundant whole-body kinematics and then designs the technological infrastructure such that cognitive mechansims emerge from distributed sensorimotor intelligence. The concept is based on neuroscience research on insects whose motor dexterity, adaptiveness and pre-rational abilities in learning and memory rival those of lower mammals: stick insects orchestrate a wide range of dexterous motor behaviours and flies can maintain object locations in short-term memory during navigation tasks, just to mention paradigms that are studied by UNIBI and JGUM. The partners UNICT and SDU will devise bio-inspired models and, in turn, guide ongoing experimental research in order to achieve the overall technological goal: a dexterous hexapod robot that exploits its bodily resources for cognitive functions. Two levels of analysis and modelling will be accounted for: the smart brain that captures various aspects of motion intelligence (motor learning, context-dependent actions, multi-sensory integration) and the smart body equipped with distributed proprioceptors and muscle-like compliance, allowing for novel, highly adaptive, neurobionic control strategies. The EMICAB robot will draw from its complex body features and learn by use of a useable internal body model. This will be monitored by an ambitious set of benchmarking scenarios. We expect mutual benefit for applied research on autonomous mobile robots and for basic research in neuroscience. URL: http://www.emicab.eu/ COORDINATOR UNIVERSITAET BIELEFELD, GERMANY Volker Drr - Tel: +49-521-106 5528 Email: volker.duerr@uni-bielefeld.de OTHER CONSORTIUM MEMBERS
Organisations SYDDANSK UNIVERSITET UNIVERSITA DEGLI STUDI DI CATANIA JOHANNES GUTENBERG UNIVERSITAET MAINZ DENMARK ITALY GERMANY

-7-

eSMCs: Extending Sensorimotor Contingencies to


Cognition
The majority of current robot architectures is based on a perception-then-action control strategy. In this project, we will adopt a theoretical perspective that turns this classical view upside-down and emphasizes the constitutive role of action for perception. The key concept our project is based on that of sensorimotor contingencies, that is, law-like relations between actions and associated changes in sensory input. We will advance this concept further and suggest that actions not only play a key role for perception, but also in developing more complex cognitive capabilities. We suggest that extended sensorimotor contingencies (eSMCs) may be exploited for the definition of object concepts and action plans and that their mastery can lead to goal-oriented behavior. The project pursues the following objectives: We will employ this approach to establish computational models that are suitable as controllers for autonomous robots; we will implement these eSMCs-based models on robotic platforms with different sensor-actuator equipment; we will investigate learning and adaptivity of eSMCs in artificial systems, focussing on sensorimotor interactions, object recognition and action planning; we will investigate and validate the concept of eSMCs in natural cognitive systems, by carrying out behavioural and neurophysiological studies on healthy human subjects; finally, we will test predictions derived from this concept in patients with movement dysfunctions, where ensueing changes in perceptual and cognitive processing will be tested. A set of benchmarks and task scenarios will be developed serving as demonstrators for the enhanced performance of artificial systems based on the eSMCs approach. Moreover, the usefulness of the approach for the development of applications in augmenting human behaviour will be demonstrated.

URL: http://www.esmcs.eu/ COORDINATOR UNIVERSITAETSKLINIKUM HAMBURG-EPPENDORF - GERMANY ENGEL Andreas K. - Tel: +49-40741056170 Email: ak.engel@mac.com

OTHER CONSORTIUM MEMBERS


Organisations UNIVERSITAET ZUERICH UNIVERSITAT POMPEU FABRA UNIVERSIDAD DEL PAIS VASCO KUNGLIGA TEKNISKA HOEGSKOLAN UNIVERSITAET OSNABRUECK SWITZERLAND SPAIN SPAIN SWEDEN GERMANY

-8-

EUCogIII: 3rd European Network for the Advancement of Artificial Cognitive Systems, Interaction and Robotics
The 3rd European Network for the Advancement of Artificial Cognitive Systems, Interaction and Robotics (EUCogIII) will organise a network of many hundreds of researchers and continue building the cognitive systems community in Europe, which straddles the divides of traditional academic disciplines in its work towards artificial intelligent systems that are autonomous, robust, flexible and self-improving in pursuing their goals. Such a network is the grease for the gears of research; fostering a self-standing community with clear aims that can produce efficient, focused research with deep impact. Further to building the community itself, EUCogIII will focus on reaching out to show what artificial cognitive systems research has to offer and to increase the impact of the research already done; this outreach will centre on a specific selection of themes and communities that lend themselves well to this purpose.Structurally, we will build bridges from the network to related communities, to extant organisations and networks especially with a view towards application of artificial cognitive systems research in robotics, but also in other areas of industry that move towards systems that are more intelligent and inspired by natural systems. This orientation towards application also provides added focus to cognitive artificial systems research itself. To further the community, the EUCogIII network will provide and support education of new researchers (of PhD student level) on the latest developments at the cutting edge of science. Finally our network will provide a... URL: http://www.eucognition.org/ COORDINATOR AMERIKANIKO KOLLEGIO ANATOLIA - GREECE MLLER Vincent C. - Tel: +30-2310398211 Email: vmueller@ac.anatolia.edu.gr

OTHER CONSORTIUM MEMBERS


Organisations UNIVERSITAET ZUERICH TECHNISCHE UNIVERSITAET WIEN UNIVERSITAT DE LES ILLES BALEARS RUHR-UNIVERSITAET BOCHUM UNIVERSITAETSKLINIKUM HAMBURG-EPPENDORF RIJKSUNIVERSITEIT GRONINGEN HOGSKOLAN I SKOVDE UNIVERSITY OF SUSSEX SWITZERLAND AUSTRIA SPAIN GERMANY GERMANY NETHERLANDS SWEDEN UNITED KINGDOM

-9-

Goal-Leaders: Goal-directed, Adaptive Builder Robots


The Goal-Leaders project aims at developing biologically-constrained architectures for the next generation of adaptive service robots, with unprecedented levels of goal-directedness and proactiveness. Goal-Leaders will realize builder robots able to realize externally assigned tasks (e.g., fetching objects, composing building parts) and, at the same time, keeping their homeostatic drives within a safe range (e.g., never end up without energy or get hurt), by operating autonomously and for prolonged periods of time in open-ended environments. To this aim, Goal-Leaders will pursue a combined neuro-scientific, computational and robotic study of three key sets of competences:-the biological and computational mechanisms behind an agent's goal system, which integrates somatic, affective, and cognitive elements, and realizes the setting and selection of the agent's goals; -the biological and computational mechanisms that support the assignment of situation-dependent value to an agent's state and action representations, and therefore realize the link between the agent's goal system and perceptual-motor processes; -the biological and computational mechanisms behind an agent's anticipatory and mental simulation abilities. The Goal-Leaders achievements beyond the state of the art will be assessed against behavioural and neuro-scientific data, and by realizing three demonstrators in which robots will perform autonomous navigation and construction tasks, and will readapt without reprogramming to novel task allocations or changes in their environment. The project consortium includes a highly complementary, interdisciplinary team of top European researchers who focus on neuroscience, cognitive science, and robotics. Our robotic design methodology will have a significant impact both for the understanding of goal-directed action in living organisms and for the realization of goaldirected service robots by combining navigation and manipulation within an unconstrained environment.

URL: http://www.goal-leaders.eu/ COORDINATOR CONSIGLIO NAZIONALE DELLE RICERCHE, ROME ITALY Giovanni Pezzulo - Tel: +39 06 4459 5206 Email: giovanni.pezzulo@istc.cnr.it

OTHER CONSORTIUM MEMBERS


Organisations UNIVERSITEIT VAN AMSTERDAM LUNDS UNIVERSITET UNIVERSITAT POMPEU FABRA NETHERLANDS SWEDEN SPAIN

- 10 -

IntellAct: Intelligent observation and execution of Actions and manipulations


IntellAct addresses the problem of understanding and exploiting the meaning (semantics) of manipulations in terms of objects, actions and their consequences for reproducing human actions with machines. This is in particular required for the interaction between humans and robots in which the robot has to understand the human action and then to transfer it to its own embodiment. IntellAct will provide means to allow for this transfer not by copying movements of the human but by transferring the human action on a semantic level. IntellAct will demonstrate the ability to understand scene and action semantics and to execute actions with a robot in two domains. First, in a laboratory environment (exemplified by a lab in the International Space Station (ISS)) and second, in an assembly process in an industrial context. IntellAct consists of three building blocks: (1) Learning: Abstract, semantic descriptions of manipulations are extracted from video sequences showing a human demonstrating the manipulations; (2) Monitoring: In the second step, observed manipulations are evaluated against the learned, semantic models; (3) Execution: Based on learned, semantic models, equivalent manipulations are executed by a robot. The analysis of low-level observation data for semantic content (Learning) and the synthesis of concrete behaviour (Execution) constitute the major scientific challenge of IntellAct. Based on the semantic interpretation and description and enhanced with low-level trajectory data for grounding, two major application areas are addressed by IntellAct: First, the monitoring of human manipulations for correctness (e.g., for training or in high-risk scenarios) and second, the efficient teaching of cognitive robots to perform manipulations in a wide variety of applications. To achieve these goals, IntellAct brings together recent methods for (1) parsing scenes into spatiotemporal graphs and so-called semantic Event Chains , (2) probabilistic models of objects and their manipulation, (3) probabilistic rule learning, and (4) dynamic motion primitives for trainable and flexible descriptions of robotic motor behaviour. Its implementation employs a concurrentengineering approach that includes virtual-reality-enhanced simulation as well as physical robots. Its goal culminates in the demonstration of a robot understanding, monitoring and reproducing human action.

URL: http://www.goal-leaders.eu/ COORDINATOR SYDDANSK UNIVERSITET - DENMARK KRGER Norbert - Tel: +45-65507483 Email: norbert@mmmi.sdu.dk

OTHER CONSORTIUM MEMBERS


Organisations RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN GEORG-AUGUST-UNIVERSITAET GOETTINGEN STIFTUNG OEFFENTLICHEN RECHTS JOZEF STEFAN INSTITUTE UNIVERSITAET INNSBRUCK AGENCIA ESTATAL CONSEJO SUPERIOR DE INVESTIGACIONES CIENTIFICAS GERMANY GERMANY SLOVENIA AUSTRIA SPAIN

- 11 -

iSense: Making Sense of Nonsense


The emergence of networked embedded systems and sensor/actuator networks has made possible the collection of large amount of real-time data about a monitored environment. Depending on the application, such data may have different characteristics: multidimensional, multi-scale, spatially distributed, time series, etc. Moreover, the data values may be influenced by controlled variables, as well as by external environmental factors. However, in many cases the collected data may be incomplete, or it may not make sense for various reasons, thus compromising the sensor-environment interaction and possibly affecting the ability to manage and control key variables of the environment. The main objective of this project is to develop intelligent data processing methods for analyzing and interpreting the data such that faults are detected (and whereas possible anticipated), isolated and identified as soon as possible, and accommodated for in future decisions or actuator actions. The problem becomes more challenging when these sensing/actuation systems are used in a wide range of environments which are not known a priori and, as a result, it is unrealistic to assume the existence of an accurate model for the behavior of various components in the monitored environment. Therefore, this project will focus on cognitive system approaches that can learn characteristics or system dynamics of the monitored environment and adapt their behavior and predict missing or inconsistent data to achieve fault tolerant monitoring and control.

URL: http://www.i-sense.org/ COORDINATOR KIOS Research Center for Intelligent Systems and Networks, UNIVERSITY OF CYPRUS Prof.Marios Polycarpou - Tel: +357 22892252 Email: contacts@i-sense.org

OTHER CONSORTIUM MEMBERS


Organisations POLITECNICO DI MILANO POLIMI THE UNIVERSITY OF BIRMINGHAM STMICROELECTRONICS SRL UNIVERSITAT POLITECNICA DE CATALUNYA ITALY UNITED KINGDOM ITALY SPAIN

- 12 -

I-SUR: Intelligent Surgical Robotics


This project will develop advanced technologies for automation in minimally invasive and open surgery. The introduction of more and more complex surgical devices, such as surgical robots, and single-port minimally invasive instruments, highlights the need of new control technologies in the operating room. On the one hand, the complexity of these devices requires new coordination methods to ensure their smooth operation; on the other hand, it also requires new interfaces that could simplify their use for surgeons. Automation may thus provide a solution to improve performance and efficiency in the operating room without increasing operating costs. Currently, automation is not used in the operating room for a number of technical and legal reasons. The anatomical environment is particularly difficult to handle by classical automation. Furthermore, the execution of a surgical intervention is not only controlled by a set of physical and geometrical set points, describing the anatomical area and its properties, but also and especially by the medical and surgical knowledge that the surgeon uses in deciding what to do and how to do it, during the intervention. These control and cognitive challenges are also coupled to a legal barrier that currently prevents the use of an automatic intervention device in the operating room. In fact, liability issues of automatic products are known to have stopped many successful research projects in several Countries. Thus, the I-SUR project aims at breaking new ground in the above areas related to automation in surgical intervention, in particular design of robotic surgical instruments, task modeling and control in highly uncertain and variable environments, medical situation awareness and its interaction with task control, surgeon-robot communication, and legal barrier identification. To narrow the scope of the work the project will focus on simple surgical actions, such as puncturing, cutting and suturing. Success metrics will be defined for those actions and methods developed that abide by safety requirements, formulated in terms of those metrics. The project will demonstrate that an autonomous robotic surgical action, carried out with the developed technologies, can be as safe as currently achievable by traditional surgery. Furthermore, preoperative task planning will be included in the project, to make sure that each surgeon is able to develop automatic procedures with his/her own surgical style. The I-SUR consortium is composed of the following partners: two University Hospitals, at the University of Verona and at the San Raffaele Institute (Milano); the e-Services department of the San Raffaele Hospital; the ETH in Zurich; the BioRobotics Laboratory of the Tallinn University of Technology; the Interventional Center of the Oslo University Hospital; the Yeditepe University in Istanbul; and the Universities of Verona Ferrara and Modena-Reggio Emilia. The University of Verona coordinates the project.

URL: http://www.isur.eu/ COORDINATOR UNIVERSITA DEGLI STUDI DI VERONA - ITALY Fiorini, Paolo - Tel: +39 045 802 7963 Email: paolo.fiorini@univr.it

OTHER CONSORTIUM MEMBERS


Organisations OSLO UNIVERSITETSSYKEHUS HF FONDAZIONE CENTRO SAN RAFFAELE DEL MONTE TABOR EIDGENSSISCHE TECHNISCHE HOCHSCHULE ZRICH YEDITEPE UNIVERSITY TALLINNA TEHNIKAULIKOOL NORWAY ITALY SWITZERLAND TURKEY ESTONIA

- 13 -

JAMES: Joint Action for Multimodal Embodied Social Systems


The JAMES project ("Joint Action for Multimodal Embodied Social Systems") aims to develop a socially intelligent humanoid robot combining efficient task-based behaviour with the ability to understand and respond in a socially appropriate manner to a wide range of multimodal communicative signals in the context of realistic, open-ended, multi-party interactions. To direct our research in JAMES, we will focus on five core objectives: (1) analysing natural human communicative signals, (2) building a model of social interaction, (3) extending the model to manage learning and uncertainty, (4) implementing the model on a physical robot platform, and (5) evaluating the implemented system. The work in JAMES will build on state-of-the-art results and techniques in seven areas: social robotics, social signal processing, machine learning, multimodal data collection, planning and reasoning, visual processing, and natural language interaction. JAMES will combine the analysis of human social communicative behaviour, the development and integration of state-of-the-art technical components, and the evaluation of integrated systems. Work on these threads will be interleaved: the results of the human data analysis will be used in the development of the technical components, while the robot will be used for further data collection and evaluation studies. JAMES will extend the state-of-the-art in social robotics by moving beyond one-on-one, long-term relationships to deal with more open-ended, multi-party, short-term situations. The research in JAMES will also increase our understanding of how humans use multimodal social cues to communicate and coordinate their interactions in task-driven, joint-action contexts. The individual technical contributions to the system components will also provide state-of-the-art results in their respective research areas.

URL: http://www. james-project.eu/ COORDINATOR THE UNIVERSITY OF EDINBURGH - UNITED KINGDOM Ron Petrick - Tel: +44 (0) 131 650 4426 Email:<R.Petrick @ed.ac.uk>

OTHER CONSORTIUM MEMBERS


Organisations FORTISS GMBH FOUNDATION FOR RESEARCH AND TECHNOLOGY HELLAS UNIVERSITAET BIELEFELD HERIOT-WATT UNIVERSITY GERMANY GREECE GERMANY UNITED KINGDOM

- 14 -

NeuralDynamics: A neuro-dynamic framework for cognitive robotics: scene representations, behavioural sequences, and learning.
Endowing robots with cognition is a long-standing and difficult objective. Substantial progress in cognitive science and neuroscience has led to the insight that cognition is tightly linked to the sensory and motor surfaces, and that cognition emerges during development from relatively lowlevel mechanisms when situated in a structured environment. Building on basic functions such as detection and selection, we will develop a set of elements of cognition and techniques for combining such elements, allowing us to scale towards such cognitive capabilities as scene representation and sequence generation. We will implement and evaluate these elements of cognition in scenarios inspired by the development of cognition in early childhood. URL: http://www.neuraldynamics.eu/ COORDINATOR RUHR-UNIVERSITAET BOCHUM, GERMANY SCHNER Gregor - Tel: +49-2343227965 Email: gregor.schoener@ini.ruhr-uni-bochum.de

OTHER CONSORTIUM MEMBERS


Organisations HOGSKOLAN I SKOVDE CINTAL - CENTRO INVESTIGACAO TECNOLOGICA DO ALGARVE SCUOLA UNIVERSITARIA PROFESSIONALE DELLA SVIZZERA ITALIANA (SUPSI) SWEDEN PORTUGAL SWITZERLAND

- 15 -

NOPTILUS: autoNomous, self-Learning, OPTImal and


compLete Underwater Systems
Current multi-AUV systems are far from being capable of fully autonomously taking over real-life complex situation-awareness operations. As such operations require advanced reasoning and decision-making abilities the current designs have to heavily rely on human operators. The involvement of humans, however, is by no means a guarantee of performance; humans can easily be overwhelmed by the information overload, fatigue can act detrimentally to their performance, properly coordinating vehicles actions is hard, and continuous operation is all but impossible. Within NOPTILUS we take the view that an effective fully-autonomous multi-AUV concept/system, is capable of overcoming these shortcomings, by replacing human-operated operations by a fully autonomous one. To successfully attain such an objective, significant advances are required, involving cooperative & cognitive-based communications and sonars (low level), Gaussian Process-based estimation as well as perceptual sensory-motor and learning motion control (medium level), and learning/cognitivebased situation understanding and motion strategies (high level).Of paramount importance is the integration of all these advances and the demonstration of the NOPTILUS system in a realistic environment at the Port of Leixes, utilizing a team of 6 AUVs that will be operating continuously on a 24hours/7days-a-week basis. As part of this demonstration another important aspect of the NOPTILUS system that of (near-) optimality will be shown. Evaluation of the performance of the overall NOPTILUS system will be performed with emphasis on its robustness, dependability, adaptability and flexibility especially when it deals with completely unknown underwater environments and situations never taught before as well as its ability to provide with arbitrarilyclose-to-the-optimal performance.

URL: http://www.noptilus-fp7.eu/ COORDINATOR CENTRE FOR RESEARCH AND TECHNOLOGY HELLAS - GREECE Associate Prof. Elias Kosmatopoulos - Tel: +30-25410-79533 Email: kosmatop <at> dssl.tuc.gr; kosmatop <at> ee.duth.gr

OTHER CONSORTIUM MEMBERS


Organisations OCEANSCAN - MARINE SYSTEMS & TECHNOLOGY LDA TELECOMMUNICATION SYSTEMS INSTITUTE APDL - ADMINISTRACAO DOS PORTOS DODOURO E LEIXOES SA UNIVERSIDADE DO PORTO IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE EIDGENSSISCHE TECHNISCHE HOCHSCHULE ZRICH TECHNISCHE UNIVERSITEIT DELFT PORTUGAL GREECE PORTUGAL PORTUGAL UNITED KINGDOM SWITZERLAND NETHERLANDS

- 16 -

ROBLOG: Cognitive Robot for Automation of Logistic

Processes robotic system suited for any unloading task of containers


Globalization causes an increasing transport of goods. Nowadays, most goods are shipped in containers and are transferred onto trucks for further transport. The containers are unloaded manually since they are nearly always packed chaotically, the variety of transported goods is high, and time requirements are strict. Unloading of containers is a strenuous task as goods have a weight up to 70 kg that poses health risks, which include the effects of pesticides and poisonous gases as well as injuries through unexpectedly falling objects. Human labour is hence a high cost factor combined with unhealthy working conditions, making automated solutions highly desirable. Existing systems for automated unloading are restricted to specific scenarios and still have drawbacks in their flexibility, adaptability and robustness. A robotic system suited for any unloading task of containers requires a high amount of cognitive capabilities. RobLog aims at developing appropriate methods and technologies meeting the requirements to automate logistics processes. The RobLog system has to be capable of 3D perception in a challenging scenario (high variability of objects, dynamic scene, deformable objects). The perceived environment has to be integrated into a 3D model in realtime. Grasping hypotheses, decisions and path-plans have to be generated and executed in an adaptive manner including obstacle avoidance and re-planning, if necessary. The actions must be grounded in a physical set-up capable of handling a large variety of potentially deformable items. Finally, there is the need to provide an interface suited for a human operator to provide high-level instructions to a multitude of systems operating at several unloading docks in parallel. All these advances are demonstrated within the project in close cooperation with an industrial end-user in a realistic application scenario; thus opening the potential to reach a completely new level of automation in the logistics chain. URL: http://www.roblog.eu/ COORDINATOR FACHHOCHSCHULE REUTLINGEN, GERMANY Prof. Dr.-Ing. Wolfgang Echelmeyer - Tel: +49 (0)7121 271-0 Email: roblog{at}reutlingen-university.de

OTHER CONSORTIUM MEMBERS


Organisations BERTHOLD VOLLERS GMBH QUBIQA A/S UNIVERSITA DI PISA JACOBS UNIVERSITY BREMEN GGMBH BIBA - BREMER INSTITUT FUER PRODUKTION UND LOGISTIK GMBH OREBRO UNIVERSITY GERMANY DENMARK ITALY GERMANY GERMANY SWEDEN

- 17 -

RUBICON: Robotics UBIquitous COgnitive Network


This project will create a self-learning robotic ecology, called RUBICON (for Robotic UBIquitous COgnitive Network), consisting of a network of sensors, effectors and mobile robot devices. Enabling robots to seamlessly operate as part of these ecologies is an important challenge for robotics R&D, in order to support applications such as ambient assisted living, security, etc. Current approaches heavily rely on models of the environment and on human configuration and supervision and lack the ability to smoothly adapt to evolving situations. These limitations make these systems hard and costly to deploy and maintain in real world applications, as they must be tailored to the specific environment and constantly updated to suit changes in both the environments and in the applications where they are deployed. A RUBICON ecology will be able to teach itself about its environment and learn to improve the way it carries out different tasks. The ecology will act as a persistent memory and source of intelligence for all its participants and it will exploit the mobility and the better sensing capabilities of the robots to verify and provide the feedback on its own performance. As the nodes of a RUBICON ecology will mutually support one another's learning, the ecology will identify, commission and fulfil tasks more effectively and efficiently. The project builds on many years of experience across a world-leading consortium. It combines robotics, multi-agent systems, novelty detection, dynamic planning, statistical and computational neuroscience methods, efficient component & data abstraction, robot/WSN middleware and three robotic test-beds. Validation will take place using two application scenarios Impact: The project will reduce the amount of preparation and pre-programming that robotic and/or wireless sensor network (WSN) solutions require when they are deployed. In addition, RUBICON ecologies will reduce the need to maintain and re-configure already-deployed systems, so that changes in the requirements of such systems can be easily implemented and new components can be easily accommodated. The relative intelligence and mobility of a robot, when compared to those of a typical wireless sensor node, means that WSN nodes embedded in a RUBICON ecology can learn about their environment and their domain application, through the training that is provided by the robot. This means that the quality of service which is offered by WSNs can be significantly improved, without the need for extensive human involvement.

URL: http://fp7rubicon.eu/ COORDINATOR UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN, IRELAND Prof Gregory O'Hare and Dr Mauro Dragone - Tel: +353 1 716 2472 Email: coordinator@fp7rubicon.eu

OTHER CONSORTIUM MEMBERS


Organisations PINTAIL LTD OREBRO UNIVERSITY ROBOTNIK AUTOMATION SLL FONDAZIONE STELLA MARIS UNIVERSITY OF ULSTER FUNDACION TECNALIA RESEARCH & INNOVATION CONSIGLIO NAZIONALE DELLE RICERCHE UNIVERSITA' DI PISA IRELAND SWEDEN SPAIN ITALY UNITED KINGDOM SPAIN ITALY ITALY

- 18 -

SPACEBOOK: Spatial & Personal Adaptive


Communication Environment: Behaviors & Objects & Operations & Knowledge
The SpaceBook project will prototype a speech-driven, hands-free, eyes-free device for pedestrian navigation and exploration. SpaceBook will be developed as open source and progress will be bench marked through controlled task-based experiments with real users in central Edinburgh. The SpaceBook project will generate concrete technical and scientific advances for eyes-free, handsfree navigation and exploration systems which will support applications in tourism, elderly care and tools for urban workers. In addition to advances in navigation and exploration systems, SpaceBook provides a task environment in which more fundamental scientific and technical knowledge will be generated. Specifically we seek to advance the state of the art in model-based approaches to plan generation and recognition, statistical learning techniques for interaction management and machine learning of semantic analysis components. The interdisciplinary SpaceBook team brings a wealth of complementary expertise necessary to realize the SpaceBoook vision. Ume University (Sweden), the project's overall coordinator, brings expertise in natural language interfaces to database systems and spatial databases. The University of Edinburgh (UK) brings expertise in location-based services, dynamic 3D modelling, geographical information systems, system evaluation methodology, discourse processing, and information access and delivery (including question answering). Heriot Watt University (UK) brings experience in machine learning, spoken dialogue systems, data-collection, and evaluation of interactive systems. KTH (Sweden) brings expertise in dialogue management and the design and development of spoken dialogue systems. Our industrial partner, Liquid Media (Sweden), brings software engineering and commercial exploitation expertise and experience. Cambridge University (UK) brings expertise in the combination of linguistic theory and machine learning and the development of real-world language processing applications. Finally, the AI group from the Universitat Pompeu Fabra (Spain) brings expertise in the areas of autonomous behavior, and in particular, model-based methods for plan generation and plan recognition. URL: http://www.spacebook-project.eu/ COORDINATOR UMEA UNIVERSITET, SWEDEN Michael Minock - Tel: +46-907866137

OTHER CONSORTIUM MEMBERS


Organisations KUNGLIGA TEKNISKA HOEGSKOLAN THE UNIVERSITY OF EDINBURGH LIQUID MEDIA AB THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE UNIVERSITAT POMPEU FABRA HERIOT-WATT UNIVERSITY SWEDEN UNITED KINGDOM SWEDEN UNITED KINGDOM SPAIN UNITED KINGDOM

- 19 -

TOMSY: TOpology based Motion SYnthesis for


dexterous manipulation
The goal of the proposed research is to enable a generational leap in the techniques and scalability of motion synthesis systems. Motion synthesis is a key component of future robotic and cognitive systems to enable their physical interaction with humans and physical manipulation of their environment. Existing motion synthesis algorithms are severely limited in their ability to cope with real-world objects such as flexible objects or objects with many degrees of freedom. The high dimensionality of the state and action space of such objects defies existing methods for perception, control and planning and leads to poor generalisability of solutions in such domains. These limitations are a core obstacle of current robotic research.We propose to solve these problems by learning and exploiting appropriate topological representations and testing them on challenging domains of flexible, multiobject manipulation and close contact robot control and computer animation. Topological representations describe motion in terms of more abstract, more appropriate, and better generalizing features: for instance, an embracing motion can better be described, controlled and planned in coordinates quantifying the 'wrappedness' of arms or fingers around the object (as opposed to joint angle coordinates). Such topological representations exist on different levels of abstraction and reduce the dimensionality of the state and action spaces. This proposal investigates existing topological metrics (similar to the mentioned 'wrappedness') and uses data driven methods to discover new mappings that capture key invariances. Given topological representations, we will develop methods for sensing, control and planning using on these representations.This proposal, for the first time, aims to achieve this at all the three levels of sensing, representation and action generation by developing novel object-action representations for sensing based on manipulation manifolds and refining metamorphic manipulator design in a complete cycle. The methods and hardware developed will be tested on challenging real world robotic manipulation problems ranging from domains with many rigid objects to articulated carton folding or origami and all the way to full body humanoid interactions with flexible objects. The results of this project provide the necessary key technologies for future robots and computer vision systems to enable fluent interaction with their environment as well as provide answers to the basic scientific question of the 'right' representation in sensorimotor control.

URL: http://www.tomsy.eu COORDINATOR KUNGLIGA TEKNISKA HOEGSKOLAN, SWEDEN KRAGIC, Danica - Tel: +46 8 790 6729 Email: danik@nada.kth.se

OTHER CONSORTIUM MEMBERS


Organisations UNIVERSIDAD DE GRANADA FREIE UNIVERSITAET BERLIN KING'S COLLEGE LONDON THE UNIVERSITY OF EDINBURGH SPAIN GERMANY UNITED KINGDOM UNITED KINGDOM

- 20 -

V-CHARGE: V-Charge - Autonomous Valet Parking and


Charging for e-Mobility
The project V-Charge is based on the vision that, due to required drastic decrease of CO2 production and energy consumption, mobility will undergo important changes in the years to come. This includes new concept for an optimal combination of public and individual transportation as well as the introduction of electrical cars that need coordinated recharging. A typical scenario of such a concept might be automatic drop-off and recovery of a car in front of a train station without taking care of parking or re-charging. Such new mobility concepts require among other technologies autonomous driving in designated areas. The objective of this project is to develop a smart car system that allows for autonomous driving in designated areas (e.g. valet parking, park and ride) and can offer advanced driver support in urban environments. The final goal in four years is the demonstration and implementation of a fully operational future car system including autonomous local transportation, valet parking and battery charging on the campus of ETH Zurich and TU Braunschweig. The envisioned key contribution is the development safe and fully autonomous driving in city-like environments using only low-cost GPS, camera images, ultrasonic sensors and radar. Within the proposed project, the focus will therefore be set on the following main topics: Development of machine vision systems based upon close-to-market sensor systems (such as stereo vision, radar, ultrasonic etc.) as well as the integration and fusion of each sensors data into a detailed world model describing static and dynamic world contents by means of online mapping and obstacle detection and tracking. Computer-base situation assessment within the world model as well as describing dependencies and interactions between separate model components (e.g. separate dynamic objects). For this purpose, the integration of market-ready map-material (i.e. originating from navigation systems) as well as the use of vehicle-to-infrastructure communication shall be explored. Precise low-cost localization in urban environments through the integration of standard satellite-based technologies with visual map-matching approaches combining both the onboardperception system and available map material. Highly adaptive global and local planning considering dynamic obstacles (cars, pedestrians) and their potential trajectory.

URL: http://www.v-charge.eu/ COORDINATOR EIDGENSSISCHE TECHNISCHE HOCHSCHULE ZRICH - SWITZERLAND HALDER Pascal - Tel: +41-446345355 Email: pascal.halder@sl.ethz.ch

OTHER CONSORTIUM MEMBERS


Organisations ROBERT BOSCH GMBH THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD VOLKSWAGEN AG UNIVERSITA DEGLI STUDI DI PARMA TECHNISCHE UNIVERSITAET BRAUNSCHWEIG GERMANY UNITED KINGDOM GERMANY ITALY GERMANY

- 21 -

XPERIENCE: Robots Bootstrapped through Learning from Experience


Xperience will implement a complete robot system for automating introspective, predictive, and interactive understanding of actions and dynamic situations based on structural bootstrapping Current research in embodied cognition builds on the idea that physical interaction with and exploration of the world allows an agent to acquire intrinsically grounded, cognitive representations which are better adapted to guiding behaviour than human crafted rules. Exploration and discriminative learning, however, are relatively slow processes. Humans are able to rapidly create new concepts and react to unanticipated situations using their experience. They use generative mechanisms, like imagining and internal simulation, based on prior knowledge to predict the immediate future. Such generative mechanisms increase both the bandwidth and speed of cognitive development, however, current artificial cognitive systems do not yet use generative mechanisms in this way.The Xperience project addresses this problem by structural bootstrapping, an idea taken from language acquisition research: knowledge about grammar allows a child to infer the meaning of an unknown word from its grammatical role together with understood remainder of the sentence. Structural bootstrapping generalizes this idea for general cognitive learning: if you know the structure of a process the role of unknown actions and entities can be inferred from their location and use in the process. This approach will enable rapid generalization and allow agents to communicate effectively. Xperience will implement a complete robot system for automating introspective, predictive, and interactive understanding of actions and dynamic situations based on structural bootstrapping. Xperience will evaluate and benchmark this on state-of-the-art humanoid robots demonstrating rich interactions with humans. By equipping embodied artificial agents with the means to exploit prior experience via generative inner models, XPERIENCE will have a major impact in a wide range of autonomous robotics applications that benefit from efficient learning through exploration, predictive reasoning and external guidance. URL: http://www.xperience.org COORDINATOR KARLSRUHER INSTITUT FUER TECHNOLOGIE, GERMANY DILLMANN, Ruediger - Tel. +49 721 6083846 Email: dillmann@ira.uka.de OTHER CONSORTIUM MEMBERS
Organisations GEORG-AUGUST-UNIVERSITAET GOETTINGEN STIFTUNG OEFFENTLICHEN RECHTS THE UNIVERSITY OF EDINBURGH SYDDANSK UNIVERSITET JOZEF STEFAN INSTITUTE UNIVERSITAET INNSBRUCK FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA GERMANY UNITED KINGDOM DENMARK SLOVENIA AUSTRIA ITALY

- 22 -

Você também pode gostar