Tải bản đầy đủ (.pdf) (35 trang)

Field and Service Robotics- Recent Advances - Yuta S. et al (Eds) Part 2 ppt

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (2.79 MB, 35 trang )

Mobile Robots Facing the RealWorld 23
in
our
Lab
, t
hat
leads
tol
ocom
otio
n
con
cepts that
pe
rformext
rem
ely well in
rough
terrain, still beingefficient and notvery complex(fig. 5and 6).
2.2 Environmen
t
Representation
Environ
me
nt
pe
rception
and
rep
rese
ntatio


ncanbem
odel
-o
r
beh
avio
r-ba
sed a
nd
might involvedifferent levels of abstraction(fig. 2). Whereas behavior-based
approaches areoften combined withbio-inspired algorithms for adaptationand
Fig. 3. 3D representation of anindoor environment by planes.The raw data(left)are
registered w
ithan u
pwardl
ookin
gSICK
laser
sca
nner, whereas
ahorizontallyarr
anged
laser scanner is used for probabilisticlocalizationof the robot during the measurement.
Through the extraction of plans (right)from the 13157 raw datapoints, the
representationcandrastically be simplified to 27 planar regions, thus reducing me mory
requirement and complexity [15], and filtering unimportant information.
Fig. 2. This figuredepicts the hierarchy of abstraction levels.More wego upin the
hierarchy,more we reduce the geometricinformation and more weincreasethe
distinctiveness.For globallocalization and mapping,highdistinctiveness is of
importance, whereas forlocalaction,precisegeo

me
tric relations
with the
environ
me
nt
come forward.
24R.Siegwart
learning, they
hardly scale w
ithmorecompl
ex taskand
haveo
nl
y shown t
he
ir
feasibility in simpleexperiments.Model based approaches make useofapriory
knowledgebymeans of environmentmodels, thus allow a very compact and task-
dependant environment interpretation. They scale muchbetter with the complexity
of the system and task,but havesomelimitations withhighly nonlinear systems
and unmodeled environment features.However, today onlymodel based control
approaches enable predictivity of the systems behavior, thus guaranteeing safety.
Models canhavedifferent level of abstraction,from raw-databased grid
representations up tohighly symbolicdescriptionsof the environment (fig. 2). For
real world navigation, the complexity is amajor issue. Complexity,especially for
localizationand mapping,isstrongly linked with the environment representation.
We thereforestronglybeliefinamodel based environmentrepresentation, that
drastically reduces the memory requirements and complexity.Furthermore, the
useofmodels of expected environment features enable tofilter most measurement

that arenotrelevant to the given navigation task,e.g. people around the robot that
arenot appropriatefeatures for localization. Representing a typicaloffice room in
2Dby raw databased occupancygrids easily requires hundredsof k-bytes if afine
grid of somecentimeters is used.However,for localizationpurposes the same
roomcan typically be represented by around 10 lines, thus requiring only around
20
to4
0
byt
es
de
pe
nd
en
t
of
the
requested r
esol
ut
ion
.The
same
appl
ies even
m
ore
drastically for
3D rep
resentatio

ns
as
presented
in
fig
ur
e3
[15
].Ifl
ower
precisio
n
is required,one might even usea topologicalmap using moreabstract and distinct
features as the fingerprints presented in [16,17].
Amajor challenge for ausefulenvironment perception and representation is
the inherent noiseof sensor systems and the featureambiguities that arepresent in
most environments.Thereforeprobabilisticalgorithms areadopted in order to
extract usefulinformationand tofuse the different signals and modalities to the
best
estim
ateof t
he
environ
me
nt
and
the r
obots
situ
ation w

ithi
nit.The
cur
ren
tly
most successfulapproaches employ Kalmanfilters [3]or Hidden Markov Models
[1]for fusion and estimation. The Kalmanfilter is well adapted if the environment
is represented in acontinuous formby geometricfeatures whereas Hidden Markov
Models are typically used withmetricgrid maps or topologicalmaps.Both
approaches have their advantages and disadvantages.Wethereforeproposehybrid
app
roaches, u
sin
ga t
op
ol
ogi
cal r
ep
resen
tatio
nf
or
the
gl
obalm
apand
ame
tr
ic

represen
tation based
on
ge
om
etricfeaturesfor the
lo
calmaps
[18].The
se
approachesenable tocombine globalconsistence, robustness,precisionand
applicability for large environments.
2.3Navigation and Control Architecture
Navigationin real world environments is a very complex task. Itrequires an
appropriatecontrol architectureimplementingvarious parallel tasks in real time. A
typicalautonomous mobile robot system requires at least fivecontrol levels,
running at di
fferent cycle times.The
main
t
asks
orde
red
by
im
port
ancearem
otor
control,emergency supervision,obstacle avoidance,localizationand planning of
the task. Apart the motor controller,all other control tasks requireinformation

about the localor even globalenvironment of the robot.As discussed in the
Mobile Robots Facing the RealWorld 25
pervious section, the perceptionand representation of the environment can
become very complex.Thus the processing power to run thesealgorithms canbe
extremely high and therefore real-timeimplementation a realchallenge.Various
researchprojects address this problem with the goal tofind new concepts and
algorithms for robust and practicalnavigationin real world environments [1,3].
Today,feasible solutions for typicalindoor or flat outdoor environments are
available (e.g.RoboXpresented below). However,navigationin unstructured and
rough terrain, where the environment has tobemodeled in real 3D, is still a very
open researcharea[4,5,6].
3 Examples
At the Autonomous Systems Lab at EPFL weconduct focused researchinmobile
robotics for autonomous operationin real world environments.Major axes arein
the fieldofmobile robot designfor rough terrain,navigationand interaction,and
in mobile micro-robotics.Among our most recent findings areenhanced feature
based localizationconcepts [3,9,8], obstacle avoidancefor highly dynamicand
human-cluttered environments, wheeled robots for high performancein rough
terrain [12,4,10]and amobilemicro-robot of the sizeofa sugar cube[19].
Recently we runone of the worlds largest mobile robot installations witheleven
fully autonomous and interactivemobile robots at the Swiss exhibition expo.02.It
represents amilestone in our mobile robotics researchand allowed us along-term
evaluation of our recent findings.Furthermoreit was used toinvestigate socialand
psychologicalissues of mobile robotics.
In the following two research results arebriefly presented and discussed.
3.1 Wheel-Based Mobile Robot Locomotion for Rough Terrain
Wheels enable efficient motion on flat ground,and,if equipped withan
appropriate suspension,can reachexcellent climbing abilities.Inour Lab we
thereforeinvestigatenew passiveand active wheel-based locomotion concepts.
Passivemeans, that the articulations of the suspensionhavenoactuators, whereas

activeconcepts usemotors for at least someof the articulations of the suspension
system.
Shrim
p,presented
in
fi
gur
e4,is apass
ive s
ys
tem
with 6
wheel
s
[12].It
can
effortlessly overcomeobstacles up to two times of its wheel diameter and climbs
regular steps of stairs that areabout of its height.Withina running research
project for the EuropeanSpaceAgency, the system is extendedfor full energetic
autonomy using solar cells,and equipped withanavigation system for
autonomous long-rangeoperation[4].
Octopus, showninfigure5,features anactivelocomotionconcept on 8
wheels.Its 6 activeand one passivearticulations enables the robot tokeep all
wheels in optimalground contact at any time[10].A specially developed tactile
wheel measures the contact point and forceofeach wheel.
26 R.Siegwart
Fig. 5. Octopus featuresanactivelocomotionconcept with8motorized and tactile
wheels, 6 activeand 1passiveDOFfor ground adaptation and on-boardintegrationofall
control elements,joint sensors and inclinometers.(Photo©Bramaz)
Fig. 4. The robot Shrimpis anall-terrain roverbased on apassivelocomotion concept.It

is characterized by 6 wheels suspended by parallel mechanisms,one fixed wheelin the
rear, twoboogies on each side and one front wheel with spring suspension. The robot
sizes around 60cminlengthand 20 cminheight,is highlystable in rough terrain and
overcomes obstacles up to 2 times its wheeldiameter withaminimalfriction coefficient.
Mobile Robots Facing the RealWorld 27
3.2 RoboX, the Tour-GuideRobot withLong-TermExperience[11]
The Swiss NationalExhibition takes placeoncein40 years.The 2002 edition,
expo.02, ranfrom May 15 toOctober 21, 2002.It hosted the exhibitionRobotics
that was intended to show the increasing closeness between manand robot
technology (fig.1). The central visitor experienceofRobotics was the interaction
witheleven autonomous,freely navigating mobile robots on a surfaceofabout
315 m2.Their main task was giving guided tours but includedalsoa robot taking
pictures of visitors.The exhibition was scheduledfor fivehundred persons per
hour.For this task, the main specifications canbe summarized as follows:
• Navigationinan unmodified,highly populated environment with visitors and
other freely navigating robots
• Bi-directionalmulti-modalinteraction usingeasy-to-use,intuitive yet robot-
typicalinteractionmodalities.
• Speechoutput in four languages:French,German,Italianand English
• Safety for visitors and robots at all time.
• Reliable operationduring around eleven hours per day, seven days per week,
during fivemonths
• Minimalmanualintervention and supervision
• Adaptivemulti-robot coordination scenarios in function of the number of
visitors and their interests
• Control of visitor flow through the robots
• Development of ten robots within tight budgets and schedules
The RoboX robot was designedand developedat our Lab by amulti-
disciplinary teamofaround 15 young engineers and artists (fig. 6 and 7). It was
then realized by our spin-off company BlueBotics.It features fully autonomous

navigation,including feature-based localization[3], highly adaptiveand dynamic
obstacle avoidance[8]and multi robot coordination on the pathplanning level [2].
The main interaction functions arefaceand people tracking, speechoutput,facial
expression through the twopan-tilt eyes and the eye-integrated LED matrix [11].
Four touchbuttons were used as input devices and two robots wereequipped with
adirectionalmicrophone and speechanalysis for simpleanswers.
The navigation and interaction software, witharound 20 main tasks, was
running on twoembeddedcomputers.The safety-criticalnavigation software runs
on aXO/2 operating system based on Oberon [13]and the interaction softwareon
anindustrialPC running Windows 2000.Anadditional security controller was
running on aPIC micro-controller,guaranteeing visitors safety at all time.
The specially developedinteraction softwareSOUL [20]aimed at composing
the scenarios likea theater or amusiccomposition. It enables through a
convenient interface tocombine different basicbehaviors with synthesized speech,
motion, sensory inputs and muchmore.
28R.Siegwart
Eleven robots wereguiding visitors through the exhibitionand interacting
with theminanenvironment cluttered by hundredsof visitors.During the five-
monthexhibition, the RoboXfamily was in contact with 686'000visitor and
traveled a totaldistanceof 3'315 km.The installation served alsoas research
Fig. 6. a)The
aut
onomous exhibition robot
RoboX.b)RoboXnumber
6 with v
isitors in
the pavilion at expo.02
Fig. 7. Basicelements and functionalities of the tour-guide robot RoboX.
Mobile Robots Facing the RealWorld 29
pl

atfo
rmand
t
echn
ol
og
y
de
mo
nst
rat
io
n.
Througho
ut
the
fi
ve-m
ontho
pe
rat
ion
period, the navigation system was close to100%reliable.Thiswas especially due
to the localization system that was based on linefeature[2], thus filtering out all
the dynamics in the environment comingfrom the visitors in the vicinity of the
robot.Moredetails on the hardwaredesign, the navigation system and the
reliability canbefound in [2,8,11,14].
4Conclusions and Outlook
Newconcepts for wheeled locomotion,featurebased environment representation
and navigationhavebeenpresented and discussed in this paper.Theirpotential

was shownby twoexamples of mo bile robots system of our Lab,facing the
complexity of the real world.They represent our recent findings,but are still only
a very first step towards intelligent and socially interactive robots.Inorder to
realize really intelligent mobile robots,able to scope withhighly complexreal
world environm ents,enormous researchefforts in various fields likeenvironment
representation,cognition and learning are still required.
Acknowledgments
The
aut
hor
would
lik
e to t
hank
all t
he
cur
ren
t and
past col
laborators
of
the
Aut
on
om
ous
Sys
tem
s Lab

fo
r
the
ir
con
tr
ibut
ion
s and
inspi
rin
g w
ork, t
he
ir
curiosity and dedicationfor mobile robotics research. The presented projects were
mainly fundedby EPFL, the EuropeanSpaceAgency (ESA)and expo.02.
References
1. S.Thrun,D.Fox,W.Burgard,and F.Dellaert,"Robust MonteCarlo Localizationfor
Mobile Robots,"InArtificialIntelligence(AI), 2001.
2.Arras,K.O.,Philippsen,R.,Tomatis,N.,de Battista, M.,Schilt,M.and Siegwart,R.,
"ANavigation FrameworkforMultiple MobileRobots and its Application at the
Expo.02 Exhibition,"inProceedings of the IEEE InternationalConferenceon
Roboticsand Automation (ICRA'03),Taipei,Taiwan, 2003
3.Arras,K.O.,C
astell
anos,J
.A.and
Siegwart
,R

.,Featur
e-Ba
sed Multi-Hypo
thesis
Localization and Tracking for Mobile Robots Using GeometricConstraints.In
Proceedings of the IEEEInternationalConferenceonRobotics and Automation
(ICRA’02),Washington DC,USA,May 11 -15. 2002.
4. Lamon,P.and Siegwart,R.,"3D-Odometry for rough terrain –Towards real 3D
navigation."
InProcee
dings
of t
he IE
EE Internation
alConf
eren
ceonRobotics
and
Automation(ICRA'03),Taipei,Taiwan, 2003.
5. Singh S.,Simmons R.,SmithT.,Stentz A.,VermaV.,YahjaA.,Schwehr K.,"Recent
Progress in Localand GlobalTraversability for Planetary Rovers",Proceedings of
IEEE InternationalConferenceonRobotics and Automation (ICRA’00),p1194-1200,
SanFrancisco,April 2000.
6.A.Mallet,S.Lacroix,a
nd
L.Gallo,"Position
estimation
in outdo
or environme
nts

usingpixel tracking and stereovision",Proceedings of IEEE InternationalConference
on Robotics and Automation (ICRA’00),pages 3519-3524,SanFrancisco,CA(USA),
April 2000.
7.Fong T.,NourbakhshI.,Dautenhahn K.,"A survey of socialinteractive robots,"
JournalofRobotics and Autonomous Systems,42,143-166, 2003
30 R.Siegwart
8. Philippsen,R.and Siegwart,R.,"Smoothand Efficient Obstacle Avoidancefor aTour
GuideRobot,"InProceedingsof IEEE InternationalConferenceonRobotics and
Automation,(ICRA’03),Taipei,Taiwan, 2003.
9. Tomatis,N.,Nourbakhsh,I.and Siegwart,R.,"Hybrid Simultaneous Localization and
MapBuilding:Closing the Loop withMulti-Hypotheses Tracking,"InProceedings of
the IEEE InternationalConferenceonRoboticsand Automation (ICRA’02),
Washington DC,USA,May 11 -15, 2002.
10.Lauria, M.,Pig
uet,Y
.and
Siegwart,R
.,"Octopus
-AnAuton
omous Wheeled
Climbing Robot,"InProceedings of the FifthInternationalConferenceonClimbing
and WalkingRobots.Publishedby ProfessionalEngineering Publishing Limited,Bury
St Edmunds and London,UK, 2002.
11. Siegwart R.,et al.,"Robox at Expo.02:ALarge Scale InstallationofPersonalRobots,"
Spe
ciali
ssu
eonSocially
InteractiveRobots,R
obotics

and Auton
omous Sys
tem
s 42
(3-4), 31March2003
12.Siegwart R.,Lamon P.,Estier T.,LauriaM.,Piguet R.,"InnovativeDesign for
Wheeled Locomotion in Rough Terrain,"JournalofRobotics and Autonomous
Systems,Elsevier Sep. 2002,Vol. 40/2-3,pp 151-162
13.Brega, R.,N.Tomatis,K.Arras,and Siegwart R.,"The Need for Autonomy and Real-
Time in Mobile Robotics:ACaseStudy of XO/2 and Pygmalion,"IEEE/RSJ
InternationalConferenceonIntelligent Robots and Systems (IROS’00),Takamatsu,
Japan, 2000.
14. Siegwart,R.,Arras,K.O.,Jensen,B.,Philippsen,R.and Tomatis,N.,"Design,
Implementation and Exploitation of aNew FullyAutonomous Tour Guide Robot,"In
Proceedings of the 1st InternationalWorkshop on Advances in ServiceRobotics
(ASER'2003),Bardolino,Italy,13-15 March 2003.
15. Weingarten,J.,Gruener,G.and Siegwart,R,"AFast and Robust 3DFeature
Extraction Algorithm for Structured Environment Reconstruction,"Proceedings of
11thInternationalConferenceonAdvanced Robotics,Portugal,July 2003.
16.Lamon,P.,I
.Nourbakhsh,et
al.
,"Deriving
and
Ma
tching
Image Fingerprin
t
Sequences for Mobile Robot Localization,"Proc.ofIEEE InternationalConferenceon
Robotics and Automation (ICRA),Seoul,Korea, 2001

17.Lamon,P.,Tapus A.,et al.,"EnvironmentalModeling withFingerprint Sequences for
TopologicalGlobalLocalization"- submitted at IROS’03,Las Vegas,USA, 2003.
18. Tomatis N.,NourbakhshI.and
Siegwart
R.,"Hybrid
Sim
ultane
ous Locali
zation
and
MapBuilding:Closing the Loop withMulti-Hypotheses Tracking." InProceedings of
the IEEE InternationalConferenceonRoboticsand Automation (ICRA’02),
Washington DC,USA,May 11 -15, 2002.
19. CaprariG.,Estier T.,Siegwart R.:"Fascination of DownScaling -Alice the Sugar
CubeRobot,JournalofMicro-Mechatronics,"VSP, Utrecht 2002,Vol. 1,No. 3,pp.
177-189.
20.Jensen,B.,Froidevaux,G.,Greppin,X.,Lorotte,A.,Mayor,L.,Meisser,M.,Ramel,
G.and Siegwart,R.(2003)"Multi-Robot Human-Interationand VisitorFlow
Management,"InProceedings of the IEEE InternationalConferenceonRobotics and
Automation(ICRA’03),Taipei,Taiwan, 2003.
Breakthroughs in HumanTechnology Interaction
Bernd Reuse
FederalMinistry of Education and Research,Germany
Abstract. In1999 the GermanFederalGovernment launched six major strategiccollabora-
tive r
esearchp
roj
ects on
HumanTechnolog
yInterac

tion, w
hichinvolved 102 researchp
art-
ners and afunding volume of 82 million.The results of theseprojects wereexpected toal-
low people tocontrol technical systems multimodally by using naturalforms of interaction
suchas speech,gestures,facialexpressions, touchand visualization methods and toapply
such systems for the most varied purposes in their privateand working environments.The
ambitious researchgoals wereachieved withprototypes for real-world applications.Re-
searchactivitieshave resulted in 116 patent applications,56 spin-off products and 13 spin-
offcompanies as well as 860 scientificpublications.
1Introduction:Trends in HumanTechnology
Interaction
Together withthe FederalMinistry of Economicsand Labour (BMWA), the Fed-
eralMinistry
of Education
and Research(
BM
BF
)o
rgani
zed ani
nternatio
nal s
tatu
s
conferen
ceinBerlin
in June
2003
, w

he
rethe
r
esu
lts
of
fo
ur
years
of
go
vernm
en
t-
funded researchonHumanTechnologyInteraction(HTI) werepresented
(www.dlr.de/pt-dlr/sw).
Distinguishedpersonalities from science, researchand industry participated in
this conf erence. The roughly 350 conferenceparticipants from Germany and
abroadagreed on the following trends in HumanTechnology Interaction:
• HumanComputerInteraction is turning intoHumanComputer Cooperationand
will support many trends in I&C.The number of transactions is increasing dra-
matically and requires new modalities in HTI.Access toany information with
any deviceat any placeand at any time will be supported by HTI interfaces.
Agents will take over routine work.
• A simpleand easy-to-handleHumanComputerInterfaceis animportantpre-
conditionfor marketing the products of the IT industry.
• HumanCom
put
er Interaction
canh

el
p s
ol
ve t
he
proble
ms
of t
he
fut
ur
e,name
ly
the
problem
s of the
agi
ng
society.Ani
ntellig
en
t
humanl
ife has
tobe s
upp
orted
by scienceand technology.
S. Yuta et al. (Eds.): Field and Service Robotics, STAR 24, pp. 31–38, 2006.
© Springer-Verlag Berlin Heidelberg 2006

32 B.Reuse
• HumanComputer Interaction will havea stronginfluenceon society as a whole
as a result of the convergenceof the useofcomputers in privateand business
environ
me
nts
,of
w
orki
ng
li
fe
an
dlei
su
retim
e,and
of
paid
an
d u
np
aid
work.
2 The
Verbmobil
Proj
ect
After 20 years of largely unsuccessful researchin the fieldof speech recognition,
whichhadonly produced sim pledialogue systems, the BMBF decided in 1993 to

providea totalofapproximately 60 millionfor aneight-year researchproject enti-
tled Verbmobil which was todeal with the automatic recognitionand translation
of spontaneous speechindialogue situations.This was not in line with the then
prevailing trendinresearch,and even someof the internationalexperts who were
involved took the view that the goalcould not beachieved at that time.
But in the Verbmobil project wepursued new paths in research:first of all,mas-
tering the complexity of spontaneous speech withall its phenomena suchas vague-
ness,ambiguities, self-corrections,hesitations and disfluencies took priority over
the
en
visage
d v
ocabulary
.Another
novel
fe
ature w
as that
researche
rs used t
he
in-
fo
rmatio
ncon
tain
ed in
t
he
prosody

of
s
pe
echf
or
spe
ech r
ecog
ni
tio
np
u
rpo
ses.In
addi
tion
, t
he
transfer
in
cluded
kn
owledg
eprocessing
, w
hi
chis in
di
spen
sab

le
in
tr
anslatio
n. The 135d
if
fe
ren
t workp
ac
kage
s and 3
5researchg
roups
di
st
ribut
ed
throughout Germany werelinkedby anetworkmanagement ledbyProfessor
Wahlster of the GermanResearchCentrefor ArtificialIntelligence(DFKI)in
Saarbrücken.
As a result of Verbmobil it was possible todemonstrateinJuly 2000 the transla-
tionof spontaneous speechfor the domain of the remotemaintenanceofPCs using
30,000 words and for a telephone translation system with10,000 words for transla-
tio
nf
rom
Germani
ntoEng
lishand

with 3,00
0w
ords
for
translatio
nfrom
German
in
toJapane
se. Inaddi
tion
Verbmo
bil
ge
ne
rated
20
spi
n-o
ff
prod
ucts,8 s
p
in-off
companies and about 800 scientificpublications.In 2001,Verbmobil received
FederalPresident Rau's GermanFutureAward, the highest German research
award.
3
LeadProjects
on HumanTechnology Interaction

In1999 the FederalGovernment started aninitiativeonHumanTechnology Inter-
action, the centralgoalbeing toextend the findings and models of the Verbmo bil
project concerning speech-based humaninteraction withcomputers tocover the
full rangeofhumanforms of interaction.
It was expected that the consideration and integrationof severalforms of inter-
action would allow amuchbetterinterpretationofthe user's intention thanone
Breakthroughs in HumanTechnology Interaction 33
modality alone.This initial view has now beenconfirmedonaglobal scale at rele-
vant internationalconferences.
Fig. 1. Aspects of multimodalinteraction
The
Fed
eralGovernm
en
t
the
ref
ore s
tage
dani
de
as
com
pe
titio
no
n
Human
Technology Interaction. Altogether 89 outlineproposals involving 800 cooperation
partners from scienceand industry were submitted.Ina two-tier process,interna-

tionalexperts selected six major strategicand interdisciplinary collaborative re-
searchprojects (leadprojects)involving 102 partners from scienceand industry.
Theseprojects were supported with82.4 millioningovernment funds (to which
industry added69.7 millionofits ownfunds)between July 1999 and September
20
03
.
The
resu
lts
of
the
proj
ects
wereexpe
cted toallow
pe
op
le tocon
trol
techni
cal
systems multimodally by usingnaturalforms of interaction suchas speech,ges-
tures,facialexpressions, touchand visualizationmethodsand toapply such sys-
tems for the most variedpurposes in their privateand working environments.The
aim was toadapt technology topeople and not vice versaas has been the casein
the past.Ergonomics and user acceptanceofthe various forms of interaction were
major criteriafor the development of prototypes which should notonly behighly
attractivefrom the scientific viewpoint but should alsohaveagreat market poten-
tial.

The followingisanoverview of the projects that werecarried out.It should be
noted that abasic-scienceproject entitledSMARTKOM succeededingeneralizing
the advanced discoursemodels for spokendialogue tocover the full spectrumof
multimodaldiscoursephenomena.The other projects covered the entire rangefrom
basic toapplied researchbut wereclearly moreapplication-oriented.
34B.Reuse
SmartKom: Dialogue-based HumanComputerInteraction through Coordi-
nated Analysisand GenerationofMultiple Modalities (www.smartkom.org
).
Computer without keyboardand mouse-integrated processing of speechand ges-
tu
res for naturali
nterac
tio
nwith t
he
system
.Even
v
ague,a
mbi
guous
and
in
com
-
pleteinput is understood.The computer is controlledby means of gestures and fa-
cialexpressions and recognizes frowning as a sign of non-understanding on the
part of the user (10 project partners; project leader:GermanResearchCentrefor
ArtificialIntelligence,DFKI). Highlights:

• Situation-based understanding of vague,ambiguous orincompletemultimodal
input at the semanticand pragmaticlevel
• Development of aMeaning RepresentationLanguage M3L.
ARVIKA: Augmented Reality (AR)forDevelopment,Productionand Ser-
vices (www.arvika.de
). The computer in your spectacles -mobile action in mi xed
realand virtualfuture-oriented working environments.Situation-related informa-
tionis displayed to serviceengineers on the spot(22 project partners; project
leader:Siemens AG, Nürnberg)
.High
lights
:
• First prototypes of mobile Augmented Reality systems for industrialapplica-
tions in development,productionand service
• Rem
oteAugm
en
ted
Reality su
ppo
rt
for serviceprocesses.
EMBASSI : Multim
od
alAss
istancef
or
Inf
otainment
and Serv

iceInfrastr
uc-
tures (www.embassi.de
). Multimodal remotecontrol for all electronicappliances
in everyday life ensures clear and intelligent user interfaces and operating instruc-
tions.Individually adaptable access topublic terminal systems is possible (18 pro-
jectpart
ne
rs;
proj
ectle
ade
r:G
rundi
gAG
,Nürnberg)
.Hig
hl
igh
ts
:
• Living
roo
mo
f the
fut
ure–
multim
od
alassistancei

n the
s
electio
no
fe
ntertain
-
ment programmes and the control of living room equipment
• Adaptivedriver assistancefor significantly increasing traffic safety.
INVITE: IntuitiveHumanComputerInteractionfor the Interlaced Informa-
tion World of the Future (www.invite.de
). Multimedial,multimodaland multi-
lo
catio
nal t
eam w
ork-
t
oo
ls
fo
r ani
nn
ovativee
xchange
of
inf
ormati
on and
kno

wl-
edge.Implicit recording of informationfor customer adviceand support (20 project
partners; project leader:ISA GmbH, Stuttgart). Highlights:
• Exploration and extractionofknowledge structures by integrationofspeech
recognition, text mining and ontology building
• Interaction and collaborativedata representationinimmersiveand distributed
3D-e
nviron
ments
.
MAP(BMWA):MultimediaWorkplaceofthe Future (www.map21.de
). New
techni
cal sol
ut
io
ns for mo
bile
activities
through
in
tegratio
nofmultim
od
ali
nterac-
tio
nfunctio
ns,new
assist

ance s
ys
tem
s,agent
techn
ol
og
ie
s
and
multim
ed
iame
th-
ods (15 project partners; project leader:Alcatel SEL AG, Stuttgart). Highlights:
Breakthroughs in HumanTechnology Interaction 35
• Securemobile agent systems for personalassistancein stationary and mobile
working situations
• Delegating tasks,appointment negotiationsby personalagents.
MORPHA : Intelligent service robots tointeract and collaborate withhuman
users (www.morpha.de
). Mobile service robots areadvancin g-mainly in private
households and in long-termcare. Robots working in humanenvironments or in-
teractingwithpeople must beable to recognizepeopleand adapt their movements
to suit them.It must bepossible to teach robots quickly and intuitively through
gestures and demonstration (17project partners; projectleaders:DelmiaGmbH,
Fellbach,FAW, Ulm). Highlights:
• New interactiveprogramming paradigms for robots through seamless integra-
tion of tactile,gestureand speechchannels,programming by touch
• Reactive,collision-free motion generation for housekeepingand manufacturing

assistance.
3.1 Selected Demonstrators
The projects produced a totalof150 demonstrators; four of themarepresented in
the following:
SM
ART
KO
M
de
vel
op
ed
sys
tem
s whi
challo
w the
s
pe
ech-
based
sen
di
ng
of
e-
mails
and sel
ectio
nofm

us
icon
dem
and u
sin
gMP
3 over
the
Internet
from
a run-
ni
ng
ca
r.
Fig. 2. SMARTKOM:Sending email from adriving car
ARVIKA developedaninnovativecabling method for the Airbus.Insteadofcon-
necting the hundreds or thousands of cables in aircraftor other technical systems
by
us
in
gcab
le
lists,en
gi
ne
ers will in
fut
ur
ebeab

le t
of
ind
the
righ
t conn
ectio
n v
ia
Augm
ented Reality
by speechcontrol of ca
ble numbers.
36 B.Reuse
Fig. 3. ARVIKA: Connecting cables in airplanes withAR support
INVITE developeda system whichis based on speechprocessingand canextract
the content of aconversation between severalparticipants and present it in agraph.
This graph is generated while the deviceislistening; the informationit contains
canbe used forfurtherpurposes,e.g. for identifying the points of the conversation
whichhaveor havenot been settled.
Fig. 4. INVITE: Exploration and Extraction of Kno wledge Structures:Goals and Concepts
Breakthroughs in HumanTechnology Interaction 37
MORPHA developed a service robotfor the domesticenvironment whichcanful-
filmany functions that areneeded by elderly or disabledpersons in their private
sph
ere,e.
g. itcano
ff
er
drin

ks,assist
t
he
m
in
walk
ing
and
carr
ying
,and
provi
de
communication support.
Fig. 5. MORPHA: Interaction and communication with robot assistants
4Results of the LeadProjects on HumanTechnology
Interaction
The leadprojects on HumanTechnology Interactionhaveproduced aconsiderable
number of results whichareofgreat scientificand commercialimportance. A sci-
entificadvisory board withinternationalmembershipconfirmed the overwhelming
success of the HTI programme and pointed to the considerable progress made by
researchin the field of HumanTechnology Interaction
(-
it.de/mti-2).
38B.Reuse
Fig. 6. Commericaland scientific results
The BMBF
has
sofar organi
zed four

even
ts
togethe
r withGermani
nd
ustr
y
in
order topromote the quick transfer of the results yielded bythe researchprojects.
The BMBF and BMWA will present these results toaninternationalpublicat the
CeBIT 2004inHannover (March18 to24),where they have reserved anareaof
about 1000 squaremeters in exhibitionhall 11 (research).
Owing to this big success, the topicsof the Germanleadprojects on Human
Technology Interactionhavebeen includedin the EU's SixthResearchFramework
Prog
ramme.
Expression of thanks:
My special thanksgo to the DLR projectmanagement groupinBerlin Adlershof,
aboveall toDr.Groteand Dr.Krahl.


Indoor Navigation for Mobile Robot by Using
Environment-Embedded Local Information
Management Device and Optical Pointer

1

2

3


4


3

3
1



2

3

4

Abstract.           
            
          
           
             
           
              
          
              
            
              
                
     

1Introduction
            
       
          
             
         
          
              
          
     
           
         
         
          
   et al.
         
           
          
             
         
          
           
             
               
             
             
         
              
        
        

           
     
2Map Information Management forNavigation
            
         
          
          
            
            

2.1 Global Map Expression forPath Planning
               
            
        
              
         
           
             
           
2.2 Local Information forActual Navigation
            
              
            
     
Fig.1.   
              
          
           
           
          

            
            
            
             
 
3Local Information Management forNavigation
3.1 Information Assistant and Optical Pointer
           
             
         
            
          
             
           
   
         
               
          
          
       
              
         
   et al.
Fig.2.       
             
            
               
              
              
     

Fig.3.      
3.2 Navigation Algorithm of the System
           
 

 
            

     
           
 
            
      
     
              
           
            
           
         
            
       
              
    
             
             

          
        
        
              

         
           
          
           
           

4Navigation System
           
     
4.1 Omni-Directional Mobile Robot
          
           
             
   
          
             
            
   et al.
Fig.4.     
Fig.5.   
4.2 Information Assistant and Optical Pointer
         
           
        
           
         
            
     
         
              

         
Fig.6.   
5Experiments
          
             
             
              
         
           
        
             
              
       
              
             
           
             
             
           
                
            
               
      
   et al.
Fig.7.  
Fig.8.       
6Conclusion
          
         
           

            
        
          
           
             
          
             

×