This talk was given as keynote at the International Conference on Innovation in Medicine and Healthcare 2013 (http://inmed13.innovationkt.org/) within the Ambient TeleCare invited session (http://inmed13.innovationkt.org/cmsISdisplay.php).
This presentation gives an overview on technologies assisting visually impaired persons and describes the progress made so far within the ALICE project (http://www.alice-project.eu/)
The event had a multi-disciplinary participation consisting of researchers, engineers, managers, students and practitioners from the medical arena, gathered for discussions on the ways the innovation, knowledge exchange and enterprise can be applied to issues related to medicine, healthcare and the issues of an ageing population.
1 of 63
More Related Content
Assistive technologies: experiences from AAL for the blind and visually impaired within the ALICE project
1. Assistive technologies: experiences from
AAL for the blind and visually impaired
within the ALICE project
Andrei BURSUC, Prof. Titus ZAHARIA
Institut Mines-T辿l辿com; T辿l辿com SudParis
firstname.lastname@telecom-sudparis.eu
Invited talk by DemaCare FP7 Project
2. Context and objectives
The ALICE project and AAL
State-of-the-art
User requirements
System prototype
Obstacle detection
Navigation assistant
Human-Machine interface
Conclusion and perspectives
2
Outline
Experiences from the ALICE project
3. VI persons face many problems every day:
overall contextual understanding of space semantics
interaction with surrounding objects
planning, orientation, communication, navigation
285M registered visually impaired people: 39M blind, 246M
with low vision (WHO report)
The degree of visual impairment is increasing with an
ageing population
3
Context and objectives
Nowadays
Experiences from the ALICE project
4. Provide navigational assistive device for elderly blind
with cognitive capabilities:
Positioning
Obstacle detection/alerting
Landmark/object recognition
Offer VI users a cognitive description based on a fusion of
of perceptions gathered from multiple sensors
Personal benefits:
Enable independency of blind and partially sighted people
Save stress and time of the end-users
Improve the individual self-esteem
4
Context and objectives
Objectives
Experiences from the ALICE project
5. 7 partners (academics, SMEs, VI persons associations)
4 European countries (ES, FR, SI, UK)
Duration: June 2012 November 2014
Final product: device consisting of smartphone with
additional sensors, wirelessly connected with local processing
unit
The project
ZVEZA
SLEPIH
5Experiences from the ALICE project
6. Ambient Assisted Living - funding activity that aims:
to create better condition of life for the older adults
to strengthen the industrial opportunities in Europe through the
use of ICT
Funding across-national projects involving SMEs, research
bodies and users organizations
Time-to-market perspective of max 2-3 years after the end
of the project
Project total budget: 1-7 M (funding 3 M at most)
AAL Joint Programme
6Experiences from the ALICE project
8. How VI orient themselves?
With the help of the guide (other person)
Using a white cane, guide dog
Using electronic devices, GPS
By listening familiar sounds
By looking for something familiar (edge of pavements,
curves, crossroads, very large inscriptions)
Underfoot textures, different surfaces
Sun, wind directions, smell
Road signs
8
State of the Art
Experiences from the ALICE project
9. Experiences from the ALICE project
How VI orient themselves?
Current techniques are still not very advanced
9
State of the Art
10. Experiences from the ALICE project
How VI orient themselves?
Cane and dogs are still kings!
10
State of the Art
11. How VI (could) orient themselves?
Navigation systems:
GPS + computer vision (clear path, landmark recognition)
Object recognition systems:
Grocery shopping assistant
RFID tags on objects
OCR (Optical Character Recognition)
Detectors: crosswalk , walk lights, staircase, street signs, pedestrians
Obstacle avoidance systems:
Integrating depth information
Step and curb detection
11
State of the Art
Experiences from the ALICE project
12. Conclusions:
Few systems work in real time
Many approaches require the use of heavy equipment
Some systems need tags
The research field should get a new boost with the advent of the
Google Glass
How VI (could) orient themselves?
12
State of the Art
[Lee, 2012]
[Marduchi, 2012] [Pradeep, 2010]
Experiences from the ALICE project
13. Limited computational resources: light and low powerful
wearable devices
Real-time responsiveness
Reliability and no false positives
Adequate and non-overwhelming communication with the
user (alerts, indications)
13
State of the Art
Challenges
Experiences from the ALICE project
14. 24 July 2013 14
Setting up the path
User feedback and requirements
15. Experiences from the ALICE project
Participants profile:
Age: 55-75
Countries: Slovenia, UK
Degree of visual impairness: blind and partially sighted
Total: 40 participants (20 from each country)
Questionnaire for end-users
15
User requirements
16. Questionnaire conclusions
50 % of participants are using only familiar routes
Most participants need someone to guide them to certain
places.
Some of them need the guide every time often they
depend on the time and will of others.
It is important to know where they are positioned, how far
the destination is and the vicinity of the route
16
User requirements
Experiences from the ALICE project
17. Questionnaire conclusions - Device
Not very much confidence placed in the electronic
navigation system (only after several successful tests)
Necessity of training and information about electronic
devices.
Half of users use speech synthesis
Willingness to use headphones, but hearing shouldnt
be obstructed.
Turn by turn functionality should not give too much
info
17
User requirements
Experiences from the ALICE project
18. Questionnaire conclusions - Indoor
85 % of respondents have difficulties with orientation
through indoor public institutions.
Difficulties the users are facing in indoor environments:
the size of the room
glittering surfaces
room darkness
no orientating points to navigate with white cane
difficulties to recognize the landmarks
background music.
18
User requirements
Experiences from the ALICE project
19. Questionnaire conclusions - Obstacles
Obstacles that users want to be warned about:
pillars
curves
overhanging branches
edge of pavements
street furniture
steps
down slopes
ramps
holes
bumps
19
User requirements
Experiences from the ALICE project
20. Experiences from the ALICE project
User expectations
The device should be accurate:
Exact info about the obstacles
Find safe corridors for walking
Warn the user when is safe to cross the road, the green light is on,
if traffic is coming (especially bikes, electric cars)
The device should be small, portable, phone sized.
20
User requirements
21. User expectations
Other features:
Give the distance to the building
Find the right bus stop, post box.
Text-to-speech for: letters, journeys instructions , street
inscriptions, shop names
Tell the weather, temperature, local taxi availability.
Recognize faces and the persons name.
21
User requirements
Experiences from the ALICE project
22. Experiences from the ALICE project 22
First tests and experiments
System prototype
23. Sensor evaluation
Evaluation of multiple sensors: camera (ToF, stereo, web),
compass, gyroscope, ultra-sonic ranger, GPS, pedometer)
Samsung Galaxy S3 used as baseline
23
System prototype
Image
Comunication
Sound commands
Tactile comunication
Orientation
Positioning
Light sensor
Inclination
Experiences from the ALICE project
24. Sensor evaluation
Sensors have different sampling speeds
24
System prototype
Experiences from the ALICE project
25. Sensor evaluation - Conclusions
All sensors in Samsung S3 are superior than the external
ones tested (except GPS).
External GPS has better reception due to antena but in
areas with strong multipath effect, the advantage is reduced
Accuracy of GPS: 10 40 meters in urban areas
Ultrasonic ranger would be useful for obstacles in front of
the user
25
System prototype
Experiences from the ALICE project
28. Possible camera positions
Setting used for video recording
28
System prototype
Experiences from the ALICE project
29. Headphones
Bone conduction headphones:
Effective even in very loud enviroment (city traffic)
Does not obscure sounds from enviroment
Very High frequencies not as good as in normal headphones
29
System prototype
Experiences from the ALICE project
33. 33
Input video stream
Interest points extraction
Grid of points regularly spread in a frame
Method overview
Obstacle detection
Experiences from the ALICE project
34. 34
Input video stream
Interest points extraction
Grid of points regularly spread in a frame
Interests points matching and
tracking
Multiscale Lucas-Kanade algorithm
Method overview
Obstacle detection
Experiences from the ALICE project
35. 35
Input video stream
Interest points extraction
Interests points matching and
tracking
Multiscale Lucas-Kanade algorithm
Background / Camera motion
estimation
Global geometric transform RANSAC
algorithm
Method overview
Obstacle detection
Experiences from the ALICE project
36. 36
Input video stream
Interest points extraction
Interests points matching and
tracking
Background / Camera motion
estimation
Global geometric transform RANSAC
algorithm
Static / Dynamic obstacle
motion estimation
Agglomerative clustering based on
proximity computation
Method overview
Obstacle detection
Experiences from the ALICE project
37. 37
Input video stream
Interest points extraction
Interests points matching and
tracking
Background / Camera motion
estimation
Static / Dynamic obstacle
motion estimation
Agglomerative clustering based on
proximity computation
Interest points refinement
K-NN algorithm and small clusters removal
Method overview
Obstacle detection
Experiences from the ALICE project
38. 38
Input video stream
Interest points extraction
Interests points matching and
tracking
Background / Camera motion
estimation
Static / Dynamic obstacle
motion estimation
Interest points refinement
Obstacles classification
K-NN algorithm and small clusters removal
Method overview
Obstacle detection
Experiences from the ALICE project
39. Experiences from the ALICE project 39
Input video stream
Interest points extraction
Interests points matching and
tracking
Background / Camera motion
estimation
Static / Dynamic obstacle
motion estimation
Interest points refinement
Obstacles classification
Obstacle classification based on position
and direction relative to the video camera
Experimental results
Method overview
Obstacle detection
41. 41
The algorithms were run on an Intel Xeon Machine 3.6 GHz, RAM 16 GB RAM and on a NVIDIA Quadro 4000 video board (256 cores CUDA, 256 bits of external memory
interface and 9945 MB graphical memory), under a Windows 7 platform (desktop).
Preprocessing steps
Time - without GPU
(msec)
Time - with GPU (msec)
Interest points detection (image grid) 0.05 0.5
Interests points matching and tracking
(unidirectional Lucas Kanade optical flow)
22 - 23 10 - 11
Background / camera motion estimation (unidirectional
homographic motion model (RANSAC)
6.5 - 8.0
Object / obstacle motion estimation
(agglomerative clustering)
0.05 0.15
Interest points refinement (K-NN algorithm) 0.05 0.1
Obstacle classification
(approaching / departing and urgent / normal)
0.05 - 0.1
Saving results (video) 1.5 2.05
TOTAL TIME / FRAME (average) 31 ms 20 ms
Computational time
Obstacle detection
Experiences from the ALICE project
43. Accessible Maps
Crow-sourced application for maps annotation
Routes are entered, edited and shared with Google Maps
OpenStreetMaps used as repository and online access to
information about points of interest.
43
Navigation assistant
Experiences from the ALICE project
44. Accessible Maps
Waypoints annotations:
WHAT: presence of crosswalk, traffic lights in an intersection, type
of intersection, walk buttons, Stop signs, median strips.
WHERE: information in form of absolute geographic form (Lat, Long)
44
Navigation assistant
Experiences from the ALICE project
45. Experiences from the ALICE project
Assistance
Crossing ahead:
Turn left and then cross:
45
Navigation assistant
48. Objectives
Human-Machine interface
Create a communication/presentation system:
Highly adapted to user needs
Enable the VI to perceive and interact with the surrounding
environment
Instructions for navigation will have to acknowledge that
user perception is similar to moving blindfolded in a maze:
Verbalization: for description of surrounding objects
Enactive methods: for presenting orientation, distance, motion
and position of moving objects
48Experiences from the ALICE project
49. Methods
Human-Machine interface
2 separate groups of users according to:
Level of visual impairment
Other criteria (age, education, etc.)
Interface modalities:
Audio semantics using sound, music and synthesized voice
Text-to-speech synthesis using headphones
Input modalities: screen, tapping, gestures, voice
Output modalities: audio, haptic, tactile
49Experiences from the ALICE project
50. Enactive methods
Human-Machine interface
Communication with the user: what, when, how
Not just how to transfer information between the system and the
user, but what information and when.
The timely delivery of the right information avoids information
overload.
Translate the sensory impressions about the surroundings into
tactile or sound information ( faster and easier to comprehend
than verbalization).
50Experiences from the ALICE project
51. User warning
Directional warnings: earcons
Positional warning:
alerting a user must give user enough time to prepare (2-3 sec for
a voice message)
acoustic signal (sequence of beeps) with varying frequencies
vibrations in the bone conduction headphones
51
Human-Machine interface
Experiences from the ALICE project
53. Georgie prototype
Sample user-interface
53
Human-Machine interface
Experiences from the ALICE project
54. 24 July 2013 54
Next steps
Conclusion and Perspectives
55. Conclusion
Encouraging first achievements within the ALICE project
Human-Machine interfacing is a difficult challenge
User feedback is essential
Still plenty of things left to improve
55
Conclusion and perspectives
Experiences from the ALICE project
56. Perspectives
Learning and recognizing user-defined landmarks and
objects of interest
Obstacle classification according to degree of risk to the
user and generation of adequate alerts
Improve navigation and recognition at key points of trip
(start and finish)
Navigation and obstacle recognition modules integrated
into a single application
56
Conclusion and perspectives
Experiences from the ALICE project
57. ALICE benefits in day-to-day life?
Jean:
is partially sighted
works at UBPS
travels the same route to his office every day
57
Conclusion and perspectives
Experiences from the ALICE project
58. ALICE benefits in day-to-day life?
Jean:
knows the route
with his white cane he manages to travel safely from the bus stop
to the building.
58
Conclusion and perspectives
Experiences from the ALICE project
59. ALICE benefits in day-to-day life?
Paul:
is blind
goes at the UBPS once a week
uses different route (he doesnt feel safe enough)
59
Conclusion and perspectives
Experiences from the ALICE project
60. ALICE benefits in day-to-day life?
Paul:
Pauls route
60
Conclusion and perspectives
Experiences from the ALICE project
61. Experiences from the ALICE project
ALICE benefits in day-to-day life?
Paul and some other blind people usually need to take
longer routes (more then 400m)
61
Conclusion and perspectives
Pauls routeJeans route
62. How can ALICE bring benefits?
24 July 2013 62
Conclusion and perspectives
Find out more at
www.alice-project.euThank you!