back

Eye Wear Computing

Augmenting the Human Mind

If you suspend your transcription on amara.org, please add a timestamp below to indicate how far you progressed! This will help others to resume your work!

Please do not press “publish” on amara.org to save your progress, use “save draft” instead. Only press “publish” when you're done with quality control.

Video duration
00:32:22
Language
English
Abstract
The talk gives an overview about the emerging field of smart glasses and how they can be used to augment our mind (e.g. how to improve our brain with technology). The talk will focus mostly on how to quantify cognitive tasks in real world environments. I also present a first application scenarios on how to use smart eyewear (e.g. google glass or JINS MEME) for short term memory augmentation and cognitive activity recognition.

Considering the last centuries, major scientific breakthroughs aimed at overcoming our pyhsical limitations (faster transportation, higher buildings,
longer, more comfortable lifes).
Yet, I believe the coming big scientific
breakthroughs will focus on
overcoming our cognitive limitations.

Smart glasses can play a vital role in

1. understanding our cognitive actions and limitations
by quantifying them

2. helping us design interventions to improve our mind.

The talk will focus mostly on the first point,
what kind of cognitve tasks can we track already
with the smart glasses that are available in the
market and what will happen in the near future.
I will discuss application examples for
Google Glass and J!NS MEME. J!NS MEME is the first consumer level device measuring eye movements using electrodes also called Electrooculography (EOG). The MEME glasses not a general computing platform. They can only stream sensor data to a computer (e.g. smart phone, laptop, desktop) using Bluetooth LE. Sensor data includes vertical and horizontal EOG channels and accelerometer + gyroscope data. The runtime of the device is 8 hours enabling long term recording and, more important, long term real-time streaming of eye and head movement. They are unobtrusive and look mostly like normal glasses.
For Google Glass I present an open sensor-logging platform (including the infrared sensor to count eye blinks) and a fast interface to do lifelogging.

We will discuss which eye movements correlate with
brain functions and how this fact can be used
to estimate the cognitive task a user is performing,
from fatigue detection, over reading segmentation
to cognitive workload and the advances to track attention and concentration. Challenges discussed in the talk include how to get ground truth and how to evaluate performance in general.

Talk ID
6460
Event:
31c3
Day
3
Room
Saal G
Start
9:15 p.m.
Duration
00:30:00
Track
Science
Type of
lecture
Speaker
Kai Kunze
Talk Slug & media link
31c3_-_6460_-_en_-_saal_g_-_201412292115_-_eye_wear_computing_-_kai_kunze
English
0.0% Checking done0.0%
0.0% Syncing done0.0%
0.0% Transcribing done0.0%
100.0% Nothing done yet100.0%
  

Work on this video on Amara!

English: Transcribed until

Last revision: 2 years, 2 months ago