Você está na página 1de 362

EV

Transforming the journey into the destination

ME310 2013 Final Documentation


Audi Team

Audi Corporation
Mechanical Engineering 310
Fall Design Documentation
Version: 10.12.2012.
Dept. of Mechanical Engineering
Stanford University
Stanford, CA 94305-4021
In Partnership With:
Aalto University, Helsinki, Finland

Final Documentation

Final Documentation

1. Prelude

1.1 Executive summary


With the advent of new robotic and
autonomous technologies, it was only
a matter of time until it crossed over to
vehicle design. The development in the
area of autonomous cars has already
reached levels that have made it possible for
companies like Google to manufacture and
test out self-driving cars, which have already
clocked over 200,000 miles in the US [10]. It
is not just the technology that is developing;
the entire support system is changing as
autonomous cars have been legalized in
three states in the US. Personal mobility
currently is often perceived as driving in a
vehicle. It is this perception that is about
to change; mobility will not just be about
driving anymore. Driving is not the primary
task that would be performed in autonomous
cars. These cars would offer a wide range of
options for various activities the user can do
inside the car while it is driving them around.
Team Audi Evolve consists of a combined
team of mechanical and electrical engineers,
product designers, industrial designers and
business school students from Stanford
University and Aalto University in Finland.
The team envisions a future where cabin
space designs of autonomous cars will have
transformed the journey into the destination.
The aim is to offer a solution for 2035 where
drivers can easily work and enjoy their free
time, while maintaining the pleasure of
driving. This makes it important to focus on
the issue of transitioning between various
activities and driving.
The important areas that have been identified
in the design space are psychological,

Final Documentation

Your autonomous car


is your personal driver
in the future!
physical and experiential. The team went
through several stages of prototyping to
explore the design space. The initial focus
was on a safe transition from autonomous to
manual driving mode. The main idea tested
was that of the steps involved in a safe
transfer of control to the user. The turning
point for this project came with a prototype
exploring a reconfigurable cabin space
design in cars. It was after this prototype that
the focus shifted on the actual experience
of the transition. Users who will adopt this
technology in the future are people who
will still want to drive along with performing
other activities in the cabin space during
autonomous mode. They will multi-task a
lot and always stay connected, therefore
changing positions frequently.
Since people will often be performing many
different activities while in the car, creating
an excellent riding experience is just as
important as an excellent driving experience.
The team is designing a system that will lead
to smooth and comfortable transitions not
only to and from driving mode, but also for
transitions between other activities. This
will also help people regain time lost in
commuting by creating a cabin space that
is adaptable to many activities and removes
tedious manual adjustments.
The designed system will give the user
complete flexibility to perform different

Final Documentation

activities in the cabin space. The car seat

interactions with the system Mode Initiators

and the steering mechanism are important

(MI) are intentional actions within reach like

components in the cabin space, which will

pulling the steering wheel to go into manual

affect the freedom of motion and ability to

driving mode and pushing on the steering

perform various activities in a comfortable

wheel to make it retract. The user can also

way. The team envisions that the system will

use Intentional Body Commands (IBC) like

consist of a smart chair and a retractable

leaning back to tilt the seat. The chair can

steering wheel. The main motivating factors

also give Adaptive Chair Reactions (ACR) to

for the retractable steering design were

movements of the users like rotating slightly

increasing mobility within the cabin space

when they reach for something at the back

and relieving the users of the responsibility

of the car.

of monitoring autonomous driving when in

This new cabin space experience allows for

autonomous mode. While the steering wheel

the driver to rotate 180 degrees to be able

is retracted and locked for autonomous

to socialize with other passengers. Having

mode, the tablet is enabled so that the user

only one seat in the front also increases

can interact with the windshield. When the

the mobility within the cabin, while also

steering wheel is not retracted it is free to

making the drivers seat very desirable.

rotate so that the driver can have manual

This configuration allows the driver to

control of the car. There are three different

better share the driving experience with

Figure 1.1.1: Final Prototype

Final Documentation

the passengers since it increases their view

and artificial intelligence. With so much

of the road as well. Our final prototype will

technological development in these major

give the experience of facilitated activities

fields there was bound to be a point when

through increasing mobility and effortlessly

these ideas crossover and start influencing

transitioning between activities for the driver.

automotive design. As technology develops

Going forward, the team will incorporate

further, it will lead to a complete paradigm

many of the findings uncovered in all of the

shift in the way cars and automotive

prototypes, research, and needfinding into

technology in general is perceived amongst

a functioning system. The development

the general population. Concepts which

described herein leads to a vision combining

were just science fiction a few years back,

many of the concepts explored during the

have started becoming real and achievable

last few months. Through a continued

in the near future. The development in

process of rapid prototyping and additional

the area of autonomous cars has already

user testing, the team plans to present a

reached levels which has made it possible

highly refined and fully functioning system

for companies like Google to manufacture

in June 2013.

and test out self driving cars which have

1.2 Project Background


Over the last 100 years personal vehicles
have become one of the most heavily
used means of transportation in developed
countries. In many ways, cars define who
we are, what we do and how we travel.
Along with increased mobility, cars have
brought about tremendous changes to the
way we live and interact. Over the period of
time, in addition to being just a medium of
transportation, cars have gone to become
a medium to experience the sheer pleasure
of racing around and driving in a machine
that is so responsive to the users actions.
Indeed a major component of the pleasure
of driving lies in the ability to drive on the
edge of control and experience the thrill of
being out there and taking the risk.
However, this entire landscape is changing
as we head towards the age of robots

already clocked over 200,000 miles in


the US. It is not just the technology that is
developing, but the entire support system
is changing with autonomous cars being
legalised in three states in the US.
Personal mobility currently is often perceived
as driving in a vehicle. It is this perception
that is about to change, mobility will not just
be about driving anymore. Driving is not the
primary task that would be performed in
autonomous cars. They would offer a wide
range of options with users being able to
perform various other activities inside the
car while it is driving them around.
The major hurdles that lie in the development
and adoption of this technology widely
are the costs associated with required
infrastructure development, legal issues
like liability in case of accidents and user
acceptability in general. The question that

Final Documentation

comes up is what would happen when

Since people will often be performing

this would actually launch in the market?

different activities while in the car,

Would people be willing to take the leap

creating an excellent riding experience is

of faith and trust their life on a computer

just as important as an excellent driving

controlled car? What kind of assurance

experience. Switching between manual

and trust development would be required

mode, autonomous mode and different

for widespread acceptance of autonomous

activities within autonomous mode, always

vehicles? These and a lot of other questions

includes a transition. It is these transitions

and issues are the prime concern of car

that the team hopes to address, as well as

manufacturers who are willing to step in

ensuring that the cabin has enough mobility

this field of autonomous car development.

in order to perform various activities. The


vision of Team Audi Evolve is to facilitate

Like any other new technology, it will take

activities for the driver through effortless

time to adopt and build trust on this system

transitions and increased mobility.

but it will happen eventually and once that


happens the possibilities that it opens up are
enormous. It is not presumptuous to assume
that by the year 2035, autonomous cars will
have become a common sight. Although
they offer relaxation and safety in driving,
it is still problematic for users to completely
rely on their autonomous functions. Car
designers and manufacturers then need to
focus on taking advantage of this whole new
opportunity to redefine urban mobility and
experience with cars.
The car will no longer be a medium for the
journey but it will be a productive location
where users can perform various activities
while being driven around autonomously.

1.3 Project Vision


Team Audi Evolve design team envisions a
future where people will still want to drive even
though driving will be only a secondary activity.
Autonomous driving allows spare time for the
driver and ensures increased productivity.

Final Documentation

1.4 Table of Contents


1. Prelude ............................................................................................. 3
1.1 Executive Summary ......................................................... 4
1.2 Project Background .......................................................... 6
1.3 Project Vision ................................................................... 7
1.4 Table of Contents ............................................................. 8
1.5 List of Figures .................................................................. 14
1.6 List of Tables .................................................................... 17
1.7 Glossary ........................................................................... 17
2. Context ........................................................................................... 25
2.1 Need Statement ................................................................ 26
2.2 Problem Statement .......................................................... 26
2.3 The Design Team

............................................................ 26

2.3.1 The Stanford Team ..........................................

28

2.3.2 The Aalto Team ................................................ 30


2.4 Corporate Sponsor ........................................................... 32
2.4.1 Corporate Liaison ............................................. 33
2.5 Teaching Team ................................................................. 33
2.6 Special Thanks ................................................................ 33
3. Design Requirements ................................................................... 35
3.1. Given Requirements ....................................................... 36
3.2 Functional Requirements ................................................

37

3.2.1 Functional Opportunities .................................

42

3.3 Physical Requirements ...................................................

43

3.3.1 Physical Opportunities ....................................

45

3.4 Business Opportunity

.....................................................

47

4. Design Development .................................................................... 49


4.1 Future Assumptions

........................................................ 50

4.1.1 Future User ...................................................... 50


4.1.2 The User story .................................................. 52

Final Documentation

4.1.3 Future Infrastructure ........................................ 52


4.2 Prototype timeline/ key learnings ..................................... 53
4.3 Finland convergence ....................................................... 53
4.3.1 Vision ............................................................... 56
4.3.2 Prototype features ............................................ 56
4.4 User testing ...................................................................... 56
4.5 Steering Wheel Concepts for Final Design ....................... 58
4.5.1 Magneto ........................................................... 58
4.5.2 Retracted steering wheel .................................. 59
4.6 Final Prototype Development ........................................... 60
4.6.1 Anticipatory Chair ............................................. 60
4.6.2 Interactive Steering Wheel ............................... 61
4.6.3 Open and Clear Cabin Space .......................... 61
5. Design Specifications .................................................................. 63
5.1 Anticipatory Chair ...........................................................

67

5.1.1 Chair Electronics ............................................

68

5.1.1.1 Microcontroller Board .....................

68

5.1.1.2 Chair Motor Driver Board ................

69

5.1.1.3 Rotation Motor Driver Board .............. 70


5.1.1.4 Chair Motor Sensing Board .............. 72
5.1.1.5 Rotation Optical Encoder ................. 73
5.1.1.6 Force Sensing Resistors .................. 74
5.1.1.7 Override Switches ............................ 74
5.1.1.8 Master Control Box .......................... 76
5.1.2 Firmware Development .................................... 77
5.1.2.1 Tilt Function

..................................... 78

5.1.2.2 Slide Forward/Back Function ............ 79


5.1.2.3. Rotate Function ............................... 79
5.1.2.4 Move Motor Function ....................... 80
5.1.3 Rotation Mechanism design ............................ 80
5.1.3.1 Assumptions ..................................... 81

10

Final Documentation

5.1.3.2 Design Calculations .......................... 81


5.1.3.3 Initial Rotation Prototype ................... 81
5.1.3.4 Final Rotation Mechanism ................. 82
5.1.4 Cabin Base Design ............................................. 83
5.1.5 Footpad Design ................................................. 84
5.2 Interactive Steering Wheel ................................................ 85
5.2.1 Interactive Steering Wheel Structure ................ 87
5.2.2 The Making of Interactive Steering Wheel ......... 88
5.2.3 The interaction specifications of the steering wheel 90
5.2.4 Comunication Between Devices ..............

95

5.3 Open and Clear Cabin Space .......................................... 95


5.3.1 The Making of Open and Clear Cabin Space ...

97

6. Project Planning and Management .............................................. 97


6.1 Timetable and Milestones ................................................ 103
6.2 Budget and Spendings ..................................................... 104
6.3 Distributed Team Management ......................................... 105
6.4 Stanford EXPE .................................................................. 111
6.5 Future Work ...................................................................... 112
6.6 Personal Reflections ......................................................... 113
7. Appendix .......................................................................................... 114
7.1 Initial Brief ......................................................................... 127
7.2 Fall Brochure .................................................................... 128
7.3 Needfinding ...................................................................... 129
7.4 Benchmarking .................................................................. 131
7.5 Technical Literature Benchmarking .................................. 132
7.6 Critical Function Prototype Golf Cart ............................

137

7.6.1 Golf Cart CFP Results .....................................

141

7.6.2 Golf Cart C# Code for Gradual Control ........

141

7.7 Online Research .............................................................. 143


7.8 Fall Presentations ............................................................. 154
7.9 Winter Presentation .......................................................... 157

Final Documentation

7.10 Winter Brochure

11

..............................................................

165

7.11 Dark Horse Prototype .......................................................

168

7.11.1 Reconfiguro - Drag and Drop Interface Java Code 170


7.12 Funky Prototype ................................................................. 170
7.12.1 Arduino Code for FSR ....................................... 191
7.12.2 Matlab Data Collection Code (main.m) .............

191

7.12.3 Matlab Data Collection Code (initialize_port.m)

192

7.12.4 Matlab Data Collection Code (readSerial.m) .....

193

7.12.5 Matlab Data Collection Code (writeToFile.m) ....

194

7.12.6 Matlab Post Process Code (post_proc.m) ........

194

7.13 Functional Prototype ......................................................... 194


7.13.1 Arduino Code - Anticipatory Chair ..................... 196
7.13.2 Arduino Code - Buckle it Out ..........................

196

7.14 Overview ........................................................................... 220


7.15 Design Reasoning ............................................................

226

7.16 Development Strategy ......................................................

227

7.17 Future Assumptions ..........................................................

229

7.17.1 Future user ........................................................

230

7.17.2 Future infrastructure ........................................... 231


7.17.3 Futute technology ............................................... 232
7.17.4 Ideal Future Persona ......................................... 233
7.18 Needfinding ........................................................................ 236
7.18.1 Context Map ....................................................... 238
7.18.2 Self-Observation ................................................

239

7.18.3 Interviews .......................................................... 239


7.18.4 EMT (Emergency Medical Technician) ..............

241

7.18.5 Survey ................................................................ 248


7.19 Benchmarking ................................................................... 249
7.19.1 Steering Benchmarking ..................................... 253
7.19.2 Motion Sickness Benchmark ............................. 254
7.19.3 Human-Machine Transition Benchmarking .......

259

12

Final Documentation

7.19.4 Confirmation Cue Benchmarking ...................... 260


7.19.5 Trust benchmarking ........................................ 262
7.20 Critical Prototypes .......................................................... 265
7.20.1 CFP - Transition Golf Cart ........................... 270
7.20.2 Steering Mechanisms .................................... 270
7.20.3 CEP - Reconfigurable Workspace ............... 275
7.20.4 CEP - Mobile Workspace ............................. 282
7.21 Design Specifications

.................................................... 284

7.21.1 Transition sequence prototype ....................... 286


7.21.2 Different Steering Controls ............................. 286
7.22 Needfinding .................................................................... 291
7.22.1 Trip to Germany ............................................. 296
7.22.2 Geneva Trip .................................................... 296
7.22.3 Dashboard Questionnaire .............................

298

7.23 Benchmarking ...............................................................

299

7.23.1 Akka Car Concept .........................................

301

7.24 Dark Horse Prototypes .....................................

301

7.24.1 Dark Horse Prototype - Reconfiguro ........... 302


7.24.2 Dark Horse Prototype - SbW Imitation ........ 302
7.24.3 Dark Horse Prototype - Sleeping Positions

310

7.24.4 Dark Horse Prototype - AR Imitation .........

312

7.24.5 DHP - Disappearing Steering Wheel ..........

314

7.24.6 Dark Horse Prototype - Magneto ................

315

7.25 Funky Prototypes ..........................................................

316

7.25.1 Funky Prototype Anticipatory Chair ............

318

7.25.2 Funky Prototype What should I do .............. 318


7.26 Functional System Prototypes ....................................... 325
7.26.1 FSP Anticipatory System ............................. 330
7.26.2 Functional System Prototype - Buckle It Out

330

Final Documentation

7.27 Design Specifications ....................................................

13

333

7.27.1 Design Specifications For Anticipatory Chair

336

7.27.2 Design Specifications For Anticipatory System

336

7.27.3 Design Specifications For Buckle it out ........ 341

14

Final Documentation

1.5 List of Figures


Figure 1.1.1: Final Prototype ............................................................... 5
Figure 2.3.1 - Audi Design Team ........................................................

27

Figure 2.3.1.1 - Stanford University Logo ..........................................

28

Figure 2.3.1.2 - Sangram Patil ...........................................................

28

Figure 2.3.1.3 - Stephanie Tomasetta ................................................

29

Figure 2.3.1.4 - David Wang ..............................................................

29

Figure 2.3.2.1 - Aalto University Logo ...............................................

30

Figure 2.3.2.2 - Goran Bjelajac ..........................................................

30

Figure 2.3.2.3 - Sifo Luo ....................................................................

31

Figure 2.3.2.4 - Heikki Sjman ..........................................................

31

Figure 2.3.2.5 - Tommi Tuulenmaki ...................................................

32

Figure 2.4.1 - Audi Logo ....................................................................

32

Figure 2.4.2 - ERL Logo ....................................................................

32

Figure 2.5.1 - Aalto Teaching Team ...................................................

34

Figure 4.1.2.1 - User story ..................................................................

52

Figure 4.2.1 - Steering Transition CFP ..............................................

53

Figure 4.2.2 - Steering Mechanisms .................................................

54

Figure 4.2.3 - Darkhorse prototype ...................................................

54

Figure 4.2.4 - Magneto prototype ...................................................

54

Figure 4.2.5 - Darkhorse prototype ...................................................

55

Figure 4.2.5 - Funky Prototype .........................................................

55

Figure 4.2.6 - Sensing chair ..............................................................

55

Figure 4.2.7 - Retractable steering wheel Aalto ................................

56

Figure 4.4.1 - User Testing ................................................................

58

Figure 4.4.2 - User Testing2 ..............................................................

58

Figure 4.5.1.1 - User Testing .............................................................

59

Figure 4.6.2.1 - Early Steering Wheel Mockup ..................................

62

Table 4.6.2.2 - Retraction Mechanism Explosion ..............................

63

Table 4.6.3.1 - Cabin Space mock up ...............................................

65

Figure 5.1.1.1 - System level view ....................................................

68

Final Documentation

15

Figure 5.1.1.2 - Physical layout of the hardware. ................................ 69


Figure 5.1.1.1.1 - Physical circuit board for the microcontroller ........... 70
Figure 5.1.1.2.1 - Schematics ............................................................. 71
Figure 5.1.1.2.2 - Physical circuit board .............................................

71

Figure 5.1.1.3.1 - Dual fan assembly for rotation motor heatsink .......

72

Figure 5.1.1.4.1 - Motor position sensing circuit schematic .................

73

Figure 5.1.1.4.2 - Motor position sensing board .................................

73

Figure 5.1.1.5.1 - Rotation encoder - optical sensor and circle of stripes 74


Figure 5.1.1.6.1 - Initial configuration of FSR tested ..........................

74

Figure 5.1.1.6.2 - FSR protoboard .....................................................

75

Figure 5.1.1.6.3 - FSR circuit schematic ............................................

76

Figure 5.1.1.7.1 - Switch override circuit schematic and protoboard

76

Figure 5.1.1.7.2 - Switch override protoboard ...................................

77

Figure 5.1.1.8.1 - Master control box ................................................

77

Figure 5.1.2.1 - Firmware process flow .............................................

78

Figure 5.1.3.2.1 - Chair Dimensions ..................................................

80

Figure 5.1.3.2.2 - Design Calculations ..............................................

81

Figure 5.1.3.3.1 - Rotation first prototype 1 ......................................

82

Figure 5.1.3.3.2 - Rotation first prototype 2 .......................................

82

Figure 5.1.3.4.1 - Rotation mechanism initial assembly ....................

83

Figure 5.1.3.4.2 - Rotation connection to platform .............................

83

Figure 5.1.3.5.3 - Rotation connection to motor ...............................

83

Figure 5.1.4.1 - Cabin base ...............................................................

84

Figure 5.1.4.2 - Drawing of the central cabin base design ...............

84

Figure 5.1.5.1 - top footpad layer ......................................................

85

Figure 5.1.5.2 - bottom footpad layer ................................................

85

Figure 5.1.5.3 - middle footpad layer ................................................

85

Figure 5.1.5.4 - Footpad top view showing wire routing ...................

85

Figure 5.1.5.5 - Footpad with embedded ball casters ......................

86

Figure 5.2.1 - 3d Printing of Steering Wheel ....................................

87

Figure 5.2.1.1 - Steering Wheel .......................................................

88

16

Final Documentation

Figure 5.2.1.2 - Steering Wheel Parts ................................................ 88


Figure 5.2.1.3 - Steering Wheel Parts Back

.....................................

89

Figure 5.2.1.4 - Retraction Mechanism Parts ....................................

89

Figure 5.2.2.1 - Base .........................................................................

91

Figure 5.2.2.2 - Base Dimensions .....................................................

91

Figure 5.2.2.3 - Tightening Bands .....................................................

92

Figure 5.2.2.4 - Tightening Bands Dimensions .................................

92

Figure 5.2.2.5 - Shaft ........................................................................

93

Figure 5.2.2.6 - Inner shaft technical drawing ..................................

94

Figure 5.2.2.7 - Microswitch support ................................................

94

Figure 5.2.2.8 - Assembled Retraction Mechanism ..........................

95

Figure 5.2.4.1 - Communication schematics ....................................

96

Figure 5.2.4.2 - Physical Implementation ..........................................

96

Figure 5.3.1.1 - Gluing the pieces .....................................................

97

Figure 5.3.1.2 - Sanding the dashboard ............................................

98

Figure 5.3.1.3 - Painted dashboard ...................................................

99

Figure 5.3.1.4 - Banding the ribbon ...................................................

100

Figure 5.3.1.5 - Barbecuing the ribbon ..............................................

100

Figure 5.3.1.6 - Mounting the ribbon ..................................................

101

Figure 6.4.1 - EXPE Booth ................................................................. 112

Final Documentation

17

1.6 List of Tables


Table 3.1.1 : Given Design Requirements .......................................... 36
Table 3.2.1 : Functional Requirements ............................................... 37
Table 3.2.1.1 : Functional Opportunities ............................................. 42
Table 3.3.1 : Physical Requirements ................................................... 43
Table 3.3.1.1: Physical Opportunities ................................................... 45
Table 3.4.1 : Business Opportunity ..................................................... 47
Table 4.3.2.1 Priorities for EXPE ........................................................ 57
Table 4.3.2.2 Priorities for EXPE ........................................................ 57
Table 4.5.1.1 Pro and Con for Magneto User Testing ........................

59

Table 4.5.2.1 Pro and Con for Retracting Steering Wheel .................

60

1.7 Glossary
Glossary
This system is much like regular cruise control, it is
Adaptive Cruise Control

supposed to maintain certain speed. Adaptive that the

ACC

car is able to adapt with current surroundings, it is able to


keep wanted distance with the car in front.

Adaptive Chair Reaction


(ACR)

The chair reaction to changes in body position of the user

Analog-to-Digital

This a device that converts a continuous physical quantity

Converter (ADC)

(eg. voltage) to a digital data.

18

Final Documentation

Glossary
Application Programming

This is a protocol that software components use as an

Interface (API)

interface for communicating with each other.

Arduino

This is a single-board microcontroller.

Augmented Reality (AR)

Autonomous Mode

Technology that combines actual world and digitally


generated image into one.

The mode when car is driving itself.

An exercise of exploring as many areas as possible that


Benchmarking

relate to the problem statement, such as technology


research, predictions, etc.

Cabin Space

The interior space within a vehicle which is occupied by


passengers/drivers.
This means information that the user of a car has to

Confirmation Clue

get about the events outside and inside the car. This
information confirms that car is behaving the way its user
wants it to behave.

Computer Numerically

Controlling method eg for machining tools that uses

Controlled (CNC)

parameters given by a computer.

Critical Experience

Physical prototype of an experience of design, which is

Prototype (CEP)

required to ensure usability.

Critical Function

Physical prototype of a fundamental element of design,

Prototype (CFP)

which is required to ensure its functionality.

Final Documentation

19

Glossary
Dashboard
Double Pole, Double
Throw (DPDT)
Electroencephalography
(EEG)

Control panel in front of the driver in a car.


A switch type that includes two SPDT switches

The recording of brains electrical activity along the scalp.

Electronically Erasable
Programmable Read-

Type of memory

Only Memory (EEPROM)


Emergency Medical

Term used in some countries to render to health care

Technician (EMT)

provider of an ambulance.

EXPE

Field Of View (FOV)

The presentation of project outcomes in the end of ME310


course at Stanford.
The vision field of Kinetic camera device that tracks body
movement

Force Sensing Resistor

This is a material whose resistance varies if a force is

(FSR)

applied.
Flowdoc is a collaboration web application for technical

Flowdock

team. It allows for quick updates, team communication,


and compiles all email threads in one place.
This is a term used for ME310 prototype that is still a bit

Functional System
Prototype

crude, and obviously assembled from off-the-shelf parts;


however, this time decisions on
technical implementation are done with increased
sophistication.
This is a term used in ME310 for an approximation

Funky Prototype

prototype of the full system without making a costly


commitment to any one configuration, technology, or
geometry.

20

Final Documentation

Glossary
H-Bridge

Haptics

Ideal Persona

Information Technology
(IT)

This is an electronic circuit that applies voltage across a


load in either direction.
Any form of nonverbal communication involving touch,
eg buttons and touch screens.
It is an assumption of potential extreme user in the
future.
The application of computers and telecommunications
equipment to store, retrieve, transmit and manipulate
data
In-car environment system, which combines audio

Infotainment System

on screen hardware for providing information and


entertainment.

Intravenous Therapy (IV

Therapy is the infusion of liquid substances directly into

Therapy)

a vein.

Infrared Camera (IR

Camera that detects not visible light with longer

Camera

wavelengths.

Input/Output Port (I/O

A port used in microcontrollers for two way

Port)

communication between a computer and microcontroller.

Integrated Drive
Electronics (IDE)
Kinetic

Kinetic

Lane Assistant

A standard for connecting a storage device-

Field Of View (FOV)


This is a motion sensing input device provided by
Microsoft.
This is a system which detects side lanes and tries to
keep the car in between them.

Final Documentation

21

Glossary
Mac OSX

Operating system of Apple computer devices.

Manual Mode

The mode when the user is controlling the car.

Matrix Laboratory

This is a numerical computing environment and fourth-

(MATLAB)

generation programming language.


This is a small computer on a single containing a

Microcontroller

processor core, memory, and programmable input/


output peripherals.

Mobile Workspace

Mode Initiators (MI)

Critical experience prototype that tested workspace


environment in real life traffic.
Commands that are used to transition between preset
modes.
It is sickness caused by difference between visually

Motion Sickness

perceived movement and the vestibular systems sense


of movement.
This is a device that selects one of several analog or

Multiplexer

digital input signals and forwards the selected input into


a single line.
An exercise of understanding and building empathy

Needfinding

for the target user group by conducting interviews and


ethnographic studies.

Night Vision
Original Equipment
Manufacturer (OEM)

This is a system that makes seeing possible in low light


conditions.
Standard component provided with the original product.

22

Final Documentation

Glossary
OpenNI/NITE

Physical Steering

Pull-Up Resistor

This is an open source API library.


Our critical function prototype that experimented
intuitiveness of different types of steering.
These are used in electrical circuits to make sure the
system settle at expected level.

Pulse-Width Modulation

This is a commonly used technique for controlling power

(PWM)

to inertial electrical devices.

Reconfigurable

Critical experience prototype of ours, which mimicked

Workspace

the transition between driving mode and leisure mode.


The angle of the chair at which the users start feeling

Sandwich Threshold

uncomfortable and are concerned that the chair is


sandwiching them.

Segway

Servo Motor

Two-wheeled self-balancing battery-powered electric


vehicle.
This is a rotary actuator that makes precise control of
angular position possible.
A library for Processing is a simple wrapper library that

SimpleOpen NI

maps simple call functions to the more complex API


library functions.

Situational Awareness

Skype

This means awareness that driver has about the


surroundings of the car.
Online communication tool which allow video calls and
chatting.

Single Pole, Double

A simple switch that changes connection of two

Throw (SPDT)

terminals to one common terminal.

Final Documentation

23

Glossary
Control system for an automobile that is done with
Steer-by-Wire (SbW)

electronic control systems using electromechanical


actuators and human-machine interfaces replacing the
mechanical control system.

Steering Wheel

Teaching Assistant (TA)

Teensy

Transition Golf Cart


Vehicle to Vehicle
communication (V2V)

Current round control device for a vehicle.


ME310 assistant that helps the students with their
project.
This is a single-board microcontroller
A critical function prototype which tested transition
between autonomous mode and manual mode
Communication between autonomous vehicles

Vehicle to Infrastructure

Communication between autonomous vehicles and

communication (V2I)

infrastructure.

VCC

Three letter combination used for power supply pin.

VDC

Volts in direct current.

Windows

Operating system for a personal computer provided by


Microsoft.

24

Final Documentation

Final Documentation

2. Context and Background

25

26

Final Documentation

2.1 Need Statement

adaptable in order to provide entertainment or

Audi has created a brand that people

mode, but also must provide a safe transition

respect and seek out because of their


utilization

of

advanced

technologies.

They are at the forefront of innovation in


automotive technology and are dedicated
to providing customers with elegant,
sophisticated solutions. The pursuit of new
technologies and with the future always in
mind, Audi has developed numerous driving
assistant systems and technologies. These
systems are bringing Audi one-step closer
to the implementation and introduction of
autonomous vehicles.
Autonomous vehicles will be relatively
common by the year 2035 and Audi envisions
car design will focus not only on the driving
experience, but also the riding experience.
As these technologies are adopted, people
will want to regain time lost from commuting
to locations where they would be productive.
Therefore, the interior cabin space design
must evolve from the current configuration
meant for driving to one that is able to adapt to
the various activities a driver and passenger
might want to do when in autonomous driving
mode.

2.2 Problem Statement


The goal of Team Audi Evolve is to develop
a cabin space that will allow Audi users,
in the year 2035, to perform numerous
activities in an autonomous vehicle. The
riding experience and driving experience of
an autonomous vehicle must be considered
in the solution. The cabin must be open and

workspace for the driver during autonomous


to manual driving when the user chooses to
take over the controls. The design solution
must ensure that the driver and passengers
trust the vehicles autonomous functions.
Furthermore, the solution must display all
necessary information for transitioning from
autonomous to manual mode and manual
to autonomous mode intuitively and without
being disruptive to the overall riding and
driving experience. The solution must also
have the sleek, sophisticated craftsmanship
that the Audi brand is known for and should
still retain the same look and feel of the brand.

2.3 The Design Team


The Audi 2012 design team is comprised
of a diverse and multi-disciplinary group of
students from Stanford University and Aalto
University who are excited to be working
together and learning from each other.
(Figure 2.3.1: Audi Design Team)

Final Documentation

Figure 2.3.1 : Audi Design Team (Thank you Djordje)

27

28

Final Documentation

2.3.1 The Stanford Team


Figure 2.3.1.1 : Stanford University Logo

Figure 2.3.1.2 : Sangram Patil

Stanford University

Sangram Patil

Location: Stanford, California (USA)

Status: 2nd Year Mechanical Engineering

Founded: 1891

Graduate Student
Contact: sangram@stanford.edu
Phone: 650.704.1145
After completing his undergrad in ME from
India, he joined Stanford last year and has
been enjoying an awesome roller-coaster
ride since then. Sangram thoroughly enjoys
working in team projects and being a part of
a student community that is so radiant and
full of enthusiasm.

Final Documentation

29

Figure 2.3.1.3 : Stephanie Tomasetta

Figure 2.3.1.4 : David Wang

Stephanie Tomasetta

David Wang

Status: 1st Year Mechanical Engineering

Status: 2nd Year Electrical Engineering

Graduate Student

Graduate Student

Contact: sltomase@stanford.edu

Contact: dcwang3@stanford.edu

Phone: 732.492.8373

Phone: 407.376.4635

Stephanie received her BSE in Product

Born and raised in Orlando, FL. David

Design from Stanford and decided she loved

received a BSEE from University of Florida

it so much she wanted to stay for a masters

(GO GATORS!). Interned at Lockheed

degree in Mechanical Engineering. She

Martin Missile and Fire Control in Orlando

believes design can bring delight to people

for 4 summers supporting both in-production

and hopes to design things that people will

and IRAD programs. As far as grad school,

become emotionally attached to, as well as

David is looking to graduate in the spring.

functionally.

Specialization/interest includes computer


architecture,
integration.

hardware

design

and

30

Final Documentation

2.3.2 The Aalto Team


Figure 2.3.2.1 : Aalto University Logo

Figure 2.3.2.2 : Goran Bjelajac

Aalto University

Goran Bjelajac

Location: Espoo, Finland

Status: Industrial and Strategic Design

Founded: 2010

Graduate Student
Contact: goran.bjelajac@aalto.fi
Phone: +385417009819
Goran Goci Bjelajac was born in 1986. in
Belgrade, Serbia. Since his early age he
has showed exceptional talent and interest
in art. A profession in design seemed natural
and the only option for him. He enrolled in
specialized high-school for industrial design
and has continued education at the Faculty
of applied arts in Belgrade and now at
Aalto University School of Art, Design and
Architecture at the department of Industrial
and Strategic Design.

Final Documentation

Figure 2.3.2.3 : Sifo Luo

Sifo Luo
Status: Information and Service
Management Graduate Student
Contact: sifo.luo@aalto.fi
Phone: +385417009819
Sifo got her Bsc in Business Technology
from Aalto School of Economics. She is now
continuing her Masters study in Information
and Service Management. Educated in three
countries -- China, Finland, and US, she is
very good at adaptation to various cultures.
Sense of urgency and teamwork are highly
valued by Sifo, and her new enthusiasm is
searching for inspiration in daily routine.

31

Figure 2.3.2.4 : Heikki Sjman

Heikki Sjman
Status: Mechanical Engineering Graduate
Student
Contact: heikki.sjoman@aalto.fi
Phone: +385417009819
Heikki is on his final year of his masters
degree in Mechatronics in Aalto University,
Finland. During his studies, he has been
working in the field of Design and spent a
year in United Kingdom studying business.
Heikki enjoys challenges and making things
happen. Every day is a new adventure!

32

Figure 2.3.2.5 : Tommi Tuulenmaki

Final Documentation

2.4 Corporate Sponsor

Figure 2.4.1 : Audi Logo

Tommi Tuulenmki
Status: Mechanical Engineering Graduate
Student
Contact: tommi.tuulenmaki@aalto.fi
Phone: +385417009819
Comes from the land of ice and snow,
Finland, where he did his undergrad in ME.
The journey continues, and now Tommi has
stepped on the path of a graduate student,
which is also being done at same school,
Aalto University. Tommi thinks challenges
are necessary for individual development;
this is why he also enjoys them so much.
He rejoices even more, when overcoming
the challenges is done with solid teamwork.

Figure 2.4.2 : ERL Logo

Audi is a subsidiary of the Volkswagen


Group, the largest automotive vehicle
manufacturer in 2011. The design team
is working closely with the Electronics
Research laboratory (ERL).

The ERL

strives to provide innovation and creativity


to the Volkswagen Group and all of its
brands. It is part of the global research
and development network that differentiates
its brands from the rest of the automotive
industry through its cutting edge technology
and research. The ERL performs research
and development in areas such as, human
machine interface systems, driver assist
systems, and infotainment applications and
platforms.

Final Documentation

33

VOLKSWAGEN Group of America, Inc.

Scott Steber, Teaching Assistant

Electronics Research Laboratory

Contact: steber@stanford.edu

500 Clipper Drive


Belmont, CA 94002

Jeremy Dabrowiak, Coach

Phone: +1 650.496.7000

Contact: jdabrowiak@gmail.com

2.4.1 Corporate Liaison


Trevor Shannon
Volkswagen Electronics Research Lab
Engineer, Multimedia Applications Team
Contact: trevor.shannon@vw.com
Phone (office): +1 650-496-7063
Phone (mobile):704.340.4160
Lorenz Bohrer
User Interaction designer
Contact: lorenz.bohrer@audi.de
Phone (office): +49 849 89 576866

Phone: 650.274.7871
Lauri Repokari, Professor
Contact: repokari@stanford.edu
Sara De Moitie, Teaching Assistant
Contact: sara.demoitie@gmail.com
Maria Kulse, Teaching Assistant
Contact: maria.kulse@gmail.com
Tuomas Sahramaa, Teaching Assistant
Contact: tuomas.sahramaa@gmail.com
Mikelis Studers, Teaching Assistant
Contact: mikelis.studers@aalto.fi

2.5 Teaching Team


Stanford University
Mark Cutkosky, Professor
Contact: cutkosky@stanford.edu
Larry Leifer, Professor
Contact: leifer@cdr.stanford.edu
George Toye, Professor
Contact: toye@cdr.stanford.edu
Tyler Bushnell, Teaching Assistant
Contact: busht@stanford.edu
Annika Matta, Teaching Assistant
Contact: amatta@stanford.edu

Harri Toivonen, Teaching Assistant


Contact: harri.k.toivonen@aalto.fi
Markku Kokela, Teaching Assistant
Contact: markku.koskela@aalto.fi
Lassi Laitinen, Coach
Contact: lassi.laitinen@gmail.com

2.6 Special Thanks to


George Atanasov, Marko Takala, Jordan
Davidson, Benjamin Tee

34

Figure 2.5.1 : Aalto Teaching Team

Final Documentation

Final Documentation

3. Design Requirements

35

36

Final Documentation

3.1 Given Requirements

solution. The functional requirements listed

Based on the benchmarking, needfinding

do and what functionality should be provided

and prototyping that was carried out in the


last six months, the team identified certain
basic functional and physical requirements
that the final solution should address.
The team also identified many interesting
directions and opportunities, which can be
explored while working towards a design

Design Requirement

in this section dictate what the system must


by the solution. The physical requirements
dictate what the system should be like
physically. Given requirements have been
listed based on the abstract that has been
provided. These requirements have either
been expanded and/or refined further based
on the testing done this quarter.

Rationale
In order for the driver to fully enjoy

Driver should trust cars autonomous functions

autonomous mode of the car, he/she should


have complete confidence in its control

Smooth and comfortable transitions between


modes and activities

The driver will perform a lot of activities in


autonomous mode and will need to transition
between them quickly and easily.
In order to appreciate freedom inside of the

Cabin should be clear and open

car, cabin has to be spacious enough and


clear so it provides comfortable working
environment.

Audi fit and finish


Table 3.1.1 : Given Design Requirements

The solution should have a similar level of


quality and usability as all Audi cars

Table 3.2.1 : Functional Requirements Part1

achieved the interface for this


transition which will reconfigure
the cabin space needs to blend in

reconfiguration interface 3 times,

they should be comfortable using

it at any point later.

current mode of the configuration

at all times during transition. If

asked about the current mode at

any point during transition, the

user should be able to identify it

used should

indicate to the

user that the

transition is

complete.

want to perform when in autonomous


mode can be performed while being
in the driving position.

or orientation. The design should

allow the user to move a distance

which is at least 2 times the width

should have

the flexibility of

moving around

of the seating or chair

Not all activities that the user would

mode, the interface is locked,

reassurance of trust on the system.

and be comfortable at any angle

input device for working; in manual

mode of the car will also lead to a

and sit in any orientation.

more mobility and be able to move

chair allows the driver to have much

The rotation mechanism within the

showing an image of the Audi logo.

is a unlocked so it can be used as an

different

Being kept aware of the current

two

In autonomous mode, the interface

has

and comfort during the transition.

tablet

interfaces for two different modes.

The

she is on.

confirmation on which mode he/

The driver will receive clear

learn interaction with the chair.

been changed, and an easy-to-

an audible click when mode has

through having a tablet that has

This requirement is achieved

Implementation

This is required for user acceptability

The cabin space

The user should be able to turn

correctly at any time.

amongst the users in terms of the

There should be 100% awareness

activities. For this goal to be

various interactions with the

with the transition itself.

comfortable transition between

The team wants to target a

Rationale

After new users go through the

Metric

The interface

user.

intuitive to the

space should be

of the cabin

reconfiguration

The interface for

Functional
Requirements

Final Documentation
37

3.2 Functional Requirements

Table 3.2.1 : Functional Requirements Part2

and activities

between modes

transitions

comfortable

It makes sense to provide


maximum functionality with
minimum clutter in terms of the
controls.

The controller must be able to

perform at least 5 activities that

would want to be performed in

the vehicle, including driving.

home.

around the driver

Smooth and

more spacious like a room in a

cabin space

vehicle and instead needs to be

volumetric

passengers.

other than drive.

utilization of

F2F interaction with backseat

will have free time to do things

changes.

into the next activities position.

allows the user to naturally flow

tedious manual adjustments and

which removes the need for

chair is an adaptive interaction

user to perform in any mode. The

is proven easy and intuitive for

wheel done by pushing and pulling

The mode trigger on the steering

free and spacious.

seat, the cabin space is now more

By removing the front passenger

improved socialization through

since in autonomous mode they

space through minimal cabin

feel like a tight transportation

chair brings more flexibility and

for a number of different activities

easily be performed in the cabin

be maximum

through the tablet interface; the

for passengers to use the space

different activities that can

The cabin space should no longer

to perform various activities

The cabin allows the driver

This is for developing an


adaptable cabin space that allows

Implementation

Rationale

There should be at least 5

Metric

There should

mode

in autonomous

multiple activities

performing the

well suited for

Cabin has to be

Functional
Requirements

38
Final Documentation

retraction mechanism is fast to


activate.

time the user wants to do a


different activity

Table 3.2.1 : Functional Requirements Part3

10 that measures the sporty

feeling of an Audi control input

and sporty

feeling of driving

driving

autonomous

of monitoring

responsibility

driver of the

Relieves the

in every 3 minutes

information not more than once

User checks or verifies the car

to be 7 or higher on a scale of

precise control

an Audi

score the control input response

maintain the

and enjoy doing other activities.

will help users focus better on

the autonomous controller actions

Reducing the distraction due to

of controlling a complex machine

of driving is the feeling and thrill

factor contributing to the pleasure

be active for safety. An important

proof systems, which will always

redefined due to the added crash

manual driving has been

on the windshield.

used for controlling work contents

in autonomous mode and can be

The steering wheel can not move

driving experience.

car to further enhance a sporty

mimics the setup of a formula race

the centered position of the chair

increases the sporty feeling, while

next position.The steering wheel

reconfiguring the space every

This is an essential requirement

quickly move the driver into their

associated with manually

A responsible driver should

seconds

and activities

to rotate at a comfortable speed to

at reducing the tedious tasks

design should

next activity is less than 15

between modes

The chair mechanism is designed

The smart cabin space aims

for cars of the future, where

The time it takes to begin the

Quick transitions

Implementation

Rationale

Control input

Metric

Functional
Requirements

Final Documentation
39

Table 3.2.1 : Functional Requirements Part4

should not go below 90

degrees.

threshold in

terms of the

the user

comfortable for

needs to

steering wheel

time of the

disrupts the smooth flow between

pushing or pulling.

activities.

like something is broken and it

If it takes longer for the steering

response

respond immediately after

chair response

too sensitive

without triggering the chair

wheel to respond, users feel

times should not trigger the

should not be

adjustments to their positions

Users must be able make minor

might get stuck in the chair.

and become confused that they

users get sandwiched in the chair

The steering wheel needs to

position by 20 degrees multiple

body commands

The emergence

Shifting or tilting upper body

The intentional

position

and the bottom of the chair

activities

space

the sandwich

be an interruption to the flow of

foot to reconfigure the cabin

If the angle is too low, then the

should be easy to use and not

move their arms more than a

The angle between the back

The mode initiator command

Rationale

The user does not need to

Metric

not exceed

The chair should

of the user

be within reach

command should

initiator

Any mode

Functional
Requirements

immediately after being initiated.

The steering wheel will respond

times in a row.

the command to happen multiple

secs) in software is to prevent

The time threshold (0.7-1.5

distrustful of the system.

angle so the user dont become

that the chair can never pass this

The chair has been coded so

facing forward.

of range as long as the driver is

wont make the steering wheel out

The steering wheel retraction

Implementation

40
Final Documentation

communication with back seat


passengers. When driving, the

only rotate fully back but also

move backward and forward.

Table 3.2.1 : Functional Requirements Part5

to be moved or activated with

smart features in manual mode.

off when in

manual driving

mode.

during manual driving mode no

The chair should not be able

must be turned

unsafe.

he/she performs as it would be

matter what body movement

The driver shouldnt be rotated

driving position.

driver should be fit into the best

backseat to achieve better F2F

rotate back and move closer to

The driver should be able to

Rationale

The chair should be able to not

Metric

Smart features

the user

comfortable for

needs to

steering wheel

time of the

The emergence

Functional
Requirements

disabled.

the chairs smart features are

the manual mode is started

The chair is coded so that when

environment is maintained.

a comfortable communication

passengers is 40 inches, so that

btw foot pad and the backseat

to the back and the distance

The chair can rotate 180 degrees

Implementation

Final Documentation
41

Table 3.2.1.1 : Functional Opportunities

autonomous mode windshield will be used

multimedia while in autonomous mode.

using the same car. The cabin space needs


to be smart enough to readjust according to

preferences without any input from the

driver

the latest technological solutions in their cars.

technologies quickly
model like the current smart phones.

It can even be an upgrade based business

Audis goal to provide their customers with

Cabin should be able to integrate new

technology development in the future and

This is to be in line with the fast paced

the user.

scenarios where different drivers would be

This is especially applicable in car sharing

contents in manual driving mode, and in

the windshield can be used to interact with

For example: AR windshield

and adjust all settings to that specified user.

recognize different default sitting positions

It could be possible for the car to begin to

as working interface.

info will be shown on the windshield as AR

In our concept, speedometer and driving

Implementation

Since the users are not driving anymore,

Rationale

Should be able to recognize driver

surface

Using the windshield as an interactive

Opportunity

42
Final Documentation

3.2.1 Functional Opportunities

Table 3.3.1 : Physical Requirements

with brand AUDI

design must be in line

The fit and finish of the

use

and be comfortable to

the pleasure of driving

inputs should maintain

The designed control

be open and clear

The cabin space should

Physical Requirements
all

integrated

must

experience

with 85% accuracy.

prototypes manufacturer

able to identify Audi as the

Car enthusiasts must be

of the car on the road.

maintain feeling their control

testing survey), in order to

real (8 out of 10 on a post

mode that they deem to be

control input when in driving

haptic feedback from the

Driver

out without hitting anything.

able to stretch their legs

four passengers must be

features are stowed away,

When

Metric

appealing to the Audi user.

Audis in order to be consistent and

finish of the design must align with

fine styling. Therefore, the fit and

sophistication achieved through their

Audis brand is its refinement and

Users are brand loyal and part of

will want to do, not have to do.

since driving is an activity the driver

they should be easy to use and fun

longer need to be precise. Instead,

The driving control inputs may not

relax, or socialize.

to have the room to be able to work,

more comfortable in order for people

The cabin space must be open and

Rationale

to the Audi tradition.

2035. The color scheme remains true

believes Audi will look like in the year

The aesthetics fit into what the team

seat.

the removal of the front passenger

The cabin is much more open due to

in the backseat as well as the driver.

There are seats for three passengers

Implementation

Final Documentation
43

3.3 Physical Requirements

Table 3.3.1: Physical Requirements

tell the difference between

a smart chair and a normal

car seat in terms of comfort

The driver shouldnt put

more than 5 pounds of

force on food pad.

be concealed and

unnoticeable to the

user sitting in it

should

The

input minimal effort to

press the foot pad.

Users should be able to

engage the footpad with

be suitable size and

not slippery.

There should be atleast 1

of clearance between any

point on the chair and the

furthermost edge of the

steering wheel during the

entire rotation cycle.

The chair should clear

the steering wheel and

the dashboard while

rotating while being in

a comfortable recline

position for the user

flat bottomed shoes.

backseat passengers feet.

The foot pad should

The footpad should not hit

Users should not be able to

Force sensors must

driver

Metric

Physical Requirements

to

ensure

contact

with

able to engage the footpad with flat


bottomed shoes.

able to engage the footpad with flat


bottomed shoes.

rotation without hitting anything

To allow for a complete uninterrupted

passengers feet. Users should be

passengers feet. Users should be

rotation.

it might hit the steering wheel during

chair to this clearance position before

for the users. Software brings the

degrees of incline which is comfortable

that it will clear the steering wheel at 30

The axis of rotation was placed such

The footpad should not hit backseat

allow the plate to move easily

embedded FSRs and ball bearings to

points

The footpad has integrated raised

covering.

to the foam seat below the leather

The force sensors have been attached

Implementation

The footpad should not hit backseat

linear movement.

should be enough to activate chairs

driver. A minimum force of input

and effortless as possible for the

The transition should be as natural

sacrifice basic user comfort in a car

Adding smart features should not

Rationale

44
Final Documentation

Table 3.3.1.1: Physical Opportunities

being constrained to the internal structure of current cars

If there is personal ownership of the car in the future and the cabin

space has been so well designed it might as well integrate into the

home space instead of being parked away in a garage

structure itself

Having the car space

as a part of the users

home

using

Virtual

space

Reality in the cabin

of

Exploring the options

goes beyond the constraints of cabin spaces.

reality helmet and transport the users wherever they want to be. This

space. One direction of the solution might just be to have a virtual

might not be necessary to have the complete functionality in the cabin

VR technology would probably be so well developed in the future that it

lives of people

The car transforms into a personal robot to better integrate into the

This opportunity allows for more flexibility for the cabin space without

Redesigning the car

Transformer car

Rationale

Opportunity

by the current car constraints.

rotation feature and not be restricted

in order to incorporate the new

The prototype is built on a platform

Implementation

Final Documentation
45

3.3.1 Physical Opportunities

inside

Table 3.3.1.1.: Physical Opportunities Part 2

cabin

reconfigurable

space should be

The

interactive inputs

the cabin space as

windows

Utilizing walls and

away, but still within the users reach (2.5ft) at any chair location.

The cabin must have integrated tables and storage that are stowed

of cabin space.

This is one direction, which can be explored for maximum utilization

more fun while being easier to transition to.

into a more intuitive

control input

Designing a new control input like this has the potential to make driving

Rationale

steering and pedals

Combining both the

Opportunity

is

currently

only

being

suited for that activity.

away items in the space to be better

rearrange, and introduce or stow

activities so users may want to

The cabin will be used for various

the car.

implemented on the windshield of

This

Implementation

46
Final Documentation

Final Documentation

47

3.4 Business Opportunity


Opportunity

Rationale
After the launch of autonomous cars, the
business model can shift to car sharing or a

New business model for AUDI

subscription based model which will redefine


ownership of cars and urban mobility in the
future in general.

Table 3.4.1 : Business Opportunity

48

Final Documentation

Final Documentation

4. Design Development

49

50

Final Documentation

Audi is a subsidiary of the Volkswagen

of the future are well integrated in the lives

Group, the largest automotive vehicle

of users, the team can envision that people

manufacturer in 2011. The design team

would show the same enthusiasm towards

is working closely with the Electronics

these autonomous machines and new

Research laboratory (ERL).

upcoming technology as they are currently

The ERL

strives to provide innovation and creativity

showing towards smartphones.

to the Volkswagen Group and all of its


brands. It is part of the global research

Internet

and development network that differentiates

surpassed boundaries of countries in terms

its brands from the rest of the automotive

of connecting people across distant areas

industry through its cutting edge technology

of the world. Accessing a wide range of

and research. The ERL performs research

information has never been this easy and

and development in areas such as, human

it is going to get better in the future. There

machine interface systems, driver assist

is this concept of perpetual connectivity

systems, and infotainment applications and

which is being predicted for the future. It is

platforms.

only a matter of time before all the existing

4.1 Future Assumptions


Assumptions help the team to build a picture
of what the world will be in the future. In
order to get a feasible frame for the design
work, especially when it involved designing
something so far in the future, the team
made reasonable assumptions based
on basic trends from the past to present,
research, needfinding and benchmarking.
Sources for these iterations can be found
among appendices.
4.1.1 Future User
The current generation has grown up in the
technology revolution. People are use to
the fast paced development in technology
and the effect of these new technologies
on our lives is continually growing.
Smartphones and smart handheld devices
have redefined the way we interact with
digital media. If smart and self-aware cars

and

social

networks

have

technologies are developed to such an extent


that they become a part of our lives and
blend in so well that we cannot do without
them. Specifically in terms of automotive
experiences, it is already being seen that
the current generation views driving as a
distraction from texting rather than the other
way around. Since being connected is so
easy, people want to stay connected. This
does not mean that the future users would
not love driving.
Experiencing

moments

of

thrill

and

adventure would still be desirable in the


future, but the essential difference would
lie in doing things because users want to
do them rather than spending time being
forced to do them. It is an extrapolation of
the current scenario when people are forced
to drive along freeways with all the traffic
only because they have to travel to and
from work everyday. Based on this insight,

Final Documentation

it is highly probable that users will find it


desirable to have an option to ride in a cabin
space that is customized to their needs and
the activities that they would want to do, to
better utilize this time lost in commuting.
The internet age has also led to more liberal
thinking. Non-conventional work options
are being explored. It has been envisioned
that new technology will lead to a great
shift in working spaces, work cultures and
procedures. The future users are most likely
going to work in an environment where
physical presence is no longer required on
a daily basis. In such a situation and with the
increasing influence of autonomous cars, it
is highly probable that the future user is not a
very good driver without the basic assistance
systems. The perceived completely manual
mode of driving is very different from the
existing perception of manual driving. There
will be many assistance systems in place
in the future cars. This prediction can be
justified on the basis of experiences of pilots
in airlines which were fit with autopilots and
new assistance systems in the 20th century.
There was a time when all the pilots were
skeptical about adopting, getting used to
and trusting this technology. But currently,
pilots rely so much on this new technology
that most of them cannot do without it.

51

52

Final Documentation

4.1.2 The User Story

who helps his son prepare for a spelling

In thinking about the future Audi user, the

test, but also enjoys playing with his son

team envisioned a scenario in which the

and making him laugh. This means that

typical user would experience their car on

often he will go into driving mode to have

a daily basis. The team sees this user as a

fun and spend some quality time not only

businessman who spent his 20s and 30s

studying, but also sharing a bond over the

very focused on forwarding his career. He

pleasure of driving an awesome car. Once

gets married later in life and decides to

his son is safely at school, the Audi user is

have a child in his early forties. He enjoys

off to work and decides to send some last

taking his wife out on nice dinner dates,

minute emails to confirm several business

but also enjoys entertaining clients and

appoints and then relaxes until he reaches

going out with his buddies.

his office. This user story helped guide in


making our decisions about the prototype

This Audi user is a family man who typi-

and what features were most important.

cally drops his young son off at school

The figures below show visualizations of

on his way to work. He is an active father

this user story

Figure 4.1.2.1 : User story

Final Documentation

4.1.3 The Future Infrastructure

Number of cars will radically increase


globally

Number of cars in developed countries will decrease and in developing


countries will raise

Developed countries will shift towards


electric vehicles, but in developing
countries gasoline will still be the
primary fuel

China will be the biggest market for


cars. It will also be the largest car
manufacturer in the world

Middle east will lose its position in oil


based wealth, because of the lack

53

4.2 Prototype timeline/


key learnings
CFP - Steering Transition
QUESTION - Do people feel safe and
comfortable making the transition using
this prototype and does this setup increase the situational awareness for the
driver?
Insight/Conclusion - the interactive interface was too distracting and users were
better at a direct transition than a gradual
transition. (Figure 4.2.1: Steering Transition CFP)

of interest for oil and because the oil


reserves are going to expire

Because the number of people will


rise on the planet, there will be huge
demand for food thus countries that
are rich in farmland will be major powers in the future

Because of global warming, ice in


Siberia and north of Canada and
Europe will melt, which will make
those countries new world leaders in
farmland and as such the new global
leaders

By 2030, transition between real and


virtual world will be complete. User
will feel and see with senses everything what a person in virtual world
sees and feels. The technology will be
already available by 2020, but it will
not be safe and legal until 2030

More and more people will work from


home and live with parents - which will
bring a major decline in marriages.

Figure 4.2.1 : Steering Transition CFP

54

Final Documentation

CFP - Steering Mechanisms


QUESTION - Do users feel more comfortable using steering mechanisms other
than the traditional steering wheel for
driving?
Insight/Conclusion- Most users felt uncomfortable using other steering mechanisms as control inputs to driving since
they were trained with and always use the
traditional steering wheel.
Darkhorse Prototype - Reconfigurable
Workspace
QUESTIONS - Do users value a reconfig-

Figure 4.2.2 : Steering Mechanisms

urable cabin space in a car? What level


of control do users want over the cabin
space layout?
Insight/Conclusion - Users value the
benefits of being able to optimize the
cabin space for different activities but they
would like to do so in the least amount of
effort and intrusion in their daily lives. Users also perform a lot of activities quickly
and can not plan in advance what they
want to do. (Figure 4.2.3)
Darkhorse Prototype - Magneto

Figure 4.2.3 : Darkhorse prototype

QUESTIONS - What would the users do


with the steering wheel if it was possible to
attach it anywhere on the dashboard?
Insight/Conclusion - Users felt that the
wheel could be used as an interactive or
control device while in autonomous mode.
However the detachable wheel was cumbersome and users didnt know where to
place it. (Figure 4.2.4 Magneto)

Figure 4.2.4 : Magneto prototype

Final Documentation

55

Funky Prototype - Chair Sense


QUESTIONS - Is having a chair anticipate
the users desire to adjust the chair valued? Can force sensors be used to predict
intentions based on body language and
position of in a chair?
Insight/Conclusion - Users had a
positive experience with this anticipatory
sensing chair prototype when the chair
matched their intentions. Force sensors
were able to identify simple user intentions
like pushing back on the chair. Users
felt the chair made transitioning between
activities easier. (Figure 4.2.5)

Figure 4.2.5 : Darkhorse prototype

Funky Prototype - Methods of Transitioning Modes


QUESTIONS - Are their any intuitive
methods of transitioning between driving
and autonomous mode?
Insight/Conclusion - Voice recognition
and touchable buttons seemed to be the
most intuitive with hand gestures as an
alternative to initiate modes. The team observed users consistently pulling or pushing the steering wheel to make it come out
or retract. (Figure 4.2.6)

Figure 4.2.5 : Funky Prototype

Functional Prototype - Chair Sense 2.0


QUESTIONS - How well can people adapt
to this new chair interaction? Are gestures
an appropriate method to initiate transition
between driving and autonomous mode?
Insight/Conclusion - People enjoyed the
experience and concept of the prototype.
Gestures used as mode initiators were unreliable due to random misfires. Must accommodate a variety of user body types.
Figure 4.2.6 : Sensing chair

56

Final Documentation

Figure 4.2.7 retractable steering wheel aalto

4.3.1 Vision
The team was split in terms of whether to
focus on the activities or transitions between activities. A resolution was reached
and encapsulated in the vision. The focus
of the project was to facilitate and make
sure that users would be able to perform
many activities, while still making sure that
the transitions would be made much more
effortless. The team agreed to deliver a
complete experience so that users could
get a taste of what it would be like to work,
relax, and socialize in the cabin space
Functional Prototype - Retractable
Steering Wheel
QUESTIONS - Do users feel that initiating
modes with the seat belt is intuitive?
Insight/Conclusion - Seat belt as a trigger
brought confusion. The trigger has to be
something simple, clear, and notify the
user of which mode they are in.

redesign. This experience also included a


very intuitive and effortless way of moving through these three activities without
any tedious adjustments or extra steps,
as well as making sure the space was
flexible enough to allow for large reconfigurations, such as the rotation that would
be needed in order to improve socializing.
The results vision statement is as follows:

(Figure 4.2.7 retractable steering wheel


aalto)

4.3 Finland convergence


During the Stanfords team visit to Finland,
the whole team unified their vision, decided on what features were more important,

To facilitate activities
for the driver through
effortless transitions
and increased mobility.

divided tasks, and prioritized tasks. This


section will outline the final outcome of the
weeks discussions and work.

4.3.2 Prototype features


On the right page is a table of key elements of the final prototype that the team
wanted to integrate. It also outlines what
team was taking on the responsibility and
task, as well as the priority level of each
task.

Final Documentation

57

Table 4.3.2.1 Priorities for EXPE

Prototype Component

Team Division

Priority Level

Anticipatory Chair with sensors

Stanford

Very High

Rotation Mechanism

Stanford

Very High

Foot Pad to be integrated with chair

Stanford

Mid

Mock Cabin Design & Build

Aalto & Stanford

High

Retraction Mechanism

Aalto

Very High

Steering Wheel Design

Aalto

Mid

Interactive Steering Input Device

Aalto

Low

Windshield Driving Video/ Simulator

Stanford

Low

Manual Mode indicator

Aalto

High

Prototype Component

Team Division

Priority Level

Kinect- secondary cues for chair

Stanford

Very High

Butterfly Doors

Aalto

Low

Temperature Control

Stanford

Low

Sounds and lights

Aalto

Mid

Lighter chairs

Stanford

Md

Interactive Display

Aalto

Very high

Other features that the team identified as


nice to have and prioritization of these
tasks, as well as who would spearhead
the efforts if time permitted.

Table 4.3.2.2 Priorities for EXPE

58

Final Documentation

4.4 User testing


Throughout the build process, the team
tested with various users. The hope was
to test the chair with as many varying
body types as possible, in order to make
sure that the chair could be used comfortable with a wide variety of people. The following are some key points and discoveries that the team found during our various
testing rounds:
(Figure 4.4.1)
Chair Findings:

Short users were unable to reach the


floor to use the foot pad if sitting all
the way back in the chair

It took users a couple of minutes to


get comfortable with using the chair.
Figure 4.4.1 - User Testing

Many users tried pushing harder, even


though the FSRs were able to pick up
on much more subtle movements.

One user tried avoiding putting his


feet on the foot pad

Many taller users who had broader


shoulders had to lift their shoulder off
the chair in order to get the recline
adjustment to activate.

People had a hard time activating


rotation when asked to twist their bodies. But they found a much easier way
by just sliding their upper body in the
direction they wanted to rotate.

Users were delighted when they activated the rotation and began smiling
or giggling.
(Figure 4.4.2)
Steering Wheel Findings:

Users thought the clicking sound and


feel was gratifying when pulling the
steering wheel

Pulling the steering wheel took a little


too much effort.

Having the same background for both


modes made people unsure of what
mode it was in, even though it said the
mode on the screen.

Figure 4.4.2 - User Testing2

4.5 Steering Wheel


Concepts for Final Design
The team decided to test different concepts of a retracted steering wheel
mechanism to find out what is the most
intuitive way for the driver to switch to/
from different driving modes. The trigger

Final Documentation

59

Figure 4.5.1.1 - User Testing

should be easy to reach, fast to initiate,


and be able to integrate with the function
of the chair.
4.5.1

Magneto

The team took the magneto darkhorse


idea and refined the concept. The new
magneto steering wheel contained a
magnet on the central back area. The
magneto steering wheel was shaped into
a rectangular object since it was formed
around a tablet device. It was integrated
with a ribbon shaped dashboard made of
metal. To change from manual to autonomous mode, the driver could simply take
the steering wheel off from the dashboard,
and use it as an input device for working wheel back in the correct spot on the
User testing result:
Pros

dashboard. (Figure 4.5.1.1) ing, or attach


it to the extended part of the dashboard
or on the lower area of the chair which
were also made of metal. To change from
autonomous to manual mode, the driver
had to put the steer back in the steering
position.
Cons
Users put the magneto back on the driving

Fast and easy to change modes

spot even though they didnt want to drive,


because that was the best reading position
for the tablet device.

Clear confirmation of each mode

Users put the magneto back on driving spot


but did not realize they were in driving mode.
Users didnt know where to store the

Users loved the new concept

magneto/tablet after taking it off, confused


at first what to do with it.

Users were comfortable and relaxed while

Users felt unsafe to take the steering wheel

using the steering wheel as an input device

off

Users passed the magneto to the back seat


passenger to share the experience
Users loved the open view brought by the
ribbon like dashboard.
Table 4.5.1.1 Pro and Con for Magneto User Testing

60

Final Documentation

was triggered by a linear actuator. The

4.6 Final Prototype


Development

shape of the steering wheel maintained

AudiEvolve is comprised of three main

the same shape as conventional wheels.

components: an anticipatory chair that

To change from manual to autonomous

senses a users intentions from intuitive

mode, the driver has to apply a certain

body movements and adapts the chairs

pushing force to the steering wheel for

position accordingly, an interactive steer-

3 seconds. The wheel will then start to

ing wheel with touch screen display that

retract and blend into the dashboard as a

is used not only for transitioning between

flat surface. To change from autonomous

autonomous and driving mode but for in-

mode, the driver had to apply another 3

teracting with any digital content the user

seconds push on the steering wheel to

desires, and an open/clear cabin space to

initiate the steering wheel to come out.

maximize utilization of space for the user.

4.5.2

Retracted steering wheel

The retracted steering wheel mechanism

User testing result:


Pros

Users felt safe and reliable with little change


from legacy

Push movement is really intuitive

Cons

The driving position is not ergonomic

Mode change process is too slow

Users used a lot of force and waited for too


long
Users had to push the wheel back with both
hands, which is an unsafe position if you are
really driving
Pushing added force on the chair that
moved the chair back
Table 4.5.2.1 Pro and Con for Retracting Steering Wheel

Final Documentation

61

4.6.1 Anticipatory Chair

between activities and modes is effortless.

The final design of the anticipatory chair

The team designed the chair system to

was built upon the concept from the

have three main functions: a tilt, a slide,

teams functional prototype. The team

and a rotate function. Users wanted

focused on the driver and the transition

the ability to move freely without being

between activities that the driver would

disrupted from what they were doing and

most likely perform within the cabin space

using body movements that correspond to

of an autonomous vehicle. It was evident

how the user would move around in a roll-

that the chair is an important aspect within

ing/swivel chair was reasonable.

this design space. The chair has a variety


of pressure sensors embedded into it that

4.6.2 Interactive Steering Wheel

determines how the user wants to adjust

To give users a futuristic feeling, the team

the chairs position. Several major differ-

wanted to redesign the input device for

ences from the functional prototype to the

steering. CFP Physical Steering showed

final design was the modification of the

the team that a wheel like steering device

rotation mechanism to allow the user to

will be the best choice. The team wanted

smoothly rotate up to 180 degrees and

to maintain some legacy from old cars and

the addition of a footpad to allow the user

spice it up with a modern design.

to slide the chair back and forth. Rotating the user up to 180 degrees drastically

The team also added more interface value

changes socialization and interaction in

for the user within the steering wheel. The

the car with fellow passengers compared

darkhorse prototype Magneto showed us

to the conventional cabin space layouts.

that there are many additional opportuni-

The footpad also serves as a resting place

ties for the steering wheel since the car is

for the users feet, while the chair is rotat-

now autonomous. From the needfinding,

ing so the users feet do not drag across

benchmarking, user testing, and online

the floor.

research, the team discovered that the


location of the steering wheel is also the

From the functional prototype, the team

most optimal location for digital interac-

realized that the user experience and

tion in autonomous mode. Therefore the

operation of the system must be flawless

team designed a steering wheel, which

or the credibility of the system is compro-

incorporated an interactive surface on it.

mised. Therefore, the team implemented

Figure 4.6.2.1 shows early mock up of the

a manual override, as well as making sure

steering wheel design.

that the chair would never be capable of

There are five main key insights that af-

sandwiching a user in the chair. Ease of

fected the design solution:

usability and flexibility are important aspects for the user to feel that the transition

62

Final Documentation

Figure 4.6.2.1 - Early Steering Wheel Mockup

1. Interaction with the surface of the

The team designed an interactive surface

steering wheel has increased since

on the wheel for increased interaction

it was introduced as a steering input

possibilities and a retraction mechanism

method. First it was only for steering,

that allowed mode triggering with push

but nowadays it has all kinds of but-

and pull movements effortlessly. The

tons on it, so that the user is allowed

retraction mechanism also locks the steer-

to interact with it. Since future cars are

ing wheel while in autonomous mode and

autonomous it allows us to increase

aligns it with the dashboard.

this interaction even more.

2. The surface of the steering wheel is

To make triggering effortless, the team

the most accessible when inside the

designed a spring retraction mechanism.

cabin space for the driver.

The strength of the spring had to be

3. Users did not like to see the steering

strong enough to keep the steering wheel

wheel move completely to a different

pushed out but also weak enough so

area.

that the user can easily push the steering

4. Physically triggered modes make the

wheel in. Additionally, the steering wheel

mode change more apparent for the

had to be able to stay locked in when

user.

pushed. Quick calculations showed us

5. Push and pull triggers were most


natural ways for triggering modes.

that the spring had to be about 40 newtons. This meant that around 4kg weight
can push the steering wheel in.

Final Documentation

63

Table 4.6.2.2 Retraction Mechanism Explosion

4.6.3 Open and Clear Cabin Space

conducted research about the future user,

It is important that the cabin space lay-

their needs, and explored the possibilities

out was open and clear. In this design,

that future cars would bring. If the car is

there is only one car seat in the front,

autonomous, the driver does not have to

arranged in the middle of the car. The old

watch the road all the time. This change

fashioned dashboard was also removed

radically transforms socialization inside

since the team discovered electric cars

of the vehicle. As the teams focus is on

will dominate the car market, therefore

the transition between multiple activities,

there would be no need for large engine

the driver requires increased mobility

compartments. Steer by wire will be the

inside of the cabin. The front passenger

main technology for future steering wheel

seat was eliminated and the driver seat

connections allowing the windshield to

was positioned in the middle to drastically

be more prominent so passengers have

increase mobility and flexibility within the

a better view of the driving scene. With

cabin space. This allowed the front seat

these changes, the team dramatically

to be able to rotate 360 degrees, making

increased the openness and clearness

socialization between passengers more

within the cabin space.

natural and comfortable.


Though the team focused on the transition

In order to better understand what the

between activities, it was still important

cabin space should look like, the team

to understand what positions would be

64

Final Documentation

needed for the most common activities

inside of a car as well. In order to do so, it

that the team identified in a car, which

was necessary to have a front seat rotate

included: driving, working, socializing, and

at least 180 degrees.

relaxing.
Relaxing
Driving

Relaxing will be done with the seat and

The team spent the beginning of the year

position adjustments. When the car is in

researching different driving methods.

autonomous mode, the driver has to feel

After user testing multiple prototypes and

comfortable and be able to freely move

many interviews, it was clear that the

around.

steering wheel is still the best way to steer


the car.

The final cabin space design utilizes all of


the advantages that an autonomous car

Working / Entertainment

offers. The team wanted to create some-

There are multiple prototypes and emerg-

thing that will give the a feeling of being in

ing technologies which have the potential

the car and communicate the team vision

to radically change the way people inter-

in the best possible way.

act with machines and work. Considering


current trends, work in the future will be

The designed cabin is open and clear.

digital and people will become even more

There is no big dashboard in front of the

digitally connected. At the same time, the

user. There is only a free flowing form (in

future of working is strongly related to the

the text we will address this free form as a

future of entertainment, because it is both

a ribbon) that flows inside and outside of

done on the same platform. The team be-

the car, creating a loop below the steer-

gan by giving the most general solution for

ing wheel to make a stand for the wheel.

working by providing the user a dashboard

The ribbon flows around the back seat as

which will also be an interactive working

well, providing users a necessary feeling

area. The main display of information will

of a compact cabin like space for EXPE.

be a windshield with augmented reality.

The whole front part of the car is a large


windshield that spreads all the way from

Socializing

the floor of the car, providing a completely

The introduction of autonomous features

new driving experience that was not possi-

in cars brings socialization in the vehicles

ble before autonomous and electrical cars.

to a whole new level. Socialization inside

For the purpose of EXPE presentation,

of a vehicle will be done through technolo-

it was decided to imitate the windshield

gy for remote business meetings, but hav-

display with a screen on which the display

ing a face to face in person conversation

and driving scenario was projected.

is something the team wanted to achieve

Final Documentation

Table 4.6.3.1 Cabin Space mock up

65

66

Final Documentation

Final Documentation

5. Design Specifications

67

68

Final Documentation

In this section we explain our final proto-

functionalities and cable wiring pin outs

type with more details. The section also

were almost exactly the same as discov-

includes the process of making the pro-

ered from the functional prototype. The

totype functional. The subsystems of our

team took the basic knowledge that was

final prototype are:

gained from the functional prototype and


applied it in the same exact way for the

1. Anticipatory Chair

final design, thinking that the chairs would

2. Interactive Steering Wheel

be very similar. The team discovered that


all the pinouts for the motor and encoder

5.1 Anticipatory Chair

lines were the same so the wires were

The chair is comprised of several components: the chair electronics, the firmware
development, and the rotation mecha-

isolated from the rest of the cable harness


and connectors were placed in line to allow easy connection/disconnection from
the main harness.

nism.

Figure 5.1.1.1 displays the system level


view of the electronics portion for the

5.1.1 Chair Electronics


The chair that was used in the final design
was from a 2007 Audi A8L. All of the seat

design and Figure 5.1.1.2 displays the


physical layout of the hardware.

Car Ba6ery
(12V)
Chair Motor
Drivers

Power Supply
(5V, 12V)

Chair
Motors

Microcontroller

RotaAon Motor
Driver

Chair Motor
Sensing

Force Sensor
Board

RotaAon
OpAcal
Encoder

Force Sensors on
Chair
Figure 5.1.1.1 System level view

RotaAon
Motor

Final Documentation

69

Figure 5.1.1.2 Physical layout of the hardware.

5.1.1.1 Microcontroller Board

the reference design to the actual design

The heart and sole of the electronic

was that the team eliminated the use of a

hardware design was the microcontroller

second microcontroller to use as a serial

board. The functional prototype used an

programmer for the atmega2560. Pro-

Arduino Atmega2560 board to control all

gramming was performed via ISP, in-sys-

the motors and sensor inputs. For the final

tem programming, which is standard for all

design, the team decided to move away

Atmel microcontrollers. The main benefit

from the Arduino IDE and use the actual

of designing a custom circuit board is that

Atmel Studios IDE to have more control

the connectors and female socket head-

and flexibility over the microcontroller.

ers to all pins from the microcontroller are

The Arduino wrapper libraries are very

implemented for faster software debug-

handy, but for this final design, the team

ging and modularity since all features and

wanted more control over the software

functionalities of the system were un-

and wanted to eliminate all of the behind

known at the time of designing the boards.

the scene actions that the wrapper library

Figure 5.1.1.1.1 displays the physical

performs. The team used the reference

circuit board for the microcontroller. The

designs of the Arduino atmega2560 as a

schematics can be found in Appendix.

basis to develop their own microcontroller


board that was suitable for their application. One of the major differences from

70

Final Documentation

Figure 5.1.1.1.1 - Physical circuit board for the microcontroller

The microcontroller has three ways of re-

tions for all power/signal lines to all other

ceiving 5V power: a DC barrel jack, a USB

boards and to the chair.

cable, or a power supply. There is a circuit


designed to select between the power

5.1.1.2 Chair Motor Driver Board

sources with the power supply having

The motor driver board for the chairs mo-

top priority if all of the sources are con-

tors is exactly the same as designed for

nected. The board also houses an FTDI

the functional prototype. H-Bridge circuits

RL232 USB to UART component used

were created out of DPDT switches to

for debugging purposes and serial com-

allow for the motors bi-directional abilities.

munication to the laptop that is controlling

Shottky diodes were placed in reversed

the interactive display modes of manual or

biased across the coils and the motors to

autonomous mode on the steering wheel.

help reduce the inductive kickback when

As mentioned above, the board has

the motor was shut down. One major dif-

female header sockets to every pin on the

ference from the functional prototype to

microcontroller to allow for quick software

this final design is that the SPDT switches

development, but has Molex Microclasp

used to connect +12VDC or GND to the

connectors used for modular connec-

supply power to the motor were replaced

Final Documentation

71

with a power transistor. The team noticed

a better experience in the chair. Figure

from user testing that the stop and start

5.1.1.2.1 displays the revised schemat-

motion of the chair movements were very

ics with the power transistor as the power

noticeable and not smooth. The team

enable component and Figure 5.1.1.2.2

decided that having a smooth ramping

shows the physical circuit board that was

function through pulse width modulation

created.

(PWM) was necessary to give the user


Figure 5.1.1.2.1 - Schematics

Figure 5.1.1.2.2 - Physical circuit board

72

Final Documentation

Since transistors were used to supply

5.1.1.3 Rotation Motor Driver Board

current to the motors, heatsinks were also

The rotation motor driver board is essen-

incorporated into the revised design to

tially the same as the chairs motors driver

dissipate the heat generated from cur-

board except that a larger heatsink and

rent flow through the component. Thermal

a dual fan assembly structured orthogo-

calculations were conducted to determine

nally was incorporated. The motor that

the suitable power dissipation rating. The

was used for the rotation mechanism had

maximum current load was measured to

a larger operating current of over 10-15

be around 9 amps, which was the stall

amps. A specific current load was not

current of the motors. The heatsink that

measured since the team did not have the

was used was an Ohmite RA-T2X-25E.

resources to measure large current loads.

Although this heatsink did not meet our

The Ohmite RA-T2X-51E heatsink was

thermal specifications based on power

used in conjunction with a dual fan assem-

dissipation at temperature rise and ther-

bly, as shown in Figure 5.1.1.3.1, because

mal resistance at natural, the team made

the heatsink alone was not able to dissi-

software modifications to be able to detect

pate the heat fast enough if the motor was

stalls within the motors and to immedi-

running continuously. The dual fan assem-

ately disconnect power to the motors. The

bly has one fan blowing from the top of the

heatsink for continuous stall loads of 9

heatsink and one fan from the side. The

amps is definitely insufficient, but for short

fans were operated on the 12VDC sup-

peak durations its suitable. The heatsink

ply line. With the fan assembly, the motor

for continuous loads of around 4 amps,

could run continuously for a much larger

normal operating current load, is fine.

duration than without the additional cooling before it gets noticeably warm.

Figure 5.1.1.3.1: Dual fan assembly for rotation motor heatsink

Final Documentation

73

5.1.1.4 Chair Motor Sensing Board

range that would be compatible for most

The motor sensing board for the chairs

microcontrollers. Figure 5.1.1.4.1 displays

motors is the same schematic design as

the schematic circuit design and Figure

constructed for the functional prototype.

5.1.1.4.2 displays the physical circuit

A comparator was used to level-shift the

board that was created.

encoder output square-wave to a 0-5V


Figure 5.1.1.4.1: Motor position sensing circuit schematic

FIgure 5.1.1.4.2 Motor position sensing board

74

Final Documentation

5.1.1.5 Rotation Optical Encoder

5.1.1.6 Force Sensing Resistors

An optical encoder was used to determine

During the functional prototype, square

the position of the rotation mechanism. A

FSR sensors were used and placed in

one inch 360 degree circle with black and

areas that would be beneficial to obtain

white stripes were used to obtain a square

weight distribution data. Data collected

wave pulse from 0-5V from every color

from the prototype seemed reasonable

transition. The circle of stripes alternated

but only for certain body types and for only

colors with every degree and was as wide

common cases. The team wanted to de-

a encoder tape sensor. This type of optical

sign a system that would be able to detect

encoder was more effective than attach-

and accommodate various body types us-

ing one to the shaft of the motor because

ers had, and change the thresholds based

the position had to be known relative to

on that body type. For example, tall/wide

the turntable that the chair was on and not

users would have a high threshold level to

the motor. If the encoder was connected

meet since it is reasonable to assume that

to the shaft, there were factors that made

more of their body weight would be press-

the source unreliable since the motor shaft

ing the sensors. One major finding from

was connected to a gearbox and then

the functional prototype was that the small

a vertical shaft. The encoded circle was

square FRSs were insufficient in gather-

placed underneath the turntable.

ing data from different body types so a


different kind of FRS was used. Figure
5.1.1.6.1 displays the initial configuration
of FSRs that were tested with for the final
design.

Figure 5.1.1.5.1 Rotation encoder - optical

Figure 5.1.1.6.1: Initial configuration of FSR

sensor and circle of stripes

tested

Final Documentation

75

Long, thin FSRs by Interlink (408 series)

directly on the chair back side cushions.

were used in conjunction with the square

Since these cushions compress and

FSRs to help with accommodating various

expand naturally, it was hard to get an

body types. The 408 series only measure

accurate reading since slight user adjust-

the total amount of force on the sensor

ments would cause unpredictable devia-

and not the location where the pressure is

tions from a calibrated zero reading. For

applied. The longer sensors for the sides

the final design, only the chair seat back

and the shoulders are beneficial for height

side sensors and the two middle square

detection since it had a much larger area

sensors were used for rotation and tilting

to sense shorter and tall people within a

back respectively.

fixed position.
Initially, all the sensors shown in Figure

The circuit design was exactly the same

5.1.1.6.1 were used to differentiate the

as from the functional prototype. The

tilting and rotating functions but after

same board was actually used but with

threshold testing with various users, it

slight modifications. Two extra sensor

was evident that more sensors were not

slots were added to accommodate the

necessarily the best solution for distin-

sensors for the sliding function via the

guishing different intentions. Having less

footpad. Figure 5.1.1.6.2 displays the FSR

sensors was better since it helped reduce

protoboard and Figure 5.1.1.6.3 displays

the amount of misfires from uncontrol-

the circuit schematics.

lable factors. The sensors on the bottom


cushion were not used in the final design
due to many factors that could not be
solved by simple software algorithms. One
observation during the threshold testing
was that users would often have objects in
their pockets, which caused the readings
from the sensors to not accurately reflect
the users body. Another key observation
was that users do not sit symmetrically on
the chair nor do they sit within a consistent pattern. Most users tend to sit more
toward the right side with more pressure
applied on the right side. This could have
been compensated for within software but
another uncontrollable factor that caused
problems with the sensor readings was
that the 408 series FSRs were placed

Figure 5.1.1.6.2 FSR protoboard

76

Final Documentation

CD4067BE

OUT

T2
T1
B6
B5
B4
B3
B2
B1
CONTROL_A
CONTROL_B

GND

OUT
IN7
IN6
IN5
IN4
IN3
IN2
IN1
IN0
A
B
VSS

VDD
IN8
IN9
IN10
IN11
IN12
IN13
IN14
IN15
INHIBIT
C
D

+5VDC
T3
T4
T5
T6

GND
CONTROL_C
CONTROL_D

TI - CD4067BE

Figure 5.1.1.6.3: FSR circuit schematic

5.1.1.7 Override Switches


From the functional prototype, the team
observed that users felt more comfortable
having some kind of override switch so
that if the chair does something that the
user didnt want it to do, the user could
use the switches to stop it and correct it.

The original adjustment switches located


on the side of the chair were the most
reasonable to use as an override point.
Just like from the functional prototype,
the resistance from each throw from the
momentary SPDT was measured as 391

Figure 5.1.1.7.1: Switch override circuit schematic and protoboard

Final Documentation

77

ohms and 822 ohms. A resistance of 470

switch are connected directly as input I/O

ohms in series with the 5VDC power line

pins. The enable/disable system switch is

to the actual switch enabled the voltage

connected to an external interrupt pin that

level for the 391 ohm side as 2.36V (ADC

determines whether the switch is in a high

10 bit value of 466) and a voltage level

or low state. The control box also features

of 3.30V for the 822 ohm side (ADC 10

an LCD screen so that one can determine

Figure 5.1.1.7.2: Switch override protoboard

bit value of 652). The protoboard that

exactly what state the system is in for

was made connects directly to the origi-

debugging and normal operation modes.

nal switch adjustment connector. Figure

Figure 5.1.1.8.1 displays the master con-

5.1.1.7.1 displays the circuit schematic

trol box.

for one of the switch and Figure 5.1.1.7.2


displays the protoboard that was made.
5.1.1.8 Master Control Box
A master control box was designed mainly
for the convenience of having users try out
the system at EXPE. The control box has
several main functions: allows the system
to be reset (software) without having to
touch the reset button on the microcontroller board, to calibrate the FSR thresholds depending on users body type, to
reset the chair to the default position, and
to enable/disable the code functions. The
software reset switch is directly connected
to the reset pin on the microcontroller.
The chair reset and the FSR calibration
Figure 5.1.1.8.1: Master control box

78

5.1.2

Final Documentation

Firmware Development

The basic firmware process flow can be


seen in Figure 5.1.2.1. The code can be
seen in Appendix XX.

FSR

Calibrate

Switch
Override
Disabled
?

Determine
thresholds
Based on
body type

Reset chair to
default pos
Check
override SW
touched

System

Enable
?
Y

Reset

Chair
?

Auto

Mode
?

Drive

Mode
?

Read FSR

Move into
drive pos

SLIDE, TILT,
OR
ROTATE

Figure 5.1.2.1: Firmware process flow

SW

touched
?

Move chair
motors
accordingly

Final Documentation

The code first determines whether the

79

cycles through each switch.

FSR calibration switch or the reset chair


switch from the master control box has

If the switch override is disabled and

been turned on. If the FSR calibration

system functions are active, then the

switch has been turned on, the FSR sen-

retractable steering wheel position dic-

sor values are collected through the ADC

tates whether the user is in autonomous

and the users body type is determined

or drive mode. If the steering wheel is

based on a certain threshold level. The

in drive mode, the chair will proceed

body type is categorized as short, me-

into a drive position setting only with all

dium, tall for the height and thin, aver-

other functions deactivated. If the steer-

age, and wide for the width. The height

ing wheel is in autonomous mode, the

category is used to determine the tilt back

chair reads the FSR sensors and checks

thresholds while the width category is

against three different functions: a tilt

used to determine the rotation thresholds.

back, a slide, and a rotate function.

If the reset chair position switch is turned


on, then the chair moves each of the

5.1.2.1 Tilt Function

motors to its zero position regardless of

The tilt function is activated once the user

current encoder position. Once the motors

holds the two centered back FSRs above

stall, then the chair resets to a default

a certain pressure value for a certain

position setting.

amount of time. This helps eliminates a


lot of misfires due to small adjustments

The code is separated into several

made by the user. If the chair back moves

categories that include switch override,

backwards, the user must lean forward

system enable, auto mode, and drive

slightly to make the FSR values go below

mode. The switch override system is

a certain threshold level to stop the chair.

necessary to enable the user to be able

The same actions also correspond when

to stop the chairs movement if it does

the chair moves forward and the user

something that the user doesnt want it to

wants to stop the chair. While the tilt func-

do. If the switch override is enabled, then

tion is enabled, no other function can be

the software checks whether any of the

activated.

switch adjustments on the chair side has


been active/touched. If no switches were

5.1.2.2 Slide Forward/Back Function

touched, the code cycles back through the

The slide forward/back function is acti-

main loop, but if they were touched, then

vated when the user pushes or pulls the

it determines which switch was activated

footpads just like the tilt function. To move

and moves the corresponding motor as

backward, the user must push forward

long as the switch is active. One switch

twice while holding the second push. This

is checked every 40 ms via a timer and

slight impulse nudge is necessary to help

80

Final Documentation

distinguish between sliding and tilting. Us-

5.1.2.4 Move Motor Function

ers tend to push with their feet when tilting

The code was structured to be able to

backward, and would cause a misfire on

take advantage of a PID motor controller

the sliding function. Having this slight

function. Although PWM gave the team

impulse before holding the second push

speed control over the chairs motor,

helps distinguish the user intentions with-

having a way to determine at what rate

out much effort. The code determines if

the speed should be controlled was an

the slight nudge is conducted and if there

issue. A PID controller helps take into ac-

are X amount of samples above a certain

count how fast the microcontroller should

threshold. If so, then it goes into a waiting

increase or decrease motor speed until

period. If the second push is conducted

it reaches its targeted set position. The

between the waiting period then the user

move motor function first determines the

will slide back, if not, then the user will

delta position from its set and current

have to conduct the nudge again. While

position. Once the error is calculated, the

the slide function is enabled, no other

result of the PID equation can be cal-

function can be activated.

culated and the PWM value will change


according to how far away the current

5.1.2.3 Rotate Function

position is from the set position. Once the

The rotate function is activated when the

motor reaches its targeted value, the mo-

user shifts their body to either the left or

tor would be disabled and the move motor

the right side. When the user is facing


toward the front, the user is restricted to
turn to the right up to 180 degree. When in
rotation mode, the tilting and sliding functions are disabled because of the restrictions with the steering wheel clearance.
If the user initiates rotation mode but the
chair is outside of the steering wheel
clearance boundary, the chair moves to
a set clearance position prior to rotating.
Once the user is rotated, the user would
relieve some of the pressure from the
activating side to stop or rotate all the way
up to 180 degrees. To exit out of rotation
mode, the user must rotate towards the
front all the way to 0 degrees for the tilt
and slide back function to be re-enabled.

Figure 5.1.3.2.1: Chair Dimensions

Final Documentation

for that specific motor would be complete.

81

5.1.3 Rotation mechanism design

This is an effective function since it does


not require the code to be constantly loop-

5.1.3.1 Assumptions:

ing through this function all the time. Other

Chair Weight = 30 kgs

functions and tasks can be interweaved

User Weight = 100 kgs

between move motor functions to allow

Rotation mechanism weight = 20 kgs

better utilization of hardware resources


and time.

5.1.3.2 Design calculations

Figure 5.1.3.2.2: Design Calculations

tric
ome a
e
g
on ees in
ia
sed
s ba 80 degr of inert
i
s
i
h
1
t
T
n
e
.
t
e
r
rota
chai
t, it
mom
ng
f the eeds to ase for er is lyi ir weigh
o
e
n
s
s
c
a
e ba
hair
he u
e ch
orst
of th el. The c l. The w ed and t t and th
r
e
t
n
e
e
h
b
s
e ce
whe
weig
g wh
ke a
55cm
at th steerin teering sition li e user
1
s
=
i
l
s
r
s
o
e
th
ai

3 0 kg
e ch and th lear the a flat p tion of is.
ons
i
h
=
s
t
'
n
r
d
u
n
e
i
m
fo
ib
ar
ll c
ax
ass
(dim
hbo
air is
d sti
tion
distr
ned
kgs
of m
rota the das ition an n the ch inuous e desig
0
)
s
f
3
t
o
1
n
2
he
t th
ont
pos
axis
one
m=
rom
-m^
ass
The raints f eclined axis is w ing a c ng abou
omp
2 kg
c
m
2
.
f
m
0
i
t
m
2
r
o
t
s
=5
b
cons ortable bout the it. Assu ab rota
hani
r sla
(l ')
l
n
f
a
mec
2 )+m
gula
n
n
o
a
com lations clined o 30 kgs s
i
t
t
c
1
e
u
m (b2 + l
rota
a re
calc letely r ed as a
ms
g of
c
ts of
n
i
s
0
t
i
=
p
4
s
s
n
t
i
i
s
on
12
hif
com e imag
l'=
(con
lel s
am c
b
xis is
disc
aral
a
iagr
r
p
d
can
n
a
i
l
+
s
s
mas
circu f the ma
ete
l axi
ped
mpl t
o
nd a
ntra
o
a
m
t
e
c
f
u
)
c
i
l
e
h
s
a
S
n
ut a
The
as o cided th eed.
56cm 35cms .
t
abo
e
=
I
s
e
b
M
s
sp
d
=
and
d wa . It was nstant
lab =
2
us r
e
s
i
^
e
r
d
p
m
o
a
a
c
l
s
c
r
2
5 kg
d/se
at a
tion
and
angu
' l =1.22
rota f 0.6 ra move
rect
m
e
m
h
o
d
u
t
l
f
d
im
^2
d of
shou
MI o
spee
c= 2
g-m
max
spee
red ngular and the
r dis
45 k
i
a
4
s
m
l
.
e
u
u
1
a
d
c 2
5
circ
um
axim
the
1 se
tia =
f the
s
hair maxim eed in
he m
/
iner
c
t
l
f
d
d
e
o
a
a
MI o
p
n
an
r
nt
to
rs
wiv
-m a
m at
ome
he s comes angula d be 0.6
N
t
N
m
7
n
l
o
t
l
m
.86
0.75
Tota
one
wou
imu
. Tha
= 30
ue is lues we
6
hen
sts d econds his max
q
.
t
r
e
0
t
o
n
t
x
va
d
ve t
d on
11 s
ratio
uire n those
.445
q
1
e
5
Base ution in ld achie r accele
r
o
is
l
u
la
t the
fety
hen
revo hair sho
angu
s tha tor of sa
air t
n
h
m
c
a
c
u
e
e
c
the
axim
rn th
ich m nal fa
he m
to tu 6 rpm.
1 wh additio
:
d
0
So t
e
4
ir
f
n
y
atel
requ
ion o
ing a
duct d apply 0 rpm.
rque pproxim
e
o
r
t
a
0
The ion is a
t an
at 3
has
t
ear ing poin 1 N-m
g
rota
m
t
r
is
ra
oint
d wo
ope
lecte That is rating p
e
s
The f 240. he ope
o
t
rpm ed that
d
i
c
e
d

82

Final Documentation

5.1.3.3 Initial rotation prototype


The first prototype of the rotation mechanism was designed using a 40:1 reduction
worm and a worm gear from McMaster
Carr. The connecting components were
machined.
A motor was selected from AmpFlow.
This motor operates on 24V and has a
stall torque of 5 N-m and a no load rpm
of 5600 at 24V. These specifications far
exceed the calculated ones including a
factor of safety. The team planned to run
Figure5.1.3.3.1 - Rotation first prototype 1

Figure 5.1.3.3.2 - Rotation first prototype 2

Final Documentation

83

this motor at 12V. This made it easier to


use transistors for PWM switching of the
motor.
There was a lot of noise and alignment
issues with this mechanism. This also
caused a lot of vibration which could be
felt by the users as it was being transmitted through the platform and onto the
chair that was connected to it.

Figure 5.1.3.4.1 - Rotation mechanism initial


assembly

5.1.3.4 Final Rotation Mechanism


It was decided to go for an industrial worm
gearbox in order to remove the noise,
vibrations and resolve all alignment issues
related to the gearbox. Flexible couplings
were used at the input and output shafts
of the gearbox to connect the motor to the
gearbox and to connect the gearbox to the
platform. Figure 5.1.3.4.1 shows the first
assembly of the gearbox.

Figure 5.1.3.5.3 - Rotation connection to motor

Figure 5.1.3.4.2 - Rotation connection to platform

84

Final Documentation

5.1.4 Cabin Base Design


The cabin base was constructed in three
parts for ease of transportation. The front
part was used to mount the dashboard,
the middle part housed the rotation mechanism and the chair and the third part was
used to mount the back seat. The first and
third parts were composed of a normal
rectangular support design. The middle
part was designed in a different way to
transmit the loads from the rotation more
uniformly. The turntable was supported at
eight points on an octagon support right at
the center of the base. Appropriate spacers were used to align the turntable and
make it flat with respect to the ground.

Figure 5.1.4.1 - Cabin base

Figure 5.1.4.2 - Drawing of the central cabin base design

Final Documentation

85

5.1.5 Footpad Design


The footpad was constructed in three layers (each a 0.25 inch in thickness) - the
base layer, the middle layer that serves as
a support and an upper section consisting
of the actual actuation element/section
and the housing around it. The footpad is
22 long and 19 wide. The three layers
have been shown in the following figures 5.1.5.1; 5.1.5.2; 5.1.5.3

The middle layer has a slot in it to route


the wires of the two force sensors that
have been placed at the front and back of
the footpad.

Figure 5.1.5.2 - bottom footpad layer

Figure 5.1.5.4 - Footpad top view showing wire routing

Figure 5.1.5.1 - top footpad layer

Figure 5.1.5.3 - middle footpad layer

86

Final Documentation

Direct actuation of the force sensors using

altered to also consist of rounded protru-

the edge of the actuating section does not

sions to make it easier to slide while also

work well, since it is pretty much acrylic to

maintaining alignment. To further reduce

acrylic contact with the force sensor be-

the sliding friction, nine ball bearings were

ing in between. The force readings were

added right below the actuating section,

much more reliable when a rubber stop-

embedded in the middle layer. To make

per was placed in between the actuating

it less slippery and much easier to actu-

section and the force sensor. As seen in

ate the footpad, the top was covered with

the diagram for the top layer, the front

vinyl rubber and grip tape to increase the

and back of the actuating section were

friction between the users feet and the

also changed to rounded protrusions for

actuating section.

a point contact on the force sensor. This


made the force response also much bet-

In terms of moving across the platform,

ter. In the next iteration, to further improve

the first iteration of the footpad had ball

on the force response of the footpad and

transfer wheels embedded in the bottom

to reduce sliding friction with the sides

layer so it could slide across the platform

of the actuating section, the sides were

on four points of contact.

Figure 5.1.5.5 - Footpad with embedded ball casters

Final Documentation

87

When the carpet was added this sliding

cal push and pull triggers for mode trigger-

led to a lot of friction, thereby increas-

ing. The steering wheel is a new design

ing the torque required by the rotation

for a sleeker aesthetic and is mounted to

mechanism to turn the chair. The final

this retraction mechanism.

iteration of the footpad does not consist of


the caster wheels and is supported by a

Steering Wheel

structure connected to the turning platform

Parts for this Interactive Steering Wheel

that is rigid enough to not let the footpad

were created with a 3D printer (Picture

touch the ground when the users feet

bellow), except for the bottom part which

are resting on it. This makes the footpad

was manufactured with a CNC mill. The

essentially float above the platform while

interactive screen was made using a used

rotating, which leads to a much better

Android tablet.

torque efficiency.
Retraction Mechanism
5.2 The Interactive Steering Wheel

Parts in this mechanism are hand made

This subsystem is a combination of the

using various manufacturing processes

retraction mechanism and the steering

including lathe, welding, sheet metal cut-

wheel. The retraction mechanism is a con-

ting and bending.

struction that allows the user to do physi-

Figure 5.2.1 - 3d Printing of Steering Wheel

88

Final Documentation

5.2.1 Structure of The Interative


Steering Wheel
Interactive Steering Wheel

Retraction Mechanism

Figure 5.2.1.1 - Steering Wheel

Handles

Frame

Android Tablet

Figure 5.2.1.2 - Steering Wheel Parts

Final Documentation

89

Back Lid

Figure 5.2.1.3 - Steering Wheel Parts Back

Ejection Spring
Tightening Band

Switch

Outter shaft

Inner Shaft
Base

Microswitch Support
Lid

Figure 5.2.1.4 - Retraction Mechanism Parts

Flange

90

Final Documentation

5.2.2 The Making of The Interactive

ties with a modification friendly operating

Steering Wheel

system based in linux. The tablet was

The CAD model was created in Rhinocer-

rooted, which means installing over the

ous modeling software and divided into

manufacturers firmware with a custom

two categories:

ROM in order to free up more functionali-

Milled parts
3D-printed parts

ties that were not allowed with the manufacturers settings and also sets free the
superuser(SU) functionality in order to

Milled Parts

receive commands over WiFi. In this case,

The Back Lid was manufactured by CNC

the team used a ROM called Cyanogen-

milling machine out of blue prototype

mod 9.1.0 since it was easily accessible

plastic.

and the instructions were available.

3D-Printed Parts

Finishing

The maximum size of Aalto Digital Design

After printing the parts, they were glued

Laboratorys uPrint printing envelope was

together and finished. The parts were

150x210 mm, so also the model had to be

filled and sanded before they were

splitted into several pieces, which together

painted. After filling, sanding and painting,

formed the Frame and Grip parts. The

the tablet was inserted into the back lid

model was generated into 14 parts that

and screwed on the frame.

were connected with studs that fit perfectly


to the corresponding holes. The settings

Retraction Mechanism

of the prints was to print solid ABS and

Base

altogether 561,39 cm^3 build material

This part of structure was CNC milled

and 197,43 cm^3 support material was

from blue prototype plastic and aluminum.

used within 44,47 hours of printing. After

The Base is made out of plastic and the

printing the parts, they were washed in an

Tightening Bands are made out of alu-

ultrasound bath in order to dissolve the

minum. The 3D models were made with

support material from the actual ABS.

Creo CAD software and those models


were analyzed with Mastercam which

Tablet

made the milling paths numerically for the

The tablet was selected by the size, the

CNC milling machine. After milling, two

operating system and the price. Gal-

M10 pressing screws were put on both

axy Tab 2 was one of the most obvious

tightening bands. The tightening bands

selections since it has 10.1 large screen,

are attached to the base with M5 bolts.

android as a operating system and is

Figure n shows the manufactured parts

one of the cheapest in its price category.

and the dimensional drawings can be

The android is well known for its capabili-

seen bellow.

Final Documentation

91

Figure 5.2.2.1: Base

Figure 5.2.2.2: Base DImensions

92

Figure 5.2.2.3: Tightening Bands

Figure 5.2.2.4: Tightening Bands Dimensions

Final Documentation

Final Documentation

93

Shaft
The shaft was manufactured with a lathe.
Here are the steps for making the shaft
part of the Retraction Mechanism.

Figure 5.2.2.5: Shaft

1. Step
Steel blanks were cut into correct lengths
2. Step
A hole for the guide nut was drilled on the
inner shaft, threads for locking bolt were
made and the guide nut was glued into its
spot.
3. Step
Grooves were made on the inner shaft
with a lathe.
4. Step
The flange was welded onto the inner
shaft and four M5 holes were drilled into
the flange.

5. Step
Guiding profile were manufactured and
M10 threads were welded onto Outer
Shaft.
6. Step
Locking screw was with a M6 bolt, two M6
nuts, two steel washers and one plastic
washer. The spring is also attached to it.

94

Final Documentation

Figure 5.2.2.6 - Inner shaft technical drawing

Microswitch Support

Cover for the retraction shaft

This support was manufactured with a

The cover was made of an ABS plastic

sheet metal cutter. The support is de-

tube, which had a hole cut in order to

signed to house the microswitch XGG2-

slide over the shaft. At the end of this tube

88-S20Z1 ,which can be seen in Figure n.

there was a glued laser cut flange to cover

Figure 5.2.2.7 - Microswitch support

Final Documentation

95

where the retraction mechanism met the

pictures of different recognizable appli-

dashboard. On top of this ABS plastic

cations like facebook, google calendar

covering there was leather covering velcro

or a movie. These applications could be

onto it.

viewed by simply swiping through the options on the tablet.

Assembled Retraction Mechanism


After all the parts are manufactured the

5.2.4. Communications between devices

retraction mechanism was assembled as

Protocols used:

seen in Figure 5.2.2.8

Android Debug Bridge (ADB), wifi port


5555
TUIO protocol, wifi port 3333
http, wifi port 80
Universal Asyncronous Receiver Transmitter, UART, Physical USB cable
The setup can be seen in the schematics
picture below. The trigger for the mode

Figure 5.2.2.8 - Assembled Retr. Mechanism

5.2.3 The interaction specifications of


the steering wheel
General overview of the setup and functionalities
The interaction part of the steering wheel
was meant to fulfill a simple purpose: to
demonstrate that the windshield can be
utilized as display while the car is driven
in autonomous mode. In order to change
these modes the steering wheel retraction
gave the signal for the tablet to change
the app and indicate the mode change in
words on the tablet screen. The steering
wheel also provide some haptic feedback
by vibrating twhen a mode change occurred. For the windshield, this meant that
there was an additional interactive window
over the driving scenario to represent our
vision of the future of work. There were

change was a physical switch behind the


steering wheel at the end of the short
steering shaft. The switch sent information
to the microcontroller through wires that
the end of shaft when moved to the other
position. The microcontroller used UART
communications to send either the letter
M for manual mode or the letter A for
autonomous mode to the serial port. On
the Mac side, there was a python script
listening to the serial port and determining
what actions to perform. The python script
operated web browsers on the projector through an applescript code and sent
the commands via wifi to the tablet which
was listening to the ADB commands. The
tablet sent back TUIO protocol data for
the computer in order to control the app
on the windshield display by swiping. A
program called Tongseng was used to listen and pass the TUIO data to the caress

96

Final Documentation

webserver. Javascript was used to make

Projector canvas

the webpages functional and for receiving


the touch events sent by Tongseng.

App

Microcontroller

Mechanical switch
behind the steering
wheel

VGA cable
Projector
Mac computer

Caress server
node.js
python script
Tongseng

USB cable
WiFi router

Galaxy Tab 2 tablet


Apps and hardware commands
Data to tablet

Data to computer
Touch events TUIO protocol

Figure 5.2.4.1: Communication schematics

The physical implementation of the setup


can be seen in the following picture.

Figure 5.2.4.2: Physical Implementation

Commands over ADB

Final Documentation

97

Figure 5.3.1.1 - Gluing the pieces

5.3 Open and Clear


Cabin Space
5.3.1 The Making of Open and Clear
Cabin Space
The dashboard:
Laser cutting the Duron pieces:
The team opted to make the dashboard
economically. The shape of dashboard
was first designed in Rhino. It was sliced
into 250 pieces so that laser cut duron
pieces could then be stacked to create the
shape.

Material needed:
3 ft x 4ft 1/4in Duron sheets
8 sheets
Gluing the pieces:
Each piece was numbered for easier assembly. On each piece, there was also a
rastered line to guide where the next piece
should be glued.
Material needed:
Wood glue
Clamps

98

Final Documentation

Sanding and applying filler

Coloring

The team next spent 2 weeks sanding off

The dashboard was finished by applying

the edges and applying Bondo to make

primer and flat black spray paint. (picture

the surface smooth and curved. (kuva

of dashboard)

taalla) In order to prevent the wood part

Material needed:

from cracking when force was applied, the

6 x primer

dashboard was then covered in fiberglass.

6 x flat black spray paint

Material needed:
Sanding machine
Grinder
40 pieces of 60 sand paper
20 pieces of 150 sand paper
3 cans x wood filler
5 cans x Bondo
Fiberglass kit

Figure 5.3.1.2 - Sanding the dashboard

Final Documentation

99

Figure 5.3.1.3 - Painted dashboard

The Ribbon

Making the bends

The ribbon was divided into 4 pieces - two

The team experimented with how to best

in the front and two in the back. The initial

make a uniform curve that could be rep-

ribbon thickness was supposed to be

licated easily. One of these ways was to

approximately 3cm. However, making the

heat up foam and to bend it over a mold.

ribbon this thick was quite a large chal-

However when the foam was heated

lenge and so alternatively materials and

instead of becoming soft, it shrank and

dimensions were explored.

became hard.
Another solution was to divide the rib-

Laser cutting acrylics

bon into smaller sections and heat each

The .dxf files were sent to a water jet cut-

acrylic piece in an oven or with a heat gun

ting company to have two 1cm thick black

to bend it. It was impossible to bend 3cm

acrylics sheets cut. (picture of sheets)

thick acrylic. The team decided to use

2 - 4 x 8 black acrylic sheets.

8mm thick acrylic pieces, which could be


bent with a heatgun.

100

Final Documentation

Figure 5.3.1.4 - Banding the ribbon

Because of the restrictions in time and the


amount of work, the team chose to use
thinner acrylic that could be heated and
bent. Since the pieces were rather large, a
grill was used to heat the entire section of
the acrylic and was molded over a trashcan to proper radius.
Painting
The ribbon was spraypainted flat black to
match the dashboard.
Material needed:

ports. The wood pillars were inserted into

1 piece of 600 sand paper

the plastic pipes, which were mounted and

2 x primer

screwed to the platform.

2 x flat black spray paint

A piece of 2x4 wood was attached to the


bottom of the ribbon. It made gluing the

Mounting the ribbon

wood to the plastic easier, as well as at-

The ribbon is created to stand on pillars to

taching the pillars to the plastic. These pil-

keep it at the right height. Wooden pillars

lars got screwed to the into each wooden

and plastic pipes were used as the sup-

board.

Figure 5.3.1.5 - Barbecuing the ribbon

Final Documentation

101

To attach the ribbon to the floor, the team


drilled holes in the floor and then placed
small plastic pipes through those holes.
Below the floor, we glued those pipes to
a small wooden board that we screwed to
the floor from below

Ribbon
2x4 Wooden Board

Wooden Pillars

Plastic pipe

Figure 5.3.1.6 - Mounting the ribbon

102

Final Documentation

Final Documentation

6. Project Planning and


Management

103

104

Final Documentation

6.1 Deliverables and Milestones


Date

Deliverables

10/22/12

Corporate Project Team Formation

11/8/12

Benchmarking review

11/29/12

Critical Function Prototype

12/4/12

Fall project brochure

12/6/12

Fall presentation

12/11/12

Fall documentation due

1/23/13

First meeting with Audi contact and engineers in Ingolstadt

1/24/13

Second meeting with Audi HMI lab representative in Ingolstadt

1/18/13 - 1/29/13

Dark horse prototyping

1/31/13 - 2/14/13

Funky prototyping

2/15/13 - 2/28/13

Functional prototyping

3/3/13

First vision locked

3/10/13

Turning point brochure

3/12/13

Turning point presentation

3/13/13 - 3/16/13

Aalto in Geneva Motor Show

3/21/13

Winter documentation due

3/22/13 - 3/31/13

Stanford in Helsinki

3/27/13

First whole team meeting with Audi locking concept

4/18/13

X is finished

5/12/13

Aalto in Stanford

5/16/13

Penultimate is due

5/28/13

EXPE brochure is due

6/6/13

EXPE

6/11/13

Final documentation due

Table 6.1.1 - Deliverables and milestones

Final Documentation

105

6.2 Budget and


spendings
This section includes the money spent for
the project so far, for both Stanford and
Aalto. The funds remaining from fall term
for both teams will rollover into the winter
and spring quarter budgets.

Date

Vendor

Item

Purpose

Amount

11/1/2012

Amazon

Mindflex Game

Benchmarking

$55.99

Stanford

Batteries and

Materials for

Bookstore

markers

Benchmarking

Sparkfun

Accelerometer

CFP

$38.23

Item

Purpose

Amount

Van renting

CEP

38.95

Saitek Aviator PC

Steering Testing

49.90

Steering Testing

24.90

Steering Testing

57.10

11/4/2012
11/21/2012

$30.44

Table 6.2.1: Stanford Spendings - Fall

Date
14/11/2012

Vendor
Mechanical
Engineering Guild

27/11/2012

Verkkokauppa

27/11/2012

Verkkokauppa

Gran Turismo 5
PS3
Thrustmaster

27/11/12

Verkkokauppa

Experience
Racing Wheel

28/11/12

ABC Gas

Diesel gasoline

Audi A6 Test

Station

for Audi A6

Drive

Table 6.2.2: Aalto Spendings - Fall

26.93

106

Final Documentation

Date

Vendor

Item

13/2/2013

ABC gas

Gasoline

27/2/2013

Hobbyfactory Oy

Servo

27/2/2013

Protoshop

Aluminium pipe

27/2/2013

Motonet Oy

Seatbelt

27/2/2013

Motonet Oy

Purpose
Funky Prototype
Testing
Functional
Prototype
Functional
Prototype
Functional
Prototype

Amount
31.07

71,9

18.23

39,9

Cover for the

Functional

steering wheel

Prototype

Item

Purpose

Amount

Paper Bots

26.86

Paper Bots

97.75

Dark Horse

180

Dark Horse

48.87

Dark Horse

Dark Horse

15

Travel

27.7

7,9

Table 6.2.3: Aalto Spendings -Winter

Date

Vendor

10/1/2013

JAMECO

1/23/2013

Frys

1/25/2013

UHAUL

1/25/2013

Frys
Dark Horse

1/26/2013

Testing
Cash
Dark Horse

1/26/2013

Testing
Cash

1/25/2013

Zipcar

Table 6.2.4: Stanford Spendings -Winter

Electronic
Components
Electronic
Components
Renting the van
Electronic
Components
Participant
incentive

Participant
incentive

Travel

Final Documentation

107

Date

Vendor

Item

Purpose

Amount

1/25/2013

UHAUL

Parking

Dark Horse

5.1

1/28/2013

UHAUL

Gas

Dark Horse

33.14

Electronic Components

FUNKY

62.89

2/20/2013

DigiKey Electronic
Components

2/20/2013

Jameco

Electronic Components

FUNKY

47.52

2/10/2013

Frys Components

Electronic Components

FUNKY

4.96

2/10/2013

Radioshack

Electronic Components

FUNKY

35.21

2/12/2013

McMaster Carr

Hardware

FUNKY

50.31

2/8/2013

AUDI Gas

Gas

AUDI testing

79.72

Fabric

FUNKY

11.3

2/10/2013

JOANN Fabric and


Craft Stores

2/9/2013

FRYS

Electronic Components

Functional

15.14

2/13/2013

SparkFun

Electronic Components

Functional

50.74

2/20/2013

Andy Mark

Functional

130.18

Resistors

Functional

3.24

Switches

Functional

13.86

Functional

33.04

Functional

13.02

3/11/2013

3/3/2013

Radioshack
Resistors
Radioshack
Switches

DC motors and
connectors

Fasteners and Shaft

2/25/2013

McMaster Carr

2/27/2013

Ace Hardware

2/25/2013

JAMECO

Electronic Components

Functional

44.57

2/26/2013

Ace Hardware

Hinges

Functional

29.25

2/25/2013

Radioshack

Functional

16.24

Table 6.2.4: Stanford Spendings -Winter

Couplings
Fasteners and Lead
Screw

Electronic
Components

108

Final Documentation

Date

Vendor

Item

Purpose

Amount ($)

2/22/2013

Frys

Electronic Components

Functional

40.03

2/22/2013

TAP Plastics

Plastic sheets

Functional

108.85

2/22/2013

ACE Hardware

Fasteners

Functional

33.48

2/22/2013

Radioshack

Electronic Components

Functional

15.15

2/13/2013

Staples

Easy button

Functional

7.58

2/20/2013

2/10/2013

Microsoft
Store
Olive Garden

Kinect

Dinner

Table 6.2.4 - Stanford Spendings -Winter

Gesture
Recognition
Food for
SUDS

150

390

Final Documentation

109

Table 6.2.5 - AaltoSpendings - Spring

Balance (EUROS)
Type/Name
5833.33 Bearings
5786.33 Teflon pipe
5760.29 Force sensors
5760.29 Stainless steel pipes
5719.99 Rubber matt
5715.53 Aluminum blank
5671.94 Galaxy tab 2
5222.04 Steering wheel for PC app
5218.05 Stainless steel pipes
5213.96 Spring presser M10, small ball head
5168.21 Spring presser M10, big ball head
5129.65 3D printing SW
4716.52 ACE hardware
4653.33 ACE hardware
4611.39 1/4 '' Duron
4605.22 gearbox
4231.22 home depot
4207.97 tshirts
3924.5 carpet
3447.78 ACE hardware
3403.78 Jameco electronics
3259.1 home depot
3251.1 ACE hardware
3228.81 home depot
3050.41 projector
2935.15 Ribbon pieces
1924.99 miscellaneous product
1915.8 Home depot
1907.25 Home depot
1899.23 Home depot
1883.86 ACE hardware
1862.96 ACE hardware
1845.56 ACE hardware
1833.87 Home depot
1738.46 ACE hardware
1723.96 coals
1692.9 Home depot
1666.63 Home depot
1638.55 potography rental
1587.29 Repeater cable
1561.74 Poster
1461.33 Ace hardware
1442.1 Fry's
1416.99 Ace hardware
1386.88 Jo-Ann Fabric and craft
Total

Date
8.4.2013
10.4.2013

Amount (EUROS)
47
26.04

5.4.2013
17.4.2013
17.4.2013
27.4.2013
3.5.2013
25.4.2013
26.4.2013
3.5.2013
12.5.2013
13.5.2013
16.5.2013
14.5.2013
17.5.2013
18.5.2013
20.5.2013
21.5.2013
22.5.2013
22.5.2013
24.5.2013
25.5.2013
29.5.2013
30.5.2013
30.5.2013
31.5.2013
30.5.2013
30.5.2013
26.5.2013
29.5.2013
29.5.2013
30.5.2013
2.6.2013
2.6.2013
1.6.2013
28.5.2013
6.5.13
1.6.13
27.5.13
4.6.13
5.6.13
5.6.13
19.5.13

40.3
4.46
43.59
449.9
3.99
4.09
45.75
38.56
413.13
63.19
41.94
6.17
374
23.25
283.47
476.72
44
144.68
8
22.29
178.4
115.26
1010.16
9.19
8.55
8.02
15.37
20.9
17.4
11.69
95.41
14.5
31.06
26.27
28.08
51.26
25.55
100.41
19.23
25.11
30.11
19.98
4466.43

110

Final Documentation

Table 6.2.6 - Stanford Spendings - Spring

BALANCE ($) Type/Name


5560.56 Turntable
5284.76 FSR Strip
5263.3 Gas
5178.19 Metal
5066.41 Electronics
4560.01 Motor
4451.7 McMaster
4298.9 McMaster
4270.87 McMaster
4244.25 Adafruit
4231.43 Fry's
4202.22 Harold's Upholstery
3743.94 Alan Steel
3712.61 Ace Hardware
3686.32 Alan Steel
3662.34 GAS
3620.99 Coconuts
3477.06 Adafruit
3464.24 Jameco
3444.29 HomeDepot
3416.71 Ace Hardware
3396.6 Gas
3356.05 Ace Hardware
3333.91 Ace Hardware
3327.73 HomeDepot
3219.1 HomeDepot
3148.28 Rajjot
2894.28 TAP Plastic
2871 JAMECO
2831.89 Digikey
2796.09 JAMECO
2694.7 Digikey
2631.48 JAMECO
2594.76 DIGIkey
2565.4 Fry's
2437.81 TAP Plastic
2415.28 Ace Hardware
2193.32 HomeDepot
2163.32 RM 36
1888.32 McMaster
1812.12 Automation Direct
1792.56 Ace Hardware
1757.83 HomeDepot
1720.45 home depot
1655.07 ikea
1373.36 home depot
Total

Date
Amount (USD)
04/05/2013
275.8
04/04/2013
21.46
04/02/2013
85.11
04/09/2013
111.78
04/10/2013
506.4
04/10/2013
108.31
04/09/2013
152.8
04/09/2013
28.03
04/12/2013
26.62
4/15/2013
12.82
4/16/2013
29.21
4/16/2013
458.28
4/23/2013
31.33
4/23/2013
26.29
4/23/2013
23.98
4/23/2013
41.35
04/05/2013
143.93
4/16/2013
12.82
4/24/2013
19.95
4/30/2013
27.58
4/19/2013
20.11
4/27/2013
40.55
4/25/2013
22.14
4/19/2013
6.18
4/26/2013
108.63
4/26/2013
70.82
05/01/2013
254
05/02/2013
69.28
05/02/2013
23.28
05/02/2013
39.11
05/06/2013
35.8
05/07/2013
101.39
5/14/2013
63.22
5/14/2013
36.72
05/08/2013
29.36
05/11/2013
127.59
5/13/2013
22.53
05/09/2013
221.96
5/17/2013
30
5/17/2013
275
76.2
19.56
34.73
37.38
65.38
281.71
4256.48

Final Documentation

111

6.3 Distributed Team


Management
Bearing with 10-hour time difference and
different timetable, the team decided to work
on different CFPs. The Aalto team tried to
create a way for easy manual steering, and
Stanford focused on the smooth steering
transition between manual and autonomous
driving modes. The two teams meet once
or twice a week for 30 - 90 minutes on
Google+ hangout to keep each other posted
on current progress, to share ideas, and to
show prototypes. (Table 6.3.1)

Tools

Purpose

Emails

Contact liaison, schedule update, important file sharing

Flowdock

Picture updating, file sharing, showing prototypes

Google Docs

Documentation, data recording

Google+ Hangout

Facebook VW-Audi Group

Discussion, brainstorming, job division, idea sharing,


weekly update
Finnish team internal group for scheduling, quick questions,
idea sharing

Photos

Recording testing process, capturing inspiration

Videos

Documenting tests, conversations, and prototyping

WhatsApp Group

Dropbox
Table 6.3.1 - Comunication Tools

Finnish internal group for instant chat on schedule


confirmation and daily accident report
Stanford internal group for file sharing

112

Final Documentation

6.4 Stanford EXPE


Stanford EXPE is the closing show of
the ME310 course. At this event, each
team has to give a final presentation at
Stanford University and demonstrate the
final product to the crowd in a small sized
booth. Audi Evolve showcased their final
product for the first time through a 12-min
presentation and had 3-hour interaction
with the EXPE visitors. We have made
two videos for EXPE. One was used in
the formal presentation, in which a future

Audi driver is using our product, switching


between two modes and interacting with a
backseat passenger. This video shows all
critical functions of the interactive steering
wheel and the anticipatory chair. The other
video is a detailed product demo. Each team
member gives a short explanation about our
product vision and value. Our Expe booths
featured a type of users manual to explain
how to use the chair and steering wheel, as
well as a tv display for our product video.

Final Documentation

6.5 Future Work

The team has identified future opportunities that this final system has the potential
to include if future iterations were to be
made.
Force Sensor Calibrated Driver Presets
This chair calibrates using the force sensors to determine what body type you
most likely are from your default seated
force readings. Based on these, the chair
could determine who exactly is sitting in
the chair and adjust the driver presets
accordingly. That way the need to input
if you are driver number 1, 2 or 3 like in
todays car settings is eliminated and the
chair simply knows based on you sitting in
the chair.
Similarly if the driver is calibrated as a
shorter person, then the chair can lower
so that the user will be able to have their
feet reach the floor and pedals without
having to do any manual adjustments.
This will make it much easier for shorter
people be able to utilize the foot pad for
moving their chair forward and backward.
This can be applied to taller people as
well who may need the chair raised up
and moved backward in order to reach the
wheel and foot pad comfortably.
These presets can be extended passed
just chair adjustments to mirror adjustment, temperature control, lighting control,
and even media/ radio preferences. This
will ultimately make the car know who is
sitting in the car without the user ever hav-

113

ing to indicate who they are. The car will


become smart and be able to anticipate
the drivers every need.
Reconfigurable Chair Locations
The team sees a potential for getting that
fourth chair back into the cabin space.
It is possible to put all of the chairs on a
railing system so that the chairs can move
around. If there are only two people in
the car then the drivers seat can move to
the left and the back right seat can move
forward. That way the two passengers
are seated next to each other. Similarly if
there are four passengers than the right
back seat can move forward, and another
seat be unfolded from the cabin space
to replace the seat that was just moved
forward. We think that this will allow for
more mobility and flexibility for users in the
cases were this three seat configuration
may not be optimal.

114

Final Documentation

6.6 Personal Reflections

Winter Reflection

Sangram

went through some great prototypes which

Fall Reflection
I think registering for the ME310 sequence
before I graduate from Stanford was the
best decision throughout my stay here. It is
now that I know what I would have missed
if I had not taken 310 this year. Its been a
great learning experience for me and I am
enjoying working with my team at Stanford
and the global team as well. It is still amazing
how we managed to get so much done in
just over a month since serious work on the
project started. There were times when we
were completely lost and there were times
when we knew exactly what we wanted
to do. The former was required but was
completely outside my comfort zone. That is
one important thing I have learnt this quarter
- how to deal with total ambiguity.

This was an interesting quarter for me. We


gave us good insights in the design space. It
took long hours of work to get our prototypes
ready in time and functioning properly, but
our team effort was very good and we got
things done! The best part is that I am
enjoying working on this project so much that
I did not even realize how this quarter went
by. It was when I started considering other
things as being interruptions to 310 work
that I realized how important this project has
become to me. We have come up with a
very good direction for our project. I still keep
wondering if there is any other good idea
out there, which is why I wish we had done
a couple of more dark horse prototypes. I
also think that our communication with the
global team could have been better and we
as a team would have discussed not only

Everyone can manage to think about

the idea for the next prototype but how each

something to do for testing and benchmarking,

of the teams is planning to implement and

but I think the most important part and the

test that idea. In any case, I am excited for

one at which my team is very good at is

the Spring quarter and slightly scared after

sitting down and analyzing the observations

looking at how quickly the next deadlines

and getting tangible conclusions from the

are coming up. But I am very sure that we

tests that were done. I have found these

can get there. I am looking forward to a good

intense what-is-going-on discussions to

final brainstorm with our team in Finland

be the most useful for me as well as for

during Spring break and hope that we come

the team. Observing and thinking about

with a solution directly based off all the good

everyday situations and drawing parallels

insights we have over the last two quarters.

into the technology and situations that


will exist in the future is something which

Spring Relfection

will prove to be very useful while heading

I have a lot of mixed emotions when I

towards a final solution in spring. I think this

write this reflection. ME 310 has been

year team Audi has a great opportunity to

everything that I had hoped for and more. It

create and design for a unique experience

is remarkable what we have achieved as a

for futuristic cars and I am looking forward

team. The final product we have delivered

to getting back to work in January.

Final Documentation

115

speaks for itself but the journey is something

teammates who have become such good

which does not come across and is not so

friends that it is not easy to say goodbye to

apparent. It has been a tireless effort for

them as they leave one by one. It is time to

the whole year. The team went through ups

say goodbye to Stanford. It has been one

and downs and we survived it to probably

awesome ride and the AUDI project has

become one of the best functioning teams

been the best experience so far, learnings

in 310 this year. I think our Finland trip was

and moments that I will cherish for the rest

the game changer. We realized that more

of my life. It is not the end of 310, this is

than half of our issues were communication

just a new beginning for 7 people who I am

issues and we went from scattered thoughts

sure will carry forward this experience and

here and there to a strong unified vision and

achieve great things in life.

a solid plan as we got into the spring quarter.


I enjoyed the design and exploration phase

Stephanie

as it was something new to me and I really

Fall Reflection

learnt a lot of things in the first two quarters.

As fall quarter comes to a close, it is clear

Not to say that I did not learn anything in the


last quarter, I learnt a lot but I was totally
in my comfort zone in Spring. We have
ultimately implemented everything at a very
professional level and have finished what
we set out for. I wish we had the planned
our budget more properly so that we could
have directly invested at the beginning in

that being in a state ambiguity is a fact of


life in ME 310. Our team began extremely
enthusiastic about the prompt and utterly
overwhelmed by the amount of things we
knew nothing about. It has been difficult
thinking about designing more than 20 years
in the future. Deciding on what to assume
and how to even come to those conclusions
about the future has been a long process

I am enjoying
working on this project
so much that I did not
even realize how this
quarter went by.

involving long discussions and debates

the changes that we made towards the end.

communicating with the global team has

But it was all a learning experience and I am


glad I went through it.
That being said, it is now time to
say goodbye..to the loft, to our space, to my

within our Stanford team and with our global


team. I hope we will get the chance to do
more wild brainstorming next quarter, since
this quarter the assignments often times
seemed to hamper our ability to really get
creative.
Furthermore,

collaborating

and

presented many challenges. Scheduling


meeting times and working around 7
peoples schedules has been difficult,
though I think we have gotten better at it.
Also the collaboration problem, I believe has

116

Final Documentation

in large part been because of the difference

tasks so that things just get forgotten on our

in assignment due dates. For a large portion

to do lists. Generally, this quarter has been

of this quarter, the global teams assignments

a lot of fun and I cant wait to see how our

were due about a week after ours. This

project evolves over the next quarter.

made it difficult to brainstorm ideas together


and really talk to each other and learn

Winter Reflection

from each others assignments, in order to

There are many emotions that come to mind

better inform each teams understanding

when reflecting on this quarter: excitement,

of the design space. When both teams

frustration, confusion, clarity, and, well,

were together, the brainstorming sessions

exhaustion. We began unsure of what we

were fun, innovative, and inspiring. I really

were doing and what direction we were

appreciated everyones creative ideas and

headed, but after diving into Dark Horse

hope we will come together more often in

everything seemed to begin to make sense.

the coming months.

I was excited about what we were learning

Prototyping and making an experience


come to life is, for me, the most enjoyable
part of the process. I like being able to
touch and see an idea that was born from
our brainstorming, and I enjoy seeing how
other people react and respond to that

...this has been the


best course I have
taken so far

and the experiences we were creating and


testing. The time has gone by so quickly
and we never let up to catch our breath, but
I think that is what has made this quarter
so interesting. We were always working
our way through a maze of problems or
decisions that needed to be addressed.
The difficulties were in figuring out how to
communicate everything we were learning
with our global team. Despite more weekly
meetings and being in constant contact, we
seem to have moved from a unified vision
last quarter to a very disjointed and slightly
at odds vision for where to get next. I also

physical manifestation of the idea. Even


quick prototypes and experiments made our
work feel more real and tangible, and it took
us out of our own heads. Instead of getting
lost in a debate, we could not deny what we
were actually experiencing and seeing while
using the prototype or in the experiment.
I think both teams can improve how we
communicate and divide tasks, making
people at times individually responsible for

found it hard to get out of the technical


mindset of implementing our prototypes. It
was easy to be overcome and overwhelmed
by all of the technical components that
needed to get done and put off making sure
that the experience and needs for the user
are always at the forefront.

I am excited

to see where we are headed next and to


visit Finland to spend more time with our
global team.

Final Documentation

117

Spring Reflection

problem. Being able to design something

This quarter was certainly crazy, but it was

and physical show a final product at the end

nice to finally just get to focus on building out

of it all is a rewarding experience.

our ideas. As expected, we ran into many

The collaboration aspect with students from

challenges that we didnt anticipate, but I


think we handled it well. Everything seemed
to go wrong, but we kept going. I was super
impressive by everyones persistence. It
was also much easier to get things done
and make significant progress on our
project once the whole team was together.
We were able to divide and conquer. We
supported each other and made sure that if
someone needed help they got it. Overall,
I am extremely happy with how our project
turned out, despite the consistently long
nights. While not every part was designed or
made perfectly, the really challenging parts
we figured out and the entire experience we
hoped to deliver, we delivered. This class
has been a great exercise in communication,
balancing a tight budget and time schedule,
and designing something so far in the future
that the problem doesnt yet exist. I have
loved this class and I am happy to have met
and learned from my teammates.
David

another country and that have different


backgrounds has been an eye opening
experience. I have not collaborated with a
group from a different country and being able
to effectively and efficiently communicate
with each has been a struggle. A major
advantage with this collaboration is being
able to see different perspectives of the
same idea from mechanical engineers
and product design engineers have
opened up solutions that I probably would
not have noticed myself as an electrical
engineer. I have learned so much from my
team members about different areas of
engineering.
I believe that at the end of the course, I
will become a better engineer that has
experience and expertise in areas that I
probably would not have been involved in
outside of this design class. I am definitely
looking forward to the next two quarters and
being able to see all the hardwork and long
hours paid off from the final design built.

Fall Reflection
This design project has been challenging
and exciting. Since the project proposal is
set for the year 2035, being able to predict
and assume what the future will be like, in
terms of the automotive industry but also the
lifestyle of humankind, is extremely difficult.
I took this course to challenge me and be
able to apply my electrical engineering
and creative thinking to solve a wide-open

Winter Reflection
The winter quarter has been extremely
exciting. Since the fall critical function
prototype and working on the darkhorse
prototype, the direction and logic of going
from prototype to prototype has been well
defined. It has been great working on the

118

Final Documentation

prototypes although it has been very time

bonding, and set our sights to deliver a final

consuming and stressful at times. Being

design that has never been seen before,

able to see our hard work pay off and our

our team became stronger and supportive

concept starting to look more like a real

than ever. This class has taught me that

product gets me looking forward towards the

communication on all basis is one of the

spring quarter. There were many difficulties

keys to success regardless of how close or

with communication between our global

far away you are. Although I am ready to

partners but we have been more persistent

graduate from Stanford and excited to learn

in keeping each other in contact on a bi-

more from real world experiences at my

weekly basis or so. Our trip to Finland to

full time job, I will never forget what I have

converge and decide all the details on the

learned and experienced from ME310 and

final design will be challenging due to all

my teammates. My team is amazing and I

the opinions of the team members but I

am very glad that I was able to share this

am positive that we will work together and

experience with them. In the end, AudiEvolve

ultimately pull off a spectacular product to

was a success because of the hardwork,

show off at EXPE. I am more of a technical

dedicated,

person so I am definitely ready for the spring

attitudes of each of my teammates. We are

quarter to start to make our vision a reality.

AudiEvolve, transforming the journey into

supportive,

and

optimistic

the destination.
Spring Reflection
ME310 was an amazing experience. This

Goran

spring quarter was extremely intense but we

Fall Reflection

managed to complete all the goals we set. I

Before I took this course, I didnt know what

learned so much from my local teammates


as well as my global teammates. ME310
is a class that I have never experienced
before nor would I have experienced if I
did not come to Stanford. It is much better
than any typical electrical engineering class
that I have taken. I collaborated with some
of the smartest people in the world and I
was fortunate enough to learn valuable
skills from them, but also give them some
of my knowledge from past experiences.
Prior to the spring quarter, the Stanfords
team dynamics and communication with
the Aalto team was little to none. But once
we visited Finland, had some quality team

to expect. It was the first time for me to take


a course from another department and I
was really worried. This course is one year
long and if it doesnt meet my expectations,
I would be stuck there for the entire year.
Fortunately for me, so far the course was
amazing. I feel like I am learning a lot for
the first time since I came to Aalto and I
hope it will continue to be like that until the
end of the course. I have to mention that
I was positively surprised by the teaching
team, especially Harri. It seemed like he
never stops working. Whatever was the
problem, he solved it as soon as possible
and even showed personal initiative which

Final Documentation

119

is unfortunately very rare among professors.

Winter Reflection

I think it is really interesting that people who

Being a part of this course so far has been

recently finished the course are the ones

an interesting experience. Ive definitively

that are teaching you. Their knowledge is

learned a lot and I have to say that this has

really fresh and it is very easy to talk to them.

been the best course I have taken so far.

The project we got is inspiring. Thinking that

Maybe the assignments that we are doing

far ahead in the future makes you come up


with different and often shocking concepts.
It requires a lot of research in order to
understand the future and how might the
world evolve, but I always liked watching
scientific shows and science fiction movies
and this part of the project I really enjoyed.
This course also teaches you to try and
test. We did a lot of tests, but I still feel that
we should have done more. By observing
and testing we were able to identify some
problems we overlooked. It was a shame
that to this date we havent tested any of
our out of the box ideas. We had a plan to
test those ideas as soon as we come back
from our winter break.
Working in interdisciplinary team is a new
experience as well. Sometimes it was hard
to make a decision, but most of the time it
was fun. Although my teammates are not
from creative fields, in brainstorming I was
surprised how fast they relaxed and the
number of interesting ideas we got in a short
period of time surpassed all my previous
brainstorming experiences.

are not so new for me, but the their number


and intensity definitely are. Every week I am
thinking that after these few days I can take
a small brake, but it just never comes. Winter
quarter has gone very quickly. It has been
very stressful period. Almost through the
entire period we didnt know what to do. It is
still very confusing. The end is approaching
and decisions need to be made, but we
are still struggling with our focus and the
problems we are solving. I wouldnt want to
deliver something at the end that doesnt
satisfy me. The course takes eight months
and it would be a waste if I cant put the
project in the portfolio later. It means that the
concept needs to be strong and the visuals
as well. I realized through the course that
in the case we have not enough time at
the end, which might happen, that design
is going to suffer. Concept needs to work
and that is something required from us. It
needs to look appealing also at the end,
but the number of engineers and designers
in the group is so unbalanced that it will
be a struggle to convince people of the
importance of aesthetics when we approach

As a conclusion, so far this course is more

the final deadline. I am really scared that I

than a positive experience. We still have a

might not be able to show this project to

lot of work before the final presentations. I

future employers at the end.

just hope that the reaction from our client at


the end of the course will be positive as well.

Team work improved during this period.


We were equally lost when it comes to

120

Final Documentation

the concepts and ideas. That is maybe the

way to make it, but at the end it saved us a

reason why we stopped discussing and

lot money that we used to outsource cutting

arguing so much. It looks that we are all

of acrylic pieces.

scared about the final outcome of the course.

I believe this course is the most intensive,

We have now much better communication

but in the same time the best course I have

with our global team as well, but it still feels

ever taken. I strongly believe I have learned

that we are competing in a way. I know

a lot during this eight months especially in

that will improve during the spring. I am

the communication sense. I also started

looking forward to the end of the course.

doing more fast prototypes and for a

Not because I will have more free time, but

designer it is very valuable habit. Honestly

because I cant wait to see our final result.

in design part we could have done more.

I hope it will make me proud and that my

Some of the parts were really not on the

final reflection will be filled with satisfaction.

level expected for design professionals, but


in the given time frame and the scale of our

Spring Reflection

project, maybe that was the maximum we

We knew that we need to meet soon in

could have done.

order to clear all the differences between

For the end, I would like to thank everyone in

our two subteams and Stanford student visit

the teaching team for making this course so

to Finland helped us become one team and

amazing. I know they made it the way it was

to clarify our main goal and vision. It was a

and I really wish in the future the course will

very intense discussion and we had a lot

keep its culture and cult status. Also many

of disagreements but at the end we found

thanks to DF staff for helping us numerous

a compromise and we all felt excited about

times.

the final two months.

I am really proud to have been a part of

The final two months were very hard.

this great group of people. With that I mean

We worked a lot to fulfill our vision, but

everybody involved in the course and DF.

everything seemed not to go according to

I know that wherever I proceed in my life I

our plan. Simply put, whatever that could

will always come back to this year as one of

have failed, failed. At some point it looked

the best and most fun years of my life. Now

almost impossible to pull this off. At the end,

it is time for something new. Over and out!

we had to make some cuts, for example


removing windshield from our plan, but we

Sifo

managed to make all the other components.

Fall Reflection

My task in the final prototype was making

The Fall period is all about dealing with

the cabin space which included steering


wheel, dashboard and the ribbon. The
dashboard itself took more than two weeks
to built. I believe we took the worst possible

ambiguity. Our project is solving a problem


after 20 years. I feel we were walking in the
dark tunnel, expecting to see dim light from
far away. Its not as bad as it sounds like. In

Final Documentation

121

fact, once we started to observe real life, to

from learning Arduino for the robot, I found

dig into current problems, and to embrace

coding extremely interesting. The movement

new concepts, we felt we would become

of servo gave me immediate feedback on

successful future fortune tellers.

whether my code is correct or wrong. It was

As the project goes on, I have got two

my first time to code for electronics and to

biggest learning. The first is we should


prototype more other than spending time
discussing whether the idea is good or
not. I enjoy working with this team. My
favorite part is brainstorming, when crazy,
funky, weird ideas are generated from
people coming from 4 different majors and
5 different countries. Appreciating each
others ideas and building ideas upon
each others is definitely a good learning
process, and its difficult. We started from
fighting which idea we should test out, which
hindered our creativity and efficiency. The
second learning is constantly asking myself
what is the reason behind the supposed
obvious to dig out the users real need. I
didnt realize fun driving also means letting
your car reach a challenging status that you
can never achieve in real life. I didnt know
separating steering control and turning
control in different devices is actually easier
than combining them in one. Thanks to the
project, I paid more attention to daily routine
and details. The result is beneficial.
I wish in the next period, communication
between two sub teams can be more
frequent and collaboration can be more
productive. I wish we can test out more
crazy ideas and leave no regret before the

plug sensors, servos, LEDs to teensy board.


Connecting all of them in 10 days for an
amateur like me was challenging. Thanks
to the help from Design Factory staff, it
shortened my learning curve. Lesson from
this, sometimes we need to think big, with
can-do spirit.
In winter I found travel brought the team
together. When we were in Germany, the
team helped each other in trivial things.
We had to spend all time together, so we
got time to discuss something that had
never been discussed before. Running to
catch the last minute morning train, putting
on earphone whenever one was in toilet,
de-briefing in the noisy metro, amazed by
the robots and became speechless in Audi
factory, sitting anxiously to express what we
had been doing to Audi faculty, these are
valuable moments that make the team more
gathered and more open to a bold vision.
In winter I noticed communication is the key.
Especially between two global teams, we
need to spend more time on that. I noticed
our past communication strategy (hangout
once a week and one on one session once
or a week) is not enough. There are still

project ends.

things not agreed by both teams. There

Winter Refletion

through. There are important decisions

Winter period is all about learning. Started

are conflicts we two teams havent gone


we havent made together. Brainstorming

122

Final Documentation

through Google hangout was not engaging.

310 is definitely my favorite course and I

To talk about disagreement, any other

will remember the project, the people, the

communication tools than face to face cant

awesomeness forever. Im grateful for the so

fully handle the friction between two teams.

much help received from our teaching team

I look forward to having Stanford team in

and Defa people. As a business student

Helsinki. I wish well talk openly and come

Im probably the person learnt the most

up with something awesome.

in this team. Trying to find my own role in


this extremely technical project was quite a

Spring Reflection

journey. The largest learning goes to cross-

ME310 gives me the best learning

functional communication and multicultural

experience ever. Looking at our final

communication. I enjoyed realizing ideas

product, Im extremely proud of our team.

with fast prototypes, proud of getting hands

It looks just like the rendering we had one

dirty and being a workaholic. If I have to

month ago. To be honest although we aimed

say wishes, I wish we could have more user

high, I couldnt believe we can realize it in a

testing and more realistic planning of time in

generally perfect way.

design perspective. Thank you ME310! Im


ready to continue the awesomeness.

The past month was so far the busiest time


of my life. We worked around the clock

Tommi

to get things crossed from the to-do list.

Fall Reflection

The teamwork with Stanford students was

This project has been quite different from

pleasant. We supported each other whenever


whoever needed help. Although we have
two separate products, the integration
work was done very well. The cabin space
creation process was full of memories.
Although its not under critical priority, it
definitely was a critical nice to have and
affected the whole EXPE experience. We
experienced failure with the chair rotation,
frustration when dashboard cracked in the
painting stage, and desperation when the
ribbon completely broke and we almost tried
every possible way to bend it. It seems that
we had all possible failures in the last month,
but the important thing is, we overcame all of
them. Im glad that we never gave up being
persistent and creative.

the paper bike challenge. This is mainly


because of the vague brief and its million
opportunities to go for. Organizing ourselves
took a lot of time in the beginning and there
is still something that could be done better.
For example setting up internal deadlines
and plan them more precisely beforehand.
This is something that is quite different from
worklife since there is no individual who
organizes and who is responsible for all
the actions.To get working done efficently
globally is still a puzzle to solve.
So far the project has taught me mainly
social skills and teamology. Taking also
into account the cultural differences and
personal preferences has taken a bigger
picture in team working. This is quite

Final Documentation

123

different from the previous studies that

with quick and dirty prototypes basically

involve only substance learning like math

every time. Latest prototype was a good

and physics. These kinds of social skills are

example that proved that even the simplest

essential in real work life situations and it

idea takes quite a lot of time to actually put

is good to practice it in different situations

into practice and there will be most likely be

and this kind of project based course with

some unexpected obstacles along the way.

diverse team is excellent opportunity for that


This project has also made me put a lot

I personally like our vision we have at the

of thought into the future technology and

moment, even though it still needs some

future world, and I have a feeling it is really

polishing. Anticipatory sensing chair is a

going to be quite different than today. But

good concept but my opinion is that it needs

before really digging into it, I really had

a proper story around it. Freedom within the

never thought about it. What it comes to

cabin space does not add any extra value

autonomous cars, I had never even thought

for the user, if there is no reason to change

about that possibility even though I am a

position.

hard core car enthusiast (maybe that is also

Team dynamics seems to have some issues

the reason for that).

locally and globally. Somehow local working

What it comes to this documentation, I

seems to be highly inefficient and I do not

really think that content could be somewhat

know the reason why. Globally we have

different from the final documentation,

some communication issues and difficulties

especially

specification/

to agree with a common decision. We have

description part. Since we do not really

been trying to solve our local issues but

have a concept to go for, why do we have to

somehow it just does not seem to ease up.

artificially made something into that section?

What comes to the global problems, I think

Critical prototypes are only testing some of

our time together in Finland is going to be

the functions or some experiences, and

the cure for that.

the

design

therefore they are not even close to the


initial prototype.

Spring Reflection
This period was absolutely the best! Our

Winter Reflection

concept started to make sense and we were

This winter period has been even more

able to make solid common opinion about

chaotic than the fall period, at least I feel

our projects framework and what we are

so. We have been using increased amount

about to make.

of time on debating about philosophical

It was the turning point of our project, when

concepts without actually putting them into

Stanford student were allowed to come to

practice. All the ideas seem to be easy

Finland. Then we finally realized what we

to put into practice and not different and

will do, even though it took meant skipping

innovative enough. Therefore we end up

our spring break, but it was totally worth it.

124

Final Documentation

The next big event was our trip to Stanford!

misunderstandings during the course and

Then all the actual execution happened!

I have been finding myself frustrated but

Days were sick-long, but hey, we made it!

also surprised about the way our team has

The final outcome was extremely awesome

globally communicated. Skype and Google+

and it was so nice to hear all the good

with 10 hour time difference is certainly a

feedback.

challenging setting. It is easy to forget that


if you do not have the physical presence of

All in all, ME310 was the turning point of

your colleagues, the electrical representation

my life. This was easily to most demanding

is everything the counterparts sense out of

school challenge, and maybe, the most

you. That is why everybody should feed all

demanding challenge in my life! It taught

the time their doings to other members of

me a lot of stuff about human interaction,

the team. Trust becomes with understanding

attitudes and ways of thinking, that normal

and understanding starts from the knowledge

school courses cannot teach. All I wonder

you share. I have not managed to do this

now is, how will my last school year go,

by myself either. On the other hand, I tried

since it will be dramatically different than

how does it work if I feed information of our

this one.

tests to our Local ME310 Facebook group


and found a significant difference. Generally

In the end I want to thank my awesome

in our Loft there is a certain feeling that Audi

teammates, our TAs, our coaches, our

team is doing a lot of work. One piece of this

liaisons, and DF staff, without you, the magic

reputation might have been earned by active

would not have happened! You made my

communication. I wonder if we should take

year the unique ME310 experience that

this into account and start doing it within our

everybody were blabbering about in the

global team too, both ways? Once we had a

beginning. Thank you once again!

small reflective meeting together in our local


group and it felt really good to actually hear

Heikki
Fall Reflection
I have always been good at time management
and figuring out how to do things. Now, it has
all changed! I keep slipping off my deadlines
and coming to meetings unprepared. Work
is not an excuse, but it makes things harder.
I am really looking forward to the next year
where I have less things to do and more time
to focus on ME310.
The greatest learning has been anyway in
communications. There has been several

what people think in peaceful environment.


I always wonder what other people think
in order to adapt myself for the best of the
group. I really wish we can say after the
project that we have been one unite team
that succeeded in because we were bold in
communicating.
To get information of how people behave in
certain situations, observation is the most
usable tool. We tried not only to focus on
superficial behavior, but finding behavior
that repeats itself and that differs from
traditional behavior.

Final Documentation

125

Winter Reflection

discuss and be on the page of the project. I

In winter quarter we reached our first global

also wish we could build our prototype in a

communication issues. It felt that earlier


every problem was something lighter and
easy to solve by conversing. In the end of
this quarter we found two separate groups
striving for separate goals. I was acting
as a communication responsible between
global communications and there I was

moving car. Maybe there is too little time. I


hope I am wrong.
I have not heard about our corporate liaison
for several weeks now. There is definitely
room for improvement. We also will see our
Europe liaison during the spring break and

able to understand more where the other

then will be able to discuss further about our

part of the team stands. After switching the

ideas for the final port.

responsibility I was more and more lost and


it felt that less communication effort was
made. We also started using WhatsApp
program in our mobile phones, which had
more of positive effect on our group work.
We were able to see with less latency what
was happening on the other side of the
world with pictures and videos.
In our local communication there was less
organization. There was no sufficient driver
that we would have done anything especially
when Tommi was away. No todo lists were
made nor followed.
In the end I was happy to finally contribute
on technical prototype and were able to
build and code one fully functioning system.
I think it did good for our team to see how
much effort and details are to build one
single prototype until the end.
I am still not happy with our last concept
idea. It feels that is not ambitious enough
and does not cause an wow effect every
time. I am really looking forward to get the
other half of the team to Finland in order to

Hopefully we will get a new start after the


Stanfords visit and will be able to nail the
final prototype concept during the visit.
Spring Reflection
As anticipated, hard work before and
especially during the Stanford visit paid its
dividends. We came together and discussed
through our vision and felt united again. Only
levels of expectations were a bit different
among people. All in all it was clear that
there was so much to do before the whole
prototype was to be ready in the Design
EXPE in Stanford. After Stanford left we
knew what we wanted to achieve and we
started exploring different ways and ended
up trying Magneto concept and retraction
mechanism parallel. In the end we found out
that compromise between these two would
suit best to our overall concept and story.
I was responsible of making interaction
happening in the final prototype. That meant
I had to suggest and figure out what would
be suitable level of execution of working,
entertaining and future interaction for the
project, but also then make it happen. I like

126

challenges and me not having that much


of coding experience and inter-machine
communications I started took a look into
things I have liked to learn but never have
had a legitimate reason to dig into. I dont
mind hard working, I have done a quite a
bit of physical and boring stuff, which is not
challenge anymore, and now very much
enjoyed working under pressure without any
clue if I was going to succeed with the tasks
at all. All the thousands of lines of googling
and aha -moments will stay in my mind.
I was so much looking forward to come to
California and enjoy being responsible for
only one thing at the time. Before flying
over, at the same time with ME310, I was
spending nights at my work while having
part of the responsibility of our teams
performance. It was not fun at all. I have
learned that two major things I care about
seems to be maximum I can have on top of
each other.

Final Documentation

Final Documentation

127

7. Appendix

128

Final Documentation

7.1 Initial Brief

!
!
"#$!%&'()*+,$-!./&01!&2!34$/56+!789:;!1/&<$6=!2&/!=#$!>;:>?>;:9!+6+@$456!A$+/!5)!
&-!=#$!=&156!&2!+0=&-&4&0)!@/5B5-,!+-@!=#$!5-=$/+6=5&-!C$=*$$-!+0=&-&4&0)!6+/!+-@!
#04+-!@/5B$/D!!"#5)!1/&<$6=!*5''!C$!6&41'$=$@!5-!6&''+C&/+=5&-!*5=#!3+'=&!E-5B$/)5=A!5-!
F$')5-(5G!H5-'+-@D!

!
IJ=!5)!=#$!A$+/!>;9K!+-@!+0=&-&4&0)!1+))$-,$/!B$#56'$)!6+-!1/&B5@$!@/5B$/)!*5=#!
5-6/$+)$@!2/$$!=54$!+-@!+!)+2$/!@/5B5-,!$L1$/5$-6$!*#5'$!)=5''!1/&B5@5-,!=#$!&1=5&-!&2!
4+-0+'!6&-=/&'!52!=#$!@/5B$/!@$)5/$)D!!M/$+=$!+!6+C5-!=#+=!5)!!"#$%&$'%()#&*!5-!&/@$/!=&!
1/&B5@$!$-=$/=+5-4$-=!&/!*&/(5-,!)1+6$!2&/!=#$!@/5B$/!@0/5-,!+0=&-&4&0)!6&-=/&'!+-@!
)=5''!$-)0/$)%+,#%'*-.#*%+*/0+0%+,#%(&*10%&/+!$!2!/0%3/$(+-!$04!A$=!6+-!+')&!
1/&B5@$!+!2#&$0%3!*%(!$+*!))-$5%+,#%(&*!52!=#$!@/5B$/!6#&&)$)!=&!=+($!4+-0+'!
6&-=/&'DN!
!
"#$!&0=6&4$!&2!=#5)!1/&<$6=!)#&0'@!4+-52$)=!5=)$'2!+)!+!1#A)56+'!1/&=&=A1$D!
!

M&/1&/+=$!O5+5)&-!
"/$B&/!P#+--&-!
%&'()*+,$-!8'$6=/&-56)!Q$)$+/6#!O+C!
8-,5-$$/G!70'=54$@5+!311'56+=5&-)!"$+4!
6R!S;TD9T;DT:U;!
&R!UK;DTVUDS;U9!
$R!=/$B&/D)#+--&-WB*D6&4!
5/'R!K;;!M'511$/!X/5B$G!Y$'4&-=G!M3!

Final Documentation

129

7.2 Fall Brochure

Design Team

Project Background
Audi is at the forefront of innovation in automotive technology and
is dedicated to providing customers with elegant sophisticated
solutions. Autonomous vehicles (AVs) will be relatively common by
the year 2035 and Audi envisions car design will focus not only on
the driving experience, but also the riding experience. Important
areas in designing driving spaces for AVs include how people will
interact with an AV and perform activities other than driving. Even
though the future users will be comfortable with riding in an AV,
there will be times when the user will want to take control just to
enjoy the pleasure of driving.

Stanford University
Sangram Patil
Stephanie Tomasetta
David Wang

Aalto University
Goran Bjelajac
Sifo Luo
Heikki Sjman
Tommi Tuulenmki

Vision
The design team envisions a future where people will want to
regain time lost from commuting to locations where they would be
productive. The team goal is to create adaptable cabin spaces
suitable for many activites, in order to transform the journey into
the destination.
Team VW-Audi envisions driving being a secondary activity that is
performed as frequently as all the other desired activities, such as
working, relaxing, socializing, and interacting with multi-media. An
important aspect of this design would be a seamless transition
between any of these activities. The VW-Audi team will aim to
maintain the pleasure of driving while providing comfort, safety
and an easy transition that increases situational awareness.

Main Assumptions about 2035


UUsers will already trust and be comfortable with the
autonomous vehicle technology in a normal riding scenario.
Corporate Liaison
Trevor Shannon

Coach
Jeremy Dabrowiak

UDriving will not be turned over to the human driver in


emergency situations. The car will safely come to a complete
stop in those situations.
UThere will be car-to-car and car-to-infrastructure
communication.

130

Final Documentation

Left: User testing with different steering controllers. Middle: Transition Experience Prototype that utilizes a
guided matching task for gradual transfer of control. Right: Cabin Space Prototype of moving chairs.

Critical Function & Experience Prototypes


The team focused their efforts on three main areas: steering, transitioning to driving, and cabin
space. The Aalto Team created multiple prototypes to test how effective and intuitive various
steering controllers are for the driver. The Stanford Team focused on creating an experience for
gradual or direct transitioning to manual driving using a display that prompts the driver to mimic
the control actions of the AV. The intent was to increase the drivers situational awareness of the
driving scenario they are entering after being occupied by another activity. The team also tested
out a new concept for cabin space interiors that will be more suited for a wide variety of activites
by having reconfigurable chair locations.

Key Findings
UVisual representation of interactive information is too intrusive and distracting during driving
even if the visuals are projected into the driving field of view.
UPeople are better at doing multiple tasks at a time if the sources of information needed to
perform the tasks are cohesive or the same.
UWhen prompted to mimic the control actions of an AV, users didnt feel like they were driving

Design Requirements
Functional

Physical

Opportunities

U Seamless transition between any activities

UThe cabin has to be


open and have an
organized interior

U Redefine the
business model
to one that is
perhaps
subscription
based

UNon-intrusive interface to the current


activity
UKeep the user aware of the surrounding
environment and the AVs actions

UPersonalization of
the cabin space to
each user

UThe control input from the user should


maintain the pleasure of driving and be
utilized as the interaction input

UAdaptable cabin
space for doing user
desired activities

ULeverage the
social impact
of self-aware cars

Design Strategy
U Explore user interaction in the context of cabin space transitioning
UBetter define and categorize activities that would likely be performed in an AV
UIdentify emerging technologies to integrate into the cabin space based on defined activites

Final Documentation

7.3 Needfinding
To get information of how people behave in

131

Conclusions / Lessons Learned

the seat looks comfortable, users try to

certain situations, observation is the most

make themselves more comfortable. In

usable tool. We tried not only to focus on

uncomfortable chairs, users dont care

superficial behavior, but finding behavior


that repeats itself and that differs from
traditional behavior.
Cafeteria observations
As a part of our observation exercise at
Stanford University, we went to the local
cafeteria and observed people working in a
public space while eating and having drinks
at the same time. The reason behind it was
to learn about peoples working habits in a
relaxed atmosphere. (Figure 69)

Observations

One seated, people rarely change


their initial position, although their
seating position is very uncomfortable
(Figure 14: Person in uncomfortable
position)

People dont think about their comfort


in advance. They put their bags on the
table before they sit, and then once
seated, they adjust to the situation.
(They worked around the bag on the
table instead of moving it to the chair
next to them)

People on more comfortable chairs


wanted to make themselves even more
comfortable. They placed their legs on
the small table and placed laptops on
their laps.

Interior can dictate user behavior - if

about their comfort

We can manipulate users and predict


their behavior

Sometimes, users dont behave in a


logical manner

If we want people to seat comfortably,


then our design must express comfort

132

7.4 Benchmarking
Gestural steering
An Xbox Kinect Dance Revolution game
was played to learn what it would feel like to
drive using gestural commands. The setup
involved a TV, an Xbox, and the Kinect
sensor module. The sensor module had to
be placed and leveled in the correct position
to be able to capture all of the movements
of the dancer. The distance of the sensor
was also important to be able to capture
movements from head to toe. Several
dance songs were played and points were
evaluated to see how well the dancer
mimicked the animated dancers motions.
Mind Control
The prototype was set up to play two different
games. One game involved two users to
compete against each other to move the
ball towards the opponents side. The other
game involved one user to raise or lower the
ball through different ball obstacles along
the track. Buttons on the game console
manually controlled the forward movement
of the ball.
Voice Command Steering
Several different paths were taken within the
empty parking lot to exhibit a more random
driving pattern to the driver. It was essential
that the driver did not know exactly where
they were even though they were able to see
the course prior to driving. Different degrees
of voice commands were given including
hard left turn, straight, slightly right, etc

Final Documentation

Motion Sickness
The experiment set up involved a car in
which the back seat windows were covered
with black garbage bags to not allow the
passenger in the backside to see the
outside. Also garbage bags were used to
partition the front section of the car from
the back section of the car. The passenger
was reading during the tests. The driver
randomly drove to different parts of the
campus with unexpected turns and car
movements/actions.
CNC Machine Operation
The experiment set up was viewed in the
Stanford Machine Workshop. The team
observed a student that was using the
CNC machines for a project in his class/
lab section. The team first observed the
students actions with the machine without
disturbing him, and then approached him
and asked direct questions of what exactly
he is doing and why he was doing that.
Racing Game Player-to-Player Transition
The experiment involved an Xbox game
station with Forza Motorsports racing game.
One user handled controls at first, and then
the second user stepped in after the first
user relinquished controls over. Several
transitions were tested including during
turning sections, straight away sections, and
within busy clustered car sections. The case
where the second user taking over controls
without seeing what was happening prior
was also tested.

Final Documentation

133

Haptic Command Steering

Confirmation Cue Voice Indicator

Setup was the same as the voice command

The experiment used the same set up as the

steering.

motion sickness benchmark and similar to


the light indictor benchmark (without the use

Confirmation Cue Light Indictor

of the indictor console). The driver shouted

The experiment used the same set

out voice commands prior to the cars

up involved in the motion sickness

actions occurring while driving randomly to

benchmarking with an additional of an

different parts of Stanford campus.

indicator as shown in the figure below which


was made to allow the riders in the front seat
to shine a light at the indications of the car
actions. The passenger in the backseat only
had the view of the indicator. The passenger
was reading during the tests.
The first experiment tested advanced
notification of the car turning LEFT,
STRAIGHT, and RIGHT. The second
experiment tested advanced notification of
the car approaching something out of the
ordinary, SPEED BUMP, SHARP TURN, or
SUDDEN BRAKING.
(Figure 70: Light Indicator Console)

Workspace Reconfigurable Set Up


The experiment set up involved a car
dashboard, a TV that was an interactive
windshield, a movable chair on a roller
board, and a cabin space table. We had a
looping driving scene on the TV to pretend
that the user was driving, and then the user
could initiate the transition with a click of
a switch, to transition into the workspace
when desired. The user could perform any
activities within the space provided.

iOnRoad Application Android


Once a smartphone running the app
is mounted on a cars windshield or
dashboard, iOnRoad combines the visual
information collected by the smartphones
camera with GPS and accelerometer data
to provide information about the road ahead
on the smartphones display. The vehicle
in the lane ahead of the driver is displayed
and marked with a time gap in seconds,
indicating how far behind the lead vehicle
the driver is and how much time there would
be to react in an emergency. Additionally,
the road between the iOnRoad-equipped
vehicle and the car ahead is marked with
a colored overlay that goes from green for

Light Indicator Console

a safe following distance to yellow to red

134

Final Documentation

when youre following too closely. In the

Again the overlay of the image of

event that the drivers attention lapses and

the environment is very intuitive to

the vehicle ahead stops or slows suddenly,

understand

iOnRoad will flash a full-screen alert and


audio warning to grab the drivers attention
and attempt to avert an accident. iOnRoad

BMW NOTES ABOUT AUTONOMOUS

can also be set to run in the background,

DRIVING

leaving the screen free for navigation while

Track Trainer: 2007 328i BMW (estimated

it continues to keep an eye on the road (get

worth $1 mil)

it?) for potential accidents.


http://reviews.cnet.com/8301-13746_7-

Races racetracks autonomously

20067732-48.html

Technology in it
Differential GPS

(Figure 71: iOnRoad Iphone application)

Augmented for ionosphere errors


(caused by clouds, etc.) which cause it to
read length further away than they are
Good at long term data sets
Optical Lane Detection
Distance from edge of track
Camera based
Digital Map
Used for lane detection and gives

iOnRoad Iphone Application

another reference for localization


IMU & Vehicle Sensors

Observations

Once you are in a more dangerous

Gyroscopes

zone, a big yellow warning takes over

IMU is bad for racetrack because it

the display. The change in display that

is not precise enough


Good for short term data sets

dramatically is distracting.

It does not show your progress to


correct that problem.

The Car Show: Man vs. Computer Video

It is nice that it can run behind other

(Watch video here http://www.hulu.com/

apps so that you only get warnings.

watch/273766)

Must be mounted to windshield

Track Trainer driven autonomously and


by human

Conclusions

Im not driving... this is sweet.

Knowing not only the warning, but

A little disconcerting

when you have remedied the problem

Save on gas because you can have

is important

lighter cars that are electronically

Final Documentation

135

accident proof

Traffic Jam Assistant BMW I3 concept

Trains racers the lines and path of

Adaptive cruise control & steering


assistance

proper racetrack driving

Adaptive cruise control uses radar, input

A9 Autobahn: 5 series with sensors

a max speed and following distance

Technology in it:

(longitudinal control)

Controller from track trainer no

Steering Assistance lane following if

preprogrammed trajectory

there are lines- keeps car in lane (lateral

Localization

control)

Planning

(Figure 72: Drivers assistance BMW

Perception

presentation)

Highway scenario ideal because:

Dont want to drive the 4 hours in the


middle of the drive from SF to LA

High is pretty controlled

Long Strips of driving

No random variables like kids playing


in the street

Emergency Stop Assistant

Incapacitated Driver Detected (drowsy,

Drivers assistance presentation

heart attack)

Vehicle takes over

Safely moves over and stops in a


secure location

Notifies emergency services

Other Assistive Technologies:

Adaptive Brake Assistant

Night Vision with Pedestrian Detection

Traffic Jam Assistant

Remote Control Parking

Narrow Passage Assistant

Lateral Collision Avoidance

Active Hazard Braking

Camera Based Pedestrian Protection

Bigger Picture-> THE FUTURE OF HIGHLY


AUTOMATED DRIVING

Efficiency
Time- coordinated systems needed to

remove traffic (i.e. short following distance)


Fuel- predict when to brake and can
avoid those situations
Space- platooning and save drag

Convenience and Experience


Reduce Fatigue
Drive only when you want
Do something else

136

Final Documentation

Safety (feel they must justify these

Unplanned failure mode

features in terms of efficiency and

There must be enough time and cues

convenience because consumer wont

for driver to reengage

buy safety features despite wanting a

Multi Vehicle and Infrastructure

safe car)

Interaction

Reduce frequency of accidents

Mixed mode of a few autonomous

Reduce severity of accidents

cars and many human drivers


Pedestrian crossing the street

Challenges:

example

Legal Issues

Pedestrian makes eye contract with

Relevant research done by Stanford

driver to make sure they are going to

legal Fellow about autonomous

stop. What happens when there is no

driving laws

driver and the car is autonomous?

Policy spearheaded by google

Research done about how to

Nevada has strongest policy enacted

give cues back to humans about

Florida is providing liability protection

autonomous mode.

to car companies

Headlights that turn with steering has

(Figure 73)

been suggested
(Figure 74)

Legal Development of Autonomous


Vehicles

Challenges of Autonomous Vehicles

Driver Interaction

Research Sensors

Complex Interaction systems

Made by VELODYNE

How to take over at high speeds

Laser scanners

Takes time to become aware enough


to take over
Is it possible to train people to do this?
Two types of take overs
Planned

64 emitters/ receivers
Spin to get a sweeping 360 degree
detection
Data volume: 1.3 mil points/ second
How do you handle that much data?

Final Documentation

Production Sensors (Slides on following

137

page)

Elevator example
Elevators used to have a elevator man
that received calls, pulled crank, pick
you up and took you to your floor.
Now elevators are automated and you
barely notice

7.5 Technical Literature


Benchmarking

The teams research and benchmarking


efforts for existing technologies were
focused on the future of mobility, existing
autonomous and concept cars, situational
awareness, steering and control, trust and
driver psychology. A lot of research has
been carried out in the area of situational
awareness and trust related to car driving
as well as aircrafts. The team has also
explored some research articles relevant
to these areas and the key insights have
been outlined in the discussion below.

Produstion Sensor Slides

Future of mobility
(1) Personal mobility
The team looked at existing and future

Past Attempts at Autonomous Driving

technological solutions for personal

Path project1990s

mobility like the project PUMA (Personal

Autonomous platooning

Urban Mobility and Acceptability), which is

Didnt go anywhere because it relied

a collaboration between GM and Segway.

on infrastructure changes

(Reference: http://www.segway.com/

It worked by having road magnets

puma/)

embedded in the road every few feet

It values less over more; taking

to keep cars in the lanes and provide

up less space, using less energy,

feedback

produced more efficiently with fewer

IMPORTANT ASSUMPTION is that

parts, creating fewer emissions during

you can assume government support

production and operation

and you must work with infrastructure


for human drivers

The elegance and maneuverability of


dynamic stabilization combined with

138

Final Documentation

proven battery, sensing, and controls


technologies come together to solve
real transportation challenges
(2) Zipcar model versus Uber model for
car sharing

These business models lead to the


concept of collaborative consumption.
(Reference: http://www.
collaborativeconsumption.com/the-

Stanley (a mod. VW), winner of DARPAs


Grand Challenge

movement/)

Collaborative Consumption describes


the rapid explosion in traditional
sharing, bartering, lending, trading,
renting, gifting, and swapping
reinvented through network
technologies on a scale and in ways
never possible before.

It would be a new era marked by TRUST between strangers, ACCESS


instead of ownership

Boss (a mod. GM), winner of DARPAs Grand


Challenge

Existing autonomous cars and driver


assistance technologies
1. DARPA challenge
Stanley is actuated via a drive-by-wire
system developed by Volkswagen of
Americas Electronic Research Lab. The
vehicle incorporates measurements from
GPS, a 6DOF inertial measurement unit,
and wheel speed for pose estimation.
While the vehicle is in motion, the
environment is perceived through four
laser range finders, a radar system, a
stereo camera pair, and a monocular
vision system.

2. Nissan Steer by Wire technology brings


car design closer to autonomous:
http://auto.dohax.com/nissans-steerby-wire-system-brings-us-closer-toautonomouscars/
As the car encounters obstacles / another
vehicle, a pedestrian the car steers
itself to safety and stops. This technology
has already been developed. So it is
safe to assume that cars of the future will
have the built in emergency stop feature
and we only need to consider transitions
during normal driving situations or when it
is safe.

Final Documentation

139

(BMW is also working on a new


technology that detects health condition
and incapacitated drivers and stops the
car safely pulls over to the side and calls
for help).
(3) Chris Gerdes - TED Video on
autonomous cars:
http://www.youtube.com/watch?v=q1sk47

VW Hover Car

FLAmg&feature=player_embedded
Gerdes believes that the optimal
autonomous car tech will not necessarily

Trust

replace humans, but should instead act

Reference paper [3] The effects of adverse

as our coach. He believes that the optimal

condition warning system characteristics

car will combine technology with human

on driver performance: an investigation of

intuition and reflexes.

alarm signal type and threshold level


Description The study addresses the issues

Concept Cars

concerning the design of adverse condition

1. PAT Autonomous Vehicle [1]

warning systems (ACWS), which warns the

We can see that the vehicle itself has

drivers about adverse road and weather

been restructured to be like a clear

conditions, or even system conditions that

cabin with the walls themselves being

can lead to skids or accidents.

interactive interfaces.

Relevant ideas and conclusions:

2. VW hover car [2]

Results of the study indicate that drivers

The car shape is more like a pod and

respond better to a low sensitivity and

steering is a car shaped joystick. So the

graded alarm signal condition compared

controller is more intuitive and the car

to other alert configurations. Applied to

responds exactly how the user moves

this project, it means that if warnings or

around the controller.

information is too sensitive and is displayed


too often then users will not trust the system
as much and even get irritated by it. A
graded alarm system means that it goes
from a low level alarm much before to a high
level alarm when the event is near in time.

140

Final Documentation

Situational Awareness:

study of auditory warnings in aircraft

1. FORD SYNC (similar to Kia UVO (short

Relevant ideas and conclusions:

for your voice) powered by Microsoft: [4]

Pilots prefer visual over auditory

SYNC lets you use your phone,

warnings when there is enough time to

browse and choose music and find

react (Stokes et al. 1990). However if

your way to just about anywhere - all

the visual channel is overloaded there

while keeping your eyes on the road

are obvious advantages in allocating

and hands on the wheel.

some tasks to other sensory channels.

Features: Calling, Entertainment,

Advantages of auditory information:

Navigation, Assist, Vehicle Health

a) Can be received regardless of head

Report with just voice commands.

position and orientation of the user

Minimal interface and minimal buttons

b) Response time to an acoustic

to distract you.

signal
a

can

visual

be
one

shorter

than

(Kramer,

for

1994)

2. GMs OnStar 4G LTE dashboard: [6]

c) Loud enough auditory warnings

Future of automobile safety, security

attract attention regardless of the task

and infotainment

in which the user is engaged at the time

Using your smartphone to monitor

d) The user might miss visual cues in

your vehicle real time through a series

high workload environments (Edworthy,

of cameras mounted on the car

1994)

Immediate notification of any impact/

Disadvantages of auditory information:

damage transmitted directly to your

a) If the signal is too loud and

smartphone

harsh that can lead to distraction

Video tutorials combined with

b) If two alarms or signals go off at the

text based info (everything on the

same time that would be problematic

touchscreen in the car) shows how

to convey

to use the different controls (Like the

Gaver (1986) has proposed the use of

past Audi project had nfo buttons)

auditory icons. They are caricatures of

Voice command navigation.

natural sounds that describe the

Navigation system that connects with

essential feature of the event that

traffic camera allowing you to see in

they portray. For events that do not

real time the state of the traffic on the

have natural sounds, earcons have

route

been proposed which are essentially


certain sounds, which the users learn to

3. Reference paper [7] - An empirical

associate with the corresponding

Final Documentation

141

4. Reference Study [8] Collision Warning

The graphs are shown on the next page.

Design to Mitigate Driver Distraction

The black line represents the data from

This study also found that the drivers

the users and the blue line represents

trusted the graded warning system more.

the ideal data that they were expected to

The interesting conclusion was that haptic

follow. Steering and pedal data graphs

warnings were preferred to auditory

have been separated. In the graphs

warnings on several dimensions including

for the incremental transition, the three

trust, overall benefit to driving and

regions

annoyance. These results suggest that

being shown are completely autonomous

non-standard warning modes like haptic

mode, pedal matching mode and steering

cues from a vibrating seat or a vibrating

matching

control input need to be considered for

mode. In the direct transition there are

information or warning system design.

only two modes, autonomous and manual.

7.6 Critical Function


Prototype Golf Cart
7.6.1 Golf Cart CFP Results
In the incremental pedal data, once we
give them steering control along with
pedals they neglect pedals completely.
The car speed drops down a lot,
sometimes even close to a stop.

In the incremental pedal data, once


we give them steering control along
with pedals they neglect pedals
completely. The car speed drops
down a lot, sometimes even close to
a stop.

In general comparing the incremental


data versus the direct transfer data
shows that the matching error for the
entire transition sequence is much
lower in the direct transition.

It should be noted that the user is actually


given complete control of the car in the
direct transition; however the graph is
showing the time in manual mode when
the interface was visible to the user.

142

Golf Cart Transition Graph

Final Documentation

Final Documentation

7.6.2 Golf Cart C# Code for Gradual Control


using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using System.IO.Ports;
using System.IO;
// using the ADXL345 accelerometer - Sparkfun tutorial 240
// incremental
namespace WindowsFormsApplication1
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}
private void button1_Click(object sender, EventArgs e)
{
SerialPort serialPort1 = new SerialPort();
serialPort1.PortName = COM10;
serialPort1.BaudRate = 9600;
serialPort1.Open();
System.Drawing.Pen myPen;
myPen = new System.Drawing.Pen(System.Drawing.Color.Red);
myPen.Width = 2;
System.Drawing.Pen myPen_auto;
myPen_auto = new System.Drawing.Pen(System.Drawing.Color.Green);
myPen_auto.Width = 2;

143

144

Final Documentation

System.Drawing.SolidBrush myBrushGray = new System.Drawing.


SolidBrush(System.Drawing.Color.LightGray);
System.Drawing.SolidBrush myBrushYellow = new System.Drawing.
SolidBrush(System.Drawing.Color.LightBlue);
System.Drawing.Graphics formGraphics = this.CreateGraphics();

if (serialPort1.IsOpen)
textBox1.Text = Done;
else
textBox1.Text = Not;
textBox3.Font = new Font(Georgia, 16);
textBox3.BackColor = Color.LightGreen;
textBox3.BorderStyle = BorderStyle.FixedSingle;
textBox3.TextAlign = HorizontalAlignment.Center;
textBox3.Text = Throttle matched;
textBox3.Visible = false;
int trial_int = 0;
string trial_msg = 0;
int flg = 0;
while(flg == 0)
{
trial_msg = serialPort1.ReadLine();
try
{
trial_int = Int32.Parse(trial_msg);
}
catch
{
}
if (trial_int == 1234)
break;
}

Final Documentation

while (flg == 0)
{
trial_msg = serialPort1.ReadLine();
try
{
trial_int = Int32.Parse(trial_msg);
}
catch
{
}
if (trial_int == 5678)
{
flg = 1;
break;
}
}
int count = 0;
int error_count = 0;
int error = 0;
int last_error = 0;
int error_threshold = 150;
int count_threshold = 200; //should be 200
bool pedalAchieved = false;
int time_count = 0;
int label_count = 0;
progressBar1.Maximum = count_threshold;
int steering_error_count = 0;
int steering_error = 0;
int steering_last_error = 0;
int steering_error_threshold = 50;
int steering_count_threshold = 150;
bool steeringAchieved = false;
//int label_count = 0;
progressBar2.Maximum = steering_count_threshold;

145

146

Final Documentation

// dummy defines just to avoid definition errors


FileStream fs = new FileStream(Z:/Acads/Year2/ME310A/AUDI/CFP/Software/
Csharp Trials/Scenario2/defaultfile.txt,FileMode.Open,FileAccess.Read);
StreamReader sr = new StreamReader(fs);
// dummy defines just to avoid definition errors
FileStream fsw = new FileStream(Z:/Acads/Year2/ME310A/AUDI/CFP/Software/
Csharp Trials/Scenario2 + defaultfile2.txt , FileMode.Append, FileAccess.Write);
StreamWriter sw = new StreamWriter(fsw);
int pot = 0;
int pot2 = 0;
int pot_auto = 0;
int pot2_auto = 0;
int button_flg = 0;
int last_button_flg = -1;
double angle_offset = Math.PI / 2;
double angle_buffer_range = Math.PI / 5;
if (checkBox2.Checked)
{
fs = new FileStream(Z:/Acads/Year2/ME310A/AUDI/CFP/Software/Csharp Trials/
+ textBox2.Text, FileMode.Open, FileAccess.Read);
sr = new StreamReader(fs);
fsw = new FileStream(Z:/Acads/Year2/ME310A/AUDI/CFP/Software/Csharp
Trials/Scenario2/ + data + textBox2.Text, FileMode.Append, FileAccess.Write);
sw = new StreamWriter(fsw);
}
while (true)
{
//increment time counter if pedal not achieved
if (button_flg == 1)
{
time_count++;
}

Final Documentation

147

if (textBox3.Visible)
{
label_count++;
if (label_count == 150)
{
textBox3.Visible = false;
label_count = 0;
}
}
String msg0 = serialPort1.ReadLine();
String message = serialPort1.ReadLine();
String message2 = serialPort1.ReadLine();
button_flg = Int32.Parse(msg0);
pot = Int32.Parse(message);
pot2 = Int32.Parse(message2);

if (checkBox1.Checked)
{
fsw = new FileStream(Z:/Acads/Year2/ME310A/AUDI/CFP/Software/Csharp
Trials/Scenario2 + textBox2.Text, FileMode.Append, FileAccess.Write);
sw = new StreamWriter(fsw);
sw.WriteLine(message + + message2);
sw.Flush();
sw.Close();
fsw.Close();
}
if (checkBox2.Checked)
{
//string[] lines = System.IO.File.ReadAllLines(@Z:/Acads/Year2/ME310A/AUDI/
CFP/Software/Csharp Trials/Scenario2trial1.txt);
//string lines_check = lines[2];
String str1 = sr.ReadLine();
String str2 = sr.ReadLine();
String str3 = sr.ReadLine();

148

Final Documentation

pot_auto = Convert.ToInt32(str1);
pot2_auto = Convert.ToInt32(str2);
sw.WriteLine(message + + message2);
//sr.Close();
//fs.Close();
}
count++;;
if (count == 4)
{
progressBar1.Value = error_count;
progressBar2.Value = steering_error_count;
//textBox1.Text = error_count.ToString();
formGraphics.Clear(Color.FromName(Control));

//int formHeight = 370;


int formHeight = 250;
//Current Graphics
if (pedalAchieved)
{
pot2 = 100;
pot2_auto = 100;
}
int rectX1 = 0; int rectY1 = formHeight-pot2 / 3; int rectWidth = 400; int
rectHeight = 200;
int rectX1_auto = 0; int rectY1_auto = formHeight - pot2_auto / 3; int rectWidth_
auto = 400; int rectHeight_auto = 200;
// Correct the graphics to have stationary auto throttle and display error
rectY1 = rectY1 - rectY1_auto + 100;
rectY1_auto = 100;
error = Math.Abs(pot2 - pot2_auto);
steering_error = Math.Abs(pot - pot_auto);
// note down the switch

Final Documentation

149

if (checkBox2.Checked && last_button_flg == 0 && button_flg == 1)


{
sw.WriteLine(**********Switch************************);
sw.WriteLine(time_count);
sw.WriteLine(**********************************);
}
last_button_flg = button_flg;
//////////////////////////////////////////
// Pedal matching code
/////////////////////////////////////////
if (!pedalAchieved && error < error_threshold && last_error < error_threshold)
{
error_count++;
}
else if (!pedalAchieved && last_error < error_threshold && error >= error_
threshold)
{
error_count = 0;
}
last_error = error;
if (!pedalAchieved && error_count == count_threshold)
{
// successfully matched throttle pedal
int debug = 1;
pedalAchieved = true;
textBox3.Visible = true;
label_count = 0;
if (checkBox2.Checked)
{
sw.WriteLine(**********Pedal Matched************************);
sw.WriteLine(time_count);
sw.WriteLine(**********************************);
}

150

Final Documentation

}
//////////////////////////////////////////
// Steering matching code
/////////////////////////////////////////
if (pedalAchieved && !steeringAchieved && steering_error < steering_error_
threshold && steering_last_error < steering_error_threshold)
{
steering_error_count++;
}
else if (pedalAchieved && !steeringAchieved && steering_last_error < error_
threshold && steering_error >= steering_error_threshold)
{
steering_error_count = 0;
}
steering_last_error = steering_error;
if (pedalAchieved && !steeringAchieved && steering_error_count == steering_
count_threshold)
{
// successfully matched throttle pedal
int debug = 1;
steeringAchieved = true;
textBox3.Text = Steering matched;
textBox3.Visible = true;
label_count = 0;
if (checkBox2.Checked)
{
sw.WriteLine(********Steering matched**********************);
sw.WriteLine(time_count);
sw.WriteLine(******************************);
}
}
float centerX = (rectX1 + rectWidth / 2);

Final Documentation

151

float centerY = (rectY1 + rectHeight / 2);


pot = pot * 2 / 3 + 30 * 22 / (7 * 180);
double endPx = centerX + rectWidth * Math.Cos((pot) * Math.PI / (180) + angle_
offset) / 2;
double endPy = centerY - rectHeight * Math.Sin((pot) * Math.PI / (180) + angle_
offset) / 2;
float centerX_auto = (rectX1_auto + rectWidth_auto / 2);
float centerY_auto = (rectY1_auto + rectHeight_auto / 2);
pot_auto = pot_auto * 2 / 3 + 30 * 22 / (7 * 180);
double endPx_auto = centerX + rectWidth_auto * Math.Cos((pot_auto) * Math.
PI / (180) + angle_offset) / 2;
double endPy_auto = centerY - rectHeight_auto * Math.Sin((pot_auto) * Math.PI
/ (180) + angle_offset) / 2;
// do not start matching in autonomous mode
if (button_flg == 0)
error_count = 0;
//if (button_flg == 1)
//{
//Autonomous Graphics
if (checkBox2.Checked)
{
if (pedalAchieved && !steeringAchieved)
{
// Steering related auto graphics
float start_angle = (float)((((-pot_auto) * Math.PI / (180) - angle_offset angle_buffer_range / 2) * 180 / Math.PI));
formGraphics.FillPie(myBrushYellow, (float)rectX1, (float)rectY1, (float)
rectWidth, (float)rectHeight, start_angle, (float)(angle_buffer_range * 180 / Math.PI));
}
else if(!pedalAchieved)
{
// Throttle related auto graphics
formGraphics.FillRectangle(myBrushGray, new Rectangle(rectX1_auto,

152

Final Documentation

rectY1_auto, rectWidth_auto, rectHeight_auto / 3));


formGraphics.DrawArc(myPen_auto, rectX1_auto, rectY1_auto,
rectWidth_auto, rectHeight_auto, 210, 120);
//formGraphics.DrawLine(myPen_auto, centerX, centerY, (float)endPx_
auto, (float)endPy_auto); //(float)endPx, (float)endPy
}
// }
//Current Graphics
if (!pedalAchieved)
{
// Throttle related current graphics
formGraphics.DrawArc(myPen, rectX1, rectY1, rectWidth, rectHeight, 210,
120);
}
else if (pedalAchieved && !steeringAchieved)
{
// Steering related current graphics
formGraphics.DrawLine(myPen, centerX, centerY, (float)endPx, (float)
endPy); //(float)endPx, (float)endPy
}
// reference for the passenger driver
int shift = 100;
int centerXnew = 400; int centerYnew = 200;
double endPx_auto_new = centerXnew + rectWidth_auto * Math.Cos((pot_
auto) * Math.PI / (180) + angle_offset) / 2;
double endPy_auto_new = centerYnew - rectHeight_auto * Math.Sin((pot_
auto) * Math.PI / (180) + angle_offset) / 2;
double endPx_new = centerXnew + rectWidth * Math.Cos((pot) * Math.PI /
(180) + angle_offset) / 2;
double endPy_new = centerYnew - rectHeight * Math.Sin((pot) * Math.PI /
(180) + angle_offset) / 2;
formGraphics.DrawLine(myPen, centerXnew + shift, centerYnew + shift,
(float)endPx_new + (float)shift, (float)endPy_new + (float)shift);
formGraphics.DrawLine(myPen_auto, centerXnew + shift, centerYnew + shift,
(float)endPx_auto_new + (float)shift, (float)endPy_auto_new + (float)shift); //(float)endPx,
(float)endPy

Final Documentation

}
count = 0;
}
}
myPen.Dispose();
formGraphics.Dispose();
}
private void textBox1_TextChanged(object sender, EventArgs e)
{
}
private void progressBar1_Click(object sender, EventArgs e)
{
}
private void button2_Click(object sender, EventArgs e)
{
this.Close();
}
private void numericUpDown1_ValueChanged(object sender, EventArgs e)
{
}
}
}

153

https://www.youtube.com/watch?v=reIyTHHCd-c

http://www.youtube.com/watch?v=uqeo_ZvGxE&feature=related
http://www.dailymotion.com/video/xcuin0_next-world-futuredanger-5th-april_shortfilms?search_algo=2
https://www.youtube.com/watch?
v=219YybX66MY&feature=related
https://www.youtube.com/watch?
v=K54LN9q1jSs&feature=related
http://www.youtube.com/watch?
v=7ic0JAH8rco&feature=relmfu
http://auto.howstuffworks.com/car-driving-safety/accidentshazardous-conditions/traffic.htm

Lecture about future of technology in 2030


Lecture by Michio Kaku

Speech from University professor about


the future
Another speech from the same guy

Intelligence revolution

Traffic statistics

Beyond 2000 show

Discovery channel video - part1 from the series

Future vehicles today

Just as an example of an episode of beyond 2000.It was shown 20 years


ago and we can see how much the world developed for that time.
From Tommi's link. Don't read it. It just shows how much time we spend
int raffic. It will be usefull in documentation one day.
A documentary about virtual reality, robots, AI... Interesting - It raised a
question that we can bond to a machine if it shows emotions and
personality. Maybe our car can be emotional?! Maybe then we will trust it
more?!

Discovery channel video

Basically most of the stuff introduced before gathered in one interesting


video.
This website is not 100% reliable, but it provides some vision of the
future and it has a poll on whether you will trust an autonomous
vehicle :).

Article claiming this would be in futre cars (20xx?).

Interesting techonlogy where computer adds digital reality to existing


reality. Introduced in 2007. Could maybe used on future windshield?
(20xx?).
Interesting vide of BMW using AR technology to assist repairing (2007).

Americans spend an average of 100 hours sitting in traffic every year

World in 2057

Future predictions

Future Intelligence

AR (Augmented Reality)

AR (Augmented Reality)

http://www.youtube.com/watch?v=P9KPJlA5yds
http://auto.howstuffworks.com/under-the-hood/trendsinnovations/5-future-car-technologies3.htm
http://www.youtube.com/watch?
v=U7UroLFYlzE&feature=related

Wasted time
AR (Augmented Reality)

http://www.futuretimeline.net

We might want to look into future materials like this

Human error, the most common cause


About 60-80% in cases a crash is caused by a human error.
accidents with cars.
Statistics what kinds of harm is caused by
E.g. wasted time and fuel worth about $78 billion.
traffic

Transparent materials for seats eg

Goran

Goran

Goran

Goran

Goran

Goran

Goran

Goran

Goran

Tommi

Tommi

Tommi

Tommi

Tommi

Tommi

Tommi

Sangram

Who added

Sangram

Comments

CMU also has/had a good autonomous


car program a couple of years back

Description

7.7 Online Research

Autonomous vehicle controlled with iPad

http://en.wikipedia.org/wiki/Augmented_reality

http://www.autoevolution.com/news/who-needs-parkingsensors-with-an-invisible-back-seat-50139.html
http://asasi.org/papers/2004/Shappell%20et%
20al_HFACS_ISASI04.pdf
http://auto.howstuffworks.com/car-driving-safety/accidentshazardous-conditions/traffic.htm
http://www.nytimes.
com/2011/05/29/business/economy/29view.html

No Link

Link/Title
http://ieeexplore.ieee.org/stamp/stamp.jsp?
tp=&arnumber=5945275

154
Final Documentation

Final Documentation

155

Link/Title

Description

http://en.wikipedia.org/wiki/Motion_sickness

Motion Sickness

http://www.medicinenet.com/motion_sickness/article.htm

Motion Sickness

http://www.viban.com/motionsickness.htm
http://en.wikipedia.org/wiki/Tilting_train

http://www.google.com/patents/US20040035347

Patent for motion


sickness

http://www.google.com/patents/US7717841

Link/Title
http://www.audi.co.uk/new-cars/a8/a8/safety/presense.html
http://www.audi.
com/com/brand/en/tools/advice/glossary.html
http://www.audi.
com/com/brand/en/tools/advice/glossary.html
http://www.audi.
com/com/brand/en/tools/advice/glossary.html
http://www.audi.
com/com/brand/en/tools/advice/glossary.html
http://www.audi.
com/com/brand/en/tools/advice/glossary.html
http://www.audi.
com/com/brand/en/tools/advice/glossary.html
http://www.audi.
com/com/brand/en/tools/advice/glossary.html
http://www.audi.
com/com/brand/en/tools/advice/glossary.html
http://www.autoevolution.com/news/volvo-citysafety-lowers-insurance-rating-for-all-uk-boundv40s-50033.html
http://en.wikipedia.
org/wiki/Electronic_stability_control
http://www.autoevolution.com/news/cadillacvirtual-bumper-helps-avoid-parking-crashes49613.html
http://wot.motortrend.com/continental-wantsemergency-steer-assist-to-drive-cars-away-fromaccidents-8013.html

Motion sickness
product
Tilting Train
2006 Patent Autonomous, self
leveling, self
correcting antimotion sickness
chair, bed

Description

Audi pre sense

A solution for motion sickness

Who added
Tommi
Tommi
Steph

Active compensation

Sangram

Active compensation gyroscope based

Sangram

Look at the references. There


are many more different
solutions for motion sickness
that people have come up with.
Maybe we can just list the
interesting ones here

Sangram

Comments
Currently available system.
Includes floowing systems:
Audi adaptive cruise control,
Audi lane assist, Audi Side
Assist, Night vision, and
Audi Adaptive Lights.

Who added

Tommi

Audi adaptive cruise control

Currently available system

Tommi

Audi Adaptive Air Suspension-sport

Currently available system

Tommi

Audi Adaptive Lights

Currently available system

Tommi

Auto-dimming Mirrors

Currently available system

Tommi

Audi Side Assist

Currently available system

Tommi

Audi Parking System

Currently available system

Tommi

Audi lane assist

Currently available system

Tommi

Night vision

Currently available system

Tommi

AEB (Autonomous Emergency Breaking)

Currently available system


introduced by Volvo.

Tommi

Introduced by BMW and


Merced-Benz in 1987.

Tommi

ESC (Electric Stability Control)


ESP (Electric Stability Program)
DSC (Dynamic Stability Control)

Parking sensors mixed with autonomous parking if Currently available system


driver is about to ht something while parking.
introduced by Cadillac.

ESA (Emergency Steering Assist)

http://green.autoblog.com/2008/08/20/continentalAFFP (Accelerator Force Feedback Pedal)


introduces-force-feedback-accelerator-pedal/
http://auto.howstuffworks.com/under-thehood/trends-innovations/5-future-cartechnologies4.htm

Comments
Some brife explanation what is
motion sickness
What causes motion sickness
etc

Air Bag

In addition to breaking, the


car also knows how to steer
in emergency situation.
Introduced by continental
and it's available in near
future (2015?).
Already available.
Introduced by Continental in
2008.
Technology studied by
Mercedes-Benz. Idea is to
stop the car itself with air
bags :D.

Tommi

Tommi

Tommi

Tommi

156

Final Documentation

Link/Title
Description
Comments
99 page document - extensive discussion about the
https://cyberlaw.stanford.
White paper on legality of autonomous legality of autonomous cars (This however is not that
edu/files/publication/files/2012-Smithcars in the US
relevant now that robocars have been legalized in
AutomatedVehiclesAreProbablyLegalinTheUS_0.pdf
California)
Important points from the article and user comments:
transportation as a service model, dealership
package of owning a personal car + having free miles
on the robo-service-car, reference to transit oriented
enviromentalists/activists (we need to look at their
views and why they are against this), service
robocars solve the parking space problems, robocars
http://www.theatlanticcities.
can really blur the line between "public" and "private"
Article about autonomous cars
com/commute/2012/09/driverless-carstransportation, concept of dynamic carpooling using
reshaping the future
would-reshape-automobiles-and-transitrobocars and smartphones, problems in robotaxis in
system/3432/
handling large crowds of people after the Big Game
lets say coming out of the stadium?, City planning model of "free-floating semi-autonomous people
transporters" would only push us toward building
more disconnected places?, car owners renting out
their robocars (other more advanced models to zipcar
and car-sharing)
http://ideas.4brad.com/
Awesome blog about robocars
Leads to more links, Still being updated regularly
http://www.autoevolution.
Autonomous parking introduced by
Parking which can be controlled with a smartpohone.
com/news/nissan-hops-aboard-self-drivingNissan
Avaivable in near future (2015).
train-with-nsc-2015-video-50028.html

Link/Title

Description

http://conferences.ifpri.
Study about
org/2020conference/PDF/summary_bongaarts.pdf
popultaion growth
http://view.fdu.
Modern human
edu/files/humanvaluessustainability.
being's values
pdf
http://www.futuretimeline.
net/21stcentury/2020.htm

2020 people profile

Comments
Population of earth is likely to
increase with 3 billion between
2001-2050. Relevance to our cabin
is might not be that good :D.
Good insight in current human
values. A lot of useless stuff but
some relevant stuff also. Article
published 2010.
Completely new generation will rule
the world in future. People more like
you and me. Less religious, more
liberal...

Who added
Sangram

Sangram

Sangram
Tommi

Who added
Tommi

Tommi

Goran

Final Documentation

157

7.8 Fall Presentations

3P/33/3P

Stanford Presentation Slides

Sangram Patil
Stephanie Tomasetta
David Wang

User trust is established.

Goran Bjelajac
Sifo Luo
Heikki Sj man
Tommi Tuulenm ki

User trust is established.


Emergency situations are responded to by the AV.

158

Final Documentation

3P/33/3P

Assumptions

The Design Space

The Design Space

User trust is established.


Emergency situations are responded to by the AV.

Transition

Trust

Cabin

Control

Transition

Trust

Cabin

Control

Vehicle-to-Vehicle and Vehicle-to-Infrastructure


communication technology is implemented.

Transition
Testing the transition sequence from
autonomous to manual control

Transition: Gradual

Transition: Direct

_______ Display

______________ Switch

Transition: Key Insights

Transition: Key Insights

Visual representation of interactive information is too


intrusive and distracting during driving.

Transition: Key Insights

Visual representation of interactive information is too


intrusive and distracting during driving.
Performance is not based on the number of tasks being
performed, but the coherence between the sources of
information.

Final Documentation

159

3P/33/3P

Transition: Key Insights

The Design Space

Trust
Reason for trusting autonomous systems
Experience of riding in an autonomous vehicle

Visual representation of interactive information is too


intrusive and distracting during driving.
Performance is not based on the number of tasks being
performed, but the coherence between the sources of
information.

Transition

Trust

Cabin

People did not feel like they were driving when prompted
to mimic the AV controls.

Trust: Key Insights

Control

Trust: Key Insights

If the user sees what the cars sees, then trust is built.

Trust: Key Insights

If the user sees what the cars sees, then trust is built.
Having anticipatory cues for out of the ordinary situations
leads to reassurance that everything is working.

The Design Space

Cabin Space
Explore experience of having a moving chair in
the cabin space

Cabin Space: Key Insights

Interactive Windshield __________________

Transition

Trust

Cabin

Control
Movable Chair __________
Cabin Activity Space _______________________
Working
Relaxing
Socializing
Interacting with multimedia
Driving

160

Final Documentation

3P/33/3P

Cabin Space: Key Insights

Cabin Space: Key Insights

It is desirable to have the flexibility of moving around in


the cabin space.

It is desirable to have the flexibility of moving around in


the cabin space.

The Design Space

Transition

Trust

Cabin

Control

Attaching the control input to the moving chair allows for


the input to be used for multiple activities, including
driving.

Control Mechanisms
Explore possibility of a different control
mechanism that is more fun and easier to
transition

Control Mechanisms: Key Insights

Control Mechanisms: Key Insights

Most non-conventional steering inputs are less precise


and have a steep learning curve.
Joystick Controlled Steering

Mind Control to Levitate Ball


(Mindflex Duel Game)

Control Mechanisms: Key Insights

Design Requirements

Design Requirements

Most non-conventional steering inputs are less precise


and have a steep learning curve.
For crash-proof manual driving in the future, the control
inputs primary function is to maintain the fun of driving.

Comfortable Transition

Final Documentation

161

3P/33/3P

Design Requirements

Design Requirements

Intuitive and Non-Intrusive Interface

Design Requirements

Design Requirements

Situational Awareness

Next Steps for Audi Evolve

Maintaining the Pleasure of Driving

Next Steps for Audi Evolve

Maximizing Cabin Space

Organized Cabin Space

Personalized Cabin Space

Adaptable Cabin Space


Explore user interaction &
cabin space transitioning ________________

Next Steps for Audi Evolve

Next Steps for Audi Evolve

Next Steps for Audi Evolve

Redefine the business model


to a subscription based one ___
Identify emerging
technologies to integrate _____

Identify emerging
technologies to integrate _____

Define and categorize


potential activities in AV __________

Define and categorize


potential activities in AV __________

Define and categorize


potential activities in AV __________

Explore user interaction &


cabin space transitioning ________________

Explore user interaction &


cabin space transitioning ________________

Explore user interaction &


cabin space transitioning ________________

162

3P/33/3P

Final Documentation

Next Steps for Audi Evolve


Leverage the social impact of
self-aware cars ____________________
Redefine the business model
to a subscription based one ___
Identify emerging
technologies to integrate _____

Questions?
Define and categorize
potential activities in AV __________

Explore user interaction &


cabin space transitioning ________________

Aalto Presentation Slides

Final Documentation

163

164

Final Documentation

Final Documentation

7.9 Winter Presentation

165

166

Final Documentation

Final Documentation

167

168

Final Documentation

7.10 Winter Brochure

EVOLVE
Possibilities to use the time and space in
the autonomous car are only limited by the
size of the car and peoples imagination. We
found that the most common activities people
wished to do in the car are working, relaxing,
and socializing. Every one of those activities
requires different body position and slightly
different interior arrangement. The space in
the car is very limited and there is just not
enough space to move freely. How to create
a comfortable and entertaining cabin environment is one of our tasks.
In order to ensure the quality and sophistication Audi brand presents, we need to make
driver experience during entire ride equally
interesting. That includes both the experience during activities in the car and the transition between them as well.
The transition between autonomous and
manual modes is another problem. If we look
at only the transition from driving to autonomous, we can immediately see that it is not
easy at all. Letting go the control of your life

to a machine requires much more than just


letting the steering wheel go. Drivers need a
simplified sequence of mode changing not
only to shorten their learning curve but also
to have a feeling of safety.

Vision
We envision cars in the year 2035 as moving personal space. Cars will be autonomous
and it will have two driving modes - manual
and autonomous. Drivers will no longer waste
time because of the time spent on the road,
but because of the possibility to use the time
while traveling for something else than driving. Those activities are going to be mainly
working, relaxing, and socializing. The design of the cabin will accommodate these activities as best as the size of the car allows to.
This requires us to radically restructure the
interior of the car. There might be only one
seat in front, and this driver seat will have to
be smart in a way that it senses what driver
wants and thus eases the transition between
activities and modes. Visual and audio guidance will also aid the transition by giving
enough information to the driver.

Final Documentation

169

170

Final Documentation

7.11 Dark Horse


Prototype
7.11.1 Reconfiguro - Drag and Drop
Interface Java Code
//Drag and Drop Interface
//ME310 Darkhorse Prototype
//By: VW-Audi 1/23/13
import acm.graphics.*;
import acm.program.*;
import java.awt.*;
import java.awt.event.*;
import javax.swing.*;
public class DragDrop extends GraphicsProgram implements DragDropConstants
{
private GPoint last_mouse_loc;
private GObject objectSelected;
private GRect dropArea;
private GLabel frontText;
private int rotateChair1Counter=0;
private int rotateChair2Counter=0;
private int flipBedCounter = 0;
private int rotateDeskCounter = 0;
private GCompound chairSymbol1 = new GCompound();
private GCompound chairSymbol2 = new GCompound();
private GCompound bedSymbol = new GCompound();
private GCompound tvSymbol = new GCompound();
private GCompound tableSymbol = new GCompound();
private GCompound deskSymbol = new GCompound();
private GCompound lampSymbol = new GCompound();
private GCompound cardSymbol = new GCompound();
private GCompound drinkSymbol = new GCompound();
private GLabel chairLabel;
private GLabel chairLabel2;
public GLabel bedLabel;
private GLabel deskLabel;
private GLabel lampLabel;
private GLabel tableLabel;

Final Documentation

171

private GLabel tvLabel;


private GLabel cardLabel;
private GLabel drinkLabel;
private GCompound legend = new GCompound();
public void init()
{
JButton resetAllButton = new JButton(Reset All);
JButton resetSelectedButton = new JButton(Reset Selected);
JButton rotateButton = new JButton(Rotate Selected);
add(resetAllButton, NORTH);
add(resetSelectedButton, NORTH);
add(rotateButton, NORTH);
setupDropArea();
createObjects();
createLegend();
addActionListeners();
addMouseListeners();
}
public void run()
{
}
public void setupDropArea()
{
dropArea = new GRect(DROP_AREA_WIDTH, DROP_AREA_HEIGHT);
dropArea.setLocation(DROP_AREA_CENTER_X, DROP_AREA_CENTER_Y);
add(dropArea);
frontText = new GLabel(BACK OF VAN FRONT OF VAN);
frontText.setFont(SansSerif-16);
add(frontText, DROP_AREA_CENTER_X + (DROP_AREA_WIDTH - frontText.getWidth())/2,
DROP_AREA_CENTER_Y + DROP_AREA_HEIGHT + frontText.getHeight());
}
public void createObjects()
{
createBedSymbol(bedSymbol, BED_INIT_LOC_X, BED_INIT_LOC_Y);
createChairSymbol(chairSymbol1, CHAIR1_INIT_LOC_X, CHAIR1_INIT_LOC_Y);
createChairSymbol(chairSymbol2, CHAIR2_INIT_LOC_X, CHAIR2_INIT_LOC_Y);
createTVSymbol(tvSymbol, TV_INIT_LOC_X, TV_INIT_LOC_Y);
createTableSymbol(tableSymbol, TABLE_INIT_LOC_X, TABLE_INIT_LOC_Y);
createDeskSymbol(deskSymbol, DESK_INIT_LOC_X, DESK_INIT_LOC_Y);

172

Final Documentation

createLampSymbol(lampSymbol, LAMP_INIT_LOC_X, LAMP_INIT_LOC_Y);


createDeckCardsSymbol(cardSymbol, DECK_CARDS_INIT_LOC_X, DECK_CARDS_INIT_
LOC_Y);
createDrinkSymbol(drinkSymbol, DRINK_INIT_LOC_X, DRINK_INIT_LOC_Y);
}
public void createLegend()
{
createLegendLabel();
}
public void createLegendLabel()
{
chairLabel = new GLabel(CHAIR SELECTED);
chairLabel2 = new GLabel(CHAIR SELECTED);
bedLabel = new GLabel(BED SELECTED);
tvLabel = new GLabel(TV PROJECTOR SELECTED);
tableLabel = new GLabel(LOW TABLE SELECTED);
deskLabel = new GLabel(DESK SELECTED);
lampLabel = new GLabel(LAMP SELECTED);
cardLabel = new GLabel(DECK OF CARDS SELECTED);
drinkLabel = new GLabel(DRINK SELECTED);
add(chairLabel);
add(chairLabel2);
add(bedLabel);
add(tvLabel);
add(tableLabel);
add(deskLabel);
add(lampLabel);
add(cardLabel);
add(drinkLabel);
chairLabel.setFont(SansSerif-20);
chairLabel2.setFont(SansSerif-20);
bedLabel.setFont(SansSerif-20);
tvLabel.setFont(SansSerif-20);
tableLabel.setFont(SansSerif-20);
deskLabel.setFont(SansSerif-20);
lampLabel.setFont(SansSerif-20);
cardLabel.setFont(SansSerif-20);
drinkLabel.setFont(SansSerif-20);
}

Final Documentation

173

public void createChairSymbol(GCompound chair, int xPosition, int yPosition)


{
GRect chair1 = new GRect(CHAIR_WIDTH, CHAIR_HEIGHT);
GLine line1 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_WIDTH - CHAIR_
MARGIN,
CHAIR_MARGIN); //center horiz line
GLine line2 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_
HEIGHT CHAIR_MARGIN); //left vert line
GLine line3 = new GLine(CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_
MARGIN, CHAIR_WIDTH
- CHAIR_MARGIN, CHAIR_MARGIN); //right vert line
chair.add(chair1);
chair.add(line1);
chair.add(line2);
chair.add(line3);
add(chair,xPosition, yPosition);
}
public void rotateChair()
{
int nextPosition = 0;
GLine line1 = null, line2=null, line3=null, line4=null, line5=null, line6=null;
GPoint objectLoc = objectSelected.getLocation();
double x = objectLoc.getX();
double y = objectLoc.getY();
GRect chair1 = new GRect(CHAIR_WIDTH, CHAIR_HEIGHT);
//Need to check which rotated position it is in
if(objectSelected == chairSymbol1)
{
nextPosition = rotateChair1Counter;
}
else if(objectSelected == chairSymbol2)
{
nextPosition = rotateChair2Counter;
}
switch(nextPosition)
{
case 0:

174

Final Documentation

//****DOWN FACING CHAIR****/


line1 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_WIDTH - CHAIR_MARGIN,
CHAIR_MARGIN); //upper horiz line
line2 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_HEIGHT CHAIR_MARGIN); //left vert line
line3 = new GLine(CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN,
CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_MARGIN); //right vert line
line4 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_WIDTH - CHAIR_MARGIN,
CHAIR_MARGIN); //upper horiz line
line5 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_HEIGHT CHAIR_MARGIN); //left vert line
line6 = new GLine(CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN,
CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_MARGIN); //right vert line
break;
case 1:
//****LEFT FACING CHAIR****/
//x1,y1 = upper_left point, x2,y2 = upper_right point
line1 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_WIDTH - CHAIR_MARGIN,
CHAIR_MARGIN); //upper horiz line
//x1,y1 = upper_right point, x2,y2 = bottom_left point
line2 = new GLine(CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN,
CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_MARGIN); //right vert line
//x1,y2 = bottom_left point, x2,y2 = bottom_right point
line3 = new GLine(CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN, CHAIR_WIDTH CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN); //bottom horiz line
/*********/
//x1,y1 = upper_left point, x2,y2 = upper_right point
line4 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_WIDTH - CHAIR_MARGIN,
CHAIR_MARGIN); //upper horiz line
//x1,y1 = upper_right point, x2,y2 = bottom_left point
line5 = new GLine(CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN,
CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_MARGIN); //right vert line
//x1,y2 = bottom_left point, x2,y2 = bottom_right point
line6 = new GLine(CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN, CHAIR_WIDTH CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN); //bottom horiz line
break;
case 2:
//****UP FACING CHAIR****/

Final Documentation

175

//x1,y1 = upper_right point, x2,y2 = bottom_left point


line1 = new GLine(CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN,
CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_MARGIN); //right vert line
//x1,y1 = upper_left point, x2,y2 = bottom_left point
line2 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_HEIGHT
CHAIR_MARGIN); //left vert line
//x1,y2 = bottom_left point, x2,y2 = bottom_right point
line3 = new GLine(CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN, CHAIR_WIDTH CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN); //bottom horiz line
/******/
//x1,y1 = upper_right point, x2,y2 = bottom_left point
line4 = new GLine(CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN,
CHAIR_WIDTH - CHAIR_MARGIN, CHAIR_MARGIN); //right vert line
//x1,y1 = upper_left point, x2,y2 = bottom_left point
line5 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_HEIGHT
CHAIR_MARGIN); //left vert line
//x1,y2 = bottom_left point, x2,y2 = bottom_right point
line6 = new GLine(CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN, CHAIR_WIDTH CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN); //bottom horiz line
break;
case 3:
//*****RIGHT FACING CHAIR*****/
//x1,y1 = upper_left point, x2,y2 = upper_right point
line1 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_WIDTH - CHAIR_MARGIN,
CHAIR_MARGIN); //upper horiz line
//x1,y1 = upper_left point, x2,y2 = bottom_left point
line2 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_HEIGHT
CHAIR_MARGIN); //left vert line
//x1,y2 = bottom_left point, x2,y2 = bottom_right point
line3 = new GLine(CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN, CHAIR_WIDTH CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN); //bottom horiz line
/*******/
//x1,y1 = upper_left point, x2,y2 = upper_right point
line4 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_WIDTH - CHAIR_MARGIN,
CHAIR_MARGIN); //upper horiz line
//x1,y1 = upper_left point, x2,y2 = bottom_left point

176

Final Documentation

line5 = new GLine(CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_MARGIN, CHAIR_HEIGHT CHAIR_MARGIN); //left vert line
//x1,y2 = bottom_left point, x2,y2 = bottom_right point
line6 = new GLine(CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN, CHAIR_WIDTH CHAIR_MARGIN, CHAIR_HEIGHT - CHAIR_MARGIN); //bottom horiz line
break;
default:
break;
}
if(objectSelected == chairSymbol1)
{
remove(chairSymbol1);
chairSymbol1.removeAll();
chairSymbol1.add(chair1);
chairSymbol1.add(line1);
chairSymbol1.add(line2);
chairSymbol1.add(line3);
add(chairSymbol1, x, y);
}
else if(objectSelected == chairSymbol2)
{
remove(chairSymbol2);
chairSymbol2.removeAll();
chairSymbol2.add(chair1);
chairSymbol2.add(line4);
chairSymbol2.add(line5);
chairSymbol2.add(line6);
add(chairSymbol2, x, y);
}
}
public void createBedSymbol(GCompound bed, int xPosition, int yPosition)
{
GRect bed1 = new GRect(BED_WIDTH, BED_HEIGHT);
GRect pillow = new GRect(BED_WIDTH/8, BED_HEIGHT*3/4);
pillow.move(BED_WIDTH/10, pillow.getHeight()/5);
GLine sheet = new GLine(30, 0,30,BED_HEIGHT);
pillow.setFilled(true);
bed.add(bed1);

Final Documentation

bed.add(pillow);
bed.add(sheet);
add(bed, xPosition, yPosition);
}
public void flipBed()
{
int nextPosition = 0;
GRect bed = null, pillow = null;
GLine sheet = null;
GPoint objectLoc = objectSelected.getLocation();
double x = objectLoc.getX();
double y = objectLoc.getY();
//Need to check which rotated position it is in
if(objectSelected == bedSymbol)
{
nextPosition = flipBedCounter;
switch(nextPosition)
{
case 0:
bed = new GRect(BED_WIDTH, BED_HEIGHT);
pillow = new GRect(BED_WIDTH/8, BED_HEIGHT*3/4);
pillow.move(BED_WIDTH/10, pillow.getHeight()/5);
sheet = new GLine(30, 0,30,BED_HEIGHT);
pillow.setFilled(true);
break;
case 1:
bed = new GRect(BED_WIDTH, BED_HEIGHT);
pillow = new GRect(BED_WIDTH/8, BED_HEIGHT*3/4);
pillow.move(BED_WIDTH - BED_WIDTH/10 - pillow.getWidth(), pillow.getHeight()/5);
sheet = new GLine(BED_WIDTH - 30, 0, BED_WIDTH - 30,BED_HEIGHT);
pillow.setFilled(true);
break;
default:
break;
}
remove(bedSymbol);
bedSymbol.removeAll();
bedSymbol.add(bed);

177

178

Final Documentation

bedSymbol.add(pillow);
bedSymbol.add(sheet);
add(bedSymbol, x, y);
}
}
public void createTVSymbol(GCompound tv, int xPosition, int yPosition)
{
GRect tv1 = new GRect(TV_WIDTH, TV_HEIGHT);
GOval frame = new GOval(2, 2, TV_WIDTH-5, TV_HEIGHT-5);
GLine line1 = new GLine(TV_WIDTH, TV_HEIGHT/2,TV_WIDTH + 10, TV_HEIGHT/2);
GLine line2 = new GLine(TV_WIDTH, TV_HEIGHT/2,TV_WIDTH + 10, TV_HEIGHT/2 + 10);
GLine line3 = new GLine(TV_WIDTH, TV_HEIGHT/2, TV_WIDTH + 10, TV_HEIGHT/2 - 10);
tv.add(tv1);
tv.add(frame);
tv.add(line1);
tv.add(line2);
tv.add(line3);
add(tv, xPosition, yPosition);
}
public void createTableSymbol(GCompound table, int xPosition, int yPosition)
{
GRect table1 = new GRect(TABLE_WIDTH, TABLE_HEIGHT);
GLabel tableText = new GLabel(T);
tableText.setFont(SansSerif-16);
double x = (table1.getWidth() - tableText.getWidth()/2);
double y = (table1.getHeight() + tableText.getAscent()/2);
table.add(table1);
table.add(tableText, x/2, y/2);
add(table, xPosition, yPosition);
}
public void createDeskSymbol(GCompound desk, int xPosition, int yPosition)
{
GRect desk1 = new GRect(DESK_WIDTH, DESK_HEIGHT);
GLabel deskText = new GLabel(D);
deskText.setFont(SansSerif-16);
double x = (desk1.getWidth() - deskText.getWidth()/2);
double y = (desk1.getHeight() + deskText.getAscent()/2);
desk.add(desk1);
desk.add(deskText,x/2, y/2 + deskText.getAscent()/8);

Final Documentation

add(desk, xPosition, yPosition);


}
public void rotateDesk()
{
int nextPosition = 0;
GRect desk =null;
GLabel deskText = null;
double textX, textY;
GPoint objectLoc = objectSelected.getLocation();
double x = objectLoc.getX();
double y = objectLoc.getY();
//Need to check which rotated position it is in
if(objectSelected == deskSymbol)
{
nextPosition = rotateDeskCounter;
System.out.println(nextPosition);
switch(nextPosition)
{
case 0:
desk = new GRect(DESK_WIDTH, DESK_HEIGHT);
deskText = new GLabel(D);
deskText.setFont(SansSerif-16);
textX = (desk.getWidth() - deskText.getWidth()/2);
textY = (desk.getHeight() + deskText.getAscent()/2);
add(deskText,textX/2, textY/2 + deskText.getAscent()/8);
break;
case 1:
desk = new GRect(DESK_HEIGHT, DESK_WIDTH);
deskText = new GLabel(D);
deskText.setFont(SansSerif-16);
textX = (desk.getWidth() - deskText.getWidth()/2);
textY = (desk.getHeight() + deskText.getAscent()/2);
add(deskText, textX/2 - deskText.getWidth()/4, textY/2);
break;
default:
break;
}
remove(deskSymbol);
deskSymbol.removeAll();

179

180

Final Documentation

deskSymbol.add(desk);
deskSymbol.add(deskText);
add(deskSymbol, x, y);
}
}
public void createLampSymbol(GCompound lamp, int xPosition, int yPosition)
{
GOval frame = new GOval(LAMP_WIDTH, LAMP_HEIGHT);
GOval lamp1 = new GOval(LAMP_WIDTH, LAMP_HEIGHT);
lamp1.setColor(Color.YELLOW);
lamp1.setFilled(true);
lamp.add(lamp1);
lamp.add(frame);
add(lamp, xPosition, yPosition);
Audi EVOLVE March 19, 2013
169
}
public void createDeckCardsSymbol(GCompound card, int xPosition, int yPosition)
{
GRect card1 = new GRect(DECK_CARDS_WIDTH, DECK_CARDS_HEIGHT);
GLabel cardText = new GLabel(DC);
double x = (card1.getWidth() - cardText.getWidth()/2);
double y = (card1.getHeight() + cardText.getAscent()/2);
card.add(card1);
card.add(cardText,x,y);
add(card, xPosition, yPosition);
}
public void createDrinkSymbol(GCompound drink, int xPosition, int yPosition)
{
GOval drink1 = new GOval(DRINK_WIDTH, DRINK_HEIGHT);
GOval frame = new GOval(DRINK_WIDTH, DRINK_HEIGHT);
drink1.setColor(Color.BLUE);
drink1.setFilled(true);
drink.add(drink1);
drink.add(frame);
add(drink, xPosition, yPosition);
}
public void mousePressed(MouseEvent e)
{

Final Documentation

last_mouse_loc = new GPoint(e.getPoint());


objectSelected = getElementAt(last_mouse_loc);
if(objectSelected != null)
{
if((objectSelected != dropArea) && (objectSelected != frontText))
{
objectSelected.sendToFront();
if(objectSelected == bedSymbol)
{
bedLabel.setVisible(true);
bedLabel.setLocation(TEXT_ALIGN_X - bedLabel.getWidth()/2, TEXT_ALIGN_Y bedLabel.getAscent()/2);
}
else
bedLabel.setVisible(false);
if(objectSelected == chairSymbol1)
{
chairLabel.setVisible(true);
chairLabel.setLocation(TEXT_ALIGN_X - chairLabel.getWidth()/2, TEXT_ALIGN_Y chairLabel.getAscent()/2);
}
else
chairLabel.setVisible(false);
if(objectSelected == chairSymbol2)
{
chairLabel2.setVisible(true);
chairLabel2.setLocation(TEXT_ALIGN_X - chairLabel2.getWidth()/2, TEXT_ALIGN_Y
- chairLabel2.getAscent()/2);
}
else
chairLabel2.setVisible(false);
if(objectSelected == tvSymbol)
{
tvLabel.setVisible(true);
tvLabel.setLocation(TEXT_ALIGN_X - tvLabel.getWidth()/2, TEXT_ALIGN_Y tvLabel.getAscent()/2);
Audi EVOLVE March 19, 2013
170
}

181

182

Final Documentation

else
tvLabel.setVisible(false);
if(objectSelected == tableSymbol)
{
tableLabel.setVisible(true);
tableLabel.setLocation(TEXT_ALIGN_X - tableLabel.getWidth()/2, TEXT_ALIGN_Y tableLabel.getAscent()/2);
}
else
tableLabel.setVisible(false);
if(objectSelected == deskSymbol)
{
deskLabel.setVisible(true);
deskLabel.setLocation(TEXT_ALIGN_X - deskLabel.getWidth()/2, TEXT_ALIGN_Y deskLabel.getAscent()/2);
}
else
deskLabel.setVisible(false);
if(objectSelected == lampSymbol)
{
lampLabel.setVisible(true);
lampLabel.setLocation(TEXT_ALIGN_X - lampLabel.getWidth()/2, TEXT_ALIGN_Y lampLabel.getAscent()/2);
}
else
lampLabel.setVisible(false);
if(objectSelected == cardSymbol)
{
cardLabel.setVisible(true);
cardLabel.setLocation(TEXT_ALIGN_X - cardLabel.getWidth()/2, TEXT_ALIGN_Y cardLabel.getAscent()/2);
}
else
cardLabel.setVisible(false);
if(objectSelected == drinkSymbol)
{
drinkLabel.setVisible(true);
drinkLabel.setLocation(TEXT_ALIGN_X - drinkLabel.getWidth()/2, TEXT_ALIGN_Y drinkLabel.getAscent()/2);

Final Documentation

183

}
else
drinkLabel.setVisible(false);
}
}
else
{
bedLabel.setVisible(false);
chairLabel.setVisible(false);
chairLabel2.setVisible(false);
tvLabel.setVisible(false);
tableLabel.setVisible(false);
deskLabel.setVisible(false);
lampLabel.setVisible(false);
cardLabel.setVisible(false);
drinkLabel.setVisible(false);
}
}
public void mouseDragged(MouseEvent e)
{
if(objectSelected != null)
{
//Need to check whether if the object is the drop area, if so dont do anything and sent to
back
if((objectSelected != dropArea) && (objectSelected != frontText))
{
objectSelected.move(e.getX() - last_mouse_loc.getX(), e.getY() - last_mouse_loc.getY());
last_mouse_loc = new GPoint(e.getPoint());
if((objectSelected != dropArea) && (objectSelected != frontText))
{
objectSelected.sendToFront();
if(objectSelected == bedSymbol)
{
bedLabel.setVisible(true);
bedLabel.setLocation(TEXT_ALIGN_X - bedLabel.getWidth()/2,
TEXT_ALIGN_Y - bedLabel.getAscent()/2);
}
else
bedLabel.setVisible(false);

184

if(objectSelected == chairSymbol1)
{
chairLabel.setVisible(true);
chairLabel.setLocation(TEXT_ALIGN_X - chairLabel.getWidth()/2,
TEXT_ALIGN_Y - chairLabel.getAscent()/2);
}
else
chairLabel.setVisible(false);
if(objectSelected == chairSymbol2)
{
chairLabel2.setVisible(true);
chairLabel2.setLocation(TEXT_ALIGN_X chairLabel2.getWidth()/2, TEXT_ALIGN_Y - chairLabel2.getAscent()/2);
}
else
chairLabel2.setVisible(false);
if(objectSelected == tvSymbol)
{
tvLabel.setVisible(true);
tvLabel.setLocation(TEXT_ALIGN_X - tvLabel.getWidth()/2,
TEXT_ALIGN_Y - tvLabel.getAscent()/2);
}
else
tvLabel.setVisible(false);
if(objectSelected == tableSymbol)
{
tableLabel.setVisible(true);
tableLabel.setLocation(TEXT_ALIGN_X - tableLabel.getWidth()/2,
TEXT_ALIGN_Y - tableLabel.getAscent()/2);
}
else
tableLabel.setVisible(false);
if(objectSelected == deskSymbol)
{
deskLabel.setVisible(true);
deskLabel.setLocation(TEXT_ALIGN_X - deskLabel.getWidth()/2,
TEXT_ALIGN_Y - deskLabel.getAscent()/2);
}
else

Final Documentation

Final Documentation

deskLabel.setVisible(false);
if(objectSelected == lampSymbol)
{
lampLabel.setVisible(true);
lampLabel.setLocation(TEXT_ALIGN_X - lampLabel.getWidth()/2,
TEXT_ALIGN_Y - lampLabel.getAscent()/2);
}
else
lampLabel.setVisible(false);
if(objectSelected == cardSymbol)
{
cardLabel.setVisible(true);
cardLabel.setLocation(TEXT_ALIGN_X - cardLabel.getWidth()/2,
TEXT_ALIGN_Y - cardLabel.getAscent()/2);
}
else
cardLabel.setVisible(false);
if(objectSelected == drinkSymbol)
{
drinkLabel.setVisible(true);
drinkLabel.setLocation(TEXT_ALIGN_X - drinkLabel.getWidth()/2,
TEXT_ALIGN_Y - drinkLabel.getAscent()/2);
}
else
drinkLabel.setVisible(false);
}
else
{
bedLabel.setVisible(false);
chairLabel.setVisible(false);
chairLabel2.setVisible(false);
tvLabel.setVisible(false);
tableLabel.setVisible(false);
deskLabel.setVisible(false);
lampLabel.setVisible(false);
cardLabel.setVisible(false);
drinkLabel.setVisible(false);
}
}

185

186

Final Documentation

else
{
objectSelected.sendBackward();
}
}
}
public void mouseClicked(MouseEvent e)
{
last_mouse_loc = new GPoint(e.getPoint());
objectSelected = getElementAt(last_mouse_loc);
if(objectSelected != null)
{
if((objectSelected != dropArea) && (objectSelected != frontText))
{
objectSelected.sendToFront();
if(objectSelected == bedSymbol)
{
bedLabel.setVisible(true);
bedLabel.setLocation(TEXT_ALIGN_X - bedLabel.getWidth()/2, TEXT_ALIGN_Y bedLabel.getAscent()/2);
}
else
bedLabel.setVisible(false);
if(objectSelected == chairSymbol1)
{
chairLabel.setVisible(true);
chairLabel.setLocation(TEXT_ALIGN_X - chairLabel.getWidth()/2, TEXT_ALIGN_Y chairLabel.getAscent()/2);
}
else
chairLabel.setVisible(false);
if(objectSelected == chairSymbol2)
Audi EVOLVE March 19, 2013
173
{
chairLabel2.setVisible(true);
chairLabel2.setLocation(TEXT_ALIGN_X - chairLabel2.getWidth()/2, TEXT_ALIGN_Y
- chairLabel2.getAscent()/2);
}

Final Documentation

else
chairLabel2.setVisible(false);
if(objectSelected == tvSymbol)
{
tvLabel.setVisible(true);
tvLabel.setLocation(TEXT_ALIGN_X - tvLabel.getWidth()/2, TEXT_ALIGN_Y tvLabel.getAscent()/2);
}
else
tvLabel.setVisible(false);
if(objectSelected == tableSymbol)
{
tableLabel.setVisible(true);
tableLabel.setLocation(TEXT_ALIGN_X - tableLabel.getWidth()/2, TEXT_ALIGN_Y tableLabel.getAscent()/2);
}
else
tableLabel.setVisible(false);
if(objectSelected == deskSymbol)
{
deskLabel.setVisible(true);
deskLabel.setLocation(TEXT_ALIGN_X - deskLabel.getWidth()/2, TEXT_ALIGN_Y deskLabel.getAscent()/2);
}
else
deskLabel.setVisible(false);
if(objectSelected == lampSymbol)
{
lampLabel.setVisible(true);
lampLabel.setLocation(TEXT_ALIGN_X - lampLabel.getWidth()/2, TEXT_ALIGN_Y lampLabel.getAscent()/2);
}
else
lampLabel.setVisible(false);
if(objectSelected == cardSymbol)
{
cardLabel.setVisible(true);
cardLabel.setLocation(TEXT_ALIGN_X - cardLabel.getWidth()/2, TEXT_ALIGN_Y cardLabel.getAscent()/2);

187

188

Final Documentation

}
else
cardLabel.setVisible(false);
if(objectSelected == drinkSymbol)
{
drinkLabel.setVisible(true);
drinkLabel.setLocation(TEXT_ALIGN_X - drinkLabel.getWidth()/2, TEXT_ALIGN_Y drinkLabel.getAscent()/2);
}
else
drinkLabel.setVisible(false);
}
}
else
{
bedLabel.setVisible(false);
chairLabel.setVisible(false);
chairLabel2.setVisible(false);
tvLabel.setVisible(false);
tableLabel.setVisible(false);
deskLabel.setVisible(false);
lampLabel.setVisible(false);
cardLabel.setVisible(false);
drinkLabel.setVisible(false);
}
}
/*-----------------------*/
/*BUTTON ACTION PERFORMED*/
/*-----------------------*/
public void actionPerformed(ActionEvent e)
{
String cmd = e.getActionCommand();
if(cmd.equals(Reset All))
{
bedSymbol.setLocation(BED_INIT_LOC_X, BED_INIT_LOC_Y);
chairSymbol1.setLocation(CHAIR1_INIT_LOC_X, CHAIR2_INIT_LOC_Y);
chairSymbol2.setLocation(CHAIR2_INIT_LOC_X, CHAIR2_INIT_LOC_Y);
tvSymbol.setLocation(TV_INIT_LOC_X, TV_INIT_LOC_Y);
tableSymbol.setLocation(TABLE_INIT_LOC_X, TABLE_INIT_LOC_Y);

Final Documentation

189

deskSymbol.setLocation(DESK_INIT_LOC_X, DESK_INIT_LOC_Y);
lampSymbol.setLocation(LAMP_INIT_LOC_X, LAMP_INIT_LOC_Y);
cardSymbol.setLocation(DECK_CARDS_INIT_LOC_X, DECK_CARDS_INIT_LOC_Y);
drinkSymbol.setLocation(DRINK_INIT_LOC_X, DRINK_INIT_LOC_Y);
}
else if(cmd.equals(Reset Selected))
{
if(objectSelected != null)
{
if(objectSelected == bedSymbol)
bedSymbol.setLocation(BED_INIT_LOC_X, BED_INIT_LOC_Y);
else if(objectSelected == chairSymbol1)
chairSymbol1.setLocation(CHAIR1_INIT_LOC_X, CHAIR1_INIT_LOC_Y);
else if(objectSelected == chairSymbol2)
chairSymbol2.setLocation(CHAIR2_INIT_LOC_X, CHAIR2_INIT_LOC_Y);
else if(objectSelected == tvSymbol)
tvSymbol.setLocation(TV_INIT_LOC_X, TV_INIT_LOC_Y);
else if(objectSelected == tableSymbol)
tableSymbol.setLocation(TABLE_INIT_LOC_X, TABLE_INIT_LOC_Y);
else if(objectSelected == deskSymbol)
deskSymbol.setLocation(DESK_INIT_LOC_X,DESK_INIT_LOC_Y);
else if(objectSelected == lampSymbol)
lampSymbol.setLocation(LAMP_INIT_LOC_X, LAMP_INIT_LOC_Y);
else if(objectSelected == cardSymbol)
cardSymbol.setLocation(DECK_CARDS_INIT_LOC_X,
DECK_CARDS_INIT_LOC_Y);
else if(objectSelected == drinkSymbol)
drinkSymbol.setLocation(DRINK_INIT_LOC_X, DRINK_INIT_LOC_Y);
}
}
else if(cmd.equals(Rotate Selected))
{
if(objectSelected != null)
{
if(objectSelected == chairSymbol1)
{
if(++rotateChair1Counter == 4)
rotateChair1Counter = 0;
rotateChair();

190

}
else if(objectSelected == chairSymbol2)
{
if(++rotateChair2Counter == 4)
rotateChair2Counter = 0;
rotateChair();
}
else if(objectSelected == bedSymbol)
Audi EVOLVE March 19, 2013
175
{
if(flipBedCounter == 1)
flipBedCounter = 0;
else
flipBedCounter = 1;
flipBed();
}
else if(objectSelected == deskSymbol)
{
if(rotateDeskCounter == 1)
rotateDeskCounter = 0;
else
rotateDeskCounter = 1;
rotateDesk();
}
}
}
}

Final Documentation

Final Documentation

7.12 Funky Prototype


7.12.1 Arduino Code for FSR
#include <math.h>
#define STATESWITCH 22 //Digital pin 22 (physical pin 78)
#define ONSWITCH 23 //Digital pin 23 (physical pin 77)
// Force Sensor Pins
const int B[] = {A0,A1,A2,A3};
const int T[] = {A4,A5,A6,A7};
// arrays to store the values
int BValue[] = {0,0,0,0};
int TValue[] = {0,0,0,0};
int switchState = 0;
int onSwitch = 0;
char topForceString[30];
char bottomForceString[30];
char forceString[50];
void setup()
{
//Create a serial connection to display the data on the terminal.
Serial.begin(9600);
// Set the Analog pins to inputs
for(int i=0;i<4;i++)
{
pinMode(B[i], INPUT);
pinMode(T[i], INPUT);
}
pinMode(ONSWITCH, INPUT);
pinMode(STATESWITCH, INPUT);
}
void loop(){
onSwitch = digitalRead(ONSWITCH);
//Only send/collect data when switch is on
if(onSwitch)
{
for(int i=0;i<4;i++)
{
BValue[i] = analogRead(B[i]);

191

192

Final Documentation

TValue[i] = analogRead(T[i]);
}
//Send switch state data to matlab
Serial.println(switch state);
switchState = digitalRead(STATESWITCH);
Serial.println(switchState);
Serial.println(force data);
sprintf(topForceString,%u %u %u %u, TValue[0],TValue[1],TValue[2],TValue[3]);
sprintf(bottomForceString,%u %u %u %u, BValue[0],BValue[1],BValue[2],BValue[3]);
sprintf(forceString,%s %s, topForceString, bottomForceString);
Serial.println(forceString);
}
}
7.12.2 Matlab Data Collection Code (main.m)
clear;
close all;
name = MovingTest3;
serialPort = initialize_port();
forceCount = 1;
forceFlag = 0;
NCompare = 5;
while(1)
%Received value should be in format of attention value,meditation value
received = readSerial();
if(strncmpi(received,switch state,NCompare))
received = received(1:length(received)-2);
state = readSerial();
state = sscanf(state,%u); %%u gets rid of the \r\n
writeToFile(state,received,name);
elseif(strncmpi(received,force data,NCompare))
received = received(1:length(received)-2);
forceFlag = 1;
%Received data as a string in format (T1,T2,T3,T4,B1,B2,B3,B4)
values = readSerial();
writeToFile(values,received,name);
end
[date time] = strtok(datestr(clock));
writeToFile(time,time stamp,name);
end

Final Documentation

193

7.12.3 Matlab Data Collection Code (initialize_port.m)


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%Initialize port
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function [serialPort] = initialize_port()
global serialPort
%port = /dev/tty.usbmodem1411;
port = COM13;
baudrate = 9600;
databits = 8;
stopbits = 1;
parity = none;
flowcontrol = none;
timeout = 10;
terminator = CR/LF;
%detect if port is already in use (if program does not terminate correctly)
oldSerial = instrfind(Port, port);
if(~isempty(oldSerial))
if(~strcmp(get(oldSerial(1),Status),open))
delete(oldSerial(1));
else
fclose(oldSerial(1));
delete(oldSerial(1));
end
end
%Establish new connection
serialPort = serial(port,BaudRate, baudrate, DataBits, databits, StopBits, stopbits, Parity,
parity,FlowControl, flowcontrol, Timeout,
timeout, Terminator, terminator);
fopen(serialPort);
%get(serialPort); %Check port settings
%fread(serialPort,serialPort.BytesAvailable);
return;
end

194

7.12.4 Matlab Data Collection Code (readSerial.m)


function [data] = readSerial()
global serialPort
while(1)
if(serialPort.BytesAvailable() ~= 0)
data = fscanf(serialPort);
disp(data);
break;
end
end
end
7.12.5 Matlab Data Collection Code (writeToFile.m)
function writeToFile(data, dataCategory,name)
if(strcmp(dataCategory, switch state))
fwriteID = fopen([name,/switch_state.txt],a+);
fprintf(fwriteID, %u\r\n, data);
elseif(strcmp(dataCategory, force data))
fwriteID = fopen([name,/force_data.txt],a+);
%T1,T2,T3,T4,B1,B2,B3,B4
fprintf(fwriteID, %s, data);
elseif(strcmp(dataCategory, time stamp))
fwriteID = fopen([name,/time_stamp.txt],a+);
fprintf(fwriteID, %s\r\n, data);
end
fclose(fwriteID);
end
7.12.6 Matlab Post Process Code (post_proc.m)
close all;
name = MovingTest3;
stateSwitch = textread([name,/switch_state.txt]);
forceData = textread([name,/force_data.txt]);
changeNums = [];
count = 1;
lastState = stateSwitch(1);
for i=2:1:max(size(stateSwitch))

Final Documentation

Final Documentation

if(stateSwitch(i)~=lastState)
changeNums(count) = i;
count = count+1;
end
lastState = stateSwitch(i);
end
figure
subplot(4,1,1);
plot(forceData(:,1));
plotLines(changeNums);
ylabel(Sensor T1);
title(Top Sensors);
subplot(4,1,2);
plot(forceData(:,2));
plotLines(changeNums);
ylabel(Sensor T2);
subplot(4,1,3);
plot(forceData(:,3));
plotLines(changeNums);
ylabel(Sensor T3);
subplot(4,1,4);
plot(forceData(:,4));
plotLines(changeNums);
ylabel(Sensor T4);
figure
subplot(4,1,1);
plot(forceData(:,5));
plotLines(changeNums);
ylabel(Sensor B1);
title(Bottom Sensors);
subplot(4,1,2);
plot(forceData(:,6));
plotLines(changeNums);
ylabel(Sensor B2);
subplot(4,1,3);
plot(forceData(:,7));
plotLines(changeNums);
ylabel(Sensor B3);
subplot(4,1,4);

195

196

Final Documentation

plot(forceData(:,8));
plotLines(changeNums);
ylabel(Sensor B4);
saveas(figure(1),[name,/,name,top.fig])
saveas(figure(2),[name,/,name,bottom.fig])
saveas(figure(1),[name,/,name,top.jpg])
saveas(figure(2),[name,/,name,bottom.jpg])
function plotLines(changeNums)
for i=1:1:max(size(changeNums))
line([changeNums(i),changeNums(i)],[0,800],LineWidth,2,Color,Red)
end
end

7.13 Functional Prototype


7.13.1 Arduino Code - Anticipatory Chair
#include <avr.io.h>
#include <avr.interrupt.h>
#include <EEPROM.h>
#include <stdlib.h>
/*
T: Turn - Turn the chair and when back on pressure sensor turn back
D: Drive - Driving position turn off smart code and have steering wheel come out
R: Retract - Go back and recline slightly, steering wheel goes in and smart mode is on
S: Swipe
Arduino:
Recline by pushing on force sensors
Override switch
*/
//#define SMART_CODE
#define DUMB_CODE
//#define RESET_POS
#define CALIBRATE_CHAIR
// Motor control lines
#define M1_switch 4
#define M1_direc 5
#define M2_switch 6

Final Documentation

#define M2_direc 7
#define M3_switch 8
#define M3_direc 9
#define M4_switch 10
#define M4_direc 11
#define trial_pin A1
// Motor encoder input lines
#define M1_PIN 53 // PCI0 on PCINT0
#define M2_PIN 3 // INT5 interrupt
#define M3_PIN A8 // PCI2 on PCINT16
#define M4_PIN 2 // INT4 interrupt
// FSR control lines
#define MUX_BIT3 18
#define MUX_BIT2 19
#define MUX_BIT1 20
#define MUX_BIT0 21
// FSR input line
#define FSR_input A0
// M2 constants
const int M2_default = 650;
const int M2_D = 550;
const int M2_R = 950;
// M4 constants
const int M4_default = 650;
const int M4_D = 550;
const int M4_R = 800;
// Int Array to store FSR values
int FSR_values[12] = {0,0,0,0,0,0,0,0,0,0,0,0};
int M4_moveComplete = 1;
int M4_moveDirec = 0;
int M4_moveDelta = 0;
unsigned long int M4_moveStartTime = 0;
unsigned long int M4_moveStopTime = 0;
int M4_moveLastPos = 0;
int M4_moveStopDeltaTime = 800;
int M4_low_thresh = 150;
int M4_high_thresh = 675;
int M2_moveComplete = 1;

197

198

int M2_moveDirec = 0;
int M2_moveDelta = 0;
unsigned long int M2_moveStartTime = 0;
int M2_moveLastPos = 0;
int M4pos = 0; int M4lastpos = 0;
int M2pos = 0; int M2lastpos = 0;
int dumb_flg = 0;
unsigned long int trial_count = 0;
unsigned long int trial_count2 = 0;
unsigned long int codeStartTime = 0;
char last_incomingByte = l;
char incomingByte = ;
int drive = -1;
int twist = 0;
int code_disabled = 0;
int smart_code_disabled = 1;
char incoming_array[5] = { , , , , };
int incoming_count = 0;
int switchvar = 0;
unsigned long int lastTime = 0;
int forceDflg = 0;
int drive_isLast = 0;
void setup()
{
pinMode(M1_switch, OUTPUT);
pinMode(M1_direc, OUTPUT);
pinMode(M1_PIN, INPUT);
pinMode(M2_switch, OUTPUT);
pinMode(M2_direc, OUTPUT);
pinMode(M2_PIN, INPUT);
pinMode(M3_switch, OUTPUT);
pinMode(M3_direc, OUTPUT);
pinMode(M3_PIN, INPUT);
pinMode(M4_switch, OUTPUT);
pinMode(M4_direc, OUTPUT);
pinMode(M4_PIN, INPUT);
// Set MUX pins DDR
pinMode(MUX_BIT3,OUTPUT);

Final Documentation

Final Documentation

pinMode(MUX_BIT2,OUTPUT);
pinMode(MUX_BIT1,OUTPUT);
pinMode(MUX_BIT0,OUTPUT);
pinMode(FSR_input, INPUT);
pinMode(trial_pin, OUTPUT);
noInterrupts(); //disable interrupts
//Set Pin Change Interrupt Enable 0,1,2
PCICR |= (1<<PCIE2) | (1<<PCIE1) | (1<<PCIE0);
//Enable appropriate pin for pin change interrupt
PCMSK2 |= (1<<PCINT16);
PCMSK1 |= (1<<PCINT8);
PCMSK0 |= (1<<PCINT0); // Digital Pin 53
//Set External Interrupt 0 enable
EIMSK |= (1<<INT4) | (1<<INT5); // Pins 2 and 3
//Sense Control for EXTint0
Audi EVOLVE March 19, 2013
193
EICRB |= (1<<ISC40) | (1<<ISC50); // Set it to trigger on pin change
interrupts(); //enable global interrupts
Serial.begin(9600);
Serial3.begin(9600);
codeStartTime = millis();
#ifdef RESET_POS
if(M4pos != M4_default)
moveM4(abs(M4pos-M4_default),-(M4pos-M4_default)/abs(M4pos-M4_default));
if(M2pos != M4_default)
moveM2(abs(M2pos-M4_default),-(M2pos-M4_default)/abs(M2pos-M4_default));
#endif
Serial.println(Init);
#ifdef CALIBRATE_CHAIR
moveM2(6000,-1);
moveM4(6000,-1);
while(M4_moveComplete != 1 || M2_moveComplete != 1)
{
M4_stallCheck();
M2_stallCheck();
//Serial.print(M4_moveComplete);
//Serial.print(\t);

199

200

Final Documentation

//Serial.println(M2_moveComplete);
}
Serial.println(Maxed Out);
EEPROM.write(0,0);
EEPROM.write(1,0);
EEPROM.write(2,0);
EEPROM.write(3,0);
Serial.println(EEPROM Updated);
M4pos = 0;
M2pos = 0;
if(M4pos != M4_default)
moveM4(abs(M4pos-M4_default),-(M4pos-M4_default)/abs(M4pos-M4_default));
if(M2pos != M2_default)
moveM2(abs(M2pos-M2_default),-(M2pos-M2_default)/abs(M2pos-M2_default));
while(M4_moveComplete != 1 || M2_moveComplete != 1)
{
M4_stallCheck();
M2_stallCheck();
//Serial.print(M4_moveComplete);
//Serial.print(\t);
//Serial.println(M2_moveComplete);
}
Serial.println(Calibrated);
#else
// Read the EEPROM for last position
int M4low = EEPROM.read(0);
int M4high = EEPROM.read(1);
int M2low = EEPROM.read(2);
int M2high = EEPROM.read(3);
M4pos = word(M4high,M4low);
M2pos = word(M2high,M2low);
#endif
}
void loop()
{
if(code_disabled == 1)
{
//Serial.println(Waiting);

Final Documentation

}
else if(code_disabled == 0)
{
readFSRvalues();
EEPROM_Update(M4pos,M2pos);
// implement state machine here
if (Serial.available() > 0 || Serial3.available() > 0)
{
if(Serial.available() > 0)
{
incomingByte = Serial.read();
}
else if(Serial3.available() > 0)
{
incomingByte = Serial3.read();
}
/*
incoming_array[(incoming_count++)%5] = incomingByte;
Serial.print(incoming_array[0]); Serial.print(\t);
Serial.print(incoming_array[1]); Serial.print(\t);
Serial.print(incoming_array[2]); Serial.print(\t);
Serial.print(incoming_array[3]); Serial.print(\t);
Serial.println(incoming_array[4]);
*/
//twist = int(incomingByte);
if(incomingByte == E)
{
EEPROM_Update(M4pos,M2pos);
code_disabled = 1;
}
else if(incomingByte == P)
{
smart_code_disabled = 1;
}
else if(incomingByte == O)
{
smart_code_disabled = 0;
}

201

202

Final Documentation

/*
DELETE Later
//Redefine incoming byte to be D after some time since initiation
if(millis() - codeStartTime > 8000 && forceDflg == 0)
{
incomingByte = D;
forceDflg = 1;
Serial.print(Forcing a D); Serial.print(\t); Serial.println(last_incomingByte);
}
*/
//Redefine incoming byte to be R based on FSR inputs from steering wheel
// FSRs 4 and 5 - Steering front
// FSRs 6 and 10 - Steering back
//Serial.println(last_incomingByte);
if(drive_isLast == 1 && FSR_values[6] < 50 && FSR_values[10] < 50 && FSR_values[4] >
600 && FSR_values[5] > 600)
{
drive_isLast = 0;
incomingByte = R;
}
if(M4_moveComplete == 1 && M2_moveComplete == 1 && incomingByte != last_incomingByte && twist == 0)
{
last_incomingByte = incomingByte;
switch(incomingByte)
{
case Q: //default position reset
if(M4pos != M4_default)
moveM4(abs(M4pos-M4_default),-(M4pos-M4_default)/abs(M4pos-M4_default));
if(M2pos != M2_default)
moveM2(abs(M2pos-M2_default),-(M2pos-M2_default)/abs(M2pos-M2_default));
break;
case D:
if(M4pos != M4_D)
moveM4(abs(M4pos-M4_D),-(M4pos-M4_D)/abs(M4pos-M4_D));
if(M2pos != M2_D)
moveM2(abs(M2pos-M2_D),-(M2pos-M2_D)/abs(M2pos-M2_D));
drive = 1; // for the steering motor

Final Documentation

drive_isLast = 1; // for the new retract code


smart_code_disabled = 1;
break;
case R:
if(M4pos != M4_R)
moveM4(abs(M4pos-M4_R),-(M4pos-M4_R)/abs(M4pos-M4_R));
if(M2pos != M2_R)
moveM2(abs(M2pos-M2_R),-(M2pos-M2_R)/abs(M2pos-M2_R));
drive = 0;
if(switchvar%2 == 1)
{
smart_code_disabled = 0;
}
break;
case T:
//twist = 1;
break;
}
}
}
// Code for controlling steering mechanism
int timeconst = 20;
if(M4_moveComplete == 1 && M2_moveComplete == 1 && drive == 1)
{
trial_count = trial_count + 1;
if(trial_count < timeconst)
{
digitalWrite(M3_direc, HIGH);
digitalWrite(M3_switch, HIGH);
}
else if(trial_count == timeconst)
{
digitalWrite(M3_switch,LOW);
drive = -1;
trial_count = 0;
}
}
if(M4_moveComplete == 1 && M2_moveComplete == 1 && drive == 0)

203

204

Final Documentation

{
trial_count = trial_count + 1;
if(trial_count < timeconst*2)
{
digitalWrite(M3_direc, LOW);
digitalWrite(M3_switch, HIGH);
}
else if(trial_count == timeconst*2)
{
digitalWrite(M3_switch, LOW);
drive = -1;
trial_count = 0;
}
}
// Code for controlling chair twist
// Code for controlling steering mechanism
//Serial.println(twist);
int timeconst2 = 20;
if(M4_moveComplete == 1 && M2_moveComplete == 1 && twist == 1)
{
trial_count2 = trial_count2 + 1;
if(trial_count2 < timeconst2)
{
digitalWrite(M1_direc, HIGH);
digitalWrite(M1_switch, HIGH);
}
else if(trial_count2 == timeconst2)
{
digitalWrite(M1_switch,LOW);
twist = 2;
EEPROM.write(4,twist);
trial_count2 = 0;
}
}
if(M4_moveComplete == 1 && M2_moveComplete == 1 && twist == 2 && FSR_values[8] >
M4_high_thresh)
{
twist = 3;

Final Documentation

EEPROM.write(4,twist);
}
if(M4_moveComplete == 1 && M2_moveComplete == 1 && twist == 3)
{
trial_count2 = trial_count2 + 1;
if(trial_count2 < timeconst2)
{
digitalWrite(M1_direc, LOW);
digitalWrite(M1_switch, HIGH);
}
else if(trial_count2 == timeconst2)
{
digitalWrite(M1_switch, LOW);
twist = 0;
EEPROM.write(4,twist);
trial_count2 = 0;
}
}
// FSR Test code delete later!
Serial.print(FSR_values[4]); Serial.print(\t);
Serial.print(FSR_values[5]); Serial.print(\t);
Serial.print(FSR_values[6]); Serial.print(\t);
Serial.println(FSR_values[10]);
#ifdef DUMB_CODE
// Dumb Code
if(dumb_flg == 0)
{
//moveM2(200,1);
//moveM4(500,1);
//digitalWrite(M3_direc, HIGH);
//digitalWrite(M3_switch,HIGH);
//digitalWrite(trial_pin, HIGH);
dumb_flg = 1;
}
#endif
if(smart_code_disabled == 0)
{
// Smart Code

205

206

Final Documentation

//Serial.println(FSR_values[7]);
if(FSR_values[7] > M4_high_thresh && M4_moveComplete == 1)
{
moveM4(400,1);
//Serial.println(a);
}
else if(FSR_values[7] < M4_low_thresh && M4_moveComplete == 1)
{
moveM4(400,-1);
//Serial.println(b);
}
if(M4_moveComplete == 2 && millis()-M4_moveStopTime > M4_moveStopDeltaTime)
{
M4_moveComplete = 1;
}
//printFSRvalues();
}
// Read EEPROM values
int M4posval = EEPROM.read(0);
int M2posval = EEPROM.read(1);
//Serial.print(M4posval);
//Serial.print(\t);
//Serial.println(M2posval);
}
}
ISR(PCINT0_vect)
{
int debounceTime = 500;
if(millis()-lastTime > debounceTime)
{
switchvar = switchvar + 1;
Serial.println(switchvar);
if(switchvar%2 == 1)
{
Serial3.flush();
code_disabled = 0;
smart_code_disabled = 0;
}

Final Documentation

207

else if(switchvar%2 == 0)
{
smart_code_disabled = 1;
}
lastTime = millis();
if(M4_moveComplete == 2)
{
M4_moveComplete = 1;
}
}
}
ISR(INT4_vect)
{
M4_moveStartTime = millis();
M4_moveLastPos = M4pos;
if(M4_moveDirec == -1)
Audi EVOLVE March 19, 2013
198
M4pos = M4pos - 1;
else
M4pos = M4pos + 1;
//EEPROM_Update(M4pos,M2pos);
//Serial.println(M4_moveComplete);
if(smart_code_disabled == 0)
{
if((M4_moveComplete == 0 && M4_moveDirec == 1 && FSR_values[7] < M4_low_thresh) ||
(M4_moveComplete == 0 &&
M4_moveDirec == -1 && FSR_values[7] > M4_high_thresh))
{
digitalWrite(M4_switch, LOW);
M4_moveComplete = 2;
M4_moveStopTime = millis();
}
}
if(abs(M4pos - M4lastpos) > M4_moveDelta)
{
digitalWrite(M4_switch, LOW);
M4_moveComplete = 1;

208

Final Documentation

}
}
ISR(INT5_vect)
{
M2_moveStartTime = millis();
M2_moveLastPos = M2pos;
if(M2_moveDirec == -1)
M2pos = M2pos - 1;
else
M2pos = M2pos + 1;
//EEPROM_Update(M4pos,M2pos);
if(abs(M2pos - M2lastpos) > M2_moveDelta)
{
digitalWrite(M2_switch, LOW);
M2_moveComplete = 1;
}
}
void readFSRvalues()
{
/*for(int i=0;i<12;i++)
{
PORTD = i; FSR_values[i] = analogRead(FSR_input);
}*/int time_delay = 2;
digitalWrite(MUX_BIT3, LOW); digitalWrite(MUX_BIT2, LOW); digitalWrite(MUX_BIT1,
LOW); digitalWrite(MUX_BIT0, LOW);
delay(time_delay); FSR_values[0] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, LOW); digitalWrite(MUX_BIT2, LOW); digitalWrite(MUX_BIT1,
LOW); digitalWrite(MUX_BIT0, HIGH);
delay(time_delay); FSR_values[1] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, LOW); digitalWrite(MUX_BIT2, LOW); digitalWrite(MUX_BIT1,
HIGH); digitalWrite(MUX_BIT0, LOW);
delay(time_delay); FSR_values[2] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, LOW); digitalWrite(MUX_BIT2, LOW); digitalWrite(MUX_BIT1,
HIGH); digitalWrite(MUX_BIT0, HIGH);
delay(time_delay); FSR_values[3] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, LOW); digitalWrite(MUX_BIT2, HIGH); digitalWrite(MUX_BIT1,
LOW); digitalWrite(MUX_BIT0, LOW);
delay(time_delay); FSR_values[4] = analogRead(FSR_input);

Final Documentation

209

digitalWrite(MUX_BIT3, LOW); digitalWrite(MUX_BIT2, HIGH); digitalWrite(MUX_BIT1,


LOW); digitalWrite(MUX_BIT0, HIGH);
delay(time_delay); FSR_values[5] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, LOW); digitalWrite(MUX_BIT2, HIGH); digitalWrite(MUX_BIT1,
HIGH); digitalWrite(MUX_BIT0, LOW);
delay(time_delay); FSR_values[6] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, LOW); digitalWrite(MUX_BIT2, HIGH); digitalWrite(MUX_BIT1,
HIGH); digitalWrite(MUX_BIT0, HIGH);
delay(time_delay); FSR_values[7] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, HIGH); digitalWrite(MUX_BIT2, LOW); digitalWrite(MUX_BIT1,
LOW); digitalWrite(MUX_BIT0, LOW);
delay(time_delay); FSR_values[8] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, HIGH); digitalWrite(MUX_BIT2, LOW); digitalWrite(MUX_BIT1,
LOW); digitalWrite(MUX_BIT0, HIGH);
delay(time_delay); FSR_values[9] = analogRead(FSR_input);
Audi EVOLVE March 19, 2013
199
digitalWrite(MUX_BIT3, HIGH); digitalWrite(MUX_BIT2, LOW); digitalWrite(MUX_BIT1,
HIGH); digitalWrite(MUX_BIT0, LOW);
delay(time_delay); FSR_values[10] = analogRead(FSR_input);
digitalWrite(MUX_BIT3, HIGH); digitalWrite(MUX_BIT2, LOW); digitalWrite(MUX_BIT1,
HIGH); digitalWrite(MUX_BIT0, HIGH);
delay(time_delay); FSR_values[11] = analogRead(FSR_input);
//Serial.println(Read done);
}
void printFSRvalues()
{
for(int i=0;i<6;i++)
{
Serial.print(FSR_values[i]);
Serial.print(\t);
}
Serial.println( );
}
void M4_stallCheck(void)
{
//Stall check
//Serial.print(M4pos); Serial.print(\t); Serial.println(M4_moveLastPos);

210

Final Documentation

if(M4_moveComplete != 1 && (millis()-M4_moveStartTime > 2000) && (millis()-M4_moveStartTime < 4000) && (abs(M4pos M4_moveLastPos) < 100))
{
Serial.println(M4 Stall);
digitalWrite(M4_switch, LOW);
M4_moveComplete = 1;
}
}
void M2_stallCheck(void)
{
//Stall check
Serial.print(M2pos); Serial.print(\t); Serial.println(M2_moveLastPos);
if(M2_moveComplete != 1 && (millis()-M2_moveStartTime > 2000) && (millis()-M2_moveStartTime < 4000) && (abs(M2pos M2_moveLastPos) < 100))
{
Serial.println(M2 Stall);
digitalWrite(M2_switch, LOW);
M2_moveComplete = 1;
}
}
void moveM4(int counts, int direc)
{
M4_moveComplete = 0;
M4_moveDirec = direc;
M4_moveDelta = counts;
M4_moveStartTime = millis();
M4_moveLastPos = M4pos;
M4lastpos = M4pos;
if(direc == 1)
{
digitalWrite(M4_switch, HIGH);
digitalWrite(M4_direc, HIGH);
}
else if(direc == -1)
{
digitalWrite(M4_switch, HIGH);

Final Documentation

digitalWrite(M4_direc, LOW);
}
}
void moveM2(int counts, int direc)
{
M2_moveComplete = 0;
M2_moveDirec = direc;
M2_moveDelta = counts;
M2_moveStartTime = millis();
Audi EVOLVE March 19, 2013
200
M2_moveLastPos = M2pos;
M2lastpos = M2pos;
if(direc == 1)
{
digitalWrite(M2_switch, HIGH);
digitalWrite(M2_direc, HIGH);
}
else if(direc == -1)
{
digitalWrite(M2_switch, HIGH);
digitalWrite(M2_direc, LOW);
}
}
void EEPROM_Update(int M4position, int M2position)
{
int M4high = highByte(M4position);
int M4low = lowByte(M4position);
int M2high = highByte(M2position);
int M2low = lowByte(M2position);
EEPROM.write(0,M4low);
EEPROM.write(1,M4high);
EEPROM.write(2,M2low);
EEPROM.write(3,M2high);
}
void setPwmFrequency(int pin, int divisor) {
byte mode;
if(pin == 5 || pin == 6 || pin == 9 || pin == 10) {

211

212

Final Documentation

switch(divisor) {
case 1: mode = 0x01; break;
case 8: mode = 0x02; break;
case 64: mode = 0x03; break;
case 256: mode = 0x04; break;
case 1024: mode = 0x05; break;
default: return;
}
if(pin == 5 || pin == 6) {
TCCR0B = TCCR0B & 0b11111000 | mode;
} else {
TCCR1B = TCCR1B & 0b11111000 | mode;
}
} else if(pin == 3 || pin == 11) {
switch(divisor) {
case 1: mode = 0x01; break;
case 8: mode = 0x02; break;
case 32: mode = 0x03; break;
case 64: mode = 0x04; break;
case 128: mode = 0x05; break;
case 256: mode = 0x06; break;
case 1024: mode = 0x7; break;
default: return;
}
TCCR2B = TCCR2B & 0b11111000 | mode;
}
}
9.10.3 Kinect Code
Functional_System_Prototype.pde
import SimpleOpenNI.*;
import processing.serial.*; //import the Serial library
Serial port; // declare a new string called serial . A string is a sequence of characters (data
type know as char)
//SkeletonPoser turn;
SkeletonPoser drive;
SkeletonPoser retract;
//SkeletonPoser swipe;
SimpleOpenNI kinect;

Final Documentation

213

char state = q;
void setup() {
port = new Serial(this, /dev/tty.usbserial-A6003SBp, 9600); // initializing the object by assigning a port and baud rate (must match that
of Arduino)
size(640, 480);
kinect = new SimpleOpenNI(this);
kinect.enableDepth();
kinect.enableUser(SimpleOpenNI.SKEL_PROFILE_ALL);
kinect.setMirror(true);
// initialize the pose object
// turn = new SkeletonPoser(kinect);
drive = new SkeletonPoser(kinect);
retract = new SkeletonPoser(kinect);
// swipe = new SkeletonPoser(kinect);
/*
// rules for the right arm
turn.addRule(SimpleOpenNI.SKEL_RIGHT_HAND, PoseRule.BELOW, SimpleOpenNI.
SKEL_RIGHT_ELBOW);
turn.addRule(SimpleOpenNI.SKEL_RIGHT_HAND, PoseRule.RIGHT_OF, SimpleOpenNI.
SKEL_RIGHT_ELBOW);
turn.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.BELOW, SimpleOpenNI.
SKEL_RIGHT_SHOULDER);
turn.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.RIGHT_OF, SimpleOpenNI.
SKEL_RIGHT_SHOULDER);
turn.addRule(SimpleOpenNI.SKEL_RIGHT_HAND, PoseRule.BEHIND, SimpleOpenNI.
SKEL_RIGHT_ELBOW);
turn.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.BEHIND, SimpleOpenNI.
SKEL_RIGHT_SHOULDER);
*/
drive.addRule(SimpleOpenNI.SKEL_RIGHT_HAND, PoseRule.ABOVE, SimpleOpenNI.
SKEL_RIGHT_ELBOW);
drive.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.ABOVE, SimpleOpenNI.
SKEL_RIGHT_SHOULDER);
drive.addRule(SimpleOpenNI.SKEL_LEFT_HAND, PoseRule.ABOVE, SimpleOpenNI.
SKEL_LEFT_ELBOW);
drive.addRule(SimpleOpenNI.SKEL_LEFT_ELBOW, PoseRule.ABOVE, SimpleOpenNI.
SKEL_LEFT_SHOULDER);
retract.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.BEHIND, SimpleOpenNI.

214

Final Documentation

SKEL_RIGHT_HAND);
retract.addRule(SimpleOpenNI.SKEL_RIGHT_SHOULDER, PoseRule.BEHIND, SimpleOpenNI.SKEL_RIGHT_ELBOW);
retract.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.BELOW, SimpleOpenNI.
SKEL_RIGHT_HAND);
retract.addRule(SimpleOpenNI.SKEL_RIGHT_HAND, PoseRule.ABOVE, SimpleOpenNI.
SKEL_LEFT_HAND);
retract.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.ABOVE, SimpleOpenNI.
SKEL_LEFT_ELBOW);
retract.addRule(SimpleOpenNI.SKEL_LEFT_HAND, PoseRule.BELOW, SimpleOpenNI.
SKEL_TORSO);
retract.addRule(SimpleOpenNI.SKEL_RIGHT_HAND, PoseRule.RIGHT_OF, SimpleOpenNI.
SKEL_LEFT_HAND);
retract.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.BEHIND, SimpleOpenNI.
SKEL_RIGHT_HAND);
retract.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.IN_FRONT_OF, SimpleOpenNI.SKEL_RIGHT_SHOULDER);
/*
swipe.addRule(SimpleOpenNI.SKEL_RIGHT_HAND, PoseRule.LEFT_OF, SimpleOpenNI.
SKEL_LEFT_HAND);
swipe.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.LEFT_OF, SimpleOpenNI.
SKEL_RIGHT_SHOULDER);
swipe.addRule(SimpleOpenNI.SKEL_RIGHT_ELBOW, PoseRule.IN_FRONT_OF, SimpleOpenNI.SKEL_LEFT_ELBOW);
*/
strokeWeight(5);
state = q;
}
void draw() {
background(0);
kinect.update();
image(kinect.depthImage(), 0, 0);
IntVector userList = new IntVector();
kinect.getUsers(userList);
if (userList.size() > 0) {
int userId = userList.get(0);
if( kinect.isTrackingSkeleton(userId)) {
// check to see if the user

Final Documentation

// is in the pose
/* if(turn.check(userId))
{
//if they are, set the color white
stroke(255);
state = T;
println(state);
} */
if(drive.check(userId))
{
//if they are, set the color blue
stroke(0,0,255);
state = D;
}
/* else if(retract.check(userId))
{
//if they are, set the color red
stroke(255,0,0);
state = R;
}
else if(swipe.check(userId))
{
//if they are, set the color yellow
stroke(250,250,19);
state = S;
} */
else
{
// otherwise set the color to green DEFAULT STATE
stroke(0,255,0);
state = q;
}
// draw the skeleton in whatever color we chose
drawSkeleton(userId);
port.write(state);
}
}
}
void drawSkeleton(int userId) {

215

216

Final Documentation

kinect.drawLimb(userId, SimpleOpenNI.SKEL_HEAD, SimpleOpenNI.SKEL_NECK);


kinect.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_LEFT_SHOULDER);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_
LEFT_ELBOW);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_ELBOW, SimpleOpenNI.SKEL_LEFT_
HAND);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_RIGHT_
SHOULDER);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_
RIGHT_ELBOW);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_ELBOW, SimpleOpenNI.SKEL_
RIGHT_HAND);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_
TORSO);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_
TORSO);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_LEFT_HIP);
//kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_HIP, SimpleOpenNI.SKEL_LEFT_
KNEE);
//kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_KNEE, SimpleOpenNI.SKEL_LEFT_
FOOT);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_RIGHT_HIP);
//kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_HIP, SimpleOpenNI.SKEL_RIGHT_
KNEE);
//kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_KNEE, SimpleOpenNI.SKEL_
RIGHT_FOOT);
kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_HIP, SimpleOpenNI.SKEL_LEFT_
HIP);
}
void drawLimb(int userId, int jointType1, int jointType2)
{
PVector jointPos1 = new PVector();
PVector jointPos2 = new PVector();
float confidence;
// draw the joint position
confidence = kinect.getJointPositionSkeleton(userId, jointType1, jointPos1);
confidence = kinect.getJointPositionSkeleton(userId, jointType2, jointPos2);

Final Documentation

217

line(jointPos1.x, jointPos1.y, jointPos1.z,


jointPos2.x, jointPos2.y, jointPos2.z);
}
// user-tracking callbacks!
void onNewUser(int userId) {
println(start pose detection);
kinect.startPoseDetection(Psi, userId);
}
void onEndCalibration(int userId, boolean successful) {
if (successful) {
println( User calibrated !!!);
kinect.startTrackingSkeleton(userId);
}
else {
println( Failed to calibrate user !!!);
kinect.startPoseDetection(Psi, userId);
}
}
void onStartPose(String pose, int userId) {
println(Started pose for user);
kinect.stopPoseDetection(userId);
kinect.requestCalibrationSkeleton(userId, true);
}
SkeletonPoser.pde
/*
pose.addRule(SimpleOpenNI.LEFT_HAND, SkeletonPoser.ABOVE, SimpleOpenNI.LEFT_
ELBOW);
pose.addRule(SimpleOpenNI.LEFT_HAND, SkeletonPoser.LEFT_OF, SimpleOpenNI.LEFT_
ELBOW);
if(pose.check(userId)){
// play the song
// with debounce
}*/
class SkeletonPoser {
SimpleOpenNI context;
ArrayList rules;
SkeletonPoser(SimpleOpenNI context){
this.context = context;
rules = new ArrayList();

218

Final Documentation

}
void addRule(int fromJoint, int jointRelation, int toJoint){
PoseRule rule = new PoseRule(context, fromJoint, jointRelation, toJoint);
rules.add(rule);
}
boolean check(int userID){
boolean result = true;
for(int i = 0; i < rules.size(); i++){
PoseRule rule = (PoseRule)rules.get(i);
result = result && rule.check(userID);
}
return result;
}
}
class PoseRule {
int fromJoint;
int toJoint;
PVector fromJointVector;
PVector toJointVector;
SimpleOpenNI context;
int jointRelation; // one of:
static final int ABOVE = 1;
static final int BELOW = 2;
static final int LEFT_OF = 3;
static final int RIGHT_OF = 4;
static final int IN_FRONT_OF = 5;
static final int BEHIND = 6;
static final int SAME_HEIGHT = 7;
PoseRule(SimpleOpenNI context, int fromJoint, int jointRelation, int toJoint){
this.context = context;
this.fromJoint = fromJoint;
this.toJoint = toJoint;
this.jointRelation = jointRelation;
fromJointVector = new PVector();
toJointVector = new PVector();
}
boolean check(int userID){
// populate the joint vectors for the user were checking

Final Documentation

context.getJointPositionSkeleton(userID, fromJoint, fromJointVector);


context.getJointPositionSkeleton(userID, toJoint, toJointVector);
int theta = 32;
fromJointVector.y = (fromJointVector.y * cos(radians(theta))) + (fromJointVector.z *
sin(radians(theta)));
fromJointVector.z = (-fromJointVector.y * sin(radians(theta))) + (fromJointVector.z *
cos(radians(theta)));
toJointVector.y = (toJointVector.y * cos(radians(theta))) + (toJointVector.z *
sin(radians(theta)));
toJointVector.z = -(toJointVector.y * sin(radians(theta))) + (toJointVector.z *
cos(radians(theta)));
boolean result;
switch(jointRelation){
case ABOVE:
result = (fromJointVector.y > toJointVector.y);
break;
case BELOW:
result = (fromJointVector.y < toJointVector.y);
break;
case LEFT_OF:
result = (fromJointVector.x < toJointVector.x);
break;
case RIGHT_OF:
result = (fromJointVector.x > toJointVector.x);
break;
case IN_FRONT_OF:
result = (fromJointVector.z < toJointVector.z);
break;
case BEHIND:
result = (fromJointVector.z > toJointVector.z);
break;
case SAME_HEIGHT:
result = (fromJointVector.y == toJointVector.y);
break;
default:
result = false;
break;
}
return result;
}

219

220

Final Documentation

7.13.2 Arduino Code - Buckle it Out


#include <Servo.h>
Servo smallMotor;
Servo largeMotor;
const int buttonPin = 2;
const int ledPin = 6;

// the number of the pushbutton pin


// Pin 6: Teensy++ 2.0 has the LED on pin 6

// variables will change:


int buttonState = 0;

// variable for reading the pushbutton status

boolean wheelStateIn = true; // state of the wheel


int pos = 40;
// This is where you change the starting and driving angle of the wheel out
const int outposition = 100;
const int inposition = 25;
// Variables
int i = 0;
void setup() {
// Connecting servos to Teensy pins
smallMotor.attach(15);
largeMotor.attach(16);

Final Documentation

// initialize the LED pin as an output:


pinMode(ledPin, OUTPUT);
// initialize the pushbutton pin as an input and activating the inner pullup resistor:
pinMode(buttonPin, INPUT_PULLUP);
// Initialize the starting position of the servos
smallMotor.write(inposition);
largeMotor.write(94);

Serial.begin(9600);
}
void outSequence()
{
// Debug command
Serial.println(OutSequence);
// Large motor sequence
for(i = 0; i < 6;i ++)
{
if(i == 0)
largeMotor.write(99);

if(i == 5)

221

222

largeMotor.write(84);
delay(106);
}
largeMotor.write(94);
// Syncing time
delay(10);
// Small motor sequence
for(pos = inposition; pos < outposition; pos += 1)
{
smallMotor.write(pos);
delay(15);
}
}
void inSequence()
{
// Debug command
Serial.println(InSequence);
// Small motor sequence
for(pos = outposition; pos >= inposition; pos-=1)

Final Documentation

Final Documentation

{
smallMotor.write(pos);
delay(15);
}
// Syncing time
delay(10);
// Large motor sequence
for(i = 0; i < 5;i ++)
{
if(i == 0)
largeMotor.write(80);

if(i > 5)
largeMotor.write(97);

delay(110);
}
largeMotor.write(94);
}
void loop(){
// Button bouncer while polling

223

224

Final Documentation

buttonState = digitalRead(buttonPin);
delay(10);
if (buttonState == digitalRead(buttonPin))
{
// Debug command
Serial.println(buttonState);
// Random delay
delay(100);

if (buttonState == LOW) {
// LED confirmation of button for debugging
digitalWrite(ledPin, HIGH);
if(wheelStateIn == true) {
outSequence();
wheelStateIn = false;
}
else {
largeMotor.write(94);
}
}

Final Documentation

else {
// LED confirmation of button for debugging
digitalWrite(ledPin, LOW);
if(wheelStateIn == false)
{
inSequence();
wheelStateIn = true;
}
else {
largeMotor.write(94);
}
}
}
// Securing motor to stop if malfunctioning
largeMotor.write(94);
}

225

226

Final Documentation

7.14 Overview

The

The approach of Team Audi Evolve to

following quarter through prototypes that

creating a design solution began by first


getting a better understanding of the problem
statement. Understanding the existing
technology surrounding autonomous driving
and driver assistant systems helped us
to see what the challenges in this design
space truly were and what currently being
done successfully or unsuccessfully was
to overcome these challenges. This, in
addition to research into the year 2035 for
social trends and user needs, will provide
valuable insights for the designs.
Further research and investigation was
conducted in the following areas through
benchmarking, prototyping, and needfinding:

Control transitions from a human to a


machine and a machine to a human

Mechanisms for steering or controlling


a car

Confirmation cues indicating that


a vehicle or machine is properly
functioning

Motion sickness prevention

Cabin space design and use

knowledge

gained

from

these

experiences will come together in the


begin to integrate all components necessary
to design a solution that provides an
excellent riding and driving experience. The
following sections describe in greater detail
the approach the team took to understand
the problem statement and begin to tackle it.
In the following chapters, the knowledge
gained from these experiences will come
together through prototypes, integrating all
components necessary to design a solution
that provides an excellent riding and driving
experience. Previously mentioned will be
described in greater detail in needfinding
and benchmarking chapters.

Final Documentation

227

7.15 Design Reasoning


For old corporations, like Volkswagen Group,
it is important to nurture their ecosystem in
various ways. According to Growth Agenda
Limited, nurturing the ecosystem can be
divided into three different types of growth;
Core Growth, New Growth, and Emerging
Growth. These three growths are basically
separated by their differences in radicalness
and time frame.
Core

Growth

includes

incremental

innovations that do not offer dramatic


changes but are enough to keep the products
fresh. Volkswagen Group maintains core
growth by launching a new car model every
few years. (Figure 13).

Audi A3 Series

228

Final Documentation

New Growth offers substantial innovations


through new products to consumers. For
example, this was what Volkswagen Groups
Audi did by introducing the new SUV Q7 in
2005, which introduced a new type of car to
the Audi brand that they did not previously
offer. Audi has continued introducing new
vehicles in that pedigree branch (Q5 2009
and Q3 2011).
Long-term innovations that are radical and
disrupt the current industry are addressing
the Emerging Growth. This is at the center of
autonomous vehicle design and research at
Audi. Cabin space design is a crucial part of
the whole experience in an autonomous car
and it is part of Audis future investment. This
technology based innovation and looking to
the future for areas of emerging growth is
what our corporate project for Audi is about.
(Figure14)

Focus of our project

Emerging
Growth

New
Growth

Core
Growth
Focus of our project - Adapted from Growth Agenda Limited

Long term bets


Strategic ventures
Future investments
Cross sector partnerships
Business model innovation
Adjacent sector disruption
New market entry
New capabilities

Nurturing the ecosystem


New products and services
New customer acquisition
Operational enhancement

Final Documentation

229

7.16 Development Strategy

Needfinding includes finding out the users

To be able to give some direction to our

surveys, perspective tools, and information

thinking process, we needed to dig deep into


the current state of car industry, gathering
all possible data that might be useful for our
final prototype. In order to achieve this goal,
we followed Stanford Universitys design
process(Figure 15). First, we analysed the
project brief and defined the problems we
were solving. With such information we
were able to direct our needfinding and
benchmarking.

needs through observation, interviews,


search. Benchmarking on the other
hand

includes

experimenting

current

solutions and technologies. (Figure 16)


With benchmarking we try to get the best
possible knowledge of the current state of
technology so that we are able to push the
limits. With benchmarking we might also get
the golden idea how could something be
done differently.

(re) Define the Problem


Iterate to keep refining the concepts

Test
Learn from prototypes

ME310
Design Process

Needfinding and
Benchmarking
Understand the users and
design space

Prototype

Brainstorm

Rapid prototype to explore ideas

Gererate as many ideas as possible

Stanford University Design Process

230

Final Documentation

Observing
Others

Self Observation

Observation

Context

Ideal Persona

Needfinding

Interviews &
Surveys

Physical

Benchmarking

Experimental

Functional

Devision between needfinding and benchmarking

7.17 Future Assumptions


Assumptions help us to build us a picture
about the possible future world. In order to
get a feasible frame for our design work,
especially when it involves designing
something so far in the future for technology,
we have to make assumptions of the world.
These educated guesses are derived from
research, needfinding and benchmarking
we have done. Sources for these iterations
can be found among appendices (Sections
8.2 & 8.3).

Information
Search

Final Documentation

231

7.17.1 Future user

our lives and blend in so well that we cannot

While it is important to predict the state of

do without them. Specifically in terms of

the future technology and urban mobility

automotive experiences, it is already being

in general, it is also a good idea to predict

seen that the current generation views

the behavior and nature of the future user.

driving as a distraction from texting rather

The envisioned design solution needs to

than the other way around. Since being

address the needs of this future user.

connected is so easy, people want to stay


connected. This does not mean that the

The current generation has grown up in

future users would not love driving.

the technology revolution. People are


already used to the fast paced development
in technology and the effect of these

adventure would still be desirable in

new technologies on our lives


is

continually

Experiencing moments of thrill and


the future, but the essential

growing.

difference would lie in doing

Smartphones and smart

things because users

handheld devices have

want to do them rather

redefined the way we

than

interact

being forced to do them.

with

digital

media. If smart and self-aware

spending

time

cars

It is an extrapolation of

of the future are well integrated in the lives

the current scenario when

of users, we can envision that people

people are forced to drive

would show the same enthusiasm

along freeways with all the traffic

towards these autonomous machines

only because they have to travel to

and new upcoming technology as

and from work everyday. Based on

they are currently showing towards

this insight, it is highly probable that

smartphones.

users will find it desirable to have an


option to ride in a cabin space that

Internet and social networks have

is customized to their needs and the

surpassed boundaries of countries

activities that they would want to do, to

in terms of connecting people across

better utilize this time lost in commuting.

distant areas of the world. Accessing a


wide range of information has never been

The internet age has also led to more

this easy and it is going to get better in the

liberal thinking. Non-conventional work

future. There is this concept of perpetual

options are being explored. It has been

connectivity which is being predicted for

envisioned that new technology will lead to a

the future. It is only a matter of time before

great shift in working spaces, work cultures

all the existing technologies are developed

and procedures. The future users are most

to such an extent that they become a part of

likely going to work in an environment where

232

Final Documentation

physical presence is no longer required on a

7.17.2 Future infrastructure

daily basis. In such a situation and with the

increasing influence of autonomous cars, it


is highly probable that the future user is not a

Number of car will radically increase


globally

Number of cars in developed countries

very good driver without the basic assistance

will decrease and in developing

systems. The perceived completely manual

countries will raise

mode of driving is very different from the

Developed countries will shift towards

existing perception of manual driving. There

electric vehicles, but in developing

will be many assistance systems in place

countries gasoline will still be the

in the future cars. This prediction can be

primary fuel

justified on the basis of experiences of pilots

China will be the biggest market for

in airlines which were fit with autopilots and

cars. It will also be the largest car

new assistance systems in the 20th century.

manufacturer in the world

There was a time when all the pilots were

Middle east will lose its position in oil

skeptical about adopting, getting used to and

based wealth, because of the lack

trusting this technology. But currently, pilots

of interest for oil and because the oil

rely so much on this new technology that

reserves are going to expire

most of them cannot do without it.

Because the number of people will


rise on the planet, there will be huge
demand for food thus countries that are
rich in farmland will be major powers in
the future

Because of global warming, ice in


Siberia and north of Canada and
Europe will melt, which will make those
countries new world leaders in farmland
and as such the new global leaders

By 2030, transition between real and


virtual world will be complete. User will
feel and see with senses everything
what a person in virtual world sees and
feels. The technology will be already
available by 2020, but it will not be safe
and legal until 2030

More and more people will work from


home and live with parents - which
will bring a major decline in marriages.
People will have no opportunities
to meet people of the opposite sex.

Final Documentation

233

7.17.3 Futute technology

transport the users to any place they want

Augmented reality This technology is

rather than being limited to the car cabin

currently improving our perception of the

space. (Figure 18)

world by using different technologies like


sound, video, graphics, real time position

3d printing The paradigm shift from

data and haptics. In the context of the car,

copying to mass customization is offered by

it might mean that the futuristic cabin space

new technologies that can produce free form

consists of a windshield that has been

shapes with very little cost. In the future, it

transformed into an interactive interface

can be envisioned that there is automated

completely integrated into the drivers view

manufacturing by robots and 3D printers of

of the road ahead. Automotive companies

any imagined form from different materials.

have already started working on HUDs

This will have a crucial effect on everything

(Head Up Displays). The most extreme case

starting from nano products to the high

would be to use virtual reality to virtually

rise buildings and future infrastructure


development. (Figure 17)

3d Printing

234

Final Documentation

Augmented Reality

Graphene is one atom thick layer of

mater to have adjustable interior that can

graphite. It is currently the best electrical and

change shape and color on command and

heat conductor in the world and it provides

communicate with us in the same time. In

almost no electrical resistance. Also, it is

practice it means that the seat can be in a

transparent. In practice, it is predicted to

perfect shape for your comfort and at the

be essential part of all electronics in near

same time move around providing you with

future. Because it is so thin and transparent,

a nice massage. User can change interior to

the thickness and transparency of the

create a bed to sleep on or moving pieces

application depends on the material it is

can provide a treadmill to get in shape

applied on.

while traveling to work. Applications of this


technology are limitless. At this point it is still

Claytronics Is programmable matter.

under development. First prototypes used

It is constructed by having miniature

relatively large CADAMS, but new research

spherical computers CADAMS (currently

is being done with 1 mm diameter pieces.

1mm diameter size is in development)

Still, magnetic forces between pieces are not

that are connected between each other by

strong enough to support large weight, but

magnetic forces and can be programmable

scientists are hoping to improve that in the

to move and change shapes according to

near future. If the development of claytronics

our needs. First application might be a

follows the curve of development of other

3d fax machine that can send information

computer technologies, it is predicted that

with which the same 3d object is recreated

there will exist a 3d fax machine within 5

somewhere else. In the car design field, it

years.

might be possible in the future to use this

Final Documentation

235

Robotics and Artificial Inteligence

Telepresence Telepresence refers to a

Robots are getting more and more humanoid

set of technologies which allow a person

every day. Already there are few robots in

to feel as if they were present, to give the

Japan that behave like people, but their

appearance of being present, or to have

artificial intelligence is still pretty limited.

an effect, via telerobotics, at a place other

The evolution of robots is in close relation

than their true location. Today, a common

with the development of AI. The machines

application is in video conferences. In the

are learning and gaining experience through

future, it might include robots mimicking

time. According to that, it is responding to

human presence telerobotics or real time

every situation in such a way that maximizes

3D holographic images being projected

its chances of success. Autonomous cars

across the globe. In future cars, this might

are going to be just one of the applications

include having a constant company while

of this robotic and artificial intelligence

being driven somewhere or communication

technology in the future. (Figure19)

between two drivers, via telepresence.

Artificial Intelligence - is the intelligence

Humanoid Robot Azimo

of machines and robots. The machine is


learning and gaining experience through
time. According to that, it is behaving in
every situation in such a way that maximises
its chances of success.
Wireless energy transfer It is a technology
which is already common in current portable
electronics. Todays gadgets are charged
with wireless technology. The gadgets are
just placed on a pad which is plugged in, and
they start charging. In the future, wireless
technology can impact other markets,
including car industry. Parking spaces can
be charging stations, cars can be without
wires, everything inside of a car can be
automatically charged and powered. There
is even the possibility of having dynamic
wireless charging of car batteries.

236

7.17.4 Ideal Future Persona


The Audi project aims for users in year 2035,
at which time Audis customer base will have
a big shift from now. From the demographic
perspective, US market is nearly saturated,
and China is likely to become the next
largest car consuming country. From the
consumer behavior perspective, the new
consumer generation is one born in comfort,
well educated, and familiar with IT since
childhood. In addition, different from Audis
current business which mainly focuses on
personal cars, the future business might be
driven by the mature autonomous driving
technology and step into car sharing area.
Instead of owning a car, customers would
be able to order a car online that allows
great levels of customization. During the
fall quarter, the team has considered future
users as the car owners, but next steps
would involve developing the persona
further based on predicted subscription
based business models.

Final Documentation

Final Documentation

237

The first persona was created under


conservative imagination. This persona
was highly affected by our current view of
the world, attitudes and stereotypes of the
typical Audi users at the moment. Tommy
Yuppie, a 35 year-old American investment
banker works in Wall Street, New York. Hes
married, and has a 5 year-old girl. In free
time, he likes to play golf, work out, and
party with friends. Living outside of the city,
he has to drive to work every morning. He
always wants to use time efficiently instead
of spending it on waiting. Thats why he
needs the self-driving car to drive him in
rush hour so that he can do conference
calls, prepare meetings, or take a nap.
(Figure 20 - top)

The second persona integrates our advanced


future predictions. Based on those, our ideal
user is a 43-year-old Chinese man named
Howard Huo. He works in a hospital doing
nano medic research. He is divorced, and
has one child living with her Mom. He is
well paid from work, loves to spend money
eating in restaurants with colleagues and
friends. He uses the car to locate the best
restaurant in Guangzhou. Sometimes when
he gets drunk, and needs the car to take
him home safely. Every Saturday, he picks
his daughter up from her Moms. They drive
outside the city to enjoy nature and by then
Howard will take over the manual control.
(Figure 20 - bottom)
First ideal persona (top)
Second ideal persona (bottom)

238

Final Documentation

To get a perspective of the challenge, which


is handed to us, the team began by gathering

7.18 Needfinding

information with different tools introduced by

To get the best possible understanding of

Bill Cockayne. First, the team used a tool

our viable future users, the team did a lot of

called Context Map. The purpose of this tool

different types of needfinding. Needfinding

is to start figuring out the central concerns in

is an exercise of understanding and

the corporate project brief. After realizing it

building empathy for the target user group

was the cabin of the car, the team identified

by conducting interviews and ethnographic

eight different stakeholders involved in the

studies. This is in the core of ME310 design

ideal Autonomous Cars Cabin as seen in

process of understanding the problem. This

the Figure below. In conclusion we came

section introduces the key discoveries..

rI
se
U
Auto
Piloted
Car Cabin

ks
pa

Cars

or
W

Drivers

s
er ce
riv n
D ista
s
As

ce

Infrastructure

nt

Si

er

n
io
ot ss
M cne

fa

ce

Users

up with a graph shown in Figure 21 below.

Final Documentation

7.18.1 Context Map


To get a perspective of the challenge, which
is handed to us, the team began by gathering
information with different tools introduced at

239

Motion sickness is
highly subjective

Stanford lectures. First, the team used a tool


called Context Map. The purpose of this tool
is to start figuring out the central concerns in
the corporate project brief. After realizing it

Observations

sick relatively quickly (about after 5

was the cabin of the car, the team identified


eight different stakeholders involved in the
ideal Autonomous Cars Cabin as seen in

minutes)

moving vehicles before but this

with a graph shown in Figure below.

experiment expedited it

processing: feeling the movement of

gaining first hand experience in several key

the bus, but not seeing the movement

areas described in the design development


be the user has provided important insights
into the design space.

This may be caused by the


incongruous information the brain is

The team began the design process by

overview. Understanding what it feels like to

The person doing the test had


experienced motion sickness in

the Figure below. In conclusion we came up

7.18.2 Self-Observation

Team member started to feel motion

Conclusions / Lessons Learned

Sight can not be blocked totally.

People in a moving vehicle need


to get some visual signals or

No Visuals Experiment

confirmation of the movement

The idea behind this experiment was


to observe how it feels to be in public

No Sound Experiment

transportation without being able to see

The team wanted to experiment with blocking

anything. A team member sat in the middle


seat of the bus with eyes covered and
closed. The team member remained awake
for the duration of the experience. The bus
ride was about an hour long through curvy
roads. The purpose of this experiment was
to see if the removal of the outside view from
our future cabins would affect future users
and it what ways.

sound and see if it had an effect on the user


like blocking sight did. For this experiment,
the test person put noise cancelling plugs
on and had a normal one-hour-long bus ride
while sitting in the middle seat of the bus.
The test person was able to see everything
during the whole journey.

240

Final Documentation

Observations

Conclusions / Lessons Learned

Noise cancellation did not have a

Visual awareness proved to be the most

dramatic effect since the test person

crucial considering the motion sickness

did not feel motion sick at all

in this case

Test

person

was

more

focused

Even a little clue of motion once in

visually to outside environment since

awhile, prevents the feeling of motion

phonological sensations did not distract

sickness

test persons vigilance


Conclusions/Lessons Learned

Normal Trip In a Bus II

This test was done the same way as Normal

Blocking the sound does not negatively


impact motion sickness

Improving concentration on visuals may


actually helps prevent motion sickness

Trip In a Bus I (different activities involved),


but now the team member sat in the back of
the bus in order to determine if seat location
impacted motion sickness.

Normal Trip In a Bus I


The experiment was done to test the

Observations

hypothesis that people get more motion sick

In this case, the test person started to

when sitting in the back of the bus by first

feel minor motion sickness when just

testing how the user feels when performing

sitting in the rear part of the bus

activities while sitting in the front of the bus.


The team member began their bus ride

Conclusions/Lessons Learned

sitting in the front of the bus doing different

Motion sickness is highly subjective,

activities such as reading, listening music,

some people get motion sick more

playing with smartphone, and using laptop.

easily. This hypothesis might get some


1. Reason for feeling motion sick just

Observations

Listening to music and being aware


of the environment did not cause any
sense of motion sickness

Reading, playing with smartphone, and


using laptop caused a minor feeling of
motion sickness after about 15 minutes

The feeling of motion sickness eased


almost

immediately

insurance from our survey

after

outside the bus windows

looking

sitting in the rear part of the bus


might be caused by several different
reasons for example:
1. There is not that much room around
the passenger (wall directly behind)
2. Sense of acceleration is greater in
turn (longer radius to wheels)
3. Worse air ventilation

Final Documentation

241

Normal Trip In a Bus III

7.18.3 Interviews

This setup differed a bit from first two Normal

Interviews are key in acquiring personal data

Trips. The team member sat backwards in

from users and beginning to empathize with

the front part of the bus (i.e. with his back

them. The team tried to interview people

facing the direction of motion). This was to

that have experienced similar situations

test the impact of travelling while facing the

to driving in an autonomous vehicle. Also

opposite direction of motion.

speaking to futurists, future enthusiastic

Observation

insights into the motivations and needs of

The team member did not experience


motion sickness for the majority of the

users, and experts provide even more


the Audi user in 2035.

trip, except slightly at the end

Bus interviews

Even though the passenger has

Riding in an autonomous car can resemble

awareness of motion, the passenger

the experience of being driven in public

was not able to predict and prepare for

transportation. Since there is no need for

upcoming turns

driving, users can focus on doing something

Conclusions/Lessons Learned

else. During daily commutes, people can

The fact that passenger cannot predict

dedicate their time to whatever they need.

what is going to happen might be one of

In order to learn about current habits during

the reasons for motion sickness

commutes, the team interviewed people


commuting on trains, buses and metros in
Helsinki. (Figure 22)

242

Final Documentation

Keynotes

driving slightly worse, but is necessary


in order to drive at low velocity
Many current cars turn off the power
steering in high velocity

Drivers like to hold the bottom part of

of the travel, they tend to better utilize

the steering wheel to get a better feeling

their time on public transport.

of the road

People organize their time better if it is


a longer trip. (a trip between cities)

If there is no need for transferring or if


they are familiar with the exact duration

On the track, power steering makes

transportation; usually they listen to


music, read or just sit and think.

People rarely do work on public

The steering wheel is the heart and


soul of racecar driving

One passenger said that she doesnt


drive often and would feel more
confused and insecure with all the driver
assistance systems

Every time she drives, she thinks about


the safety of her dog in the back.

One girl thought it was boring to watch

The steering wheel is


the heart and soul of
racecar driving

the same scenery every day while going


to work

Many people consider traveling time as

Car Laboratory interview

time to rest

The Aalto team visited the Car Laboratory


in Aalto University to speak with experts

Car enthusiast I, II, and III

involved in research and innovations in


the automotive industry. Learning what

Gaining insights into what makes car

directions they see the automotive industry

enthusiasts not only love Audis, but love

moving in 25 years and what aspects are

driving. Racecar drivers and other car

important to consider when designing future

enthusiasts were the teams target users

vehicles. The interviewee was already

for a series of interviews. These users pay

familiar with similar works in this area and

attention to certain details that a casual driver

he was very knowledgeable. He showed

disregards or might not even appreciate.

the team a presentation explaining in detail


everything he thought was important in

Key Notes

Car dynamics (e.g. aerodynamics of the


car) are really important in high speed
driving

In order to be able to drive on a track,


the hands must be able to cross over
each other

designing a car for the future. (Figure 23)

Final Documentation

243

CarLab Interview

Key Notes

comes from hands on the wheel the

as expected. Explosion of the airbag

fastest (0.1 - 0.3 sec), proceed by the

is very powerful and often injures the

inner ear (balance) and sight.

Passive safety will be overtaken by

The extreme ends of drivers will be

active safety (If autonomous driving

younger and older than today, because

is so safe, there will be no need for

autonomous driving will allow it. Together

seatbelts and airbags any more)

with new user demographics, comes a

Feedback from the car movements

Passive safety in the car wasnt as safe

passenger.

Cultural

considerations

must

be

certain change in driving habits.

understood in designing for different

Steering with feet might be a more

regions - western cultures are individual

intuitive way to steer, because it

and eastern cultures are more family

resembles already learned walking

oriented.

movements

244

Figure 25 : PIlot Interviews

Final Documentation

Final Documentation

245

Pilot interviews I and II


To understand the transition experience
from autonomous mode to manual mode
and manual mode to autonomous mode, the
team went to interview pilots in a private jet
company called Jetflite Ltd. The team also
interviewed amateur pilot who flies planes
as a hobby. The team hoped to get some
insight into how it feels to hand over the
controls to a machine and to be onboard
when the machine is in control. (Figure 25)
Key Notes

Autopilot is more like a cruise control,


pilot has to be aware of the situation

There are two pilots in aprivate jet cabin


just to make sure there is always at least
one human who is aware of what the
plane is doing

What does airplane autopilot do?


Function 1: Controls trim tab and
stabilizes the plane (basic).
Function 2: Keeps the plane on certain
height
Function 3: Flying by predefined route
after take-off

Autopilot is turned on and off by pressing


a button

Usually autopilot is not used for takeoffs


and landings

When plane is in autopilot mode, the


pilot is occupied checking the weather,
estimating landing times, checking
gasoline conditions, etc.

Pilots imagined certain situations

246

Final Documentation

where the autonomous functions of

Pilot Instructor at Palo Alto

the car are the most usable: truck

Regional Airport

driver gets tired during long-distance

The pilot instructor showed the team the

travels, in traffic jams, parking, and


for preventing human errors (semiautonomous)

In order to ensure human trusts


the autopilot, the machine creates
situational awareness by providing
regular information updates (current
speed, gas status, distance to
destination, weather, etc).

To make the transition between


autopilot and manual mode as easy
as possible, the driver should decide
exactly when they wish to take over

cockpit of a Cessna airplane (two-seater).


The experience in a small aircraft with
autopilot activated is very similar to the
concept of an autonomous vehicle driving.
The team was able to get the experience
of what kinds of activities the pilots perform
in the cabin space while autopilot is
engaged, and the different kinds of steering
mechanisms used. (Figure 26)
Key Notes

joystick and yoke steering (personal

control.

beforehand. Alert can be sound,


seat vibration, color change. E.G.
If the car is about to run out of gas,
the car should remind the driver of a
mandatory stop in the next closest
gas station.

preference - both of them are very

Passive transition: in case the


system goes wrong, pilot is alerted

It takes time for the pilot to


synchronize manual steering with
the previous plane movements from
the autopilot. Especially in windy
weather when the plane has trimmed
itself to fly against the wind, the pilot
has to be able to adjust his steering
to the weather conditions during the
transition.

Different steering mechanisms -

sensitive)

Autopilot modes - three - ascending/


descending, waypoints, altitude

Auto pilot landing required during bad


weather

Semi Auto pilot - guidance system


which you manually follow

Pilots use smart phones and ipads in


the cockpit; they dont always have
hands on the controls.

Final Documentation

247

Cockpit of Cessna Plane

No jerk occurs in transitioning to


manual control

There is limited space surrounding the


yoke.

Layered and redundant safety


mechanisms exist to ensure safety

In the flight autopilot, pilots perform


a precautionary action - maintaining
hands near the control- during the
transition from manual mode to
autopilot

Conclusions
On longer trips, pilots perform activities like
reading and playing games on electronic
devices when in autopilot. Having a
comfortable and flexible cabin space is
important to be able to enjoy the riding
experience. On the transition side, the
interface with which the pilot interacts should
be more intuitive, non-redundant, and more
user friendly.

248

Final Documentation

7.18.4 EMT
(Emergency Medical Technician)

They have medic catchers for secondary


safety. Though there are seatbelts for

The team was able to view the backend of an

primary safety, these seatbelts are not

ambulance at the Stanford Hospital. EMTs

that safe and often not used since

have to work in the back of these vehicles

they allow the medic to reach across

while being transported to the hospital in

the patient to the other side of the

a fast manner, while performing medical

ambulance.

procedures on patients. (Figure 27)

The height of the passenger cabin in the


vehicle is also very important for more

Key Notes

workspace and easy accessibility into

and out of the ambulance for medics.

He noted that it never gets comfortable


being in the back of an ambulance.

They just deal with the discomfort.

Conclusions/Lessons Learned

Stability is an issue especially when

Having a workspace that is flexible,

performing delicate tasks like giving an

organized, and comfortable is important

IV.

especially when performing delicate

He told us there was no real solution for

tasks. A space that is designed for

stabilizing IV administration; they just do

many activities instead of one particular

their best with what they have.

activity is key.

Accessibility of equipment is necessary

Shelves

EMT Vehicle Interior

Patient Bed

Seat Belts

Final Documentation

249

7.18.5 Survey
To discover users potential needs while
driving or riding a car, the team created a
survey about the current drivers driving
habits, their favorite activities inside a
vehicle, and unpleasant moments they have
experienced as a driver or a passenger.
The survey has 9 questions with no direct
indication of autonomous driving. It was sent
out to our Facebook friends.
(Figure 28: Servey Results)
Results

62 responses. 77% of them are in the


age group 18-26. 21% of them are in
the age group 27-35. 39% are female.
61% are male.

Females
of:

enjoy

privacy,

driving
easy

because

door-to-door

transportation, music, good scenery,


and hanging out with friends.

Males enjoy driving because of: enjoy


their own time, enjoy driving at the limits
like speed and drifting, feeling of control

Some crazy things they would like to


do in the car: massage chairs, stretch,
sleep, watch a movie, hang out with
people in the back seat, prepare food,
read, text while driving, shower, surfing
the internet, and gaming.

60% of survey takers get motion sick


while reading in the back of the car and
on a bus. More than 80% of survey
takers never get motion sick while
reading in a train, metro, or plane.

250

Final Documentation

What is your age?

What is your gender?

18 - 26 (18)

Male (38)

27 - 35 (13)

36 - 45 (1)

Female (24)
18 - 26

48

77%

27 - 35

13

21%

27 - 35

2%

Female

24

39%

Male

38

61%

How often do you feel motion sick while reading in the following vehicles - Metro?

Always
Sometimes

Always

5%

Sometimes

10%

Hardly

53

85%

Hardly
0

11

22

33

44

55

How often do you feel motion sick while reading in the following vehicles - Plane?

Always
Sometimes
Hardly
0

11

22

33

44

55

Always

0%

Sometimes

15%

Hardly

53

85%

Final Documentation

251

How often do you feel motion sick while reading in the following vehicles - Car front seat?

Always
Sometimes

Always

15%

Sometimes

13

21%

Hardly

40

65%

Hardly
0

16

24

32

40

How often do you feel motion sick while reading in the following vehicles - Car back seat?

Always

Always

15

24%

Sometimes

Sometimes

22

35%

Hardly

25

40%

Hardly
0

10

15

20

25

How often do you feel motion sick while reading in the following vehicles - Bus?

Always
Sometimes

Always

15%

Sometimes

27

44%

Hardly

26

42%

Hardly
0

10

15

20

25

30

How often do you feel motion sick while reading in the following vehicles - Train?

Always
Sometimes
Hardly
0

10

20

30

40

50

Always

3%

Sometimes

10

16%

Hardly

50

81%

252

Final Documentation

Conclusions / Lessons Learned

Phone calling & texting

61

98%

People want more comfortable setup

Internet surfing through...

49

79%

and more space in the car

Listening to music

61

98%

People want more communication and

Mobile gaming

38

61%

engagement with other passengers.

Reading

42

68%

People want to experience dangerous

Working (laptop or paper)

29

47%

driving mode without sacrificing safety.

Chatting with friends

55

89%

People want to use their time more

Eating and drinking beverage

55

89%

efficiently in long driving trip (multi-

Drinking alcohol

27

44%

tasking)

Changing clothes

39

63%

People dont want to drive in situations

Putting on make-up

14

23%

like: boring scenery, long waiting,

Other

18

29%

parking

As a passenger, what kind of thing have you done inside a vehicle?

Phone calling & texting


Internet surfing through mobile phone
Listening to music
Mobile gaming
Reading
Working (laptop or paper)
Chatting with friends
Eating and drinking beverage
Drinking alcohol
Changing clothes
Putting on make-up
Other
0

12

24

36

48

60

72

People may select more than one checkbox, so percentages may add up to more than 100%

Final Documentation

253

7.19 Benchmarking

transition, steering, motion sickness, and

From the brainstorming sessions that the

confirmation cues. The team identified that

team conducted, there were many areas

the areas of interests could be separated

of interest that needed to be explored for

into three categories:

benchmarking. Benchmarking is an exercise


of exploring as many areas as possible

1. Physical Steering and transition

that relate to the problem statement, such

2. Psychological Trust and confirmation

as technology research, predictions, etc.

cues

Along with needfinding these two form

3. Cabin/Environment Motion sickness

the base of design. The areas areas for

and workspace

benchmarking included: trust, workspace,

Trust

Confirmation
Cues

Wokspace

Benchmarking
Motion
Sickness

Transition

Steering

254

Final Documentation

7.19.1 Steering Benchmarking

Observations

The team investigated various methods that

between the dancer and the person

could be used as steering mechanisms for


vehicles. The methods include: gestural,
mind control, voice commands, and haptic

mimicking the moves of the dancers

and these ways of steering mechanisms


could possibly show a more intuitive and
effective way to transition from autonomous
to manual mode instead of using the
traditional steering wheel.
Gestural Steering Benchmark
In order to understand what it would feel like
to drive using gestural commands, the team
tried out an Xbox Kinect dancing game to
see if gestures would be efficient to control.
(Figure 30)

Same concept applies semi-autonomous


features which has a guide to follow in

feedback. The steering benchmarking was


defined as a physical aspect of the design

There was noticeable reaction time

order to help maneuver

Initial training and warm up is needed


to get acclimated to the system

Need to have some built in error


allowance so the user has a better
chance at mimicking the commands

Conclusions/Insights
There are several insights that came from
benchmarking a gesture-based system. A
gesture based steering system can work if
the reaction time between the gesture and
response is minimized for best performance.
There also would be a learning curve
associated with this system. A training mode
can be implemented as a guide to follow
system to help the users get familiar with the
system and gain more confidence, but need

Gestural Steering Benchmark

to compensate if the user cannot mimic the


actions precisely.

Mind Control Benchmark


One non-obvious steering mechanism
that the team brainstormed was the use
of mind control. Being able to steer using
brainwaves and mind control is advanced,
but has the possibility to eliminate any
physical steering mechanisms completely.
The team tested this idea by playing the
game Mindflex. It reads the levels of
concentration from sensors in the headsets

Final Documentation

255

Mind Control Benchmark

and uses that to control the altitude of a

Conclusions

ball as well as whether it travels forward or

Although the mindflex game was a great

backwards along a track. It is comparable to

way to test out mind control activities, it was

how an EEG machine operates and senses

concluded that it is a very ineffective way

a patients vital signs. (Figure 31)

for a steering mechanism, since it was hard


to concentrate on performing simple tasks
(i.e. moving a ball up or down, forward or

Observations

backwards). It would be not feasible to apply

Being able to isolate your concentration

this method to complex tasks and steering

to particular tasks is not easy.

within a vehicle, even though it is possible

Only one task can be done at a time,

to train yourself to control and concentrate

hard to concentrate on two things at

properly.

once.

Once at extremes of high or low


concentration it is hard to move the
other way.

It is possible to train yourself to control


it properly based on the feedback of
concentration levels.

256

Final Documentation

Voice Command Steering Benchmark

would do smaller incremental operations

To evaluate how effective a user can steer

while turning left.

based on voice commands, the team set

voice. The team tested whether the drivers


trust themselves to perform this task and
the vehicle to provide correct information.
This helped to illuminate what types of
cues should be used in order to assist in a
transition or to allow more flexibility in the
driver position so that it is not necessary to
be looking out the windshield when driving.
(Figure 32: Blindfolded Test)

when

the

voice

left to indicate the driver should turn

experiment was performed in an empty


the passenger guided the driver with their

easier

commands used qualifiers like hard

up a blindfolded driving experiment. The


parking lot on Stanford campus in which

Became

the wheel a lot.

Interpretation of the commands was


very clear.

Conclusions
Several insightful conclusions were brought
up with this benchmarking idea. The voice
based steering functionality is not effective
since appropriate feedback and confirmation
cues were not provided. This leads to trust
issues and the driver feeling unsafe with
the situation. Although the voice commands
were clear and concise (left, right, hard

Observations

Hard to figure out if the steering wheel


was straightened out. It was also difficult
for the driver to drive in a straight path,
as they could not tell when to make the
slight adjustments that were necessary.

Difficult to tell whether we were even


moving (since moving at slow coasting
speeds).

It was very disorienting and made the


driver feel slightly off balance or uneasy.

Driver wanted some type of orientation


cues to understand why they were
being told to do certain tasks

Driver was very hesitant to go fast even


when told to just go straight.

Driver did not trust his or her own


reaction time to the commands.

When told to turn left, the drivers would


not turn all the way to left. Instead they

left, hard right), it was hard for the driver


to determine to what degree the command
should be taken. A hard right or left is very
ambiguous, especially when there are many
factors associated with driving including
speed, obstacles in the road, and the
sensitivity of steering for a particular car.
The number of levels in a steering feature/
mechanism is important to be able to make
small or large changes. The user input in
current mechanical steering design in cars is
a continuously varying input, as compared to
certain discrete degrees of input in the voice
commands based on the phrases used. The
ideal number of degrees required to steer
will fall somewhere in between. Another
important conclusion is that different drivers
would respond differently to such a system
based on past behavioral experiences. This
leads to the issues of past experiences
affecting driver psychology.

Final Documentation

257

Blindfolded Test

Haptic Command Steering Benchmark

Observations

The haptic command steering benchmark


was part of the same experiment and set up

The reaction time was longer in terms

like the voice command steering benchmark.

of noting the difference in command or

Since it was hard to determine what degree

when to stop doing one thing and start

of steering should be mapped to the voice

doing another.

commands (left, right, hard left, hard right),

another was not distinct enough.

haptic feedback, the team hypothesized,


should provide better commands in terms

The transition from one command to

Harder to perform when there were a

of interpretation. This benchmark only used

lot of tapping tasks happening one after

haptic command and not voice commands.

another

Tapping on the right shoulder meant to turn

Signals could be interpreted in different

right and tapping on the left shoulder meant

ways by different drivers, for instance the

to turn left. To indicate to drive straight,

stop tap signal could also be interpreted

both shoulders were tapped. To indicate

as a rapidly accelerate signal.

stop, both shoulders were pressed down

long periods of driving.

on at the same time. By tapping faster on


the shoulders with turning or going straight,

It was uncomfortable and annoying after

A lot of concentration and focus was

the driver knew to what degree it should be

needed to interpret/process the tapping

done. (Figure 33)

task and then to respond to the task.

258

Final Documentation

Haptic Command Benchmark

Conclusions

Segway benchmarking

Compared to the voice command steering

The team realized that in order for the

benchmark, although the reaction time


to interpret, process, and respond to the
tapping tasks was longer, the degree of
mapping of steering to the haptic feedback
seemed to be more effective than the voice
commands. Haptic and voice commands are
possible solutions steering mechanisms, but
more indicators to alert the driver prior to the
actual command would allow the driver to
anticipate the command (turning or driving
straight). Visual feedback still seems to be
the ultimate solution since the driver can
process the visual data faster and be able
to see the environment prior to it happening.

drivers to quickly take over the controls,


the control mechanism must be very easy
to learn. The team looked for an existing
example of a generally perceived intuitive
vehicle in order to isolate what was essential
to make a steering method intuitive.
A local entertainment center was contacted
and a time was scheduled to test Segways.
Segway uses unique steering method and
thus could be one of the future steering
directions. In order to go forward or
backward the driver needs to lean forward or
backward accordingly. Pulling the handle in
front of the driver to the left or right does the
steering left and right. The steering handle
serves also as a help to balance the driver
during the ride.(Figure 34)

Final Documentation

Observations

It is surprisingly easy to learn to drive


a Segway

Steering is very logical and similar to


body movements during walking

The small size of the vehicle and the


position of the driver (above the vehicle)
helps in precise navigation

Although the speed is not fast (20km/h


maximum) it feels like it is moving at
faster speeds

Conclusions
In order to make the driving experience
easy to learn, the team might try to use body
movements as one of the steering methods.
Being able to see the vehicle from above is
really useful in precise driving. If steering
is logical and similar to already learned
motoric body functions, it does not take long
to learn (it took about 2 minutes to learn to
drive Segways).
7.19.2 Motion Sickness Benchmark
The purpose of this benchmark was
to determine whether motion sickness
was an issue if no visuals of the outside
environment were given. The team initially
believed that focusing on a fixed visual while
having moving visuals in their peripherals
caused motion sickness. By eliminating the
moving visuals from the peripherals, motion
sickness could possibly be reduced. This
experiment involved covering the windows
of a car so the passenger in the back seat
could not see out of the windows towards
the front of the car. The passenger sat in the
back and read, while being driven around
campus. (Figure 35)

259

260

Final Documentation

Observations

Within 5 minutes of reading in the car,


passenger felt motion sick, even with
people who never experienced motion
sickness in the past

Good ventilation and lighting may


have been a factor the garbage
bags blocked off airflow to the where
the passenger was riding and lighting
produced moving shadows

May have been due to the extreme


driving that the tester was doing as well

Very difficult to read because you could


not anticipate turns or stopping

Conclusions
From our initial beliefs, motion sickness

Motion Sickness Benchmark

was still an issue even with the surrounding


environment visuals being blocked. Other
factors within our experimental set-up could

7.19.3 Human-Machine Transition

have induced the motion sickness problem.

Benchmarking

In this case, the remedy would probably be

The driver transition between autonomous

to have screens for everyone to see out from


the front into the horizon. Some sort of visual
and anticipatory feedback is necessary to
avoid motion sickness while doing work
in the car. Some options to explore would
be whether the complete view is required
or only the view from one side of the car
is sufficient. The problem with this is that
in metropolitan cities, having the view of
buildings zooming past the side could make
people more motion sick. Motion sickness is
definitely a problem and can be considered
an extension to the teams design vision
since the cabin space will be designed to
accommodate users doing desired activities.

and driving mode is an important aspect and


area of interest. To be able to determine the
steps that should be taken prior and during
the transition, the team observed two
different events that simulate the humanmachine interaction transition.
CNC Machine Operation Observation
The team observed fellow students using
CNC machines in the workshops to view a
scenario in which humans transfer control to
a machine. The team hoped to learn about
what kind of procedures occur before the
students transfer controls, what happens
when something goes wrong, and how they
transfer control back to themselves.

Final Documentation

261

Observations

Conclusions

Dry runs of the actual program were

The team observed that repeated visual

performed first before the final cut to

inspections and routines were performed

ensure the program worked properly.

initially to make sure that the program was

During the dry run the speeds could be

correct and doing what it is suppose to do.

controlled to be at 25%, 50% and 100%

In the situation of autonomous cars breaking

of the actual cutting speed for visual

down, having some kind of immediate

inspection.

service or real-time diagnostic analysis

Operator always had hands on the

can be performed on the spot. A major

stop button

finding was that dry runs with controlled

When something went wrong and did

speeds could be extrapolated as building

not know how to fix the problem, the

gradual trust with the autonomous system.

operator had to wait for a TA

It can either be different percent of manual

doubled

control and gradual shift to autonomous

checked that the correct drilling tool

thus building complete trust or it can also

was inserted.

be in completely autonomous mode with

The

operator

frequently

different levels of maximum speed. Trust in


the system and having self-confidence that
it is working properly is a main concern, but
gradually maintaining that trust is important.
Even though this experiment was focused
on the transition between human and
machine, it opened up many questions in
terms of developing trust. (Figure 36)

262

Final Documentation

Racing Game Player-to-Player Transition


In order to understand what it would feel like
to take over control of driving after being
in autonomous mode, the team decided to
experience this using a racing video game.
One player would drive the route in the
game, and then quickly give control over to
another player. (Figure 37)
Observations

Easier to switch if the other player was


observing what was going on

Racing Game Player to Player Transition

If the next player was not paying


attention, the chances of crashing
increased dramatically

Conclusions
From this experience, it seemed obvious
that a gradual shift of control is probably
better so that the drivers are aware of
the surrounding environments and of the
vehicles control actions. This way the driver
can align themselves or the controls in that
direction. An incremental rather than a direct
transition of controls would allow the driver
to get acclimated with what is happening
around them.

7.19.4 Confirmation Cue Benchmarking


Confirmation cue benchmarking is essential
in determining the psychological effects
of a driver in an autonomous car. Being
able to maximize the enjoyment of the
riding experience is important while also
keeping the user aware of the surrounding
environments and vehicles control actions.

Final Documentation

263

Light Indictor Experiment

anything, it made the passenger feel

The team wanted to test and understand

like the system wasnt working or wasnt

how much information would be needed for


the passenger in an autonomous car to ride
comfortably when all visual cues (i.e. seeing
out a window) are gone. The experiment
involved a passenger in the back seat of
the car in which they could not see out any
window or through the front of the car. A
console was lit up to indicate specific actions
of the car or surrounding environment prior
to the actions occurring. Two versions of this

quite sure what was going on.


Conclusions
Although the indicators seemed to be
effective at times, user acceptability will
depend a lot on the type and the amount
of information relayed, and how intrusive
it is when doing desired activities inside
the car during autonomous mode. One
possible further benchmarking to explore

experiment setup was tested (Figure 38):

would be to understand the different tasks

1. Console lit up to indicate what direction

finding a non-intrusive and effective way of

is being traveled (i.e. turning left, right,


driving straight)
2. Console lit up to indicate the speed at
which the car is travelling (i.e. slow to

being performed in the autonomous car and


giving notifications to the user. By having
anticipatory cues for out of the ordinary
situations, it will lead to reassurance that
everything is working fine.

fast)
3. Console lit up to indicate in advance
when something out of the ordinary
was going to happen (i.e. speed bump,
sharp turn, sudden braking)
Observations

Left/Right & Speed Indicators were not


really useful because the passenger
felt the motion. In this case, real time
indication is not useful.

Speed Bump, Sharp Turn and Braking


signs were effective because it gave
advance warning and was not real time.

One downside of this light system was


that the passenger had to be looking at
it to know, which made it irrelevant when
trying to do something else in the car

When the light wasnt indicating

Console in the car that indicated left,


straight, right (posts were replaced for
speed bump, sharp turn, braking indictors)

264

Final Documentation

Voice Indictor Experiment


The team not only tested how visual
confirmation cues affect how a passenger is
notified about the surrounding environments
and of the vehicles control actions, but
with the use of voice indicators as well.
The motivation of this was to see whether
giving only audio cues could provide the
passenger with enough information for
them to feel secure. The experiment setup
used the same setup from the light indicator
experiment, but the console did not project
anything. The rider in the front of the car
would say what is about to happen, for
example, stop approaching.
Observations

Voice is nice because it gives a more


precise description of what is going on.

Feels more secure and reassuring.

One concern is whether it is too intrusive


when music is being played or in
general.

Conclusions
Compared to the light indicator experiment,
the voice indicator seemed to be more
effective. User acceptability will still depend
on the type and the amount of information
relayed and how intrusive it is when doing
desired activities inside the car during
autonomous mode. The interface with
which the user interacts should maximize
the enjoyment of the riding experience but
still allow the user to be aware of whats
happening around them.

Final Documentation

265

Observations

7.19.5 Trust benchmarking


There

are

various

assistive

driving

technologies implemented in current high


tech cars that allow the driver to be in less

know that the ACC was working.

the team a better understanding of how and

to stop.

to decide what the desired following

vehicles in the future.

The ACC was tested within a Porsche


Panamera to show what feedback is given

distance was.

team was interested in finding out how


the driver engages the feature and how to
disengage or override the feature. It also

Override it at any time by pressing on


the accelerator or brake.

The visual of whether a car was in


front or not was helpful, as well as, the

currently to help ease the drivers concerns


about giving up control of the pedals. The

Easy to transition to using ACC by using


a lever to push in and a scroll wheel

why drivers will be able to trust autonomous

Adaptive Cruise Control (ACC)

No visual indicating how far away from


the car in front or that the car intended

control. By benchmarking and experiencing


existing driver assist technologies, it will give

There was no feedback letting the driver

following distance setting visual

Location on dashboard of the visual


was very convenient and only slightly
distracting

helped the team to experience what it feels


like to give up control in a way that the team
is not accustomed to. (Figure 39)

ACC Control
Panamera Dashboard

266

Final Documentation

Conclusions

the sounds are effective in alerting the driver,

It is important to have feedback to know that

it can get annoying. The interface that the

the system is working and what it plans on

driver interacts with is a crucial part in the

doing to make the driver feel comfortable

riding and driving experience, but being able

enough to relax and not be too vigilant by

to trust the systems autonomous functions

watching over the car. If the user sees what

are important too

the car sees, then trust is built.


Night Vision
Parking Assistant

The night vision assistance in the Audi A6 and

The parking assistant in the Porsche

A7 models is displayed on the dashboard of

Panamera has another feature that gave

the car. It is meant to help drivers at night be

the user more control and awareness while

able to easily identify obstacles in their path

parking. The display had yellow lines to

that should be avoided. When a pedestrian

show the cars projected path according to

is identified, they are highlighted with a red

the position of the steering wheel and has

outline to give the driver an alert that there

sensors to notify the driver how close the

is something in the road ahead.(Figure 41)

car is to an object. A beeping noise sounded


when the car was very close to hitting

Observations

something. (Figure 40: Parking assistant

display while backing up)

good

Observations

The use of infrared cameras is very


The image is clear, and the alerts and
identification works relatively well

This is very helpful in knowing if you are


turning the wheel properly, especially for

Conclusions

parallel parking.

Although the systems works properly and

The sensor/ beeping is slightly annoying.

effectively, it is a little distracting since the

It is very cautious and drivers realize that

display was bright at night and within your

even though the visual is red, they can

peripheral view which could draw your eyes

still move further.

away from the actual road. It is possible that

The display of the car and colors

it could be more helpful in nighttime/daytime

showing proximity to an object is very

driving in fog, snow, rain, or other hazardous

intuitive and easy to understand.

weather conditions.

Conclusions
The display of information on top of a realtime image of the actual environment was
very effective for parking and driving in
reverse during low speed activities. Although

Final Documentation

267

Night Vision Display

Parking Assistant display while backing up

Night VIsion

Lane Assistant

Conclusions

The team evaluated the lane assist feature

In order to trust the system, the user should

in the Audi A6 to determine how well it


functioned and what the experience was

be well aware how it works. Confirmation


cues of the system were really important,

with it engaged.

since the driver has to acknowledge that the

Observations

trust them. To really be able to do something

At first, the lane assist seemed a bit


scary, but became more comfortable
in time

The more the feature was used, the


more confidence the driver had in the
system

It felt strange that a simple lane change


on the highway without signal light was
so difficult when lane assist was on

It was unexpected when lane assistant


turned itself off

Was not instantly trusting left and right


warning light while changing lanes since
it is an unconscious, learned movement.

assistant systems were really on in order to


else besides driving, the cabin should really
be designed for it (space and tools).

268

Final Documentation

Night Vision Display

Camera

Audi A8 Equipment

Display

Camera

Final Documentation

269

Audi S8 2013 Benchmarking


The team got the chance to experience the
most expensive camera system (Audi Pre
Sense Plus) that Audi has to offer. This gave
a good insight of the current state of camera
technology in the cars and how it is used
for modeling the surroundings of the car.
(Figure 42)
Observations

Nowadays cameras can be quite useful


for observing the surroundings

Cameras make it possible for the


computer to model the surroundings

Corner cameras help to see what is


happening around the corner

If a person is walking to the front of the


moving car the human shape will turn
to red to warn the driver

The car tries to recognize human


shapes in the surroundings (it cannot
recognize animals yet)

Conclusions
In the future, computers will be able to
combine multiple sources of information
so that car will have better awareness of
its surroundings than a human driver. The
future car will most likely be able to predict
and calculate many different scenarios of
what will happen while driving.

270

Final Documentation

Golf Cart Prototype

7.20 Critical Prototypes


Critical prototypes are physical prototypes
uses in ME310 to identify critical functions
and

experiences.

Therefore

critical

prototypes are divided into CFP (Critical


Function Prototype) and CEP (Critical
Experience Prototype). CFP is a physical
prototype of a fundamental element of
design, which is required to ensure its
functionality and CEP is a physical prototype
of an experience of design, which is required
to ensure usability.

7.20.1 Critical Function Prototype Transition Golf Cart (Figure 43)


Driver comfort, safety and ease of transition
are important issues during transitioning
between autonomous driving and manual
driving for an autonomous car. It is important
to test the sequence of actions that would
take place and the kind of interface that the
drivers would interact with during the actual
transition.
The critical function being tested in this
prototype is the transition from autonomous
to manual. Situational awareness and being
in line with the actions being applied by
the autonomous controller are important
for a smooth and safe transition. That is
why the team implemented the concept of

Final Documentation

271

guided matching of controls (steering and

With Transition Golf Cart we tried to

throttle) during and after transition. This was

answer to following questions:

implemented by setting up a golf cart with

1. Do people feel safe and comfortable

a transition switch and an interface in front

making this transition using the

of the users which prompts them to mimic

prototype?

the actions of the autonomous controller.

2. Does the setup increase situational

The hypothesis being that such a guided

awareness for the driver?

matching interface will lead to increased

3. Are people good at matching?

situational awareness of the user leading to

4. Is the interface to intuitive?

a comfortable transition. Autonomous mode

5. What is the appropriate amount of

was faked using a passenger seat driver.

time required to make this transition

The actions of the autonomous controller

safely?

were displayed based on pre-recorded data

6. Is visual representation a good way

on the same track for driving. The details of

of conveying information to the driver

the setup can be found in appendices.

during driving?

Display

Golf Cart Prototype Driving

Switch

272

Final Documentation

The two types of transition sequences


that were tested using this prototype were
(Figure 45):
1) Gradual Transition
The user is given control of the pedals first
and a matching task is initiated. Once a
certain degree of accuracy is achieved in
matching the autonomous control actions,
the user is given the control of the steering
as well and a similar matching task is
initiated. After matching the steering, the
car goes into completely manual mode.
2) Direct Transition
The user is given control of both the control
inputs at the same time and a simultaneous
matching task is initiated. Once the user is
comfortable in matching, the interface goes
away and the car goes into completely
manual mode.

Gradual Transition
Transition

Match

Match

Initiated

Manual

Autonomous

Mode

Mode

Direct Transition
Transition
Initiated

Match

Autonomous
Mode
Gradual and Manual Transition Transfer Sequences

Manual
Mode

Final Documentation

273

Gradual

Direct

Pedal Matching

Simultaneous Matching

Steering Matching

Target
Regions

Step 1

Step 2

Observations
Users did not feel like driving and felt
unsafe during both the transition
sequences. People are more engaged in the
matching task than driving which is
supposed to be the primary task in
manual mode
In the gradual transition, once the users
are given the both the steering and pedals
in step 2, they neglect pedals completely
during steering matching
Matching error for the entire transition
sequence is lower for the direct transition.

Conclusions
The interactive interface was too distracting
during driving and it needs to blend with the
environment more easily. Visual
representation of information is too intrusive
during driving.

People are better at doing simultaneous


matching than matching the control inputs
one at a time
People are better at doing simultaneous
matching than matching the control inputs
one at a time
This might be because of the lower transition

Users felt safe and more comfortable in the

time in the direct sequence. The longer

direct transition

process involved in the gradual transition


delays complete situational awareness.

In the gradual transition, the users had a


tendency to grab the steering wheel as soon
as they switched to manual
Golf Cart Prototype Obeservations and Conclusions

The reason for this might be that having


a feeling of controlling the more sensitive
input, which is steering, makes the users
feel safe.

274

Final Documentation

The results were surprising because people

inputs from both the sources and that

were better at and more comfortable with

they would control the pedals based on

performing multiple tasks at the same time

feedback from the environment. However

than doing one task at a time. The reason

as observed before, the visual interface

for this was based on how information is

was too distracting and the pedal control

processed from various sources. As can be

was essentially an open loop. This was

seen in the flowcharts in Figure 46, there are

confirmed by the fact that the car came to

two sources of information involved the

a complete stop at times during steering

interface and the environment.

matching which meant that pedal control


was being completely ignored. On the other

In the gradual transition, as shown in the

hand, in the direct transition, the source

flowchart, it was expected that the users

of information for both the control inputs

would control the steering based on the

was the interface and the environment

Direct Transition - Expected

Direct Transition - Results

Interface

Environment

Gradual Transition - Expected


Interface

Transition Sequence Flowchart

Environment

Interface

Environment

Gradual Transition - Results


Interface

Environment

Final Documentation

275

Ideal Transition

7.20.2 Steering Mechanisms


Steering and pedals have traditionally been

Interface

Environment

the primary control inputs in automotive


design. When designing the cabin space
and transition interface for an autonomous
car of the future, it might be desirable to
explore other designs for control inputs that
will make driving more fun and be easier and
comfortable to transition to from any other
activity that is being performed in the car
space during autonomous mode. Having a
new control input might also open up new
options for cabin space design. This new

was completely ignored. But the users

control input needs to be intuitive to use

performance was still better because

and the users should be able to clearly and

there was just one source of information.

comfortably convey their intentions to the

Performance is not based on the number of

car controller.

tasks being performed at the same time but

To test this idea, the team hooked up

on the coherence between the sources of

different steering controllers to car gaming

information. As shown in the flowcharts, the

interfaces and users tried to steer the car

ideal solution would lie where the source of

with these non-conventional control inputs. A

information overlaps between the interface

pre-interview was conducted to understand

and environment. Thus in the ideal solution,

the driving habits and experiences of the

the interface blends in with environment

user. After the user tried out each of the

perfectly so that the goal of maintaining

prototypes, a post test survey was conducted

situational awareness while making the

which gathered information on the comfort

users feel safe and comfortable is achieved.

level of that particular method of steering


and its intuitiveness. The different types of
steering control inputs that were tested in
this prototype are:

Small hand controller

Steering with feet

Total joystick control

Joystick and control buttons mixed

Tilting feet to steer

276

Final Documentation

Based on the motivation for this CFP and

Steering with feet

some initial ideas, the team came up with

We wanted to test if steering with feet can be

certain requirements that need to be satisfied


by a good control input design. Metrics for
testing these requirements were identified
based on the car gaming interface setup for
this prototype.

a more intuitive alternative to conventional


steering methods. We purchased a steering
wheel game controller with pedals and set
up the game so that the turning left and right
is done by pressing left and right pedal and
acceleration and braking is done by pressing

Small hand controller


We used a normal Playstation Joystick tigh a

buttons on the steering wheel. Turning the


wheel doesnt effect the game whatsoever.
(Figure 48)

small steering stick which is used with thumb


movements. After initial testing with users,

Observations

we quickly realized that the tested steering

Turning was really difficult. Users didnt

method is not satisfying our needs and we

know how hard they should press the

quickly discarded it. Driving was way too

pedals.

imprecise using this method. (Figure 47)

Feet do not respond as quickly as hands


in such a dynamic situation.

The throttle button was binary and it


created some problems for the users.
Many of them wished to control the
speed better.

Having the steering wheel in front of the


users, without having any function was

Small Hand Controller

confusing. (The setting had a steering


wheel on the table where one button
is acceleration and turning was done
by feet by pressing left and right pedal)

Final Documentation

277

Steering with Feet

One person experienced fatigue in this


exercise.

One person replied that it would have


been better if she could see her feet.

People often made mistake and steered


to the wrong side

People who have never driven before,


said that they would like to have
someone controlling the gas and brake
for them. Whereas people who know
how to drive, would like to take more
responsibility and control the speed by
themselves.

Steering with feet is actually quite hard


to adapt to and has a steep learning
curve associated with it.

Steering with joystick


The team wanted to test if steering with a
joystick would be a more intuitive alternative
to conventional steering methods. (Figure
49) A conventional joystick was purchased
and set up so that the acceleration and
braking was done by pushing and pulling
the joystick and turning was done by pulling
the joystick left and right. In the second test,
turning was still done in the same manner,
but acceleration was changed to be the front
button on the joystick. (Figure - User-test
with joystick/Button on the joystick)

278

Final Documentation

Steering with Joystick

Observations
1. Control setting: four direction as

the speed as well.

Sometimes users didnt know if the

acceleration, move left/right, and brake

car was turning or accelerating. It was

(Figure 50)

hard to isolate only hand movement to


the side without moving it forward and

Joystick was not consistent in response.

backwards as well. Moving forward and

When it was pushed lightly, the vehicle

backwards is accelerating and braking.

in the game didnt react and if the

wants to brake he has to pull back on

too much. (Maybe there was a problem

the joystick, but the car in the game is

with this specific joystick or this game

still going forward for some time.

Like in the previous testing, it was

trouble for users. It was really hard to

difficult to keep car on the track going

keep the car going straight after turning.

forward after turning. Small directional

One person suggested to having a

adjustments are really hard.

joystick that moves left - right and not

It was confusing for the user when he

joystick was pressed hard, car moved

) That circumstance created a lot of

Hand hurts if the joystick is not placed

just to have left - right tilting motion

properly. Also it is painful to keep it

Steering in this way was easy to do with

pushed forward all the time. User

one hand

needed to keep it pressed in order to

It was hard to keep the same speed in

have constant speed.

turns. When user moves the joystick left


and right, he/ she accidentally disrupts

Some of the users brought the joystick


closer to their bodies

Final Documentation

279

1 .Control setting

2. Control setting: joystick left/right to control


turning, buttons to control gas & brake

Some users were performing better if

Accelerate

they have gas and brake separated

Right

from turning movements. One person


improved by 20 sec per lap after this

Left

adjustment. (Often they were still


pushing the joystick forward even if that
had no effect on the game.)

One guy suggested that it would be

Brake

better to have a speed limit and try


driving like that.

Braking button was not analog, so it


was always braking maximally instead
of slowing down the car.

Going backwards with a separated


button is confusing and it takes time to
realize what to do and where the button
for reverse is.

When users were able to see the whole


car, they almost didnt crash at all.
However, in top view it is hard to see

2 .Control setting

how much the car turns, because user


cant see the wheels in front of the car.

Accelerate

Right

Left
Brake

280

Final Documentation

Tilting Feet
The team wanted to simulated Segway-like
-steering for this prototype. In this tilting feet
setup the car was steered with tilting feet
to simulate the intuitiveness of a Segway.
(Figure 51)
Observations

Accelerating same time with tilting


forward and steering proved to be
almost impossible

Joystick position on the floor was


almost impossible to put in a way that
is comfortable to steer with

It took too much effort to tilt the joystick


to its full extent with just feet.

Tilting Feet Controller

Final Documentation

281

Conclusions / Lessons Learned

Another insight is that listed requirements

from CFP

should not be applied in our context since

The assumed purpose of this prototype

we realized that the intuitiveness is not really

proved to be quite irrelevant. It was

an issue. These steering methods can also

discovered that none of the non-conventional

be tested in perspective of fun in the future.

steering options are that precise and all of


them have a steep learning curve associated

Opportunities

with them. However, the most interesting

Steering, accelerating, and braking is

insight from this prototype was that manual

entertainment that cannot be done the

driving is redefined in the future with cars

same way in anywhere else. Simulators

being semi-autonomous and crash proof. In

and games are always missing some

such a situation, it might even be desirable

parts of the holistic experience.

to have a non-precise controller that gives

The intuitiveness is not the issue, since

the user an illusion of precise control while

driving safely will be enhanced with

making driving more fun.

active safety systems that autonomous


car and its infrastructure can provide.

Almost all of the cars in the future will be

It might also be safe to assume that

part of the autonomous car system since if

passive safety systems (seat-belts,

one person is allowed to drive outside the

airbags, etc.) can be removed and the

system and do unpredictable moves, the

freedom of cabin design is increased.

rest of the drivers will be in danger. This is


why intuitiveness is not really an issue from
the perspective of learning. After testing this
prototype we will approach steering from
another perspective: the fun perspective.
Autonomous technologies make totally
safe driving possible and driving in a semiautonomous mode lets the driver to push the
boundaries knowing there is no dangers in
driving because the car takes care of safety.
However the driver should not feel like the
car is applying a lot of corrective action even
in manual mode. That needs to be avoided
since the team aims at maintaining the
pleasure of driving through the proposed
design.

282

Final Documentation

7.20.3 Critical Experience Prototype -

Reconfigurable Workspace
Workspace prototyping is an important

desired activities

when the user wants to take control

be in autonomous mode a majority of the

ASAP

time, being about to do the drivers desired


vehicle is ideal. The driver not only needs
to be able to do their activities in the space,
but it needs to be a comfortable and more

Conclusions / Lessons Learned

The team wanted to explore the experience

with this setup. It has to be fast enough


for a quick transition but slow enough to

a mock cabin space with movable chair to

avoid discomfort.

be good to have the steering apparatus

quick prototype of the envisioned cabin

attached to the moving chair so that you

space with an interactive windshield display

can control the car from any location

and a moving chair for repositioning the user

and orientation within the car

the cabin space in an intuitive way for

prototype is shown in Figure 52. The details

Observations

The setup of the moving chair


transitioning to the cabin activity space
table was an interesting feeling, not able
to do that in current cars

Position of person when moving forward


or backwards in transition is awkward
especially with the feet dangling

It would be desirable to have an interface


that displays various configurations of

activities like working and relaxing. The

been included in.

One interesting concept that came out


of testing this prototype was that it might

To test this experience, the team made a

about how this prototype was tested have

It was observed that the chair speed


affects the level of comfort users have

of a car. The environment was setup to have

has been adapted to perform some of the

Users noted that the experience was


not like being confined in a typical car

of having a moving chair in the cabin space

during autonomous mode to a location that

It is desirable to have more room and


space to do various activities

intimate personalized experience.

transition from driving to leisure mode.

Timing may be a problem when trying to


switch in transition suddenly, especially

aspect of the design. Since the driver will

activities within the cabin space of the

Comfortable with space to be able to

the user to interact with.

Besides having a switch, there can


be other ways of conveying the users
intention to the controller like voice
based and gesture based commands.

Final Documentation

283

Interactive
Windshield

Movable
Chair

Cabin Activity
Space

Reconfigurable Workspace

284

Final Documentation

7.20.4 Critical Experience Prototype -

Results

Mobile Workspace

The team wanted to explore the experience


of how it actually feels to work in a moving
vehicle and whether it is even possible to
do that comfortably. Our assumption is that
some of the time used in autonomous car will
be used for working. If working is possible in
a simple setup, it will certainly be possible

Nobody was motion sick, although it


was not possible to look outside.

It was easy to concentrate in the van,


because of the lack of distraction

It was hard to write on the whiteboard


while moving

Some of the users were completely


relaxed during driving, but others were

in a space designed for that.

constantly aware of the environment

Cabin space could also be used for

small window between front and back

entertainment. The aim of this prototype was


to create a situation where a passenger in
the future car can do some physical activity
and maybe have fun during the drive. This
was implemented by trying to play with a
Nintendo Wii, because it requires physical
movement. Different physical activities were
simulated in the car by having different video

and tried to follow the driver through the


of the van. It might be possible that the
confidence comes with familiar driver.
Conclusions / Lessons Learned
It is possible to work in the car while moving.
The team was not so sure after car tests
made earlier, but when tested out with this
van that was moving slowly, it was really

games on board.

comfortable to work there.

Test Setup

reasons behind motion sickness and what

The team rented a van in order to test the


possibility of having an office and to work
in a moving vehicle. Within the van two
different scenarios were created (Figure 53):

Office scenario with a table, chairs and a


whiteboard in the back of the van where
a meeting was conducted while the car
was moving slowly to simulate riding in
an autonomous vehicle.

Entertainment scenario included having


a gaming console (Nintendo Wii), a TV
and a fatboy bag inside the van.

The team needs to explore more the


visual or other cues need to be provided to
prevent it.

Final Documentation

Mobile workspace - Office Scenario (top), Entertainment Scenario (bottom)

285

286

Final Documentation

7.21 Design Specifications

follow to make the transition easier. This

As described in Section 4.5.6, one

the steering and pedals being displayed

of our main CFPs was the one that


tested out the transition sequence by
simulating autonomous mode on a golf
cart and prompting the user to mimic
the actions of the autonomous controller
for situational awareness. This section
lists

the

design

specifications

and

detailed description of this prototype.

plan required real time information about


to the user. The following sections give
details about the electronics and the coding
strategy used. The entire code can be found
in Appendix 7.6.2.
Mechanical Setup
As shown in Figure 54, the mechanical
design for this prototype consisted of
designing a means of indicating the desire to

7.21.1 Transition sequence prototype

switch to manual from autonomous by using


a switch. This switch needs to be placed in

The main objective of this prototype was to

a comfortable and accessible position for

test the comfort level and intuitiveness of a

the user. In this prototype this switch was

gradual

from

mounted on the cover that was designed to

autonomous to manual mode. As described

protect all the electronic circuits that were

in Section 4.8.1, both the transition

mounted on the steering wheel of the golf

sequences consisted of real time data being

cart. The proposed design for this prototype

displayed to the users and along with pre-

also required an interface right in front of the

recorded data that they were supposed to

users while driving to prompt them to mimic

or

Transition Switch

direct

transition

Display Construction

Final Documentation

287

the autonomous controller actions. A 7-inch


tablet was used for this purpose as shown in
Figure 55. To create a mount for this tablet,
the team bent a sheet of acrylic to make it
a holder for the tablet and added mounting
clips on the same for securing it to the golf
cart windshield right in front of the user. at
the same time. The reason behind it was
to learn about peoples working habits in a
relaxed atmosphere.
The way these two designed components
were mounted on the golf cart has been
shown in Figure 56. It should be noted
that the laptop was just for real time serial
communication with the sensors and kept
on the passenger side of the car in a way
that was not interfering with any of the users
actions during testing.
Golf Cart Construction

288

Final Documentation

As shown in Figure 57, below the electronics


were mounted right on the steering wheel
beneath the designed component for
protection from damage during testing. This
also turned out to be a useful shield from
rains during testing. The mechanical design
allows the flexibility of flipping open the
cover and working on the electronics if the
designer wants to change something on the
breadboard or check for lose connections.

Switch

Switch Contrustion

Final Documentation

289

Description of the electronics


Arduino duemilanove microcontroller was
used for receiving data from the sensors

3V3 5V

Vin

and communicating with the laptop.

RST

D13

A software platform called iDisplay on

AREF

D12

Android was used to extend the screen

D11

Arduino

of the laptop to the tablet for displaying


the interface that was coded in Visual C#

Arduino board. There were two sensors


used for getting the steering and pedal
data respectively. There is an internal
potentiometer mounted on the throttle pedal
of the golf cart to give the throttle signal to

D9
D8
D7

and which responded to the data received


through serial communication with the

D10

D6
A0

D5

A1

D4

A2

D3

A3

D2

A4

D1
D0

A5
GND

its internal controller. There was an external


tap created on the voltage reading of this
potentiometer and the value was fed to the

Figure 58: Arduino Board

Arduino ADC. It should be noted that only


one pedal (throttle) was used for this entire
experiment and the brake pedal was not
used. The analog input sample circuit has

Acceleration of the car itself and sudden

been shown in Figure 58.

bumps will make the net acceleration


vector jump around 65

The steering angle was detected using a

Rotational acceleration while turning the

maximum acceleration vector detection

steering wheel will also cause problems

algorithm applied to the data received


from the accelerometer ADXL345. The

However this was a basic prototype to show

connections for this module have been

proof-of-concept and a rough steering angle

shown in Figure 42 above. When the system

calculation is fine for this purpose. It was

is reset, it takes 5 seconds and notes down

actually working pretty well considering that

the stationary initial net vector (which is

the golf cart was not accelerating too much

mainly just gravity). Once the algorithm

on the track and there were no bumps. Also

starts, every time it calculates the angle of

the accelerometer was mounted, as close to

the current net vector with this initial vector

the center of the steering wheel as possible,

and converts it to the steering angle. There

so there was no or little effect of rotational

are two ways in which we will get inaccurate

acceleration on the Cartesian acceleration

readings from this method:

vector that was being measured by the unit.

290

Final Documentation

Description of the code

to manual command is received. When the

Figure 59 outlines the flowchart used for the

user flips the switch, the display comes live


and the transition sequence is initiated.

code. In the recording phase, the car is ran

The accuracy of matching is checked by

once on the pre-decided track with steering

incrementing a loop counter inside the code

and pedal data is recorded as a reference.

and showing the score in a progress bar on

This recorded data is treated as the ideal

the right in the interface. If the user goes

actions that the autonomous controller will

outside the desired region the counter is

apply and which the user is expected to

reset to zero. Once the user achieves the

follow while trying to retrace the same path

desired level of accuracy the interface

and transitioning into manual mode.

switches off and the car goes into completely

The code is continuously communicating

manual mode.

with the microcontroller to get real time


data, but it is not displayed until the switch

Arduino Flowchart
Stop

Select Mode

Write data
to file

Recording

Outside
threshold

Accuracy
achieved

Remove
Display

Loop
counter++
within
threshold

Wait for
switch
Switch to
manual
recieved

Display recorded
and real time data

Final Documentation

7.21.2 Different Steering Controls


The another CFP made was to find out the
most intuitive steering method out of many.
This section lists the design specifications
and detailed description of this prototype.
Joysteer prototype
The reasoning behind this prototype
was to discover the most intuitive way of

291

steering since we assumed that the drivers


will drive much worse in 2035 because of
autonomous features. The prototype was
realized by building a test setup of video
game and different controllers to steer a
car game. The user testers were given a
one controller and one track to finish. The
following section will illustrate what kind of
test setup was built. (Figure 60; 61)

Basic Setup

Basic Setup - TV, Audi A4 car chair, Playstation 3, Colin Mcrae Dirt for PS3

292

Final Documentation

Final Documentation

293

Test Setup:
Small joystick control (Figure 62)
Large joystick control: Saitek Aviator PC
(Figure 63)
Thrustmaster Experience Racing Wheel
with Pedals (Figure 64)
Sponge (Figure 65)
Duct Tape (Figure 66)
Reboard

Racing Wheel

Small Joystick Control

Sponge

Large Joystick Control

Duct Tape

294

Joystick as a tilting controller setup

Sponge

Reboard

Joystick as tilting controler setup

Final Documentation

Final Documentation

295

Joystick as a tilting controller


For using joystick as a tilting controller a
sponge was cut and paired with a large
joystick. Tester needs to put feet on top
of reboard and use feet tilting to steer and
manage speed.
(Figure 66: Joystick as tilting controler setup)

Controllers

Gas and Brake

Steering

Small Joystick

Buttons

Analog control

Large Joystick

Analog up & down

Analog control

Large Joystick

Buttons

Analog control

Pedals

Buttons

Left & Right pedal control

Feet tilting

Analog up & down

Analog control

296

Fall quarter was primarily used to gain more


knowledge about the task we were given.
We used the time to benchmark related
industries and tested prototypes mainly
based on our assumptions.During fall
quarter we tested different ways of steering
and discovered that the steering wheel was

Final Documentation

7.22 Needfinding
7.22.1 Trip to Germany
After Christmas break, the team got an
opportunity to visit Germany and had the
first meet with corporate partners from
Audi, which was the main purpose of this

still the best way to steer.

trip. Since we already traveled to Munich,

In the beginning of the winter period we have

and BMW Welt, as well as Audi Museum.

already had understanding of the problem,


future assumptions, user and the technology
that might be available in the future. We had
a meeting with our corporate partners in
January and by then we had entire picture
of what was expected from us to deliver and

we also decided to drop in BMW Museum


In addition to previous, we also got an
opportunity to visit University of Munich
and see their automobile laboratory and
jet engine laboratory. The trip itself was
genuinely hectic but also highly educational.
Travel plan for our travel can be found in

what might be our final goal.

Appendix.

Also, we tested different activities in the car

List of our targets in chronological order:

and realized that the size of the vehicle was


the only boundary we would face when it
came to utilizing the time in the car. Space
in the car was limited and making the interior
of the car suited for different activities was

1. BMW Welt
2. BMW Museum
3. Audi Museum (Figure 68)
4. 1st Meeting with Audi contacts Lorenz
Bohrer and Tilo Koch

our biggest challenge.

5. Audi Factory Tour

In addition, we tested the transition between

Bohrer and Ulrich Mueller

autonomous and manual driving. In order


for the driver to be confident in cars ability
to steer itself and to know that the transition
of control has been made, there has to be a
whole sequence of clues and reassurances.
Because of its complexity, after winter
period, transition from autonomous and
manual driving still remained a problem that
we needed to research more.

6. 2nd Meeting with Audi contacts Lorenz


7. University of Munich, Vehicle Laboratory
8. University of Munich, Jet Engine
Laboratory
Key notes

OLED technology is going to spread in


car interior design

We should narrow down our scope

Audis formula for successful products


is

sophistication,

sportiness

and

progressiveness

1/3 of people get simulation sickness

Final Documentation

297

(same as motion sickness) because

Conclusions / Lessons Learned

of the mismatch with senses and

The trip itself taught the team about the

predictions of brains. -> Interesting

long history of German car industry and

statistics for us.

its significant focus on detailed design as

Audi is researching SbW (steer by wire)

well as high end engineering. Audi itself is

technology and it will most likely be

really passionate about finalized products

introduced in future cars

that are high quality and well designed, they


were constantly emphasizing these matters

More notes and some explanatory pictures

greatly and wanted to be sure that tangible

can be found in Appendix.

prototype we provide for EXPE will meet


their requests.

298

Final Documentation

7.22.2 Geneva Trip

Smart for two, Aston Martin Cygnet...)

Geneva car show (Figure 69), as one of the

or they offer slightly bigger versions for


four passengers.(Fiat 500, Opel Adam,

biggest car shows in the world, annually

Citroen DS1...), but also there are many

attracts the most famous car manufacturers

vehicles between already established

to present brand new car models and

classes, like small SUVs, Expensive

concepts. Many of them have their world

sport cars hatchbacks (Ferrari FF, Aston

premiere there. It was a unique opportunity

Martin Rapide...) or even cars between

for us to see what the current and future

motorcycles and passenger vehicles.

trends in car industry were, to speak with


professionals from the field, to observe
design of the cars, and to locate visual

(Renault Twizy, Toyota Iroad...)

Current trend in the visual appearance

elements that make a car looks futuristic.

of the cars is to have lines which are

Observations

manner. Lines are not parallel to the

ground any more, but they flow freely

Electrical cars are a reality. Every major


car manufacturer already provides an
electric vehicle.

braking the surfaces in not so logical

Car manufacturers are constantly


expanding the number of models they
offer. Many manufacturers already
have small vehicles (size of Toyota IQ,

Geneva Car Show

on the surface of the car.

LED lights and glowing lines in


headlights are common

Even small, compact vehicles tend to


look aggressive

Final Documentation

299

Conclusions / Lessons Learned

Results

Combination of colored LED lights


(usually

light

blue),

transparent

were not known or wrongly interpreted

materials and playful mixture of different


lines and patterns on the chassis of the

by 12 testers.

Indicators like engine oil temperature,

car make the car look more dynamic

ABS, cruise control, airbag, check

and futuristic

engine light, battery level, were not

There are not many changes in the car

always needed.

interiors. Mainly, changes are related

Halves of the logos on questionnaire 1

Motor temperature, oil temperature, and

with introducing new technology, like

battery life were considered the least

touch screens.

important indicators in questionnaire 2

It is possible that in the future there

by 16 testers.

will be more vehicles for one or two


Conclusions / Lessons Learned

persons.

Compared with dashboards in 20 years ago,


7.22.3 Dashboard Questionnaire
With dashboard questionnaire we wanted
to explore how aware of different indicators
that are on a dashboard the users actually
are. Rationale behind this was to see what
information is actually valuable for driver to
have when he is using a car. We had two
different kinds of questionnaires (Figure 70
and Figure 71). In 1 questionnaire we asked
drivers to identify what each logo was for
on on the dashboard. In questionnaire 2 we
asked drivers to arrange the meters in order
based on their opinions from the most useful
to the least useful.

1.
3.
Dashboad
Questionaires
2.

4.

5.
6.

dashboard is more simple, showing less


information. We may get rid of dashboard
and only show the most important indicators
on augmented reality windshield. Drivers
can choose customized information that
they want to see. Some information like fuel
condition and turning lights is only shown
when needed. This way we get more cabin
space. This area may be utilized for working
space.

300

Final Documentation

Changing
Manual / Autonomous
Brake buttons

Acceleration
Forward, Reverse
and U-turn mode

Link and Go - Car Concept

Rotatable Chair

Final Documentation

7.23 Benchmarking
7.23.1 Akka Car Concept

301

Steering

but it doesnt spin when the car is in

Among other cars at the Geneva car show,


there was one car which focuses on the
same issue as we do. It was a working

autonomous mode.

handles behind the wheel and braking is

Akka Technologies in collaboration with


The concept is called Link and Go (Figure 72)
and it offers both manual and autonomous
driving experience.
Observations
Outlook of the car:

Car doesnt look sporty or aggressive.

done by pressing buttons with thumbs.

in order to drive.

When driving, the seat can be locked.

Drivers seat is really not ergonomically


designed for comfortable driving, but for
multifunctionality and changing of the

modes.

It is slightly taller than normal cars. It


of freedom.
The whole car is larger and has more

Performance

Back seat looks like a piece of furniture


and the whole car looks more room like

on the roof of the car.

Front seats are attached to each other


and are rotatable 360 degrees. Driver
can turn the front seat entirely and
socialize with people in the back.

During the rotation of the seats, there


is not enough place for legs.

Changing between modes is done


before the drive. There is no option
for changing modes on the go. There
is a display attached to the front seats
where user can set up the mode.

The car is designed for city use. It has


a maximum speed of 50 km/h when in
autonomous mode and 120 when driven

Mode changing

The car is a working prototype. It has


sensors in the front and a big sensor

space in the interior.

When in driving mode, driver needs to


manually pull the steering wheel closer,

for convenience and not for pleasure.


provides more comfort and the sense

Steering wheel moves closer to the


dashboard when not in use.

It is mainly a functional vehicle which is

Acceleration and brake pedals are


removed. Acceleration is done by pulling

prototype of an autonomous car, done by


other companies that provide software.

Steering wheel is always visible,

manually.

Car is meant for the year 2025 because


of the legal issues.

302

7.24 Dark Horse Prototypes


Dark Horse prototype is the term we use to
describe a prototype which reflects our most
radical and risky ideas. It presents a leap of
fate, something that might work and maybe
can be implemented in the final concept or
as the concept itself. In this exercise we
were encouraged to think outside of the box
and this was our last chance to experiment
and to test unconventional ideas before we
focus on something which will eventually be
our final concept.
Because of the lack of time caused by
our trip to Germany, we had limited time
to develop highly functional prototype,
basically we only had only one more
sophisticated prototype. In addition, we
decided to make a few of rapid prototypes
of our ideas and to present those to users
for some feedback and further inspiration.
In the following paragraphs we will explain
these prototypes briefly.

Final Documentation

7.24.1 Dark Horse Prototype Reconfiguro


The dark horse prototype tested whether the
users of autonomous vehicles would value
the concept of a reconfigurable interior
space. A reconfigurable space would allow
the users to pick and place the objects and
furniture that would be an appropriate fit
for the activity they would do in the vehicle.
This prototype also determined what level
of control the users would want to have over
the reconfigurable space.
Setup
To create a realistic experience for this
prototype, it was pertinent for the users
to feel what it would be like to reconfigure
the interior space from the outside and
then perform various activities within the
space. The team rented a U-Haul moving
van to replicate the autonomous driving
experience, as show in Figure 73. The
can already had a partition between the
front and backspaces so the user had no
visuals of the front and the driver. The
objects that were available for the user

Final Documentation

303

Reconfiguro Setup

to configure the vans back interior space

prior to getting into the van to configure the

included two chairs, an inflatable bed with

space to give them a realistic experience

pillows, a low-rise nightstand, a high-rise

of what it would feel like to control the

desk, a TV projector, a desk lamp, a deck

cabin space remotely from anywhere. The

of cards, drinks, and magazines. The TV

vehicle was also equipped with speakers

projector was used in conjunction with

not only to play the audio from the TV

a laptop computer to display TV shows

shows and music, but to amplify a text-to-

on the vans partition screen (if the TV

speech application to imitate a robotic car

projector was selected in the interface) or

voice. The text-to-speech voice was used

a front-view driving visual (if the projector

to communicate with user of their travel

was not selected), as shown in Figure 74.

updates, ask the user specific questions, and

The user interacted with a tablet interface

to respond to any requests from the user.

304

Final Documentation

Control Interfaces

The second interface was a parameter-

Three different interfaces were tested within

based application (shown in Figure 76)

this experience to determine what level of

in which the user would answer specific

control the users would want to have over

questions pertaining to the parameters of

the reconfigure space. The first interface

the activity or drive. The user would answer

was a Java drag and drop application

questions such as how many passengers

(shown in Figure 75), which the user could

will be riding, how long will you be riding,

have full control over where and in what

and Set car for: (relax, sleeping, socializing,

orientation the objects, could be placed in

etc.). Unlike the drag and drop interface,

the interior. The drag and drop application

this parameter-based application gives less

was designed to portray the objects in a

control to the user in the type of configure

proportionally realistic manner relative to

that is possible within the interior space.

the size of the vans interior space which

The slides for this interface can be seen in

allowed the user a better perception of what

APPENDIXSection 9.8.3.

could really fit in such a space.


The code for the java application can be
found in Section 9.8.2 APPENDIX.
Control Interface

Final Documentation

305

Control Interface - Parameter Based

The third interface that was tested was

The drag and drop and parameter-based

based on the concept of a lifestyle sync

interfaces were presented on a small tablet

application in which the autonomous vehicle

device to give the user a realistic experience

knows the users schedule for the day and

of what it would be like to configure the

rearranges the space based on what kind

vehicles space remotely from anywhere.

of things the user will be doing throughout

The lifestyle syncing interface was given

the day. The vehicle also talked to the user

to the user as a survey to determine what

and intuitively knows if the user needed

their schedule would be, and then the space

something specific. For example, if the

was configured for those activities.

user just went to the get a Starbucks coffee


but there is no cup holder in the previous
configuration. The vehicle will know and
prompt the user if he or she needs a cup
holder. With lifestyle syncing, the vehicle
would train and learn from what the user
needs, requests, and actually does to be
able to configure the vehicle in the most
optimal way.

306

Final Documentation

Testing Procedure

was very similar to flow 1 and 2 except

The testing was performed on campus with

that the user did not reconfigure the space.

different stops locations around campus:

Another additional feature within flow 3 was

1. Starbucks to pick up a cup of coffee

the ability to communicate with the van by

2. Work to pick some books

asking specific questions or requests and

3. Trip to pick up a friend

the car would respond.

Figure 77 portrays the different experience

The behind the scenes action of how the

flows for each of the interface interactions

whole experience flow worked was very

and demonstrates the behind the scenes

important. Two of the team members

procedure to reconfigure the space.

operated the van environment; one person

Experience flow 1 and 2 correspond to the

was driving while the other person was

drag/drop and parameters based interface.

operating all the internal features. The other

The user was given the story about that they

member followed the van on the routes with

just received an autonomous car and they

all furniture objects that were not in the

were going on three different stops around

cabin space. Once the van made a stop for

town (stops were performed on campus).

the user and the user went off to perform

The user dragged/dropped where they

their tasks, the entire team reconfigured the

wanted the furniture to be placed within

cabin, swapping objects from car to van.

the cabin space or answered questions


pertaining to the parameters of the activity
and drive. Once the space was configured
accordingly, the user hopped in the back
and proceeded to the first stop, a 5-minute
journey. The user picked up a snack or
coffee and right before heading back to the
van, they reconfigured the space to their
needs or desires. The next stop was a
20-minute journey and the user performed
similar tasks as during the first stop. The
last stop consisted of picking up a friend
and socializing.
Experience flow 3 corresponded to the
lifesync application. The van knew the
users schedule and preferences and made
an educated guess at how they wanted the
cabin space to be configured. The flow 3

Final Documentation

307

Experience
Flow 1 and 2

Experience
Flow 3

Behind the
scenes

User comes
Introduction story
marrated. Schedule
given.

User comes
Introduction story
marrated. Schedule
given.

Two team members


ready near the van

User selects
configuration from
the tablet interface

User answers
preference survey

User reaches car


and starts with short
5 min journey to
grocery store

User reaches stop 1


and goes shopping.
Uses tablet interface
to re-change
configuration.

Long Journey to
work started based
on the configuration
chosen.

After work, car


reconfigured for two
people and socializing with a friend.

User reaches car


and starts with short
5 min journey to
grocery store

Users go shopping
and return to find
space reconfigured
for them.

Long Journey to
work started. Car
prompts user for
things they would
like to change.
Changes are made
by making a quick
stop.

After work, car


reconfigured for two
people and socializing with a friend.
Car still prompts for
comfort with current
configuration and
desired changes.

Cabin space
reconfigured based
on what user wants
and team is in
driving position.

Two members of
theam in the van.
One member
follows in another
car with spare
furniture.

Cabin space is
reconfigured based
on what user wants
and team is in
driving position

One team member


follows in van and
helps in changes
on the go.

308

Final Documentation

Results

There were a six user tests that interacted


with different interfaces: one test with the

It was hard to gauge how cluttered it will


actually be inside the van

Initially the users tended to place as

drag and drop application, one test with

many objects as they could in the cabin

the parameters based interface, and four

space

tests with the lifesync application. More

When they were in the car they spent

user tests were performed with the lifesync

a significant portion of their time

because more valuable insights were

discussing or thinking about what the

collected during the first initial test. Figure

next configuration will be.

78 displays some of the users cabin space


configuration.
Common observation for conditions for

Parameter based interface:

It was easier than the drag and drop.

The configurations created for the users

experience flow 1 and 2

were acceptable and they did not need

that much level of control

Having to configure before getting into


the car interrupts their daily activities
and in general, it is more effort on the

LifeSync Interface:

users side

It made transitioning from one place to


another in your daily schedule seamless

Drag and Drop Interface

The users did not mind making minor

It took the users a long time to decide

adjustments to the configuration while

on the configuration

they were travelling to better suit their


needs.

User Tests Configuration (bottom and right)

Final Documentation

309

The users liked the fact that the car

Facing each other was most conducive

anticipated their needs and gave

for work meetings and socializing.

suggestions

the

Having them sit right next to each

configuration that the user might desire

other can be useful for multimedia or

for

changes

to

for romancing.
General Experience Observations:

The person that gets picked up views

While relaxing, the users did not always

the other person who was riding the

want to be lying down completely, would

autonomous car before as the owner. -

prefer something like a reclining chair.

(In one instance, when the car asked a

Users reposition themselves a lot in the

question to the users and they replied

cabin space.

with different opinions then the person

Just the front projection of driving is not

who owns the car gets priority in group

enough to comfort the users and make

dynamic. In the second instance, the

them feel safe

friend did not really interact with the

Maintaining

visuals

not

just

for

car as much as the owner, as they are

situational awareness and safety but

under the assumption that they owner

also for pleasure is important

knows better how to interact with their

People get used to voice commands

car.)

while interacting with the car very

Rider became embarrassed when the

quickly and if it stops talking to them

car asked something that revealed

then they feel that something is wrong.

personal information.

There is an emotional connection that

When a user was riding sideways, their

is developed with the car when the car

body was turned slightly towards the

talks to the users.

front.

310

Final Documentation

If the car did not establish rapport early

from the darkhorse prototype, the team

into the ride they felt less comfortable in

wanted to focus on the aspects of mobility

speaking with the car. People got really

and flexibility within a smart cabin space

comfortable very quickly after this.

design, and concluded that within the cabin,

Since it does not feel like a car anymore,

the chair is an important factor that affects

you need more accessories like trash

this design space. One possible solution

cans, outlets, wifi.

was to design a chair that senses whether

While talking to the car, the users want

the user is falling asleep or if they wanted to

to humanize it as much as possible.

rotate and sit in a different orientation and

Things kept falling off the table.

it just does that without the user inputting

Having just a fixed cup holder makes it

any specific commands. So is it possible

constrained a lot.

to sense a users intentions and adapt the

Car not interrupting the conversation of

chairs position based on the users body

a group or knowing when to interrupt

movements while not intruding in the users

the conversation. It is essentially like

activities?

another human that must follow all the


social etiquettes.

Car

user

confidentiality

what

information is revealed in front of the


users friends and what is not.

Changing configuration when the car


has stopped was better. So there is a
need to change configuration safely
when the car is moving.

The users forgot that they were in


the back of the van and when they
saw other cars in the projection, they
suddenly realized that they were also

7.24.2 Dark Horse Prototype - SbW


Imitation
Night before leaving to Germany, we
wanted to test with a quick prototype if it is
possible to steer in non-ordinary positions,
for example sideways or back towards front.
(Figure 79)This prototype was based on our
assumption that future autopiloted cars will
be aware of the surroundings so that optimal
driving position is not necessarily required
for safe driving.

in a car and the projection was in fact

To see if driving is actually possible in

distracting to the main activity in which

various positions, we could maybe place

they were engrossed.

the steering elsewhere in the car and this


way acquire more freedom in cabin design.

Conclusion

We wanted to also test how does it feel

People value the benefits of being able

to have contradiction between expected

to optimize the cabin space for different

vehicle movement and the actual vehicle

activities but they would like to do so in

movement.

the least amount of effort and intrusion in


their daily lives. From the insights collected

After visiting Audi, they kindly suggested us


to focus on some other factors inside the

Final Documentation

311

cabin. They also told us that they already

Setup

have an own laboratory focusing on this

Skype call between iPhone and iPad ->

so called Steer-by-Wire -technology. The

quick analog setup for screen/camera

reason they are studying this is mostly

solution

because it allows the extra freedom in

iPad was also used as a steering wheel

dashboard are and also because in future

One person was pushing and steering

electric cars are mostly done with this

the vehicle according the drivers

technology. This is because it reduces car

gestures

weight and the car is already eclectic and

Before iterating setup any further we

would not function at all without electricity?

quickly tested steering sideways and

This sort of digital technology replaced

back towards front

analog speedometer in late 1970s and it


is going to eventually replace the analog
steering.
Iphone with camera,
giving a vision forward

Ipad as a wheel and screen

312

Final Documentation

Observations

Vehicle can be steered in non-ordinary


positions in slow velocities without
difficulties to adapt

Minor crashes were caused by rough


setup since the external person steering

everyone would like


to sleep in the car if it
would be possible

cannot react right after the driver has

2. Driver sleeps in a relatively uncomfortable

given the input

position so that he/she can enjoy a nap, and

There is also a little delay between

the car is able to wake the driver up in a

the phone and iPad which also makes

relatively short time.

steering a little difficult


Sleeping in first generation autonomous
Conclusions / Lessons Learned

cars, which are going to be introduced within

We can assume that the future electric

10 year, sleeping is not most likely going to

cars are going to use this SbW-technology

be possible since reorientation from sleep

since it already under a lot of research at

takes too much time (about 30 seconds)

Audi. The delay issue can also be solved

and and early autonomous cars have to

with more sophisticated technology. To test

be able to give the controls for the driver

functionality of steering through screens in

in some situations. This is not a problem

non-ordinary positions needs to be tested

in later generation autonomous cars,

in higher velocities before validating that it

where our target year 2035 is situated. In

could be done. If our project is heading into

2035, infrastructure and cars autonomous

this direction we need to test this.

features are advanced enough to allow


sleeping in the drivers seat.

7.24.3 Dark Horse Prototype - Sleeping


Positions
Since almost everyone would like to sleep
in the car if it would be possible, the team
wanted to build various sleeping setups to
imitate different possibilities to do so. Two
assumptions exists:
1. Driver sleeps in a comfortable position so
that he/she can enjoy complete relaxation.
However, in this situation, it is hard for the
car to wake up the driver in a relatively short
time.

Setup One (Figure 80)


Using the first assumption as a guideline,
the team came up with sleeping position
sideways to direction of movement. Driver
seat turns 90 degrees to the right from the
face forward orientated position and the chair
transforms into a bed. This idea derives from
the fact that beds are positioned like this in
trains to prevent possible motion sickness.
Setup Two (Figure 80)
Under our second assumption, we observed
that if it is possible for a person to fall asleep

Final Documentation

in straight sitting position as long as the


users head is supported so that it does not
fall. Therefore, we made a head holder out
of pillows for the user to see if it is possible
to sleep like this or not.
Setup Three (Figure 80)
Third setup included a device that has a
flat and soft surface to support the head
and the same kind of chest support the
upper body (Figure Sleeping Positions 3).
This setup got its idea from the way people
sleep in airplanes. Since the inclination of
the seats are so minor, sleeping is difficult
in a regular way. In this kind of situation
people are forced to figure out more ways
to sleep comfortably. In airplanes you can
see people sleeping on the table in front of
them leaning on their hands

313

314

Final Documentation

Observations

Setup

The first setup is not much different from

We tried to draw augmented information

the regular sleeping position. Issue with this

on a glass of a protection helmet. Details

setup is to have the required space in cars

we drew on the glass was mainly driving

cabin. Width of the car is not enough for

related, such as speedometer, road and

this kind of setup so it is impossible to sleep

navigation information. (Figure 81)

sideways.
Conclusion / Lessons Learned
In the second test setup the users neck

We quickly immediately realized that this

gets tired really quick since the weight of

technology is almost impossible to imitate

the head is too much to support if the neck

like like this since the eye has to focus on

muscles are operating because of sleeping.

the details drawn on the glass and therefore

This setup is not good for even a quick nap

blurring environment. Therefore this is not

since it causes ache for the user.

useful to imitate augmented reality like this.

In the third sleeping works actually quite


well if the support is just correct for the
user. Spine is more relaxed as well as neck.
Users chest gets a little too much pressure
since the body weight is mostly on chest
and breathing causes chest movement and
increases load on the chest.
Conclusion / Lessons Learned
Replacing the traditional sleeping position in
rational way is not possible since we would
have to compensate with users ergonomics
and we are not willing to do so. Sleeping in
the drivers seat has to be done somewhat
the same way it is done in the seat next to
the driver.
7.24.4 Dark Horse Prototype - AR
Imitation
Because augmented reality technology (AR)
is going be available in future and various
applications are going to be possible with that,
we wanted see if it is possible to imitate it.

Top - Protection helmet split


Bottom - Transparent Foil

Final Documentation

315

7.24.5 Dark Horse Prototype Disappearing Steering Wheel


With this prototype we wanted to test one
of our ideas for transition between, what
we called, manual and autonomous mode.
Idea was to create a mechanism that will
hide the steering wheel, so the user can
utilize the dashboard space for some other
activity, when he/she is not required to steer.
The whole purpose of this prototype was
to visualize in a tangible way what we had
in mind and then, to use that prototype to
inspire conversation or to get some ideas.
Setup
This prototype is one of many quick and
dirty prototypes. We made a cardboard
steering wheel that we attached to a
cardboard table (which imitates the
dashboard - Figure 82) We made two slices
in the table and used the tape as hinges in
order to make the board rotatable.
Observations

When flipped, steering wheel requires a


lot of space below the dashboard.

We could imagine that the cuts in the


table would give the interior very cheap

outlook.

The cuts wouldnt be very convenient,


especially when the table is used for
eating.

Conclusions / Lessons Learned


Although the prototype was rough we
quickly realized that this solution for making
utilizing the dashboard was not probably
the best solution. This construction gave
very mechanical outlook and caused more
problems than it has solved. We realized
that we need more simple and more
sophisticated solution to this problem.

316

Final Documentation

7.24.6 Dark Horse Prototype -

in some convenient place. Places

Magneto

suggested:

Idea behind Magneto prototype was to test


what users will do with the steering wheel
if it was possible to attach anywhere on the
dashboard. Prototype acted mainly as a
conversation starter.
Setup

Cars roof

On the bench next to drivers seat

In personal bag or purse

What would the user do with detachable


steering wheel?

in leisure mode for cars infotainment

We made a dashboard made out of


cardboard with metal underneath, and
steering wheel of plywood with a magnet in
order to fake the steering wheel mounting.
(Figure 83)Test setup was organised on a
table and mood was set by explaining that
there is two modes of driving: manual mode
where you drive yourself and autonomous
mode, where the car drives itself. During this
we conversed with the tester and tried to
collect valuable feedback from it. (Figure 84)
Discussion Key Notes
Where would the user put the steering
wheel?

User would put the steering wheel

Use the steering wheel as control device


system

Steer the car from outside like a remotely


controllable car

Wheel could have a picture of the car


from bird view like a computer game

What other functions detachable steering


wheel makes possible?

Steering wheel could be used as a


personal tablet

Augmented reality in the wheel, like


personal tablet but with more futuristic
technology

Steering wheel could be a key as well


that could lock the car and prevent its
usage if the steering wheel is not near

Final Documentation

the car
Conclusions / Lessons Learned
User testing is a fundamental way of getting
feedback and ideas from prototypes.
Prototype does not have to be that
sophisticated to open a conversation and
communicate the idea. Here we learned
that building and testing more is learning
more. Ideas generated by users was listed

317

318

Final Documentation

7.25 Funky Prototypes

chair adjustments before continuing on to

Funky prototype is a term used in ME310 for

to determine if users value this anticipatory

an approximation prototype of the full system


without making a costly commitment to any
one configuration, technology, or geometry.
This prototype is still a low-commitment,
rapidly assembled, concept prototype that
allows for objective evaluation and testing.

a different activity. This prototype was trying


system and if it could be achieved by using
force sensors.
Test Setup
Users were asked to perform the list of
activities taped to the dashboard in any
order and at whatever interval they wanted.

7.25.1 Funky Prototype

The list of activities included driving, trying

Anticipatory Chair

to take a nap, reaching for your backpack in

The team wanted to explore the concept


of facilitating seamless transitions between
activities by having a chair that senses the
changes in your body position and language
& moves to accommodate those changes.
The hope is to create a feeling of more
mobility by removing the need to make

the backseat, watching TV, texting/ reading


a magazine, and talking to your passenger.
(Figure 85) User activities displays the
activities being performed from an actual

Final Documentation

319

320

Final Documentation

User Positions in the Car

user test.

Technical Results

The users were told that the chair would

Three patterns were noticeable in the post-

adjust based on their movements in the

processed force sensor data: sitting up,

chair. There was a button (as shown in

rotating to reach something in the back

Figure 87) to override the actions being

seat, and pushing back against the chair.

made by the chair on the steering wheel,

The following pages explain the data and

which users were instructed to press if the

patterns that emerged. The graphs are

chair ever incorrectly moved or anticipated

arranged, as the force sensors are arranged

their needs. A second switch was on the

on the chair. The top two graphs represent

dashboard so that the user could switch

the top two sensors on the chair, and the two

between windshield mode and watching

right graphs represent the two right sensors

TV. The debrief interview was conducted

placed on the chair when looking straight on

in the test setup, while the chair adjusted

to the chair. (Figure 88)

to better suit this social dynamic. The tests


lasted approximately 20 minutes.

Dashboard Test Setup

Final Documentation

321

Force Sensors

Force Sensors on the Chair

322

Final Documentation

Observations

passengers. In general, users had a positive

Users would pull at the wheel slightly

experience with this anticipatory sensing

when they wanted to move into driving

chair prototype when the chair matched their

position.

intentions. In instances where the chair was

Users would begin to push back against

initially incorrect in its anticipatory action,

the chair if the chair moves in a direction

users still did not want to regain manual

that the user did not want.

control over the chair and felt it made their

Rotating the chair to reach a backpack

transitions easier.

was useful only when the backpack was


out of reach. Otherwise the chair did not
react fast enough to be helpful.

Comfortable angles of recline differed


for each user.

Results
The team discovered that force sensors
seem like a fairly feasible way to identify
simple user intentions like pushing back
on the chair to recline, but a higher density
of sensors in conjunction with additional
types of sensors will be needed in order to
better distinguish between more complex
intentions like the desire to socialize with

Seat Back Data

Sitting up =
0 force read

Seat hitting lower


back as chair raises
into driving position =
INCREASE in force

Force Pattern 1: Sitting Up (from reclined


position to driving position)
These graph (Figure 89) demonstrates
leaning up from the seat back and waiting
for the seat to move into driving position.
When the person sits up, the two upper
sensors read no force since their back is
not touching the chair. Once the seat starts
touching persons back, the two lower
sensors become active.
For the seat cushion, as the person sits up
from a reclined position all their weight is then
shown in the seat cushion force sensors,
which is why one can see an increase in all
four bottom sensors to a steady state value.

Final Documentation

323

Seat Cusion Data

Sitting up =
INCREASE in force

Force Pattern 1

Force Pattern 2: Rotating (to reach for

red mark transition, the data shows a lot

something in the back seat)

of movement shifting. Since there were not

These graph (Figure 90) demonstrate

enough sensors to capture the entire chair,

rotating / leaning towards the back to grab

the variation makes it hard to distinguish

something. At the red transition mark, the

what exactly is going on. Conclusion, need

entire top right side sensors goes to zero

to increase density of sensors among the

while T1 (lower left sensor, the direction

chair to help filter out small movements

the user was leaning) increases. After the

shifts that are unintentional.

Seat Back Data

Top right sensor =


0 force

Lower left sensor


as user rotates
in that direction =
INCREASE in force.
Force Pattern 2

324

Final Documentation

Seat Cusion Data


Rotating to this
side = INCREASE
in force

Rotating to other side = DECREASE in


Force Pattern 3: Pushing Back (to recline)
These graph (Figure 3) demonstrate pushing
back on the seat. The top seat back sensors
(T2 and T4) increase as the user pushes
against the seat back.
The top seat cushion sensors (B2 and B4)
decrease, while the lower seat cushion
sensors (B1 and B3) should slightly increase.
This increase is seen in B3, but because of
the users position in the chair B1 did not
detect this increase.

Seat Back Data


Pushing on seat
back = INCREASE
in force

force on all right sensors

Final Documentation

325

Seat Cusion Data


Pushing on seat
back = DECREASE
in force

Pushing on seat
back = slight
INCREASE in force

Force Pattern 3

7.25.2 Funky Prototype

sceneries, one with a dull road and one with

What should I do

fun-to-drive road. All test situations were

The team wanted to find out gestures and


subconscious movements users performed
within the cabin space to validate whether
people perform unintentional gestures in
unfamiliar situations. This prototype was
suppose to encourage innovative thinking
and offer us new and different ideas for
triggering these two modes. The team
believed that this prototype might uncover
intuitive gestures that people do when they
oriented themselves into a driving or leisure
position.
Test Setup
The concept of the prototype was to design
a dashboard with a steering wheel that
could disappear by flipping the dashboard.
It consisted of a car seat and a dashboard
that had a steering wheel on one side and
an original dashboard on the other side. The
windshield was imitated with two different

filmed with a camera to later observe the


users reactions and body language.(Figure)
The team tested the prototype in several
different locations including Design Factory,
Administration Building, and Motonet (a
local hardware retailer store).
The test started with an explanation of the
situation the user was in. We ended the
explanation by suggesting the user to try
to summon the steering wheel. After the
user got the wheel to appear, we reset the
scenario and asked the user to retract the
steering wheel. In our test there was no right
gesture for summoning the wheel, so we did
not a have a locked gesture for summoning
the steering wheel. The test process is
explained in the picture on the next page.

326

Final Documentation

To give the user a motivation to change

swapped the driving scenario to fun-to-

between autonomous and driving modes, we

drive road and asked the tester to summon

explained to the user that they would have to

the wheel. After the user got control of the

switch modes based on the road conditions.

steering wheel, we changed the boring road

The tester started with autonomous mode on

back to trigger the urge to switch back to

a boring road. Then our facilitator suddenly

autonomous mode as shown in figure on


the right.

Situation
explanation

User doing the


transition to
autonomous mode

User doing the


transition to
manual mode

Debriefing the
test with the
user

Final Documentation

Autonomous mode

Boring
road

Manual mode

Fun
road

327

Autonomous mode

Manual mode

Trigger for
Transition

Fun
road

Manual mode

Autonomous mode

Trigger for
Transition

Boring
road

Test Results

regular buttons were most commonly

We defined inputs that the users gave us

suggested. Most of the hand gesture

as suggestions and categorized them into

suggestions were performed subconsciously.

different groups. There were 11 testers with

This was what we were looking for, but as

suggestions pertaining to both driving and

can be seen from test dat, it was not a

autonomous mode. We have 44 suggestions

common suggestion. Although there was

collected in the table below. To ease the

hand gestures at some point, it was never

comparison between different suggestions,

the first suggestion.

the data is collected into graph that can also


be seen in (Figure 95).

Observations

Testers were afraid to fail, they did not

These were mainly suggestions that users

go wild on ideas as we thought they

gave us knowingly. Voice recognition and

would.

328

Final Documentation

30

Amount in Total
25

1st Gesture
Suggestion Amount

20

2nd Gesture
Suggestion Amount
3rd Gesture

15

Suggestion Amount
4th Gesture

10

Suggestion Amount
5th Gesture
Suggestion Amount

Touch

VoiceG

estureE
ye
Recognition

Brain
Waves

Suggestion Genre
Categories

Touch

Voice

Gesture

Eye
Rec.

Brain
Waves

Suggestions
in Total

Amount in Total

23

11

44

14

19

11

1st Gesture
Suggestion Amount
2nd Gesture
Suggestion Amount
3rd Gesture
Suggestion Amount
4th Gesture
Suggestion Amount
5th Gesture
Suggestion Amount

Final Documentation

If the first suggestion they gave was not

Conclusions / Lessons Learned

right, they thought quite a while before

Voice recognition and touchable buttons

another try.

Some users lifted their hands to imitate


driving when they realized it is a mock
up of a car interior.

People from Design Factory are a lot


more open to wild ideas and it is easier
to get those from them than people out
of the university area.

It seemed to be really hard for users to


position oneself far in the future.

Users tended to suggest ways that were


already working functions for them. For
example one user was praising voice
control since he had had so good
experiences with his voice controllable
GPS.

Debrief Feedback

One tester would like to have some


indication when the controls are under
his control, other than only steering
wheel appearing.

Users can be thought any kind of


behavior, for example calling to service
lines every time when there is a
problem is a learned habit from work
for example. There is always someone
else responsible for matters that you are
not responsible for.

Test setup was too humiliating, tester


got frustrated when the suggestion he
was sure was right did not work.

329

Voice browsing through menus are


too complicated and slow. Most likely
the voice commands will not be like
that, more like direct commands to do
different actions.

seemed to still be the most intuitive way for


users to transit from autonomous to driving
mode. One possible solution is to have
a combination of commands that map to
different user interaction. Another possible
solution is hand gesture that initiate various
modes. If the user does not have hands
on the steering wheel, the car will clearly
increase autonomous functionality and if
hands are on the steering wheel, the car
will decrease autonomous feeling.We also
realized that especially in car world, the
prototype has to be sophisticated enough to
communicate well with the tester, otherwise
it is really hard to get valuable feedback.
Testing with users outside the university
environment is a whole other matter. The
whole test setup needs to be very well
planed. In Design Factory it does not really
matter, since people are used to test different
kinds of prototypes, but normal citizens who
just want to take care of their daily business
is a different matter. In the future we have
to be really careful that we do not confuse
testers with unclear test setup.
Since the year 2035 is far in the future, it is
difficult for normal people to comprehend.
We have been living in the year 2035 for
4 months and the futuristic world is really
familiar to us. But for random people in a
store it is almost impossible to step into
future just like that. For future testing we
should not include the futuristic aspect too
much, it seemed to just confuse the user.

330

Final Documentation

7.26 Functional System


Prototypes

in the driving mode and is hidden in

prototype setup 1 & 2. The steering wheel

How well can people adapt to this new


chair interaction?

Are gestures an appropriate method to


initiate transition between driving and
autonomous mode?

Functional System prototype is a term used


for ME310 prototype that is still a bit crude,
and obviously assembled from off-the-shelf
parts; however, this time decisions on
technical implementation are done with
increased sophistication.
7.26.1 Functional Sytem Prototype
Anticipatory System
Based on a successful funky prototype, the
team decided to move forward and test the
entire functional system with full integration
of the steering wheel and Chair Sense being
implemented in a closed loop. The concept
being tested is one, which facilitates
effortless transitions between activities by
sensing the intentions of the user through
intuitive body command movements and
intentional mode initiating gestures.
Prototype Description
The funky prototype was developed further
to include closed loop commands for the
chair motions based on inputs from the
force sensors on the chair and gestures
being recognized from the Microsoft Kinect
camera. Another addition to the prototype
is a functional steering mechanism, which
emerges from the dashboard only when

autonomous modes as seen in Figures


96 and 97. Complete functional system
is also hooked up to a driving simulator for
users to get the experience of driving and
transitioning between autonomous and
driving mode. A projector has been used to
mock up the windshield in front of the user,
as they would experience in a real vehicle.
The team also verified that the chair twist/
rotational motions work with a person sitting
in the chair.

Final Documentation

331

user was in autonomous mode.


Test Setup
The user was asked to sit in the system

The gestures that the user asked to perform

and had all of the interactions and gestures

during the test were an calibration pose to

described to them, and a brief introduction

get the Kinect to recognize their body, a

phase occurred so the user could get the

drive pose to move into driving mode, and

hang of all of the gestures and interactions

a retract pose to move into autonomous

with the chair. A driving simulation game

mode. These gestures can be seen on

called Crazy Taxi was being projected

the next page (Figure 98) to demonstrate

onto the screen in front of them as a

the way the Kinect reads these different

windshield and they could take over control

gestures.

of the moving car when the steering wheel


emerged. A team member drove the
videogame car behind the screen while the

332

Calibrated Position

Final Documentation

The user also tried to utilize the chair


sense feature. The team would adjust the
placement of the force sensors in order to
ensure a better data output and to ensure
that the amount of pressure they needed
to exert on the chair to make it incline or
decline was not too much or to sensitive. The
modes they could experience were driving
mode and autonomous mode. Driving mode
entailed having the chair move closer to the
dashboard into an upright driving position
and having the steering wheel emerge.

Drive Position

Autonomous mode entailed having the


chair move away from the dashboard and
recline into a more comfortable and relaxed
seat position, as well as having the steering
wheel retract and be hidden from the users
view and out of their reach. Once the users
were familiar with all of the interactions with
the system, the team let them just transition
through the activities and modes they as
they pleased.
Results
Chair Sense interaction:

Retract Position

For tall or short people it was difficult to


use the interaction.

The force sensors were often not placed


correctly to ensure proper pressure
distribution.

Due to the angle of the Kinect, it was


unable to capture their entire upper
body so the Kinect could not understand
their gestures.

The responsiveness of the sensors was


annoyingly sensitive

The chair was oscillating around the


position the user actually wanted to be

Gestures recognized by the Kinect software 1, 2, & 3

Final Documentation

333

in.

changes. The data collection points also

Pushing back against chair felt very

cannot be fixed to point on the chair, but

natural

rather use reference points from the users

There is a sandwich threshold, which

body. For instance, using pressure data

the user will get stuck in and be unable

where the chair touches just below the top of

to push the chair back.

the users shoulder. That way height will not

If this threshold is met, users immediately

affect the data and chairs responsiveness.

want to turn off these smart features or

Another important discovery was that

get out of the chair.

gestures were triggered randomly just when

People are afraid of being crushed

a person was gesticulating. Therefore, we

by the chair allowing to go past this

do not think gestures should be used as

sandwich threshold

mode initiators. Instead, gestures could be


used as a way of weeding out unintentional
body movements as opposed to intentional

Steering Mechanism Interaction:

body commands.

People wanted to keep both hands on


the steering wheel, while signaling that

In general, people enjoyed the experience

they wanted to go into autonomous

and concept. Specifically, they enjoyed

mode.

moving from driving mode into autonomous

Removing their hands from the wheel

mode as it meant the chair moved to a more

made them feel unsafe and felt awkward

relaxed position and the steering wheel went


away. It made them feel free to not worry

Other Considerations:

about driving anymore and instantly put

Most users wanted a twist motion.

them in a position that was more enticing.

All users described several instances in


which they would want this exist, even
in cars today

Wished there was a way to lock &


unlock chair sense

Users wanted a way to tell the chair to


stop in the position they wanted, in the
event that it incorrectly reacts.

7.26.2 Functional System Prototype Buckle It Out

What is needed for effortless transition?

What kind of problems there are while


changing between autonomous and
driving mode with a seatbelt switch?

The two design teams conducted different


Conclusions
The team concluded that the thresholds we
set for the sensitivity of the chair must be
dynamic and change as the chair position

prototypes for the same purpose: to provide


an easy trigger of autonomous mode and
manual mode, and to test the transition
between mode changes. Both teams
decided to hide the steering wheel when in

334

Final Documentation

autonomous mode, based on the rationale

The team found out that if the steering wheel

that :

was removed, there was 15% more cabin

1) it is less confusing for driver not seeing

space. The concept of this prototype was to

the steering wheel what it is autonomous

design the steering wheel so it folded inside

mode,

the table when the driver freed their hands

2) the self-spinning steering wheel will not

from driving and wanted to use the space

be a distraction when the car is driving

to do something else. The goal was to test

autonomously,

what kind of information drivers needed

3) retracted/folded steering wheel saves

to smoothly transition from one mode to

space for other activities. Two different

another and whether our mode-changing

methods for trigger of changing modes were

trigger (seat belt) was easy to use. The

prototyped.

assumptions were that the key activities in

The first one was the seat belt and the

the car are driving, socializing and working,

second one was gesture control. In addition,

that can be seen in the Figure 99.

the two prototypes provided different


solutions for hiding the steering wheel.

Final Documentation

335

Test Process

to unbuckle the seat belt when the car

The team tested four individuals from

was still moving.

different age categories. Test users were

Seat belt interrupted the activity flow

initially told that they were now sitting in a

when switching to autonomous mode.

self-driving car on their way home, and could

The driver cant let the wheel go while

take control over the car when in driving

unbuckling the seat belt. Here transition

mode. When the car was in autonomous

is not smooth from manual mode to

mode, the steering wheel was folded inside

autonomous mode.

a table and retracted out when in driving

A cooling down time before the mode

mode. A driving simulator was displayed on

change can give driver better preparation

a computer screen so the testers could see

physically and psychologically.

the road condition and take control over the

Trigger for manual auto might be

car. Instead of telling testers how to change

different than trigger for auto manual

mode, the team tried to let the users find the

because manual auto requires

trigger first.

more attention and more complicated


safety system. A more intuitive, easily

Observations

None of the testers realized that seat

accessible trigger is needed.

Autonomous function activation system

belt was the way to summon the

might be different from activation system

steering wheel.

for wheel movement.

They either pulled the wheel, searched


for button, tried voice control or gesture,

Conclusions

etc.

To make the transition between modes as

The team had to tell the testers to fasten

easy as possible for the users, the team has

the seat belt.

to really think through the transition process.

The wheel popping out surprised them.

The process has to be simple and clear for

The team was surprised to see that not

the user. Also the user has to be aware

many people unbuckle the seat belt

all the time of what the car is doing, what

to change mode. Instead, they would

mode the car is in, and how to change the

rather like to push the wheel back.

car to different modes. The same goes with

The team realized that seat belt might

different activities within the cabin.

not be the best way to activate one


mode.
Results

Seat belt as a trigger brought confusion.


Although seatbelt is fastened only when
the driver is driving, driver was confused

336

Final Documentation

7.27 Design Specifications


7.27.1 Design Specifications For

Chair Electronics Discovery


The chair that was used in the prototype was
from an older Audi car. All the electronics

Anticipatory Chair

to control the seat adjustments were fully

The technical setup of this prototype

functional. The most technical part of this

included the fully functional car seat that was


hacked into to be able to control manually
from a remote area. The chair was fitted
with eight force sensing resistors (FSR)
connected to an Arduino Mega2560 board.
The Arduino was connected to a computer
running Matlab that collected the raw data
from the sensors and the time stamp marks
that notified of the test users transition
between activities. Post processing scripts
were created to construct visual plots of
the sensor data versus time to examine
whether distinct patterns relative to the
users intentional movement positions were
noticeable. If these patterns were obvious,
then having the chair automatically make
the adjustments based on these patterns
could be a possibility.

prototype consisted of making the necessary


adjustments to the chair electronics to
control the adjustments from a remote area.
The first main objective was to figure out the
how the chair worked and where all the motor
and adjustment control lines were routed.
The team re-engineered the seat without
any documentation or electronic schematics
of the seat controller. A pinout diagram of
the seat wiring harness was created to map
out where the signals from each controller
were routed. Figure Connector diagrams
from seat wiring harness 1 & 2 displays
the necessary pin out diagrams of all the
connectors that were under the seat. The
only adjustments that were considered
were the back seat tilt, the bottom seat tilt,
the bottom height adjustment, and the seat
sliding function.

Motor Control
Signals(black connector)
+12VDC

Motor 4 P2

11
N/C

N/C

Connector Diagrams

GND
Motor 3 P2

Motor 2 P2

Motor 3 P1

Motor 1 P2

Motor 2 P1

Motor 1 P1

Motor 4 P2

Final Documentation

337

Motor Sensing
Signals(blue connector)
1
2
6
9
10
12
13
14
15

Red Conn P6
Red Conn P7
ACC Pin4
Motor 4 Encoder
Motor 1/2/3 GND
Motor 2 Encoder
Motor 4 GND
Motor 1 Encoder
Motor 3 Encoder

17
19
22
23
29

Motor 1
(Front bottom tilt)
1
2

Power
Return

Motor 2
(Chair slide)
Encoder
GND

4
3

ACC Pin3
ACC Pin1
ACC Pin5
ACC Pin2
Red Conn P2

1
2

Power
Return

Motor 3
(Back bottom tilt)
1
2

Power
Return

Motor 4
(Chair tilt)
Encoder
GND

4
3

Main incoming / outgoing


(Red connector)
1

6
7

1 - Blue conn P1
2 - Blue conn P28
3 - N/C
4 - +12VDC
5 - GND
Connector Diagrams

Encoder
GND

4
3

3
5

10
6 - +12VDC
7 - Blue conn P2
8 - Seat Belt
9 - N/C
10 - Seat Belt

1
2

Power
Return

Encoder
GND

4
3

Adjustment Controller
Connector
5

1 - Back tilt
2 - Chair slide
3 - GND
4 - Bottom height
5 - Bottom tilt

338

Final Documentation

To Controller Input

The second objective was to figure out


how the switches on the chair worked in

conjunction with the electronic controller.


Figure Switch box controller schematic

S?
SW - SPDT

displays the circuit schematic that was


designed to work with a switch box the
team created. Since the controller circuit

schematics was proprietary and could


not get access to it, the team tapped onto
the seat control adjustment connector

450

700

while hooked up with the controller. It


was observed that no 12VDC power line
was being routed into the connector but a
GND (ground) line was. This immediately

GND

meant that the switches, when closed,

Once the switches were figured out, a

were switching to GND. The next thing

manual switch box was created so that the

that was discovered that the switches were

chair could be controlled from a remote

momentary single-pole double throw and the

area rather than from the switch controls

resistances on each of the throws could be

on the chair. Figures Switch control box 1

measured. One resistance was measured

& 2 display the manual switch box that the

as ~700ohms while the other resistance

team designed.

was measured as ~450ohms. Since only


five signal lines were routed into the seat
adjustment connector and one was GND,
the other four signal lines corresponded
to the four different switches that could be
activated. Each switch has two options,
either forward or backwards direction. This
led the team to uncover that the controller
used an analog-to-digital converter onboard
to be able to sense whether the voltage from
the switch side corresponded to the 450ohm
or the 700ohm side of the switch, resulting
in the motor to spin forward or backward.

Final Documentation

339

Force Sensing Resistors


The chair was outfitted with eight FSRs to
collect the users weight distribution data
across the chair. Figure 103 displays the
chair with the sensors and the nomenclature
used to distinguish each sensor for data
processing. The FSR were connected with
connectors to make it easier to debug/
repair each individual sensor. A proto-shield
that connected to the top of the Arduino
Mega2560 was used to breadboard the
FSR circuits. Figure FSR proto shield the
complete proto-shield and Figure Schematic
for one FSR circuit displays the circuit
schematic of how each individual sensor
is connected.
Switch control box 2

Force Sensors

Position of the Force Sensors

340

: FSR Proto

Final Documentation

Schematic for one FSR circuit

+5VDC
R6
FSR
3
R7
3.3K

GND

The FSRs are essentially potentiometers


(variable resistors). The amount of force
applied to the sensor pad is inversely
proportional the amount of resistance across
the terminals of the FSR. Within the circuit
that was used, the more force applied,
a larger voltage drop is seen where the
connection to the analog input pin is tapped.
The onboard ADC was used to convert this
voltage of 0-5V range to a 10-bit digital
value. A higher digital value corresponds to
more force applied. The Arduino collects the
data from a each sensor approximately 40
times per second and sends the raw data via
serial communication to Matlab for further
post processing.
During initial user testing, it was observed that
the density of FSRs that were implemented
were not sufficient in determining the weight
distribution of the user over the entire chair.

The FSRs only detect how much pressure


is applied to the pad and not where the
pressure is being applied to on the pad.
For FSRs to work, higher density of sensors
needs to be placed in localized areas to
obtain a better weight distribution.
MATLAB Processing
MATLAB was used, via serial communication
with the Arduino, to collect the raw data
from the sensors and the timestamp
that marked the data to notify the users
transition between activities. MATLAB
saved the FSR data and time stamp in text
files. Post processing scripts (Appendices)
took the information within the text files
and constructed visual plots of the data,
separated by top and bottom. The team
analyzed these plots and the user test
videos together to verify that the users

Final Documentation

341

visual data represented the patterns in the

7.27.2 Design Specifications For

plot. Figure Plots of FSR data from a user

Anticipatory System

test displays a users test data plotted for


analysis. The plots were constructed to the
same nomenclature outline to make it more
intuitive.

This prototype system consists of 5


subsystems that all work in conjunction
to provide the user with an interactive
responsive chair experience as seen in
Figure Subsystem component diagram.
These subsystems include the Microsoft
Kinect (IR Camera), Force Sensors and
Board, Chair Rotation Mechanism, Steering
Wheel Mechanism, and Motor Relay Driver
with Motor Position Detection. The Microsoft
Kinect used and coded to detect specific
poses or gestures that would indicate
whether the user wanted to be in driving
mode or autonomous mode. The force
sensors and board were used to collect

Plots of User Test Data

the data used to determine chair tilt body


intentions to recline or incline the seat back.
The chair rotation mechanism allowed for the
chair to rotation approximately 20 degrees
to help users sitting more comfortably
while talking to a friend or reach something
from the back seat. The steering wheel
mechanism allowed the steering wheel to
be hidden when not in use and to emerge
when the user wants to drive. And lastly,
the motor relay driver and motor position
detection would enable motors in order for
the chair to reactive to these user (gesture
and intentional body commands) and force
sensor inputs.

342

Final Documentation

IR Camera (Kinect)

Force Sensor Board

Arduino

Chair Rotation Mechanism

Steering Wheel Mechanism

Motor Relay Driver

Motor Position Sensing

Reley Driver Circuit Schematics

Relay Driver Board

motors on the seat, one motor to control

The team designed a custom motor driving

the steering mechanism, and one to control

and position-sensing controller to allow


the closed loop functionality. The OEM
controller that came with the chair was
not used because modifications to the
microcontroller could not be accessed.
The team re-engineered the controller by
visual inspections of the unit and determined
how exactly the controller knew where the
position of the chairs motors were at all the
time.
To be able to gain control over the motors,
H-Bridge circuits were created out of
DPDT switches to allow the motors to spin
backwards and forwards, as shown in Figure
108. SPDT switches were used to connect
+12VDC or GND to supply power or lock
the motor. The circuit schematic displays
only one relay setup. Since there are four

the rotation mechanism, six individual relay


circuits were designed in parallel on a
protoboard, as shown in Figure Relay driver
circuit schematic and board 1 & 2. SPDT
switches were used to connect +12VDC or
GND to supply power or lock the motor.
Load current measurements were obtained

Final Documentation

343

+12VDC

Enable power
D1
1N5817

GND
K1
ORWH-SH-105H-3F
(Relay SPDT)
D3

Motor control

GND

D3
1N5817
K2
RTE24005F
(Relay DPDT)

GND
M
Motor
prior to selecting parts to build the relay

the forward and backward positions back

board. Load currents while the motors

electromagnetic field spikes are generated.

were running were approximately 3-4A

These spikes can potentially damage parts

with an average weight of 120lbs. As the

and disrupt/create noise on sensitive logic

motors reached and stalled at their running

signals in the nearby area.

limits, the stall current was approximately


9A. The DPDT and SPDT switches were

During initial testing of the entire prototype,

chosen with current ratings of 10A and with

the team discovered that to create a

a switching coil voltage of 5VDC. This coil

more refined experience for the user, the

voltage was pertinent since the board was

transitions between chair positions must be

designed so the microcontroller logic levels

smooth and non-intrusive. The on/off control

would be able to control the switches. A 10A

signals to the motor did not provide the

current rating seemed reasonable since the

experience that the team ultimately desired

team knew that software would prohibit the

so future designs of the relay board will

motors to run until they stalled out. 1N5817

replace the SPDT switches, that supplied

Schottky diodes were applied across the coil

power to the motor, with Darlington power

control inputs and between +12VDC and

transistors and appropriate heat sinks to

GND. Since motors and coils have inductive

dissipate the wasted energy. The relays

properties, as the motor switches between

had a switching time of approximately 10ms.

344

Final Documentation

Although PWMs could be used to control

signal line was tapped to the encoder sensor.

the motors, the motors did not run properly

With the use of a potentiometer in series with

with such a low PWM frequency since the

+5VDC that hooked to the encoder line, the

motors required a much higher frequency to

optimal resistance value (~440 ohms) was

run without sharp transition steps.

found that allowed a low frequency squarewave pulse to be transmitted on the line.

Motor Position Sensing Board

The square-wave pulse had a DC offset of

The next step after gaining control over the

~2.5V and a range from 2.5V to 5V. These

motors with the relays was to determine how

parameter ranges were not sufficient to use

the original seat controller knew the positions

with the Arduino logic levels so a comparator

of each motor. The team knew there was

setup was used with a threshold detection

some kind of encoder sensor on the motor

of 3V, as shown in Figure Motor position

but did not know how it was possible since

sensing circuit schematic and board 1 &

encoder sensors usually consist of 3 signal

2. This allowed the output to step between

lines, VCC, GND, and the sensing line.

0V and 5V for an input range of 2.5V and

The motor pinouts only had two lines for

5V. The circuit schematic only displays the

the encoder, some signal line and GND.

setup for one motor, but the board houses

After several different attempts, the team

four separate motor sensing circuits for the

discovered that power was being supplied

chair motors as shown in the figure below.

through the encoder signal line while the

Motor Position Sensing Board 1

Final Documentation

345

R2
1K

GND

+5VDC

R2
1K

R4

R5

+5VDC
3
6

1K

470

+5VDC

U1A
LP339

R3
Res1

OUTPUT

1K

12
GND

GND
Figure 111: Motor Position Sensing Board 2

Force Sensing Resistor

placed in areas that would be beneficial to

The same square FSR and circuit schematic

obtain weight distribution data from. The

were used as in the funky prototype. Four

sensor layout can be seen in Figure FSR

additional FSR, a total of 12 FSR, were

layout on the chair 1 & 2.

T1

T2

T3
T5

T4

T6

B1

B2

B3
B5

B4
B6

FSR Layout on the chair

346

Final Documentation

FSR Board

Instead of the FSR circuits attached to an


Arduino protoshield, a dedicated board was
built. The previous protoshield required eight
analog inputs to be used on the Arduino.
The best way to utilize I/O ports on the
microcontroller was to design a board that
incorporated a 16-to-1 analog multiplexer
(with digital control lines) to select between
each of the 12 FSRs. This multiplexer
only required one analog input rather than
directly connecting each FSR to an analog
input. Figure FSR circuit schematic and
board, displays the newly re-designed FSR
board.

CD4067BE

OUT

T2
T1
B6
B5
B4
B3
B2
B1
CONTROL_A
CONTROL_B

GND

OUT
IN7
IN6
IN5
IN4
IN3
IN2
IN1
IN0
A
B
VSS

VDD
IN8
IN9
IN10
IN11
IN12
IN13
IN14
IN15
INHIBIT
C
D

+5VDC
T3
T4
T5
T6

GND
CONTROL_C
CONTROL_D

TI - CD4067BE

FSR circuit schematics

Kinetic Camera Software

to distinguish between unintentional and

The Kinect camera from Microsoft was

intentional movements. One way to help

used in conjunction with the FSRs in order

filter out the noise and small movement

to better detect the body positions and

adjustments was to use another sensor (the

movements of the user. During the funky

Kinect camera). The Kinect has been a very

prototype user testing, the team determined

popular device to develop gesture based

that the FSR data was not accurate enough

recognition applications for a variety of

Final Documentation

347

purposes. The team used the Kinect to get

During initial testing, it was observed that the

specific body positions and gestures related

Kinect did not track the users body position

to enabling driving and retracting mode, but

accurately and efficiently as if the Kinect

also to detect whether the user wanted to

was mounted perpendicular to the user.

grab something from the backseat. Once the


data is processed on the laptop, the state of

Integrated Development Environment

which the users body position is transferred

Setup

via serial communication to the Arduino that

The Kinect device and software was

controls the seat.

used on Mac OSX. Since it is a Microsoft/


Windows based product, third party open

Kinetic Position Setup For Testing

source develop environments were used

The Kinect is setup as shown in Figure

to create the code. OpenNI/NITE was the

Kinect setup FOV. With the device above

open source API library that was used to

the person and tilted downward, the depth

gain access to most of the Kinect device

information for the joint positions must be

functionalities. Processing, a graphical

translated/rotated around the x-axis by

interface that was developed in the same

the amount the Kinect is tilted, which is

manner as the Arduino IDE, was used with

performed in software. This will emulate

the SimpleOpenNI library to develop the

the coordinate system to be similar as if the

data processing code passed to the Arduino

Kinect was mounted directly in front of the

controller. The SimpleOpenNI library for

user within the FOV (Field of View).

Processing is a simple wrapper library that


maps simple call functions to the more
complex API library functions within OpenNI.
Software Description and Functionalities
The software was designed to detect

Kin

ect

specific body positions and gestures. The

FO

team incorporated detection for transitioning


into driving mode (summoning the wheel),
transitioning

into

autonomous

mode,

(retracting the wheel), and turning/reaching


back toward the backseat. The software was
designed around the code from Making

Dashboard

Things See by Greg Borenstein.


The software is split up into two different
files, the functional_system_prototype.
pde (the main file) and the skeletonposer.

Kinect setup FOV

348

pde (checking function file). The main


file initializes all the necessary data

Final Documentation

Calibrated Position

processing events, adds the different body


position rules for detection, and does the
high level checking of each state. The
skeletonposer file implements the rule
functions that are called within the main
function and checks each rule to give a
binary result of being either PASSED or
FAILED. The rules are based on joint-tojoint coordinate comparison. For example, if
the y-coordinate of the right hand is greater
than the y-coordinate of the right elbow and
right shoulder, it can be concluded that the
user is raising their right. Figure 63 hand

Drive Position

shows two of the body positions that were


implemented in software to be detected. The
code checks the rules against the updated
coordinate comparisons at every iteration
of the code loop. Once the skeletonposer
file determines if the rules have PASSED
or FAILED, the main file will send the
Arduino that controls the seat adjustments
a character via serial communication to
represent the state of the user. The states
are decoded as R for retract, D for drive,
and Q for default state in which the user is
in a neutral position.

Retract Position

Once the software is running, the user


uses the calibration body position Psi,
which most resembles the dont shoot
me position as shown in Figure Gestures
recognized by the Kinect software .The
Kinect always has to be calibrated to the
users body for skeleton tracking to be
accurate and enabled.

Gestures Recognized by Kinect Software

Final Documentation

349

Rotational Mechanism
The rotation mechanism allowed for users
to reach the backseat for an item or for
parents to reach back and unbuckle their
childs car seat. Placing the chair base on
a ball bearing turntable, which was also
attached to a motor base, compromised
the rotation mechanism as seen in Figure
Rotational mechanism subcomponent. This
motor base was where we also mounted our
motor and gear. Attached to the chair base
was a large gear, which interacts with the
gear mounted to the motor. This system
only allowed for a 30-degree turn, and the
gears .5 thick laser-cut acrylic pieces as
seen in Figure Laser cut gears for rotational
mechanism. Gear profile information and
motor specifications can be found in Section
Laser Cut Gears

9.10.4 Appendices.

Rotational mechanism subcomponent

Motor

Seat Base

Motor Gear

Seat Gear

350

Final Documentation

Steering Mechanism
The team designed a retractable steering
wheel using a crank slider mechanism. The
main concept is that the steering wheel lies
flush against the dashboard when not in
driving mode. This default configuration has
been shown in Figure Steering wheel tucked
away (disappeared) when in autonomous
mode. There is also a flap, which the users
can use as a work desk if they want to use
that space in autonomous mode to perform
some activity.
As soon as the controller gets a driving

another addition to the functional prototype

mode command it initiates the chair motion

compared to the Funky prototype. The whole

first. Once the chair is in the driving position

steering assembly lies on a slider the base

the steering wheel motor turns for a fixed

of which is connected to the platform on

time interval (since the motor used for this

which the entire setup has been mounted.

prototype does not have encoders on it) and

The motor has been mounted in a laser cut

the steering wheel pops out for the users to

acrylic structure, which is attached vertically

drive. In this prototype, the steering wheel

to a support that goes over the mechanism.

has been connected to the actual driving

The two links and their fasteners have been

simulator steering wheel so that users

chosen so that there is no interference

can actually drive on the simulator, which

during motion. The crank slider mechanism

is being projected in front of them. Giving

has been shown in Figure Crank slider

users this simulated driving experience is

mechanism for steering wheel.

Crank slider mechanism for steering wheel.

Final Documentation

351

Arduino Code Description

the backside of the chair. It is a 10-bit ADC

The motion control module has been

so the force sensor values are going from 0

implemented using the Move Motor

to 1024. It was found through initial testing

functions. It takes as arguments the

that the normal value of this force sensor

distance to be moved in terms of encoder

reading on the back of the chair is around

ticks and the direction of motion. There

450. So the upper and lower thresholds for

is a moveComplete variable on both the

the algorithm were then set as 150 and 700.

motors which is set to 0 when the motors


are moving. This functionality can be used

The software has been implemented in

to ignore motion commands when the chair

the form of a state machine, which keeps

is getting adjusted for a certain mode.

checking for gesture recognition commands


being sent from the Kinect. These commands

There is a motor stall check Boolean function

are in the form of characters received on the

that detects that a motion command is being

Serial port of the Arduino. As soon as a new

applied and the encoder line is not ticking.

character is detected the code feeds through

When a motor stall is detected the controller

the corresponding motion commands to the

immediately shuts down that motor.

two motors. The force sensor smart algorithm


has an override over this functionality except

The Arduino code starts with a calibration

when in drive mode. If at any point the force

function that maxes out the two motors in

sensor reading crosses the upper threshold

one limit position. This is detected using the

then the algorithm initiates backward motion

motor stall function. Once both the motors

of the chair tilt and vice versa for tilting

are maxed out the code then updates the

forward. If the algorithm detects a low on

EEPROM values of the motor position to

the force sensors while moving back then it

zero. The EEPROM is used to constantly

stops the motion in that direction and goes

store the current motor position, so that

to default mode. Figure Smart chair mode

the next time the code starts it knows what

thresholds displays the thresholds for the

position the seat is in currently.

chair positions.

All the force sensors on the chair are


connected to the ADC pins on the Arduino.
The motor encoder lines are connected to
the External Interrupt pins corresponding to
INT4 and INT5. Only two motors are being
currently used for this prototype the front/
back and the chair tilt motors. The Arduino
code is running in a loop and detecting the
force sensor values from the top portion of

352

Final Documentation

R
Steering
Wheel
D

Smart
Chair
R

Smart chair mode thresholds

7.27.3 Design Specifications For


Buckle it out
System Setup
The prototype is meant to test with the
users for the experience of switching
between autonomous mode and manualdriving mode. The system consists of two
sub-systems that make the steering wheel
to appear. The sub-systems are mounted
to a frame. The sub-systems are pushing
out mechanism and steering mechanism.
Fastening a modified seat belt triggers all
this activity. The casing of the servo was
designed with detail to fit the ends of the
custom made steering instrument and the
rotating servo. After that the electronics and
the software are explained with care. The
final prototype design is shown in Figure
122. Figure 123 shows user tests with the
final prototype.

D - Driving Mode
Q - Default
R - Retract Mode

Final Documentation

Final Prototype Design (top and middle)

353

354

The Frame
The base of the prototype was a chair from
a Fiat Doblo that was bolted to a wooden
frame. The measurements of the mock-up
cabin were taken from the Audi A8. Figure
124 displays a member of the group setting
the layout.
The frame was built already for an earlier
Funky prototype, but the system to control
the steering wheel was incorporated. The
team had two ideas how to make the steering
to appear as shown in Figure 125. One idea
was to have a cylinder-rolling dashboard
that had the steering wheel appear from the
bottom of the dashboard between the legs
of the driver. The concept was initially tested
but was too large to fit in a normal sized car.
The team took measurements from a BMW

Final Documentation

500 taxi and validated that in this concept


steering wheel would hit drivers legs. The
cabin space approximations can be seen in
Figure 126. The other idea consisted of a
joint-like bar that brought the steering wheel
inside the dashboard and opened up like
a flower for the driver (Figure 125 middle
right). In the end, the team chose a solution
related from the original ideas and combined
the two ideas to be appearing steering
wheel that came out of the dashboard, but
retracted vertically in the very end of the
cycle of switching modes.

Final Documentation

355

48cm

24cm

64cm

38cm

77cm

50cm

356

Final Documentation

Steering Mechanism

Gaming Steering Wheel

The steering mechanism was built on

its ready to be driven. Inside the casing

ordinary gaming steering wheel where the

there is a servo and the wiring for it. The

original wheel was cut and the wires were

wheel is a square shaped bent pipe that is

lengthened. (Figure 127) The new steering

mounted to a servo with a friction coupling.

wheel was designed to appear from the

The servo casing was made out of

dashboard. After a flipping movement, the

prototyping plastic by milling. (Figures 128

wheel appeared in front of the user, showing

and 129)

Servo Casing

Final Documentation

Steering Wheel Mechanism

357

358

Final Documentation

Pushing Out Mechanism

by motor controller that is able to handle

The appearing wheel was decided to

24V and more than 10 Amps. The motor

build with a normal gaming steering wheel

controller was controlled by Teensy 2.0++

mounted to a table that moved linearly

with PWM signal. The motor was operated

for a determinate amount. The prototype

blindly, so no feedback was given back to

consisted of a bar that was connected to a

the microcontroller and the calibration was

rolling plate on a table while the other end

made with time sequence that how long time

was connected to a separate small piece of

of operating the motor to a certain direction

table on guides from an Ikea drawer. The

would fulfill the need of 22 cm of linear

construction was made with a 24V DC motor

movement in the end of the bar. (Figure 131)

connected to the bar that transfered the


rotational movement to a linear movement.
The power was supplied from a controllable
power supply. The motor was controlled

Pushing Out Mechanism

Final Documentation

359

Seat belt switch

Teensy 2.0++ Code Description

The Seat belt switch was build of a standard

Code used for controlling the motor can

seat belt lock and embedded touch-sensitive

be found from the appendix. The code

switch. A hole was drilled on the back of the

has a main loop that is a modified state

casing and the switch mounted with hot glue

machine that triggers from the seatbelt

to be pressed by the existing mechanism of

switch. The button has a bouncer control

the lock. The switch was then connected to

that it eliminates all the false alarms from

the Teensy 2.0++ pins with pull-up resistors

the interference of other electronics. The

and polled in the program code. (Figure 132)

calibration is measured by hand and all

Electronics
The electronics on breadboard contains
Teensy 2.0++, a push button for debug,
wiring for the seat belt switch, motor
controller(pin 16) and servo(pin 15) wiring.

the values are timed with the mechanical


feedback of the system.

360

Final Documentation

Final Documentation

361

Thank you for your time!

362

Final Documentation

ME310 2013 Final Documentation


Audi Team

Você também pode gostar