-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathindex.html
More file actions
632 lines (463 loc) · 26.9 KB
/
index.html
File metadata and controls
632 lines (463 loc) · 26.9 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">
<title>PersonalizedLanguageLearning_GH2</title>
<style type="text/css">
body {
font-family: Helvetica, arial, sans-serif;
font-size: 14px;
line-height: 1.6;
padding-top: 10px;
padding-bottom: 10px;
background-color: white;
padding: 30px; }
body > *:first-child {
margin-top: 0 !important; }
body > *:last-child {
margin-bottom: 0 !important; }
a {
color: #4183C4; }
a.absent {
color: #cc0000; }
a.anchor {
display: block;
padding-left: 30px;
margin-left: -30px;
cursor: pointer;
position: absolute;
top: 0;
left: 0;
bottom: 0; }
h1, h2, h3, h4, h5, h6 {
margin: 20px 0 10px;
padding: 0;
font-weight: bold;
-webkit-font-smoothing: antialiased;
cursor: text;
position: relative; }
h1:hover a.anchor, h2:hover a.anchor, h3:hover a.anchor, h4:hover a.anchor, h5:hover a.anchor, h6:hover a.anchor {
background: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAAGXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAA09pVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADw/eHBhY2tldCBiZWdpbj0i77u/IiBpZD0iVzVNME1wQ2VoaUh6cmVTek5UY3prYzlkIj8+IDx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IkFkb2JlIFhNUCBDb3JlIDUuMy1jMDExIDY2LjE0NTY2MSwgMjAxMi8wMi8wNi0xNDo1NjoyNyAgICAgICAgIj4gPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4gPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIgeG1sbnM6eG1wPSJodHRwOi8vbnMuYWRvYmUuY29tL3hhcC8xLjAvIiB4bWxuczp4bXBNTT0iaHR0cDovL25zLmFkb2JlLmNvbS94YXAvMS4wL21tLyIgeG1sbnM6c3RSZWY9Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC9zVHlwZS9SZXNvdXJjZVJlZiMiIHhtcDpDcmVhdG9yVG9vbD0iQWRvYmUgUGhvdG9zaG9wIENTNiAoMTMuMCAyMDEyMDMwNS5tLjQxNSAyMDEyLzAzLzA1OjIxOjAwOjAwKSAgKE1hY2ludG9zaCkiIHhtcE1NOkluc3RhbmNlSUQ9InhtcC5paWQ6OUM2NjlDQjI4ODBGMTFFMTg1ODlEODNERDJBRjUwQTQiIHhtcE1NOkRvY3VtZW50SUQ9InhtcC5kaWQ6OUM2NjlDQjM4ODBGMTFFMTg1ODlEODNERDJBRjUwQTQiPiA8eG1wTU06RGVyaXZlZEZyb20gc3RSZWY6aW5zdGFuY2VJRD0ieG1wLmlpZDo5QzY2OUNCMDg4MEYxMUUxODU4OUQ4M0REMkFGNTBBNCIgc3RSZWY6ZG9jdW1lbnRJRD0ieG1wLmRpZDo5QzY2OUNCMTg4MEYxMUUxODU4OUQ4M0REMkFGNTBBNCIvPiA8L3JkZjpEZXNjcmlwdGlvbj4gPC9yZGY6UkRGPiA8L3g6eG1wbWV0YT4gPD94cGFja2V0IGVuZD0iciI/PsQhXeAAAABfSURBVHjaYvz//z8DJYCRUgMYQAbAMBQIAvEqkBQWXI6sHqwHiwG70TTBxGaiWwjCTGgOUgJiF1J8wMRAIUA34B4Q76HUBelAfJYSA0CuMIEaRP8wGIkGMA54bgQIMACAmkXJi0hKJQAAAABJRU5ErkJggg==) no-repeat 10px center;
text-decoration: none; }
h1 tt, h1 code {
font-size: inherit; }
h2 tt, h2 code {
font-size: inherit; }
h3 tt, h3 code {
font-size: inherit; }
h4 tt, h4 code {
font-size: inherit; }
h5 tt, h5 code {
font-size: inherit; }
h6 tt, h6 code {
font-size: inherit; }
h1 {
font-size: 28px;
color: black; }
h2 {
font-size: 24px;
border-bottom: 1px solid #cccccc;
color: black; }
h3 {
font-size: 18px; }
h4 {
font-size: 16px; }
h5 {
font-size: 14px; }
h6 {
color: #777777;
font-size: 14px; }
p, blockquote, ul, ol, dl, li, table, pre {
margin: 15px 0; }
hr {
background: transparent url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAYAAAAECAYAAACtBE5DAAAAGXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAAyJpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADw/eHBhY2tldCBiZWdpbj0i77u/IiBpZD0iVzVNME1wQ2VoaUh6cmVTek5UY3prYzlkIj8+IDx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IkFkb2JlIFhNUCBDb3JlIDUuMC1jMDYwIDYxLjEzNDc3NywgMjAxMC8wMi8xMi0xNzozMjowMCAgICAgICAgIj4gPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4gPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIgeG1sbnM6eG1wPSJodHRwOi8vbnMuYWRvYmUuY29tL3hhcC8xLjAvIiB4bWxuczp4bXBNTT0iaHR0cDovL25zLmFkb2JlLmNvbS94YXAvMS4wL21tLyIgeG1sbnM6c3RSZWY9Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC9zVHlwZS9SZXNvdXJjZVJlZiMiIHhtcDpDcmVhdG9yVG9vbD0iQWRvYmUgUGhvdG9zaG9wIENTNSBNYWNpbnRvc2giIHhtcE1NOkluc3RhbmNlSUQ9InhtcC5paWQ6OENDRjNBN0E2NTZBMTFFMEI3QjRBODM4NzJDMjlGNDgiIHhtcE1NOkRvY3VtZW50SUQ9InhtcC5kaWQ6OENDRjNBN0I2NTZBMTFFMEI3QjRBODM4NzJDMjlGNDgiPiA8eG1wTU06RGVyaXZlZEZyb20gc3RSZWY6aW5zdGFuY2VJRD0ieG1wLmlpZDo4Q0NGM0E3ODY1NkExMUUwQjdCNEE4Mzg3MkMyOUY0OCIgc3RSZWY6ZG9jdW1lbnRJRD0ieG1wLmRpZDo4Q0NGM0E3OTY1NkExMUUwQjdCNEE4Mzg3MkMyOUY0OCIvPiA8L3JkZjpEZXNjcmlwdGlvbj4gPC9yZGY6UkRGPiA8L3g6eG1wbWV0YT4gPD94cGFja2V0IGVuZD0iciI/PqqezsUAAAAfSURBVHjaYmRABcYwBiM2QSA4y4hNEKYDQxAEAAIMAHNGAzhkPOlYAAAAAElFTkSuQmCC) repeat-x 0 0;
border: 0 none;
color: #cccccc;
height: 4px;
padding: 0;
}
body > h2:first-child {
margin-top: 0;
padding-top: 0; }
body > h1:first-child {
margin-top: 0;
padding-top: 0; }
body > h1:first-child + h2 {
margin-top: 0;
padding-top: 0; }
body > h3:first-child, body > h4:first-child, body > h5:first-child, body > h6:first-child {
margin-top: 0;
padding-top: 0; }
a:first-child h1, a:first-child h2, a:first-child h3, a:first-child h4, a:first-child h5, a:first-child h6 {
margin-top: 0;
padding-top: 0; }
h1 p, h2 p, h3 p, h4 p, h5 p, h6 p {
margin-top: 0; }
li p.first {
display: inline-block; }
li {
margin: 0; }
ul, ol {
padding-left: 30px; }
ul :first-child, ol :first-child {
margin-top: 0; }
dl {
padding: 0; }
dl dt {
font-size: 14px;
font-weight: bold;
font-style: italic;
padding: 0;
margin: 15px 0 5px; }
dl dt:first-child {
padding: 0; }
dl dt > :first-child {
margin-top: 0; }
dl dt > :last-child {
margin-bottom: 0; }
dl dd {
margin: 0 0 15px;
padding: 0 15px; }
dl dd > :first-child {
margin-top: 0; }
dl dd > :last-child {
margin-bottom: 0; }
blockquote {
border-left: 4px solid #dddddd;
padding: 0 15px;
color: #777777; }
blockquote > :first-child {
margin-top: 0; }
blockquote > :last-child {
margin-bottom: 0; }
table {
padding: 0;border-collapse: collapse; }
table tr {
border-top: 1px solid #cccccc;
background-color: white;
margin: 0;
padding: 0; }
table tr:nth-child(2n) {
background-color: #f8f8f8; }
table tr th {
font-weight: bold;
border: 1px solid #cccccc;
margin: 0;
padding: 6px 13px; }
table tr td {
border: 1px solid #cccccc;
margin: 0;
padding: 6px 13px; }
table tr th :first-child, table tr td :first-child {
margin-top: 0; }
table tr th :last-child, table tr td :last-child {
margin-bottom: 0; }
img {
max-width: 100%; }
span.frame {
display: block;
overflow: hidden; }
span.frame > span {
border: 1px solid #dddddd;
display: block;
float: left;
overflow: hidden;
margin: 13px 0 0;
padding: 7px;
width: auto; }
span.frame span img {
display: block;
float: left; }
span.frame span span {
clear: both;
color: #333333;
display: block;
padding: 5px 0 0; }
span.align-center {
display: block;
overflow: hidden;
clear: both; }
span.align-center > span {
display: block;
overflow: hidden;
margin: 13px auto 0;
text-align: center; }
span.align-center span img {
margin: 0 auto;
text-align: center; }
span.align-right {
display: block;
overflow: hidden;
clear: both; }
span.align-right > span {
display: block;
overflow: hidden;
margin: 13px 0 0;
text-align: right; }
span.align-right span img {
margin: 0;
text-align: right; }
span.float-left {
display: block;
margin-right: 13px;
overflow: hidden;
float: left; }
span.float-left span {
margin: 13px 0 0; }
span.float-right {
display: block;
margin-left: 13px;
overflow: hidden;
float: right; }
span.float-right > span {
display: block;
overflow: hidden;
margin: 13px auto 0;
text-align: right; }
code, tt {
margin: 0 2px;
padding: 0 5px;
white-space: nowrap;
border: 1px solid #eaeaea;
background-color: #f8f8f8;
border-radius: 3px; }
pre code {
margin: 0;
padding: 0;
white-space: pre;
border: none;
background: transparent; }
.highlight pre {
background-color: #f8f8f8;
border: 1px solid #cccccc;
font-size: 13px;
line-height: 19px;
overflow: auto;
padding: 6px 10px;
border-radius: 3px; }
pre {
background-color: #f8f8f8;
border: 1px solid #cccccc;
font-size: 13px;
line-height: 19px;
overflow: auto;
padding: 6px 10px;
border-radius: 3px; }
pre code, pre tt {
background-color: transparent;
border: none; }
sup {
font-size: 0.83em;
vertical-align: super;
line-height: 0;
}
kbd {
display: inline-block;
padding: 3px 5px;
font-size: 11px;
line-height: 10px;
color: #555;
vertical-align: middle;
background-color: #fcfcfc;
border: solid 1px #ccc;
border-bottom-color: #bbb;
border-radius: 3px;
box-shadow: inset 0 -1px 0 #bbb
}
* {
-webkit-print-color-adjust: exact;
}
@media screen and (min-width: 914px) {
body {
width: 854px;
margin:0 auto;
}
}
@media print {
table, pre {
page-break-inside: avoid;
}
pre {
word-wrap: break-word;
}
}
</style>
</head>
<body>
<p><img src="imageData/cover.png" alt="Screen shots of final mockup for mobile application"></p>
<h1 id="toc_0">Personalized Language Learning</h1>
<p>UX Case Study | Iterative Research, Design, and Testing</p>
<hr>
<h2 id="toc_1">Background</h2>
<p>This is an individual project I began working on as an Apprentice in User Experience at Fresh Tilled Soil, a Boston-based digital design agency. </p>
<h2 id="toc_2">Challenge</h2>
<p>While gamified lesson plans are an effective and engaging way to learn for many, I wanted to explore alternatives to this format where material is learned more organically, is more freely and meaningfully associated, and is unique to every learner. </p>
<p>I chose to design an experience for learning American Sign Language, other than my personal interest in learning it, because I was intrigued by the potential challenges it poses and opportunities it may afford compared to a hands-free, oral language. With American Sign Language is also a rich culture to immerse yourself in and diverse communities of hearing, hard of hearing, and deaf individuals to connect with.</p>
<h2 id="toc_3">Approach</h2>
<p>I approached this project through iterative design sprints consisting of five phases:</p>
<ul>
<li><em>Understanding</em> user needs and motivations</li>
<li><em>Diverging</em> to generate many ideas</li>
<li><em>Converging</em> on one idea</li>
<li><em>Prototyping</em> that core user flow</li>
<li><em>Testing</em> my most dangerous assumptions</li>
</ul>
<hr>
<h2 id="toc_4">Sprint 1</h2>
<h3 id="toc_5">Understand</h3>
<p>During the initial sprint, I wanted to better understand people’s motivations behind learning another language and explore what patterns of studying behaviors existed. Through interviews, I discovered that people were studying in shorter, more frequent sessions, on the go, and often times spontaneously rather than following a structured study schedule. </p>
<h3 id="toc_6">Diverge</h3>
<p>I began sketching ideas for short, engaging learning exercises and ways to capture moment-of-learning associations, such as what the user was doing, where she was, or how she was feeling during those exercises. Referred to as “hints”, these associations would always remain attached to the words, serving as memory aids when needed.</p>
<p><img src="imageData/diverge_1_muted.png" alt="Sketches and post-it notes"></p>
<h3 id="toc_7">Converge</h3>
<p>The core user flow began with learning the sign for a word, then adding a hint, and finally seeing that hint available during review exercises. </p>
<p><img src="imageData/coreUserFlow_1.jpg" alt="Diagram of core user flow"></p>
<h3 id="toc_8">Prototype</h3>
<p>Starting with paper prototypes, I moved to Keynote so that I could include videos of words being signed. Keeping this prototype at low-fidelity allowed me to test my assumptions and get the most valuable feedback regarding core user flow rather than UI design.</p>
<p><img src="imageData/converge_1.jpg" alt="paper prototypes"> </p>
<p><img src="imageData/prototype_1.png" alt="screenshot of low-fidelity mockup in Keynote"></p>
<h3 id="toc_9">Test</h3>
<h4 id="toc_10">Method</h4>
<p>The most dangerous assumption I wanted to test here was that users would find it engaging and useful to associate each new sign they learned with something specific about the moment they learned it. </p>
<p>I recruited five participants with no prior experience learning American Sign Language. Simply because of time constraints, this test was conducted remotely using Keynote and video/voice chat. </p>
<p>Participants were asked to think aloud while they used the prototype to learn new signs and complete review exercises. Because I was moderating, observing, listening for what participants were and weren’t saying, and taking notes, each session was also recorded with the participant’s consent.</p>
<h4 id="toc_11">Findings</h4>
<p>Overall, participants were confused by the idea of explicitly choosing an association to make with certain signs. Referring to them as <q>hints</q> and <q>memory aids</q> increased confusion, as participants expected that the hint would directly relate to the actual meaning of the sign rather than relate back to the moment they learned the sign. Every participant wanted more flexibility with creating hints, mainly the ability to take notes and draw. </p>
<h4 id="toc_12">Further insight</h4>
<p>Attempting to measure how well people were remembering signs they had learned using hints with just a handful of participants wasn't too effective. Particularly because of the confusion around what the hints were and how they were being used, it was much more valuable to simply hear their thoughts and observe their behaviors while testing the prototype. </p>
<p>In addition, this particular usability test would have been much more effective if it were conducted as a field study, which would allow participants to use real rather than simulated moment-of-learning experiences as associations. </p>
<hr>
<h2 id="toc_13">Sprint 2</h2>
<p>From the feedback, observations, and lessons learned in the first sprint, it only made sense to use this next sprint to test my next set of assumptions out in the real world, testing how people learn new signs and create hints at different times of day and in different types of environments.</p>
<h3 id="toc_14">Understand</h3>
<p>I first wanted to better understand how people use spare time throughout a typical day. Through in-depth interviews, I recorded detailed accounts of each individual's experience throughout a specific day, including what they were feeling and thinking. </p>
<p>I found that individuals used their lunch and commute time primarily to connect with close friends and family, sharing with others where they were, what they were doing, and how they were feeling. </p>
<h3 id="toc_15">Diverge</h3>
<p>What if these moments of sharing with others were also used for learning new signs? </p>
<p>What if the user is in control of the material she learns based on what she is doing, where she is, and how she is feeling? </p>
<p>In learning signs relevant to what is going on in the user's life at that moment, the user no longer needs to rely on adding hints for creating moment-of-learning associations.</p>
<h3 id="toc_16">Converge</h3>
<p>Instead of learning a pre-determined series of signs, the revised key user flow then begins with learning signs that are relevant to the user during that moment-of-learning. </p>
<p><img src="imageData/coreUserFlow_2.jpg" alt="diagram of core user flow"></p>
<p>This was done by prompting the user to answer questions like <q>Where are you?</q>, <q>What are you doing?</q>, or <q>How do you feel?</q>. </p>
<p>Adding hints becomes a practical way to help learn new signs in whatever format is most effective - notes, a sketch, image, or other media. </p>
<p><img src="imageData/wireflow_Prototype_2.png" alt="wireflow diagram showing flow of user interaction between low-fidelity screenshots"></p>
<h3 id="toc_17">Prototype</h3>
<p>From this wireflow, I created interactive prototypes with video layers using Flinto. Because I was still testing core user flow, I kept the fidelity level low. During usability testing, I gave participants paper prototypes for actually creating their hints. </p>
<h3 id="toc_18">Test</h3>
<h4 id="toc_19">Method</h4>
<p>The key assumptions I wanted to test were:</p>
<ul>
<li><p>Users will add hints for helping them remember the meaning of the sign. </p></li>
<li><p>Users are more likely to create hints and capture their environment if the environment is interesting and at a more productive time of day (ie, morning or afternoon rather than at the end of the work day).</p></li>
<li><p>Users will rely mostly on notes and sketches for remembering what signs mean when they are using the app during their commute, rather than take photos. </p></li>
</ul>
<p>To do this, I observed four participants learning signs and creating hints while in different environments and times of day. The tests with participants A and B were divided into separate sessions on different days. </p>
<table>
<thead>
<tr>
<th style="text-align: center">Participant</th>
<th style="text-align: center">Morning Commute</th>
<th style="text-align: center">Afternoon Coffee/Lunch</th>
<th style="text-align: center">Evening Commute</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center">A</td>
<td style="text-align: center"></td>
<td style="text-align: center">X</td>
<td style="text-align: center">X</td>
</tr>
<tr>
<td style="text-align: center">B</td>
<td style="text-align: center">X</td>
<td style="text-align: center">X</td>
<td style="text-align: center"></td>
</tr>
<tr>
<td style="text-align: center">C</td>
<td style="text-align: center"></td>
<td style="text-align: center">X</td>
<td style="text-align: center"></td>
</tr>
<tr>
<td style="text-align: center">D</td>
<td style="text-align: center"></td>
<td style="text-align: center">X</td>
<td style="text-align: center"></td>
</tr>
</tbody>
</table>
<p>For this usability testing, I teamed up with other apprentices for help with recording sessions and taking notes. </p>
<h4 id="toc_20">Findings</h4>
<h5 id="toc_21">User-generated hints</h5>
<p>Rather than add hints for helping them remember the meaning of the sign, most participants added hints for remembering <em>how</em> to sign the word they were learning. </p>
<p>Participants A, B and C created hints for remembering how to sign words. Participants A and B chose to describe hand shape and movement by drawing diagrams and adding notes. Participant C added detailed descriptions to describe hand shape and movement, in addition to drawing diagrams. </p>
<p>Only Participant D created hints for remembering what signs meant, using drawings and notes.</p>
<p><img src="imageData/test_hints_2.png" alt="photos of user-generated hints"></p>
<p>Overall, participants seemed to prefer adding notes as their primary hint, regardless of how the hint was to be used and under what environmental conditions the moment-of-learning was. </p>
<h5 id="toc_22">Reviewing signs with user-generated and app-generated hints</h5>
<p>In the second sessions with participants A and B, hints they created for signs they learned in the previous session were used to help them remember how to sign the word. When asked to recall the meaning of signs, they could see other words they learned at the same time and when or where they learned that group of signs.</p>
<p><img src="imageData/prototype_2.png" alt="screenshot of user-generated hints appearing in app as a memory aids for studying"></p>
<p>Incorporating user-generated and app-generated hints in this way was easy for both participants to understand and use. Both participants were willing, but less excited, to take a photo of themselves signing or of their surroundings during their morning and evening commutes. Not surprisingly, both were more interested in taking photos while having lunch. </p>
<p><img src="imageData/test_Paloma_2.png" alt="photos of participant using mockup app in a restaurant setting"></p>
<h4 id="toc_23">Further insight</h4>
<p>All but one participant were willing to take photos of themselves or their surroundings, particularly in the afternoon at lunchtime. All but one participant developed strategies for taking photos/videos of themselves signing a two-handed sign. In other words, people were able to find a way to accomplish something if they were motivated enough to do so. However, having to prop your phone up against a cup of coffee at the table, or placing your phone temporarily on your lap while on a moving train isn't the best experience.</p>
<p>Other interesting findings were:</p>
<ul>
<li>Participants expressed that they felt comfortable learning to sign in busy public environments because no sound was necessary.</li>
<li>Participants got left and right confused, some had much more trouble with this than others.</li>
<li>Participants had trouble initially learning a sign when presented by more than one person.</li>
<li>Participants enjoyed learning their names and seeing their names used throughout the prototype.</li>
</ul>
<hr>
<h2 id="toc_24">Sprint 3</h2>
<p>The focus of the final sprint was to build on the exploratory findings of the previous sprints, redefining the core user flow to create a personalized language learning experience for real users.</p>
<h3 id="toc_25">Understand</h3>
<p>I began research into who is practicing sign language, what their motivation is to learn/practice American Sign Language, and what their level of proficiency is. </p>
<p>From this research, three key personas and an important user story that connects them emerged.</p>
<p><img src="imageData/personasEmpathyMaps.png" alt="personas Ben, Jill, and Kate"></p>
<h4 id="toc_26">Ben and Jill</h4>
<p>Ben is deaf and fluent in ASL. He enjoys making new friends, encourages anyone interested in strengthening their ASL skills, and is actively involved in the deaf community.</p>
<p>Ben frequently shares photos on Facebook and Instagram. His cousin Jill, in California, often comments on his social media posts in an effort to stay connected. Every so often, Ben and Jill will find time to video chat. </p>
<p>Jill is hearing and just beginning to learn American Sign Language in order to connect with Ben and the family more. She is very motivated to learn sign language in order to build a stronger relationship with her cousin. </p>
<p>While video chat enables Jill to learn and practice new signs with Ben as they would in person, she and Ben struggle to have meaningful conversation over what turns into a half hour tutoring session. Infrequent video chatting adds to the struggle, since Jill goes weeks without practice signing with another person. </p>
<p><img src="imageData/xMap_Ben&Jill.png" alt="experience map showing interactions between Ben and Jill"></p>
<p>In mapping the experience of daily, weekly, and monthly interactions between Ben and Jill, I was able to pinpoint opportunities for addressing specific pain points.</p>
<p>Ben and Jill frequently connect through sharing photos. Each time either shares a photo could be an opportunity for Jill to learn new signs and share what she is learning with Ben. With access to media created by users fluent in American Sign Language, Jill could describe the photo she is sharing with Ben using sign language. Having attached various signs to the description of her photo, she is able to review all of the signs she has learned, add notes and/or sketches, and more easily share what she is learning with others.</p>
<h4 id="toc_27">Kate</h4>
<p>Kate is hard of hearing. On a weekly or monthly basis she meets with others to practice ASL and stay connected with the deaf community. Due to limited signing practice, however, she often finds it difficult to establish more meaningful relationships. She needs a way to more easily and more frequently connect with others in the community.</p>
<p><img src="imageData/xMap_Kate.png" alt="experience map for Kate"></p>
<p>In mapping Kate's specific flow of interactions when practicing sign language in person, the key opportunities discovered were:</p>
<ul>
<li>Kate could have more meaningful conversations if photos were shared.</li>
<li>She could more effectively learn new signs while in conversation if she could better capture them to practice later.</li>
</ul>
<h3 id="toc_28">Diverge</h3>
<p>How does the core user flow need to evolve to support each of these use cases for these key personas?</p>
<p>How does this inform the core functionality for the app?</p>
<h3 id="toc_29">Converge</h3>
<p>A new core user flow that begins with the user taking a photo turns any moment a user wants to capture and share with others into a moment of learning. Signs are attached to the photo by adding a short description and hints can be added to each sign. </p>
<p><img src="imageData/coreUserFlow_3.jpg" alt="diagram of core user flow"></p>
<h3 id="toc_30">Prototype</h3>
<p>Starting in Balsamiq made it fast and easy for me to explore many design ideas for an image-based learning experience. </p>
<p><img src="imageData/onboardingX.png" alt="screenshots of medium-fidelity mockup made in Balsamiq"></p>
<h4 id="toc_31">Final Design</h4>
<p>In further developing the best idea, I wanted to design a more polished prototype without having to decide on every detail of the look and feel yet. To do this, I designed a prototype in Sketch using components from Google's Material Design system.</p>
<p><img src="imageData/Artboard_hifiPrototype.png" alt="screenshots of hi-fidelity mockup made in Sketch"></p>
<p>I chose list-based navigation to hierarchically structure three major content views: All items, Image item, and Sign item.</p>
<h5 id="toc_32">All items</h5>
<p>When an already on-boarded user starts the application, she is given a list view of her previous entries with a floating action button to easily add a new one. Provided with this three-line list, she can search her images by description, see the number of signs and hints added, and sort by the item's date.</p>
<h5 id="toc_33">Image item</h5>
<p>Selecting an image item allows her to view and edit the item's image, description, or signs. To drill down further to details for each sign, she selects a sign from the two-line list.</p>
<h5 id="toc_34">Sign items</h5>
<p>In viewing sign details, she can add or edit her hints or media of the sign.</p>
<hr>
<h2 id="toc_35">Reflection</h2>
<p>During this project I fully immersed myself in every aspect of the design sprint process and used it to drive the design of a user experience, from concept to working prototype. It was an opportunity to apply design thinking principles in practice, and use insights from user research and testing to challenge and inform each iteration of design. </p>
<p>I made mistakes and learned many lessons along the way in each phase, but perhaps the most exciting discovery I made was during testing. I found that allowing time after a test session to openly discuss with the participant and follow up with any questions I had as an observer proved to be invaluable. Paired directly with testing my assumptions, having that dialogue not only helped in better understanding a user's motivations but also gain further insight I otherwise would have missed out on.</p>
<hr>
</body>
</html>