Home
TC-Usability は じ め に 製品開発工程において専門的な観点から
Contents
1.
2.
3.
4. 3 1 3 4
5. TC
6. EU 28 TC Usability UE
7. TC Usability 1 2 TC
8. uncovers subjective preferences easy to manage T TC Usability quick to conduct self report can be unreliable as a measure of performance questionnair
9. e IEA JBMIA UPA e TC
10. 30 TC Usability UA MAE
11. Hm CURES 3 1 3 5 UA 31 TC Usability 3 1 3 5 3
12. c
13. amp 26 TC Usability RE c
14. TC TC E
15. 38 KJ F F1 39
16. 18013407 64 TC Usability HEGEN gS e Tuc I G LIEFIE Bing n n adi E pes 3CEJ 7 9 SH ipne 2E E Ebd LOM E im P ad Gt OERESOSH SH T OSHS SU EE ES le sm rsren IMAR ERBE TSH Let SRONEGIV TSH Bata BEAECOSH GC eR SH worm Bia OSH amp CISPZ Me EGA V C SH FIRE i amp IMLIAS_T LUSH Hli QE C3H43 BH 8 Z 5H UK Y 3C amp CFOVANM Ig RUE SER BO HYO MEOE SH O GL 02 4X8 E SH 1
17. 1S013407 CD FREZI
18. A Z b n PC IT
19. 3 1 3 7 18013407 2001 7 33 TC Usability 3 1 4 UPA 1 UPA body of knowledge UPA Usability Professionals Association http www upassoc org Brand Technical Design Ergonomics Experie Audi n ce Desi Analysis VM Training Factors EpPsS 9 HCI User pone Computer Interface Usability m science Design In formation formation Ar cture De Library d Science Docu
20. 19 3 TC Usability TC Usability LIE CU ES nA aA itae Md 7 11E ET ae MR DOMO NOR ET NE 7 Pareri ie tM RE ecc Ch IUS RE EE 8 ERROR RO RENE ARR e NEC eu EN E TE ET DN NUN ce Duc TANTO 9 VELIT UPS Arc MR 13 2d cU CUI lello 13 2 2 ale 14 2 KEO OWN Ei 14 222 SERED BART AEF WATT Lee 14 223 kk 15 2 24 TC 15 DDS V EP UD BEER UD SECHS ilaele ila 15 3 pd KERET OEREIN OR OL adest etia v cis 19 VISA A LA eM 3S 19 3 1 1 TEA International Ergonomics Association kk 19 92 2
21. TC TC 18
22. 2006 TC TC A B C HW D i TC TC TC
23. 3 4 IS013407 a b
24. PL 19
25. IS0 13407 1S0 18529 ISO 18152 UPA TC 14 2 2 3 2 2 4 2 2 5 TC Usability
26. TC IT IT TC Usability 2 2 2 2 1 2 2 2 TC
27. 40 42 No 1 2 3 1 98 TC Usability 4 2
28. 101 4 4 3 1 TC Usabihty 3 1 MES 3 2 No
29. ATIS T EORR e f e 1 IEA e e E 1 f m Hi 20 TC Usability 3 1 2 4
30. quick to conduct can be used from the early stages of a project enhances communication and learning among the users usability experts designers and those responsible for the development may cause conflict between the parties cannot collect task performance data during the use of the method h Creativity methods The aim of such methods is the elicitation of new products and systems features usually extracted from group interactions In the context of human centred approaches members of such groups are often users Creativity methods are used in many fields to generate a list of ide
31. 3 3
32. Examples of these methods are Keystroke Level Model KLM KLM Goals Operators Methods Selection rules GOMS GOMS Analytical method of description M thode Analytique de Description MAD 85 TC Usability MAD k Expert evaluation Expert evaluation also called expertise is based upon the knowledge of the usability specialist with practical experience and skills in ergonomics Expert evaluation can lead to the rapid identification of potential problems but depending upon the skill profile of the usability specialist it may also be used to eliminate the causes of the problems For this reason it is recommended that several usability specialists are involved in such an approach in order to share and exchange several perspectives of evaluation These expert evaluation methods provide means to identify known types of usability problems and can be applied from early in the lifecycle However they are limited by the skill of the usability specialists and cannot be used to identify unpredi
33. 7 45 TC Usability E2 31 HCD HCD Human Centered Des ign User Centered Design 1S013407 IS0 TR18529 HCD E3 32 Afi Human Factors Ergonomics
34. collects quick overview of users opinion flexible allows probing per users responses detailed analysis is time consuming it is open to biases both in the questions and the answers heeds expertise to accurately interpret data 78 TC Usability f Thinking aloud This kind of survey provides understanding of the ways a task is performed and helps to validate or disprove assumptions Most often it is based on verbal protocols thinking aloud Thinking aloud involves having users continuously verbalise their ideas beliefs expectations doubts discoveries etc during their activity when using the system Thinking aloud protocols provide valuable data with regard to why users are performing certain actions This data is an import
35. e e e e e e e iH 3 1 2 5 1 e 1 1 1 2 1 3 e 14
36. 35 TC Usability G H IS013407 Plan and manage the human centred design process Understand and specify user and organisational requirements and context of use Produce design solutions Evaluate designs against requirements e e e UCD
37. 2 61 TC Usability
38. 66 TC Usability REQUIREMENTS ANALYSIS THE USABILITY ENGINEERING LIFECYCLE B DESIGN TESTING DEVELOPMENT Level ia o YSereenDesign senile Standards e t tb Detailed User SDS Interfoce Design DUID sos UniuSystem Testing style Prototyping x Sierra Style Ierative iterative Guide DUID SDS Evaluation 1 Mockups Guide Qu Iterative M Evaluation Evaluation nau Met Usability Gouls All Functionality Addressed o Mel Usability Goals ves User All Issues Installation Feedback Resolved Enhancements 3 2 4 1 Mayhew 67 TC Usability 3 2 5 UPA UPA 18013407 UPA CIF IS0O Nigel
39. mx 3 1 3 6 32 TC Usability 2 314 15 EL Jk tp FCC255 508 1 IS09241 IS013407 IF s
40. 19 20 21
41. 33 Cognitive Psychology 34 Psychology
42. G G1 49 50 51 G2 52
43. 1IS013407 HCD1 HCD7 63 3 2 3 TC Usability IS018152 IS0 TR 18529 ISO PAS 18152 HSL Human System Lifecycle PAS Publicly Available Specification 3 2 3 1 IS013407
44. Web 2 DVD 104 TC Usability 3 4 1 5 2
45. ii Bo d e 58 TC Usability 4 a b c a b c
46. TC TC 110
47. 3 4 4 18013407 72 TC Usability 3 3 3 3 1 IS016982 18013407 IS016982 IS013407
48. IS013407 Acquisition Supply 1IS013407 Development Requirements analysis 195013407 Development Architectural design 15013407 Development Qualification testing 1S013407 Maintenance Operation ISO013407 Table 3 primary lifecycle processes Methods Lifecycle process Observation of users Performance related measurements Collaborative design and evaluation Creativity methods Model based methods Automated evaluation Thinking aloud Acquisition Supply H Document based m
49. 3 e 3 1 e 32 21 TC Usability 4 e 4 1 4 2 e 4 3
50. 6 B 7 8 9 10
51. 2 3 4 5
52. 3 1 2 International Ergonomics Association IEA 2001 Core Competency in Ergonomics URL http www iea cc browse php2contID edu_introduction Summary of Core Competencies in Ergonomics Units and Elements of Competency http www iea cc browse php contID edu introduction IEA IEA IEA i 1 limi CO Due
53. Unit Element Performance Criteria 3 1 2 3
54. j 3 Formal methods Formal methods allow the abstraction of user behavior or interface behavior These methods can be used either to specify and design user interface at the early stages of the process or to evaluate existing paper Or software prototypes at later stages of design When selecting methods a number of issues and factors should be considered Their use of formality leads to high internal validity if their results can be reproduced On the other hand their ecological validity is very low since they don t take into account the real context of use Most of these methods come from cognitive sciences and have no link with software engineering formal methods
55. e 9 2 e 9 3 22 TC Usability e 9 4 e 9 5 23 TC Usability 3 1 3 ft JBMIA 1960 35 2002 14 4 1 3
56. 44 47 TC Usability 45 F2 46 47 48
57. 35 36 46 TC Usability 37
58. TC TC 3 3 TIE 18013407 18016982 18013407 Nigel Bevan 3 1 3 2 3 3 109 TC Usability TC TC 4
59. 18018529 18018152 from cradle to grave TC 2006 TC 18013407 TC uu 1 BEX 3 2 6 2 TC 69 TC Usability TC TC
60. 0JT 0ff JT 0JT TC TC TC 3 2 1S0O 1S013407 1 018529 18018152 Mayhew UPA TC IS0O 19018529 19018152 IS013407 1 018529 18018152 TC
61. EE eu C 0 0 e 4 5 5 e 5 1 e 52 e 5 3 5 4 e EX 55 MMORIUCEMORIMA ME RANMA HOF a ARC e X35 6 6 1 e 62
62. 40 41 No 1 2 3 4 1 100 TC Usabihty 5 2 6
63. Gl DU XE D Gl a b c d e 2 57 TC Usability a
64. d e f g a b c d d 8 e _ 4 b
65. 35 38 40 41 42 43
66. 27 28 29 UT I M
67. 2004 2005 TC TC 3 1 IEA JBMIA TC UPA TC TC TC
68. s UE 3 1 3 3 RE 27 TC Usabi
69. 146 eee 28 4 Lo of 7 48 G G1 49 l OJT OffJT_ 51 5 ia6 3 1 5 4 TC TC 53 TC Usability 0JT 0ff JT 000009 O OJT OffJT A SET 3 1 5 5 0JT Off JD 0JT 0ff JT A
70. i 7 Tools supporting document based methods This type of method is a document based analysis helped by a tool This help can be simply the fact that the documentation is provided on line or can be more sophisticated using for instance knowledge based systems These tools make available information contained in documents style guides guidelines handbooks production rules extracted from the literature for interaction object selection in data bases hypertexts expert systems and design environments for the purpose of good human computer interface design EH 83 TC Usability HCI j Model based methods
71. Nigel Bevan Serco Usability Services UK Alan Colton SurgeWorks Donald Day Intuit Jonathan Earthy Lloyds Register UK Dane Falkner SurgeWorks Masaaki Kurosu NIME Japan Julie Nowicki Optavia Stephanie Rosenbaum Tec Ed Charlotte Schwendeman Vertecon Bill Saiff FannieMae Eric K Strandt Northwestern Mutual Don Williams Microsoft Corporation Larry Wood Brigham Young University A B C D E F
72. 5 3 6 AE 99 4 3 42 TC Usabihty 42 MES
73. TC Usability 1 3 1 3 TC e TC Usability TC lt BR gt FF TC TC B7 27
74. b c d e f g h i IS013407 a b c d e
75. b Performance related measurements Performance related measurements are also called task related measurements The commonly used quantifiable performance measurements relating to effectiveness and efficiency include the following time spent to complete a task number of tasks which can be completed within a predefined duration number of errors time spent recovering from errors time spent locating and interpreting information in users guide number of commands utilized number of systems features which can be recalled frequency of use of support materials documentation help system etc number of times user task was abandoned 75
76. AM New Medium Established Leader 3 1 4 8 38 TC Usability UCD Tze Question 19 What should the scope of a usability certification be please check all that you think should be included 1 Understanding and applying a user centered design 8796 process 2 ldentifying and analyzing user needs 86 3 Defining the context in which the system will be used 7396 4 Conducting usability evaluations 8596 5 Proposing design solutions 7296 6 Interface design 5996 Other written in grouped 7 Research skills 13 196 8 Software development 196 9 Design prototyping 196 10 Evangelism or ROI 196 11 Information architecture 196 12 Cognitive psychology or human factors 196 Note Percentages show number of respondents including this scope item 1 3 4 9 100 500 Nothing Under 50
77. 2 3 4 5 RE UE UA 3 1 3 2 y y HCD np a RE UE UA UI 3 1 3 2 3 25 TC Usability 3 1 3 2
78. 49 TC Usability 3K Hf BRUT Ih aS RB m pes H e a RA iS Rank 3 1 5 2 50 TC Usability 2005 0JT Off JT TC 2004 TC TC
79. 1 2 3 102 TC Usabihty 4 5 5 6
80. 19 TC Usability 3 1 2 1 IEA 3 1 2 2 IEA IEA IEA IEA 2001 Version 3 IEA Technical Committee BCPE RU CREE
81. 2 3 4 E3 32 33 34 35 36 37 38 7 F 8 9 39 10 40 11 41 12286821 5 6 B 13 14 15 43 44 45 46 47 48 G G1 49 50 51 G2
82. UE 29 TC Usability 3 1 3 4 UA
83. 55 TC Usability 3 2 3 2 1 18013407 IS013407 3 2 1 1 56 TC Usability 1 context of use a b
84. 53 0JT a1 Sociology 48 a2 a3 a4 ab TC Usability Anthropology Ethnography
85. RE gt gt PEAS EE EFIS 7190770 FOP OR Bl BE KH ZF E 10 TC Usability m 2m
86. a c d a 59 TC Usability b 18013407 60 TC Usability 18013407 a b c
87. 9 EPA 1957 Leiden Fitting the Job to the Worker IEA 1964 1 ISO IEA LR
88. RD 1 2 1
89. eu id TC eet eee ee a TC TC 41 TC Usability 3 1 6 TC TC TC UPA 2004 2005 2004 UPA IS013407 A 1
90. e e gt 2002 3 5 978 Jarrett C Quesenbery W CHS Analysis of Survey on Attitudes towards Certification HCI 70 EI HCI Practitioner 60 50 Bl HCI Manager 40 O Closely Related 30 Field 20 O Software Related 10 0 E Business Job Title Position 3 1 4 3 36 TC Usability 10 35 30 25 20 15 10 5 0 pe MLess than 1 Bl 1 2 Years 13 4 Years 15 10 Years E 11 20 Years D 20 Years 3 1 4 4 CHFP Certified Human Factors Professional through BCPE 12 gt CPE Certified Professional Ergonomist through BCPE 10
91. e 6 3 7 e 7 1 e 7 2 e 7 3 8 8 1 82 1 o 8 3 8 4 8 5 9 e 9 1
92. f g 3 IS013407 a b c d a 0
93. TC 1 18013407 2 18013407 3 18
94. TC TC TC TC TC
95. 22 44 TC Usability 23 D 24 25 E E1 26 GUI
96. d 1
97. j 1 General Two different uses of model based approaches are described here 1 user interface specification and design methods which allow the model ing of user and data and the use of the resultant models at the specification and design steps of the process 2 formal methods which are based on models of users and tasks Such methods allow the prediction of user performance XH widely available standardizes comparisons and predicts performance earlier integration with engineering approaches time consuming open to bias heeds expertise to build and i
98. 3 1 3 2 93 94 TC Usability TC Usability 4 GE TC TC 0JT 0Off JT gt 40 gt 41 gt 42 TC 3 1 3 2 HQL 95 4 1 40 TC Usabihty 40
99. Physical Environment Mode P Work Model Flow Model Sequence Pi Mode m Work Model Flow Model Sequence ioc Tr Mode ce lag KJ PA 6 Mitel eS Bayer and Holtzblatt Contextual Design 2 n Lo 97 TC Usability 4 2 41 41 MES
100. 1 Plan and manage the human centered design process 18013407 Competency Specify how human centered activities fit into the system development process 1 1 Identify and plan stakeholder and user involvement 1 2 Select human centered methods and techniques 1 3Provide human centered design support for other processes 2 Understand and specify user and organizational requirements and context of use 18013407 Competency Establish the requirements of the user organization and other interested parties for the system taking full account of the needs competencies and working environment of each relevant stakeholder in the system Identify clarify and record context of use in which the system will operate 1 Clarify and document system goals 2 Analyse
101. flexible allows probing per users responses may be uncomfortable for some users detailed analysis is time consuming cannot collect task performance data during use of method g Collaborative design and evaluation Such methods consist of involving different types of participants users product developers and human factors specialists etc to collaborate in the evaluation or design of systems Collaborative methods stress the importance of the user 79 TC Usability playing an active role in design and evaluation The reason for this is that the context of use and or the tasks of the users might be difficult for the designer and those responsible for the development to understand or the fact that users may have a difficulty expressing their actual needs or requirements in the development process
102. i 2 Handbooks recommendations guides Compared with the style guides coming from the issuer of the software handbooks as they are more general usually take into account more semantic aspects The evaluation is run the same way as with style guides EE i 3 Standards Standards can be used for document based design and analysis when they contain lists of recommendations These recommendations as well as the original sources of documented guidance are likely to become increasingly important with the growing acceptance of the standards ZH i 4 Evaluation grids 82 TC Usability Evaluation grids apply a list as complete as possible of properties of appropriate ergonomic interfaces Each property is evaluated by providing a notation on a range o
103. expertise not always required but would enhance results enhances communication among the users developers usability experts and improves consistency 81 TC Usability can be based on the state of the art knowledge does not cover every aspect of user interaction with the system Can be time consuming if done exhaustively BENIPIICTICHRIANIA i 1 Style guides Style guides involve expert design and evaluation using a style guide as reference The guide can come from the provider of the software or be defined customized in the company in which it will be used possibly with the help of a human factors specialist
104. 2 1 2 2 12 TC Usability TC Usability 2 2 1 TC Web
105. 103 TC Usability 4 5 3 2 3 2 MES 3 1 No 1
106. 19 3 1 3 ueste 24 AACNUPA DE EBERT A 34 3 1 5 TCHADATTFEUFLIAYVAVZ ie 41 3 16 TG TGB EE 42 LE Udo c psa A dalia ai 56 3 2 1 418013407 kk 56 3 2 2 ISO18529 kk 63 333 IS018152 s ob pet e ated a ose ER ec 64 3 24 Mayhew 0 FE CA EFL EE L 66 3 2 5 UPA kk 68 3 2 6 TC kk 68 TC Usability 33 FRRZETNIETEDNHIII ele iis 73 3 3 1 ISO16982 ISO13407 73 3332 UPA EF AE FS Ub cre eann EO ee A sed EE 89 3 3 3 TC GAPE O AIDS DBA eee 91 dr ARONNE LOLLI ALL aa 95 4 1 40 DFSAEE ERR 96 4 2 A EN CAD Fabia aaa 98 4 3 42 sese serene snsneneeeneneesneneneneniniosesnszeneneneneneneneene 100 4 4 3 1 tty 102 45 32 ge 104 Io WE C NN MER RPM REM RET 109 Dopo 111 TC Usability r mol 1 1 1 2 1 3
107. 43 e TC Usability De 11 12 BBA 13 14 15 16 17 last 18
108. 6 3 ok n 7 4 8 N di pa n RE Hi Ki 105 Tai TC Usability 106 ap 107 TC Usability TC Usability 108 SE f TC Usabihty dip
109. 86 TC Usability quick to conduct well adapted to early stage of a project can identify specific problems and recommend solutions high skills in ergonomics required may miss important problem I Automated evaluation Based on algorithms focused on usability criteria or using ergonomic knowledge based systems the automated evaluations diagnose the deficiencies of the system compared to predefined rules The fact that the context of use is not addressed in these approaches implies the complementary use of other methods ZE consistency on evaluat
110. c ue A AR e D g h multi disciplinary 62 TC Usability 3 2 2 18018529 1S0 TR18529 Human centred lifecycle process descriptions 2000 6 1S0 TR CH U ISO IEC TR 15504 HCD3 Specify stakeholders and organizational requirements HCD4 Understand and specify the context of use HCD7 Introduce and operate the system HCD1 Ensure HCD content in systems strategy HCD2 Plan and manage the HCD process HCD Evaluate designs against requirements HCD5 Produce desig
111. 4 1 Specify and validate context of evaluation 4 2 Evaluate early prototypes in order to define and evaluate the requirements for the system 4 3 Evaluate prototypes in order to improve the design 4 4 Evaluate the system in order to check that the stakeholder and organizational requirements have been met 4 5 Evaluate the system in order to check that the required practice has been followed 4 6 Evaluate the system in use in order to ensure that it continues to meet organizational and user needs 5 Demonstrate professional skills IS013407 Competency Enables HCD to be done in the organization through working at a professional level HCD 5 1 A degree of autonomy in the control of their own work 5 2 Having some influence on other people a project or an organization 5 3 Cope with a degree of complexity intricacy or complication in their work 5 4 Understanding of and skill in role within the working and professional environment ISO 13407 1999 User centred design process for interactive systems ISO TR 18529 2000 Ergonomics of human system interaction Human centred lifecycle pro
112. 3 Automatic analysis of presentation quality The purpose is to evaluate the ability of the representation to make clear the logical structure of a given set of information The proposed model establishes a relationship between the abstract representation of the structure and the abstract methods of presentation The structural relationships between the entities of a set of information are formalized in a semantic network independently of their technical implementation 88 TC Usability 3 3 2 UPA Nigel Bevan Nigel Bevan 18013407 1 4 5
113. 8 Fr HEHE SH B Ili WEI Y SH Vad CaM st SES ER Y EACEReRERCYSH EROEREGUY VY SH XA CE SH l SSAEBNE LESH Md SEE BEBESH BaazSH CEISP UE BE V VSH SFIZI ISH Sole c FE TSH EEMESHE LI BRAESHE CIS gt BIER BG E SH EEH ZH ARH SH SFID ERY T L 8H NGYACYETLYA IS018152 3 2 3 1 65 TC Usability 3 2 4 Mayhew SO Mayhew 1999 Mayhew requirement analysis design evaluation development installation 3
114. E 16 TC Usability TC Usability 3 3 1 3 2 33 18 TC Usability TC Usability 3 3 1 3 1 1 IEA International Ergonomics Association IEA EEC European Economic Cooperation EPA European Productivity Agency EPA 1955 Human Factors Section 1956 Human Factors
115. Skill required but skill set is more available than other methods well adapted to the early stages of a project detailed analysis is time consuming open to bias i Document based methods General In the document based methods also called document based analysis the usability specialist uses existing documents in addition to his own judgement The expert has to have enough experience to be in a position to use these documents in a way that is appropriate to the context of use and to carry out the design or evaluation in an efficient way These documents based on commonly agreed rules or experimentally proven demonstrations can come from the contributions of specialists guidance guides check lists or from software suppliers
116. 3 1 3 1 18013407 GIDL EE ETRAS SO13407 3 AWH RE UA 81 3 3 1 3 1 HI RE Requirement Engineering f RE 18013407 9241 UE Usability Engineering in WT SSA Aen to Zo TR ZZ Y Tiniha J UA Usability Assessment GOON UE URRH PRU ALKA A Se BR Bal OD BEAD BEA 3 1 3
117. 3 1 5 3 TC TU TC TC TC TC Web TO TC T amp U 5 TC Usability vs 1 gt A 3 5 6 pzxkgn niti 9 17 C 18 23 E a E PANERA 133 138 F 52 TC Usability F
118. TC Usability 1 1 TC Usability db E Fi WEB TC Web
119. 52 16 17 C 18 19 20 21 22 23 D 24 25 E E1 26 27 28 29 30 E2 a1 a2 a3 a4 a5 31 HCD UCD 3 1 5 1 TC TC amp 42 TC Usability A 1
120. gt AHFP Associate Human Factor Professional through BCPE 2 A gt 30 UPA CHI ACM HFES STC Organization Count Percentage UPA 630 649 7 60 SIGCHI 377 m 3996 50 ACM 236 24 40 HFES 132 TA nis 20 STC 111 11 10 British HCI 59 52 0 of UK respondents 3 1 4 5 37 TC Usability 35 E Extremely 30 Likely 250 B Somewhat h Likely 0 20 D Neutral Don t 15 Know 10 OSomewhat Unlikely 0 5 m Extremely 0 Unlikely 3 1 4 6 New to Field 1 5 105 Essa os O Leader 15 32 eg ue Sea semen age 3 1 4 7
121. a b c CG d SF ae TC sr E BP 71 TC Usability nh mk Tt H nh n The Db N J ob nb S FW AI FX ok DE VU qe Y n 3 3
122. B 0JT Off JT TC 54 TC Usability OE SE Hoke Te fp TERNER Ye SUE amp PERC Cub Bio ge ele s E Bd chu pe aoe E BuU oe SHE p OQ s ERAS AF na EASE 29 TAHE Tn ASEE ZI WERE TI tnr rakes D EAF fest pranise Y Ol granza 5 Pe HE HAEE tI Yee ES TAL EELEE OF ae ES RY i8 i CE MET ae ale tt be fil LP ISERNIA OF EE RIA il amp Tcacackmduc ap 244 F TE BREE rS Aarb Eh ya Ba Yu AAt F EF n Mani E ESAE i CERERE SP Fede e _ B TEENI RE Hales E UE e BB HERE Tkm i8 wes e B i IEEE EE 98 imo 6 Bg PEC dE iy 6 Bi SEWER 86 a Ge futu Gi ome Bhe BGC LL Ti OG Age I TH x tif A 1 math TI se AEDS FE MHs Teg bet SIE Aen EE B TOM TEH TE Via Ve Ane Pc Ae BET Sens amp E BOI See 52 WX E Bd Fu Vest 08 PER n a Yeu 88 3 1 5 6 TC TC
123. 50 to 100 100 to 200 200 500 500 1000 13 1000 2000 2000 10 000 NE Dont Know 3 1 4 10 39 TC Usability UPA Important UPA Members Neutral Don t Know Not important Very Unimportant 3 1 4 11 UPA UPA Board of Directors Position Statement on Certification for Usability Professionals JULY 7 2002 Orlando Florida During the past 9 months UPA has investigated the need for a certification program for usability professionals Based on feedback from members and other professionals the UPA Board of Directors has decided that it is premature for UPA to lead an effort to develop a certification program at this time However this work also produced a strong consensus onrelated initiatives that would provide immediate value for the profession Among these is developing a body of knowl
124. MES 41 42 No 1 Flow Model Sequence Work Holtzblatt Work Model 2 Model Cultural Model Model Artifact Model 96 TC Usability
125. 1 24 TC Usability 3 RE UE UA 3 U 3 1809241 3 1
126. Bevan 1S013407 UPA 3 2 6 TC ISO 18013407 Mayhew ZA IPSI gt LTU 4 we STU 1 EX Design Processes That Correspond to ISO13407 E 32 61 TC 68 TC Usability 19013407
127. stakeholders 3 Assess risk to stakeholders 4 Identify document and analyse the context of use 5 Define the use of the system 6 Generate the stakeholder user and organisational requirements 7 Set usability objectives 3 Produce design solutions 1IS013407 Competency Create potential design solutions by drawing on established state of the art practice the experience and knowledge of the participants and the results of the context of use analysis 3 1 Allocate functions 3 2 Produce composite task model 89 TC Usability 3 3 Explore system design 3 4 Use existing knowledge to develop design solutions 3 5 Specify system and use 3 6 Develop prototypes 4 Evaluate designs against usability requirements 180138407 0 8 UZ ut A C65 5 Competency Collect feedback on the developing design This feedback will be collected from end users and other representative sources
128. 013407 3 1 3 2 Web DVD TC 70 TC Usability SHE TC
129. TC Usability UC IT IT
130. TC Usability number of digressions Rita amount of idle time it is important to distinguish between system induced delays due to thinking time and delays caused by external factors number of total key strokes collects quantifiable data _results are easy to compare does not necessarily uncover the cause of problems _requires some kind of working version c Critical incidents Critical incidents consist of a systematic collection of events which stand out against the background of user performance The incidents are described in the form of short reports which provide an account of the facts surrounding the incident The data can be collected from interviews with the user and from objective observations of the interaction The incidents are then grouped and categorized When performance related measurements have current tasks and existing situations as the focus of interest critical incident techniques enable the examination of significant e
131. Thinking aloud involves having users continuously verbalise their ideas beliefs expectations doubts discoveries etc during their activity when using the system tested Collaborative Such methods consist of involving different types of participants users product developers and human design and hu factors specialists to collaborate in the evaluation or design of systems evaluation Creativity methods The aim of such methods is the elicitation of new products and systems features usually extracted from YIN group interactions In the context of human centred approaches members of such groups are often users Document based N The usability specialist uses existing documents in addition to his own judgement methods Model based They are based on models which are abstract representations of the evaluated product and allow to approaches N predict the users performances cognitive theorisation cognitive model of the interaction top down approach Expert evaluation Expert evaluation is based upon the knowledge expertise and practical experience in ergonomics of the N usability specialist Automated N Based on algorithms focused on usability criteria or using ergonomic knowledge based systems the evaluation automated evaluations diagnose the deficiencies of product compared to predefined rules a Observation of users T
132. ant supplement to the objective data capture of the performed actions through observation performance measurement data logging video or scan conversion The instructions for getting users to think aloud have to be given before starting and repeated during the session quick to conduct collects insights into users mental process
133. as to create new products and or to solve a problem by changing perspectives and considering alternative options They are not uniquely ergonomic methods but they can be used in the context of the human centered user centered design approach These methods work more effectively with users involvement but can also be run without users They fit the conception stage of the design process particularly well and can be used in early stages of a project They help to create and define new products their functionality and their interfaces 80 TC Usability HCD UCD
134. be useful to gather information from users using questionnaire items questions and statements The questionnaire items can be either open ended statements or checklist closed questionnaire items and scales the advantage of the former is that they allow people to give elaborate answers but there is always a danger of collecting only cryptic statements which are difficult to interpret For this reason the closed questionnaire item format is often preferred Standardized questionnaires can also be used for systematic comparisons The type of data being collected is users quantifications suggestions opinions and ratings of the systems features user help preferences ease of use etc The qualitative methods are indirect evaluation methods in that they do not study the user interaction but only users opinions about the user interface There is also a need for building consistency checks in questionnaires Implementing consistency checks can be done by using different question formats referring to the same item For this reason closed questions are often preferred
135. cess descriptions Skills Framework for the Information Age SFIA 90 TC Usability 3 3 3 TC 3 1 6 TC F IS013407 1 39 41 42 2 40 45 46 3 47 48 4 43 44 91 92 TC Usability TC Usability 4 4 1 4 2 4 3 4 4 4 5 40 41 42
136. ctable problems which only arise with users There can be large differences between experts when diagnosing usability problems These differences can be reduced by the use of the appropriate document based methods The evaluation is based on the background and knowledge of the expert In this kind of evaluation the expert identifies the most frequently observed on problems by reference to an optimum man machine interface model he she has in mind The multi expertise evaluation is based on the same rules as the previous one The only difference is due to the number and variety of experts which expands the scope and is a factor of better reliability Domains like human factors psychology sociology cognitive approach graphics can be addressed in the same analysis In any case usability experts are needed EH
137. e 1 5 gt e 16 e 1 7 2 e 2 1 e 2 2 e 2 3 e 2 4 e 25
138. e items open to biases both in the questions and the answers e Interviews The interviews are similar to questionnaires with greater flexibility and with a face to face procedure There are many different forms of interview from very structured to open ended lnterviewing a user on an individual basis requires much more staff time than administering a questionnaire Interviews have the advantage however of being more flexible since the interviewer can explain difficult questions more deeply or reformulate a question if it is unclear to the user Interviews can also allow interviewers to follow up answers that require further elaboration or that lead to new insights which had not been anticipated in the design of the interview EB
139. edge to help usability practitioners grow professionally and help others understand usability better A body of knowledge might include A list of skills Prerequisite knowledge Framework of usability life cycle practices This body of knowledge could then be used as the basis for a professional development plan curriculum and self assessment tools The UPA is planning to move these initiatives forward 40 TC Usability 3 1 5 TC TC eee TC U
140. ethods Questionnaires Expert evaluation Interviews Development Requirements analysis Critical incidents Development Architectural design Development Qualification testing Pe el Maintenance Operation Legend recommended appropriate neutral when the cell is empty not recommended not applicable NA 73 TC Usability Table C 2 Synthetic description of the referenced methods Name of the Direct Short description of methods method involvement of users Observation of Y Collect in a precise and systematic way information about the behaviour and the performance of users users in the context of specific tasks during user activity Performance Collect quantifiable performance measurements in order to understand their impacts on usability related Y measurements Critical incidents Y Systematic collection of specific events positive or negative Questionnaires Y Indirect evaluation method which do not study the user interface itself but only users opinions and perceptions about the user interface as they are answered in pre defined questionnaires Interviews n Same as questionnaire but with a face to face procedure Thinking aloud
141. f values More often the properties come from agreed rules of ergonomics i b Ergonomic criteria The approach is similar to the previous one The only difference is due to the exclusively ergonomic criteria used in these lists i 6 Cognitive walkthroughs The process is run by walking through the tasks the user has to perform with the system taking account of the user goals knowledge and context of use The applicationof cognitive walkthroughs is aimed at avoiding the risk of a biased view of user behavior based on the personal view of the person doing the design or evaluation
142. his method consists of the precise and systematic collection of information about the behavior and the performance of users in the context of specific tasks during the user s activity which may be carried out either in real life situations or laboratories Such observation is structured and based on grids and protocols which enable the behavior to be classified Much observation is based on taking detailed notes on what the users do and then analyzing the data later ZE 74 TC Usability method can be performed in real word settings real activity is reported time consuming to analyze the data T 4 heeds expertise to accurately interpret data no insight into mental data
143. ion across projects RA may miss important problems requires a working version of prototype The follow are examples of automated evaluation methods 1 Knowledge based A knowledge based system KBS helps to evaluate and automatically improve graphical views It proposes guidance based on ergonomic rules stored in the databases 87 TC Usability KBS 2 Automatic analysis of perceptive screen complexity The screens are analyzed by programs which use agreed criteria global density local density number of sets of characters medium size of the groups number of items complexity of presentation etc
144. lity 3 1 3 3 VE
145. ment Design Testing 3 1 41 UPA 2 34 TC Usability Self Assessment UCD Roles amp Curriculum Career Development Certification 3 1 4 2 UPA 2 2001 11 Salt Lake
146. n solutions 3 2 2 1 15013407 18013407 00 IS0 TR 18529 2000 TR Technical Report TR 7 HCD3 4 5 6 18013407 HCD2 1IS013407
147. nterpret models 2 Usability specification and design methods These specification and design methods may expand software engineering methods adapting UML notification language or are dedicated methods to user 84 TC Usability interface covering both the specification and the design stages MUSE Method for Usability Engineering as an example These methods use flow charts UML s class diagram for users conceptual model interaction diagram and state diagram for task description UML is improved to support user interface properties Itisalsopossible to use another more general method like Petri s nets to define the procedure WL MUSE UML UML
148. vents positive or negative collects causes of problems focuses on events where demands on users are high 76 TC Usability real activity is reported may require a long elapsed time to complete successful completion uncertain due to insufficient events to report d Questionnaires There may be several occasions during development when it will
Download Pdf Manuals
Related Search
Related Contents
Istruzioni e catalogo pezzi di ricambio Sanyo POA-CM01 mounting kit Copyright © All rights reserved.