Home

Integrated Modelling Methodology - Version 2.0

image

Contents

1. SJ WO SNO Bunjnsuoo syeweyew uoewozu Buum 5 UONeWOJUI SyeW O ul ebpejmouy SEyoed s1041 BulsouBbeip so auiyoew e Buniedau 6 98 sus lqold Bulajos Jo syse 104 sed y ul Adu jeu S S pue aHpajmouy Adde Sauesql SeAiyoue Seseqeyep jeusejul 6a SeounOS SNOoWeA wos eHpajmouy S nsix pull pue JO O A u syse s v 104 yey 5 5 Huluses v dnolb v 104 31 si 2214 MOY yUIY noA eyM 4 5 0 l v UO woad dnoub au yo s oldul Op yJOM 1134 JO nOUUE YOIUM Yury noA 15UAA Huluses v peyoddns aq pinous saeAojdwe Auew moy Auedwod ul ZUH S4S Buiuses v Ul 00 5 o JUeM noA yoiym pja v Ul Burysiom ase Auew moy Auedwod l wa s s Buluses e Ul yoddns o JUeM noA jeu dnoJb p1 v 51 OYM wa shs Buiuses e 1 0 5 o JUeM noA ey uiewop Huluses 3y 51 15UAA eaey Auedwod noA s op seeAojdwe Auew MoH p iqinsip sud azu Awe 1 pesiueBso Auedwos JNOA SI MOH Auedwoo jews p soro s luedulo2 1 HEU S JO 19002 pesojo Auedwoo Jno amp si eyouesq
2. ilensn dnouf y ul sa Aojdw oq Auenbes eHueyo uq op JO awy 1340 15 JeU eJ ulewa urewop z261 ul soldo v oq ESUONENUS JUBJOJJIP Ul Ajqissod yse wes y pue uo 104 s zey sHpajmouy au s oq sseooid e sd s o MOY O uem oym 5 oddns o UeM noA Wass Huluses e yM JUsWdAOIdUU zueu IS ppe jou Aew se 370SOdY 40 10 291puI 18 UNO9 E 51 511 1 u O yoes uopeounwuoo 6 05 Buoys poop seeAojdwe y 1 JUeOIIUBIS e ppe jou Aew 4714SOciV Bulyoo y eym puy ilensn pue u s niliqissod yosees y YUM AH A ave dnos6 v ul s ojdw J 1404 HO peo awos Bye UBD y 6 vejd 04480 y op osje pinoys 314904 eym 3sn s4eyL 310904 40 Uogeopu ue SI siy s noysod JUSWUUNDOP y UI BISYMOELWOS SPJEMO Wau 1 104 JO s n P llo 3ioddns o SW mil 51 idj y no4 104 syueu Aue 4 U10 YIM ilensn ulewop ay ul s Zoldul Op MOH MOU O p u Ady yeym pul ul Ajdde pu no v ul zey sa
3. learn 2 work Information Society Technologies Project Number 027023 APOSDLE Advanced Process Oriented Self Directed Learning Environment Integrated Project IST Technology enhanced Learning Integrated Modelling Methodology Version 2 0 Deliverable D 1 6 Due date 2009 04 30 Actual submission date 2009 04 29 Start date of project 2006 03 01 Duration 48 Revision Final Organisation name of lead contractor for this deliverable FBK FONDAZIONE BRUNO KESSLER Project co funded by the European Commission within the Sixth Framework Programme 2002 2006 Dissemination Level PU Public PP Restricted to other programme participants including the Commission Services RE Restricted to a group specified by the consortium including the Commission Services LIU O X CO Confidential only for members of the consortium including the Commission Services learn work A anposcile 52 Disclaimer This document contains material which is copyright of certain APOSDLE consortium parties and may not be reproduced or copied without permission The information contained in this document is the proprietary confidential information of certain APOSDLE consortium parties and may not be disclosed except in accordance with the consortium agreement The commercial use of any information in this document may require a licence from the proprietor of that information Neither the APOSDLE consortium as a
4. 12 1 Introduction The modelling wiki MoKi is a collaborative tool that provides support for enabling domain experts who do not necessarily have knowledge engineering skills to model business domains and simple processes directly Wiki templates and functionalities guide the domain experts through the necessary steps Some knowledge of basic modelling steps as described in 1 are required Users unfamilar with the steps of the Integrated Modelling Methodology for APOSDLE should refer to the description in 1 or ask their coach for a summary of the basic modelling process 2 Wiki main functionalities The functionalities of the wiki can be classified in 4 groups which can be found in the left hand side bar of Figure 1 1 Wiki Import Functionalities These functionalities support a compact import of groups of concepts and tasks to facilitate the early phases of modelling 2 Domain Model Management These functionalities support the management of domain model elements 3 Task Model Management These functionalities support the management of task model elements 4 Wiki Owl Export functionalities These functionalities support the automatic export of knowledge about the domain model task model resp in OWL page discussion edit history delete move protect watch refresh Set wgLogo to the URL Main Page path to your MediaWiki has been successfully installed own logo image Consult the User s Guide amp
5. The process and domain scribble were useful 1 KE ISN 63 learn work A anposcile 52 8 1 1 2 Feedback both for the IMM of P2 and P3 As a positive side effect the company thinks about implicit knowledge and implicit processes in the company in a structured manner P2 2 KE EADS CCI P3 1 KE ISN 1 Coach SAP 8 1 1 3 Feedback on the IMM for P3 General o It was challenging to negotiate the APOSDLE domain with different stakeholders 1 Coach KC Objectives amp Explanations o We knew how domain concepts and tasks have to look like 1 KE CCl We knew what was an appropriate domain for APOSDLE 1 KE CCl O An explanation is given why a certain domain was selected 3 KE ISN CCI EADS 1 Coach KC o Defining the scope of the APOSDLE domain is still difficult but interesting 1 KE ISN Collection of documents o It was easy to find crucial documents 1 KE CCI Knowledge elicitation techniques o The Scope and boundaries questionnaire was useful 1 Coach KC Reuse of Models o The P2 model was built the basis for the P3 model 1 KE EADS had no modelling effort because the model already existed 1 Coach SAP Achievement of goals o In our opinion the goals were fulfilled within this step 1 KE CCI 64 learn work A anposcile 52 8 2 Phase 1 Knowledge Acquisition 8 2 1 Knowledge Acquisition from documents
6. Ul suons no waysks Buluses x ldulo e ABU o lls p y JO syse 11 Ajjeoiseg aoueyo e FIGSOdV A H sow Ajjnjadoy pue pue Huljjapouw ul JSSAuU o q IM Us 51 ss oold y J AIGSOdV o Auedwoo y JO JUSLU LWWLWWOD JUSJOYU y 10 SYSE SIL 314804 spijemo 511104 si 4odull 5 s nbe jjoo Jo y wou 1 SuOIeENyS yons ul je S l s4S Yoddns SABA Ul BUOP 9q UD 5491 zey 321601 1 511 ssoyeoipul Bp lMOUY Jo anea e Jo asn Bulyew Aq si suoddns 47qGSOdV JUewssesse u dulo pue yoeqpss JO pub x ldulo e YyoIuM syse 4 104 U S S 511 x luoo 5111 ul F1GSOdV Buisn sureSe xye ds iqeqold 51 yom PD LMOU JO suogeoIpu 2 101 e J YIOM D AOUN 104 JOJeOIPU Ue 51 SIYL x luo9 5111 ul 10904 Buisn ysureBbe xye ds Ajqeqoud 51 yom P LMOUY JO SUONeDIpUI 2 Jo 10 BJ9U J YIOM SHpsjmouy 104 JOJeOIPU Ue 51 SIYL p jj pow o sp u ulewop Buluses y se 314504 e SI siy Uleulop Buluses p xil OU SI 18 Aew 51 puey v UO Auessede
7. 4 3 1 Basic knowledge 5 2 Profound knowledge bal ua bu Rae aaa aza q 6 3 3 Know NOW to appiy use do 8888828222 28228338833333833 6 34 Know TOW DTO GUCE sn sab bab da das ala aslan aa maa TE 7 000 WAS SCM anaya aaa EAEE arabasi 8 4 Modelling more economically by creating tasks with variables 9 4 1 When and why could variables be 9 4 2 The idea of tasks with Variables 10 4 3 Ground tasks abstract tasks and specialised tasks 10 4 4 Where do variables come from ccccecccececeeceeeececeeeeeeeeneceeeteecsueeeueceeeeueeseeteetsessueeseenaeenes 10 4 5 Creating tasks with Variables s 8 88 2888 288338 2232843883339333333 11 5 Description of Guidelines for specifying learning goals with the TACT 13 SS TAC lo a m ab 15 15 6 1 1 LULU n o 15 0 r b 15 6 2 Your Knowledgebase Files you neeqd s 888 82888838 8288 388338333833333 15 6 3 Files that will be modi
8. Was the knowledge elicited useful What where the main difficulties encountered in eliciting knowledge from resources Were the explanations given clear enough Page 2 Integrated Modelling Methodology Collection of feedback Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 2 Positive Experiences ADD your comments here As EADS domain for P3 remained the same as for P2 so the main starting basis we used were the P2 version of models and notes remarks wrote during P2 evaluation Negative Experiences ADD your comments here Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Please state the main differences if any between performing this step this year and last year ADD your comments here 3 Step 1b Knowledge elicitation from Domain Experts Des The goal of this sub step is to elicit knowledge directly from the DEs The desired output is a refined task list and an extensive list of candidate domain concepts Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as m
9. 4 4 1 2 Ontology Questionnaire This questionnaire is meant to propose statements and questions to the Knowledge Experts that are extracted from the models contained in the MoKi and aim to verify if the Knowledge Experts agree with those statements if not this obviously triggers a request for some manual verification and revision of parts of the models contained in the Moki The questionnaire concerns the domain model The purpose of the ontology questionnaire is to let a Knowledge Expert verify the knowledge that can be inferred from an ontology and remove it in case it was not intended The rationale behind this is that Knowledge expert and knowledge engineer might encode their knowledge in the ontology in such a way that they do not agree with everything that can be inferred from it After seeing the inferred statements the Knowledge expert or the knowledge engineer might disagree with an inferred statement and wants to remove it This is not directly possible because it is inferred and not stated The ontology questionnaire finds the reason for an inferred statement and lets the user remove the reason for the inference With the statements which lead to the unwanted inference removed also the unwanted inference is removed from the ontology The ontology questionnaire shows a list of inferences to the knowledge expert The Knowledge expert should read through these statements carefully In case of disagreement the knowledge engineer can
10. 8 2 1 1 Feedback on the IMM for P2 General o Modelling in this step took a lot of time 1 Coach SAP Objectives amp Explanations o The objectives of the phase were clear 2 KE CCl ISN 2 Coaches SAP KC o Written descriptions for using the term extraction would have been sufficient Workshop would not have been necessary 1 KE ISN o The explanations were clear 1 Coach SAP Knowledge extraction from documents o Extracting knowledge from documented sources by hand lead to useful concepts 1 KE CCl o Extracting knowledge from documented sources by hand was tedious and difficult 1 KE CCl o Basically automatic term extraction with an other tool than the DMT was a useful method 1 KE EADS o Efforts should be put into providing more appropriate tools for term extraction 1 KE EADS 8 2 1 2 Feedback both for the IMM of P2 and P3 Knowledge extraction from documents o Knowledge from documented sources was extracted by hand P2 1 KE CCI P3 2 KE EADS CCl o Term extraction did not work properly P2 1 KE ISN 2 Coaches KC SAP P3 1 KE ISN o The domain modelling tool could not be used for other languages than English P2 3 KE ISN CCI EADS 2 Coaches KC SAP P3 1 KE ISN 65 learn work A anposcile 52 8 2 1 3 Feedback on the IMM for P3 o It is advantageous that the term extractor is integrated into the Mo
11. Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 Positive Experiences The OWL was a result of automatic exports There were no problems and no effort Negative Experiences Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Has to be evaluated by application partners Please state the main differences if any between performing this step this year and last year ADD your comments here 7 Step 5 Modelling of Learning goals previously known as Formal Models Integration The goal of this step is to obtain an OWL ontology of the learning goal model via the TACT tool Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Page 7 Integrated Modelling Methodology Collection of feedback Please rate form 0 t
12. Were the explanations given clear enough Page 2 Integrated Modelling Methodology Collection of feedback Was the goal of the step completed after this step ie was a refined task list and an extensive list of candidate domain concepts ready after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you Positive Experiences Negative Experiences Please estimate your modeling coaching efforts at this stage in terms of hours to perform the task Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please state the main differences if any between performing this step this year and last year 4 Informal Modeling of Domain and Tasks in 1 Starting from the knowledge elicited in Step 1 a b the main goal of this step 15 to obtain an informal but rather complete description of the domain model and task model in a Semantic MediaWiki called After this modeling step the informal concept model should only consist of relevant domain concepts see 5 2 Page 3 Integrated Modelling Methodology Collection of feedback 5 Step 3 Informal Models Validation and Revision The goal of this step is to have the domain model and task model validated completeness and correctness by the DEs The step was supported by guidelines by results from auto
13. knowledge about various creativity techniques and tools available e basic knowledge about addresses knowledge that a company can have different addresses knowledge about which different addresses a company can have knowledge about different addresses of a company basic knowledge about REACH interest agents basic knowledge about organizations that exert political influence on the implementation of REACH e basic knowledge about exceptions of REACH knowledge about substances that are not subject to the regulations of REACH e basic knowledge about model knowledge about properties and elements of a various models knowledge about different types of models required for simulation building learn work A anosclle 42 3 2 Profound knowledge of Definition The learning goal type Profound knowledge of means to comprehend conceptual knowledge about the topic and its properties and the relationships to other topics in that domain This includes for instance understanding the indication of a certain method or tool Knowing causes and effects of an error or understanding the mechanisms of an engine Example If for instance this learning goal type is linked to the topic APOSDLE Wiki the learning goal profound knowledge of APOSDLE Wiki means that one understands the structure of the APOSDLE Wiki the functionality of the icons or that one is able to navigate in the Wiki APO
14. y ues Aisnonunuo o y ul s old l ay Auedwos JnoA ul p yauocuy Mou Hulyesd Jo Hbuibeyoed Sul idde Puipui Yom DD LMOUX 51 MOH juewodu yuewodul JeymMawos yuewodul J A jou ile 16 JULYOdUI JOU ile ye E21d you ead YOU 2210 jeaidAy ile ye yeoidAy Jou 2210 JOU 2210 zeym wos Reece jeaidAy ile ye yeoidAy Jou jeaidAy YOU jeoidA zeym wos 2210 Asan ile ye yeoidAy Jou jeoidA JOU jeoidA zeym wos jeaidAy Asan 001 92 GL 1S p solo 0S 92 SC 0 OL lt sueak 0 pue G U M Q g uun JOAO Bulenjony aie uo SySe pue SUOT ISOd u x 15UA O G pue z U Ml Q poe sqol l v ileold aJe syuequnou qol yey Aes noA Huo moy wej sAs JNOA jo dnol 2618 v 104 pulul ul Wa SAS Huluses y dnoub bie v YM suons nb ay AaMSuUe pesojo Auedwos ul Mau pul JO Bey ed Ajdde oym ldo d a 5 BHpajmouy zueyodw MOH uones uepb o JNOA ul JO 3 01 JNOGe suons no 22100 21 10045 e UO u JO s5ulu Meu JUSAU Adu m u
15. 1 Coach SAP 74 learn work A anposcile 52 Role of the modeller o Modelling cannot be done by people who have no experience in knowledge engineering 2 KE CCI ISN 1 Coach SAP o The knowledge engineers were domain experts at the same time 1 KE CNM o Modelling was done by a KE and not by a DE in every stage 3 KE ISN CCI EADS 2 Coaches KC FBK Modelling tools o It would be nice to have everything integrated in one tool 1 KE ISN Model revision o Changing the model once it is finished is still difficult 1 KE ISN 75 learn work A anposcile 52 Content e Part 1 Initial questionnaire Scope amp Boundaries e Part 2 MoKi User Manual e Part 3 Validation amp Revision of Domain Tasks e Part 4 TACT User Manual e Part 5 Validation amp Revision of Learning Goals e Part 6 Filled Feedback Questionnaires on the Integrated Modelling Methodology o CCl o EADS o ISN o KC o SAP o FBK Initial questionnaire Scope amp Boundaries 414904 0141 s inos l SWOS S AUL O aq Ajqeqoud juswebeuew s old l se 10904 JO UOnEOlpul poof e s siy uosea 9169 2e1 S SI 3194 J abpajmouy 3H X 2 09 oldo y ul 51 2 ino 104 wa sAs Buluses e ABU O JUeM noA AYM p nq o uols o p Auedwod ul sn no
16. A anposcile 652 Tasks v fe Description v Einzelfallberatung durchf hren 2 gt Einzelfallberatung Kundenproblemanalyse 179 Zust ndigkeit gt Beratungsfall dokumentieren Add Variable Standardauskuntt gt Verweisen il Beratung erlernen Specialised Task F Einzelfallberatung Lt Show only Tasks without Learning Goals Domain Model Elements Filter Also Known as v l Beratung zum Thema Gewerbliche Schutzrechte A gt ii Dokumentation der Beratung in der IHK Darmstadt gt fF Gesetzliche Bestimmungen zu Gewerblichen Schutz unc gt Technologietransfer gt ii Erwerb und Verwertung von Gewerblichen Sehufzrechten F rderungs und Finanzierungsmdglichkeiten im Bereich Gewerbliche Schut gt lj Experten Ansprechpartner f r das Thema Gewerbliche Schutzrechte gt Beratungsstrategie im Bereich Gewerbliche Schutzrechte v Description Informationen zu den bekannten Schutz und Verwertungsrechten und zu verwandten Rechtsthemen a Add as Learning Goal for Einzelfallberatung J Highlight Domain Model Elements not referenced in Learning Goals Learning Goal Mapping Einzelfallberatung Understand v Save l Figure 13 Overview of the TACT User Interface 4 3 2 Explanations for learning goals Since a number of learning goals are added automatically TACT provides explanations for why a learning goal appea
17. Do all tasks have a task identifier number Coaches should provide support to this check Long names Are there tasks with names longer that 30 characters This is not necessarily an error but a stimulus to think of the name can be shortened to ease the display in the user interface and make the task more comprehensible to the end users of APOSDLE oe These questionnaires are meant to propose to the Knowledge Experts statements and questions that are extracted from the models contained in the MoKi and aim to verify if the Knowledge Experts KE agree with those statements If not this obviously triggers a request for some manual verification and revision of parts of the models contained in the MoKi The questionnaire concerns only the domain model APs should use it on line once the first revision manual and automatic checks is completed The ontology questionnaire is made for the purpose of letting a Knowledge Expert verify the knowledge that can be inferred from an ontology and remove it in case it was not intended The rationale behind this is that neither the knowledge expert nor the knowledge engineer explicitly state wrong things Nevertheless they might encode their knowledge in the ontology in such a way that they do not agree with everything that can be inferred from it This can be due either to not well knowing the used formalism OWL DL or to having a large and complex domain ontology After seeing the inferred statements
18. Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 16h Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner could be better The results of the termextractor could still be improved Please state the main differences if any between performing this step this year and last year It was easier to upload the documents to extract the terms and then to add the terms to the ontology 3 Step 1b Knowledge elicitation from Domain Experts Des The goal of this sub step is to elicit knowledge directly from the DEs The desired output is a refined task list and an extensive list of candidate domain concepts Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Did the domain experts coincide with the person performing the modeling What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step ie was a refined task list and an extensive list of candidate domain concepts ready after this step Page 3 Integrated Modelling Methodology Collection of feedback Please rate form 0 to 5 where is very e
19. These questionnaires accessible on line to an address which will be communicated to each Application Partner in due course propose to the Knowledge Experts statements and questions that are extracted from the models contained in the MoKi and aim to verify if the Knowledge Experts agree with those statements if not this obviously triggers a request for some manual verification and revision of parts of the models contained in the Moki Steps 1 and 2 can be performed in parallel and concern both models task and domain Step 3 has to be executed after 1 and 2 and concerns only the domain model as Shown in Figure 1 Manual Checks i On line a Revise in questionnaires Revise in Moki Moki only domain Automatic only domain checks Figure 1 The Revision Process b apcescliz learn work The two lists of checks described in this section are meant to guide Application Partners and Coaches to reflect upon the domain and task models created in the MoKi and evaluate whether some parts can be improved or need to be modified Some of these suggestions require some understanding of knowledge engineering and therefore need an active involvement of the coaches together with the APs 3 1 Domain Model Completeness of Concepts Are there relevant concepts missing from the Moki Granularity are the concepts useful for supporting learning Are they too broad or too detailed Relations between conce
20. We wonder if there is room for a further formalisation of this hierarchy creating step e g by applying Formal Concept analysis Negative Experiences Further formal guidance for transforming card sorting into a hierarchy would be helpful Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Both CNM and CCI had a good basis for the proceeding steps in this stage Please state the main differences if any between performing this step this year and last year CCI reduced the effort to one domain model CNM was extended to Rescue and iTEl The effort for Rescue in those stages was still very low since it just had to be transferred into P3 4 Informal Modeling of Domain and Tasks in MoKi Starting from the knowledge elicited in Step 1 a b the main goal of this step 15 to obtain an informal but rather complete description of the domain model and task model in a Semantic MediaWiki called After this modeling step the informal concept model should only consist of relevant domain concepts see 5 2 Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were
21. engineers from ISN plus coaches from KC In addition FBK provided an additional knowledge engineer to supervise the entire modelling process for all the Application Domains 3 2 Phase 0 Scope amp Boundaries 3 2 1 Goal In this initial step of the methodology the goal is to define the scope and boundaries of the respective application domains and to identify potential learning resources The desired output of Phase is threefold A first preliminary list of tasks called process scribbles roughly specifies the tasks that have to be performed by workers in the application domain Further a first list of learning goals describes abilities skills and Knowledge that should be present in people performing these tasks Moreover a collection of representative documents should indicate relevant learning resources Scope amp Boundaries for the domain that are identified in this modelling phase have to be documented in an appropriate manner In P2 for instance the results were verbally described in a central wiki whereas in P3 previously existing models plus short statements about intended changes additions served as documentation 3 2 2 Description At the beginning an initial questionnaire about target groups target tasks and learning needs of the application domain was filled in by the KE in cooperation with the involved parties in the company DEs future learners decision makers in the company Since most of the knowledge enginee
22. for workers have to be taken into account Depending on the desired granularity of the model and on the granularity of the starting concept laddering takes approximately 10 15 minutes per concept 3 4 Phase 2 Modelling of domain tasks 3 4 1 Goal Starting from the knowledge elicited in Phase 1 see Section 3 3 the main goal of this step is to create a specification of the domain and the task models This specification which also contains a first alignment between domain elements and tasks used in Phase 3 see Section 3 6 as a basis for the modelling of learning goals is created using Moki see Section 4 2 the tool we have developed within the APOSDLE project to support this specific phase of the IMM 3 4 2 Description The goal of Phase 1 was to acquire as much information as possible During Phase 2 the KEs have to process this information in order to produce a complete description of the application domain and of the tasks a user can perform A set of guidelines is provided to support the KEs in processing the list of candidate domain concepts and tasks see Sections 3 4 3 1 and 3 4 3 2 Once the list of concepts and tasks composing the models has been created the KEs start creating the domain and task model with Moki the tool based on Semantic MediaWiki we have developed to support the modelling activities in this phase of the IMM For each domain concept and tasks a page in MoKi is created and for each of these t
23. gt 5 associated learning goals tasks should be associated to a reasonable amount of learning goal Having a task associated to more than 5 learning goal could highlight a difference of granularity between the description of tasks and the one of learning goals learn work A anosclle 42 which needs to be fixed This is not an error in general but needs to be checked Coaches should provide support to this check e Maybe the tasks are really complex and need a lot of learning goals e Maybe the learning goals are rather low level and detailed e Maybe the tasks are very high level and abstract Implications the possible explanations should be looked at e it should be reconsidered if the task should be broken down Maybe the task is a super task in the task model e This is one way to improve the TASK MODEL wrt completeness list of tasks with the same set of learning goals if there are tasks that have exactly the same learning goals this could suggest to join or to delete tasks with this property list of learning goals not connected to any task learning goals should always be associated to a task Therefore if there are learning goals not connected to any task this is an error and needs to be fixed list of learning goals connected to only 1 task If a huge number of learning goals are assigned only to one few task the system has less information to discriminate between different learning goals and in particular to Know wh
24. means to comprehend conceptual knowledge about the topic and its properties and the relationships to other topics in that domain This includes for instance understanding the indication of a certain method or tool knowing causes and effects of an error or understanding the mechanisms of an engine Example If for instance this learning goal type is linked to the topic APOSDLE Wiki the learning goal profound knowledge of APOSDLE Wiki means that one understands the structure of the APOSDLE Wiki the functionality of the icons or that one is able to navigate in the Wiki APOSDLE use case The learner wants to have a profound understanding of a lt domain element gt The knowledge worker has a basic understanding of a topic but he still has questions like OK know what it is but how does this work Why should do it in a certain way How did this happen Why did this happen S he searches for explanations that help him to answer these questions Or s he just wants to Know more about the topic to be able to understand things s he reads in documents or to be able to communicate about the topic with co workers or to be able to generate new ideas Therefore s he searches for information that contains backspecialised information historical data trends and developments relationships with other domain elements etc Material use types explanation more about APOSDLE Examples e profound knowledge of scenario techniqu
25. the KE or knowledge engineer might disagree with an inferred statement and wish to remove it This is not directly possible because it is inferred and not stated The ontology questionnaire finds the reason for an inferred statement and lets the user remove the reason for the inference In the following use the terms axiom and statement interchangeably 1 1 Conceptual walk through the ontology questionnaire The ontology questionnaire uses a reasoner on an OWL DL ontology to infer statements Example ANOVA subClassOf Test is a statement It states that the concept ANOVA is a subclass of the concept Test Other ways of expressing this could be Everything which is an ANOVA is also a Test in nearly natural language or ANOVA c Test in a formal language lt then shows the list of inferences to the knowledge expert The knowledge expert should read through these statements carefully In case of disagreeal the knowledge engineer can get the reason why this statement was inferred Example ANOVA subClassOf Test was inferred because of the statements ANOVA subClassOf Parametric Test and Parametric_Test subClassOf Test If the KE disagrees either of the two statements must be removed Then the offending statement ANOVA subClassOf Test will not be inferred anymore from the ontology Aposdle Specific One important point for the usage of the Questionnaire in APOSDLE is tha
26. the subtopic with which the specialised task specialises the task with the variable This learning goal can not be deleted except by deleting the variable This however will also remove the specialised task 4 4 Validation tools These tools some automatic checks and the Ontology Questionnaire support the revision and validation of the entire APOSDLE knowledge base created The automatic checks are performed via a JAVA tool while the Ontology Questionnaire is a web based tool 4 4 1 Validation amp Revision of Domain Tasks These checks are performed after the usage of the MoKi and concern the task and domain models In addition to some manual guidelines used to validate the list of concepts contained in the domain model as well as the list of tasks contained in the task model of the MoKi we have implemented some tools to help users in revising the models created 1 Automatic checks This part consists of a list of automatic checks performed to verify certain properties of the concepts and tasks described in the Moki The results of these checks are sent to the Application Partners and coaches to help them with revising the models contained in the MoKi 2 Ontology questionnaire This questionnaire accessible on line proposes statements and questions to the Knowledge Experts that are extracted from the domain model contained in MoKi and aim to verify if the Knowledge Experts agree with those statements if not this obviously tr
27. tick the statement and by clicking the Justify button at the bottom of the list he she gets the reason why this statement was inferred Figure 14 shows the page of the Ontology Questionnaire containing in the top half the list of statements entailed by the domain ontology and in the bottom half the explicitly modelled statements displayed for user convenience 29 APOSDLE consortium all rights reserved learn work aposcdle D 1 6 Integrated Modelling Methodology Version 2 0 Entailed Statements In this box you see the statements axioms that are entailed inferred by the specified ontology You can get a justification by se ontology see the second box below 1 Test subClassOf Test OTwo Way ANOVA subClassOf Parametric Test OF Test subClassOf Test O ANOVA subClassOf Test Oone Way_ ANOVA subClassOf Parametmc Test Qone Way_ANOVA subClassOf Inferential Statistics Oone Way_ANOVA subClassOf Compar nz Means Oone V ay ANOVA subClassOf Test OLinear Correlation subClassOf Inferential Statistics Or Test subClassOf Test 1 Axloms 1 In this box you can see the axioms the specified ontology exists of by checking one of the checkboxes and clicking the remove Cistandard_Deviation subClassOf exists Square_root_of Variance L iPartOf is transitive ETP Vahe subClassOf L Toyerates on range Z Yah e or Crosstab or Standard
28. Agenda for Activity independent of what is the Activity such as know how to do use apply Agenda This small example illustrates the fact that in some cases tasks are modelled in a way that they might require Knowledge independent of the concrete application of the task e g know how to do use apply Agenda and knowledge that is strongly related to the concrete application of the task e g profound knowledge of Board Meeting This could result in two different modelling decisions e Ambiguous modelling a very generic topic is used modelled as a learning goal For instance the task Prepare Agenda for Activity requires the learning goal basic understanding about Activity meaning that in a concrete situation e g for preparing the agenda of a workshop exactly one specific sub topic of Activity e g VVorkshop is required for performing the task APOSDLE cannot deal with this ambiguity 56 learn work A anposcile 52 Detailed modelling the task is broken down into more specific tasks For instance the task Prepare Agenda for Activity can be broken down into Prepare agenda for Meeting Prepare agenda for Board Meeting Prepare agenda for Demo Meeting Prepare agenda for Workshop Then learning goals are assigned to these more specific tasks However this causes extra work for the knowledge engineer as s he needs to model all tasks separately and as s he needs to ass
29. Experiences ADD your comments here Page 1 Integrated Modelling Methodology Collection of feedback Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner In our opinion the goals were fulfilled We had a tasks model which was concerted with the CCI s workflow and a great collection of relevant learning resources For CNM we had a domain model and a deeper understanding about the domain For Rescue there was no effort in this stage since the models already existed Please state the main differences if any between performing this step this year and last year Application Partners were much more experienced did work more independently The iterative process using graphical tools from the beginning turned out to be much easier The discussions with CNM helped to focus very early in the process and reduced effort Main Work was done by application partners Main effort for SAP in this stage was to get an overview understanding of the CNM iTel process 2 Step 1a Knowledge elicitation from Digital Resources The goal of this sub step is to extract as much knowledge as possible from the digital resources provided by the Domain Experts The desired output of Stage la is a number of candidate domain concepts Feedback Please state positive and
30. Modelling Methodology Collection of feedback 4 Informal Modeling of Domain and Tasks in MoKi Starting from the knowledge elicited in Step 1 a b the main goal of this step 15 to obtain an informal but rather complete description of the domain model and task model in a Semantic MediaWiki called After this modeling step the informal concept model should only consist of relevant domain concepts see 5 2 Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Was the wiki used in a collaborative manner What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you Experience with variables Did you use variables in the informal model Please indicate why or why not Did you find it difficult to understand how variables could be used in general Did you find it difficult to insert variables in the MoKi I was not deeply involved in coaching EADS during the conceptual analysis for adding variables to tasks From the technical point of view it was easy to add variables to tasks thanks also
31. Modelling Methodology Collection of feedback Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Was the wiki used in a collaborative manner What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 1 Experience with variables Did you use variables in the informal model Please indicate why or why not Did you find it difficult to understand how variables could be used in general Did you find it difficult to insert variables in the MoKi We did not use any variables Our philosophy was to keep the models as simple as possible Positive Experiences ADD your comments here Using the Sematic MediaWiki was not difficult We could easily transcribe our domain concepts and the task model into the Wiki The Sematic MeidaWiki was intuitive to use The different illustrations of the domain concepts and the task model were very comfortable We did not have any difficulties with this tool We did not use variables because our domain model was not multidimensional Using variables would have made o
32. TACT At the end of this phase a correct and complete learning goal model in OWL is available This model is directly exported from TACT 3 6 3 Supporting Tools Techniques amp Resources The Task Competence Tool TACT supports the activity of creating learning goals based on tasks and domain concepts The TACT allows modifying the task model insofar as adding or removing task parameters is possible Furthermore modelling with task and learning goal parameters is supported in that users mostly need only model learning goals for abstract tasks while the rest is automatically added by the TACT The TACT also supports modelling by highlighting tasks which are not described through learning goals and domain concepts which are not used as part of learning goals A manual of TACT included in the Annex Part 4 TACT User Manual was distributed to the users which contains guidelines to modelling and a technical description of how to use the TACT Guidelines for modelling learning goals were also used for validation and revision support See Section 3 7 18 learn work A anposcile 52 3 6 3 1 Guidelines for modelling learning goals 1 Assign to a task all learning goals that are indispensable for a task and do not assign learning goals that are only nice to have for performing the task The easiest this is to do this by imagining several concrete situations where a person performs the task For instance the task Detectin
33. a certain party of the APOSDLE consortium warrant that the information contained in this document is capable of use nor that use of the information is free from risk and accepts no liability for loss or damage suffered by any person using the information This document does not represent the opinion of the European Community and the European Community is not responsible for any use that might be made of its content Imprint Full project title Advanced Process Oriented Self Directed Learning Environment Title of work package WP 1 Formal Models Creation Document title Document Identifier Work package leader SAP List of authors Barbara Kump TUG Viktoria Pammer TUG Henny Leemkuil UT Administrative Co ordinator Harald Mayer Scientific Co ordinator Stefanie Lindstaedt Copyright notice 2006 APOSDLE consortium Document History Version Date Reason of change 1 2008 11 17 Document created bkump vpammer learn work aposcle Task Competence Mapping Tool TACT Executive Summary APOSDLE consortium all rights reserved page li learn work A anosclle 42 Executive SUNAY aaa arna riean ili Tabi o Onnan n Iv t Preliminary NOLES E Ea is aaea Aaaaoi 1 2 Guidelines for specifying learning goals with the 2225 2 3 Building Learning goals with Learning goal Types 25256
34. already in a almost final form This was a result of the intuitive MoKi Please state the main differences if any between performing this step this year and last year MoKi Much easier to use less tools More experience 5 Step 3 Informal Models Validation and Revision The goal of this step is to have the domain model and task model validated completeness and correctness by the DEs The step was supported by guidelines by results from automatic checks and by an on line ontology questionnaire Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Page 5 Integrated Modelling Methodology Collection of feedback Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 for CNM and CCI Positive Experiences The tools were adequate for the models Since the models were kept simple the modeling process in general was much easier There were just minor changes in this step Negative Experiences Change all CCI is part of relations into is a relationsI This step did not produce much effort we still
35. and domain model in the Please state the main differences if any between performing this step this year and last year There is a huge diference between the MOK and the Wiki we used last year MOKI is easier to handle a user can model very quick due to the import function and the browse functionalities allow a good conceptual overview over the models 5 Step 3 Informal Models Validation and Revision The goal of this step is to have the domain model and task model validated completeness and correctness by the DEs The step was supported by guidelines by results from automatic checks and by an on line ontology questionnaire Page 5 Integrated Modelling Methodology Collection of feedback Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 3 Positive Experiences The check report delivers a good overview also some hints from the coaches were useful Some relations between concepts dind t ake any sense and were detected throug
36. are contained in the Annex at the end of the document 2 2 The vision for the IMM Second Version The construction of the APOSDLE knowledge base is inherently a collaborative activity performed by different actors the so called modelling team composed of domain experts coaches and knowledge engineers each with different know how technical skills and roles To support collaboration between the modelling team and to allow greater flexibility during the cooperative modelling activity we developed the collaborative modelling paradigm illustrated in Figure 3 Figure 3 The IMM Collaborative Approach This paradigm is inspired by recent Web 2 0 collaborative solutions of which wikis are one example and was proposed in Christl et al 2008 and Rospocher et al 2008 as a way to support modelling activities in an enterprise modelling setting In this paradigm all the actors asynchronously collaborate toward the creation of the APOSDLE knowledge base by inserting knowledge either formal or informal by transforming knowledge from informal to formal and by revising knowledge The domain experts enter the missing knowledge using a form of informal language into the models or provide A detailed description of the current task model and of the reasons behind its simplification can be found in Appendix 1 learn work A anposcile 52 feedback on the formal models created Asynchronously the knowledge engineers can re
37. can be found in Deliverable D6 9 Second Version of Application Domain Models We also provide an overview of the team who performed the modelling for each Application Partner CCI has changed its application domain after the evaluation of the APOSDLE second prototype The new domain is about Information and Consulting on Industrial Property Rights The modelling task was done by domain experts and by knowledge engineers from CCl plus coaches from SAP e CNM has chosen two application domains for the APOSDLE third prototype One of them is the RESCUE domain a methodology for Requirement Engineering developed by City University and was already used by CNM for the ASPOSDLE second prototype In addition learn work A anposcile 52 CNM decided to add a new domain about the Information Technology Infrastructure Library ITIL V3 The modelling task was done by domain experts and by knowledge engineers from CNM plus coaches from SAP e EADS IW decided to keep and elaborate the domain chosen for P2 the Simulation Domain focusing for P3 on the electromagnetism physical domain The modelling task was done by domain experts and by knowledge engineers from EADS plus coaches from FBK e ISN has chosen a new domain on Innovation management in a network of SME s with boundaries on Consulting project management and further education in the field of innovation management The modelling task was done by domain experts and by knowledge
38. clear what should be achieved during this phase for P2 the main problem was a lack of understanding of how a model should look like KE CCI EADS what was an adequate APOSLDE domain KE ISN Coach KC and how the models would be used in APOSDLE KE CCl Nonetheless knowledge elicitation techniques questionnaire and modelling methods process and domain scribble of P2 were seen as useful 1 KE ISN Due to their experiences from P2 for P3 the KEs had a good understanding of what was an appropriate domain for APOSDLE All of the KEs gave an explanation why they had selected a certain APOSDLE domain for P3 Most of these explanations were based on the questions from the initial questionnaire applied in this phase The results points to the necessity of good examples both for adequate domains and for good models They also point to the necessity of having at least a basic understanding of the target system in which models are going to be used in this case APOSDLE This was the motivation to systematically develop a questionnaire the initial questionnaire described in Phase 0 accompanied by guidelines on how different situations will potentially affect the performance and usefulness of the APOSDLE system We see the initial questionnaire as a useful means for identifying an adequate APOSDLE domain 40 learn work A anposcile 52 For P2 most of the KEs expressed difficulties in finding adequate documents for several reasons
39. domain model Parameters are denoted using lt gt For instance the task Prepare Agenda for lt Activity gt is an abstract task that contains the parameter lt Activity gt Specialised task A specialised task is the instance of an abstract task An abstract task is de composed into specialised tasks by replacing the parameter with all sub topics of the topic that the parameter is about 55 learn work A anposcile 52 7 3 Learning Goal Model We regard a learning goal as the combination of learning goal type and domain concept Each learning goal is defined as a pair lt learning goal type domain concept gt which are retrieved by means of the has learning goal type and is about relations in Figure 15 For a list of learning goal types see Section 7 4 1 The domain concept defines the content that the learning goal is about The learning goal type specifies the type or somehow the degree of knowledge and skills the person needs to have about this topic for performing a specific task For instance a learning goal basic knowledge about APOSDLE Wiki would describe the ability of a person to read and navigate in the APOSDLE Wiki The person would Know what is available on the APOSDLE Wiki and how to move back and forth between the pages The learning goal basic knowledge about APOSDLE Wiki would not include the ability to edit the content of the Wiki or to insert pages In order to expre
40. don t understand why relations are provided at the beginning and have to be changed at the end because of technical reasons Furthermore the current model does not exactly express what CCI did want to model We mainly had to do the Rescue Modeling for CNM Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner There were just minor changes in later steps The check made us sure to have completed the informal model before going into the formal modeling phase Please state the main differences if any between performing this step this year and last year The MoKi s usability was much better then in the previous year Therefore the models were in a good shape from the beginning 6 Step 4 From Informal to Formal At the end of this step the domain model and task model will be contained in two OWL ontologies Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Page 6 Integrated Modelling Methodology Collection of feedback Were the explanations given clear enough
41. due to the growing popularity of wiki based web sites e g wikipedia users are quite familiar with wikis and the editing of wiki pages Furthermore the SMW framework already provides several important functionalities such as access control and permissions tracing of the activity semantic search and so on without the need to install specific client applications Finally only a web browser is required on the end user side to use the system The second important reason for choosing a semantic wiki was the fact that the wiki can provide a uniform tool and interface for the informal specification of the different components of the APOSDLE Knowledge base in particular domain and task models This differs from the usual procedure where dedicated but often disconnected modelling tools are used to model each aspect As a final reason for implementing MoKi on top of a semantic wiki the natural language descriptions inserted in a semantic wiki can be structured according to predefined templates with the help of semantic constructs like properties AS a consequence the informal descriptions in natural language contain enough structure to be automatically translated in formal models thus allowing the re use of informal descriptions for automatic ontology creation 4 2 1 Describing knowledge in a MoKi page The main idea behind is to associate a wiki page to each simple or complex element of the formal model so that this page contains an i
42. dx y INO peyJOM aq pinous esau suoneo dx Jeajo JOU BJU J ulewop Bulule l v noqe suons no ou OU sof sok pesojo AuEduuo2 y JO 55920 5 y o JeIONIO due ase v ul SeeAojdwe zey syseL Azzn s ss oold y dnos6 ay ul syse v Jo AUeW 104 ou OU sof sok ou OU sof sok ou OU sof sok pesojo poob sem wogyno y ISE ples aq Ajjensn ued yse e JO dnos6 ye6se y l gaXo dwea au o ued uloo no x y 29npold e JO ode e Ba vlo no e UAL Ssyse woad dnoub v ul s oldu l ilensr PUS v o BuluUIBeq y wouy Syse woad v ul SeaXojdwe Ailensn ou OU sof sok NSE y 104 ase JEU S D S pue y 109458 UM AjjUeOWIUBIS Au old l zo Syse y v ul ou ou sof sok pesojo NSE AH A lqE pinous ojdw v s dnolb v U ou Ou sof sok pasoj
43. for the KEs KE CCI CNM ISN Coach SAP In two domains the semantic MediaWiki was considered cumbersome for modelling concepts and tasks CCI CNM whereas the KEs from the other two domains found it functional ISN EADS Further comments on the semantic MediaWiki of P2 were that the models of different domains should be in separate Wikis KE EADS and that the functionality of the Wiki needed to be improved in general KE EADS Another problem stated by one of the KEs EADS was that the semantic MediaWiki did not support the iterative modelling process Based on the answers in the questionnaires for P2 the semantic MediaWiki was replaced by the Moki see Section 4 2 The MoKi was considered to be effective and convenient by all respondents of P3 s questionnaires According to them the user manual was good KE EADS the import functionality was helpful KE EADS the different visualizations of the domain concepts and tasks allowed for getting an overview over the models KE CCI EADS and the MoKi supported the involvement of domain experts Coach FBK Minor problems reported in the questionnaires were that a description of how to use the is part of relation in the MoKi was missing KE CCI Coach SAP and that deleted concepts were still visible in the MoKi EADS One positive remark was made on the fact that there was a separate Moki for each partner KE EADS None of the negative remarks from the questionnaires in P2
44. for the modelling of learning goals The specification is provided using the Modelling WiKi Moki tool o Phase 2a Validation amp Revision of Domain Tasks The domain and task models are validated and if needed revised Guidelines for manual revision and validation checks are provided to help with the revision process e Phase 3 Modelling of Learning Goals A specification of the learning goal model is created This Phase refines the initial alignment between domain elements and tasks produced in Phase 2 to specify detailed learning goals The specification is provided by using the TACT tool o Phase 3a Validation amp Revision of Learning Goals The learning goal model is evaluated and if needed revised Validation checks are provided to help with the revision process Phases 0 3 are performed in a sequential manner Revision loops can originate from Phase 2a and Phase 3a as shown by the arrows in Figure 4 Main roles The roles used in the Integrated Modelling Methodology Second Version did not change compared with the ones described in the first version and reported in Section 2 1 The modelling team is therefore composed of Domain Experts DEs Coaches and Knowledge Engineers KEs 3 1 1 The Knowledge Bases of the 3 Prototype We briefly summarise the five application domains chosen by the four Application Partners to be part of the APOSDLE 37 Prototype A full description of these domains and the models produced
45. for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Page 7 Integrated Modelling Methodology Collection of feedback Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you Experience with variables Did you use variables in the learning goals Please indicate why or why not Did you find it difficult to understand how variables could be used in general Did you find it difficult to insert variables in TACT Ifyou used variables where did you find it more intuitive easy to create tasks with variables in the MoKi or in TACT Positive Experiences ADD your comments here Using the TACT tool was not difficult The manual was very useful to understand the meaning of learning goal types and how to use the tool Negative Experiences ADD your comments here We had problems in understanding the inheritance within the tool We thought if a class has the learning goal type profound knowledge about the subclasses would inherit this learning goal type This was not the case So we had to define the learning goal type for every subclass although the class had alr
46. goal of the task model is to contain a description of tasks and possibly of the decomposition of tasks in their components Differently from the meta model used for the development of the prototype it only contains some information to retrieve the graphical ordering of sub tasks within a process but not the workflow information about processes Concerning this change we must stress that the task models as considered in APOSDLE are not comparable with ordinary workflow models In fact the APOSDLE system provides documenis templates for the current working step to help the user in performing learning the current task Therefore each task or sub task is assigned with several documents that contain learning content for the current work step Here the user is not restricted to a specific task order The system allows the user to move freely between the different tasks and sub tasks Therefore a deep modelling of temporal task orders will result in additional effort though not being considered in the running APOSDLE except by the task viewer For this reason we decided to adopt the OWL ontology language also for the formalisation of tasks and not the YAWL workflow language Each task has an attribute task description used to store the textual description of the task The values of task description is provided at modelling time by the modellers domain experts The ordering of tasks in a process graph is stored according to some int
47. how difficult this step was for you Positive Experiences Negative Experiences Page 1 Integrated Modelling Methodology Collection of feedback Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Please state the main differences if any between performing this step this year and last year 2 Step 1a Knowledge elicitation from Digital Resources The goal of this sub step is to extract as much knowledge as possible from the digital resources provided by the Domain Experts The desired output of Stage la is a number of candidate domain concepts I was not involved in this step zg 3 Step 1b Knowledge elicitation from Domain Experts Des The goal of this sub step is to elicit knowledge directly from the DEs The desired output is a refined task list and an extensive list of candidate domain concepts Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Did the domain experts coincide with the person performing the modeling What where the main difficulties encountered in this stage
48. in APOSDLE 2 2 Connection between Task Model and Learning Goal Model list of tasks without associated learning goals tasks should be associated to at least 1 learning goal Please check the reason why there are tasks not related to any learning goal list of tasks with 1 associated learning goal If a task has only one learning goal assigned this might have several reasons e Performing the task requires only one learning goal However this should not be the case for too many tasks because otherwise one of the models task model domain model is becoming redundant e The task might be not central to the learning domain it possibly requires a lot of other learning goals that are not modelled e The task might be central to the learning domain However crucial learning goals are missing because the domain concepts have been forgotten in the domain model e The task might be central to the learning domain However crucial learning goals are missing because there are no resources available e The domain concept that the learning goal is about might be rather high level and sub concepts were also meant Implications the possible explanations should be looked at e the modeller has to take the decision whether the task should be removed whether the domain model should be broken down whether further domain concepts should be picked up into the model or whether the status quo is satisfactory list of tasks with
49. in order to achieve more difficult learning goals This was a big problem for P2 because the pre requisite between learning goals was computed from the task learning goal assignment but should be less problematic this year as a different strategy is used to compute pre requisites between learning goals Therefore possible changes should be first discussed with the coaches 2 4 Learning Goal Types list of never used learning goal type this is not necessarily an error but should be checked Integrated Modelling Methodology Collection of feedback Integrated Modelling Methodology Collection of Feedback The goal of this questionnaire is to collect positive and negative feedback useful for the evaluation and improvement of the Integrated Modelling Methodology Please state your comments the way you prefer you can provide feedback in form of short sentences and bulleted lists as well as more complex descriptions Both formats are perfectly acceptable Note for APs Please provide feedback only for the steps you have already completed Partner Name CCI 1 Step 0 Scope amp Boundaries and Resources Collection The goal of this step is to define the scope and boundaries of the application domain to be modeled and to gather some resources related to the application domain The output should be a first preliminary list of tasks process scribble a first list of domain concepts and a collection of relevant learning resourc
50. know how to apply use do a and know how to produce a These specific learning goal types are used to define learning goals for topics They have clear definitions and meanings that are described hereinafter There are qualitative differences between learning goals of the different types in the sense that different cognitive processes are involved but there is no explicit hierarchy among them Although learning goal types are clearly specified in several cases the final decision on which learning goal type to choose resides in the knowledge engineer Descriptions of the topics may provide useful information and facilitate the selection of a learning goal type In the current version of the TACT if available topic descriptions are displayed Moreover in the third APOSDLE prototype we have added a learning goal type that is called unspecified This learning goal type can be used to express that a user in order to perform the task might need all kinds of information about the topic In the TACT interface by default a learning goal is always of type unspecified and has to be changed into a specific learning goal type manually This procedure is described in Section 6 6 Table 1 Learning Goal Types in different langugages Learning Goal Type English Learning Goal Type German profound knowledge of Umfassendes Wissen ber know how to produce a Anfertigen k nnen von unspecified e
51. learning goals referring to the same topic by using different learning goal types DO NOT RELY on the task topic assignment that stems from the knowledge required section in the APOSDLE Wiki Perform a SECOND TRIAL to review the task learning goal assignment Do first cut task learning goal mappings quickly and rather by intuition without thinking too much and do not strive for perfection In a number of case studies this strategy has proven to lead to success WHEN YOU ARE FINISHED with a first cut mapping for all tasks walk through all task learning goal assignments a second time and if necessary add and remove learning goals learn work A anosclle 42 The first trial of your task learning goal assignment will take around 2 and 3 hours depending on the number of tasks in your model The revision will take another 60 min approximately In Section 5 these points are explained in more detail learn work A anosclle 42 This document describes the learning goal types that are used in the third APOSDLE prototype Each learning goal consists of one learning goal type and one topic One topic can theoretically be linked with all five learning goal types thereby creating five different learning goals Based on our experiences with the second prototype we revised the list of learning goal types The remaining specific learning goal types are the following basic knowledge about profound knowledge of
52. negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Was the knowledge elicited useful What where the main difficulties encountered in eliciting knowledge from resources Were the explanations given clear enough Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 CCI 3 CNM Page 2 Integrated Modelling Methodology Collection of feedback Positive Experiences With CNM we had a meeting to sort extract the main knowledge about ITel Using Card sorting in several rounds we lead CNM to get an overview and structure the domain and contenets themselves Negative Experiences We had to travel to Dortmund This step took a lot of time but was important to structure and facilitate the proceeding steps Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Both for CNM and CCI we had a good overview and a structure of a domain after this step Please state the main differences if any between performing this step this year an
53. of actions performed in later steps Unfortunately we don t know exactly when he or she revisits a given task before and after which task It may have an impact on learning goals definition The competencies are not the same if the simulation engineer performs a given task for the first time or for a second time For the Prototype 2 we decided to assign to the task all the learning goals that are relevant in both situations However it means that the competency model contains several tasks with a great number of learning goals Of course TACT can be improved As an example of possible improvements the models visualization identification of super tasks and subtasks relations between concepts presentation of process order instead of the alphabetical list of tasks Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 8 hours ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer yes Please state the main differences if any between performing this step this year and last year ADD your comments here Better tool TACT but the process remains the same The main difference consists in using variable in the task and learning goals definition Page 9 Integrated Modelling Methodology Collection of feedback 8 Step 6 Formal Models Validation At the end
54. of the APOSDLE Wiki was filled in a rather early modelling stage Consequently those suggested topics might be incomplete or some of them might be wrong Therefore the KE should not hesitate to re assess their relevance for performing the task 5 Perform a second trial to review the task learning goal assignment lterative modelling is encouraged Usually at the beginning of a task learning goal mapping one is rather uncertain about how to do it During modelling a sense arises for what is a meaningful mapping and what not Therefore a first cut mapping of learning goals to tasks by intuition is suggested In several case studies this strategy has proven to lead to success Finally at least one more walk through is suggested 19 learn work A anposcile 52 7 Phase 3a Validation amp Revision of Learning Goals 3 7 1 Goal The goal of this phase is to validate with respect to its correctness and completeness the domain dependent part of the APOSDLE knowledge base created during the previous modelling phases This validation phase may trigger arevision in MoKi of the domain and task models arevision in TACT of the learning goal model 3 7 2 Description The validation and revision of the Learning Goals is built around a single activity supported by automatic checks as described in Figure 8 Revise in Moki Automatic checks Revise in TACT Figure 8 Validation amp Revision of Learnin
55. sense to add parameters in order to not having too much learning goals and to refined the task Did you find it difficult to understand how variables could be used in general No Did you find it difficult to insert variables in TACT When you know where you have to do it highlight respective task and topic then it is easy But I first did not know how to do this Ifyou used variables where did you find it more intuitive easy to create tasks with variables in the MoKi or in TACT Both It I very useful to have the variables created already as a hint in the TACT But there was one task were I found out in the TACT that a variable would make sense Positive Experiences TACT is easy to use and especially the descriptions and the automatic generated learning goals are very useful Negative Experiences There is still some usability improvement Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 20h Page 8 Integrated Modelling Methodology Collection of feedback Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Yes All tasks have now lerning goals and the specifc tasks have specific learning goals Please state the main differences if any between performing this step this year and last year Having the descriptions available and automatic generated learning goals is better The learninggoals a
56. the different models like the relation denoting required knowledge between elements of the task and the domain model This natural language based but also structured description provides a natural bridge between formal and informal representation of knowledge The user fills a page via forms so he she does not need to Know any particular syntax or language to participate in the creation of the domain and tasks models All the actors involved in the modelling activities can also interact with each other and exchange further ideas and comments using the SMW s built it discussion functionality Below in Figure 6 an example of a MoKi page describing an element of the domain model is shown Modify concept Brainstorming Annotations Description One of the techniques that can be used to acquire information and requirements for the future system By this technique requirements engineers ask group of stakeholders to generate ideas Preconditions suitable group of stakeholders Strengths good for eliciting high level domain entities and questioning assumptions Weaknesses susceptible to group processes possibly unsystematic Synonyms Hierarchical Structure Is a ACRE Methods Is part of Properties Add another Notes free text O This is a minor edit O Watch this page Save page Show preview Show changes Cancel Figure 9 The page of a domain elemen
57. the information is structured according to pre defined templates 4 1 Templates for Concepts A typical template for a concept is shown in Figure 5 Is shows the information stored in the wiki for the specific concept The tabs listed in the upper part of the template provide basic functionalities to manage the template e edit with form This tab enables to edit the information contained in the template with a form based interface Note auto completion is used to suggest how to fil the forms with already existing elements of the wiki e edit This tab enables to edit the information contained in the template directly with the semantic media wiki syntax e history This tab enables to see the history of changes of the current page e delete This tab enables to delete the current concept e move This tab allows to rename the current concept Please refer to your coaches for a description of the information that must be added in the fields of the different templates 4 2 List concepts The List concepts functionality creates a table which lists all the concepts contained in the wiki and their main properties By clicking on a concept the template of that concept is shown 4 3 Add a concept The Add a concept functionality allows to add or edit a concept in the wiki If the concept is new then it is added to the wiki The user is automatically shown the empty template with form for the new concept so that the relevant information c
58. this domain concept help to differentiate between resources 2 Does this domain concept refer to a learning goal of a hypothetical APOSDLE user 2 1 Does this concept helo APOSDLE to support the mastering of the learning goal As a general rule it is suggested to keep possibly irrelevant concepts rather than risking to remove relevant ones Nevertheless it is clear that the consequence of keeping irrelevant information at this stage increases the effort of modelling in every subsequent modelling stage 3 4 3 2 Guidelines for choosing relevant task Starting from all possibly relevant tasks generated Phase 1 the KE decides which ones are relevant tasks and which ones should be discarded To help deciding about relevant tasks in the APOSDLE knowledge base the following guidelines were given 1 Does the task refer to a situation task in which learning supported by APOSDLE shall occur 1 1 Does the task require knowledge that lies inside the specified learning domain 1 2 Should APOSDLE be able to support this task 2 Is the task recognisable for the future APOSDLE user The statement I am currently doing task X should make sense to the future APOSDLE user and should provide a good insight in the correct granularity level of a task For instance the statement l am currently performing an activity would be too generic and it probably would not make sense to a user while the statement I m currently pressing the ESC
59. to be over expressive for the purpose of APOSDLE From an analysis of the issues reported above as well as from the analysis of specific the feedback items reported in D1 3 we identified several requirements for a revision of the IMM A more integrated modelling process Most of the granularity issues reported by the Application Partners were in fact due to granularity mismatches between the Domain and the Task model Thus supporting the concurrent and inter related modelling of these two modules of the APOSDLE knowledge base was deemed to be a necessary strategy to handle granularity issues A more agile modelling process This serves to reduce the workload of modelling and to strengthen an active collaboration between the members of the modelling team Several complaints concerned the fact that the iteration between informal and formal was too time consuming for the APOSDLE modelling activities Thus we decided to move away from a strongly structured waterfall paradigm and switch to a more agile collaborative paradigm illustrated in Section 2 2 Simplification of the modelling of tasks The need of keeping the modelling process as simple as possible together with issues about the usage of the YAWL editor and the over expressivity of YAWL triggered a revision and a simplification of the way tasks are modelled Note that the granularity of models is a typical problem of modelling for which there is no general solution We do not ai
60. to the auto completion functionality in the MoKi Positive Experiences The objectives where clear The tool chosen is adequate for this step of the methodology especially if domain experts are active involved in the modeling phase The tool has greatly improved wrt last year s one The MoKi has been used in a collaborative manner since coaches were able to monitor and provide feedback on the work of application partners Negative Experiences ADD your comments here Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here 3 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Yes as the AP has been able to produce quite detailed domain and task models Please state the main differences if any between performing this step this year and last year Page 2 Integrated Modelling Methodology Collection of feedback The tool has greatly improved new templates forms vs wiki syntax editing new visualization functionalities delete rename support wrt last year s one thus easing the work of people involved in modelling activities 5 Step 3 Informal Models Validation and Revision The goal of this step is to have the domain model and task model validated completeness and correctness by the DEs The step was supported by guidelines by results from automatic checks and by an on line
61. twice in the topic browser One and the same topic can be a sub topic of two different topics If two topics have the same name they are one and the same topic and no distinction needs to be made between them Hovv many learning goals should be assigned to a task All learning goals that are mandatory should be assigned to a task don t assign learning goals that would be nice to have Learning goals are not inherited from supertasks or tasks that have to be performed before the task under consideration but have to be modeled explicitly The easiest this is realized by asking for each task What do 1 have to know be able to do for performing the task successfully Lessons Learned e Use variables sparingly Otherwise the number of tasks can easily become very large 22 A apescile learn work Project Number 027023 APOSDLE Advanced Process Oriented Self Directed Learning Environment Integrated Project IST Technology enhanced Learning Information Society Technologies Validation amp Revision of Learning Goals Integrated Modelling Methodology APOSDLE Identifier APOSDLE W10 JRS Agenda Plenary and GA Trento Author Partner Chiara Ghidini FBK Barbara Kump TUG Marco Rospocher FBK Work Package Task WPI Document Status Draft Confidentiality Confidential Version Date Reason of change 1 2008 10 1 Document created 2 2008 10 10 Distinction between manual and automatic che
62. was made for the MoKi We conclude from these results that the MoKi is a useful tool for informal modelling For modelling P2 in order to enable a review of the models by all coaches and in order to facilitate collaborative modelling the models were built in two languages national language plus English which was found to be very tedious by the KEs KE CCl In P3 models were only built in one language Because of APOSDLE specific requirements in the modelling method for P2 the informal task model had to be transcribed into a YAWL model manually by SAP The YAWL model was then converted automatically into owl files The informal P2 domain model in the semantic MediaWiki was automatically exported into owl files Then if desired the formal task and domain model could be manually edited The analysis of the evaluation questionnaires for P2 showed that YAWL and Proteg were considered as appropriate tools for formal modelling KE ISN CCl CNM Coach KC Minor technical problems with Protege were reported by one of the KEs CCl One coach KC also stated that formal models were well prepared by the technical partners FBK and SAP However the transformation from the informal task model in the Wiki was time consuming and an automatic translation from the Wiki to YAWL would have been helpful Coach SAP According to one KE ISN some relations got lost during the transformation of the domain model from the Wiki to an ovvl file Even t
63. were suggested by Cooke amp McDonald 1986 The original procedure of card sorting was described for example by Maiden amp Rugg 1996 We have slightly adapted it for our methodology Of course not all Knowledge elicitation techniques have to be applied in every domain Choosing knowledge elicitation techniques depends on the requirements of the respective model For instance if already many concepts have been brainstormed for the domain concept listing does not need to be applied Moki the wiki which supports Phase 2 for a full description see Section 4 2 10 learn work A anposcile 52 3 3 3 Supporting Tools Techniques amp Resources Here we briefly describe the techniques we propose in our IMM to support the Knowledge Acquisition phase Below we present the techniques for Knowledge Elicitation from Domain Experts while an extensive description of the text mining functionality supporting Knowledge Acquisition from Digital Sources is provided in Section 4 2 2 1 3 3 3 1 Structured Interviews Based on the first cut task list and process scribble that was generated in Phase 0 a more fine grained task list should be generated and further relevant domain concepts should be identified Therefore the KE conducted structured interviews with the DE Tasks were broken down into sub tasks by asking for each task in the first cut task list what were its subtasks This was performed until the KE and DE were able to obtain t
64. whether APOSDLE is suitable for this company On the other hand its purpose is to document the intended learning domain the target user group the tasks of the intended APOSDLE users etc The theoretical foundation of this questionnaire is detailed in the APOSDLE Deliverable 2 8 amp 3 5 2009 and the questionnaire itself is appended in the Annex Part 1 Initial questionnaire Scope amp Boundaries The questionnaire is designed in such a way that it can be either filled out remotely or be used as a guideline for a personal structured interview The questionnaire is accompanied by interpretation guidelines such that any person who is reasonably acquainted with APOSDLE can provide advice about the suitability of APOSDLE This questionnaire was given to all application partners for P3 filled out and interpreted by the coaches together with the knowledge engineers 3 3 Phase 1 Knowledge Acquisition 3 3 1 Goal The goal of the Knowledge Acquisition step is to extract as much knowledge as possible from both digital resources provided by the DEs and by eliciting knowledge directly from the DEs The results are a refined task list and an extensive list of candidate domain concepts that are documented in the modelling wiki 3 3 2 Description Phase 1 is subdivided into two different activities namely Knowledge Acquisition from Digital Sources and Knowledge Elicitation from Domain Experts The two activities are running in pa
65. which indicates that modelling was easier for the KE when modelling P3 which is probably due to a the experience in modelling b the knowledge about APOSDLE and the role of models in APOSDLE c the improvement of the modelling tools and d the reduction of complexity in the model as compared with their models from P2 and e the selection of a more appropriate simpler domain which was easier to model Moreover also due to the scope and boundaries questionnaire the domain fit better with the application scenario of APOSDLE the P3 domain was more task driven than the P2 domain The numbers in the table are mentioned here to give an approximate impression about the time needed for modelling However modelling time cannot be directly compared for different models as the effort needed depends on a variety of factors such as the Scope of the domain Number of concepts tasks learning goals Access to relevant documents Availability of process descriptions or concept hierarchies Availability of experts for knowledge elicitation and model validation Complexity of the model hierarchical depth of domain model usage of variables As mentioned above further improvement of the efficiency of the modelling methodology could be planned e g by reducing the number of tools involved Having everything in one tool should already speed up the modelling process However other facilities have to be invented which help in reducing the
66. whole nor a certain party of the APOSDLE consortium warrant that the information contained in this document is capable of use nor that use of the information is free from risk and accepts no liability for loss or damage suffered by any person using the information This document does not represent the opinion of the European Community and the European Community is not responsible for any use that might be made of its content Imprint Full project title Advanced Process Oriented Self Directed Learning Environment Title of work package WP Work Processes Document title Integrated Modelling Methodology Version 2 0 public version Document Identifier APOSDLE D1 6 FBK IMMv2 0_ public Work package leader SAP List of authors Chiara Ghidini Marco Rospocher editors Barbara Kump KC Viktoria Pammer KC Andreas Faatz SAP Andreas Zinnen SAP Administrative Co ordinator Harald Mayer Scientific Co ordinator Stefanie Lindstaedt Copyright notice 2006 2009 APOSDLE consortium Document History Version Date o Reason of change 2009 03 16 Document created 2009 04 10 Document sent for Internal review 2009 04 21 Internal review comments received 2009 04 29 Final Version submitted B W N learn work A anposcile 52 This document describes the second version of the APOSDLE Integrated Modelling Methodology This methodology which updates the previous version described in Deliverable D1 3 Integr
67. 3 learn work A anposcile 52 Objectives and explanations o The explanations were clear 1 KE EADS o The objectives of the phase were clear 1 KE EADS Check report o The check report was useful and gave a nice overview 2 KE ISN EADS 1 Coach FBK o The result of the check report rather concerns the technology partners 1 KE EADS TACT o The Excel sheet produced by the TACT was very useful for checking the learning goal model 1 KE ISN 8 7 General remarks 8 7 1 1 Feedback on the IMM for P2 o Iit was difficult to estimate the modelling efforts 2 KE CCI ISN o The organization of the modelling process was not clear 1 KE CNM o We had difficulties with knowledge engineering because we had no experience with it 1 KE CCl o There was a lack of understanding of the role of models in APOSDLE 1 Coach SAP o The model was kept simple and was therefore easily manageable 1 KE CCl 1 Coach SAP 8 7 1 2 Feedback both for the IMM of P2 and P3 Complexity of the model o The model does not allow to depict all necessary complexity of the domain P2 1 KE EADS P3 1 KE EADS 8 7 1 3 Feedback on the IMM for P3 Modelling efforts o The modelling process is still time consuming 1 KE CCI 1 Coach SAP o Speed up modelling is crucial for a real world application 1 KE CCl o Time to be invested in modelling is acceptable
68. A c Test lf it is unchecked statements will be shown as Anova subClassOf Test 1 4 Known issues and bugs The ontology questionnaire does not deal with imported ontologies So if an ontology contains imports the reasoning is done only over the statements within the uploaded file The ontology questionnaire does not store labels comments or similar things The ontology questionnaire relies on Pellet to do the reasoning If Pellet cannot deal with an ontology the questionnaire cannot either In case uploading an ontology takes too long try loading the ontology into Protege 4 and classifying it with Pellet If this works you have discovered a bug in the ontology questionnaire If Pellet in Protege 4 also fails then this ontology can simply not be dealt with ajoscile learn work Information Society Technologies C Task Competence Mapping Tool TACT User Manual learn work A anosclle 42 Disclaimer This document contains material which is copyright of certain APOSDLE consortium parties and may not be reproduced or copied without permission The information contained in this document is the proprietary confidential information of certain APOSDLE consortium parties and may not be disclosed except in accordance with the consortium agreement The commercial use of any information in this document may require a licence from the proprietor of that information Neither the APOSDLE consortium as a whole nor
69. A op 54 8 5 juswebeuew Huluses JO S l S4S Huluses ouso p sop 8 00 aq Aew 31IGSOdV o 4194 you sue 5 P VMMOUN zi 5 Aq ydulex 104 os o v ued uSnoulle yoddns jou s op 314504 sq sn 474SOciV 19410 Ulu paJeys Udy UBD zey jenazew B nsix yo suon llo Bunesso Aq 5p ylouy Builbeyoed syoddns Aed A uo F7GSOdV Buiuse9 p seq yse Bp jmouy BulAjdde syoddns 5 F7GSOdV 5p louy swoddns AiSuons 414504 Buismoug youeas Huluuejd Buismoq Aq uono p Se 2ileuonouni JO 0 e 5 osje qsodv 114 p i p 104 ues s lndulo e UO uop zou 181 5459 eoue SUI J104 peseq se ndwoo Ahu l vul si FIGSOdV s oldul AjMeu o 31 Bunyusue pue s oldul BulAes eaHpajmouy y deey AUedwoo y dj y ued 10904 uu si UoHeJONY H 11dSOdvV suleBbe suonpolpul Buors ase s y uoneooj auo je SI pue syeaX Auew edulis swes y sey jews SI y J suoneo ldu Auedwos JnoA ul p sn swajshs juswebeuew Buluses Jo 54 8 5 Buluses mou Ag ES USWAOJSASp MOU YIM o dn d
70. AP2009 Volume 426 of CEUR Workshop Pammer V Scheir P amp Lindstaedt S 2007 Two Protege plug ins for supporting document based ontology engineering and ontological annotation at document level 70th International Prot g Conference 2007 Protege Protege ontology editor protege stanford edu 50 aposcdle D 1 6 Integrated Modelling Methodology Version 2 0 learn work 7 Appendix 1 The Meta Model of the APOSDLE Knowledge Base The main elements of the meta model are the elements of the different models and data structures with particular focus on their mutual relationships as illustrated in Figure 15 User profile Task model Learning goal model requires 2 Bi jnoge 5 has_parameter has parameter g_goal_type Instructional types has_learnin 021 il O a Domain model APOSDLE knowledge base Knowledge Artefacts Figure 15 The meta model of the APOSDLE knowledge base The main elements of the APOSDLE Knowledge Base meta model are 1 the Domain model 2 the Task model APOSDLE consortium all rights reserved page 51 learn work A anposcile 52 3 the Learning goal model 4 the APOSDLE instructional types 5 the relations between all the elements above In this section we describe these elements and their relations in detail 7 1 Domain Model The domain model contains
71. Coach KC o Classic structured interviews might have been useful 1 KE ISN o Classic structured interviews might bring better results than card sorting 1 KE CCl 66 learn work A anposcile 52 8 2 2 2 Feedback both for the IMM of P2 and P3 Experts o Experts had very little time P2 3 KE ISN EADS CCl 2 Coaches SAP KC P3 1 KE CCI 8 2 2 3 Feedback on the IMM for P3 Modelling effort o Modelling in this step took a lot of time 1 KE CCI o It is challenging to reduce data from domain experts to tasks and topics 1 KE ISN o We had no coaching effort everything was done by the KE 1 Coach SAP Experts o Experts were interested because they had a better understanding of the APOSDLE system and the modelling process 1 KE CCl o Interviews with domain experts were useful 1 KE CCl o Different experts have different perspectives on the domain it is difficult to find overlaps 1 KE ISN Knowledge elicitation techniques o Knowledge elicitation workshops were useful 1 KE ISN o A modified version of card sorting was applied 2 KE ISN EADS 1 Coach SAP 8 3 Phase 2 Modelling of domain and tasks 8 3 1 1 Feedback on the IMM for P2 General o Informal modelling is not necessary we should have directly started with formal modelling 1 KE CCl o It is essential to start from informal modelling before creating formal mod
72. Concerning long labels again too complex names bear the risk of not meaning much to the user and also could result in being too long for a nice user interface As a guideline we will provide an automatic check which lists tasks with names longer that 30 characters This is not necessarily an error but a stimulus to think of the name can be shortened see automatic checks below Relevance for learning Would people want to learn about the tasks in the task model Granularity of the task model is it too coarse or too fine grained The decision whether the task list is fine grained enough depends on the intended use of the learning environment and on the intended target group and is hence on the knowledge engineer As a rough guideline the task should allow for a manageable amount of learning goals that can be acquired in a reasonable time by the intended target group to allow for work integrated learning IMM P2 C apcescliz learn work Descriptions o Are the task descriptions correct o Are the descriptions easy to understand i e helpful for a person who wants to learn about the task Variables o Are there tasks that have variables but should not have variables o Are there tasks currently without variables that should have variables Coaches should provide support to both checks Knowledge required Are the concepts in the knowledge required section correct are they really required These checks will be performed au
73. DEs asking the DEs to sort the cards in different groups or piles The KE then asks the DEs to specify the criterion used to sort the cards into these groups Better results can be usually obtained by small groups of DEs working together The KE documents the piles obtained and the sorting criterion applied by the DEs Next the KE shuffles the cards again and gives them back to the DEs asking for a new sort according to a different criterion Sorting with the same set of cards should proceed as long as the DEs are still able to come up with different sorting criteria Card sorting takes approximately 10 15 minutes per sorting trial Typically DEs are able to sort objects according to 8 10 criteria For eliciting expert knowledge with respect to knowledge required for each task a preliminary list of tasks and topics already has to be modelled In preparation of the card sorting session one card is prepared for each of the tasks and the topics Cards for tasks must have different colours than cards for topics For the sort better results can be usually obtained by small groups of DEs working together because biases in a single expert s view of the domain can be identified more easily The procedure is similar to the one described above except that one task of interest is selected and the respective card is laid on the table in front of the two experts The experts are asked to describe what is meant by the task what are the input and output of the
74. Deviation or Mean L Results in domain Two Way_ANOVA or General Linear Model GLM or Linear_Cortelation or F Test Clis iype of range T Test C Alpha subClassOf L based on domain Hypothesis_Testing or ANOVA or Linear_Regression EO T Test subClassOf Parametric Test Figure 14 Entailed and explicitly modelled statements from the SDA domain The ontology questionnaire was deployed as a web application on a server hosted by the Know Center Its distribution was accompanied by a detailed manual contained in the Annex Part 3 Validation and Revision of Domain Tasks 4 4 2 Validation amp Revision of Learning Goals These checks are performed at the end of the modelling phase involving the usage of the TACT and are used to refine and tune the models modified and or created in TACT The change of the models triggered by the list of checks has to be performed manually in MoKi and or in TACT The checks are performed automatically via some Java tools the OWL files of the whole knowledge base are given as input to these scripts which are based on Jena Library and Pellet reasoner The tools actually implement some SPARQL queries on the OWL RDF files describing the models These scripts return a text file containing page 30 c o apaescile learn 2 work General Statistics Section Some general statistics on the models the number of Tasks the number of Domain Elements the number of Learning Goa
75. Document revised adding steps learn work A anosclle 42 This document has the goal of guiding Application Partners and Coaches through a list of checks and suggestions to be used to refine and tune the models expressed in the Modelling WiKi Moki All the checks should be performed between October 24 and November 7 The changes of the domain model and the task model triggered by the list of checks and suggestions described here should be made in the Moki by the application partners supervised by their coaches In the following we provide an overall overview of the revision process and its main steps Section 2 Brief description of the overall process then we illustrate the steps more in detail both for the domain model and for the task model The revision process is divided in three main steps 1 Manual checks This part consists of a list of suggestions to manually check and validate the list of concepts contained in the domain model and the list of tasks contained in the task model of the MoKi These suggestions and checks can trigger updates and modifications to improve the models directly in the MoKi 2 Automatic checks This part consists of a list of automatic check that will be performed to verify certain properties of the concepts and tasks described in the MoKi The results of these checks will be sent to the Application Partners and coaches to help them revise the models contained in the Moki 3 On line questionnaires
76. Experts Des The goal of this sub step is to elicit knowledge directly from the DEs The desired output is a refined task list and an extensive list of candidate domain concepts Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Did the domain experts coincide with the person performing the modeling What where the main difficulties encountered in this stage Page 3 Integrated Modelling Methodology Collection of feedback Were the explanations given clear enough Was the goal of the step completed after this step ie was a refined task list and an extensive list of candidate domain concepts ready after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 2 Positive Experiences ADD your comments here To gather the knowledge of our domain experts we made several interviews and smaller workshops with them These interviews were very comfortable and we got a lot of knowledge from them so that we could gather new domain concepts Again this step was not so difficult but we spent a lot of time interviewing and observing the domain experts The domain experts were this time more open for interviews and knowledge elicitation because
77. Ki 1 KE ISN 8 2 2 Knowledge Acquisition from Experts 8 2 2 1 Feedback on the IMM for P2 General o APOSDLE vocabulary was difficult to understand for people not involved in modelling 1KE EADS o The purpose of modelling was not clear to people not involved in APOSDLE 1 KE CCI o The preparation of knowledge elicitation requires efforts and time 1 KE EADS o Knowledge elicitation from experts did not lead to a lot of useful results 1 KE CCl o Identification of task and concepts was easy for the expert 1 KE CNM Objectives and explanations o The objectives of the phase were clear 2 KE CCI ISN 1 Coach SAP o The explanations were clear 1 KE ISN o The explanations were clear for a person experienced in knowledge engineering but not for others 2 KE CCI CNM o Explanation of modelling processes and modelling methodology to non KE was difficult 1 KE CCI Experts o Experts had difficulties to make implicit knowledge explicit 2 KE ISN CCl o Experts had problems to think in processes 1 KE CCl Knowledge elicitation techniques o Knowledge elicitation techniques were adequate 1 KE EADS 1 Coach KC o Laddering was useful for the task model 2 Coaches SAP KC o Card sorting in a group was useful 1 KE ISN 1 Coach SAP o Card sorting is fun 1 KE CCl o Card sorting did not lead to useful results 1
78. LE uses the data contained in the APOSDLE knowledge base concerned the modelling of tasks In the 2 prototype tasks were modelled with the workflow based language YAWL From the analysis of the experience of last year the following issues emerged Usability of the YAWL editor The usability of the YAWL tool was rated poorly by the application partners Reasons were the complex graphical interface and the long training needed to use the YAWL editor by domain experts with no knowledge engineering skills YAWL is a language whose main function is to model business processes Processes in the E learning area appear to be much more informal than those in the business sector Parallelism and jumps can only be cumbersomely constructed in YAWL and since these unusual constructs occur frequently in the domains of our application partners they result in increased modelling effort and difficult to read models Over expressive power of workflows The task models as considered in APOSDLE are not comparable with ordinary business models First the models are created mainly by domain experts with few modelling experiences and skills and they have to be kept simple Second and more important the APOSDLE system does not use the expressive temporal information contained in workflows but only needs to store simple information such as the task sub task hierarchy and some simple before after relation between tasks Thus the workflow constructs of YAWL seemed
79. M for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Was the wiki used in a collaborative manner What where the main difficulties encountered in this stage Page 4 Integrated Modelling Methodology Collection of feedback Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 Experience with variables Did you use variables in the informal model Please indicate why or why not Yes we did use variables We have some Concepts e g Creativity Techniques which has about 15 Subconcepts labeling different techniques Therefore it really makes sense to use these variables Did you find it difficult to understand how variables could be used in general No Did you find it difficult to insert variables in the MoKi No Positive Experiences The MOK is very good It is very easy to insert the relevant data and to have a good overview Negative Experiences 11 Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 25h Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Yes we have hopefully a good task
80. Methodology Collection of feedback Do you think that the domain you have chosen is appropriate for learning support with APOSDLE From the evaluation of APOSDLE prototype 2 CCI learned that the specific APOSDLE e learning approach promises the highest benefits in a setting with academic and scientific domains and users solid domains and stable curriculums a well maintained document basis organisational culture rewarding one s own initiative autonomy and self monitoring Neither the scenario of learning event organisation nor the scenario of learning consulting on REACH covered those requirements Thus CCI changed the domain again and chose the new domain of Information and Consulting on Industrial Property Rights The task of informing and consulting on industrial property rights is carried out by six people at CCI two jurists one electrical engineer one economist one biologist and one commercial clerk in case of substitution academic qualification prevails It is a stroke of luck that the economist person 15 still quite new in this task The topic is well settled well documented and manageable Compared to REACH it is not very dynamic CCI has got a clear profile in the domain Unlike other consulting institutions like patent offices or industrial property agencies CCI informs especially on questions of patent strategy patent commercialisation and licensing CCI has built up genuine internal knowledge about pro
81. Modelling Methodology Collection of feedback Integrated Modelling Methodology Collection of Feedback The goal of this questionnaire is to collect positive and negative feedback useful for the evaluation and improvement of the Integrated Modelling Methodology Please state your comments the way you prefer you can provide feedback in form of short sentences and bulleted lists as well as more complex descriptions Both formats are perfectly acceptable Note for APs Please provide feedback only for the steps you have already completed Partner Name ISN 1 Step 0 Scope amp Boundaries and Resources Collection The goal of this step is to define the scope and boundaries of the application domain to be modeled and to gather some resources related to the application domain The output should be a first preliminary list of tasks process scribble a first list of domain concepts and a collection of relevant learning resources Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in choosing a domain and collect the resources Were the explanation given clear enough Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 w
82. No redefinition of scope and no resources collection ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Please state the main differences if any between performing this step this year and last year ADD your comments here The main difference was to use the P2 models and domain for P3 and not beginning from scratch The list of tasks and concepts for P3 were updated according to P2 evaluation results and new meta model that enabled specifying an additional parameter in task model Moreover a questionnaire was filled by EADS Its objective was to decide whether Simulation is an appropriate learning domain for APOSDLE P3 About the scope of the domain after the evaluation of P2 we decided to focus only on the development process of simulation and to forget the part concerning the study process 2 Step 1a Knowledge elicitation from Digital Resources The goal of this sub step is to extract as much knowledge as possible from the digital resources provided by the Domain Experts The desired output of Stage la is a number of candidate domain concepts Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate
83. O O O Automatic translation from the Wiki to YAWL would have been helpful 1 Coach SAP There were minor technical problems with Proteg 1 KE CCI Protege and YAWL were adequate tools 1 Coach KC Some relations got lost during the transformation from WiKi to Protege 1 KE ISN YAWL is a useful tool 1 KE CNM Formal models were well prepared by the technical partners FBK SAP 1 Coach KC 8 3 1 2 Feedback both for the IMM of P2 and P3 Objectives and explanations O The objectives of the phase were clear P2 3 KE ISN EADS CCl 1 Coach KC P3 1 KE EADS 1 Coach FBK 68 learn work A anposcile 52 Explanations were clear P2 1 KE ISN P3 1 KE EADS iki o The Wiki was used in a collaborative manner by the knowledge engineers and their coaches P2 1 Coach SAP P3 1 Coach FBK o The Wiki was not used in a collaborative manner among the KEs of different domains P2 2 KE CCI ISN 1 Coach KC P3 1 KE EADS 8 3 1 3 Feedback on the IMM for P3 General o Having less tools saved a lot of time 1 Coach SAP Adaptation of methods o MS Visio was used to build graphical domain and task models 2 KE CCI ISN Conceptual issues o It was difficult to find labels for domain concepts 2 KE EADS CCl Moki o The MoKi was useful 3 KE CCI ISN EADS 2 Coaches FBK SAP o Description of how t
84. P 5 1 3 2 Negative Feedback GENERAL Unsolved problem Experts knowledge is expanding continuously How could we transfer this knowledge growth to APOSDLE without repeating workshops and interviews continuously Source AP Comment This is a typical modelling problem which goes beyond this methodology and the scope of the APOSDLE project GENERAL It is a challenge to reduce the data from the domain experts to tasks and concepts also it is not very easy to find overlaps Each DE has his own view on the things and his own view on how important some things are Source AP Comment This is a general knowledge acquisition problem which goes beyond this methodology and the scope of the APOSDLE project Anyway support by coaches based on their experiences of supervising supporting this phase in previous deployment of the system could be effective to solve these issues ORGANIZATION This step is very time consuming Difficulties to find the right dates We had to travel This step took a lot of time Source APs Comment This phase is a crucial one in the methodology and the quality of the models created is highly influenced by its output Hence investing some time in it will be rewarding in the end Nevertheless a possibility could be to provide more time effective strategies to perform the entire knowledge elicitation acquisition phase Furthermore to reduce costs time spent travelling some techniques e g interviews cou
85. P Comment This is a usability issue of TACT Further development of the TACT and its manual will profit from this feedback Feedback on Phase 3a Validation amp Revision of Learning Goals Positive Feedback TOOL The results of automatic checks helped to reduce the effort They provided a good feedback what points still have to be closer checked Source Coach TOOL The automatic formal models checks turn out to be very useful to fulfil the objectives Source Coach APs TOOL Anew the check report gives a very nice overview also liked very much the Spreadsheet that the TACT produces With these two documents it is quite easy to check everything Source AP 5 1 7 2 Negative Feedback GENERAL The whole process can only be performed by knowledge engineers and not by domain experts Source Coach APs Comment The methodology could provide more detailed guidelines showing some typical examples on how to react to the results of the checks Anyway coach support is crucial in this phase 5 2 Comparison of feedback for Prototype 2 and Prototype 3 A summarizing qualitative content analysis was applied for the answers of the KE and Coaches in the questionnaires for the second APOSDLE Prototype P2 and the third APOSDLE prototype P3 39 learn work c apaescile The qualitative analysis was conducted in a two step process First each answer mostly one sentence of each respondent in the questionnaires w
86. Part of Insert list of domain concepts Vehicle Car Sport_Car Truck nality ain Insert list here nageme Onset Add st to the Moki t One Term for each row use the space to create a hierarchy s Term A Term A1 ionality Term A11 Term A2 yement Figure 2 Load List of Concepts After clicking on the mport button all the terms will be added as concepts to the wiki and templates for these concepts are automat cally created The list of all concepts present in the wiki can be accessed via the List Concepts functionality see Section 4 2 at page 7 Indentation can be used to organise the concepts in an is a hierarchy or in a part of hierarchy To select the appropriate hierarchy you can use the selector on the top of the page Note With the current version of the MoKi the usage of the import functionality more than once could result in a loss of data stored in already created pages Therefore we suggest to use the import functionality only once at the start of the modelling activity and to choose whether to import an is a or a part of hierarchy Extensions of the wiki which overcome this problem and allow to use the import several time maintaining he consistency of data are planned and will be announced in due course lFor more information about Is a and part of hierarchies see Section 4 Note The insertion of a flat that is not indented list of terms will result in the insertion of a fl
87. Phase 3 Integrated Modelling of Learning Goals TACT tool The learning goal model is created together with its alignment to the task mode This Phase uses the raw alignment between domain concepts and tasks produced in Phase 2 as a starting point to specify detailed learning goals Phase 3a Validation amp Revision Automatic Checks the learning goal model is evaluated and if necessary revised accordingly x n Figure 4 The IMM Second Version As can be seen from Figure 4 this restructuring also supports better than the steps of the IMM First Version the need for a coherent and integrated development of the different components of the APOSDLE knowledge base as it also focuses on the relation between the different models that have to be specified In addition it allows the provision of immediate and comprehensive feedback to the modellers thus increasing the effectiveness of the modelling activities learn work A anposcile 52 The work performed during the third year of the APOSDLE project has been focused around the refinement and implementation of the IMM Second Version and on a set of tools and the methodological support which realise the collaborative integrated approach depicted in Figure 3 While the realisation of this full vision was not attainable within a single year the current version of the IMM and of the suite of Modelling Tools provide a first concrete step towards
88. S CCl Coaches SAP KC and for P3 KE CCl is that experts have little time and are not available for knowledge elicitation No specific statements were made by the KEs on the clarity of objectives and explanations in this phase for P3 Again it can be assumed that the objectives were clear for them due to their experience from P2 Similarly when modelling P3 the experts were more interested because they had a better understanding of the APOSDLE system and the modelling process KE CCl In addition while knowledge elicitation did not lead to many useful results for P2 in the same domain CCl interviews with domain experts were useful for P3 CCl These results point out that for successful Knowledge elicitation from domain experts it is indispensable that the functionality of APOSDLE and the role and importance of the models for the success of the entire APOSDLE system are explained to the experts in a comprehensible manner without using APOSDLE terminology This is especially important given that the experts have little time in general and are therefore only willing to participate actively in knowledge elicitation if they can see the benefit of their effort According to the KEs this step also requires much effort and time both for the preparation P2 KE EADS as well as for structuring the information elicited from the experts P3 KE CCl ISN If different experts were involved they had different perspectives on the domain and it w
89. S pdf and collaborative 1 pdf have been selected so far for term extraction These files are also available for downloading Once the files have been uploaded then the user can should select the language in which the documents are written After that terms can be extracted by clicking on Extract relevant terms or Cluster Relevant Terms and get relevant terms for cluster The terms extracted can be viewed by clicking on the symbol next to Terms By clicking on a concept this is added to the MoKi and a template is shown where to enter all the information about that concept see Section 4 1 special Upload File f Choose File no file selected Language English Group terms semantically Maximum number of items 100 Extract relevant terms lity Cluster documents and get relevant terms for clusters Remove all uploaded files Files Modelling ExampleEADS pdf Del geme collaborative pdf Del Terms Terms add this term to ontology ality nent Figure 4 The Term Extractor The additional functionality Remove all uploaded files can be used to remove all the uploaded files at once For further details on the Term Extractor please contact Viktoria Pammer For further details on the Term Extractor please contact Viktoria Pammer 6 4 Domain Model Management These functionalities support the management of domain model elements Each element is stored as a wiki page and
90. SDLE use case The learner wants to have a profound understanding of a lt domain element gt The Knowledge worker has a basic understanding of a topic but he still has questions like OK know what it is but how does this work Why should do it in a certain way How did this happen Why did this happen S he searches for explanations that help him to answer these questions Or s he just wants to know more about the topic to be able to understand things s he reads in documents or to be able to communicate about the topic with co workers or to be able to generate new ideas Therefore s he searches for information that contains backspecialised information historical data trends and developments relationships with other domain elements etc Material use types explanation more about APOSDLE Examples e profound knowledge of scenario techniques knowledge about the indication and principles of scenario techniques understand how scenario techniques work and why they work that way e profound knowledge of relation understand relations within a database system knowledge about which data are linked to which other data knowledge about properties of the relations e profound knowledge of REACH substance class understand the meaning of the classification of chemical substances in dependence on the date of registration amount of input toxicity environmental compatibility and intended purpose e profound knowledge o
91. Use Case Diagram Extends gt li Notation gt li System Product Description gt a 2 izi Event Means available to actors to achieve their goals They can be observable 2 Synchronisation Cheg like co operation or inferred such as default knowledge or trust in the team gt fF Resource member Conteytual Feature M Add as Learning Goal for Identify resources available to achieve goals Highlight Domain Model Elements not referenced in Learning Goals Learning Goal Mapping Identify resources available to achieve goals Save Figure 6 4 Overview of the TACT a variable can be added Select a task 1 and an appropriate topic 2 a A topic can be used as a variable if The task has no other variable the topic has sub topics and the name of the topic occurs in the name of the task b TACT will only let you add a variable in these cases In all other cases the button Add Variable 3 is disabled 12 Click the Add Variable Button 3 19 learn work A anosclle 572 MAIN Tasks Y Gather data on human activity 5 During system goal modelling with i actors attain Identify the systems goals and high level functional goals goals and achieve soft goals using resources Therefore to produce a complete system goal model the analyst needs to detect all available resources that actors consume and resource This Description in Identify reso
92. VVP 20VVorkspace VVP01 Process Modelling 20Tools TA CT 20 P3 Save the tact jar File into a separate directory 6 2 Your Knowledgebase Files you need XXX stands for CCI EADS ISN RESCUE or SDA The TACT will need to read the following files so have them ready somewhere on your hard disk They must all be in one directory e XXXaposdle ontology owl Contains APOSDLE specific concepts such as Task or Learning Goal e XXXaposdle categories owl Contains APOSDLE specific categories such as learning goal types XXXdomain ontology owl Contains your domain model e XXXtask ontology owl Contains your task model XXXknowrequ txt This is optional It contains the knowledge required that you defined in the Moki for your tasks For each application partners these files are in the svn repository Application partners if you do not have access to the svn please ask your coach about it 6 3 Files that will be modified When you save your work in TACT XXXtask ontology owl will be modified 15 learn work A anosclle 42 6 4 Files that will be created When you save your work in TACT the following files will be created in the directory where the other knowledge base files are XXXtask learninggoal ontology owl Contains the mappings between tasks and learning goals XXXlearninggoal ontology owl Contains the descriptions of the required learning goals These files are needed by FBK for stori
93. Wiki editing it or creating links APOSDLE use case The learner who clicks on the learning goal wants to have basic knowledge about what a lt domain element gt is The knowledge worker has no or very limited knowledge about a topic and wants to have a basic understanding of it or wants to check whether his basic knowledge is accurate up to date To reach this goal s he searches for introductory texts about the topic definitions or examples Material use types introduction definition example what APOSDLE Examples e basic knowledge about creativity techniques knowledge about various creativity techniques and tools available basic knowledge about addresses knowledge that a company can have different addresses knowledge about which different addresses a company can have knowledge about different addresses of a company 58 learn work A anposcile 52 basic knowledge about REACH interest agents basic knowledge about organizations that exert political influence on the implementation of REACH basic knowledge about exceptions of REACH knowledge about substances that are not subject to the regulations of REACH e basic knowledge about model knowledge about properties and elements of a various models knowledge about different types of models required for simulation building 7 4 1 2 Profound knowledge of Definition The learning goal type Profound knowledge of
94. a base system e Know how to apply use do a MS Project ability to use the specific project management tool for project management Gantt charts resource planning etc e Know how to apply use do a substance fixtures ability to perform a survey of chemical substances in use 7 4 1 4 Know how to produce Definition The learning goal type Know how to produce means to be able to create produce or build a certain topic for instance a task model In this sense Know how to produce means the ability of a person to achieve a certain outcome without a specified rule or procedure Therefore know how to produce has to be linked to topics that refer to results e g project report or products e g build a software Example If this learning goal type is linked to a topic for instance Wiki content the learning goal know how to produce Wiki content means that a person knows the Wiki setup and notation and is able to edit the Wiki content In this case profound knowledge of Wiki content is a prerequisite of know how to produce Wiki content and therefore a task that requires the learning goal know how to produce Wiki content would also require the learning goal profound knowledge of Wiki content However this is no general rule The decision whether the ability to edit the content of the Wiki is specified by the learning goal know how to produce Wiki conten
95. able for the task could be basic knowledge about methods and basic knowledge about tools Of course it might also be convenient for the person to have profound knowledge of methods and basic knowledge about tools The knowledge engineer has to decide whether these learning goals are indispensable for performing the task or if they would be just nice to have The distinction between indispensable and dispensable learning goals is important since modelling learning goals that are just nice to have will impair the selection of adequate learning content in a concrete APOSDLE application Assign to a task learning goals for all topics and SUB TOPICS that are required for performing the task Sub topics are not inherited from their parent topics lf you want to express that a task requires knowledge about all sub topics of a certain topic e g all sub topics of MS Office in your domain model namely MS Word MS Excel and MS Power Point this has to be modelled explicitly In other words assigning the learning goal basic knowledge of MS Office does not include basic knowledge of MS Word or basic knowledge of MS Excel DIFFERENTIATE between learning goals referring to the same topic by using different learning goal types Specify learning goals by using diverse learning goal types For instance the EADS task Validate and test simulation might require
96. al of the step completed after this step Yes Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you 0 hour Positive Experiences ADD your comments here The KE was strongly supported by technology partners in formal models development EADS owl model was provided by FBK Page 7 Integrated Modelling Methodology Collection of feedback Negative Experiences ADD your comments here Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer The answer concerns more the technology partners involved in the operation necessary to look at the model in Prot g and to verify if nothing is missing after translation But the method seems to be very satisfying The only remark is that KE should have the possibility to export himself the MOKI s files in owl Currently this function 15 not available for the KE Please state the main differences if any between performing this step this year and last year ADD your comments here No major differences 7 Step 5 Modelling of Learning goals previously known as Formal Models Integration The goal of this step is to obtain an OWL ontology of the learning goal model via the TACT tool Feedbac
97. alidation and Revision of Domain Tasks 44 5 2 5 Phase 3 Modelling of learning 0 15 45 5 2 6 Phase Validation and Revision of Learning 15 45 7 im n l A 46 5 CONCUSSIONS o 48 BID OUTADu b 50 7 Appendix 1 The Meta Model of the APOSDLE Knowledge Base 51 75 TLomal il00 a pa aaa ra 52 Te TAS WO CS aries m eae T E 52 7 2 1 Task numbering to model a workflow ordering 53 7 2 2 Modelling tasks with parameters 54 75 Learning daa ad al aa 56 7 3 1 Modelling learning goals With 5 56 2 58 7 4 1 The learning goal types in the 3rd 58 7 4 2 material uses in the 3rd Prototype 2 61 75 Relations UUCEeninodes a ao s bi 62 8 Appendix 2 Statements in the Evaluation questionnaires of P2 and P3 63 8 1 Phase Scope and Boundaries 63 8 2 Phase 1 Knowledge 19 10 65 8 2 1 Knowledge Acquisition from documents ssccccecseeec
98. als Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you for a knowledge engineer 5 for a domain expert who can not understand the CClFormalModelsChecks txt document Negative Experiences ADD your comments here Can only be done by knowledge engineers and not by domain experts Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 3 4 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Please state the main differences if any between performing this step this year and last year ADD your comments here 9 General questions and remarks e Did the domain experts coincide with the person performing the modeling If yes did it happen in all the steps or only some o No domain experts and knowledge engineers were different persons in all steps of modelling Page 9 Integrated Modelling
99. also the following meta properties used in the tem plates are displayed e Description e Has default form Has subtasks sa s part of Synonyms Please ignore them as they are not exported as domain specific relations in OWL Protege 4 7 Additional Functionaliles This section is empty in the currect 5 Task Model Management The functionalities of the Task Model Management are a subset of the ones explained for the Domain Model Management in Section 4 and have a similar behaviour The main difference here is the different template used or tasks shown in Figure 6 Edit Task model form Buchung booking Warning You are not logged in Your IP address will be recorded in this page s edit history Annotations Description Bestellungen von Raumen Technik externen Dienstleistern Structural Information Concept to be used as parameter Task id Subtasks Raum buchen Externe Services buchen Pr sentationstechnik buchen Knowledge required Figure 6 A Task Template Please refer to your coaches for a description of the information that must be added in the fields of the different templates 10 6 Owl Export Functionalities These functionalities allow to automatically export the information contained in the wiki in OWL format Export Domain Model allows to export the domain model while Export Task Model exports the task model Note Currently only the export of the domain mod
100. ameters are introduced for allowing knowledge engineers to model tasks in a compact manner i e without forcing them to specify too many specific tasks Nonetheless APOSDLE users will obtain information about specific tasks that they need in realistic learning situations For example the knowledge engineer creates the abstract task Prepare Agenda for Activity and assigns general learning goals to the task e g basic knowledge about Agenda Then s he indicates that there are different Activities i e the knowledge engineer defines the task variable and that for each of the specific activities the user needs to have profound understanding of the respective activity i e the knowledge engineer defines the learning goal variable Consequently using APOSDLE in the specific situation the user will receive general information about the abstract task e g basic knowledge about Agenda as well as specific information about the specific task For instance s he will receive information about what is a demo meeting but not the information about what is a workshop 4 3 Ground tasks abstract tasks and specialised tasks In order to deal with variables different types of tasks have to be defined Normal tasks without variables are ground tasks Tasks with variables are called abstract tasks and tasks created from tasks with variables are specialised tasks Ground task A ground task is a tas
101. an be entered If the concept already exists then its form is retrieved and the user can update it 4 4 Delete a concept The Delete a concept functionality allows to delete a concept from the wiki In the current version the list of concepts is shown The user can select the one to delete and then delete it from its template as explained in Section 4 1 3 Car AposdleDevel File Modifica Visualizza eX Car AposdleDevel A aposcile C9 work navigation Main Page Recent changes wiki import functionality Load list of tasks Term Extractor Domain Model domain model managem List properties Additional Functionality task model management m Additional Functionality wiki owl export functionalit Domain Model m Task Model admin staff m Create atemplate List templates Create a form Mozilla Firefox Cronologia Segnalibri Strumenti 7 6 4 https aposdle fbk eu developmentjsindex php Car B List de Dkmvviki page discussion Car Rame Car Description This is a car Synonyms machine Isa Is part of Property Property Target is driven by driver Car has no subconcepts Car has no parts Facts about Car Description Isa Is driven by Name Synonyms Category Domain model edit with form D 2 Admin my talk edit history delete move protect This is a car Vehicle Driver x Car machine IC my preferen
102. are agenda for Workshop inherits the task learning goal assignment Prepare agenda for Workshop unspecified Workshop automatically created from abstract learning goal profound knowledge of Activity automatically created specialised learning goal know how to apply use do Agenda inherited from the abstract task Note To each specialised task additional learning goals can be added e g basic knowledge of Management but inherited learning goals cannot be deleted As the origin of learning goals automatically created manually created might be confusing to the user of the TACT we have added explanations that can be accessed clicking on the Explanation button next to the learning goal of a task These explanations are detailed in section 6 8 12 15 aposcile learn work The guidelines that were introduced in section 2 are described in more detail hereinafter Assign to a task ALL learning goals that are INDISPENSABLE and DO NOT ASSIGN learning goals that are only nice to have for performing the task The easiest this is done by imagining several concrete situations where a person performs the task Then assign all learning goals that are required in ALL those situations Example The ISN task Detecting methods and tools would definitely always require knowledge about which different methods and tools are available Therefore the learning goals that are indispens
103. as difficult to find overlaps KE ISN i e agreement between the experts In order to facilitate the preparation of modelling efforts future versions of the modelling method could include templates checklists and guidelines for the preparation and analysis of different knowledge elicitation techniques e g card sorting When modelling P2 some experts had difficulties to think in processes KE ISN and to make implicit knowledge explicit KE ISN CCI In the case where the knowledge engineers were domain experts themselves CNM the identification of tasks and concepts was easy for the expert No such experiences were reported for P3 This might also be related to the fact that the domain experts had a better understanding of the APOSDLE functionality and they therefore had a better understanding of what kind of knowledge they should provide Further statements of the KEs in the questionnaires concerned different knowledge elicitation techniques Knowledge elicitation techniques were adequate according to two KEs P2 KE EADS P3 KE ISN Whereas in some domains card sorting was useful P2 KE ISN Coach SAP and card sorting was fun P2 KE CCl it did not lead to many useful results in other domains P2 KE CCI Coach KC According to coaches laddering was effective for the task model P2 Coach SAP and for the domain model P2 Coach KC For modelling P3 a modified version of card sorting Section 3 3 3 5 was ap
104. as assigned to one of the modelling phases Scope and Boundaries Knowledge Elicitation both from documents and experts Modelling of domain and tasks Validation and Revision of Domain Tasks Modelling of learning goals and Validation and Revision of Learning Goals Second similar answers were paraphrased into statements For instance the answers of a respondent A Another problem was that experts don t have time and of a respondent B each hour of the real experts is so hard to get were summarised into the statement Experts had very little time The resulting statements are listed in Appendix 2 For our analyses we could draw upon filled questionnaires of 4 knowledge engineers ISN CCl EADS CNM and 2 coaches KC SAP for P2 and upon the questionnaires of 3 knowledge engineers EADS ISN CCl and 3 coaches SAP KC FBK for P3 Table 1 gives an overview over the assignment of coaches to knowledge engineers KE for modelling APOSDLE s second and third prototype P2 and P3 Prototype 2 Prototype 3 s s l x Table 1 Application Partner Coach assignment for the modelling activities vvithin APOSDLE Whereas in the case of CNM the knowledge engineers were domain experts themselves in all other domains ISN CCI EADS the knowledge engineers and the domain experts were different groups of persons 5 2 1 Phase 0 Scope and Boundaries Even though all KEs CCI ISN EADS CNM stated that it was
105. ask models of P3 were translated into formal models in a fully automatic way This way modelling efforts were not duplicated anymore and fewer tools were used which was seen as beneficial Coach SAP Consequently it might have been the case that the benefits of informal modelling were more obvious in the Moki than they had been in the semantic Media Wiki from P2 A few more conceptual issues came up during the evaluation Two KEs stated that it was difficult to find labels for domain concepts P2 KE EADS CCl Another issue stated by the KEs was the question of granularity for the KEs it was hard to decide if a model has the right granularity P2 KE ISN CCl Even though these two issues were only brought up for P2 we assume that the questions of the right granularity and the right labels for concepts are general ones which were also present when modelling P3 We believe that the reason why the problems were not stated for P3 was that the experts had already experience both with modelling and with APOSDLE and they therefore had a better feeling for what is the right granularity of the models and what are good labels for concepts in APOSDLE However the question what is the right granularity and what are the right labels for concepts cannot be answered in general Different factors need to be taken into account such as the resources documents available or the target group of people who should learn with the APOSDLE instance For model
106. ass of T lt X gt 57 learn work A anposcile 52 7 4 Instructional Types The Instructional Types contain lists of learning goal types and material uses which are used in the definition of learning goals and in the annotation of documents respectively The list if learning goal types and material uses is known a priori ie is not defined at modelling time The 1 n relation trigger is used to describe that a learning goal can trigger specific material uses The list of learning goal types in the 5 prototype is contained in Section 7 4 1 The list of material uses is described in Section 7 4 2 7 4 1 The learning goal types in the 3rd prototype The learning goal types used on the 37 prototype are e basic knovvledge about e profound knowledge of e know how to apply use do a e know how to produce e unspecified In the following we illustrate each one of them in detail 7 4 1 1 Basic knowledge about Definition The learning goal type basic knowledge about means that a worker needs basic knowledge about the topic under consideration in order to perform the task successfully Basic knowledge includes the knowledge about dates names events places prices titles major theories The learning goal type basic knowledge about does not include the ability to use apply edit or transform a topic Example For instance basic knowledge about APOSDLE Wiki does not include navigating in the
107. asy and 5 is extremely difficult how difficult this step was for you 3 Positive Experiences We performed together with our Coach four very nice workshops using sophisticated knowledge elicitation techniques After these workshops we had a lot of useful data Negative Experiences It is a challenge to reduce the data from the domain experts to tasks and concepts also it is not very easy to find overlaps Each DE has his own view on the things and his own view on how important some things are Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 1 PM 140 h Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Yes the data was very useful Please state the main differences if any between performing this step this year and last year Last year we didn t spend much effort in this step the problem is that I assume that the quality of the models depend highly from this step 4 Informal Modeling of Domain and Tasks in 1 Starting from the knowledge elicited in Step 1 a b the main goal of this step 15 to obtain an informal but rather complete description of the domain model and task model in a Semantic MediaWiki called After this modeling step the informal concept model should only consist of relevant domain concepts see 5 2 Feedback Please state positive and negative experiences of the IM
108. at list of concepts in the MoKi The information about the is a and part of hierarchy is automatically saved in the templates of the relevant concepts see section 4 1 3 2 Load List of Tasks The behaviour of Load List of Tasks is analogous to the one of Load List of Concepts illus trated in Section 3 1 The main difference is that indentation 1s used here to store information about the sub task hierarchy see Figure 3 7 special Load list of tasks Hierarchy Kosmas X Insert list of tasks Prepare experiment Verify pre conditions for experiment Start experiment End experiment Collect results of experiment f Insert One Term for each rovv use the space to create a hierarchy Term A Term A1 Term A11 Term 2 nt Figure 3 Load List of Tasks 3 3 Term Extractor The Term Extractor allows to extract terms from documents and to add them to the concept list of the MoK i Supported formats of files are ascii and pdf For more information about the sub task hierarchy please contact your coaches First the user has to upload the files from which the terms have to be extracted To do that a button Choose files allows to browse the local file system and select files Once a file has ben selected it can be added to the list of current files by clicking on the Upload File link The uploaded files are displayed in the section Files In the example in Figure 4 two files Modelling ExampleEAD
109. at one variable in the task is a useful means to reduce modelling efforts However we have learned e g from the final number of tasks in the model that using variables leads to a huge number of tasks very easily This has to be taken into account for coaching Source AP Coach GENERAL The tool chosen is adequate for this step of the methodology especially if domain experts are active involved in the modelling phase Source Coach GENERAL The models after this step were already in a almost final form This was a result of the intuitive MoKi Source Coach ORGANISATIONAL Some internal document were prepared on task and domain model and were very useful to quicker the process Source AP ORGANISATIONAL There were less tools especially not using YAWL did save a lot of time for us Source Coach ORGANIZATION It has been much better to have a separate MOKI for each partner Source AP TOOL MoKi much easier to use less tools More experience Source Coach AP 36 learn work A anposcile 52 TOOL The MoKi has much improved since last year t is as comfortable to use as Prot g Source APs TOOL There is a huge difference between the Moki and the Wiki we used last year Moki is easier to handle a user can model very quick due to the import function and the browse functionalities allow a good conceptual overview over the models Source APs TOOL Using the Semantic MediaWiki was very easy th
110. ated Modelling Methodology First Version guides the process of creation of the application domain dependent parts of the APOSDLE Knowledge Base The APOSDLE Knowledge Base provides the basis for reasoning within the APOSDLE System Compared with this first version several changes were made both in the structure of the methodology and in the tools which are used to support it These changes take into account the extensive feedback that was collected during the development of the first version of the Application Partner Domain Models which were used in the APSODLE Prototype 2 The second version of the methodology consists of four main phases which cover the entire process of model creation from the initial selection of the application domain to its final specification Phase 0 Scope amp Boundaries n this phase the scope and boundaries of the application domain are determined and documented The first step of this phase is to use questionnaires and workshops to elicit the main tasks and learning needs of the different Application Partners in order to identify candidate application domains for learning also called learning domains The candidate application domains are then discussed and the final domain is then chosen and briefly documented Furthermore resources which may be relevant for chosen learning domain are collected The key aspect of this phase is to support the Application Partners to identify a learning domain which is appro
111. b We could easily build a collection of relevant learning resources Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here 80 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer In our opinion the goals were fulfilled within this step We had a simple hierarchical and completed domain model which was easily to understand for the domain experts We also had a tasks model which was concerted with the CCI s workflow and a great collection of relevant learning resources Please state the main differences if any between performing this step this year and last year We used both times the same methods 2 Step 1a Knowledge elicitation from Digital Resources The goal of this sub step is to extract as much knowledge as possible from the digital resources provided by the Domain Experts The desired output of Stage la is a number of candidate domain concepts Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Was the knowledge elicited useful What where the main difficulties encountered in eliciting knowledge from resources Were the explanat
112. back useful for the evaluation and improvement of the Integrated Modelling Methodology Please state your comments the way you prefer you can provide feedback in form of short sentences and bulleted lists as well as more complex descriptions Both formats are perfectly acceptable Note for APs Please provide feedback only for the steps you have already completed Partner Name SAP Coach of CNM and CCI 1 Step 0 Scope amp Boundaries and Resources Collection The goal of this step is to define the scope and boundaries of the application domain to be modeled and to gather some resources related to the application domain The output should be a first preliminary list of tasks process scribble a first list of domain concepts and a collection of relevant learning resources Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in choosing a domain and collect the resources Were the explanation given clear enough Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you CCI 0 CNM 3 Positive Experiences ADD your comments here Negative
113. ble manner Yes Please state the main differences if any between performing this step this year and last year No revision of the models has been necessary at this stage this year probably due to the higher quality models developed during the informal model phase 9 General questions and remarks e Did the domain experts coincide with the person performing the modeling If yes did it happen in all the steps or only some The real modelling activities have been performed only by Knowledge Engineers Page 5 Integrated Modelling Methodology Collection of feedback e Do you think that the domain you have chosen is appropriate for learning support with APOSDLE I can t really say much about this at this point However I have the feeling that the domain chosen is somehow too general Do you have any additional remarks or suggestions for improvement Page 6
114. ces my watchlist my contributions log out watch refresh EH G x A List forms Cres Figure 5 A template for a concept When deleting a concept the references to that concept will remain pending and have to be fixed by hand To overcome this problem we suggest to delete only concept which have no sub classes or sub parts Also it is better to ensure that the concept to be deleted do not provide domain and range information for properties If a concept to be deleted has sub class and or sub part concepts the is a and part of browsers can be used to move the children concepts to their new location in the hierarchy or this can be done by editing the same information in the templates 4 5 IsA Browser IsPartOf Browser The IsA Browser functionality allows to visualise and update the is a hierarchy of concepts using a graphical interface and drag and drop functionalities Once the hierarchy has been reorganised the user has to click on the save Tree button to save the changes in the MoKi The information on the Is a hierarchy in the templates is automatically updated The behaviour of the IsPartOf Browser functionality is analogous This functionality allows to manage the is part of hierarchy instead of the is a hierarchy 4 6 List properties The List properties functionality creates a list of all the domain specific properties relations contained in the wiki Note In the current version of the MoKi
115. character would probably be too specific and in most cases would not make sense for a user to be recognized as a separate task 14 learn work A anposcile 52 3 4 3 3 Domain concept Template Figure 5 shows a screenshot of a filled form associated to a domain concept template in MoKi Modify concept ASERIS BE Annotations Description ASERIS BE EMC2000 software solves the Maxwell equations in the frequency domain by a finite boundary element method ASERIS EMC2000 is widely used for EMC and antenna applications etc ASERIS BE EMC2000 application are Radio antennas on automobiles and aircraft Satellite telecommunication systems Telecommunication antenna radiations patterns Radar Cross Section etc Synonyms Hierarchical Structure Is a Solver Is part of Properties Property implements Property target BEM f Add another Figure 5 Screenshot of a form associated to the domain concept template in MoKi For each concept vve ask for a Description and Synonyms in the Annotations box These elements are modelled as properties of type String in MoKi We also ask for some relation with other concepts suggesting pre defined relations such as ls a and ls part of in the Hierarchical Structure box or allowing the user to add domain dependent relations in the Properties box In the latter case the user can specify the relation
116. cified is linked to ALL material uses For instance if there is a learning goal that is called unspecified database a user selecting the learning goal will receive snippets of all types i e examples for databases definitions guidelines checklists and all other snippets that are available for the topic 3 1 Basic knowledge about Definition The learning goal type basic knowledge about means that a worker needs basic knowledge about the topic under consideration in order to perform the task successfully Basic knowledge includes the knowledge about dates names events places prices titles major theories The learning goal type basic knowledge about does not include the ability to use apply edit or transform a topic Example For instance basic knowledge about APOSDLE Wiki does not include navigating in the Wiki editing it or creating links APOSDLE use case The learner who clicks on the learning goal wants to have basic knowledge about what a lt domain elements 15 The knovvledge vvorker has no or very limited knovvledge about a topic and vvants to have a basic understanding of it or vvants to check vvhether his basic knovvledge is accurate up to date To reach this goal s he searches for introductory texts about the topic definitions or examples Material use types introduction definition example what APOSDLE Examples e basic knowledge about creativity techniques
117. ck Yes The models were quickly created and described on the MOKI Please state the main differences if any between performing this step this year and last year ADD your comments here Significant improvements were done for P3 informal modeling tool MOKT and process template has been better organized and it was not necessary to KE to know the detailed Wiki syntax to enter data KE was able to see the entire set of tasks their description with the learning goals domain concepts Possibility to create groups of concepts and task with the very easy List typing Each AP had his own MOKI 5 Step 3 Informal Models Validation and Revision The goal of this step is to have the domain model and task model validated completeness and correctness by the DEs The step was supported by guidelines by results from automatic checks and by an on line ontology questionnaire Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Yes Were the tools adequate Yes What where the main difficulties encountered in this stage Were the explanations given clear enough Yes Was the goal of the step completed after this step yes Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you 1 no domain exper
118. come of chapter listing are mainly concepts which unlike in concept listing already have some structure Depending on the domain chapter listing takes approximately 30 45 minutes 11 learn work A anposcile 52 3 3 3 5 Card Sorting Card sorting is a technique that is very often applied in information design processes in order to generate an overall structure for information as well as suggestions for navigation menus and possible taxonomies In Phase 1 of the modelling methodology it was performed with the DEs to find relations between domain concepts and for identifying new relevant domain concepts Card sorting is a quick inexpensive and reliable method that can provide insight to the DE s view of the domain and that can make tacit knowledge explicit Card sorting was applied for two different purposes in the process For eliciting expert knowledge with respect to the structure of tasks and concepts ll For eliciting expert knowledge with respect to the knowledge required for each task For eliciting expert knowledge with respect to the structure of tasks and concepts sub concept hierarchies sub task hierarchies card sorting is performed as follows The KE with the help of coaches if applicable prepares a set of cards objects with a clear description for each one of them This set of objects could be a set of resources or a set of previously chosen domain task concepts The KE shuffles the cards and gives them to the
119. creates the learning goal with the type Unspecified per default 11 Click Save to store all your changes Changes of learning goal types new learning goals removing old learning goals etc 6 7 Creating the learning goal model advanced mode with variables This section deals with the possibility to add variables to tasks and explains what happens technically in this case The conceptual meaning of variables is described in Section 4 MAIN Tasks ii Description s Gather data on human activity 2 j During system goal modelling with i actors attain Identify the systems goals and high level functional goals goals and achieve soft goals using resources B identify resources available to achieve goals Therefore to produce a complete systern goal model Analyse the scopelvalidity domain of the new tool the analyst needs to detect all available resources that Analyse the different action sequences envisaged in certain situation Add Variable un 256 775 Identify non prescribed tasks that operators put in place to achieve gc during SR modelling Plan and prepare acquisition sessions decide on acquisition methoc Finish HAM data gathering Identify non prescribed goals tasks strategies of the workers to acqui E Identify resources available to achieve goals pecialised Task Show only Tasks without Learning Goals Domain Model Elements Filter Also Known as Use Case xi
120. cult to insert variables in the MoKi Positive Experiences ADD your comments here Very good support to KE in knowledge structuring and formalization Some EADS internal document were prepared on task and domain model and were very useful to quicker the process Better to have a separate MOKI for each partner Useful Import functionality Good visualization of tasks in dedicated tableau Useful Is a and Part of browser Good manual user for the Moki Use of variables in informal model to reduce KE workload but limited to one due to potential complexity for visualization function No strong difficulties to insert variables in the MOKI usefull post checks Negative Experiences ADD your comments here Deleted concepts that still remain available on the wiki When creating them difficult to have clear vision of how variables will be used further in the P3 There is missing the moki to owl export tool this tool is interesting EADS Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task about 1 day the preparatory document built before the MOKI with some expert feedback also strongly accelerate the modeling process ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Page 5 Integrated Modelling Methodology Collection of feedba
121. d last year Card Sorting with CNM to structure from the beginning 3 Step 1b Knowledge elicitation from Domain Experts Des The goal of this sub step is to elicit knowledge directly from the DEs The desired output is a refined task list and an extensive list of candidate domain concepts Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Did the domain experts coincide with the person performing the modeling What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step ie was a refined task list and an extensive list of candidate domain concepts ready after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 CCI 3 CNM Positive Experiences Page 3 Integrated Modelling Methodology Collection of feedback CCI everything was done by CCI not very much coaching effort CNM Review of an 1Tel document Main effort was in this Stepp to get a deeper understanding of the new field at CNM Therefore we transformed the Card Sorting Result into a hierarchy by understanding the sorting piles from the different dorting rounds as attributes
122. da and knowledge that is strongly related to the concrete application of the task e g profound knowledge of Board Meeting This could result in two different modelling decisions e Ambiguous modelling a very generic topic is used modelled as a learning goal Example The task Prepare Agenda for Activity requires the learning goal basic understanding about Activity meaning that in a concrete situation e g for preparing the agenda of a workshop exactly one specific sub topic of Activity e g VVorkshop is required for performing the task APOSDLE cannot deal with this ambiguity e Detailed modelling the task is broken down into more specific tasks Example The task Prepare Agenda for Activity is broken down into Prepare agenda for Meeting Prepare agenda for Board Meeting Prepare agenda for Demo Meeting Prepare agenda for Workshop Then learning goals are assigned to these more specific tasks However this causes extra work for the knowledge engineer as s he needs to model all tasks separately and as s he needs to assign the knowledge that is always required for preparing the agenda of an activity e g know how to do use apply Agenda to each of these more specific tasks by hand learn work A anosclle 42 The disadvantages of these two ways of modelling should overcome by using task variables 4 2 The idea of tasks with variables Tasks with variables also called par
123. diverse learning goals relating to the topic Simulation Software First the worker might need to have basic knowledge about Simulation Software i e she needs knowledge about the software Second the worker might also need to have profound knowledge of Simulation Software Finally she might have to Know how to apply use do a Simulation Software In another task e g Define the software and hardware architecture the worker might only need basic knowledge about Simulation Software and profound knowledge of Simulation Software but she might not need to apply it DO NOT RELY on the suggested topics that stem from the knowledge required section in the APOSDLE Wiki 13 15 aposcile learn work The knowledge required section of the APOSDLE Wiki was filled in a rather early modelling stage Consequently those suggested topics might be incomplete or some of them might be wrong Therefore please take those topics only as suggestions or hints and do not hesitate to re assess their relevance for performing the task It is absolutely normal if you as a knowledge engineer will change your point of view and your understanding of required knowledge Modelling is an iterative process that requires revisions at certain stages Perform a SECOND TRIAL to review the task learning goal assignment Usually at the beginning of a task learning goal mapping one is
124. e D1 3 Integrated Modelling Methodology First Version where state of the art methodologies for Enterprise Modelling are presented and the motivations underlying the development of the IMM are discussed A detailed description of the meta model schema of the APOSDLE knowledge base is contained in Appendix 1 The domain specific parts of the APOSDLE knowledge base can be considered an example of enterprise model as described in Fox amp Gr ninger 1998 learn work A anposcile 52 2 1 The IMM First Version The First Version of the Integrated Modelling Methodology presented in Deliverable D1 3 Integrated Modelling Methodology First Version and briefly summarised in Figure 2 was built around a strict waterfall paradigm In this paradigm the starting point of the modelling activities is a collection of informal Knowledge provided by knowledge experts this knowledge is transformed by knowledge engineers into a set of formal statements possibly with the support of semi automatic transformation tools which constitute the final model Evaluation steps are also planned at specific stages of the process Phases Phase 0 Scope 8 Boundaries the scope and boundaries of the application domain are determined and documented Phase 1 Knowledge Acquisition knowledge is i elicited from domain experts using techniques like interviews card sorting and laddering and ii extracted from available digital resources relevan
125. e evaluation of the Methodology is described This evaluation uses feedback obtained using the same questionnaire that was used for the evaluation of the Integrated Modelling Methodology First Version The use of similar questionnaires has allowed not only to collect feedback but also to make possible a comparative analysis between the two versions of the IMM 1 3 Related Documents This deliverable is related to the following documents e APOSDLE Deliverable D1 3 Integrated Modelling Methodology First Version e APOSDLE Deliverable D2 7 Conceptual Framework amp Architecture Version 2 APOSDLE Deliverable D6 9 Second Version of Application Partner Domain Models APOSDLE Deliverable 04 5 Software Architecture for 3rd APOSDLE Prototype APOSDLE Deliverable D1 9 3rd Prototype APOSDLE Work amp Modelling Tools APOSDLE Deliverable D2 8 amp D3 5 The APOSDLE Approach to Self directed Work integrated Learning learn work A anposcile 52 APOSDLE approach to work integrated learning is based on a general purpose domain independent learning platform plus a largely domain specific APOSDLE knowledge base whose coarse grained schema is illustrated in Figure 1 This Knowledge base formalises key aspects of the environment in which users operate the business domain in which they act e the tasks activities they may perform e the learning goals they may need to acquire li also formal
126. e g they stated that knowledge of the domain was not documented KE EADS no documents were available which could be used for learning and teaching KE CCI documents were not centrally stored KE CCl EADS no workflow descriptions were available KE CCl etc Despite these problems In the end knowledge engineers found more resources than they had expected Coach KC In the initial questionnaire used in P3 no more difficulties in this phase were mentioned by the KEs and coaches On the contrary the same KE CCI who had expressed great difficulties when collecting documents for P2 stated that It was easy to find crucial documents in P3 At this point the question remains if this is due to the methodology the experience of the KEs or due to addressing another domain with more documents In any case it can be assumed that the KEs did not report any more problems for this phase because they knew that they would find documents and they knew what to look for Even though the Scope amp Boundaries phase was described to be generally difficult for P2 and P3 by most KEs and coaches nevertheless it was seen by them as a positive experience All stated that as a positive side effect the company thinks about implicit knowledge and implicit processes in the company in a structured manner P2 EADS CCl P3 ISN SAP which would not have been the case without modelling their domains for APOSDLE 5 2 2 Phase 1 Knowledge Acquisiti
127. e guidelines checklists templates examples and or constraints that give some structure in performing the task without giving a recipe or prescription Material use types guideline checklist template example how constraint APOSDLE Examples e know how to produce final report ability to write a final project report for the customer includes the knowledge of standards and norms for layout organization references etc know how to produce scenario ability to generate simulation scenarios that enable identifying the major entities that must be represented by a simulation know how to produce REACH material for external consulting ability to generate documents which the IHK employees can hand on to the costumers during the consulting process 3 5 unspecified Definition The learning goal type unspecified is used to express that the task under consideration requires all kinds of knowledge about a certain topic Example For instance if there is a learning goal that is called unspecified wiki a user selecting the learning goal will receive snippets of all types i e examples of wikis definitions guidelines checklists and all other snippets that are available for the topic APOSDLE use case The learner wants to receive all snippets that are available for a specific topic Material use types All material use types APOSDLE Examples There are no specific examples thi
128. e resources related to the application domain The output should be a first preliminary list of tasks process scribble a first list of domain concepts and a collection of relevant learning resources Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in choosing a domain and collect the resources Were the explanation given clear enough Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 3 Positive Experiences ADD your comments here The EADS domain for P3 remained the same as for P2 and we have used the P2 version of models as a basis for P3 modeling activity Negative Experiences ADD your comments here Page 1 Integrated Modelling Methodology Collection of feedback The major difficulty in EADS task and domain model building was gt To represent all the complexity of Simulation domain in a task model limited to only one variable gt To have a domain model readable and understandable for the end user Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task
129. e topic Card Sorting a special knowledge elicitation technique the learning goal know how to apply use do a Card Sorting means to know how to do conduct a Card Sorting session with domain experts how to prepare Card Sorting sessions and how to log the results However Know how to apply use do a card sorting does not mean to Know in which situation Card Sorting is indicated or which are advantages and disadvantages of the technique Therefore the learning goal type Know how to apply use do a does not include Basic knowledge about or Profound knowledge of a certain topic APOSDLE use case The learner wants to know how to apply use do a lt domain element gt The knowledge worker wants to know what the next steps are in a procedure or a well defined task that s he has to perform but that s he is not able to carry out without some guidance S he searches for information that tells him which steps there are and which order they have to be completed This information is like a recipe or prescription Furthermore s he likes to have an example or demonstration of the procedure Material use types how do demonstration checklist example how APOSDLE Examples e Know how to apply use do a core learning goal analysis ability to perform a core learning goal analysis for a company e Know how to apply use do a Er2 ability to use the text data format er2 for loading data into another dat
130. e transferred to P3 This step took a lot of time 1 Coach SAP Objectives and explanations o Description of the inheritance in the TACT was missing 1 KE CCl TACT o Inheritance of learning goals for sub concepts was missing 1 KE CCl o Multihierarchies were not visible in the TACT although they were in the MoKi 1 KE CCl Variables o Variables were useful to reduce the modelling effort 1 KE ISN 1 Coach KC o Variables were not used to keep the model simple 1 KE CCI 1 Coach SAP o It was difficult to understand how variables would be used in APOSDLE 1 KE EADS o It was not difficult to understand how to use variables 1 KE ISN o It was not difficult to insert variables in the MoKi 2 KE ISN EADS 1 Coach FBK o Variables were inserted in the MoKi 2 KE ISN EADS o Using variables can drastically increase the number of tasks 1 Coach KC o Inserting variables in the TACT is easy 1 KE ISN o It is useful that you insert variables in the MoKi and get hints in the TACT 1 KE ISN 8 6 Phase 3a Validation and Revision l 8 6 1 1 Feedback on the IMM for P2 This step was performed after the evaluation of IMM for P2 o It is hard to say when a formal model is finished 1 Coach KC 8 6 1 2 Feedback both for the IMM of P2 and P3 This step was performed after the evaluation of IMM for P2 8 6 1 3 Feedback on the IMM for P3 7
131. eady this learning goal type This procedure took a lot of time Another difficulty was that our multihierarchies were not visible within the tool although they were within the Semantic Media Wiki Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here 80 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner The result looks reasonable in the TACT tool and in the ongoing CCI FormalModelsCheck txt file even if the learning goal type basic knowledge about for the task Beratung erlernen has vanished in the check file The true result will only be visible but in the APOSDLE prototype Please state the main differences if any between performing this step this year and last year The TACT tool has much more improved from the point of usability Learning goal types are now clearer and more adequate for us Page 8 Integrated Modelling Methodology Collection of feedback 8 Step 6 Formal Models Validation At the end of this step all the models created domain task and learning goals should be formally correct and complete The goal of this step is to have the models validated completeness and correctness by the DEs The step was supported by guidelines and by results from automatic checks similar to the one of step 3 but which also involve checking the quality of learning go
132. ection 5 In Section 6 the TACT tool and its features are described Eventually we have added a lessons learned section Section 6 from our user test learn work A anosclle 42 Some brief preparation will make the task learning goal mapping much easier for you and will notably enhance the quality of the resulting task learning goal model Before starting the task learning goal assignment Carefully read through the TACT User Manual Make sure that you have a clear understanding of the meaning of the different learning goal types Quickly skim through the list of topics in order to have a full picture of which topics are modelled in your domain Quickly skim through the list of tasks Maybe you don t remember how exactly you modelled your tasks especially because you probably did a lot of revisions and cannot exactly remember the final version Try to bring to mind what you meant by each task Take a closer look at similar tasks and think of the differences between them This shall help you avoiding double work when using the TACT Assigning tasks and learning goals 1 Assign to a task ALL learning goals that are INDISPENSABLE and DO NOT ASSIGN learning goals that are only nice to have for performing the task Assign to a task learning goals for all topics and SUB TOPICS that are required for performing the task Sub topics are not inherited from their parent topics DIFFERENTIATE between
133. ed Modelling Methodology Version 2 0 learn work c apaescile 4 2 2 3 Visualization Functionalities These functionalities allow to generate different types of graphical overviews of the models they help the actors to deal with a global view on the models and not only on single model elements In particular the tool allows two kinds of overviews of the model e Inthe tabular based view the user sees a table listing all the elements of the domain model or the task model where for each element some relevant information is shown e g its description the concepts of which it is a specialisation for domain elements its subtasks for tasks and more n the tree based view called IsA PartOf Browser a tree like view shows the hierarchy of the domain elements according to either the subclass or part of relation This tree like view is dynamically created from the content of the MoKi pages The user has the possibility to expand collapse only parts of the tree thus allowing him or her to efficiently browse even large and complex models Actually this is not just a static visualization since the user can easily rearrange via drag n drop the taxonomy and partonomy of concepts in the domain model and the changes performed within the browser are propagated to the pages describing the elements involved Figure 12 shows an example of the tree based view Domain Model 158 a Browser El Thing ACRE Methods Brainstormi
134. ed for performing the task Sub topics are not inherited from their parent topics In order to express that a task requires knowledge about all sub topics of a certain topic e g all sub topics of MS Office in the domain model namely MS Word MS Excel and MS Power Point this has to be modelled explicitly In other words assigning the learning goal basic knowledge of MS Office does not include basic knowledge of MS Word or basic knowledge of MS Excel 3 Differentiate between learning goals referring to the same topic by using different learning goal types For instance the EADS task Validate and test simulation might require different learning goals relating to the topic Simulation Software First the worker might need to have basic knowledge about Simulation Software i e she needs knowledge about the software Second the worker might also need to have profound knowledge of Simulation Software Finally she might have to Know how to apply use do a Simulation Software In the task Define the software and hardware architecture on the other hand the worker might only need basic knowledge about Simulation Software and profound knowledge of Simulation Software but she might not need to apply it 4 Do not rely only on the suggested topics that stem from the knowledge required section in the MoKi The knowledge required section
135. edge Base provides the basis for reasoning within the APOSDLE System The Methodology has been accurately followed by each Application Partner to build their specific APOSDLE Knowledge Base The specific models created are described in Deliverable D6 9 Second Version of Application Partner Domain Models This document provides an overview of the evolution from the IMM First Version to the IMM Second Version Section 2 a detailed description of the current version of the Integrated Modelling Methodology together with the tools that were developed to support the Application Partners in their modelling activities Sections 3 and 4 and finally an evaluation of the Methodology and a comparison between the first and the second version Section 5 1 2 Scope of this document This deliverable is an updated version of the previous one D1 3 Integrated Modelling Methodology First Version Compared with this first version several changes are made both in the structure of the Methodology and in the tools which are used to support it These changes take into account the extensive feedback that was collected during the development of the first version of the Application Partner Domain Models which were used in the APOSDLE Prototype 2 In this document we focus on a detailed overview of the current version of the Integrated Modelling Methodology and on the APOSDLE Modelling Tools which were developed to support it In addition a qualitativ
136. eeeseeeeeeecseeeeceesaaaeeeeeesaeeeeessaaeeeeeesaaees 65 8 2 2 Knowledge Acquisition from Experts 66 8 3 Phase 2 Modelling of domain ANC tasks 67 8 4 Phase 2a Validation and Revision 1 70 8 5 Phase 3 Modelling of learning goals ccccccccsssseceeeceeeeeceeesaeeeceeesseaeeeessseeseeeeesaeeeeeessaaees 71 vi learn 2 work Posscila D 1 6 Integrated Modelling Methodology Version 2 0 8 6 Phase Validation and Revision l 8 88 8 8 8 73 8 7 General remarks 2 2 2 8 8 28 8 28 2 3 8 83 3 3 74 o An a 76 O APOSDLE consortium all rights reserved page V learn work co Zeosdia D 1 6 Integrated Modelling Methodology Version 2 0 APOSDLE consortium all rights reserved page viii learn work A anposcile 52 1 1 Purpose of this document This document describes the second version of the APOSDLE Integrated Modelling Methodology IMM This methodology which updates the previous version described in Deliverable D1 3 Integrated Modelling Methodology First Version guides the process of creation of the application domain dependent parts of the APOSDLE Knowledge Base The APOSDLE Knowl
137. el is implemented Note the behaviour of the export differs from browser to browser In some browsers the owl file is automatically downloaded in others it is shown in others a while page is shown and the command show page source has to be used to see the owl source 11 7 Additional Concepts 7 1 Is a and Part of hierarchies Is a hierarchy Is a relationship pronounced 15 a is a data relationship that indicates a type subtype data relationship and should be already a familiar concept to persons with knowl edge of Object Oriented Programming The Is a based approach to modelling recognizes that many types or classes of an individual entity can exist for instance in a vehicle domain an individual entity that is a specific vehicle can be a Car or a Boat or an Aircraft In turn Boats can be Sailboats or Yachts and so on This note not intended to cover extensively this topic but only to provide some basic examples of 1s a relations which be used as examples to build an is a hierarchy The Is a example below is based on an example described in 3 Example 1 Consider a over simplified description of a vehicle dealership and its decompo sition in sub classes e The top class of this domain is the class Vehicle e Vehicles can be partitioned into Car Boat and Aircraft e within the car class the classes could be further partitioned into classes for Truck Van and Sedan e within the Boat class t
138. elligent numbering on the basis of the graphical representation of the task To store the intelligent number of a task an attribute task number is used An in depth description of task numbers is contained in Section 7 2 1 52 learn work A anposcile 52 Task names contain at most one parameter This parameter is a concept from the domain ontology We refer to tasks with parameters as abstract tasks and tasks without parameter as ground tasks An in depth description of the use parameters with tasks is contained in Section 7 2 2 Tasks are decomposed in sub tasks The relation is subtask of is used to express the hierarchy of sub tasks is subtask of is formalised as a part of relation In the informal modelling phase a task is associated to some knowledge required under the form of domain concept but this relation is not stored in the formal model and thus not formalised in the meta model The Knowledge required information is used in TACT to help with the definition of learning goals 7 2 1 Task numbering to model a workflow ordering The investigation of the task models for prototype 2 revealed that workflow constructs as for exampled used in YAWL seem to be over expressive for our purposes For APOSDLE prototype 3 exclusively constructs were allowed which do not distinguish between parallelism logical AND and logical OR to keep the models simple for domain experts hence an
139. elling Methodology Collection of Feedback The goal of this questionnaire is to collect positive and negative feedback useful for the evaluation and improvement of the Integrated Modelling Methodology Please state your comments the way you prefer you can provide feedback in form of short sentences and bulleted lists as well as more complex descriptions Both formats are perfectly acceptable Note for APs Please provide feedback only for the steps you have already completed Partner Name TUG Coach of ISN 1 Step 0 Scope amp Boundaries and Resources Collection The goal of this step is to define the scope and boundaries of the application domain to be modeled and to gather some resources related to the application domain The output should be a first preliminary list of tasks process scribble a first list of domain concepts and a collection of relevant learning resources Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in choosing a domain and collect the resources Were the explanation given clear enough Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult
140. els 1 KE EADS 1 Coach SAP o The intended granularity of the models was unclear 2 KE ISN CCl o There was a lack of understanding how the models would be used in APOSDLE 1 KE EADS o The translation into different languages was time consuming 1 KE CCl 67 15 apcescile learn 2 work Obiectives and explanations O O O O O Modelling rules for the Wiki were not clear 1 KE EADS A manual for the Wiki was missing 1 KE CNM It was not clear how to handle the Wiki 1 KE CCI Explanations seemed clear but results were partly wrong 1 Coach SAP The explanations were clear for a person experienced in knowledge engineering but not for others 1 KE CCI Wiki O O O The change of the semantic Wiki caused additional effort 3 KE CCI CNM ISN 1 Coach SAP AP models should be separated 1 KE EADS The semantic Wiki was inexpedient for modelling concepts and tasks 2 KE CCI CNM 1 Coach SAP The semantic Wiki was useful 2 KE ISN EADS 1 Coach KC It was unclear how to model relationships between concepts 2 KE CCI ISN It was difficult to find the functionality for renaming tasks and concepts 1 KE KC The Wiki does not support the iterative modelling process 1 KE EADS Wiki functionality needs to be improved 1 KE EADS Transferring the models to YAWL and Proteg
141. eolpul l yuno e 51 911 BHueyo vqq JI Dulll poul U UM UNODIE OjU SNW 51 sE Y SI SIY il UIeWOP BHuluses pue yse usamjaq euoboyuo siy e9 aM S Huly JeWOs 1 nq ise y Uses o zuem 104 op ajdoad 1 pjnoys jdo d zey 5p lMouy y pue yse y u Ml q YUl Buoys OU 51 3194 zey s e rpul 511 s op 1111 soldo Jejus UM 5 5 1 soido Aq Buismoug pazezouue YIM yoddns Buiuses si u q Auew jenuazod y sey FIGSOdV SYSE OS E J A MOH peseq yse s 1156 olov 151 v JOU q ew s 41dSOciV juo sHhpajmouy jenze ul p s 1 u due jdo d aoue sul 104 ss old SulAH pun ou s 14 J ou OU sof sok i vq B5o ou OU sof sok ou OU sof sof ou ou sof ou OU sof Ou sof sok ou OU sof sok Ajuenbe y e6ueyo ajqe s 5002110 1 suols n
142. ere used by one of the KEs EADS According to this KE EADS automatic term extraction was a useful method and efforts should be put into providing more appropriate tools For P3 the text mining functionalities were integrated in the MoKi which was approved by one of the KEs ISN As no efforts were put into improving the functionality of the tool it still did not work properly for P3 which was pointed out by only one of the KE ISN From this fact we may conclude that the tool was not tried out by the other KEs because of their experiences during modelling P2 As term extraction by hand seems tedious and time consuming for future versions of the modelling methodology it might be worthwhile considering to put effort into the improvement of the text mining functionalities 5 2 2 2 Knowledge Acquisition from Experts In this phase when eliciting knowledge for P2 the objectives were clear for the KEs KE CCI ISN Coach SAP According to the KEs the explanations were clear for them ISN CCI CNM but would not have been clear for a person not experienced in knowledge engineering KE CCI CNM It was 41 learn work A anposcile 52 difficult to explain the purpose of modelling the process and the modelling methodology to people not involved in modelling EADS CCl Moreover the APOSDLE vocabulary was difficult to understand for people not involved in APOSDLE KE EADS Another general problem stated for both P2 KE ISN EAD
143. es Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in choosing a domain and collect the resources Were the explanation given clear enough Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 1 Positive Experiences ADD your comments here During this step we could resort to the experience from APOSDLE Prototype 2 We knew how the domain concepts and the list of tasks have to look like CCI chose a new domain Information and Consulting on ndustrial Property Rights We started with an intense iterative process of building graphical domain and task models These were visualised as simple concept trees Therefore we used Microsoft Visio s block Page 1 Integrated Modelling Methodology Collection of feedback diagram stencils This simple visualisation turned out to be very helpful for the discussions with the domain experts For the collection of relevant learning resource we resorted to the relevant documents given within the CCI s intranet and the CCI s homepage We also searched for relevant documents in the World Wide We
144. es knowledge about the indication and principles of scenario techniques understand how scenario techniques work and why they work that way e profound knowledge of relation understand relations within a database system knowledge about which data are linked to which other data knowledge about properties of the relations e profound knowledge of REACH substance class understand the meaning of the classification of chemical substances in dependence on the date of registration amount of input toxicity environmental compatibility and intended purpose e profound knowledge of domain model understand the data model of a certain domain understand the structure the purpose of structuring and the logic of the data model 7 4 1 3 Know how to apply use do a Definition The learning goal type know how to apply use do a means to carry out procedural knowledge Therefore know how to apply use do a has to be linked only to topics that refer to a set of rules or learn work A anposcile 52 guidelines e g a computation rule the UML notation procedures e g the RESCUE requirements engineering process a method e g Systematic Interview or a tool e g Protege Know how to apply use do a is used when a procedure method exists The learning goal type can be used with methods formats applications calculations etc Example If this learning goal type for instance is linked to th
145. extension of the task model by an attribute TaskID is sufficient to model the temporal relationship between the tasks as we explain in the following In a nutshell we create task models that briefly give an overview of the overall process AND OR and parallelism constructs are not distinguished i e are treated with the same modelling construct Additionally the task model should be expressive enough to define task subtask relationships We suggest that a numbering attribute will be specified add the end of this modelling phase The following example shows how the numbering works and fulfils the requirements as summarized before This could be a sample task model how application partners would define it in the first phase of the informal modelling e g in Visio TS Figure 16 An example of task model The example task model consists of eight tasks starting with T1 and ending with T8 There is an OR split from T1 into T5 and T2 13 14 Note that T7 and T8 are subtasks of 12 The basic rules for the numbering are that e sequences are expressed by sequenced numbers e g 2 following 1 AND OR and parallelism are expressed by using the same number plus a suffix i e 2a and 2b would be parallel or alternative tasks subtasks are specified by using a dot notation i e 1 1 would be a subtask of 1 53 learn work A anposcile 52 Already this simple example illustrates pitfalls tha
146. f domain model understand the data model of a certain domain understand the structure the purpose of structuring and the logic of the data model 3 3 Know how to apply use do a Definition The learning goal type know how to apply use do a means to carry out procedural knowledge Therefore know how to apply use do a has to be linked only to topics that refer to a set of rules or guidelines e g a computation rule the UML notation procedures e g the RESCUE requirements engineering process a method e g Systematic Interview or a tool e g Protege Know how to apply use do a is used when a procedure method exists The learning goal type can be used with methods formats applications calculations etc learn work A anosclle 42 Example If this learning goal type for instance is linked to the topic Card Sorting a special knowledge elicitation technique the learning goal know how to apply use do a Card Sorting means to know how to do conduct a Card Sorting session with domain experts how to prepare Card Sorting sessions and how to log the results However Know how to apply use do a card sorting does not mean to know in which situation Card Sorting is indicated or which are advantages and disadvantages of the technique Therefore the learning goal type Know how to apply use do a does not include Basic knowledge about or Profound knowledge of a certain
147. f knowledge extraction by providing better automatic knowledge elicitation from digital resources and by embedding state of the art knowledge elicitation tools in the MoKi To train expert coaches in order to guide the Application Partners in the application of the IMM 49 A anposcile 52 learn work Christl C Ghidini C Guss J Pammer V Rospocher M Lindstaedt S Scheir P S Serafini L 2008 Deploying semantic web technologies for work integrated learning in industry A comparison Sme vs large sized company In Proceedings of the 7th Int Semantic Web Conference ISWC 2008 In Use Track Volume 5318 Springer 709 722 Cooke N M amp McDonald J E 1986 A Formal Methodology for Acquiring and Representing Expert Knowledge Proceedings of the IEEE 74 10 1422 1430 Fox M S amp Gruninger M 1998 Enterprise modeling A Magazine 19 3 109 121 Ghidini C Kump B Lindstaedt S Mahbub N Pammer V Rospocher M amp Serafini L 2009 MoKi The Enterprise Modelling Wiki Proceedings of the 6th Annual European Semantic Web Conference ESWC2009 Demo Session Maiden N A M amp Rugg G 1996 ACRE selecting methods for requirements acquisition Software Engineering Journal 11 3 183 192 Rospocher M Ghidini C Serafini L Kump B Pammer V Lindstaedt S N Faatz A amp Ley T 2008 Collaborative enterprise integrated modelling Proceedings of SVV
148. fied 88 8 8888882822 838833833 3 15 6 4 Files that will be 0 2882233242 43883333333383 3 16 6 5 Startup V VOUTAC ya pal d ada a ara uba 16 6 6 Creating the learning goal model simple mode 17 6 7 Creating the learning goal model advanced mode with variables 19 6 8 Explanations for learning goals cccccccsscccscecsceceeeceueeceeeceueceueeseeeeseecueeaueeseeesuessaeeseessaes 21 05 O L 102 l H a ee eee eee ee oe 21 6 9 1 Unselecting 5 2 21 6 9 2 Different ontology task ontology or required knovvledge files 21 7 FAQ and Lessons ba alad 22 learn work A anosclle 42 With the TACT we want to connect tasks with topics by specifying learning goals that are necessary for performing the tasks This is a crucial step in the modelling process and therefore has to be performed carefully How do we define a Learning Goal We regard a learning goal as the combination of Learning Goal Type and Topic The
149. fine the formal model by inserting new elements by modifying existing knowledge or by asking for clarification from the domain experts The usage of a robust collaborative technology as the one provided by the wiki allows the provision of state of the art functionality like simultaneous access and online communication via the platform To support these different actors we have proposed a system in which content can be represented at different degrees of formality This to enable domain experts to create review and modify models at a rather informal human intelligible level and to allow knowledge engineers to check the quality of the formal definitions and their correspondence with the informal parts they intend to represent In order to make this vision possible without increasing the overhead of human work necessary to cope with these different representations of knowledge the system must be able to maintain the alignment between the informal specification of the APOSDLE knowledge base and its formal version and should also make the translation between different levels of formality in an automated manner as smooth as possible This automatic alignment simplifies and makes more agile the interaction between the different actors of the modelling team as it removes the need of having to stick to rigid interaction protocols centred around the informal vs formal waterfall paradigm In addition this vision allows the actors of the modelling team to co
150. for an inferred statement and optionally delete it If you want to know why a statement has been inferred select the corresponding radio button and click the button Justify at the bottom of the Entailed Statements box see Figure 4 You will be taken to the Justification View see Figure 5 oe In the first line you will see for which statement you are shown the reason In the rose box you find one or more groups of statements Each group represents one reason for the selected axiom e You can now simply go back to the list of entailed statements or to another view lf you want to delete the selected axiom from the ontology you have two choices You can not directly delete an inferred axiom because it is not explicitly stated in the ontology You can only remove the reasons why this axiom was inferred e Inthe blue box there is a suggestion which axiom to remove In order to accept this choice click on the button Delete Minimum Hitting Set In the rose box you find one or more groups of statements As each group represents one reason for the inferred axiom you must remove one line from each group You can do nothing wrong the radio buttons ensure that you have selected one from each group Click on the button Remove to remove all selected axioms Note that deleting axioms in the ontology does not change in any way your local ontology file All changes are made on the server on a temporary model N
151. for information on using the wiki software Getting started navigation Main Page Configuration settings list rz Recent changes MediaWiki FAQ amp wiki import functionality a MediaWiki release mailing list amp Load list of domain concepts Load list of tasks p 1 Import Term Extractor Domain Model domain model manageme List concepts l Add a concept Delete a concept IsA Browser 2 domain model management 5 Browser List properties Additional Functionality List tasks Add a task Delete a task 3 task model management m Additional Functionality 1 wiki owl export functionalit Domain Model Task Model to be lt 41 OWL export implemented Figure 1 The wiki main page 3 Wiki Import Functionalities The support a compact import of groups of concepts and tasks to facilitate the early phases of modelling Here we illustrate the behaviour of all of them 3 1 Load List of Concepts Load List of Concepts allows to enter a textual list of concepts and to import all of them in the wiki To import the list of concepts you can type them or paste them in the textual form in a dedicated form one for each row see Figure 2 Marco myak my prererences my wal gO iL Load list of domain 15 dur D Hierarchy for Is A Ols Part Of Select Is a or
152. g Goals A list of automatic check is performed on the entire Knowledge base to verify certain properties of the concepts and tasks described in MoKi and of the learning goals created in TACT In particular these checks focus on the mappings between tasks and learning goals and the connection between tasks and domain concepts The results of these checks is provided to the KEs and their coaches which according to these results may decide to revise the models using the most appropriate tools either MoKi e g if a new task need to be added or TACT e g if a new learning goal needs to be added to a task 3 7 3 Supporting Tools Techniques amp Resources Similarly to the first validation and revision phase see Section 3 5 we designed a bunch of tools and guidelines documents to support the modelling activities in this phase of the IMM The entire process is described in the Validation amp Revision of Learning Goals Manual contained in the Annex Part 5 Validation amp Revision of Learning Goals which also presents the kind of automatic checks performed together with suggestions for a possible revision of the models according to the results of the checks The automatic checks are performed via a Java tool which performs some SPARQL queries over the OWL knowledge base created A detailed description of the automatic checks is available in Section 4 4 2 20 learn work A anposcile 52 4 1 Overview To support the crea
153. g Methodology Collection of Feedback The goal of this questionnaire is to collect positive and negative feedback useful for the evaluation and improvement of the Integrated Modelling Methodology Please state your comments the way you prefer you can provide feedback in form of short sentences and bulleted lists as well as more complex descriptions Both formats are perfectly acceptable Note for APs Please provide feedback only for the steps you have already completed Partner Name FBK Coach of EADS 1 Step 0 Scope amp Boundaries and Resources Collection The goal of this step is to define the scope and boundaries of the application domain to be modeled and to gather some resources related to the application domain The output should be a first preliminary list of tasks process scribble a first list of domain concepts and a collection of relevant learning resources Not involved 2 Step 1a Knowledge elicitation from Digital Resources The goal of this sub step is to extract as much knowledge as possible from the digital resources provided by the Domain Experts The desired output of Stage la is a number of candidate domain concepts Not involved 3 Step 1b Knowledge elicitation from Domain Experts Des The goal of this sub step is to elicit knowledge directly from the DEs The desired output is a refined task list and an extensive list of candidate domain concepts Not involved Page Integrated
154. g goal type unspecified is used to express that the task under consideration requires all kinds of knowledge about a certain topic Example For instance if there is a learning goal that is called unspecified wiki a user selecting the learning goal will receive snippets of all types i e examples of wikis definitions guidelines checklists and all other snippets that are available for the topic APOSDLE use case The learner wants to receive all snippets that are available for a specific topic Material use types All material use types APOSDLE Examples There are no specific examples this type can be used for all types of topics 7 4 2 The material uses in the 3rd prototype The material uses used on the 37 prototype are e Unspecified Introduction e Example what e How to e Example how 61 learn work A anposcile 52 Definition More about e Constraint e Checklist e Template e Demonstration e Explanation e Guideline In addition custom material uses can be added for different learning domains 7 5 Relations between models A Task requires learning goals A Task has parameter at most one domain concept A learning goal has learning goal type a certain learning goal type and is about a certain domain concept A learning goal has parameter at most one domain concept 62 15 apcescile learn 2 work 68 1 P
155. g methods and tools would definitely always require Knowledge about which different methods and tools are available Therefore the learning goals that are indispensable for the task could be basic knowledge about methods and basic knowledge about tools Of course it might also be convenient for the person to have profound knowledge of methods and basic knowledge about tools The knowledge engineer has to decide whether these learning goals are indispensable for performing the task or if they would be just nice to have The distinction between indispensable and dispensable learning goals is important since modelling learning goals that are just nice to have will impair the selection of adequate learning content in a concrete APOSDLE application APOSDLE users vvho are seeking help for a task at hand may not be able to judge the relevance of a learning goal related to a task Additionally the APOSDLE system has no way to distinguish between necessary knowledge and optional knowledge If a user who cannot judge the relevance of learning goals for a task is provided with a list of learning goals some of which are only nice to have the user might be overloaded with a lot of information most of which is not necessary for the task at hand and the user would not be able to distinguish required information from optional information 2 Assign to a task learning goals for all topics and sub topics that are requir
156. h also contains the list of checks presented as questions to be performed in the Manual Checks activity and suggestions for possible revision according to the results of the manual and automatic checks The automatic checks are performed via a Java tool which performs some SPARQL queries over the OWL models created The on line questionnaires are implemented in the Ontology Questionnaire tool Both the automated checks and the Ontology Questionnaire tools are described in Section 4 4 1 A manual of the Ontology Questionnaire is contained in the Validation amp Revision of Domain Tasks Manual 3 6 Phase 3 Modelling of learning goals 3 6 1 Goal The goal of this phase is to produce the specification of the learning goal model Starting from the initial alignment between domain elements and tasks produced in Phase 2 the users specify in detail the learning goals using TACT which was developed within the project and is described in detail in Section 4 3 3 6 2 Description In this phase it can be assumed that the domain and task model are stable in that no relevant domain concepts or tasks are missing Changing labels and descriptions of tasks and domain as well as removing redundant tasks and domain concepts does not affect the learning goal model It is also assumed that the necessity of task parameters will sometimes only be recognised in this phase which is the reason why adding and removing task parameters is directly supported in
157. h the formal checks and the coaches Negative Experiences The formal check report was very long Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 16h Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Yes we made a few quite valuable changes in the model and eliminated conceptual mistakes Please state the main differences if any between performing this step this year and last year don t remember well how we did this step last year but the ontology check was although it was a little bit too long quite useful Page 6 Integrated Modelling Methodology Collection of feedback 6 Step 4 From Informal to Formal At the end of this step the domain model and task model will be contained in two OWL ontologies Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 Positive Experiences We didn t do anything in t
158. hase 0 Scope and Boundaries 8 1 1 1 Feedback on the IMM for P2 General O A 2 day workshop is necessary in this phase 1 Coach SAP Objectives amp Explanations O O O The objectives of the phase were clear 3 KE CCI EADS ISN 1 Coach SAP The objectives were not clear 1 KE CNM The explanations were clear 1 KE ISN There was a lack of clear understanding how APOSDLE works 1 KE CCI There was a lack of good examples how a model should look like 2 KE CCI EADS There was a lack of good examples of what is an adequate domain 1 KE ISN We had no clear criteria which domain is suitable for APOSDLE 1 Coach KC After the models for P2 were finished there was a better understanding of what is an appropriate APOSDLE domain 1 KE ISN Collection of documents O O We had difficulties in finding crucial documents 1 KE ISN The collection of documents is time consuming because documents are not centrally stored 2 KE CCI EADS The knowledge about the domain is not documented 1 KE EADS There is no explicit teaching and learning material available 1 KE CCI There are no workflow descriptions available 1 KE CCl In the end knowledge engineers found more resources than they had expected 1 Coach KC Knowledge elicitation techniques O O The knowledge elicitation techniques were adequate 1 Coach SAP
159. he check found a lot of mistakes that had been caused by the deficits of the last version of the semantic wiki old wiki no separate wikis for the different ontologies gt concept project was used by several ontology with different notion wiki was quite confusing gt many relations knowledge required about were missing etc The good check results are due to the improved MOKI 6 Step 4 From Informal to Formal At the end of this step the domain model and task model will be contained in two OWL ontologies Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 The OWL ontologies were made by an automatic export of the informal CCI domain an task models from the Semantic MediaWiki MoKi i 7 Step 5 Modelling of Learning goals previously known as Formal Models Integration The goal of this step is to obtain an OWL ontology of the learning goal model via the TACT tool Feedback Please state positive and negative experiences of the IMM
160. he classes could be further partitioned into classes for Sailboats or Yachts within the Aircraft class the classes could be further partitioned into classes for Heli copter and Airplane The Car class because it IS A Vehicle would inherit the properties of the Vehicle class Anal ogously the Van class IS A Car which in turn IS A Vehicle ad therefore objects of the Van class inherit all behaviors relating to the CAr and Vehicle classes These Is a relations can be represented in a hierarchical structure as follows 12 Further properties related to is a hierarchies and sub classes can be usually specified in Ontol ogy building Typical examples of these properties are e whether the subclasses provide a complete decomposition of the superclass In our exam ple we may want to specify if the subclasses Car Boat and Aircraft provide a complete decomposition of the class Vehicle or whether there can be Vehicles which do not belong to any of these classes e whether subclasses are disjoint In our example we may want to specify that Boat and Aircraft are disjoint and there cannot be individuals which are both a Boat and an Aircraft or we may want to allow the existence of individuals e g a Seaplane which can be considered both Boats and an Aircrafts 13 The current version of the MoKi does not allow you to specify these further properties of the is a hierarchy which will have to be added to the extracted OWL onto
161. he desired granularity of the task list Whether the task list is fine grained enough depends on the intended use of the learning environment and on the intended target group and the decision has to be made by the knowledge engineer As a rough guideline the task should require a manageable amount of learning goals that can be acquired in a reasonable time by the intended target group during work integrated learning However defining objective criteria for the ideal degree of granularity of the task list is still an open issue and is one of our research questions in future work Relations between tasks were identified by asking for each task and each sub task what input would be required and what the output was Relevant domain concepts and learning goals for a task or a sub task should be elicited by asking for each task what knowledge was needed for accomplishing the task and for guidelines on performing the task 3 3 3 2 Concept Listing Concept listing is a simple interview technique in the course of which the expert is asked to answer the question What topics does a person has to have knowledge about in order to do X For instance in APOSDLE the question could be What topics does a person has to have knowledge about in order to make a simulation of the effects of lightening on an airplane Unlike Cooke amp McDonald 1986 who asked the respondents to write down all the concepts on a sheet of paper we were logging the in
162. he user is asked to fill some templates using predefined forms Although the KEs provide the descriptions of the elements of the task and domain model in Natural Language since these descriptions are structured according to pre defined templates with the help of semantic constructs like properties they contain sufficient structure to be automatically translated in formal models OWL ontologies in the case of the APOSDLE KB Thus the KEs do not need to become experts in formal languages to create the domain and task models they just need to fill the templates one for each task and one for each concept via forms Note to enable the compact modelling of similar tasks a mechanism of task parametrization was introduced in the IMM Second Version A detailed description of this mechanism is contained in Appendix 1 Section 7 2 2 The main idea is to add a parameter also called variable from the domain model in the name of a task in order to use the knowledge present in the domain model to specify families of tasks in a compact manner to reduce modelling effort At the end of this phase MoKi contains a set of filled templates one for each task and one for each domain concept which are then automatically translated by the MoKi s OWL export functionality in an OWL domain model and an OWL task model 13 learn work A anposcile 52 3 4 3 Supporting Tools Techniques amp Resources To support this phase of the methodology we deve
163. here is very easy and 5 is extremely difficult how difficult this step was for you 3 Positive Experiences It is interesting to reflect the domain of the own company Defining which domain can be useful big enough but not too big it quite difficult but interesting Negative Experiences It would be good to have an APOSDLE with two domains because we have one regarding the whole project processing of innovation management projects in this field work mainly experts But in our use case we plan also to have a general Page 1 Integrated Modelling Methodology Collection of feedback APOSDLE for innovation management where people from outside costumers students can learn without having access to the confidential documents Please estimate your modeling efforts at this stage in terms of hours spent to perform the task 0 5PM 70h Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Yes we did well designed Interviews with all domain experts and decided to take these parts for APOSDLE all of them agreed with Please state the main differences if any between performing this step this year and last year The experiences from last year were very useful Last year we didn t spend so much effort in this step We know know how APOSDLe and the models look like also the experts knew how APOSDLE looks like it is easier to define the scope and boundaries by knowi
164. his step The formalization was carried out by FBK Negative Experiences ADD your comments here Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Please state the main differences if any between performing this step this year and last year ADD your comments here Page 7 Integrated Modelling Methodology Collection of feedback 7 Step 5 Modelling of Learning goals previously known as Formal Models Integration The goal of this step is to obtain an OWL ontology of the learning goal model via the TACT tool Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you 2 Experience with variables Did you use variables in the learning goals Yes we did use variables because we have a few tasks where it really makes
165. his step the domain model and task model will be contained in two OWL ontologies Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 Positive Experiences This phase has been performed in a full automatic way thanks to the MoKi built in export functionalities without any effort from application partner and coaches Negative Experiences Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 0 Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Yes the model developed into the MoKi has been translated into valid formal models Please state the main differences if any between performing this step this year and last year 7 Step 5 Modelling of Learning goals previously known as Formal Models Integration The goal of this step is to obtain an OWL ontology of the learning goal model via the TACT
166. hough the explanations how to handle the formal models seemed clear results in this stage of P2 were partly wrong according to one coach SAP For P3 the export from the Moki into owl files was further improved Formal task and domain models were generated automatically without additional effort neither for the KEs nor for the coaches which was mentioned positively in the evaluation questionnaires KE ISN EADS Coaches FBK SAP The fact that once the informal model is in the MoKi no additional effort is necessary to create formal task and domain models constitutes in our view a great improvement of the modelling methodology After modelling P2 different opinions existed on the necessity of the informal modelling stage i e of modelling tasks and topics in the semantic MediaWiki before they are translated into formal models some of the KE found informal modelling not necessary and would have preferred to directly start with formal modelling KE CCI Others found that it was essential to start from informal modelling before creating formal models KE EADS Coach SAP No such statements were made for P3 One explanation might be that informal modelling was much easier for P3 than it was for P2 and that models had to be provided only in one language In addition unlike for P2 where the task model had to be created manually from what was written in the semantic MediaWiki the informal domain and 43 learn work A anposcile 52 t
167. iHe ess pue sjoo y dj y p u oym s n e llo yoddns Ajjensn ulewop Buluses v ul 5 MOH Auedwos JnoA ul ulewop v 104 5161 l S4S zju wapeuew JO wazs s Buruseaj How to use the Moki Chiara Ghidini Draft of November 20 2008 Thanks to Viktoria Pammer Marco Rospocher and Andreas Zinnen for their valuable feedback Contents 1 H TOLUC OR s eek sl eh Ee ee EOS m Cae OS 2 2 Wiki main functionalities o el 3 3 Wiki Import Functionalities 4 Jl Load List of Concepts 4 3 2 Load List or Tasks a a E aE dae Be 5 Be TEM EX TICON ety seach a a B d k Copy 3 6 0 6 i 5 4 Domain Model Management 7 4 1 Templates for Concepts 7 4 2 TASUCONCCDIS 2 hoe m OS 7 4 3 Add a CONCEDE xs os a oe soz a 6 0 7 4 4 Delete conceot ae ae dua 7 4 5 SA Browser IsPartOf Browser 8 4 6 TASC PTODPETHES 4 4 5 4 4 4 a Soe be She 8 4 7 Additional Functionalites 9 2 Task Model Management 10 6 Owl Export Functionalities 2 2 ee 1 7 Additional Concepts 2 sae Say we 4 12 7 1 Is a and Part of hierarchies
168. ich learning goals are pre requisites of others that is which simple learning goals must be achieved first in order to achieve more difficult learning goals This was a big problem for P2 because the pre requisite between learning goals was computed from the task learning goal assignment but should be less problematic this year as a different strategy is used to compute pre requisites between learning goals Therefore possible changes should be first discussed with the coaches 2 3 Connection between Domain Model and Task Model list of domain elements not connected to any task domain elements not connected to any task will never be used in selecting learning material starting from a task but will only be used in the free search or in the Topic Selection of APOSDLE Suggests Moreover they will not be part of the prerequisite relation which orders the topics learning goals in ones being pre requisites of others The results of this test should be considered to check if some of the domain elements not connected to any task are instead needed as knowledge required for some task list of domain concepts connected to only 1 task If a huge number of domain concepts are assigned only to one few task the system has less information to discriminate between different learning goals and in particular to Know which learning goals are pre requisites of others that is which simple learning goals must be achieved first
169. ich should be considered by the entire APOSDLE consortium and not only related to the Integrated Modelling Methodology Importance of coaches The coaching effort is still large It would even be much large if the application partners would not be as experienced as they are Without a knowledge engineer the modelling process would not be possible Domain experts themselves cannot finish the modelling on their own However our overall impression is that the amount of time to be invested is acceptable All in all we can imagine the opportunity for an APOSDLE company i e exploitation strategy to sell modelling as a service which can be performed and sold supported by the IMM in due time and quite independently from the domain to be modelled From our experience with coaching developing several models with several partners we perceive the IMM as domain independent enough to be sold Source Coach Comment The methodology should provide guidelines and support also for coaches in order to improve the quality and reduce the effort of coaching The modelling process is still very time consuming This disadvantage has hardly changed This can be accepted for a test situation but not for a real world application Speeding up modelling is crucial Source APs Comment As we noted already in the Deliverable D1 3 for a system like APOSDLE it is impossible to completely eliminate modelling effort However according to the filled
170. iggers a request for some manual verification and revision of parts of the models contained in the Moki 4 4 1 1 Automatic checks These checks are performed automatically via some Java tools based on Jena Library and the Pellet reasoner the OWL models of task and domain are exported from the MoKi and these tools are applied off line The tools implement some SPARQL queries on the OWL RDF files containing the models The output consists of a text file containing Domain Model Section e alist of all domain elements for which no description has been provided a list of all the top level concepts that is concepts at the first level in the class subclass hierarchy having no children Task Model Section e alist of all tasks for which no description has been provided e alist of all tasks with variables for which o the concept used as variable is not in the list of domain concepts o the concept used as variable is not part of the name of the task o the variable attached to the task is different than the variable used in its supertask 28 learn work A anposcile 52 alist of all concepts which are used in the knowledge required field of any tasks but that are not in the list of domain concepts alist of all tasks having an empty knowledge required field e a list of all tasks missing the Task ID the number used to model the workflow of the processes e alist of all task having names longer than 30 characters
171. ign the knowledge that is always required for preparing the agenda of an activity e g know how to do use apply Agenda to each of these more specific tasks by hand The disadvantages of these two ways of modelling can be overcome by using task parameters We refer to learning goals LGS with parameters as abstract learning goals and to learning goals without parameter as ground LGs A learning goal can contain at most one parameter This parameter is a concept from the domain ontology The semantics of a learning goal with a parameter is the class of LGs obtained by replacing the parameter with all the sub concepts according to the is a relation of the concept used as parameter in the learning goal including the parameter name itself Example Abstract LG Basic knowledge about activity contains the parameter activity In the domain ontology activity has the sub concepts listed in Example 1 Then from the abstract LG Basic knowledge about activity we obtain the following ground LGs ordered in is a hierarchy Basic knowledge about activity Basic knowledge about meeting Basic knowledge about board meeting Basic knowledge about demo meeting Basic knowledge about workshop A ground task cannot require an abstract learning goal with a parameter An abstract task requires exactly one abstract learning goal and the parameter of the abstract task and the abstract learning goal are the same The semantics of the requi
172. in MoKi the list of concepts contained in the domain model and the list of tasks contained in the task model according to a list of suggestions If these suggestions and checks trigger the necessity of revising the models they update them directly in MoKi Automated checks A list of automated checks is performed on the models to verify certain properties of the concepts and tasks described in MoKi The result of these checks is provided to the KEs and their coaches which according to these results may decide to revise the models directly in MoKi 17 learn work A anposcile 52 On line questionnaires These questionnaires propose to the KEs statements and questions that are extracted from the domain models contained in the MoKi and aim to verify if the Knowledge Experts agree with those statements In case of disagreement the KEs need to manually verify and revise the domain model directly in MoKi Activities 1 and 2 can be performed in parallel and concern both models task and domain Activity 3 has to be executed after 1 and 2 and concerns only the domain model as described in Section 4 4 1 2 3 5 3 Supporting Tools Techniques amp Resources To support the modelling activities in this phase of the IMM we designed a bunch of tools and guideline documents The whole process is described in the Validation amp Revision of Domain Tasks Manual included in the Annex Part 3 Validation amp Revision of Domain Tasks whic
173. in the Property field and can specify the related concept in the Property Target field As a simple example in the case of concept Sweater Property may contain s made of and Property Target may contain Wool Multiple pairs Property Property Target can be added using the Add another button All these relations are modelled as a property of type Page in Moki which basically means they point to other pages in Moki The predefined relation ls a is introduced in the template with the subclass relationship of OWL in mind Therefore the informal Is a relation in the Semantic MediaWiki is used with the semantics of the subclass relationship of OWL For two concepts X and Y X Is a Y if everything that is an X is also a Y and is also automatically transformed in the is a subclass relationship of OWL The KE starts filling the forms providing information for the fields In particular the autocompletion functionality supports the KE in filling the Is a Is part of Property and Property Target fields 15 learn work A anposcile 52 3 4 3 4 Task Template Figure 6 shows a screenshot of a filled form associated to a task template in Moki Modify Task Build Model of Physics with Maxwell Equations Annotations Description This task consists in understanding the problem to be simulated in terms of EM and EM formal modelling characterizing the physical phen
174. ing to the changes 1 did im TACT Adding deleting variables the best thing would be the have one tool where everything is changed automatically Page 9 Integrated Modelling Methodology Collection of feedback Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 6 Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner yes Please state the main differences if any between performing this step this year and last year LAst year we did not have aclear overview which task has which learning goal therefore this was better Last year it was quite time consuming and confusing to check th formal models in Protege 9 General questions and remarks e Did the domain experts coincide with the person performing the modeling If yes did it happen in all the steps or only some Basically not But some steps e g formal checks were done by the person performing the modeling and not by the domain expert e Do you think that the domain you have chosen is appropriate for learning support with APOSDLE Yes the domain is appropriate The consulting area is a very knowledge intensive field for everyone it is useful to learn from previous projects e Do you have any additional remarks or suggestions for improvement See above Page 10 Integrated Modelling Methodology Collection of feedback Integrated Mod
175. ions given clear enough Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you Page 2 Integrated Modelling Methodology Collection of feedback Positive Experiences ADD your comments here We did not use a tool for terminology extraction we extracted the knowledge from the digital resource intellectually We analysed the structure of the resources and gathered a number of candidate domain concepts It was not difficult but it took a lot of time Negative Experiences ADD your comments here This step took a lot of time Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here 60 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer We gathered many new domain concepts within this task Because of reading and concerning with the resources we could gather many information about nformation and Consulting on Industrial Property Rights Please state the main differences if any between performing this step this year and last year No difference we used the same methods We did not use the term extraction tool due to poor experiences with this tool last year 3 Step 1b Knowledge elicitation from Domain
176. is time Changing content was more intuitive the overall process was much faster Source Coach APs TOOL Possibility to create groups of concepts and task with the very easy List typing Source AP TOOL It was easy to add variables to tasks thanks also to the auto completion functionality in the MoKi Source Coach AP TOOL The Moki has been used in a collaborative manner since coaches were able to monitor and provide feedback on the work of application partners Source Coach 5 1 4 2 Negative Feedback 5 1 5 5 1 5 1 GENERAL When creating variable it is difficult to have clear vision of how they will be used further in the P3 Source AP Comment The methodology should provide better support and explanation also providing some examples on how the use of variables reduces the modelling activities especially when specificying learning goals and their connection with tasks TOOL There were deleted concepts that still remained available on the wiki Source AP Comment This issue has been already solved during the APOSDLE modelling activities TOOL The only remark is that KE should have the possibility to export himself the MoKTs files in OWL Currently this function is not available for the KE Source AP Comment Due to a server configuration issue we have been forced to disable direct user access to this functionality However one of the aims of MoKi is to allow a user to directly obtain the OWL ve
177. ises the inter relationships between these elements as well as some specific Instructional Types which are needed to define learning goals and classify learning material APOSDLE knowledge base Learning Goal Model Task Model is about Instructional Types Domain element Learning goal type triggers Figure 1 The APOSDLE Knowledge Base Building the domain specific part of the APSODLE knowledge base namely the Domain Model the Task Model the Learning Goal model and their inter relations requires several skills These skills span from knowing the different aspects that have to be described in the models to having the ability of encoding such knowledge into formal statements and to having the ability of integrating ditferent aspects such as the domain elements the tasks and the learning goals into a uniform and coherent vision For this reason building the APSODLE knowledge base is inherently a collaborative activity performed by different actors and carried out based on some collaborative protocol which is usually described in the methodology used to support the modelling The Integrated Modelling Methodology IMM developed within the APOSDLE project guides the process of creation of the application domain dependent parts of the APOSDLE Knowledge Base illustrated in Figure 1 For an overview of the initial reasons which motivated the development of an APSODLE Modelling Methodology we refer the reader to Deliverabl
178. isfied task has no other variable Task names can contain at most one variable e The topic has sub topics otherwise the variable makes no sense e The name of the topic occurs in the name of the task lf you want to add a variable with TACT the tool will check these conditions and enable the Add Variable button only in case the conditions are satisfied 4 5 Creating tasks with variables The technical description of how to model tasks with variables with the TACT tool is given in section 6 7 In this section we describe what happens if a variable is added to a task Let us again consider the example from above The knowledge engineer has modelled the abstract task Prepare Agenda for Activity where Activity is the task variable Then learning goals are assigned to the task All topics from the ontology can be manually assigned to the task as learning goals Moreover some learning goals are created automatically Automatically created learning goals for abstract tasks Once a task with a variable is defined and selected in the TACT TACT assigns to that abstract task an abstract learning goal i e a learning goal with the same variable as the task In our example the abstract learning goal unspecified Activity is created for the task Prepare Agenda for lt Activity gt This automatically created learning goal is a placeholder for specific learning goals of all specialised tasks rela
179. its connection to the domain and task models is then validated and possibly revised in Phase 3a Validation amp Revision of Learning Goals according to the results of some automatic validation checks performed At the end of this phase the entire APOSDLE Knowledge Base is ready to be plugged into the APOSDLE system The second version of the Integrated Modelling Methodology has been accurately followed by each Application Partner to build their specific APOSDLE Knowledge Base The specific models created are described in Deliverable D6 9 Second Version of Application Partner Domain Models The feedback obtained from the experience of building the specific APOSDLE Knowledge Bases for the 37 Prototype was used for a careful evaluation of the IMM Second version whose findings are reported in the final part of the deliverable learn work A anposcile 52 EOC UNIV SUE ANY estes a na ar ya ya l sls ili Tabe 01 ad d V 1 RLMO UCUOncn yal aul ay olan n 1 Tel P rpose ofthis 1 1 2 Scope of this 5 5555522 1 1 3 Related 15 222 a E EAE AAE EEA RESETE 1 2 Integrated Modelling of Domain Tasks and Learning Goals A Collaborative and inteqs160 PTOo3Cli i a
180. its implementation as it supports 1 the access to the APOSDLE knowledge base at different levels of formality 2 the integrated modelling of several aspects of the APOSDLE knowledge base and 3 the coherent development of the formal part In Section 3 we illustrate in detail the different phases of the IMM Second Version while Section 4 focuses on the description of the Modelling Tools learn work A anposcile 52 3 1 Overview of the Integrated Modelling Methodology The Integrated Modelling Methodology Second Version consists of four phases as depicted in Figure 4 e Phase 0 Scope amp Boundaries The scope and boundaries of the application domain are determined and documented questionnaires and workshops are used to elicit the main tasks and learning needs in order to identify candidate application domains for learning e Phase 1 Knowledge Acquisition Knowledge is elicited from domain experts and extracted from available digital resources relevant for the chosen domain Knowledge elicitation techniques such as interviews card sorting and laddering are used to support knowledge elicitation from experts while a terms extractor tool is used to support knowledge elicitation from digital resources e Phase 2 Modelling of Domain Tasks A specification of the domain and the task models is created This specification also contains a first alignment between domain elements and tasks which is used in Phase 4 as a basis
181. k Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Yes Were the tools adequate Yes What where the main difficulties encountered in this stage Were the explanations given clear enough Yes Was the goal of the step completed after this step Yes Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you Experience with variables Page 8 Integrated Modelling Methodology Collection of feedback Did you use variables in the learning goals Please indicate why or why not Not in tact because they have been defined int the MOKI Did you find it difficult to understand how variables could be used in general Did you find it difficult to insert variables in TACT Ifyou used variables where did you find it more intuitive easy to create tasks with variables in the MoKi or in TACT Positive Experiences ADD your comments here The installation and use of TACT tool is very easy and the user manual contains quite clear definitions and examples of the different competency types Negative Experiences ADD your comments here The EADS simulation process is strongly iterative The simulation engineer has always a possibility to go to the step before and revisit this step as a result
182. k without a variable For example Prepare Agenda for Activity is a specialised task Abstract task An abstract task is a task with a variable that is about a certain topic such as Activity with sub topics in the domain model Variables are denoted using lt gt For instance the task Prepare Agenda for lt Activity gt is an abstract task that contains the variable lt Activity gt Specialised task A specialised task is the instance of an abstract task An abstract task is de composed into specialised tasks by replacing the variable with all sub topics of the topic that the variable is about Example The abstract task Prepare Agenda for Activity from the example above is a placeholder for the set of specialised tasks Prepare agenda for Meeting Prepare agenda for Board Meeting Prepare agenda for Demo Meeting Prepare agenda for Workshop 4 4 Where do variables come from There are two ways to define variables a Task variables are defined in the MOKI 10 learn work A anosclle 42 Task variables are defined in the TACT In case of a namely the task variable has been defined in the MOKI TACT will indicate that the task has a variable see Figure 6 5 in section 6 7 The technical description of how to model tasks with variables with the TACT tool is given in section 6 7 e g see Figure 6 4 In both cases a variable can only be added if the following conditions are sat
183. ks added 3 2008 10 11 Step on questionnaire added 4 2008 10 16 Document revised adding steps learn work A anosclle 42 This document has the goal of guiding Application Partners and Coaches through a list of checks and suggestions to be used to refine and tune the models modified and or created in TACT The changes of the models triggered by the list of checks and suggestions described here should be made in the MoKi and or in TACT by the application partners supervised by their coaches In the following we provide a description of all the checks performed learn work A anosclle 42 These checks will be performed automatically ie via appropriate scripts The results will be provided by FBK at the beginning of the revision process to the coaches The coaches will send them to the Application Partners and they together will coordinate on how to revise the models in the MoKi and or TACT accordingly 2 1 General Statistics Number of Tasks Number of Domain Elements Number of Learning Goals Number of Task Learning Goal Assignments These statistics are provided to give an overview of the size of the models created with the MoKi and TACT They are not meant to emphasise errors or problems Nevertheless if the numbers differ greatly e g very few tasks but a large number of learning goals this may be caused by some granularity discrepancy and should be discussed with the coaches to check if this may originate problems
184. last year ADD your comments here 9 General questions and remarks Did the domain experts coincide with the person performing the modeling If yes did it happen in all the steps or only some For CCI we had different persons For CNM the domain expert also performed the modeling Page 9 Integrated Modelling Methodology Collection of feedback Do you think that the domain you have chosen is appropriate for learning support with APOSDLE Please give reasons for your answer has to be evaluated by applications partners Do you have any additional remarks or suggestions for improvement The coaching effort is still high It would even be much higher if the application partners would not be as experienced Without a knowledge engineer the modeling process would not be possible Domain experts themselves cannot finish the modeling on their own But in addition our overall impression is that the amount of time to be invested is acceptable All in all we can imagine the opportunity for an APOSDLE company i e exploitation strategy to sell modeling as a service which can be peformed and sold supported by the IIM in due time and quite independently from the domain to be modeled 1 e from our experience of coaching developing several models with several partners we perceive the IIM as domain independent enough to be sold Page 10 Integrated Modelling Methodology Collection of feedback Integrated Modellin
185. ld be also performed via videoconferencing tools ORGANIZATION Further formal guidance for transforming card sorting into a hierarchy would be helpful Source Coach Comment The methodology should provide detailed guidelines to support the formalization of the card sorting results maybe investigating the state of the art for already available results documents ORGANIZATION DE should not be involved in too similar knowledge elicitation sessions after each other the DE might have the feeling that they are giving the same information again and again This means knowledge elicitation sessions at this stage need to be carefully planned Source Coach Comment The knowledge elicitation phase could be planned in detail in advance in order to avoid the above issue If there are a considerable number of domain experts available one 35 learn work A anposcile 52 possibility could to perform some technique with one group of experts and other techniques with other groups If the number of domain experts is very small an option could be to apply only one knowledge elicitation technique the most appropriate one instead of several as is done now TOOL There are some minor usability issues uploading a document and then adding the topics to the ontology might not be very easy and intuitive for some users The text mining functionality still doesn t work very well with long or German documents but it was sufficie
186. leteness and correctness by the DEs The step was supported by guidelines and by results from automatic checks similar to the one of step 3 but which also involve checking the quality of learning goals Page 8 Integrated Modelling Methodology Collection of feedback Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 Positive Experiences The results of automatic checks helped to reduce the effort They provided a good feedback what points still have to be closer checked Negative Experiences The whole process can only be performed by knowledge engineers and not by domain experts Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Please state the main differences if any between performing this step this year and
187. ling P3 two KEs CCI ISN reported that they used MS Visio to build graphical domain and task models which both found very helpful for the subsequent steps It is worth considering that a sub step is added in this phase for graphical modelling of the domain and tasks Both the semantic MediaWiki P2 and the Moki P3 were not used in a collaborative manner among the KEs of different domains P2 KE CCI ISN Coach KC P3 KE EADS Nonetheless the Wiki and Moki were used in a collaborative manner by the knowledge engineers and their coaches which was found to be very convenient P2 Coaches SAP KC P3 FBK 5 2 4 Phase 2a Validation and Revision of Domain Tasks The objectives of this phase were clear to the KEs for P2 KE ISN CCl EADS and for P3 KE EADS Coach FBK Some KEs also stated that the explanations were clear P2 KE ISN P3 KE EADS Coach FBK and that the explanations were clear at least for a person being experienced in knowledge engineering respectively P2 KE CCl Even though the KEs had a feeling for what they should do during this phase it was unclear how model validation should happen for P2 Coach KC because no criteria existed how to validate the informal models In addition it is hard to say when an informal model is finished Coach SAP For P2 the KEs stated that it was not possible to assess the quality of the model without the help of a coach or someone else KE ISN CNM Also they stated
188. list of domain concepts Load list of tasks Term Extractor Domain Model domain model managem m List concepts Add a concept Delete a concept Terms Terms heed taxonomy reuse 1 gruber becoming Elsearchinq search roles Hwidely H projects encoding explicit aim H discussion views Figure 11 Upload files extract relevant terms and cluster the files top From automatically extracted terms bottom left new concepts in the MoKi can be directly created bottom right special Upload File Durchsuchen Language English Group terms semantically al Maximum number of items 100 Extract relevant terms Cluster documents and get relevant terms for clusters Remove all uploaded files Files Noy 2001 Ontology Development 101 A Guide to Creating Your First Ontology pdf Studer 1996 Knowledge Engineering Principles and methods pdf Terms add this term to ontology Create a concept Taxonomy Annotations Description Synonyms Hierarchical Structure Is a Is part of 4 2 2 2 Model Management Functionalities This set of functionalities provides the basic functionality each modelling tool necessarily provides creating editing and deleting model elements Depending on the type of element task or domain concept pre defined templates are loaded when an elemant is created or edited APOSDLE consortium all rights reserved page 24 grat
189. lly represents the task learning goal model as OWL ontologies conforming to the structure of the APOSDLE KB See Appendix 1 To male life easy for knowledge engineers task learning goal models are regularly back uped and a human readable changelog collects changes to the models A detailed manual of TACT see Annex part was given to the application partners It describes both the conceptual and technical aspects as well as usage of TACT 4 3 1 TACT functionalities A simple workflow within TACT consists of loading the APOSDLE Knowledge Base for a specific learning domain An exemplary screenshot of the TACT is given in Figure 13 The KE specifies for each Task 1 which knowledge domain concept 2 plus learning goal type 3 is required Furthermore learning goals can be added to tasks with parameters or variables in which case the specialised tasks inherit all learning goals of the parent task TACT also supports the modification of tasks by either i adding or il removing a variable In the first case specialised tasks will be automatically created while in the latter case all specialised tasks will be removed from the task model The semantics of variables is explained in more detail in Appendix 1 TACT supports modelling by highlighting tasks without learning goals as well as highlighting domain concepts which are not part of any learning goal Both highlighting functionalities can be switched on and off 26 learn work
190. logy using an ontology editor eg Prot g For further instructions on how to model 1s a hierarchies in Prot g please refer to 2 Part of hierarchy Part of hierarchy is meant to represent part whole relations between the concepts stored in the MOKi The study of part whole relations is an entire field in itself called mereology This note not intended to cover extensively this topic but only to provide some basic examples of part whole relations which can be used as examples to build a part of hierarchy The Part of and example below is based on an example described in 3 Example 2 Consider a over simplified description of a car and its decomposition into parts subparts etc e Cars have parts Engine Headlight Wheel e Engines have parts Crankcase Carburetor e Headlights have parts Headlight bulb Reflector These part whole relations can be represented in a hierarchical structure as follows Crankcase Headlight Headlight bulb 14 Index Add Template concept 7 concept 7 auto completion 7 Term Extractor 5 Cluster relevant terms 6 Wiki Import Functionalities 4 Delete concept 7 Domain Model Management 7 Edit concept 7 Edit with form concept 7 Export Domain Model 11 Task Model 11 extract relevant terms 6 hierarchy Is a 4 Part of 4 14 history 7 IsA Browser 8 IsPartOf Browser 8 List concept 7 properties 8 Load List of Concepts 4 Load Li
191. loped e A set of guidelines to filter from the concepts and tasks acquired in Phase 1 of the Methodology those which will be included into the domain and task models A tool called MoKi Modelling WiKi based on Semantic MediaWiki see Section 4 2 for a detailed description of the tool and of the reasons which motivated our choice of developing it on top of Semantic MediaWiki which supports the activity of creating the domain and task models A manual of MoKi attached in the Annex Part 2 Moki User Manual which contains some modelling guidelines and a technical description of how to use MoKi and which is distributed to all the actors involved in the modelling activities Below we briefly present the guidelines provided to choose the relevant domain concepts and tasks and we describe the templates the KEs has to fill to create the formal domain and task models 3 4 3 1 Guildelines for choosing relevant domain concepts Starting from all possibly relevant terms generated in Phase 1 the KE decides which ones are relevant domain concepts and which ones should be discarded and prepares a list to be validated by the DEs To help deciding about relevant concepts in the APOSDLE knowledge base the following guidelines were given 1 Is this domain concept useful for retrieval 1 1 Are there resources dealing with this domain concept or is it reasonable to expect resources dealing with this domain concept in the future 1 2 Does
192. ls and the number of Task Learning Goal assignments Connection betvveen Task Model and Learning Goal Model Section a list of all tasks having no learning goals attached a list of all tasks having exactly one learning goal attached a list of all tasks having at least five learning goals attached a list of tasks having the same set of learning goals attached a list of all learning goals not attached to any task a list of all learning goals attached to exactly one task Connection between Domain Model and Task Model Section a list of all domain elements not connected via learning goals to any task a list of all domain elements connected via learning goals to exactly one task Learning Goal Types Section a list of the learning goal types never used 31 learn work A anposcile 52 At the end of the modelling activities for Prototype 3 similarly to what has been done for the first version of the IMM we asked the Application Partners and the Coaches to provide some feedback and commenis on the modelling methodology proposed The feedback has been collected from a questionnaire sent to the Application Partners and their Coaches The questionnaire we asked them to fill is basically the same as the one proposed at the end of the first version of the IMM except for some minor changes due to the different methodology structure and is composed of specific questions for each phase of the methodology The filled feedback ques
193. ls of the step in a satisfiable manner Please give reasons for your answer yes Please state the main differences if any between performing this step this year and last year Page 10 Integrated Modelling Methodology Collection of feedback ADD your comments here 9 General questions and remarks e Did the domain experts coincide with the person performing the modeling If yes did it happen in all the steps or only some No domain expert are not performing the modeling e Do you think that the domain you have chosen is appropriate for learning support with APOSDLE Please give reasons for your answer Absolutely yes role of learning is important especially within EADS IW Finding Applying and Creating new knowledge is part of Electromagnetic Simulation job Target group expected to continuously learn keep up to date with new developments The simulation activities as predictive and virtual oriented capabilities are more and more often used to support numerous business activities Broadly speaking the training or learning management system should supports developing and maintaining the right range of skills and competences needed for the EM Simulation Domain jobs Tasks that employees in the target group are performing are crucial to the success of success and safety of the products then as a consequence of the company e Do you have any additional remarks or suggestions for improvement Page 11 Integrated
194. m here at solving the problem in a general setting but to handle this problem in the specific APOSDLE context learn work A anposcile 52 in APOSDLE The YAWL language was dismissed in favour of the same ontology language OWL which is used to describe the other parts of the APOSDLE knowledge base The migration from YAWL to OWL also reduced the number of modelling tools In fact we decided to use one single tool MoKi to cover both the modelling of domain and tasks with a uniform interface and modelling style This intends to strengthen the integrated modelling of domain and tasks and to reduce the training needed to use the modelling tools e Better modelling tools This means to reduce the workload of modelling to support the more agile modelling process envisaged for the IMM Second Version and to increase the quality and integration of models e Better manuals guidelines This intends to help the identification of appropriate learning domains reduce the workload of modelling increase the quality of models and also to support the specific contributions and roles required by the different actors inside the modelling team Thus we decided to i improve the questionnaires used to support the identification of appropriate learning domains and ii write guidelines and manuals about the different phases and tools of the IMM to support coaches and domain experts in their modelling activities Questionnaires Manuals and Guidelines
195. main idea is to use the knowledge present in the domain model to support a detailed specification of tasks in a compact manner Thus we stimulated the domain experts to evaluate if their tasks could be extended by introducing a concept from the domain ontology for instance by changing create agenda to create agenda for lt activity gt Notationally we use lt A gt to denote a parameter in a task 54 learn work A anposcile 52 The intuitive semantics of a task with a parameter is the class of tasks obtained by replacing the parameter with all the sub concepts according to the is a relation of the concept used as parameter in the task name including the parameter name itself Example 1 The task Prepare agenda for lt activity gt contains the parameter activity The domain ontology contains the following taxonomic information indentation is used to graphically represent the is a relation activity meeting board meeting demo meeting workshop Then from the task Prepare agenda for lt activity gt we obtain the following tasks also ordered in is a hierarchy Prepare agenda for activity Prepare agenda for meeting Prepare agenda for board meeting Prepare agenda for demo meeting Prepare agenda for workshop Task parametrisation is introduced to allow the modelling of tasks in a compact manner i e without forcing the specification of a number of very similar tasks but at the same time to obtain i
196. matic checks and by an on line ontology questionnaire 6 Step 4 From Informal to Formal At the end of this step the domain model and task model will be contained in two OWL ontologies 7 Step 5 Modelling of Learning goals previously known as Formal Models Integration The goal of this step is to obtain an OWL ontology of the learning goal model via the TACT tool 8 Step 6 Formal Models Validation At the end of this step all the models created domain task and learning goals should be formally correct and complete The goal of this step is to have the models validated completeness and correctness by the DEs The step was supported by guidelines and by results from automatic checks similar to the one of step 3 but which also involve checking the quality of learning goals Page 4 Integrated Modelling Methodology Collection of feedback 9 General questions and remarks e Did the domain experts coincide with the person performing the modeling If yes did it happen in all the steps or only some e Do you think that the domain you have chosen is appropriate for learning support with APOSDLE Please give reasons for your answer Do you have any additional remarks or suggestions for improvement Page 5 Integrated Modelling Methodology Collection of feedback Integrated Modelling Methodology Collection of Feedback The goal of this questionnaire is to collect positive and negative feed
197. med works better than doing it in a random sequence The latter type of card sorting is a very interactive technique with quite some action According to the feedback of DEs it is even sometimes funny as far as KE can be funny due to the ludic and interactive character of the method 12 learn work A anposcile 52 3 3 3 6 Laddering Laddering is a semi structured interview technique that is employed in order to break down and refine identified domain concepts or to detect relationships between concepts The KE starts from one domain concept for example the high level concept Method For instance the question Which methods do exist leads to a number of domain concepts for instance Brainstorming Techniques or Knowledge Management Techniques that are connected to the starting concept by a relation of type is a This procedure is then repeated for the new concept for example by asking Which Knowledge Management Techniques do exist and also for the resulting domain concepts until the desired degree of granularity is obtained In so doing a cognitive ladder between different domain concepts is established As for the granularity of the task list the decision about the granularity of domain concepts is on the knowledge engineer For taking this decision several factors such as the resources available but also the documents to be created in the future or the potential learning goals
198. meetings with simulation experts Pierre and Richard P directly between KE and SE or with third party Barbara This could be considered as Knowledge elicitation from experts 4 Informal Modeling of Domain and Tasks in 1 Starting from the knowledge elicited in Step 1 a b the main goal of this step 15 to obtain an informal but rather complete description of the domain model and task model in a Semantic MediaWiki called MoKi After this modeling step the informal concept model should only consist of relevant domain concepts see 5 2 Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Yes Were the tools adequate Yes Page 4 Integrated Modelling Methodology Collection of feedback Was the wiki used in a collaborative manner No What where the main difficulties encountered in this stage Were the explanations given clear enough Yes Was the goal of the step completed after this step Yes Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 2 Experience with variables Did you use variables in the informal model Please indicate why or why not Did you find it difficult to understand how variables could be used in general Did you find it diffi
199. modelling effort for the KEs at every stage in the modelling process 47 learn work A anposcile 52 In this document we described the second version of the APOSDLE Integrated Modelling Methodology This methodology developed within the APOSDLE project guides the process of creating the application domain dependent parts of the APOSDLE Knowledge Base The APOSDLE Knowledge Base provides the basis for reasoning within the APOSDLE System We provided a description of the approach taken in order to produce the second version of the Methodology and of the main differences between the two versions a detailed overview of the phases of the methodology and of the tools used to support it Finally we provided an overview of the feedback obtained from the evaluation of the modelling activities and an in depth comparison between the first and second version The feedback collected from Application Partners and coaches who have accurately followed the second version of the Methodology during the development of Prototype 3 provide the basis of future enhancements of Integrated Modelling Methodology and tools to be used for in further developments of the APOSDLE system As was illustrated in Section 5 the main results coming from the application of the Integrated Modelling Methodology second version show a considerable improvement when compared with the results obtained with the first version They also highlight specific points where improvements a
200. mpty cell unspezifiziert leeres Feld Note The learning goal types are not intended to be hierarchical i e for instance profound knowledge of a topic does not include basic knowledge about a topic If one wants to express that a user performing the task should be provided with learning content in order to acquire both basic knowledge and profound knowledge then this has to be modelled explicitly The role of learning goal types and material uses in the third APOSDLE prototype In the third APOSDLE prototype the main purpose of learning goal types is filtering the list of resources provided by APOSDLE suggests Learning goals of different type lead to different types of snippets i e different information For instance the learning goal type basic knowledge about is linked to the material use type definition This means for instance if a user selects the learning goal basic knowledge about databases s he might receive snippets that provide him her with definitions learn work A anosclle 42 of databases However if s he selects the learning goal know how produce a database APOSDLE might provide him her with information about constraints of databases Naturally APOSDLE can only provide such content if resources documents multimedia are available for the desired material uses In contrast to specific learning goal types the learning goal type unspe
201. n of the task model This attribute is modelled as a property of type String in MoKi For more details on the use of the numbering attribute with task see Section 7 2 1 in Appendix 1 Subtasks optional In this field the user can add other tasks separated by a comma For each of these tasks the relation between it and the task described in the form is as follows X has subtask Y means that Y is a more fine grained task that is part of X This relation is modelled as property of type Page in MoKi e Knowledge required optional This relation has been defined for anticipating the more complex Task Learning Goal Domain Concept mapping Doing this we want to capture early on in the modelling process the relation between tasks and domain concepis The semantics is as follows For task X and domain concept Y X Knowledge required Y means that in order to successfully perform X knowledge about the concept Y is necessary There is 16 15 apcescile learn 2 work no formal semantics for this relation hovvever as this relation vvill need to be re examined and formalised in Phase 3 see Section 3 6 This relation is modelled as a property of type Page in MoKi If the KE wants to specify knowledge about relevant domain elements already in this modelling phase s he can do this here The Knowledge required has to be specified in terms of one or more domain concepts This is potentially a good moment to discover releva
202. ncentrate on what knowledge they are modelling e g the business domain the activities the learning goals rather than concentrating on the language in which the knowledge is specified The choice of focusing on the what or in other words around the intention of having to think the same thing only once has enabled us to restructure the IMM around the parts of the knowledge base to be constructed and their inter relations as depicted in Figure 4 Phases Tools Phase 0 Scope amp Boundaries Guidelines and questionnaires the scope and boundaries of the application domain are determined and documented Phase 1 Knowledge Acquisition i Knowledge elicitation techniques knowledge is i elicited from domain experts using techniques like interviews card sorting and laddering and ii Know Miner tool embedded in Moki ii extracted from available digital resources relevant for the domain using tools like a terms extractor _ Phase 2 Integrated Modelling of Domain and Tasks Modelling WiKi tool Moki The domain and the task models are created This specification contains a first alignment between domain gt concepts and tasks which is used in Phase 4 as a basis for the modelling of learning goals Domain concept requires knowledge of Tasks Manual Gudelines Phase 2a Validation amp Revision Automatic Checks the domain and task models are evaluated and if necessary revised accordingly Ontology Questionnaire
203. nciples e g what is an ontology what is a model why is a model needed at the very beginning of the modelling process Another point that was mentioned in the evaluation questionnaires was the complexity of the model One KE EADS stated both in the P2 and in the P3 questionnaire that the model does not allow to depict the inherent complexity of the domain for instance in their domain one and the same process step would be performed several times in the process and would require different skills each time On the other hand the model of P2 was found to be too complex by another KE CCl For P3 it was kept simple e g they were not using variables and therefore it was easily manageable KE CCI Coach SAP The complexity of domains which can be realised within APOSDLE should be discussed during the scope and boundaries phase in order to make sure that APOSDLE is the right system to support learning in that domain One of the statements in the questionnaire was referring the number of modelling tools involved Even though the number of tools was already reduced for modelling P3 in comparison to P2 it would be nice to have everything in one tool KE ISN Another point related to this is the fact that changing the model e g the label of a task once the learning goal model is finished is still difficult and has to be done manually KE ISN Future work could be dedicated to integrate even more the different modelling tools with the long te
204. nformal but structured description of the element itself A demo version of Moki can be tried out on line at the MoKi web site moki fbk eu A detailed description of the current version of Moki is contained in the MoKi manual available at the same web site www semantic mediawiki org and www mediawiki org 21 C 1 y A AN F A LA i i g ri c IAA ht rt la q ri g Prins 4 J A r e H rh fy a per F Ye MH al Vio e IOON 9 EAO af w M i ca CU V uc M VIC il VUVUIVUUY VUIS Ui ZA learn work A anosclle 572 The typical page contains e an informal description of the element in natural language images or drawings can be attached as well The purpose of this part is to document the model and clarify it to users not trained in the formal representation e g reference to source documents notes about modelling choices and open problems etc Comments can be added by each user and are not translated to the formal model e a structured part where the element is described by means of triples of the form subject relation object with the element itself playing the role of the subject The purpose of this part is to represent the connection between elements of the same model like class sub class relations between elements of the domain model or task sub task relations between elements of the task model as well as connections between elements of
205. nformation in the models about the real tasks that users are doing at run time In this way APOSDLE users will obtain information about specific tasks that they need in realistic learning situations For example the modeller creates the task Prepare agenda for lt activity gt and assigns general learning goals to the task e g basic knowledge about lt activity gt Then s he indicates that there are different activities i e activity is a parameter and that for each of the specific activities the user needs to have profound understanding of the respective activity i e the knowledge engineer defines the learning goal parameter Note Abstract tasks may trigger some reordering of the domain ontology as the more specific tasks are obtained using the is a hierarchy of domain concepts and should be used only if strictly necessary 7 2 2 1 Ground tasks abstract tasks and specialised tasks In order to deal with parameters different types of tasks have to be defined Normal tasks without parameters are ground tasks Tasks with parameters are called abstract tasks and tasks created from tasks with parameters are specialised tasks Ground task A ground task is a task without a parameter For example Prepare Agenda for Activity is a Specialised task Abstract task An abstract task is a task with a parameter that is about a certain topic Such as Activity with sub topics in the
206. ng Ethnography Interview Observation Scenario_Walkthroughs Verbal_Protocol 5 7 Action_ HAM Resource_Management_Strategy Action UCM Activity_Description Actor_ HAM Actor_ UCM Constraint Contextual_Feature Creativity_Workshop 2 Q Event Abnormal_Event Normal_Event Fit_Criterion Goal_ HAM Collective Goal High_Level_Goal Individual_Goal Local_Goal Non prescribed_Goal Prescribed_Goal F Goal_ SGM Soft_Goal d Madal Save Tree Figure 12 The domain model IsA Browser L learn work A anposcile 52 4 2 2 4 Export Functionalities These functionalities support the automatic export of Knowledge of the domain and task model into standard knowledge representation languages At the moment the formal representation for both models is an OWL ontology The task model and the domain model can be exported separately 4 3 TACT The TAsk Competence Tool TACT supports the activity of assigning learning goals to tasks It loads the generic part of the APOSDLE KB as well as the domain and the task model of a given learning domain It also loads the preliminary task learning goal model created within the MoKi Of course it supports iterative modelling i e TACT also allows saving and re loading a task learning goal model It is programmed in Java and distributed as runnable jar file It is able to load the files produced by the Moki and interna
207. ng entries e Upload Ontology e List Entailed Statements e Justification e Save current ontology e List Removed Axioms e Options These are the different views of the ontology At every point in time the views that are open to you are displayed as links Click on them to go A plain text view is either closed to you or you are currently seeing it A E WIS 1 Ve JOU SONSOMIUM a INTIS TESETVCO there kam work 5 1 2 4 List inferred statements You will automatically be transferred to the List Entailed Statements View Upload Ontology List Entailed Statements Justification save current ontology List Removed Axioms Options Entailed Statements You are here Tn this box you see the statements axioms that are entailed inferred by the specified ontology You can get a justification by selecting one statement axiom and clicking the justify button at the end of the box For seeing the axioms which are explicitly stated in this ontology see the second box below ORank_Correlation_Test subClassOf Test OTwo Way_ANOVA subClassOf Parametric_Test O Two Way_ANOVA subClassOf Inferential Statistics Figure 3 On the displayed page you see two boxes One with the title Entailed Statements and one with the title Axioms see Figure 4 We call the first the Entailed Statements box and the second the Explicit Statements box The first box shows the sta
208. ng in the Structure Repository and we suggest adding these files to the subversion repository in the respective directories where the domain and task ontology are stored Application partners please send these files to your coach once you are finished with the TACT e backup Directory that contains previous versions of your model You can restore them by clicking the Restore old data version button in the opening dialog see 6 5 Startup view of TACT e tempi The following files will be created in the directory where you saved tact jar e TACT log A logfile In case of problems with the TACT send this logfile together with a clear and repeatable problem description to the Know Center vpammer know center at e TACT properties Contains the last directory which you opened with TACT Diff csv Contains a change log Please send this file to the Know Center bkump tugraz at once you are finished with the TACT 6 5 Startup view of TACT 1 Double click tact jar 16 Task Competence Mapping Tool TACT learn work aposclle JAGI fool Startup Partner Prefix 1 Data location Browse Start Restore old data version Import Learning Goals from Text 2 Figure 6 1 Start up view on TACT 2 Click on the Browse button 1 and choose XXXaposdle categories owl file in the directory where your knowledge base resides If you want to import learning goals that you entered as knowledge requi
209. ng the system at least a little bit 2 Step 1a Knowledge elicitation from Digital Resources The goal of this sub step is to extract as much knowledge as possible from the digital resources provided by the Domain Experts The desired output of Stage la is a number of candidate domain concepts Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Was the knowledge elicited useful What where the main difficulties encountered in eliciting knowledge from resources Were the explanations given clear enough Was the goal of the step completed after this step Was there something missing Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you 1 Page 2 Integrated Modelling Methodology Collection of feedback Positive Experiences It is very nice that the termextractor is now integrated to the MOKI That makes it very easy to add the extracted concepts to the models Negative Experiences There are some minor usability issues uploading a document and then adding the topics to the ontology might not be very easy and intuitive for some users The termextractor still doesn t work very well with long or german documents but it was sufficient for us
210. nt domain concepts missing in the informal domain model The Knowledge required section does not have to be a complete list of domain concepts that are relevant for performing the task It should be regarded as a possibility to record task domain concept mappings that will be taken into account in Phase 3 A task s required knowledge should be specified at the lowest possible level of the task subtask hierarchy that is if a task has sub tasks then the required Knowledge should be specified only for the subtasks This is based on the assumption that the tasks granularity in the task model is such that each complete task is typically carried out by one and the same person 3 5 Phase 2a Validation amp Revision of Domain Tasks 3 5 1 Goal The goal of this phase is to validate with respect to their correctness and completeness the domain and task models created with MoKi during the previous modelling phase see Section 3 4 This validation phase may trigger a revision in MoKi of the models created 3 5 2 Description The process of validation and revision of the domain and task models is illustrated in Figure 7 Manual Checks On line Revise in Moki only domain Revise in questionnaires Moki only domain Automatic checks Figure 7 Validation and Revision of Domain Tasks The process is divided in three main activities Manual checks The KEs supported by the coaches manually check and validate
211. nt for us Source AP Comment The text mining functionality which is integrated with MoKi should be improved TOOL We did not use the text mining functionality due to poor experiences with this tool last year We extracted the knowledge from the digital resource intellectually We analysed the structure of the resources and gathered a number of candidate domain concepts It was not difficult but it took a lot of time Source AP Comment The text mining functionality should be improved Feedback on Phase 2 Modelling of domain tasks Positive Feedback GENERAL Significant improvements were done for P3 informal modelling tool MOKI and process template has been better organized and it was not necessary to KE to know the detailed Wiki syntax to enter data KE was able to see the entire set of tasks their description with the learning goals domain concepts Source AP GENERAL After this step we had all relevant domain concepts and tasks within clearly structured and transparent models Source AP GENERAL The task model was much easier just the numbers had to be added Source Coach GENERAL Very good support to KE in knowledge structuring and formalization Source AP GENERAL Use of variables in informal model reduced KE workload We have some concepts which has about 15 subconcepts labelling different techniques Therefore it really makes sense to use these variables Source APs GENERAL think th
212. nt suggests some improvement or modification of the methodology we propose one or more possible actions to take or to solve the problem in future versions of the methodology The feedback for the single phases concerns several different aspects it goes from comments on general aspects to specific comments on technical problems In order to help the reader we tag the comments for each phase with one of the following labels e GENERAL we tag with this label those comments about a general aspect of a phase TOOLS we tag with this label those comments about tools and techniques proposed to support the methodology e ORGANIZATION we tag with this label those comments about organizational aspects of the methodology Note that the feedback questionnaire stimulated the Application Partners and coaches to comment also on more general issues not necessarily related to the methodology In this section we process only the feedback that deals directly with the methodology 32 learn work A anposcile 52 51 1 General Feedback on the Methodology From the experience of producing the APOSDLE knowledge bases for Prototype 2 we have identified the following general issues Importance of knowing the APOSDLE system and the kind of models it needs The experiences from APOSDLE Prototype 2 were very useful We knew how the domain concepts and the list of tasks have to look like It was extremely helpful that all domain experts i
213. nvolved knew APOSDLE P2 its functionality and possibilities Source APs Coaches Furthermore Application Partners were much more experienced did work more independently Main Work was done by application partners Source Coach Comment The methodology should provide guidelines and resources for better explaining how the final models should be and what is the role of the knowledge base inside the APOSDLE system An option could be to show in the early stages of the modelling activities a demonstration of the APOSDLE system focusing on how the models influence the behaviour of the system and the quality of the learning support provided Involvement of domain experts in modelling activities Domain expert are not performing the modelling Modelling needs still a person with the qualification of a knowledge engineer Modelling should be so easy that domain experts can do it themselves knowledge engineering skills are quite rare in small and medium sized companies Source AP Comment The methodology as it is now does not require an active role of domain expert in the modelling activities However the methodology relies on a committed Knowledge Engineer from the company What to do if this person is not available is a general issue common to many complex systems development projects that require some effort to be customised A possible solution could be to provide external Knowledge Engineers instead of coaches but this is a decision wh
214. o sseiboud pue sp u Huluses 1y 1 0110 5 JO lqede u c ri 5 o p sn LUa SAs y v ul SeaXojdwea ay 2 5 Ajjeneds yom Adu u do uo ul yom dnoub v ul s oldul oq ESP Oy 1 WO sanbeajjoo Ul pue l v o ll q ssod y Ajjensn seoejdysom 13y ye dnoJ6 v ul s oldul oq aess o u li s5ulu Meu Wes O wy ilensn dno v ul s oldul oq yo u L dx y woz pue o yi dnol v ul s ojdw pInOAA 2 51145 pue eHpajmouy l v SACI O pue uJeaj O Ajjeas Ailensn dno 5 v ul s oldul oq ulewop y ul s oldul noge suoljsanyH Zulewop Huluses y ul Wedxe ue vlO2 Qq 0 l HulUJes Ul SWI JO JUNOWE 2114 15 e JSOAU O AHESS D U S H A uiewop Buluses y mouy oym dnoJb ul aj doed noA oq ua Uuo9 Huluseaj se sas ojdw UBD QE EAE szu WnNdOp Jo zunowe zuaNIYNS E 3194 S y10M uay s UsLUNDOP luon l
215. o SUOIIUIJOP SSBD0Jd 1 YUM sqof unnol aie v ul Syse v Jo AUEN ou ou sof sok YOO pjnoys q eym mouy poex JOU op SySe Buez sue urewop v Ul 5 UH SAS noA v o ds l YIM do noA yOnw moy sp ld szu w zezs v 104 dnou6 ay ul si zey YIOM v JNOGe suons no o dx no s nileuonouni gulewop BHuluses JO 14 04 1090 1EUAA ulewop Buiuses ino 104 l s4S Buluses e Jo y dx NOA yeu JELO JEW 314504 Wo SWI aJe p pnioul aq IlEUS 514 1 19 104 1801 JOJ UOSeSJ POOH e snu 241dSOdV u y SOA J Auedwos JnoA ul 2618 ay 104 181 5 5 Huiuseaj 10 9 5 Buluses e soq yey q you Aew 410904 JO si u q y woo uo ul 1947960 dnoJb y ul 5405 e il B A 5 Huljoe UCD
216. o 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 CCI 2 CNM Experience with variables Did you use variables in the learning goals Please indicate why or why not Did you find it difficult to understand how variables could be used in general Did you find it difficult to insert variables in TACT Ifyou used variables where did you find it more intuitive easy to create tasks with variables in the MoKi or in TACT Positive Experiences Using the TACT tool was not difficult The manual was very useful to understand the meaning of learning goal types and how to use the tool Negative Experiences We had to do the Rescue Modeling for CNM The old models had to be transferred to the new version This step was much effort for SAP Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner The results look good there are still some checks to perform for CNM Please state the main differences if any between performing this step this year and last year ADD your comments here 8 Step 6 Formal Models Validation At the end of this step all the models created domain task and learning goals should be formally correct and complete The goal of this step is to have the models validated comp
217. o use the is part of relation in the MoKi was missing 1 KE CCl 1 Coach SAP o The user manual for the MoKi was good 1 KE EADS o The Import functionality was useful 1 KE EADS o It was better to have a separate MoKi for each partner 1 KE EADS o The different visualisations of the domain concepts and tasks were useful 2 KE CCI EADS o MoKi supports the involvement of domain experts 1 Coach FBK o Deleted concepts were still visible in the MoKi 1 KE EADS 69 learn work A anposcile 52 Transferring the models to Protege o Formal models were generated automatically no effort 2 KE ISN EADS 2 Coaches FBK SAP o Moki to owl export tool is not available for KE 1 KE EADS 8 4 Phase 2a Validation and Revision 8 4 1 1 Feedback on the IMM for P2 General o The evaluation of this phase was unclear 1 Coach KC o It is hard to say when an informal model is finished 1 Coach SAP o The KE cannot assess the quality of the model without the help of a coach or someone else 1 KE ISN Objectives and explanations o The explanations were clear for a person experienced in knowledge engineering but not for others 1 KE CCl Experts o Experts had very little time 1 KE CCl o Experts were not available for model validation 1 KE EADS o Experts were not interested in model revision 1 KE CCl o Exper
218. of this step all the models created domain task and learning goals should be formally correct and complete The goal of this step is to have the models validated completeness and correctness by the DEs The step was supported by guidelines and by results from automatic checks similar to the one of step 3 but which also involve checking the quality of learning goals Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Yes Were the tools adequate Yes What where the main difficulties encountered in this stage Were the explanations given clear enough yes Was the goal of the step completed after this step Yes Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you 1 The KE was strongly supported by technology partners in formal models validation No DE involved Positive Experiences ADD your comments here AUTOMATIC post TACT CHECK results very useful Negative Experiences ADD your comments here The answer concerns more the technology partners involved in the operation Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task hour ADD your comments here Do you think that the outcome produced in this step fulfilled the goa
219. ology First Version several issues were identified and analysed We repeat below the main issues as stated in D 1 3 Identification of appropriate learning domains Guidelines and examples about what is an adequate learning domain scenario for APOSDLE are missing Source APs and Coaches e Granularity of models Guidelines for and examples of the granularity of the models are missing Therefore it is difficult to understand the right granularity at which to perform modelling Source Aps learn work A anposcile 52 Workload of modelling Too much workload effort is required to follow the methodology and perform modelling Furthermore an estimate of the workload effort for each phase is missing Source APs Importance of Knowledge Engineers The quality of the models created is much better when the APs can dedicate one or more persons to modelling activities Source Coaches Importance of coaches The role of the coach is really important in the methodology Good support from coaches was fundamental to achieve good modelling results Source APs and Coaches Importance of tools There is a close relation between the quality of the models created and familiarity with or a good understanding of the tools used in particular of the Semantic MediaWiki Source Coaches Another important factor that emerged from an analysis of modelling activities of the 2 Prototype and from an in depth evaluation of the way APOSD
220. omena and doing approximation with Maxwell s equations to obtain the model of the physics Structural Information Concept to be used as Maxwell Equations parameter Task id 5a 1 Subtasks _ Knowledge required Maxwell Equations EM Disturbances Problem To Be Simulated Figure 6 Screenshot of a form associated to the task template in MoKi For each task we ask for a Description in the Annotation box which is modelled as a property of type String in Moki In the Structural Information box we ask the KE to fill the following fields used to collect relations between the task and domain concepts and or other tasks e Concept to be used as a parameter optional The user can insert in this field a domain concept to be used as a parameter in the task In inserting the domain concept the user has to check that o Only one domain concept is allowed o domain concept has sub concepts with respect to the Is A relation o The name of the topic occurs in the name of the task this is needed to assign meaningful names to the specialized tasks This relation is modelled as a property of type Page in MoKi For more details on the use of parameters within tasks see Section 7 2 2 in Appendix 1 e Task id required This field is used to represent the tasks workflow via numbering This numbering attribute is specified at the end of the of the current modelling phase before the formal creatio
221. ompleted after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 Positive Experiences ADD your comments here The tools were adequate for our domain model We kept our models very simple concept and task trees with some multihierarchies no variables no special semantic relations Due to the simpleness and manageability of the models and thanks to the close contact to the domain expert we did not identify any further failures Negative Experiences ADD your comments here We had to change all our is part of relations into is a relations This was no big workload because it was done automatically But it would have been better if we knew from the beginning that the MOKI could only process is a relations and not the offered is part of relations Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here 10 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer We discovered one or two minor mistakes The check made us sure to have completed the informal model before going into the formal modeling phase Page 6 Integrated Modelling Methodology Collection of feedback Please state the main differences if any between performing this step this year and last year Last year t
222. on 5 2 2 1 Knowledge Acquisition from documents According to the answers of the respondents of the P2 questionnaires the objectives of this phase were clear KE CCI ISN Coaches SAP KC No remarks were made concerning the objectives in this phase for P3 either so it can be assumed that the objectives were still clear probably even clearer because the KEs already knew what they needed to model in the subsequent steps Both for P2 and for P3 concepts and tasks were extracted from the documents manually KE CCI EADS ISN Even though extracting Knowledge from documented sources by hand was found to be tedious difficult and very time consuming P2 KE CCl this way of concept extraction lead to useful concepts according to the KE Explanations for the text mining functionalities within the domain modelling tool given in P2 during a modelling workshop were clear according to the respondents KE ISN Coach SAP One of the KEs ISN even stated that written descriptions for using the text mining functionalities would have been sufficient For P3 an overview of the text mining functionalities was given in the MoKi manual The text mining functionalities did not work properly for P2 KE ISN The tool could not be used for other languages than English and it was not able to identify relevant concepts also for English documents P2 KE ISN Coach KC SAP For P2 other tools than the text mining tools provided by the APOSDLE project w
223. ontology questionnaire Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you Positive Experiences The objectives where clear as well as the explanation The automatic checks turn out to be very useful to fulfill the objectives Also the visualization functionalities in the MoKi helped a lot in this phase Negative Experiences ADD your comments here Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 1 2 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Yes Please state the main differences if any between performing this step this year and last year last year there were no automatic checks at this stage Furthermore there were no visualization functionalities in the MoKi to help revising the models Page 3 Integrated Modelling Methodology Collection of feedback 6 Step 4 From Informal to Formal At the end of t
224. operty of type String in the meta model 7 2 2 Modelling tasks vvith parameters The result of the modelling activities of P2 revealed the need for a better bundling of tasks which repeat their structure in different parts of the domain We will now show and discuss such a bundling mechanism called task parametrisation Consider the following examples of tasks extracted from the P2 task models of CCI and ISN respectively e sketch solution e create agenda Both tasks do not define a precise activity in the first task the consultant of CCI executing that task will probably work to sketch a solution for a certain lt issue gt while in the second task the person from ISN will work to create an agenda for a certain lt activity gt e g a workshop a conference a meeting and so on Depending on the lt issue gt to be solved or the lt activity gt for which the agenda Is being prepared the worker using APSODLE may need to achieve different learning goals Thus the modelling of tasks at a general level as in the examples above constitutes a problem for ASPODLE as tasks are disconnected from the domain knowledge or the problem solving knowledge necessary to support the workers in their actual and concrete activities On the other hand forcing the domain experts to model activities at a very specific level would make the modelling of tasks cumbersome and not very natural To address this problem we have introduced the notion of parameter The
225. ort was useful and gave a nice overview 2 KE ISN EADS 1 Coach FBK o Somme errors were detected through the formal checks 1 KE ISN o The formal check report was very long 1 KE ISN Ontology questionnaire o The ontology questionnaire was not used 1 KE EADS 8 5 Phase 3 Modelling of learning goals 8 5 1 1 Feedback on the IMM for P2 71 learn work A anposcile 52 General o Modelling learning goals was difficult because of a lack of understanding of the role of models in APOSDLE 1 KE Objectives and explanations o Competency Types were sometimes unclear 1 KE CCl TACT o There were minor technical problems with TACT 1 KE CCl 8 5 1 2 Feedback both for the IMM of P2 and P3 Objectives and explanations o The explanations were clear P2 1 KE ISN P3 1 KE EADS o The objectives of this phase were clear P2 1 KE ISN P3 1 KE EADS TACT o The TACT tool was easy to implement and use P2 4 KE ISN CCl CNM EADS P3 3 KE EADS ISN CCl 1 Coach SAP o The TACT manual was very useful P2 2 KE CCI EADS P3 2 KE EADS CCl 1 Coach SAP o There were minor usability issues in TACT P2 1 KE ISN P3 1 KE ISN TACT can be improved P2 1 KE EADS P3 1 KE EADS 72 learn work A anposcile 52 8 5 1 3 Feedback on the IMM for P3 Modelling effort o The learning goal model from P2 had to b
226. ote that after removing the reason s for the selected axiom you can go back to the List Entailed Statements View and if you check you should not find it in the list anymore Upload Ontology List Entailed Statement Justification eae current ontology List Removed Axioms Options Justified Statement JANOV A subClassOf Test ANOVA subllassOf Parametuc Test Click here ta Delete Minimum Hittting Set 2 ANOVA subClassOf Parametriz Test One reason for the selected axiom Parameatiic Test subClassOf Test Remove Statement for which you are shown the reason Figure 5 1 2 6 Undo and check which axioms you have already removed Go to the List Removed Axioms View You see a list of axioms statements which you have removed from the ontology since you uploaded it to the questionnaire By checking the checkbox in front of one or more axioms and then clicking Reinsert you can add them again to the ontology thus undoing your changes Aposdle Specific When you are finished with the questionnaire i e when you have reviewed all inferred statements and are ready to assert the changes go the the List Removed Axioms View For each axiom that is listed there If it says A subClassOf B then go to the concept page of the concept A in the MoKi In the line Is A you should see the concept B Edit the concept description and remove the concept B Please for evaluation pur
227. ould be worth considering for further improving the modelling method at this stage 5 2 7 General remarks General remarks for P2 were made on the fact that modelling efforts could not be estimated KE CCl ISN and that the organization of the modelling process was not clear KE CNM In addition in general there was a lack of understanding of the role of models in APOSDLE Coach SAP No remarks were made concerning these general issues for P3 This might be due to the fact that the KEs already had experience with the modelling methodology that they could estimate the modelling effort and that the organisation was therefore clear Nonetheless this is an important point when starting the modelling process in a new learning domain The role of the models the modelling process and the modelling efforts need to be clarified in advance in order to allow for a successful creation of models For P2 it was stated that the KEs had difficulties with knowledge engineering because they had no experience with it KE CCl Also for P3 it was stated that modelling cannot be done by people who have no experience in knowledge engineering KE CCI ISN Coach SAP A decision needs to be made whether it is intended that modelling can also be done by persons who have no experience in knowledge engineering If this is the case means and measures should be thought of which introduce people not experienced in knowledge engineering with the very basic ideas and pri
228. perty rights management for small and medium enterprises Exactly this CCI genuine subject matter should be imparted to consulting novices with help of the APOSDLE system Do you have any additional remarks or suggestions for improvement e The modeling process is still very time consuming this disadvantage has hardly changed This can be accepted for a test situation but not for a real world application Speed up modeling is crucial e Modelling needs still a person with the qualification of a knowledge engineer Modelling should be so easy that domain experts can do it themselves knowledge engineering qualification is quite rare in small and medium sized companies Page 10 Integrated Modelling Methodology Collection of feedback Integrated Modelling Methodology Collection of Feedback The goal of this questionnaire is to collect positive and negative feedback useful for the evaluation and improvement of the Integrated Modelling Methodology Please state your comments the way you prefer you can provide feedback in form of short sentences and bulleted lists as well as more complex descriptions Both formats are perfectly acceptable Note for APs Please provide feedback only for the steps you have already completed Partner Name EADS IW 1 Step 0 Scope amp Boundaries and Resources Collection The goal of this step is to define the scope and boundaries of the application domain to be modeled and to gather som
229. plied in three domains KE ISN EADS Coach SAP which was found to be very useful These results indicate that different knowledge elicitation techniques are useful in some situations why they are not in others and that there are personal preferences of domain experts e g some like card sorting some do not which need to be taken into account when preparing a knowledge elicitation session 5 2 3 Phase 2 Modelling of domain and tasks Both for P2 and for P3 the objectives of the phase were clear according to the KEs and coaches P2 KE ISN EADS CCI Coach KC P3 KE EADS Coach FBK However at this modelling stage in P2 the KEs still had problems to understand how the models would be used in APOSDLE KE EADS While the explanations of what to do also were clear according to the KE P2 KE ISN P3 KE EADS in P2 modelling rules for the semantic MediaWiki P2 were not clearly specified KE EADS Furthermore it was not clear how to handle the P2 Wiki KE CCl and a manual was lacking KE 42 learn work A anposcile 52 CNM More specifically the problems of the semantic MediaWiki from P2 mentioned by the respondents were that it was unclear how to model relationships between concepts KE CCl ISN and that it was difficult to find the functionality for renaming tasks and concepts Coach KC The main problem at this stage for P2 was the change of the Wiki to a semantic MediaWiki which caused additional effort
230. poses copy and paste the list of removed axioms into an email and send it to Viktoria Pammer vpammer knovv center at kam work oom You are here Upload Ontology List Entailed Statements Justification save ourent ontolozyl Lit Removed Axioms Options You can choose previously removed axioms to reinsert anova subClassOf Comparing Means Clanova subilassf Farametric_Test Figure 6 1 3 Additional features 1 3 1 Delete explicitly given statements In the Explicit statements box see Figure 4 you see statements that were explicitly given in the ontology If you decide you do not want to state this after all you can simply check the checkbox corresponding to the statement and click on the Delete Button at the bottom of the box 1 3 2 Save current ontology In case you want to save the changed ontology to your local system to to the Save current ontology View Depending on the browser you use you will either be prompted directly to save the file or you will see a lot of text RDF XML in the browser window In this case click on File and Save As to save the ontology Note that the ontology questionnaire does not store labels comments or similar things 1 3 3 Options In the Options Vievv you can un check the option Use symbolic rendering engine After changing the selection you must click Submit If this checkbox ix checked the statements will be shown as ANOV
231. priate for the learn work approach taken by APOSDLE Phase 1 Knowledge Acquisition The goal of this phase is the acquisition of knowledge about the application domains that have to be formalised and integrated in the APOSDLE knowledge base The proposed methodology aims to extract as much knowledge as possible from both Domain Experts and available digital resources identified by the Application Partners The elicitation of knowledge from Domain Experts is based on well known state of the art techniques like interviews card sorting laddering and concept step chapter listing while the extraction of Knowledge from digital resources is based on algorithms and tools for term extraction described in Pammer Scheir amp Lindstaedt 2007 The key aspect of this phase is twofold first the methodology has to support the effective and rapid Knowledge acquisition from Domain Experts who are often rarely available and scarcely motivated towards modelling second the methodology has to ease the process of modelling by reusing knowledge already present in digital format in the organisation Phase 2 Modelling of Domain Tasks Starting from the knowledge elicited in Phase 1 in this phase a complete formal description of the domain and task models which are part of the APOSDLE knowledge base is provided the domain model is about the specific work domain application domain a user wants to learn about with APOSDLE while the task model concerns
232. pts are the relations modelled in the MoKi correct Are the different types of relations is a and part of used correctly Would other self defined relations be more useful for expressing the relation between two concepts Coaches should provide support to this check Hierarchy Should some very similar concepts be grouped into a superordinate class Coaches should provide support to this check Concepts vs tasks Are all concepts domain concepts or should some of them be modeled as tasks Descriptions o Are the descriptions comprehensible Do they make sense Suggestion is possible ask an external person to read the descriptions and see if they make sense o Are the descriptions correct for the given concept As a fictitious example assume that the concept Erfindungen von Arbeitnehmerlinnen Inventions of employees has the description legal regulation for inventions of employees This description does not exactly describe its concept and it seems that the Knowledge Expert means something different than what is indicated in the label This may be a problem for APSODLE as this could lead to some ambiguity in the annotations and in the learning goals 3 2 Task Model Labels o Are the labels easy to understand for people who don t know APOSDLE o Are the labels too short long Please remember that very short labels bear the risk of not meaning much to the user Analogously unnecessary long or convolute
233. questionnaires received with respect to the first version of the methodology the time consumed dealing with technical aspects regarding the tools has substantially reduced Furthermore thanks to the experience gained in developing the models for P2 in P3 we have been able to provide a better quantification of the effort required for each step of the methodology and this has helped coaches and APs to better plan their work Supporting Tools have been remarkably improved There is a huge difference between the MOKI and the Wiki we used last year The TACT tool is much improved from the point of usability Source APs Coaches 33 learn work A anposcile 52 5 1 2 9 1 2 1 Importance of tools supporting validation and revision The automatic checks turn out to be very useful to reach the objectives Source APs Coaches One single tool covering all the phases It would be nice to have everything integrated in one tool After the final checks found a task where wanted to change the labeling this can just be done by technical partners Also now have to update the MOKI according to the changes did in TACT Adding deleting variables the best thing would be the have one tool where everything is changed automatically Source AP Comment A feasible solution could be to integrate in MoKi all the other modelling tools TACT Validation Tools in order to have a unique interface tool covering all the phases of the methodolog
234. r of tasks in the model is drastically increased Coach KC It is hard to say by now if this is really a problem This is something which needs to be investigated during the overall evaluation of APOSDLE 5 2 6 Phase 3a Validation and Revision of Learning Goals For P2 this step was performed after the evaluation questionnaires were given to the KEs and Coaches Thus no results exist for this phase The only statement for this phase in P2 was that it is hard to say when a formal model is finished Coach KC According to one KE EADS the objectives and explanations of this phase were clear No further statements were made by the respondents on the clarity of objectives and explanations in this phase which we take as an indicator that there were no major problems in this phase The check report was found to be helpful and to give a nice overview over the models KE ISN EADS Coach FBK even though it was very long KE ISN Another KE EADS stated that the result of the check report rather concerned the technology partners than the KEs From these results we conclude that effort needs to be put into the format of the check report and that further effort should be put into the explanation of the results to the KEs 45 learn work c apaescile One of the KEs ISN stated that the Excel sheet produced by the TACT was very useful for checking the learning goal model Exploiting the output of the TACT as a model validation method c
235. r instance by changing the presentation format of the automated check report 44 learn work A anposcile 52 As for knowledge elicitation experts were also involved in this step Again for P2 the KEs were facing the problems that experts had very little time and were not available for model revision CCI EADS and that they were not interested in it KE CCl Coach KC Nothing was stated about the availability of experts in this modelling step for P3 It can be assumed that the availability of experts is always a critical issue We interpret the absence of negative statements as an indicator that a the experts were more interested in model revision because they had more knowledge about APOSDLE and the role of models and b the validation tools provided made it easier for the KEs and experts to check their task and domain models 5 2 5 Phase 3 Modelling of learning goals For the phase of modelling learning goals we were facing the same situation as for the previous steps while the objectives of this phase were clear P2 KE ISN P3 KE EADS and also the explanations were understandable P2 KE ISN P3 KE EADS modelling learning goals in P2 was still perceived as difficult because of a lack of understanding about the role of models in APOSDLE Both versions of the TACT tool were easy to implement and use P2 KE ISN CCI CNM EADS P3 EADS ISN CCl Coach SAP with minor usability issues P2 KE ISN P3 KE ISN mino
236. r technical problems P2 CCl and suggestions for improvement were given P2 KE EADS P3 KE EADS like for instance that learning goals for sub concepts should be inherited from learning goals for high level concepts CCI When modelling learning goals for P2 learning goal types previously called competency types were unclear in some situations KE CCl For the P3 version of the TACT descriptions of inheritance mechanisms in the TACT were missing KE CCl In all these results indicate that the TACT generally fulfilled its purpose Suggestions for improvement should be considered in future versions of the tool For modelling P3 the concept of variables was introduced Some KEs used variables to reduce the modelling effort KE ISN Coach KC others avoided them in order to keep the models simple KE CCl Coach SAP The KEs found it was easy to understand how to use variables KE ISN and to insert them in the MoKi KE ISN EADS Coach FBK and in the TACT KE ISN According to one KE ISN the best procedure is to insert variables in the Moki and get them as hints in the TACT One difficulty reported about variables was that it was difficult to understand how variables would be used in APOSDLE KE EADS This again points to the necessity of making the functions of APOSDLE more transparent for the people involved in modelling A possible problem that occurred when using variables for learning goal modelling is that the numbe
237. rallel 3 3 2 1 Knowledge Acquisition from Digital Sources The KE uses text mining services such as relevant term extraction and document clustering to automatically elicit relevant topics from the digital resources collected in Phase 0 The available text mining functionalities are analogous to the functionality of the Discovery Tab Protege plug in described in Pammer Scheir amp Lindstaedt 2007 The text mining functionalities support the English and the German language The functionalities are i extract relevant terms from a set of documents ii group synonymous terms ili cluster a set of documents and iv extract relevant terms of each cluster The text mining functionalities were embedded in the see Section 4 2 2 1 This means that knowledge acquisition immediately delivers an input to modelling since extracted terms can directly be saved as potential concepts if the KE deems them to be relevant 3 3 2 2 Knowledge Elicitation from Domain Experts In order to elicit knowledge from the DEs the KE applied various techniques Structured interviews were conducted by the KE for refining the list of domain concepts the list of tasks the relations between tasks and the mappings between tasks and domain concepts Special knowledge elicitation methods e g card sorting laddering step listing chapter listing were applied for eliciting tacit knowledge from the DEs Concept listing step listing and chapter listing
238. rarchy of the topics according to the is a hierarchy If you click on a topic you see its synonyms 6 and its description 7 to the right if you entered them in the Moki 5 If you want to see only topics which are used to describe a learning goal mark the corresponding checkbox below the topic description field 8 6 If you want to add a learning goal to the selected task select the topic that you want and click the Add Learning Goal for lt selected topic gt Button 9 7 At the bottom left you see the selected task again 10 To its right you see the already assigned learning goals 10 The drop down field shows the learning goal type the text to its right shows the topic 8 In order to assign to a task multiple learning goals about the same topic but with different learning goals proceed as follows Add a learning goal for the topic its type is per default Unspecified empty selection in TACT Change its type Then add the next learning goal which is again created with the default type Unspecified Change its type etc 18 learn work A anosclle 42 9 You can delete a learning goal by clicking on theTrash Button 12 and you get an explanation why the learning goal is here by clicking on the Explanation Button 13 For more on explanations see Section 6 8 10 The reason for this procedure is that you can not assign two exactly equal learning goals same topic same type to one task TACT always
239. rather unsure about how to do this During modelling a sense arises for what is a meaningful mapping and what not Therefore do first cut task learning goal mappings quickly and rather by intuition without thinking too much and do not strive for perfection In a number of case studies this strategy has proven to lead to success WHEN YOU ARE FINISHED with a first cut mapping for all tasks walk through all task learning goal assignments a second time and if necessary add and remove learning goals Especially if two or more tasks have the same or very similar learning goals assigned think about it once again The first trial of your task learning goal assignment will take around 2 and 3 hours depending on the number of tasks in your model The revision will take another 60 min approximately 14 learn work A anosclle 42 This section provides you with technical information on how to install and use the TACT 6 1 Installation 6 1 1 Prerequisite In order to use the TACT you must have Java 6 installed Preferably you have Java 6 Update 10 which allows you to see a slightly nicer User Interface In Windows you can check this in the list of installed Software If you find that you do not have Java 6 installed you can download it from Sun s homepage http VVVVVV ava com de dovvnload manual isp 6 1 2 TACT The TACT distribution is located at https partner know center at KC Partner Space Aposdle 01
240. re needed One of the main results of the refinements of the Integrated Modelling Methodology is that it has allowed to instantiate even further the general task of modelling in the APSODLE context and to build the modelling phases which are needed to create an operational APOSDLE system Additional positive results are e An increased awareness of the aim of modelling and of the modelling process within APOSDLE Precise reasons for this increased awareness are difficult to provide They can be partly due to the restructuring and simplification of the IMM and to the improved guidelines manuals and guidance provided in this second version as well as to the fact that the KEs already had experience with the modelling activities performed for the 2 Prototype Nonetheless this is an important point when starting the modelling process in a new learning domain The role of the models the modelling process and the modelling efforts need to be clarified in advance in order to allow for a successful creation of models A reduction of the granularity issue Finding the right granularity level at which a domain has to be described is a general modelling issue which goes beyond the scope of APSODLE Nevertheless the integrated modelling approach taken for the IMM second version appears to have simplified the task of developing a coherent and integrated ASPODLE knowledge base thus reducing complaints about the right granularit
241. re easier to understand and better to apply 8 Step 6 Formal Models Validation At the end of this step all the models created domain task and learning goals should be formally correct and complete The goal of this step is to have the models validated completeness and correctness by the DEs The step was supported by guidelines and by results from automatic checks similar to the one of step 3 but which also involve checking the quality of learning goals Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where 0 is very easy and 5 is extremely difficult how difficult this step was for you 2 Positive Experiences Anew the check report gives a very nice overview I alos liked very much the xls Sheet that the TACT produces With these two documents it is quite easy to check everything Negative Experiences It would be nice to have everything integrated in one tool After the final checks I found a task where I wanted to change the labeling this can just be done by technical partners Also I now have to update the MOKI accord
242. reby the topic defines the content that the learning goal is about The learning goal type specifies the type or somehow the degree of Knowledge and skills the person needs to have about this topic for performing a specific task For instance a learning goal basic knowledge about APOSDLE Wiki would describe the ability of a person to read and navigate in the APOSDLE Wiki The person would know what is available on the APOSDLE Wiki and how to move back and forth between the pages The learning goal basic knowledge about APOSDLE Wiki would not include the ability to edit the content of the Wiki or to insert pages In order to express the latter a learning goal know how to apply use do a APOSDLE Wiki would have to be defined A detailed description of how to build learning goals from topics and a taxonomy of learning goal types are given in Section 2 About this User Manual With this manual we want to give you some explanations and guidelines on what to keep in mind in order to obtain a meaningful and valid task learning goal assignment The manual is organized as follows At first brief guidelines are given on how to model learning goals using the TACT Section 2 Then learning goal types are defined and explained by means of examples Section 3 In Section 4 we gives a brief introduction about the how and why of using variables in tasks Further the guidelines from Section 2 are described in more detail S
243. red in the Moki click on Import Learning Goals from Text 3 Click Start 4 in order to start the TACT for the defined Knowledgebase If you have imported learning goals from the MoKi as described in Step 3 then you will get a reminder to save all learning goals with the TACT in order to obtain the formal models see Figure Figure 6 2 3 When you have already worked with this Knowledgebase in TACT before and have earlier versions you can restore them by clicking on Restore old data version 5 Then choose the directory which contains the desired backup The backup directories are labelled with dates so you have one backup per day yaeniiidin Learning goals have been imported into model you still need to save them using TACT Figure 6 2 When learning goals from the MoKi are loaded using a text file you still need to use the Save button in TACT in order to obtain the formal learning goal model 6 6 Creating the learning goal model simple mode This section describes the basic functionality of the TACT to create a learning goal models Most probably this will be all you need APOSDLE consortium all rights reserved page 17 aposcle learn work C Tasks m gt fi Einzelfallberatung Kundenproblemanalyse Zust ndigkeit 4 js Beratungsfall dokumentieren Standardauskuntt Description E Einzelfallberatung durchf hren 5 gt verweisen Beratung erle
244. res relation between an abstract task and an abstract learning goal is that for the abstract task T lt X gt such that T lt X gt requires LG Igtype XY each specialised task T Y requires the learning goal IG Igtype Y X is the task parameter Y is a subclass of X and LG Igtype XY denotes a learning goal composed of the learning goal type Igtype and the domain concept X Furthermore since T Y specialises T lt X gt it inherits all ground learning goals assigned to T lt X gt Example The abstract task prepare agenda for activity requires the abstract LG Basic knowledge about activity Then we obtain the following pairing between ground tasks and ground LGs Prepare agenda for activity requires Basic knowledge about activity Prepare agenda for meeting requires Basic knowledge about meeting Prepare agenda for board meeting requires Basic knowledge about board meeting Prepare agenda for demo meeting requires Basic knowledge about demo meeting Prepare agenda for workshop requires Basic knowledge about workshop In this way abstract LGs allow modellers to model LGs and the pairing between tasks and LGs ina compact manner i e without forcing them to specify all the ground LGs and tasks but at the same time to obtain information in the models about the real tasks and LGs users require at run time 8 In OWL T Y is a subcl
245. rm goal to have only one modelling tool where the KEs can flexibly move back and forth between informal and formal task domain and learning goal modelling According to the statements of one of the KEs for P3 the modelling process is still time consuming and speeding up the modelling process is seen as crucial for the success of a real world application KE CCI The coach of this KE SAP agreed that modelling was time consuming but regarded the time to be invested in modelling as acceptable for a real world application No remarks were made by the KEs and coaches of the other domains for P3 Table 2 shows the estimated modelling effort in hours of each KE for P2 and P3 v l l li 0 s mi 46 A anposcile 52 learn work Scope amp Boundaries and Resources Collection Knowledge elicitation from Digital Resources I Knowledge elicitation from Domain Experts Modelling of domain and task Semantic Wiki Validation and Revision Modelling of domain and task Protege and YAWL Modelling of learning goals TACT O NO O1 NO D gt O O O Validation and Revision l N A Table 2 Comparison of modelling efforts in hours for each phase in P2 and P3 as estimated by the knowledge engineers Looking at the table one has to bear in mind that CCI was modelling from scratch for P3 and that only a third of the time was needed though
246. rnen Specialised Task LJ Show only Tasks without Learning Goals Domain Model Elements Einzelfallberatung G Filter Also Known as v m Beratung zum Thema Gewerbliche Schutzrechte A gt Dokumentation der Beratung in der IHK Darmstadt gt fF Gesetzliche Bestimmungen zu Gewerblichen Schutz unt gt m Technologietransfer gt li Erwerb und Verwertung von Gewerblichen Schutzrechten Description F rderungs und Finanzierungsmdoglichkeiten im Bereich Gewerbliche Schut i a Informationen zu den bekannten Schutz und ngsrechten und zu gt i Experten Ansprechpartner f r das Thema Gewerbliche Schutzrechte verwandten Rechtsthemen gt Lj Beratungsstrategie im Bereich Gewerbliche Schutzrechte Add as Learning Goal for Einzelfallberatung L Highlight Domain Model Elements not referenced in Learning Goals Learning Goal Mapping Einzelfallberatung Understand x Save Figure 6 3 Overview of the TACT User Interface On the top left you see the task browser It shows the Sub Task hierarchy 1 If you click on a task you see its description if you entered a description in the MoKi on the right 2 You see the name of the selected task once again below the task browser 3 If you want to see only those task to which no learning goals are yet assigned mark the corresponding checkbox below the task browser 4 4 In the middle on the left you see the topic browser It shows the hie
247. rning Goals 30 5 Qualitative Evaluation and Comparison with the first version of the methodology 32 51 Qualitative dad ada Aaeeei ii 32 5 1 1 General Feedback on the Methodology 8 22 33 5 1 2 Feedback on Phase Scope amp Boundaries 288383 2 2228833333 34 5 1 3 Feedback on Phase 1 Knowledge 15 0 34 5 1 4 Feedback on Phase 2 Modelling of domain tasks 36 5 1 5 Feedback on Phase 2a Validation amp Revision of Domain Tasks 37 5 1 6 Feedback on Phase 3 Modelling of learning goals 38 5 1 7 Feedback on Phase Validation amp Revision of Learning Goals 39 5 2 Comparison of feedback for Prototype 2 and Prototype 3 39 5 2 1 Phase 0 Scope and Boundaries 40 5 2 2 Phase 1 Knowledge Acquisition 41 5 2 3 Phase 2 Modelling of domain and 5 42 5 2 4 Phase 2a V
248. rocedure Therefore know how to produce has to be linked to topics that refer to results e g project report or products e g build a software Example If this learning goal type is linked to a topic for instance Wiki content the learning goal know how to produce Wiki content means that a person knows the Wiki setup and notation and is able to edit the Wiki content In this case profound knowledge of Wiki content is a prerequisite of Know how to produce Wiki content and therefore a task that requires the learning goal Know how to produce Wiki content would also require the learning goal profound knowledge of Wiki content However this is no general rule learn work A anosclle 42 The decision whether the ability to edit the content of the Wiki is specified by the learning goal know how to produce Wiki content or know how to apply use do APOSDLE Wiki is on the knowledge engineer and depends on whether there are clear rules or procedures for creating the Wiki know how to apply use do or not know how to produce APOSDLE use case The learner wants to know how to produce a lt domain element gt The knowledge worker has to produce something that is not clearly defined but has some constraints for example a plan an agenda for a meeting a design and wants to know what he has to keep in mind when performing such a task He searches for lessons learned by others lik
249. rs and some of the domains were the same as in P2 the subsequent steps of Phase 0 as followed for P2 were not carried out In principle however Phase 0 is still intended to contain a workshop in which the KEs are informed about the procedure of modelling this methodology and the roles of models in APOSDLE By means of concrete written scenarios tasks shall be identified and concrete learning needs derived Simultaneously resources should be collected that constitute potential learning materials 3 2 3 Supporting Tools Techniques amp Resources 3 2 3 1 Initial questionnaire As mentioned above an initial questionnaire was used to gain insight about properties of the application domain The questions identified target user groups and their tasks the application domain typical learning needs and high level learning goals central domain concepts The questions also asked about existing learning support and perceived insufficiencies e g bottlenecks experienced during learning from experts as well as existing digital resources containing knowledge about the application domain On the one hand its purpose is to ascertain the suitability of the learning domain for APOSDLE and the support of the company for the introduction of a work integrated learning system such as APOSDLE For instance if only a very few knowledge workers will benefit and learn work A anposcile 52 personnel turnover is very low in a company it is questionable
250. rs next to a task 1 This learning goal was created manually in the current session This learning goal was manually added since the current Knowledgebase has been last opened with the TACT 2 This learning goal was imported from an existing learning goal model This learning goal was already contained in the previous learning goal model It may have been that this learning goal was imported from text from the knowledge required in the MokKi Another case is that the learning goal model was saved previously and has now been reloaded 3 This learning goal was created automatically and contains the same variable as the task This learning goal contains a variable It is the same variable as the corresponding task All specialised tasks will get a learning goal with the corresponding subtopic of the variable This learning goal can not be deleted except by deleting the variable 4 This learning goal was created automatically and contains the topic of the variable in the task This learning goal is a ground learning goal which contains the topic of the variable It is assigned to the task with the variable This learning goal was added automatically based on a heuristic It can be deleted age 27 learn work A anposcile 52 5 This learning goal was created automatically and contains a sub concept of the variable in the task This learning goal is assigned to a specialised task It contains a subtopic of the variable namely
251. rsion of their model Feedback on Phase 2a Validation amp Revision of Domain Tasks Positive Feedback GENERAL Since the models were kept simple the modelling process in general was much easier There were just minor changes in this step Source Coach GENERAL The check made us sure to have completed the informal model before going into the formal modelling phase Source Coach AP GENERAL The models developed into the MoKi have been translated into valid OWL formal models without any effort from application partner and coaches Source Coach AP GENERAL The objective and explanation of this step was clear Source AP GENERAL No domain experts involved Source AP TOOL The good check results are due to the improved MOKI Source AP TOOL The MoKFs usability was much better than in the previous year Therefore the models were in a good shape from the beginning Source Coach TOOL The visualization functionalities in particular the Is a and Is part of browser participate in validation end revision Source Coach AP 37 learn work A anposcile 52 TOOL The tools were adequate for the models Source Coach APs TOOL The automatic checks turn out to be very useful to fulfil the objectives Source Coach APs TOOL The check report delivers a good overview also some hints from the coaches were useful Some relations between concepts didn t make any sense and were detec
252. s has Variable Resource Identify Non prescribed Resources available to achieve goals unu Save Figure 6 5 A task with a variable is selected Specialised tasks are shown Now the selected task is shown to have a variable 5 Also a number of specialised tasks are created 6 For each subtopic of the variable one specialised task is created Each specialised task specialises the task with the variable with respect to one subtopic of the variable For instance if Resource is the variable and Non Prescribed Resource is a subtopic the task Identify Non Prescribed Resources specialises the task Identify Resources with respect to Prescribed Resource Two learning goals are automatically added to the task with the variable 7 8 The first learning goal 7 contains the same variable as the task This learning goal can only be deleted by deleting the variable The meaning of this learning goal is that each specialised task will require a learning goal for the subtopic with respect to which it specialises the task with the variable For instance the task Identify Non prescribed Resources will automatically require a learning goal with the topic Non prescribed Resource 9 You can not delete this learning goal The second learning goal is a normal learning goal for the topic 8 This learning goal can be deleted Special notes e Note that a specialised task inheri
253. s type can be used for all types of topics learn work A anosclle 42 4 1 When and why could variables be useful Our modelling experiences have shown that very often the problem arises that a task in different situations requires different learning goals For instance consider the following example Example Consider the task Prepare Agenda for Activity The topic Activity in the domain model has a number of sub topics indentation is used to indicate the sub class hierarchy Activity Meeting Board Meeting Demo Meeting Workshop It is quite difficult to assign learning goals to the task Prepare Agenda for Activity For instance preparing the agenda for a board meeting might require quite specific knowledge about board meetings e g the learning goal profound knowledge of Board Meeting whereas it might require no knowledge at all about Workshop In contrast preparing a workshop of course might require profound knowledge of Workshop and no knowledge about Board Meeting Additionally there might be knowledge that is required for performing the task Prepare Agenda for Activity independent of what is the Activity such as know how to do use apply Agenda This small example illustrates the fact that in some cases tasks are modelled in a way that they might require knowledge independent of the concrete application of the task e g know how to do use apply Agen
254. sailunyoddo 5 se FIGSOdV JO 40 E9 pUL Ue 51 511 s oldul Jeyjo Sunoe uo Jo s unyoddo s e 19 ilenu od hi se T14SOdV 104 JO edIpu lqissod e s SIUL 314804 usn ueym ew S N osje pessnosip aq snw siy usea O SUI lou op s oldud Buluse9 0 91 syoayo p L 2 SINT JeuoNIpesy e eqXew ses 511 ul 14904 10 UO eo pulJa UNOD e si Sty BdIAPe JOSJIP s ne jjo 101 ul JOU ae s ojdw il Ihy sn jou Alqeqo d s ul s s yoddns Buluses e pa yeAow jou SI SN il 8 uewny OSIE pue jeyi6ip S unsix o sseo0e juo ueo FIGSOdV BOUIS I A MOU 314904 104 J0 C9 pul 19 UNOD Huo js e 51 511 590 OU il pesinboe ues ulpulop y 104 asijiadxe y p ssn sip SNW JOU J UH SAS v OJU D q O QEHEAE Sue S USLUNDOP 01 0 yu uo SE pue UI 1S S y OU s UsWUNDOp 51 5 ysel _Sdoo yoeqpssj sjeve ew B l lE zo jood HulAjons ue 51 11dSOdV 10 sjulod 620 v Jo uo zou s p ll poul JO UOHNIOAS pue o spasu p mouy 2 souls 410904 104 uon
255. se 2a Validation amp Revision of Domain TASKS 17 5 001 a s n 17 0 ELL a a oa eee ee 17 3 5 3 Supporting Tools Techniques amp 18 3 6 Phase 3 Modelling of learning goals 8888888 8888333228 833333 28 18 30 06c66c 18 05 MSS ONION m n 18 3 6 3 Supporting Tools Techniques amp 18 3 7 Phase Validation amp Revision of Learning Goals 20 507 101 o 00 20 27 ik kk ii 20 3 7 3 Supporting Tools Techniques amp 20 4 M dling Tools ects ts cette ce creer este oian 21 MeN 21 So 21 4 2 1 Describing Knowledge in a MoKi 21 learn work A anposcile 52 MOKI TUMGIIONMAINIGS sa 23 0000 26 TACT UMC a 5557 0 gestae 26 4 3 2 Explanations for learning 0815 2 27 4 4 0000 a a 28 4 4 1 Validation amp Revision of Domain 5 5 28 4 4 2 Validation amp Revision of Lea
256. sentation of process order instead of the alphabetical list of tasks Source APs Comment This is a usability issue of TACT Further development of the TACT will profit from this feedback TOOL We had problems in understanding the inheritance within the tool TACT We thought if a class has the learning goal type profound knowledge about the subclasses would inherit this learning goal type This was not the case So we had to define the learning goal type for every subclass although the class had already this learning goal type This procedure took a lot of time Source AP Comment This may be due to a misunderstanding of inheritance in general Learning goal types are not inherited by domain concept class Therefore it is possible that one domain concept is used together with different learning goal types for different tasks This is the intended behaviour but obviously the communication explanation to the knowledge engineers was lacking TOOL Another difficulty was that our multi hierarchies were not visible within the tool although they were within the Semantic Media Wiki Source AP Comment TACT currently only supports the visualisation of the OWL class subclass hierarchy Further development of the TACT may profit from this feedback TOOL about Adding variable in TACT When you know where you have to do it highlight respective task and topic then it is easy But first did not know how to do this Source A
257. sponding task All specialised tasks will get a learning goal with the corresponding subtopic of the variable You cannot delete this learning goal except if you delete the variable This learning goal was created automatically and contains the topic of the variable in the task This learning goal is a normal learning goal which contains the topic of the variable It is assigned to the task with the variable This learning goal was added based on a heuristic You can delete it This learning goal was created automatically and contains a sub concept of the variable in the task This learning goal is assigned to a specialised task It contains a subtopic of the variable namely the subtopic with which the specialised task specialises the task with the variable You can not delete this learning goal except by removing the variable This however will also remove the specialised task Trouble Shooting 6 9 1 Unselecting elements In order to unselect any element hold down the Ctrl key on your keyboard and click on the selected element 6 9 2 Different ontology task ontology or required knowledge files If you want to do the mapping with a different domain ontology or a different task ontology Close the TACT Open it again and choose the Knowledgebase that you want to open 21 learn work A anosclle 42 Some questions were arising and some lessons were learned during our user tests FAQ e Why do some topics occur
258. ss the latter a learning goal know how to apply use do a APOSDLE Wiki would have to be defined The learning goal model contains the list of learning goals which refer to a certain business domain and task model Furthermore the learning goal model connects tasks to learning goals by means of the requires relation which is used to specify which tasks require which learning goals 7 3 1 Modelling learning goals with parameters Our modelling experiences have shown that very often the problem arises that a task in different situations requires different learning goals For instance consider the following example Example Consider the task Prepare Agenda for Activity The topic Activity in the domain model has a number of sub topics indentation is used to indicate the sub class hierarchy Activity Meeting Board Meeting Demo Meeting Workshop It is quite difficult to assign learning goals to the task Prepare Agenda for Activity For instance preparing the agenda for a board meeting might require quite specific knowledge about board meetings e g the learning goal profound knowledge of Board Meeting whereas it might require no knowledge at all about Workshop In contrast preparing a workshop of course might require profound knowledge of Workshop and no knowledge about Board Meeting Additionally there might be knowledge that is required for performing the task Prepare
259. st of Tasks 5 Move concept 7 rename concept 7 sub task hierarchy 5 Task Model Management 10 15 Bibliography 1 Chiara Ghidini Marco Rospocher Luciano Serafini Barbara Kump Viktoria Pammer An dreas Faatz and Joanna Guss Integrated modelling methodology version 1 aposdle deliv erable 1 3 APOSDLE Deliverable 1 3 2007 2 Matthew Horridge Holger Knublauch Alan Rector Robert Stevens and Chris Wroe A practical guide to building owl ontologies using the protege owl plugin and co ode tools edition 1 0 August 2004 3 Natasha Noy and Evan Wallace Simple part whole relations in owl on tologies http www w3 org 2001 sw BestPractices OEP SimplePart Whole simple part whole relations v1 5 html 16 A apescile learn work Project Number 027023 APOSDLE Advanced Process Oriented Self Directed Learning Environment Integrated Project IST Technology enhanced Learning Information Society Technologies Validation amp Revision of Domain Tasks Integrated Modelling Methodology APOSDLE Identifier APOSDLE W10 JRS Agenda Plenary and GA Trento Author Partner Chiara Ghidini FBK Viktoria Pammer KC Barbara Kump TUG Work Package Task WPI Document Status Draft Confidentiality Confidential Version Date Reason of change 1 2008 10 1 Document created 2 2008 10 10 Distinction between manual and automatic cheks added 3 2008 10 11 Step on questionnaire added 4 2008 10 16
260. t or know how to apply use do APOSDLE Wiki is on the knowledge 60 learn work A anposcile 52 engineer and depends on whether there are clear rules or procedures for creating the Wiki know how to apply use do or not know how to produce APOSDLE use case The learner wants to know how to produce a lt domain element gt The knowledge worker has to produce something that is not clearly defined but has some constraints for example a plan an agenda for a meeting a design and wants to know what he has to keep in mind when performing such a task He searches for lessons learned by others like guidelines checklists templates examples and or constraints that give some structure in performing the task without giving a recipe or prescription Material use types guideline checklist template example how constraint APOSDLE Examples e know how to produce final report ability to write a final project report for the customer includes the knowledge of standards and norms for layout organization references etc e know how to produce scenario ability to generate simulation scenarios that enable identifying the major entities that must be represented by a simulation e know how to produce REACH material for external consulting ability to generate documents which the IHK employees can hand on to the costumers during the consulting process 7 4 1 5 unspecified Definition The learnin
261. t although the changed ontology can in principle be saved directly for APOSDLE you must note the axioms you deleted and delete them manually in the MoKi How this can best be done is also described in detail below we expect that to be quite fast and easy however kam work oa 1 2 Step by Step through the Questionnaire 1 2 1 Start the questionnaire S uz rt itl h r d xi Fiusios File Edit View History Bookmarks Tools Help v A F http services knovv center tuqraz at S080 InteractiveOntoloqyQuestionnaire rr 4 Click here to start the interactive ontology questionnaire Figure 1 Click on the link Click here to start the interactive ontology questionnaire 1 2 2 Upload your domain ontology on Upload the domain ontology for which you want to verify the inferences You are here Upload Ontology List Entaled Statements save current ontology List Removed Amioms Options Here you can specify the ontology you want to upload so you can operate on it Browse Upisa Figure 2 Click on Browse to open a file dialogue and browse for your ontology file Click on Upload to upload it Aposdle Specific If XXX is a prefix like EADS ISN and so on for your company this file is called XXXdomain ontology owl If you do not know where you can find it ask your coach for it 1 2 3 Navigation The header of the page shows the followi
262. t for the domain using tools like a terms extractor Phase 2 Informal Modelling an informal but rather complete description of the models is created using a Semantic MediaWiki Phase 3 Formal Modelling and Integration each formal model is built using the appropriate formal language Phase 4 Validation amp Revision the formal models are evaluated and if necessary revised accordingly Figure 2 The IMM First Version The IMM First Version also defines the actors who belong to the so called modelling team and collaboratively work in the modelling process e Domain Expert DE The DE provides the fundamental knowledge about the domain of the users of APOSDLE and their learning needs The DE also specifies the pool of resources to be used for knowledge extraction e Knowledge Engineer KE The KE helps the elicitation of knowledge from the DE and guides the entire modelling process e Coach The coach is a person who comes from the APOSDLE team and has the task of supporting Knowledge Engineers who are not completely skilled in modelling throughout the entire modelling process The IMM First Version was used by each Application Partner to build the specific APOSDLE Knowledge Bases for the APOSDLE 2 Prototype see Deliverable D6 8 Application Partner Domain Models From this experience and from the evaluation of the feedback collected and reported in Deliverable D1 3 Integrated Modelling Method
263. t have to be considered during the task numbering process Obviously T1 number 1 Since T5 and T2 are separated by an OR split we could assign both tasks a number 2 as follows T5 number 2 and T2 number 2 parallel to 15 13 and 14 follow 12 Therefore a logical consequence would be that T3 number 3 after 12 and T4 number 3 after T2 parallel to T3 Obviously this order leads to a conflict T3 follows T2 number 2 but is still parallel to T5 number 2 which is not expressed using this way of numbering Introducing an abstract task see dashed oval in the following figure we can easily solve this conflict T1 is still the first task T1 number 1 Since T5 is parallel to T2 13 14 we introduce an abstract task including the three latter tasks Note that in the following letters are used to indicate splits in the task model Sub Tasks are indicated by different number levels separated by a Following numbering is a result of this approach 15 number 2a 12 number 2b 1 parallel to T5 17 number 2b 1 1 18 number 2b 1 2 e 13 number 2b 2a after T2 parallel to 15 14 number 2b 2b after T2 parallel to T3 parallel to T5 e 16 number 3 In this case 2b would be the abstract task that not necessarily has to be specified by the application partners Cleary all temporal relations are expressed using this numbering VVe introduced this numbering scheme as the attribute task number pr
264. t in MoKi O APOSDLE consortium all rights reserved page 22 A anoscile Sz grated Modelling Methodology Version 2 0 learn work while Figure 10 gives an example of a page describing an element of the task model Edit Task model form Analyse the scope validity domain of the new tool Annotations Description When modelling human activities in the current system the analyst should consider the scope of the new computer based system to be introduced and whether and how the new system will impact on the work and human activities being observed This task advises on how to undertake such analyses Structural Information Concept to be used as parameter Task id 2b 1c Subtasks Knovvledge Activity Description required x O This is a minor edit O Watch this page Save page Show preview Show changes Cancel Figure 10 The page of a task in MoKi 4 2 MoKi functionalities Moki provides several groups of functionalities to support modelling all of which can be accessed via a wiki style menu This section contains a description of the functionalities currently available Concerning future extensions Moki is built in a modular way in order to facilitate the plugging in of new or existing state of the art tools 4 2 2 1 Import Functionalities We provide three types of import functionalities e mport of available domain task formal models With this functionality
265. task i e what does one have when starting the task what does one produce when the task is finished Next the DEs are asked to pick topics which are relevant for the task and to start to find a common solution i e a common agreement on the set of topics which are required for the task Typically this leads to discussions among the experts which can also serve as valuable source of information for the knowledge engineer For instance often experts arrive at conditions under which one set of topics is required for a task and conditions under which another set of topics would be more helpful From such discussions tasks can be identified which are too broad or too generic etc Once the experts arrive at a conclusion the result is documented e g photographed This procedure is repeated for all tasks of interest For each task the procedure leads to a preliminary task topic assignment Other possible outcomes side products of the card sort are tasks that need to be renamed tasks that are unnecessary a wrong task sequence of tasks missing tasks and concepts need to be refined or to be defined During 2 hours approximately 10 tasks can be worked on It is important to pre select tasks that shall be discussed during the session and to pre select the cards for these tasks from the list in order to warrant a smooth process Walking through the tasks in a regular sequence i e in the sequence in which they are usually perfor
266. ted through the formal checks and the coaches Source AP 5 1 5 2 Negative Feedback GENERAL s part of relations has to be changed into is a relations This step did not produce much effort we still don t understand why relations are provided at the beginning and have to be changed at the end because of technical reasons Furthermore the current model does not exactly express what the AP did want to model Source Coach AP Comment The guidelines and manuals supporting the methodology and tools should emphasize how the informal descriptions provided in MoKi are actually formalized in the OWL models TOOL The formal check report was very long Source AP Comment The length of the check results file depends on the number of the checks performed and the number of entries violating the checks While the first number is fixed the second depends on the models produced An option would be to find a more compact way to represents the results of the checks maybe a table instead of a plain text file Feedback on Phase 3 Modelling of learning goals Positive Feedback GENERAL Learning goal types are now clearer and more adequate for us Source AP TOOL The installation and use of TACT tool is very easy and the user manual contains quite clear definitions and examples of the different competency types Source AP TOOL The TACT manual was very useful to understand the meaning of learning goal types and how to use the
267. ted to the abstract task Abstract learning goals can only be deleted by deleting the variable A second learning goal is automatically assigned to the task namely a specialised learning goal that is about the domain topic which is the variable in the task This automatically created learning goal in the example from above would be unspecified Activity meaning that performing the task Prepare Agenda for Activity in any case might require knowledge about Activity in general This learning goal can be deleted The learning goal types of the two automatically created learning goals can be modified manually Specialised tasks inherit learning goals from abstract tasks Each abstract task is split into specialised tasks e g the task Prepare agenda for lt Activity gt is decomposed into Prepare agenda for Meeting Prepare agenda for Board Meeting Prepare agenda for Demo Meeting Prepare agenda for Workshop The task learning goal mapping from abstract tasks is inherited each specialised task 11 learn work A anosclle 42 For instance consider the task learning goal mapping for the abstract task Prepare agenda for lt Activity gt unspecified Activity automatically created abstract learning goal profound knowledge of Activity automatically created specialised learning goal know how to apply use do Agenda manually created learning goal The specialised task Prep
268. tements which are inferred from the uploaded ontology If you open the owl file with a text editor you would not find these statements written there The second box shows the statements which were explicitly given in the MoKi Entailed Statements In this box you see the statements axioms that are entailed inferred iby the specified ontology You can get a justification by se ontology see the second box below xa Saa TALES sa Tao War ANOVA subClassOf Parametric Test Oly Test avChasiOf Tat Entailed Statements box Canova sibClassOf Test O One Way ANOVA subClassOf Parametric Test i C One Way ANOVA s uhUlass f Inferential Statistics O One Way ANOVA subClassOf Comparing Means O One Way subClassOf Test fay Comelation subClassOf Inferential Statistics AxIOMS In this box you can see the axioms the specified ontology exists of by checking one of the checkboxes and clicking the remove Ela Deviation sublassOf exists Square _root_of Variance iL zPartOf is transitive L P Vaha subClassOf Value Fran range 4 Value or Crosstab or Standard Deviation or Mean FT domain Two Way ANOVA or General Linear Model GLM or Linear Correlation or F Test 1 ET tepe of range T Test LJ alpha subClassOf ETL waeaen domain Hypothesis Testing or ANOVA Linear Rezression re res COE Pirie ie Fett Explicit Statements box Figure 4 1 2 5 Find out the reason
269. terviewee s oral responses That way the interviewee can concentrate on brainstorming concepts and can speak them out immediately without having to write them down and more concepts can be listed The outcome of concept listing is an unstructured list of concepts relevant for the domain Depending on the size of the domain concept listing takes approximately 10 15 minutes 3 3 3 3 Step Listing Step listing is very similar to concept listing with the difference that the expert is asked to list all the steps in the process of doing X without worrying about their sequence The question that is asked to the respondent is What are the specific steps that a person has to do for performing X An APOSDLE example for this question would be What are the specific steps that a person has to do for performing innovation management The outcome of step listing is a possibly unstructured list of tasks that have to be performed by a person in the learning domain Depending on the domain step listing takes approximately 10 15 minutes 3 3 3 4 Chapter Listing In chapter listing the expert is asked to imagine that he or she wanted to write a book about the domain under consideration The expert then is told to come up with proposed chapter titles and subtitles for such a book For instance the question for an APOSDLE domain could be If you would write a book about requirements engineering what would be the chapters and sub chapters The out
270. that it was difficult to rate the relevance of single concepts KE CCI EADS and that a graphical representation of the models was missing for P2 KE EADS Coach SAP It seemed that the semantic MediaWiki of P2 did not support the validation of the task and domain model very well Getting an overview of the model was difficult in the Wiki Coaches SAP KC Indeed with the Wiki it was possible to extract relations between concepts so that a revision by domain experts could be done but it was difficult to extract relation pages KE ISN According to one of the coaches KC revision of P2 was only made from a formal perspective but not from a content perspective For P3 we tried to facilitate this modelling step by providing a better overview of the models in the MoKi and by preparing guidelines for manual and automated model checks as well as an ontology questionnaire One of the KEs EADS stated that the ontology questionnaire was not used but no negative statement was made with respect to the ontology questionnaire According to the KEs the MoKi especially the is a and part of browsers helped validation and revision KE CCl EADS Coach FBK The formal check report from the automated model checks was very long KE ISN but gave a nice overview KE ISN EADS Coach FBK From these statements we conclude that the support for model revision at this stage basically is useful but further effort is needed for improving them fo
271. the activities and tasks a user can perform in the organisation This description also contains a first alignment between domain elements and tasks which is used in Phase 3 as a basis for creating the learning goals model that is the model describing the learning goals a user can have in the organisation inside the specific application domain The model descriptions are created using the Modelling WiKi MoKi a tool developed within the project The MokKi allows users to describe the elements of the different models in an informal but structured manner using Natural Language It automatically translates these structured descriptions in formal models without requiring the Application Partners to become experts in the formal languages used to produce the formal models The domain and task models created are then validated and possibly revised in Phase 2a Validation amp Revision of Domain Tasks guidelines for manual revision and automatic validation checks are provided to support the Application Partners during the revision process Phase 3 Modelling of Learning Goals In this phase a formal specification of the learning goal model is produced Starting from the initial alignment between domain elements and tasks learn work A anposcile 52 produced in Phase 2 the users specify in details the learning goals using the Task And Competency Tool TACT a tool specifically developed within the project The learning goal model created and
272. the tools adequate Was the wiki used in a collaborative manner What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Page 4 Integrated Modelling Methodology Collection of feedback Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 0 CCI 0 CNM Experience with variables Did you use variables in the informal model Please indicate why or why not Did you find it difficult to understand how variables could be used in general Did you find it difficult to insert variables in the MoKi Positive Experiences Using the Sematic MediaWiki was very easy this time Changing content was more intuitive the overall process was much faster The task model was much easier just the numbers had to be added There were less tools especially not using YAWL did save a lot of time for us Neither for CNM nor for CCI variables were required Negative Experiences We had to do the Rescue Modeling for CNM We also added the iTel model into the MoKi There was just a check done by CNM Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner The models after this step very
273. the user can set up MoKi with an already available domain or task model instead of starting modelling from scratch From the technical point of view the XML serialisation of the OWL formal model is parsed in order to obtain its relevant elements and a page is created for each one of them Input of structured lists of elements With this functionality the user can create new elements of the models by inserting lists of concepts or tasks organized according to predefined semantic structures e g a taxonomy or a partonomy or task subtask decomposition structure Textmining functionalities To support the utilization of available unstructured knowledge relevant for the modelling activity MoKi includes an extension which i extracts relevant L OSDLE consortium all rights reserved page 23 learn work A anosclle Sz D 1 6 Integrated Modelling Methodology Version 2 0 terms from a set of documents ii groups synonymous terms based on WordNet iii clusters a set of documents and iv extracts relevant terms of each cluster Whichever functionality is used the relevant outcome is a list of groups of words By clicking on a word the KE creates a new domain concept in the MoKi s domain model Figure 11 shows screenshots of the central activities of the text mining extension aposcile learn work A C navigation m Main Page a Recent changes wiki import functionality Domain Model Task Model Load
274. the vocabulary and description of the business learning domain modelled in the APOSDLE knowledge base It is formalised as an OWL ontology The main elements of the domain model are concept elements Each concept has two attributes concept description and synonyms These are used to store the textual description and a list of synonyms of the concept itself The values of concept description and synonyms are provided at modelling time by the modellers domain experts The Is a and part of relations are used with their standard meanings Namely they are used to structure concepts in a hierarchy of sub concepts and to represent the components of a concept respectively In addition to concepts the domain model can also contain domain specific relations that are used to connect different concepts That is a domain specific relation say R can be used to specify that concept A is in relation R with concept B Domain concepts can be related via the is_prerequisite_of relation This relation is meant to identify prerequisite concepts for learning It is not modelled by domain experts at modelling time but is computed on demand after assigning learning goals to tasks The modellers can remove the automatically created prerequisite relationships during or after the task learning goal mapping done using the TACT tool 7 2 Task Model The task model contains a structured list of tasks which refer to a certain business The
275. they had a better understanding of the APOSDLE system and modeling process Negative Experiences ADD your comments here Very time consuming Difficulties to find the right dates Unsolved problem Experts knowledge is expanding continuously How could we transfer this knowledge growth to APOSDLE without repeating workshops and interviews continuously Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here 50 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer We gathered a lot of domain concepts and also discussed the domain concepts made by the knowledge elicitation from Digital Resources In our opinion we could build harmonious domain concept Please state the main differences if any between performing this step this year and last year We concentrated on one domain instead of two Knowledge experts were stronger involved No methodological differences 4 Informal Modeling of Domain and Tasks in Moki Starting from the knowledge elicited in Step 1 a b the main goal of this step 15 to obtain an informal but rather complete description of the domain model and task model in a Semantic MediaWiki called After this modeling step the informal concept model should only consist of relevant domain concepts see 5 2 Page 4 Integrated
276. tion of the domain dependent part of the APSODLE Knowledge Base we have developed a set of modelling tools The set of modelling tools contains The MOdelling vviKl a wiki based tool which supports the creation of the domain and task models e TACT The Task Learning Goal Mappings Tool a JAVA based tool which supports the creation of the learning goal model e Validation Tools some automatic checks and the Ontology Questionnaire which support the revision and validation of the whole APOSDLE knowledge base created the former are some JAVA based scripts the latter is a web based tool In the next three sections of the document we will describe in details each one of these tools 4 2 MoKi MoKi see Ghidini et al 2009 for more details is a wiki based tool which extends Semantic MediaWiki SMVV to support domain experts in creating the domain and task models The choice for developing Moki on top of a semantic wiki was made for several reasons First of all wikis provide an ideal and robust basis for the development of a collaborative tool They are web based systems that is they are accessible virtually from every place in the world this feature is particularly suitable since the actors involved in modelling activities may not be located in the same building or even in the same town and may not be able to physically participate in meetings Wikis provide a state of the art robust collaborative tool and
277. tionnaires are collected in the Annex Part 6 Filled Feedback Questionnaires on the Integrated Modelling Methodology deliverable The feedback collected allowed us to provide a first evaluation of the second version of the methodology This evaluation is organized in two parts In Section 5 1 we report the general comments on the entire methodology as well as the specific feedback for each of its phases Although the modelling activities in APOSDLE has ended by the time of writing this deliverable and no future developments of the methodology and supporting tools within the project is currently scheduled we decided anyway to present possible improvements or changes of the methodology triggered by the feedback received which could be considered in some future enhancement of the APOSDLE system In Section 5 2 we present the results of the summarizing qualitative content analysis that was applied for the answers the Application Partners and Coaches provided in the questionnaires for both versions of the IMM in order to provide a comparative qualitative evaluation of the first and second version of the IMM 5 1 Qualitative Evaluation This section organizes the evaluation as follows first we consider some general comments on the entire methodology then for each phase of the methodology we report the specific feedback for that phase For each comment we report if it was made by Application Partners APs or by Coaches Furthermore if a comme
278. tomatically ie via appropriate scripts The results will be provided by FBK at the beginning of the revision process to the coaches The coaches will send them to the Application Partners and they together will coordinate on how to revise the models in the Moki accordingly 4 1 Domain Model Descriptions Do all concepts have descriptions If there are missing descriptions please add them in the MokKi Are there top level concepts that is very general concepts at the first level in the hierarchy vvith no children Note This is not necessarily an error Only a stimulus to consider if these concepts should have also more specific sub concepts or if they should be discarded Coaches should provide support to this check 4 2 Task Model Descriptions Do all tasks have task descriptions If If there are missing descriptions please add them in the MokKi Variables o Do all concepts that are listed as variable also exist in the list of concepts o 15 the variable of a subtask the same variable of the supertask o Are the variables part of the task name Coaches should provide support to these three checks knowledge required o Do all concepts that are listed in the knowledge required section also exist in the list of concepts Are there tasks without knowledge required This is not necessarily an error but a stimulus to check if some Knowledge required can be added Coaches should provide support to both checks Task identifier
279. tool Not involved Page 4 Integrated Modelling Methodology Collection of feedback 8 Step 6 Formal Models Validation At the end of this step all the models created domain task and learning goals should be formally correct and complete The goal of this step is to have the models validated completeness and correctness by the DEs The step was supported by guidelines and by results from automatic checks similar to the one of step 3 but which also involve checking the quality of learning goals Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step completed after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 1 Positive Experiences The objectives where clear as well as the explanation The automatic formal models checks turn out to be very useful to fulfill the objectives Negative Experiences Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfia
280. tool Source Coach AP TOOL The TACT tool has much more improved from the point of usability Source AP TOOL t is very useful to have the variables created in MoKi already as a hint in the TACT Source AP TOOL Having the descriptions available and automatic generated learning goals is better The learning goals are easier to understand and better to apply Source AP 5 1 6 2 Negative Feedback GENERAL The learning goals are not the same if the user performs a given task for the first time or for a second time We decided to assign to the task all the learning goals that are relevant in both situations However it means that the learning goal model contains several tasks with a great number of learning goals Source AP Comment Assigning to a task all learning goals that are relevant for it is the correct way to model the learning goal model This is true even if a large number of learning goals is finally assigned to each task A quality indication for the learning goal model is that the same learning goals are assigned to multiple tasks Adaption to users within APOSDLE is not achieved by executing the same task multiple times but by executing different tasks with the same learning goals over time 38 learn work A anposcile 52 TOOL Of course TACT be improved As an example of possible improvements the models visualization identification of super tasks and subtasks relations between concepts pre
281. topic APOSDLE use case The learner wants to know how to apply use do a lt domain element gt The knowledge worker wants to know what the next steps are in a procedure or a well defined task that s he has to perform but that s he is not able to carry out without some guidance S he searches for information that tells him which steps there are and which order they have to be completed This information is like a recipe or prescription Furthermore s he likes to have an example or demonstration of the procedure Material use types how do 1 demonstration checklist example how APOSDLE Examples e Know how to apply use do a core learning goal analysis ability to perform a core learning goal analysis for a company e Know how to apply use do a Er2 ability to use the text data format er2 for loading data into another data base system e Know how to apply use do a MS Project ability to use the specific project management tool for project management gantt charts resource planning etc e Know how to apply use do a substance fixtures ability to perform a survey of chemical substances in use 3 4 Know how to produce Definition The learning goal type Know how to produce means to be able to create produce or build a certain topic for instance a task model In this sense know how to produce means the ability of a person to achieve a certain outcome without a specified rule or p
282. ts all learning goals of the task with the variable Therefore add all learning goals that all specialised tasks require only once to the task with the 20 learn work A anosclle 42 6 8 variable and only additional learning goals required only by one specialised task to this specialised task Once you have selected a specialised task you must unselect it to add learning goals to the corresponding task with the variable Do so by holding the Ctrl Key of your keyboard and then clicking again on the selected specialised task Explanations for learning goals Since a number of learning goals are added automatically TACT provides you with explanations for why 1 6 9 a learning goal appears next to a task This learning goal was created manually in the current session You manually added this learning goal since you have opened the current Knovvledgebase vvith the TACT This learning goal vvas imported from an existing learning goal model When you opened the TACT this learning goal was already contained in the previous learning goal model It may have been that this learning goal was imported from text from the knowledge required in the MoKi Another case is that you saved your learning goal model one day and reopened the same Knowledge Base another day This learning goal was created automatically and contains the same variable as the task This learnig goal contains a variable It is the same variable as the corre
283. ts involved Positive Experiences ADD your comments here The objective and explanation of this step was clear No DE involved but 2 KE Is a and part of browser graphical tree representation participate in validation end revision AUTOMATIC CHECK results very useful No online ontology questionnaire used Negative Experiences ADD your comments here Page 6 Integrated Modelling Methodology Collection of feedback Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task 2 hours ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Yes Please state the main differences if any between performing this step this year and last year ADD your comments here Not so much differences 6 Step 4 From Informal to Formal At the end of this step the domain model and task model will be contained in two OWL ontologies Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Yes Were the tools adequate Yes What where the main difficulties encountered in this stage Were the explanations given clear enough Yes Was the go
284. ts were reluctant 1 Coach KC Conceptual o It was difficult to rate the relevance of single concepts 2 KE CCI EADS o Feedback about the quality of the model was missing or not critical enough 1 KE CNM Wiki o Getting an overview of the model was difficult in the Wiki 2 Coaches SAP KC o With the semantic Wiki it was possible to extract the relations so that a revision by domain experts could easily be done 1 KE ISN o It was difficult to extract relation pages 1 KE ISN Graphical visualisation of the models o A graphical visuailsation of models was missing 1 KE EADS 1 Coach SAP 70 learn work A anposcile 52 Model revision o Revision was done rather on a formal level than on a content wise level 1 Coach KC 8 4 1 2 Feedback both for the IMM of P2 and P3 Objectives and explanations o The explanations were clear P2 1 KE ISN P3 1 KE EADS 1 Coach FBK o The objectives of the phase were clear P2 3 KE ISN CCI EADS P3 1 KE EADS 1 Coach FBK 8 4 1 3 Feedback on the IMM for P3 General o Hints from the coaches were useful 1 KE ISN o There were just minor changes in this step 1 Coach SAP Moki o The MoKi was useful for model revision 1 KE CCl o The is a and part of browsers in the MoKi were helped in validation and revision 1 KE EADS 1 Coach FBK Check report o The check rep
285. u 51 snonunuo u ym 4990 Ho s Ed 410504 SYSE JO s eym u0 uJe O sp u juo ojdw ue yey 1 Aew 511 JOU il NSE e Buruseaj snf auoawos Yoddns osje ueo ey Sunedionied jdo d jdy nu ase ajay YSE 10 YOIUM UI UlpulOPp Huluses e WO 5 310904 INJdjay q you yfi ulewuOp paeUs e JO Wa sAs Buiuses e ypy si uonesijeloeds y NOLDHOA e JOU UBD FIGSOdv 54 19 5 4 Aq 0 5 1 4 q Ajqeqoud pino jeu sqol unnol seyeoipul siy L 980104 1 9 9 9 YUM Ul WOddns Aq Np 46u v 0 syse Yons uyu di u ueo 3140 OM D HAOUN 104 40 E9lpul Aewd e due syse 0 5 Jo 0 Zl l qe seseosoul uleBe 51 SesesJoUl SIU JO 949 ub 94 ye 51 PHua jeyo 51 J eHuajjeyo e si uognjos e JO Buryoo 5 0 moy Bulmouy jou pue yse e d y ouueo seulhue yosees jeuope sdj y H14SOciV 204 JUeIUSAUOD Ailensn SI e 104 Bulyoo s ays eym SMOUY SUOSWOS passnosip aq PINOYS 5111 Se0p JIASOdY JEYM O JOU nq eaj ae suone
286. u sl 2 2I ZTE ll 120 ci a s say 3 2 2 The vision for the IMM Second 210 5 3 The Integrated Modelling Methodology cscccccsessseeesceeeseeeesecenseeeeseoenseeeeseoenseeeeseoenseeessooenss 8 3 1 Overview of the Integrated Modelling 1 8 3 1 1 Knowledge Bases of the 3 Prototype cccccccccsscssseseseceseseseseseseecevevevevavavavverereneneeeevevevens 8 3 2 Phase Scope amp Boundaries cccccccccsesseecceeeeceeeeeseeeeeeeeeesaeeseeeeeeeseesseaeeeeeeeeessueaseeeeeeeeesaaas 9 SAE E i E E E E N is 9 0 22 DESCHPUOM eesse EERTE EEE Ea EEEE a NEERA 9 3 2 3 Supporting Tools Techniques amp 9 3 3 Phase 1 Knowledge 159 10 10 555 101 10 7 CTO UO i etree testes ena ete b n 10 3 3 3 Supporting Tools Techniques amp 11 34 Phase 2 Modelling of domain tasks A ea dadda la 13 57 13 Sr oem 81 0111 eer meer ner reer ney 13 3 4 3 Supporting Tools Techniques amp 14 3 5 Pha
287. uch space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate Did the domain experts coincide with the person performing the modeling What where the main difficulties encountered in this stage Were the explanations given clear enough Page 3 Integrated Modelling Methodology Collection of feedback Was the goal of the step completed after this step ie was a refined task list and an extensive list of candidate domain concepts ready after this step Please rate form 0 to 5 where is very easy and 5 is extremely difficult how difficult this step was for you 2 Positive Experiences ADD your comments here The evaluation of P2 both concerning the software and the EADS models created for P2 resulted in a kind of knowledge elicitation from expert Negative Experiences ADD your comments here Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer Please state the main differences if any between performing this step this year and last year ADD your comments here Same remarks as steps amp 1a The P2 evaluation included also model evaluation Task Learning Goal Mapping Evaluation
288. ur model disproportionally complicate Negative Experiences ADD your comments here No negative expericences Please estimate your modeling coaching efforts at this stage in terms of hours spent to perform the task ADD your comments here 20 hours Do you think that the outcome produced in this step fulfilled the goals of the step in a satisfiable manner Please give reasons for your answer After this step we had all relevant domain concepts and tasks within clearly structured and transparent models Please state the main differences if any between performing this step this year and last year Page 5 Integrated Modelling Methodology Collection of feedback The MOKI has much improved since last year t is as comfortable to use as Prot g 5 Step 3 Informal Models Validation and Revision The goal of this step is to have the domain model and task model validated completeness and correctness by the DEs The step was supported by guidelines by results from automatic checks and by an on line ontology questionnaire Feedback Please state positive and negative experiences of the IMM for this step in the space below Use as much space as you need Some questions to keep in mind while providing the feedback Were the objectives clear Were the tools adequate What where the main difficulties encountered in this stage Were the explanations given clear enough Was the goal of the step c
289. urce AP GENERAL Using Card sorting in several rounds we lead the AP to get an overview and structure the domain and contents themselves Source Coach ORGANIZATION Everything was done by the AP not very much coaching effort Source Coach 34 learn work A anposcile 52 ORGANIZATION We performed together with our Coach four very nice workshops using sophisticated knowledge elicitation techniques After these workshops we had a lot of useful data Source AP ORGANIZATION We tried out several additional KE techniques improved version of Card Sorting Chapter Listing Step Listing etc Again it was very advantageous to start from an existing model which needed to be improved instead starting from scratch Source Coach ORGANIZATION To gather the knowledge of our domain experts we made several interviews and smaller workshops with them These interviews were very comfortable and we got a lot of knowledge from them so that we could gather new domain concepts Again this step was not so difficult but we spent a lot of time interviewing and observing the domain experts The domain experts were this time more open for interviews and knowledge elicitation because they had a better understanding of the APOSDLE system and modelling process Source AP TOOL t is very nice that the textmining functionality is now integrated to the MoKi That makes it very easy to add the extracted concepts to the models Source A
290. urces available to achieve goals _ Analyse the scope yalidity domain of the new tool Analyse the different action sequences envisaged in certain situations simple task provides simple advice for detecting Identify non prescribed tasks that operators putin place to achieve goals while cc Plan and prepare acquisition sessions decide on acquisition methods Finish HAM data gathering Identify non prescribed goals tasks strategies ofthe workers to acquire and proc eo these resources during SR modelling v gt Specialised Task Identify Non prescribed Resources available to achieve Identify resources available to achieve goals Identify Prescribed Resources available to achieve goal has Variable Resource LJ Show only Tasks without Learning Goals Domain Model Elements Filter Also Known as Use Case xi Use Case Diagram Extends gt Notation gt System Product a li Event Description 2 Synchronisation Checking Models gt fi Resource Means available to actors to achieve their goals They can be observable like co operation or inferred such as default knowledge or trust in the Fit Criterion l Activity Description P Add as Learning Goal for Identify Non prescribed Resources available to achieve goals LJ Highlight Domain Model Elements not referenced in Learning Goals Learning Goal Mapping Identify resources available to achieve goal
291. y Feedback on Phase 0 Scope amp Boundaries Positive Feedback GENERAL We could easily build a collection of relevant learning resources Source AP GENERAL We had created the questionnaire on APOSDLE application domains which to me was also a very useful tool for coaching the process selecting an adequate APOSDLE domain Source Coach GENERAL The discussions with the AP helped to focus very early in the process and reduced effort Source Coach GENERAL The iterative process using graphical tools from the beginning turned out to be much easier Source Coach AP 5 1 2 2 Negative Feedback GENERAL It was difficult to represent all the complexity of our task model due to the only one variable limitation Source AP Comment The inclusion of multiple variables would have led to a very high complexity of the meta model since it is not trivial to define inter dependencies of variables and desirable rules for inheritance propagation along multiple hierarchies We believe that the approach of representing multi variables in tasks within the structure of the domain model has been quite successful Feedback on Phase 1 Knowledge Acquisition Positive Feedback GENERAL We had a good overview and a structure of our AP domain after this step Source Coach GENERAL This step was important to structure and facilitate the proceeding steps Source Coach GENERAL Knowledge experts were strongly involved So
292. y of modelling e A positive evaluation of the tools The evaluation of the tools developed to support the IMM was generally positive In particular the improved version of the MoKi has helped to remove all the comments on the usage of Semantic Wikis which were part of the evaluation of the IMM first version The reports obtained from the validation tools helped a focused and quick revision of the models thus speeding up the modelling process The main item where improvement is still required is to decrease as much as possible the effort of modelling and build at the same time good models for the APOSDLE system Easing the task of modelling is a general problem in building knowledge dependent systems which goes beyond the scope of APOSDLE Nevertheless the evaluation of the IMM second version has shown that the redesign of the methodology and of the tools was a fist good step in this direction It has also highlighted further improvements which could contribute to make modelling mode effective and which can provide the basis for further improvements of the modelling tools Among the most important are 48 15 apcescile learn 2 work To integrate all the tools and routines in a single modelling tool A re engineering of the Moki to include all the different modelling tools in one too would further reduce the modelling effort by simplifying and speeding up the construction of the APOSDLE knowledge base To improve the task o

Download Pdf Manuals

image

Related Search

Related Contents

User Manual    Dyson DC29dB Multifloor    Mode d`emploi d`une plante d`herbier  取扱説明書  3. using the application  

Copyright © All rights reserved.
Failed to retrieve file