Presentazione DSS success final

Category: Entertainment

Presentation Description

No description available.


Presentation Transcript


ISESS, Lisbon, 23-27.5. 2005 JAROSLAV MYSIAK, Fondazione Eni Enrico Mattei FEEM, Venice, Italy; [email protected] DECISION SUPPORT SYSTEMS for Integrated Water Resource Management: Success and Failure

Sketching the problem: 

Sketching the problem 30% of all software projects are cancelled, 50% come in over budget, 60% are considered failures by the organisations that initiated them and 90% come in late. (Standish Group; Economist, Nov 25th 2004) “We know why the project fails, we know how to prevent their failure. So why they still fail”? (Martin Cobb, Treasury board of Canada Secretary)


Outline DSS Failure Technology (acceptance) Scientific policy advise Common grounds Misunderstanding of how managers make decisions Miscomprehension of the policy makers’ needs Ignoring cognitive aspects Insufficient management of change …..

DSS Concept: 

DSS Concept DSS originally developed for tacking ill- and semi-structured problems (Gorry and Scott Morton, 1971) Computer system dealing with the structured part of a problem while human judgements was brought to bear on the structural part, constituting a human – machine system Successively evolved to an umbrella term for a cluster of researches related to computer mediated (scientific) decision support

Disappointment, frustration, abandonment : 

Disappointment, frustration, abandonment “My ideas about DSS have moved from enthusiasm to disillusionment to abandonment during the 20+ years” (Alter, 2004) “I am still trying to find out what "decision support" in general is, and whether or not it is anything at all!” (Reitsma) “Some people claim that DSS matured to the point that it can be considered as part of the mainstream and that it is loosing its identity. DSS may disappear as a stand-alone field” (Carlsson and Turban, 2002) “A persistent lack of confidence is heard in environmental circles. (DSS) are viewed as being somewhat mysterious and arcane” (Swayne, 2003) The direct use and application of IA models in the policy area has been rarely realised successfully, which has been often frustrating. (Rotmans and van Asselt, 2001)

Evaluation of DSS benefits/impact: 

Evaluation of DSS benefits/impact How to define success? Factors influencing success … Relationships between the success factors …

Success measures: 

Success measures System usage - system is only used if the benefits outweigh the costs Perceived usefulness – extent to which people believe that the technology will help them perform their job better Perceived easy of use – degree to which a person believe that using a particular system would be free of effort

Common shortcomings: 

Common shortcomings High system usage provides no guarantee of DSS effectiveness or added value to organisational performance User may have poor introspection, may not recognise good advice and dislike being corrected by computers (Potts et al. 2001) Success and failures are often interrelated, partial or open to later reinterpretation (EC, 2003)

Factors influencing success (1): 

Factors influencing success (1) Different factors are relevant in different stages of DSS development/implementation Unfreezing Moving Refreezing Establishing condition for change Design, development, tentative implementation Institutionalising new situation e.g. executive support/ DSS champion clear business objectives e.g. early involvement of future user; Iterative specification, prototyping e.g. management of resistance; adapting capability, open environment (also in Santhanam et al., 2000; Wierenga et al., 1997; Finlay and Forghani, 1998

Factors influencing success (2): 

Factors influencing success (2) Different factors for organisational and individual users’ impacts Individual level Organisation/institution e.g. user satisfaction; perceived individual benefits; impact of the job e.g. impacts on business, institutional effectiveness, (also in Wierenga et al., 1997)

Factors influencing success (3): 

Factors influencing success (3) Different factors relevant for managers and analyst/ technician Manager diagnosis Analyst diagnosis e.g. resource availability; management of organisational resistance; operating sponsor; appropriate staff e.g. development risks, management of system evolution and spread; appropriate technology; management of data (based on Palvia and Chervany, 1995)

Causal chain : 

Causal chain Individuals’ subjective probabilities of the consequences if a particular behaviour is performed Individuals’ positive and negative feelings if a particular behaviour is performed

Technology Acceptance Theory : 

Technology Acceptance Theory Usefulness Easy of use External variables mediated by beliefs; “easy of use” indirect influence through “usefulness”, attitude not relevant mediator (Davis, 1989)

Theory of Planned Behaviour : 

Theory of Planned Behaviour Attitudinal Beliefs and evaluations Normative Beliefs and motivation to comply Subjective norm Control beliefs and perceived facilitation Perceived Behavioural control (Ajzen, 1985)

Decomposed Theory of Planned Behaviour : 

Decomposed Theory of Planned Behaviour Subjective norm Perceived Behavioural control User satisfaction Personal innovativeness Easy if use Usefulness Peer influence External influence Self efficacy Facilitating condition (Taylor and Todd, 1995)


Evaluation Altogether up to 40% variability in use explained Influence of some factors varies at different stages of the IS implementation IS considered as an independent issue in organisational dynamics (based on Legris et al. 2003



Learning by using technology: 

Learning by using technology (based on Mysiak, 2005) Initial holistic ranking Holistic ranking after MCA Different MCA methods A number of policy makers were asked to assess a set of policy measures without and with a decision aid (multiple criteria decision methods - MCA)

Learning by using technology: 

Learning by using technology (based on Mysiak, 2005) Initial holistic ranking Holistic ranking after MCA Low correlation between individual rakings Correlation between individual rakings increased Although the methods were generally not trusted … … the ranking before and after applying MCA did not correlate  policy makers changed their mind Different MCA methods

DSS as catalyst of IWRM : 

DSS as catalyst of IWRM DSS renaissance in water management field – as a catalyst of integrated water resource management system (IWRM) IWRM is a process which promotes the coordinated development and management of water, land and related resources, in order to maximize the resultant economic and social welfare in an equitable manner without compromising the sustainability of vital ecosystems (Global Water Partnership, 2002) EU Directive 60/2000 - Water Framework Directive

Conditions of systems development : 

Conditions of systems development The IWRM evolving concept, WFD includes many novelty aspects not familiar to policy makers  lack of comprehensive system specification The WFD imposes institutional changes, often accompanied by uncertainties and conflicts (institutional overplay)  intended users often not known Produced guidance documents found too generic to guide local implementation, policy makers pay attention to the first obligations while most projects focused on the last stage – selection of measures  misfit between users’ needs and project objectives

DSS developed – computerised part : 

DSS developed – computerised part Problem structuring Plan of actions Key issues External environment Alternatives Values Goals Stakeholders (based on Belton and Steward, 2002) Model building Model using to inform and challenge thinking Specifying alternatives Defining criteria Eliciting values Synthesis of information Challenging of intuition Sensitivity analysis Environmental (mostly hydrological) models, decision models (multiple criteria decision models, Bayesian Networks, agent-based modelling)

DSS developed - accompanying methodologies : 

DSS developed - accompanying methodologies Problem structuring Plan of actions Key issues External environment Alternatives Values Goals Stakeholders Model building Model using to inform and challenge thinking Specifying alternatives Defining criteria Eliciting values Synthesis of information Challenging of intuition Sensitivity analysis (based on Belton and Steward, 2002) Actor analysis, Cognitive mapping, Social network analysis, Problem structuring methods, Role Games, Social learning Participatory model building, conflict mitigation

Learning - different perceptions: 

Learning - different perceptions Scientists Knowledge amongst the modellers about the demands of the WFD on them generally low It should not be a problem for stakeholders to be faced with 3 different models and 3 different results. Multiple models rather a norm. Promoting uncertainty comes with risk that people do not believe model results Policy makers Policy makers want to use models but have a significant lack of confidence in them. Confidence is about convincing people of the merits of models and trust that comes with experience. Need to agree on one model, new models are difficult to “sell” and explain to people Model uncertainty not as important issue as data reliability (based on Hare, 2004a and 2004b)

Policy makers: 

Policy makers Lack of interest on the side of the stakeholders and difficulty to keep them engaged Selection of stakeholders needs to be done carefully to avoid disappointments Models used to enforce laws and therefore used by experts Not all models needs to be transparent Trust in a model can be generated if somebody in the peer group has confidence in it and if he is well trusted Can models including social aspects ever be accurate enough? (based on Hare, 2004a)

Modellers/ DSS builders: 

Modellers/ DSS builders Stakeholders might loose their power to say no to a decision if they are part of the process from the beginning and thus may not be willing to participate Integrating human dimensions into models will increase models’ uncertainty and make validation difficult Models will not bring a solution of existing conflicts but will clarify facts There is a danger that affected parties may produce a counter-model and experts may become parts of opposing camps Different group of stakeholders should be worked with at different stages (based on Hare, 2004a)

Achievements : 

Achievements Critical insights gained through improved communication of the different perspectives of researchers and policy makers  dialog DSS development, implementation and application make conflict between disciplinary worldviews apparent  mutual learning DSS has been a catalyst of interdisciplinary, policy oriented research as well as vehicle to implement (societal) changes (based on Hare, 2004a)

Conclusions (1) : 

Conclusions (1) “My main concern is not that DSS technology has failed ti live up to initial expectations, but that the underlying skills in system analysis have not been adequately exploited” (Cox, 1995). DSS represents an interface between human and machine, but also between science and policy; social and natural sciences, and interdisciplinary research… for which no universally accepted quality standard exist DSS evaluation is an multidisciplinary task There is rarely a complete failure (and success). An assessment of DSS success (and failure) have to consider a variety of benefits, including mutual learning between scientists and policy makers (social learning).

Conclusions (2) : 

Conclusions (2) More attention to be paid to change management, raising and maintaining the commitment of policy makers - to implement scientific policy advises scientists - to carry out policy relevant research (i.e. develop criteria for evaluate and reward such effort) In light of the huge investments in ICT tools in water management there is an urgent need to institutionalise the collected experience (e.g. databases of case studies, specified requirements, protocols from stakeholders meetings etc.)


Thank you for your attention!! [email protected]

authorStream Live Help