L01 Introduction 2006

Views:
 
Category: Education
     
 

Presentation Description

No description available.

Comments

Presentation Transcript

Methodology & Explanation: 

Methodology andamp; Explanation Richard Joiner MSc. Human Communication andamp; Computing Lecture 1: Introduction

Introduction: 

Introduction Course Aim Course Objectives Course Assessment Rationale Laboratory Studies Field Studies

Aim: 

Aim To give the students an introductory understanding of the research methods in human computer interaction and communication research. To raise students awareness of the scientific and engineering methods used in the context of human-human and human-computer interaction.

Objectives: 

Objectives Apply appropriate techniques for the interpretation of material, including observational and ethnographic material. Develop a critical understanding of the assumptions that underpin the development and application of models.

Objectives: 

Objectives Apply methods of analysis, experimentation and model building. Distinguish between descriptive, predictive and prescriptive models. Design and carry out empirical studies including experimental and observational approaches.

Objectives: 

Objectives Apply analytical techniques to the analysis of human-human and human-computer interaction. Construct descriptive, qualitative, quantitative and explanatory accounts of human-human and human-computer interaction.

Assessment: 

Assessment 4 practical assignments to be submitted as portfolio at the end of the unit for a total value of 80%. Oral presentation on an assigned reading of your choice (first come basis), in which the materials and approaches used are presented in a critical manner 20%.

Moodle: 

Moodle The course materials are located in a VLE called Moodle http://www.bath.ac.uk/e-learning/

Course Text Book: 

Course Text Book Preece, J, Rogers, Y andamp; Sharp H. (2002) Interaction Design: Beyond human-computer interaction. Chapter 11. London Wiley

Methodology & Explanation: 

Methodology andamp; Explanation Richard Joiner MSc. Human Communication andamp; Computing Lecture 1: DECIDE a framework for evaluation

Objectives: 

Objectives Describe the evaluation paradigms and techniques used in interaction design Discuss the conceptual, practical and ethical issues to be considered when planning an evaluation Introduce the decide frame work to help you plan your evaluation

Introduction: 

Introduction Not just desktop computing. What other kinds of technologies are we evaluating?

Collaborative Technologies: 

Collaborative Technologies

Immersive technologies: 

Immersive technologies

Tangible interfaces: 

Tangible interfaces

Mobile and Wireless: 

Mobile and Wireless

Evaluation Paradigms: 

Evaluation Paradigms Any evaluation is guided by a set of beliefs and practices. These are known as an evaluation paradigm Each paradigm has a set of techniques associated with it

Evaluation Paradigms: 

Evaluation Paradigms There are a number of evaluation paradigms Quick and Dirty Evaluation Usability testing Field Studies Predictive Evaluation

Evaluation Paradigms: 

Evaluation Paradigms Quick and Dirty Informal feedback from the users Emphasis is on speed Usually descriptive and informal Often consultants are used

Evaluation Paradigms: 

Evaluation Paradigms Usability Testing Measuring user performance measures include time, number of errors Strongly controlled by the evaluator and typically in a laboratory. Quantitative data is collected

Evaluation Paradigms: 

Field Studies Carried out in natural settings Help identify new opportunities for new technology determine requirements for design facilitate introduction of new design Two distinct approaches: outsider and insider Evaluation Paradigms

Evaluation Paradigms: 

Evaluation Paradigms Predictive Evaluation Experts apply there knowledge of typical users Often in the form of heuristic evaluation It is quick, relatively inexpensive

Evaluation Techniques: 

Evaluation Techniques Observing Users Asking users Asking Experts Testing Users Performance Modelling Users Performance

Slide24: 

Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes No No No No No No No No No No

DECIDE Framework: 

DECIDE Framework Well planned evaluations are driven by clear goals and appropriate questions We are going to use the DECIDE framework

DECIDE: 

DECIDE Determine the overall goal What are the overall goals Who wants it and why An evaluation to determine users needs is different to one to fine tune an interface or to find the best metaphor for a conceptual design or how technology will change work practices

DECIDE: 

DECIDE Explore the questions Overall goals need to broken down into questions that can be answered to satisfy them. Why do people not use a computerised calendar too busy don’t want others to know what they are doing.

DECIDE: 

DECIDE Choose the Evaluation paradigm Having chosen the goals and the main questions the next step is to choose the evaluation paradigm and techniques Evaluation paradigm determines the techniques used Practical and ethical decisions must be considered.

DECIDE: 

DECIDE Identify Practical Issues There are many practical issues. These include users facilities and equipment Schedule and budget constraints Expertise

DECIDE: 

DECIDE Decide how to deal with ethical issues The BPS and the ACM have ethical codes, which they expect their members to abide by. Don’t do anything that you would not like done to others.

DECIDE: 

DECIDE Evaluate Interpret and present data Reliability Validity Biases Scope Generalisability or External validity Ecological validity

Summary: 

Summary Describe the evaluation paradigms and techniques used in interaction design Discuss the conceptual, practical and ethical issues to be considered when planning an evaluation Introduce the decide frame work to help you plan your evaluation

authorStream Live Help