Category: Education

Presentation Description

No description available.


By: faizrasool037 (102 month(s) ago)

this is a gud

By: love1977 (117 month(s) ago)

exellent presentation really, i was admired with the way it is presented.........., would you give me a copy please

Presentation Transcript

Research Methods: 

Research Methods Michael Wood http://userweb.port.ac.uk/~woodm/rm/rm.ppt This file contains draft slides which will be updated.


Reading There are many books available - eg Saunders et al (2007) Robson (2002) Easterby-Smith et al (2002) See also NORM: Notes on research methods at http://userweb.port.ac.uk/~woodm/rm/norm.doc


Contents Overview of academic business research Formulating research aims What must be in a project plan and a project? The design of research projects Evaluating research Statistical analysis for research Qualitative data analysis Some philosophy Questionnaire design Interview design and qualitative research Finally… Interviews and qualitative research – more detail More on literature reviews

Overview of academic business research: 

Overview of academic business research Reading: browse through a book on research methods: e.g. Saunders et al (2007), Robson (2002) These slides intended as a brief summary of the important points Reread them when you are starting your project

Advice on research methods: 

Advice on research methods Common sense – don’t forget this Articles and books reporting similar research – should be discussed in the project Books on research methods in general Focus on chapters relevant to your project.

Purpose and characteristics of academic research: 

Purpose and characteristics of academic research Purpose: Discover truth about something; and/or Find a good way of doing something Must be Systematic and as thorough and trustworthy as possible Clearly written and with sufficient detail for readers to check credibility Ethical

Types of research include …: 

Types of research include … Large scale surveys (of people, organisations, events, etc) analysed statistically Small scale surveys with emphasis on qualitative detail Case studies (to see how something works in detail) Experiments (change something to see what happens) Models can be set up, tested and used for … Participant observation (observe as participant) Action research (combine research and action) Evaluation Anything else (??) necessary to meet your aims Many projects combine several of these

Sources of data: many possibilities : 

Sources of data: many possibilities Interviews Including focus groups, Delphi technique (Robson, 2002:57), various approaches to eliciting comments (e.g. “photo elicitation” – Sam Warren) Questionnaires, including via email (be careful …) Documents (minutes of meetings, company reports, etc) The web Databases – within organisation, of share prices, etc Observations of various kinds Etc …. Be imaginative! Sources of literature is a different issue (Judith’s session is very important for this)

Experiments (RCTs) : 

Experiments (RCTs) Advantages of experiments over non-interventionist research Can investigate new things Disentangle cause and effect. Can control variables you haven’t even thought of. (e.g. medical drug trials) But Hawthorne effect Often impractical or unethical So use less rigorous quasi-experiments instead – eg in action research See NORM, Wood (2003, Chapter 10)

Finding a suitable topic: 

Finding a suitable topic Interest Career Feasibility Usefulness

How to do research: 

How to do research Read about topic Draft aims of research. Clear, simple, focused. Draft literature review. Draft research plan – check it is really likely to meet your research aims. Check again. Do research/analysis Draft research/analysis and recommendations/conclusions Check it fits together and revise all sections

Practical issues: 

Practical issues Timing Plan this remembering that your supervisor may suggest extensive changes. Gantt chart may help. Ethics (remember the form!) Access to information. Take care: this is often difficult!

Formulating research aims: 

Formulating research aims Reading – most research methods books, e.g. Saunders et al, 2007

Research aims or questions: 

Research aims or questions Usually start from vague idea Then formulate a clear aim, or list of aims, that your research will achieve. OR formulate a clear question or list of questions. This process may require some creative thinking Techniques like brainstorming and mind maps may be useful

Aims, objectives, questions: 

Aims, objectives, questions You can formulate your research aims as aims (or objectives if you prefer that word) or questions. These are different ways of saying the same thing. Doesn’t matter which you use, but don’t confuse things by having aims and questions May be helpful to have a list or hierarchy of aims, but keep it simple


Hypotheses Hypotheses are statements whose truth you want to test, or “predicted answers” to research questions (Robson, 2002:65) Occasionally may be appropriate as a top level research aim e.g. to test the hypothesis that “working at home improves quality of life” Usually best to avoid hypotheses when formulating main research aims because questions or aims tend to be more flexible e.g. “how does working at home affect quality of life?” Null hypotheses have a (controversial) role in some statistical analysis (… as you will see), but this is not relevant to formulating your overall research aims

Research aims or questions: 

Research aims or questions Research aims or questions should: Be clearly and simply expressed fit together (so that you have a coherent project) Clarify the intended outcome and scope of the research Your research aims or questions should also Be relevant to your degree Be achievable Present a reasonable level of challenge

Research aims or questions: 

Research aims or questions Must be research aims, not business or personal aims. However, business or personal aims may be part of the background motivating your research aims, and research aims would normally include the aim of making recommendations to people or organisations. Should generally have a limited scope or focus. The danger with general aims is that they lead to superficial research. May relate to theoretical issues.


Theory “Theory” includes models, explanatory frameworks, generalisations, recommendations … Examples …. Your research should link with any relevant theory. It may Use a theory Demonstrate that a theory is useful Test a theory Modify a theory or create a new theory

Also ask yourself: 

Also ask yourself Is the research worth doing? Are there any ethical or political problems? Is it possible? Have you got access to the necessary data?

Example of research aims: 

Example of research aims The aims of this research are to Describe the decision making strategies of small investors Determine the effectiveness of these strategies Any comments? Does this seem reasonable for a Masters project?

Another example of research aims: 

Another example of research aims The aims of this research project are to Evaluate Method X for planning mountaineering expeditions If necessary propose and justify Amended Method X for planning mountaineering expeditions

Another example of research aims: 

Another example of research aims What are the important quality problems in Company X? How serious are these problems? What is the best strategy for reducing these problems? Any comments? Does this seem reasonable for a Masters project? Does it matter that they are expressed as questions?

Three more examples of research aims: 

Three more examples of research aims The aim of this research is to investigate the role of the internet in banking. This research project aims to explain activity based costing. The aim of this project is to Test the efficient market hypothesis for the Athens stock exchange, and Determine how global warming will influence share prices. Any comments? These are not reasonable for an Masters projects! Why not?

Possible research topics: 

Possible research topics These are just some possibilities. There are more … Research in a specific organisation Best if they are likely to implement any recommendations Take care you have adequate access to data Easier if you have a recognised / paid job there and / or know key players well. Research based on publicly available data Eg share prices, the www, published atatistics Research based on surveys of the “public”

What must be in a project and a project plan?: 

What must be in a project and a project plan? Reading Project guidelines Proposal guidelines Saunders et al (2007), or another similar book

What must be in a project?: 

What must be in a project? Abstract (short summary of project including conclusions) Background and aims (what you’re trying to find out and why it’s important) Literature review (of relevant previous research which you will build on or extend) Research methods – plan and justification (what you did to meet the aims, and why it was a sensible approach) Analysis (in detail, to convince sceptical readers and impress examiners: important tables, diagrams etc must be in the text, only details in appendix) Results, conclusions, recommendations, limitations, further research References (list works cited in text in alphabetical order) Appendices – Ethics form, extra details for the reader Flexible designs can be more flexible – but everything must be there!

Features of a good project: 

Features of a good project Obviously important and interesting Difficult to disagree with because Arguments and analysis detailed, clear and obviously valid Possible objections considered Fits together Aims met by methods (check this in your project plan) Conclusions follow from analysis

References and citations: 

References and citations You must give references to publications which you draw on or quote Exact (word for word) quotes must be in “…” and the reference must be given Maximum one paragraph or so Use one of the standard referencing systems – preferably the Harvard (see library leaflet or university website) Copying word for word without “…” and reference is treated as cheating and you will fail!

Harvard referencing system: 

Harvard referencing system Very important to use this (or another established system) Seems easy to me, but causes a lot of difficulty Check library website (search for Harvard) and/or copy an academic article or book. All references in text like Smith (2001) Then alphabetical list of references at the end. Should include everything referred to.

What must be in your project plan (proposal)?: 

What must be in your project plan (proposal)? See assignment description and Saunders et al (2003) Section 2.5 You may be able to put parts of it in your project! The literature review is important (just a summary here – more in final project – but must refer to literature!) You should describe and justify your research methods in as much detail as possible

Writing style (1): 

Writing style (1) Keep it simple. Short sentences Paragraphs Clear subheadings Read it through to make sure you can follow it. Swap with a friend and check each others’

Writing style (2): 

Writing style (2) I think the EMH was true in this situation… In my opinion the EMH was true … In the author’s opinion the EMH was true … 4 The evidence suggests that the EMH was true … 5 This shows that the EMH was true … Use 4 or 5. Avoid 1, 2 or 3

Writing style (3): 

Writing style (3) I work for … and the problems are … / I interviewed three managers. The author works for … and the problems are … / The author interviewed three managers. ??? / Three managers were interviewed. Opinions vary here. I (MW) prefer (1). Others prefer (2) or (3). Check with your supervisor.

Design of research projects: 

Design of research projects Design means deciding on the methods and approaches which will best achieve your aims Needs thinking out carefully starting from your aims Check the proposed design will achieve all your aims The design may require the use of a theoretical framework – which should be explained and its use justified Many designs incorporate several different approaches Some would advocate “flexible” designs (see Robson, 2002)

Designing research is not easy!: 

Designing research is not easy! Think about how you can best achieve your aims Be imaginative Think about it again … and again Check you’ve found the best way you can for meeting all your aims

A general design for a typical Masters degree project: 

A general design for a typical Masters degree project If the aim is to find a good strategy to "improve" X in org Y, then a possible design may be: Survey/case studies of Org Y to investigate problems and opportunities Survey/case studies to see how other organisations do X and which approaches work well Based on (1), (2), the literature, and perhaps creative inspiration, consultations within the organisation, simulation or modelling, devise a strategy likely to improve X Try/test/pilot/monitor the proposed strategy

Then …: 

Then … Having designed your research get someone to act as a devil’s advocate and tell you What’s wrong with it – why it may fail to deliver what you are aiming for What may go wrong

Evaluating research: 

Evaluating research Relevant to planning your own research (check it against the criteria in the following slides) critically reviewing research, and These slides are intended as a checklist for your research and others’

Good research should be:: 

Good research should be: As useful or interesting as possible As user-friendly as possible As uncritisable (trustworthy) as possible Trustworthiness or credibility is particularly important. Can you trust the conclusions? Essential to give readers enough detail to check.

Check research using: 

Check research using The 3 U’s The 2 CRITIC’s 2 extra checks Applies to both your research and research by other people

Usefulness of research: 

Usefulness of research Are the conclusions likely to be useful or interesting? Do they tell us something we didn’t know already? Do they have clear implications? Are they likely to lead to improved practice in the future?

User-friendliness of research: 

User-friendliness of research How easy is it to follow the article or project report? Research should be presented in as user-friendly a manner as possible. This is important, but … Don’t forget that the difficulty of the topic may mean that the research is inevitably difficult to follow. Who are the intended readers? If you criticise a research article on the grounds that it is difficult to understand, you should be able to explain how it could improved!

Uncritisability (trustworthiness / credibility) of research: 

Uncritisability (trustworthiness / credibility) of research Are the results and conclusions likely to be valid (right)? Do you believe / accept them? Are there any major difficulties? Think carefully. Use your common sense – see also books on critical thinking (eg Cottrell, 2005)

Trustworthiness of research: main things to check: 

Trustworthiness of research: main things to check are Cause-effect assumptions justified? is sample likely to be Representative of what is intended? (Or: to what other people / organisations / times / conditions / etc can the results be generalised?) do measurements (Indicators) measure what they are supposed to measure? are the Theoretical assumptions OK? is information from Informants likely to be valid? can you be sure that the results are not caused by Chance factors? (Could you be confident of getting the same results with another sample, at another time, with another set of interviewers?)


Jargon Most of these checks are covered by technical jargon, concepts and techniques – e.g. lots of types of validity (internal, external, construct, face …), lots of types of reliability, ideas about test and scale construction (see Robson, 2002), etc Read up only those areas which you think are relevant. I have largely avoided jargon here. This should always include sampling – always necessary to consider whether your sample is likely to be representative of your area of interest.

Deciding what is cause and what is effect: 

Deciding what is cause and what is effect Take care – may be more complicated than it appears Experiments or quasi-experiments (including a trial of a new innovation) may be necessary for definite answers but … Alternatively …

To ensure results Representative … check Sampling: 

To ensure results Representative … check Sampling Sampling is usually necessary (why?), but check how it’s done. A silly sample can make results meaningless. Occasionally a census may be possible, but … Normally we want the sample to be representative of the population or wider context—so you must check if this is likely. Need to consider how the sample is selected and its size.

How to sample: 

How to sample 1 Clarify target population of people, times, organisations, shares (i.e.the whole group in which you are interested) 2 Decide how to sample. There are many methods of taking a sample from your target population, including Random Stratified Purposive Convenience (opportunity) Cluster, snowball, quota, etc (see a book) Usually random samples are best for large samples, and purposive samples for small samples analysed qualitatively.

Random sampling: 

Random sampling Make a numbered list of the target population (a sampling frame) Use random numbers to choose sample Each member of population has the same chance of being selected (and it’s independent of any biases) Each member of sample selected independently What if some members of the sample can’t be found or won’t help? (Do not substitute more cooperative individuals because …) The principle is to ignore all variables and choose at random. This allows for all “noise” variables.

Sampling in practice: 

Sampling in practice Many samples are biased – which means the results will be biased regardless of the sample size – and so will not give a good idea of the population. Eg TV audience research, word length, NRE, non-response bias in surveys, survivor bias in stock price samples Ideal for large samples is random sampling, but this is often difficult to do properly. Eg Iraq war death rate sample, TV audience research. Be suspicious of statistical results from purposive or convenience samples (especially small ones) Is a large sample always better than a small sample? Is a sample of one ever OK?

A sampling problem: 

A sampling problem An MBA student sends out 100 questionnaires to 100 organisations asking if they would be interested in a particular service. Twenty are returned, and of these 6 indicated they may be interested in the service There are 650 firms in the relevant industry sector How big is the market for the service? Are you sure?

Measurements (Indicators): 

Measurements (Indicators) Are measurements of things like satisfaction, profit, quality, intelligence etc OK? If you are using something like the average response to a series of questions, take care! See literature on Tests and scales (eg Robson, 2002: 292-308). If possible use existing measurement system. Conventional to distinguish between Validity (are you measuring the right thing?) Reliability (consistency) If possible use triangulation (check with information from different source)

Reliability of measurements: 

Reliability of measurements Same answer at different times? If anything depends on subjective judgments, check agreement between different judges Eg – marking projects If you’re asking a number of questions to get at the same information, check the relationship between answers to these questions – with two questions use a correlation coefficient, with more than two use Cronbach’s Alpha (if you are keen on stats!) – see http://www.statsoft.com/textbook/stathome.htm

A measurement problem: 

A measurement problem How would you measure quality of service in a casino? How would you check if your proposed measure is valid / reliable / right / accurate?

A second measurement problem: 

A second measurement problem Andy had answers from lots of questions on a SD, D, N, A, SA scale He wanted a measure to tell him which questions produced responses which gave a a clear overall view (COV) from his respondents His defined his measurement as COV = abs(SD+D–A–SA) – N (where SD is the number of SD responses, etc, abs = absolute value) He then highlighted questions for which COV > 0 Do you think this is a sensible measurement?

Theoretical assumptions: 

Theoretical assumptions If the research uses a theory, is the theory right for the purpose? (And is it a “valid” theory?) A questionnaire or interview plan may be based on assumptions about what is relevant. Are these assumptions OK?

Validity of information from interviews or questionnaires: 

Validity of information from interviews or questionnaires Remember the distinction between facts, desires and opinions Are people telling you the truth? Perhaps they Are biased (why might this be?) Don’t know or are guessing? If possible check with another source (triangulation)

Take care with opinion surveys: 

Take care with opinion surveys You can ask someone What she did last week What she does in general terms Her opinion of what she does What she thinks other people do Her opinion of what she thinks other people do How she thinks things can be improved What she thinks about particular suggestions about how things can be improved What she likes / wants / values Etc Think about what type of question you are using and whether it is really useful

Making sure that you are not being misled by Chance: 

Making sure that you are not being misled by Chance Could your results just be due to chance? Have you taken account of sampling error? (If you repeated your research with another sample are you sure the answer would be the same?) Is the sample large enough? Null hypothesis tests or confidence intervals can be used to answer these questions. (We will look at these later.) Are measurements reliable?

The first CRITIC: 

The first CRITIC Cause and effect assumptions OK? Representative sample? Indicators (measurements) OK? Theoretical assumptions OK? Informants likely to be telling truth? Chance ruled out as explanation? (Checks needed are mostly common sense – except for Chance.)

The second CRITIC: 

The second CRITIC C Claim? R Role of the claimant? I Information backing the claim? T Test? I Independent testing? C Cause proposed? Teaching skepticism via the CRITIC acronym and the skeptical inquirer Skeptical Inquirer,  Sept-Oct, 2002  by Wayne R. Bartz

Two extra checks: 

Two extra checks Triangulation – compare results from different sources. Applies to data, methods, observers, theories (Robson, 2002: 174). Checks trustworthiness. Use of a devil’s advocate or critical friend. Get someone to try and be critical and find difficulties with your research- then fix or discuss the problems. Checks all 3 U’s. Can you think of anything else?

Anything else…?: 

Anything else…? Is this list complete? Does it include all the flaws you have noticed in research?

Checklist: the 3 U’s, 2 CRITIC’s and Extra checks: 

Checklist: the 3 U’s, 2 CRITIC’s and Extra checks Useful? User-friendly? Uncriticisable (trustworthy)? 2 CRITIC’s Extra checks Triangulation Devils advocate (critical friend)

To think about …: 

To think about … Choose a published research paper (e.g. the paper on TQM and small firms) and Check if it’s OK on the three U’s, and whether it can deal satisfactorily with the first CRITIC In particular, check the sampling method. What is sampled? Is the sample size and method OK? What would you have done? Are there any important issues that are not covered by the 3 U’s and the first CRITIC? How would you improve the research?

Critique of an article: 

Critique of an article Do you accept what the article says, or are there flaws in the research? Think about the article! Use your common sense. Remember lists of points to consider. Is it worth publishing? Could you do better? For a good mark read round the subject – eg other research on the same theme. Would the research benefit from some qualitative work, p values or confidence intervals, case studies, different perspectives, experiments…

Statistical data analysis: 

Statistical data analysis Go to http://userweb.port.ac.uk/~woodm/stats/StatNotes0.ppt

Qualitative data analysis: 

Qualitative data analysis Aim is detail and depth of understanding Demonstrate and understand possibilities, but not how frequently they occur Use direct quotes (“…”) as evidence and to reduce danger of imposing your perspective Sometimes may be helpful to Summarise in a table or similar Use coding scheme to analyse statistically (but be careful if the sample is very small!) More advanced possibilities in Saunders et al, Robson, www.qual.auckland.ac.nz/

Literature review: 

Literature review See Saunders et al (2003) Chapter 3 Focus on relevant books, articles and theories Brief on general points More detailed on research of particular relevance to your project – you will need to search for articles using the library databases Critical Should lead into your method and analysis Must be properly referenced!

Philosophy of research: 

Philosophy of research In the textbooks you will find discussions of positivism, social constructivism, phenomenology, etc, etc. In my view, Robson (2002) is the best research methods text for philosophical concepts. Almost all concepts and distinctions here open to serious criticism – see Robson (2002). I wouldn’t suggest focusing on these ideas unless you are interested – in which case be critical of what you read! If you do want to go into philosophy, use a book like the Penguin Dictionary of Philosophy (Mautner, 2005) to check what the words mean. Also note that there are arguments against being prescriptive about research methods in books with titles like After method: mess in social science research (Law, 2004), and Against method: outline of an anarchistic theory of knowledge (Feyerabend, 1975)

Some ideas which are worth mulling over: 

Some ideas which are worth mulling over Detailed study of a small sample vs less detailed study of a large sample Induction vs the Hypothetico-deductive method (Popper) vs Following a framework / paradigm / theory vs Deduction Subjective vs Objective; Facts vs Values

Some misguided platitudes: 

Some misguided platitudes The following are often assumed (I think wrongly): There are two distinct kinds of research: Quantitative (=positivist=hypothetico-deductive), and Qualitative (=phenomenological=inductive). Instead … Positivist research (only) starts from hypotheses. Instead ... Academics tend to disagree about many of these issues. If you do decide to go into them, please think hard, and don’t accept everything you read in the textbooks uncritically!

Qualitative vs quantitative: 

Qualitative vs quantitative Quantitative usually means statistical – usually with largish samples Qualitative means focusing on qualities – usually with smallish samples studied in depth Disadvantage with statistical approaches is that the data on each case is often very superficial Disadvantage with qualitative approaches is that case(s) studied may be untypical and can’t be used for statistical generalisation Often best to use both approaches This distinction often confused with other distinctions …

Regrettable tendency to reduce things to a simple dichotomy: 

Regrettable tendency to reduce things to a simple dichotomy If you’re a soft and cuddly person: Soft and cuddly (e.g. interpretivist, qualitative, inductivist …) … is good Hard and spiky (e.g. positivist, quantitative, deductivist, …) … is bad But if you are a hard person you will probably reverse good and bad above. There are really many different dichotomies. Reducing them all to one is neither right nor useful.

And …: 

And … To hard and spiky people, soft and cuddly research is lacking in rigour To soft and cuddly people, hard and spiky research is naïve and lacking in richness

Induction vs hypothetico-deductive method: 

Induction vs hypothetico-deductive method Generalise from the data without preconceptions (induction) Grounded theory. Rigour is in process used to generate theory from data Versus Use data to test hypotheses or theories (hypothetico-deductive method) Karl Popper. Rigor is in the testing. Theory building vs theory testing is much the same distinction (see Saunders et al, 2007, pp 117-119) However, I don’t think these are the only choices …

Other useful approaches besides induction and hypothetico-deduction: 

Other useful approaches besides induction and hypothetico-deduction Use a framework or theory or “paradigm” (Kuhn, 1970) to define concepts, questions, and measurements, but without trying to test the theory Arguably what most scientists do most of the time (c.f. Kuhn, 1970). Rigour is in ensuring the theory is a good one, and in using it properly. Deduction from data, theories and framework. E.g. the differences between two quality standards can be deduced. Rigour is in checking the deduction and the info you start with Differs from the hypothetico deductive method in that the result is the deduction itself, not a confirmation, rejection or revision of a hypothesis or theory Note that this contradicts the assumption in Saunders et al (2007: 117-119) that there are just two approaches – “deductive” and “inductive”. I think they mean “hypothetico-deductive”, and they omit the two very important possibilities above.

An example …: 

An example … How would these four approaches work with a project of interest to you …

Karl Popper’s ideas (1): 

Karl Popper’s ideas (1) Science works by putting forward bold theories (or hypotheses) and then testing them as thoroughly as possible Provisionally accept theories that have withstood this testing process Theories must be sufficiently precise to be falsifiable – otherwise not proper science (eg Freud’s theories are too vague…)

Karl Popper’s ideas (2) - eg: 

Karl Popper’s ideas (2) - eg Einstein’s theory of general relativity predicts that light from a distant star will be bent by a small amount by passing close to the sun. Newton’s theory predicts the light will not be bent. Only possible to check during a total eclipse of the sun. In an eclipse in 1918 light was bent as Einstein’s theory predicted Newton’s theory is falsified; Einstein’s lives on and seemed much more credible.

Karl Popper’s ideas (3): 

Karl Popper’s ideas (3) Theories can come from anywhere – guesswork, intuition, other theories, etc The process of criticising theories and trying to show they are wrong is vital for science The method applies to both natural and social sciences How would you apply Popper’s ideas to a management research project? … in practice, has elements in common with a “critical” attitude …

Critical attitude: 

Critical attitude Try to anticipate and discuss criticisms Get a friend to act as a devil’s advocate Your work should be so convincing that it can’t be disputed! Think of any criticisms you have of articles you have read. Make sure the same faults don’t apply to your work. Word “critical” sometimes used in a slightly different, more specific, sense.

Questionnaire design: summary: 

Questionnaire design: summary Read a (chapter of a) book on questionnaires Develop a pilot. Remember questionnaires are far more difficult to design than they appear! Check with your pilot respondents: Is it clear? Is it interesting / appealing / user-friendly / not too long? Would you answer it? Does it provide (only) the information you want? Are you still sure a questionnaire is a good idea?

Questionnaire design (1): 

Questionnaire design (1) Write down what you want to find out Closed questions Tick boxes Rating (Likert) scales Open questions Pros and cons of each … Check your questions will enable you to find out what you need to for your research

Questionnaire design (2): 

Questionnaire design (2) Covering letter Pilot it 3-4 nice friendly people to tell you what’s wrong with it Pilot the analysis too Consider sample to send it to Anonymity / confidentiality How to send it / get it back (email?) What to do about non-response?

Questionnaire design (3): 

Questionnaire design (3) Far too many questionnaires about - many of them very silly. What is the response rate likely to be? Would you fill it in? Are you sure a questionnaire is necessary??? Many companies have a policy of not responding to questionnaires Are there any alternatives? Check with your supervisor before sending it out

Interview design: in brief (1): 

Interview design: in brief (1) Read a (chapter of a) book on interviews Follows, or precedes questionnaire, or stands alone Be clear what you want to find out Consider telephone interviews Small sample. Don’t do too many interviews. Plan your questions. Should be open-ended and flexible, and aim for a detailed understanding Probes and prompts

Interview design (2): 

Interview design (2) Ask for permission to tape record Transcribe interesting bits to get quotes for your project Get interviewee relaxed. Anonymity / confidentiality (take care here!) Check you’ve covered everything Send interviewee transcript afterwards? Some transcripts or parts of transcripts in appendix?


Finally… I wrote the following slides after reading several less-than-perfect projects…

Is it really going to be useful?: 

Is it really going to be useful? What use do you want the results to be? This may be a practical use – to help make more money, or to make life easier – or a contribution to theory, but it should be something that is really worth achieving. Then work backwards from what you want to achieve to the best methods to achieve it. It is very easy to waste a lot of time and energy getting results that don’t answer the real question and leave the reader asking “so what?” (Common sense is often the best guide.)

Take care with opinion surveys: 

Take care with opinion surveys Suppose your research is about risk management and its effectiveness. You decide to investigate by means of a questionnaire and come up with: “70% of people in the organisation think our risk management is unsatisfactory.” You then present this as the rationale behind a recommendation to improve risk management. Many projects are a bit like this But … So what? How do they know? And besides, the point is how to improve it!

Be explicit about sources of data and method of analysis: 

Be explicit about sources of data and method of analysis You are doing formal research which means others should be able to check the credibility of your work. Therefore you should give reasonable details of how the data was obtained and how you analysed it (as well as references to published work). Eg how samples selected, interviews organised, etc. Quotes from q-aires and interviewees. Explain how statistics etc are calculated, etc.

And before you hand it in: 

And before you hand it in Have you shown your supervisor and another critic a draft and taken account of any comments? Are the references complete and in order? Is all verbatim quotation referenced? Have you described the analysis you’ve done in the main text? For a good mark we need to see that you have done a reasonable amount of rigorous, systematic analysis! Have you made your conclusions and recommendations clear? Do they refer back to the analysis and meet your aims? Have you mentioned any possible limitations, and made suggestions for future research?

Interviews and qualitative research: more detail: 

Interviews and qualitative research: more detail I am grateful to Alan Rutter for the next few slides, some of which I have edited slightly

Primary data collection: interviewing: 

Primary data collection: interviewing Useful for accessing peoples’ perceptions, meanings, definitions of situations, eliciting their constructions of reality, etc. Alternative types structured semi-structured in-depth Ethical considerations


Qualitative interviews Face to face interviews Telephone interviews Focus group interviews One to one One to many Forms of qualitative interviews After Saunders, et al, 2000 F F f

Interview respondents: 

Interview respondents Who will be interviewed and why? How many will be interviewed and how many times? When and for how long will each person be interviewed? Where will each person be interviewed? How will access to the interview situation be organised?

Sampling for small sample qualitative research: 

Sampling for small sample qualitative research Usually best to use theoretical (purposive) sampling - the selection of individuals who you think will best contribute to the development of a theory Results apply to immediate situations May be tentatively generalised, but the small sample means …

Difficulties with interviews: 

Difficulties with interviews Mistrust by respondents e.g. researcher is a management spy Loyalty to organisation/colleagues Adherence to stereotypical views rather than their own inner feelings and knowledge Complete indifference An opportunity for respondent to ‘sell’ their ideas

Managing the interview: 

Managing the interview Preparation for the interview the interview schedule Beginning the interview - establishing rapport Communication and listening skills Asking questions sequence and types of questions Closing the interview

Verifying interview data: 

Verifying interview data Body language Material evidence e.g. company/factory tour Writing notes as soon as possible after interview Use informant verification and secondary sources


Remember Need to demonstrate rigour Good research acknowledges bias and the need to expose it. Devil’s advocates are useful for revealing bias and other problems, but are seldom used. …Is all research is biased?

More on Literature reviews: 

More on Literature reviews I am grateful to Andreas Hoecht for the next 16 slides Don’t forget the literature should be clearly focused on your research aims, and it should be critical in the sense that you should point out strengths and weaknesses where appropriate

Research methods: writing a literature review (Andreas Hoecht): 

Research methods: writing a literature review (Andreas Hoecht) 1.Finding material 2. Mapping relevant literatures 3. Evaluating literature 4.Some practical hints

Writing a literature review Finding material: 

Writing a literature review Finding material There is no prescribed number of sources you should use, it depends on the topic Be wary if you feel that you are drowning in material you found for your topic, it probably means you have not narrowed it down enough Be wary if you find no sources or very little sources. You normally need some academic sources to be able to write a meaningful literature review

What secondary sources should you use?: 

What secondary sources should you use? Books: Use textbooks only to get an overview over a topic Academic monographs (edited books with chapters by different authors) can be very useful. They often explore a topic from different angles or cover different aspects of a topic Don’t use “airport bookstall” books as serious sources

Secondary sources: 

Secondary sources Journals: Peer-reviewed academic journal articles should normally be the backbone of your literature review They provide up-to date discussions of topics and are usually more narrowly focused than textbooks “Trade journals” (non peer-reviewed) can provide good introductions to topics and overviews of developments but carry considerably less academic “weight” than academic journals.

(Secondary) sources: 

(Secondary) sources Sometimes you may be able to find article titles like “ …:A review of the literature” in academic journals. They can save you lots of work Internet: Make sure you are able to distinguish between credible sources and Joe Block’s unsubstantiated views Reputed organisations’ websites can be good sources of information (but may have a bias/self-interest). (gov. Agencies, internat. Organisations)

(Secondary) sources: 

(Secondary) sources Dissertations and PhDs: Checking dissertations stocked in the library may help you to get a feel for what is expected in a dissertation as well as provide information on a topic Government reports/EU reports/other organisations’ reports can be very useful (but are sometimes biased).

Searching for literature: 

Searching for literature The key is the use of electronic databases Some databases are full text (you can download articles directly), others are bibliographic databases (you need to check with library or use inter-library loan requests) Business Sources Premier/Emerald Full Text/Econlit/Science Direct are all recommended Be patient and creative in the use of keywords

Searching for literature: 

Searching for literature CD-Rom newspaper databases (FT, Economist) can be useful tools Financial Data and Marketing databases mainly provide primary data

Mapping out relevant literatures: 

Mapping out relevant literatures Don’t put everything you find or everything you read in your literature review Time spent on familiarising yourself with and assessing literature for relevance is never wasted Only after you have gained a good overview over the literature will you be able to decide on your particular “angle” and your research questions

Mapping out relevant literature: 

Mapping out relevant literature Your database search should tell you how much and what type of literature is available For some well-researched topic you will be able to concentrate on the literature directly dealing with your specific topic For other research ideas, you may need to think about “related areas” or similar experiences in other industries or possible insights from other subject disciplines for enlightenment

Mapping out relevant literature: 

Mapping out relevant literature An simple example: If you are interested in TQM and small firms you may wish to Look at the TQM literature in general for the pros and cons, constraints and motives Identify success and failure factors from the TQM literature Check the small business literature for general business conditions and constraints Check the small business literature to find out if these success factors apply there

Mapping out relevant literature: 

Mapping out relevant literature You can draw this as a conceptual map of overlapping circles or as a flow diagram if this suits your learning style Brainstorming and drawing conceptual maps is best done after you have gained a feel for the literature from your literature search

Evaluating literature: 

Evaluating literature This becomes easier with experience When reading literature: identify the key arguments that are made The reason(s) for the conclusions drawn They should be either derived from logical deduction (a conclusion following necessarily from premises) and /or based on empirical evidence

Evaluating literature: 

Evaluating literature Check the logic of the arguments made Does this necessarily follow? Check the supporting evidence Is this data relevant? Is it meaningful and accurate? Could it be interpreted in another way? Which data would I need to challenge this? Check for flaws: tautologies, simplistic analogies, redefinition of terms, moral judgements (ought to)

Some practical hints: 

Some practical hints Make sure you refer to key texts that are frequently cited in the literature Find out whether there are different “schools” or “camps” in the literature and cover their positions. Use your research questions to structure your literature review Check the validity (logic, empirical evidence) of arguments made Make clear on what basis you decide to side with a “camp” or author or why you remain unconvinced or oppose a judgement

Some practical hints: 

Some practical hints Don’t overstate your case and be realistic about what you can conclude Be particularly fair to views and arguments you don’t agree with (avoid to be seen as biased) Don’t be shy to critique established “trade names”(academic gurus) Write your literature review for non-specialists and avoid jargon Write it well structured and easy to read

authorStream Live Help