Baxter, J. and J. Eyles. 1997. “Evaluating Qualitative Research in Social Geography: Establishing ‘Rigour’ in Interview Analysis.”

Baxter, Jamie and John Eyles. 1997. “Evaluating Qualitative Research in Social Geography: Establishing ‘Rigour’ in Interview Analysis.” Transactions of the Institute of British Geographers 22 (4): 505-525.

We have a problem in qualitative social geography in maintaining standards that ensure rigor and generalizability. Works to develop a framework of method/ological tools to improve, not constrain, qualitative evaluation.
Observes tension between creative description of independent sites, and evaluation – standardized procedures of reporting. Reflexivity is good, but how much is too much, where it overshadows the interpretation? Notes (not reifies) the difference between academic and lay accounts of events; reflexivity is necessary for successful interpretation of qualitative work; how we choose to frame our interpretations does speak to our evaluative methods.
Evaluative methods through: plausibility of research design (methodology, methods, analysis tools); plausibility of accounts (how well does the description match the event); and, appeal to interpretive community (fit with existing bodies of literature)… responsibility in reflexivity, accurateness of data description, establishing trustworthiness.
Ensuring triangulation does not necessarily mean that it is a more rigorous study. How are people recruited, how many are interviewed? The genericism of these studies creates a lack of understanding of plausibility of research design.
TIPS:
– Balance interpretation fairly with the inclusion of quotations, as interviewees’ meanings can aid in lending credibility to accounts
– Detail your interview practices
– Discuss how you analyze your data (systematically)
– Explain length and depth of immersion in fieldwork
– Revisit your respondent, and participant-check your data… do not use these to expand, but merely verify your data!
– Provide rationale for how to verify or establish validity in your findings.
– Why are your quotes important? What do they bring to the findings?
– Just because your work appeals or utilizes a lot of previous literature does not provide rigor for your own research.
– Back up your scientific constructs or definitions with lay definitions (use of participant-derived terms to aid in developing theoretical concepts – Rose 1982)
– Make explicit the connection between data (characteristics) and theory (explanatory schema – 511)

The drive to make these “hidden” or undisclosed means of interpretation is a point of science – to make knowledge, and the development of knowledge public

 

Source: Lincoln and Guba (1985).

Credibility – “the degree to which a description of human experience is such that those having the experience would recognize it immediately and those outside the experience can understand it” (512, Lincoln and Guba 1985) – however, some respondents may bias or give partial accounts to researchers (Miles and Crush 1993) or may contradict each other. Convenience or snowball samples may not be the best, as they most accessible may not be the most informative. WHAT IS YOUR SAMPLING STRATEGY?

Use multiple sources, multiple methods, multiple researchers, and multiple theories (Denzin 1978).
Utilize peer debriefing, negative case analysis, referential adequacy (checking your data against pilot, and previous studies), and member checking. However, member checking has been questioned: “To assumed that respondents can validate or even falsify accounts in some definitive way is to forge the social character of the relationship between researcher and participants and too assume that they have privileged access to the truth. Neither of these assumptions is sustainable” (Hammersley 1992).

Using multi-site studies as a means to create generalizability, even in case studies. Low inference descriptors – two or more types of recordings of the same accounts, used by other researchers to compare interpretations. Inquiry audits as a detailed account of how research was done, examined by a peer, and is involved with the long process of “checking” throughout research, instead of just at the end.


(kx ^ Abide by these questions when opening up proposal work).

CITES:
Denzin, N. 1978. The Research Act. New York: McGraw-Hill.
Hammersley, M. 1992. What’s Wrong with Ethnography. London: Routledge.
Lincoln, Y. and E. Guba. 1985. Naturalistic Inquiry. Beverly Hills, CA: Sage.
Miles, M. and J. Crush. 1993. “Personal Narratives as Interactive Texts: Collecting and Interpreting Life Histories.” Professional Geographer 45(1): 84-94.
Rose, G. 1982. Deciphering Sociological Research. London: MacMillan.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: