Strauss, Anselm L. and Juliet M. Corbin. 1998. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. Thousand Oaks, CA: Sage Publications.
Part One: Basic Considerations
“How can I make sense out of all of this material? How can I have a theoretical interpretation while still grounding it in the empirical reality reflected by my materials? How can I make sure that my data and interpretations are valid and reliable? How do I break through the inevitable biases, prejudices, and stereotypical perspectives that I bring with me to the analytic situation? How do I pull all of my analyses together to create a concise theoretical formulation of the area under study?” (pg x)
Differentiating between methodology (“ a way of thinking about and studying social reality” (3)) versus methods (“a set of procedures and techniques for gathering and analyzing data” (3)) – Coding as “the analytic processes through which data are fractured, conceptualized, and integrated to form theory” (3). Offering of researchers’ interpretation of science – though different, they impact how we think about phenomena and interpret theory in the future. Flexibility and openness in research and interpretation leads to ambiguity – however, it is a recognition of the complexity of the phenomenon and the taken-for-granted-nature of the concepts. Self-reflexivity critical, along with “appropriateness, authenticity, credibility, intuitiveness, receptivity, reciprocity and sensitivity” (Rew, Bechtel, and Sapp 1993) – writing for non/academic audiences – to analyze situation critically, recognize bias, responsive to respondents, and absorption to work process. Grounded theory developed by Glaser and Strauss – Strauss impacted by need for empirical pre-observation, theory’s relationship to data, respondents as agent, people act based on meaning which is defined through situational and (re)produced interactions – relationship between structure and process. Glaser as more quantitative – the need to make comparisons to develop and relate ideas. Qualitative as any data not arrived through statistical means – experience, behaviors, feelings, interactions — mostly interpretative of observations and interviews, film, census information. Grounded theory as “theory that was derived from data, systematically gathered and analyzed through the research process” (12) – does not start with preconceived theory (unless purpose is to extend existing theory) , in hopes of resembling ‘reality’ of data, rather than preconceived theory. Coding procedures: build, don’t test theory, offer analytic tools for processing data, consider alternative meanings, identify and relate concepts.
Description, Conceptual Ordering, and Theorizing
Description: “The use of words to convey a mental image of an event, a piece of scenery, a scene, an experience, an emotion, or a sensation; the account related from the perspective of the person doing the depicting” (15) – often un/consciously selected by author. Descriptions should not include inference or analysis – this is something that comes a little later; conceptual ordering: “organizing (sometimes rating) of data according to a selective and specified set of properties and their dimensions” (15) – categorizing data to be able to make some sense of the patterns that emerge and accounting for outliers – categories, stages/steps, actions; theory: “a set of well-integrated concepts related through statements of relationship, which together constitute an integrated framework that can be used to explain or predict criteria” (15) – Description is not a theory, nor is it a means to predict or explain. “To explain or predict, we need a theoretical statement, a connection between two or more concepts” (Hage 1972, 34). Theories study several phenomena, properties – may be substantive (specific to group or place) or formal (more generalizable, applicable to multiple areas). Remarks that feminism, structuralism, and interactionism are NOT theories, but merely philosophies – not a “well-developed an related set of explanatory concepts about how the world works” (24). Qualitative research can validate a theory – also, simply applying theory to data is not theorizing – one must build or broaden a theory. Who/what/where/how/why (to) interview? Triangulation not for the sake of triangulation, but as a means to develop comprehensive and integrated theory – good to use any/every method appropriate.
The Interplay Between Qualitative and Quantitative Theorizing
“However, one must remember that because emergence is the foundation of our approach to theory building, a researcher cannot enter an investigation with a list of preconceived concepts, a guiding theoretical framework, or a well thought out design. Concepts and design must be allowed to emerge from the data. Once relevant concepts and hypotheses have emerged from and validated against data, the researcher might turn to quantitative measures and analysis if this will enhance the research process” (34).
Defines objectivity: “the ability to achieve a certain degree of distance from the research materials and to represent them fairly; the ability to listen to the words of respondents and to give them a voice independent of that of the researcher” (35), differentiation between technical and non-technical literature – technical as a background, supplemental information; non-technical (interviews, documents, etc.) as a source (or supplement) of data and emerging concepts. Source of research questions may be offered, experiential, based in non/technical literature, in research itself. Justification for experiential RQ development: “Choosing a research problem through the professional or personal experience route might seem more hazardous than choosing one through the suggested or literature routes. This is not necessarily the case. The touchstone of one’s own experience might be a more valuable indicator of a potentially successful research endeavor than another’s more abstract source” (38). RQ does not necessarily have DVs and IVs, but merely identifies phenomenon to be studied. Analysis, in GT, is done through data collection, and vice versa. “It is the analysis that drives the data collection. Therefore, there is a constant interplay between the researcher and the research act. Because this interplay requires immersion in the data, by the end of the inquiry, the researcher is shaped by the data, just as the data are shaped by the researcher” (42) – brings up questions of objectivity and sensitivity. S&C argue that objectivity is something that is possible, and that subjectivity should be minimized through ‘giving voice’ to respondents, turning to literature to describe/compare to similar phenomena — this comparative example does not give data, but merely a means to recognize instances of data, and view phenomena at dimensional, property-based levels –– to examine multiple viewpoints of the same phenomena, observe multiple and varied representatives of actors, processes, places, events, times, being skeptical and provisional (as all concepts seem to range with conditions) – familiarity with literature can be useful for catching subtleties, may also block creativity in interpretation. Theoretical orientations and subject matters impact what/how we methodologically examine. Reciprocal and ongoing engagement with the literature – as a source, as a supplement, as a means to demonstrate data.
Part Two: Coding Procedures
Analysis Through Microscopic Examination of Data
Microanalysis: “the detailed line-by-line analysis necessary at the beginning of a study to generate initial categories (with their properties and dimensions) and to suggest relationships among categories; a combination of open and axial coding” (57). Line-by-line can be word, sentence, or paragraph, not static or rigid, of field notes, interviews, pictures, etc. Analysis of data as collected and observers’ interpretations of those events, as well as the interplay between observation and observer. Words may elicit different meanings to different interpreters. “Doing microanalysis compels the analyst to listen closely to what the interviewees are saying and how they are saying it. This means that we are attempting to understand how they are interpreting certain events. This prevents us from jumping precipitously to our own theoretical conclusions, taking into account the interviewees’ interpretations. It helps us to avoid laying our first interpretations on data, forcing us to consider alternative explanations. Also, if we are fortunate, then participants will give us in vivo concepts that will further stimulate our analyses” (65). Use of theoretical and probing questions (who, what, why, etc.) works to extend dimensions of interpretation, offering words properties and consequences, comparisons from concept to concept, and their use within the narrative offered – presents gaps in research, insecurity – items to research more thoroughly, and dispelling common assumptions about the ‘meaning’ offered to a phrase/idea offered.
Basic Operations: Asking Questions and Making Comparisons
Theoretical sampling: “Sampling on the basis of emerging concepts, with the aim being to explore the dimensional range or varied conditions along which the properties of concepts vary” (73). Ask questions, make comparisons. Types of questions: sensitizing (what does the data indicate – who/what/where/when are processes taking place/actors/consequences), theoretical (offer process, variation, interconnection of concepts – how do these items relate, what happens if, how do these things change, what’s the bigger picture), structural (related to next steps- what permissions, sources do I need, what is the saturation point, where are the gaps), and guiding (open-ended, work to provide examples of initial inquiries/concepts). Comparison as a point of classification or to lead us into deeper theory, into greater abstraction (yet, specification and definition of studied phenomenon), and to question our biases about studied concepts.
Specification, looking at one particular phrase, inverting the statement (if something is easy, what would it mean for it to be difficult?), comparing one example to another, thoroughly question the use of absolute terms (everyone, always, never, sometimes etc. – what happens if this doesn’t uphold, what works to make this so?), examine what isn’t being said.
Open coding: “the analytic process through which concepts are identified and their properties and dimensions are discovered in data” (101). Properties: “characteristics of a category, the delineation of which defines and gives it meaning” (101). Dimensions: “the range along which general properties of a category vary giving specification to a category and variation to the theory” (101). In open coding, data are broken down and closely examined through comparison and grouped into categories. In axial and selective coding, data are related reassembled through categories and relationship, and are now considered hypotheses. Conceptualizing as grouping similar events under classifications, related meanings. “Any particular object can be named and thus located in countless ways. The naming sets it within a context of quite differently related classes. The nature or essence of an object does not reside mysteriously within the object itself but is dependent upon how it is defined. (Strauss, 1969, p. 20) But also, the direction of activity depends upon the particular ways that objects are classified…. It is the definition of what the object “is” that allows action to occur with reference to what it is taken to be” (104). In vivo codes – when words of respondents are used as a point of conceptualizing, for the imagery or meaning offered. In vivo coding done through sorting data into meanings and groups, based upon the imagery evoked.
Axial coding: “the process of relating categories to their subcategories, termed ‘axial’ because coding occurs around the axis of a category, linking categories at the level of properties and dimensions” (123) – the reassembling of ideas broken apart during open coding, requires some categories to be formed, but openness to how categories related. The integrating of structure and process – (what exists and why/how). Tip – look for words such as since, when, because – this alludes to process. Examination of consequence and condition – “sets of events and happenings that create the situations, issues, and problems pertaining to a phenomenon and, to a certain extent, explain why and how persons or groups respond in certain ways…… may rise out of time, place, culture, rules, regulations, beliefs, economics, power, or gender factors as well as the social worlds, organizations, and institutions in which we find ourselves along with our personal motivations and biographies” (130). May shift or change over time, be micro or macro, be a cause or impact – deriving relational hypotheses from these categorical and conditional memberships. Saturation when no (few) new ideas, properties, dimensions, or conditions emerge.
Selective coding: “the process of integrating and refining the theory” (143) and bringing forth data from individual instances into a full relational scheme – abstraction process. Choice of a central category – the ability to organize and relate to all other categories – what the paper is about, appears frequently in data, no data ‘forcing’, is abstract and flexible. Developing a story through the data – writing as narrative, as a descriptive concept, through diagrams, revisiting memos, returning to the literature. When refining the theory, we review for consistency, logical gaps (filling in with denseness the properties and dimensions of category – demonstrates variation but completion), trimming the fat, and validation of scheme.
Coding For Process
Observing change and disparity within structures and conditions. Process as “a series of evolving sequences of action/interaction that occur over time and space, changing or sometimes remaining the same in response to the situation or context” (165). Not a separate coding, but happens simultaneously with other analyses. Verbs to the adjectives of coding. How people personalize the structures analyzed.
The Conditional/Consequential Matrix
Conditional/consequential matrix: “an analytic device to stimulate analysts’ thinking about the relationships between macro and micro conditions/consequences both to each other and to process” (181). Conditions and consequences are not isolated, but are always related to interaction and processes, and are integrated into the text. These relationships are rarely linear. (aka condition à action/interaction à consequence); instead, exist in clusters and associate/covary to each other and related interactions. Interactions can be those of individuals, but can also be between institutions. Non-action does not mean non-response or non-consequence. Contextualized on many “area” levels – global, national, regional, familial, individual, etc.
Theoretical sampling: “data gathering driven by concepts derived from the evolving theory and based on the concept of ‘making comparisons,’ whose purpose is to go to places, people, or events that will maximize opportunities to discover variations among concepts and to densify categories in terms of their properties and dimensions” (201). Sampling based on tools, time, and population to be studied. Sensitivity that is developed from emerging concepts helps fine-tune perception toward indicators of concepts in data. “It is not unusual in the early stages of a project for the investigator to overlook the significance of certain events. Later, when more sensitivity has developed, the investigator can legitimately return to data and recode them in light of these new insights” (206). Considerable wait time in preparation for sampling – observations or interviewing prior to “real” sampling. Relational/variational sampling – examining relationships of a concept while in axial coding. Selective coding promotes deliberate sampling – more purposeful as the process continues.
Memos and Diagrams
It is necessary to memo after every analytic session, but memos do not have to be long. Memos can result from other memos. Memos should be dated, include references to where idea was derived, code number of interview/observation/document, page and line numbers, and other identificatory forms. Should have headings that note categories or concepts to which the memos relate. Keep a separate document for emerging themes.
Part Three: Gaining Closure
Criteria for Evaluation
Are concepts generated? Are concepts systematically related? Are there many conceptual linkages, and are categories well-developed? (Do categories have conceptual density?) Is variation built into the theory? Are the conditions under which variation can be found built into the study and explained (not just as background knowledge, but as a part of the body)? Has process been taken into account? Do the theoretical findings seem significant, and to what extent? Does the theory stand the test of time and become part of the discussions and ideas exchanged among relevant social and professional groups?
Student Questions and Answers to These
Process is not just inductive – it is a cyclical application of induction and deduction from what is learned.
Hage, J. 1972. Techniques and Problems of Theory Construction in Sociology. New York: John Wiley.
Strauss, A. 1969. Mirrors and Masks. Mill Valley, CA: Sociology Press.