Upload a featured Image or attachment
This section discusses the selection and justification of the appropriate research paradigm. The term paradigm, as used in this research, refers to the philosophies and beliefs that provide guidelines and principles in relation to how research is conducted (Guba & Lincoln 1994; Hussey & Hussey 1997; Ticehurst & Veal 1999). A research paradigm is a framework of assumptions that guides researchers in their work (Healy & Perry 2000; Thompson & Perry 2004).
A number of research paradigms exist including positivism, realism, critical theory and constructivism (Healy & Perry 2000; Perry, Riege & Brown 1999); positivist and critical interpretive (Ticehurst & Veal 1999); and positivist and phenomenological (Hussey & Hussey 1997). Although there is much debate amongst researchers regarding which paradigm is ‘best’, it is perhaps more useful to regard the alternatives as points on a continuum, with a paradigm and associated methodologies being chosen because they best suit the task at hand (Gummesson 2003; Hussey & Hussey 1997; Ticehurst & Veal 1999). Table 3.1 summarizes the key features of the two paradigms considered for this research, positivistic and phenomenological.
Features of the Two Main Paradigms
Tends to produce quantitative data.
Uses large samples.
Concerned with hypothesis testing.
Data is highly specific and precise.
The location is artificial.
Reliability is high.
Validity is low.
Generalizes from sample to population.
Tends to produce qualitative data.
Uses small samples.
Concerned with generating theories.
Data is rich and subjective.
The location is natural.
Reliability is low.
Validity is high.
Generalizes from one setting to another.
Source: (Hussey & Hussey 1997, p. 54)
Phenomenological research is also referred to as interpretive (Hussey & Hussey 1997), or critical interpretive (Ticehurst & Veal 1999), while Gummesson’s (2003) view is that all research is interpretive.
Research paradigms have three elements that assist with distinguishing between them. They are: ontology, the ‘reality’ that researchers investigate; epistemology, the relationship between that reality and the researcher; and methodology, the technique used to discover reality (Healy & Perry 2000, p. 119). These elements can be used to assess the appropriateness of a paradigm for a particular research problem, and as criteria for judging quality in research (Healy & Perry 2000).
To ensure reliability, quantitative research, under the positivist paradigm, requires a large amount of data from a broad range of people and organizations (Ticehurst & Veal 1999).
The intent was to explore the phenomena in question through focusing on the individuals who participate in these management processes, to minimize the distance between researcher and subject, and to create a rich description of the participant’s experience.
In this way, the research went through a cycle of observation – induction – theory building – observation – deduction – theory testing (Gummesson 2003; Perry 1998; Ticehurst & Veal 1999), via data collection and analysis from the target population. The intent was to use an exploratory approach, which was progressively refined and through analysis of the data, to form an explanatory approach (Eisenhardt 1989; Parkhe 1993; Yin 2003).
A phenomenological paradigm was selected for this research due to the exploratory nature of the research. The objective was to describe real-world practice, using the data collected to refine the theoretical model developed. The section that follows will examine the possible methodologies available with this paradigm.
Figure 3.3 – A Representative Range of Methodologies and their Related Paradigms
Quality, Validity and Reliability
Quality, validity and reliability in qualitative research relate to the ability of the research to be repeated, and to its success, or otherwise, in answering the research question (Hussey &
Hussey 1997). This research has been carefully designed to achieve these aims.
As with debate amongst scholars regarding which paradigm is ‘best’, there is also debate about which criteria is most appropriate to the various paradigms (Golafshani 2003; Healy & Perry 2000). Table 3.12 summarizes the criteria for the various paradigms. Irrespective of the various views, it is important to ensure that research is fit for purpose, a principle followed in designing this research.
Table 3.12 – Research Quality Criteria
‘Truth’ or credibility
Neutrality or conformability
Peer and key informant
Source: Adapted from (Healy & Perry 2000, p. 121-122)
Techniques from Grounded Theory have been used to analyze the data collected for this research. This involved encoding the data and identifying categories until a central category emerged. The quality of the process was ensured using the tests of construct validity, internal and external validity and reliability. The next section will discuss how ethical issues were addressed for this research.
Analyzing qualitative data
Qualitative research produces large amounts of textual data in the form of transcripts and observational field notes
The systematic and rigorous preparation and analysis of these data is time consuming and labour intensive
Textual data are typically explored inductively using content analysis to generate categories and explanations; software packages can help with analysis but should not be viewed as short cuts to rigorous and systematic analysis
High quality analysis of qualitative data depends on the skill, vision, and integrity of the researcher
Relation between analysis and qualitative data
Textual data (in the form of field-notes or transcripts) are explored using some variant of content analysis. In general, qualitative research does not seek to quantify data. Qualitative sampling strategies do not aim to identify a statistically representative set of respondents, so expressing results in relative frequencies may be misleading.
Qualitative research uses analytical categories to describe and explain social phenomena. The term grounded theory is used to describe the inductive process of identifying analytical categories as they emerge from the data (developing hypotheses from the ground or research field upwards rather defining them a priori). Initially the data are read and reread to identify and index themes and categories: these may center on particular phrases, incidents, or types of behavior. Sometimes interesting or unfamiliar terms used by the group studied can form the basis of analytical categories. Becker and Geer’s classic study of medical training uncovered the specialist use of the term “crock” to denote patients who were seen as less worthwhile to treat by medical staff and students.4
All the data relevant to each category are identified and examined using a process called constant comparison, in which each item is checked or compared with the rest of the data to establish analytical categories. This requires a coherent and systematic approach. The key point about this process is that it is inclusive; categories are added to reflect as many of the nuances in the data as possible, rather than reducing the data to a few numerical codes. Sections of the data-such as discrete incidents-will typically include multiple themes, so it is important to have some system of cross indexing to deal with this. A number of computer software packages have been developed to assist with this process (see below).
Indexing the data creates a large number of “fuzzy categories” or units.5 Informed by the analytical and theoretical ideas developed during the research, these categories are further refined and reduced in number by grouping them together. It is then possible to select key themes or categories for further investigation- typically by “cutting and pasting”-that is, selecting sections of data on like or related themes and putting them together. Paper systems for this (using multiple photocopies, cardex systems, matrices, or spreadsheets), although considered somewhat old fashioned and laborious, can help the researcher to develop an intimate knowledge of the data. Word processors can also facilitate data searching, and split screen functions make this a particularly appealing method for sorting and copying data into separate files.
Stages in the analysis of field notes in a qualitative study of ear, nose, and throat surgeons’ disposal decisions for children referred for possible tonsillectomy and adenoidectomy (with examples)
(1) Provisional classification-for each surgeon all cases categorised according to disposal category used (tonsillectomy and adenoidectomy or adenoidectomy alone)
(2) Identification of features of provisional cases-common features of cases in each disposal category identified (most tonsillectomy and adenoidectomy cases found to have three main clinical signs)
(3) Scrutiny of deviant cases-include in (2) or modify (1) to accommodate deviant cases (tonsillectomy and adenoidectomy performed when only two of three signs present)
(4) Identification of shared features of cases-features common to other disposal categories (history of several episodes of tonsillitis)
(5) Derivation of surgeons’ decision rules-from the features common to cases (case history more important than physical examination)
(6) Derivation of surgeons’ search procedures (for each decision rule)-the particular clinical signs looded for by each surgeon
Applied qualitative research
The framework approach has been developed in Britain specifically for applied or policy relevant qualitative research in which the objectives of the investigation are typically set in advance and shaped by the information requirements of the funding body. The timescales of applied research tend to be short and there is often a need to link the analysis with quantitative findings. For these reasons, although the framework approach reflects the original accounts and observations of the people studied (that is, “grounded” and inductive), it starts deductively from pre-set aims and objectives. The data collection tends to be more structured than would be the norm for much other qualitative research and the analytical process tends to be more explicit and more strongly informed by a prior reasoning(box).6 The analysis is designed so that it can be viewed and assessed by people other than the primary analyst.
Five stages of data analysis in the framework approach
Familiarization-immersion in the raw data (or typically a pragmatic selection from the data) by listening to tapes, reading transcripts, studying notes and so on, in order to list key ideas and recurrent themes
Identifying a thematic framework-identifying all the key issues, concepts, and themes by which the data can be examined and referenced. This is carried out by drawing on a priori issues and questions derived from the aims and objectives of the study as well as issues raised by the respondents themselves and views or experiences that recur in the data. The end product of this stage is a detailed index of the data, which labels the data into manageable chunks for subsequent retrieval and exploration
Indexing-applying the thematic framework or index systematically to all the data in textual form by annotating the transcripts with numerical codes from the index, usually supported by short text descriptors to elaborate the index heading. Single passages of text can often encompass a large number of different themes, each of which has to be recorded, usually in the margin of the transcript
Charting-rearranging the data according to the appropriate part of the thematic framework to which they relate, and forming charts. For example, there is likely to be a chart for each key subject area or theme with entries for several respondents. Unlike simple cut and paste methods that group verbatim text, the charts contain distilled summaries of views and experiences. Thus the charting process involves a considerable amount of abstraction and synthesis
Mapping and interpretation-using the charts to define concepts, map the range and nature of phenomena, create typologies and find associations between themes with a view to providing explanations for the findings. The process of mapping and interpretation is influenced by the original research objectives as well as by the themes that have emerged from the data themselves
Whereas a narrative study reports the life of a single individual, a phenomenological study describes the meaning for several individuals of their lived experiences of a concept or a phenomenon. Phenomenologists focus on describing what all participants have in common as they experience a phenomenon .The basic purpose of phenomenology is to reduce individual experiences with a phenomenon to a description of the universal essence (a “grasp of the very nature of the thing,” van Manen, 1990, p. 177). To this end, qualitative researchers identify a phenomenon (an “object” of human experience; van Manen, 1990, p. 163). This human experience may be phenomena such as insomnia, being left out, anger, grief, or undergoing coronary artery bypass surgery (Moustakas, 1994). The inquirer then collects data from persons who have experienced the phenomenon, and develops a composite description of the essence of the experience for all of the individuals. This description consists of “what” they experienced and “how” they experienced it (Moustakas, 1994).
Beyond these procedures, phenomenology has a strong philosophical component to it. It draws heavily on the writings of the German mathematician Edmund Husserl (1859-1938) and those who expanded on his views, such as Heidegger, Sartre, and Merleau-Ponty (Spiegelberg, 1982). Phenomenology is popular in the social and health sciences, especially in sociology (Borgatta & Borgatta, 1992; Swingewood, 1991), psychology (Giorgi, 1985; Polkinghorne, 1989), nursing and the health sciences (Nieswiadomy, 1993; Oiler, 1986), and education (Tesch, 1988; van Manen, 1990). Husserl’s ideas are abstract, and, as late as 1945, Merleau-Ponty (1962) still raised the question,
Looking across all of these perspectives, however, we see that the philosophical assumptions rest on some common grounds: the study of the lived experiences of persons, the view that these experiences are conscious ones (van Manen, 1990), and the development of descriptions of the essences of these experiences, not explanations or analyses (Moustakas, 1994). At a broader level, Stewart and Mickunas (1990) emphasize four philosophical perspectives in phenomenology: The intentionality of consciousness.
This idea is that consciousness is always directed toward an object. Reality of an object, then, is inextricably related to one’s consciousness of it. Thus, reality, according to Husserl, is not divided into subjects and objects, but into the dual Cartesian nature of both subjects and objects as they appear in consciousness.
â€¢ The refusal of the subject-object dichotomy. This theme flows naturally from the intentionality of consciousness. The reality of an object is only perceived within the meaning of the experience of an individual. An individual writing a phenomenology would be remiss to not include some discussion about the philosophical presuppositions of phenomenology along with the methods in this form of inquiry. Moustakas (1994) devotes over one hundred pages to the philosophical assumptions before he turns to the methods.
Types of Phenomenology
Two approaches to phenomenology are highlighted in this discussion: hermeneutic phenomenology (van Manen, 1990) and empirical, transcendental, or psychological phenomenology (Moustakas, 1994). An educator, van Manen, has written an instructive book on hermeneutical phenomenology in which he describes research as oriented toward lived experience (phenomenology) and interpreting the “texts” of life (hermeneutics) (van Manen, 1990, p. 4). Although van Manen does not approach phenomenology with a set of rules or methods, he discusses phenomenology research as a dynamic interplay among six research activities. Researchers first turn to a phenomenon, an “abiding concern” (p. 31), which seriously interests them (e.g., reading, running, driving, and mothering). In the process, they reflect on essential themes, what constitutes the nature of this lived experience? They write a description of the phenomenon, maintaining a strong relation to the topic of inquiry and balancing the parts of the writing to the whole. Phenomenology is not only a description, but it is also seen as an interpretive process in which the researcher makes an interpretation (i.e., the researcher “mediates” between different meanings; van Manen, 1990, p. 26) of the meaning of the lived experiences
Moustakas’s (1994) transcendental or psychological phenomenology is focused less on the interpretations of the researcher and more on a description of the experiences of participants. In addition, Moustakas focuses on one of Husserl’s concepts, epoche (or bracketing), in which investigators set aside their experiences, as much as possible, to take a fresh perspective toward the phenomenon under examination. Hence, “transcendental” means “in which everything is perceived freshly, as if for the first time” (Moustakas, 1994, p. 34). Moustakas admits that this state is seldom perfectly achieved.
However, I see researchers who embrace this idea when they begin a project by describing their own experiences with the phenomenon and bracketing out their views before proceeding with the experiences of others. Besides bracketing, empirical, transcendental phenomenology draws on the Duquesne Studies in Phenomenological Psychology (e.g., Giorgi, 1985) and the data analysis procedures of Van Kaam (1966) and Colaizzi (1978).
The procedures, illustrated by Moustakas (1994), consist of identifying a phenomenon to study, bracketing out one’s experiences, and collecting data from several persons who have experienced the phenomenon. The researcher then analyzes the data by reducing the information to significant statements or quotes and combines the statements into themes. Following that, the researcher develops a textural description of the experiences of the persons (what participants experienced), a structural description of their experiences (how they experienced it in terms of the conditions, situations, or context), and a combination of the textural and structural descriptions to convey an overall essence of the experience.
Procedures for Conducting Phenomenological Research
I use the psychologist Moustakas’s (1994) approach because it has systematic steps in the data analysis procedure and guidelines for assembling the textual and structural descriptions. The conduct of psychological phenomenology has been addressed in a number of writings, including Dukes (1984), Tesch (1990), Giorgi (1985, 1994), Polkinghorne (1989), and, most recently, Moustakas (1994). The major procedural steps in the process would be as follows:
â€¢ The researcher determines if the research problem is best examined using a phenomenological approach. The type of problem best suited for this form of research is one in which it is important to understand several individuals’ common or shared experiences of a phenomenon. It would be important to understand these common experiences in order to develop practices or policies, or to develop a deeper understanding about the features of the phenomenon.
â€¢ A phenomenon of interest to study, such as anger, professionalism, what it means to be underweight, or what it means to be a wrestler, is identified. Moustakas (1994) provides numerous examples of phenomena that have been studied. The researcher recognizes and specifies the broad philosophical assumptions of phenomenology. For example, one could write about the combination of objective reality and individual experiences. These lived experiences are furthermore “conscious” and directed toward an object. To fully describe how participants view the phenomenon, researchers must bracket out, as much as possible, their own experiences.
â€¢ Data are collected from the individuals who have experienced the phenomenon. Often data collection in phenomenological studies consists of in-depth interviews and multiple interviews with participants. Polkinghorne (1989) recommends that researchers interview from 5 to 25 individuals who have all experienced the phenomenon. Other forms of data may also be collected, such as observations, journals, art, poetry, music, and other forms of art. Van Manen (1990) mentions taped conversations, formally written responses, accounts of vicarious experiences of drama, films, poetry, and novels.
â€¢ The participants are asked two broad, general questions (Moustakas, 1994): What have you experienced in terms of the phenomenon? What contexts or situations have typically influenced or affected your experiences of the phenomenon? Other open-ended questions may also be asked, but these two, especially, focus attention on gathering data that will lead to a textural description and a structural description of the experiences, and ultimately provide an understanding of the common experiences of the participants.
â€¢ Phenomenological data analysis steps are generally similar for all psychological phenomenologists who discuss the methods (Moustakas, 1994; Polkinghorne, 1989). Building on the data from the first and second research questions, data analysts go through the data (e.g., interview transcriptions) and highlight “significant statements,” sentences, or quotes that provide an understanding of how the participants experienced the phenomenon. Moustakas (1994) calls this step horizonalization. Next, the researcher develops clusters of meaning from these significant statements into themes.
â€¢ These significant statements and themes are then used to write a description of what the participants experienced (textural description). They are also used to write a description of the context or setting that influenced how the participants experienced the phenomenon, called imaginative variation or structural description. Moustakas (1994) adds a further step: Researchers also write about their own experiences and the context and situations that have influenced their experiences. I like to shorten Moustakas’s procedures, and reflect these personal statements at the beginning of the phenomenology or include them in a methods discussion of the role of the researcher (Marshall & Rossman, 2006).
â€¢ From the structural and textural descriptions, the researcher then writes a composite description that presents the “essence” of the phenomenon, called the essential, invariant structure (or essence). Primarily this passage focuses on the common experiences of the participants. For example, it means that all experiences have an underlying structure (grief is the same whether the loved one is a puppy, a parakeet, or a child). It is a descriptive passage, a long paragraph or two, and the reader should come away from the phenomenology with the feeling, “I understand better what it is like for someone to experience that” (Polkinghorne, 1989, p. 46).
Grounded Theory Research
Although a phenomenology emphasizes the meaning of an experience for a number of individuals, the intent of a grounded theory study is to move beyond description and to generate or discover a theory, an abstract analytical schema of a process (or action or interaction, Strauss & Corbin, 1998). Participants in the study would all have experienced the process, and the development of the theory might help explain practice or provide a framework for further research. A key idea is that this theory-development does not come “off the shelf,” but rather is generated or “grounded” in data from participants who have experienced the process (Strauss & Corbin, 1998). Thus, grounded theory is a qualitative research design in which the inquirer generates a general explanation (a theory) of a process, action, or interaction shaped by the views of a large number of participants (Strauss & Corbin, 1998).
Thus, grounded theory provided for the generation of a theory (complete with a diagram and hypotheses) of actions, interactions, or processes through interrelating categories of information based on data collected from individuals.
More recently, Charmaz (2006) has advocated for a constructivist grounded theory, thus introducing yet another perspective into the conversation about procedures.
Another recent grounded theory perspective is that of Clarke (2005) who, along with Charmaz, seeks to reclaim grounded theory from its “positivist underpinnings” Clarke, however, goes further than Charmaz, suggesting that social “situations” should form our unit of analysis in grounded theory and that three sociological modes can be useful in analyzing these situations- situational, social world/arenas, and positional cartographic maps for collecting and analyzing qualitative data. She further expands grounded theory “after the postmodern turn” and relies on postmodern perspectives (i.e., the political nature of research and interpretation, reflexivity on the part of researchers, a recognition of problems of representing information, questions of legitimacy and authority, and repositioning the researcher away from the “all knowing analyst” to the “acknowledged participant”) (pp. xxvii, xxviii). Clarke frequently turns to the postmodern, post structural writer Michael Foucault (1972) to help turn the grounded theory discourse.
Procedures for Conducting Grounded Theory Research
Although Charmaz’s interpretive approach has many attractive elements (e.g., reflexivity, being flexible in structure, as discussed in Chapter 2), I rely on Strauss and Corbin (1990, 1998) to illustrate grounded theory procedures because their systematic approach is helpful to individuals learning about and applying grounded theory research.
â€¢ The researcher needs to begin by determining if grounded theory is best suited to study his or her research problem. Grounded theory is a good design to use when a theory is not available to explain a process. The literature may have models available, but they were developed and tested other than those of interest to the qualitative researcher. Also, theories may be present, but they are incomplete because they do not address potentially valuable variables of interest to the researcher.
On the practical side, a theory may be needed to explain how people are experiencing a phenomenon, and the grounded theory developed by the researcher will provide such a general framework.
â€¢ The research questions that the inquirer asks of participants will focus on understanding how individuals experience the process and identifying the steps in the process (What was the process? How did it unfold?). After initially exploring these issues, the researcher then returns to the participants and asks more detailed questions that help to shape the axial coding phase, questions such as: What was central to the process? (the core phenomenon);
What influenced or caused this phenomenon to occur? (Causal conditions); what strategies were employed during the process? (Strategies); what effect occurred? (Consequences)
â€¢ These questions are typically asked in interviews, although other forms of data may also be collected, such as observations, documents, and audiovisual materials. The point is to gather enough information to fully develop (or saturate) the model.
â€¢ The analysis of the data proceeds in stages. In open coding, the researcher forms categories of information about the phenomenon being studied by segmenting information. Within each category, the investigator finds several properties, or subcategories, and looks for data to dimensionalize, or show the extreme possibilities on a continuum of, the property.
â€¢ In axial coding, the investigator assembles the data in new ways after open coding. This is presented using a coding paradigm or logic diagram (i.e.,a visual model) in which the researcher identifies a central phenomenon (i.e., a central category about the phenomenon), explores causal conditions (i.e., categories of conditions that influence the phenomenon), specifies strategies (i.e., the actions or interactions that result from the central phenomenon), identifies the context and intervening conditions (i.e., the narrow and broad conditions that influence the strategies), and delineates the consequences (i.e., the outcomes of the strategies) for this phenomenon.
â€¢ In selective coding, the researcher may write a “story line” that connects the categories. Alternatively, propositions or hypotheses may be specified that state predicted relationships.
â€¢ Finally, the researcher may develop and visually portray a conditional matrix that elucidates the social, historical, and economic conditions influencing the central phenomenon. It is an optional step and one in which the qualitative inquirer thinks about the model from the smallest to the broadest perspective.
â€¢ The result of this process of data collection and analysis is a theory, a substantive-level theory, written by a researcher close to a specific problem or population of people. The theory emerges with help from the process of memoing, a process in which the researcher writes down ideas about the evolving theory throughout the process of open, axial, and selective coding.
The substantive-level theory may be tested later for its empirical verification with quantitative data to determine if it can be generalized to a sample and population (see mixed methods design procedures, Creswell & Plano Clark, 2007). Alternatively, the study may end at this point with the generation of a theory as the goal of the research.