Chapter+2

The Grounded Theory Method Process Classic grounded theory method is a "//general method of comparative analysis//", which has four central criteria: work (generality), relevance (understanding), fit (valid), and modifiability (control) (Glaser, 1978, p 4-5). //Fit// has to do with how closely concepts match the incidents they are representing, and this is related to how thoroughly the constant comparison of incidents to concepts is done. A //relevant// study deals with the real concerns and problems from the participants’ perspective, and captures the attention not only of academics but also of persons in the know - the knowledgeable layperson. The theory //works// when it explains how problems are being solved. A //modifiable// theory can be altered when new relevant data is compared to existing data. Grounded theory is unique in the field of research in that it does not generate findings, but ideas (Glaser, 1978). Grounded theorists inductively create unique theories through constant comparative analysis. The word 'create' conjures up images of creativity and originality. According to Glaser, “It does not take a genius to generate a useful grounded theory. It does take some codification of the method of doing it, as well as recognition of its legitimacy for student training and academic careers” (Glaser, 1978, p.11). What results from a grounded theory method analysis is an organized series of integrated theoretical concepts. .

** Initial Research Setting ** The substantive area of interest of this study was technology-mediated distance education; the analysis began with the application of distance education to higher education in the province of Manitoba, Canada. Initial interview candidates were selected from students who studied through Campus Manitoba, a consortium of all public institutions providing higher education by distance in Manitoba. [1]

Sources of Data One of the basic precepts of grounded theory is expressed by the idea that "All is data" (Glaser, 1978, p.8; Glaser, 2007). Much of the data for the analysis for this project was collected from participant interviews; however, consistent with the practice of the grounded theory method, data from many other sources were examined: web resources, policy documents, minutes of meetings, workshop and conference proceedings, collegial comments, anecdotal comparisons, and media messages in popular culture.

//Intensive Interviews// Data collection for this research began with interview data from individuals who had experience with distance education as students, faculty or other participants. Individuals from other groups were interviewed as data collection progressed. Intensive interviews were conducted using a technique recommended and modeled by Phyllis Noerager Stern. (personal communication, 2007)

//The Interview Question// Gathering data from initial interview subjects for this research began with the use of a broad open-ended question or grand tour question. This broad question invited respondents to provide data from their own experience rather than responding to a set of questions composed to evoke a certain answer set and thus confirm, rather than discover, a theory. Additionally, interviews were conducted with individuals who had contemplated using distance education but decided against it, opting to make arrangements for face-to-face participation. Further interview candidates were selected through "...convenience sampling to locate persons who have already gone through, or have observed the process" (Morse, 2007). Sampling subsequently became more purposeful to interview specific individuals of interest. Finally, theoretical sampling was used to test emerging categories against the experiences of specific individuals or data sets and to direct the selection of further sources of data (Glaser, 1978). Glaser (1967) describes theoretical sampling as, …the process of data collection for generating theory whereby the analyst jointly collects, codes, and analyzes his data and decides what data to collect next and where to find them, in order to develop his theory as it emerges (p. 45).

//Observations// Sensitized observations of relevant social facts in everyday settings are an important source of data for grounding theory. A grounded theory analyst makes observations with the emerging theory in mind and is continually alert for incidents that inform the theory or provide illustrations of concepts. Unstructured observations are used in grounded theory because structured observations are replete with preconceived relevancies (O. Simmons personal communication, April 13, 2009). My observations for this research have been recorded in memos and included in the comparative analysis.

//The Analyst's Perspective// There are a number of ways that the analyst's experience and perspective were included as sources of data. As analyst, I simply conducted a self-interview and responded to the grand tour question as would any other interviewee. The resulting data was subjected to the same coding and inspection for indicators, categories and dimensions. Analytic introspection provided another set of data as I became sensitized to the data, codes, memos, and processes.

//Collegial Comments// Once a core variable emerged, I compared notes with others engaged in grounded theory method inquiries. Such discussions sparked further insights into the core variable, were recorded as memos, and were subjected to comparative analysis.

//Literature from the Substantive Area// Literature from the substantive area of interest was included in the analysis but only after the theory had been established and grounded. At that point, I considered literature from the field of distance education and integrated it into the emerged theory. I was conscious of following Glaser’s (1978) caution to avoid confusing my theory with preconceptions and other theoretical frameworks (p.35).

//Institutional Documents, Conference Proceedings and Web Resources.// I examined data from annual reports, promotional literature and policy documents of various education institutions and agencies involved in distance education. I also considered and compared data from conference proceedings relating to computer-mediated distance education. Because the substantive area under examination involved the Internet, I naturally looked to institutional websites and online resources for relevant material. As the analysis progressed and once a tentative core variable emerged, I established a systematic automated online search. I used the Google Alerts service to notify me of any online references to variations of the core variable. These notifications provided theoretical samples directing my search for insights into concepts and suggestions for further data collection and comparisons.

//Extant Theory// The creative process involved in grounded theory method requires that the initial analysis be performed without reference to existing theory. Extant theory can still be an important source of data for an emerging theory but only if it earns its way into the analysis. The result is usually extending and transcending the extant theory rather than verifying a deducted hypothesis or replicating an earlier one. Thus scholarship in the same area starts after the emerging theory is sufficiently developed so the theory will not be preconceived by preempting concepts. (Glaser, 1978, p. 31)

Once the theory in this analysis emerged, I considered extant theory from a number of fields using the constant comparative process.

//Slices of data// An important source of data for this study was gathered from shared stories, experiences, anecdotal comparisons, general knowledge, and reading. …For generating theory this variety is highly beneficial, because it yields more information on categories than any one way of knowing (technique of collection) (Glaser & Straus, 1967, p. 66)

Data of this sort was used in the constant comparative process at the later stages of analysis where I used it to refine and delimit core categories.

** Elements of the Grounded Theory Process ** According to Glaser, the grounded theory analysis is an iterative process, a series of "double back steps" (Glaser 1978, p. 16). The process begins with collection of research data, open coding of indicators in data, and the sorting of codes into categories. Variables emerge that appear to explain the way that people deal with problems and concerns. The concept that appears to explain the most variation is promoted to the core variable and forms the basis for the emerging theory. Selective and theoretical samplings are used to further develop the core variable and its dimensions. Copious memos are generated throughout the entire analysis and are sorted into a theoretical outline which allows for the first written draft of the theory. The draft is reworked continuously to improve integration and conceptual density of the theory. the final stage of the analysis is completed as the manuscript is prepared for publication.



Figure 1. Grounded theory method workflow

Figure 1 illustrates the process I used in the analysis presented in this paper. The process was not as linear as the diagram suggests, although the general flow of work is representative. Many of the processes involved iterations of tasks at various stages and a number of tasks were ongoing throughout the analysis. Interviews were analyzed for incidents which were coded (labeled) and sorted. The sorted and labeled incidents were tentatively assigned to categories, and constantly compared to new incidents and codes as they emerged. Fairly early in the analysis, one concept appeared to be explaining much of the variation that was being observed in the data. A tentative core variable emerged around the in vivo code “keeping your distance”.

I recorded early interviews with a digital recorder and took field notes at the same time. Reconciliation of recordings and notes indicated that recording was not adding value to the process and subsequent data collection relied on field notes alone. I used standard notebooks for field notes with raw data recorded on the right-hand page and codes and memos developed on the left hand page. I used a word processor to compose memos and organized them with the help of concept mapping software. Initially descriptive, memos became more conceptual as categories and properties converged around the core variable. I organized the conceptual memos into a theoretical outline that could be adjusted to accommodate the emerging theory. Finally, I used the theoretical outline to develop a manuscript. What follows is a more detailed description of the various stages of the process.

Data collection and analysis occurred simultaneously. Analysis of data from interviews began with the first interview and continued as more interviews were conducted. The analysis directed the choice of the next interview direction and subject. While the data analysis relied heavily on induction, deductive processes were used to guide the selection of data sources for further comparisons.
 * // Data Collection //**

When collecting data through interviews, I used an intensive interview process evolved from ethnography and adapted by grounded theorists. (P.Noerager Stern, personal communication, August 11, 2006) The interview began with a grand tour question, an invitation to the interviewee to disclose their experience, in this case with distance education. My initial question was, "Will you please tell me about your experience with distance education?" This open-ended request invited participants to discuss what most interested them. I used probing questions and active listening techniques to draw out more information and encourage interviewees to reflect at a deeper, more introspective level.

Data for analysis was also collected from the proceedings of various meetings, seminars, workshops and conferences related to education and distance education; much of this data was from my own collection of documents acquired working in the field. Ongoing constant comparative analysis was used to discern the categories and dimensions of the emerging concepts from all sources of data. [2]

Data can take many forms, as “all is data” to a grounded theorist (Glaser 1978, p.8) Data collected from interviews can be characterized as one of four types: //baseline, properline, interpreted,// and //vague// (Glaser, 1998). //Baseline// data are the best possible data gathered from a trusted source, accepted at face value. //Properline// data are acquired from a source that lines or laces the data with what the respondent thinks is proper, rather than what truly is. I was required to make a determination if a person was telling me what they thought I wanted to hear. A third type is //interpreted data//, which are, as the name suggests, an interpretation on the subject. These data come as a result of the participant providing data through a lens that only s/he can imagine. The final type of data are //vague//, the result of the participant's “vaguing out” (Glaser, 1998, p. 9). In some cases it appeared the participant ‘pulled back’ in their narrative when approaching something personal or confidential. The individual switched from speaking about specific incidents to speaking in generalities.
 * // Interview Data Types //**

All of the four types mentioned above are data to be analyzed, coded, and integrated into a conceptual framework. The fact that a participant has proper-lined or vagued-out can be data that gives a researcher insight into a participant's perspective.

At times it was difficult to determine exactly the type of data being presented. A useful heuristic device was suggested by the work of Clandinin & Connelly, (1998) in their analysis of practical and formal epistemologies. In their description of teacher narratives they distinguished between the “sacred story”, the “secret story”, and the “cover story” (p.25). The sacred story is like the properline data that Glaser described, the authorized version of reality. The secret story is baseline data, the lived experience of the individual that may diverge significantly from the sacred story. The cover story is the version that reconciles the sacred and secret and protects the individual from censure. These various narratives were apparent in the data collected for this analysis.

With respect to the reliability of data, Glaser emphasized the difference between the grounded theory method and formal Qualitative Data Analysis (QDA) (Glaser, 2007). One of the principle differences was concern on the part of QDA for what Glaser called “worrisome accuracy” or “distortion tyranny” and the difficulty inherent in removing or accounting for bias in collected data.

. " (Grounded theory) gives this freedom from distortion tyranny to data that is sourced from interpretations, constructions, voice of participant, personal experience of various kinds, culturally differential, language differential, objective-subjective, value laden, behavior vs. spoken, truth vs. properline, credibility of informants, selective data collection, selective non-random sampling, multiple versions of the truth, historical, biographical, gender bias, varying interview and observational techniques. The reader may think of more sources. To repeat this in context, all that GT does is to rigorously generate conceptual hypotheses that get applied with fit, relevance and workability (explanatory)" (Glaser, 2007 p5).

Glaser’s statement regarding worrisome accuracy proved very helpful in this analysis as I endeavored to glean meaning from seemingly ambiguous and multifaceted data.

“The essential relationship between data and theory is a conceptual code” (Glaser, 1978, p.55) Coding fractures the data so that patterns begin to emerge. Initially I analyzed and coded interview data line-by-line. I used the orienting questions suggested by Glaser (1978), “What is the data telling me?” and “What is actually happening in the data?” (p. 57) As substantive and open codes developed, I compared each to the codes from previous comparisons. Through the comparison of codes, latent patterns began to emerge and suggest categories of variables. With continuing comparisons, theoretical properties began to surface and a tentative core variable became discernible. As the theory emerged from the codes, ideas and concepts began to reoccur. At this point I began to consider certain categories saturated, and the process of selective coding began to further relate data to the other dimensions of the core variable (Glaser, 1978).
 * // Generating Codes from Data //**

Open coding was the first type of coding I used in this analysis to fracture the initial data and push the process beyond preconceptions. Codes were generated by constructing labels from the processes and behaviors under examination, guided by the analytic orienting questions. I sorted codes into tentative and flexible categories. I initially used codes from general vocabulary supplemented by lexical searches for synonyms, root words and word pairs. To mitigate forcing data into preconceived categories, I developed my own codebook and avoided the prefabricated codebooks common in other forms of qualitative research.
 * // Open Coding //**

When one category of codes in particular seemed to explain most of the variation I was seeing in the data, coding centered on that variable. This variable was the potential core variable and appeared to represent the highest available level of abstraction. Selective coding involved looking for specific open codes that continually reappeared in the data and related to the potential core. Other coded variables were considered as subcategories and/or dimensions. I selected codes that promised to move the analysis to more conceptual levels.
 * // Selective Coding //**

The core variable is the “main theme of what is happening in the data” (Glaser, 1978, p. 94). There are several criteria to determine the plausibility of the core variable: When the core variable was determined, the focus of sampling and coding shifted. Subsequent coding concentrated on the core variable and subcategories relating theoretical and substantive codes to each other and eventually to the core variable. Glaser (1978; 1998) compiled several families of codes generated by theorists, which can be used when beginning to generate theoretical codes. In this analysis I began by using one set of codes in particular, //the six C's// (causes, contexts, contingencies, consequences, co-variances and conditions) as a beginning framework (Glaser, 1978). The emerging concepts did not fit into all of these predetermined categories. In particular, it soon became obvious that the emerging theory worked across contexts, eliminating that code from further analysis.
 * // Core Variable //**
 * It must be central and account for the widest variation in a pattern of behavior.
 * It must reoccur frequently. It needs to appear throughout the data and as a pattern that is predominant in the coding.
 * It must have “clear and grabbing implications” for the theory. When the core variable is mentioned to the knowledgeable layperson, they can immediately see the implication (Glaser, 1978, p. 95).
 * // Theoretical Codes //**
 * // Theoretical Codes //**

The codes began to integrate once selective and theoretical coding commenced. As I made connections among the codes and their properties, the comparison between incidents shifted to comparisons of properties, categories and dimensions. As new incidents arose from additional data collection, they were compared to the aggregated and more conceptual codes. As these connections begin to emerge, they were written up in theoretical memos.
 * // Integrating the Codes //**

Theoretical sampling is the process of data collection for generating theory whereby the analyst jointly collects, codes, and analyzes his data and decides what data to collect next and where to find them in order to develop his theory as it emerges. Glaser, (1978, p.36).
 * // Theoretical Sampling //**

I used theoretical sampling to direct the selection of the next interview subject or next document for data analysis, making connections between the core variable and its properties. The emerging patterns were of interest rather than the descriptive details.

Initial participants for this analysis were chosen from related groups, support staff, administration and faculty involved in the distance education enterprise. Comparisons were sought on variables that were relevant to all these groups. Glaser (1978) refers to the interchangeability of indicators that allows for comparisons between seemingly non-comparable groups and points out that differences and similarities hold theoretical value. (p.43) As Glaser (1978) points out, "apples can be compared to oranges if the comparison is the kinds of vitamins beneath the skin". (p.42) In the early theoretical sampling, it seemed appropriate to look beyond the initial groups from the distance education arena to test the generalizability of the core variable. This proved to be fruitful, although not necessarily as a theory about distance education as much as it contributed to the more general theory that emerged from the data. Some common issues that emerged were the use of technology, the cynicism that arose from overblown institutional hyperbole and marketing, and the avoidance of superfluous interactions with non-compatible (non-coherent) group members.

I was interested to discover more about aspects of incidents from interview data that indicated dissonance as a result of mixed messages related to technology use in education. I used theoretical codes adapted from the work of Argyris (1976) in the field of Action Science. [3] The action science theoretical perspective directed me to new sources of data from various institutional websites, brochures and promotional materials.

“The bedrock of theory generation. . . is the writing of theoretical memos” (Glaser, 1978, p. 83). I began writing memos as soon as I commenced coding of interview data. Memos provided an outlet for ideas, thoughts, and questions about what I was analyzing, making theoretical connections among the codes, categories, and the core variable as it emerged. I used memos to begin connecting categories and to examine their relationships.
 * // Memoing //**

Memos were concrete and descriptive in the beginning but were revisited and refined as the analysis proceeded. Memos became increasingly conceptual with each of the iterations.
 * // Refining Memos from Concrete to Conceptual //**

While I tentatively sorted memos as I wrote them a more purposive sorting occurred when I began to plan the final written form of the theory. Sorting reduced the variety of concepts, discovering underlying similarities, making connections, and formulating or reformulating the developing theory at a more abstract conceptual level. Delimiting the theory to more abstract concepts enhanced generalizability for application in social settings beyond distance education. I sorted each written memo into a theoretical outline according to its relation to other memos, using analytical rules.
 * // Sorting the Memos //**

//Analytical Rules// Memos are sorted according to analytical rules (Glaser, 1978). The most critical rule for sorting is the relationship of the memo to the core variable; if a memo is not related to the core variable or a property of the core variable, it is left out of the analysis. (In some instances, the abandoned memos may be useful in the development of another grounded theory relating to another pattern of social behavior.) The analyst establishes rules for the determination of the core variable, the one that explains the most variation, recognizing that “the goal is not to cover all possible theoretical possibilities nor explain //all// variation.” (p.122). Other rules relate to the integrative fit of ideas and are “based on the assumption that the social organization of the world is integrated and the job of the grounded theorist is to discover it” (p 123). The application of rules to sorting generates other analytical rules which result in further memos. Analytical rules also apply to theoretical completeness. The theorist “explains with the fewest possible concepts, and the greatest possible scope, as much variation as possible in the behavior and problem under study” (p125). A distinction is made between theoretical completeness and scholarly completeness, recognizing that it is impossible to consider all scholarly references to the area under consideration. The analyst’s “job is to contribute to this literature, not completely to master it. His contribution is integrative and recognitive, not reverent”(p.126). Analytical rules constantly emerge as they become relevant while others are given up as they become unnecessary or useless. All relevant memos find a place in the outline, though their importance to the theory may vary. “The smaller the amount of concepts that account for the greatest variation in substantive behavior resolving the main concern is the goal” (Glaser, 1998). The memos become the outline, and then the writer must merely connect and integrate the ideas together into a formal theory.

Reducing the theory involved discovering underlying similarities and making connections and formulating or reformulating the theory with a smaller set of higher level concepts. Reduction helped to delimit the theory to concepts that have generalizability and application across contexts and social arenas. This analysis started with multiple substantive categories that, with reduction, collapsed into fewer, more conceptual dimensions. Other categories that did not fit with the identified core variable were set aside for another treatment. Writing the Theory Writing the theory consisted of combining the sorted memos and theoretical outline into a cohesive, accessible theory. The outline and the memos formed the backbone for the writing. I integrated examples from data and the literature for the purpose of supporting, illustrating, and/or expanding the theory. I felt that it was important to be able to provide adequate illustrations from my data without being overly concerned about providing proof or verification. I also felt a challenge in this regard to respect the confidentiality of my respondents and to meet my assurances to them that nothing that they said could personally identify them.
 * // Reducing the Theory //**

Glaser, (1978) describes the objective of the grounded theory analyst when writing up a manuscript: The credibility of the theory should be won by its integration, relevance and workability, not by illustration used as if it were proof. The assumption of the reader, he should be advised, is that all concepts are grounded and that this massive grounding effort could not be shown in a writing. Also that as grounded they are not proven; they are only suggested. The theory is an integrated set of hypotheses, not of findings. Proofs are not the point. Illustrations are only to establish imagery and understanding as vividly as possible when needed. It is not incumbent upon the analyst to provide the reader with description or information as to how each hypothesis was reached. Stating the method in the beginning or appendix is sufficient, perhaps with an example of how one went about grounding a code and an hypothesis (p.134).

In order to ensure that the theory generated by this analysis had the necessary grab and relevance, conscious effort was made to use accessible language. The intention was that this theory be of use to knowledgeable laypeople. Apprehension of this theory does not presuppose any special technical or professional background.
 * // Accessible Language //**

An important aspect of writing up a theory is the development of an ontology, or vocabulary, for the theory, naming the various essential features of the theory, the categories, and dependencies of the concepts. According to Lamp & Milton, (2007) phenomenological ontology must be based on an examination of the essential features of any experience that could provide knowledge of such objects. The use of the gerund form is characteristic of classical grounded theory manuscripts when describing processes (Glaser & Caplan, 1996). The core variable and dimensions are expressed in gerund form to emphasize that the theory is an expression of an active process. As I developed the vocabulary for this grounded theory, I elected to use terminology and language that would ensure that the theory would be accessible and have the broadest possible utility in various academic and applied contexts.
 * // Developing Ontology //**

//Summary// One of the most delightful aspects of using the grounded theory method was the ease of data collection and the rich variety of sources that were available. When I initially began my data collection using formal interviews, I had some trepidation that people would not comfortably share information with me. I am a large male with a dark complexion and occasionally people report that they find me intense and intimidating. However, my concerns were quickly allayed when I discovered that people were more than willing to share their experiences with me. Overall, my interactions with interviewees were very satisfying. Glaser (1978) relates his experience gathering interview data, “mention a hospital and everyone has an experience to tell and often cannot be stopped until they do” (p. 51). I found this to be very much the case as I began my quest for data, particularly once the core variable had emerged and been tentatively confirmed. Whenever I described the emerging theory, people began to instantly relate and provide me with more data than I could easily assimilate and certainly more than I could provide as evidence. I was especially gratified when I mentioned the emerging core variable to an experienced grounded theorist who immediately began to relate on a personal level. Data from many of such discussions was incorporated into the analysis.

Once again, I am not seeking to provide evidence, a process more suited to other forms of formal qualitative data analysis, but I am seeking to illuminate a hypothesis that has emerged from the constant comparative process. Examples are offered as illustrations of concepts.

A number of grounded theorists have remarked on an interesting phenomenon that occurs when gathering data for a grounded analysis. When a true core variable emerges it is almost impossible to avoid seeing indicators everywhere. [4] Glaser (1967) discusses an important source of data arising out of seemingly serendipitous encounters where data might be considered trivial in other quantitative data analysis research methods. He describes “slice of life” data and data from anecdotal comparisons:

This kind of data can be trusted if the experience was “lived.” Anecdotal comparisons are especially useful in starting research and developing core categories. The researcher can ask himself where else has he learned about the category and make quick comparisons to start to develop it (core category) and sensitize himself to its relevancies. (p. 67)

An example of this from my experience is illustrated by a casual coffee conversation that I had with a group of friends. We had all been enjoying a free-ranging discussion and I had mentioned what I was doing and described the core variable of keeping your distance that had recently begun to emerge from my coding of initial interviews. The conversation ranged on as such conversations will, and a number of single women in the group mentioned that they were actively involved in online dating. Being happily married and thankfully out of the dating game, I had little direct experience with online dating and asked about the appeal of such methods. One woman quickly offered that I should recognize the appeal because we had just been talking about //keeping your distance// and online dating is a useful way for people to view perspective partners while maintaining a comfortable distance. This incident was one of many slices of life that appeared as the theory emerged. An important criteria of a grounded theory is that it is modifiable and able to accommodate new data as it arises. (Glaser, 1978, p. 5) I fully expect that more data will be revealed. What follows below is my effort to articulate the dimensions and variations of the theory of //keeping your distance//.

[1] Interviewing was conducted following the requirements of the Tri-council Policy for Research Involving Human Subjects (Canada, 2005). The major tenets of this policy ensures protection for vulnerable people, respect for human dignity, informed consent for all material used, respect for privacy and confidentiality, respecting justice and inclusiveness minimizing harm and maximizing benefit. In particular Section 3.2, the Section that governs the collection of data in the form of private information will be observed. Certificates granted by the Institutional Review Boards of Brandon University, the administrative agent for Campus Manitoba, and from Fielding Graduate University are available in the appendix. Sampling began with a subject group of people who use Campus Manitoba to access and provide higher education in rural Manitoba. Selective sampling led to other populations.

[2] One valuable source of data is the analyst's experience. However, this experience should not preconceive the outcome of the analysis. It should be given the same consideration as any other source of data. This experience includes direct experience in the substantive area of distance education and the analyst's experience performing the analysis. An example of a preconception with respect to distance education was the notion that participating in distance education leaves the participants feeling distant and alienated, a fairly commonly reported criticism of this mode of delivery. When the reported experience of the interviewees was analyzed, the sense of distance was reported as real but not necessarily alienating. Participants saw distance not as a problem, but as a solution to a problem, that of maintaining personal autonomy. As this pattern of behavior began to emerge, I analyzed my experience with various educational formats and determined that distance was a feature that I valued as well, for a variety of reasons. I treated my experience as another form of data. Upon the recommendation of Dr. Paul Wishart, I interviewed myself with the same question that I was posing to the other interviewees and subjected the interview to the same analysis. As Glaser points out, each preconceived notion must earn its way into the analysis.

[3] The action science view is that all behavior is motivated by theories of the world in two broad categories, espoused theories, or what people say they believe and theories-in-use, or the theories that are revealed by observations of the actions. Often, dissonance between these creates conflict or alienation. Theories-in-use are implicit and grounded theory method is suited for examination of obscured themes. [4] I have been very privileged to participate in a number of discussions with established grounded theorists and to participate in workshops dedicated to the method. One conference, held in Banff, Alberta in 2007, in particular yielded important insights into the practices of grounded theorists of many stripes. The proceedings of that conference were published in Morse et al, 2009.