University of Nebraska - Lincoln University of Nebraska - Lincoln
DigitalCommons@University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln
Faculty Publications, Department of Psychology Psychology, Department of
2005
Mixed Methods Research Designs in Counseling Psychology Mixed Methods Research Designs in Counseling Psychology
William E. Hanson
University of Nebraska-Lincoln
John W. Creswell
University of Nebraska-Lincoln
Vicki L. Plano Clark
University of Nebraska-Lincoln
Kelly S. Petska
University of Nebraska-Lincoln
J. David Creswell
University of California, Los Angeles
Follow this and additional works at: https://digitalcommons.unl.edu/psychfacpub
Part of the Psychiatry and Psychology Commons
Hanson, William E.; Creswell, John W.; Plano Clark, Vicki L.; Petska, Kelly S.; and Creswell, J. David, "Mixed
Methods Research Designs in Counseling Psychology" (2005).
Faculty Publications, Department of
Psychology
. 373.
https://digitalcommons.unl.edu/psychfacpub/373
This Article is brought to you for free and open access by the Psychology, Department of at
DigitalCommons@University of Nebraska - Lincoln. It has been accepted for inclusion in Faculty Publications,
Department of Psychology by an authorized administrator of DigitalCommons@University of Nebraska - Lincoln.
Over the past 25 years, numerous calls for increased meth-
odological diversity and alternative research methods have
been made (Gelso, 1979; Goldman, 1976; Howard, 1983).
These calls have led to important discussions about incorpo-
rating qualitative methods in counseling research and includ-
ing qualitative studies in traditional publication outlets (Hosh-
mand, 1989; Maione & Chenail, 1999; Morrow & Smith, 2000).
They have also led to discussions about integrating quantita-
tive and qualitative methods, commonly referred to as mixed
methods research.
In the social sciences at large, mixed methods research has
become increasingly popular and may be considered a le-
gitimate, stand-alone research design (Creswell, 2002, 2003;
Greene, Caracelli, & Graham, 1989; Tashakkori & Teddlie,
1998, 2003). It may be dened as “the collection or analysis
of both quantitative and qualitative data in a single study in
which the data are collected concurrently or sequentially, are
given a priority, and involve the integration of the data at one
or more stages in the process of research” (Creswell, Plano
Clark, Gutmann, & Hanson, 2003, p. 212). When both quanti-
tative and qualitative data are included in a study, researchers
may enrich their results in ways that one form of data does not
allow (Brewer & Hunter, 1989; Tashakkori & Teddlie, 1998).
Using both forms of data, for example, allows researchers to
simultaneously generalize results from a sample to a popula-
tion and to gain a deeper understanding of the phenomenon
of interest. It also allows researchers to test theoretical mod-
els and to modify them based on participant feedback. Results
of precise, instrument-based measurements may, likewise, be
augmented by contextual, eld-based information (Greene &
Caracelli, 1997).
Despite the availability of mixed-methods-related books,
chapters, and journal articles, virtually nothing has been writ-
ten about mixed methods research designs in applied psy-
chology, generally, or in counseling psychology, specically.
Cursory examination of the three editions of the Handbook of
Counseling Psychology (e.g., Brown & Lent, 2000), of popular
research design texts (e.g., Heppner, Kivlighan, & Wampold,
1999), and of mainstream, peer-reviewed journals (e.g., Jour-
nal of Counseling & Development, The Counseling Psychologist) re-
inforces this assertion. The general absence of discussions on
mixed methods research designs may be due to a number of
factors, including the historical precedent of favoring quanti-
tative and experimental methods in psychology (Gergen, 2001;
Waszak & Sines, 2003), the difculty in learning and applying
both types of methods (Behrens & Smith, 1996; Ponterotto &
Published in Journal of Counseling Psychology 52:2 (2005), pp. 224–235; doi 10.1037/0022-0167.52.2.224
Copyright © 2005 American Psychological Association. Used by permission.
“This article may not exactly replicate the nal version published in the APA journal. It is not the copy of record.”
An earlier version of this article was presented at the 111th Annual Convention of the American Psychological Association,
Toronto, Ontario, Canada, August 2003. The authors thank Patricia Cerda and Carey Pawlowski, who assisted in identifying
and locating published mixed methods studies.
Submitted October 27, 2004; revised December 6, 2004; accepted December 10, 2004.
Mixed Methods Research Designs in Counseling Psychology
William E. Hanson, Department of Educational Psychology, University of Nebraska–Lincoln
John W. Creswell, Department of Educational Psychology and Ofce of Qualitative and Mixed Methods Research,
University of Nebraska–Lincoln, and Department of Family Medicine, University of Michigan
Vicki L. Plano Clark, Department of Educational Psychology, Ofce of Qualitative and Mixed Methods Research,
and Department of Physics and Astronomy, University of Nebraska–Lincoln
Kelly S. Petska, Department of Educational Psychology, University of Nebraska–Lincoln
J. David Creswell, Department of Psychology, University of California, Los Angeles
Corresponding authors — William E. Hanson, Counseling Psychology Program, 228 TEAC, University of NebraskaLincoln,
Lincoln, NE 68588-0345, email [email protected] , and John W. Creswell, Department of Educational Psychology,
241 TEAC, University of Nebraska–Lincoln, Lincoln, NE 68588-0345, email [email protected]
Abstract
With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies
to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualita-
tive data in a single or multiphase study. This article presents an overview of mixed methods research designs. It denes mixed
methods research, discusses its origins and philosophical basis, advances steps and procedures used in these designs, and iden-
ties 6 different types of designs. Important design features are illustrated using studies published in the counseling literature.
Finally, the article ends with recommendations for designing, implementing, and reporting mixed methods studies in the litera-
ture and for discussing their viability and continued usefulness in the eld of counseling psychology.
224
Mi xe d Me th o ds Re s ea Rch de sig ns in co u ns e li n g Psy cho log y 225
Grieger, 1999), and the general lack of attention given to di-
verse methodological approaches in graduate education and
training (Aiken, West, Sechrest, & Reno, 1990). However, with
so few resources available, answers to the following types of
questions remain elusive and somewhat difcult to nd: What
is mixed methods research? What types of mixed methods
studies have been published in counseling? How should mixed
methods studies be conducted and reported in the literature?
The purpose of this article is to help answer these questions
by introducing mixed methods research designs to counseling
psychologists.
1
Our goal is to help counseling researchers and
educators become more familiar with mixed methods termi-
nology, procedures, designs, and key design features. Articles
by Goodyear, Tracey, Claiborn, Lichtenberg, and Wampold
(2005) and Beck (2005) introduce two specic methodological
approaches—ideographic concept mapping and ethnographic
decision tree modeling, respectively—and serve to further fa-
miliarize researchers and educators with mixed methods re-
search designs.
The present article is divided into three sections. In the rst
section, we present an overview of mixed methods research,
including its origins and philosophical basis, rationales, ba-
sic steps in designing a mixed methods study, and procedural
notations. We also present a typology for classifying different
types of mixed methods research designs. In the second sec-
tion, we use mixed methods studies published in counseling
to illustrate each of the designs and key design features dis-
cussed. In the third and nal section, we offer recommenda-
tions for conducting and publishing mixed methods research.
Overview of Mixed Methods Research
The historical evolution of mixed methods research has
not been traced completely by any one author or source, al-
though Datta (1994) and Tashakkori and Teddlie (1998, 2003)
have identied many of the major developmental milestones.
The brief overview presented here attempts to incorporate and
build on their analyses.
Origins and Philosophical Basis
The use of multiple data collection methods dates back to
the earliest social science research. It was, however, Camp-
bell and Fiske’s (1959) study of the validation of psycho-
logical traits that brought multiple data collection methods
into the spotlight. In their classic study, the multitrait-multi-
method matrix was designed to rule out method effects; that
is, to allow one to attribute individual variation in scale scores
to the personality trait itself rather than to the method used
to measure it. Although Campbell and Fiske focused on col-
lecting multiple quantitative data, their work was instrumen-
tal in encouraging the use of multiple methods and the collec-
tion of multiple forms of data in a single study (Sieber, 1973).
Taken one step further, the term triangulation, borrowed from
military naval science to signify the use of multiple reference
points to locate an object’s exact position, was later used to
suggest that quantitative and qualitative data could be com-
plementary. Each could, for example, “uncover some unique
variance which otherwise may have been neglected by a single
method” (Jick, 1979, p. 603).
Over time, mixed methods research has gradually gained
momentum as a viable alternative research method. Over the
past 15 years, at least 10 mixed methods textbooks have been
published (Bamberger, 2000; Brewer & Hunter, 1989; Bryman,
1988; Cook & Reichardt, 1979; Creswell, 2002, 2003; Greene
& Caracelli, 1997; Newman & Benz, 1998; Reichardt & Ral-
lis, 1994; Tashakkori & Teddlie, 1998). Recently, the Hand-
book of Mixed Methods in Social and Behavioral Research was pub-
lished (Tashakkori & Teddlie, 2003). In addition, journals such
as Field Methods and Quantity and Quality are devoted to pub-
lishing mixed methods research. International online journals
(see Forum: Qualitative Social Research at http://qualitative-re-
search.ne) and Web sites (e.g., http://www.u.edu/~bridges/
people.htm) provide easy access, resources, and hands-on ex-
periences for interested researchers. Despite this growth and
development, a number of controversial issues and debates
have limited the widespread acceptance of mixed methods
research.
Two important and persistent issues, the paradigm-method
t issue and the “best” paradigm issue, have inspired consid-
erable debate regarding the philosophical basis of mixed meth-
ods research. The paradigm-method t issue relates to the
question “Do philosophical paradigms (e.g., postpositivism,
constructivism) and research methods have to t together?”
This issue rst surfaced in the 1960s and 70s, primarily as a
result of the increasing popularity of qualitative research and
the identication of philosophical distinctions between tradi-
tional postpositivist and naturalistic research. Guba and Lin-
coln (1988), for example, identied paradigm differences
between postpositivist philosophical assumptions and natu-
ralistic assumptions in terms of epistemology (how we know
what we know), ontology (the nature of reality), axiology (the
place of values in research), and methodology (the process of
research). This led to a dichotomy between traditional inquiry
paradigms and naturalistic paradigms.
Some researchers have argued, for example, that a postpos-
itivist philosophical paradigm, or worldview, could be com-
bined only with quantitative methods and that a naturalistic
worldview could be combined only with qualitative meth-
ods. This issue has been referred to as the “paradigm debate”
(Reichardt & Rallis, 1994). From this perspective, mixed meth-
ods research was viewed as untenable (i.e., incommensura-
ble or incompatible) because certain paradigms and methods
could not “t” together legitimately (Smith, 1983). Reichardt
and Cook (1979) countered this viewpoint, however, by sug-
gesting that different philosophical paradigms and meth-
ods were compatible. In their article, they argued that para-
digms and methods are not inherently linked, citing a variety
of examples to support their position (e.g., quantitative pro-
cedures are not always objective, and qualitative procedures
are not always subjective). Indeed, the perspective exists to-
day that multiple methods may be used in a single research
study to, for example, take advantage of the representative-
ness and generalizability of quantitative ndings and the in-
depth, contextual nature of qualitative ndings (Greene &
Caracelli, 2003).
The best paradigm issue relates to the question “What phil-
osophical paradigm is the best foundation for mixed methods
research?” This issue, like the paradigm-method t issue, has
multiple perspectives (Tashakkori & Teddlie, 2003). One per-
1. We thank Beth Haverkamp for her helpful conceptual feedback on this article.
226 ha ns o n e t a l. in Jo urn al of Co u ns e li n g Psy Cho log y 52 (2005)
spective is that mixed methods research uses competing par-
adigms intentionally, giving each one relatively equal foot-
ing and merit. This “dialectical” perspective recognizes that
using competing paradigms gives rise to contradictory ideas
and contested arguments, features of research that are to be
honored and that may not be reconciled (Greene & Caracelli,
1997, 2003). Such oppositions reect different ways of making
knowledge claims, and we advocate for honoring and respect-
ing the different paradigmatic perspectives that researchers
bring to bear on a study. In an earlier publication, we iden-
tied six different mixed methods research designs and dis-
cussed how the underlying theoretical lenses, or paradigms,
may differ, depending on the type of design being used (Cre-
swell et al., 2003). This perspective maintains that mixed meth-
ods research may be viewed strictly as a “method,” thus
allowing researchers to use any number of philosophical foun-
dations for its justication and use. The best paradigm is de-
termined by the researcher and the research problem—not by
the method.
Another perspective is that pragmatism is the best para-
digm for mixed methods research (Tashakkori & Teddlie,
2003). Pragmatism is a set of ideas articulated by many peo-
ple, from historical gures such as Dewey, James, and Pierce
to contemporaries such as Murphy, Rorty, and West. It draws
on many ideas including using “what works,using diverse
approaches, and valuing both objective and subjective knowl-
edge (Cherryholmes, 1992). Rossman and Wilson (1985) were
among the rst to associate pragmatism with mixed meth-
ods research. They differentiated between methodologi-
cal purists, situationalists, and pragmatists. The purists be-
lieved that quantitative and qualitative methods derived
from different, mutually exclusive, epistemological and on-
tological assumptions about research. The situationalists be-
lieved that both methods have value (similar to the dialectical
perspective mentioned earlier) but that certain methods are
more appropriate under certain circumstances. The pragma-
tists, in contrast, believed that, regardless of circumstances,
both methods may be used in a single study. For many mixed
methods researchers, then, pragmatism has become the an-
swer to the question of what is the best paradigm for mixed
methods research. Recently, Tashakkori and Teddlie (2003)
have attempted to formally link pragmatism and mixed meth-
ods research, arguing that, among other things, the research
question should be of primary importance—more important
than either the method or the theoretical lens, or paradigm,
that underlies the method. At least 13 other prominent mixed
methods researchers and scholars also believe that pragma-
tism is the best philosophical basis of mixed methods research
(Tashakkori & Teddlie, 2003).
Rationales, Basic Steps in Designing a Mixed Methods Study, and
Procedural Notations
Rationales. In the mid-1980s, scholars began expressing con-
cern that researchers were indiscriminately mixing quantita-
tive and qualitative methods and forms of data without ac-
knowledging or articulating defensible reasons for doing so
(Greene et al., 1989; Rossman & Wilson, 1985). As a result, dif-
ferent reasons, or rationales, for mixing both forms of data in
a single study were identied. Greene et al. (1989), for exam-
ple, identied a number of rationales for combining data col-
lection methods. These rationales went above and beyond
the traditional notion of triangulation. Specically, quanti-
tative and qualitative methods could be combined to use re-
sults from one method to elaborate on results from the other
method (complementarity), use results from one method to
help develop or inform the other method (development; see
Goodyear et al., 2005, and Beck, 2005), recast results from one
method to questions or results from the other method (initia-
tion), and extend the breadth or range of inquiry by using dif-
ferent methods for different inquiry components (expansion).
Thus, they provided not only rationales for mixing methods
and forms of data but also names for them.
Recently, mixed methods researchers have expanded
the reasons for conducting a mixed methods investiga-
tion (Mertens, 2003; Newman, Ridenour, Newman, & De-
Marco, 2003; Punch, 1998). We agree with Mertens (2003)
and Punch (1998), who suggested that mixed methods in-
vestigations may be used to (a) better understand a research
problem by converging numeric trends from quantitative
data and specic details from qualitative data; (b) identify
variables/constructs that may be measured subsequently
through the use of existing instruments or the development
of new ones; (c) obtain statistical, quantitative data and re-
sults from a sample of a population and use them to iden-
tify individuals who may expand on the results through
qualitative data and results; and (d) convey the needs of in-
dividuals or groups of individuals who are marginalized or
underrepresented.
For a comprehensive, in-depth discussion of rationale is-
sues, the reader is referred to Newman et al. (2003).
Basic steps in designing a mixed methods study. Designing a
mixed methods study involves a number of steps, many of
which are similar to those taken in traditional research meth-
ods. These include deciding on the purpose of the study, the
research questions, and the type of data to collect. Designing
a mixed methods study, however, also involves at least three
additional steps. These include deciding whether to use an
explicit theoretical lens, identifying the data collection pro-
cedures, and identifying the data analysis and integration
procedures (Creswell, 1999; Greene & Caracelli, 1997; Mor-
gan, 1998; Tashakkori & Teddlie, 1998). These steps occur
more or less sequentially, with one informing and inuenc-
ing the others.
The rst step involves deciding whether to use an explicit
theoretical lens. As used here, the term theoretical lens refers
to the philosophical basis, or paradigm, (e.g., postpositivism,
constructivism, feminism) that underlies a researcher’s study
and subsequent methodological choices (Crotty, 1998). It is an
umbrella term that may be distinguished from broader epis-
temologies (e.g., objectivism, subjectivism), from narrower
methodologies (e.g., experimental research), and from, nar-
rower still, methods (e.g., random sampling, interviews). Rec-
ognizing that all researchers bring implicit theories and as-
sumptions to their investigations, researchers at this initial
stage must decide whether they are going to view their study
from a paradigmatic base (e.g., postpositivism, constructiv-
ism) that does not necessarily involve a goal of social change
or from an advocacy-based lens such as feminism. Our use of
the term advocacy is similar to what Ponterotto (2005) refers to
as a critical/emancipatory” paradigm. In any event, the out-
come of this decision informs and inuences the methodology
and the methods used in the study, as well as the use of the
study’s ndings.
Mi xe d Me th o ds Re s ea Rch de sig ns in co u ns e li n g Psy cho log y 227
If, for example, a feminist lens is used in a mixed meth-
ods study, then the gendered perspective provides a deduc-
tive lens that informs the research questions asked at the be-
ginning of the study and the advocacy outcomes advanced at
the end (cf. Mertens, 2003). Within the eld of counseling psy-
chology, the research question might be “How does a coun-
selor’s level of self-disclosure affect a client’s perception of
empowerment?” Answering this question may lead to more
empowering, research-informed, counselor-client interactions
and to overt attempts to change how counselors are trained
and supervised.
The second step involves deciding how data collection will
be implemented and prioritized. Implementation refers to the
order in which the quantitative and qualitative data are col-
lected, concurrently or sequentially, and priority refers to the
weight, or relative emphasis, given to the two types of data,
equal or unequal (Creswell et al., 2003; Morgan, 1998). A coun-
seling researcher could, in the example above, collect data se-
quentially, rst collecting quantitative survey data related to
clients’ postsession levels of perceived empowerment and
then collecting qualitative interview data. The interview data
could then be used to corroborate, refute, or augment ndings
from the survey data. As a result, priority in this hypotheti-
cal study would be unequal. Unequal priority occurs when a
researcher emphasizes one form of data more than the other,
starts with one form as the major component of a study, or col-
lects one form in more detail than the other (Morgan, 1998).
Figure 1 shows many of the options related to this step.
The third step involves deciding the point at which data
analysis and integration will occur. In mixed methods stud-
ies, data analysis and integration may occur by analyzing
the data separately, by transforming them, or by connecting
the analyses in some way (Caracelli & Green, 1993; Onwueg-
buzie & Teddlie, 2003; Tashakkori & Teddlie, 1998). A coun-
seling researcher could, for example, analyze the quantitative
and qualitative data separately and then compare and con-
trast the two sets of results in the discussion. As an alterna-
tive strategy, themes that emerged from the qualitative inter-
view data could be transformed into counts or ratings and
subsequently compared to the quantitative survey data. An-
other option would be to connect the data analyses. To do this,
the researcher could analyze the survey data, create a categor-
ical variable that helps explain the outcome variance, and con-
duct follow-up interviews with individuals who were repre-
sentative of each of the categories. For example, on the basis of
results from the survey data, a typology of empowering and
disempowering counselor self-disclosures, or levels of self-dis-
closure, could be developed. The researcher could then inter-
view a subsample of clients (e.g., some who felt empowered
and some who felt disempowered). In this way, results from
the quantitative analysis would be connected to the qualitative
data collection and analysis, primarily by aiding in the iden-
tication and selection of individuals to participate in the fol-
low-up interviews.
Procedural notations. Reminiscent of the notation system de-
veloped by Campbell and Stanley (1966), which used Xs and
Figure 1. Options related to mixed methods data collection procedures. QUAN = quantitative data
was prioritized; QUAL = qualitative data was prioritized; qual = lower priority given to the quali-
tative data; quan = lower priority given to the quantitative data.
228 ha ns o n e t a l. in Jo urn al of Co u ns e li n g Psy Cho log y 52 (2005)
Os to represent different experimental procedures, Morse
(1991, 2003) developed a system for representing different
mixed methods procedures. Instead of Xs and Os, however,
her system uses plus (+) symbols and arrows (→) as well as
capital and lowercase letters. A plus sign indicates that quan-
titative and qualitative data are collected concurrently (at the
same time), and an arrow indicates that they are collected se-
quentially (one followed by the other). The use of capital let-
ters indicates higher priority for a particular method. Low-
ercase letters, in turn, indicate lower priority. By displaying
mixed methods procedures graphically, readers may identify,
at a glance, the implementation and the priority of the data
collection procedures (see Figure 1). For example, QUAN
qual indicates a quantitatively driven sequential study, where
quantitative data collection is followed by qualitative data col-
lection with unequal priority, and QUAL + QUAN indicates
a qualitatively and quantitatively driven concurrent study,
where qualitative and quantitative data collection occur at the
same time and are given equal priority.
Types of Mixed Methods Research Designs
Several authors have developed typologies of mixed meth-
ods research designs, drawing mostly from approaches used
in evaluation (Greene et al., 1989), nursing (Morse, 1991), pub-
lic health (Steckler, McLeroy, Goodman, Bird, & McCormick,
1992), and education research (Creswell, 2002). Classication
systems that use acceptable, standardized names and descrip-
tive categories are still being developed. As one example, Cre-
swell et al. (2003) developed a parsimonious system for clas-
sifying mixed methods research designs. As shown in Figure
2, there are six primary types of designs: three sequential (ex-
planatory, exploratory, and transformative) and three con-
current (triangulation, nested, and transformative). Each var-
ies with respect to its use of an explicit theoretical/advocacy
lens, approach to implementation (sequential or concurrent
data collection procedures), priority given to the quantitative
and qualitative data (equal or unequal), stage at which the
data are analyzed and integrated (separated, transformed, or
Figure 2. Typology for classifying mixed methods research designs. QUAN = quantitative data was
prioritized; QUAL = qualitative data was prioritized; qual = lower priority given to the qualitative
data; quan = lower priority given to the quantitative data.
Mi xe d Me th o ds Re s ea Rch de sig ns in co u ns e li n g Psy cho log y 229
connected), and procedural notations. Because mixed methods
designs are, generally speaking, complex, it is important to un-
derstand subtle differences and nuances between and among
them. To facilitate this understanding, we next describe each
of the six designs, beginning with sequential designs.
Sequential designs. There are three types of sequential de-
signs: sequential explanatory, sequential exploratory, and se-
quential transformative. Sequential explanatory designs do
not use an explicit advocacy lens. In these designs, quantita-
tive data are collected and analyzed, followed by qualitative
data. Priority is usually unequal and given to the quantitative
data. Qualitative data are used primarily to augment quantita-
tive data. Data analysis is usually connected, and integration
usually occurs at the data interpretation stage and in the dis-
cussion. These designs are particularly useful for, as its name
suggests, explaining relationships and/or study ndings, es-
pecially when they are unexpected.
Sequential exploratory designs also do not use an explicit
advocacy lens. In these designs, qualitative data are collected
and analyzed rst, followed by quantitative data. Priority is
usually unequal and given to the qualitative data. Quantita-
tive data are used primarily to augment qualitative data. Data
analysis is usually connected, and integration usually occurs at
the data interpretation stage and in the discussion. These de-
signs are useful for exploring relationships when study vari-
ables are not known, rening and testing an emerging theory,
developing new psychological test/assessment instruments
based on an initial qualitative analysis, and generalizing quali-
tative ndings to a specic population.
In contrast to the other two sequential designs, sequen-
tial transformative designs use an explicit advocacy lens (e.g.,
feminist perspectives, critical theory), which is usually re-
ected in the purpose statement, research questions, and im-
plications for action and change. In these designs, quantitative
data may be collected and analyzed, followed by qualitative
data, or conversely, qualitative data may be collected and ana-
lyzed, followed by quantitative data. Thus, either form of data
may be collected rst, depending on the needs and preferences
of the researchers. Priority may be unequal and given to one
form of data or the other or, in some cases, equal and given to
both forms of data. Data analysis is usually connected, and in-
tegration usually occurs at the data interpretation stage and in
the discussion. These designs are useful for giving voice to di-
verse or alternative perspectives, advocating for research par-
ticipants, and better understanding a phenomenon that may
be changing as a result of being studied.
Concurrent designs. Similar to sequential mixed methods re-
search designs, there are three types of concurrent designs:
concurrent triangulation, concurrent nested, and concurrent
transformative. In concurrent triangulation designs, quanti-
tative and qualitative data are collected and analyzed at the
same time. Priority is usually equal and given to both forms of
data. Data analysis is usually separate, and integration usually
occurs at the data interpretation stage. Interpretation typically
involves discussing the extent to which the data triangulate or
converge. These designs are useful for attempting to conrm,
cross-validate, and corroborate study ndings.
In concurrent nested designs, like concurrent triangulation
designs, quantitative and qualitative data are collected and an-
alyzed at the same time. However, priority is usually unequal
and given to one of the two forms of data—either to the quan-
titative or qualitative data. The nested, or embedded, forms of
data are, in these designs, usually given less priority. One rea-
son for this is that the less prioritized form of data may be in-
cluded to help answer an altogether different question or set
of questions. Data analysis usually involves transforming the
data, and integration usually occurs during the data analysis
stage. These designs are useful for gaining a broader perspec-
tive on the topic at hand and for studying different groups, or
levels, within a single study.
In contrast to the other two concurrent designs, concur-
rent transformative designs use an explicit advocacy lens (e.g.,
feminist perspectives, critical theory), which is usually re-
ected in the purpose statement, research questions, and im-
plications for action and change. Quantitative and qualitative
data are collected and analyzed at the same time. Priority may
be unequal and given to one form of data or the other or, in
some cases, equal and given to both forms of data. Data anal-
ysis is usually separate, and integration usually occurs at the
data interpretation stage or, if transformed, during data analy-
sis. Similar to sequential transformative designs, these designs
are useful for giving voice to diverse or alternative perspec-
tives, advocating for research participants, and better under-
standing a phenomenon that may be changing as a result of
being studied.
Illustration of Mixed Methods Research Designs
and Key Design Features
In this section, we use studies published in the counseling
literature to illustrate each of the six types of mixed methods
research designs. In so doing, conceptual issues, such as im-
plementation, priority, and data analysis and integration, may
become more concrete and easier to understand. We also use
these studies to highlight potential publication outlets and
topics; the extent to which they include an explicit purpose
statement, research questions, and rationale for using a mixed
methods design; the data collection procedures; and the data
analysis procedures. These design features are important ways
of characterizing mixed methods studies. They offer insights
into the complexities of this type of research and serve as sign-
posts and markers for identifying, understanding, and evalu-
ating the different types of designs.
To identify published mixed methods studies, we searched
the PsycINFO computer database three times between Au-
gust 2001 and May 2002, locating all counseling-related jour-
nal articles written in English. We then back-checked reference
lists of the articles to identify other studies that may have been
missed initially. This search procedure resulted in the identi-
cation of 22 studies. These studies were published between
1986 and 2000. Table 1 lists the design features of each.
Five of the six types of mixed methods research designs ap-
peared in the counseling literature during the designated time
period. Concurrent triangulation was the most common type
of design used (32%, n = 7), followed by concurrent nested de-
signs (27%, n = 6), sequential explanatory designs (23%, n =
5), sequential exploratory designs (14%, n = 3), and concurrent
transformative designs (4%, n = 1). No sequential transforma-
tive designs were used, and none of the studies used proce-
dural notations to depict their design.
Luzzo (1995) used a concurrent triangulation design to study
gender differences in career maturity and perceived barriers to
career development. Four hundred one undergraduate students
230 ha ns o n e t a l. in Jo urn al of Co u ns e li n g Psy Cho log y 52 (2005)
participated in the quantitative part of the study, and 128 par-
ticipated in the qualitative part. In this study, the author did not
use an advocacy lens, stated the study’s purpose and rationale
for using a mixed methods design, implemented data collec-
tion concurrently (QUAN and QUAL at the same time), prior-
itized the data equally, and integrated the data after analyzing
them (during the interpretation phase). Specically, quantita-
tive data, in the form of scores on three different measures, and
qualitative data, in the form of tape-recorded responses to open-
ended questions, were collected to examine career-related gen-
der differences. After analyzing the quantitative and qualitative
data separately, the results were triangulated (i.e., integrated),
and consistent/overlapping gender differences were identied.
Balmer (1994), Balmer, Seeley, and Bachengana (1996), Good
and Heppner (1995), Hill et al. (2000), Martin, Goodyear, and
Newton (1987), and Meier (1999) are other examples of studies
that used concurrent triangulation designs.
Williams, Judge, Hill, and Hoffman (1997) also used a con-
current mixed methods research design. However, they used
a concurrent nested design to study “trainees’, clients’, and su-
pervisors’ perceptions of the trainees’ personal reactions and
management strategies during counseling sessions” (p. 391).
Seven doctoral trainees, 30 volunteer clients, and 7 supervisors
participated in the study. In this study, the authors did not use
an advocacy lens, stated the study’s purpose and rationale for
using a mixed methods design, reported three research ques-
tions (2 QUAL and 1 quan, which focused on different issues),
implemented data collection concurrently (quan and QUAL at
the same time), prioritized the qualitative data, and integrated
the data after analyzing/transforming them (during the in-
terpretation phase). Specically, qualitative data, in the form
of written responses to open-ended questions, were collected
to examine two different issues: the kinds of personal reac-
tions trainees have during counseling sessions and the strat-
egies that they use to manage their reactions. Quantitative
data, in the form of pre- and postchange scores, were nested
and collected to examine changes in trainee anxiety, counsel-
ing self-efcacy, management of countertransference issues,
and general counseling skills. After analyzing the qualitative
and quantitative data separately, the results were used to help
answer the three research questions. Aspenson et al. (1993),
Baker and Siryk (1986), Blustein, Phillips, Jobin-Davis, Fin-
kelberg, and Rourke (1997), Gaston and Marmar (1989), and
Guernina (1998) are other examples of studies that used con-
current nested designs.
In contrast to Luzzo (1995) and Williams et al. (1997),
Palmer and Cochran (1988) used a sequential mixed meth-
ods research design. They used a sequential explanatory de-
sign to provide “an empirical test of parent effectiveness in
a structured career development program for their children”
(p. 71). Forty volunteer families participated in their study.
The experimental group completed a self-guided interven-
tion program, which was compared to a control group on
parent-child relationship measures and career development
outcomes. In this study, the authors used Bronfenbrunner’s
theory of human development and Super’s theory of career
development as explicit theoretical lenses, stated the study’s
purpose, implemented data collection sequentially (QUAN
followed by QUAL), prioritized the data equally, and inte-
grated the data after analyzing them (during the interpre-
Table 1. Design Features of Mixed Methods Studies Published in Counseling
Purpose or
Study Design Topic RQs/rationale Priority/analysis
Aspenson et al. (1993) Concurrent nested Training/supervision Yes/yes QUAL + quan/connected
Baker & Siryk (1986) Concurrent nested Assessment Yes/no QUAN + qual/connected
Balmer (1994) Concurrent triangulation Group counseling No/yes QUAN + QUAL/separate
Balmer et al. (1998) Concurrent transformative Group counseling No/yes QUAN + QUAL/separate
Balmer et al. (1996) Concurrent triangulation Individual counseling No/yes QUAN + QUAL/separate
Blustein et al. (1997) Concurrent nested Vocational/career Yes/yes QUAL + quan/CDT
Chusid & Cochran (1989) Sequential explanatory Vocational/career Yes/yes (qual→)quan→QUAL/connected
Daughtry & Kunkel (1993) Sequential exploratory Individual counseling Yes/yes qual→QUAN/connected
Gaston & Marmar (1989) Concurrent nested Individual counseling Yes/yes QUAN + qual/connected
Good & Heppner (1995) Concurrent triangulation Training/diversity Yes/yes QUAL + quan/SDT
Guernina (1998) Concurrent nested Individual counseling Yes/yes QUAN + qual/separate
Hill et al. (2000) Concurrent triangulation Individual counseling Yes/yes QUAN + QUAL/separate
Luzzo (1995) Concurrent triangulation Vocational/career Yes/yes QUAN + QUAL/separate
Martin et al. (1987) Concurrent triangulation Training/supervision Yes/yes QUAN + qual/SDT
Meier (1999) Concurrent triangulation Assessment/training Yes/no QUAN + QUAL/separate
Orndoff & Herr (1996) Sequential explanatory Vocational/career Yes/yes QUAN→QUAL/connected
Palmer & Cochran (1988) Sequential explanatory Vocational/career Yes/no QUAN→QUAL/separate
Paulson et al. (1999) Sequential exploratory Counseling process Yes/yes qual→QUAN/connected
Payne et al. (1991) Sequential exploratory Individual counseling Yes/yes (quan→)qual→QUAN/CDT
Poasa et al. (2000) Sequential explanatory Diversity Yes/yes quan→QUAL/separate
Wampold et al. (1995) Sequential explanatory Vocational/career Yes/yes QUAN→(quan + QUAL)/separate
Williams et al. (1997) Concurrent nested Training/supervision Yes/yes QUAL + quan/SDT
Purpose or RQs (research questions)/rationale = whether or not the study included an explicit purpose statement, RQ, and/or rationale for using
a mixed methods design. Priority/analysis = the weight, or relative emphasis, given to the quantitative and qualitative data/the point at which
the data were analyzed and integrated. QUAL = qualitative data was prioritized; QUAN = quantitative data was prioritized; quan = lower priority
given to the quantitative data; qual = lower priority given to the qualitative data; CDT = connected analyses with data transformation; SDT =
separate analyses with data transformation.
Mi xe d Me th o ds Re s ea Rch de sig ns in co u ns e li n g Psy cho log y 231
tation phase and in the discussion). Specically, quantita-
tive data, in the form of scores on three different measures,
were collected and analyzed, followed by qualitative data, in
the form of verbal responses to open-ended interviews. Af-
ter the quantitative data were analyzed, parents were inter-
viewed, either in person or by telephone, to “gain a narra-
tive description of how the program went, with attention to
problems and benets. The questions were open-ended, in-
tended to invite general comments rather than denitive an-
swers” (Palmer & Cochran, 1988, p. 73). The qualitative data
were used to augment the quantitative data. The authors
noted that the “qualitative data from the interviews tended
to support quantitative results” (p. 74). The authors did not
report any research questions or specify a rationale for using
a mixed methods design. Chusid and Cochran (1989), Ornd-
off and Herr (1996), Poasa, Mallinckrodt, and Suzuki (2000),
and Wampold et al. (1995) are other examples of studies that
used sequential explanatory designs.
Paulson, Truscott, and Stuart (1999) also used a sequen-
tial mixed methods research design. However, they used a
sequential exploratory design to study clients’ perceptions of
helpful experiences in counseling. Thirty-six clients and 12
counselors participated in the study. In this study, the au-
thors did not use an advocacy lens, stated the study’s pur-
pose and rationale for using a mixed methods design, re-
ported one research question (combined qual and QUAN),
implemented data collection sequentially (qual followed by
QUAN), prioritized the quantitative data, and connected the
data analysis. Specically, qualitative data, in the form of
transcribed responses to a single, open-ended question (i.e.,
“What was helpful about counseling?”), were collected and
analyzed, followed by quantitative data, in the form of a sort-
ing and rating task. Quantitative data were included to aug-
ment the qualitative data and to develop a concept map of
clients’ responses to the open-ended question. Daughtry and
Kunkel (1993) and Payne, Robbins, and Dougherty (1991) are
other examples of studies that used sequential exploratory
designs. The methodological approaches described by Good-
year et al. (2005) and Beck (2005) may also be considered ex-
amples of sequential exploratory designs.
In the only identied transformative mixed methods re-
search design, Balmer, Gikundi, Nasio, Kihuho, and Plum-
mer (1998) used a concurrent transformative design to “evaluate
group counseling, based upon a unied theory, as an inter-
vention strategy for men with an STD infection and to de-
velop a more detailed understanding of sexual behavior that
results in STD/HIV acquisition and transmission” (p. 34).
Two hundred forty-two men who were Kenyan and infected
with an STD and 6 counselors participated in this random-
ized clinical trial study. In this study, the authors used an ex-
plicit advocacy lens, stated the rationale for using a mixed
methods design, implemented data collection concurrently
(QUAN and QUAL at the same time), prioritized the data
equally, and integrated the data after analyzing them (dur-
ing the interpretation phase). Specically, in terms of an ad-
vocacy (“participatory action research”) lens, “the qualitative
assessment process allowed the counseled groups to become
collaborators in a joint project and perhaps it increased their
commitment” (Balmer et al., 1998, p. 42). Thus, the research
participantsperspectives were elicited and used to help val-
idate the ndings. Moreover, the authors reported that the
participants changed as a result of their participation. In
terms of implementation (data collection), quantitative data,
in the form of pre- and postchange scores on ve different
measures and medical statistics, and qualitative data, in the
form of observations, interviews, eld notes, and documents,
were collected simultaneously. After analyzing the quantita-
tive and qualitative data separately, the results were triangu-
lated (i.e., integrated) and compared to the existing literature
in this area. The authors did not state the purpose explicitly
or report any research questions. No other examples of con-
current transformative designs were identied in our search
of the counseling literature.
No sequential transformative designs were identied ei-
ther. Consequently, to illustrate this design, a counseling-re-
lated study from the human development literature is de-
scribed here. In this study, Tolman and Szalacha (1999) used
a sequential transformative design to “understand the dimen-
sions of the experience of sexual desire for adolescent girls
(p. 8). Thirty females who were in 11th grade and who at-
tended an urban high school (n = 15) and a suburban high
school (n = 15) participated in the study. In this study, the
authors used an explicit advocacy lens, stated the rationale
for using a mixed methods design, reported three research
questions (2 QUAL and 1 quan), implemented data col-
lection sequentially (QUAL followed by quan followed by
QUAL), prioritized the qualitative data, and connected the
data analysis. Specically, in terms of the advocacy lens, it
was “explicitly feminist in nature,” using “a feminist orga-
nizing principle of listening to and taking women’s voices se-
riously…particularly in data collection and data reduction,
as well as in data analysis and interpretation” (p. 11). Thus,
a mixed methods design was used to create “an opportunity
for girls to put into words and to name their experience in
and questions about a realm of their lives that remains un-
spoken in the larger culture” (p. 13). Data were collected and
analyzed in three sequential phases. In the rst and third
phases, qualitative data, in the form of transcribed narratives
of private, one-on-one, semistructured interviews, were col-
lected and analyzed. In the second phase, quantitative data,
in the form of coded frequency data, were collected and ana-
lyzed. Results from the rst analysis were used to inform the
second phase of data collection, and similarly, results from
the second analysis were used to inform the third phase of
data collection. In the end, the results from the three analy-
ses were triangulated and used to help answer the three re-
search questions.
Journals, Purpose Statements, Research Questions, and Rationales
Mixed methods studies have been published in at least
seven counseling-related journals: Counselling Psychology
Quarterly (CPQ); Counselor Education and Supervision (CES);
Journal of Counseling & Development (JCD); Journal of Counsel-
ing Psychology (JCP), Professional Psychology: Research and Prac-
tice (PPRP), Psychotherapy: Theory, Research, Training, Prac-
tice; and The Counseling Psychologist (TCP). The investigations
have targeted a range of topics of interest to the eld (e.g.,
individual counseling, vocational/career, training/supervi-
sion; see Table 1).
A particularly important design feature of mixed methods
studies is the extent to which they include an explicit pur-
pose statement, research questions (RQs), and rationale for
using both quantitative and qualitative methods and data in
a study (Creswell et al., 2003). As alluded to previously, pur-
232 ha ns o n e t a l. in Jo urn al of Co u ns e li n g Psy Cho log y 52 (2005)
pose statements and research questions serve as signposts
and markers for identifying, understanding, and evaluating
the different types of mixed methods research designs. They
also shape the analyses and integration of the results. Hav-
ing a well conceived rationale is also important because it
indicates to the reader that the quantitative and qualitative
methods and data were mixed intentionally and for defensi-
ble reasons.
In our sample, purpose statements, RQs, and rationales
were included in 19 (86%), 11 (50%), and 19 (86%) studies, re-
spectively. All 19 studies that stated a purpose stated it ex-
plicitly. For example, Wampold et al. (1995), in a two-part
study of differences in social skills across Holland types
(Study 1) and of how people who are task-oriented (e.g., C, R,
and I types) construct their social/work environments (Study
2), stated, “The purpose of Study 1 was to test the hypothe-
ses about relative strengths and weaknesses in specied so-
cial skills for various types of people” (pp. 368) and “Study 2
was a qualitative study designed to examine the density and
nature of social interactions produced by chemists in an aca-
demic setting(pp. 371). Three studies (14%) did not include
purpose statements.
Across the 11 studies that included RQs, the number of RQs
ranged from one to ve, with a mean of 2.64 RQs (SD = 1.36).
Five studies (45%) included both quantitative and qualitative
RQs. Three (27%) included only quantitative RQs, one (9%) in-
cluded only qualitative, and two (18%) included only combi-
nations of quantitative and qualitative.
Across the 19 studies that stated a rationale for mix-
ing methods and quantitative and qualitative data, 16 (84%)
stated it explicitly. For example, Gaston and Marmar (1989),
in a time-series study of therapeutic change events, mentioned
specically the importance of including both forms of data:
The main thesis of this article is that quantitative and qual-
itative knowledge are both essential for the understanding
of the change process in psychotherapy. Ideally, information
from both paradigms should be acquired within single inves-
tigations. With the use of a study example, we attempt to il-
lustrate the dual advantages of richer process-outcome nd-
ings provided by combining quantitative and qualitative
approaches. (p. 169)
Three (16%) of the 19 studies that reported a rationale did
not state it explicitly. In these studies, it was implied and had
to be inferred from the text. Three studies (14%) did not indi-
cate a rationale.
Data Collection Procedures
Fourteen mixed methods studies implemented data col-
lection procedures concurrently (64%), and 8 implemented
them sequentially (36%). Priority was distributed more or
less evenly across studies, with 7 prioritizing quantitative
data (32%), 6 prioritizing qualitative data (27%), and 9 pri-
oritizing both equally (41%). Quantitative data consisted pri-
marily of self-report, instrument-based data (n = 20; 91%),
followed by rating tasks (n = 5; 23%) and by observation-
(n = 1; 4%) and physiology-based data (n = 1; 4%). Qualita-
tive data consisted primarily of data based on individual or
group interviews (n = 17; 77%), followed by observations/
eld notes (n = 9; 41%) and by data based on existing mate-
rials (n = 4; 18%), including ofcial records, personal docu-
ments, and archival data.
Data Analysis Procedures
Ten mixed methods studies (45%) analyzed quantitative
and qualitative data separately, before all of the data were col-
lected or analyzed. Data analysis was connected in 7 studies
(32%), separated and transformed (e.g., qualitative data were
transformed into quantitative scores) in 3 studies (14%), and
connected and transformed in 2 studies (9%). Quantitative
data analysis consisted primarily of descriptive, or explor-
atory, procedures (n = 20; 91%), followed by inferential, or
conrmatory, procedures (n = 19; 86%). Qualitative data anal-
ysis consisted primarily of the identication of themes and re-
lationships (n = 17; 77%), using, for example, grounded theory
(Strauss & Corbin, 1990) and consensual qualitative research
(CQR; Hill, Thompson, & Williams, 1997), followed by thick
description (n = 8; 36%; Wolcott, 1994). Twenty (91%) of the
studies integrated the data at the interpretation stage, and 2
(9%) integrated the data at the analysis stage.
In considering the 22 studies cited in this section, a number
of general observations may be made. First, mixed methods
studies have indeed been published in counseling journals, the
majority of which were published in CPQ, JCP, JCD, or TCP
during the 1990s. Second, concurrent designs, where quanti-
tative and qualitative data are collected at the same time, were
the most common type of design used. Third, researchers who
published mixed methods studies tended to include purpose
statements, research questions, and rationales for using these
designs. None of the studies, however, used procedural nota-
tions to depict the design. Fourth, the priority for data collec-
tion was distributed equally between quantitative and quali-
tative data across the studies. Fifth, data analysis tended to
occur separately, and integration of the results (i.e., triangula-
tion) tended to occur at the interpretation stage and in the dis-
cussion—approaches to analysis and integration that are con-
sistent with concurrent triangulation designs, the single most
popular type of design that was used.
We are well aware that these observations are primarily
descriptive in nature. In reviewing the studies, we did not at-
tempt to critique or rate the quality of any of them. As descrip-
tive categories and standardized evaluative criteria continue
to evolve, it may become easier to offer more formal strengths-
and weaknesses-based observations. We are also aware that,
despite our systematic, 9-month-long literature search, it is
quite likely that we missed a few studies, especially ones that
have been published within the past few years. Despite these
limitations, we hope that this section of the article is of heuris-
tic value to readers.
Recommendations
The primary purpose of this article was to introduce mixed
methods research to counseling researchers and educators. On
the basis of our understanding of mixed methods procedures
and designs, as well as the general observations noted above,
we offer the following recommendations for designing, imple-
menting, and reporting a mixed methods study.
1. We recommend that researchers attend closely to theoreti-
cal/paradigmatic issues. Attention should be paid to the
theoretical lens that informs the investigation and to the
priority that is assigned to the quantitative and qualitative
data. Explicit statement of the researcher’s lens is informa-
Mi xe d Me th o ds Re s ea Rch de sig ns in co u ns e li n g Psy cho log y 233
tive. A postpositivist lens would, for example, be appro-
priate for a sequential explanatory design that prioritized
the quantitative data, whereas a constructivist lens would
be appropriate for a sequential exploratory design that pri-
oritized the qualitative data. For transformative designs,
an advocacy-based or transformative-emancipatory lens
would be required, regardless of whether the quantitative
or qualitative data were prioritized.
2. We recommend that researchers also attend closely to de-
sign and implementation issues, particularly to how and
when data are collected (e.g., concurrently or sequentially).
The study’s purpose plays an important role here (Cre-
swell, 1999). If, for example, the purpose is to triangulate
or converge the results, then the data may be collected con-
currently. However, elaboration of the results would re-
quire a sequential design.
3. In mixed methods studies, data analysis and integration
may occur at almost any point in time (Creswell et al.,
2003). As noted by Onwuegbuzie and Teddlie (2003), “The
point at which the data analysis begins and ends depends
on the type of data collected, which in turn depends on the
sample size, which in turn depends on the research design,
which in turn depends on the purpose” (p. 351). We rec-
ommend that researchers familiarize themselves with the
analysis and integration strategies used in the mixed meth-
ods studies cited in this article as well as with those rec-
ommend by Caracelli and Green (1993) and Onwuegbuzie
and Teddlie (2003).
4. Because mixed methods studies require a working knowl-
edge and understanding of both quantitative and quali-
tative methods, and because they involve multiple stages
of data collection and analysis that frequently extend over
long periods of time, we recommend that researchers work
in teams. Working in teams allows researchers with ex-
pertise in quantitative methods and analyses, qualitative
methods and analyses, and/or both to be involved directly
in designing and implementing a mixed methods study.
5. In preparing a mixed methods manuscript, we recommend
that researchers use the phrase mixed methods in the titles of
their studies. We also recommend that, early on, research-
ers foreshadow the logic and progression of their studies
by stating the study’s purpose and research questions in
the introduction. Clear, well written purpose statements
and research questions that specify the quantitative and
qualitative aspects of the study help focus the manuscript.
6. We recommend that, in the introduction, researchers ex-
plicitly state a rationale for mixing quantitative and qual-
itative methods and data (e.g., to triangulate the results, to
extend the study’s results). It is best to specify the advan-
tages, for the specied research questions, that accrue from
using both methods and data. Examples of good rationales
may be found in Gaston and Marmar (1989) and Hill et al.
(2000).
7. We recommend that, in the methods, researchers specify the
type of mixed methods research design used (e.g., sequen-
tial explanatory mixed methods design) and include proce-
dural notations such as those shown in Figures 1 and 2. By
doing this, the eld will be able to build a common vocab-
ulary and shared understanding of the different types of
designs available.
8. Finally, we recommend that counseling researchers and ed-
ucators continue having candid discussions about the le-
gitimacy and viability of mixed methods research. As one
anonymous reviewer noted,
researchers [should] openly discuss their views on
the integration of potentially distinct epistemolog-
ical issues in using mixed designs. This may not
always be necessary when the methods are rela-
tively close with respect to assumptions about the
nature of knowledge. However, when the methods
are quite far apart…some exploration of the com-
plexities of merging methodological perspectives
would be quite helpful.
We strongly agree. Discussions of this nature may stimulate
additional interest and future advancements in this emerging
form of inquiry.
Many scholars have begun to describe mixed methods re-
search as a legitimate, stand-alone research design ready to
stand beside time-honored designs such as experiments, sur-
veys, grounded theory studies, and ethnographies (Datta, 1994;
Tashakkori & Teddlie, 1998, 2003). Despite numerous chal-
lenges and obstacles, it has emerged as a viable alternative to
purely quantitative or qualitative methods and designs. With
studies available in the literature, and in this issue, to serve as
models, and with the recommendations included here, coun-
seling researchers and educators may be on the verge of a new
generation of thinking about method and methodology.
References
Aiken, L. S., West, S. G., Sechrest, L., & Reno, R. R. (1990). Grad-
uate training in statistics, methodology, and measurement in
psychology: A survey of PhD programs in North America.
American Psychologist, 45, 721–734.
Aspenson, D. O., Gersh, T. L., Perot, A. R., Galassi, J. P., Schroeder,
R., Kerick, S., Bulger, J., & Brooks, L. (1993). Graduate psychol-
ogy students’ perceptions of the scientist-practitioner model of
training. Counselling Psychology Quarterly, 6, 201–215.
Baker, R. W., & Siryk, B. (1986). Exploratory intervention with a
scale measuring adjustment to college. Journal of Counseling
Psychology, 33, 31–38.
Balmer, D. H. (1994). The efcacy of a scientic and ethnographic
research design for evaluating AIDS group counselling. Coun-
selling Psychology Quarterly, 7, 429–440.
Balmer, D. H., Gikundi, E., Nasio, J., Kihuho, F., & Plummer, F. A.
(1998). A clinical trial of group counselling for changing high-
risk sexual behaviour in men. Counselling Psychology Quarterly,
11, 33–43.
Balmer, D. H., Seeley, J., & Bachengana, C. (1996). The role of
counselling in community support for HIV/AIDS in Uganda.
Counselling Psychology Quarterly, 9, 177–190.
Bamberger, M. (Ed.). (2000). Integrating quantitative and qualitative
research in development projects. Washington, DC: World Bank.
Beck, K. A. (2005). Ethnographic decision tree modeling: A re-
search method for counseling psychologists. Journal of Counsel-
ing Psychology, 52, 243–249.
Behrens, J. T., & Smith, M. L. (1996). Data and data analysis. In
D.Berliner & B.Calfee (Eds.), The handbook of educational psy-
chology (pp. 945–989). New York: Macmillan.
Blustein, D. L., Phillips, S. D., Jobin-Davis, K., Finkelberg, S. L.,
& Rourke, A. E. (1997). A theory-building investigation of
the school-to-work transition. The Counseling Psychologist, 25,
364–402.
Brewer, J., & Hunter, A. (1989). Multimethod research: A synthesis of
styles. Newbury Park, NJ: Sage.
234 ha ns o n e t a l. in Jo urn al of Co u ns e li n g Psy Cho log y 52 (2005)
Brown, S. D., & Lent, R. W. (2000). Handbook of counseling psychol-
ogy (3rd ed.). New York: Wiley.
Bryman, A. (1988). Quantity and quality in social research. London:
Routledge.
Campbell, D. T., & Fiske, D. (1959). Convergent and discriminant
validation by the multitrait-multimethod matrix. Psychological
Bulletin, 56, 81–105.
Campbell, D. T., & Stanley, J. C. (1966). Experimental and quasi-
experimental designs for research. In N. L.Gage (Ed.),
Handbook of research on teaching (pp. 1–76). Chicago: Rand
McNally.
Caracelli, V. J., & Greene, J. C. (1993). Data analysis strategies for
mixed-method evaluation designs. Educational Evaluation and
Policy Analysis, 15, 195–207.
Cherryholmes, C. C. (1992). Notes on pragmatism and scientic
realism. Educational Researcher, 21, 13–17.
Chusid, H., & Cochran, L. (1989). Meaning of career changes from
the perspective of family roles and dramas. Journal of Counsel-
ing Psychology, 36, 34–41.
Cook, T. D., & Reichardt, C. S. (Eds.). (1979). Qualitative and quanti-
tative methods in evaluation research. Beverly Hills, CA: Sage.
Creswell, J. W. (1999). Mixed method research: Introduction and
application. In T.Cijek (Ed.), Handbook of educational policy (pp.
455–472). San Deigo, CA: Academic Press.
Creswell, J. W. (2002). Educational research: Planning, conducting,
and evaluating quantitative and qualitative approaches to research.
Upper Saddle River, NJ: Merrill/Pearson Education.
Creswell, J. W. (2003). Research design: Quantitative, qualitative,
and mixed methods approaches (2nd ed.). Thousand Oaks, CA:
Sage.
Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson,
W. E. (2003). Advanced mixed methods research designs. In
A.Tashakkori & C.Teddlie (Eds.), Handbook of mixed methods
in social and behavioral research (pp. 209–240). Thousand Oaks,
CA: Sage.
Crotty, M. (1998). The foundations of social research: Meaning and per-
spective in the research process. London: Sage.
Datta, L. (1994). Paradigm wars: A basis for peaceful coexistence
and beyond. In C. S.Reichardt & S. F.Rallis (Eds.), The qualita-
tive-quantitative debate: New perspectives (pp. 53–70). San Fran-
cisco: Jossey-Bass.
Daughtry, D., & Kunkel, M. A. (1993). Experience of depression in
college students: A concept map. Journal of Counseling Psychol-
ogy, 40, 316–323.
Gaston, L., & Marmar, C. R. (1989). Quantitative and qualitative
analyses for psychotherapy research: Integration through
time-series design. Psychotherapy, 26, 169–176.
Gelso, C. J. (1979). Research in counseling: Methodological and
professional issues. The Counseling Psychologist, 8, 7–36.
Gergen, K. J. (2001). Psychological science in a postmodern con-
text. American Psychologist, 56, 803–813.
Goldman, L. (1976). A revolution in counseling psychology. Jour-
nal of Counseling Psychology, 23, 543–552.
Good, G. E., & Heppner, M. J. (1995). Students’ perceptions of a
gender issues course: A qualitative and quantitative examina-
tion. Counselor Education and Supervision, 34, 308–320.
Goodyear, R. K., Tracey, T. J. G., Claiborn, C. D., Lichtenberg, J.
W., & Wampold, B. E. (2005). Ideographic concept mapping in
counseling psychology research: Conceptual overview, meth-
odology, and an illustration. Journal of Counseling Psychology,
52, 236–242.
Greene, J. C., & Caracelli, V. J. (Eds.). (1997). Advances in mixed-
method evaluation: The challenges and benets of integrating di-
verse paradigms. (New Directions for Evaluation, No. 74). San
Francisco: Jossey-Bass.
Greene, J. C., & Caracelli, V. J. (2003). Making paradigmatic sense
of mixed methods practice. In A.Tashakkori & C.Teddlie
(Eds.), Handbook of mixed methods in social and behavioral research
(pp. 91–110). Thousand Oaks, CA: Sage.
Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a
conceptual framework for mixed-method evaluation designs.
Educational Evaluation and Policy Analysis, 11, 255–274.
Guba, E. G., & Lincoln, Y. S. (1988). Do inquiry paradigms imply
inquiry methodologies?. In D. M.Fetterman (Ed.), Qualitative
approaches to evaluation in education (pp. 89–115). New York:
Praeger Publishers.
Guernina, Z. (1998). Adolescents with eating disorders: A pilot
study. Counselling Psychology Quarterly, 11, 117–124.
Heppner, P. P., Kivlighan, D. M., Jr., & Wampold, B. E. (1999). Re-
search design in counseling (2nd ed.). Belmont, CA: Wadsworth.
Hill, C. E., Thompson, B. J., & Williams, E. N. (1997). A guide to
conducting consensual qualitative research. The Counseling
Psychologist, 25, 517–572.
Hill, C. E., Zack, J. S., Wonnell, T. L., Hoffman, M. A., Rochlen, A.
B., Goldberg, J. L., et al. (2000). Structured brief therapy with a
focus on dreams or loss for clients with troubling dreams and
recent loss. Journal of Counseling Psychology, 47, 90–101.
Hoshmand, L. L. S. T. (1989). Alternate research paradigms: A re-
view and teaching proposal. The Counseling Psychologist, 17,
3–79.
Howard, G. S. (1983). Toward methodological pluralism. Journal of
Counseling Psychology, 30, 19–21.
Jick, T. D. (1979). Mixing qualitative and quantitative methods:
Triangulation in action. Administrative Science Quarterly, 24,
602–611.
Luzzo, D. A. (1995). Gender differences in college students’ career
maturity and perceived barriers in career development. Jour-
nal of Counseling & Development, 73, 319–322.
Maione, P. V., & Chenail, R. J. (1999). Qualitative inquiry in psy-
chotherapy: Research on the common factors. In M. A.Hubble,
B. L.Duncan, & S. D.Miller (Eds.), The heart and soul of change:
What works in therapy (pp. 57–88). Washington, DC: American
Psychological Association.
Martin, J. S., Goodyear, R. K., & Newton, F. B. (1987). Clinical su-
pervision: An intensive case study. Professional Psychology: Re-
search and Practice, 18, 225–235.
Meier, S. T. (1999). Training the practitioner-scientist: Bridg-
ing case conceptualization, assessment, and intervention. The
Counseling Psychologist, 27, 846–869.
Mertens, D. M. (2003). Mixed methods and the politics of human
research: The transformative-emancipatory perspective. In
A.Tashakkori & C.Teddlie (Eds.), Handbook of mixed methods
in social and behavioral research (pp. 135–164). Thousand Oaks,
CA: Sage.
Morgan, D. L. (1998). Practical strategies for combining qualitative
and quantitative methods: Applications to health research.
Qualitative Health Research, 8, 362–376.
Morrow, S. L., & Smith, M. L. (2000). Qualitative research for coun-
seling psychology. In S. D.Brown & R. W.Lent (Eds.), Hand-
book of counseling psychology (3rd ed., (pp. 199–230). New York:
Wiley.
Morse, J. M. (1991). Approaches to qualitative-quantitative meth-
odological triangulation. Nursing Research, 40, 120–123.
Morse, J. M. (2003). Principles of mixed methods and multimethod
research design. In A.Tashakkori & C.Teddlie (Eds.), Handbook
of mixed methods in social and behavioral research (pp. 189–208).
Thousand Oaks, CA: Sage.
Newman, I., & Benz, C. R. (1998). Qualitative-quantitative research
methodology: Exploring the interactive continuum. Carbondale:
University of Illinois Press.
Mi xe d Me th o ds Re s ea Rch de sig ns in co u ns e li n g Psy cho log y 235
Newman, I., Ridenour, C. S., Newman, C., & DeMarco, G. M. P.,
Jr. (2003). A typology of research purposes and its relationship
to mixed methods. In A.Tashakkori & C.Teddlie (Eds.), Hand-
book of mixed methods in social and behavioral research (pp. 167–
188). Thousand Oaks, CA: Sage.
Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for ana-
lyzing data in mixed methods research. In A.Tashakkori &
C.Teddlie (Eds.), Handbook of mixed methods in social and behav-
ioral research (pp. 351–383). Thousand Oaks, CA: Sage.
Orndoff, R. M., & Herr, E. L. (1996). A comparative study of de-
clared and undeclared college students on career uncertainty
and involvement in career development activities. Journal of
Counseling & Development, 74, 632–640.
Palmer, S., & Cochran, L. (1988). Parents as agents of career devel-
opment. Journal of Counseling Psychology, 35, 71–76.
Paulson, B. L., Truscott, D., & Stuart, J. (1999). Clients’ perceptions
of helpful experiences in counseling. Journal of Counseling Psy-
chology, 46, 317–324.
Payne, E. C., Robbins, S. B., & Dougherty, L. (1991). Goal directed-
ness and older-adult adjustment. Journal of Counseling Psychol-
ogy, 38, 302–308.
Poasa, K. H., Mallinckrodt, B., & Suzuki, L. A. (2000). Causal attri-
butions for problematic family interactions: A qualitative, cul-
tural comparison of Western Samoa, American Samoa and the
United States. The Counseling Psychologist, 28, 32–60.
Ponterotto, J. G. (2005). Qualitative research in counseling psy-
chology: A primer on research paradigms and philosophy of
science. Journal of Counseling Psychology, 52, 126–136.
Ponterotto, J. G., & Grieger, I. (1999). Merging qualitative and
quantitative perspectives in a research identity. In M.Kopala &
L.Suzuki (Eds.), Using qualitative methods in psychology (pp. 49–
62). Thousand Oaks, CA: Sage.
Punch, K. F. (1998). Introduction to social research: Quantitative and
qualitative approaches. Thousand Oaks, CA: Sage.
Reichardt, C. S., & Cook, T. D. (1979). Beyond qualitative versus
quantitative methods. In T. D.Cook & C. S.Reichardt (Eds.),
Qualitative and quantitative methods in evaluation research (pp. 7–
32). Beverly Hills, CA: Sage.
Reichardt, C. S., & Rallis, S. F. (Eds.). (1994). The qualitative-quanti-
tative debate: New perspectives. San Francisco: Jossey-Bass.
Rossman, G. B., & Wilson, B. L. (1985). Numbers and words:
Combing quantitative and qualitative methods in a single
large-scale evaluation study. Evaluation Review, 9, 627–643.
Sieber, S. D. (1973). The integration of eld work and survey meth-
ods. American Journal of Sociology, 78, 1335–1359.
Smith, J. K. (1983). Quantitative versus qualitative research: An at-
tempt to clarify the issue. Educational Researcher, 12, 6–13.
Steckler, A., McLeroy, K. R., Goodman, R. M., Bird, S. T., & Mc-
Cormick, L. (1992). Toward integrating qualitative and quan-
titative methods: An introduction. Health Education Quarterly,
19, 1–8.
Strauss, A., & Corbin, J. (Eds.). (1990). Basics of qualitative research:
Ground theory procedures and techniques. Newbury Park, CA:
Sage.
Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: Combin-
ing qualitative and quantitative approaches. Thousand Oaks, CA:
Sage.
Tashakkori, A., & Teddlie, C. (Eds.). (2003). Handbook of mixed
methods in social and behavioral research. Thousand Oaks, CA:
Sage.
Tolman, D. L., & Szalacha, L. A. (1999). Dimensions of desire:
Bridging qualitative and quantitative methods in a study of fe-
male adolescent sexuality. Psychology of Women Quarterly, 23,
7–39.
Wampold, B. E., Ankarlo, G., Mondin, G., Trinidad-Carrillo, M.,
Baumler, B., & Prater, K. (1995). Social skills of and social en-
vironments produced by different Holland types: A social per-
spective on person-environment t model. Journal of Counsel-
ing Psychology, 42, 365–379.
Waszak, C., & Sines, M. C. (2003). Mixed methods in psycholog-
ical research. In A.Tashakkori & C.Teddlie (Eds.), Handbook
of mixed methods in social and behavioral research (pp. 557–576).
Thousand Oaks, CA: Sage.
Williams, E. N., Judge, A. B., Hill, C. E., & Hoffman, M. A. (1997).
Experiences of novice therapists in prepracticum: Trainees’,
clients’, and supervisors’ perceptions of therapists’ personal
reactions and management strategies. Journal of Counseling
Psychology, 44, 390–399.
Wolcott, H. F. (1994). Transforming qualitative data: Description, anal-
ysis, and interpretation. Thousand Oaks, CA: Sage.