11/5/2019
1/6
Week 5
Research Methods and Data Types
Health research is systematic (follows a sequential process) and principled (carried out
according to explicit rules) way to gather information to investigate health issues and
solve health-related problems. As such, these guidelines are what constitutes a research
method. In health research, the term method refers to a strict set of rules, including (a)
how knowledge should be acquired, (b) the form in which knowledge should be stated,
and (c) how the truth or validity of knowledge should be established (Polgar & Thomas,
2019). This is very different from your project’s methodology, which consists of the
research design, sample size, data collection procedures, instrumentation, and paradigm
(Glicken, 2003).
There are three overarching conceptual foundations of health research methods:
quantitative, qualitative and mixed methods.
Quantitative research involves the use of mathematics in the discovery of
relationships among a variety of variables.
This research method uses a positivist paradigm, experimental designs, and
the scientific method to generate results that may reveal such relationships
that are generalizable to other people, places, and events (Polgar & Thomas,
2019).
Qualitative research is an examination of phenomena within the cultural and social
environment in which it takes place.
It is not designed to determine cause-effect relationships or the ability to
generalize to other people, places, and events (as is found in quantitative
research); rather, it incorporates participant observations, in-depth
interviews, focus group discussions, and textual (nonnumeric) data to
identify themes and patterns to eventually generate new theories (Jacobsen,
2017).
A mixed-methods approach involves the use of the combination of both
quantitative and qualitative methods.
Here, the researcher has the advantage of data being produced including
both objective facts (statistical analysis) and subjective experiences
described by the study participants (Polgar & Thomas, 2019).
11/5/2019
2/6
Books and Resources for this Week
Bazeley, P. (2015). Mixed methods in
management research: Implications for
the field. Electronic Journal of Business
Research Methods, 13(1), 27-35.
Data can be derived from a primary study, secondary sources, or a systematic review of
the literature. A primary study involves the collection of firsthand, often never-before-
seen information by the person who actually conducted the research being reported. The
analysis of an existing (historical) dataset that was written by someone other than the person
who conducted the research constitutes a secondary source. A systematic review includes a
predetermined, comprehensive and transparent search and screening process to identify,
collect, critique, and synthesize all of the relevant studies on a particular topic. If the review
combines data from a number of studies into one calculated outcome, then it is referred to as
a meta-analysis (Forister & Blessing, 2016).
Be sure to review this week's resources carefully. You are expected to apply the information
from these resources when you prepare your assignments.
References
Forister, J. G., & Blessing, J. D. (2016). Introduction to research and medical literature for
health professionals (4th ed.). Burlington, MA: Jones & Bartlett Learning.
Glicken, M. D. (2003). Social research: A simple guide. Boston, MA: Pearson Education,
Inc.
Jacobsen, K. H. (2017). Introduction to health research methods: a practical guide (2nd
ed.). Burlington, MA: Jones & Bartlett Learning.
Polgar, S., & Thomas, S. A. (2019). Introduction to research in the health sciences (7th ed.).
Edinburgh: Elsevier, Ltd.
11/5/2019
3/6
Goff, W. M., & Getenet, S. (2017).
Design-based research in doctoral
studies: Adding a new dimension to
doctoral research. International
Journal...
Humphrey, R., & Simpson, B. (2012).
Writes of passage: Writing up
qualitative data as a threshold concept
in doctoral research. Teaching in Higher
Edu
Landrum, B., & Garza, G. (2015).
Mending fences: Defining the domains
and approaches of quantitative and
qualitative research. Qualitative
Psychology,
Mligo, E. S. (2016). Introduction to
research methods and report writing: A
practical guide for students and
researchers in social sciences and...
Wellington, J., & Szczerbinski, M.
(2007). Research methods for the social
sciences. London, UK: Continuum
International Publishing Group.
Yoshikawa, H., Weisner, T. S., Kalil, A., &
Way, N. (2013). Mixing qualitative and
quantitative research in developmental
science: Uses and...
11/5/2019
4/6
Quantitative Link
Qualitative: Link
Mixed Methods: Link
DHA-7008 Week 5 Assignment Word Document
Use of evidence.... PDF document
Week 5 - Assignment 1: Evaluate Research Methods Assignment
Due November 10 at 11:59 PM
For this assignment, you will review the quantitative, qualitative, and mixed methods
articles found in this week’s resources under Articles for Review.
For each article, you will need to provide the following:
An APA-formatted reference list entry
Identify the problem under study
Identify the purpose statement
Emphasize key points of the study
Differentiate why the methodology was chosen
Discuss how researchers addressed ethical considerations in the study
Length: 4-6 pages, not including the title page. A reference page is not needed for this
assignment.
References: 3 references are provided
Your annotated bibliography should demonstrate thoughtful consideration of the ideas
and concepts that are presented in the course and provide new thoughts and insights
11/5/2019
5/6
Week 5 - Assignment 2: Determine Secondary Data
Sources Assignment
Due November 10 at 11:59 PM
relating directly to this topic. Your response should reflect graduate-level writing and
APA standards.
For this assignment, you will use the table provided and identify at least two possible
secondary data sources for each project type listed in the first column. You will need to
describe more specifically what your possible project may be for each project type in the
second column. This helps frame the data sources and your rationale for choosing that
source. Make sure you include the URL for those sources retrievable from the Web.
Project Type
Project Description
Secondary Data
Source Rationale
Complet APA
Referenc
Quality Improvement/Performance Management Project
This type of project leads to measurable improvement in healthcare systems, services, and/or health status of targeted populations.
Healthcare Policy Analysis/Policy Development
This type of project is broad and may include the analysis
of policy process or policy content, and links to health outcomes. It is used to help influence stakeholders’
11/5/2019
decisions.
Evaluation of the Effectiveness of a Project, Program, Intervention, Services, etc.
This type of project is used to gain insight, improve practice, assess effects, and/or build capacity.
Table 1. Project Type Table
Length: 1-2 pages, not including the title page. A reference page is not needed for this
assignment.
References: Include a minimum of 6 scholarly and/or professional data sources
Your table should demonstrate thoughtful consideration of the ideas and concepts that are
presented in the course and provide new thoughts and insights relating directly to this
topic. Your response should reflect graduate-level writing and APA standards.
Chapter 5: Financing Risk
Financing Risk
• Risk exists for a healthcare organization is there is an event or action that can have impact on its financial or operational performance.
• Healthcare organizations work to balance this by covering the financial risk or transferring it. – Financing risk means to ensure that adequate
funds are available to cover costs related to unexpected events
– Transferring risk is accomplished by purchasing insurance.
To Finance or Transfer Risk
• Management of risk is paramount to the healthcare organization and should be tailored to the specific needs and structure.
• The healthcare organization must determine what risk can and should be internally financed versus what risk should be transferred
• The goal of risk management is to add value to the organization by appropriately and wisely managing risk
Costs of Adverse Risk
• Defense Costs
• Settlement or Judgment
• Loss Reduction
• Employee Morale
• Opportunity Costs
Identifying Risk
• Risk managers work to identify areas of risk exposures in order to minimize the likelihood of adverse events as well as how to cover costs if they should occur by monitoring: – Adverse incident reports – Patient safety data – Quality indicators – Insurance company claims – Employee satisfaction/complaints – Patient satisfaction/complaints – Accreditation survey results – Financial reports – Professional literature
Financing the Risk
• The fiscal well-being of the organization is the determinant of how best to managing the financing of risk.
• Internal financing is not prudent if the organization does not have available funding to cover risk.
• External financing of risk is less costly yet still is a financial expense to the organization and must be weighed as to how much coverage is needed.
Analyzing How to Finance Risk
• Healthcare organizations evaluate cost- effectiveness of available risk financing alternatives through:
– Quantitative analysis measures an event’s risk variables
– Qualitative analysis measures the event’s impact on the organization
Insurance Options
• Traditional Insurance Companies
– Fairly common
– Standard coverage
– Cost is relative predictable
– Events not covered by insurance remain the
responsibility of the healthcare organization
• Self-Insurance or Self-Funding
– Requires a significant amount of capital and
financial reserves
Choosing an Insurance Plan
• Make sure the plan meets your needs in
terms of:
– Portability
– Flexibility
– Services provided
• Choose a company based on:
– Experience -- Staffing
– Technology -- Procedures
– Costs -- Protection
Total Cost of Risk
• In order to balance the need for risk financing with the cost, healthcare organizations need to estimate the total cost of risk by analyzing:
– Cost of risk transfer
– Cost of risk retention
– Administrative costs associated with managing both the exposure to risk and claims if adverse events occur
Areas of Exposure
• Automobile Liability
• Aircraft Liability
• Business Interruption and Income
• Crime
• Cyber Liability
• Directors/Officers Liability
• Emergency Evacuation
• Employment (injury/illness, benefits,
practices)
Areas of Exposure
• Fiduciary Liability
• General Liability
• Licensing Board Discipline
• Media
• Medical Equipment Breakdown
• Patient Confidentiality
• Professional Liability
• Property
Insuring Agreements
• Insurance company will pay sums that the insured becomes legally obligated to pay.
• Occurrence Policies cover all injuries that occurred during the policy period, regardless of when they were reported.
• Claims Made policies cover injuries reported during the policy period that occurred after the policy retroactive date.
Summary
• Financing of Risk is a major component of
Management.
• Determining the method of financing risk
as well as selecting the appropriate liability
insurance company and plan is essential
Chapter 4: Communications to Reduce Risk
Communication is a risk?
• Lack of communication between
physicians and their patients can be a
critical factor leading to malpractice
lawsuits
– Lack of communication can lead to patient
dissatisfaction
– Dissatisfied patients are more likely to pursue
malpractice litigation
Barriers to Communication
• Lack of or poor listening skills
• Physical barriers
• Personal distractions
Communication depends on…
• Personality
• Age
• Environmental factors – Income
– Education
– Social situation
• Intelligence – Fluid intelligence
– Crystallized intelligence
Communication and Risk Management
• Understanding patients within their societal environment and culture is important to managing risk
– This will assist with communicating to the patient at their level of understanding
– Misunderstandings due to cultural or societal differences may be avoided with attention to proper communication
Why do Patients Sue?
Patients tend to sue when the Provider has caused them harm but also when they feel the Provider has:
• Deserted them
• Didn’t listen or devalued their view
• Didn’t give them necessary information or didn’t explain it
• Didn’t understand or acknowledge their perspective
Why is this important?
• Patients do not have the skills to
accurately identify ‘quality’ healthcare,
therefore they tend to view how they are
treated (customer service) as an indicator
of quality of care
Poor customer service = Poor quality
What are Patients looking for with litigation?
• Altruism
• Rationalization
• Recompense
• Accountability
Do unto others…
• Respect and civility can play a major role in risk management. Providers need to civil and respectful of their patients’ concerns by offering:
– Empathy
– Compassion
– Care
Cultural Awareness
• Providers need to have an understanding (sensitivity) of their patients’ backgrounds as cultural differences can lead to misunderstandings or non-compliance if not properly attended to.
– Cultural destructiveness
– Cultural incapacity
– Cultural blindness
Patient Empowerment • Studies show that much patient dissatisfaction
comes from deficient communication
• Empowering the patient to be an active participant in the provision of healthcare may lead to improved communication
• Programs are available to assist the patient in learning their role in provision of health – TJC: Speak Up program
– AHA: Patient’s Bill of Rights
– Facility specific: Complaint/Grievance Process
Health Literacy
• Degree to which individuals have the capacity to obtain, process and understand basic health information.
– Approximately 1/3 of adults have basic or below basic skills for dealing with health material
• Health facilities must follow federal regulation to provide language services for those patients with limited English proficiency (LEP)
Informed Consent
• Informed Consent implies that the patient
understands the service to be rendered,
the risks involved and potential outcomes.
• Valid consent is given when the patient:
– Has been informed
– Is competent
– Has not been coerced
Why do risk managers care
about informed consent?
• Courts have decided that patients have a
right to control their own body and decide
about medical treatment
• An informed and educated patient is more
likely to have realistic expectations about
his condition/treatment
Patient Education
• Poor communication can increase patient non-compliance which can lead to harm
• Adherence to physician instructions can be improved with communication: – Agree upon diagnosis through discussion
– Simplify regimen
– Written instructions in understandable language
– Motivate the patient to adhere to instructions
– Discuss potential risks, side effects and costs
Barriers to Patient Education
• Lack of time
• Health literacy of the patient
• Fear of materials being used against the provider
• Skepticism of patient’s ability to follow instructions
• Lack of adequate reimbursement
• Effects on the provider’s personal life
Difference of Opinion
• Due to their level of health literacy, patients and providers may see potential side effects or adverse reactions quite differently
– Provider sees an anticipated outcome
– Patient sees an error
• Physicians also have a different take on errors and tend to define them more narrowly
Disclosure • Disclosure can show that the provider is not
hiding anything and may serve to – Lessen the tendency to litigate
– Increase the tendency to settle
• Patients desire full disclosure of harmful errors – An acknowledgement that the error occurred
– What happened
– Why it happened
– Implication to patient’s health
– How it will be avoided in future
– An apology
Apologize
• If something has gone wrong, the patient has the right to an apology.
– Unfortunately, many providers are cautious to do so due to concern an apology would be an admission of guilt or wrong doing
– Some states have enacted Apology laws which make physician apology inadmissible in court
• Disclosure, explanation and apology should come within a reasonable timeframe of the incident.
Key Issue – Patient Satisfaction
• Patients who are satisfied are less likely to
sue
• It has been found that anger, not injury is
the trigger for most malpractice claims.
• Studies suggest that empathy and good
interpersonal skills may decrease the
likelihood of malpractice claims
Consumer Information
• With advent of the Internet and Social Media, healthcare consumers have access to multitudes of information not only on healthcare topics, but on their healthcare providers as well
• CMS website offers comparison of healthcare facilities based on reported quality indicators
• Accreditation also gives consumers information regarding their health facility providers
Issues with Web-based Information
• Consumer self-rating information is also available on the web though it is not vetted
• Social Media opens issues with confidentiality
• Courts have yet to set precedents regarding use of Social Media
Summary
• Appropriate and Positive Communication
is a valuable Risk Management Tool as it
can have an impact on patient satisfaction
– Dissatisfied patients are more likely to sue
even if there is no injury
– Satisfied patients with an adverse event are
less likely to sue
CHAPTER
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill
4 Scheduling
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
Learning Outcomes
When you finish this chapter, you will be able to:
4.1 Describe the two methods used to schedule
appointments.
4.2 Explain the method used to classify patients as new
or established.
4.3 List the three categories of information new patients
provide during telephone preregistration.
4.4 Identify the information that needs to be verified for
established patients when making an appointment.
4.5 Describe covered and noncovered services under
medical insurance policies.
4-2
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
Learning Outcomes (Continued)
When you finish this chapter, you will be able to:
4.6 List the three main points to verify with the payer
regarding a patient’s benefits prior to a visit.
4.7 Explain when a preauthorization number or referral
document is required for a patient’s encounter.
4.8 List the four main areas of Medisoft Network
Professional’s Office Hours window.
4.9 Demonstrate how to enter an appointment.
4.10 Demonstrate how to book follow-up and repeating
appointments.
4.11 Demonstrate how to reschedule an appointment.
4-3
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
Learning Outcomes (Continued)
When you finish this chapter, you will be able to:
4.12 Demonstrate how to create a recall list.
4.13 Demonstrate how to enter provider breaks in the
schedule.
4.14 Demonstrate how to print a provider’s schedule.
4-4
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
Key Terms
• benefits
• capitation
• coinsurance
• copayment (copay)
• covered services
• deductible
• established patient (EP)
• fee-for-service
• health plan
• indemnity plan
• managed care
4-5
• medical insurance
• new patient (NP)
• noncovered services
• nonparticipating
(nonPAR) provider
• Office Hours break
• Office Hours calendar
• Office Hours patient
information
• out-of-network
• out-of-pocket
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
Key Terms (Continued)
• participating (PAR)
provider
• patient portal
• payer
• policyholder
• preauthorization
• preexisting condition
• premium
• preregistration
• preventive medical
services
4-6
• provider
• provider’s daily schedule
• provider selection box
• referral
• referral number
• schedule of benefits
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.1 Scheduling Methods 4-7
• Patient appointments may be scheduled via
telephone or online.
• Patient portal—secure website that enables
communication between patients and health
care providers for tasks such as scheduling,
completing registration forms, and making
payments
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.1 Scheduling Methods (Continued) 4-8
• Scheduling systems include these methods:
– Open hours
– Stream scheduling
– Double-booking
– Wave scheduling
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.2 New Versus Established Patients 4-9
• New patient (NP)—patient who has not
received professional services from a provider
(or another provider with the same specialty in
the practice) within the past three years
• Established patient (EP)—patient who has
received professional services from a provider
(or another provider with the same specialty in
the practice) within the past three years
• Preregistration—process of gathering basic
contact, insurance, and reason for visit
information before a new patient comes into the
office for an encounter
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.3 Preregistration for New Patients 4-10
• During preregistration, new patients usually
provide three types of information:
– Demographic information
– Basic insurance information
– Reason for the visit (also known as the chief
complaint)
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.3 Preregistration for New Patients
(Continued) 4-11
• Participating (PAR) provider—provider who
agrees to provide medical services to a payer’s
policyholders according to the terms of the
plan’s contract
• Nonparticipating (nonPAR) provider—
provider who chooses not to join a particular
government or other health plan
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.4 Appointments for Established
Patients 4-12
• Medical offices verify established patients’
information prior to an appointment; such
information includes:
– changes to a patient’s address,
– changes to a patient’s health plan or employment.
• The reason for the visit should also be
established to schedule the correct amount of
time for the encounter.
• Patients’ account balances are checked as well.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.5 Insurance Basics 4-13
• Medical insurance—financial plan that covers
the cost of hospital and medical care
• Policyholder—person who buys an insurance
plan; the insured, subscriber, or guarantor
• Health plan—individual or group plan that either
provides or pays for the cost of medical care
• Payer—health plan or program
• Premium—money the insured pays to a health
plan for a health care policy; usually paid
monthly
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.5 Insurance Basics (Continued) 4-14
• Benefits—amount of money a health plan pays
for services covered in an insurance policy
• Schedule of benefits—list of the medical
expenses that a health plan covers
• Provider—person or entity that supplies medical
or health services and bills for or is paid for the
services in the normal course of business
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.5 Insurance Basics (Continued) 4-15
• Covered services—medical procedures and
treatments that are included as benefits under
an insured’s health plan
– These may include primary care, emergency care,
medical specialists’ services, and surgery.
• Preventive medical services—care that is
provided to keep patients healthy or to prevent
illness, such as routine checkups and screening
tests
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.5 Insurance Basics (Continued) 4-16
• Noncovered services—medical procedures
that are not included in a plan’s benefits; these
things may include:
– Dental services, eye care, treatment for employment-
related injuries, cosmetic procedures, infertility
services, or experimental procedures
– Specific items such as vocational rehabilitation or
surgical treatment of obesity
– Prescription drug benefits
– Treatment for preexisting conditions—illnesses or
disorders of a beneficiary that existed before the
effective date of insurance coverage
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.5 Insurance Basics (Continued) 4-17
• Indemnity plan—type of medical insurance that
reimburses a policyholder for medical services
under the terms of its schedule of benefits
• Deductible—amount that an insured person
must pay, usually on an annual basis, for health
care services before a health plan’s payment
begins
• Coinsurance—portion of charges that an
insured person must pay for health care services
after payment of the deductible amount; usually
stated as a percentage
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.5 Insurance Basics (Continued) 4-18
• Out-of-pocket—expenses the insured must pay
before benefits begin
• Fee-for-service—health plan that repays the
policyholder for covered medical expenses
• Capitation—prepayment covering provider’s
services for a plan member for a specified
period
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.5 Insurance Basics (Continued) 4-19
• Managed care—system that combines the
financing and delivery of appropriate, cost-
effective health care services to its members;
basic types include:
– Health maintenance organizations (HMOs)
– Point-of-service (POS) plans
– Preferred provider organizations (PPOs)
– Consumer-driven health plans (CDHPs)
• Out-of-network—provider that does not have a
participation agreement with a plan
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.5 Insurance Basics (Continued) 4-20
• Preauthorization—prior authorization from a
payer for services to be provided
• Copayment (copay)—amount that a health plan
requires a beneficiary to pay at the time of
service for each health care encounter
• Referral—transfer of patient care from one
physician to another
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.6 Eligibility and Benefits Verification 4-21
• Except in a medical emergency, the following
information should be obtained/verified from a
patient’s health plan before an encounter:
– Patient’s general eligibility for benefits
– Amount of the copayment for the visit, if one is
required
– Whether the planned encounter is for a covered
service that is medically necessary under the payer’s
rules
• Patients should be informed if their policy does
not cover a planned service.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.7 Preauthorization, Referrals, and
Outside Procedures 4-22
• Managed care payers often require
preauthorization before a patient:
– sees a specialist,
– is admitted to the hospital, or
– has a particular procedure.
• If the payer approves the service, it issues a
preauthorization number that must be entered in
the PM and included on the claim.
• Referral number—authorization number given
by a referring physician to the referred physician
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.8 Using Office Hours—Medisoft Network
Professional’s Appointment Scheduler 4-23
The Office Hours window contains four main
areas:
– Provider selection box—selection box that
determines which provider’s schedule is displayed in
the provider’s daily schedule
– Provider’s daily schedule—listing of time slots for a
particular day for a specific provider that corresponds
to the date selected in the calendar
– Office Hours calendar—interactive calendar that is
used to select or change dates in Office Hours
– Office Hours patient information—area that
displays information about the patient who is selected
in the provider’s daily schedule
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.9 Entering Appointments 4-24
To enter an appointment in Medisoft Clinical:
– Select the appropriate provider from within the Office
Hours program.
– Choose an appointment time slot.
– Complete the fields in the New Appointment Entry
dialog box.
– Click the Save button to enter the information on the
schedule.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.10 Booking Follow-up and Repeating
Appointments 4-25
• To create follow-up appointments in Office
Hours:
– Click the Go to a Date shortcut button on the toolbar;
the Go To Date dialog box will be displayed to allow a
choice of date.
– After a future date option is selected, click the Go
button to close the dialog box and begin the search.
– The future date will be located and displayed in the
calendar schedule accordingly.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.10 Booking Follow-up and Repeating
Appointments (Continued) 4-26
• To create repeating appointments in Office
Hours:
– Open the New Appointment Entry dialog box.
– Click the Change button; the Repeat Change dialog
box is displayed.
– Make selections and enter information in the Repeat
Change dialog box.
– When done, click the OK button, and then the Save
button, to enter the repeating appointments on the
schedule.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.11 Rescheduling and Canceling
Appointments 4-27
To locate an appointment that needs to be
rescheduled:
– Click the Appointment List option on the Office Hours
Lists menu; the Appointment List dialog box appears.
– Use the Cut and Paste commands to move an
appointment.
– Use the Cut command to cancel an appointment.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.12 Creating a Patient Recall List 4-28
To create or maintain a recall list in MNP:
– Click Patient Recall on the Lists menu; the Patient
Recall List dialog box is displayed.
– Patients are added to the recall list by clicking the
New button in the Patient Recall List dialog box or by
clicking the Patient Recall Entry shortcut button; the
Patient Recall dialog box is displayed.
– After the information has been entered in the dialog
box, click the Save button.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.13 Creating Provider Breaks 4-29
• Office Hours break—block of time when a
physician is unavailable for appointments with
patients
• To set up a break for a current provider:
– Click the Break Entry shortcut button; the New Break
Entry dialog box will appear.
– Enter the information in the dialog box, and click the
Save button to enter the break(s).
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
4.14 Printing Schedules 4-30
• To print a provider’s schedule within Office
Hours:
– Use the Appointment List option on the Office Hours
Reports menu to view a list of all appointments for a
provider for a given day.
– The report can be previewed on the screen or sent
directly to the printer.
• Alternatively, click the Print Appointment List
shortcut button.
CHAPTER
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill
3 Introduction to
Medisoft Clinical
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
Learning Outcomes
When you finish this chapter, you will be able to:
3.1 List the practice management and electronic health
record applications in Medisoft Clinical.
3.2 Discuss three security features in Medisoft Clinical
that protect patients’ health information.
3.3 List the menus in Medisoft Clinical Patient Records.
3.4 List the menus in Medisoft Network Professional.
3.5 Describe how pre-encounter tasks are completed in
Medisoft Clinical.
3.6 Describe how encounter tasks are completed in
Medisoft Clinical.
3-2
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
Learning Outcomes (Continued)
When you finish this chapter, you will be able to:
3.7 Describe how post-encounter tasks are completed in
Medisoft Clinical.
3.8 Explain how to create and restore backup files in
Medisoft Clinical.
3.9 Discuss the types of help available in Medisoft
Clinical.
3-3
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
Key Terms
• access levels
• Auto Log Off
• backing up
• chart
• chief complaint
• dashboard
• database
• disaster recovery plan
• knowledge base
• Medisoft Clinical
3-4
• Medisoft Clinical Patient
Records (MCPR)
• Medisoft Network
Professional (MNP)
• park
• password
• restoring
• user name
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.1 Medisoft Clinical: A Practice
Management/Electronic Health Record Program 3-5
• Medisoft Clinical—integrated practice
management (PM) and electronic health record
(EHR) program
• Medisoft Network Professional (MNP)—
practice management application within Medisoft
Clinical
• Medisoft Clinical Patient Records (MPCR)—
electronic health record application within
Medisoft Clinical
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.2 Security Features in Medisoft Clinical 3-6
Medisoft Clinical has a number of built-in security
features
• User name—name that an individual uses for
identification purposes when logging onto a
computer or an application
• Password—confidential authentication
information
• Access levels—security option that determines
the areas of the program a user can access, and
whether the user has rights to enter or edit data
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.2 Security Features in Medisoft Clinical
(Continued) 3-7
• Park—privacy and security feature in MPCR that
allows a user to leave a workstation for a brief
time without having to exit the program
• Auto Log Off—feature of MNP that
automatically logs a user out of the program
after a period of inactivity
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.3 Medisoft Clinical Patient Records 3-8
• Standard menu items in MCPR include:
– File
– View
– Task
– Maintenance
– Reports
– Window
– Help
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.3 Medisoft Clinical Patient Records
(Continued) 3-9
• Database—collection of related bits of
information
• Chart—folder that contains all records
pertaining to a patient
• Dashboard—panel in MCPR that offers
providers a convenient view of important
information
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.4 Medisoft Network Professional 3-10
• Names of the menus in MNP are listed on the
menu bar, and include:
– File
– Edit
– Activities
– Lists
– Reports
– Tools
– Window
– Help
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.5 Using Medisoft Clinical to Complete
Pre-Encounter Tasks 3-11
• Pre-encounter steps include preregistration and
appointment scheduling.
– To enter preregistration information about a new
patient, click the New Patient button, and complete
the Patient/Guarantor dialog box.
– To enter an appointment in Office Hours, select a
provider, select a date and time slot, and save.
• Chief complaint—patient’s description of the
symptoms or reasons for seeking medical care
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.6 Using Medisoft Clinical to Complete
Encounter Tasks 3-12
• Encounter steps include all activities that take
place from the patient’s arrival until the patient’s
departure from the office, such as:
– Establishing financial responsibility—real-time
insurance eligibility can be checked
• Insurance information is entered in one or more of the Policy
tabs in the Case folder in MNP.
– Check-in—reviewing account balance, updating
patient information, recording documentation and
examination findings
• The Patient/Guarantor dialog box is updated as needed.
• SOAP notes are recorded.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.6 Using Medisoft Clinical to Complete
Encounter Tasks (Continued) 3-13
• Encounter steps (continued):
– Coding—assigning codes based on the services
provided and the provider’s determination
• In MPCR, codes are selected from lists provided on an
electronic encounter form.
– Checkout—payments are calculated and posted,
follow-up appointments and tests are scheduled,
materials are dispensed, and referrals are provided
• The Unprocessed Charges dialog box in MCPR is used to
post and review charges.
• Scheduling is performed in Office Hours.
• Referral and prescriptions are created within MCPR.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.7 Using Medisoft Clinical to Complete
Post-Encounter Tasks 3-14
• After the patient visit is complete, activities focus
on payment for services, including:
– Preparing and transmitting claims
• In MNP, claim functions are located on the Activities menu.
• The Claim Management dialog box is used for current claims
and to create new claims.
• Claims are transmitted through MNP’s Revenue
Management feature.
– Monitoring payer adjudication
• Payer adjudication is tracked using the Deposit List window.
• Charges are applied in the Apply Payment/Adjustment to
Charges dialog box.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.7 Using Medisoft Clinical to Complete
Post-Encounter Tasks (Continued) 3-15
• Activities focused on payment for services
(continued):
– Generating patient statements
• In MNP, the Statement Management option on the Activities
menu contains options for creating and printing patient
statements.
• Selections in the Create Statements dialog box determine
which statements will be created.
– Following up on payments and collections
• In MNP, collection functions are located on the Activities
menu and on the Reports Menu.
• The Collection List feature on the Activities menu is used to
place an account in collections.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.8 Backing Up and Restoring Files 3-16
• Disaster recovery plan—plan for resuming
normal operations after a disaster such as a fire
or a computer malfunction
• Backing up—making a copy of data files at a
specific point in time that can be used to restore
data
– In MNP, the Backup Data option on the File menu can
be used to make a backup copy of the database.
• Restoring—process of retrieving data from a
backup storage device
– Files are restored using the Restore Data feature on
the File menu.
© 2012 The McGraw-Hill Companies, Inc. All rights reserved.
3.9 The Medisoft Clinical Help Feature 3-17
• MNP and MCPR offer built-in and online help
files.
– Built-in help feature is accessed via the Help menu.
– Help menu also provides access to help available
online at the MNP website.
• Knowledge base—collection of up-to-date
technical information
Writes of passage: writing up qualitative data as a threshold concept in doctoral research
Robin Humphrey a * and Bob Simpson
b
a Faculty of Humanities and Social Sciences, Newcastle University, Daysh Building, Newcastle
upon Tyne, UK; b Department of Anthropology, Durham University, Dawson Building, South
Road, Durham, UK
(Received 4 July 2011; final version received 14 March 2012)
Effective writing is an essential skill for all doctoral students, yet it is one that receives relatively little attention in training and supervision. This article explores extensive feedback from participants in a series of workshops for doctoral candidates engaged with writing up qualitative data. The themes arising from the data analysis are discussed in terms of the affective domain of writing, and the main claim is that writing up qualitative data has been identified as what Meyer and Land would call a threshold concept for doctoral candidates employing qualitative analysis. Drawing on Turner’s notion of liminality, the article concludes that interdisciplinary workshops can be instrumental in helping doctoral candidates understand the role of writing, and of writing up qualitative data in particular, in their development into independent, autonomous researchers.
Keywords: doctoral education; writing groups; qualitative research methods; threshold concepts; interdisciplinary study programme
Introduction
The impetus for the Writing Across Boundaries (WAB) project, about which we write
in this article, was the observation that writing poses particular challenges for
doctoral students in general and for those using qualitative data in particular. The
problem is most acute at the point where data collection has ended, writing begins in
earnest and the deadline for completion begins to loom. Both authors had this
experience when writing their own theses, have dealt with it as supervisors and
recognise it in the experience of others. When devising the WAB project, we were
both immersed in doctoral matters through our formal Faculty-level duties in our
respective universities, which allowed us to take a broad look at writing support
across the social sciences. We realised that the difficulties were being experienced not
just by doctoral candidates from our own disciplines of Anthropology and Sociology,
but by researchers from every social science discipline, and indeed from disciplines in
the natural and medical sciences and the humanities.
Whilst much research training goes into preparing doctoral students in the early
stages of their Ph.D. careers, relatively little attention has been paid to writing
post-fieldwork, which, for reasons we will outline later, we continue to call
the ‘writing-up’ stage despite much recent criticism of the term (Badley 2009;
*Corresponding author. Email: [email protected]
Teaching in Higher Education
Vol. 17, No. 6, December 2012, 735�746
ISSN 1356-2517 print/ISSN 1470-1294 online
# 2012 Taylor & Francis
http://dx.doi.org/10.1080/13562517.2012.678328
http://www.tandfonline.com
Thomson and Kamler 2010), and to how doctoral researchers might be helped in this
endeavour. The WAB project was an attempt to address this issue in practical ways
by offering doctoral students help in negotiating what can often appear a very ‘scary
gap’ in their doctoral training and one which has hitherto been self-negotiated (Simpson and Humphrey 2008, 2010).
Our interest in writing and analysis contributes to the development of the post-
Roberts agenda for UK doctoral training, and in particular to the newly launched
Researcher Development Framework (RDF) which seeks to articulate ‘the knowl-
edge, behaviours and attributes of effective and highly skilled researchers’ (Vitae
2011). The RDF is structured in four domains, within each of which are three sub-
domains and associated descriptors. Domain A encompasses the knowledge,
intellectual abilities and techniques required to produce work of a professional academic standard, and the three sub-domains are Knowledge Base (A1), Cognitive
Abilities (A2) and Creativity (A3). We were keen to understand writing up qualitative
data as a distinctive synthesis of these three sub-domains, and to situate our analysis
within the broad context of doctoral training. In short, we were not so much
interested in the ‘how to’ approach to writing, but were concerned rather to cultivate
a reflexive awareness of what happens for a doctoral student when they begin to write
up qualitative data and why this happens.
In the account that follows we provide an analysis of a body of data that was collected at three of the annual WAB workshops. This data provides insight into
the kinds of impediments that doctoral students encounter when writing up
qualitative data for incorporation into a thesis. Furthermore, the very positive
response that the workshops received gave us some important clues as to the place
of writing in the doctoral process and how the impediments might be more
effectively addressed.
Analysis of the data revealed very significant cross cutting themes which are
pertinent when it comes to understanding the process of doctoral study. The two themes which we discuss here relate to debates about the affective domain in the
writing process and the identification of writing up qualitative data as a threshold
concept in doctoral research. This last theme highlights what we see as an important
point in academic pedagogy and one which is critical for the development of
autonomous, professional researchers.
Situating our analysis in the literature
The themes outlined above were generated from our data, rather than derived prior
to the data analysis from the substantial body of literature that is now available on
the writing of doctoral theses. However, once the themes were identified, we then
sought to link them to concepts and discourses prevalent in the contemporary
literature.
As our workshop participants came from so many different disciplines, we turned
first to the work of Bernstein (1990, 2000), and in particular his typology of
knowledge structures in which the vertical knowledge structures of the natural sciences are distinguished from the radically different horizontal knowledge
structures found in the social sciences. Social science disciplines, Bernstein
concluded, share a common conceptual core although the boundaries between
them are far less rigid than are the boundaries between disciplines in the natural
736 R. Humphrey and B. Simpson
sciences. As we will show, this differentiation between the natural and the social
sciences helped us to appreciate the workings of our interdisciplinary workshops in a
new light.
We then searched the burgeoning literature which addresses academic writing at doctoral level. Some of these texts concentrate on publishing pedagogy (Aitchison,
Kamler, and Lee 2010; Belcher 2009), while others focus on particular aspects of the
doctoral writing process, such as dissertation proposals (Kratwohl and Smith 2005)
and literature reviews (Kamler and Thomson 2006a; Machi and McEvoy 2008).
There is also an established research tradition which explores the ways in which
graduate students learn writing conventions in disciplinary settings (McAlpine, Paré,
and Starke-Meyerring 2008; Prior 1998). There were two bodies of work, however,
that had direct relevance to our themes: work addressing the relationship between text work and identity construction in doctoral research (Kamler and Thomson
2006a, 2006b); and work addressing the incorporation of the notion of threshold
concepts into the realm of doctoral pedagogy (Kiley 2009; Meyer and Land 2006).
Through their treatment of doctoral writing as a complex, institutionally
constrained social practice, rather than simply a set of skills and competences,
Kamler and Thomson (2006b, 2) employ the notion of ‘research as writing’, and seek
to remedy the situation where the development of scholarly writing has become a
major site of anxiety for doctoral candidates, and their supervisors. Their call for universities to address more seriously the question of research writing and to
establish ‘institutional writing cultures’ (144) will be addressed elsewhere. The aspects
of their work that we draw on here are the connections they make between academic
writing practices and the formation of the ‘doctoral researcher’.
Recognising the relationship explored by Kamler and Thomson between
successful doctoral writing and the development of the doctoral candidate’s identity
as an academic researcher allowed us to make the link between our findings and the
emerging literature on threshold concepts in doctoral research. Kiley (2009) argues that doctoral candidates undertake a series of rites of passage during their
candidature, and that there are times during their research education when they
demonstrate that they have undergone a change in the way they see themselves and
their research work. These changes, she argues, are the result of the candidate first
encountering, and then successfully crossing, a threshold which is critical for the
furtherance of the doctoral research process. The identification of discipline-specific
threshold concepts has been developed as a way of differentiating between core
learning outcomes that represent ‘seeing things in a transformed way’ and those that do not (Kiley and Wisker 2009; Meyer and Land 2006). A threshold concept is seen
as distinct from other core learning outcomes because ‘once grasped, [it] leads to a
qualitatively different view of the subject matter and/or learning experience and of
oneself as a learner’ (Kiley and Wisker 2009, 432).
We shall explore these concepts further when we discuss our findings, but first we
will outline briefly the content of our workshops and our methodological approaches
of generating feedback from the participants.
The workshops
The WAB workshops were the centrepiece of a project funded by the UK’s
Economic and Social Research Council as a part of its Researcher Development
Teaching in Higher Education 737
Initiative. 1
Each two day workshop was residential and comprised five participative
sessions:
i. An introduction by the organisers, including a panel conducted by recently successful doctoral researchers who reflected on their strategies for writing up
qualitative data in their theses.
ii. Ordering text � delivered by a psychologist who is also a creative writer. iii. Analysing the relationship between text and representation � delivered
variously by a sociologist with an interest in narrative, and an anthropologist
who is also a poet.
iv. Rhetoric and narrative in qualitative writing � delivered by a social anthropologist.
v. Data and theory � delivered by experienced and widely published qualitative researchers in Sociology and Education.
2
The structure of the workshops did not change from their inception, largely
owing to the positive feedback and reinforcement gained after the first, and each
subsequent, workshop. The first two workshops were regional, and open to
applicants from the five universities in the North-East of England. The third
workshop was opened up to applicants from any university in the UK, and the fourth workshop was advertised throughout Europe.
Information about the workshop and details about applications were dissemi-
nated via email distribution lists and via the WAB project website. 3
The criteria used
for selection were that the doctoral candidate should have completed their fieldwork
and should be writing a thesis based in part or entirely on qualitative data. The
application form had to be submitted by the candidate’s supervisor, who was asked
to make a case explaining why the candidate would benefit from the workshop. Out
of a total of 237 applicants for the four workshops, 156 participants were drawn from 26 UK universities and, in the fourth workshop, from universities in the Netherlands,
Poland, Belgium, the Irish Republic and the Czech Republic.
The workshop participants were drawn from all but four of the 19 ESRC social
science disciplines. Although the majority of the participants were social scientists,
the workshops attracted some who had been trained and were located in the natural
sciences (particularly environmental science), the medical sciences (including health
services research, midwifery, physiotherapy, general medical practice, nursing and
pharmacy) and the humanities (including modern languages, history and design). Participants brought with them experience of a wide range of qualitative
methodologies, the most common being interviews (70%) and participant-observa-
tion (42%). Most researchers (71%) were employing a combination of qualitative
methods, and some (12%) were combining qualitative analysis with that of
quantitative data.
Methodology
The strategy adopted for the formal evaluation of the workshops had three stages,
with each stage employing a different methodological approach in order to generate
multiple perspectives on the workshops. For stage one, two Ph.D. students attended
the workshop as participant observers. They took notes, and discussed what was
738 R. Humphrey and B. Simpson
going on with participants both during the workshop sessions and in the less formal
periods during the residential weekend. Short reports were produced by the participant
observers following the workshops, which provided the foundation for debriefing
sessions where impressions and reflections were discussed with the project leaders.
For stage two, all the participants were asked to fill in a short questionnaire soon
after the workshop. This was filled in remotely on Durham University’s online
learning support platform, Blackboard. The questionnaire was designed to capture
immediate impressions of the workshop, and included a series of closed questions asking participants to rank their responses on five-point Likert scales, and some
open ended questions asking what they liked best about the workshop and how they
thought it could be improved. The overall response rate to the post-workshop online
questionnaire across the three years was 82%, and the responses generated both
quantitative and qualitative data. Significantly, the qualitative data was unusually
detailed and thoughtful by the standards of online feedback, and went beyond the
initial expectations of the project organisers.
For stage three, semi-structured telephone interviews were carried out six months
after the workshop. These were conducted not only with participants but also with
their supervisors to assess whether there had been any longer term impacts arising
from the workshop, rather than just a short-term ‘glow’. The response rates for stage
three were 53% for students and 50% for their supervisors.
These three exercises were repeated for each of the first three workshops, and
produced a longitudinal data set comprising a rich mix of quantitative and qualitative
data, the latter produced via ethnographic, self-completion and interview-based
approaches. All the data were processed and stored electronically on the software
package NVivo, in preparation for analysis. The breadth and depth of the data set
allowed emergent themes to be traced across the three cohorts of participants, and for
conclusions to be drawn with stronger claims to rigour and generalisability than is
often the case with qualitative studies based on single context or cohort studies.
From feedback to analysis
In terms of feedback on the success of the workshops, the data collected was
overwhelmingly positive. In the online quantitative feedback collected immediately
after the workshops, after the results for the first three years were combined
participants indicated, through their rating on the five-point Likert scales, not only
that they had found the workshops very enjoyable (90%) but also that they had
greatly increased their confidence in their ability to write up their Ph.D. (90%) and
felt that they had been very useful in helping them to develop strategies for writing
up qualitative data (91%). These results reassured the organisers that the workshops
were, at least in the short term, effective and that the project was largely meeting its
original aims. These results were further reinforced by a great many comments from
workshop participants pointing to how their attitude to writing had been changed
positively:
The impact [of the workshop] has been phenomenal. I was losing sleep before but when I came back I got straight on to it and wrote reams and reams, so it was like opening a floodgate � it gave me the opportunity to move on as a writer. (Doctoral Candidate in Education, Workshop 2)
Teaching in Higher Education 739
Such responses were enormously gratifying, but made us curious as to why the
workshops were so successful. We would like to think that they were well organised
and presented, but there was a sense that we had touched on something that was
much more fundamental and dynamic among the groups that we had convened. Beyond specific organisational factors, we first turned to Bernstein’s typology of
knowledge structures to help us deepen our understanding of what had transpired in
the workshops. Bernstein (1990, 2000) contrasts the vertical knowledge structures of
the natural sciences with the horizontal knowledge structures found in the social
sciences. In the latter, disciplines share a common conceptual core and the
boundaries between them are weak and porous. Thus, bringing together 30�40 doctoral students created a significant opportunity for lateral communication to take
place and, as we go on to illustrate, for participants to use these encounters to generate positive momentum for themselves; in many respects we were merely
providing the crucible in which certain kinds of reactions could take place.
A key catalyst in these reactions was the sharing of broad methodological
approaches. These overlaps facilitated moves out of the vertical, disciplinary
knowledge domains which some participants brought to the workshop. What we
inferred from these moves was that focusing on a common processual problem for
doctoral students, in this case the writing up of qualitative data, can be profitably
undertaken where a mix of disciplines and methodologies are brought together. An important feature of the workshop in this respect was the academic level of
the doctoral participants, all of whom had already acquired threshold concepts at
undergraduate level of a wide range of academic disciplines (Cousin 2006). The
workshops were therefore characterised by a high level of cross disciplinary
exploration and boundary crossing (Engeström, Engeström, and Kärkkäinen 1995)
which, as we shall see below, constituted a rich resource from which participants
could draw in deepening their learning experience.
The participants who were crossing the most difficult boundaries were those who had been educated in the natural sciences, for whom cross-disciplinary moves of the
kind we discuss here were unfamiliar. For them, the intellectual task of writing up
qualitative data was particularly challenging:
As someone with a natural science background, qualitative data is still new to me and analysing and writing up ‘words’ rather than numbers is a daunting process. (Doctoral candidate in Environmental Science, Workshop 3)
Observing interactions at the workshop, gathering post-workshop impressions
and subsequently interviewing students and their supervisors gave us important
insights into the experience of writing at this critical phase in the doctoral process
and enabled us to illuminate some of the still largely uncharted areas of doctoral
research training.
Acquiring confidence and self belief: the affective domain in the writing process
Issues of confidence are evidently key when it comes to writing up qualitative data. This message was the clearest to emerge from the analysis of the feedback data.
There were 196 explicit references to acquiring confidence in writing from the
workshops in the anonymous online feedback and in the transcripts of the 63 phone
interviews subsequently conducted with participants, and many more comments
740 R. Humphrey and B. Simpson
where it was strongly implied. The context in which this message was expressed,
however, took many forms.
As we have seen earlier, research using qualitative methods is now carried out by
doctoral candidates in disciplines outside of the social sciences. Many participants
from such disciplines referred to changes in their confidence levels in writing about
approaches which may well be deemed marginal in their academic environments:
Interestingly, one of the organisers said my approach was similar to an anthropologist’s approach. It was useful to find this out � finding out that it was acceptable to do what I was doing. That was really good. And good to speak to people with diverse backgrounds and find out that there’s lots of ways of doing it and you do what you need to, to fit the purpose. (Doctoral candidate in Design, Workshop 1)
Confidence through increased knowledge and understanding of the qualitative
research process was keenly felt by participants from disciplines where quantitative
research is dominant and follows scientific models for enquiry and data presentation:
I can’t explain very well but I knew the way I was writing in this rigid scientific structure wasn’t right for my data. Obviously I’m still doing active experimental research as well as writing up. Although I have one supervisor pushing it a lot more experimentally, I have the confidence now to say ‘no I have this [qualitative] data’ and it is an important part of the research too . . .. I think the difficulty I’ve had is with my setting, I’m in the medical setting. I feel more constrained, more reined in I think because you have this idea that it has to be rigorous and that means to write like this, only in this certain ‘scientific’ way. But seeing how other people were doing this, I thought ‘no, this works, this could work for me’. (Doctoral candidate in Physiotherapy, Workshop 3)
This greater appreciation of the nature of qualitative research was expressed well
by the General Medical Practitioner quoted below, who prior to the workshop had
clearly struggled with the differences between qualitative and quantitative data and,
more specifically, with what this meant in a context increasingly dominated by
evidence-based paradigms:
I am committed to qualitative research, but come from a discipline very closely allied with biomedicine, and this workshop . . . has given me the confidence to trust my data. There are many decisions which researchers have to make at all stages of the process. The writing stage is no different. I have the confidence to know that I don’t have to include everything in the thesis, I have to make judgments. It was helpful to consider the words ‘illustration’ and ‘explanation’, rather than ‘evidence’. As a health care professional, this word comes back to haunt us on a daily basis. (Doctoral candidate in General Medical Practice, Workshop 3)
This anxiety about writing up qualitative data was not dependent on the
innovative adoption of qualitative research methods by participants in disciplines
dominated by other methodological approaches, since there were comments from
doctoral candidates in Sociology, Anthropology and Human Geography showing
that familiarity with and acceptance of qualitative research within a discipline can
produce its own pressures:
As you know Anthropology has been talking representation for the last 20 years, so I worry more about how to write, how do I do this? But that was a good thing about the
Teaching in Higher Education 741
workshop; it was very good at bringing confidence to me. I was anxious about writing. (Doctoral candidate in Anthropology, Workshop 2)
The emergence of confidence as a strong theme in our data analysis provides
powerful corroboration for claims that writing about qualitative data and analysis is
more than simply a matter of technical ability, but is a process in which the writer’s
attitudes and feelings about writing also play a significant part. In our view, paying
attention to this relationship is key to the development of skills in analysis and
communication for the aspiring social scientist. Following Bloom’s taxonomy of goals within education systems (Bloom 1956),
Wellington (2010) argues that the cognitive domain of skill and knowledge
development has dominated thinking on developing writing ability to the detriment
of understanding the role of affect in the writing process. Considering the affective
domain causes us to reflect on the role of feelings and emotions in learning and
teaching, yet it is a domain that has tended to be neglected in postgraduate
education, where academics have, perhaps, underestimated the extent to which
doctoral candidates need help with confidence, motivation and inspiration (Lillis and Turner 2001; Wellington 2010).
Our evidence suggests strongly that academic writing is not a skill that we can
assume is inbuilt and simply develops autonomously and individually (Nightingale
1988; Wellington 2010). Difficulties with writing are magnified when coupled with
the analysis of qualitative data. This distinctive post-fieldwork activity for
researchers working with qualitative data has its own complexities and difficulties
(Silverman 2010; Wolcott 2009), as the interplay of writing and analysis requires the
bringing together of rigorous data analysis and the nuanced and rhetorical use of language.
The anxieties provoked when trying to engage with this aspect of the research
process in a doctoral thesis are likely to be exacerbated by current thinking that
writing should start on ‘day one’ of a thesis, and that the notion of ‘writing up’ is
outmoded and potentially dangerous as it implies that writing only happens at the
end of the doctoral cycle (Badley 2009; Kamler and Thomson 2006b). While we
agree with Badley that the term ‘writing up’ is problematic since it can fail to convey
the nature of good academic writing (‘a problematical and tentative exercise in critical reflective thinking’, Badley 2009), our evidence suggests that this conflation
of ‘writing’ with ‘writing up’ can mask some important distinctions between the
mechanics of writing (literature reviews, accounts of methodology, contextualisation,
etc.), reflexive and reflective writing (in the form of diaries and fieldwork logs) and
‘writing up’ (the final synthesis of information and experience in the form of a
thesis). All of these stages of writing are important, and particularly so for
qualitative researchers, who typically produce words that describe words, rather
than words that describe numbers. Enabling students to be clearer about how the varieties of writing relate to each other was an important outcome of the workshop
and one which led us to think more analytically about writing as a threshold concept.
Writing up qualitative data as a threshold concept in doctoral research
Just as we were able to identify some of the affective problems associated with
writing, we were also interested to note some of the affective solutions that the
742 R. Humphrey and B. Simpson
workshop generated. There were many comments that referred to a new feeling of
being able to overcome the intellectual and emotional challenges of post-fieldwork/
data-collection writing, and especially of having glimpsed new perspectives that
might impart the confidence to try new approaches and pursue more creative
directions:
I was scared before I got there, I felt challenged about things I didn’t know about, that I wouldn’t know enough. But when I got there I wasn’t intimidated at all, everyone was very willing to share. It gave me permission to let my creative intuition take me forward. To stop being worried that it’s not scientific or academic enough. But it just comes. They say that in the text books, you know, just write, but the workshop let me do that. I feel like I’m on a race course and all these hurdles keep popping up in front of me and I jump this one, and then that one, but the last one’s in sight and I’m heading for the finish line. (Doctoral candidate in Gerontology, Workshop 3)
The reference above to being given ‘permission’ is echoed in many comments, and
there were also references to ‘feeling liberated’ and ‘freed up’ after the workshop:
Definitely [had positive benefits from the workshop], because now I’m writing up � my thinking has changed � I feel freed up to write up differently than perhaps I might have before. I felt liberated after the workshops as I could write up more like myself. I think the workshop liberated me to write up as me. (Doctoral candidate in Health Care, Workshop 1)
The idea that participants were somehow deriving a sense of liberation and
perhaps even feeling that a kind of ‘permission’ was being given was as puzzling as it
was gratifying. We were curious as to where the authority for this licence to ‘write up
as me’ issued from as it was certainly not what we had in any way planned or
intended and a deeper reflection on what is happening here is instructive. In
particular, repeated reference to the idea of ‘voice’ gives some clue as to where
impediments may lie:
I am much more concerned about using my own voice, much more confident that I can write in my own voice, that it is distinctive. I’m more confident about doing that. My colleagues are always talking about making an original contribution. A big part of your original contribution is the way you communicate . . . I have more sense that I can put my own stamp on this now, put in my own voice. (Doctoral candidate in Nursing, Workshop 3)
The interviewer of this participant noted that he had used the phrase ‘finding my
own voice’ several times, and commented that for many doctoral candidates she had
spoken to ‘there seems to be a sense of coming to the final year of the Ph.D. having
spent so much time with other people’s words that they are unsure of how to find
their own words, or what standing or role they take in the final script’.
All of this is strong evidence that the workshops have been working around what
Kiley refers to as a threshold concept in doctoral research, in this case research that
employs qualitative methodology. Following Meyer and Land (2006), Kiley (2009)
adapts Turner’s notion of liminality (1979) and suggests that prior to crossing a
threshold of understanding doctoral candidates can enter a liminal space, in which
some can experience being ‘stuck’ for some time. Whether there is a single moment in
Teaching in Higher Education 743
which one discovers one’s own voice and in a single epiphany acquires belief in one’s
own potential, thereby becoming ‘unstuck’, is questionable. However, it did seem
that through the medium of the workshops we had been able to create the space for
this kind of liminality. Drawing on the work of the anthropologist Victor Turner
(1967, 1969, 1974), we understand liminality as a place and a time which is outside of
the conventional structures of process and in which there is the opportunity to
engage in play and experimentation in relation to values and assumptions that might
otherwise be constrained by the structure and conventions that prevail in other
contexts. As Turner famously put it, the liminal is culture in the subjunctive mood,
that is ‘the mood of maybe, might be, as if, hypothesis, fantasy, conjecture, desire � depending on which of the trinity of cognition, affect, and conation is situationally
dominant’ (Turner 1986, 42).
In the face of the challenges of writing up data collected using qualitative
methodologies, the workshop appeared to provide a crack or interstice within which
a deeper reflection on self, writing and the doctoral process became possible.
Crucially, what doctoral students were able to create within this space was a boost to
confidence and self belief which would enable them to successfully cross a significant
threshold in writing up their qualitative data and take an important step towards
becoming academic researchers in their own right.
Conclusion
Analysis of participant feedback has not been presented simply to impress with the
evident success of the WAB workshops � the scale of the success of which came as a surprise to the organisers and, hopefully, offered reassurance to the projects’ funders,
the ESRC. Rather, the feedback analysis has provided insights into a crucial stage of
the doctoral cycle.
We believe that our data have provided evidence for the claim that the writing up
of qualitative data is a threshold concept in this form of doctoral research, and that
achieving this is challenging for most doctoral researchers. We would contend that, in
enabling this to happen, training of the kind reported on here can play a crucial role.
As we have seen, confidence is key to taking control of the thesis as a textual
synthesis of data, theory and experience and requires the bringing together of the
skills and expertise underpinning all of the three sub-domains of Domain A of the
RDF, Knowledge and Intellectual Abilities: sound academic knowledge, cognitive
abilities and creativity (Vitae 2011). Acquiring the confidence to achieve this appears
to be significantly helped by removing the doctoral candidate for a short, but intense
time from their established environments of supervisors, immediate peers and
disciplinary arrangements. Our evidence confirms the pedagogical benefits for
doctoral candidates of breaking out of their disciplinary and institutional homes
in order to spend focused time with their peers and fellow travellers.
The fact that the workshops were purposefully multi-disciplinary, focusing more
on the nature of writing up qualitative data than on disciplinary perspectives or
processes, was undoubtedly important for the experience of the workshop
participants. However, it was also important for the potential generalisability of
the analytic conclusions presented here. In those disciplines that employ qualitative
methodologies, at least, the moment when the doctoral candidate begins to analyse
744 R. Humphrey and B. Simpson
and write up their data is often a defining one in the move from novice to
independent social researcher.
Acknowledgements
The authors would like to thank all the participants in the WAB workshops, both for their participation and for the time they gave us in providing the feedback which gave us both the data and the inspiration for this article. We would also like to thank their supervisors, who initially nominated them for a place on the workshops and then allowed us to interview them over the phone six months later. We are indebted to Carolyn McAlhone and Clare Hardy, from the Graduate School at Durham University, for magnificent administrative help with the workshops over the years. We thank, and acknowledge the work of, the doctoral students who helped with the workshops and data processing, Victoria Wood, Mwenza Blell, Alison Jobe, Sally Atkinson and Rachel Douglas Jones, and acknowledge with thanks the help with NVivo given by Dr Jane Wilcockson, and the insightful advice given by Dr Stan Taylor, Director of the Centre for Research and Academic Practice at Durham University, regarding an early draft of this article.
Notes
1. The Writing across Boundaries project was funded by the ESRC Research Development Initiative, grant number RES 035 25 0013. Further details of the RDI can be found at: http://www.rdi.ac.uk/
2. Further elaboration and reflections on the content of the workshops can be found in earlier publications (Simpson and Humphrey 2008, 2010).
3. The project website can be found at http://www.dur.ac.uk/writingacrossboundaries/. Since its inception in June 2008, according to Google Analytics the 211 pages have been viewed 70,941 times, and its home page has recorded 13,026 unique page views, of which 7,049 have originated from outside the UK.
References
Aitchison, C., B. Kamler, and A. Lee, eds. 2010. Publishing pedagogies for the doctorate and beyond. London: Routledge.
Badley, G. 2009. Academic writing as shaping and reshaping. Teaching in Higher Education 14, no. 2: 209�19.
Belcher, W.L. 2009. Writing your journal article in 12 weeks: A guide to academic publishing success. Thousand Oaks, CA: Sage.
Bernstein, B. 1990. The structuring of pedagogic practice. London: Routledge. Bernstein, B. 2000. Pedagogy, symbolic control and identity. Oxford: Rowman and Littlefield. Bloom, B.A., ed. 1956. Taxonomy of educational objections, the classification of educational
goals � Handbook 1: Cognitive domain. New York: McKay. Cousin, G. 2006. Threshold concepts, troublesome knowledge and emotional capital: an
exploration into learning about others. In Overcoming barriers to student understanding: threshold concepts and troublesome knowledge, ed. J.H.F. Meyer and R. Land, 134�47. London: Routledge.
Engeström, Y., R. Engeström, and M. Kärkkäinen. 1995. Polycontexuality and boundary crossing in expert cognition: Learning and problem solving in complex work activities. Learning and Instruction 5: 319�36.
Kamler, B., and P. Thomson. 2006a. Doctoral writing: Pedagogies for work with literatures. Paper presented at AERA annual meeting, 7�11 April, in San Francisco, CA.
Kamler, B., and P. Thomson. 2006b. Helping doctoral students write: Pedagogies for supervision. London: Routledge.
Kiley, M. 2009. Indentifying threshold concepts and proposing strategies to support doctoral candidates. Innovations in Education and Teaching International 46, no. 3: 293�304.
Teaching in Higher Education 745
Kiley, M., and G. Wisker. 2009. Threshold concepts in research education and evidence of threshold crossing. Higher Education Research and Development 28, no. 4: 431�41.
Kratwohl, D.R., and N.L. Smith. 2005. How to prepare a dissertation proposal: Suggestions for students in the social and behavioural sciences. Syracuse. NY: Syracuse Univ. Press.
Lillis, T., and J. Turner. 2001. Student writing in higher education: Contemporary confusion, traditional concerns. Teaching in Higher Education 6, no. 1: 57�68.
Machi, L.A., and B.T. McEvoy. 2008. The literature review: Six steps to success. Thousand Oaks, CA: Sage.
McAlpine, L., A. Paré, and D. Starke-Meyerring. 2008. Disciplinary voices: A shifting landscape for English doctoral education in the 21st century. In Changing practices in doctoral education, ed. D. Boud and A. Lee, 42�53, London: Routledge.
Meyer, J.H.F., and R. Land, eds. 2006. Overcoming barriers to student understanding: Threshold concepts and troublesome knowledge. London and New York: Routledge.
Nightingale, P. 1998. Understanding processes and problems in student writing. Studies in Higher Education 13, no. 3: 262�80.
Prior, P. 1998. Writing/disciplinarity: A sociohistoric account of literate activity in the academy. Maywah, NJ: Erlbaum.
Silverman, D. 2010. Doing qualitative research, 3rd edn. London: Sage. Simpson, R., and R. Humphrey. 2008. Writing across boundaries: Explorations in research,
writing and rhetoric in qualitative research. Qualiti 8: 10�2. Simpson, R., and R. Humphrey. 2010. Writing across boundaries: Reflections on the place of
writing in doctoral research training for social scientists. Learning and Teaching: The International Journal of Higher Education in the Social Sciences 3, no. 1: 69�91.
Thomson, P., and B. Kamler. 2010. It’s been said before and we’ll say it again - research is writing. In The Routledge Doctoral Student’s Companion: Getting to grips with research in education and the social sciences, ed. P. Thomson and M. Walker, 149�60. London: Routledge.
Thomson, P., and M. Walker, eds. 2010. The Routledge Doctoral Student’s Companion: Getting to grips with research in education and the social sciences. London: Routledge.
Turner, V. 1967. The forest of symbols: Aspects of Ndembu ritual. Ithaca: Cornell Univ. Press. Turner, V. 1969. The ritual process: Structure and anti-structure. Ithaca: Cornell Univ. Press. Turner, V. 1974. Dramas, fields, and metaphors: Symbolic action in human society. Ithaca:
Cornell Univ. Press. Turner, V. 1979. Betwixt and between: The liminal period in rites of passage. In Reader in
comparative religion, ed. W. Less and E. Vogt, 234�53. New York: Harper and Row. Turner, V. 1986. Dewey, Dilthey and drama: An essay in the anthropology of experience.
In The anthropology of experience, ed. V. Turner and E. Bruner, 33�44. Urbana & Chicago: Univ. of Illinois Press.
Vitae. 2011. Researcher development statement. http://www.vitae.ac.uk/CMS/files/upload/ Researcher development statement.pdf (accessed June 27, 2011).
Wolcott, H.F. 2009. Writing up qualitative research, 3rd edn. Thousand Oaks, CA: Sage. Wellington, J. 2010. More than a matter of cognition: An exploration of affective writing
problems of post-graduate students and their possible solutions. Teaching in Higher Education 15, no. 2: 135�50.
746 R. Humphrey and B. Simpson
Copyright of Teaching in Higher Education is the property of Routledge and its content may not be copied or
emailed to multiple sites or posted to a listserv without the copyright holder's express written permission.
However, users may print, download, or email articles for individual use.
Mending Fences: Defining the Domains and Approaches of Quantitative and Qualitative Research
Brittany Landrum and Gilbert Garza University of Dallas
In view of the increasing ubiquity of qualitative research, particularly mixed method designs, it is important to examine whether qualitative and quantitative models of research can be integrated and how this integration should take place. The recent adoption of best practices for mixed methods research by the NIH seems an opportune starting point for discussion of these questions. This article explores the notion that qualitative and quantitative research, while stemming from fundamentally different “approaches,” might yet find an appropriate complementary relationship. We argue, however, that such a complementary relationship depends on an understanding of the notion of approach and an insight into the fundamentally different guiding questions and domains of these 2 research models. Holding that “good fences make good neighbors,” this article explores the frontier between quantitative and qualitative research and the challenges attendant to designing and conducting mixed methods research.
Keywords: best practices, methodology, mixed methods research, qualitative research, quantitative research
Good fences make good neighbors. —Robert Frost (1919), “Mending Wall”
With the increasing ubiquity of qualitative research (Wertz, 2011) and the emergence of mixed methods research that utilizes both qual- itative and quantitative analysis (Creswell, Klassen, Plano Clark, & Smith, 2011; see also Creswell, 2009; Creswell & Clark, 2007; Tashakkori & Teddlie, 2003; Taskakkori, Ted- dlie, & Sines, 2013), there is a growing need to address the boundaries and differences between these two types of research. Both types of re- search have a set of usually implicit philosoph- ical suppositions (see Churchill & Wertz, 2002; Garza, 2004, 2007, 2011; Giorgi, 2009; von Eckartsberg, 1998; Wertz, 1985). Among oth- ers, Garza (2006) and Giorgi (2009) suggest that important differences exist between these two approaches to research. Following Giorgi, such differences would define different domains
of research motivated by fundamentally differ- ent questions and producing fundamentally dif- ferent knowledge claims. These different knowledge claims can “create a terrible mess” without an understanding of the philosophical foundations of both types of research (Greener, 2011, p. 3). Thus, this article seeks to delineate the domains of both approaches and discuss the combined use of quantitative and qualitative data and approaches in mixed methods research. An understanding of these differences with mu- tual respect for each domain will provide the necessary framework for discussing issues re- lated to mixing both types of research. Finally, we will discuss the complementarity of strengths of both approaches arguing for the necessity of methodological pluralism.
Defining Quantitative and Qualitative Domains and Approaches
Qualitative and quantitative research com- prise two different (but not opposed) interpre- tative frameworks. At a fundamental level, what distinguishes the domains of qualitative and quantitative research are the implicit interpreta- tive frames of reference that are brought to bear on their subject matter and methods (Giorgi,
Brittany Landrum and Gilbert Garza, Department of Psy- chology, University of Dallas.
Correspondence concerning this article should be ad- dressed to Brittany Landrum, Department of Psychology, University of Dallas, 1845 East Northgate Drive, Irving, TX 75062. E-mail: [email protected]
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Qualitative Psychology © 2015 American Psychological Association 2015, Vol. 2, No. 2, 199–209 2326-3598/15/$12.00 http://dx.doi.org/10.1037/qup0000030
199
2009)—what von Eckartsberg (1998) and Giorgi (1970) have called ‘approach.’
In previous descriptions, qualitative and quantitative research have been defined by the type of data used (non-numeric and numeric, respectively; see Greener, 2011) as well as in- ductive and deductive frameworks (see Greener, 2011; Teddlie & Tashakkori, 2009). Another way to understand quantitative and qualitative approaches to research is in terms of the knowledge claims they make and the inter- pretive frameworks employed to bring these claims to light. At one end of a continuum describing the interface of knowledge and frame of reference are ‘purely’ quantitative studies. Such research examines relations of magnitude between variables measuring quantities1 (e.g., height, weight, number of behaviors, hippocam- pal volume, etc.) and uses the numeric analysis of data to test and verify these relations. At the other end of this continuum are ‘purely’ quali- tative studies. This sort of research makes de- scriptive knowledge claims about meaning us- ing ‘descriptive’ data typically expressing these findings in linguistic narratives.
However, all these definitions meant to dis- tinguish the two approaches are not mutually exclusive. Qualitative research does count and explore dimensions of magnitude (Sandelowski, 2001) and likewise quantitative research in- cludes non-numeric data (e.g., categorical data2) and makes inferences about meaning based on dimensions of magnitude (Teddlie & Tashakkori, 2009). Furthermore, all scientific inquiry draws on both inductive and deductive frameworks (see Merleau-Ponty, 1961/1964), and we would argue that the interplay between data and the interpretative frame of reference are not always mutually exclusively quantita- tive or qualitative. The boundaries between these two approaches, more often than not, are not a clearly defined fence but rather a mixing of both types of data and approaches. Indeed, in both ‘pure’ cases described above, the kind of knowledge claimed fits well with the frame of reference used to establish and communicate its findings. Verification or confirmation of such studies can be achieved in terms of replication within the same analytic model. However, it is with regard to the middle regions on the con- tinuum that epistemological clarity and explic- itness are needed to interpret research findings
and the light they shed on the topic under in- vestigation (see Figure 1).
One of these middle positions is called quan- titizing and occurs when research claims knowl- edge of an order of magnitude but uses a qual- itative interpretive framework as the basis of such claims (e.g., performing numerical analy- ses based on frequency of themes, or “ratings of strength or intensity” Teddlie & Tashakkori, 2009, p. 269; see also Sandelowski, Voils, & Knafl, 2009). The other ‘middle position’ is called qualitizing and occurs when research claims qualitative knowledge but uses a quan- titative interpretive framework as the basis of such claims (e.g., categories based on range in magnitude, frequency count taken as a dimen- sion of importance; Hesse-Biber, 2010; Sand- elowski et al., 2009; Teddlie & Tashakkori, 2009). Because the knowledge claims of such research and interpretive frames of reference used to establish and test them do not match, special care and epistemological knowledge must be used when interpreting such findings. For instance, Johnson and den Heyer (1980) emphasize the distinction between a statistical question and a psychometric question pointing to the necessity of understanding the rubric of measurement when interpreting IQ scores.
An example contrasting a ‘purely’ quantita- tive relationship with instances where data and approach are mixed data will help illustrate these concerns and the special care we are ad- vocating. A regression coefficient of 1 between number of friends on Facebook and number of photos on one’s profile means that an increase by one friend predicts an increase in one photo posted; both of these variables are measured using ratio data whereby 1 friend on Facebook and 1 photo are quantities and thus fall under the ‘purely’ quantitative approach. When the
1 We have deliberately chosen examples of measures whose relation to the scales which produce them are not under debate. There is widespread agreement that height and weight represent quantities on a ratio scale, for exam- ple. This is not always the case with scales such as the Likert type, which is discussed below.
2 Categorical data are often called qualitative or nominal data but are analyzed using specialized statistical methods within quantitative research (see Agresti, 2002). In this article, this scale of measurement classification is distinct from qualitative data and research which describe non- numeric data that are to be used with methodologies devel- oped by qualitative researchers.
200 LANDRUM AND GARZA
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
variable in question is on a Likert scale, the relationship is an increase or decrease in agree- ment based on the number people circle on average, not necessarily or directly with the construct it is taken to operationalize. Concerns have been raised against Likert type data con- cerning the appropriate use of parametric or nonparametric statistics resting upon whether it is interval or ordinal data, respectively (see, e.g., Carifio & Perla, 2008; Norman, 2010). For such data to be considered interval, one would have to be able to answer the question pointedly posed by Knapp (1990), “3 what?” in relation to a 3 circled on a Likert-type scale. This type of data is not quite a quantity like the number of friends or photos are; it is neither clear whether the steps on the scale are indeed equidistant from each other (see, e.g., Jamieson, 2004) nor whether the ‘degree’ of agreement is measuring a quantity of something and if this quantity is the same for everyone who completes it. The answers on a Likert-type scale cannot escape the subjective understanding of the participant. We are not saying that Likert type data should not be used in this way; rather we are advocat- ing for appropriately understanding the knowl- edge claims they make. Likert-type data fit somewhere between the two end points on the spectrum of the interface of knowledge and appear to be an example of quantitizing whereby a dimension of agreement (qualitative) is rendered in terms of quantity (quantitative).
While a psychometric question can be dis- tinct from a statistical question, Merenda (n.d.) points to a more troubling example of quantitiz- ing concerning a case when the question of what one is measuring cannot be separated from the
statistical problems that it raises. The case in point Merenda highlights is when data repre- senting dichotomous categories, such as male and female, are included with other continuous predictor variables through ‘dummy coding’ in a regression analysis. To be used in statistical analyses that require continuous variables, these dichotomous variables are treated as though they were continuous, as though there were values somewhere between male and female. This is a violation of the assumption of contin- uous and discrete predictor variables in a regres- sion analysis thus presenting a questionable sta- tistical result. He further adds that there is no substitute for conducting a separate analysis between males and females.
In an example of qualitizing, Cialdini et al. (1976) calculated the frequency of ‘we’ and ‘non-we’ statements used to describe team and personal outcomes for players on a sport team. A dimension of quantity (counts/frequency) is rendered in terms of subjective ownership of instrumentality in a sport team’s victory or de- feat. Similarly, in an example of quantitizing, Pollard, Nievar, Nathans, and Riggs (2014) counted the frequency of occurrences of various themes from qualitative narratives and con- cluded that based on nonsignificant chi-squared analyses that the experiences of Hispanic and Caucasian mothers did not differ thematically. In this example, a quantitative rubric is utilized to make claims regarding dimensions of expe- rience. In these examples, we see the need to take special care when interpreting the mean- ings of the statistical analysis and the operation- alization of the constructs given that the data
Middle Ground
Qualitizing Quantitizing
‘Pure’ Quantitative ‘Pure’ Qualitative
Numerical analysis of
data that are quantities
Descriptive analysis of
data that are non-numeric
Figure 1. The possible configurations of data and interpretative frame of references repre- sented as a continuum. The middle ground is of special concern regarding the practice of mixed methods.
201QUANTITATIVE AND QUALITATIVE DOMAINS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
(quantitative or qualitative) and interpretation (qualitative or quantitative) do not coincide.
Although we argue that neither method holds a privileged perspective on the world, these two modes of description are distinguished, for the most part, by their respective approaches. We hold that no inquiry can be undertaken from a perspective-less position (Merleau-Ponty, 1945/ 1962) and thus even natural science is not value free (see Kendler, 2005 who asserts this and Garza, 2006, who refutes this position). Indeed, we would hold that an explicit acknowledgment of approach is necessary to assess the validity of any inquiry (Churchill, Lowery, McNally, & Rao, 1998; Garza, 2004).3 Specifically in qual- itative research, validity comprises a coherence between the researcher’s frame of reference, the research question, the data, and the findings. Next, we will turn to some specific concerns with mixed methods.
Concerns Regarding the Intersection of Quantitative and Qualitative Frameworks
The Question of Hegemony of Approach
In a qualitative research training meeting, conducted for researchers who were for the most part both well-versed in quantitative re- search models and inexperienced in qualitative research, one individual expressed a concern with the notion of ‘interrater reliability’ and a desire to make sure all the ‘coders’ were naming themes the same way. This individual felt that if one coder named a theme ‘reluctance’ and an- other named it ‘resistance,’ the analysis would not be reliable, that is, the same. This individual proposed providing a list of themes that all coders would share before conducting the anal- ysis. To run a quantitative interrater reliability analysis, it is commonly computed as a corre- lation coefficient describing the degree of over- lap between two variables (regardless of what scale of measurement the code comprises). The concern with the two words being the ‘same’ was a quantitative concern posed to a qualitative question. Here a qualitative claim is based on a quantitative rubric: the meanings are themati- cally related but the rubric is a numeric one of the codes used in reliability analysis. The knowledge claim here is one of corresponding magnitudes of evaluations. Judging the reliabil- ity of the responses based on their thematic
coherence instead allows us to recognize their ‘sameness’ while preserving the subtle and nu- anced differences captured in different ways of expressing it highlighting the different perspec- tives that are brought to bear when analyzing qualitative data. The potential of qualitative re- search to discern a complexity of meaning should not be hampered by the quantitative con- cern with reliability as correlation. Reliability in quantitative analysis rests on sameness, repeti- tion; in qualitative research it rests on related- ness (further discussed in Churchill & Wertz, 2002; Garza, 2004, 2007, 2011; Giorgi, 2009). This example presents an opportunity to illumi- nate the challenges that arise when the approach of one research model is applied to the practices of the other. Answering the concern raised here necessitates that we understand the differences in approaches that could be illustrative for prac- titioners in this area to avoid some of the com- mon pitfalls we are addressing.
In a particularly illustrative example, Fredrickson and Losada (2005) adopted formu- las created and suitable for fluid dynamics in physics to explain changes in attitudes over time. Resting on the presumption that attitudes are not only similar to but follow the same laws of nature as fluid, these researchers have not taken into account the differing philosophical approaches that shape both of these phenomena. Quite apart from whether attitudes are a physi- cal ‘thing’ like water for instance, the use and application of these mathematical formulas again highlights the hegemony of quantitative frameworks. Following the critiques raised by Brown, Sokal, and Friedman (2013), the utili- zation of these models can raise serious episte- mological and conceptual concerns.
Another example of hegemony of perspective is raised by Giorgi regarding the practice of some qualitative researchers to ‘verify’ their qualitative interpretative analyses by their par- ticipants or other ‘judges’ (Giorgi, 2008; Pollio, Henley, & Thompson, 1997). Giorgi (2008) as- tutely points out that participants are not versed in either the approach or procedures used for the analysis and thus could not assess its validity.
3 Creswell and Clark (2007) also point to the importance of laying out the philosophical underpinnings of research. However, in the literature, neither quantitative nor qualita- tive research uniformly does this.
202 LANDRUM AND GARZA
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Similarly we would add that statistical results would not be verified by the participants be- cause we cannot presume sufficient statistical sophistication to make such a judgment. Al- though it might seem that we are singling out incursions of quantitative into qualitative prac- tice, we suspect this is because the highly spe- cialized language of statistics makes incursions in the other direction less likely; everyone speaks in narratives but not everyone speaks in statistical narratives. In either case, instances of either incursion point to the need for method- ological pluralism.
Counts
Our next concern is the use of counts in qualitative research (see Leech & Onwueg- buzie, 2011; Miles & Huberman, 1994; Sand- elowski, 2001), and there are a number of ‘qual- itative’ articles that include frequency counts of themes and ‘quantitizing’ or assigning a numer- ical value to qualitative data that is then subject to quantitative analysis (see Dutton & Win- stead, 2011; Sandelowski et al., 2009). It would be a mistake to equate frequency with impor- tance or worse yet to conduct statistical analysis with these counts as in what Sandelowski (2001) calls “acontextual counting.” Examples could include counting up the number of times a particular word is said taken to imply a greater importance of that dimension of meaning in that person’s life. Often what an individual does not say is just as revealing and important as what they do say and when counting something, this ‘absence’ is not taken into account. In a thesis workshop for senior undergraduates conducting phenomenological research, a participant pro- vided a description of losing her virginity and the most striking part was that she never men- tioned the partner once in the entire description (Garza, 2004, Spring). Here, the lack of any mention of the other party involved reveals much about this phenomenon as meaningfully lived by the participant. We argue that as soon as one begins to count themes, one is no longer conducting qualitative research and not really conducting quantitative research either. This, in our minds, fails to respect the proper domains for both types of research.
Another example of a heightened concern with numbers in qualitative research is what Sandelowski (2001) refers to as “analytic over-
counting.” This refers to the tendency by some qualitative researchers to count everything that could possibly be counted to the detriment of clear presentation of the qualitative findings. Examples of this include a focus on the precise number of themes identified whereby the actual count is given greater emphasis than a descrip- tion of the themes themselves. Sometimes even when patterns of meanings comprise the results, Sandelowski reports that researchers become preoccupied with the number of participants who exhibit the themes where the focus is on frequency and less so on the meaning of the themes or patterns. All of these examples point to the need for researchers to be mindful of the type of data being gathered and the analytic approach undertaken paying particular attention to the appropriate knowledge claims.
Confirmation and Validation
The practice of ‘(dis)confirming’ and ‘(non-)validating’ one set of findings with another set when the data and interpretive frameworks are not matched is widespread (see Ellis, Marsh, & Craven, 2009; Hastings, 2012; Riegel, Dickson, Kuhn, Page, & Wor- rall-Carter, 2010; Sechrist, Suitor, Riffin, Taylor-Watson, & Pillemer, 2011 for exam- ples). In all of these examples, quantitative and qualitative data are used to explicitly ‘confirm’ and ‘verify’ each other and to as- sess ‘concordance’ of findings.
Wagner et al. (2012) argue against ‘confir- mation’ of findings rooted in one approach by research rooted in the other because conflicting results might initially appear problematic. If we examine the hippocampus from a neurophysio- logical point of view and find there are differ- ences between those (including animal species) who hoard and those who do not hoard (see, e.g., Brodin & Lundborg, 2003; Hampton, Sherry, Shettleworth, Khurgel, & Ivy, 1995; Volman, Grubb, & Schuett, 1997), it would not be appropriate to use qualitative data to confirm differences in the hippocampus. On the other hand, would the hippocampal findings confirm differences in memory found in qualitative data? Although these two sets of findings from two approaches shed light on each other, we do not believe one can confirm the other without implicitly holding that one type of data is more valid and thus the basis for such confirmation. If
203QUANTITATIVE AND QUALITATIVE DOMAINS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
hippocampal volume was found to be larger in those who hoard, survey data that revealed the importance of memory would not be surprising. But it would no more ‘confirm’ the findings related to hippocampal volume than a German translation of Shakespeare could confirm the Chinese translation; the point here is that a researcher must understand the differences in the languages used. The increase in hippocam- pal volume and the importance of memory are two complementary findings: they neither con- firm each other nor disaffirm each other. To- gether, they expand our understanding of the role of memory in those who hoard.
Another example of this practice of ‘confir- mation’ and ‘validation’ arises when attempting to interpret percentage of concordance between qualitative and quantitative findings. In a mixed methods study examining self care behaviors among patients with heart failure, Riegel et al. (2010) computed the percentage of agreement between identification of a self care theme in the participants’ narratives with a cutoff score on a quantitative survey. Although the two research- ers independently analyzed the two types of data respectively, the operationalization of self care has already been defined in advance with the use of a quantitative survey and the calcu- lation of ‘concordance’ rates presumes that the lived experiences provided through the narra- tives will touch on the same points raised by the survey items and vice versa. Furthermore, the concordance rates are taken to be an indication that the quantitative and qualitative methods are more valid and thus more trustworthy if a higher rate of concordance is reached. However, it is not immediately clear what this percentage of agreement means; for instance if self care main- tenance reached 75% agreement but self care confidence reached 95% agreement, what does the 20% difference mean? Assuming 100 par- ticipants in the sample, this difference would precisely mean that 20 more people provided evidence of this theme in their narratives and circled a higher number on the survey. Can this increase indicate that one piece of data is more valid? We suggest not because the validity of either quantitative or qualitative methods rests upon the respective philosophical approach un- dergirding both types of methods and that using one method cannot ‘confirm’ or ‘validate’ find- ings in the other. This practice renders the qual- itative data into a dimension of magnitude again
marking implicit adherence to a quantitative frame of reference. Additionally, this practice rests on the presumption that the number one circles for a group of items operationalized to measure a phenomenon will coincide with a description or narrative provided by the partic- ipants. How one narrates one’s experiences may or may not match with a list of items on this topic and one of the benefits of conducting mixed methods would be to examine this pos- sibility. However, holding this presumption of similarity across two types of data collected shuts down the possibility of examining this dimension when the goal is to assess ‘concor- dance.’ What these researchers have rightly dis- cerned is that there are similarities here as well as a relationship between these two methods; however, similarity has both qualitative and quantitative dimensions, and a change in one does not necessarily map onto a manifestation in both types. As Rollo May points out when describing the differences between memory ca- pacities in humans and sheep, a difference in terms of length of time or other quantitative distinctions also imply quality differences but given the distinct interpretative frames of refer- ence, these two changes cannot be assumed to be ‘the same’ (May, 1979). When these two methods are used to validate each other or to usurp one by the other, the strength of multiple perspectives is diminished and eradicates the possibility of exploring amplification, differ- ences, similarities, and so forth when both types of data are viewed from one perspective and thus conflated.
Likewise, we contend that neither method can be used to confirm or disconfirm the other. Instead, we suggest that the frame of reference here is ‘augmentation.’ Consider the ‘mountain’ task used to assess developmental egocentrism as an analogy here; a child sits at a square table with a three-dimensional mountain and is asked to describe the mountain from various view- points. The egocentric child cannot discern how a viewer sitting on the other three sides of the table would see anything different from what he or she sees from his or her own perspective. He or she might even be perplexed by the fact that such an observer could see something ‘at odds’ with what he or she sees. Similarly an ‘ap- proach-centric’ researcher might seek ‘confir- mation’ of his or her own perspective when conducting mixed methods. We argue that a
204 LANDRUM AND GARZA
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
methodologically pluralistic researcher would see that the complementary perspectives of other approaches, need not ‘confirm’ their own perspectival view but augment it, providing a more complex and full description of the phe- nomenon being investigated. Rather than have convergence or agreement as a goal of mixed methods, we advocate for complementarity in mixed methods.
Mixing Methods as Complementarity of Strengths: The Case for Methodological
Pluralism
Both quantitative and qualitative research methods are limited in scope assessing dimen- sions of meaning and magnitude, respectively (Giorgi, 2009). But it is perhaps more fruitful to think of these limits as domains of strength. These domains describe the frontiers of the two research models and set the stage for a comple- mentarity of strengths whereby our understand- ing of the phenomena we research is more com- plete in view of the differences than that proffered by ‘verification’ or ‘confirmation.’ Complementarity requires that research con- ducted from any point along the continuum (a) acknowledge the differences between the ap- proaches, (b) show respect for these differences, and (c) possess a mindfulness that the ‘middle ground’ we have described comprises complex intersections of knowledge claims, epistemo- logical assumptions, and approach. We advo- cate that no position on this continuum is priv- ileged and that methodological plurality allows researchers to more fully describe a phenome- non across this full continuum generating a wide array of knowledge.
In the recent Best Practices for Mixed Meth- ods Research in the Health Sciences, Creswell et al. (2011) describe three types of integrating qualitative and quantitative data. Two of the three types, connecting and embedding data, respect the boundaries of the two domains whereby one type of inquiry informs the other type of inquiry at a subsequent or concurrent time, respectively. The other type of integration, merging data, mixes up the ‘messy middle’ by using one type of data to compare and/or con- firm the findings from the other type.
When connecting data, one type of data anal- ysis is used to inform the collection of a second type of data at a subsequent time point. In this
way, the data gathered are analyzed using the methods appropriate to the type of data gath- ered. In our own mixed methods research ex- ample below, our qualitative analysis illumi- nated a transformed meaning of home that suggests an additional variable to examine in future quantitative research. The connecting process does not violate the boundaries as the type of data gathered (numeric vs. non-numeric) is appropriately analyzed (quantitative vs. qual- itative, respectively), enabling the two ap- proaches to mutually shed light on each other while neither confirming nor validating one ap- proach over the other.
Likewise in the embedding data method, one type of data analysis is deemed primary and the other as secondary. The primary method is cho- sen appropriately given the type of data being collected while the secondary method is chosen for supplemental and illuminating purposes. Like the connecting process above, the embed- ded process does not violate the boundaries between approaches.
However, the merging process can violate the boundaries we have outlined above. In this pro- cess, a researcher can transform a piece of qual- itative data into counts that are then subject to quantitative analyses. In our view, this violates a fundamental difference in the two approaches; namely the non-numeric qualitative data when transformed into number of times a theme is mentioned departs from a dimension of mean- ing (i.e., importance) and transforms it into the currency of magnitude (i.e., counts). This merg- ing process calls into question the boundaries that divide these two approaches and the differ- ent currencies that each trade in. As argued above, the act of using one type of approach to validate or confirm the other neglects how each has its own language, understanding, and phil- osophical foundations. However, this does not mean that the two approaches cannot be used concurrently in one research project. Rather than confirming or validating, where one ap- proach is more highly valued, we feel that when both approaches and domains are respected, the two types of results can shed light and illumi- nate the subject matter as well as provide a greater understanding than either approach could on their own.
Kendler (2005) argues that methodological plurality would create confusion and contradic- tion and argues for a strict natural science ap-
205QUANTITATIVE AND QUALITATIVE DOMAINS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
proach to psychological research utilizing the methods of quantitative research. To our minds this is akin to saying that a meal could be accurately described either by a list of its ingre- dients or by the subjective experience of its deliciousness but not both; reporting both would be ‘confusing’ or ‘contradictory.’ De- spite these claims that the two types of research are incompatible (Kendler, 2005), we have il- lustrated that the goal of both types of research is to gain a more complete understanding of the phenomenon under investigation. Rather than rely on a ‘monomethod,’ we suggest that meth- odological plurality allows researchers to draw on the strengths of both quantitative and quali- tative research.
As a case in point, Trend (1979 as cited in Teddlie & Tashakkori, 2009) explored program implementation and found discrepant results in the quantitative and qualitative analyses. Spe- cifically, when examining the quantitative data, the program was rated positively across sites and it appeared successful. However the quali- tative data provided the researchers with a qual- itative impression that the implementation of the program was not successful and problems were encountered at the various sites. When attempting to reconcile these apparent differ- ences, the researchers discovered that a contex- tual variable, the site’s urban versus rural loca- tion, could account for the discrepancy revealing that dimensions of meaning associ- ated with this distinction in terms of costs, in- come of families, ethnicity, ease of recruitment, among others, revealed further nuances in the quantitative findings. By examining the qualita- tive data gathered for implementation of the program at each site rather than collapsing across sites as the initial quantitative analysis did, the researchers used both types of data gathered to augment each other and to illumi- nate the contextual factors specific to each pro- gram. Only when both data and thus both anal- yses were incorporated and examined together could the findings give a more comprehensive picture of implementation. This example illumi- nates how posing a qualitative question could lead to a reevaluation of a quantitative analysis providing further insights that would not have been possible if only one method had been applied.
Another example of the appropriately com- plementary relationship of quantitative and
qualitative analysis in mixed methods research is a study we conducted on Facebook usage and its relationship to satisfaction with college life. (Landrum & Garza, 2011). We began with a quantitative study by asking our participants to report on their Facebook (FB) usage and gath- ered measures of social capital among other measures of demographic and college experi- ence. We tested a Structural Equation Model (SEM) and found that when heavy users of FB were connecting with friends from high school, they reported less satisfaction with college life when compared to students who were connect- ing with fellow students and classmates at col- lege illuminating dimensions of magnitude. This quantitative finding suggested a fruitful avenue to explore dimensions of meaning, what FB means to them. Our structured interview focus group analysis revealed a theme that could not emerge from the quantitative analysis as we had conceived it. Our spontaneous inter- action with participants and open-ended analy- sis allowed us to discern that for some students the meaning of home had transformed from their parent’s home to their college residence. This qualitative finding shed new light on our interpretation of the SEM model suggesting that it was not so much how often students used FB but rather how they were using FB, whether they were connecting with those in their current milieu and past milieu and which of these mi- lieus was understood by participants as their home. This opens a whole new avenue of re- search of both kinds. The benefits of truly col- laborative mixed methods cannot occur when each or either model is corrupted to the pur- poses of the other. In both of these examples, the relationship between the two approaches is not one of confirmation or validation but of augmentation. Just as describing the mountain scene from two sides of the table yields a more comprehensive description, the full potential of mixed methods research becomes possible when the boundaries are respected, the strengths honored, and the two models are thus mutually and truly complementary across the entire con- tinuum of research approaches.
Like all who sojourn beyond their homes, methodological adventurers would be well ad- vised to learn the language and customs of the domains they visit. The necessity of this only comes to light when one recognizes that a fron- tier has been crossed. To achieve a truly appro-
206 LANDRUM AND GARZA
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
priate balance between quantitative and qualita- tive research methods as well as mixing the two approaches, we recommend methodological pluralism. Envisioned as a sort of methodolog- ical multiculturalism, we are calling for other researchers in the field to join this discussion and engage in dialogue with each other. We argue that together, quantitative and qualitative approaches are stronger and provide more knowledge and insights about a research topic than either approach alone. While both ap- proaches shed unique light on a particular re- search topic, we suggest that methodologically pluralistic researchers would be able to ap- proach their interests in such a way as to reveal new insights that neither method nor approach could reveal alone. When both quantitative and qualitative researchers reach out to each other across the fence, learn the language, and respect the boundaries outlined above, we can start to make great strides in the emerging field. Only when both sides understand and respect the domains can the differences and uniqueness of both approaches be appreciated.
References
Agresti, A. (2002). Categorical data analysis (2nd ed.). Hoboken, NJ: Wiley, John & Sons, Inc. http:// dx.doi.org/10.1002/0471249688
Brodin, A., & Lundborg, K. (2003). Is hippocampal volume affected by specialization for food hoard- ing in birds? Proceedings of the Royal Society of London B: Biological Sciences, 270, 1555–1563. http://dx.doi.org/10.1098/rspb.2003.2413
Brown, N. J. L., Sokal, A. D., & Friedman, H. L. (2013). The complex dynamics of wishful think- ing: The critical positivity ratio. American Psy- chologist, 68, 801–813. http://dx.doi.org/10.1037/ a0032850
Carifio, J., & Perla, R. (2008). Resolving the 50-year debate around using and misusing Likert scales. Medical Education, 42, 1150–1152. http://dx.doi .org/10.1111/j.1365-2923.2008.03172.x
Churchill, S. D., Lowery, J. E., McNally, O., & Rao, A. (1998). The question of reliability in interpre- tive psychological research: A comparison of three phenomenologically based protocol analyses. In R. Valle (Ed.), Phenomenological inquiry in psychol- ogy: Existential and transpersonal dimensions (pp. 63–85). New York, NY: Plenum Press. http://dx .doi.org/10.1007/978-1-4899-0125-5_3
Churchill, S. D., & Wertz, F. J. (2002). An introduc- tion to phenomenological research psychology: Historical, conceptual, and methodological foun-
dations. In K. J. Schneider, J. F. T. Bugental, & J. F. Pierson (Eds.), The handbook of humanistic psychology: Leading edges in theory, research, and practice (pp. 247–262). Thousand Oaks, CA: Sage.
Cialdini, R. B., Borden, R. J., Thorne, A., Walker, M., Freeman, S., & Sloan, L. (1976). Basking in reflected glory: Three (football) field studies. Jour- nal of Personality and Social Psychology, 34, 366–375. http://dx.doi.org/10.1037/0022-3514.34 .3.366
Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches (3rd ed.). Thousand Oaks, CA: Sage.
Creswell, J. W., & Clark, V. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage.
Creswell, J. W., Klassen, A. C., Plano Clark, V. L., & Smith, K. C. (2011). Best practices for mixed meth- ods research in the health sciences. Bethesda, MD: National Institutes of Health, Office of Behavioral and Social Sciences Research. Retrieved from http:// obssr.od.nih.gov/mixed_methods_research
Dutton, L. B., & Winstead, B. A. (2011). Types, frequency, and effectiveness of responses to un- wanted pursuit and stalking. Journal of Interper- sonal Violence, 26, 1129–1156. http://dx.doi.org/ 10.1177/0886260510368153
Ellis, L. A., Marsh, H. W., & Craven, R. G. (2009). Addressing the challenges faced by early adoles- cents: A mixed-method evaluation of the benefits of peer support. American Journal of Community Psychology, 44(1–2), 54 –75. http://dx.doi.org/ 10.1007/s10464-009-9251-y
Fredrickson, B. L., & Losada, M. F. (2005). Positive affect and the complex dynamics of human flour- ishing. American Psychologist, 60, 678 – 686. http://dx.doi.org/10.1037/0003-066X.60.7.678
Frost, R. (1919). Mending wall. In L. Untermeyer (Ed.), Modern American poetry. New York, NY: Harcourt, Brace and Howe. Retrieved from http:// www.bartleby.com/104/64.html
Garza, G. (2004). Thematic moment analysis: A di- dactic application of a procedure for phenomeno- logical analysis of narrative data. The Humanistic Psychologist, 32, 120 –168. http://dx.doi.org/ 10.1080/08873267.2004.9961749
Garza, G. (2004, Spring). Senior qualitative research workshop: Senior thesis. Lecture conducted from University of Dallas, Irving, TX.
Garza, G. (2006). A clarification of Heidegger’s phe- nomenology. American Psychologist, 61, 255– 256. http://dx.doi.org/10.1037/0003-066X.61.3 .255
Garza, G. (2007). Varieties of phenomenological research at the University of Dallas: An emerging typology. Qualitative Research in Psychology, 4, 313–342. http:// dx.doi.org/10.1080/14780880701551170
207QUANTITATIVE AND QUALITATIVE DOMAINS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Garza, G. (2011). Thematic collation: An illustrative analysis of the experience of regret. Qualitative Research in Psychology, 8, 40–65. http://dx.doi .org/10.1080/14780880903490839
Giorgi, A. (1970). Psychology as a human science: A phenomenologically based approach. New York, NY: Harper & Row.
Giorgi, A. (2008). Difficulties encountered in the application of the phenomenological method in the social sciences. Indo-Pacific Journal of Phe- nomenology, 8. Retrieved from http://www.ipjp .org/index.php?option�com_jdownloads& view�viewdownload&catid�33&cid�124&Itemid� 318
Giorgi, A. (2009). The descriptive phenomenological method in psychology. Pittsburgh, PA: Duquesne University Press.
Greener, I. (2011). Designing social research: A guide for the bewildered. Thousand Oaks, CA: Sage.
Hampton, R. R., Sherry, D. F., Shettleworth, S. J., Khurgel, M., & Ivy, G. (1995). Hippocampal vol- ume and food-storing behavior are related in parids. Brain, Behavior and Evolution, 45, 54–61. http://dx.doi.org/10.1159/000113385
Hastings, L. J. (2012). Generativity in young adults: Comparing and explaining the impact of mentor- ing (Doctoral dissertation, Paper 84, Educational Administration: Theses, Dissertations, and Stu- dent). Retrieved from http://digitalcommons.unl .edu/cehsedaddiss/84
Hesse-Biber, S. N. (2010). Mixed methods research: Merging theory with practice. New York, NY: Guilford Press.
Jamieson, S. (2004). Likert scales: How to (ab)use them. Medical Education, 38, 1217–1218. http:// dx.doi.org/10.1111/j.1365-2929.2004.02012.x
Johnson, R. W., & den Heyer, K. (1980). On the enduring untruth about measurement and paramet- ric statistics. Canadian Psychology/Psychologie canadienne, 21, 134 –135. http://dx.doi.org/ 10.1037/h0081084
Kendler, H. H. (2005). Psychology and phenomenol- ogy: A clarification. American Psychologist, 60, 318–324. http://dx.doi.org/10.1037/0003-066X.60 .4.318
Knapp, T. R. (1990). Treating ordinal scales as in- terval scales: An attempt to resolve the contro- versy. Nursing Research, 39, 121–123. http://dx .doi.org/10.1097/00006199-199003000-00019
Landrum, B. & Garza, G. (2011, April). Retention and Predictors of Satisfaction with College Life. Paper presented at the 57th Meeting of the South- western Psychological Association, San Antonio, TX.
Leech, N. L., & Onwuegbuzie, A. J. (2011). Beyond constant comparison qualitative data analysis: Us-
ing NVivo. School Psychology Quarterly, 26, 70– 84. http://dx.doi.org/10.1037/a0022711
May, R. (1979). Psychology and the human dilemma. New York, NY: W.W. Norton & Co.
Merenda, P. F. (n.d.). Common errors of omission and commission observed in proposals, theses, and dissertations, 1965–1985. Retrieved from https:// www.yumpu.com/en/document/view/12904682/ common-errors-in-analysis-and-writing-amsci- ammons-scientific-
Merleau-Ponty, M. (1962). The phenomenology of perception (C. Smith, Trans.). Mahwah, NJ: The Humanities Press. (Original work published 1945)
Merleau-Ponty, M. (1964). The phenomenology of perception (J. Wild, Trans.). Chicago, IL: North- western University Press. (Original work pub- lished 1961)
Miles, M. B., & Huberman, A. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage.
Norman, G. (2010). Likert scales, levels of measure- ment and the “laws” of statistics. Advances in Health Sciences Education, 15, 625–632. http://dx .doi.org/10.1007/s10459-010-9222-y
Pollard, S. E., Nievar, M. A., Nathans, L. L., & Riggs, S. A. (2014). A comparison of White and Hispanic women’s stories of adjustment to the birth of a child. Infant Mental Health Journal, 35, 193–209. http://dx.doi.org/10.1002/imhj.21437
Pollio, H. R., Henley, T. B., & Thompson, C. J. (1997). The phenomenology of everyday life: Em- pirical investigations of human experience. Cam- bridge, UK: Cambridge University Press. http://dx .doi.org/10.1017/CBO9780511752919
Riegel, B., Dickson, V. V., Kuhn, L., Page, K., & Worrall-Carter, L. (2010). Gender-specific barriers and facilitators to heart failure self-care: A mixed methods study. International Journal of Nursing Studies, 47, 888–895. http://dx.doi.org/10.1016/j .ijnurstu.2009.12.011
Sandelowski, M. (2001). Real qualitative researchers do not count: The use of numbers in qualitative research. Research in Nursing & Health, 24, 230– 240. http://dx.doi.org/10.1002/nur.1025
Sandelowski, M., Voils, C. I., & Knafl, G. (2009). On quantitizing. Journal of Mixed Methods Re- search, 3, 208 –222. http://dx.doi.org/10.1177/ 1558689809334210
Sechrist, J., Suitor, J. J., Riffin, C., Taylor-Watson, K., & Pillemer, K. (2011). Race and older mothers’ differentiation: A sequential quantitative and qual- itative analysis. Journal of Family Psychology, 25, 837–846. http://dx.doi.org/10.1037/a0025709
Tashakkori, A., & Teddlie, C. (Eds.). (2003). Hand- book of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage.
Tashakkori, A., Teddlie, C., & Sines, M. C. (2013). Utilizing mixed methods in psychological re-
208 LANDRUM AND GARZA
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
search. In J. A. Schinka, W. F. Velicer, & I. B. Weiner (Eds.), Handbook of psychology: Vol. 2. Research methods in psychology (2nd ed., pp. 428–450). Hoboken, NJ: Wiley.
Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research. Thousand Oaks, CA: Sage.
Volman, S. F., Grubb, T. C., Jr., & Schuett, K. C. (1997). Relative hippocampal volume in relation to food-storing behavior in four species of wood- peckers. Brain, Behavior and Evolution, 49, 110– 120. http://dx.doi.org/10.1159/000112985
von Eckartsberg, R. (1998). Introducing existential- phenomenological psychology. In R. Valle (Ed.), Phenomenological inquiry in psychology: Existen- tial and transpersonal dimensions (pp. 3–20). New York, NY: Plenum Press. http://dx.doi.org/ 10.1007/978-1-4899-0125-5_1
Wagner, K., Davidson, P., Pollini, R., Strathdee, S., Washburn, R., & Palinkas, L. (2012). Reconciling
incongruous qualitative and quantitative findings in mixed methods research: Exemplars from re- search with drug using populations. International Journal of Drug Policy, 23, 54–61. http://dx.doi .org/10.1016/j.drugpo.2011.05.009
Wertz, F. (1985). Method and findings in a phenom- enological psychological study of a complex life event: Being criminally victimized. In A. Giorgi (Ed.), Phenomenology and psychological research (pp. 155–216). Pittsburgh, PA: Duquesne Univer- sity Press.
Wertz, F. J. (2011). The qualitative revolution and psychology: Science, politics, and ethics. The Hu- manistic Psychologist, 39, 77–104. http://dx.doi .org/10.1080/08873267.2011.564531
Received November 10, 2013 Revision received March 23, 2015
Accepted June 30, 2015 �
209QUANTITATIVE AND QUALITATIVE DOMAINS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an Ps
yc ho
lo gi
ca l
A ss
oc ia
tio n
or on
e of
its al
lie d
pu bl
is he
rs .
T hi
s ar
tic le
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Mixed Methods in Management Research: Implications for the Field
Pat Bazeley Centre for Primary Health Care and Equity, University of New South Wales, Sydney, Australia [email protected]
Abstract: Mixed methods approaches to research have been widely adopted in social sciences and professional studies disciplines. Using a combination of methods is assumed to offer the promise of greater flexibility in undertaking research, of generating better supported arguments from research data, and of increased relevance to a wider circle of stakeholders, claims that are at least partially supported by evidence of higher journal citation rates for mixed than monomethod articles. A review of eighty-three articles published eight years apart in the Academy of Management Journal (AMJ) and Administrative Science Quarterly (ASQ) suggests that organizational and management researchers have been slow to adopt mixed methods approaches to research. Articles for both periods and in both journals were clearly dominated by studies that employed statistical analyses of archival, database, experimental or survey data, with little change over the period. These results reflect those found in other studies. This review of articles raised wider issues. 1) Difficulty was experienced in classifying studies, leading to a refinement in emphasis for a definition of mixed methods. 2) Management researchers as a whole, as reflected in the style and referencing of these articles, have thorough training in the fine details of statistical methods of analysis; understanding of qualitative analysis is weaker and restricted to a few; and none appears to have any awareness of a growing literature on mixed methods, nor did any discuss the kinds of issues typically covered in qualitative and mixed methods articles in other journals. The results of this review have implications for the training of management and organization studies researchers who currently appear to have a quite limited repertoire of non-statistical methods on which to draw when undertaking research.
Keywords: methodology; methods; mixed methods; quantitative; qualitative; research training; management; organization studies
1. Mixed methods as a methodological approach Mixed and multi-method research have a long history in both science and social science field research (Maxwell 2015) and in evaluation studies (Rallis and Rossman 2003), but the adoption of mixed methods research more widely and its establishment as a distinct methodological tradition across the behavioural and social sciences is more recent (Johnson and Gray 2010). Since the turn of the century, adoption in some fields within health and education has been rapid and widespread, but those working in psychology and management appear to have been somewhat more reticent in their adoption of mixed approaches (Molina Azorín 2011; Molina Azorin and Cameron 2015; van der Roest, Spaaij and van Bottenburg 2015). The desire to appear ‘scientific’ in one’s methods, and differences in ontology, epistemology, and in disciplinary traditions have hindered the willingness of some to engage with what appears to be a compromise (and paradigmatically compromised) position. Management researchers seeking to use a mixed methods approach can meet with resistance from the gatekeepers of a discipline in which research, if not practice, is heavily imbued with a positivist philosophy, a love of indices, and an expectation of elaborate statistical analyses of numeric data (Currall and Towler 2003). Molina Azorin and Cameron (2015) cite several sources of evidence to suggest, however, that perhaps business scholars are using mixed methods research to a greater extent than appears from counts of studies identified as mixed methods articles, with journal editors being partly to blame for lack of openness about having included some qualitative methods during an investigation that was eventually published as an ostensibly quantitative research article.
The obligation to be open to use of mixed methods is predicated on the need to find the most appropriate methodology and method(s) to meet the purpose of any specific research project, and to answer its questions. Most social science researchers using mixed methods claim they do so to add strength to their argument (Johnson, Onwuegbuzie and Turner 2007). It is of interest, for example, that more than half of the “exemplary studies” in organization science explored by Frost and Stablein (1992) employed multiple or mixed methods (resulting in almost universal experiences for their authors of difficulty in having their work accepted for publication). In their recently published paper, Molina-Azorín and Cameron (2015) outlined four ways in which using mixed methods can benefit business research: preliminary qualitative data can provide a deeper understanding of context to inform context-specific studies in strategic management and entrepreneurship; attention to both process and outcome through mixed methods benefit theory-building, for example with qualitative methods contributing insights as to the mechanisms through which different variables contribute to a measured outcome; study of complex organizations would benefit from analyses that are integrated across
ISSN 1477-7029 27 ©ACPIL
Reference this paper as Bazeley P. “Mixed Methods in Management Research: Implications for the Field” The Electronic Journal of Business Research Methods Volume 13 Issue 1 2015 (pp 27-35), available online at www.ejbrm.com
Electronic Journal of Business Research Methods Volume 13 Issue 1 2015
micro and macro levels; and use of mixed methods helps to bridge the academic-practitioner divide through enhancing the interpretation and communication of results. A higher level of citations for mixed than monomethod articles from the same journals was reported also by these authors (based on an earlier study by Molina Azorin 2011) as evidence of the benefits gained.
After declaring mixed methods to constitute a third methodological movement (Johnson and Onwuegbuzie 2004) and in an attempt to achieve consensus about the focus of this rapidly evolving movement, Johnson, Onwuegbuzie and Turner (2007) reviewed 19 definitions of mixed methods contributed by then leaders in the field, to arrive at the following composite definition:
Mixed methods research is the type of research in which a researcher or team of researchers combines elements of qualitative and quantitative research approaches (e.g., use of qualitative and quantitative viewpoints, data collection, analysis, inference techniques) for the purposes of breadth and depth of understanding and corroboration. (Johnson et al, 2007: 123)
In looking for what makes mixed methods distinctive (rather than to define it), Jennifer Greene described a mixed methods way of thinking as:
an orientation toward social inquiry that actively invites us to participate in dialogue about multiple ways of seeing and hearing, multiple ways of making sense of the social world, and multiple standpoints on what is important and to be valued and cherished. A mixed methods way of thinking rests on assumptions that there are multiple legitimate approaches to social inquiry and that any given approach to social inquiry is inevitably partial. … a mixed methods way of thinking actively engages us with difference and diversity in service of both better understanding and greater equity of voice. (Greene 2008: 20)
Greene particularly values the dialectical aspect of using multiple or mixed methods, suggesting that dissonance resulting from diverse perspectives is to be welcomed because these become a source of fresh insights in themselves, and through seeking their resolution.
At the practitioner level, Teddlie and Tashakkori (2012: 775) identified the following as a set of core characteristics of mixed methods research (several of which could be considered to be characteristic of any good research):
Methodological eclecticism
Paradigm pluralism
Iterative, cyclical approach to research
Set of basic “signature” research designs and analytical processes
Focus on the research question (or research problem) in determining the methods employed within any given study
Emphasis on continua rather than a set of dichotomies
Emphasis on diversity at all levels of the research enterprise
Tendency toward balance and compromise that is implicit within the “third methodological community”
Reliance on visual representations (e.g., figures, diagrams) and a common notational system.
Perhaps symptomatic of the youth of this third major methodological movement, or of the huge variation in backgrounds among its practitioners, debates nevertheless continue over just what constitutes or defines a mixed methods project. I have consistently emphasised the need for integration of the different methods employed (e.g., Bazeley 2009), suggesting this as a major criterion distinguishing mixed methods from multimethod and other forms of research. I define integration as occurring:
to the extent that different data elements and various strategies for analysis of those elements are combined throughout a study in such a way as to become interdependent in reaching a common theoretical or research goal, thereby producing findings that are greater than the sum of the parts. (Bazeley 2010: 432)
www.ejbrm.com 28 ISSN 1477-7029
Pat Bazeley
It has to be said, however, that integration at that level has been and remains an elusive goal in a high proportion of published studies where a mix of methods is claimed to have been used (O’Cathain, Murphy and Nicholl 2007).
The question of definition was relevant to, and became a challenge for the investigation reported here. Does defining a study as using mixed methods necessarily mean it has used both quantitative and qualitative sources or methods? What about studies that use multiple different qualitative (or, for that matter, quantitative) approaches to data gathering? Most, but not all leading mixed methods researchers look for both, despite also agreeing that the terms qualitative and quantitative are not necessarily polar opposites, but define boundaries on a multidimensional continuum. So how much of each different approach is needed? By what point in the research-writing process does integration need to have occurred?
This paper explores the adoption and application of mixed methods by organizational and management researchers by considering the incidence and style of articles published in two top-tier management journals over two recent and comparable periods, eight years apart. What can be learned from this exploration about the methodological training of researchers in management and organizational studies, their willingness to draw from a wider field of methodological literature, and their preparedness adopt innovative approaches to research?
2. Methods for the review The incidence and style of methods adopted in management research were identified, tabulated, and described, with a particular focus on the adoption of mixed methods. This review refines and updates a brief overview that was part of a contribution to The SAGE dictionary of qualitative management research (Bazeley 2008). Articles were reviewed in 2006 and again in late 2014, and included all those that were published (excluding editorials and book reviews) during a two month period in 2006 and again in 2014 for the Academy of Management Journal (AMJ) and over a full year in 2005-6 and for 2014 for Administrative Science Quarterly (ASQ), generating 39 articles for 2006 (19 AMJ and 20 ASQ) and 44 for 2014 (24 AMJ and 20 ASQ).
Authors writing for these two journals have the luxury of being given enough space to provide a detailed rationale for and report of all aspects of their particular research project. Interestingly, they rarely provide an overview statement of their research design, although, with the consistent exception of when qualitative research is conducted as a preliminary to designing quantitative survey items or variables, they generally provide very detailed descriptions of data sources and analysis methods used in conducting their research. These articles are therefore ideal for a review of the type reported here. Authors’ reports of methods used were employed as a primary source of information for this analysis, but the study purpose(s) and background were also noted and consideration was given to way in which results were reported and discussed, as further evidence of the approach used.
Despite the detail available to the reader, definitive classification of the studies proved difficult. Some refinements were made to the earlier (2006) attempt to classify the studies, with somewhat stricter criteria regarding level of integration of results being applied, a consequence of growing maturity in this methodological field over the intervening period (and of the author’s reflections on it). In particular, studies in which qualitative methods were employed to design and/or check variables for use in statistical analyses but which did not describe any analysis processes for that qualitative data nor make further meaningful reference to it thereafter were not considered to comprise mixed methods studies for this review because they did not meet the criteria outlined above. (Some had been classified previously as sequential qual>QUAN studies because of the often considerable extent of preliminary qualitative data collected in order to design survey questions, variables, or scales.) Similarly, the occasional reporting of numbers of observations (e.g., of themes coded) in qualitative studies was considered to be a common enough practice within that tradition that it therefore did not warrant classification as mixed methods. Where the data from varied sources were amalgamated into a single database without preliminary separate consideration, and then analysed using a single method, the study was regarded as using a mono-method approach and not mixed. Studies that used mixed methods within either a quantitative or qualitative approach (rather than including both) were noted: these form a group about which methodologists would have different opinions as to whether they should be defined as mixed, multiple, or mono-method studies.
Attempting to classify the design type of the mixed methods studies also proved difficult, despite there being a considerable amount of (primarily US) literature devoted to this kind of enterprise (e.g., Teddlie and Tashakkori 2009; Creswell and Plano Clark 2011). Information was often iteratively exchanged between
www.ejbrm.com 29 ©ACPIL
Electronic Journal of Business Research Methods Volume 13 Issue 1 2015
methods; studies might also combine more than one of the basic design patterns (e.g., triangulation, development, and expansion), especially if methods evolve during the course of the study. This reinforces the value of having methods described in detail as was done here, rather than being described in brief with a label attached as often occurs elsewhere.
3. Results Changes in methodological approach taken by management researchers in empirical studies over the period reviewed were minimal, with both periods being strongly dominated by quantitative approaches to research (typically involving multivariate regression and its derivatives), even when mixed forms of data were employed during the research process. Table 1 provides an overview of the approaches taken as determined from the reports in each of these journals for the two periods studied. Additional methodological details then follow, with further details added again for mixed methods articles.
Table 1: Overview of methodology, 2006–2014
Methodological approach
2006 2014 Total
AMJ ASQ Both AMJ ASQ Both N %
Quantitative 16 12 28 17 14 31 59 71.1
Qualitative 3 4 7 6 3 9 16 19.3
Mixed methods 0 4 4 1 3 4 8 9.6
Total 19 20 39 24 20 44 83 100.0
Articles for both periods and both journals were clearly dominated by quantitative studies that relied on statistical analyses of archival, database, experimental or survey data, proportions being similar for both journals and both time periods. Qualitative articles were distributed across both years and both journals, with figures too small to draw conclusions about comparative patterns. An even smaller number of articles were classified as using mixed methods (4 in each year) using moderately stringent criteria that looked for evidence of integration (i.e., interdependence between the methods used) – 7 of the 8 were found in ASQ. Because changes over the period were clearly non-significant in quantitative terms, the two year-based samples were amalgamated for further analyses.
Further detail is provided for all methodological approaches in Table 2.
Table 2: Details of methods within methodological approaches
Methodological approach
Journal Total
AMJ ASQ
Mixed methods 1 7 8
field work followed by experiment to test 1 1
preliminary qual to inform and then illustrate and help explain quant (+ follow-up qual for some)
7 7
Qualitative 9 7 16
purely qualitative 8 3 11
qual with some counts 1 4 5
Quantitative 33 26 59
purely quantitative 23 18 41
quant including variables derived from qual 5 4 9
quant with preliminary qual (design/check vars and/or context; some minimally interpret findings)
5 4 9
Total 43 40 83
www.ejbrm.com 30 ISSN 1477-7029
Pat Bazeley
This shows that the majority (69.5%) of quantitative articles were ‘purely’ so, in that both their sources and their analyses were numerical/statistical. Nine of the 59 quantitative articles drew some or all of their variables from qualitative sources, including interviews, open-ended survey questions, newspapers and other archival sources. In these articles these data typically were coded according to a limited a priori system or auto coded using some form of computerised content analysis, with the codes then used in statistical analyses only without any further reference to the original textual sources. A further 9 with primarily quantitative data and exclusively employing statistical analyses had gathered often quite extensive preliminary qualitative data to assist with initial contextual information and/or for designing, identifying, or checking relevant variables, but 7 of these made no further reference to the qualitative information while 2 provided just occasional brief illustrative comments or a rare quote in the results or discussion sections. In none of these latter cases was the use and integration of qualitative data considered to be sufficient to define the studies as mixed methods. Indeed, one of the authors who gathered extensive qualitative data and did include a brief reference to it in his results made a point of noting (in a footnote) that his study should not be considered to be a mixed method study. Similarly, 5 qualitative studies (of a total of 16) that included some counts (usually just frequency of codes) in their reporting were not classified as using mixed methods for this analysis.
Eight studies were classified as employing mixed methods insofar as they extensively reported information derived from different components of their data, and integrated it to some extent. None described their study as such, however, and one author only made an incidental reference to ‘mixed methods’ in summing up his data sources. Seven of these eight studies, all in ASQ, were quantitative dominant; the one mixed methods article in AMJ gave equal priority to both approaches. Three were primarily sequential in design, two concurrent, and three are best described as iteratively moving between components. The majority reported results sequentially, and tended to integrate the qualitative material in the discussion rather than the results section of the article.
Table 3 outlines additional details of the ways in which methods were combined in these studies. Each included extensive qualitative as well as quantitative data. The descriptions of the quantitative data sources and analyses were always detailed, while the descriptions of the qualitative sources and analyses varied considerably in extent and depth. The level of integration in results was generally limited, typically comprising use of the qualitative data for contextual, illustrative and post hoc explanatory purposes. Indeed, at least one, and possibly two or three might not merit classification as mixed methods studies if stricter guidelines regarding integration (e.g., in the results section) were applied.
Table 3: Details of mixed methods studies
Article Topic Data and analyses
Almandoz, J. (2014) ASQ
Context in which local bank founders’ financial vs. community focus impact on risk-taking behaviour
Iterative moving ‘back and forth’ between quant/qual archival data, theories, and qual interviews to contextualise and explain quant findings.
Amabile, T.M., Barsade, S.G., Mueller, J.S. and Staw, B.M. (2005) ASQ
Temporal relationships between affect and workplace creativity for knowledge workers in project teams
Quant and qual data collected daily from workers over several months, and monthly peer-ratings. Qualitative descriptions transformed to quantitative variables for statistical analysis, also provided illustrative material and incidental results.
Brickson, S.L. (2005) ASQ
Organizational identity orientation (independent, dyadic/relational, or communal) and relations with others
Mostly qualitative data collected by survey, inductively coded, then codes were associated with identity orientations (detailed examples of coding provided) and transformed for statistical analyses.
Grant, A.M., Berg, J.M. and Cable, D.M. (2014) AMJ
Benefits of flexible, self-reflective (fun) job titles for reducing burnout in stressful organizations
(a) Inductive field study (interviews, observations and archival documents) in a high-stress organization, (b) test impact and role of causal mechanisms in less stressful environment using quasi-experiment
Obstfeld, D. (2005) ASQ
Social networking, individual agency and innovation in engineering
Iterative working between interviews, survey, key informant reviews, network data, ethnographic observations. Separate statistical analyses with qual integrated to develop explanations.
www.ejbrm.com 31 ©ACPIL
Electronic Journal of Business Research Methods Volume 13 Issue 1 2015
Article Topic Data and analyses
Ou, A.Y., Tsui A.S., Kinicki, A.J., et al (2014) ASQ
Develop and test a measure of humility; validate with CEOs in industry in China
(a) Theoretical development of concept; qual discussions to assess content validity of scale; EFA/CFA to refine; (b) use scale to test hypotheses; qual interviews to elaborate the quant results.
Ody-Brasier and Vermeulen (2014) ASQ
Normative ‘rules’ governing differential pricing of champagne grapes by growers
Surveys and interviews with growers and other industry stakeholders, and extensive contextual data as basis for statistical modelling and interpretation of social and other influences on prices.
Tilcsik, A. (2014) ASQ
Effect of resource culture when hired on problem solving and persistence and effectiveness of learned patterns.
Longitudinal personnel records related to qual and quant changes in environmental characteristics; complemented by interviews analysed using inductive coding and matrix construction to illustrate and explain behavioural differences.
If the definition of mixed methods is extended to include mixed within an overall approach, then at least a further five studies (3 quantitative, 2 qualitative, from those so counted earlier) could be so classified. Of the three that would merit the description of quantitative mixed methods, two comprised multiple quantitative substudies from which overall patterns were integrated and interpreted as a single set of results; the third used a second quantitative study to expand on findings from the first. Two qualitative studies that could be considered mixed drew on multiple sources of qualitative data supplemented by a limited amount of quantitative data, one within the context of an ethnography, and the other a case study. Each presented their results in the form of an integrated narrative based primarily on their various qualitative sources, occasionally illustrated by a quantitative statistic or table. Ethnography and case studies are generally classified as qualitative methodologies, but because each of these is likely to incorporate multiple sources of data, and depending on the type and handling of that data, they could also be described as ‘inherently mixed’ (Teddlie and Tashakkori 2009), as was the case with these two studies.
3.1 Additional observations In the process of reviewing articles one inevitably notices patterns within them, beyond the bare-boned requirements for the review. Perhaps this was especially so in this case because I come from a social science rather than management background; being in ‘foreign territory’ always sensitises one to cultural patterns that are taken for granted by ‘native’ inhabitants (or practitioners, in this case). What follows are the points observed, that potentially matter for management researchers.
Perhaps because those using multiple or mixed methods did not self-define as doing so, no articles included any references to a growing literature on mixed methods. Nor did any discuss the kinds of issues typically covered in mixed methods articles in other journals, or demonstrate common practices of mixed methods researchers noted by Teddlie and Tashakkori (2012). These include, for example, making reference to paradigmatic issues (ontology and epistemology), indicating their purpose for mixing, naming and describing the type of design and/or using diagrams to illustrate the timing of different methods and their points of interface.
Authors across all methodological approaches rarely provided an overview of their methodological orientation or research design at the beginning of the methods section of their papers, although abstracts in AMJ generally provided a brief statement of design. Rather, methods sections were almost always started directly with some specific aspect of process, such as sample selection. Similarly, the results sections of quantitative studies almost universally started with ‘Table x presents descriptive data and correlations for all variables’ – that is, as with the methods sections, starting with a specific aspect of the data rather than an introduction to guide the reader.
Within qualitative studies, mention of methodological approaches (with one or two exceptions) was limited to case study or guidance from grounded theory. Is it that these are the primary qualitative approaches that are relevant for management research questions, or does it suggest a limited range of options are considered by (or available to) management researchers?
www.ejbrm.com 32 ISSN 1477-7029
Pat Bazeley
Most researchers using qualitative methods developed theoretical models to explain their data, as did Grant et al. (2014) in Stage 1 of their equal status mixed methods study, but those using qualitative methods to complement quantitative methods generally just reported coded categories or themes or contextual items of information. This indicates a more superficial approach to analysis of the qualitative data by the latter group, despite their frequent reference to use of grounded theory methods (a claim thoroughly critiqued by Suddaby 2006). Quantitative studies that employed only preliminary qualitative data generally did not mention how their qualitative data were analysed (if they were at all), except that where they were transformed to create quantitative variables the coding process used was described.
Details of all quantitative procedures, which were extremely thorough, for example in providing the logic for each analysis strategy and in covering every possible threat to validity, were also meticulously referenced. Apart from exceptions in some of the studies defined as qualitative, however, referencing of qualitative procedures was almost entirely limited to one or two of the many available texts, these being Miles and Huberman (1994) and Strauss and Corbin (1990 or 1998). Rarely was there anything more than superficial reference to the actual analysis strategies described in either of those texts.
No attempt was made, in this review, to assess the added benefit of taking a mixed approach. To make a definitive assessment of benefit would require a comparative analysis of the extent to which alternative methods used for a similar purpose were effective in achieving their purpose – or might have been, in the absence of a comparative study – a very complex task. Nevertheless, it was disappointing to see quantitative studies that clearly could have made more effective use of the often extensive amounts of preliminary qualitative data gathered so as to illustrate or further refine patterns and relationships being revealed through statistical analyses. Iterative moving back and forth between data sources, especially during the processes of analysis and writing, will reveal ideas and trends not noticed or developed when a single method is used.
4. Discussion Molina Azorin and Cameron (2015: 469) echo Daft and Lewin’s (1990) call for researchers in organization studies to break out of their “normal science straitjacket”. Management, insofar as it is represented by these two journals, clearly continues to be a field in which empirical research is dominated by deductive, quantitative-statistical approaches and stylised reporting. While qualitative methods have achieved a small degree of acceptance by management researchers, use of mixed methods approaches continues to lag well behind despite the growing popularity of this approach in other fields, particularly those that, like management, have implications for translation to practice. Furthermore, the majority of mixed methods studies reported in these journals continued to rely on quantitative data and statistical analyses as their primary tool, with qualitative data and analyses being secondary and integration, in several cases, being further limited by sequential (methods determined) reporting.
When defining an article as using mixed methods (and, analogously, when writing a mixed methods article) attention must be focused on evidence of interdependence between different methods or approaches taken to gathering and analysing data. Interdependence speaks to a meaningful exchange between the varied approaches, which will be reflected in the way the study is conducted and its results are reported. It is through this meaningful exchange and reporting that the benefit of mixing methods becomes apparent. Several of the studies had rich but largely untapped data sources available to them, and had their authors brought through and paid some attention to the data generated by their less dominant method when doing their final analyses and writing their results and discussion sections, the studies would have been defined as mixed. This emphasis on evidence of integration coming through into the written results of the studies refines what is meant by integration and interdependence in defining mixed methods. Beyond definitions and classification, what is lost are the additional insights to be gained when data derived from different methods are viewed together, compared and contrasted with discrepancies explored, and coordinated into an integrated conclusion.
Difficulties in classification abound in this still emerging methodological movement, however. Can one classify an ethnography that uses multiple methods to gather and analyse data, including some basic numerical analyses, as mixed methods research or is it a qualitative study using an ethnographic methodology? Is statistically based content analysis of a priori or auto coded qualitative data a quantitative or a mixed method? These are questions still being debated, but perhaps there is no definitive answer. Rather than presume set definitions and boundaries, the better course is to always outline one’s parameters, to provide descriptions rather than labels. Having the space to do so freely is a luxury provided by the two journals that were studied for this review.
www.ejbrm.com 33 ©ACPIL
Electronic Journal of Business Research Methods Volume 13 Issue 1 2015
The results of this review have particular and important implications for the training of management researchers who appear to have a quite limited repertoire of methodological approaches on which to draw when undertaking their research – a conclusion reached also by Molina Azorin and Cameron (2015). The complexity of the problems researched in management and the level of research capability evident in management researchers would suggest adoption of mixed methods approaches would be readily achievable. Should the training of management researchers be extended to include a more thorough introduction to qualitative and mixed methods, they would be able to take even better advantage of the multiple and rich sources of data that they typically use in their studies. For example, they could draw on the qualitative sources they so often use only as preliminary data as a resource to contribute illumination and explanation when presenting and reviewing their follow-on statistical analyses, and they might more appropriately describe and reference the methods they use. Adopting fully developed mixed methods approaches to their studies, perhaps even in the dialectical mode recommended by Greene (2008) and practised by Jermier (1985), would provide management researchers with better opportunities to match rigour with relevance (as sought by Tushman and O’Reilly 2007), thus rendering their work more accessible to a professional audience (McGahan 2007), and to be ‘counterintuitive’ in challenging established theory (as sought by Bartunik, Rynes and Ireland 2006), thus rendering their work more interesting to an academic audience.
References Almandoz, J. (2014) ‘Founding teams as carriers of competing logics: When institutional forces predict banks’ risk
exposure’, Administrative Science Quarterly, vol. 59, no. 3, pp. 442-473. Amabile, T.M., Barsade, S.G., Mueller, J.S. & Staw, B.M. (2005) ‘Affect and creativity at work’, Administrative Science
Quarterly, vol. 50, no. 3, pp. 367-403. Bartunik, J.M., Rynes, S.L. & Ireland, R.D. (2006) ‘What makes management research interesting, and why does it matter?’
Academy of Management Journal Editors' Forum, Academy of Management Journal, vol. 49, no. 1, pp. 9-15. Bazeley, P. (2008) ‘Mixed methods in management research’, in Thorpe, R. & Holt, R. (eds.), The SAGE dictionary of
qualitative management research, pp. 133-136, London: Sage. Bazeley, P. (2009) ‘Editorial: Integrating data analyses in mixed methods research’, Journal of Mixed Methods Research, vol.
3, no. 3, pp. 203-207. Bazeley, P. (2010) ‘Computer assisted integration of mixed methods data sources and analyses’, in Tashakkori, A. & Teddlie,
C. (eds.), Handbook of mixed methods in social and behavioral research, 2nd edition, pp. 431-467, Thousand Oaks, CA: Sage.
Brickson, S.L. (2005) ‘Organizational identity orientation: forging a link between organizational identity and organizations’ relations with stakeholders’, Administrative Science Quarterly, vol. 50, no. 4, pp. 576-609.
Creswell, J.W. & Plano Clark, V.L. (2011) Designing and conducting mixed methods research, 2nd edition, Thousand Oaks, CA: Sage.
Currall, S.C. & Towler, A.J. (2003) ‘Research methods in management and organizational research: Toward integration of qualitative & quantitative techniques’, in Tashakkori, A. & Teddlie, C. (eds.), Handbook of mixed methods in social & behavioral research, pp. 513-526, Thousand Oaks, CA: Sage.
Daft, R.L. & Lewin, A.Y. (1990) ‘Can organization studies begin to break out of the normal science straitjacket? An editorial essay’, Organization Science, vol. 1, no. 1, pp. 1-9.
Frost, P.J. & Stablein, R.E. (1992) Doing exemplary research, Sage, Newbury Park, CA. Grant, A.M., Berg, J.M. & Cable, D.M. (2014) ‘Job titles as identity badges: How self-reflective titles can reduce emotional
exhaustion’, Academy of Management Journal, vol. 57, no. 4, pp. 1201-1225. Greene, J.C. (2008) ‘Is mixed methods social inquiry a distinctive methodology?’ Journal of Mixed Methods Research, vol. 2,
no. 1, pp. 7-22. Jermier, J.M. (1985) ‘”When the sleeper wakes” A short story extending themes in radical organization theory’, Journal of
Management, vol. 11, no. 2, pp. 67-80. Johnson, R.B. & Gray, R. (2010) ‘A history of the philosophical and theoretical issues for mixed methods research’, in
Tashakkori, A. & Teddlie, C. (eds.), Handbook of mixed methods in social & behavioral research, 2nd edition, pp. 69-94, Thousand Oaks, CA: Sage.
Johnson, R.B. & Onwuegbuzie, A.J. (2004) ‘Mixed methods research: A research paradigm whose time has come’, Educational Researcher, vol. 33, no. 7, pp. 14-26.
Johnson, R.B., Onwuegbuzie, A.J. & Turner, L.A. (2007) ‘Toward a definition of mixed methods research’, Journal of Mixed Methods Research, vol. 1, no. 2, pp. 112-133.
Maxwell, J.A. (2015) ‘Expanding the history and range of mixed methods research’, Journal of Mixed Methods Research, online first. doi: 10.1177/1558689815571132
McGahan, A.M. (2007) ‘Academic research that matters to managers: On zebras, dogs, lemmings, hammers, and turnips’, Academy of Management Journal, vol. 50, no. 4, pp. 748-753.
www.ejbrm.com 34 ISSN 1477-7029
Pat Bazeley
Miles, M.B. & Huberman, A.M. (1994) Qualitative data analysis: An expanded sourcebook, 2nd edition, Sage, Thousand Oaks, CA. (A more recent edition is: Miles, M.B., Huberman, A.M. & Saldaña, J. [2014] Qualitative data analysis: a methods sourcebook, 3rd edition, Thousand Oaks, CA: Sage.)
Molina Azorίn, J.F. (2011) ‘The use and added value of mixed methods in management research’, Journal of Mixed Methods Research, vol. 5, no. 1, pp. 7-24.
Molina Azorín, J.F. & Cameron, R. (2015) ‘History and emergent practices of multimethod and mixed methods in business research’, in Hesse-Biber, S. & Johnson, R.B. (eds.), Oxford handbook of multimethod and mixed methods research inquiry, pp. 466-485, New York: Oxford University Press.
Obstfeld, D. (2005) ‘Social networks, the tertius iungens orientation, and involvement in innovation’ Administrative Science Quarterly, vol. 50, no. 1, pp. 100-130.
O'Cathain, A., Murphy, E. & Nicholl, J. (2007) ‘Integration and publications as indicators of “yield” from mixed methods studies’, Journal of Mixed Methods Research, vol. 1, no. 2, pp. 147-163.
Ody-Brasier, A. & Vermeulen, F. (2014) ‘The price you pay: price-setting as a response to norm violations in the market for champagne grapes’, Administrative Science Quarterly, vol. 59, no. 1, pp. 109-144.
Ou, A.Y., Tsui A.S., Kinicki, A.J., et al. (2014) ‘Humble chief executive officers’ connections to top management team integration and middle managers’ responses’ Administrative Science Quarterly, vol. 59, no. 1, pp. 34-72.
Rallis, S.F. & Rossman, G.B. (2003) ‘Mixed methods in evaluation contexts: A pragmatic framework’, in Tashakkori A. & Teddlie C. (eds.), Handbook of mixed methods in social and behavioral research, pp. 491-512, Thousand Oaks, CA: Sage.
Strauss, A.L. & Corbin, J. (1990; 1998) Basics of qualitative research, Thousand Oaks, CA: Sage. (More recent editions are: Corbin, J. & Strauss, A.L. [2008; 2015] Basics of qualitative research, Thousand Oaks, CA: Sage.)
Suddaby, R. (2006) ‘What grounded theory is not’, Academy of Management Journal, vol. 49, no. 4, pp. 633-642. Teddlie, C. & Tashakkori, A. (2009) Foundations of mixed methods research, Thousand Oaks, CA: Sage. Teddlie, C. & Tashakkori, A. (2012) ‘Common “core” characteristics of mixed methods research: A review of critical issues
and call for greater convergence’, American Behavioral Scientist, vol. 56, no. 6, pp. 774-788. Tilcsik, A. (2014) ‘Imprint-environment fit and performance: How organizational munificence at the time of hire affects
subsequent job performance’, Administrative Science Quarterly, vol. 59, no. 4, pp. 639-668. Tushman, M. & O’Reilly, C. (2007) ‘Research and relevance: Implications of Pasteur’s quadrant for doctoral programs and
faculty development’, Academy of Management Journal, vol. 50, no. 4, pp.769-774. van der Roest, J.-W., Spaaij, R., & van Bottenburg, M. (2015) ‘Mixed methods in emerging academic subdisciplines: The
case of sport management’, Journal of Mixed Methods Research, vol. 9, no. 1, pp. 70-90.
www.ejbrm.com 35 ©ACPIL
Copyright of Electronic Journal of Business Research Methods is the property of Academic Conferences & Publishing International Ltd. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.
Volume 12, 2017
Accepting Editor: Erik Shefsky │ Received: November 17, 2016 │ Revised: March 4, April 4, May 13, May 14, 2017 │ Accepted: May 27, 2017. Cite as: Goff, W. M., & Getenet, S.. (2017). Design based research in doctoral studies: Adding a new dimension to doctoral research. International Journal of Doctoral Studies, 12, 107-121. Retrieved from http://www.informingscience.org/Publications/3761
(CC BY-NC 4.0) This article is licensed it to you under a Creative Commons Attribution-Noncommercial 4.0 International License. When you copy and redistribute this paper in full or in part, you need to provide proper attribution to it to ensure that others can later locate this work (and to ensure that others do not accuse you of plagiarism). You may (and we encour- age you to) adapt, remix, transform, and build upon the material for any non-commercial purposes. This license does not permit you to use this material for commercial purposes.
DESIGN-BASED RESEARCH IN DOCTORAL STUDIES: ADDING A NEW DIMENSION TO DOCTORAL RESEARCH
Wendy M Goff Swinburne University of Technology, Rowville, VIC, Australia
Seyum Getenet* University of Southern Queensland, Springfield, Australia
* Corresponding author
ABSTRACT Aim/Purpose We show a new dimension to the process of using design-based research ap-
proach in doctoral dissertations.
Background Design-based research is a long-term and concentrated approach to educational inquiry. It is often a recommendation that doctoral students should not attempt to adopt this approach for their doctoral dissertations. In this paper, we docu- ment two doctoral dissertations that used a design-based research approach in two different contexts.
Methodology The study draws on a qualitative analysis of the methodological approaches of two doctoral dissertations through the lenses of Herrington, McKenney, Reeves and Oliver principles of design-based research approach.
Contribution The findings of this study add a new dimension to using design-based research approach in doctoral dissertations in shorter-term and less intensive contexts.
Findings The results of this study indicate that design-based research is not only an ef- fective methodological approach in doctoral dissertations, but it also has the potential to guide future research direction beyond examination.
Recommendations The findings of this study demonstrate that the design based research approach could bring researchers and practitioners together regarding a common purpose to design context-based solutions to educational problems.
for Practitioners
Impact on Society We show an alternative view and application of design-based research in doc- toral dissertations. Also, we identify the benefits of this type of research for doctoral students after completing their dissertations.
Keywords design based research, doctoral study, doctoral dissertation
Design-Based Research in Doctoral Studies
108
INTRODUCTION The fundamental assumption of empirical research, in most educational settings, is that practitioners will apply theories and research findings. However, we find that there is often no clear link between changes in practice and results of research (Corbin & Strauss, 2014). Design-Based Research (DBR) evolved near the beginning of the 21st century as a practical research methodology that provided a bridge between theory and practice in a classroom context (Anderson & Shattuck, 2012). It is com- plex and multi-faceted work that has the dual goal of developing theoretical insights and a solution to a problem (McKeney & Reeves, 2012). The dual goal of generating knowledge about both theory and practice simultaneously means that a close relationship with theory and new research findings is a crucial and ongoing component of the research process. DBR examines and develops theories about processes while also analyzing the effectiveness of a research design with participants, to shape the processes they are studying (Gravemeijer & Cobb, 2006). The recognition of the potential of DBR to advance theory and practice has seen the approach gain momentum in recent years, particularly in the field of education (van den Akker, Gravemeijer, McKenney, & Nieveen, 2006).
Barab and Squire (2004) define DBR as a series of approaches, with the aim of producing new theo- ries and practices that potentially impact learning and teaching in naturalistic settings. This series of approaches has a label in many different terms, such as DBR (Kelly, 2006), development research (van den Akker, 1999), design research (Reeves, Herrington, & Oliver, 2005), developmental research (McKenney & van den Akker, 2005), and design experiments (Brown, 1992; Collins, 1992). It is often purported within the literature that DBR requires intensive and long-term collaboration between re- searchers and practitioners. This collaboration demands the development of solutions to practical problems in learning environments, with the identification of reusable design principles. This is a major driving force of the research. As a result, there is an assumption embedded within the educa- tional field that DBR appears to be a long-term and intensive approach to educational inquiry. This assumption often projects the notion that doctoral student should not attempt to adopt this ap- proach for their dissertations (Herrington, McKenney, Reeves, & Oliver, 2007).
However, some studies have begun to highlight the appropriateness of DBR for doctoral studies. Herrington et al. (2007) and Kennedy-Clark (2013) have illustrated the importance and the compo- nents of a dissertation proposal utilizing a DBR approach. For example, Kennedy-Clark (2013) sug- gested that the DBR approach can provide a platform for higher degree research (HDR) students to apply a range of analysis methods, data collection tools, and techniques that can lead to a better un- derstanding of their study’s relative strengths and weaknesses regarding these techniques. Herrington et al. (2007) provided specific guidelines on preparing the DBR proposal including a sequential and practical description of proposed research. Similarly, Plomp (2007) and Reeves (2006) have provided some insight to explicitly articulate the differing phases of a DBR approach, making it easier for doc- toral students to conceptualize the approach in action. However, there is still a limited amount of research on how to use the DBR approach, particularly in HDR or doctoral studies contexts (Ken- nedy-Clark, 2013).
This study’s authors illustrate the practical use of DBR in two doctoral dissertations. The two disser- tations designs revolve around the notion of professional development and each formulated a set of draft principles to guide educators involved in supporting the mathematical learning of children. Al- so, we illustrate the challenges of using the DBR approach during the research process of the two doctoral studies and highlight how this approach can effectively utilize and adapt to doctoral disserta- tions. Our research question is, “How can researchers use DBR to approach different doctoral disser- tations and other short term, less intensive, research contexts?”
LITERATURE REVIEW DBR is a research approach that extends existing methods to address the issue of linking theory and practice in educational research (Corbin & Strauss, 2014). Within the DBR process, researchers ac-
Goff & Getenet
109
tively frame problems to improve a perceived or current situation or problem. Kennedy-Clark (2013) considered DBR as a constructivist-based proposition of an alternative epistemology of practice and presented as a thoughtful conversation with the situation. In addition, it is a methodological approach grounded in a pragmatic epistemology. Pragmatists rejected the quest for certainty, suggesting that science becomes understandable when the conception of science as a system of absolute truths is dropped (Maxcy, 2003) and the existence of multiple subjective realities is appreciated (Tashakkori & Teddie, 2010).
Pragmatism is a generative mode of inquiry that is concerned with lived experience and interpreta- tion. In fact, lived experience is at the forefront of pragmatic inquiry where the study participants are not subjects of a study, but are experts and knowledgeable participants within their communities (Metcalfe, 2008). At the core of the inquiry is a path into reality (Cobb, 2011). Therefore, an inquiry is not only practical and outcome-orientated, but also scientific and should be entwined with, and lead to, developing new theoretical understandings (Johnson, Onwuegbuzie, & Turner, 2007).
The pragmatic inquiry does not prescribe how an inquiry should take place, but the pragmatic nature of the inquiry supports to reason out how the choices were made throughout the inquiry (Cobb, 2011). It is through this argument that new theoretical considerations emerge to examine, refine, and develop existing theoretical understandings. DBR provides a vehicle for pragmatic inquiry, through a series of methodological approaches to assist in the exploration of complex phenomena in real-life contexts and in collaboration with people engaged in everyday practice (Herrington et al., 2007). This series of approaches works simultaneously to improve practice, while also testing, refining, and de- veloping a scientific practice (Barab & Squire, 2004). The following two sections describe phases of DBR mapped against typical elements of a HDR proposal and model the use DBR in doctoral stud- ies.
DESIGN BASED R ESEARCH IN H IGH ER DEGREE R ESEARCH In HDR study, the development of the research proposal is an integral part of the doctoral process. If done well, the research proposal provides the HDR student with a robust plan to implement. It also provides a reliable map for referring to as the candidature unfolds. Most universities tend to provide their HDR students with specific guidelines to follow to prepare the research proposal (Her- rington et al., 2007). There are some variances between the requirements of different institutions and disciplines areas to prepare the research proposal. However, typical guidelines generally include in- formation about the development of aims and objectives, rationale, research questions, significance, literature review, theoretical framework, methodology, data collection and data analysis, timeline, and ethical considerations (e.g., Herrington et al., 2007; Kennedy-Clark, 2013).
University guidelines provide HDR students with some sound insight into developing a plan for con- ducting traditional research. However, if the conceptualized inquiry does not fit neatly into the tradi- tional research design, guidelines provided by universities for doctoral students are confusing rather than helpful when formulating the research proposal and conduct the inquiry (Herrington et al., 2007). Herrington and colleagues provided insight into this issue by outlining how HDR students adopting a DBR approach might formulate a DBR project within a traditional, predictive doctoral research proposal. To do this, they present four phases of DBR by breaking down the preliminary phase of the approach into two separate phases. The phases of DBR are then juxtaposed with the elements of a traditional and predictive research proposal to demonstrate ‘fit.’ Table 1 highlights this mapping, as illustrated by Herrington et al. (2007).
While the work of Herrington and colleagues (2007) has been helpful in demonstrating how DBR might fit within the traditional research proposal, it does not adequately highlight the strength of DBR in HDR work, nor does it show the versatility and flexibility of the approach in conducting the research inquiry. It is important for both HDR students and their supervisors to realize the potential and strength of the DBR approach to obtain a sound basis for research training.
Design-Based Research in Doctoral Studies
110
Table 1. Design-based research and elements of a research proposal
PHASE OF DESIGN-BASED RESEARCH (REEVES, 2006)
THE TOPICS/ELEMENTS THAT NEED DESCRIPTION
POSITION IN A RE- SEARCH PROPOSAL
PHASE 1: Analysis of practical problems by researchers and practi- tioners in collaboration
Statement of problem Statement of problem or Introduction or Ra- tionale or Background
Consultation with researchers and practitioners
Research questions Research questions
Literature review Literature review
PHASE 2: Development of solu- tions informed by existing design principles and technological innova- tions
Theoretical framework
Theoretical framework Development of draft principles to guide the design of the inter- vention
Description of proposed inter- vention Methodology
PHASE 3: Iterative cycles of testing and refinement of solutions in practice
Implementation of intervention (First iteration)
Methodology Participants
Data collection
Data analysis
Implementation of intervention
Second and further iterations
Participants
Data collection
Data analysis
PHASE 4: Reflection to produce “design principles” and enhance solution implementation
Design principles Designed arte- fact(s) Professional development Methodology
M ODELS TO USE DESIGN-BASED R ESEARCH IN DOCTORAL STUDIES Abdallah and Wegerif (2014) highlight the flexibility of DBR by presenting a version of DBR for doctoral studies. In contrast to the guidance provided to doctoral students by Herrington and col- leagues (2007), the model does not provide a particular format in which to follow but rather “flexibly outlines realistic methods and procedures followed in the study under the DBR umbrella” (Abdallah & Wegerif, 2014, p.15).
Derived from the work of Plomp (2009) the model presented by Abdullah and Wegerif (2014) is structured around three specific phases;
Goff & Getenet
111
1. Integrating literature and exploratory research to develop initial theoretical framework for design,
2. Implementing this with careful evaluation of processes as well as products in two iter- ations, using the results of the study of the first iteration to refine the second itera- tion, and
3. Finally using the results of the second iteration to produce a new and improved theo- retical framework for design ready for further research study presented as the main outcome of the thesis.
The authors suggest that this model highlights the flexibility of DBR by incorporating the notion that the particular context, nature, and objectives of individual research inquiries are unique (Abdal- lah & Wegerif, 2014). However, in doctoral research, the model is too broad, and as a result, lacks the specific guidance needed to navigate the different stages of the doctoral process.
Kennedy-Clark (2013) provides an overview of DBR in the HDR experience and contextualize the phases of DBR in doctoral studies. In a comparison of doctoral theses using DBR, Kennedy-Clark suggests that the three phases of DBR can provide enabling checkpoints that support HDR students in redefining and reflecting on their research as it progresses. In the following section, we have illus- trated the three phases of DBR, which are the preliminary phase, the prototyping phase, and the as- sessment phase.
Doctoral students consider the preliminary phase as the stage of the inquiry to formulate the investi- gation and prepare the research proposal. This stage incorporates the doctoral proposal and all of the typical requirements that the proposal requires (review of literature, conceptual framework, re- search questions formulation, etc.).
According to Kennedy-Clark (2013), the next phase of the approach is the prototyping phase, which marks the commencement of the research. This phase also involves a number of iterations of an intervention or material, with each iteration being a micro cycle of the research. The final phase pos- ited is the assessment phase. Kennedy-Clark suggests that the purpose at this stage of the inquiry is to conclude the inquiry by discussing how the investigation has answered the research questions. While the checkpoints provided by Kennedy-Clark provide some more specified direction as to how doctoral students might use DBR in the doctoral process, they provide little insight into the flexibility of DBR as a pragmatic mode of inquiry.
The DBR models discussed make a vital contribution to the field of DBR and provide some guid- ance concerning the HDR candidature. However, more insight and guidance into the flexibility of DBR for doctoral research is needed to ensure that students and their supervisors understand the potential afforded through the DBR approach. Notably, understanding the challenges of using DBR in doctoral studies paves the way to its effective use in doctoral work. Anderson and Shattuck (2012) advise that one of the challenges for doctoral supervisors and their students in the uptake of a DBR approach lies in the requirement of multiple iterations juxtaposed alongside the time restraint of the HDR candidature. They further suggest that a partial solution is for established education researchers to develop multiyear DBR research agendas that have legitimate space and roles for graduate students to undertake and own significant pieces of this larger agenda (Anderson & Shattuck, 2012). While such solution might be a viable way to assist doctoral supervisors and students in realizing the poten- tial of the approach in doctoral work, it requires significant monetary and time investment that might not necessarily be afforded in educational research.
Others also indicate various challenges of the methodological approach that might be problematic for doctoral students. These challenges include issues around research data usage, the problem of objectivity, involving participants in the research process, and the time to complete the research pro- cess in its entirety (Dede, 2004; Design-Based Research Collective, 2003; Wang & Hannafin, 2005). Studies have also indicated that DBR researchers are often finding themselves playing the conflicting
Design-Based Research in Doctoral Studies
112
roles of advocate and critic (Design-Based Research Collectives, 2003) which can be a difficult path for the novice researcher to navigate.
Despite the challenges, DBR can be drawn upon for a variety of purposes. For example, Thein, Bar- bas, Carnevali, Fox, Mahoney, and Vensel (2012) used DBR to investigate the effectiveness of teach- ing multicultural literature through a collaborative and iterative process of inquiry guided by theoreti- cal principles. Hakkarainen (2009) showed its applicability in education to design, implement, and refine a problem-based learning course on educational digital video use and production. Similarly, Wang and Hannafin (2005) showed the effectiveness of the DBR approach to design technology- enhanced learning environments. Each of the studies was iterative, but their duration of iteration varied across the different approaches. This is important insight for doctoral students as it highlights the adaptability of the approach for different purposes.
METHOD In this study, we examine the methodological approach of two doctoral dissertations against the phases and principles of the DBR through a qualitative comparative descriptive approach. Table 1 shows the phases and principles of the DBR suggested by Reevees (2006) and Herrington et al. (2007). The study also identifies the purposes, brief outcomes, and the challenges of each doctoral study and demonstrates how the approach can be effectively adopted and modified to suit the doctoral inquiry.
DATA ANALYSIS The authors of this study analyzed the data from both doctoral studies alongside with the different phases and principles of the DBR approach. The analysis involved identifying themes and actions that were evident across both studies within each of the DBR phases and principles. After the au- thors identified the themes, the two studies were re-analyzed to identify common challenges that were evident across each of the inquiries. In the later section, we then examined the data alongside with the traditional doctoral process to establish a set of principles to guide doctoral work that draws on a DBR methodological approach.
BACKGROUN D IN FORM ATION - T H E T WO DOCTORAL STUDIES’ PURPOSE AN D CON TEXT
Doctoral study 1: Tertiary level In doctoral study 1 [DS1] (Getenet, 2015), DBR was used to design a Professional Development (PD) program and simultaneously study the research participants’ knowledge of technology integrat- ed mathematics teaching. Hence, in DS1, the analysis of a learning problem formulated the research proposal and prompted quite specific ideas for a PD program approach. Alongside the PD program, the researcher included the creation of particular teaching and learning materials and methods de- signed to realize participants’ learning gains predicted by theory and research. The section below il- lustrates the activities conducted in DS1 in each phase.
The preliminary phase of the research involved a contextual and problem analysis of teachers’ knowledge of technology integrated mathematics teaching, with the development of a conceptual framework based on a review of the research literature. During this phase of the study, the research- er reviewed the theoretical framework of the study and formulated the PD program guidelines. The guidelines included the formation of teams, identification of available ICT, and the consideration of web-based software. Developing the theoretical framework for the study involved a review of the relevant literature to frame the knowledge required of mathematics teacher educators to integrate technology into their teaching. It also involved the development of an associated instrument to measure mathematics teachers’ knowledge of technology-integrated teaching in mathematics. The researcher used Smith and Ragan’s (2005) categories of context to drive the PD program guidelines
Goff & Getenet
113
through a context analysis. These categories are the analysis of learning context, analysis of learners, and analysis of learning tasks.
The second phase of DS1 focused on setting out design guidelines and optimizing the planned PD program through cycles of design, evaluation, and revision (Kelly, 2006; Plomp, 2009). This phase of the study was iterative, with formative assessment aimed at improving the intervention. Assessing the quality of the PD program in terms of its practicality and validity was part of this phase. Participants in DS1 were actively involved in shaping each PD program guideline for possible improvement. The participants further recommended additional PD program guidelines that made the intervention more relevant to their contexts. For example, they suggested the formation of small teams based on their work arrangements.
The last phase of the study was a summative evaluation to determine the extent to which the PD program had met the pre-determined objectives. This phase resulted in recommendations for im- provement to the PD program. This phase also assisted to identify the effectiveness of the PD pro- gram. It involved examining the impact of the PD program on the research participants’ technology integrated mathematics teaching practices by comparing before and after results of participation. In this phase, presenting samples of lessons supported to show the effectiveness of teacher educators’ ICT integrated mathematics teaching.
In summary, DS1 drew upon DBR to improve practices. As such, the DBR followed in DS1 was helpful for teachers to meet the learning needs of their students. It also provided a voice to practi- tioners in the research process.
Doctoral study 2: Transition into the first-year-of-school DBR was used in doctoral study 2 [DS2] (Goff, 2016) to formulate a set of draft principles to guide educators and researchers involved in supporting the mathematical learning of children making the transition to school. The research proposal included the analysis of a problem that the researcher had experienced personally, and in her role as both a first-year-of-school and prior-to-school teacher. The problem centered on how to best support the current mathematical understandings, skills, and knowledge of children when they start formal education. The research proposal included a theoreti- cal analysis of the problem, coupled with a proposed solution (intervention) and given to practition- ers for further refinement and implementation. The purpose of the study was not to test the effec- tiveness of the proposed intervention, but to draw on the refinement and implementation of the intervention in context, to develop some draft principles for future work in this area. The section below illustrates each phase of DBR as reflected in DS2.
The preliminary phase of the study involved establishing a sound theoretical construct. The devel- opment of the theoretical construct focused on how the mathematical learning of children best be supported during the transition to school. Once established, the theoretical construct was then drawn upon to develop a tentative research plan: an intervention that focused on supporting the mathemati- cal learning of children. In relation to the development of the research proposal, the researcher pre- sented a tentative plan for the research, but she communicated in her candidature document that the study was emergent, and therefore how it unfolded in practice could not be predetermined.
After confirmation of candidature, the researcher took the research proposal to practitioners for dis- cussion and refinement. It was at this stage of DS2 (Phase 2) where the researcher recruited partici- pants into the study and primed the intervention for implementation in two different contexts. The researcher documented both Phase 1 and Phase 2, which formed a significant component of the fi- nal thesis document. The refined intervention involved the establishment of two research teams lo- cated at two different sites (consisting of a prior-to-school and first-year-of-school teacher and fami- lies of children making the transition to school). Each research team was provided with the design- brief to create a plan that would support the existing mathematical understandings of children as
Design-Based Research in Doctoral Studies
114
they made the transition to school. The researchers’ role within the project was to study what tran- spired at the two different sites.
The third phase of DS2 involved implementing the refined intervention. As the implementation took place, four research team meetings were conducted at each of the two sites. Research team meetings provided an opportunity for participants to talk about their experiences, including some of the chal- lenges and opportunities that were transpiring. They also provided an opportunity for further re- finement of the intervention as it unfolded in practice, and afforded the researcher and participants with various opportunities to guide or ‘test-out’ the intervention in different ways. During this phase, the researcher documented the processes as the intervention unfolded at the two different sites. This included the documentation and ‘on-the-run’ analyses of all decisions.
The final phase of the project involved a retrospective analysis of the process from conception to completion. In this phase of DS2, the conceptual framework of the Cultural Interface (Nakata, 2007) was drawn upon to analyze what transpired in practice. The purpose of this retrospective anal- ysis was not to test the effectiveness of the intervention but to derive a set of draft principles that could be further refined and drawn upon in later work. In the thesis, these principles were presented as a starting point for others embarking on similar work (Goff, 2016). The final phase also involved examining the draft principles alongside current research literature. This facilitated a way to provide recommendations for utilizing the draft principles and suggested their application for future work.
The findings of DS2 resulted in six design principles. These results are not definitive but rather pro- vide a starting point for those examining how best to support the mathematical learning of children making the transition to school. They also provide a research trajectory for the doctoral student (Goff, 2016) post candidature. The six design principles are focused on how adults come together to support the mathematical learning of children and provide insight into what might need to be con- sidered by those embarking on similar work. The six principles are:
Principle 1 - The transition to school is recognized as a developmental context for adults.
Principle 2 - Relationship and partnership are recognized as independent but interconnected con- cepts.
Principle 3 - Adults are supported to navigate through their transition experience.
Principle 4 - A particular task to perform will provide a reason for ongoing interaction and facilitate partnership.
Principle 5 - Opportunities for ongoing interactions about mathematics are afforded and
Principle 6 - Opportunities to reconstruct mathematics in a shared space are provided (Goff, 2016).
RESULTS In this section, we present the results of the study by examining the methodological approaches of the two doctoral dissertations against the phases and principles of DBR. The results focus on the phases of DBR and the challenges in using the DBR approach as reflected in the two doctoral studies.
PH ASES OF DBR IN TH E T WO DOCTORAL STUDIES In the two doctoral studies, each phase of the DBR approach was used to fit their purpose. For example, DS1 used the first phase to develop a PD program and conducts a contextual analysis, whereas DS2 used this phase to investigate a problem of practice and develop a solution to the problem.
Goff & Getenet
115
Table 2 presents activities in each of the doctoral studies and the themes that emerged across the two studies. The background information of the two doctoral studies described in the method section of this study supported to identify the themes described in Table 2.
Table 2. Summary of themes emerged alongside the phases of the DBR Approach
PHASES ACTIVITIES IN EACH PHASE THEMES AND ACTIONS DOCTORAL STUDY 1 DOCTORAL STUDY 2
Phase 1 Researcher develops PD program and conducts a contextual analysis. Partici- pants are recruited into the project.
Researcher investigates a problem of practice and develops a proposed so- lution (intervention).
• Contextual analysis • Back-
ground/Groundwork • Formulation of the re-
search project Phase 2 Design guidelines are
established, and the PD program is implemented. The PD program is refined with practitioners in real- world contexts through sev- eral iterations of redesign and implementation.
The problem of practice and the proposed solu- tion is presented to par- ticipants for further in- vestigation and refine- ment. Participants re- cruited into the project.
• Implementation of Re- search Project
• Data Collection • Ongoing Analysis and
Refinement
Phase 3
A summative evaluation of the PD program is conduct- ed.
The intervention is im- plemented and further refined with participants in real-world contexts. The researcher docu- ments and guides this process alongside partic- ipants through iterative cycles of refinement.
• Evaluation
Phase 4 Design principles for the PD program are derived and presented.
Retrospective analysis and the development of draft principles.
• Formulation of Design Principles
As shown in Table 2, even though both doctoral studies have completed similar activities in Phases 1, 3 and 4, the activities completed in Phase 2 were slightly different. For example, the primary focus of DS1 in Phase 2 was designing PD program guidelines and refinement of the PD guidelines working with practitioners in real-world contexts. Whereas, the focus of DS2 in Phase 2 was presenting a proposed solution to participants for further investigation and refinement as well as participants’ re- cruitment.
CHALLENGES IN CONDUCTING EACH DOCTORAL STUDY
Doctoral study 1 DS1 encountered a number of challenges. The challenges in DS1 included an extensive data set, challenges with how the researcher might position himself as the study unfolded, collaborating with research participants in the research process, and the time constraint to complete the research pro- ject. In the following paragraphs, we illustrate each of the challenges in DS1.
In DS1, the use of multiple instruments and multiple data sources generated a significant amount of data. The data collection instruments were questionnaires, interviews, observations, and focus group
Design-Based Research in Doctoral Studies
116
discussions (including workshops). As a result, the researcher made a choice regarding aspects of the research to emphasize and selecting the data that addressed the research themes as encapsulated in the research questions. This did not mean that the doctoral study used a small part of the data col- lected. Rather, it documented the whole design process and reported key findings.
The second challenge in DS1 was the role of the researcher, either towards the subjective side or to- wards closer to an objectivist stance in the research process. However, the researcher played a signifi- cant and positive role in shaping the phenomena under study. As a result, in DS1, the researcher en- gaged in creative activities of developing PD program guidelines guided by existing scientific knowledge and practitioners’ voice. Equally, the researcher was careful not to impose the values and beliefs held but acted as a facilitator throughout the process.
Deeply involving participants in the research process was another challenge identified in DS1. At the start of DS1, the participants in the study preferred to attach themselves to the matured teaching practiced approaches rather than immersing themselves in a new approach, which required the use of ICT in their teaching. It took a significant time investment to engage effectively the participants in each activity of the research process.
DBR usually takes an extended period so that a PD program can be developed and refined through an iterative process and as a result have a maximum impact on the participants’ practices (Wang & Hannafin, 2005). Due to the restricted time frame of the doctoral candidature, the study was unable to incorporate multiple prototypes for the PD design (Getenet, 2015).
Doctoral Study 2 In DS2 (Goff, 2016) similar challenges were encountered. These challenges included decisions around the recruitment of participants, the large data set that emerged, and the time constraints im- posed by the doctoral process. The recruitment of participants in DS2 was an intricate process that demanded a significant component of planning. The close proximity in which the researcher works with participants in DBR necessitates that a sound rapport between researcher and participant is de- veloped (Mitchell, 2010). Within the time restraints of the doctoral process, this required careful con- sideration and various purposeful steps. For example, the researcher initially met with the partici- pants’ supervisors (the school principal and the pre-school coordinator) to explain the project and detail the level of support afforded to the participants. Such steps were planned during the proposal stage of the research and were designed to maximize the time that could be spent developing and building rapport.
Similarly, to DS1, the use of multiple instruments (video-recorded team meetings, research field notes, participant diaries, and email data) and data sources generated a significant amount of data. As a result, choices had to be made about which aspects of the data collected were to be highlighted and emphasized. The researcher chose the relevant data by maintaining an explicit focus on the research questions under investigation, rather than opening the inquiry up to other possibilities that emerged as the investigation unfolded. Table 3 summarizes the challenges identified in DS1 and DS2.
Table 3. Challenges identified in each doctoral studies
CHALLENGES IDENTIFIED DOCTORAL STUDY 1 DOCTORAL STUDY 2
Recruitment of participants - Yes
Data usage/Large data set Yes Yes
Objectivity Yes - Creating collaboration Yes - Time constraint Yes Yes
Goff & Getenet
117
As shown in Table 3, we have identified that recruitment of participants, large data sets, and time constraints were the challenges in both doctoral studies. For example, in relation to time constraint, it is assumed that DBR inquiries are typically in-depth and longitudinal, and involve a research team rather than a solitary researcher. In the two doctoral studies, however, such affordances were not pre- sent due to the nature of the doctoral candidature and the limited time the doctoral process imposes. To combat these issues, the researcher (in DS2) considered the inquiry beyond the doctoral process. As such, the DBR process was broken down into two distinct components – the doctoral research and a post-doctoral plan. This resulted in a change of purpose of the doctoral research, whereby the aim of the inquiry was to formulate a draft set of design principles that could be both offered to others as a starting point for engaging in similar work and a starting point for post-doctoral studies. In doing so, the nature of depth of the DBR approach was not lost through the time constraints im- posed by the doctoral candidature, nor was the notion that DBR demands an investment in time if it is to generate long-lasting or significant change. Table 4 summarizes the principles for a doctoral dis- sertation, potential challenges as reflected in both doctoral studies and possible actions to overcome those challenges.
Table 4. Principles for DBR in Doctoral Studies
PHASE PRINCIPLES FOR DOCTORAL WORK
POTENTIAL
CHALLENGES
ACTIONS
1 Formulation of the Research Proposal
Creating Collaboration
Time Constraint
Contextual Analysis and Formula- tion of the Research Project
2 Implementation of the Research Project
Requirement of Partici- pants
Large Data Set Generat- ed
Objectivity
Data Collection and Ongoing Anal- ysis
Focus on the research questions under investigations
Carefully define the researcher’s role where to influencing and shaping the phenomena under study
3 Summative Analysis of Data
Formulation of Design Principles
Large Data Set
Data Usage
Time Constraints
Evaluating the project
Answering the research questions
4 Thesis finalized Time Constraints Retrospective Analysis of the entire project
Formulating a plan that extended beyond the doctoral inquiry also provided the researcher with a trajectory for academic work beyond candidature. This is an important consideration for doctoral candidates to formulate a career in academia.
In summary, both doctoral studies encountered similar challenges in using DBR as a methodological approach.
DISCUSSION The two doctoral studies transpired in different contexts; however, they remained consistent with using the DBR methodological approach. Both doctoral projects were aligned directly with the vari-
Design-Based Research in Doctoral Studies
118
ous phases and principles proposed by Reeves (2006) and Herrington et al. (2007). However, the overall goal of methodology for each of the projects was what differed. The goal of the method for DS1 was to formulate a set of design principles and complete the DBR process. Whereas, the goal of DS2 was to commence the DBR process by developing a set of draft principles that could be fur- ther refined, tested, and built upon post-candidature. Goals of the methodology are an important consideration when navigating the doctoral process, particularly if doctoral students considered DBR as the methodological approach adopted. Thinking about the doctoral process in different ways such as the beginning of a longitudinal inquiry can facilitate this notion. Consistent with the recommenda- tions of Anderson and Shattuck (2012), in this study, the authors noted that the doctoral students remain focused on the goals that they set out to achieve rather than becoming caught up in the itera- tive and emergent nature of the DBR approach. This implies that the methodology should support the research goals rather than create them.
In each doctoral study, the DBR methodological approach provided a solid basis from which to en- gage in educational inquiry. Each study was situated in the pragmatics of the everyday, was conduct- ed in collaboration with practitioners, and was emergent in nature. The DBR principles were used to satisfy each doctoral study goal working with practitioners. Both doctoral studies were also interven- tionist and were designed to address a problem of practice. Identifying similarities across different inquiries that have adopted a DBR methodological approach provides a basis that other doctoral candidates can draw upon. The two doctoral studies presented in this paper, coupled with the various phases and activities presented by Reeves (2006) and Herrington et al. (2007), provides insight into this process.
Mapping the themes that emerged across the two doctoral approaches, alongside the traditional doc- toral research approach, highlights the effectiveness and flexibility of DBR as an appropriate meth- odology for doctoral work. Explicating the different stages of the doctoral candidature provides a robust roadmap in which HDR students can align the specific milestones of their candidature along- side the various phases of the approach. It also provides a roadmap for HDR supervisors as they guide and support their students’ candidature.
The design principles for doctoral work highlight the different stages of doctoral work alongside the various phases of the DBR approach. The actions and potential challenges presented provide insight into what activity should take place throughout the candidature as well as highlight the differing chal- lenges that the HDR student might face as the study unfolds.
The challenges that emerged in the doctoral studies presented in this study were not unlike those challenges identified by others within the DBR research literature (see Anderson & Shattuck, 2012; Dede, 2004). Creating complex interventions with practitioners in real-world contexts is a challenging activity. Researching and driving the research process adds to this complexity, particularly within the doctoral candidature where the added pressures of the limitation of time and the novice navigation of the research terrain are also present. To overcome some of these challenges, it is important that the doctoral candidate makes use of the human resources that surround them at this time – particu- larly experienced researchers and supervisors who are well-positioned to provide guidance and men- torship. Maintaining a close relationship and focus on the research questions under investigations will also achieve this goal (see Table 4), as will a preparedness for the common challenges identified in this study. In the two doctoral studies, time constraint was overcome through, for example, owning the significant pieces of the larger research agenda, which are consistent with the recommendation of Anderson and Shattuck (2012).
Different from the common trends of educational research, which is its emphasis on practices rather than a focus to improve practice (e.g., Anderson & Shattuck, 2012; McKenney & Reeves, 2012), the two doctoral studies showed the potential advantages of DBR to improve practices through the ac- tive involvement of practitioners. Similar to the findings of Kennedy-Clark (2013), the pragmatic nature of DBR enabled the DS1 and DS2 researchers to improve their research design and their un-
Goff & Getenet
119
derstanding of the problem working with practitioners united by a common purpose. For example, DS1 included both types of research on practice and research in practice concurrently. That is, it in- volved research on a practice by working with practitioners to design a PD program and used as a springboard to improve their educational practices to improve the practitioners’ competencies to use technology in teaching mathematics. This implies that the DBR approach allowed the researchers to work in collaboration with the practitioners to improve practitioners’ practical problems designed to enhance their practices rather than only doing research on them. This is also another dimension for doctoral students to use DBR approach in their doctoral dissertation as a research methodology to influence practice as well as creating a long-term partnership with practitioners.
CONCLUSION In this study, we used Reeves (2006) and Herrington et al.’s (2007) phases and principles of the DBR approach to examine the methodological approaches of two doctoral dissertations. As a result, the authors of this study presented two different doctoral dissertations that have adopted a DBR meth- odological approach. As evident throughout this study, DBR provides an important methodological approach for understanding and addressing problems of practice, particularly in the educational con- text, where a long criticism of educational research is that it is often divorced from the reality of the everyday (Design-Based Research Collective, 2003). Doctoral candidates, situated in Faculties and Schools of Education, who adopt a DBR methodological approach are positioned well to begin to change this mantra and to generate authentic change within the educational context.
This study has also demonstrated that the DBR approach can be employed to bring researchers and practitioners together to design context-based solutions to educational problems, which have deep- rooted meaning for practitioners about the relationship between educational theory and practice. As shown in the two doctoral studies, the researchers and practitioners worked closely together, making meaningful changes to their practice. Thinking creatively about what DBR might look like within the doctoral process and encouraging doctoral students to explore the methodological approach is a way to facilitate this notion. The design principles presented in this study can help guide this exploration.
Even though DBR is assumed a long-term and intensive approach to educational inquiry that doc- toral students should not attempt to adopt for their doctoral dissertations, the findings of this study demonstrated that DBR can be used in the shorter term and less intensive contexts of doctoral re- search. The findings of this study could make a contribution to understandings the challenges of using DBR and possible strategies to overcome those challenges to use DBR approach in doctoral work.
REFERENCES Abdallah, M. M. S., & Wegerif, R. B. (2014). Design-based research (DBR) in educational enquiry and technological studies:
A version for PhD students targeting the integration of new technologies and literacies into educational contexts. ERIC: ED546471. Retrieved from http://files.eric.ed.gov/fulltext/ED546471.pdf
Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educa- tional Researcher, 41(1), 16-25. doi: 10.3102/0013189x11428813
Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The Journal of the Learning Sciences, 13(1), 1–14. doi: 10.1207/s15327809jls1301_1
Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex in- terventions in classroom settings. The Journal of the Learning Sciences, 2(2), 141–178. doi: 10.1207/s15327809jls0202_2
Cobb, P. (2011). Introduction. In E. Yackel, K. Gravemeijer, & A. Sfard (Eds.), A journey in mathematics education research - Insights from the work of Paul Cobb (pp. 9-17). Dordrecht: Springer.
Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). New York: Springer-Verlag.
Design-Based Research in Doctoral Studies
120
Corbin, J., & Strauss, A. (2014). Basics of qualitative research: Techniques and procedures for developing grounded theory. London, UK: Sage publications.
Dede, C. (2004). If design-based research is the answer, what is the question? A commentary on Collins, Jo- seph, and Bielaczyc; diSessa and Cobb; and Fishman, Marx, Blumenthal, Krajcik, and Soloway in the JLS special issue on design-based research. The Journal of the Learning Sciences, 13(1), 105-114. doi: 10.1207/s15327809jls1301_5
Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational in- quiry. Educational Researcher, 32(1), 5-8. doi: 10.3102/0013189X032001005
Getenet, S. (2015). Enhancing mathematics teacher educators' technological pedagogical content knowledge through collaborative professional development: Ethiopia. Unpublished doctoral dissertation, University of Tasmania, Launceston, Australia.
Goff, W. M. (2016). Partnership at the cultural interface - How adults come together to support the mathematical learning of children making the transition to school. Unpublished doctoral dissertation, Charles Sturt University, Bathurst, Australia
Gravemeijer, K., & Cobb, P. (2006). Design research from a learning design perspective. In J. Van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 17–51). London, England: Routledge.
Hakkarainen, P. (2009). Designing and implementing a PBL course on educational digital video production: Lessons learned from a design-based research. Educational Technology, Research and Development, 57(2), 211– 228. doi: 10.1007/s11423-007-9039-4
Herrington, J., McKenney, S., Reeves, T., & Oliver, R. (2007). Design-based research and doctoral students: Guidelines for preparing a dissertation proposal. In C. Montgomerie & J. Seale (Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2007 (pp. 4089-4097). Chesapeake, VA: AACE.
Johnson, R. B., Onwuegbuzie, A. J., & Turner, L. A. (2007). Towards a definition of mixed methods research. Journal of Mixed Methods Research, 1(2), 112-133. doi: 10.1177/1558689806298224
Kelly, A. E. (2006). Quality criteria for design research: Evidence and commitments. In J. van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 107–118). Abingdon, UK: Routledge.
Kennedy-Clark, S. (2013). Research by design: Design-based research and the higher degree research student. Journal of Learning Design, 6(2), 26-32. doi: http://dx.doi.org/10.5204/jld.v6i2.128
Maxcy, S. J. (2003). Pragmatic threads in mixed methods research in the social sciences: The search for multiple modes of inquiry and the end of the philosophy of formalism. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 51-89). Thousand Oaks, CA: Sage.
McKenney, S. E., & Reeves, T. C. (2012). Conducting educational design research. New York, NY: Routledge.
McKenney, S., & Van Den Akker, J. (2005). Computer-based support for curriculum designers: A case of de- velopmental research. Educational Technology Research and Development, 53(2), 41-66. doi: 10.1007/bf02504865
Metcalfe, M. (2008). Pragmatic inquiry. The Journal of the Operational Research Society, 59(8), 1091-1099. doi: 10.1057/palgrave.jors.2602443
Mitchell, W. (2010). ‘I know how I feel’listening to young people with life-limiting conditions who have learning and communication impairments. Qualitative Social Work, 9(2), 185-203. doi: 10.1177/1473325009346460
Nakata, M. (2007). The cultural interface. The Australian Journal of Indigenous Education, 36(S1), 7-14. doi: 10.1017/S1326011100004646
Plomp, T. (2009). Education design research: An introduction. In T. Plomp & N. Nieveen (Eds.), An introduction to educational design research (pp. 9–35). Enschede, The Netherlands: Netherlands Institute for Curriculum Development.
Reeves, T. C. (2006). Design research from a technology perspective. Educational Design Research, 1(3), 52-66.
Goff & Getenet
121
Reeves, T. C., Herrington, J., & Oliver, R. (2005). Design research: A socially responsible approach to instruc- tional technology research in higher education. Journal of Computing in Higher Education, 16(2), 96–115. doi: 10.1007/bf02961476
Smith, P. L., & Ragan, T. J. (2005). Instructional design. Hoboken, NJ: Wiley & Sons.
Tashakkori, A., & Teddie, C. (Eds.). (2010). SAGE handbook of mixed methods in social and behavioural research. London: Sage.
Thein, A. H., Barbas, P., Carnevali, C., Fox, A., Mahoney, A., & Vensel, S. (2012). The affordances of design- based research for studying multicultural literature instruction: Reflections and insights from a teacher- researcher collaboration. English Teaching: Practice and Critique, 11(1), 121–135. Retrieved from http://files.eric.ed.gov/fulltext/EJ970235.pdf
Van den Akker, J. (1999). Principles and methods of development research. In J. van den Akker, R. Branch, K. Gustafson, N. Nieveen, & T. Plomp (Eds), Design Approaches and tools in education and training (pp. 1–15). Dordrecht: Kluwer Academic Publishers.
Van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (2006). Introducing educational design re- search. In J. V. D. Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 3–7). New York, NY: Routledge.
Wang, F., & Hannafin, M. (2005). Design-based research and technology-enhanced learning environments. Edu- cational Technology Research and Development, 53(4), 5-23. doi: 10.1007/bf02504682
BIOGRAPHIES Dr Wendy Goff has worked with schools to develop a variety of intervention and support programs targeting the social welfare needs of children and families. Her research focuses on adult relationships and how they might impact on the learning and development of children. Throughout her career she has worked as a social welfare worker, a pre- school teacher, and a primary school teacher. She joined the tertiary setting in 2010. Wendy is particularly interested in the mathematical learning of children, and her recent work involved the implementation of an intervention focusing on adults noticing and supporting the
mathematical understandings of children making the transition to school. Wendy uses an Educational Design-Based Research (DBR) methodological approach in her work, and her research is embedded in real-world settings.
Dr Seyum Getenet has worked as a lecturer in Teacher Education Uni- versity for more than 9 years. He has also taught primary and secondary schools mathematics. His research interest is on pedagogical content knowledge of mathematics teachers and how professional development is used as an enabler for change. Most of his research work use Design- Based Research methodological approach. He is currently working as a Senior Lecturer in Mathematics Curriculum and Pedagogy.
Copyright of International Journal of Doctoral Studies is the property of Informing Science and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.
Mixing Qualitative and Quantitative Research in Developmental Science: Uses and Methodological Choices
Hirokazu Yoshikawa Harvard Graduate School of Education
Thomas S. Weisner University of California, Los Angeles
Ariel Kalil University of Chicago
Niobe Way New York University
Multiple methods are vital to understanding development as a dynamic, transactional process. This article focuses on the ways in which quantitative and qualitative meth- odologies can be combined to enrich developmental science and the study of human development, focusing on the practical questions of “when” and “how.” Research situations that may be especially suited to mixing qualitative and quantitative ap- proaches are described. The authors also discuss potential choices for using mixed quantitative– qualitative approaches in study design, sampling, construction of mea- sures or interview protocols, collaborations, and data analysis relevant to developmen- tal science. Finally, they discuss some common pitfalls that occur in mixing these methods and include suggestions for surmounting them.
Keywords: mixed methods, quantitative, qualitative
How does knowledge gleaned from words complement knowledge gleaned from numbers,
and vice versa? How and when does the com- bination of quantitative and qualitative data col- lection and analytic methods enrich develop- mental science? Our science increasingly relies on multimethod approaches to examining de- velopmental processes (Garcia Coll, 2005; So- ciety for Research in Child Development, 2005; Weisner, 2005). As a consequence, develop- mental scholars have broken new ground over the past decade in understanding the cognitive, linguistic, social, cultural, and biological pro- cesses related to human development and fam- ily life. In this article, we focus on the many productive ways in which quantitative and qual- itative methods can be combined to study hu- man development.
Several summaries and handbooks focusing on integrating qualitative and quantitative data collection and analysis methods in the social sciences have been published recently (Axinn & Pearce, 2006; Bernard, 1995, 1998; Creswell & Plano Clark, in press; Greene & Caracelli, 1997; Tashakkori & Teddlie, 1998, 2003). Onwueg- buzie and Leech (2005) argue for combining the contrasting “Qs” (polarized quantitative and qualitative methods tracks and courses) into, for example, integrated bilingual, pragmatic re- search methods courses in education. In this article, we focus specifically on the uses of
Hirokazu Yoshikawa, Harvard Graduate School of Edu- cation, Harvard University; Thomas S. Weisner, Depart- ments of Psychiatry and Anthropology, University of Cali- fornia, Los Angeles; Ariel Kalil, Harris School of Public Policy, University of Chicago; Niobe Way, Department of Applied Psychology, New York University.
This article was partially based on a conference, “Mixed Meth- ods Research on Economic Conditions, Public Policy, and Child and Family Well-Being,” sponsored by the National Poverty Center, University of Michigan, in June of 2005. Additional support for that conference was provided by the American Psy- chological Association and the National Institute of Child Health and Human Development to Ariel Kalil and Hirokazu Yo- shikawa. Work on this article by Thomas S. Weisner was sup- ported by the University of California, Los Angeles Field Work Training and Qualitative Data Lab, National Institute of Child Health and Human Development 5 P30 HD004612 and Semel Institute, Center for Culture & Health. Work on this article by Hirokazu Yoshikawa and Niobe Way was supported by National Science Foundation Behavioral and Cognitive Sciences Grant 021589 to the New York University Center for Research on Culture, Development, and Education. Work on the article by Hirokazu Yoshikawa and Ariel Kalil was also supported by Scholars grants from the William T. Grant Foundation.
Correspondence concerning this article should be ad- dressed to Hirokazu Yoshikawa, Harvard Graduate School of Education, 14 Appian Way, Room 704, Cambridge, MA 02138. E-mail: [email protected]
This article is reprinted from Developmental Psychology, 2008, Vol. 44, No. 2, 344 –354.
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Qualitative Psychology © 2013 American Psychological Association 2013, Vol. 1(S), 3–18 2326-3598/13/$12.00 DOI: 10.1037/2326-3598.1.S.3
3
mixed methods for developmental science. We answer practical questions of when and how: When might mixing qualitative and quantitative approaches be useful in a developmental study? What are the methodological choices involved in qualitative and quantitative inquiry in studies of human development?
By quantitative research, we mean methods of inquiry that analyze numeric representations of the world. Survey and questionnaire data as well as biological or physiological data are of- ten analyzed in quantitative units. Inquiry that relies on qualitative methods collects and ana- lyzes non-numeric representations of the world—words, texts, narratives, pictures, and/or observations. The epistemological as- sumption underlying our discussion of mixed methods is that in scientific endeavors, the world can be represented through both numbers and words and that numbers and words should be given equal status in developmental science. Developmental science is a holistic enterprise including the social, neurological, and biologi- cal sciences. Although particular disciplines may emphasize particular methods of data col- lection and analysis, this is no reason to limit a particular program of research in developmental science to a single method.
In this article, we make the distinction be- tween qualitative and quantitative data and qualitative and quantitative data analysis (Ax- inn & Pearce, 2006). The world is not inherently qualitative or quantitative; it is the act of human representation through numbers or non-numeric signifiers like words that make aspects of the scientific enterprise qualitative or quantitative. Behaviors or contexts relevant to human devel- opment are not inherently qualitative or quanti- tative, but the methods of representation through which behaviors or contexts are re- corded in research are. In this article, we define qualitative data as information that has been collected not in numeric form but in texts, nar- ratives, or observations (including pictures and video). We define quantitative data as informa- tion that has been collected in numeric form (e.g., counts, levels, or Likert-format re- sponses). We define qualitative data analyses, similarly, as forms of analysis that do not rely on numeric representation and quantitative data analyses as forms that do. Qualitative ap- proaches cover a wide range of methods, just as there is a wide range of quantitative methods.
An important corollary to this distinction between qualitative and quantitative data and data analysis is that all four combinations of these two categorizations are possible. That is, qualitative data can be analyzed through either qualitative or quantitative data analysis techniques, as can quantitative data. Interview transcripts can be reliably coded for the fre- quency of mention of themes, the numbers of words or keywords, or the complexity of vo- cabulary and statistically analyzed. Ethno- graphic data from the world’s cultures have been coded for quantitative analysis (Rogoff, Sellers, Pirotta, Fox, & White, 1975). Con- versely, individuals above or below a cut-off on a Likert scale or continuous dimension can be analyzed and characterized qualitatively, without further numeric representation.
Before turning to our primary questions, we begin with three general beliefs that guide our discussion of mixing qualitative– quantitative methods in studies of human development. First, integrating these approaches can bring us closer to understanding a developmental pro- cess than either set of methods can on its own. This belief goes beyond the commonly stated value of triangulation across methods, a strategy that focuses on convergence across methods on a particular finding, or separating out methods variance. Rather, our belief is that the combina- tion of words and numbers can bring us closer to the complexity of developmental change by providing divergent as well as convergent data. Divergent data across methods can spur further inquiry and refinement of theory rather than simply representing disconfirming information (Sieber, 1973). Integrated methods can also make a study more believable to broader audi- ences, because they represent the world more completely.
Our second belief is that the particular research question concerning developmental processes should determine whether and how qualitative and quantitative methods should be combined. As with other forms of research, methods should fol- low the question rather than vice versa. This means that not all research studies in devel- opmental science call for the use of both kinds of methods. We will describe certain common types of research questions that we think lend themselves to the process of mix- ing methods.
4 YOSHIKAWA, WEISNER, KALIL, AND QAY
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Our third belief is that the qualitative– quantitative distinction itself is somewhat ar- bitrary and limiting (see also Onwuegbuzie & Leech, 2005). There are other dimensions of research methods often associated with this distinction (e.g., small–large sample, primary data collection–secondary data analysis, un- generalizable– generalizable, noncausal– causal, nonexperimental– experimental, and culture-specific– universal) that cross-cut the qualitative– quantitative distinction. Anthro- pologists have described methods as experi- ence-near (representing the voices, intentions, meanings, and local rationality of parents and children in local settings) and experience- distant (representing the world of groups, insti- tutions, and social address categories). Methods can be particularistic, capturing a part of some phenomenon, or holistic, attempting to capture the whole context or situation (Weisner, 1996b). In this article, we aim to challenge the overly simple conceptualization of a single qualitative– quantitative “divide.”
Research Circumstances in Developmental Science That Call for Mixing Qualitative
and Quantitative Methods
Many, but certainly not all, research situa- tions may be particularly suited to mixed qual- itative and quantitative approaches. We discuss several such situations that may be particularly relevant to developmental science here.
Assessing Developmental or Contextual Constructs That Are Difficult to Measure Using Either Set of Methods Alone
Human development occurs through the re- ciprocal exchanges between individual growth and change in social contexts (Bronfenbrenner & Morris, 1998; Thelen & Smith, 2006). How- ever, some aspects of individual behavior or contextual characteristics can be difficult to un- derstand using only quantitative or only quali- tative methods. For example, recent work by Kathryn Edin and Laura Lein (1997) focusing on single mothers’ economic strategies and household budgeting established patterns of household expenditure that have been difficult to measure using traditional survey methods because of the sensitive nature of this informa- tion. Their assessment of spending relied first
on meeting mothers in person and gaining their trust through interviews, which were usually repeated over several months, until a typical month’s budget was fully accounted for. Qual- itative data collection methods (semi-structured interviews) allowed the development of rapport that in turn facilitated a more complete and accurate accounting of income sources and ex- penditures than prior survey studies had achieved. In other words, their research ques- tions required collecting and analyzing quanti- tative and qualitative data.
Another example of research requiring both quantitative and qualitative information con- cerns studies of diurnal and nocturnal stress processes in human development. Physiological measures, such as those representing stress pro- cesses, provide information about the effects of stress on human development that cannot be reported by individuals (Gunnar & Vazquez, 2001). However, these types of data should be combined with self-report data that provide in- formation on individuals’ perceptions of and responses to daily stressful events (Adam, Hawkley, Kudielka, & Cacioppo, 2006), thus allowing researchers to track how individual behavior, at both the psychological and physi- ological levels, corresponds to individual per- ceptions and meaning-making. McKenna and McDade (2005) review evidence on perceived norms regarding co-sleeping between mothers and infants, as well as evidence for contingent psychobiological attunement that occurs in these dyads as they sleep together. Quantitative data are necessary for the monitoring of sleep- ing parents’ and infants’ physiological and be- havioral patterns. But to understand the mean- ings, practices, and contexts of sleep patterns between mothers and children, qualitative (eth- nography, parent interviews) and quantitative (questionnaires, systematic home observations) data are necessary. The combination of qualita- tive and quantitative evidence provides both prevalence estimates and information on cultur- ally based goals and beliefs: We know from the combination of these forms of data and analysis that worldwide, parents sleep with infants and toddlers to insure their health and to facilitate breastfeeding; older children and parents sleep together for shared comfort and familiarity. These practices do not lead to excessive depen- dency or other outcomes that often worry U.S. parents (Morelli, Oppenheim, Rogoff, & Gold-
5SPECIAL SECTION: MIXING METHODS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
smith, 1992; Okami, Weisner, & Olmstead, 2002).
Integrating the Study of Beliefs, Goals, and Practices in Socialization and Development
Shweder et al. (2006) note that the study of culture in human development benefits from the integration of symbolic (e.g., beliefs, goals, and rules) and behavioral (e.g., customs and behav- iors) aspects of cultural communities. In this view, the shared meanings that are passed on from one generation to the next and that consti- tute culture have both symbolic and behavioral dimensions. Likewise, Super and Harkness’s concept of the developmental niche of child rearing integrates attention to the physical set- ting, behavioral customs, and caregivers’ psy- chology (Harkness & Super, 1996; Super & Harkness, 1986). In order to conduct integrated studies of beliefs and practices in human devel- opment, it is necessary to conduct close obser- vation of behaviors and activities in natural settings as well as to explore the beliefs, inten- tions, meanings, and goals of children, their caregivers, and others over time (Weisner, 2002). Examining behavior and belief systems requires both quantitative and qualitative ap- proaches to research: quantitative methods to understand the prevalence of particular prac- tices, behaviors, and beliefs, and qualitative methods to understand meanings, functions, goals and intentions. Authors of classic cross- cultural studies of children’s development have fully integrated qualitative and quantitative methods to examine both beliefs and behaviors of children and their caregivers, resulting in a blend of local and universal knowledge (LeVine et al., 1994; Whiting & Edwards, 1988; Whiting & Whiting, 1975). Ethnographic studies of childhood have a deep and rich literature across cultures and in the United States (Burton & Jarrett, 2000; LeVine, 2007). Ethnographic studies of children are important precisely be- cause developmental pathways and contexts do vary so widely across local populations, cultural communities, historical periods, and ecologies, and so require careful and systematic descrip- tion and analysis.
Parenting and development include a direction and purpose along a life path, or a cultural career (Goldschmidt, 1992), which organizes both sym- bolic and behavioral aspects of development.
LeVine (2003) calls for the blending of the study of universals in development, with local variations in both the goals and specific practices of social- ization and parenting around the world. Norma- tively “healthy” relationships are thought to re- quire a balance between opposed dimensions of autonomy and intimacy, which is the dominant cultural relational schema underlying successful development in the United States (Tamis- LeMonda et al., in press; Weisner, 2001). But there are other developmental goals promoting “healthy” development, including “symbiotic har- mony,” as found in Japan (Rothbaum, Pott, Azuma, Miyake, & Weisz, 2000), or “socially distributed” caretaking and support, as found in many Latin American, African, and Asian coun- tries (LeVine, Miller, & West, 1988; Serpell, 1993; Weisner, 1987).
Beliefs, goals, and practices are particularly interesting when they are not congruent. The combination of quantitative and qualitative ev- idence can shed light on why this is so. In a recent study, Hughes et al. (in press) examined beliefs regarding the importance of various eth- nic and racial socialization practices, as well as frequencies of those practices themselves, in a sample of 210 Chinese, African American, Eu- ropean American, and Latino adolescent–parent pairs. Both survey and semi-structured inter- view data were collected from both teens and parents. The researchers first uncovered dis- crepancies in their survey data between levels of beliefs and practices within participants as well as levels of beliefs or reported practices across the teen and parent in a particular family. The semi-structured interview data helped shed light on why the discrepancies occurred. For exam- ple, it appeared that routine, everyday activities (revolving around food, books, films, or arti- facts, for example) were often identified as as- sociated with ethnicity but not perceived as examples of intentional cultural or ethnic so- cialization.
Estimating and Understanding Developmental Change at Multiple Time Scales
Developmental growth over time in popula- tions is best discerned by estimating trajectories of changing competencies and skills. Such work is conducted most often using quantitative methods (Collins & Sayer, 2001; Singer & Willett, 2003).
6 YOSHIKAWA, WEISNER, KALIL, AND QAY
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
However, events of developmental importance can occur at a multitude of time scales and at intervals that are difficult to predict.
Developmental change occurs in part as a result of the cumulative impact of innumerable interactions with parents, caregivers, teachers, siblings, and peers in the settings and at the time scale of the daily routine. Such interactions can be assessed using methods that quantify the data (e.g., structured tasks and time diaries) and with methods that aim to understand the quality of those interactions (e.g., observations and inter- views; Johnson, 1996). The spot observation technique, in which random occurrences of be- havior are sampled and described in detail, has been used in ethnographic research investigat- ing child rearing and family life in cultures around the world (Jankowiak, 1993; Super, 1976; Whiting & Whiting, 1975). For example, analysis of hundreds of sampled events result- ing from systematic participant observation in- dicated that the balance of sleep, arousal, and restraint among infants in different cultures var- ied greatly and was associated with patterns of infant motor development (Super, 1976).
Mixing quantitative and qualitative evidence can also shed light on changes that occur within and across entire developmental stages. Parents of children with disabilities in one longitudinal study described hundreds of particular accom- modations (e.g., activities intended to alter their daily routine at meal time, seeking services, transportation, caretaking, etc.) and why parents made them, based on semi-structured ecocul- tural interviews. Interviewers asked parents to “walk us through your day,” describing how and why families maintain daily routines. Quan- titative ratings based on these interviews showed that the frequency of family accommo- dations remained relatively stable across early to middle childhood, while the intensity of such accommodations declined. Quantitative mea- sures also showed that cognitive assessments of the children did not predict sustainable accom- modations, whereas assessments of socio- emotional functioning levels did (Bernheimer & Weisner, 2007; Gallimore, Coots, Weisner, Garnier, & Guthrie, 1996; Weisner, Matheson, Coots, & Bernheimer, 2005).
In a longitudinal study of social and emotional development among urban, low-income adoles- cents, survey measures of friendship quality indi- cated that whereas girls reported higher levels of
perceived support from their friends in early ado- lescence, by late adolescence girls and boys were reporting equal levels of friendship support. How- ever, qualitative findings indicated that the mean- ing and function of friendship support during late adolescence was dramatically different for girls and boys (Way, Becker, & Greene, 2006; Way & Greene, 2006).
Examining Reciprocal Relationships Between Contextual and Individual-Level Factors
Transactional theories of development posit that individual and contextual characteristics in- fluence each other in reciprocal causal processes across time (Ford & Lerner, 1992; Gottlieb, 1997). In recent years, quantitative methods for modeling such reciprocal influences have grown, such that studies modeling reciprocal associations between individuals and their fam- ily, peer, and other contexts have become rela- tively commonplace (e.g., Eisenberg et al., 2005). The strengths of the quantitative ap- proach include the ability to estimate how the strength of reciprocal causal associations changes over time. For example, quantitative data can be used to estimate how the influence of child characteristics on parenting changes between the periods of early and middle child- hood.
Mixing qualitative and quantitative methods can give a richer picture of such reciprocal associations by uncovering in detail the pro- cesses by which individuals select their own (or others’) environments. A study using national survey data examined the factors that predicted parents’ choice of center-based care; mothers with higher levels of education, lower levels of social support (e.g., from a co-resident grand- parent), and those providing higher levels of cognitive stimulation in the home were more likely to select center-based care (Fuller, Hol- loway, & Liang, 1996). A complementary qual- itative interview study found that parents valued safety and trust in their providers more than other structural or process indicators of quality (Mensing, French, Fuller, & Kagan, 2000). It is important to note that these findings suggest that most survey-based, quantitative studies of child care quality are overlooking factors that parents value the most—in other words, aspects of the caregiver–parent relationship.
7SPECIAL SECTION: MIXING METHODS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Exploring Causal Associations and Their Mechanisms
Both words and numbers can shed light on causality. However, the contributions of quali- tative and quantitative methods are different, and the combination can provide a richer picture of a causal association than either can alone (Axinn & Pearce, 2006). Quantitative methods are suited to estimating the direction and mag- nitude of a causal influence on development. Whether using classic, random-assignment ex- perimental methods or a quasi-experimental ap- proach, the goal is most often an unbiased esti- mate of the effect of a predictor on a develop- mental outcome (Foster, 2002; Rubin, 1974; Shadish, Cook, & Campbell, 2002).
Qualitative approaches to causal analysis, on the other hand, are most suited to uncovering mechanisms of cause and effect (what some have called “process analysis”; Brady & Col- lier, 2004). Many researchers who use quanti- tative analyses to understand causal impacts of a treatment or phenomenon intend to eliminate selection effects; in contrast, qualitative analy- sis is often aimed at describing in detail these same processes, taking into account human agency. In addition, quantitative approaches, testing particular hypotheses about a delimited number of mediating mechanisms, may not help discern the full range of explanatory processes that hold in any particular cause– effect relation- ship. Qualitative methods can help uncover such mechanisms. For example, a qualitative analysis using data from the Moving to Oppor- tunity residential-mobility experiment explored why the offer of a move from a low-income to a high-income neighborhood had more positive effects on girls’ academic performance and so- cial behavior than it did on boys’. The qualita- tive substudy found that boys of parents who took the offer to move from high- to low- poverty neighborhoods had more difficulty ad- justing to new neighborhoods. Girls adapted more quickly to the new settings, developing school-based friendship networks whose mem- bers were less likely to engage in risky behav- iors. Girls felt more harassed in their old neigh- borhoods and experienced less fear in the new ones. These were experiences that had not been anticipated in the survey but emerged from in- depth qualitative interviews (Clampet-Lun- dquist, Edin, Kling, & Duncan, 2006).
Another common situation in which quantita- tive and qualitative data are integrated is in the evaluation of the implementation quality of pro- grams for children and youth. Implementation is partly a matter of what is offered in a program and partly a consequence of families’ perceptions about a program that determine whether they make use of it. Both can explain or moderate the causal effects of an intervention on children. This mix of examining what was offered and how it was perceived is well suited to a combination of quantitative and qualitative methods. Datta (2005), for example, reviewed evidence from quantitative experimental evaluations of the Comer approach to whole-school reform. Quanti- tative data indicated that the program did not achieve the intended results; merged qualitative and quantitative data indicated that the Comer principles were effective, but only when the ap- proach was appropriately implemented (Datta, 2005, p. 66). The Early Head Start national eval- uation (Love et al., 2002) used a series of in-depth and focus-group interviews with staff and pro- gram directors in each of the 17 experimental sites to characterize site-specific theories of change (that is, beliefs about how and through what mechanisms the local program was affecting tar- geted outcomes; Weiss, 1995). These were then categorized as focused on parent processes, child processes, or a combination. The resulting three- category variable was used as a moderator of the experimental impact.
Another example of a puzzling causal asso- ciation addressed by the integration of quanti- tative and qualitative research emerged from a 6-year longitudinal study of adolescent mothers and their children (Way & Leadbeater, 1999). In this study, the survey data indicated that moth- ers who reported lower levels of emotional sup- port from their own mothers at the time of the birth achieved higher levels of educational at- tainment after 6 years than their counterparts who reported more emotional support. The qualitative, in-depth interviews with these young mothers indicated that those mothers who reported the least amount of emotional support from their own mothers at birth had mothers who had the highest expectations for their daughters. Thus, the low amount of emo- tional support received by their own mothers at the birth of their children was due, in part, to the anger and disappointment that their own moth- ers felt about their daughters’ having become
8 YOSHIKAWA, WEISNER, KALIL, AND QAY
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
pregnant at such a young age. The mothers who were perceived to be the most emotionally sup- portive of their daughters at the births of their daughters’ children, however, had few educa- tional expectations for their daughters; thus, they did not frown on the arrival of a grand- child. The qualitative analysis indicated that the predictors of educational attainment may have had more to do with the expectations the moth- ers had for their daughters than with the level of emotional support they provided. These find- ings underscore the ways in which quantitative and qualitative methods can be mixed to pro- duce a clearer understanding of an association uncovered using quantitative methods.
Integrating the Study of Developmental Phenomena That Occur With High Prevalence With Those That Occur in Isolated Cases
The distinction between quantitative and qualitative research has occasionally been de- scribed as variable- versus individual-centered, or nomothetic and idiographic. However, this distinction is not accurate in that both qualita- tive and quantitative research can be conducted at either the population or individual level of analysis. However, one strength of qualitative research is its usefulness in identifying isolated cases that may uncover an entirely new area of inquiry (Pearce, 2002; Turner, 2004). For ex- ample, many quantitative methods used in de- velopmental science summarize information about groups of individuals rather than identi- fying and exploring unusual cases in depth. The ability to identify and then conduct follow-up detailed exploration of atypical cases may be a particular strength of qualitative approaches. This can occur in two ways. First, a qualitative analysis can uncover a new developmental phe- nomenon. This can open up the opportunity to explore its prevalence, predictors, and sequelae in quantitative studies. For example, the Sturm und Drang theory of adolescent development as a process of individuation requiring conflict with parents was developed largely through data and theorizing from case studies in psycho- analytic scholarship. This theory was then tested in numerous developmental studies of adolescence, most of which employed quantita- tive methods to the point at which it was dis- counted as a phenomenon that was a necessary feature of successful adolescent development in
middle-class U.S. cultures. Theories of adoles- cent development were enriched through this process.
Second, a quantitative analysis could un- cover an unusual developmental phenome- non, with qualitative research employed to investigate it in more depth. An “outlier” or set of outliers in a quantitative analysis, for example, could be followed up with qualita- tive inquiry. Pearce (2002) conducted a study of the influence of religion on family life in Nepal in which a subsample of adults in a survey study were identified who preferred much larger family sizes than predicted in a regression analysis (predictors included de- mographics as well as religioethnic group and a variety of religious beliefs and activities). This subgroup was then interviewed using qualitative methods about their family size preference. On the basis of the interview find- ings, which suggested that the proper unit of analysis for religious practice in the sample communities was the household, Pearce re- coded activity variables to represent house- hold-level activity and increased the predic- tive power of her quantitative analysis.
The United States is usually an outlier com- pared with the world of children and parents to which we should always hope to generalize. Cross-cultural and cross-national samples are needed to test findings from work done in a single community or nation. The United States is an isolated case for which we need much larger and more representative samples of the world’s chil- dren, parents, and contexts for development. Of- ten, qualitative and quantitative evidence helps to put our own isolated U.S. case into perspective. The Sturm und Drang hypothesis, for example, required major modifications when it was com- pared with studies concerning the quantitative pat- terning as well as qualitative meaning of adoles- cent–parent relationships in other cultures (Brown, Larson, & Saraswathi, 2002; Larson & Verma, 1999; Schlegel & Barry, 1991). Compar- isons of child and family poverty across the afflu- ent nations of the Europe, the United States, and Canada show sharp differences in poverty levels and in some of the reasons (e.g., social investment and support programs, taxation and income redis- tribution, and political make-up of elected bodies) for such variation (Rainwater & Smeeding, 2003). The notion that sibling caretaking would inevita- bly create rivalry and that caretakers would “lose
9SPECIAL SECTION: MIXING METHODS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
their childhoods” is far from what cross-cultural or U.S. data show; there are costs in well-being but also many benefits for increased child nurturance, responsibility, and the values of socially distrib- uted care and support (East, Weisner, & Reyes, 2006; Serpell, 1993; Weisner, 1996a).
Methodological Choices in Mixing Qualitative and Quantitative Approaches
Methodological choices are determined by the particular research question at hand. Here we discuss some of the critical choices one must make when designing studies, sampling, con- structing measures or interview protocols, and analyzing data using a mixed quantitative– qualitative approach.
Research Design and Data Collection Modality
The productive mixing of qualitative and quantitative methods can occur in the context of a variety of research designs, including nonex- perimental and experimental studies and pro- spective longitudinal as well as cross-sectional or retrospective studies. The choice of design should ideally be made a priori, with attention to the particular strengths of each design within the context of the research topic (causal infer- ence, e.g., for experimental studies; the ability to model change for longitudinal studies). The use of integrated methods throughout the stages of a study, and an iterative, cumulative ap- proach to inquiry, rather than the use of a new set of methods after the research design for the other part of the study has already been final- ized, is likely to result in richer data and theory. Goldenberg, Gallimore, and Reese (2005) illus- trate this in their 15-year longitudinal research program studying Latino children’s literacy de- velopment, in which an interest in the contexts that mattered for these children’s school success led to the use of ethnography in homes and schools; qualitative interviews with parents, teachers, and children; questionnaires; school records; and developmental assessments. The research team used an iterative process of data collection. For example, their quantitative find- ings indicated that parental personal and educa- tional backgrounds at school entry significantly influenced the literacy beliefs and home literacy practices of parents and children and also their children’s early school achievement. These
findings were enriched by qualitative data col- lection in the cities, small towns, and rural vil- lages of origin of the parents in Mexico to figure out why these associations held.
Much has been written about the choice of the many qualitative methods as well as data collec- tion modalities (e.g., ethnography, in-depth inter- view, structured open-ended questions, and focus groups) available (Bernard, 1998; Creswell, 1998; Denzin & Lincoln, 1998). In the context of a mixed qualitative– quantitative study, the match between kinds of quantitative and qualitative methods should be considered in addition to the usual match of method to research question. One method might be chosen specifically to fill in the gaps or shortcomings of another. For example, a survey study that examines parenting practices and child development without much attention to the physical context of the home or the commu- nity may benefit from participant observation that provides detailed, in-depth descriptions of these social contexts of parenting. For another example, if group process and discourse are important ele- ments of a construct (e.g., peer perceptions) but have not been a focus of research using one set of methods, a data collection method that provides group dynamics data, such as sociometric ratings or focus groups, could be chosen for the next phase in the research.
Relationship Between Researcher and Participant
Developmental researchers should consider the nature of the relationship between themselves and their participants when choosing between qualita- tive and quantitative data collection and analysis strategies. Direct contact with participants is usu- ally not an option when conducting secondary data analysis, particularly with survey or admin- istrative data. A researcher may wish to comple- ment such secondary analysis with a data collec- tion strategy (qualitative or quantitative or both) that allows more direct contact with a particular population. This more direct contact can result in a more comprehensive understanding of a devel- opmental phenomenon.
If the two sets of methods are to be used with the same participants, one issue to consider is how the relationship between researcher and participant changes across data collection mo- dalities. This change in relationship quality may have consequences for data quality. On one
10 YOSHIKAWA, WEISNER, KALIL, AND QAY
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
hand, conducting qualitative interviews first can establish a level of rapport that is crucial for collecting rich and personal accounts. On the other hand, some quantitative methods may be more likely to provide confidentiality or ano- nymity (e.g., computer-assisted survey admin- istration). Pilot samples and research testing different approaches and obtaining information on participants’ experience of the range of data collection methods can help inform choices re- garding particular combinations.
Sampling
Mixing qualitative and quantitative ap- proaches brings up vexing tradeoffs regarding how to sample. Typically, qualitative samples are smaller than quantitative samples because of the time demands of qualitative data collection and analysis. However, this need not be the case. Some researchers (e.g., Edin & Lein, 1997; Way, Gingold, Rotenberg, & Kuriakose, 2005; Way & Pahl, 2001) collect both in-depth qualitative interviews and survey measures from entire samples of hundreds of participants. If that is not possible, two common alternatives are embedding or nesting a qualitative sample within a larger quantitative sample and drawing a separate qualitative sample with a similar sampling plan. There are some advantages to the nested design. First, one can examine the quantitative (e.g., survey) data of the qualitative sample. Subgroups of the qualitative sample, for example, can be drawn based on responses obtained from the survey (e.g., at an extreme or in the middle of the range of one or more measures; Miles & Huberman, 1994). Second, one can more easily generalize from one sample to the other if they are nested. Random subsam- pling can be especially useful in this regard. However, participant burdens are certainly less- ened if a qualitative sample is drawn separately (e.g., Cherlin, Burton, Hurt, & Purvin, 2004).
An embedded qualitative sample can be drawn based on particular criteria, such as family struc- ture, risk level, or developmental status. For ex- ample, a recent qualitative investigation drew a subsample from a larger quantitative study of wel- fare recipients on the basis of women’s entry into marriage over the 5-year time frame of the larger study (Jayakody & Seefeldt, 2005). Another eth- nographic subsample was drawn randomly from both conditions of an experimentally evaluated
intervention in the New Hope antipoverty exper- iment (Duncan, Huston, & Weisner, 2006; Gib- son-Davis & Duncan, 2005). In this case, the researchers argued strongly for incorporating both experimental and control-group members in the qualitative substudy in order to gain more power- ful insights into the causal effects of the interven- tion. Over 1,300 program and control sample adults were eligible for the full survey; these adults were randomly assigned to either the New Hope or control group. Of those, over 800 had at least one child between the ages of 1 and 12 years (the focal age group for the child and family study). From this group, equal numbers of pro- gram and control families were randomly selected to participate in the ethnographic study (and con- tinue in the survey study sample as well).
Network-based sampling (e.g., “snowball sam- pling,” in which respondents refer the researcher to other respondents) is quite common in qualita- tive research. By carefully selecting a range of starting cases, engaging in several stages of refer- rals from those cases, and halting referrals after only a few stages, researchers can represent a relatively wide range of variation on demographic characteristics in a particular population (Heck- athorn, 1997). The choice between network-based sampling and population-based sampling should be informed by the type of population as well as the response rate obtained. For example, research- ers may better sample a “hidden” or stigmatized population using network-based sampling than they may using population-based sampling, whereas the reverse may be true for a population from which one can obtain a higher response rate (Small, 2005).
The systematic sampling of particular contexts to highlight variation in qualitative cases is a com- mon approach when the topic of study is devel- opment within that context. This task becomes more complicated when it is conducted in combi- nation with sampling in a quantitative study. For example, a qualitative study of child or youth development in neighborhood contexts that is con- ducted within a larger quantitative study may need a sample of a smaller number of communities than those represented in the larger sample. Neighbor- hoods may be selected on the basis of particular dimensions that are of interest in the study; the number of dimensions across which neighbor- hoods are chosen, however, will be more con- strained in the qualitative study. Decisions on how many interviews or participants to sample per
11SPECIAL SECTION: MIXING METHODS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
neighborhood depend, in turn, on the individual- level characteristics across which one would like to ensure variation. This is a topic that is not well understood and would benefit from new research. In the New Hope experiment (Gibson-Davis & Duncan, 2005), the ethnographic cases constituted a qualitative subsample of roughly 8%. The qual- itative study sample size was a decision based on time, money, and intuition about how many fam- ilies and children would be enough. With 44 cases, one could detect a program impact of about 0.6 standard deviation with a 95% confidence interval. However, it turned out that, using the full survey data sample, researchers found no program impacts as large as 0.6. The qualitative data could not be used to detect new experimental impacts in the ethnographic sample.
Variance and other features of the quantita- tive data also affect decisions about how and how many to select in a subset—features that often cannot be predicted before doing a study. Factors to consider when making decisions about relative sample size of quantitative versus qualitative samples include: ensuring represen- tation of the full range of the target population; allowing for variation in the variable or topic of interest; ensuring that fieldworkers have the time and resources to capture rich, complex, and nuanced developmental processes; and estimat- ing statistical power a priori for key associa- tions. These are methodological dilemmas spe- cific to mixed-methods work for which few established guidelines are as yet available.
Measure Development
The development of assessment and mea- surement tools in one set of methods can be based on evidence from the other. Perhaps most common is the situation in which qualitative evidence is used to develop quantitative instru- ments. Pendleton, Poloma, and Garland (1980), for example, used interviews with 53 dual- earner couples to develop quantitative scales tapping aspects of work and family such as domestic responsibility, satisfaction, self- image, and career salience.
Qualitative evidence can also be used to im- prove on the limitations of measures that have historically been implemented in quantitative sur- vey instruments. Lugo-Gil and Yoshikawa (2006), for example, analyzed qualitative interviews on expenditures on children conducted with immi-
grant and ethnically diverse parents. These inter- views suggested multiple ways in which the stan- dard U.S. survey approach to expenditure mea- surement—the Consumer Expenditure Survey— could be revised to better measure expenditures on children in diverse families. Revisions were made to time frames, definitions of household, and phrasing of questions, and categories particularly relevant to consumption in these families, such as informal contributions from others and remit- tances, were added. The survey measure based on the qualitative findings was then administered to estimate investments in children in a larger survey sample.
Qualitative protocols also can be developed from quantitative data. For example, partici- pants can be asked how two domains of expe- rience are related on the basis of quantitative study of the two domains in association with each other (e.g., asking adolescents how expe- riences of discrimination in their daily lives might relate to their well-being and their school engagement, a question that is best asked after extensive probing of each of these topics sepa- rately). Similarly, the constructs of “time” and “money” have long been studied as key com- ponents of family life and child development. In quantitative studies, these constructs are usually assessed separately with time diaries and expen- diture grids. A recent qualitative study of un- employed middle-class fathers, however, asked parents about the tradeoffs they perceived in having time versus money as key parenting “investments” in their school-age children. The findings suggested a nuanced portrait of the conditions under which the merits of increased time for children as a result of parents’ job loss outweighed the loss of income from employ- ment. For example, for unemployed fathers who had sufficient savings or other financial re- sources, and were therefore experiencing rela- tively little economic stress, the unexpected chance to rekindle or strengthen relationships with children through spending more time to- gether was viewed as a welcome opportunity that would have long-lasting positive conse- quences. In contrast, for those fathers whose financial obligations weighed more heavily on them, the increased time they were spending at home served only as a frustrating reminder of their unemployment status. These qualitative findings suggest, in turn, revising quantitative measures to assign “weights” to these important
12 YOSHIKAWA, WEISNER, KALIL, AND QAY
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
inputs in light of a particular family’s circum- stance (Kalil, Spindel, & Hart, 2006).
Data Analysis
Quantitative and qualitative data analysis from a mixed-methods study can be sequenced in a variety of ways. As the examples throughout this article illustrate, there is no “best” way for the two kinds of analyses to inform each other. Studies range from two-stage models in which the quali- tative analyses follow the quantitative analyses or vice versa to complex iterations in which, at dif- ferent stages, qualitative and quantitative analyses accomplish different purposes. Odom et al. (2006) conducted a study of the experiences of preschool children with disabilities in inclusive classrooms. They gathered survey, observational, participant observation, and in-depth interview data on chil- dren’s experiences in inclusive classrooms. In their analyses, they first identified children at ex- tremes of social acceptance and rejection in the survey data and then analyzed the two groups’ experiences holistically using a range of qualita- tive methods. Next, they conducted a quantitative cluster analysis to identify patterns of peer socio- metric perceptions associated with acceptance or rejection and validated those clusters using partic- ipant observation methods. This complex set of analyses provided a rich picture of the experience of acceptance and rejection of children with dis- abilities in an educational setting.
Divergent findings from quantitative and qualitative methods do not necessarily represent a “problem” with the data. Miller, Khamarko, and Beard (2005) reported conflicting results from a mixed-methods evaluation of an HIV prevention program for adolescents and young adults. They found that although the program did not achieve substantial impacts on quantita- tive assessments of risk behavior, the commu- nity organizations involved felt that the pro- gram brought attention to a neglected health issue and catalyzed community prevention ef- forts. The divergence of evidence provided use- ful data that challenged the program’s estab- lished theory of change and revealed the plural- ity of values among stakeholders involved.
One all-too-common way to mix qualitative and quantitative methods within a research proj- ect is to have separate analysts collect and an- alyze data, each using one set of methods. This division of labor is, in our view, not the best
choice from scientific and training perspectives. It is preferable to integrate the two perspectives throughout the analysis phase of a research proj- ect and have each analyst conduct both quanti- tative and qualitative data analysis. This can lead to rich integration across methods and anal- yses. However, this approach also requires training across both sets of methods, a difficult task given the extensive skill sets and traditions within each set. If experts in quantitative meth- ods partner with experts in qualitative methods and jointly explore common developmental questions, new findings as well as new skills can be learned by all. It is not necessary for each individual to be equally expert in all methods. Joint training in qualitative and quantitative methods can be accomplished productively by embedding training opportunities in mixed qualitative– quantitative studies. In a recent book on the effects of low-wage work on family processes and child development, each member of a small team of analysts engaged in both quantitative and qualitative analyses. Many members of the team were at the doctoral or post-doctoral level and were lead analysts on studies focusing on particular aspects of low- wage work and child development (Yoshikawa, Weisner, & Lowe, 2006). The whole team en- gaged in a core set of coding and analysis tasks using ethnographic field notes, as well as a core quantitative analysis of work and income trajec- tories and their effects on children (Yoshikawa, Lowe, et al., 2006). Each chapter author then expanded on these core analyses, using both quantitative and qualitative methods to examine a particular aspect of low-wage work and its effects on parents and children (job quality, nonstandard hours and schedules, job discrimi- nation experiences, child care and work, work goals and values, budgeting, work and relation- ships or marriage, etc.). This approach served simultaneously as an efficient way to conduct mixed-methods analyses and a rich training op- portunity across both sets of methods.
Common Pitfalls of Mixing Qualitative– Quantitative Methods and How to
Surmount Them
In this section, we discuss four common pit- falls in research using mixed qualitative– quantitative methods. Although each is not spe- cific to developmental science, we describe ex-
13SPECIAL SECTION: MIXING METHODS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
amples relevant to developmental research. For each, we suggest possible remedies.
Finding Publication Outlets and Funding
A common anxiety about conducting re- search across qualitative and quantitative meth- ods is whether such work will be received well by reviewers and funders. Chapters, books, or reports do often allow more latitude for mixed- method studies than they do for developmental journals. Some journals may, unfortunately, have “gatekeeping” criteria that make it difficult in practice to present mixed-methods evidence, particularly when including qualitative evi- dence with thick description. However, devel- opmental journals are increasingly recognizing and publishing mixed-methods research. Some are recognizing the need for space for text evi- dence. This is a two-way process. The more often developmental research that uses mixed methods is submitted to journals, the more likely it will become that editors will accept such work for publication. Similarly, the more often developmentalists with a range of meth- odological expertise serve as reviewers for jour- nals and funding agencies, the more likely it is that studies incorporating the different methods will be supported.
Balancing Participant Burdens
Participants perceive research “burdens” in different ways. Time is an important, but not the only, consideration in understanding participant sense of burden. Participant engagement and involvement is also important. A long, struc- tured, closed-ended survey, or hundreds of questionnaire items to fill out, page after page, can be a burden for many. But a qualitative conversation, despite taking just as long, allows participants to tell their own stories and takes place with a fieldworker listening closely to participants’ concerns. The burden participants experience may be much less. And some forms of collection of qualitative data (e.g., a record- ing of preschoolers’ naturally occurring conver- sations in preschool classrooms and play- grounds) have no direct burden on participants (Rizzo & Corsaro, 1995). The personal relation- ships that participants develop with fieldwork- ers are positive for many families. Multiple methods, however, often create greater burden;
this can be particularly acute if more than one method is attempted in a single visit. Participant payments, support, and number of contacts should be weighted accordingly. Providing meaningful payment, gifts, child care, and flex- ibility in scheduling can help.
Managing Time and Resources
Individual investigators can do multiple kinds of data collection and analysis them- selves. But partnering with others who have complementary expertise is also valuable. When some members of a team are method- bilingual, barriers to integration of data are less- ened. Investing some time and resources across methods, even if the investment has to be small at times, can nonetheless have large payoffs. In practice, many developmentalists are trained in particular data collection and analysis tech- niques and gradually accrue other expertise over time. Learning a particular data collection tech- nique to help answer a particular, delimited research question is more feasible (and less daunting) than learning all “qualitative re- search” or “quantitative research” methods.
Collaboration Among Researchers of Different Scientific Backgrounds
Research using mixed qualitative– quantita- tive methods often involves collaborations among researchers who have different scientific backgrounds. We use the term “scientific back- ground” rather than “discipline” because re- searchers within a single discipline can differ greatly in their approach to qualitative and/or quantitative methods. Beliefs about and skills in using different methods are part of our social identities. Differences can occur in epistemo- logical beliefs (e.g., positivism vs. constructiv- ism or post-positivism; Guba & Lincoln, 1994; Onwuegbuzie & Leech, 2005); preferences in data collection approaches with participants; terminology and labeling of concepts, con- structs, and methods; or experience with partic- ular kinds of research. All of these issues can affect a multitude of design, implementation, and analysis decisions. Resolving them requires patience, perspective taking, and conflict reso- lution skills, and most important, the willing- ness to learn unfamiliar research practices and teach familiar practices to others.
14 YOSHIKAWA, WEISNER, KALIL, AND QAY
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Conclusion
Researchers can be specialists in a method or analysis technique and can advocate for that method without becoming “methodocentric,” in other words, confounding a method useful for understanding the research problem with the research problem itself. “Methodocentrism,” like ethnocentrism, can have some positive functions, such as building professional exper- tise and identity, but it has negative conse- quences too. The questions should focus on the empirical problem, theory, and study partici- pants. What strong evidence does one have that will contribute to understanding the families, children, and participants in a particular study? What does one’s evidence tell us about theory and the validity and reliability of all methods? How does each kind of evidence add to the emerging overall story? How does one’s evi- dence make a particular story more believable to a wider audience? The focus should be on the participants, the contexts in which they live, the theory, and the emerging story that the accumu- lated evidence tells—not on which method has been used to gather such evidence (we have argued, of course, that different methods and designs are good for making different parts of an empirical story more believable).
When partnering with others in mixed- method work, it is useful to select partners who all sides trust to have a version of this stance toward the children, families, and contexts un- der study, who are not methodocentric but who are curious about what different ways of repre- senting the common research questions using mixed methods will reveal, and who focus on testing theories rather than on preapproving the one that might be favored by their discipline.
In doing so, developmental theory will be enriched through the expanded lens that mixing methods can provide on developmental phe- nomena. This work is just beginning with re- gard to mixing the study of words and numbers in scientific research. We believe that in future years, as the productive mixing of these meth- ods continues to grow, our understanding of human development will be greatly enhanced.
References
Adam, E., Hawkley, L., Kudielka, B., & Cacioppo, J. (2006). Day-to-day dynamics of experience-
cortisol associations in a population-based sample of older adults. Proceedings of the National Acad- emy of Sciences, 103, 17058 –17063.
Axinn, W. G., & Pearce, L. D. (2006). Mixed method data collection strategies. New York: Cambridge University Press.
Bernard, H. R. (1995). Research methods in anthro- pology: Qualitative and quantitative approaches. Walnut Creek, CA: Alta Mira Press.
Bernard, H. R. (1998). Introduction: On method and methods in anthropology. In H. R. Bernard (Ed.), Handbook of methods in cultural anthropology (pp. 9 –36). Walnut Creek, CA: AltaMira Press.
Bernheimer, L. B., & Weisner, T. S. (2007). “Let me just tell you what I do all day . . . ”: The family story at the center of intervention research and practice. Infants & Young Children, 20, 192–201.
Brady, H. E., & Collier, D. (Eds.). (2004). Rethinking social inquiry: Diverse tools, shared standards. New York: Rowman and Littlefield.
Bronfenbrenner, U., & Morris, P. A. (1998). The ecol- ogy of developmental process. In W. R. Damon & R. M. Lerner (Eds.), Handbook of child psychology: Volume I. Theoretical models of human development (5th ed., pp. 993–1028). New York: Wiley.
Brown, B. B., Larson, R. W., & Saraswathi, T. S. (2002). The world’s youth: Adolescence in eight regions of the globe. Cambridge, England: Cam- bridge University Press.
Burton, L. M., & Jarrett, R. L. (2000). In the mix, yet on the margins: The place of family in urban neighbor- hood and child development research. Journal of Marriage and the Family, 62, 1114 –1135.
Cherlin, A. J., Burton, L. M., Hurt, T. R., & Purvin, D. M. (2004). The influence of physical and sexual abuse on marriage and cohabitation. American So- ciological Review, 69, 768 –789.
Clampet-Lundquist, S., Edin, K., Kling, J., & Duncan, G. (2006). Moving at-risk teenagers out of high-risk neighborhoods: Why girls fare better than boys (Princeton IRS Working Paper 509). Princeton, NJ: Princeton University, Industrial Relations Section.
Collins, L. M., & Sayer, A. G. (Eds.). (2001). New methods for the analysis of change. Washington, DC: American Psychological Association.
Creswell, J. W. (1998). Qualitative inquiry and re- search design: Choosing among five alternatives. Thousand Oaks, CA: Sage.
Creswell, J. W., & Plano Clark, V. L. (in press). Designing and conducting mixed methods re- search. Thousand Oaks, CA: Sage.
Datta, L. (2005). Mixed methods, more justified con- clusions: The case of the Abt evaluation of the Comer program in Detroit. In T. S. Weisner (Ed.), Discovering successful pathways in children’s devel- opment: Mixed methods in the study of childhood and family life (pp. 65– 83). Chicago: University of Chi- cago Press.
15SPECIAL SECTION: MIXING METHODS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
Denzin, N. K., & Lincoln, Y. S. (Eds.). (1998). Strategies of qualitative inquiry. Thousand Oaks, CA: Sage.
Duncan, G. J., Huston, A. C., & Weisner, T. S. (2006). Higher ground: New hope for the working poor and their children. New York: Russell Sage Foundation.
East, P. L., Weisner, T. S., & Reyes, B. (2006). Youths’ caretaking of their adolescent sisters’ chil- dren. Its costs and benefits for youths’ develop- ment. Applied Developmental Science, 10, 86 –95.
Edin, K., & Lein, L. (1997). Making ends meet: How single mothers survive welfare and low-wage work. New York: Russell Sage Foundation.
Eisenberg, N., Zhou, Q., Spinrad, T. L., Valiente, C., Fabes, R. A., & Liew, J. (2005). Relations among positive parenting, children’s effortful control, and externalizing problems: A three-wave longitudinal study. Child Development, 76, 1055–1071.
Ford, D. H., & Lerner, R. M. (1992). Developmental systems theory: An integrative approach. Thou- sand Oaks, CA: Sage.
Foster, E. M. (2002). How economists think about family resources and child development. Child De- velopment, 73, 1904 –1914.
Fuller, B., Holloway, S. D., & Liang, X. (1996). Family selection of child-care centers: The influ- ence of household support, ethnicity, and parental practices. Child Development, 67, 3320 –3337.
Gallimore, R., Coots, J. J., Weisner, T. S., Garnier, H. E., & Guthrie, D. (1996). Family responses to children with early developmental delays II: Ac- commodation intensity and activity in early and middle childhood. American Journal on Mental Retardation, 101, 215–232.
Garcia Coll, C. (2005). Editorial. Developmental Psychology, 41, 299 –300.
Gibson-Davis, C., & Duncan, G. J. (2005). Qualita- tive/quantitative synergies in a random-assignment program evaluation. In T. S. Weisner (Ed.), Dis- covering successful pathways in children’s devel- opment: New methods in the study of childhood and family life (pp. 283–303). Chicago: University of Chicago Press.
Goldenberg, C., Gallimore, R. G., & Reese, L. (2005). Using mixed methods to explore Latino children’s literacy development. In T. S. Weisner (Ed.), Discovering successful pathways in chil- dren’s development: Mixed methods in the study of childhood and family life (pp. 21– 46). Chicago: University of Chicago Press.
Goldschmidt, W. (1992). The human career. The self in the symbolic world. Cambridge, MA: Blackwell.
Gottlieb, G. (1997). Synthesizing nature-nurture: Prenatal roots of instinctive behavior. Mahwah, NJ: Erlbaum.
Greene, J. C., & Caracelli, V. J. (Eds.). (1997). Ad- vances in mixed-method evaluation: The chal-
lenges and benefits of integrating diverse para- digms. Thousand Oaks, CA: Sage.
Guba, E., & Lincoln, Y. (1994). Competing para- digms in qualitative research. In N. Denzin & Y. Lincoln (Eds.), The handbook of qualitative re- search (pp. 105–117). Thousand Oaks, CA: Sage.
Gunnar, M. R., & Vazquez, D. M. (2001). Low cortisol and a flattening of the expected daytime rhythm: Potential indices of risk in human devel- opment. Development and Psychopathology, 13, 516 –538.
Harkness, S., & Super, C. M. (Eds.). (1996). Parents’ cultural belief systems: Their origins, expressions, and consequences. New York: Guilford Press.
Heckathorn, D. D. (1997). Respondent-driven sam- pling: A new approach to the study of hidden populations. Social Problems, 44, 174 –199.
Hughes, D. L., Rivas, D., Foust, M., Hagelskamp, C. I., Gersick, S., & Way, N. (in press). How to catch a moonbeam: A mixed-methods approach to understanding ethnic socialization processes in ethnically diverse families. In S. Quintana & C. McKown (Eds.), Handbook of race, racism, and the developing child. New York: Wiley.
Jankowiak, W. R. (1993). The Chinese family: Par- ents, spouses, and children. In W. R. Jankowiak, Sex, death and hierarchy in a Chinese city (pp. 223–257). New York: Columbia University Press.
Jayakody, R., & Seefeldt, K. (2005, April). We got married, now what? Family dynamics after union formation. Paper presented at the annual meeting of the Population Association of America, Phila- delphia, PA.
Johnson, A. (1996). Time allocation. In D. Levinson & M. Ember (Eds.), Encyclopedia of Cultural An- thropology (Vol. 4). New York: Henry Holt.
Kalil, A., Spindel, L., & Hart, C. (2006). Downsized dads: How white-collar fathers negotiate family life after job loss. Unpublished manuscript, Uni- versity of Chicago.
Larson, R. W., & Verma, S. (1999). How children and adolescents spend time across the world: Work, play, and developmental opportunities. Psy- chological Bulletin, 125, 701–736.
LeVine, R. A. (2003). Childhood socialization: Com- parative studies of parenting, learning, and edu- cational change (CERC Studies in Comparative Education, 12). Hong Kong: University of Hong Kong Press.
LeVine, R. A. (2007). Ethnographic studies of child- hood: A historical overview. American Anthropol- ogist, 109, 247–260.
LeVine, R. A., Dixon, S., LeVine, S., Richman, A., Leiderman, P. H., Keefer, C. H., & Brazelton, T. B. (Eds.). (1994). Child care and culture: Les- sons from Africa. Cambridge, England: Cambridge University Press.
16 YOSHIKAWA, WEISNER, KALIL, AND QAY
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
LeVine, R. A., Miller, P. M., & West, M. M. (Eds.). (1988). Parental behavior in diverse societies. San Francisco: Jossey-Bass.
Love, J. M., et al. (2002). Making a difference in the lives of infants and toddlers and their families: The impacts of Early Head Start. Washington, DC: Administration on Children, Youth and Families, Department of Health and Human Services.
Lugo-Gil, J., & Yoshikawa, H. (2006). Measuring expenditures on children in immigrant and ethni- cally diverse families (National Poverty Center Working Paper 06 – 017). Ann Arbor, MI: National Poverty Center.
McKenna, J. J., & McDade, T. (2005). Why babies should never sleep alone: A review of the co- sleeping controversy in relation to SIDS, bedshar- ing and breast feeding. Pediatric Respiratory Re- views, 6, 134 –152.
Mensing, J. F., French, D., Fuller, B., & Kagan, S. L. (2000). Child care selection under welfare reform: How mothers balance work requirements and par- enting. Early Education and Development, 11, 573–595.
Miles, M. B., & Huberman, A. M. (1994). Qualita- tive data analysis. Thousand Oaks, CA: Sage.
Miller, R. L., Khamarko, K., & Beard, S. (2005, June). Two tales of an intervention: Sorting through conflicting evidence from a mixed- methods evaluation. Paper presented at the bien- nial meeting of the Society for Community Re- search and Action, Urbana-Champaign, IL.
Morelli, G. A., Oppenheim, D., Rogoff, B., & Gold- smith, D. (1992). Cultural variation in infants’ sleeping arrangements: Questions of indepen- dence. Developmental Psychology, 28, 604 – 613.
Odom, S. L., Zerher, C., Li, S., Marquart, J. M., Sandall, S., & Brown, W. H. (2006). Social accep- tance and rejection of preschool children with dis- abilities: A mixed-method analysis. Journal of Ed- ucational Psychology, 98, 807– 823.
Okami, P., Weisner, T. S., & Olmstead, R. (2002). Outcome correlates of parent-child bedsharing: An 18-year longitudinal study. Journal of Develop- mental and Behavioral Pediatrics, 23, 244 –253.
Onwuegbuzie, A. L., & Leech, N. L. (2005). Taking the “Q” out of research: Teaching research meth- odology courses without the divide between quan- titative and qualitative paradigms. Quality & Quantity, 39, 267–296.
Pearce, L. D. (2002). Integrating survey and ethno- graphic methods for systematic anomalous case analysis. Sociological Methodology, 32, 103–132.
Pendleton, B. F., Poloma, M. M., & Garland, T. N. (1980). Scales for investigation of dual-earner cou- ples. Journal of Marriage and the Family, 42, 269 –276.
Rainwater, L., & Smeeding. T. (2003). Poor kids in a rich country: America’s children in comparative perspective. New York: Russell Sage Foundation.
Rizzo, T. A., & Corsaro, W. A. (1995). Social sup- port processes in early childhood friendship: A comparative study of ecological congruences in enacted support. American Journal of Community Psychology, 23, 389 – 417.
Rogoff, B. Sellers, M. J., Pirotta, S., Fox, N., & White, S. H. (1975). Age of assignment of roles and responsibilities to children: A cross-cultural survey. Human Development, 18, 353–369.
Rothbaum, F., Pott, M., Azuma, H., Miyake, K., & Weisz, J. (2000). The development of close rela- tionships in Japan and the United States: Paths of symbiotic harmony and generative tension. Child Development, 71, 1121–1142.
Rubin, D. (1974). Estimating causal effects of treat- ments in randomized and nonrandomized studies. Journal of Educational Psychology, 66, 688 –701.
Schlegel, A., & Barry, H. (1991). Adolescence: An anthropological inquiry. New York: The Free Press.
Serpell, R. (1993). The significance of schooling. Life journeys in an African society. Cambridge, Eng- land: Cambridge University Press.
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental de- signs for generalized causal inference. New York: Houghton Mifflin.
Shweder, R. A., Goodnow, J. J., Hatano, G., LeVine, R. A., Markus, H. R., & Miller, P. J. (2006). The cultural psychology of development: One mind, many mentalities. In W. Damon & R. M. Lerner (Eds.), Handbook of child development: Volume 1. Theoretical models of human development. New York: Wiley.
Sieber, S. (1973). The integration of fieldwork and survey methods. American Journal of Sociology, 78, 1335–1359.
Singer, J. D., & Willett, J. B. (2003). Applied longi- tudinal data analysis: Modeling change and event occurrence. New York: Oxford University Press.
Small, M. L. (2005, May). Lost in translation: How not to make qualitative research more scientific. Paper presented at the National Science Founda- tion Workshop on Interdisciplinary Standards for Rigorous Qualitative Research, Washington, DC.
Society for Research in Child Development (2005). SRCD strategic plan. Retrieved December 3, 2006, from http://www.srcd.org/documents/miscella- neous/strategicplan.pdf.
Super, C. (1976). Environmental effects on motor development: The case of African infant precocity. Developmental Medical Child Neurology, 18, 561–567.
Super, C., & Harkness, S. (1986). The developmental niche: A conceptualization at the interface of child
17SPECIAL SECTION: MIXING METHODS
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.
and culture. International Journal of Behavioral Development, 9, 545–569.
Tamis-LeMonda, C. S., Way, N., Hughes, D., Yo- shikawa, H., Kahana-Kalman, R., & Niwa, E. (in press). Parents’ goals for children: The dynamic coexistence of collectivism and individualism. So- cial Development.
Tashakkori, A., & Teddlie, C. (1998). Mixed meth- odology: Combining qualitative and quantitative approaches. Thousand Oaks, CA: Sage.
Tashakkori, A., & Teddlie, C. (Eds.). (2003). Hand- book of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage.
Thelen, E., & Smith, L. (2006). Dynamic systems theories. In W. Damon & R. Lerner (Eds.), Hand- book of child development: Volume 1. Theoretical models of human development (pp. 258 –311). New York: Wiley.
Turner, M. (2004, May). A study of mental events in cognitive science and social science. Paper pre- sented at the National Science Foundation Work- shop on Qualitative Methods in the Behavioral Sciences, Washington, DC.
Way, N., Becker, B., & Greene, M. (2006). Friend- ships among African American, Latino, and Chi- nese American adolescents in an urban context. In C. Tamis-LeMonda & L. Balter (Eds), Child Psy- chology: A Handbook of Contemporary Issues (2nd ed.). New York: Psychology Press.
Way, N., Gingold, R., Rotenberg, M., & Kuriakose, G. (2005). The development of friendships among African American, Latino, and Chinese American youth: A qualitative account. In N. Way & J. Hamm (Eds), Close friendships among adoles- cents. New Directions For Child and Adolescent Development. San Francisco: Jossey-Bass.
Way, N., & Greene, M. (2006). Changes in perceived friendship quality from early to late adolescence: Using growth curve analysis to examine the pat- terns and contextual predictors. Journal of Re- search on Adolescence, 16, 293–320.
Way, N., & Leadbeater, B. (1999). Pathways toward educational achievement among African American and Puerto Rican adolescent mothers: Examining the role of family support. Development and Psy- chopathology, 8, 123–139.
Way, N., & Pahl, K. (2001). Individual and contex- tual predictors of perceived friendship quality among ethnic minority, low-income adolescents. Journal of Research on Adolescence, 11, 325–349.
Weisner, T. S. (1987). Socialization for parenthood in sibling caretaking societies. In J. Lancaster, A. Rossi, J. Altmann, & L. Sherrod (Eds.), Parenting across the life span (pp. 237–270). New York: Aldine Press.
Weisner, T. S. (1996a). The 5 to 7 transition as an ecocultural project. In A. Sameroff & M. Haith (Eds.), The five to seven year shift: The age of reason and responsibility (pp. 295–326). Chicago: University of Chicago Press.
Weisner, T. S. (1996b). Why ethnography should be the most important method in the study of human development. In R. Jessor, A. Colby, & R. Sh- weder (Eds.), Ethnography and human develop- ment. Context and meaning in social inquiry (pp. 305–324). Chicago: University of Chicago Press.
Weisner, T. S. (2001). The American dependency conflict: Continuities and discontinuities in behav- ior and values of countercultural parents and their children. Ethos, 29, 271–295.
Weisner, T. S. (2002). Ecocultural understanding of children’s developmental pathways. Human De- velopment, 45, 275–281.
Weisner, T. S. (Ed.). (2005). Discovering successful pathways in children’s development: Mixed meth- ods in the study of childhood and family life. Chi- cago: University of Chicago Press.
Weisner, T. S., Matheson, C., Coots, J., & Bern- heimer, L. (2005). Sustainability of daily routines as a family outcome. In A. Maynard & M. Martini (Eds.), The psychology of learning in cultural con- text (pp. 41–73). New York: Kluwer/Plenum.
Weiss, C. H. (1995). Nothing as practical as good theory: Exploring theory-based evaluation for comprehensive community initiatives for children and families. In J. P. Connell, A. C. Kubisch, L. B. Schorr, & C. H. Weiss (Eds.), New approaches to evaluating community initatives: Concepts, meth- ods and contexts (pp. 65–92). Washington, DC: The Aspen Institute.
Whiting, B. B., & Edwards, C. P. (1988). Children of different worlds: The formation of social behavior. Cambridge, MA: Harvard University Press.
Whiting, B. B., & Whiting, J. W. M. (1975). Children of six cultures; A psycho-cultural analysis. Cam- bridge, MA: Harvard University Press.
Yoshikawa, H., Lowe, E. D., Bos, J., Weisner, T. S., Nikulina, V., & Hsueh, J. (2006). Do pathways through low-wage work matter for children’s de- velopment? In H. Yoshikawa, T. S. Weisner, & E. Lowe (Eds.), Making it work: Low-wage employ- ment, family life, and child development (pp. 54 – 73). New York: Russell Sage Foundation.
Yoshikawa, H., Weisner, T. S., & Lowe, E. (Eds.). (2006). Making it work: Low-wage employment, family life, and child development. New York: Russell Sage Foundation.
Received February 9, 2007 Revision received August 2, 2007
Accepted August 24, 2007 �
18 YOSHIKAWA, WEISNER, KALIL, AND QAY
T hi
s do
cu m
en t
is co
py ri
gh te
d by
th e
A m
er ic
an P
sy ch
ol og
ic al
A ss
oc ia
ti on
or on
e of
it s
al li
ed pu
bl is
he rs
. T
hi s
ar ti
cl e
is in
te nd
ed so
le ly
fo r
th e
pe rs
on al
us e
of th
e in
di vi
du al
us er
an d
is no
t to
be di
ss em
in at
ed br
oa dl
y.

Get help from top-rated tutors in any subject.
Efficiently complete your homework and academic assignments by getting help from the experts at homeworkarchive.com