LEARNING CONTRACT FOR
NAME __________________________________________
ACTIVITY CTE670 Adult Learners
Learning Objectives |
Learning Resources and Strategies |
Evidence of Accomplishment of Objectives |
Criteria and Means for Validating Evidence |
|
|
|
|
Doctor of Business Administration
Doctoral Study Rubric and Research Handbook
FOREWORD
Walden University
DBA Doctoral Study Rubric and Research Handbook1 February 2019
This document consists of two components: the Doctoral Study Rubric2 and the Research Handbook. Thus, the purpose of this document is two-fold. First, the purpose of the rubric is to guide DBA students and DBA Doctoral Study supervisory committees as they work together to develop high-quality proposals and Doctoral Study research. The committee will use the rubric to provide on-going and flexible evaluation and reevaluation of the proposal and DBA Doctoral Study drafts. The University Research Reviewer (URR), who reviews the proposal/DBA Doctoral Study on behalf of the University, will also use this rubric to communicate feedback and any required revisions.
Second, the Research Handbook is an accompanying guide to the rubric that provides detailed instructions and knowledge pertaining to corresponding rubric components. The doctoral student is still responsible for utilizing self-identified resources to aid in the understanding and presentation of the rubric requirements. Elements in the Doctoral Study rubric correspond to elements in the Research Handbook. For example, one will find more detailed information on the Problem Statement (Heading # 1.3 in the DBA Rubric) in Heading # 1.3 (Problem Statement) of the Research Handbook. Using the Doctoral Study Rubric in conjunction with the Research Handbook when writing the proposal/Doctoral Study is highly recommended.
In the writing process, use the DBA Template and Rubric as a suggested outline for the DBA Proposal and Doctoral Study and as a basis for feedback on early drafts.
Before the Proposal Oral Conference or DBA Doctoral Study Oral Conference, the committee and URR will complete the rubric in MyDR and upload the proposal per the process checklist. Find the MyDR Process Checklist at http://academicguides.waldenu.edu/researchcenter/osra/dba .The guidance on orals is located at http://academicguides.waldenu.edu/researchcenter/osra/oraldefense.
After the Proposal Oral Conference or DBA Doctoral Study Oral Conference, and once the student completes any committee or methodologist revision requests for the proposal/Doctoral Study, the committee will review the proposal/Doctoral Study and make any needed modifications. When the committee members agree that the student met all of the rubric requirements for the proposal and passed the oral defense, the chair then notes in MyDR that the student passed the oral defense.
1 The DBA Rubric and Research Handbook video tutorial can be viewed at: http://youtu.be/KiiDGmLbRN0.
2 The guidance in the rubric supersedes any guidance you might see depicted elsewhere. For example, the Problem Statement video tutorial on YouTube depicts a maximum word count of 250 for the Problem Statement. The Problem Statement is recommended not to be too lengthy (recommended not to exceed 150 words). It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate).
About consensus: For the final copy of the proposal or DBA Doctoral Study, there must be unanimous agreement by the DBA Doctoral Study supervisory committee before the student proceeds to the next step in the process checklist.
Timely Review and Return of Student Work
For research courses (i.e., KAMs, dissertations, and doctoral studies), the guideline for review and return of student research drafts is generally within 2 weeks; or, alternatively, provide a substantive overview of issues and concerns and an estimate of when of the full review will be complete. The 2-week time frame is a guideline and representative of what the university believes to be best practices. It is a desired practice for faculty members to respond to students upon receipt of research drafts and indicate when the draft will be returned. The faculty mentor or committee chair should provide students guidance on activities to work on that support student progress in the meantime. If a review of student research work requires significantly more time, for example, due to the length or complexity of the submission from one or more students, then faculty members are expected to notify the student of the additional time estimated to review their work.
Committee chairs or faculty mentors should set expectations early in the term for deadlines relating to submission and return of specified research documents that provide evidence of substantial academic progress. This is part of the term plan and should include deadlines for submission of designated documents and the final term report. Please note: Faculty members are not expected to review research drafts between terms, outside of what is required for end-of-term grading. Any research draft submitted within 5 days of the final day of the term may not receive detailed feedback until approximately 10 days into the subsequent term.
If the review takes place during any of the official Walden holidays (New Year’s Day; Martin Luther King, Jr. Day; Memorial Day; Independence Day; Labor Day; Thanksgiving Day; day after Thanksgiving; or Christmas Day), the holiday will not count in the review cycle. It is important to note that MyDR, which includes a general 14-day review timeline, does not adjust for holidays and end-of-terms, so any late notices received from the workflow as a result of a holiday are not an accurate reflection of the review time frame.
Note: As you consider your references, it is recommended that in business 85% should be within the past 5 years. Other than data collected from the study site, students cannot use magazines, trade publications, summary textbooks, websites, and blogs as references.
TABLE OF CONTENTS
DBA RESEARCH HANDBOOK 26
SECTION 1: FOUNDATION OF THE STUDY 27
1.1 - Abstract 28
1.2 - Background of the Problem 28
Applied DBA Versus a Speculative/Theoretical PhD 28
Preparing the Background of the Problem 29
Strategy for Mapping to the Rubric 31
Aligning the Specific Business Problem With the Purpose Statement and RQ ... 33 1.4 - Purpose Statement 35
Six Elements of the Purpose Statement 35
Hypothetical Quantitative Example 38
Hypothetical Qualitative Example 38
1.6 - Research Question (Quantitative Only) 39
1.7 - Hypotheses (Quantitative/Mixed-Method Only) 40
1.8 - Research Question (Qualitative Only) 40
1.9 - Interview Questions (Qualitative Only) 42
Example Applied DBA Interview Questions 43
1.10 - Theoretical/Conceptual Framework 43
1.11 - Operational Definitions 46
1.12 - Assumptions, Limitations, and Delimitations 46
1.13 - Significance of the Study 47
1.14 - Review of the Professional and Academic Literature 47
1.15 – Transition 49
2.2 - Role of the Researcher 51
Data Saturation in Qualitative Study Designs 53
How to Use Multiple Sources to Support Claims and Decisions 54
2.6 - Population and Sampling (Quantitative Only) 54
2.7 - Population and Sampling (Qualitative Only) 55
Data Saturation and Sampling 56
2.9 - Data Collection—Instruments (Quantitative) 57
2.10 - Data Collection – Instruments (Qualitative) 57
2.11 - Data Collection Technique 60
2.12 - Data Organization Technique (Qualitative Only) 60
2.13 - Data Analysis (Quantitative Only) 60
2.14 - Data Analysis (Qualitative Only) 61
2.15 - Study Validity (Quantitative Only) 63
Internal Validity 63
2.16 - Reliability and Validity (Qualitative Only) 65
2.17 - Transition and Summary 66
SECTION 3: APPLICATION TO PROFESSIONAL PRACTICE AND IMPLICATIONS FOR CHANGE 67
3.2 - Presentation of Findings (Quantitative) 68
3.3 - Presentation of Findings (Qualitative) 74
3.4 - Application to Professional Practice 74
3.5 - Implications for Social Change 74
3.6 - Recommendations for Action 75
3.7 - Recommendations for Further Research 75
3.8 - Reflections 75
3.9 - Conclusion 75
3.10 - Appendices/Table of Contents 75
APPENDIX C: MAJOR QUANTITATIVE DESIGNS 83
APPENDIX D: SAMPLING TYPOLOGIES 84
APPENDIX E: SAMPLE POWER ANALYSIS 85
APPENDIX F: SAMPLE QUANTITATIVE LITERATURE REVIEW OUTLINE 86
APPENDIX G: SAMPLE APA TABLES 89
APPENDIX H: SAMPLE INTERVIEW PROTOCOL 95
BIBLIOGRAPHY: SUGGESTED READINGS LISTS 97
Assumptions, Limitations, and Delimitations 98
Data Saturation and Data Collection Sources 111
Ethical Considerations/IRB 117
Interview Protocol Sources 142
Qualitative Research Foundation 175
Qualitative and Quantitative Sources 180
Reliability, Validity, Transferability, and Generalizability Sources 189
Qualitative Software Analysis Sources 205
DBA DOCTORAL STUDY RUBRIC
Student and Committee Information3
Student’s Name (Last, First): |
|
Student ID (For office use only): |
|
Chairperson: |
|
Second Committee Member: |
|
University Research Reviewer: |
|
Student to provide total number of references: (As you consider your references, it is recommended that in business 85% should be within the past 5 years). |
|
|
|
|
|
|
|
|
|
|
|
Note: Provide the required information in the yellow highlighted column.
3 Chair will complete the yellow highlighted fields in this section before submitting the rubric. Be sure to include the names of all members of the committee.
Evaluation4
5Date/Stage of the Rubric:
Date of Review |
|
Before Proposal Oral Defense |
|
Before Proposal Oral (Revised)6 |
|
Before Doctoral Study Oral Defense |
|
Before Doctoral Study Oral (Revised)7 |
|
Note: Place an “X” in column (yellow highlight) associated with the appropriate stage.
Evaluation of State of the DBA Doctoral Study or Proposal:
No changes required, advance to next step; rubric requirements met |
|
Changes required for resubmission; rubric requirements not met |
|
Note: Place an “X” in the column (yellow highlight) associated with the appropriate evaluation decision.
Member Information:
Name of member providing this review |
|
Role of the member providing this review |
|
Note: Enter the information in the yellow highlighted column.
4 Each member of the committee completes the evaluation.
5 Be sure to follow the Process Checklist (located at http://academicguides.waldenu.edu/researchcenter/osra ) naming convention when sending the document through the review process. Following the naming convention is vital for tracking student progress throughout the doctoral study process.
6 Check when second and subsequent rubrics are needed if previous proposal defense was not passed.
7 Check when second and subsequent rubrics are needed if previous Doctoral Study defense was not passed.
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
(1.1) Abstract (To be completed only after completion of Section 3) |
|
a. Includes a WOW statement illuminating the problem under study. |
|
b. Identifies the design (i.e., case study, phenomenological, quasi-experimental, correlation, etc.) NOTE: Do not mention the method (qualitative/quantitative) in the abstract. |
|
c. Identifies the study’s population and geographical location. |
|
d. Identifies theoretical (quantitative) or conceptual framework (qualitative) that grounded the study; theory/conceptual framework names are lower case. |
|
e. Describes the data collection process (e.g., interviews, surveys, questionnaires, etc.). |
|
f. Describes the data analysis process (e.g., modified van Kaam method) to identify themes; in qualitative studies (e.g., t test, ANOVA, or multiple regression), to report statistical data in a quantitative study.) Omit SW Titles. |
|
i. Identifies two or three themes that morphed from the study (qualitative). |
|
j. Presents the statistical results for each research question (quantitative studies). |
|
k. Describes how these data may contribute to social change (use the word social change and be specific on who specifically may benefit).8 |
|
l. Ensures the first line in the abstract is not indented. |
|
m. Ensures Abstract does not exceed one page. |
|
n. Use plural verbs with data (e.g., the data were - the word data is the plural of datum). |
|
8 Begin this section as follows: The implications for positive social change include the potential to…”.
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
o. Ensures all numbers are expressed in digits (i.e., 1, 2, 10, 20, etc.) and not spelled out unless beginning a sentence; Ensures Abstract does not include seriation (i.e., (a), (b), (c), etc.). |
|
(1.2) Background of the Problem 9 Provides a brief and concise overview of the context or background of the problem. DBA Doctoral Studies are focused on applied business research. This sets the stage for the study. This heading should comprise no more than one page in length. |
|
Please review the video tutorial located @: http://youtu.be/IYWzCYyrgpo to aid you in preparing the Problem Statement. |
|
a. Provides a hook10 supported by peer- reviewed or government citation 5 or less years old from anticipated completion date (CAO approval). |
|
b. Provides an anchor11 supported by peer- reviewed or government citation 5 or less years old from anticipated completion date (CAO approval). |
|
c. States the general business problem Note: This element should start as follows: The general business problem is… |
|
d. States the specific business problem. Be sure to state who has the specific problem (i.e., small business leaders, project managers, supply chain managers, etc.) Note: This element should start as follows: The specific business problem is that some (identify who has the problem)… |
|
9 Include an introductory paragraph before the Background of the Problem component. However, do not label this introductory paragraph with a L1 APA heading. The purpose of the background is to introduce the topic and problem you will address. Briefly indicate why the problem deserves new research. More important, the Doctoral Study must address applied research, so you will want to identify the need to solve an applied business problem. The goal of this section is to encourage readers to continue reading, to generate interest in the study, and provide an initial frame of reference for understanding the entire research framework
10 The hook should be a succinct WOW statement to catch the reader’s attention.
11 An anchor comprises a number, percentage, dollar value, ratio, index, etc.
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
e. Ensures the specific business problem aligns with the research question and purpose statement. |
|
f. Problem Statement should be clear and succinct (It is recommended to be approximately 150 words). |
|
· Check with Ulrich’s Periodical Directory http://library.waldenu.edu/728.htm to ensure citations are peer reviewed.12 · See Problem Statement Video Tutorial at: http://youtu.be/IYWzCYyrgpo. |
|
Describes the intent of the research13. The Purpose Statement is a mini story and recommended to be approximately 200 words. The Purpose Statement must address the following six elements: |
|
a. Identifies the research method as qualitative14, quantitative15, or mixed- method. |
|
b. Identifies research design16 (i.e., case study, phenomenological, quasi- experimental, correlational, etc.). |
|
c. If quantitative or mixed method: Identifies a minimum of two 17 independent (experimental/quasi-experimental designs) or predictor (correlational designs) and at least one dependent variable18. Note: The quantitative study must include at least two independent/predictor variables .19 Ensures the independent |
|
12 Ulrich’s is not 100% correct; the student must verify peer review status via the journal home page.
13 The first sentence of the purpose statement must align with the research question and specific business problem in the problem statement.
14 Visit the Center for Research Quality qualitative methodology tutorial at: http://academicguides.waldenu.edu/researchcenter/resources/Design
15 See the quantitative Research Primer located at Appendix B; Visit the Center for Research Quality quantitative methodology tutorial at: http://academicguides.waldenu.edu/researchcenter/resources/Design
16 See Appendix C for a depiction of basic quantitative designs and their characteristics.
17 Covariates, mediator, and moderator variables are types of independent/predictor variables; be sure to clearly identify these types of variables as applicable.
18 The terms “independent” and “predictor variables are often used interchangeably in correlation studies. Please be consistent with the chosen terminology.
19 See Heading 1.6, Research Questions (Quantitative Only), in the Research Handbook.
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
variables appropriately align with the variables/constructs identified in component 1.10, Theoretical/Conceptual Framework. |
|
d. Identifies specific population group for proposed study. |
|
e. Identifies geographic location of the study. |
|
f. Identifies contribution to social change. |
|
g. Ensures the first sentence links/aligns directly with the specific business problem. |
|
· See Purpose Statement Video Tutorial at: http://youtu.be/pLP4r0mfT9A. |
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
(1.5) Nature of the Study 20 Provides a brief discussion on the research method (i.e., quantitative or qualitative) and design (i.e., correlation for quantitative study; phenomenological, case study, etc., for a qualitative design); cite a minimum of one source (The method and design will be discussed in detail in Section 2). · Note: A single paragraph is sufficient for each component: one for the method and one for t he design. |
|
a. Identifies the selection of one method (qualitative, quantitative, or mixed method) and why other methods would not work (cite a minimum of one source). |
|
b. Identifies the selection of the design 21 (within the method) and why it was selected over other designs (cite a minimum of one source). |
|
a. Lists research question(s) in about 10-15 words. |
|
20 A single paragraph can be used for each component: one for the method and one for the design.
21 See Appendix C for a brief depiction of the major research designs.
b. Ensures research question(s)22 align(s) with the specific business problem and first line of the Purpose Statement. |
|
c. Includes the independent/predictor and dependent/criterion variables as identified in the Purpose Statement; ensures the independent/predictor variables appropriately align with the constructs/variables identified in component 1.10, Theoretical/Conceptual Framework. |
|
(1.7) Hypotheses (Quantitative/Mixed-Method Only) States, in accurate format, the null and alternative hypotheses for each research question23. |
|
a. Lists overarching research question in approximately 10-15 words. |
|
b. Ensures research question aligns with the specific Business Problem and Purpose Statement. |
|
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
a. Lists each interview or focus group question. Questions must contribute knowledge to the research question. Questions must be open-ended, and cannot be answered with a Yes or No. |
|
b. Ensures interview/focus group questions align with the research question. |
|
22 The research question(s) must contain the independent/predictor and dependent/criterion variables identified in the Purpose Statement.
23 Hypotheses must include the variables identified in the research question.
(1.10) Theoretical/Conceptual Framework 24 Clearly and concisely identify the theoretical/conceptual framework. In quantitative studies, the theoretical framework is the appropriate term and in qualitative studies, the conceptual framework is the appropriate term. The student will articulate the theoretical/conceptual framework with concepts from the literature to ground and complement the applied business study. · This component should not exceed one page. It will be expanded upon in the literature review. See Theoretical/Conceptual Framework Video Tutorial at: http://youtu.be/P-01xVTIVC8 |
|
a. Identifies and describes the theory or conceptual model for theoretical/conceptual framework. |
|
b. Identifies theorist(s) of the theory or conceptual model for theoretical/conceptual framework. |
|
c. Identifies date of the theory or conceptual model for theoretical/conceptual framework (if applicable).25 |
|
d. Identifies key concepts/propositions/tenets of the theory or conceptual model for theoretical/conceptual framework26. |
|
e. Quantitative only - Ensures the theoretical constructs/variables underlying the theory are clearly identified and align with the constructs/variables (independent variables) identified in the Purpose Statement and Research Question(s). Note: The independent variables/constructs represent the underlying concepts of the theoretical framework in quantitative research. |
|
· Identifies how/why the theory or conceptual model for theoretical/conceptual framework is applicable and fits/applies to the study. |
|
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
(1.11) Operational Definitions |
24 The theoretical/conceptual framework informs the research (quantitative) and interview (qualitative) questions. Be sure to review the Theoretical/Conceptual Framework Video Tutorial at: http://youtu.be/P-01xVTIVC8
25 Some literature identifies the specific date the theorist introduced the theory; provide this date if this is the case. If date is missing, then requirement (c) is not applicable.
26 Ensures the independent variables appropriately align with the theoretical framework(s) identified in component 1.10, Theoretical/Conceptual Framework.
a. Presents technical terms, jargon, or special word used in the study. |
|
b. Lists in alphabetical order. Formats in italics followed by an italicized colon. The definition follows on the same line. (This is similar to an APA Level 5 heading with a colon replacing the period.) |
|
c. Provides citations (for each definition) from credible sources (peer-reviewed, seminal work/text, government sites, etc). |
|
d. Does not include terms found in a basic academic dictionary (i.e., Webster’s). |
|
e. Does not exceed 10 key operational definitions. |
|
a. Defines the term Assumptions and provides citation; lists facts that the student assumes to be true but cannot actually be verified. |
|
b. Defines the term Limitations and provides citation; lists potential weaknesses of the study that are not within the control of the researcher. |
|
c. Defines the term Delimitations and provides citation; identifies the bounds of the study. |
|
(1.13) Significance of the Study 27 |
|
a. States why the study findings may be of value to businesses. |
|
b. States how this study may contribute to effective practice of business (improvement of business practice). |
|
c. Identifies how the results might contribute to positive social change. |
|
27 This area is important in determining Doc Study of the Year Award-justify well.
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
(1.14) Review of the Professional and Academic Literature 28 |
|
A. Literature Review Opening Narrative |
|
i. Contains a brief discussion of the content of the literature that includes a critical analysis and synthesis of various sources/content of the literature (journals, reports, and scholarly seminal books, etc.) to convince readers of depth of inquiry. |
|
ii. Explains the organization of the review. |
|
iii.Explains the strategy for searching the literature. |
|
iv. The majority of references should be from peer-reviewed sources. (Suggested 85% of the total sources should be peer-reviewed.) |
|
v. The majority of references should be current. (As you consider your references, it is recommended that 85% should be within the past 5 years). |
|
B. Application to the Applied Business Problem |
|
i. Introduces the purpose of the study. |
|
ii. Identifies hypotheses if a quantitative/mixed method study. |
|
iii.Contains a critical analysis and synthesis of literature pertaining to the theoretical/conceptual framework the student identified in item #1.10, Theoretical/Conceptual Framework, above29. The student includes a critical analysis with supporting and contrasting theories/conceptual models for the theory in the theoretical/conceptual framework. |
|
Section 1 Foundation of the Study (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
28 The average length of substantive literature review is between 30 to 40 pages (25 pages minimum). However, the need for depth and breadth is required. See quantitative example at Appendix F and visit the Writing Center at: http://writingcenter.waldenu.edu/50.htm for more information on writing the literature review.
29 A key portion of the Review of the Literature must focus on the specific theoretical/conceptual framework you are using in your study. This is a “ key requirement for you to be able to adequately address items 3.2g, Presentation of Findings (quantitative studies) and 3.3c, Presentation of Findings (qualitative studies).
iv. Contains a critical analysis and synthesis of literature pertaining to the independent variables (quantitative/mixed-method studies) the student identified in item # 4c (Purpose Statement). |
|
v. Contains a critical analysis and synthesis of literature pertaining to the dependent variable(s) (quantitative/mixed-method studies) the student identified in item # 4c (Purpose Statement). |
|
vi. Discusses measurement of variables (quantitative/mixed-method studies) the student identified in item # 4c (Purpose Statement). |
|
vii. Contains a critical analysis and synthesis of literature pertaining to potential themes and phenomena (qualitative studies) the student identified in the Purpose Statement. |
|
viii. Compares and contrasts different points of view, and the relationship of the study to previous research and findings (sample size/geographical location variance, etc.). |
|
ix. Provides a comprehensive critical analysis and synthesis of the literature. |
|
C. Relevancy of the Literature |
|
The literature review is well organized. Introduce the purpose of the study. Include hypotheses (if a quantitative/mixed method study) in the opening narrative. |
|
D. Literature Review Organization |
|
i. Presented in a well-organized manner. |
|
ii. Adheres to APA formatting standards. |
|
(1.15) Transition |
|
a. Ends with a Transition Heading that contains a concise summary30 of key points of Section 1. |
|
b. Provides an overview introducing Sections 2 and 3. |
|
30 A concise summary recaps the major elements of the review of the literature and does not introduce new information.
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
Begins Section 2 with a restatement of the Purpose Statement presented in Section 1. · Note: Copy-and-paste the purpose statement from Section 1 |
|
(2.2) Role of the Researcher Describes the role of the researcher in the data collection process and provides a peer-reviewed or seminal source. Describes any relationship the researcher may have had with the topic, participants, or research area. |
|
a. Describes the role of the researcher in the data collection process and provides a peer-reviewed or seminal source. |
|
b. Describes any relationship the researcher may have had with the topic, participants, or research area. |
|
c. Provides a brief description of the researcher’s role related to ethics and the Belmont Report31 protocol. |
|
d. Qualitative studies: Describes how the student will mitigate bias and avoid viewing data through a personal lens/or perspective. |
|
e. Qualitative studies with interviews: Briefly describes the rationale for an interview protocol. |
|
f. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
a. Describes the eligibility criteria for study participants. |
|
b. Discusses strategies for gaining access to participants. |
|
c. Identifies strategies for establishing a working relationship with participants. |
|
d. The participants’ characteristics must align with the overarching research question. |
|
31 See Belmont Report at: http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html.
32 Select “N/A” and explain why if participants are not used in the study.
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
e. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
(2.4) Research Method Expands on the discussion in Heading 1.5 (Nature of the Study). |
|
a. Identifies the use of a specific research method by indicating whether the proposed study is quantitative, qualitative, or mixed methods. |
|
b. Justifies the use of the research method over the other research methods. |
|
c. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
(2.5) Research Design Expands on the discussion in Heading 1.5 (Nature of the Study). |
|
a. Identifies the use of a specific research design. |
|
b. Justifies the use of the research design over other key designs for the study. |
|
c. For qualitative studies, identifies how the student will ensure data saturation. |
|
d. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
(2.6) Population and Sampling (Quantitative Only) |
|
a. Describes the population from which the sample will come. |
|
b. Demonstrates that population aligns with the overarching research question. |
|
c. Describes and justifies the sampling method (i.e., probabilistic or nonprobabilistic) and specific subcategory (i.e., simple random or convenience). Addresses the strength and weaknesses associated with the chosen sampling method and subcategory (Appendix C.) |
|
d. Justifies sample size via power analysis (see example in Appendix E ). Provides justification for the proposed effect size, alpha, and power levels. |
|
e. Cites the source for calculating or the tool used to calculate the sample size. |
|
f. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
a. Justifies the number of participants33 · Describes and justifies the sampling method (e.g., purposeful, snowball, etc.). · Describes and justifies the number of participants. · Identifies how the student will ensure data saturation. |
|
b. Demonstrates criteria for selecting participants and interview setting are appropriate to the study (Rich descriptions are encouraged) |
|
c. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
(2.8) Ethical Research |
|
a. Discusses the informed consent process. Includes informed consent form in an appendix and lists in the Table of Contents. |
|
b. Discusses participant procedures for withdrawing from the study. |
|
c. Describes any incentives for participating. |
|
d. Clarifies measures that the student will use to assure that the ethical protection of participants is adequate. |
|
e. Refers to agreement documents in the (a) appendices, and (b) Table of Contents. |
|
f. Includes statement that the student will store the data securely for 5 years to protect confidentiality of participants. |
|
g. Final Doctoral Study includes the Walden IRB approval number. |
|
h. Identifies how the student will protect names of individuals or organizations to keep the participants and organizations confidential. |
|
i. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
33 The DBA policy for phenomenological studies is a minimum of 20 participants.
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
a. States the name of the instrument(s). |
|
b. Identifies name of publisher/developer(s) and year of development (if applicable). |
|
c. Discusses concept(s) measured by the instrument(s). |
|
d. Includes a detailed description of data that comprise each construct/variable measured by the instrument(s). |
|
e. Identifies scale of measurement (i.e., nominal, ordinal, interval, ratio) for each construct/variable measured by the instrument. Please see Scales of Measurement video tutorial at: http://youtu.be/PDsMUlexaMY . |
|
f. Discusses appropriateness to the current study (i.e., why is this the best instrument to use for measuring the variables/constructs?) |
|
g. Discusses instrument administration (e.g., how long, any special requirements/tools, special instructions, pencil and paper, online, etc.). |
|
h. Describes how scores are calculated and what the scores mean; identifies items to be reverse- coded (if applicable). |
|
i. Identifies where and/or with what populations the instrument was normed; identifies where and with what populations other researchers have used the instrument(s) for collecting data. |
|
j. Identifies published reliability (e.g., test-retest reliability, internal consistency, split-half, etc.) and validity properties (e.g., construct validity, concurrent validity, convergent validity, and discriminant validity) of the instrument(s)34. |
|
k. Identifies strategies used to assess validity (e.g., construct validity, concurrent validity, convergent validity, discriminant validity) and reliability (e.g., test- retest reliability, internal consistency, split-half, etc.). |
|
l. Discusses and justifies any adjustments or revisions to the use of standardized research instruments. |
|
m. Identifies where in appendices the instrument(s) (or copy of permission to use instrument or purchase is (are) located). Ensures Table of Contents lists appendices. [Copies of the instrument may not be reproduced in an Appendix without written permission.] |
|
34 Published reliability and validity properties might be found in the test review and in other studies where the instrument was used to collect data.
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
n. Describes where raw data will be available (appendices, tables, or by request from the researcher). |
|
o. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
(2.10) Data Collection Instruments (Qualitative Studies Only) |
|
a. In addition to identifying the student as the primary data collection instrument, identifies the data collection instrument/process (e.g., informal interview, semistructured interviews, phenomenological in-depth interviews, focus groups, company/archival documents, etc.). |
|
b. Clarifies how the student will use the data collection instrument/technique (the process/protocol). |
|
c. Identifies how the student will enhance the reliability and validity of the data collection instrument/process (e.g., member checking, transcript review, pilot test, etc.). |
|
d. Identifies where in appendices the instrument (e.g., interview protocol, focus group protocol, interview questions, etc.) is (are) located. Ensures Table of Contents lists appendices. |
|
e. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
(2.11) Data Collection Technique |
|
a. Describes the technique used to collect data such as an online/paper survey, interview, observation, site visit, video recording (think recipe card—step-by- step-process and describe richly. Provides abridged interview protocol ( see Appendix H ), focus group protocol, observation protocol, etc. and identifies location in an appendix. |
|
b. Describes advantages and disadvantages of data collection technique. |
|
c. As applicable, describes the process for conducting a pilot study after IRB approval. |
|
d. For qualitative studies, identifies how the student will use member checking of the data interpretation or transcript review (if applicable). |
|
e. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
a. Describes the systems for keeping track of data, emerging understandings such as research logs, reflective journals, and cataloging/labeling systems. |
|
b. Reminds readers all raw data will be stored securely for 5 years. |
|
c. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
a. Restates the research questions and hypotheses from Section 1. |
|
b. Describes and defends, in detail, the statistical analyses that the student will conduct (e.g., multiple regression, two-way ANOVA, etc.). |
|
c. Describes and defends, in detail, why other statistical analyses are not appropriate. |
|
d. Provides explanation of data cleaning and screening procedures as appropriate to the study. |
|
e. Provides explanation for addressing missing data. |
|
f. Identifies and explains the assumptions pertaining to the statistical analyses. |
|
g. Identifies the process for testing/assessing the assumptions. |
|
h. Identifies appropriate actions to be taken take if the assumptions are violated35. |
|
i. Describes how the student will interpret inferential results (i.e. key parameter estimates, effect sizes, confidence intervals, probability values, odds ratios, etc.). |
|
j. Identifies statistical software and version that the student will use in the data analysis process (e.g., SPSS, Excel, R, etc.). |
|
k. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
35 Bootstrapping can be used as an effective method for addressing violations of assumptions.
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
a. Identifies the appropriate data analysis process for the research design (e.g., one of the four types of triangulation for case study; modified van Kaam, van Maanen, etc. for phenomenology). |
|
b. Provides a logical and sequential process for the data analysis. |
|
c. Details the student’s conceptual plan or software (e.g., NVivo, Atlasti, Ethnograph, Excel, etc.) for coding, mind-mapping, and identifying themes. |
|
d. Identifies how the student will focus on the key themes, correlate the key themes with the literature (including new studies published since writing the proposal) and the conceptual framework. |
|
e. It is recommended to support claims and decisions with multiple scholarly peer- reviewed or seminal sources (as appropriate). |
|
(2.15) Study Validity (Quantitative Only) 36 |
|
a. Experimental/quasi-experimental designs only: Describes threats to external validity (e.g., testing reactivity, interaction effects of selection and experimental variables, specificity of variables, reactive effects of experimental arrangements, and multiple-treatment interference, as appropriate to the study) and how the student will address the threats to external validity. |
|
b. Experimental/quasi-experimental designs only: Describes threats to internal validity (e.g., history, maturation, testing, instrumentation, statistical regression, experimental mortality, and selection-maturation interaction, as appropriate to the study) and how the student will address the threats to internal validity. |
|
c. Discusses threats to statistical conclusion validity37 (e.g., factors that affect the alpha/Type I error rate) and how the student will address the threats to statistical conclusion validity. |
|
d. Describes the extent to which, and rationale for justifying if, and if so why, research findings can be generalized to larger populations (external validity) and applied to different settings. |
|
36 Items “a” and “b” pertain to experimental and quasi-experimental designs only. Item “c” pertains to all quantitative designs. Discuss validity as it pertains to the study outcomes. This component is not to address the reliability and validity of the study instruments. The reliability and validity of the study instruments is addressed in item 2.9 (quantitative) and 2.10 (qualitative). Item “d”, external validity, pertains to all quantitative designs.
37 The three factors to be discussed are (a) reliability of the instrument, (b) data assumptions, and (c) sample size.
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
e. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
|
(2.16) Reliability and Validity (Qualitative Only) : A key difference from quantitative research is the reliability and validity headings. The analogous criteria for qualitative studies are credibility, transferability, dependability, and confirmability. These criteria are not measurable and need to be established using qualitative methods such as member checking--Marshall and Rossman (2016) have a good definition, and triangulation (data triangulation, investigator triangulation, theoretical triangulation, and methodological triangulation). See Norman Denzin’s (1978, 2009) works on triangulation). Please review more detailed information on qualitative validity at: http://www.socialresearchmethods.net/kb/qualval.php |
|
Reliability |
|
a. Identifies how the student will address dependability. (i.e., member checking of data interpretation, transcript review, pilot test, etc.). |
|
b. It is recommended to support claims and decisions with multiple scholarly peer- reviewed or seminal sources (as appropriate). |
|
Validity |
|
c. Identifies how the student will ensure credibility (i.e., member checking of the data interpretation, participant transcript review, triangulation, etc.). |
|
d. Identifies how the student will address transferability in relation to the reader and future research. |
|
e. Identifies how the student will address confirmability. |
|
f. Identifies how the student will ensure data saturation. |
|
g. It is recommended to support claims and decisions with multiple scholarly peer- reviewed or seminal sources (as appropriate). |
|
(2.17) Transition and Summary |
|
a. Ends with a Transition Statement that contains a summary of key points. |
|
b. Includes an overview of what the student will cover in Section 3. |
|
Proposal Stage. Before IRB approval, the paper is written in future tense and after IRB approval, the paper is changed to past tense. |
|
Writing Style. The paper is written in predominantly active voice without slang, euphemisms, or anthropomorphisms. |
|
Section 2 The Project (FOR PROPOSAL & DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
Follows APA 6th edition in the text and in the reference list |
|
References: Of the total sources cited, a minimum of 85% should be peer reviewed (it is recommended that in business 85% should be within the past 5 years of anticipated completion date). Each source on the References page should match an in-text citation and vice versa
|
|
Congratulations! This ends the Proposal section. See the Process Checklist located at the Center for Research Quality website (see URL below). http://researchcenter.waldenu.edu/Documents/DBA_Process_Checklist.pdf |
Section 3 Application for Professional Practice and Implications for Social Change (FOR DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
|
|
a. Begins with the purpose of the study. Do not repeat the entire purpose statement. Typically, the first sentence of the purpose statement will suffice. |
|
b. Provides a brief summary of the findings (do not exceed one page). |
|
|
|
a. Describes the statistical test(s), the variables, and the purpose of the test(s) and how they relate to the hypotheses. |
|
b. Presents relevant descriptive statistics38 (i.e. mean, standard deviation for scale variables; frequencies and percentages for nominal variables). |
|
c. Provides evaluation of statistical assumptions from Heading 2.13e. |
|
d. Reports inferential statistical analyses results, organized by research question, in proper APA statistical notation/format. Includes the alpha level chosen for the test, test value, p (significance level) values, effect size, degrees of freedom, confidence intervals (when appropriate), etc. |
|
e. Includes appropriate tables39 and figures to illustrate results, as per the current edition of the Publication Manual of the American Psychological Association. |
|
f. Summarizes answers to research questions. |
|
38 See the following link for further information on descriptive statistics: http://www.socialresearchmethods.net/kb/statdesc.php
39 See Appendix E for basic formatted descriptive and inferential statistic tables.
Section 3 Application for Professional Practice and Implications for Social Change (FOR DBA DOCTORAL STUDY DOCUMENTS) Quality Indicators |
Type Met, Not Met, or N/A in Each Cell |
g. 40Describes in what ways findings confirm, disconfirm, or extend knowledge of the theoretical framework and relationship(s) among variables by comparing the findings with other peer- reviewed studies from the literature review that includes studies addressed during the proposal stage and new studies since writing the proposal. Ties findings or disputes findings to the existing literature on effective business practice. |
|
h. Analyzes and interprets the findings in the context of the theoretical framework, as appropriate. |
|
i. Ensures interpretations do not exceed the data, findings, and scope. |
|
|
|
a. Lists the overarching research question. |
|
b. Identifies each theme. Analyzes and discusses findings in relation to the themes. |
|
c. 41Describes in what ways findings confirm, disconfirm, or extend knowledge in the discipline by comparing the findings with other peer-reviewed studies from the literature review that includes new studies since writing the proposal. |
|
d. Ties findings to the conceptual framework |
|
e. Ties findings or disputes findings to the existing literature on effective business practice. |
|
Provides a detailed discussion on the applicability of the findings with respect to the professional practice of business. This major subsection provides a rich academic argument for why and how the findings are relevant to improved business practice. |
|
40 It is important to ensure the review of the literature is a critical analysis and synthesis of the theory and variables identified in the study.
41 It is important the student includes a critical analysis and synthesis of the new literature (studies) published since the proposal and correlates the literature with the findings in the study.
42 This is an important area for Doctoral Study of the Year Award.
Expresses implications in terms of tangible improvements to individuals, communities, organizations, institutions, cultures, or societies as the findings could beneficially affect social change/behaviors. |
|
a. Ensures recommendations flow logically from the conclusions and contain steps to useful action. |
|
b. States who needs to pay attention to the results. |
|
c. Indicates how the results might be disseminated via literature, conferences, training, etc. |
|
Lists recommendations for further study related to improved practice in business. Identifies how limitations identified in Section 1.12b, Limitations, can be addressed in future research. |
|
Includes a reflection on the researcher's experience within the DBA Doctoral Study process, in which the researcher discusses possible personal biases or preconceived ideas and values, the possible effects of the researcher on the participants or the situation, and any changes to the researcher’s thinking after completing the study. |
|
Closes with a strong concluding statement making the take-home message clear to the reader. |
|
a. Consent form(s) attached. (Redact/blackout all personal or identifying data.) |
|
b. Organizational permission (Blackout name). |
|
c. Sample of Instrument (i.e., survey, interview protocol with interview questions, observation protocol, etc.; copyrighted surveys cannot be included w/o written permissions.) |
|
43 This is an important area for Doctoral Study of the Year Award.
44 Limitations identified in section 1.12b, as a minimum, are ideal sources for future studies.
Doctor of Business Administration
Research Handbook
Section 1 – Foundation of the Study
Note: This handbook is not in the DBA Doctoral Study Template. Make certain that the proposal and study conform to DBA Doctoral Study Template heading sequencing, and formatting with the correct margins and line spacing.
1.1 - Abstract
The abstract must not exceed one page. The abstract text must be double-spaced with no paragraph breaks. The first line must not be indented. Describe the overall research problem being addressed in the first couple of sentences and indicate why it is important (e.g., who would care if the problem were solved). You can include a general introduction of the issue in the first sentence, but you need to move to a clear statement of the research problem. Identify the purpose and theoretical foundations, summarize the key research question(s), and briefly describe the overall research design and data analytic procedures. Identify the key results, themes, one or two conclusions, and recommendations that capture the heart of the research. Conclude with a statement on the implications for positive social change. Here are some form and style tips: (a) limit the abstract to one page; (b) maintain the scholarly language used throughout the doctoral study; (c) keep the abstract concise, accurate, and readable; (d) use correct English; one may use passive voice in the abstract; (e) ensure each sentence adds value to the reader’s understanding of the research; (f) use the full name of any term and if the acronym is used more than once in the abstract include the acronym in parentheses. Do not include references or citations in the abstract. Per APA style, unless at the beginning of a sentence, use numerals in the abstract, and don’t identify the titles of any software. Do not include seriation (i.e., (a), (b), (c), etc.)
1.2 - Background of the Problem
The purpose of the background is to introduce the topic and problem you will address.
Briefly, you want to indicate why the problem deserves new research. More important, the Doctoral Study must address applied research, so you will want to identify the need to study how some business leaders are solving or have solved an applied business problem. The goal of this heading is to encourage readers to continue reading, to generate interest in the study, and provide an initial frame of reference for understanding the entire research framework.
Applied DBA Versus a Speculative/Theoretical PhD
A DBA study is an applied business study linking theory to professional practice.
Students can use the following criteria to ensure that they have a clear DBA business study or a DBA business study rather than a PhD business study. In contrast to a DBA study, a PhD study is a hypothetical/theoretical study that leads to expanding or creating theory rather than solving a business problem.
Qualitative studies. A qualitative study about people’s perceptions on how to address a business problem is hypothetical and is a PhD study. In contrast, a qualitative study is about a strategy that a business leader or manager has implemented /is implementing to solve a business problem or a strategy that a business leader or manager has implemented to solve a business problem is an applied DBA study.
Quantitative studies. A quantitative study that includes one or more variables in which the leader or manager cannot change to solve a business problem is a hypothetical/theoretical PhD study. Whereas, a quantitative study that includes only variables which business leaders or business managers can manipulate or change to solve a business problem is an applied DBA study.
Preparing the Background of the Problem
The Background of the Problem can be effectively accomplished in no more than one page; brevity and clarity are essential. The Review of the Literature will provide a more detailed discussion on the literature pertaining to the topic/problem. Immersing yourself in the literature on your topic/problem is crucial to uncovering a viable business problem. Do not underestimate the importance of the literature in helping identifying a viable business problem.
The research topic is broad in nature; do not narrow the focus too quickly. You want to provide the reader, especially those not familiar with the topic, time to become familiar with the topic. Transition the reader to a more a concise presentation of the specific business topic/problem under study. This component focuses on identifying why the study is important, how the study relates to previous research on the topic/problem, and gives the reader a firm sense of what your study is going to address and why. The Background of the Problem contains information supporting the business problem. Do not describe, explain, justify, etc., the need for the study in the Problem Statement. Provide these critical elements (description, explanation, justification, etc.) in the Background of the Problem component. As such, the Problem Statement can be written effectively in as little as four sentences: (a) hook, (b) anchor (c) general business problem, and (d) specific business problem. Transfer the supporting references in the Background of the Problem to the Problem Statement, but submit in a concise manner. For example, the hook and anchor reference provided in the Background of the Problem should be used in the Problem Statement.
Include a transition statement that leads to problem statement that will provide more specificity regarding the problem identified in the Background or the Problem component. A well-written transition signals a change in content. It tells your reader that they have finished one main unit and are moving to the next, or it tells them that they are moving from a general explanation to a specific example or application. A transition form the background to the Problem Statement is often as brief as one sentence, as follows: The background to the problem has been provided; the focus will now shift to the Problem Statement. Tip: Many potential business topics/problems can be found in the Area for Future Research heading of most peer- reviewed journal articles.
1.3 - Problem Statement
As shown in the following graphic, the Problem Statement must include four specific components the (a) hook, (b) anchor, (c) general business problem, and (d) specific business problem. It is recommended that the Problem Statement be approximately 150 words. More important, ensure the problem statement reflects an applied business problem; avoid Rubric Creep45. You must ensure you map to the rubric requirements. This is the most critical component of the doctoral study and will be highly scrutinized in the review process. Again, the Problem Statement is not to identify causes for the problem, solutions to the problem, or any other superfluous information. A well-written problem statement can be presented in four to five sentences. Please review the training video (see link below) developed by the DBA methodology team to aid in writing your problem statement. The video will help add clarity and save you time. The Problem Statement Video Tutorial can be found at: http://youtu.be/IYWzCYyrgpo.
45 Rubric creep occurs when the problem statement does not reflect an applied business problem.
DBA students are seeking a degree in business and must ensure the problem statement is business focused. The problem statement must not represent a problem that has a social, psychological, educational, or other discipline specific emphasis. A business problem is something that is a problem for a business from the perspective of the business managers or the industry’s leaders. Therefore, it is important to adopt a management perspective, and not that of social advocates. The perspective must be from the position of the managers and leaders of business who can address the problem.
Avoiding Rubric Creep
To ascertain if a problem addresses a business issue or has Rubric creep/Rubric drift, please consider the following:
· An important indicator that a business related problem is a specific business problem is that the problem statement relates to a key business process that organizational leaders need to address and effectively meet the organization’s mission.
· A business problem relates to one or more critical success factors (CSFs). Business leaders use business processes to function effectively to complete one or more CSF’s needed to carry out their business mission.
· A business problem is one that a business manager/leader can solve.
Conduct a final check of the problem statement by putting the hook, anchor, general business problem, and specific business problem in bullet form and check for alignment among the four bullets. When you can ensure that the problem statement aligns throughout, write in scholarly narrative form (no bullets).
Strategy for Mapping to the Rubric
· Read the rubric requirements for a heading.
· Read what you wrote in the heading.
· Read the rubric requirements for a heading again.
· Read what you wrote in the section and highlight (in the proposal and the rubric) the rubric elements that you addressed in the heading.
· Revise the heading as needed to include the rubric elements that you missed and eliminate superfluous narrative.
· Start the process at the top again until you have mastered the rubric elements in the heading.
Specific Business Problem
The specific business problem is the genesis of one’s study. It is vital that one has a clear and precise specific business problem. One will align the contents of the Research Question and Purpose Statement with the specific business problem.
The qualitative specific business problem. The qualitative specific business problem must be well defined and not contain multiple issues (variables in quantitative studies). The
following graphic depicts how to include the elements needed in a qualitative specific business problem.
The quantitative specific business problem. The quantitative specific business problem must be well defined and contain the key variables. The following graphic depicts how to include the elements needed in a qualitative specific business problem.
Aligning the Specific Business Problem With the Purpose Statement and RQ
Make certain that the specific business problem, Purpose Statement, and Research Question (RQ) align. A good technique to use to enhance the alignment is to put the specific business problem, RQ, and first sentence of the Purpose Statement together on a blank document to ensure that you are using the same words. Notice the suggested order differs from the order the headings appear in the study.
Qualitative alignment example. The graphic below provides an example of alignment among the Specific Business Problem, Research Question, and first sentence of the Purpose Statement using the same key words. Pay attention to the words one uses in identifying the issue that the leader lacks or has in limited supply. The word determines how one can collect data.
· Some business leaders lack understanding… To ascertain what one understands will require a quantitative design.
· Some business leaders lack knowledge… To ascertain a business leader’s knowledge will require a quantitative design.
· Some business leaders lack strategies (or have limited plans, processes, procedures)… To ascertain a business leader’s strategies may involve interviews, focus groups, company archival records and documents, company policies and procedures, company intranet/Internet site, and direct/participant observation (in some cases) to collect data. Usually interviews or focus groups are the primary data collection method.
· Some business leaders lack skills… To ascertain a business leader’s skills will involve direct/participant observation as the primary data collection method.
Quantitative alignment example. Notice how the Specific Business Problem, Research Question, and first sentence of the Purpose Statement use the same key words with the exception that the research question and subsequent first sentence in the purpose statement do not address the business leader—this is a difference between qualitative and quantitative studies. The following is an example of alignment for a quantitative correlational study.
1.4 - Purpose Statement
There is a difference in the rubric requirements for a quantitative versus a qualitative study. The Purpose Statement must include the following components: (a) methodology, (b) design, (c) independent and dependent variables (for quantitative studies only), (d) specific population and justification for using the chosen population, (e) geographical location, and (f) the study’s potential for effecting social change. It is recommended that the Purpose Statement be approximately 200 words. The Purpose Statement is to be a concise statement and must not include detailed design information (sample size, data collection, etc.). Please be sure to map to the rubric. Please review the purpose statement video at: http://youtu.be/pLP4r0mfT9A. This video tutorial will be helpful to you in preparing your Purpose Statement.
Six Elements of the Purpose Statement
As mentioned above, the Purpose Statement consists of six elements. These six elements, and their contents, are:
Methodology. The first element to be presented in the Purpose Statement is the research methodology. The methodology is the overall philosophical assumption the researcher uses for designing and developing the study. In other words, the methodology is a worldview of how knowledge is acquired. The qualitative method is a means for exploring and understanding the meaning individuals or groups ascribe to a business problem. The qualitative method involves researchers using open-ended questions to learn what a business leader is doing or has done to solve a business problem. The quantitative method involves researchers using closed-ended questions to test hypotheses. Mixed-method studies contain a qualitative study methodology and a quantitative study methodology and must meet the requirements of both methodologies.
Mixed-method studies are rarely conducted in the DBA program. You simply need to identify the methodology for or your study in a single sentence. There is no other information required other than this single statement.
Design. The second element to be presented in the Purpose Statement is the research design. While there are numerous designs, the most common qualitative designs seen in DBA doctoral studies are the case study design, miniethnography, focus group, and the phenomenological design. The correlational design is the most common design for quantitative studies. You simply need to identify the design of your study. There is no other information required other than this single statement.
Variables (quantitative study only)46. A variable is any entity that can take on different values. Another definition of a variable is that it is a characteristic or condition that changes or has different values for different individuals or units of analyses (i.e. sample units). More so, variables are the corner stone of quantitative research, where the researcher seeks to explain the relationships among variables or to compare group differences regarding a variable or variables
46 See section 1.6 “Research Questions” for more information on variable requirements.
of interest. Another important distinction for term variable is the distinction between an
independent and dependent variable.
An independent variable is the variable you have control over (experimental designs), what you can choose and manipulate. A dependent variable is also known as a response variable or explained variable. The independent variable is usually what you think will affect the dependent variable. In some cases, you may not be able to manipulate the independent variable. It may be something that is already there and is fixed (i.e. company size), something you would like to evaluate with respect to how it predicts, influences, impacts, or causes a change in the dependent variable (i.e. employee satisfaction).
As it applies to your research, the dependent variable is normally the problematic variable in DBA studies where the researcher it trying to explain what influences, affects, causes or can predict the problem. For example, if the specific business problem is low employee satisfaction then employee satisfaction is the dependent variable. The researcher then selects independent variables that are thought to predict, influence, impact, or cause the dependent variable, in this case, employee satisfaction.
Thus, it is extremely important to identify clearly the independent and dependent variables in the Purpose Statement component of the proposal. Identification of the variables informs other research components such as sample size and type of statistical analysis that is to be conducted. See more on variables at: http://www.socialresearchmethods.net/kb/variable.php
Targeted population. A population is the larger group that you are studying. The population is not to be misconstrued as the sample, or your study’s participants. You will select your sample, or study participants from the larger population. For example, your population might be all small business leaders in New York. You will however, select a subset of small business leaders in New York to serve as your sample or participants. Remember, you are to address the broader population in this component of the Purpose Statement.
In a qualitative ethnographic or case study, you will need to define the population with the scope of the study. For example, if you are conducting a single case study, the population will be people that meet the participant criteria within that organization/company. Likewise, in a multiple case study the population will be the people that meet the participant criteria within the organizations/companies in the study.
Examples for a case study with the following research question: What strategies do department store managers use to motivate their sales associates?
Single case study example. The population will be department store managers in one New England department store who have a strategy to motivate their sales associates.
Multiple case study example. The population will be department store managers in four New England department stores who have a strategy to motivate their sales associates.
Geographical location. The geographical location simply identifies the geographical location of your study’s participants. The participants might be in a particular country, region,
state, or city. Of course, this may vary based upon the purpose of your study. In the decision to identify the geographic location, one must ensure that the confidentiality of the company(ies) and participants. If one is conducting a study in an automotive manufacturing facility and there are only one or two companies in the city or state (i.e. Alabama), one should define the geographic location to avoid the specific sample units being easily identifiable (i.e., southern United States).
Social change. The final element of your Purpose Statement requires you to provide a positive social change statement. Positive social change involves improvement of human or social conditions by promoting the worth, dignity, and development of individuals, communities, organizations, institutions, cultures, or societies. Focus on explaining “WHO” may benefit, and “HOW” the “WHO” may benefit from your study’s findings and recommendations.
Quantitative hypothetical example. The purpose of this quantitative correlation study is to examine the relationship between leadership styles, size of business, and business revenue.
The independent variables are leadership style and size of business size. The dependent variable is business revenue. The targeted population will consist of business leaders of microelectronic companies in the southeast United States. The implications for positive social change include the potential to (provide social change statement).
Note: DBA doctoral studies require the highest level or rigor and scholarship. One focus of rigor and scholarship is that of the number of predictor or independent variables examined in quantitative doctoral studies. Nonexperimental research (i.e. correlation, quasi- experimental, etc.) requires the use of at least two independent or predictor variables.
Qualitative hypothetical example (case study). The purpose of this qualitative multiple case study is to explore the strategies that department store managers use to motivate their sales associates. The targeted population will comprise of department store managers form one of the three department stores in the southeast region of the United States who have implemented strategies to motivate their sales associates. The implication for positive social change includes the potential to (provide social change statement).
Note: In a case study, and often in ethnographic studies, the population is limited to those people meeting the participant criteria in the company or companies being studies. In a phenomenological or narrative study, the population includes all people who meet the participant criteria.
1.5 - Nature of the Study
The Nature of the Study component serves two purposes (a) describing and justifying the methodology (i.e. quantitative, qualitative, mixed-method) and (b) describing and justifying the design (i.e. case study, phenomenological, correlation, sequential explanatory, etc.). Therefore, a well-crafted Nature of the Study can be presented in two paragraphs and not exceed one page.
The first paragraph describes and justifies the methodology and the second paragraph describes and justifies the design. These two components should not be intermingled. A common error in this heading is to restate the purpose, identify variables, analyses, etc. and include other superfluous information. Again, map to the rubric and only include the required content!
Remember that the Nature of the Study succinctly represents your defense of your choice of method and design; therefore, it must have depth. You must demonstrate to the reviewers that you have done the reading and research needed to support your research method and design. That evidence also includes discussing why you did not choose other methods and designs.
Keep this heading deep yet brief. You will have time to expand upon the Nature of the Study
later in the Research Method and Design heading.
Hypothetical Quantitative Example47
I chose a quantitative methodology for this study. Using a quantitative study enables one to identify results that can be used to describe or note numerical changes in numerical characteristics of a population of interest; generalize to other, similar situations; provide explanations of predictions, and explain casual relationships (cite). Thus, the quantitative method is appropriate for this study because the purpose of the study is to analyze numerical data and infer the results to a larger population. A mixed methods study contains the attributes of both quantitative and qualitative methods (cite). The qualitative method is appropriate when the research intent is to explore business processes, how people make sense and meaning, and what their experiences are like (cite). Therefore, the qualitative and qualitative portions of a mixed- method approach are not appropriate for this study.
Specifically, the correlation design is chosen for this study. A correlation researcher examines the relationship between or among two or more variables (cite). The correlation design is appropriate for this study because a key objective for this study is to predict the relationship between a set of predictor variables (leadership style and size of business) and a dependent variable (company revenue). Other designs, such as experimental and quasi-experimental designs are appropriate when the researcher seeks to assess a degree of cause and effect (cite). This principal objective for this study is to identify a predictive model; thus the experimental and quasi-experimental designs are not appropriate.
Hypothetical Qualitative Example
The three research methods include qualitative, quantitative, and mixed methods (cite). I selected the qualitative method to use open-ended questions. Qualitative researchers use open- ended questions to discover what is occurring or has occurred (cite). In contrast, quantitative researchers use closed ended questions to test hypotheses (cite). Mixed methods research includes both a qualitative element and quantitative element (cite). To explore (your topic), I will not be testing hypotheses which is part of a quantitative study or the quantitative portion of a mixed methods study.
47 Note: As you can see, the example clearly starts with topic sentences (red text) that foreshadow what is to be addressed in the paragraph. Notice the quantitative method paragraph does not address the design, as the topic sentence does not suggest the design is the focus of the paragraph. The design is not foreshadowed in the topic sentence. Remember, a topic sentence alerts the reader to the main topic of the paragraph.
I considered four research designs that one could use for a qualitative study on (2-3 words identifying your topic): (a) miniethnography, (b) focus group, (c) narrative, and (d) case study. (Note: Select the designs that you considered and are applicable to an applied qualitative study.) Miniethnography involves… (Briefly discuss miniethnography, 1-sentence defining with a citation, 1-sentence if needed why it is or is not the optimal choice). Business researchers use focus groups to… (Briefly discuss focus groups, 1-sentence defining with a citation, 1-sentence if needed why it is or is not the optimal choice). A narrative design entails… (Briefly discuss narrative designs, 1-sentence defining with a citation, 1-sentence if needed for why it is or is not the optimal choice). Case study researchers… (Briefly discuss case study, 1-sentence defining with a citation, 1-sentence is needed why it is or is not the optimal choice).
1.6 - Research Question (Quantitative Only)
DBA doctoral studies require the highest level or rigor and scholarship. One focus of rigor and scholarship is that of the number of predictor or independent variables examined in quantitative doc studies. Non-experimental research (i.e. correlation , quasi- experimental , etc.) requires the use of at least two independent or predictor variables. This is due to the “third variable” problem. A third variable also known as a confounding or mediator variable can confound the relationship between the independent and dependent variable. This confounding can lead the researcher to incorrectly interpret the results, leading to an incorrect rejection of the null hypothesis.
As such, all DBA quantitative studies require the examination of at least two predictor, or independent variables. This affects the statistical analysis, as simple bivariate correlations (correlation designs) or one-way ANOVAs cannot be used as inferential statistical tests. Other statistical procedures, such as partial correlation, semipartial correlation, mediation and moderation, and multiple regression analyses, as a minimum must be used for correlation studies. Quasi-experimental, causal comparative, etc., designs must employ statistical analyses (i.e. factorial ANOVAs), as a minimum, which examines more than one independent variable.
Below are appropriate and inappropriate examples of correlation and quasi-experimental research questions. These examples depict predictor/independent variables, which are (a) employee job satisfaction and (b) leadership experience. The dependent variable is company gross revenue.
· Appropriate Correlation Example (two predictor variables): Does a linear combination of employee job satisfaction and leadership experience significantly predict employee productivity?
· Inappropriate Correlation Example (only one predictor variable): Does employee job satisfaction significantly predict employee productivity?
· Appropriate Quasi-experimental Example (two independent variables): Do employee job satisfaction and leadership experience significantly influence employee productivity?
· Inappropriate Quasi-experimental Example (only one independent variable):
Does employee job satisfaction significantly influence employee productivity?
1.7 - Hypotheses (Quantitative/Mixed-Method Only)
Hypotheses
Two major elements in the research design are the hypotheses and the variables used to test them. A hypothesis is a provisional idea whose merit deserves further evaluation. Two hypotheses, the null (H0) and alternative (H1), are to be stated for each research question. Below are appropriate examples of correlation and quasi-experimental/casual comparative null and alternative hypotheses; note how they mirror the research questions identified above in the Quantitative Research Questions heading. These examples depict predictor/independent variables, which are (a) employee job satisfaction and (b) leadership experience. The dependent variable is company gross revenue. The H0 and H1 reflect the appropriate statistical notation and are to be included. See more on hypotheses at: http://www.socialresearchmethods.net/kb/hypothes.php
Correlation
· Null Hypothesis (H0): The linear combination of employee job satisfaction and leadership experience will not significantly predict employee productivity.
· Alternative Hypothesis (H1): The linear combination of employee job satisfaction and leadership experience will significantly predict employee productivity.
Quasi-experimental
· Null Hypothesis (H0): Employee job satisfaction and leadership experience do not significantly influence employee productivity.
· Alternative Hypothesis (H1): Employee job satisfaction and leadership experience significantly influence employee productivity.
1.8 - Research Question (Qualitative Only)
In a qualitative study, the Research Question uses the same words as in the Specific Business Problem to identify the specific business leader and identify what the leader has limited supply of or is lacking. The following examples demonstrate how to align the research question with the specific business problem.
1.9 - Interview Questions (Qualitative Only)
In qualitative studies, the researcher must first identify the population for the study (business leaders that have solved or are solving the specific business problem) and align the interview questions with the population and the research question. Interview questions must (a) provide answers to the research question, (b) not go beyond the research question (i.e., no demographics if not part of the research question), (c) be in the language (word choice) that the participant will understand, (d) be open-ended questions (no Yes or No answerable questions), and (e) be applied DBA rather than speculative PhD questions (see the example below).
Interview questions should be straightforward and ask what or how the business leader has addressed the research problem. Typically, case study and ethnographic interviews will be semistructured, semiformal, unstructured, or informal. Phenomenological studies use the phenomenological long interview with only one to three questions to have a longer discussion getting in depth data and reaching a state of epoché. Students should critically read about the different interviewing techniques and select the best technique for the study design.
Semistructured and semiformal interviews frequently include six to ten interview questions to allow time for probing questions. The final interview question in a semistructured or informal interview frequently asks the participant to share any additional information for addressing the research question(s): What additional information would you like to share about XYZ? One typically uses an unstructured or informal interview technique when having a more casual discussion often spreading the interview questions out over time during field visits (i.e., during a direct observation or participant observation phase in data collection).
In contrast, the phenomenological long interview typically has one or two interview questions. Although phenomenological interview questions are written as a question, the interview protocol involves creating an in depth discussion (typically 1-2 hours) and reaching a state of epoché. The phenomenological long interview requires more study and preparation as compared to more traditional interviewing techniques used in ethnography and case study designs.
Be cautious not to confuse the interviewing process with the interviewing questions. The concept of semistructured questions or semistructured interview questions does not exist.
Semistructured interviews (semiformal, unstructured, or informal interviews) are a specific interviewing technique/process. All qualitative interview questions are open-ended. However, the interview questions are not semistructured.
Example Research Question
What strategies do department store managers use to motivate their sales associates?
Example Applied DBA Interview Questions
1. What strategies are you using to motivate your sales associates?
2. What method did you find worked best to motivate your sales associates?
3. How did your sales associates respond to your different motivation techniques?
Example Speculative/Theoretical PhD Questions (do not use)
1. What strategies should managers use to motivate sales associates?
2. What method do you think will work best to motivate sales associates?
3. How do you feel your sales associates respond to other motivation techniques?
1.10 - Theoretical/Conceptual Framework
A theoretical (for quantitative studies) or conceptual framework (for qualitative studies) offers a systematic view of a phenomenon. In other words, the framework provides a lens through which to view a phenomenon.
Identifying the Best Theory or Conceptual Model
Make certain that the theory aligns with the research question. Consider the following when searching for a theory or conceptual model for the conceptual framework.
· Critically read peer- reviewed studies related to your topic and identify the theories that the sources found aligned with their studies. After one has read and synthesized numerous peer-reviewed studies related to the topic for the annotated bibliography, one will notice a few theories (or conceptual models) that aligned with several studies.
· Critically read the seminal work on the theories (or conceptual models) that you found in peer-reviewed studies related to your topic.
· Related studies may be about the concept and not the specific industry.
· For example, if one is studying how the family owned wrecking yard leaders succession plan, one could look at studies on leadership training and development in other types of organizations.
· Quantitative. Select the theory or conceptual model that best aligns with the research question and provides an interrelated set of constructs, variables, hypotheses, or propositions that offer an explanation for phenomenon.
· Qualitative. Select the theory or conceptual model that best aligns with the research question.
As you can see, it is important to immerse yourself in the literature pertaining to your conceptual framework to gain a good understanding of the framework. More important, your literature review must include an exhaustive review of the literature pertaining to the conceptual framework you are proposing for your study. This is extremely important, as you will be required to discuss your findings as they confirm, disconfirm, extend, etc., the extant literature on your conceptual framework. You must critically analyze and synthesize the studies where your conceptual framework has been the lens through which the phenomenon has been viewed.
As outlined in the DBA Rubric, you are required to present a brief overview of your theory or conceptual framework in Section one of the proposal. Please note this is not to be a detailed review of your theory or framework. The detailed review is required in the Review of the Literature heading. Here, a model for presenting the theory or framework heading is offered.
You will want to state the name of the theory or identify the conceptual framework, identify the theorist if applicable, list key concepts of the theory or framework, identify any propositions or hypotheses, and identify how the theory or framework applies to your study. Please note there are obvious variations to this model depending upon your particular study and topic. However, the intent is to briefly present the key aspects of your theory and or framework and show how it fits into your study.
Quantitative Example
Burns (1978) developed the transformational leadership theory. Burns used the theory to offer an explanation for leadership based upon the premise that leaders are able to inspire followers to change expectations, perceptions, and motivations to work toward common goals. Burns identified the following key constructs underlying the theory (a) idealized attributes, (b) idealized behaviors, (c) intellectual stimulation, (d) inspirational motivation, and (e) individualized consideration. As applied to this study, the transformational leadership theory holds that I would expect the independent variables (transformational leadership constructs), measured by the Multifaceted Leadership Questionnaire, to predict employee turnover intention because (provide a rationale based upon the logic of the theory and extant literature). The following figure48 is a graphical depiction of the transformational leadership theory as it applies to examining turnover intentions.
48 Graphical models are useful for depicting the theoretical framework in quantitative studies.
Let’s examine the theoretical framework from the perspective of possible lenses through which to view phenomena. Assume the business problem or phenomenon is the failure rate of small businesses, an obvious business concern. There are plethora’s of explanations that can be offered for the failure of small businesses. As the researcher, you have the choice of lens for which to view the problem. For example, you might hypothesize or rationalize that transformational leadership characteristics offer a systematic view for the failure of small businesses. Specifically, you hypothesize or rationalize that a leaders transformational leadership characteristics are influential in the success of small businesses. As such, your study would be grounded in transformational leadership theory or transformational leadership conceptual framework.
Or perhaps, you hypothesize or rationalize that servant leadership characteristics offer a systematic view for the failure of small businesses. Specifically, you hypothesize or rationalize that a leaders servant leadership characteristics are influential in the success of small businesses. As such, your study would be grounded in transformational leadership theory or transformational leadership conceptual framework. Hence, the number of lenses through which a problem or phenomena can be viewed is limitless. Only your imagination stands between you and selecting the theory or conceptual framework that can be used to connect your study to existing knowledge.
Perhaps, one of the most misunderstood aspects of theory is how to apply it in the doctoral study. Researchers utilizing a quantitative study grounded in transformational leadership theory must measure or assess the constructs underlying the theory. The broad constructs of transformational leadership theory are idealized attributes, idealized behaviors, inspirational motivation, stimulation, and idealized consideration.
Therefore, an instrument such as the Multifaceted Leadership Questionnaire (MLQ) is appropriate to measure the underlying constructs of transformational leadership theory. Any instrument not proven to assess transformational leadership cannot be approved for use in a study grounded in transformational leadership theory. If you (inappropriately) used a nonvalidated instrument, you would not be testing the proposed transformational leadership theory, and your
study would not have construct validity. For example, the Servant Leadership Survey (SLS) instrument could not be approved for use in a study grounded in transformational leadership theory, as the SLS was validated for use in measuring constructs underlying servant leadership theory.
Qualitative Example
Example research question. What strategies do department store managers use to motivate their sales associates?
Example conceptual framework. Vroom (1959) developed the expectancy-valence theory, which he later called the expectancy motivation theory (Vroom, 1964). The expectancy motivation theory suggests that employees will exhibit positive performance behaviors when they believe that their work will result in certain rewards (Vroom, 1964). Building upon Vroom’s expectancy motivation theory, Gilbert (1978, 2013) published his behavioral engineering model that provided a motivational foundation for the inputs that can lead to specific employee motives. Gilbert identified three categories covering information, instrumentation, and motivation. Within the manager’s scope of control are data, resources, and incentives. Within the employee’s scope of control are knowledge, capacity, and motives. Gilbert argued that if managers improved the availability of data access, provided the tools and equipment, or incentives to perform, employees would exhibit a change in willingness to participate. Likewise, if employees have a change in knowledge or capacity to perform, employees would exhibit a change in willingness to participate (Gilbert, 1978, 2013). Vroom’s (1964) expectancy motivation theory and Gilbert’s (1978) behavioral engineering model both align with this study exploring the strategies that department store managers use to motivate their sales associates.
1.11 - Operational Definitions
Do not include terms found in a basic academic dictionary (i.e. Webster’s). List only terms than might not be understood by the reader. All definitions should be sourced from professional/scholarly sources and in alphabetical order. Do not include more than 10 key operational definitions. Although one can use a maximum of 10 terms, there may only be a few terms pertinent to the study. Listing a specific term that only one or two sources in the literature review introduce is likely not pertinent to the study and should not be listed in the operational definitions.
1.12 - Assumptions, Limitations, and Delimitations49
Assumptions are facts considered to be true, but which cannot actually be verified by the researcher. Assumptions carry risk and should be treated as such. A mitigation discussion would be appropriate. Identify all assumptions associated with the study. Limitations refer to potential study weaknesses, which cannot be addressed by the researcher. Identify all limitations
49 Review the following resource for more detailed information: Ellis, T. J., & Levy, Y. (2009). Towards a guide for novice researchers on research methodology: Review and proposed methods. Issues in Informing Science and Information Technology, 6, 323-337. Retrieved from http://www.informingscience.org/Journals/IISIT/Overview
associated with the study. Delimitations refer to the bounds or scope of the study. Describe the boundaries and what is in and out of your study’s scope.
1.13 - Significance of the Study
Contribution to Business Practice
Discuss how the findings, conclusions, and recommendations from your study could fill gaps in the understanding and effective practice of business.
Implications for Social Change
Provide a statement of the your study’s potential for effecting positive social change or the improvement of human or social conditions by promoting the worth, dignity, and development of individuals, communities, organizations, institutions, cultures, or societies.
1.14 - Review of the Professional and Academic Literature
The literature review content needs to be a comprehensive and critical analysis and synthesis of the literature related to the theory and/or conceptual model from the Theoretical/Conceptual Framework as well as the existing body of knowledge regarding the research topic. What a literature review should not be is an amalgamation of essays on the topic. The approach to this heading may vary by authors’ specific purpose. For example, if your study is to be grounded in the transformational leadership theoretical or conceptual framework, you will be examining or exploring your phenomenon through a leadership lens. You want to report on extant research that was grounded in the transformational leadership theoretical/conceptual framework. You would want to report on the literature that is as close to your topic/phenomenon as possible. In addition, if you are conducting a quantitative study, you need to include the literature for any other key variables. A basic outline is presented at Appendix A.50
Critical analysis and synthesis of the relevant literature will be an important element of the literature review. The review of the literature is not to be a regurgitation of what you have read. It is also not to teach about a topic; rather, it is to show your mastery of the previous and recent research on your topic and provide a comprehensive up-to-date literature review on your topic. Start with an introductory heading and then report the literature. This should be an exhaustive review of the literature using the chosen theoretical/conceptual framework and consist of the key and recent writings in the field. Repeat this approach if there are any additional theories. In addition, in quantitative studies, there must be a critical analysis and synthesis for each variable.
There are three questions that students typically ask about the literature review: (a) length, (b) organizational structure, and (c) content. The length will depend upon the theoretical foundation related to the topic and scholarly studies related to the theory. Typically, for a doctoral study, a literature review will average 35-40 pages. However, demonstrating a rich and
50 Literature reviews will vary by topic, author, etc. However, Appendix A presents the minimum requirements for a quantitative study.
comprehensive review of the topic is more important than the number of pages in a literature review.
The most common ways that one may organize the literature review are to use a chronological, topical, or combination of chronological and topical structure. The literature review should be a succinct yet in-depth critical analysis of scholarly studies and authoritative seminal work. The literature review should not be a summary of one’s reading or an amalgamation of essays on the topic.
The literature review content needs to be a comprehensive and critical analysis and synthesis of the literature related to the theory and/or conceptual model that one identified in the Theoretical/Conceptual Framework as well as the existing body of knowledge regarding the research topic. Typically one half to two thirds of a good literature review will relate the theory or conceptual models to a critical analysis and synthesis about the topic and problem. One organizational strategy for the literature review is (a) one third discussing the theory or conceptual model (see figure below), (b) one third topical foundation, and (c) one third discussing the topic in relation to the theory.
1.15 – Transition
This heading summarizes the key contents of Section 1. Do not introduce any new material in the summary, but do provide an overview of the primary objectives and contents of Sections 2 and 3.
Section 2 – The Project
2.1 - Purpose Statement
Simply cut-and-paste the Purpose Statement from Section 1.
2.2 - Role of the Researcher
The Role of the Researcher is an important part of your proposal and study. The content that you present in this subheading is important because it demonstrates that a) you have done the research that is required, b) that you understand what your role is in the study design, and 3) you understand the limitations and challenges in this type of role, and how any concerns may be mitigated to enhance the reliability and validity of your work.
One of the most challenging parts to write in this subheading is about the use of a personal lens primarily because novice researchers (like students) assume that they have no bias in their data collection. However, it is important to remember that a participant’s as well as the researcher’s bias/worldview is present in all social research, both intentionally and unintentionally which is why it is important to address strategies to mitigate bias.
To address the concept of a personal lens, remember that in qualitative research, the researcher is the data collection instrument and cannot separate themselves from the research, which brings up special concerns. Remember that the researcher operates among multiple worlds while engaging in research, which include the cultural world of the study participants as well as the world of one’s own perspective. A researcher's cultural and experiential background will contain biases, values, and ideologies that can affect the interpretation of a study’s findings.
Therefore, researcher bias is a concern because the data can reflect the researcher’s personal bias and concerns. It becomes imperative that the interpretation of the phenomena represent that of participants and not of the researcher. Hearing and understanding the perspective of others may be one of the most difficult dilemmas the researcher must address. The better a researcher is able to recognize his/her personal view of the world and to discern the presence of a personal lens, the better one is able to hear and interpret the behavior and reflections of others.
How you address and mitigate a personal lens/worldview during your data collection and analysis is important and a key component in the Role of the Researcher subheading. It is important that a novice researcher recognizes their own personal role in the study and mitigates any concerns during data collection. Part of your discussion in this subheading should address how this is demonstrated through using an interview protocol, member checking, transcript validation and review, reaching data saturation, enabling sense making, facilitating epoché, careful construction of interview questions, and other strategies to mitigate the use of one’s personal lens during the data collection process of the study.
It would be impossible to remove all bias because you are a human being. Rather, one mitigates bias as best as one can. This is demonstrated via using an interview protocol, member checking, data saturation, and other strategies to mitigate the use of one personal lens during the data collection process of your study. Inadvertently driving participants to predetermined conclusions speaks to the same concepts.
2.3 - Participants
The requirements are straight forward but often missed in the Participants heading.
Consider the explanations in the following table.
Rubric Requirement |
Explanation |
a. Describes the eligibility criteria for study participants. |
The participants must meet the eligibility requirement within the scope of the population. Consider the research question: What strategies do department store managers use to motivate their sales associates? If one identified the population as department store managers who have worked in the field for 8-years and have a minimum of 5-years supervising sales associates, one would not be necessarily addressing the requirement. The criteria for the example research question would be department store managers who have successful strategies that they are using to motivate sales associates. The department store manager may have been in the field for 20-years or 1-month—the time in position has nothing to do with the study. Likewise, working with the employees does not mean that the department store manager is using a strategy to motivate the sales associates. |
b. Discusses strategies for gaining access to participants. |
Explain your plan for gaining access to participants. In a quantitative survey, one may use a professional association membership list or other types of list to access participants via email, phone, etc. For a qualitative study, one may also use professional associations, trade affiliations, etc. for gaining access. One may also be using rosters inside the company(ies) and emailing, calling, or visiting in person for a case study. It is vital that you develop a strategy to determine that participants meet the study criteria before inviting participation. |
c. Identifies strategies for establishing a working relationship with participants. |
Once one gains access, one needs to develop a working relationship with the participants. This may be as simple as sending a survey link via email in a quantitative study to how you will cover the informed consent form and set the |
|
stage for a qualitative interview (often referencing the interview protocol). |
d. The participants must align with the overarching research question. |
This requirement is a reminder that one must have the correct criteria for selecting the participants and that the criteria must align with the research question—nothing else should be included in the criteria. |
e. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources (as appropriate). |
During planning the study, one will make several decisions. In this heading, there is a decision for the participant criteria, how one will gain access to the participants, and how one will build a working relationship with the participants. It is recommended to support claims and decisions with multiple scholarly peer-reviewed or seminal sources. Fortunately, you have an annotated bibliography with peer-reviewed studies where others have made similar decisions as well as seminal sources on methodology. Tip: To represent your sources correctly: Write about what you will do in one sentence and synthesize your sources supporting your decision in a separate sentence. |
2.4 - Research Method
This heading is an extension of the Nature of the Study. The first paragraph of the Nature of the Study required a description of and justified the methodology. Here you will extend that discussion by providing more information and additional resources. Remember to use multiple sources to support claims and decisions. It is important to have a strong case to support the rationale for research method and design.
2.5 - Research Design
This section is an extension of the Nature of the Study. The second paragraph of the Nature of the Study required a description of and justified the design. Here you will extend that description by providing more information and additional resources. Remember to use multiple sources to support claims and decisions.
Data Saturation in Qualitative Study Designs
A vital prerequisite for a valid qualitative study is having a plan to ensure data saturation.
Data saturation in qualitative research ensures the validity in a qualitative study similar to a statistically valid sample in a quantitative study. See more on data saturation in the Population and Sampling heading below.
How to Use Multiple Sources to Support Claims and Decisions
Specifically stating multiple sources is one way to make it clear to the reviewers that you have mapped to the Rubric. However, what the reviewers are looking for is that students have done the required reading to justify the choice of research design that will best assist collecting data to answer the research question. Rather than list name-date, name-date, name-date repeatedly, one would synthesize the concepts into one cohesive whole supported by sources in a somewhat indirect manner. For example:
Case studies are the preferred strategy researchers employ when asking how or what questions (Amerson, 2011; Andrade, 2009; Yin, 2009). These types of studies identify operational links among events over time (Andrade, 2009; Baxter & Jack, 2008; Yin, 2009). Case studies may be exploratory, explanatory, or descriptive and may involve one organization and location or multiple organizations and locations for a comparative case study (Amerson, 2011; Stake, 1995; Yin, 2009).
In other words, you are supporting your synthesis with multiple sources. Another way to support your design with a source is:
Ethnographic study is unique in that it includes fieldwork where all relevant participants are observed and interviewed informally rather than a specified number as in phenomenology (Fusch, 2001; Wolcott, 2011). Bernard (2012) stated that the number of participants needed for a qualitative study was a number he could not quantify, but that the researcher takes what he can get it.
In other words, you support your synthesis in a more direct way. Note that Bernard's entire work is not within the text, but, rather, one important statement that he did make is and it supports the chosen research design.
In both examples, the synthesis demonstrated depth of knowledge that is supported by published peer-reviewed work, which is what reviewers want to see in your work. Moreover, it is a demonstration of your scholarly research abilities. Note, you may use the same source to support more than one decision if applicable.
2.6 - Population and Sampling (Quantitative Only)
Population
Start by describing the population from which the sample will be drawn. Include any pertinent demographic variables (e.g., CEO, senior executive, mid-level manager, sales professional, front-line supervisor, etc.). Refer to pg. 29 (Participant Characteristics) of the APA Manual (American Psychological Association, 2010) for other appropriate characteristics when appropriate.
Sampling
The two broad categories of sampling methods are probabilistic sampling (random sampling) and non-probabilistic sampling (non-random sampling)51. Identify and defend your sampling method. You must address the strengths and weaknesses of your chosen sampling method. For example, if you will utilize a stratified random technique defend your reason for doing so. Also note why stratified sampling is more appropriate for your research situation than another sampling technique. You will need to refer to the literature pertaining to sampling techniques.
Describe and defend the sample size. This is where you discuss conducting a power analysis to determine the appropriate sample size. You will present your power analysis in this component. G*Power3 is an excellent power analysis software tool and can be downloaded at: http://www.gpower.hhu.de/en.html . You will find a user’s manual and short tutorial at the same website. See Appendix B for an example power analysis.
Describe the eligibility criteria for inclusion in the study. Discuss any exclusion criteria. Make the eligibility criteria clear, as the results of the study cannot be generalized beyond your targeted population. You need to make it clear as to who can, and who cannot, participate in your study.
2.7 - Population and Sampling (Qualitative Only)
Defining the Population
In this heading, one needs to define the scope of the study. For example, in a phenomenological study, the population will be all the people within the scope of the study (i.e., a specific industry) that meet the participant criteria noted in the participant section 2.3 above. In an ethnographic study or case study, the population would comprise all people that meet the participant criteria in one company for an ethnographic study or single case study and multiple companies for a multiple case study. One should identify the number of companies in a multiple case study. Likewise, one should identify the approximate number of people (that meet the participant criteria) within your study’s population.
Sampling
One must describe and justify the sampling method (census, convenience, criterion, purposeful, quota, snowball, etc.). Once one defines the total population meeting the participant criteria within the scope of the study, one must identify the sample size that has the best opportunity for the researcher to reach data saturation. A large sample size does not guarantee that one will reach data saturation, nor does a small sample size—rather, it is what constitutes the sample size. One must also select a sampling technique that supports the research design.
51 See Appendix B for a typology of sampling strategies.
For example, one may use a census sample for a single or multiple case study with a small population versus a convenience sample in an ethnographic study. A census sample is actually a census, which means that the study participants will include 100% of the population. For example, as depicted in the following graphic, if one identified the scope of a multiple case study to include five companies and the people that meet the participant criteria for the population as the CEOs of the five companies, there would be a census sample if all five of the CEOs participated.
Data Saturation and Sampling
In the Population and Sampling heading (as well as the Research Design and the Validity headings), one must define how one will ensure data saturation. Although data saturation in qualitative research ensures the validity in a qualitative study similar to a statistically valid sample in a quantitative study, there is no direct correlation between the sample size and reaching data saturation. Data saturation in qualitative research is a way to ensure that one obtained accurate and valid data. Using too small of a sample or too large of a sample will not ensure data saturation. One should critically read and obtain a clear understanding of data saturation before writing a qualitative proposal. Fusch and Ness (2015) synthesized the literature to identify some key characteristics of reaching data saturation which include no new data, no new themes, no new coding, and ability to replicate the study (providing one asks the same participants the same questions in the same timeframe). The study design (case study, miniethnography, phenomenological, etc.) will affect when and how one reaches data saturation. One may be conducting interviews only in a phenomenological study, whereas one would use multiple data collection methods in a case study.
Although the DBA leadership requires a minimum of 20-participants in a phenomenological study and although one may use member checking to enhance the richness of the data, one may have to interview many more participants to reach data saturation. In contrast, in a case study using a small census sample and multiple data collection methods, one may reach data saturation with one or a few participants. In qualitative studies, quality (rich data) is more important than quantity (thick data).
2.8 - Ethical Research
Each research study comes with its own set of specific ethical issues. Thus, a rubric cannot address all possible scenarios. Therefore, it will be helpful to review the IRB Application Form before you complete this component to ensure you address any requirements not identified in the rubric or Research Handbook. However, as a minimum, discuss the informed consent process. Include a copy of the informed consent form in an appendix and list the informed
consent form in the Table of Contents. Discuss participant procedures for withdrawing from the study. Describe any applicable incentives. Clarify measures for assuring the ethical protection of participants is adequate. Agreement documents are to be listed in the (a) text of the study, (b) appendices and (c) Table of Contents. Include a statement that data will be maintained in a safe place for 5-years to protect rights of participants. Ensure you indicate that the final doctoral manuscript will include the Walden IRB approval number. Ensure the document does not include names or any other identifiable information of individuals or organizations.
Each participant in your study must give written consent to take part in the data collection phase of the work. Moreover, as a researcher following the protocols of the Belmont Report, you must ensure that your participants have a full understanding of their part in the study. Finally, you must ensure that participants understand that they may withdraw from your study at any time without penalty, and how to withdraw from the study.
It is a good practice to complete the first draft of your IRB application while completing the ethics section as well as Section 2. Consider: (a) writing a sentence about your plan to share a summary of the findings with the study participants, and (b) do not use the term anonymous for qualitative studies if you will be interviewing or knowing whom the participants are. Qualitative researchers can protect the confidentiality but not the anonymity of participants because the researcher will know who the participants are. Depending upon the data collection method, quantitative researchers may be able to protect participants’ anonymity.
2.9 - Data Collection—Instruments (Quantitative)
You will describe each instrument’s purpose, intended populations, scales, scoring process, time needed to complete, etc. This heading will also address the psychometric issues surrounding the instrument, reliability and validity—this is very important. You will need to report the reliability and validity coefficients. Where possible, include the details of the reliability measures employed (e.g. test-retest, equivalent or alternate form, split-half, and internal consistency). Validity should include content validity, criterion-related validity, and construct validity. State briefly what these measures of validity are, and report their Intercorrelation coefficients.
You will need to address any special requirements of the publisher. You will need to gain permission from the test publisher to use some instruments. This can be requested by sending a formal letter or email to the publisher. Alternatively, you may need to complete a training course or require your chair’s signature to acquire the instrument—be sure to include this information if applicable.
2.10 - Data Collection – Instruments (Qualitative)
The requirements are straight forward but often missed in the Participants heading.
Consider the explanations in the following table.
Rubric requirement |
Explanation |
a. In addition to identifying the student as the primary data collection instrument, identifies the data collection instrument/process (i.e., informal interview, semistructured interviews, phenomenological in-depth interviews, focus groups, company/archival documents, etc.). |
Rubric requirement has two parts and students sometimes miss one of them, which can lead to a revision request. 1. Identifying that you are the primary data collection instrument. 2. Identifying all of the secondary, tertiary, etc. data collection instruments. Although common in ethnographic research, in case studies, students must have a minimum of two data collection methods. |
b. Clarifies how the student will use the data collection instrument/technique (the process/protocol). |
Describe how you will use the instrument(s) by providing a brief definition of each instrument and referencing interview or focus group protocols, etc. The focus here should be more on defining and using the instrument. For example, if you are using a specific type of interview, what is the interviewing technique specific to your chosen approach (i.e., unstructured or semistructured interviews). Keep this brief; however, be sure to define the different data collection methods (with scholarly support). In the Data Collection Technique Heading, where you will expand upon the process. |
c. Identifies how the student will enhance the reliability and validity of the data collection instrument/process (i.e., member checking, transcript review, pilot test, etc.). |
Clarify how you will enhance the reliability and validity of the instruments such as using an expert panel to validate interview questions, member checking follow up interviews after semistructured interviews, triangulation of multiple data collection methods (during the data analysis as applicable to the research design), etc. |
d. Identifies where in appendices the instrument (i.e., interview protocol, focus group protocol, interview questions, etc.) is (are) located. Ensures Table of Contents lists appendices. |
As applicable, include interview protocols, focus group protocols, direct/participant observation protocols in the appendices. |
e. Supports every decision with a minimum of three scholarly peer-reviewed or seminal sources. |
During the study plan, one will make several decisions. In this heading there are several decisions to make and support. Each decision such as the following will need scholarly support: · Identifying that you are the primary data collection |
instrument.
· Identifying all of the secondary, tertiary, etc. data collection instruments such as type of interviews, focus groups, company/archival documents, company marketing materials, etc.).
· Identifying how you will use the instruments by providing a brief definition of the instrument and referencing interview or focus group protocols, etc.
· Identifying how you will enhance the reliability and validity of the instruments such as by using member-checking follow up interviews after a semistructured interview.
Tip to represent your sources correctly: Write about what you will do in one sentence and synthesize your sources supporting your decision in a separate sentence. See the following examples:
Academic integrity code of conduct violation (misrepresenting sources) example 1:
I will use semistructured to explore the strategies that department store managers use to motive their sales associates (Johnson & Williams, 2013; Rubin & Rubin, 2012; Smith, 2014). Note that the sources did not discuss the student’s study in their publications and the example is a misrepresentation of the sources.
Correctly supporting a decision example 1. Cite (2014) used semistructured interviews to determine how sales managers motivate sales associates. Likewise, Cite (2013) found that semistructured interviews were a good approach to learn how department store managers motivate sales clerks. Rubin and Rubin (2012) argued that semistructured interviews are a good way for the researcher to focus on the details that address the research question. Therefore, I will use semistructured to explore the strategies that department store managers use to motivate their sales associates. Note: please be sure to synthesize your sources to support your decisions.
Academic integrity code of conduct violation (misrepresenting sources) example 2:
I will be the primary data collection instrument in this study (Denzin, 2014; Marshall & Rossman, 2016; Wolcott, 2005). Note that the sources did not discuss the student’s study in their publications and the example is a misrepresentation of the
sources.
Correctly supporting a decision example 2. I will be the primary data collection instrument in this study. In qualitative research, the researcher is the primary data collection instrument because the researcher hears, sees, and interprets the data (Denzin, 2014; Marshall & Rossman, 2016; Wolcott, 2005). Note: please be sure to synthesize your sources to support your decisions.
2.11 - Data Collection Technique
Do not confuse the purpose of this heading with that for the explanation of procedures. You want to discuss the main approach to collecting your data. It is a good idea to restate the research question and then address the data collection process. Depending upon whether you are using a quantitative or qualitative method, you should discuss and support your decision for collecting the data.
Quantitative Studies
In a quantitative study one would discuss: (a) surveys, (b) structured record reviews to collect data (e.g., sales data, performance records, government databases, etc.), and (d) structured observations. Self-administered questionnaires and structured records are more prevalent with quantitative research. Indicate the process you will use to collect your data. State your rationale for selecting the process (e.g., in terms of strengths and weaknesses, cost, data availability, convenience, etc.).
Qualitative Studies
Describe the process for collecting the data (i.e., interviews, focus groups, direct or participant observations, and review of company/archival documents, performance indicators, sales reports, business plans, etc.) Provide an abridged interview protocol, focus group protocol, observation protocol, etc., and identify the location of the protocols in an appendix.
2.12 - Data Organization Technique (Qualitative Only)
The Data Organization Technique can often be a short paragraph where students address all of the data that they collected in this heading. There are typically two decisions in this section: (a) about how one will securely store the data (electronic and hard copies) and (b) that the data will be destroyed after 5 years.
2.13 - Data Analysis (Quantitative Only)
Data analysis involves discussing the statistical test(s) you will use to answer each research question, and justify the tests’ selection. Indicate the nature of the scale for each
variable (e.g., nominal, ordinal, interval, and ratio). Why is the selected statistical test more appropriate than another? (Hint: The statistical test is usually selected due to the nature of the question and scale of measurement of the variables you defined). Describe how you will deal with discrepant cases (missing data, data that cannot be interpreted, etc.). Identify the software that will be used to analyze the data. Be sure to discuss the data assumptions, how they will be assessed, and how you will address any violations (e.g., using Bootstrapping).
2.14 - Data Analysis (Qualitative Only)
The qualitative data analysis heading is critical for demonstrating doctoral level competence and will help you prepare for Section 3. This heading must be deep yet can be covered in one or two succinct paragraphs. Reviewing the following table’s contents will help you develop and write your data analysis plan.
Rubric requirement |
Explanation |
a. Identifies the appropriate data analysis process for the research design (i.e., one of the four types of triangulation for case studies; modified van Kaam, van Maanen, etc. for phenomenology). |
Different qualitative research designs require different data analysis processes. Critically read seminal works and other studies using your research design to be able to demonstrate that you are prepared to conduct a data analysis. For example, case study researchers will use methodological triangulation. Ethnographic researchers will likely use methodological triangulation. However ethnographers may also use data triangulation. |
b. Provides a logical and sequential process for the data analysis. |
Students must succinctly describe how they will perform the data analysis. Students must use all the data for the analysis. Often students planning case studies or ethnographic studies discuss the data collection instruments and techniques above, but forget everything but the interview data in the data analysis section. Students should begin their data analysis heading by noting the data from the planned collection methods and how they will use the data analysis process (in either order). For a case study, one would start by discussing how one will use methodological triangulation for the information from the different data collection methods. |
c. Details the student’s conceptual plan or software (i.e., NVivo, Atlasti, Ethnograph, Excel, etc.) for coding, mind-mapping, and identifying themes. |
Or is the key word in this requirement. Explain the classic data analysis method or qualitative software analysis method (how you will do it). Classic Data Analysis Method For the classic data analysis method, discuss sorting all of the concepts and ideas on separate sheets of paper into categorized piles—be sure to support your decision. Critically analyze the data using a large physical mind map (i.e., stacks, piles, or clusters of concepts and ideas on a wall or large room floor) for the classic data analysis method. Qualitative Software Analysis Method For the qualitative software analysis method, code all of the concepts and ideas (all of the data and not just the interview questions)—be sure to support your decision. Critically analyze the data in a graphical portrayal of categorized and coded concepts and ideas using the qualitative software analysis method. Themes Question the meaning of the reoccurring concepts and ideas to identify the themes. In effect, the compiling phase involves organizing the data in an order, to create a database, while disassembling phases involves dividing the complied data into fragments and labels. The reassembling process involves clustering and categorizing the labels into sequences and groups. The interpretation stage requires creating narratives from the sequences and groups including conclusions. |
d. Identifies how the student will focus on the key themes, correlate the key themes with the literature (including new studies published since writing the proposal) and the conceptual framework. |
This should be a one or two sentence plan on how you will correlate the key themes with recent studies and the theory or conceptual models from your conceptual framework. This will help you prepare for the presentation of findings in Section 3. |
e. Supports every decision |
Critically reading seminal and authoritative work for data |
with a minimum of three scholarly peer-reviewed or seminal sources. |
analysis in your selected research design is vital at this stage of your doctoral journey. You should have ample sources to support your decisions—there are some suggested readings lists in the Bibliography-Suggested Readings Lists |
2.15 - Study Validity (Quantitative Only)
Internal Validity52
Internal validity is the approximate truth about inferences regarding cause-effect or causal relationships. Thus, internal validity is only relevant in studies in which researchers seek to examine causal relationships ( i.e., experiments or quasi-experimental designs). Internal validity is not relevant in observational ( i.e., correlation designs or descriptive studies, for instance.) However, for studies in which researchers seek to assess the effects of programs or interventions, internal validity is perhaps the primary consideration. In those contexts, you would like to be able to conclude that your program or treatment made a difference -- it improved a business process or outcome
Experiments/quasiexperiments. Experimental and quasi-experimental designs are susceptible to up to 8 threats to internal validity, depending upon the specific design. These eight threats are (a) selection, (b) selection by maturation, (c) statistical regression, (d) mortality, (e) maturation, (f) history, (g) testing, and (h) instrumentation. You need to address each of these threats by briefly mentioning what they are, and, as relevant, the steps you will take in your study to address each of these threats. Again, some of the threats may not be applicable, depending upon your specific design. You can refer to a basic research design textbook to obtain a better understanding of these threats and how to combat them. Be sure to cite your sources. See the following link for further information: http://www.socialresearchmethods.net/kb/causeeff.php
If you are not conducting an experiment then indicate that this is a nonexperimental design (i.e. correlation) and threats to internal validity are not applicable. However, indicate that threats to statistical conclusion validity are of concern, and then address threats to statistical conclusion validity.
Threats to statistical conclusion validity. Start by explaining what these threats are.
Threats to statistical conclusion validity are conditions that inflate the Type I error rates, (rejecting the null hypothesis when it is in fact true), and Type II error rates (accepting the null hypothesis when it is false.) The three conditions that you need to cover here are: (a) reliability of the instrument, (b) data assumptions, and (c) sample size.
52 See more on internal validity @ http://www.socialresearchmethods.net/kb/intval.php
Reliability of the instrument. You already reported the reliability properties of your instrument in the Instrumentation heading. However, you need to determine how reliable the instrument is for your specific sample. Here you will indicate you will conduct an internal consistency reliability check of the instrument against your specific sample. The intent is to see how close the reported reliability coefficient (in section 2.9 - Instrumentation) is and your calculated reliability coefficient. State what an acceptable value is (i.e. >.7) and how you will check your instrument’s reliability. There is a procedure (Analyze/Scale/Reliability Analysis) in SPSS that will allow you to compute Cronbach’s alpha, one of several reliability coefficients.
You will report the results of the reliability analysis in Section 3, Presentation of Findings heading. The degree of agreement/disagreement can provide information for your discussion, especially in the event of a nonsignificant result.
Data assumptions53 (varies by statistical test). You will state what the assumptions are pertaining to your tests and the effects violation of the assumptions can have on your results.
Indicate how you will check these assumptions. Refer to a basic statistics textbook for assumptions regarding various tests. For example, the Green and Salkind text used in the DDBA 8438 course is an excellent resource for identifying assumptions for most basic statistical tests. Pallant (2010)54 is an excellent text for instruction on performing parametric assumption testing. The following Table contains the major assumptions and procedures for testing the assumptions for multiple linear regression and for ANOVA tests.
Table X
Statistical Test, Assumptions, and Procedures for Testing Assumptions
Statistical test |
Assumptions |
Testing |
Multiple Regression |
|
|
Outliers |
|
Scatterplot |
Multicollinearity |
Normal Probability Plot (P- P) of the Regression Standardized Residual |
|
Normality |
“ |
|
Linearity |
“ |
|
Homoscedasticity |
“ |
|
Independence of Residuals |
“ |
|
ANOVA |
|
|
Normality |
Histograms |
|
Equality of Variances |
Levene’s Test of Equality of Variances |
53 Data assumptions vary by statistical test.
54 Pallant, J. (2010). SPSS survivor manual (4th ed.). Berkshire. England: McGraw-Hill.
Sample size. Include a brief explanation of the effects of using too small a sample size could have on your study’s outcomes (refer to any basic statistics textbook). However, you will indicate this threat has been met by conducting a power analysis to ensure you have a sufficient sample size. Be sure to cite your work.
External Validity
External validity refers to the extent the study findings can be generalized to larger populations and applied to different settings. External validity is related to the sampling strategy (identified in Heading 2.6, Population and Sampling). Probability sampling strategies (random sampling) enhances external validity. Conversely, nonprobabilistic sampling strategies hinder external validity. This relationship is to be discussed in this heading.
2.16 - Reliability and Validity (Qualitative Only)
A key difference from quantitative research is the reliability and validity headings. The analogous criteria for qualitative studies are dependability, credibility, transferability, and confirmability. These criteria are not measurable and need to be established using qualitative methods such as member checking [Marshall and Rossman (2016) provide a good definition.] and triangulation (Data triangulation, investigator triangulation, theoretical triangulation, and methodological triangulation). See Norman Denzin’s work on triangulation). Please review more detailed information on qualitative validity at: http://www.socialresearchmethods.net/kb/qualval.php
Reliability
Reliability refers to how one will address dependability. Some of the ways to enhance the dependability of the study are member checking of data interpretation, transcript review, pilot test, expert validation of the interview questions, interview protocol, focus group protocol, direct or participant observation protocol, etc. Reaching data saturation will help assure the dependability of the findings. See the seminal literature on reliability.
Validity
Qualitative study validity refers to the credibility, transferability, and confirmability of the findings. Reaching data saturation will help assure the credibility, transferability, and confirmability of the findings. Please see seminal work on qualitative validity to ensure that you have a valid study.
Credibility. One can enhance credibility by member checking of the data interpretation, participant transcript review, triangulation, interview protocol, focus group protocol, direct or participant observation protocol, etc. Demonstrating qualitative credibility ensures the reviewers that one is addressing the findings from the perspective of the participants.
Confirmability. One can enhance the confirmability by ensuring that the results can be confirmed or supported by others. Probing during interviews and follow up member checking interviews, questioning from different perspectives, triangulation, etc. are techniques one may use to enhance the confirmability.
Transferability. Be sure to demonstrate how you will enable others to determine the transferability of the findings (i.e., meticulously adhering to the data collection and analysis techniques for the research design, using interview protocol, focus group protocol, direct or participant observation protocol, reaching data saturation, etc.). In contrast to quantitative studies where the researcher generalizes the findings, qualitative researchers do not generalize and do not state that the findings are transferable.
2.17 - Transition and Summary
End with a transaction heading that contains a summary of key points and provides an overview introducing Section 3. Do not include any new information in the summary.
Section 3 –Application to Professional Practice and Implications for Change
3.1 - Introduction
Reacquaint the reader to the purpose of the study. For quantitative studies, simply restating the first two sentences of the Purpose Statement followed by a brief summary of the study findings. For qualitative studies simply restate the first sentence of the purpose statement and briefly summarize the findings.
Quantitative Example
The purpose of this quantitative correlation study was to examine the relationship between employee job satisfaction, employee motivation, and employee turnover intention. The independent variables were employee job satisfaction and employee motivation. The dependent variable was employee turnover intention. The null hypothesis was rejected and the alternative hypothesis was accepted. Employee job satisfaction and employee motivation significantly predicted employee turnover.
Qualitative Example
The purpose of this qualitative multiple case study was to explore the strategies that department store managers used to motivate their sales associates. The data came from manager interviews, manager-employee observations, and company documentation at five department stores in Texas. The findings showed methods that the managers used to motivate their sales employees to provide better customer service and increase sales.
3.2 - Presentation of Findings (Quantitative)
An example of an APA results write-up for a multiple regression analysis is provided. Assumptions vary by statistical test. Therefore, ensure you address the appropriate assumptions for your statistical test.
Quantitative Example
In this subheading, I will discuss testing of the assumptions, present descriptive statistics, present inferential statistic results, provide a theoretical conversation pertaining to the findings, and conclude with a concise summary. I employed Bootstrapping, using 1,000 samples, to address the possible influence of assumption violations. Thus, bootstrapping 95% confidence intervals are presented where appropriate.
Tests of Assumptions
The assumptions of multicollinearity, outliers, normality, linearity, homoscedasticity, and independence of residuals were evaluated. Bootstrapping, using 1,000 samples, enabled combating the influence of assumption violations.
Multicollinearity. Multicollinearity was evaluated by viewing the correlation coefficients among the predictor variables. All bivariate correlations were small to medium (Table X); therefore the violation of the assumption of multicollinearity was not evident. The following table contains the correlation coefficients.
Table X
Correlation Coefficients Among Study Predictor Variables
Variable |
Age |
Weight |
Height |
Age |
1.00 |
.151 |
-.010 |
Weight |
.151 |
1.00 |
.562 |
Height |
-.010 |
.562 |
1.00 |
Note. N = 204.
Outliers, normality, linearity, homoscedasticity, and independence of residuals55.
Outliers, normality, linearity, homoscedasticity, and independence of residuals were evaluated by examining the Normal Probability Plot (P-P) of the Regression Standardized Residual (Figure 1) and the scatterplot of the standardized residuals (Figure 2). The examinations indicated there were no major violations of these assumptions. The tendency of the points to lie in a reasonably straight line (Figure 1), diagonal from the bottom left to the top right, provides supportive evidence the assumption of normality has not been grossly violated (Pallant, 2010). The lack of a clear or systematic pattern in the scatterplot of the standardized residuals (Figure 2) supports the tenability of the assumptions being met. However, 1,000 bootstrapping samples were computed to combat any possible influence of assumption violations and 95% confidence intervals based upon the bootstrap samples are reported where appropriate.
55 These are the same assumptions discussed in Section 2; the results of the assumption testing are now discussed. These assumptions differ by statistical test and the appropriate assumptions are to be discussed. Note, your specific discussion might differ. For example, there may be severe data assumption violations in the data you collected. Therefore, you would discuss appropriately.
Figure 1. Normal probability plot (P-P) of the regression standardized residuals.
Figure 2. Scatterplot of the standardized residuals.
Descriptive Statistics
In total, I received 207 surveys. Three records were eliminated due to missing data, resulting in 204 records for the analysis. Table X contains descriptive statistics of the study variables.
Table X
Means and Standard Deviations for Quantitative Study Variables
Variable |
M |
SD |
Bootstrapped 95% CI (M)56 |
Sleep Index |
26.36 |
10.56 |
[24.80, 27.94] |
Age |
43.60 |
12.51 |
[41.90, 45.28] |
Weight |
72.34 |
15.21 |
[70.23, 74.51] |
Height |
169.12 |
10.00 |
[167.68, 170.44] |
Note: N = 204.
Inferential Results
Standard multiple linear regression,57 α = .05 (two-tailed), was used to examine the efficacy of age, weight, and height in predicting sleep index. The independent variables were age, weight, and height 58. The dependent variable was sleep index 59. The null hypothesis was that age, weight, and height would not significantly predict sleep index. The alternative hypothesis was that age, weight, and height would significantly predict sleep index. Preliminary analyses were conducted to assess whether the assumptions of multicollinearity, outliers, normality, linearity, homoscedasticity, and independence of residuals60 were met; no serious violations were noted (see Tests of Assumptions). The model as a whole was able to significantly predict sleep index, F(3, 200) = 4.778, p < .003, R2 = .06761. The R2 (.067) value indicated that approximately 7% of variations in sleep index is accounted for by the linear combination of the predictor variables (sex, weight, and height). In the final model, age and height were statistically
56 The 95% Bootstrap confidence intervals are produced when the bootstrapping procedure is selected in the SPSS regression process. See regression video tutorial located at: https://www.youtube.com/watch?v=1ItFMKlPG5k
57 Identify the test and of purpose of the test.
58 Restate the independent variables as presented in the purpose statement and research question; there is to be no deviation.
59 Restate the dependent variables as presented in the purpose statement and research question; there is to be no deviation.
60 Identify the assumptions and state they how were assessed.
61 State whether the model as a whole was able to predict (or not) the dependent variable. Report the appropriate statistics.
significant with age (t= -3.892, p < .01) accounting for a higher contribution to the model than height (t = -2.595, p < .05). Weight did not explain any significant variation in sleep index. The final predictive equation was:
Sleep Index = 70.205 -.148(Age) + .109(Weight) –2.303(Height).
Age. The negative slope for age (-.148) as a predictor of sleep index indicated there was about a .148 decrease in sleep index for each one-point increase in age. In other words, sleep index tends to decrease as age increases. The squared semi-partial coefficient (sr2) 62 that estimated how much variance in sleep index was uniquely predictable from age was .03, indicating that 3% of the variance in sleep index is uniquely accounted for by age, when weight and height are controlled.
Height. The negative slope for height (-2.303) as a predictor of sleep index indicated there was a 2.303 decrease in sleep index for each additional one-unit increase in height, controlling for age and weight. In other words, sleep index tends to decrease as height increases. The squared semi-partial coefficient (sr2) that estimated how much variance in sleep index was uniquely predictable from height was .04, indicating that 4% of the variance in sleep is uniquely accounted for by height, when age and weight are controlled. The following Table depicts the regression summary table.
Table X
62 Derived from the SPSS output.
Regression Analysis Summary for Predictor Variables
Variable |
Β63 |
SE Β |
β64 |
t65 |
p66 |
B 95%67 Bootstrap CI |
Age |
-.148 |
0.054 |
-.393 |
-3.892 |
<. 01 |
[-.262, -.025] |
Weight |
.109 |
3.770 |
-.038 |
0.371 |
.712 |
[-.008, .245] |
Height |
-2.303 |
.888 |
-.268 |
-2.595 |
.011 |
[-.442, -.081] |
Note. N= 204.
Analysis summary. The purpose of this study was to examine the efficacy of age, weight, and height in predicting sleep index. I used standard multiple linear regression to examine the ability of age, weight, and height to predict the value of sleep index. Assumptions surrounding multiple regression were assessed with no serious violations noted. The model as a whole was able to significantly predict sleep index, F(3, 200) = 4.778, p < .003, R2 = .067. Both age and height provide useful predictive information about sleep index. The conclusion from this analysis is that age and height are significantly associated with sleep index, even when weight is controlled (e.g. held constant).
Theoretical conversation on findings. 68Describe in what ways findings confirm, disconfirm, or extend knowledge of the theoretical framework and relationship(s) among variables by comparing the findings with other peer-reviewed studies69 from the literature review that includes studies addressed during the proposal stage and new studies since writing the proposal. 70Ties findings or disputes findings to the existing literature on effective business
63 Β values are to be used in the regression equation. These are the unstandardized coefficients in the SPSS output.
64 The beta weights identify which variables contribute more to the model. These are the standardized coefficients in the SPSS output.
65 The test statistic for the hypothesis test for the slope (Β); derived from the SPSS output; used to evaluate the significance of the Β weights, where p ≤ .05 is significant.
66 The test statistic for the hypothesis test for the slope (Β); derived from the SPSS output; used to evaluate the significance of the Β weights, where p ≤ .05 is significant.
67 The 95% Bootstrap confidence intervals are produced when the bootstrapping procedure is selected in the SPSS regression process. See regression video tutorial located at: https://www.youtube.com/watch?v=1ItFMKlPG5k
68 Rubric item 3.2g
69 This rubric requirement substantiates the requirement to critically analyze, synthesize and “report” the results of the literature (studies) pertaining to the theory and variables (see rubric component 1.14, Review of the Professional and Academic Literature).
70 Rubric item 3.2h
practice. Analyzes and interpret the findings in the context of the theoretical framework, as appropriate. 71Ensures interpretations do not exceed the data, findings, and scope.
3.3 - Presentation of Findings (Qualitative)
There is a common misconception about Section 3. Reporting the results of the study findings is more complicated than it first appears to be. This is because the findings must be related back to the body of knowledge as well as the conceptual framework. It is not a matter of telling the reader who-said-what-and-when, one must present an in-depth scholarly discussion of how the study findings contribute to the field.
Do not be misled or fail to understand that reporting the findings is not about listing the answers to the interview questions. The answers to the interview questions are your evidence, not the answer to the research question. Moreover, one should never list the interview questions in the presentation of findings.
Remember that the rubric asks about the research question, not the interview questions.
The research question is the overarching question that your study answers.
Also, remember that you are presenting your findings as themes —major, minor, unexpected, and/or serendipitous that are a result of your data—answers to interview questions, document review, journaling, observation notes, focus group answers, etc. Also, remember that it is a good practice when using a qualitative data analysis software program to include at least one table per theme from NVivo, Atlasti, Ethnograph, or others that illustrates the frequencies.
Finally, when appropriate, remember to integrate member checking.
To sum up: Present the theme, present the evidence from the findings that support the theme (including tables), then support both from the body of knowledge/conceptual framework.
3.4 - Application to Professional Practice
Discuss how business leaders can apply the findings to aid in solving the specific business problem. Do not repeat literature review; rather focus on application. Often researchers can use this heading to help gain access by offering potentially participating company leaders a summary of the findings including suggestions for professional practice.
3.5 - Implications for Social Change
Now that you have analyzed and discussed the findings, suggest potential implications in terms of tangible improvements for individuals, communities, organizations, institutions, cultures, or societies as the findings could catalyze beneficial social change/behaviors.
71 Rubric item 3.2i
3.6 - Recommendations for Action
This is where you can create a win-win for companies and individuals participating in your study. The rubric requires the following: (a) that you ensure the recommendations flow logically from the conclusions and contain steps to useful action, (b) that you state who needs to pay attention to the results (this can help you with a win-win to discuss when gaining access for the study), and (c) that you indicate how the results might be disseminated via literature, conferences, training, etc.
3.7 - Recommendations for Further Research
Discuss areas for future research. A starting point is to identify how the limitations (weaknesses) identified in Heading 1.12, Assumptions, Limitations, Delimitations, can be improved upon in future studies. Follow up this conversation by identifying other research possibilities illuminated while conducting the study. Do not repeat literature; rather provide future researchers (e.g., other DBA students) with potential research agenda for furthering the scholarly conversation pertaining to the business problem.
This is a good section to discuss serendipitous results, unanswered new questions that arose, and a finding that does not seem to align with a theory or conceptual model warranting a recommendation for further research. Often this section can lead to postdoc research.
3.8 - Reflections
Per the rubric, this short heading includes a reflection on the researcher's experience within the DBA Doctoral Study process in which the researcher discusses possible personal biases or preconceived ideas and values, the possible effects of the researcher on the participants or the situation, and her/his changes in thinking after completing the study.
3.9 - Conclusion
Per the rubric, students should close with a strong concluding statement making the take- home message clear to the reader. This should be a conclusion and not a summary.
3.10 - Appendices/Table of Contents
Ensure all appendices appear in the order they are referenced in the proposal/doctoral
study.
APPENDIX A: WALDEN UNIVERSITY DOCTOR OF BUSINESS ADMINISTRATION PROGRAM VIDEO TITLES AND URL ADDRESSES
|
Title |
|
URL Address |
1 |
Walden DBA Rubric and Handbook Video Tutorial |
| |
2 |
Walden DBA Problem Statement Tutorial |
| |
3 |
Walden DBA Purpose Statement Tutorial |
| |
4 |
Walden DBA Theoretical/Conceptual Framework |
| |
5 |
Scales of Measurement |
| |
6 |
DDBA Week One Application |
| |
7 |
DDBA 8438 Week Two Application Video – Part 1 |
| |
8 |
Week Two Application Video – Part 2 |
| |
9 |
Part 1: Independent Samples T - Test |
| |
10 |
Part 2: Independent Samples T - Test |
| |
11 |
Part 1: Week Five One-way ANOVA |
| |
12 |
Part 2: Week Five One-way ANOVA |
| |
13 |
Walden University Doctor of Business Administration Multiple Linear Regression – Part 1 |
| |
14 |
Walden University Doctor of Business Administration Multiple Linear Regression – Part 2 |
|
Note: Titles in green are used in DDBA 8438 but can be applicable in the research process.
APPENDIX B: QUANTITATIVE RESEARCH PRIMER: PROBLEM STATEMENT, PURPOSE STATEMENT, RESEARCH QUESTION(S), AND HYPOTHESES
Doctor of Business Administration
Quantitative Research Primer: Problem Statement, Purpose Statement, Research Question, and Hypotheses
Prepared by the DBA Methodology Team: June 2014
DBA doctoral studies require the highest level of rigor and scholarship. One focus of rigor and scholarship is the number of predictor or independent variables72 examined in quantitative doc studies. Nonexperimental research (i.e. correlation73, quasi- experimental74, etc.) requires the use of at least two independent or predictor variables. This is due to the third variable problem. A third variable, also known as a confounding or mediator variable, can confound the relationship between the independent and dependent variable. This compounding effect can lead the researcher to incorrectly interpret the results, leading to an incorrect rejection of the null hypothesis (Type I error).
As such, all DBA quantitative studies require the examination of at least two predictor (correlation studies), or independent (i.e., quasi-experimental, causal comparative, etc. studies) variables. This affects the statistical analysis, as simple bivariate correlations (correlation designs) or one-way ANOVAs cannot be used as inferential statistical tests. Other statistical procedures, such as multiple regression analyses, must be used for correlation studies. Quasi-experimental/causal comparative designs must employ statistical analyses (i.e. factorial ANOVAs), as a minimum capable of examining more than one independent variable. Please be sure to discuss this with your chair!
Below are hypothetical examples of correlation and quasi-experimental research scenarios, which include the Problem Statement, Purpose Statement, Research Question, and Hypotheses. These examples depict two predictor (correlation studies)/independent (quasi-experimental) variables, which are (a) employee job satisfaction and (b) employee motivation. The dependent variable is employee turnover intentions. It may be helpful to use this model as a script and fill in the specifics as they apply to your study. The red underlined text is what you will need to change for your specific study. Footnotes (in red) are included to identify the required rubric elements.
Again, map to the rubric in this component and all components of your doctoral study. The rubric criteria are the basis for judging the quality of your study. Notice how each of the six rubric elements is included in the purpose statement and there is no superfluous information.
Please review the Problem Statement video tutorial at: http://youtu.be/IYWzCYyrgpo to aid you in preparing the Problem Statement.
Please review the Purpose statement video tutorial at: http://youtu.be/pLP4r0mfT9A to aid you in preparing the Purpose Statement.
72 Click the hyperlink to be taken to additional information. 73 Click the hyperlink to be taken to additional information. 74 Click the hyperlink to be taken to additional information.
Hypothetical Example (Correlation Design) Problem Statement
Organizations place great emphasis on retention because of the strategic value of intellectual capital and the costs of replacing valued employees (cite)75. Research in this domain is potentially valuable because turnover costs U.S. businesses billions of dollars per year (cite), and practices that promote retention can save even small companies millions of dollars annually (cite)76. The general business problem is that turnover intention has been shown to be among the best predictors of turnover (cite)77. The specific business problem is that some microelectronic business owners do not understand the relationship between job satisfaction, motivation, and employee turnover intentions78.
Purpose Statement
The purpose of this quantitative79 correlation80 study is to examine the relationship between employee job satisfaction, employee motivation, and employee turnover intentions. The independent variables are employee job satisfaction and employee motivation81. The dependent variable is employee turnover intention82. The targeted population will consist of mid-level employees of microelectronic companies83 located in the southeast United States. The implications for positive social change include the potential to better understand the correlates of employee turnover, thus increasing propensity for sustainability of the microelectronic industry 84.
Research Question
What is the relationship between employee job satisfaction, employee motivation, and employee turnover intentions?
Hypotheses
Null Hypothesis (H0): There is no statistically significant relationship between employee job satisfaction, employee motivation, and employee turnover intentions.
75 Hook
76 Anchor
77 General business problem 78 Specific business problem 79 Method
80 Design
81 Independent variables
82 Dependent variable
83 Targeted population
84 Social change statement
Alternative Hypothesis (H1): There is a statistically significant relationship between employee job satisfaction, employee motivation, and employee turnover intentions.
Hypothetical Example (Causal-Comparative Design)
Problem Statement
Organizations place great emphasis on retention because of the strategic value of intellectual capital and the costs of replacing valued employees (cite). Research in this domain is potentially valuable because turnover costs U.S. businesses billions of dollars per year (cite), and practices that promote retention can save even small companies millions of dollars annually (cite). The general business problem is that turnover intention have been shown to have a significant impact on employee turnover (cite). The specific business problem is that some micro-electronic business owners do not understand the impact of job satisfaction, motivation, on employee turnover intentions.
Purpose Statement
The purpose of this quantitative85 correlation86 study is to examine the impact of employee job satisfaction and employee motivation on employee turnover intentions. The independent variables are employee job satisfaction and employee motivation87. The dependent variable is employee turnover intention88. The targeted population will consist of midlevel employees of microelectronic companies89 located in the southeast United States. The implications for positive social change include the potential to provide a better understanding of the correlates of employee turnover, thus increasing propensity for sustainability of the microelectronic industry90.
Research Question
What is the impact of employee job satisfaction and employee motivation on
employee turnover intentions?
Hypotheses
Null Hypothesis (H0): Employee job satisfaction and employee motivation have no significant impact on employee turnover intentions.
Alternative Hypothesis (H1): Employee job satisfaction and employee motivation have a statistically significant impact on employee turnover intentions.
85 Method
86 Design
87 Independent variables
88 Dependent variable
89 Targeted population
90 Social change statement
Research Tips
· Correlation designs use the term relationship
· Causal comparative designs use the terms impact or influence
· Variables are presented in temporal order; that is the independent variables are presented first, followed by the dependent variable
· The word and (see bold text in Purpose Statement) separates the predictor variables from the dependent variable in correlation designs
· The word on (see bold text in Purpose Statement) separates the independent variables from the dependent variable in experimental/quasi-experimental designs
· The null and alternative hypotheses are almost mirror images of the research question
· The null hypothesis is the hypothesis of no difference; suggesting there will not be a significant result
· The alternative hypothesis is the hypothesis of difference; suggesting there will be a significant result
APPENDIX C: MAJOR QUANTITATIVE DESIGNS
Research design91 is the blueprint that enables the investigator to develop solutions to research problems and guides the researcher in the various stages of the research (Frankfort- Nachmias & Nachmias, 2008). The research design aids the researcher in structuring, analyzing, and interpreting the data (Frankfort-Nachmias & Nachmias, 2008). DeForge (2010) described research design as a plan for guiding researchers in addressing research problems and answering research questions.
Quantitative Methodology and Associated Designs
Design |
Characteristics |
Experimental |
· Assess causal (cause and effect) relationships between an independent and dependent variable · Defining feature: random assignment to group condition · Manipulation of the independent variable · Strongest in terms of internal validity; greatest confidence in causal inferences · Requires power analysis to determine appropriate sample size · Analyses can include, but are not limited to, (ANOVA, ANCOVA, MANOVA, etc.) |
Quasi-experimental |
· Assess causal relationships between an independent and dependent variable. · Defining feature: lack of random assignment to group condition · Manipulation of the independent variable · Weakened ability to make causal inferences · Requires power analysis to determine appropriate sample size |
Correlation |
· Assess relationships between independent and dependent variables · Defining feature: does not imply causality · Requires power analysis to determine appropriate sample size · Analyses can include, but are not limited to, (a) multiple regression, (b) logistic regression, and (c) discriminant analysis |
Note. Correlation designs are the most common seen in DBA studies.
91 Review the Research Methods Knowledge Base at: http://www.socialresearchmethods.net/kb/design.php for more information pertaining to research design.
APPENDIX D: SAMPLING TYPOLOGIES92
Non Probabilistic Sampling (Non-Random) |
|
Availability (Convenience) |
A nonprobabilistic sampling procedure in which units are selected from the target population based on their availability or convenience of the researcher. |
Purposive |
A nonprobabilistic sampling procedure in which units are selected from the target population based on their fit with the purpose of the study and specific inclusion and exclusion criteria. |
Quota |
A nonprobabilistic sampling procedure in which the population is divided into mutually exclusive subcategories. Interviewers or other data collectors solicit participation in the study from members of the subcategories until a target number of elements to be sampled from the subcategories have been met. |
Snowball |
A nonprobabilistic sampling procedure in which elements are selected from the target population with assistance of previously selected populations. |
Probabilistic Sampling (Random) |
|
Simple Random Sampling |
A probability sampling procedure that gives every unit in the target population, and each possible sample of a given size, an equal chance of being selected. |
Stratified Sampling |
A probability sampling procedure in which the target population is first separated into mutually exclusive, homogeneous segments (strata) and then a simple random sample is selected from each segment (stratum) |
Systematic Sampling |
A probability sampling procedure in which a random selection is made of the first unit for the sample, and then subsequent units are selected used a fixed or systematic interval until the desired sample size is reached. |
Cluster Sampling |
A nonprobabilistic sampling procedure in which units of the target population are randomly selected in natural occurring groups (clusters). |
92 Adapted from Daniel, J. (2012). Sampling essentials: Practical guidelines for making sampling choices. Los Angeles, CA: SAGE.
APPENDIX E: SAMPLE POWER ANALYSIS
G*Power is a statistical software package quantiative researhcers use to conduct an apriori sample size analysis (Faul, Erdfelder, Buchner, & Lang, 2009)93. A power analysis, using G*Power version 3.1.9 software, was conducted to determine the appropriate sample size for the study. An a priori power analysis, assuming a medium effect size (f 2= .15), α = .05, and 2 predictor variables, identified that a minumum sample size of 68 participants is required to achieve a power of .80. Increasing the sample size to 146 will increase power to .99. Therefore, the researcher will seek between 68 and 146 participants for the study (Figure 1).
Figure 1. Power as a function of sample size.
The use of a medium effect size (f2 = .15) is apporiate for this proposed study. The medium effect size was based on the analysis of X articles where (identify your variable) was the outcome measurement.
93 Faul, F., Erdfelder, E., Buchner, A., & Lang, A.-G. (2009). Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods, 41, 1149-1160. doi:10.3758/brm.41.4.1149
APPENDIX F: SAMPLE QUANTITATIVE LITERATURE REVIEW OUTLINE
Introduction
Provide an introduction containing a discussion of the content of the literature review (including the percentages of total references that are peer reviewed, and the percentage of total references that are published within 5 years of the expected year of CAO approval). Also discuss the organization of the review, and the strategy for searching the literature. The review of the literature will follow in appropriately formatted APA headings. Do not present the literature review in annotated bibliography format (i.e., presenting one study after another.) Rather, provide a critical analysis and synthesis of the literature.
Transformational Leadership Theory94
Introduce the theory. You can present the information provided in Heading 1-4, Theoretical/Conceptual Framework. However, this heading should be expanded, providing the reader with more depth pertaining to the theory. Descriptive information should be included here. The critical analysis and synthesis of the literature follows below.
Main point one.95 Conducting a good literature review involves the reader identifying and separating literature by similar ideas, themes, topics etc. The similar ideas can be presented using appropriate APA L2 headings; use subordinate headings as appropriate. You are not to simply regurgitate the material you have read. The literature presented in each main topic heading must be a critical analysis and synthesis of the empirical observations (research studies) you have reviewed. Critical analysis and synthesis of the literature grounded in your theoretical framework will enable you to meet the requirements in the Presentation of Findings heading.
See the Doctoral Study Rubric for more information.
Main point two. The same information presented in main point one applies for main point two.
Main point three. The same information presented in main point three applies for Main Point C.
Rival Theories/Opponents of the Theoretical/Conceptual Framework
There are always rival theories, that is, rival/alternate lenses for examining a phenomenon. A good literature review comprises an inquiry into the major rival theories. Provide a very brief overview of two to three rival theories and then shift the discussion to one major rival theory. Questions you may consider addressing in this component are:
· What are the strengths and limitations of this theory?
94 APA Level 2 heading.
95 APA Level 3 heading.
· Why did you not choose to examine your problem through this theoretical lens?
· What do opponents (other authorities) in the field identify as the limitations or weakness of this rival theory?
Measurement
A good literature review must address the measurement instruments pertaining to the variables or constructs underlying the theoretical framework. Often times, there is more than one measurement instrument available to measure the same variables or constructs. A review of the measurement instruments will facilitate your identifying appropriate instruments for your theoretical variables/constructs. Addressing, validity and reliability properties of the various instruments is a vital component of this heading. In addition, discussing the various populations for which the instruments were used is vital to addressing the requirements for this component.
For example, a study grounded in transformational leadership theory will undoubtedly uncover a plethora of literature where previous researchers employed the Multifaceted Leadership Questionnaire (MLQ) to measure the transformational leadership constructs. In many cases, you will identify more than one instrument purporting to measure the same variables or constructs. A critical analysis and synthesis will enable you to select the most appropriate instrument to measure the constructs underlying your study. Address the strengths and weaknesses of each instrument. The results of your critical analysis and synthesis will justify the selection of the instrument you propose to use for your study. Remember, many decisions you make in your study (i.e. selecting instruments) are grounded in the extant literature; these decisions are not to be arbitrarily made.
Independent Variable A (variable not underlying the theory)
The study may contain additional variables96 outside the umbrella of the theoretical framework. Therefore, discussions of these variables are warranted. An informed decision must be made to include variables in a study. As such, variables or constructs examined in a quantitative study are derived from extant literature; they are not arbitrarily selected for inclusion in a study. For example, assume job satisfaction is an independent or predictor variable in your study. If so, this variable must be substantiated from the literature. Therefore, you are to conduct a critical analysis and synthesis pertaining to the literature. This critical analysis and synthesis must support evidence of a relationship between each potential independent variable and the dependent variable in your study, or a variable closely related to the dependent variable in your study. In addition, there might be inconclusive evidence and you are to provide the support for including the independent or predictor variable in your study. Include APA sub headings for each independent and dependent variable.
96 It is important to understand you are not addressing variables underlying the theoretical framework. Here you are addressing any “additional” variables included in the study that are not aligned with the theoretical framework. In essence, there will be justification for every variable measured in the study.
Independent Variable B (variable not underlying the theory)
The same information in Independent Variable A applies for each independent or predictor variable in the study.
Independent Variable C (variable not underlying the theory)
The same information in Independent Variable A applies for each independent or predictor variable in the study.
Dependent Variable
The dependent variable must also be addressed in the literature review. This is normally the problematic variable in the study. Remember you are viewing this problematic variable through the identified theoretical lens. Again, this component is to include a critical analysis and synthesis of the empirical literature pertaining to the dependent variable.
Methodologies
Address they various methodologies (quantitative, qualitative, mixed-method) in the literature through which previous researchers have addressed the dependent variable. A literature review must not solely address the methodology that matches to intended studies design.
Remember, the literature review is to be an exhaustive review of the literature pertaining to a topic.
Summary
End with a transition heading that contains a summary of key points and provides an overview introducing Section 2 and Section 3. Do not include any new information in the summary.
APPENDIX G: SAMPLE APA TABLES
Properly formatted APA tables are critical media for presenting descriptive and inferential statistics results. This appendix provides templates that serve as models for what is required for various types of statistical analyses. The examples are based on guidelines contained in the sixth edition of the Publication Manual of the American Psychological Association97. You can simply cut and paste these tables into the appropriate section of your proposal/doctoral study.98
97 American Psychological Association. (2010). Publication manual of the American Psychological Association. (6th ed.). Washington, DC: Author.
98 Tables will need to be adjusted for your particular analyses. For example, you may need to add/delete additional rows/columns as appropriate.
Basic One Group Descriptive Statistics Table for Quantitative Variables (Example Depicting 3 Variables)
Table X
The Table Title Goes Here and Is Italicized (N = XX)
Variable |
n |
M |
M 95% Bootstrap CI |
SD |
SD 95% Bootstrap CI |
Variable 1 |
23 |
2.4 |
[1.85, 2.99] |
.24 |
[.11, .64] |
Variable 2 |
34 |
2.8 |
[1.56, 3.94] |
.34 |
[.22, .53] |
Variable 3 |
34 |
2.9 |
[2.05, 3.35] |
.28 |
[.25, .44] |
Basic Descriptive Statistics Table for Qualitative
(Example Depicting 3 Variables)
Table X
The Table Title Goes Here and Is Italicized (N = XX)
Variable |
n |
% |
Variable 1 |
32 |
32 |
Variable 2 |
34 |
34 |
Variable 3 |
34 |
34 |
Total |
100 |
100 |
Simultaneous Regression Table (2 Variables)
Table X
The Table Title Goes Here and Is Italicized (N = XX)
Variable |
B |
SE Β |
β |
t |
p |
B 95% Bootstrap CI |
Variable 1 |
0.00 |
0.00 |
.00 |
.00 |
.00 |
[00.00, 00.00] |
Variable 2 |
0.00 |
0.00 |
.00 |
.00 |
.00 |
[00.00, 00.00] |
Note. Type any notes here.
Hierarchical Regression Table (2 Steps)
Table X
The Table Title Goes Here and Is Italicized (N = XX)
Variable |
B |
SE Β |
β |
R2 |
∆R2 |
Step 1 |
|
|
|
|
|
Variable 1 |
0.00 |
0.00 |
.00 |
.00 |
.00 |
Variable 2 |
0.00 |
0.00 |
.00 |
.00 |
.00 |
Step 2 |
0.00 |
0.00 |
.00 |
.00 |
.00 |
Variable 1 |
0.00 |
0.00 |
.00 |
.00 |
.00 |
Variable 2 |
0.00 |
0.00 |
.00 |
.00 |
.00 |
Variable 3 |
0.00 |
0.00 |
.00 |
.00 |
.00 |
Note. Type any notes here.
The table above reflects a “Play it Safe99” hierarchical regression table with 2 variables in step one and 3 variables in step 2. You will need to make modifications according to your specific model.
99 The “Play It safe” table is comprehensive and thus would be appropriate if the writer wanted to be as thorough as possible and was not concerned with brevity.
Two-Way ANOVA Table
Table X
The Table Title Goes Here and Is Italicized (N = XX)
Source |
df |
F |
η |
p |
Between subjects |
||||
Variable 1 (A) |
XX |
0.00 |
0.00 |
.00 |
Variable 2 (B) |
XX |
0.00 |
0.00 |
.00 |
A x B |
XX |
|
|
.00 |
B within-group error |
XX |
|
|
.00 |
|
Within-subjects |
|
|
|
|
XX |
0.00 |
0.00 |
.00 |
|
XX |
0.00 |
0.00 |
.00 |
|
XX |
0.00 |
0.00 |
.00 |
Note. Type any notes here.
Correlation Table
Table X
The Table Title Goes Here and Is Italicized (N = XX)
Subscale |
1 |
2 |
3 |
4 |
Students (n = XX) |
||||
1. Variable 1 |
1.0 |
.00 |
.00 |
.00 |
2. Variable 2 |
.00 |
1.0 |
.00 |
.00 |
3. Variable 3 |
.00 |
.00 |
1.0 |
.00 |
4. Variable 4 |
.00 |
.00 |
.00 |
1.0 |
Older adults (n = XX) |
||||
1. Variable 1 |
1.0 |
.00 |
.00 |
.00 |
2. Variable 2 |
.00 |
1.0 |
.00 |
.00 |
3. Variable 3 |
.00 |
.00 |
1.0 |
.00 |
4. Variable 4 |
.00 |
.00 |
.00 |
1.0 |
Note. Type any notes here.
Logistic Regression Table (6 Predictors)
Table X
The Table Title Goes Here and Is Italicized (N = XX)
B |
S.E |
Wald |
df |
p |
Odds Ratio |
95% CI for Odds Ratio |
|
|
|
|
|
|
|
Lower |
Upper |
Variable 1 |
|
|
|
|
|
|
|
Variable 2 |
|
|
|
|
|
|
|
Variable 3 |
|
|
|
|
|
|
|
Variable 4 |
|
|
|
|
|
|
|
Variable 5 |
|
|
|
|
|
|
|
Variable 6 |
|
|
|
|
|
|
|
Constant |
|
|
|
|
|
|
|
APPENDIX H: SAMPLE INTERVIEW PROTOCOL
Interview Protocol |
||
What you will do |
|
What you will say—script |
Introduce the interview and set the stage—often over a meal or coffee |
Script XXXXXXXXXXXXXXXXXXXXX |
|
· Watch for non-verbal queues · Paraphrase as needed · Ask follow-up probing questions to get more in depth |
1. Interview question |
|
|
2. Interview question |
|
|
3. Interview question |
|
|
4. Interview question |
|
|
|
5. Interview question |
|
|
6. Interview question |
|
|
7. Interview question |
|
|
8. Interview question |
|
|
9. Interview question |
|
|
10. Last interview question should be a wrap up question such as: What additional experiences have you had…? |
Wrap up interview thanking participant |
Script XXXXXXXXXXXXXXXXXXXXX |
|
Schedule follow-up member checking interview |
Script XXXXXXXXXXXXXXXXXXXXX |
|
Follow–up Member Checking Interview
Graphic by Gene E. Fusch, Ph.D. not needed in proposal or study—just a visual reminder during proposal stage when creating interview protocol. |
||
Introduce follow-up interview and set the stage |
Script XXXXXXXXXXXXXXXXXXXXX |
Share a copy of the succinct synthesis for each individual question Bring in probing questions related to other information that you may have found— note the information must be related so that you are probing and adhering to the IRB approval. Walk through each question, read the interpretation and ask: Did I miss anything? Or, What would you like to add? |
Script XXXXXXXXXXXXXXXXXXXXX |
|
1. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
2. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
3. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
4. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
5. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
6. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
7. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
8. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
9. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
|
10. Question and succinct synthesis of the interpretation—perhaps one paragraph or as needed |
BIBLIOGRAPHY: SUGGESTED READINGS LISTS
Please note that these references are an amalgamation of input and suggestions. The purpose is to provide DBA students with additional reading sources to prepare for the doctoral study. Students are responsible for correctly referencing any sources per the APA publication manual (6th ed.). The following Readings lists are in order by the following topics.
· Assumptions, Limitations, and Delimitations
· Case Study Sources
· Case Study Seminal Books
· Data Saturation and Data Collection Sources
· Ethical Considerations/IRB
· Ethnography Sources
· Focus Groups
· Interview Protocol Sources
· Interviews Sources
· Journaling Sources
· Member Checking Sources
· Mixed Methods Research
· Notetaking and Fieldwork
· Phenomenological Sources
· Pilot Studies
· Qualitative Research Foundation
· Qualitative and Quantitative Sources
· Reliability, Validity, Transferability, and Generalizability Sources
· Sampling and Incentives
· Sense-making
· Qualitative Software Analysis Sources
· Triangulation Sources
Assumptions, Limitations, and Delimitations
Assumptions
Abrams, L. S. (2010). Sampling hard to reach populations in qualitative research: The case of incarcerated youth. Qualitative Social Work, 9, 536-550. doi:10.1077/1473325010367821
Applebaum, M. (2012). Phenomenological psychological research as science. Journal of Phenomenological Psychology, 43(1), 36-72. doi:10.1163/156916212x632952
Arghode, V. (2012). Qualitative and quantitative research: Paradigmatic differences.
Global Education Journal, 2012(4), 155-163. Retrieved from http://franklinpublishing.net/globaleducation.html
Bansal, P., & Corley, K. (2011). The coming of age for qualitative research: Embracing the diversity of qualitative methods. Academy of Management Journal, 54, 233- 237. doi:10.5465/AMJ.2011.60262792
Bunniss, S., & Kelly, D. R. (2010). Research paradigms in medical education research.
Qualitative Research in Medical Education, 44, 358-366. doi:10.1111/j.1365- 2923.2009.03611.x
Castellan, C. M. (2010). Quantitative and qualitative research: A view for clarity.
International Journal of Education, 2(2), 1-14. Retrieved from http:// www.macrothink.org/ije
Cunliffe, A. L. (2011). Crafting qualitative research: Morgan and Smircich 30 years on.
Organizational Research Methods, 14, 647-673. doi:10.1177/1094428110373658
Diefenbach, T. (2009). Are case studies more than sophisticated storytelling?
Methodological problems of qualitative empirical research mainly based on semistructured interviews. Quality and Quantity, 43, 875-894. doi:10.1007/s11135-008-9164-0
Draper, A. A., & Swift, J. A. (2011). Qualitative research in nutrition and dietetics: Data collection issues. Journal of Human Nutrition & Dietetics, 24(1), 3-12. doi:10.1111/j.1365-277X.2010.01117.x
Ellis, T. J., & Levy, Y. (2009). Towards a guide for novice researchers on research methodology: Review and proposed methods. Issues in Informing Science & Information Technology, 323-337. Retrieved from http://informingscience.org/
Fan, X. (2013). “The test is reliable”; “The test is valid”: Language use, unconscious assumptions, and education research practice. The Asia-Pacific Education Researcher, 22, 217-218. doi:10.1007/s40299-012-0036-y
Gallop, S. (2011). Viewpoint: Assumptions. Journal of Behavioral Optometry, 22, 158-
160. Retrieved from http://www.oepf.org/journals
Grant, A. (2014). Troubling ‘lived experience’: A post-structural critique of mental health nursing qualitative research assumptions. Journal of Psychiatric and Mental Health Nursing, 21(6), 544-549 doi:10.1111/jpm.12113
Hodges, N. (2011). Qualitative research: A discussion of frequently articulated qualms (FAQs). Family and Consumer Sciences Research Journal, 40, 90-92. doi:10.1111/j.1552-3934.2011.02091.x
Lips-Wiersma, M., & Mills, A. J. (2013) Understanding the basic assumptions about human nature in workplace spirituality: Beyond the critical versus positive divide. Journal of Management Inquiry, 23(2), 148-161. doi:10.1177/1056492613501227
Kirkwood, A., & Price, L. (2013). Examining some assumptions and limitations of research on the effects of emerging technologies for teaching and learning in higher education. British Journal of Educational Technology, 44, 536-543. doi:10.1111/bjet.12049
Kouchaki, M., Okhuysen, G. A., Waller, M. J., & Tajeddin, G. (2012). The treatment of the relationship between croups and their environments: A review and critical examination of common assumptions in research. Group & Organization Management, 37, 171-203. doi:10.1177/1059601112443850
Marshall, C., & Rossman, G. B. (2016). Designing qualitative research (6th ed.).
Thousand Oaks, CA: Sage.
Martin, K., & Parmar, B. (2012). Assumptions in decision-making scholarship: Implications for business ethics research. Journal of Business Ethics, 105, 289- 306. doi:10.1007/s10551-011-0965-z
Pratt, M. G. (2009). For the lack of a boilerplate: Tips on writing up (and reviewing) qualitative research. Academy of Management Journal, 52, 856-862. doi:10.5465/AMJ.2009.44632557
Rocha Pereira, H. (2012). Rigour in phenomenological research: Reflections of a novice nurse researcher. Nurse Researcher, 19(3), 16-19. Retrieved from http://nurse researcher.rcnpublishing.co.uk
Wahyuni, D. (2012). The research design maze: understanding paradigms, cases, methods and methodologies. Journal of Applied Management Accounting Research, 10(1), 69-80. Retrieved from http://maaw.info/JAMAR.htm
Limitations
Aastrup, J., & Halldorsson, A. (2013). Quality criteria for qualitative inquiries in logistics. European Journal of Operational Research, 144, 321-332. doi:10.1016/S0377- 2217(02)00397-1
Anderson, C. (2010). Presenting and evaluating qualitative research. American Journal of Pharmaceutical Education, 74(8), 1-7. doi:10.5688/aj7408141
Brutus, S., Aguinis, H., & Wassmer, U. (2012). Self-reported limitations and future directions in scholarly reports analysis and recommendations. Journal of Management, 39(1) 48-75. doi:10.1177/0149206312455245
Brutus, S., Gill, H., & Duniewicz, K. (2010). State of science in industrial and organizational psychology: A review of self-reported limitations. Personnel Psychology, 63, 907-936. doi:10.1111/j.1744-6570.2010.01192.x
Bunniss, S., & Kelly, D. R. (2010). Research paradigms in medical education research.
Qualitative Research in Medical Education, 44, 358-366. doi:10.1111/j.1365- 2923.2009.03611.x
Castellan, C. M. (2010). Quantitative and qualitative research: A view for clarity.
International Journal of Education, 2(2), 1-14. Retrieved from http:// www.macrothink.org/ije
Connelly, L. M. (2013). Limitation section. Medsurg Nursing, 22, 325-325, 336.
Retrieved from http://www.medsurgnursing.net/cgi- bin/WebObjects/MSNJournal.woa
Cunliffe, A. L. (2011). Crafting qualitative research: Morgan and Smircich 30 years on.
Organizational Research Methods, 14, 647-673. doi:10.1177/1094428110373658
Diefenbach, T. (2009). Are case studies more than sophisticated storytelling?
Methodological problems of qualitative empirical research mainly based on semistructured interviews. Quality and Quantity, 43, 875-894. doi:10.1007/s11135-008-9164-0
Draper, A. A., & Swift, J. A. (2011). Qualitative research in nutrition and dietetics: Data collection issues. Journal of Human Nutrition & Dietetics, 24(1), 3-12. doi:10.1111/j.1365-277X.2010.01117.x
Ellis, T. J., & Levy, Y. (2009). Towards a guide for novice researchers on research methodology: Review and proposed methods. Issues in Informing Science & Information Technology, 323-337. Retrieved from http://informingscience.org/
Fan, X. (2013). “The test is reliable”; “The test is valid”: Language use, unconscious assumptions, and education research practice. The Asia-Pacific Education Researcher, 22, 217-218. doi:10.1007/s40299-012-0036-y
Finfgeld-Connett, D. (2010). Generalizability and transferability of meta-synthesis research findings. Journal of Advanced Nursing, 66, 246-254. doi:10.1111/j.1365-2648.2009.05250.x
Gibbs, L., Kealy, M., Willis, K., Green, J., Welch, N., & Daly, J. (2007). What have sampling and data collection got to do with good qualitative research? Australian and New Zealand Journal of Public Health, 31, 540-544. doi:10.1111/j.1753- 6405.2007.00140.x
Hodges, N. (2011). Qualitative research: A discussion of frequently articulated qualms (FAQs). Family and Consumer Sciences Research Journal, 40, 90-92. doi:10.1111/j.1552-3934.2011.02091.x
Houghton, C., Casey, D., Shaw, D., & Murphy, K. (2013). Rigour in qualitative case- study research. Nurse Researcher, 20(4), 12-17. doi:10.7748/nr2013.03.20.4.12.e326
Marshall, C., & Rossman, G. B. (2016). Designing qualitative research (6th ed.).
Thousand Oaks, CA: Sage.
O’Reilly, M., & Parker, N. (2012, May). Unsatisfactory saturation: A critical exploration of the notion of saturated sample sizes in qualitative research. Qualitative Research Journal, 1-8. doi:10.1177/1468794112446106
Polit, D. F., & Beck, C. T. (2010). Generalization in quantitative and qualitative research: Myths and strategies. International Journal of Nursing Studies, 47, 1451-1458. doi:10.1016/j.ijnurstu.2010.06.004
Pratt, M. G. (2009). For the lack of a boilerplate: Tips on writing up (and reviewing) qualitative research. Academy of Management Journal, 52, 856-862. doi:10.5465/AMJ.2009.44632557
Prowse, M., & Camfield, L. (2013). Improving the quality of development assistance: What role for qualitative methods in randomized experiments? Progress in Development Studies, 13(1), 51-61. doi:10.1177/146499341201300104
Rocha Pereira, H. (2012). Rigour in phenomenological research: Reflections of a novice nurse researcher. Nurse Researcher, 19(3), 16-19. Retrieved from http://nurse researcher.rcnpublishing.co.uk
Sabbour, S., Lasi, H., & Tessin, P. (2012). Business intelligence and strategic decision simulation. World Academy of Science, Engineering and Technology, 6, 980-987. Retrieved from http://waset.org/Publications?p=61
Delimitations
Barratt, M., Choi, T. Y., & Li, M. (2011). Qualitative case studies in operations management: Trends, research outcomes, and future research implications. Journal of Operations Management, 29, 329-342. doi:10.1016/j.jom.2010.06.002
Baxter, P., & Jack, S. (2008). Qualitative case study methodology: Study design and implementation for novice researchers. The Qualitative Report, 13, 544-559. Retrieved from http://www.nova.edu/ssss/QR/QR13-4/baxter
Bunniss, S., & Kelly, D. R. (2010). Research paradigms in medical education research.
Qualitative Research in Medical Education, 44, 358-366. doi:10.1111/j.1365- 2923.2009.03611.x
Castellan, C. M. (2010). Quantitative and qualitative research: A view for clarity.
International Journal of Education, 2(2), 1-14. Retrieved from http:// www.macrothink.org/ije
Cunliffe, A. L. (2011). Crafting qualitative research: Morgan and Smircich 30 years on.
Organizational Research Methods, 14, 647-673. doi:10.1177/1094428110373658
Diefenbach, T. (2009). Are case studies more than sophisticated storytelling?
Methodological problems of qualitative empirical research mainly based on semistructured interviews. Quality and Quantity, 43, 875-894. doi:10.1007/s11135-008-9164-0
Draper, A. A., & Swift, J. A. (2011). Qualitative research in nutrition and dietetics: Data collection issues. Journal of Human Nutrition & Dietetics, 24(1), 3-12. doi:10.1111/j.1365-277X.2010.01117.x
Ellis, T. J., & Levy, Y. (2009). Towards a guide for novice researchers on research methodology: Review and proposed methods. Issues in Informing Science & Information Technology, 323-337. Retrieved from http://informingscience.org/
Fan, X. (2013). “The test is reliable”; “The test is valid”: Language use, unconscious assumptions, and education research practice. The Asia-Pacific Education Researcher, 22, 217-218. doi:10.1007/s40299-012-0036-y
Marshall, C., & Rossman, G. B. (2016). Designing qualitative research (6th ed.).
Thousand Oaks, CA: Sage.
Hodges, N. (2011). Qualitative research: A discussion of frequently articulated qualms (FAQs). Family and Consumer Sciences Research Journal, 40, 90-92. doi:10.1111/j.1552-3934.2011.02091.x
Nenty, H., & Adedoyin, O. O. (2010). Research orientation and research-related behaviour of graduate education students at University of Botswana.
International Research Journal, 1, 577-585. Retrieved from http://interesjournals.org
Podsakoff, P. M., MacKenzie, S. B., & Podsakoff, N. P. (2012). Sources of method bias in social science research and recommendations on how to control
it. Annual Review of Psychology, 63, 539-569. doi:10.1146/annurev-psych- 120710-100452
Pratt, M. G. (2009). For the lack of a boilerplate: Tips on writing up (and reviewing) qualitative research. Academy of Management Journal, 52, 856-862. doi:10.5465/AMJ.2009.44632557
Rocha Pereira, H. (2012). Rigour in phenomenological research: Reflections of a novice nurse researcher. Nurse Researcher, 19(3), 16-19. Retrieved from http://nurse researcher.rcnpublishing.co.uk
Scotland, J. (2012). Exploring the philosophical underpinnings of research: Relating ontology and epistemology to the methodology and methods of the scientific, interpretive, and critical research paradigms. English Language Teaching, 5(9), 9-17. doi:10.5539/elt.v5n9p9
Small, M. (2009). How many cases do I need: On science and the logic of case selection in field-based research. Ethnography, 10(1), 5-38. doi:10.1177/1466138108099586
Spitzmüller, J., & Warnke, I. H. (2011). Discourse as a “linguistic object”: Methodical and methodological delimitations. Critical Discourse Studies, 8, 75-94. doi:10.1080/17405904.2011.558680
Case Study Sources
Alfonso, M., Nickelson, L., & Cohen, D. (2012). Farmers’ markets in rural communities: A case study. American Journal of Health Education, 43(3), 143-151. Retrieved from http://www.aahperd.org/aahe/publications/ajhe/
Almutairi, A. F., Gardner, G. E., & McCarthy, A. (2014). Practical guidance for the use of pattern-matching technique in case-study research: A case presentation. Nursing & Health Sciences, 16, 239-244. doi:10.1111/nhs.12096
Amerson, R. (2011). Making a case for the case study method. Journal of Nursing Education, 50, 427-428. doi:10.3928.01484834-20110719-01
Andrade, A. D. (2009). Interpretive research aiming at theory building: Adopting and adapting the case study design. The Qualitative Report, 14(1), 42-60. Retrieved from http://www.nova.edu/ssss/QR/QR14-1/diaz-andrade
Ates, O. (2013). Using case studies for teaching management to computer engineering students. International Journal of Business and Management, 8(5), 72-81. doi:10.5539/ijbm.v8n5p72
Baker, R. G., (2011). The contribution of case study research to knowledge of how to improve the quality of care. British Medical Journal Quality and Safety, 20, 30-35. doi:10.1136/bmjqs.2010.046490
Baxter, P., & Jack, S. (2008). Qualitative case study methodology: Study design and implementation for novice researchers. The Qualitative Report, 13, 544-559. Retrieved from http://www.nova.edu/ssss/QR/QR13-4/baxter
Beverland, M., & Lindgreen, A. (2010). What makes a good case study? A positivist review of qualitative case research published in Industrial Marketing Management, 1971-2006. Industrial Marketing Management, 39, 56-63. doi:10.1016/j.indmarman.2008.09.005
Boblin, S. L., Ireland, S., Kirkpatrick, H., & Robertson, K. (2013). Using Stakes qualitative case study approach to explore implementation evidence-based practice. Qualitative Health Research, 23, 1267-1275. doi:10.1177/1049732313502128
Breslin, M., & Buchanan, R. (2011). On the case study method of research and teaching in design. Design Issues, 24(1), 36-40. Retrieved from http://www.mitjournals.org
Bucic, T., Robinson, L., & Ramburuth, P. (2010). Effects of leadership style on team learning. Journal of Workplace Learning, 22, 228-248. doi:10.1108/13665621011040680
Butvilas, T., & Zygmantas, J. (2011). An ethnographic case study in educational research. Acta Paedagogica Vilnensia, 27, 33-42. Retrieved from http://www.leidykla.eu/index.php?id=36
Cinneide, B. (2015). The role of effectiveness of case studies: Student performance in case study vs. “theory” examinations. Journal of European Industrial Training, 21(1) 3-13. www.emeraldinsight.com/journal.jeit
Cronin, C. (2014). Using case study research as a rigorous form of inquiry. Nurse Researcher, 21(5), 19-27. doi:10.7748/nr.21.5.19.e1240
Crowe, S., Cresswell, K., Robertson, A., Huby, G., Avery, A., & Sheikh, A. (2011). The case study approach. BMC Medical Research Methodology, 11(1), 1-9. doi:10.1186/1471-2288-11-100
Da Mota Pedrosa, A., Näslund, D., & Jasmand, C. (2012). Logistics case study based research: Towards higher quality. International Journal of Physical Distribution & Logistics Management, 42, 275-295. doi:10.1108/09600031211225963
Dasgupta, M. (2015). Exploring the relevance of case study research. Vision (09722629), 19(2), 147-160. doi:10.1177/0972262915575661
De Massis, A., & Kotlar, J. (2014). The case study method in family business research: Guidelines for qualitative scholarship. Journal of Family Business Strategy, 5(1), 15-29. doi:10.1016/j.jfbs.2014.01.007
Easton, G. (2010). Critical realism in case study research. Industrial Marketing Management, 39(1), 118-128. doi:10.1016/j.indmarman.2008.06.004
Eno, M., & Dammak, A. (2014). Debating the case study dilemma: Controversies and considerations. Veritas: The Academic Journal of St Clements Education Group, 5(3), 1-8. Retrieved from http://stclements.edu/Veritas/VERITAS%20October%202014
Gibbert, M., & Ruigrok, W. (2010). The what and how of case study rigor: Three strategies based on published work. Organizational Research Methods, 13, 710- 737. doi:10.1177/1094428109351319
Harland, T. (2014). Learning about case study methodology to research higher education. Higher Education Research & Development, 1-10. doi:10.1080/07294360.2014.911253
Hietanen, J., Sihvonen, A., Tikkanen, H., & Mattila, P. (2014). Managerial storytelling: How we produce managerial and academic stories in qualitative B2B case study research. Journal of Global Scholars of Marketing, 24. doi:10.1080/21639159.2014.911496
Houghton, C. E., Casey, D., Shaw, D., & Murphy, K. (2010). Ethical challenges in qualitative research: Examples from practice. Nurse Researcher, 18(1), 15-25. Retrieved from http://nurseresearcher.rcnpublishing.co.uk
Hyett, N., Kenny, A., & Dickson-Swift, V. (2014). Methodology or method? A critical review of qualitative case study reports. International Journal of Qualitative, 9. doi:10.3402/qhw.v9.23606
Järvensivu, T., & Törnroos, J. Å. (2010). Case study research with moderate constructionism: Conceptualization and practical illustration. Industrial Marketing Management, 39(1), 100-108. doi:10.1016/j.indmarman.2008.05.005
Ketokivi, M., & Choi, T. (2014). Renaissance of case research as a scientific method.
Journal of Operations Management, 32, 232-240. doi:10.1016/j.jom.2014.03.004
Moll, S. (2012). Navigating political minefields: Partnerships in organizational case study research. Work, 43, 5-12. doi:10.3233/wor-2012-1442
Morse, A. L., & McEvoy, C. D. (2014). Qualitative research in sport management: Case study as a methodological approach. The Qualitative Report, 19, 1-13. Retrieved from http://www.nova.edu/ssss/QR/QR19/morse17
Murakami, Y. (2013, March). Rethinking a case study method in educational research: A comparative analysis method in qualitative research. Educational Studies in Japan: International Yearbook, (7), 81-96. Retrieved from http://ci.nii.ac.jp/vol_issue/nels/AA12192695_en.html
Pan, S., & Tan, B. (2011). Demystifying case research: A structured-pragmatic- situational (SPS) approach to conducting case studies. Information and Organization, 21(3), 161-176. doi:10.1016/j.infoandorg.2011.07.001
Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? Part 2: Introducing qualitative research methodologies and methods. Manual Therapy, 17, 378-384. doi:10.1016/j.math.2012.03.004
Piekkari, R., Plakoyiannaki, E., & Welch, C. (2010). Good' case research in industrial marketing: Insights from research practice. Industrial Marketing Management, 39, 109-117. doi:10.1016/j.indmarman.2008.04.017
Pratama, A., & Firman, A. (2010). Exploring the use of qualitative research methodology in conducting research in cross cultural management. International Journal of Interdisciplinary Social Sciences, 5, 331-342. Retrieved from http://www.iji.cgpublisher.com
Radley, A., & Chamberlain, K. (2012). The study of the case: Conceptualising case study research. Journal of Community & Applied Social Psychology, 22, 390– 399. doi:10.1002/casp.1106
Ridder, H. (2012). Case study research. Design and methods (book review of Robert Yin). Zeitschrift fur Personalforschung, 26(1), 93-95. Retrieved from http://www.zfp-personalforschung.de/de/
Rodrigues, G. N., Alves, V., Silveira, R., & Laranjeira, L. A. (2012). Dependability analysis in the Ambient Assisted Living Domain: An exploratory case study. Journal of Systems and Software, 85(1), 112-131. doi:10.1016/j.jss.2011.07.037
Sandelowski, M. (2011). "Casing" the research case study. Research in Nursing & Health, 34, 153-159. doi:10.1002/nur.20421
Sangster-Gromley, E. (2013). How case-study research can help to explain implementation of the nurse practitioner role. Nurse Researcher, 20(4), 6-11. doi:10.7748/nr2013.03.20.4.6.e291
Singh, A. S. (2014). Conducting case study research in non-profit organisations.
Qualitative Market Research: An International Journal, 17, 77-84. doi:10.1108/QMR-04-2013-0024
Small, M. (2009). How many cases do I need? On science and the logic of case selection in field-based research. Ethnography, 10(1), 5-38. doi:10.1177/1466138108099586
Snowden, A., & Martin C. R. (2011). Concurrent analysis: Towards generalizable qualitative research. Journal of Clinical Nursing, 20, 2868-2877. doi:10.1111/j.1365-2702.2010.03513.x
Snyder, C. (2012). A case study of a case study: Analysis of a robust qualitative research methodology. The Qualitative Report, 17(26), 1-21. Retrieved from http://www.nova.edu/ssss/QR/QR17/snyder
Stewart, J. (2012). Multiple-case study methods in governance-related research. Public Management Review, 14(1), 67-82. doi:10.1080/14719037.2011.589618
Street, C. T., & Ward, K. W. (2012). Improving validity and reliability in longitudinal case study timelines. European Journal of Information Systems, 21, 160-175. doi:10.1057/ejis.2011.53
Taylor, R. (2013). Case-study research in context. Nurse Researcher, 20(4), 4-5.
Retrieved from http://www.nursing-standard.co.uk/
Thomas, G. (2011). A typology for the case study in social science following a review of definition, discourse, and structure. Qualitative Inquiry, 17, 511-521. doi:10.1177/1077800411409884
Tight, M. (2010). The curious case of case study: A viewpoint. International Journal of Social Research Methodology, 13, 329-339. doi:10.1080/13645570903187181
Tsang, E. W. (2012, August 26). Case study methodology: Causal explanation, contextualization, and theorizing. Journal of International Management, 19, 195- 202. doi:10.1016/j.intman.2012.08.004
Tsang, E. W. (2014). Case studies and generalization in information systems research: A critical realist perspective. Journal of Strategic Information Systems, 23, 174- 186. doi:10.1016/j.jsis.2013.09.002
Verner, J. M., & Abdullah, L. M. (2012). Exploratory case study research: Outsources project failure. Information and Software Technology, 54, 866-886. doi:10.1016/j.infsof.2011.11.001
Vissak, T. (2010). Recommendations for using case study methods in international business research. The Qualitative Report, 15, 370-388. Retrieved from http://nsuworks.nova.edu/cgi/viewcontent.cgi?article=1156&context=tqr
Vohra, V. (2014). Using the multiple case study design to decipher contextual leadership behaviors in Indian organizations. The Electronic Journal of Business Research Methods, 12, 54-65. Retrieved from http://www.ejbrm.com
Wahyuni, D. (2012). The research design maze: Understanding paradigms, cases, methods and methodologies. Journal of Applied Management Accounting Research, 10(1), 69-80. Retrieved from http://www.cmawebline.org/jamar
Welch, C., Piekkari, R., Plakoyiannaki, E., & Paavilainen-Mäntymäki, E. (2011).
Theorising from case studies: Towards a pluralist future for international business research. Journal of International Business Studies, 42, 740-762. doi:10.1057/jibs.2010.55
Westerman, M. A. (2014). Examining arguments against quantitative research: “Case studies” illustrating the challenge of finding a sound philosophical basis of a human sciences approach to psychology. New Ideas in Psychology, 32, 42-58. doi:10.1016/jnewideapsych.2013.08.002
Whiffin, C. J., Bailey, C., Ellis-Hill, C., & Jarrett, N. (2014). Challenges and solutions during analysis in a longitudinal narrative case study. Nurse Researcher, 21(4), 20-26. Retrieved from http://rcnpublishing.com/journal/nr
White, J., Drew, S., & Hay, T. (2009). Ethnography versus case study: Positioning research and researchers. Qualitative Research Journal, 9(1), 18-27. doi:10.3316/QRJ0901018
Woodside, A. G. (2010). Bridging the chasm between survey and case study research: Research methods for achieving generalization, accuracy, and complexity.
Industrial Marketing Management, 39(1), 64-75. doi:10.1016/j.indmarman.2009.03.017
Yadav, A., Shaver, G. M., & Meckl, P. (2010). Lesson learned: Implementing the case teaching method in a mechanical engineering course. Journal of Engineering Education, 99(1), 149-162. doi:10.1002/j.2168-9830.2010.tb01042.x
Yazan, B. (2015). Three approaches to case study methods in education: Yin, Merriam, and Stake. The Qualitative Report, 20(2), 134-152. Retrieved from http://nsuworks.nova.edu/tqr/vol20/iss2/12
Yin, R. K. (2013, July 10). Validity and generalization in future case study evaluations.
Evaluation, 19, 312-332. doi:10.1177/1356389013497081
Zivkovic, J. (2012). Strengths and weaknesses of business research methodologies: Two disparate case studies. Business Studies Journal, 4(2), 91-99. Retrieved from http://www.alliedacademies.org/public/journals/JournalDetails.aspx?jid=26
Case Study Seminal Books
Stake, R. E. (1995). The art of case study research. Thousand Oaks: Sage.
Yin, R. K. (2012). Applications of case study research (3rd ed.). Thousand Oaks: Sage. Yin, R. K. (2014). Case study research: designs and methods (5th ed.). Thousand
Oaks: Sage.
Data Saturation and Data Collection Sources
Abowitz, D. A., & Toole, T. M. (2010). Mixed methods research: Fundamental issues of design, validity, and reliability in construction research. Journal of Construction Engineering & Management, 136(1), 108-116. doi:10.1061/(ASCE)CO.1943-
7862.0000026
Anderson, C. (2010). Presenting and evaluating qualitative research. American Journal of Pharmaceutical Education, 74(8), 4-7. Retrieved from http://www.ajpe.org/
Anyan, F. (2013). The influence of power shifts in data collection and analysis stages: A focus on qualitative research interview. The Qualitative Report, 18(18), 1-9.
Retrieved from http://www.nova.edu/sss/QR/index.html
Barratt, M., Choi, T. Y., & Li, M. (2011). Qualitative case studies in operations management: Trends, research outcomes, and future research implications. Journal of Operations Management, 29, 329-342. doi:10.1016/j.jom.2010.06.002
Bernard, R. H. (2011). Research methods in anthropology: Qualitative and quantitative approaches. Thousand Oaks: Sage.
Bowen, G. A. (2008). Naturalistic inquiry and the saturation concept: A research note.
Qualitative Research, 8(1), 137-152. doi:10.1177/1468794107085301
Brod, M., Tesler, L. E., & Christiansen, T. L. (2009). Qualitative research and content validity: Developing best practices based on science and experience. Quality of Life Research, 18, 1263-1278. doi:10.1007/s11136-009-9540-9
Carlsen, B., & Glenton, C. (2011). What about N? A methodological study of sample size reporting in focus group studies. BMC Medical Research Methodology, 11(1), 26-35. doi:10.1186/1471-2288-11-26
Cater, J. K. (2011). SKYPE - A cost-effective method for qualitative research.
Rehabilitation Counselors & Educators Journal, 4, 3. Retrieved from http://www.nationalrehab.org/cwt/external/wcpages/divisions/rcea.aspx
Chikweche, T., & Fletcher, R. (2012). Undertaking research at the bottom of the pyramid using qualitative methods. Qualitative Market Research: An International Journal, 15, 242-267. doi:10.1108/13522751211231978
Coast, J., & Horrocks, S. (2010). Developing attributes and levels for discrete choice experiments using qualitative methods. Journal of Health Services Research and Policy, 12(1), 25-30. doi:10.346457934563454
Couper, M. P. (2011). The future of modes of data collection. Public Opinion Quarterly, 75(5), 889-908. Retrieved from http://poq.oxfordjournals.org/
Covell, C. L., Sidani, S., & Ritchie, J. A. (2012). Does the sequence of data collection influence participants’ responses to closed and open-ended questions? A methodological study. International Journal of Nursing Studies, 49, 664-671. doi:10.1016/j.ijnurstu.2011.12.002
Dennis, B. (2010, June). Ethical dilemmas in the field: The complex nature of doing education ethnography. Ethnography and Education, 5(2), 123-127. doi:10.1080/17457823.2010.493391
Denzin, N. K. (2009). The research act: A theoretical introduction to sociological methods. New York, NY: Aldine Transaction.
Denzin, N. K. (2012). Triangulation 2.0. Journal of Mixed Methods Research, 6(2), 80-
88. doi:10.1177?1558689812437186 Sage
Dibley, L. (2011). Analyzing narrative data using McCormack’s lenses. Nurse Researcher, 18(3), 13-19. Retrieved from http://nurseresearcher.rcnpublishing.co.uk/news-and- opinion/commentary/analysing-qualitative-data
Dixon, S. E. A., & Clifford, A., (2007). Ecopreneurship: A new approach to managing the triple bottom line. Journal of Organizational Change Management, 20(3), 326- 345. doi:10.1108/09534810710740164
Draper, A. A., & Swift, J. A. (2011). Qualitative research in nutrition and dietetics: Data collection issues. Journal of Human Nutrition & Dietetics, 24(1), 3-12. doi:10.1111/j.1365-277X.2010.01117.x
Edelman, B. (2012). Using Internet data for economic research. The Journal of Economic Perspectives, 26, 189-206. doi:10.1257/jep.26.2.189
Field, A. (2009). Discovering statistics using SPSS (3rd ed.). Thousand Oaks, CA: Sage.
Floden, R. E. (2009). Empirical research without certainty. Educational Theory, 59, 485- 498. doi:10.1111/j.1741-5446.2009.00332.x
Francis, J. J., Johnston, M., Robertson, C., Glidewell, L., Entwistle, V. Eccles, M. P., & Grimshaw, J. M. (2010). What is an adequate sample size? Operationalizing data saturation for theory-based interview studies. Psychology and Health, 25, 1229- 1245. doi:10.1080/08870440903194015
Fusch, G. E. (2008, December). What happens when the ROI model does not fit?
Performance Improvement Quarterly, 14(4), 60-76. doi:10.1111/j.1937- 8327.2001.tb00230.x
Fusch, P., & Ness, L. (2015). Are we there yet? Data saturation in qualitative research.
The Qualitative Report, 20, 1408-1416. Retrieved from http://tqr.nova.edu/
Gerring, J. (2011). How good is enough? A multidimensional, best-possible standard for research design. Political Research Quarterly, 64, 625-636. doi:10.1177/1065912910361221
Gibbert, M., & Ruigrok, W. (2010). The what and how of case study rigor: Three strategies based on published work. Organizational Research Methods, 13, 710-
737. doi:10.1177/1094428109351319
Gibbins, J., Bhatia, R., Forbes, K., & Reid, C. M. (2014). What do patients with advanced incurable cancer want from the management of their pain? A qualitative study. Palliative Medicine, 28(1), 71-78. doi:10.1177/0269216313486310
Gibbs, L., Kealy, M., Willis, K., Green, J., Welch, N., & Daly, J. (2007). What have sampling and data collection got to do with good qualitative research? Australian and New Zealand Journal of Public Health, 31, 540-544. doi:10.1111/j.1753- 6405.2007.00140.x
Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18(1), 59-82.
doi:10.1177/1525822X05279903
Halcomb, E., & Andrew, S. (2009). Practical considerations for higher degree research students undertaking mixed methods projects. International Journal of Multiple Research Approaches, 3, 153-162. Retrieved from http://mra.e- contentmanagement.com
Hannah, D., & Lautsch, B. (2011). Counting in qualitative research: Why to conduct it, when to avoid it, and when to closet it. Journal of Management Inquiry, 20, 14- 22. doi:10.1177/1056492610375988
Hayman, B., Wilkes, L., & Jackson, D. (2012). Journaling: Identification of challenges and reflection on strategies. Nurse Researcher, 19(3), 27-31. Retrieved from http://www.nursing-standard.co.uk
Holloway, I., Brown, L., & Shipway, R. (2010). Meaning not measurement: Using ethnography to bring a deeper understanding to the participant experience of festivals and events. International Journal of Event and Festival Management, 1(1), 74-85. doi:10.1108/17852951011029315
Kerr, C., Nixon, A., & Wild, D. (2010). Assessing and demonstrating data saturation in qualitative inquire supporting patient-reported outcomes research. Expert Review of Pharmacoeconomics & Outcomes Research, 10, 269-281. Retrieved from http://informahealthcare.com/loi/erp
Knight, J. (2012). Deletion, distortion and data collection: The application of the neuro- linguistic programming (NLP) meta-model in qualitative interviews. Australasian Journal of Market & Social Research, 20(1), 15-21. Retrieved from http://www.amsrs.com
Lasch, K. E., Marquis, P., Vigneux, M., Abetz, L., Arnould, B.,…Bayliss, M. (2010). PRO development: Rigorous qualitative research as the crucial foundation. Quality of Life Research, 19, 1087-1096. doi:10.1007/s11136-010-9677-6
Lunnay, B., Borlagdan, J., McNaughton, D., & Ward, P. (2015). Ethical use of social media to facilitate qualitative research. Qualitative Health Research, 25, 99-109. doi:10.1177/1049732314549031
Marshall, C. & Rossman, G. (2015). Designing qualitative research (6th ed.). Thousand Oaks: Sage.
Mason, M. (2010, September). Sample size and saturation in PhD studies using qualitative interviews. Forum: Qualitative Social Research, 11(3). Retrieved from http://www.qualitative-research.net/index.php/fqs/article/view/1428/3027
Morse, J. M. (2015). "Data were saturated..." Qualitative Health Research, 25, 587-588. doi:10.1177/1049732315576699
Nastasia, D. I., & Rakow, L. F. (2010). What is theory? Puzzles and maps as metaphors in communication theory. triple C, 8(1), 1-17. Retrieved from http://triple-c.at
Onwuegbuze, A. J., & Leech, N. L. (2007). A call for qualitative power analyses. Quality & Quantity, 41(1), 105-121. doi:10.1007/s11135-005-1098-1
Onwuegbuzie, A. J., Leech, N. L., & Collins, K. M. T. (2010). Innovative data collection strategies in qualitative research. The Qualitative Report, 15, 696-726. Retrieved from http://www.nova.edu/ssss/QR/QR15-3/onwuegbuzie
O’Reilly, M., & Parker, N. (2012, May). Unsatisfactory saturation: A critical exploration of the notion of saturated sample sizes in qualitative research. Qualitative Research Journal, 1-8. doi:10.1177/1468794112446106
Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? Part 2: Introducing qualitative research methodologies and methods. Manual Therapy, 17, 378-384. doi:10.1016/j.math.2012.03.004
Pratama, A., & Firman, A. (2010). Exploring the use of qualitative research methodology in conducting research in cross cultural management. International Journal of Interdisciplinary Social Sciences, 5, 331-342. Retrieved from http://www.iji.cgpublisher.com
Robinson, O. (2014). Sampling in interview-based qualitative research: A theoretical and practical guide. Research in Psychology, 11(1), 25-41. doi:10.1080/14780887.2013.801543
Smith, A. (2012). Internet search tactics. Online Information Review, 36, 7-20. doi:10.1108/14684521211219481
Stavros, C., & Westberg, K. (2009). Using triangulation and multiple case studies to advance relationship marketing theory. Qualitative Market Research, 12, 307- 320. doi:10.1108/13522750910963827
Sterling, C. (2012). The handbook of Internet studies. Journalism and Mass Communication Quarterly, 89, 751-752. doi:10.1177/1077699012462100
Swift, J. A., & Tischler, V. (2010). Qualitative research in nutrition and dietetics: Getting started. Journal of Human Nutrition and Dietetics, 23, 559-566. doi:10.1111/j.1365-277X.2010.01116.X
Tilley, L., & Woodthorpe, K. (2011). Is it the end for anonymity as we know it? A critical examination of the ethical principle of anonymity in the context of 21st century demands on the qualitative researcher. Qualitative Research, 11, 197-212. doi:10.1177/1468794110394073
Tukey, J. W. (1977). Exploratory data analysis. Reading, MA : Addison-Wesley.
Turner, D. W. III. (2010). Qualitative interview design: A practical guide for novice investigators. The Qualitative Report, 3(2), 7-13. Retrieved from http://www.nova.edu/ssss/QR/QR15-3/qid
Walker, J. L. (2012). The use of saturation in qualitative research. Canadian Journal of Cardiovascular Nursing, 22(2), 37-46. Retrieved from http://www.cccn.ca
White, J., Drew, S., & Hay, T. (2009). Ethnography versus case study: Positioning research and researchers. Qualitative Research Journal, 9(1), 18-27. doi:10.3316/QRJ0901018
White, D. E., Oelke, N. D., & Friesen, S. (2012). Management of a large qualitative data set: Establishing trustworthiness of the data. International Journal of Qualitative Methods, 11, 244-258. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/9883
Whiteley, A. (2012). Supervisory conversations on rigour and interpretive research.
Qualitative Research Journal, 12, 251-271. doi:10.1108/14439881211248383
Williamson, K. (2006). Research in constructivist frameworks using ethnographic techniques. Library Trends, 55(1), 83-101. doi:10.1353/lib.2006.0054
Zikmund, W., Babin, B.J., Carr, J.C., & Griffin, M. (2010). Business research methods
(8th ed.). Mason, OH: Thomson/South-Western.
Ethical Considerations/IRB
Adams, P., Wongwit, W., Pengsaa, K., Khusmith, S., Fungladda, W., Chaiyaphan, W.,
... Kaewkungwal, J. (2013). Ethical issues in research involving minority populations: The process and outcomes of protocol review by the ethics committee of the faculty of tropical medicine, Mahidol University, Thailand. BMC Medical Ethics, 14(1). doi:10.1186/1472-6939-14-33
Ahern, K. (2012). Informed consent: Are researchers accurately representing risks and benefits? Scandinavian Journal of Caring Sciences, 26, 671-678. doi:10.1111/j.1471-6712.2012.00978.x
Alby, F., Fatigante, M. (2014). Preserving the respondent’s standpoint in a research interview: Different strategies of ‘doing’ the interviewer. Human Studies, 37, 239- 256. doi:1007/s10746-013-9292-y
Alcadipani, R., & Hodgson, D. (2009). By any means necessary? Ethnographic access, ethics and the critical researcher. Tamara:Journal of Critical Organization Inquiry, 7(4), 127-128. Retrieved from http://tamarajournal.com/
Aluwihare-Samaranayake, D. (2012). Ethics in qualitative research: A view of the participants’ and researchers’ world from a critical standpoint. International Journal of Qualitative Methods, 11(2), 64-81. Retrieved from https://ejournals.library.ualberta.ca/index.php/IJQM/index
Amon, J. J., Baral, S. D., Beyrer, C., & Kass, N. (2012). Human rights research and ethics review: Protecting individuals or protecting the state? PLoS Med, 9, e1001325. doi:10.1371/journal.pmed.1001325
Angelos, P. (2013). Ethical issues of participant recruitment in surgical clinical trials.
Annals of Surgical Oncology, 20, 3184-3187. doi:10.1245/s10434-013-3178-0
Barbour, A. (2010, June). Exploring some ethical dilemmas and obligations of the ethnographer. Ethnography and Education, 5(2), 159-173. doi:10.1080/17457823.2010.493399
Barker, M. (2013). Finding audiences for our research: Rethinking the issue of ethical challenges. Journal of the Communication Review, 16(1/2), 70-80. doi:10.1080/10714421.2013.757504
Beskow, L. M., Check, D. K., & Ammarell, N. (2014). Research participants’ understanding of and reactions to certificates of confidentiality. AJOB Primary Research, 5(1), 12-22. doi:10.1080/21507716.2013.813596
Blee, K., & Currier, A. (2011). Ethics beyond the IRB: An introductory essay. Qualitative Sociology, 34, 401-413. doi:10.1007/s11133-011-9195-z
Bloomer, M. J., Cross, W., Endacott, R., O’Connor, M., & Moss, C. (2012). Qualitative observation in a clinical setting: Challenges at end of life. Nursing and Health Sciences, 14, 25-31. doi:10.1111/j.1442-2018.2011.00653.x
Boyd, W. E., Parry, S., Burger, N., Kelly, J., Boyd, W., & Smith, J. (2013). Writing for ethical research: Novice researchers, writing, and the experience of experiential narrative. Creative Education, 4(12), 30-39. doi:10.4236/ce.2013.412a1005
Brakewood, B., & Poldrack, R. (2013). The ethics of secondary data analysis: Considering the application of Belmont principles to the sharing of neuroimaging data. Neuroimage, 82, 671-676. doi:10.1016/j.neuroimage.2013.02.040
Brewis, J. (2014). The ethics of researching friends: On convenience sampling in qualitative management and organization studies. Journal of British Management, 25, 849-862. doi:10.1111/1467-8551.12064
Bromley, E., Mikesell, L., Jones, F., Khodyakov, D. (2015). From subject to participant: Ethics and the evolving role of community in health research. American Journal of Public Health, 105, 900-908. Retrieved from http://www.ajph.aphapublications.org/
Cassidy, S. (2013). Acknowledging hubris in interpretative data analysis. Nurse Researcher, 20(6), 27-31. doi:10.7748/nr2013.07.20.6.27.e321
Chappy, S., & Gaberson, K. B. (2012). To IRB or not to IRB: That is the question.
AORN Journal, 95, 682-683. doi:10.1016/j.aorn/2012.03.012
Check, D. K., Wolf, L. E., Dame, L. A., & Beskow, L. M. (2014). Certificates of confidentiality and informed consent: Perspectives of IRB chairs and institutional legal counsel. IRB: Ethics and Human Research, 36(1), 1-8. doi:10.1038/gim.2014.102
Corman, J. (2010). Principles of ethical review. Applied Clinical Trials, 19(7), 8-9.
Retrieved from http://www.appliedclinicaltrialsonline.com
Cook, A. F., Hoas, H., & Joyner, J. C. (2013). The protectors and the protected: What regulators and researchers can learn from IRB members and subjects. Narrative Inquiry in Bioethics, 3(1), 51-65. doi:10.1353/nib.2013.0014
Cross, J., Pickering, K., & Hickey, M. (2014). Community-based participatory research, ethics, and Institutional Review Boards: Untying a Gordian knot. Critical Sociology, 1-20. doi:10.1177/0896920513512696.
Crow, G., Wiles, R., Heath, S., & Charles, V. (2006). Research ethics and data quality: The implications of informed consent. International Journal of Social Research Methodology, 9(2), 83-95. doi:10.1080/13645570600595231
Crowther, J. L., & Lloyd-Williams, M. (2012). Researching sensitive and emotive topics: The participants’ voice. Research Ethics, 8, 200-211. doi:10.1177/1747016112455887
Cseko, G., & Tremaine, W. (2013). The role of the Institutional Review Board in the oversight of the ethical aspects of human studies research. Nutrition in Clinical Practice, 28, 177-181. doi:10.1177/0884533612474042
Coupal, L. (2005). Practitioner-research and the regulation of research ethics: The challenge of individual, organizational, and social interests. Forum Qualitative Sozialforschung/Forum: Qualitative Social Research, 6(1). Retrieved from http://www.qualitative-research.net/index.php/fqs
Damianakis, T., & Woodford, M. R. (2012). Qualitative research with small connected communities generating new knowledge while upholding research
ethics. Qualitative Health Research, 22, 708-718. doi:10.1177/1049732311431444
Dekking, S. A., van der Graaf, R., & van Delden, J. J. (2014). Strengths and weaknesses of guideline approaches to safeguard voluntary informed consent of patients within a dependent relationship. BMC Medicine, 12(1). doi:10.1186/1741-7015-12-52
Das, N., & Das, S. (2014). Hiring a professional medical writer: Is it equivalent to ghostwriting? Biochemia Medica, 24(1), 19-24. doi:10.11613/BM.2014.004
Dennis, B. (2010, June). Ethical dilemmas in the field: The complex nature of doing education ethnography. Ethnography and Education, 5(2), 123-127. doi:10.1080/17457823.2010.493391
De Roubaix, J. A. (2011). Beneficence, non-maleficence, distributive justice and respect for patient autonomy-reconcilable ends in aesthetic surgery. Journal of Plastic Reconstructive & Aesthetic Surgery, 64(1), 11-16. doi:10.1016/j.bjps.2010.03.034
Deventer-Van, J. P. (2009). Ethical consideration during human centered covert and overt research. Quality and Quantity, 43, 45-57. doi:10.1007/s11135-006-9069-8
DuBois, J. M., Beskow, L., Campbell, J., Dugosh, K., Festinger, D., Hartz, S., … Lidz, C. (2012). Restoring balance: A consensus statement on the protection of vulnerable research participants. American Journal of Public Health, 102, 2220- 2225. doi:10.2105/AJPH.2012.300757
Eide, P., & Kahn, D. (2008). Ethical issues in the qualitative researcher—participant relationship. Nursing Ethics, 15(2), 199-207. doi:10.1177/0969733007086018
Elmir, R., Schmied, V., Jackson, D., & Wilkes, L. (2011). Interviewing people about potentially sensitive topics. Nurse Researcher, 19(1), 12-16. Retrieved from http://nurseresearcher.rcnpublishing.co.uk
Enama, M. E., Hu, Z., Gordon, I., Costner, P., Ledgerwood, J. E., & Grady, C. (2012, April 14). Randomization to standard and concise informed consent forms: Development of evidence-based consent practices. Contemporary Clinical Trials, 33, 895-902. doi:10.1016/j.cct.2012.04.005
Erlich, Y., & Narayanan, A. (2014). Routes for breaching and protecting genetic privacy.
Nature Reviews Genetics, 15, 409-421. doi:10.1038/nrg3723
Fein, E. C., & Kulik, C. T. (2011). Safeguarding access and safeguarding meaning as strategies for achieving confidentiality. Industrial and Organizational Psychology: Perspectives on Science and Practice, 4, 479-481. doi:10.1111/j.1754- 9434.2011.01378.x
Ferreira, R., Buttell, F., & Ferreira, S. (2015). Ethical considerations for conducting disaster research with vulnerable populations. Journal of Social Work Values and Ethics, 12, 379-384. Retrieved from http://jswve.org
Fouka, G., & Mantzorou, M. (2011). What are the major ethical issues in conducting research? Is there a conflict between the research ethics and the nature of nursing? Health Science Journal, 5, 3-14. Retrieved from http://www.hsj.gr
Freysteinson, W. M., Lewis, C., Sisk, A., Wuest, L., Deutsch, A. S., & Cesario, S. K. (2013). Investigator reflections: A final debriefing following emotionally sensitive mirror research. Holistic Nursing Practice, 27, 177-184. doi:10.1097/HNP.0b013e31828a0968
Gibson, S., Benson, O., & Brand, S. L. (2013). Talking about suicide confidentiality and anonymity in qualitative research. Nursing Ethics, 20, 18-29. doi:10.1177/0969733012452684
Goldblatt, H., Karnieli-Miller, O., & Neumann, M. (2011). Sharing qualitative research findings with participants: Study experiences of methodological and ethical dilemmas. Patient Education and Counseling, 82, 389-395. doi:10.1016/j.pec.2010.12.016
Grady, C. (2010). Do IRBs protect human research participants? Journal of the American Medical Association, 304(10), 1122-1123. doi:10.1001/jama.2010.1304
Greaney, A. M., Sheehy, A., Heffernan, C., Murphy, J., Mhaolrúnaigh, S. N., Heffernan, E., & Brown, G. (2012). Research ethics application: A guide for the novice researcher. British Journal of Nursing, 21, 38-43. Retrieved from http://info.britishjournalofnursing.com/
Greene, M. J. (2014). On the inside looking in: Methodological insights and challenges in conducting qualitative insider research. The Qualitative Report, 19(29), 1-13. Retrieved from http://nsuworks.nova.edu/tqr/vol19/iss29/3
Haahr, A., Norlyk, A., & Hall, E. (2013). Ethical challenges embedded in qualitative research interviews with close relatives. Nursing Ethics, 2(1), 6-15. doi:10.1177/0969733013486370
Hadidi, N., Lindquist, R., Treat-Jacobson, D., & Swanson, P. (2013). Participant withdrawal: Challenges and practical solutions for recruitment and retention in clinical trials. Creative Nursing, 19(1), 37-41.d:10.1891/1078-4535.19.1.37
Hair, N., & Clark, M. (2007). The ethical dilemmas and challenges of ethnographic research in electronic communities. International Journal of Market Research, 49, 781-800. Retrieved from http://www.ijmr.com/
Halse, C., & Honey, A. (2014). Unraveling ethics: Illuminating the moral dilemmas of research ethics. Signs, 40(1). Retrieved from http://press.uchicago.edu/ucp/journals/journal/signs.html
Hammersley, M. (2014). On ethical principles for social research. International Journal of Social Research Methodology, 1-17. doi:10.1080/13645579.2014.924169
Head, E. (2009). The ethics and implications of paying participants in qualitative research. International Journal of Social Research Methodology, 12, 335-344. doi:10.1080/13645570802246724
Heggestad, A. K. T., Nortvedt, P., & Slettebo, A. (2012). The importance of moral sensitivity when including persons with dementia in qualitative research. Nursing Ethics. doi:10.1177/0969733012455564
Hegney, D., & Chan, T. W. (2010). Ethical challenges in the conduct of qualitative research. Nurse Researcher, 18(1), 4-7. Retrieved from
nurseresearcher.rcnpublishing.co.uk/
Holloway, I., Brown, L., & Shipway, R. (2010). Meaning not measurement: Using ethnography to bring a deeper understanding to the participant experience of festivals and events. International Journal of Event and Festival Management, 1(1), 74-85. doi:10.1108/17852951011029315
Houghton, C. E., Casey, D., Shaw, D., & Murphy, K. (2010). Ethical challenges in qualitative research: Examples from practice. Nurse Researcher, 18(1), 15-25. Retrieved from http://nurseresearcher.rcnpublishing.co.uk
Hoyland, S., Hollund, J. G., & Olsen, O. E. (2015). Gaining access to a research site and participants in medical and nursing research: A synthesis of accounts.
Medical Education, 49, 224-232. doi:10.1111/medu.12622
Huang, X., O’Connor, M., Ke, L. S., & Lee, S. (2014). Ethical and methodological issues in qualitative health research involving children: A systematic review. Nursing Ethics. doi:10.1177/0969733014564102
Hunter, D. (2012). How not to argue against mandatory ethics review. Journal of Medical Ethics, 39, 521-524. doi:10.1136/medethics-2012-101074
Ignacio, J. J., & Taylor, B. J. (2013). Ethical issues in health-care inquiry: A discussion paper. International Journal of Nursing Practice, 18, 56-61. doi:10.1111.ijn.12017
Irwin, S. (2013). Qualitative secondary data analysis: Ethics, epistemology and context.
Journal of Progress in Development Studies, 13, 295-306. doi:10.1177/1464993413490479
Jerolmack, C., & Khan, S. (2014). Talk is cheap ethnography and the attitudinal fallacy.
Sociological Methods & Research, 43, 178-209. doi:10.1177/0049124114523396
Johnson, H., Douglas, J., Bigby, C., & Iacono, T. (2011). The challenges and benefits of using participant observation to understand the social interaction of adults with intellectual disabilities. AAC: Augmentative & Alternative Communication, 27, 267-278. doi:10.3109/07434618.2011.587831
Journot, V., Perusat-Villetorte, S., Bouyssou, C., Couffin-Cadiergues, S., Tall, A., Fagard, C., . . . Chene, G. (2013). Preserving participant anonymity during remote preenrollment consent form checking. Clinical Trials, 10, 460-462. doi:10.1177/1740774513480962
Juritzen, T. I., Grimen, H., & Heggen, K. (2011). Protecting vulnerable research participants: A Foucault-inspired analysis of ethics committees. Nursing Ethics, 8, 640-650. doi:10.1117/0969733011403807
Juros, T. (2011). Reporting on the issues of research rigour and ethics: The case of publications using qualitative methods in the Croatian social science journals. Revija Za Sociologiju, 41, 161-184. doi:10.5613/rzs.41.2.2
Kapoulas, A., & Mitic, M. (2012). Understanding challenges of qualitative research: Rhetorical issues and reality traps. Qualitative Market Research, 15, 354-368. doi:10.1108/13522751211257051
Kapp, M. B. (2006). Ethical and legal issues in research involving human subjects: Do you want a piece of me? Journal of Clinical Pathology, 59, 335-339. doi:10.1136/jcp.2005.030957
Kasim, A., & Al-Gahuri, H. A. (2015). Overcoming challenges in qualitative inquiry within
a conservative society. Tourism Management, 50, 124-129. doi:10.1016/j.tourman.2015.01.004
Kelley, A., Belcourt-Dittloff, A., Belcourt, C., & Belcourt, G. (2013). Research ethics and indigenous communities. American Journal of Public Health. doi:10.2105/AJPH.2012.301522
Kelty, S. F., Julian, R., & Ross, A. (2013). Dismantling the justice silos: Avoiding the pitfalls and reaping the benefits of information-sharing between forensic science, medicine, and law. Forensic Science International, 230(1), 8-15. doi:10.1016/j.forsciint.2012.10.032
Kiguba, R., Kutyabami, P., Kiwuwa, S., Katabira, E., & Sewankambo, N. K. (2012).
Assessing the quality of informed consent in a resource limited setting: A cross- sectional study. BMC Medical Ethics, 13(1), 21-27. doi:10.1186/1472-6939-13-21
King, T., Brankovic, L., & Gillard, P. (2012). Perspectives of Australian adults about protecting the privacy of their health information in statistical databases.
International Journal of Medical Informatics, 81, 279-289. doi:10.1016/j.ijmedinf.2012.01.005
Kingsley, J., Phillips, R., Townsend, M., & Henderson-Wilson, C. (2010). Using a qualitative approach to research to build trust between a non-aboriginal researcher and aboriginal participants (Australia). Qualitative Research Journal, 10(1), 2-12. doi:10.3316/QRJ1001002
Klotz, A. C., Da Motta Veiga, S. P., Buckley, M. R., & Gavin, M. B. (2013). The role of trustworthiness in recruitment and selection: A review and guide for future research. Journal of Organizational Behavior, 34(Suppl 1), S104-S119. doi:10.1002/job.1891
Knepp, M. M. (2014). Personality, sex of participant, and face-to-face interaction affect reading of informed consent forms. Psychological Reports. doi:10.2466/17.07.PR0.114k13w1
Komesaroff, P. A. (2012). Comment on 'A research participant's rights as an ethical dilemma'. Australian & New Zealand Journal of Public Health, 36, p. 511. doi:10.1111/j.1753-6405.2012.00945.x
Kumar, N. K. (2013). Informed consent: Past and present. Perspectives in Clinical Research, 4(1), 21-25. doi:10.4103/2229-3485.106372
Lange, M., Rogers, W., & Dodds, S. (2013). Vulnerability in research ethics: A way forward. Bioethics, 27, 333-340. doi:10.1111/bioe.12032
Lavis, V. (2010). Multiple researcher identities: Highlighting tensions and implications for ethical practice in qualitative interviewing. Qualitative Research in Psychology, 7, 316-331. doi:10.1080/14780880902929506
Lipscomb, M. (2010). Participant overexposure and the role of researcher judgment.
Nurse Researcher, 17(4), 49-59. Retrieved from nurseresearcher.rcnpublishing.co.uk
Lopez-Dicastillo, O., & Belintxon, M. (2014, May 15). The challenges of participant observations of cultural encounters within an ethnographic study. Procedia - Social and Behavioral Sciences, 132, 522-526. doi:10.1016/j.sbspro.2014.04.347
Lunnay, B., Borlagdan, J., McNaughton, D., & Ward, P. (2015). Ethical use of social media to facilitate qualitative research. Qualitative Health Research, 25, 99-109. doi:10.1177/1049732314549031
Lynch, H. F., (2013). Human research subjects as human research workers. Yale Journal of Health Policy, Law & Ethics, 14, 122-193. Retrieved from http://www.yale.edu/yjhple/
Malone, H., Nicholl, H., & Tracey, C. (2014). Awareness and minimization of systematic bias in research. British Journal of Nursing, 23, 279-282. Retrieved from http://www.britishjournalofnursing.com/
Mamo, L., & Fishman, J. R. (2013). Why justice? Introduction to the special issue on entanglements of science, ethics, and justice. Science, Technology and Human Values, 38, 159-175. doi:10.1177/0162243912473162
McDonald, K. E., Kidney, C. A., & Patka, M. (2013). ‘You need to let your voice be heard’: Research participants' views on research. Journal of Intellectual Disability Research, 57, 216-225. doi:10.1111/j.1365-2788.2011.01527.x
McRae, A. D., Bennett, C., Brown, J. B., Weijer, C., Boruch, R., Brehaut, J., & ...
Taljaard, M. (2013). Researchers' perceptions of ethical challenges in cluster randomized trials: A qualitative analysis. Trials, 14(1), 1-7. doi:10.1186/1745- 6215-14-1
Mealer, M., & Jones, J. (2014). Methodological and ethical issues related to qualitative telephone interviews on sensitive topics. Nurse Researcher, 21, 32-37. Retrieved from http://rcnpublishing.com/journal/nr
Menikoff, J. (2010). The paradoxical problem with multiple-IRB review. The New England Journal of Medicine, 363, 1591-15933. doi:10.1056/NEJMp1005101
Mero-Jaffe, I. (2011). ‘Is that what I said?’ Interview transcript approval by participants: An aspect of ethics in qualitative research. International Journal of Qualitative
Methods, 10, 231-247. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/8449
Michalos, A. (2013). The business case for asserting the business case for business ethics. Journal of Business Ethics, 114, 599-606. doi:10.1007/s10551-013-1706- 2
Mikesell, L., Bromley, E., & Khodyakov, D. (2013). Ethical community-engaged research: A literature review. American Journal of Public Health, 103(12), e7 e14. doi:10.2105/AJPH.2012.301605
Morse, J. M., & Coulehan, J. (2015). Maintaining confidentiality in qualitative publications. Qualitative Health Research, 25,151-152. doi:10.1177/1049732314563489
Neusar, A. (2014). To trust or not to trust? Interpretations in qualitative research.
Human Affairs, 24(2), 178-188. doi:10.2478/s13374-014-0218-9
Newman, P., & Tufford, L. (2012). Bracketing in qualitative research. Qualitative Social Work, 11(1), 80-96. doi:10.1177/1473325010368316
Nind, M., Wiles, R., Bengry-Howell, A., & Crow, G. (2013). Methodological innovation and research ethics: Forces in tension or forces in harmony? Qualitative Research, 13, 650-667. doi:10.1177/1468794112455042
Oliver, J., & Eales, K. (2008). Research ethics: Re-evaluating the consequentialist perspective of using covert participant observation in management research. Qualitative Market Research: An International Journal, 11, 344-357. doi:10.1108/13522750810879057
Orb, A., Eisenhauer, L., & Wynaden, D. (2001). Ethics in qualitative research. Journal of Nursing Scholarship, 33, 93-96. doi:10.1111/j.1547-5069.2001.00093.x
Owen, J. R., & Kemp, D. (2014). ‘Free prior and informed consent’, social complexity and the mining industry: Establishing a knowledge base. Resources Policy, 41, 91-100. doi:10.1016/j.resourpol.2014.03.006
Paechter, C. (2013). Researching sensitive issues online: Implications of a hybrid insider/outsider position in a retrospective ethnographic study. Qualitative Research, 13(1), 71-86. doi:10.1177/1468794112446107
Plankey-Videla, N. (2012). Informed consent as process: Problematizing informed consent in organizations ethnographies. Journal of Qualitative Sociology, 35(1), 1-21.
doi:10.1007/s11133-011-9212-2
Pletcher, M. J., Lo, B., & Grady, D. (2015). Criteria for waiver of informed consent for quality improvement research-reply. JAMA Internal Medicine, 175, 143-143. doi:10.1001/jamainternmed.2014.6997
Pollock, K. (2012). Procedure versus process: Ethical paradigms and the conduct of qualitative research. BMC Medical Ethics, 13(1), 25-31. doi:10.1186/1472-6939- 13-25
Pyer, M., & Campbell, J. (2012). Qualitative researching with vulnerable groups.
International Journal of Therapy and Rehabilitation, 19, 311-316. Retrieved from http://www.ijtr.co/uk
Resnik, D. B. (2011). Scientific research and the public trust. Science and Engineering Ethics, 17, 399-409. doi:10.1007/s11948-010-9210-x
Resnick, B. (2014). Publishing a DNP capstone: After the where, what and how. The ethics and process of manuscript submission. Geriatric Nursing, 35(2), 91-92. doi:10.1016/j.gerinurse.2013.11.009
Rhodes, K. V., & Miller, F. G. (2012). Simulated patient studies: An ethical analysis.
Milbank Quarterly, 90, 706-724. doi:10.1111/j.1468-0009.2012.00680.x
Roberts, J. (2009). An author’s guide to publication ethics: A review of emerging standards in biomedical journals. Headache, 49, 579-589. Retrieved from http://www.headachejournal.org
Sanjari, M., Bahramnezhad, F., Fomani, F. K., Shoghi, M., & Cheraghi, M. A. (2014).
Ethical challenges of researchers in qualitative studies: The necessity to develop a specific guideline. Journal of Medical Ethics and History of Medicine, 8, 7-14. Retrieved from http://jmehm.tums.ac.ir/index.php/jmehm
Sherry, E. (2013). The vulnerable researcher: Facing the challenges of sensitive research. Qualitative Research Journal, 13, 278-288. doi:10.1108/QRJ-10-2012- 0007
Sikes, P., & Piper, H. (2010). Ethical research, academic freedom and the role of ethics committees and review procedures in educational research. International Journal of Research & Method in Education, 33, 205-213. doi:10.1080/1743727X.2010.511838
Simundic, A. (2013). Bias in research. Biochemia Medica, 23(1), 12-15. doi:10.11613/BM.2013.003
Sim, J. (2010). Addressing conflicts in research ethics: Consent and risk of harm.
Physiotherapy Research International, 15(2), 80-87. doi:10.1002/pri.481
Sims, J. M. (2010). A brief review of the Belmont report. Dimensions of Critical Care Nursing, 29, 173-174. doi:10.1097/DCC.0b013e3181de9ec5
Stern, S., & Lemmens, T. (2011). Legal remedies for medical ghostwriting: Imposing fraud liability on guest authors of ghostwritten articles. PLoS Medicine, 8(8), e1001070.doi:10.1371/journal.pmed.1001070
Stretton, S. (2014). Systematic review on the primary and secondary reporting of the prevalence of ghostwriting in the medical literature. BMJ Open, 4(7), 1-11. doi:10.1136/bmjopen-2013-004777
Tam, N., Huy, N., Thoa, L., Long, N., Trang, N., Hirayama, K., & Karbwang, J. (2015).
Participants’ understanding of informed consent in clinical trials over three decades: Systematic review and meta-analysis. Bulletin of the World Health Organization, 93(3), 186-198. doi:10.2471/BLT.14.141390
Tang, Q. (2015). From ephemerizer to timed-ephemerizer: Achieve assured lifecycle enforcement for sensitive data. The Computer Journal, 58, 1003-1020. doi:10.1093/comjnl/bxu030
Taylor, S., & Land, C. (2014). Organizational anonymity and the negotiation of research access. Qualitative Research in Organizations and Management, 9(2), 98-109. doi:10.1108/QROM-10-2012-1104
Thrope, A. S. (2014). Doing the right thing or doing the thing right: Implications of participant withdrawal. Organizational Research Methods, 17, 255-277. doi:10.1177/1094428114524828
Trier-Bieniek, A. (2012). Framing the telephone interview as a participant-centered tool for qualitative research: A methodological discussion. Qualitative Research, 12, 630-644. doi:10.1177/1468794112439005
Tilley, L. & Woodthorpe, K. (2011). Is it the end for anonymity as we know it? A critical examination of the ethical principle of anonymity in the context of 21st century demands on the qualitative researcher. Qualitative Research, 11(2), 197-212. doi:10.1177/1468794110394073
Tomkinson, S. (2015). Doing field work on state organizations in democratic settings: Ethical issues of research in refugee decision making. Forum: Qualitative Social Research, 16(1), 144-166. Retrieved from http://www.qualitative- research.net/index.php/fqs
Tuchman, G. (2011). Ethical imperialism: Institutional review boards and the social sciences, 1965-2009. Contemporary Sociology, 40, 617-619. doi:10.1177/0094306111419111mm
Udo-Akang, D. (2013). Ethical orientation for new and prospective researchers.
American International Journal of Social Sciences, 2(1), 54-64. Retrieved from http://www.aijssnet.com
Unluer, S. (2012). Being an insider researcher while conducting case study research.
The Qualitative Report, 17, 58. Retrieved from http://www.nova.edu/ssss/QR/BackIssues/index.html
Valandra, V. (2012). Reflexivity and professional use of self in research: A doctoral student’s journey. Journal of Ethnographic and Qualitative Research, 6(4), 204-
220. Retrieved from http://www.jeqr.org/
VanderWalde, A., & Kurzban, S. (2011). Paying human subjects in research: Where are we, how did we get here, and now what? Journal of Law, Medicine, & Ethics, 39, 543-558. doi:10.1111/j.1748-720X.2011.00621
Wainwright, D., & Sambrook, S. (2010). The ethics of data collection: Unintended consequences? Journal of Health Organization and Management, 24, 277-787. doi:10.1108/14777261011054617
Wallace, M., & Sheldon, N. (2014, February). Business research ethics: Participant observer perspectives. Journal of Business Ethics, 1-11. doi:10.1007/s10551- 014-2102-2
Wester, K. L. (2011). Publishing ethical research: A step-by-step overview. Journal of Counselling & Development, 89, 301-307. Retrieved from www.counseling.org
White, J., & Fitzgerald, T. (2010). Researcher tales and research ethics: The spaces in which we find ourselves. International Journal of Research & Method in Education, 33, 273-285. doi:10.1080/1743727X.2010.511711
Wolf, L. (2010). The research ethics committee is not the enemy: Oversight of community-based participatory research. Journal of Empirical Research on Human Research Ethics, 5, 77- 86. doi:10.1525/jer.2010.5.4.77
Wolf, L. E., Dame, L. A., Patel, M. J., Williams, B. A., Austin, J. A., & Beskow, L. M. (2012). Certificates of confidentiality: Legal counsels’ experiences with and perspectives on legal demands for research data. Journal of Empirical Research onHumanResearchEthics :JERHRE, 7(4), 1-9. doi:10.1525/jer.2012.7.4.1
Zimmerman, E., & Racine, E. (2012). Ethical issues in the translation of socialneuroscience: A policy analysis of current guidelines for public dialogue in human research accountability in research. Policies & Quality Assurance, 19, 27- 46. doi:10.1080/08989621.2012.650949
Zuraw, R. (2013). Consenting in the dark: Choose your own deception. The American Journal of Bioethics, 13(11), 57-59. doi:10.1080/15265161.2013.840016
Ethnography Sources
Ager, D. L. (2011). The emotional impact and behavioral consequences of post M & A integration: An ethnographic case study in the software industry. Journal of Contemporary Ethnography, 40, 199-230. doi:10.1177/0891241610387134
Alcadipani, R., & Hodgson, D. (2009). By any means necessary? Ethnographic access, ethics and the critical researcher. Tamara:Journal of Critical Organization Inquiry, 7(4), 127-128. Retrieved from http://tamarajournal.com/
Alfonso, M., Nickelson, L., & Cohen, D. (2012). Farmers’ markets in rural communities: A case study. American Journal of Health Education, 43, 143-151. Retrieved from http://www.aahperd.org/aahe/publications/ajhe/
Barbour, A. (2010, June). Exploring some ethical dilemmas and obligations of the ethnographer. Ethnography and Education, 5, 159-173. doi:10.1080/17457823.2010.493399
Boddy, C. R. (2011). “Hanging around with people”: Ethnography in marketing research and intelligence gathering. Marketing Review, 11, 151-163. doi:10.1362/146934711X589381
Bridges, J., Nicholson, C., Maben, J., Pope, C., Flatley, M., Wilkinson, C., & Tziggili,
M. (2013). Capacity for care: Meta-ethnography of acute care nurses' experiences of the nurse-patient relationship. Journal of Advanced Nursing, 69, 760-772. doi:10.1111/jan.12050
Butvilas, T., & Zygmantas, J. (2011). An ethnographic case study in educational research. Acta Paedagogica Vilnensia, 27, 33-42. Retrieved from http://www.leidykla.eu/index.php?id=36
Campbell-Reed, E. R., & Scharen, C. (2013). Ethnography on holy ground: How qualitative interviewing is practical theological work. International Journal of Practical Theology, 17, 232-259. doi:10.1515/ijpt-2013-0015
Chikweche, T., & Fletcher, R. (2012). Undertaking research at the bottom of the pyramid using qualitative methods. Qualitative Market Research: An International Journal, 15, 242-267. doi:10.1108/13522751211231978
Cincotta, D. (2015). An ethnography: An inquiry into agency alignment meetings. Journal of Business Studies 7(1), 95-106. Retrieved from http://alliedacademies.org/Public/Default.aspx
Cramer, H., Shaw, A., Wye, L., & Weiss, M. (2010). Over-the-counter advice seeking about complementary and alternative medicines (CAM) in community pharmacies and health shops: An ethnographic study. Health & Social Care in the Community, 18(1), 41-50. doi:10.1111/j.1365-2524.2009.00877.x
Cruz, E. V., & Higginbottom, G. (2013). The use of focused ethnography in nursing research. Nurse Researcher, 20(4), 36-43. doi:10.7748/nr2013.03.20.4.36.e305
Dennis, B. (2010, June). Ethical dilemmas in the field: the complex nature of doing education ethnography. Ethnography and Education, 5(2), 123-127. doi:10.1080/17457823.2010.493391
Doloriert, C., & Sambrook, S., (2012) Organisational autoethnography. Journal of Organizational Ethnography, 1, 83-95. doi:10.1108/20466741211220688
Dominguez, D., Beaulieu, A., Estalella, A., Gomez, E., Schnettler, B., & Read, R. (2007). Vitual ethnography. Forum: Qualitative Social Research, 8. Retrieved from http://www.qualitative-research.net/index.php/fqs/article/view/274/601
Dowden, A. R., Gunby, J. D., Warren, J. M., & Boston, Q. (2014). A phenomenological analysis of invisibility among African-American males: implications for clinical practice and client retention. The Professional Counsellor, 4, 58-70. doi:10.15241/ard.4.1.58
Down, S. (2012). A historiographical account of workplace and organizational ethnography. Journal of Organizational Ethnography, 1, 72-82. doi:10.1108/20466741211220679
Fields, D. A., & Kafai, Y. B. (2009). A connective ethnography of peer knowledge sharing and diffusion in a tween virtual world. Computer Supported Collaborative Learning, 4(1), 47-69. doi:10.1007/s11412-008-9057-1
Fitzgerald, J. L. (2009). Mapping the experience of drug dealing risk environments: An ethnographic case study. International Journal of Drug Policy, 20, 261-269. doi:10.1016/j.drugpo.2008.10.002
Forsey, M. G. (2010). Ethnography as participant listening. Ethnography, 11, 558-572. doi:10.1177/1466138110372587
Freeman, L., & Spanjaard, D. (2012). Bridging the gap: The case for expanding ethnographic techniques in the marketing research curriculum. Journal of Marketing Education, 34, 238-250. doi:10.1177/0273475312455334
Garcia, A. C., Standlee, A. I., Bechkoff, J.. & Cui, Y. (2009), Ethnographic approaches to the internet and computer-mediated communication. Journal of Contemporary Ethnography, 38 (1), 52-84. doi:10.1177/0891241607310839
Gibson, J. W. (2010). A winning combination for business researchers: A review of qualitative methods in business research. The Qualitative Report, 15, 1012--1015. Retrieved from http://www.nova.edu/ssss/QR/QR15-4/eriksson
Goodson, L., & Vassar, M. (2011). An overview of ethnography in healthcare and medical education research. Journal of Educational Evaluation for Health Professions, 8(4). doi:10.3352/jeehp.2011.8.4
Granot, E., Brashear, T. G., & Motta, P. C. (2012). A structural guide to in-depth interviewing in business and industrial marketing research. The Journal of Business & Industrial Marketing, 27, 547-553. doi:10.1108/08858621211257310
Hair, N., & Clark, M. (2007). The ethical dilemmas and challenges of ethnographic research in electronic communities. International Journal of Market Research, 49, 781-800. Retrieved from http://www.ijmr.com/
Hampshire, K. (2014). The interview as narrative ethnography: Seeking and shaping connections in qualitative research. International Journal of Social Research Methodology, 17(3), 215-231. doi:10.1080/13645579.2012.729405
Hays, D. G., & Wood, C. (2011). Infusing qualitative traditions in counseling research designs. Journal of Counseling & Development, 89, 288-295. doi:10.1002/j.1556- 6678.2011.tb00091.x
Holloway, I., Brown, L., & Shipway, R. (2010). Meaning not measurement: Using ethnography to bring a deeper understanding to the participant experience of festivals and events. International Journal of Event and Festival Management, 1(1), 74-85. doi:10.1108/17852951011029315
Horst, H., Hjorth, L., & Tacchi, J. (2012). Rethinking ethnography: An introduction.
Media International Australia, Incorporating Culture and Policy, 86-93. Retrieved from http://apo.org.au/research/rethinking-ethnography-introduction
Hoskins, M, L., & White, J. (2013). Relational inquiries and the research interview: Mentoring future researchers. Qualitative Inquiry, 19, 179-188. doi:10.1177/1077800412466224
Huby, G., Harries, J., & Grant, S. (2011). Contributions of ethnography to the study of public services management. Public Management Review, 13, 209-225. doi:10.1080/14719037.2010.532969
Jacob, S. A., & Furgerson, S. (2012). Writing interview protocols and conducting interviews: Tips for students new to the field of qualitative research. Qualitative Report, 17, 1-10. Retrieved from http://www.nova.edu/ssss/QR/QR17/jacob
Jerolmack, C., & Khan, S. (2014). Talk is cheap ethnography and the attitudinal fallacy.
Sociological Methods & Research, 43, 178-209. doi:10.1177/0049124114523396 Johnson, B. C., Dunlap, E., & Benoit, E. (2010). Organizing mountains of words for data
analysis, both qualitative and quantitative. Substance Use & Misuse, 45, 648- 670. doi:10.3109/10826081003594757
Kisely, S., & Kendall, E. (2011). Critically appraising qualitative research: A guide for clinicians more familiar with quantitative techniques. Australasian Psychiatry, 19, 364-367. doi:10.3109/10398562.2011.562508
Klitmøller, A., & Lauring, J. (2013). When global virtual teams share knowledge: Media richness, cultural difference and language commonality. Journal of World Business, 48, 398-406. doi;10.1016/j.jwb.2012.07.023
Küster, I., & Vila, N. (2011). Successful SME web design through consumer focus groups. International Journal of Quality & Reliability Management, 28(2), 132– 154. doi:10.1108/02656711111101728
Lahlou, S. (2011). How can we capture the subject’s perspective? An evidence-based approach for the social scientist. Social Science Information, 50, 607-655. doi:10.1177/0539018411411033
Lambert, V., Glacken, M., & McCarron, M. (2011). Employing an ethnographic approach: Key characteristics. Nurse Researcher, 19(1), 17-24. Retrieved from http://nursingstandard.rcnpublishing.co.uk
Lewis, S. J., & Russell, A. J. (2011). Being embedded: A way forward for ethnographic research. Ethnography, 12, 398-416. doi:10.1177/1466138110393786
Lopez-Dicastillo, O., & Belintxon, M. (2014, May 15). The challenges of participant observations of cultural encounters within an ethnographic study. Procedia - Social and Behavioral Sciences, 132, 522-526. doi:10.1016/j.sbspro.2014.04.347
Luo, H. (2011). Qualitative research on educational technology: Philosophies, methods and challenges. International Journal of Education, 3(2), 1-16. doi:10.5296/ije.v3i2.857
McCaslin, M. L., & Scott, K. W. (2003). The five-question method for framing a qualitative research study. The Qualitative Report, 8, 447-461. Retrieved from http://www.nova.edu/ssss/QR/QR8-3/mcaslin
Mannay, D., & Morgan, M. (2015). Doing ethnography or applying a qualitative technique? Reflections from the ‘waiting field’. Qualitative Research, 15(2), 166- 182. doi:10.1177/1468794113517391.
Mears, A. (2013). Ethnography as precarious work. The Sociological Quarterly, 54, 20- 34. doi:10.1111/tsq.12005
Mendez, C. (2009). Anthropology and ethnography: Contributions to integrated marketing communications. Marketing Intelligence & Planning, 27, 633-648. doi:10.1108/02634500910977863
Murthy, D. (2013). Ethnographic research 2.0. Journal of Organizational Ethnography, 2
(1), 23-36. doi:10.1108/JOE-01-2012-0008
Mutchler, M. G., McKay, T., McDavitt, B., & Gordon, K. K. (2013). Using
peer ethnography to address health disparities among young urban Black and Latino men who have sex with men. American Journal of Public Health, 103, 849-852. doi:10.2105/AJPH.2012.300988
O'Connor, S. J. (2011). Context is everything: The role of auto-ethnography, reflexivity, and self-critique in establishing the credibility of qualitative research findings.
European Journal of Cancer Care, 20, 421-423. doi:10.1111/j.1365- 2354.2011.01261.x
Ojha, A. K., & Holmes, T. L. (2010). Don’t tease me, I’m working: Examining humor in a Midwestern organization using ethnography of communication. The Qualitative Report, 15, 279-300. Retrieved from http://www.nova.edu/ssss/QR/QR15-2/ojha
Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? Part 2: Introducing qualitative research methodologies and methods. Manual Therapy, 17, 378-384. doi:10.1016/j.math.2012.03.004
Phelps, A., & Horman, M. (2010). Ethnographic theory-building research in construction.
Journal of Construction Engineering & Management, 136(1), 58-65. doi:10.1061/(ASCE)CO.1943-7862.0000104
Prior, D. D., & Miller, L. M. (2012). Webethnography. International Journal of Market Research, 54, 503-520. doi:10.2501/IJMR-54-4-503-520
Pritchard, K. (2011). From “being there” to “being [. . . ] where?”: Relocating ethnography. Qualitative Research in Organizations and Management: An International Journal, 6, 230-245. doi:10.1108/17465641111188402
Robillard, C. (2010). The gendered experience of stigmatization in severe and persistent mental illness in Lima, Peru. Social Science & Medicine, 71, 2178- 2186. doi:10.1016/j.socscimed.2010.10.004
Robinson, S. G. (2013). The relevancy of ethnography to nursing research. Nursing Science Quarterly, 26, 14-19. doi:10.1177/0894318412466742
Ronald, R. (2011). Ethnography and comparative housing research. International Journal of Housing Policy, 11, 415-437. doi:10.1080/14616718.2011.626605
Sandall, J. (2010). Normal birth, magical birth: The role of the 36-week birth talk in caseload midwifery practice. Midwifery, 26, 211-221. doi:10.1016/j.midw.2008.007.002
Sangasubana, N. (2011). How to conduct ethnographic research. The Qualitative Report, 16, 567-573. Retrieved from http://www.nova.edu/ssss/QR/QR16- 2/sangasubana
Shover, N. (2012). Ethnographic methods in criminological research: Rationale, reprise, and warning. American Journal of Criminal Justice, 37, 139-145. doi:10.1007/s12103-012-9160-8
Smyth, J., & McInerney, J. (2013). Whose side are you on? Advocacy ethnography: some methodological aspects of narrative portraits of disadvantaged young people, in socially critical research. International Journal of Qualitative Studies in Education, 26, 1-20. doi:10.1080/09518398.2011.604649
Storesund, A., & McMurray, A. (2009). Quality of practice in an intensive care unit (ICU): A mini-ethnographic case study. Intensive and Critical Care Nursing, 25(3), 120- 127. doi:10.1016/j.iccn.2009.02.001
Swinghurst, D., Greenhalgh, T., Russell, J., & Myall, M. (2011). Receptionist input to quality and safety in repeat prescribing in UK general practice: Ethnographic case study. British Medical Journal, 343(7831), 1-11. doi:10.1136/bmj.d6788
Symons, J., & Maggio, R. (2014). 'Based on a true story': Ethnography's impact as a narrative form. Journal of Comparative Research in Anthropology and Sociology, 5(2), 1-6. Retrieved from http://compaso.ro
Taber, N. (2010). Institutional ethnography, autoethnography, and narrative: An argument for incorporating multiple methodologies. Qualitative Research, 10, 5- 25. doi:10.1177/1468794109348680
Talmy, S. (2010). Qualitative interviews in applied linguistics: From research instrument to social practice. Annual Review of Applied Linguistics, 30, 128-148. doi:10.1017/S0267190510000085
Thierbach, C., & Lorenz, A. (2014). Exploring the orientation in space. Mixing focused ethnography and surveys in social experiment. Historical Social Research, 39(2), 137-166. doi:10.12759/hsr.39.2014.2.137-166
Van Maanen, J. (2006). Ethnography then and now. Qualitative Research in Organizations and Management: An International Journal, 1, 13-21. doi:10.1108/17465640610666615
Van Maanen, J. (2010). Ethnography as work: Some rules of engagement. Journal of Management Studies, 48, 218-234. doi:10.1111/j.1467-6486.2010.00980.x
Visconti, L. M. (2010). Ethnographic case study (ECS): Abductive modeling of ethnography and improving the relevance in business marketing research. Industrial Marketing Management, 30(1), 25-39. doi:10.1016/j.indmarman.2008.04.019
Wainwright, D., & Sambrook, S. (2010). The ethics of data collection: Unintended consequences? Journal of Health Organization and Management, 24, 277-287. doi:10.1108/14777261011054617
Watson, T. J. (2011). Ethnography, reality and truth: The vital need for studies of how things work. Journal of Management Studies, 48, 202-217. doi:10.1111/j.1467- 6486.2011.01015.x
Watson, T. J. (2012). Making organizational ethnography. Journal of Organizational Ethnography, 1(1), 15-22. doi:10.1108/20466741211220615
White, J., Drew, S., & Hay, T. (2009). Ethnography versus case study: Positioning research and researchers. Qualitative Research Journal, 9(1), 18-27. doi:10.3316/QRJ0901018
Wilson, W. J., & Chaddha, A. (2009). The role of theory in ethnographic research.
Ethnography, 10, 549-564. doi:10.1177/1466138109347009
Wolcott, H. F. (2005). The art of fieldwork. Walnut Creek, CA: Altamira Press Wolcott, H.F. (2004). The ethnographic autobiography. Auto/Biography, 12, 93-106.
doi:10.1191/0967550704ab004oa
Wolcott, H. F. (2010). Ethnography lessons: A primer. Walnut Creek, CA: Left Coast Press.
Wolcott, H.F. (1994). Transforming qualitative data – description, analysis, and interpretation. Thousand Oaks: Sage.
Wolcott, H. F. (2009). Writing up qualitative research. Thousand Oaks: Sage Publications.
Wolfinger, N. H. (2002). On writing fieldnotes: Collection strategies and background expectancies. Qualitative Research, 2, 85-95. doi:10.1177/1468794102002001640
Yanow, D. (2012). Organizational ethnography between toolbox and world-making.
Journal of Organizational Ethnography, 1(1), 31-42. doi:1108/202466741211220633
Zhou, D., & Sun, X. (2010). Group differences among Nongmingong: A follow-up
ethnographic case study. International Journal of Business Anthropology, 1(1), 79-94. Retrieved from http://www.na-businesspress.com/ijbaopen.html
Zhu, Y., & Bargiela-Chiappini, F. (2013). Balancing emic and etic: Situated learning and ethnogaphy of communication in cross-cultural management education.
Academy of Management Learning & Education, 12, 380-395. doi:10.5465.amle.2012.0221
Zilber, T. B. (2014). Beyond a single organization: Challenges and opportunities in doing field level ethnography. Journal of Organizational Ethnography, 3(1), 96- 113. doi:10.1108/JOE-11-2012-0043
Focus Groups
Bill, F., & Olaison, L. (2009).The indirect approach of semi-focused groups: Expanding focus group research through role-playing. Qualitative Research in Organizations and Management, 4 (1), 7-26. doi:10.1108/17465640910951426
Boateng, W. (2012). Evaluating the efficacy of focus group discussion (FGD) in qualitative social research. International Journal of Business and Social Science, 7(3), 54-57. Retrieved from http://www.ijbssnet.com/journals
Bristol, T., & Fern, E. F. (2003). The effects of interaction on consumers’ attitudes in focus groups. Psychology and Marketing, 20, 433-454. doi:10.1002/mar.10080
Bruggen, E. (2009). A critical comparison of offline focus groups, online focus groups and e-Delphi. International Journal of Market Research, 51, 363-381. Retrieved from http://www.warc.com
Bussières, A. E., Patey, A. M., Francis, J. J., Sales, A. E., & Grimshaw, J. M. (2012).
Identifying factors likely to influence compliance with diagnostic imaging guideline recommendations for spine disorders among chiropractors in North America: a focus group study using the theoretical domains framework. Implementation Science, 7, 82-93. doi:10.1186/1748-5908-7-82
Carlsen, B., & Glenton, C. (2011). What about N? A methodological study of sample size reporting in focus group studies. BMC Medical Research Methodology, 11(1), 26-35. doi:10.1186/1471-2288-11-26
Chikweche, T., & Fletcher, R. (2012). Undertaking research at the bottom of the pyramid using qualitative methods. Qualitative Market Research: An International Journal, 15, 242-267. doi:10.1108/13522751211231978
Coenen, M., Stamm, T. A., Stucki, G., & Cieza, A. (2012). Individual interviews and focus groups in patient with rheumatoid arthritis: A comparison of two qualitative methods. Quality Life Research, 21, 359-370. doi:10.1007/s11136-011-9943-2
Coule, T (2013). Theories of knowledge and focus groups in organization and management research. Qualitative Research in Organizations and Management , 8(2), 148-162. doi:10.1108/QROM-09-2011-1006
Dilshad, R. M., & Latif, M. I. (2013). Focus group interview as a tool for qualitative research: An analysis. Pakistan Journal of Social Sciences, 33(1), 191-198. Retrieved from http://www.bzu.edu.pk/
Doody, O., Slevin, E., & Taggart, L. (2013). Focus group interviews. Part 3: analysis.
British Journal of Nursing, 22, 266-269. Retrieved from http://www.internurse.com/cgi-
bin/go.pl/library/article.cgi?uid=97512;article=BJN_22_5_266_269
Freeman, L, & Spanjaard, D. (2012). Bridging the gap: The case for expanding ethnographic techniques in the marketing research curriculum. Journal of Marketing Education, 34, 238-250. doi:10.1177/0273475312455334
Gany, F. M., Gill, P. P., Ahmed, A., Acharya, S., & Leng, J. (2013). “Every disease… man can get can start in this cab”: Focus groups to identify south Asian taxi drivers’ knowledge, attitudes and beliefs about cardiovascular disease and its risks. Journal of Immigrant and Minority Health, 15, 986-992. doi:10.1007/s10903-012-9682-7
George, M. (2013, June 20). Teaching focus group interviewing: Benefits and challenges. Teaching Sociology, 41, 257-270. doi:10.1177/0092055X12465295
Gill, P., K. Stewart, K., Treasure, E. & Chadwick, B. (2008) Methods of data collection in qualitative research: Interviews and focus groups. British Dental Journal, 204, 291-295. doi:10.1038/bdj.2008.192
Jarvinen, M., & Demant, J. (2011). The normalization of Cannabis use among young people: Symbolic boundary work in focus groups. Health, Risk, & Society, 13, 165-182. doi:10.1080/13698575.2011.556184
Jayawardana, A. & O’Donnell, M. (2009). Devolution, job enrichment and workplace performance in Sri Lanka’s garment industry. The Economic and Labour Relations Review, 19(2), 107-122. Retrieved from http://www.austlii.edu.au/au/journals/ELRRev/
Kehoe, W., & Lindgren, J. (2003). Focus groups in global marketing: Concept, methodology and implications. The Marketing Management Journal, 13 (2), 14-
28. Retrieved from http://www.mmaglobal.org/publications.html
Kitchen, M.C. (2013). Methods in focus group interviews in cross-cultural settings.
Qualitative Research Journal, 13, 265-277.doi:10.1108/QRJ-2013-0005
Koskan, A.M., Rice, J., Gwede, C.K., Meade, C.D., Sehovic, I., & Quinn, G.P. (2014). Advantages, disadvantages, and lessons learned in conducting telephone focus groups to discuss biospecimen research concerns of individuals genetically at risk for cancer. The Qualitative Report, 19(10), 1-8. Retrieved from http://www.nova.edu/ssss/QR/QR19/koskan10
Küster, I., & Vila, N. (2011). Successful SME web design through consumer focus groups. International Journal of Quality & Reliability Management, 28(2), 132– 154. doi:10.1108/02656711111101728
Lowery, D. R., & Morse, W. C. (2013). A qualitative method for collecting spatial data on
important places for recreation, livelihoods, and ecological meanings: Integrating focus groups with public participation geographic information systems. Society & Natural Resources, 26, 1422-1437. doi:10.1080/08941920.2013.819954
Mangioni, V., & Mckerchar, M. (2013). Strengthening the validity and reliability of the focus group as a method in tax research. eJournal of Tax Research, 11(2), 176-
190. Retrieved from https://www.business.unsw.edu.au/research/publications/atax-journal
Massey, O.T. (2011). A proposed model for the analysis and interpretation of focus groups in evaluation research. Evaluation and Program Planning, 34(1), 21-28. doi:10.1016/j.evalprogplan.2010.06.003
Nepomuceno, M., & Porto, J., (2010). Human values and attitudes toward bank services in Brazil. The International Journal of Bank Marketing, 28(3), 168-192. doi:10.1108/02652321011036459
O’hEocha, C., Conboy, K., & Wang, S. (2010). Using focus group in studies of ISD team behavior. Electronic Journal of Business Methods, 8, 119-131. Retrieved from http://www.ejbrm.com
Onwuegbuzie, A. J., Dickinson, W. B., Leech, N. L., & Zoran, A. G. (2009). A qualitative framework for collecting and analyzing data in focus group research.
International Journal of Qualitative Methods, 8(3), 1-21. Retrieved from http://[email protected]
Onwuegbuzie, A. J., Leech, N. L., & Collins, K. M. T. (2010). Innovative data collection strategies in qualitative research. The Qualitative Report, 15, 696-726. Retrieved from http://www.nova.edu/ssss/QR/QR15-3/onwuegbuzie
Packer-Muti, B. (2010). Conducting a focus group. The Qualitative Report, 15, 1023- 1026. Retrieved from http://www.nova.edu/ssss/QR/QR15-4/packer
Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? Part 2: Introducing qualitative research methodologies and methods. Manual Therapy, 17, 378-384. doi:10.1016/j.math.2012.03.004
Pratama, A., & Firman, A. (2010). Exploring the use of qualitative research methodology in conducting research in cross cultural management. International Journal of Interdisciplinary Social Sciences, 5, 331-342. Retrieved from http://www.iji.cgpublisher
Rakow, L. F. (2011). Commentary: Interviews and focus groups as critical and cultural methods. Journalism and Mass Communication Quarterly, 88, 416-428. doi:10.1177/107769901108800211
Redlich-Amirav, D., & Higginbottom, G. (2014). New emerging technologies in qualitative research. The Qualitative Report, 19(26), 1-14. Retrieved from http://www.nova.edu/ssss/QR/QR19/redlich-amirav12
Rodrigues, V. S., Piecyk, M., Potter, A., McKinnon, A., Naim, M., & Edwards, J. (2010).
Assessing the application of focus groups as a method for collecting data in logistics. International Journal of Logistics: Research and Applications, 13(1), 75- 94. doi:10.1080/13675560903224970
Rose-Anderson, C., Baldwin, J. S.. & Ridgway, K. (2010). The effects of communicative interactions on meaning construction in group situations. Qualitative Research in Organizations and Management, 5, 196-215. doi:10.1108/17465641011068866
Ryan, K. E., Gandhal, T., Culbertson, M. J., & Carlson, C. (2013, December). Focus group evidence: Implications for design and analysis. American Journal of Evaluation, 1-18. doi:10.1177/1098214013508300
Sarvestani, A. S., Bufumbo, L., Geiger, J. D., & Sienko, K. H. (2012). Traditional male circumcision in Uganda: A qualitative focus group discussion analysis. Public Library of Science One, 7(10), 1-10. doi:10.1371/journal.pone.0045316
Schmidt, M. (2010). Quantification of transcripts from depth interviews, open-ended responses and focus groups. International Journal of Market Research, 52, 483- 508. doi:10.2501/S1470785309201417
Shaha, M., Wenzel, J., & Hill, E. (2011). Planning and conducting focus group research with nurses. Nurse Researcher, 18(2), 77-87. doi:10.7748/nr2011.01.18.2.77.c8286
Sheppard, S., & Jones, H. (2013). Researching the ‘researched’ about research: A fresh perspective on the power of focus groups. Market & Social Research, 21(2), 40-
47. Retrieved from http://www.amsrs.com.au
Sin, J. (2013). Focus group study of siblings of individuals with psychosis: Views on designing an online psychoeducational resource. Journal of Psychosocial Nursing & Mental Health Services, 51(6), 28-36. doi:10.3928/02793695- 20130404-02
Stahl, B. C., Tremblay, M. C., & LeRouge, C. M. (2011). Focus groups and critical social IS research: How choice of method can promote emancipation of respondents and researchers. European Journal of Information Systems, 20, 378-394.
Retrieved from http://aisel.aisnet.org
Sutton, S. G., & Arnold, V. (2013). Focus group methods: Using interactive and nominal groups to explore emerging technology-driven phenomena in accounting and information systems. Methodologies in AIS Research, 14(2), 81-88.
doi:10.1016/j.accinf.2011.10.001
Thomas, R., & Quinlan, E. (2014). Teaching and learning focus group facilitation: An encounter with experiential learning in a graduate sociology classroom.
Transformative Dialogues: Teaching & Learning Journal, 7(1), 1-15. Retrieved from www.kpu.ca/td
Vala, J. (2014). The interpretation of an old Japanese five-line poem with a focus group method. Social and Behavioral Sciences, 116, 3816-3819. doi:10.1016/j.sbspro.2014.01.847
Vicsek, L. (2010). Issues in the analysis of focus groups: Generalisability, quantifiability, treatment of context and quotations. The Qualitative Report, 15(1), 122-141.
Retrieved from http://www.nova.edu/ssss/QR/QR15-1/vicsek
Webb, C., & Kevern, J. (2008). Focus groups as a research method: A critique of some aspects of their use in nursing research. Journal of Advanced Nursing, 33, 798-805. doi:10.1046/j.1365-2648.2001.01720.x
Weber, M.J. (2014). Defining the constructs of making, enabling, and keeping promises: A focus group application. Journal of Services Research, 13(2), 117-130.
Retrieved from http://www.jsr-iimt.in/
Wong, E., Coulter, A., Cheung, A., Yam, C., Yeoh, E., & Griffiths, S. (2013, July). Item generation in the development of an inpatient experience questionnaire: a qualitative study. BMC Health Services Research, 13. doi:10.1186/1472-6963- 13-265
Wooten, D. B., & Reed Ii, A. (2000). A conceptual overview of the self-presentational concerns and response tendencies of focus group participants. Journal of Consumer Psychology, 9, 141-153. Retrieved from http://www.elsevier.com
Interview Protocol Sources
Amerson, R. (2011). Making a case for the case study method. Journal of Nursing Education, 50, 427-428. doi:10.3928.01484834-20110719-01
Briggs, R. O., & Murphy, J. D. (2011). Discovering and evaluation collaboration engineering opportunities: An interview protocol based on the value frequency model. Group Decision and Negotiation, 20, 315-346. doi:10.1007/s10726-009- 9158-x
Brown, D. A., Lamb, M. E., Lewis, C., Pipe, M., Orbach, Y., & Wolfman, M. (2013). The NICHD investigative interview protocol: An analogue study. Journal of Experimental Psychology: Applied, 19, 367-382. doi:10.1037/a0035143
Carlson, J. A. (2010). Avoiding traps in member checking. The Qualitative Report, 15, 1102-1113. Retrieved from http://www.nova.edu/ssss/QR/QR15-5/carlson
Chenail, R. (2011). Interviewing the investigator: Strategies for addressing instrumentation and researcher bias concerns in qualitative research. The Qualitative Report, 16, 255-262. Retrieved from http://www.nova.edu/ssss/QR/QR16-1/interviewing
De Ceunynck, T., Kusumastuti, D., Hannes, E., Janssens, D., & Wets, G. (2013).
Mapping leisure shopping trip decision making: Validation of the CNET interview protocol. Quality and Quantity, 47, 1831-1849. doi:10.1007/s11135-011-9629-4
Hoffman, D. M., (2009). Multiple methods, communicative preferences and the incremental interview approach protocol. Forum: Qualitative Social Research, 10(1). Retrieved from http://www.qualitative- research.net/index.php/fqs/article/view/1220/2655
Jacob, S. A., & Furgerson, S. (2012). Writing interview protocols and conducting interviews: Tips for students new to the field of qualitative research. The Qualitative Report, 17, 1-10. Retrieved from http://www.nova.edu/ssss/QR/QR17/jacob
Platt, L. F., & Skowron, E. A. (2013). The family genogram interview: Reliability and validity of a new interview protocol. The Family Journal, 21(1), 35-45. doi:10.1177/1066480712456817
Rabionet, S. E. (2011). How I learned to design and conduct semi-structured interviews.
The Qualitative Report, 16, 563-566. Retrieved from http://www.nova.edu/ssss/QR/WQR/rabionet
Turner, D. W. III. (2010). Qualitative interview design: A practical guide for novice investigators. The Qualitative Report, 3(2), 7-13. Retrieved from http://www.nova.edu/ssss/QR/QR15-3/qid
Interviews Sources
Adams, E. (2010). The joys and challenges of semi-structured interviewing. Community Practitioner, 83(7), 18-21. Retrieved from http://www.scie-socialcareonline.org.uk
Anderson, C. (2010). Presenting and evaluating qualitative research. American Journal of Pharmaceutical Education, 74(8), 4-7. Retrieved from http://www.ajpe.org/
Anyan, F. (2013). The influence of power shifts in data collection and analysis stages: A focus on qualitative research interview. The Qualitative Report, 18(18), 1-9.
Retrieved from http://www.nova.edu/sss/QR/index.html
Aripin, N., Mustafa, H., & Hussein, A. (2011). Virtual team and trust relationship: Focus group interviews in multimedia super corridor status companies. Journal of Technosocial, 3(2), 55-67. Retrieved from http://penerbit.uthm.edu.my/ojs/index.php/JTS
Barnham, C. (2012). Separating methodologies. International Journal of Market Research, 54, 736-738. doi:10.2501/IJMR-54-6-736-738
Berent, P. H. (1966). The depth interview. Journal of Advertising Research, 6(2), 32-39. doi:10.1108/13522750610640530
Bernard, R. H. (2011). Research methods in anthropology: Qualitative and quantitative approaches. Thousand Oaks: Sage.
Birjandi, P., & Bagherkazemi, M. (2011). From face-to-face to paired oral proficiency interviews: The nut is yet to be cracked. English Language Teaching, 4(2), 169- 175. doi:10.5539/elt.v4n2p169
Block, E. S., & Erskine, L. (2012). Interviewing by telephone: Specific considerations, opportunities, and challenges. International Journal of Qualitative Methods. 11, 428-445. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/6863
Bowden, C., & Galindo-Gonzalez, S. (2015). Interviewing when you’re not face-to-face: The use of email interviews in a phenomenological study. International Journal of Doctoral Studies, 10, 79-92. Retrieved from http://ijds.org/V olume10/IJDSv10p079-092Bowden0684
Brod, M., Tesler, L. E., & Christiansen, T. L. (2009). Qualitative research and content validity: Developing best practices based on science and experience. Quality of Life Research, 18, 1263-1278. doi:10.1007/s11136-009-9540-9
Bryman, A. (2008). Of methods and methodology. Qualitative Research in Organizations and Management, 3(2), 159-168. doi:10.1108/17465640810900568
Buchbinder, E. (2011). Beyond checking: Experiences of the validation interview.
Qualitative Social Work, 10, 106-122. doi:10.1177/1473325010370189
Bulpitt, H., & Martin, P. J. (2010). Who am I and what am I doing? Becoming a qualitative research interviewer. Nurse Researcher, 17(3), 7-16. Retrieved from http://nurseresearcher.rcnpublishing.co.uk
Butterfield, L. D., Borgen, W. A., & Amundson, N. E. (2009). The impact of a qualitative research interview on workers' views of their situation. Canadian Journal of Counselling, 43(2), 120-130. Retrieved from http://www.ucalgary.ca/ucpress/journals/CJC/index.html
Cachia, M., & Millward, L. (2011). The telephone medium and semi-structured interviews: A complementary fit. Qualitative Research in Organizations and Management, 6, 265-277. doi:10.1108/17465641111188420
Cairney, P., & St Denny, E. (2015). Reviews of what is qualitative research and what is qualitative interviewing. International Journal of Social Research Methodology: Theory and Practice, 18, 117-125. doi:10.1080/13645579.2014.957434
Cassell, C., Bishop, V., Symon, G., Johnson, P., & Buehring, A. (2009). Learning to be a qualitative management researcher. Management Learning, 40, 513-533. doi:10.1177/1350507609340811
Cater, J. K. (2011). SKYPE - A cost-effective method for qualitative research.
Rehabilitation Counselors & Educators Journal, 4, 3. Retrieved from http://www.nationalrehab.org/cwt/external/wcpages/divisions/rcea.aspx
Chenail, R. (2011). Interviewing the investigator: Strategies for addressing instrumentation and researcher bias concerns in qualitative research. The Qualitative Report, 16(1), 255-262. Retrieved from http://www.nova.edu/ssss/QR/QR16-1/interviewing
Chikweche, T., & Fletcher, R. (2012). Undertaking research at the bottom of the pyramid using qualitative methods. Qualitative Market Research: An International Journal, 15, 242-267. doi:10.1108/13522751211231978
Coast, J., & Horrocks, S. (2010). Developing attributes and levels for discrete choice experiments using qualitative methods. Journal of Health Services Research and Policy, 12(1), 25-30. doi:10.346457934563454
Codie, J. (2012). Beyond rationalisations: Improving interview data quality. Qualitative Research in Accounting & Management, 9(2), 168-193. doi:10.1108/11766091211240379
Condie, J. (2012). Beyond rationalisations: Improving interview data quality. Qualitative Research in Accounting and Management, 9(2), 168-193. doi:10.1108/11766091211240379
Covell, C. L., Sidani, S., & Ritchie, J. A. (2012). Does the sequence of data collection influence participants’ responses to closed and open-ended questions? A methodological study. International Journal of Nursing Studies, 49, 664-671. doi:10.1016/j.ijnurstu.2011.12.002
Deakin, H., & Wakefield, K. (2013). Skype interviewing: Reflections of two PhD researchers. Qualitative Research, 14, 603-616. doi:10.1177/1468794113488126
Diefenbach, T. (2009). Are case studies more than sophisticated storytelling?
Methodological problems of qualitative empirical research mainly based on semistructured interviews. Quality and Quantity, 43, 875-894. doi:10.1007/s11135-008-9164-0
Doody, O., & Noonan, M. (2013). Preparing and conducting interviews to collect data.
Nurse Researcher, 20(5), 28-32. doi:10.7748/nr2013.05.20.5.28.e327
Dworkin, S. L. (2012). Sample size policy for qualitative studies using in-depth interviews. Archives of Sexual Behavior, 41, 1319-1320. doi:10.1007/s105080120016-6
Elmir, R., Schmied, V., Jackson, D., & Wilkes, L. (2011). Interviewing people about potentially sensitive topics. Nurse Researcher, 19(1), 12-16. Retrieved from http://nurseresearcher.rcnpublishing.co.uk
Forsey, M. G. (2010). Ethnography as participant listening. Ethnography, 11, 558-572. doi:10.1177/1466138110372587
Francis, J. J., Johnston, M., Robertson, C., Glidewell, L., Entwistle, V. Eccles, M. P., & Grimshaw, J. M. (2010). What is an adequate sample size? Operationalizing data saturation for theory-based interview studies. Psychology and Health, 25, 1229- 1245. doi:10.1080/08870440903194015
Frels, R. K., & Onwuegbuzie, A. J. (2013). Administering quantitative instruments with qualitative interviews: A mixed research approach. Journal of Counseling and Development, 91(2), 184-194. doi:10.1002/j.1556-6676.2013.00085.x
Gibbs, L., Kealy, M., Willis, K., Green, J., Welch, N., & Daly, J. (2007). What have sampling and data collection got to do with good qualitative research? Australian and New Zealand Journal of Public Health, 31, 540-544. doi:10.1111/j.1753- 6405.2007.00140.x
Gill, P., K. Stewart, K., Treasure, E. & Chadwick, B. (2008) Methods of data collection in qualitative research: Interviews and focus groups. British Dental Journal, 204, 291-295. doi:10.1038/bdj.2008.192
Glogowska, M., Young, P., & Lockyer, L. (2011). Propriety, process and purpose: Considerations of the use of the telephone interview method in an educational research study. Higher Education, 62(1), 17-26. doi:10.1007/s10734-010-9362-2
Goodbody, L., & Burns, J. (2011). A disquisition on pluralism in qualitative methods: The troublesome case of a critical narrative analysis. Qualitative Research in
Pschology, 8 (2), 170-196. doi:10.1080/14780887.2011.575288
Granot, E., Brashear, T. G., & Motta, P. C. (2012). A structural guide to in-depth interviewing in business and industrial marketing research.The Journal of Business & Industrial Marketing, 27, 547-553. doi:10.1108/08858621211257310
Green, L. (2013). In their own words: Using interview materials when writing up qualitative research. Australian Journal of Communication, 40(3), 105-119. Retrieved from http://austjourcomm.org/index.php/ajc
Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18(1), 59-82.
doi:10.1177/1525822X05279903
Haahr, A., Norlyk, A., & Hall, E. (2013). Ethical challenges embedded in qualitative research interviews with close relatives. Nursing Ethics, 2(1), 6-15. doi:10.1177/0969733013486370
Harvey, L. (2015). Beyond member checking: A dialogic approach to the research interview. International Journal of Research & Method in Education, 38, 23-38. doi:10.1080/1743727X.2014.914487
Harvey, W. S. (2011). Strategies for conducting elite interviews. Qualitative Research, 11, 431-441. doi:10.1177/1468794111404329
Hermanowicz, J. (2013). The longitudinal qualitative interview. Qualitative Sociology, 36, 189-208. doi:10.1007/s11133-013-9247-7
Higginbottom, G. M. A., Pillay, J. J., & Boadu, N. Y. (2013). Guidance on performing focused ethnographies with an emphasis on healthcare research. The Qualitative Report, 18(17), 1-16. Retrieved from http://www.nova.edu/ssss/QR/QR18/higginbottom17
Holt, A. (2010). Using the telephone for narrative interviewing: A research note.
Qualitative Research, 10, 113–121. doi:10.1177/1468794109348686
Honan, E. (2014). Disrupting the habit of interviewing. Reconceptualizing Educational Research Methodology, 5(1), 1-17. doi:10.7577/rerm.929
Houghton, C., Casey, D., Shaw, D., & Murphy, K. (2013). Rigour in qualitative case- study research. Nurse Researcher, 20(4), 12-17. doi:10.7748/nr2013.03.20.4.12.e326
Irvine, A. (2011). Duration, dominance and depth in telephone and face-to-face interviews: A comparative exploration. International Journal of Qualitative Methods, 10, 202-220. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/10276
Irvine, A., Drew, P., & Sainsbury, R. (2013). ‘Am I not answering your questions properly?’Clarification, adequacy and responsiveness in semi-structured telephone and face-to-face interviews. Qualitative Research, 13(1), 87-106. doi:10.1177/1468794112439086
Jacob, S. A., & Furgerson, S. (2012). Writing interview protocols and conducting interviews: Tips for students new to the field of qualitative research. Qualitative Report, 17, 1-10. Retrieved from http://www.nova.edu/ssss/QR/QR17/jacob
Jamshed, S. (2014). Qualitative research method-interviewing and observation. Journal of Basic and Clinical Pharmacy, 5(4), 87-88. doi:10.4103/0976-0105.141942
Janghorban, R., Roudsari, R. L., & Taghipour, A. (2014). Skype interviewing: The new generation of online synchronous interview in qualitative research. International Journal of Qualitative Studies on Health and Well-being, 9. doi:10.3402/qhw.v9.24152
Jepsen, D. M., & Rodwell, J. J. (2008). Convergent interviewing: A qualitative diagnostic technique for researchers. Management Research News, 31, 650-658. doi:http://dx.doi.org/10.1108/01409170810898545
Kim, Y. (2011). The pilot study in qualitative inquiry: Identifying issues and learning lessons for culturally competent research. Qualitative Social Work, 10, 190-206. doi:10.1177/1473325010362001
Knight, J. (2012). Deletion, distortion and data collection: The application of the neuro- linguistic programming (NLP) meta-model in qualitative interviews. Australasian Journal of Market & Social Research, 20(1), 15-21. Retrieved from http://www.amsrs.com
Knox, S., & Burkard, A. W. (2009). Qualitative research interviews. Psychotherapy Research, 19, 566-575. doi:10.1080/10503300802702105
Lamont, M., & Swidler, A. (2014). Methodological pluralism and the possibilities and limits of interviewing. Qualitative Sociology, 37(2), 153-171. doi:10.1007/s11133- 014-9274-z
Lampropoulou, S., & Myers, G. (2013). Stance taking in interviews from the Qualidata Archive. Qualitative Social Research, 14(1), 1-23. Retrieved from http://www.qualitative-research.net/index.php/fqs/article/view/1813/3468
Latcheva, R. (2011). Cognitive interviewing and factor-analytic techniques: A mixed approach to validity of surveys items measuring national identity. Quality & Quantity, 45, 1175-1199. doi:10.1007/s11135-009-9285-0
Lavis, V. (2010). Multiple researcher identities: Highlighting tensions and implications for ethical practice in qualitative interviewing. Qualitative Research in Psychology, 7, 316-331. doi:10.1080/14780880902929506
Mann, S. (2011). A critical review of qualitative interviews in applied linguistics. Applied Linguistics, 32(1), 6-24. doi:10.1093/applin/amq043 http://www2.warwick.ac.uk/fac/soc/al/staff/teaching/mann/interviews/sm_1_applin g-6-24
Marshall, C., & Rossman, G. (2016). Designing qualitative research (6th ed.). Thousand Oaks: Sage.
Marshall, B., Cardon, P., Poddar, A., & Fontenot, R. (2013). Does sample size matter in qualitative research? A review of qualitative interview in is research. Journal of Computer Information Systems, 54(1), 11-22. Retrieved from http://www.iacis.org/jcis/jcis.php
Mason, M. (2010, September). Sample size and saturation in PhD studies using qualitative interviews. Forum: Qualitative Social Research, 11(3). Retrieved from http://www.qualitative-research.net/index.php/fqs/article/view/1428/3027
McCormack, C. (2000). From interview transcript to interpretive story: Part 1: viewing the transcript through multiple lenses. Field Methods, 12, 282-297. doi:10.1177/1525822X0001200402
McCormack, C. (2000). From interview transcript to interpretive story: Part 2: developing an interpretive story. Field Methods, 12, 298-315. doi:10.1177/1525822X0001200403
Mealer, M., & Jones, J. (2014). Methodological and ethical issues related to qualitative telephone interviews on sensitive topics. Nurse Researcher, 21, 32-37. Retrieved from http://rcnpublishing.com/journal/nr
Mikecz, R. (2012). Interviewing elites: Addressing methodological issues.
Qualitative Inquiry, 18, 482-493. doi:10.1177/1077800412442818
Mishler, E. G. (1996). Research interviewing: Context and narrative. Boston: Harvard University Press.
Moore, N., & Stokes, P. (2012). Elite interviewing and the role of sector context: An organizational case from the football industry. Qualitative Market Research: An International Journal, 15, 438-464. doi:10.1108/13522751211257105
Nelson, J. A., Onwuegbuzie, A. J., Wines, L. A., & Frels, R. K. (2013). The therapeutic interview process in qualitative research studies. The Qualitative Report, 18(7), 1-17. Retrieved from http://www.nova.edu/ssss/QR/QR18/nelson79
Novick, G. (2008). Is there a bias against telephone interviews in qualitative research?
Research in Nursing & Health, 31, 391-398. doi:10.1002/nur.20259
Oleinik, A. (2011). Mixing quantitative and qualitative content analysis: Triangulation at work. Quality and Quantity, 45, 859-873. doi:10.1007/s11135-010-9399-4
Oliphant, G. C., Hansen, K., & Oliphant, B. J. (2008). Predictive validity of a behavioral interview technique. Marketing Management Journal, 18(2), 93-105. Retrieved from http://www.mmaglobal.org
Onwuegbuzie, A. J., & Byers, V. T. (2014). An exemplar for combining the collection, analysis, and interpretation of verbal and nonverbal data in qualitative research. International Journal of Education, 6(1), 183-246. doi:10.5296/ije.v6i1.4399
Onwuegbuzie, A. J., Leech, N. L., & Collins, K. M. T. (2010). Innovative data collection strategies in qualitative research. The Qualitative Report, 15, 696-726. Retrieved from http://www.nova.edu/ssss/QR/QR15-3/onwuegbuzie
Onwuegbuzie, A. J., Leech, N. L., Slate, J. R., Stark, M., & Sharma, B. (2012). An exemplar for teaching and learning qualitative research. The Qualitative Report, 17(1), 16-77. Retrieved from http://www.nova.edu/ssss/QR/QR17-1/onwuegbuzie
Palys, T., & Atchison, C. (2010). Research decisions: Quantitative and qualitative perspectives (4th ed.). Scarborough, ON: Nelson Education.
Patrick, D. L., Burke, L. B., Gwaltney, C. J., Leidy, N. K., Martin, M. L., Molsen, E., & Ring, L. (2011). Content validity: Establishing and reporting the evidence in newly developed patient-reported outcomes (PRO) instruments for medical product evaluation. ISPOR PRO Good Research Practices Task Force report: Part 1: Eliciting concepts for a new PRO instrument. Value in Health, 14, 967-977. doi:10.1016/j.jval.2011.06.014
Pearson, M., & Coomber, R. (2010). The challenge of external validity in policy-relevant systematic reviews: A case study from the field of substance misuse. Addiction, 105(1), 136-145. doi:10.1111/j.1360-0443.2009.02713.x
Peredaryenko, M. S., & Krauss, S. E. (2013). Calibrating the human instrument: understanding the interviewing experience of novice qualitative researchers. The Qualitative Report, 18(85), 1-17. Retrieved from http://www.nova.edu/ssss/QR/QR18/peredaryenko85
Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? Part 2: Introducing qualitative research methodologies and methods. Manual Therapy, 17, 378-384. doi:10.1016/j.math.2012.03.004
Pezalla, A. E., Pettigrew, J., & Miller-Day, M. (2012). Researching the researcher-as- instrument: An exercise in interviewer self-reflexivity. Qualitative Research, 12(2), 165-185. doi:10.1177/1468794111422107
Pratama, A., & Firman, A. (2010). Exploring the use of qualitative research methodology in conducting research in cross cultural management. International Journal of Interdisciplinary Social Sciences, 5, 331-342. Retrieved from http://www.iji.cgpublisher.com
Qu, S. Q., & Dumay, J. (2011). The qualitative research interview. Qualitative Research in Accounting & Management, 8, 238-264. doi:10.1108/11766091111162070
Rabionet, S. E. (2011). How I learned to design and conduct semi-structured interviews.
The Qualitative Report, 16(2), 563-566. Retrieved from http://www.nova.edu/ssss/QR/WQR/rabionet
Radford, M. L., Radford, G. P. Connaway. L. S., & DeAngelis, J. A. (2011). On virtual face-work: An ethnography of communication approach to a live chat reference interaction. Library Quarterly, 81, 431-453. Retrieved from http://www.oclc.org/content/dam/research/publications/library/2011/201109- lq.pdf?urlm=162960
Rakow, L. F. (2011). Commentary: Interviews and focus groups as critical and cultural methods. Journalism and Mass Communication Quarterly, 88, 416-428. doi:10.1177/107769901108800211
Ratislova, K., & Ratislav, J. (2014). Asynchronous email interviews as a qualitative research method in humanities. Human Affairs, 24, 452-460. doi:102478/s13374- 014-0240-y
Riiskjær, E., Ammentorp, J., & Kofoed, P. (2012). The value of open-ended questions in surveys on patient experience: Number of comments and perceived usefulness
from a hospital perspective. International Journal for Quality in Health Care, 24,
509-516. doi:10.1093/intqhc/mzs039
Robinson, O. (2014). Sampling in interview-based qualitative research: A theoretical and practical guide. Research in Psychology, 11(1), 25-41. doi:10.1080/14780887.2013.801543
Rossetto, K. R. (2014). Qualitative research interviews: Assessing the therapeutic value and challenges. Journal of Social and Personal Relationships, 31, 482-489. doi:10.1177/0265407514522892
Roulston, K. (2010). Considering quality in qualitative interviewing. Qualitative Research, 10(2), 199-202. doi:10.1177/1468794109356739
Rowley, J. (2012). Conducting research interviews. Management Research Review, 35, 260-271. doi:10.1108/01409171211210154
Rubin, H. J., & Rubin, I. S. (2012). Qualitative interviewing: The art of hearing data (3rd ed.). Thousand Oaks: Sage.
Sánchez-Fernández, J., Muñoz-Leiva, F., Montoro-Ríos, F.J., & Ibáñez-Zapata, J. Á. (2010). An analysis of the effect of pre-incentives and post-incentives based on draws on response to web surveys. Quality and Quantity, 44, 357-373. doi:10.1007/s11135-008-9197-4
Schilling, J. (2010). On the pragmatics of qualitative assessment: Designing the process for content analysis. European Journal of Psychological Assessment, 22(1), 28-
37. Retrieved from http://www.hogrefe.com/periodicals/european-journal- psychological-assessment
Schmidt, M. (2010). Quantification of transcripts from depth interviews, open-ended responses and focus groups. International Journal of Market Research, 52, 483- 508. doi:10.2501/S1470785309201417
Skalland, B. (2011). An alternative to the response rate for measuring a survey's realization of the target population. Public Opinion Quarterly, 75(1), 89-98. Retrieved from poq.oxfordjournals.org
Snowball, J. D., & Willis, K. G. (2011). Interview versus self-completion questionnaires in discrete choice experiments. Applied Economics Letters, 18, 1521-1525. doi:1080/13504851.2010.548770
Stephens, N. (2007). Collecting data from elites and ultraelites: Telephone and face-to- face interviews with macroeconomists. Qualitative Research, 7, 203-216. doi:10.1177/1468794107076020
Stacey, K., & Vincent, J. (2011). Evaluation of an electronic interview through electronic media with multimedia stimulus materials for gaining in-depth responses from professionals. Qualitative Research, 11, 605-624, doi:10.1177/1468794111413237
Sturges, J. E., & Hanrahan, K. J. (2004). Comparing telephone and face-to-face qualitative interviewing: a Research note. Qualitative Research, 4, 107-118. doi:10.1177/146879410404111
Talmy, S. (2010). Qualitative interviews in applied linguistics: From research instrument to social practice. Annual Review of Applied Linguistics, 30, 128-148. doi:10.1017/S0267190510000085
Tope, D., Chamberlain, L. J., Crowley, M., & Hodson, R. (2005). The benefits of being there: Evidence from the literature on work. Journal of Contemporary Ethnography, 34, 470-493. doi:10.1177/089124605276692
Turner, D. W. III. (2010). Qualitative interview design: A practical guide for novice investigators. The Qualitative Report, 3(2), 7-13. Retrieved from http://www.nova.edu/ssss/QR/QR15-3/qid
Vogl, S. (2013). Telephone versus face-to-face interviews: Mode affect on semi structured interviews with children. Sociology Methodology, 43(1), 133-177. doi:10.1177/0081175012465967
Wagstaff, C., & Williams, B. (2014). Specific design features of an interpretative phenomenological analysis study. Nurse Researcher, 21(3), 8-12. Retrieved from http://www.nursing-standard.co.uk/
Warr, D., & Mann, R. (2011). Using peer‐interviewing methods to explore place‐based disadvantage: Dissolving the distance between suits and civilians. International Journal of Social Research Methodology, 14, 337-352. doi:10.1080/13645579.2010.537527
Weijters, B., Schillewaert, N., & Geuens, M. (2008). Assessing response styles across modes of data collection. Journal of the Academy of Marketing Science, 36, 409- 422. doi:10.1007/s11747-007-0077
West, B. T., & Kreuter, F. (2013). Factors affecting the accuracy of interviewer observations: Evidence from the national survey of family growth. Public Opinion Quarterly, 77, 522-548. doi:10.1093/poq/
Wheeldon, J. (2010). Mapping mixed methods research: Methods, measures, and meaning. Journal of Mixed Methods Research, 4(2), 87-102. doi:10.1177/1558689809358755
Whiting, L. S. (2008). Semi-structured interviews: Guidance for novice researchers.
Nursing Standard, 22(23), 35-40. Retrieved from hhtp://www.nursing- standard.co/uk
Williamson, K. (2006). Research in constructivist frameworks using ethnographic techniques. Library Trends, 55(1), 83-101. doi:10.1353/lib.2006.0054
Wolgemuth, J. R. (2013, September). Analyzing for critical resistance in narrative research. Qualitative Research, 1-71. doi:10.1177/1468794113501685
Yii, S. B., Powell, M. B., & Guadagno, B. (2014). The association between investigative interviewers’ knowledge of question type and adherence to best-practice interviewing. Legal and Criminological Psychology, 19(2), 270-281. doi:10.1111/lcrp.12000
Yin, R. K. (2014). Case study research: designs and methods (5th ed.). Thousand Oaks: Sage.
Journaling Sources
Anderson, J. (2012). Reflective journals as a tool for auto-ethnographic learning: A case study of student experiences with individualized sustainability. Journal of Geography in Higher Education, 36, 613-623. doi:10.1080/03098265.2012.692157
Applebaum, L. (2014). From whining to wondering: Reflective journaling with preservice educators. Journal of Jewish Education, 80(1), 5-23. doi:10.1080/15244113.2014.880140
Atkinson, M. J., Tally, S., Heichel, C. W., Kozak, I., Leich, J., & Levack, A. (2012). A qualitative investigation of visual tasks with which to assess distance-specific visual function. Quality of Life Research, 22, 437-453. doi:10.1007/s11136-012- 0154-2
Berger, R. (2015). Now I see it, now I don't: Researcher's position and reflexivity in qualitative research. Qualitative Research, 15, 219-234. doi:10.1177/1468794112468475.
Brennan, M. C., & Cotgrave, A. J. (2014). Sustainable development : A qualitative inquiry into the current state of the UK construction industry. Structural Survey, 32, 315-330. doi:10.1108/SS-02-2014-0010
Charles, J., P. (2010). Journaling: Creating space for "I". Creative Nursing, 16(4), 180- 184. doi:10.1891/1078-4535.16.4.180
Charon, R., & Hermann, N. (2012). A sense of story, or why teach reflective writing? Academic Medicine: Journal of the Association of American Medical Colleges, 87, 5-7. doi:10.1097/ACM.0b013e31823a59c7
Covell, C. L., Sidani, S., & Ritchie, J. A. (2012). Does the sequence of data collection influence participants’ responses to closed and open-ended questions? A methodological study. International Journal of Nursing Studies, 49, 664-671. doi:10.1016/j.ijnurstu.2011.12.002
Cruz, E. V., & Higginbottom, G. (2013). The use of focused ethnography in nursing research. Nurse Researcher, 20(4), 36-43. doi:10.7748/nr2013.03.20.4.36.e305
Cumming-Potvin, W. (2013). "New basics" and literacies: Deepening reflexivity in qualitative research. Qualitative Research Journal, 13(2), 214-230. doi:10.1108/QRJ-04-2013-0024
Elllis, C. E., Adams, T. E., & Bochner, A. P. (2013) Authoethnography: An overview.
Forum: Qualitative Social Research, 12(1). Retrieved from http://www.qualitative- research.net/index.php/fqs/article/view/1589/3095
Everett, M. C. (2013). Reflective journal writing and the first-year experience.
International Journal of Teaching & Learning in Higher Education, 25, 213-222. Retrieved from http://www.isetl.org/ijtlhe/
Hayman, B., Wilkes, L., & Jackson, D. (2012). Journaling: Identification of challenges and reflection on strategies. Nurse Researcher, 19(3), 27-31. Retrieved from http://www.nursing-standard.co.uk
Lakshmi, B. S. (2014). Reflective practice through journal writing and peer observation: A case study. Turkish Online Journal of Distance Education (TOJDE), 15, 189-
204. Retrieved from http://www.tojde.anadolu.edu.tr
Lamb, D. (2013). Research in the first person: Reflection on the research experience using a research journal. Market & Social Research, 21(2), 32-39. Retrieved from http://www.amsrs.com.au/documents/item/1284
Lamb, D. (2013). Promoting the case for using a research journal to document and reflect on the research experience. Electronic Journal of Business Research Methods, 11(2), 84-92. Retrieved from http://www.academic- conferences.org/ejournals.htm
Lasater, K. (2009). Reflective journaling for clinical judgement development and evaluation. Journal of Nursing Education, 48(10), 40-44. doi:10.3928/01484834- 20090101-06
Li, J. (2008). Ethical challenges in participant observation: A reflection on ethnographic fieldwork. The Qualitative Report, 13(2), 100-115. Retrieved from http://www.nova.edu/ssss/QR/AR13-1/li.
Miller, W. R. (2014). Interactive journaling as a clinical tool. Journal of Mental Health Counseling, 36(1), 31-42. doi:10.17744/mehc.36.1.0k5v52l12540w218
Ortlipp, M. (2008). Keeping and using reflective journals in the qualitative research process. The Qualitative Report, 13, 695-705. Retrieved from http://www.nova.edu/ssss/QR/QR13-4/ortlipp
Peredaryenko, M. S., & Krauss, S. E. (2013). Calibrating the human instrument: Understanding the interviewing experience of novice qualitative researchers. The Qualitative Report, 18(43), 1-17. Retrieved from http://www.nova.edu/ssss/QR/
Slotnick, R. C., & Janesick, V. J. (2011). Conversations on method: deconstructing policy through the researcher reflective journal. The Qualitative Report, 16, 1352- 1360. Retrieved from http://www.nova.edu/ssss/QR/QR16-5/slotnick
Snyder, C. (2012). A case study of a case study: Analysis of a robust qualitative research methodology. The Qualitative Report, 17(26), 1-21. Retrieved from http://www.nova.edu/ssss/QR/QR17/snyder
Wall, C., Glenn, S., & Mitchinson, S., & Poole, H. (2004). Using reflective diary to develop bracketing skills during a phenomenological investigation. Nurse Researcher, 11(4), 20-29. doi:10.7748/nr2004.07.11.4.20.c6212
Member Checking Sources
Andraski, M. P., Chandler, C., Powell, B., Humes, D., & Wakefield, S. (2014). Bridging the divide: HIV prevention research and black men who have sex with men.
American Journal of Public Health, 104, 708-714. Retrieved from http://ajph.aphapublications.org/
Carlson, J. A. (2010). Avoiding traps in member checking. The Qualitative Report, 15, 1102–1113. Retrieved from http://www.nova.edu/ssss/QR/QR15-5/carlson
Doyle, S. (2007). Member checking with older women: A framework for negotiating meaning. Health Care for Women International, 28, 888-908. doi:10.1080/07399330701615325
Goldblatt, H., Karnieli-Miller, O., & Neumann, M. (2011). Sharing qualitative research findings with participants: Study experiences of methodological and ethical dilemmas. Patient Education and Counseling, 82, 389-395. doi: 10.1016/j.pec.2010.12.016
Harper, M., & Cole, P. (2012). Member checking: Can benefits be gained similar to group therapy? The Qualitative Report, 17(2), 510-517. Retrieved from http://www.nova.edu/ssss/QR/QR17-2/harper
Harvey, L. (2015). Beyond member checking: A dialogic approach to the research interview. International Journal of Research & Method in Education, 38, 23-38. doi:10.1080/1743727X.2014.914487
Jonsen, K., & Jehn, K. A. (2009). Using triangulation to validate themes in qualitative studies. Qualitative Research in Organizations and Management: An International Journal, 4(2), 123-150. doi:10.1108/17465640910978391
Koelsch, L. E. (2013). Reconceptualizing the member check interview. International Journal of Qualitative Methods, 12, 168-179. Retrieved from http:ejournals.library.ualberta.ca/index.php/IJQM/article/view/12327
McConnell-Henry, T., Chapman, Y., & Francis, K. (2011). Member checking and Heideggerian phenomenology: A redundant component. Nurse Researcher, 18(2), 28-37. doi:10.7748/nr2011.01.18.2.28.c8282
Mejo-Jaffe, I. (2011). ‘Is this what I said?’ Interview transcript approval by participants: An aspect of ethics in qualitative research. International Journal of Qualitative Methods, 10, 231-247. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/
Reilly, R. C. (2013). Found poems, member checking and crises of representation. The Qualitative Report, 18(15), 10-18. Retrieved from http://www.nova.edu/ssss/qr
Yildirim, K. (2010). Raising the quality in qualitative research. Ilkogretim Online, 9(1), 79-92. Retrieved from http://ilkogretim-online.org
Mixed Methods Research
Abowitz, D. A., & Toole, T. M. (2010). Mixed methods research: Fundamental issues of design, validity, and reliability in construction research. Journal of Construction Engineering & Management, 136(1), 108-116. doi:10.1061/(ASCE)CO.1943-
7862.0000026
Alexander, M., MacLaren, A., O’Gorman, K., & Taheri, B. (2011). “He just didn’t seem to understand the banter”: Bullying or simply establishing social cohesion? Tourism Management, 33, 1245-1255. doi:10.1016/j.tourman.2011.11.001
Arcidiacono, F., & De Gregorio, E. (2008). Methodological thinking in psychology: Starting from mixed methods. International Journal of Multiple Research Approaches, 2(1), 118-126. Retrieved from http://mra.e-contentmanagement.com
Arora, R., & Stoner, C. (2009). A mixed method approach to understanding brand personality. Journal of Product & Brand Management, 18, 272-283. doi:10.1108/10610420910972792
Azorin, J. M., & Cameron, R. (2010). The application of mixed methods in organisational research: A literature review. Electronic Journal of Business Research Methods, 8, 95-105. Retrieved from http://www.ejbrm.com/
Borrego, M., Douglas, E. P., & Amelink, C. T. (2011). Quantitative, qualitative, and mixed research methods in engineering education. Journal of Engineering Education, 41(1), 153-166. Retrieved from http://www.jee.org
Brent, J. J., & Kraska, P. B. (2010). Moving beyond our methodological default: A case for mixed methods. Journal of Criminal Justice Education , 21, 412-430. doi:10.1080/10511253.2010.516562
Cameron, R. (2011). Mixed methods research: The five Ps framework. Electronic Journal of Business Research Methods, 9(2), 96-108. Retrieved from http://www.ejbrm.com/main.html
Cameron, R., & Molina-Azorin, J. (2011). The acceptance of mixed methods in business and management research. International Journal of Organizational Analysis, 19, 256-271. doi:10.1108/19348831111149204
Caruth, G. D. (2013). Demystifying mixed methods research design: A review of the literature. Mevlana International Journal of Education, 3(2), 112-122. doi:10.13054/mije.13.35.3.2
Castro, F. G., Kellison, J. G., Boyd, S. J., & Kopak, A. (2010). A methodology for conducting integrative mixed methods research and data analyses. Journal of Mixed Methods Research, 4, 342-360. doi:10.1177/1558689810382916
Christ, T. W. (2013). The worldview matrix as a strategy when designing mixed methods research. International Journal of Multiple Research Approaches, 7(1), 110-118. doi:10.5172/mra.2013.7.1.110
Collins, K., & O'Cathain, A. (2009). Introduction: Ten points about mixed methods research to be considered by the novice researcher. International Journal of Multiple Research Approaches, 3(1), 2-7. Retrieved from http://mra.e- contentmanagement.com
Cooke, A., Smith, D., & Booth, A. (2012). Beyond PICO: The SPIDER tool for qualitative evidence synthesis. Qualitative Health Research, 22, 1435-1443. doi:10.1177/1049732312452938
Covell, C. L., Sidani, S., & Ritchie, J. A. (2012). Does the sequence of data collection influence participants’ responses to closed and open-ended questions? A methodological study. International Journal of Nursing Studies, 49, 664-671. doi:10.1016/j.ijnurstu.2011.12.002
Crede, E., & Borrego, M. (2013). From ethnography to items: A mixed methods approach to developing a survey to examine graduate engineering student retention. Journal of Mixed Methods Research, 7, 62-80. doi:10.1177/1558689812451792
Doyle, L., Brady, A. M., & Byrne, G. (2009). An overview of mixed methods research.
Journal of Research in Nursing, 14(2), 175-185. doi:10.1177/1744987108093962
Farquhar, M., Ewing, G., & Booth, S. (2011). Using mixed methods to develop and evaluate complex intervention in palliative care research. Palliative Medicine, 25, 748-757. doi:10.1177/0269216311417919
Feilzer, M. Y. (2010). Doing mixed methods research pragmatically: Implications for the rediscovery of pragmatism as a research paradigm. Journal of Mixed Methods Research, 4, 6-16. doi:10.1177/1558689809349691
Fetters, M. D., Curry, L. A., & Creswell, J. W. (2013). Achieving integration in mixed methods designs, principles and practices. Health Services Research, 48, 2134- 2156. doi:10.1111/1475-6773.12117
Fielding, N. (2010). Mixed methods research in the real world. International Journal of Social Research Methodology, 13(2), 127-138. doi:10.1080/13645570902996186
Frels, R. K., & Onwuegbuzie, A. J. (2013). Administering quantitative instruments with qualitative interviews: A mixed research approach. Journal of Counseling and Development, 91, 184-194. doi:10.1002/j.1556-6676.2013.00085.x
Green, C., Duan, N., Gibbons, R., Hoagwood, K., Palinkas, L., & Wisdom, J. (2014). Approaches to mixed methods dissemination and implementation research: Methods, strengths, caveats, and opportunities. Administration and Policy in Mental Health and Mental Health Services Research, 1-16. doi:10.1007/s10488- 014-0552-6
Griensven, H. V., Moore, A. P., & Hall, V. (2014, May 11). Mixed methods research: The best of both worlds? Manual Therapy, 19, 367-371. doi:10.1016/j.math.2014.05.005
Halcomb, E., & Andrew, S. (2009). Practical considerations for higher degree research students undertaking mixed methods projects. International Journal of Multiple Research Approaches, 3(2), 153-162. Retrieved from http://mra.e- contentmanagement.com
Harrison, R. L., & Reilly, T. M. (2011). Mixed methods designs in marketing research.
Qualitative Market Research: An International Journal, 14(1), 7-26. doi:10.1108/13522751111099300
Hayes, B., Bonner, A., & Douglas, C. (2013). An introduction to mixed methods research for nephrology nurses. Renal Society of Australasia Journal, 9(1), 8-14. Retrieved from http://www.renalsociety.org
Hesse-Biber, S. H. (2010). Qualitative approaches to mixed method practice. Qualitative Inquiry, 16, 455-468. doi:10.1177/1077800410364611
Heyvaert, M., Maes, B., & Onghena, P. (2011). Mixed methods research synthesis: Definitions framework and potential. Qualitative and Quantitative, 47, 659-676. doi:10.1007/s11135-011-9538-6
Ihantola, E. M., & Kihn, L. A. (2011). Threats to validity and reliability in mixed methods accounting research. Qualitative Research in Accounting and Management, 8(1), 39-58. doi:10:1108/11766091111124694
Klassen, A. C., Creswell, J., Plano Clark, V. L., Smith, K. C., & Meissner, H. I. (2012).
Best practices in mixed methods for quality of life research. Quality of Life Research, 21, 377-378. doi:10.1007/s11136-012-0122-x
Lund, T. (2012). Combining qualitative and quantitative approaches: Some arguments for mixed methods research. Scandinavian Journal of Educational Research, 56, 155-165. doi:10.1080/00313831.2011.568674
Malina, M. A., Nørreklit, H. S. O., & Selto, F. H. (2011). Lessons learned: Advantages and disadvantages of mixed method research. Qualitative Research in Accounting and Management, 8(1), 59-71. doi:10.1108/11766091111124702
Maxwell, J. A. (2010). Using numbers in qualitative research. Qualitative Inquiry, 16, 475-482. doi:10.1177/1077800410364740
Mertens, D. (2010). Philosophy in mixed methods teaching: The transformative paradigm as illustration. International Journal of Multiple Research Approaches, 4(1), 9-18. doi:10.5172/mra.2010.4.1.009
Molina-Azorin, J. F. (2011). The use and added value of mixed methods in management research. Journal of Mixed Methods Research, 5(1), 7-24. doi:10.1177/1558689810384490
Niglas, K. (2009). How the novice researcher can make sense of mixed methods designs. International Journal of Multiple Research Approaches, 3(1), 34-46. Retrieved from http://mra.e-contentmanagement.com
Oleinik, A. (2011). Mixing quantitative and qualitative content analysis: Triangulation at work. Quality and Quantity, 45, 859-873. doi:10.1007/s11135-010-9399-4
Onwuegbuzie, A. J., Bustamante, R. M., & Nelson, J. A. (2010). Mixed research as a tool for developing quantitative instruments. Journal of Mixed Methods Research, 4(1), 56-78. doi:10.1177/1558689809355805
Onwuegbuzie, A., Johnson, R., & Collins, K. (2009). Call for mixed analysis: A philosophical framework for combining qualitative and quantitative approaches. International Journal of Multiple Research Approaches, 3(2), 114-139. Retrieved from http://mra.e-contentmanagement.com
Onwuegbuzie, A. J., & Leech, N. L. (2005). On becoming a pragmatic researcher: The importance of combining quantitative and qualitative research methodologies. International Journal of Social Research Methodology, 8, 375-387. doi:10.1080/13645570500402447
Onwuegbuzie, A. J., & Leech, N. L. (2010). Generalization practices in qualitative research: A mixed methods case study. Quality and Quantity, 44, 881-892. doi:10.1007/s11135-009-9241-z
Onwuegbuzie, A. J., Slate, J. R., Leech, N. L., & Collins, K. T. (2007). Conducting mixed analyses: A general typology. International Journal of Multiple Research Approaches, 1(1), 4-17. Retrieved from http://mra.e-contentmanagement.com
Ostlund, U., Kidd, L., Wengstrom, Y., & Rowa-Dewar, N. (2011). Combining qualitative and quantitative research within mixed method research designs: A methodological review. International Journal of Nursing Studies, 48, 369-383. doi:10.1016/j.ijnurstu.2010.10.005
Plano Clark, V. L. (2010). The adoption and practice of mixed methods in U.S. trends in federally funded health-related research. Qualitative Inquiry, 16, 428-440. doi:10.1177/1077800410364609
Rice, E., Holloway, I. W., Barman-Adhikari, A., Fuentes, D., Brown, C. H., & Palinkas, L.
A. (2014). A mixed methods approach to network data collection. Field Methods, 26, 252-268. doi:10.1177/1525822X13518168
Robinson, P. (2010). Conclusion: On hammers, nails and building sites: Teaching mixed methods. International Journal of Multiple Research Approaches, 4, 66-72.
Retrieved from http://mra.e-contentmanagement.com
Ruffin, M. T., Creswell, J. W., Jimbo, M., & Getter, M. D. (2009). Factors influencing choice for colorectal cancer screening among previously unscreened African and causasian Americans: Findings form a triangulation mixed methods investigation. Journal of Community Health, 34(2), 79-89. doi:10.1007/s10900-008-9133-
Salehi, K., & Golafshani, N. (2010). Commentary: Using mixed methods in research studies: An opportunity with its challenges. International Journal of Multiple Research Approaches, 4(3), 186-191. doi:10.5172/mra.2010.4.3.186
Simpson, S. H. (2011). Demystifying the research process: Mixed methods. Pediatric Nursing, 37(1), 28-29. Retrieved from http://www.pediatricnursing.net
Small, M. L. (2011). How to conduct a mixed methods study: Recent trends in a rapidly growing literature. Annual Review of Sociology, 37, 57-86. doi:10.1146/annurev.soc.012809.102657
Southam-Gerow, M. A., & Dorsey, S. (2014). Qualitative and mixed methods research in dissemination and implementation science: Introduction to the special issue. Journal of Clinical Child & Adolescent Psychology, 43, 845-850. doi:10.1080/15374416.2014.930690
Sparkes, A. C. (2014). Developing mixed methods research in sport and exercise psychology: Critical reflections on five points of controversy. Psychology of Sport and Exercise, 16, 49-58. doi:10.1016/j.psychsport.2014.08.014
Symonds, J. E., & Gorard, S. (2010). Death of mixed methods? or the rebirth of research as a craft. Evaluation & Research in Education, 23(2), 121-136. doi:10.1080/09500790.2010.483514
Torrance, H. (2012). Triangulation, respondent validation, and democratic participation in mixed methods research. Journal of Mixed Methods Research, 6(2), 111-123. doi:10.1177/1558689812437185
Truscott, D. M., Swars, S., Smith, S., Thornton-Reid, F., Zhao, Y., Dooley, C., Williams, B.,…Matthews, M. (2010). A cross-disciplinary examination of the prevalence of mixed methods in educational research: 1995-2005. International Journal of Social Research Methodology, 13, 317-328. doi:10.1080/13645570903097950
Venkatesh, V., Brown, S. A., & Bala, H. (2013). Bridging the qualitative-quantitative divide: Guidelines for conducting mixed methods research in information systems. MIS Quarterly, 37(1), 21-54. Retrieved from http://www.misq.org/index.html
Wheeldon, J. (2010). Mapping mixed methods research: Methods, measures, and meaning. Journal of Mixed Methods Research, 4(2), 87-102. doi:10.1177/1558689809358755
Wisdom, J. P., Cavaleri, M. A., Onwuegbuzie, A. J., & Green, C. A. (2012).
Methodological reporting in qualitative, quantitative, and mixed methods health services research articles. Health Services Research, 47, 721-745. doi:10.1111/j.1475-6773.2011.01344.x
Zachariadis, M., Scott, S., & Barrett, M. (2013). Methodological implications of critical realism for mixed-methods research. MIS Quarterly, 37, 855-879. Retrieved from http://www.misq.org/index/html
Zohrabi, M. (2013). Mixed method research: Instruments, validity, reliability and reporting findings. Theory and Practice in Language Studies, 3, 254-262. doi:10.4304/tpls.3.2.254-262
Notetaking and Fieldwork
Bughardt, G. M., Bartmess-LeVasseur, J. N., Browning, S. A., Morrison, K. E., Stec, C. L., Zachau, C. E., & Freeberg, T. M. (2012). Perspectives – minimizing observer bias in behavioral studies: A review and recommendations. Ethology, 118, 511- 517. doi:10:1111/j.1439-0310.2012.02040.x
Christie, C. D., Bemister, T. B., & Dobson, K. S. (2015). Record-informing and note- taking: A continuation of the debate about their impact on client perceptions. Canadian Psychology/Psychologie, 56 (1). Retrieved from http://psycnet.apa.org
Cole, C. E. (2013). Stories from the lived and living fieldwork process. Qualitative Research in Organizations and Management, 8(1), 50-69. doi.10.1108/17465641311327513
Ivey, J. (2012). The value of qualitative research methods. Pediatric Nursing, 38, 319-
344. Retrieved from http://www.pediatricnursing.org
Jackson, J. E. (1990). I am a fieldnote: Fieldnotes as a symbol of professional identity.
In R. Sanjek(ed.), Fieldnotes: The making of anthropology (p.3-33). Ithaca: Cornell University Press.
Kawulich, B. B. (2005). Participant observation as a data collection method. Forum: Qualitative Social Research, 6, (2). Retrieved from http://www.qualitative- research.net/index.php/fqs/article/view/466/996
Kennedy-Lewis, B. L. (2012). When a teacher becomes a researcher: Using self- narrative to define one’s role as participant observer. Theory Into Practice, 51(2), 107-113. doi:10.1080/00405841.2012.662865
Kourtizin, S. (2002). The half-baked concept of raw data in ethnographic observation.
Canadian Journal of Education, 27(1), 119-138. Retrieved from http://www.csse- scee.ca/
Mulhall, A. (2003). In the field: Notes on observation in qualitative research. Journal of Advanced Nursing, 41, 306-313. doi:10.1046/j.1365-2648.2003.02514.x
Murthy, D. (2013). Ethnographic research 2.0. Journal of Organizational Ethnography, 2
(1), 23-36. doi:10.1108/JOE-01-2012-0008
Punch, S. (2012). Hidden struggles of fieldwork: Exploring the role and use of field diaries. Emotion, Space and Society, 5(2), 86-93. doi:10.1016/j.emospa.2010.09.005
Walford, G. (2009). The practice of writing ethnographic field notes. Ethnography & Education, 4(2), 117-130, doi:10.1080/17457820902972713
Walshe, C., Ewing, G., & Griffiths, J. (2012). Using observation as a data collection method to help understand patient and professional roles and actions in palliative care setting. Palliative Medicine, 26, 1048-1054. doi:10.1177/0269216311432897
Wolcott, H.F. (1994). Transforming qualitative data – description, analysis, and interpretation. Thousand Oaks: Sage.
Wolcott, H. F. (1995). The art of fieldwork. New York, NY: Altamira Press Wolfinger, N. H. (2002). On writing fieldnotes: Collection strategies and background
expectancies. Qualitative Research, 2, 85-95.
doi:10.1177/1468794102002001640
Phenomenological Sources
Applebaum, M. (2012). Phenomenological psychological research as science. Journal of Phenomenological Psychology, 43(1), 36-72. doi:10.1163/156916212x632952
Audet, C. T., & Everall, R. D. (2010). Therapist self-disclosure and the therapeutic relationship: A phenomenological study from the client perspective. British Journal of Guidance & Counselling, 38, 327-342. doi:10.1080/03069885.2010.482450
Bevan, M. T. (2014). A method of phenomenological interviewing. Qualitative Health Research, 24, 136-144. doi:10.1177/1049732313519710
Bradbury-Jones, C., Irvine F., & Sambrook S. (2010). Phenomenology and participant feedback: Convention or contention. Nurse Researcher, 17(2), 25-33. Retrieved from http://rcnpublishing.com/journal/nr
Chan, N. N., & Walker, C. (2015). An exploration of students’ lived experiences of using smartphones in diverse learning contexts using a hermeneutic phenomenological approach. Computers & Education, 82, 96-106. doi:10.1016/j.compedu.2014.11.001
Chan, Z. C. Y., Fung, Y., & Chien, W. (2013). Bracketing in phenomenology: Only undertaken in the data collection and analysis process? The Qualitative Report, 18(59) 1-9. Retrieved from http://www.nova.edu/ssss/QR/QR18/chan59
Cloonan, T. F. (2012). The employment of the phenomenological psychological method in the service of art education. Journal of Phenomenological Psychology, 43, 73- 129. doi:10.1163/156916212X632961
Conklin, T. A. (2013). Work worth doing: A phenomenological study of the experience of discovering and following one's calling. Journal of Management Inquiry, 21, 298- 317. doi:10.1177/1056492611414426
Converse, M. (2012). Philosophy of phenomenology: How understanding aids research. The International Journal of Research Methodology in Nursing and Health Care, 20. doi:10.7748/nr2012.09.20.1.28.c9305
Davidsen, A. (2013. Phenomenological approaches in psychology and health sciences.
Qualitative Research in Psychology, 10, 318-339. doi:10.1080/14780887.2011.608466
Desjarlais, R., & Throop, C. (2011). Phenomenological approaches in anthropology.
Annual Review of Anthropology, 40, 87-102. doi:10.1146/annurev-anthro- 092010-153345
Dibley, L. (2011). Analyzing narrative data using McCormack’s lenses. Nurse Researcher, 18(3), 13-19. Retrieved from http://nurseresearcher.rcnpublishing.co.uk/news-and- opinion/commentary/analysing-qualitative-data
Dixon, S. E. A., & Clifford, A., (2007). Ecopreneurship: A new approach to managing the triple bottom line. Journal of Organizational Change Management, 20, 326-345. doi:10.1108/09534810710740164
Dowden, A. R., Gunby, J. D., Warren, J. M., & Boston, Q. (2014). A phenomenological analysis of invisibility among African-American males: implications for clinical practice and client retention. The Professional Counsellor, 4, 58-70. doi:10.15241/ard.4.1.58
Dowling, M., & Cooney, A. (2012). Research approaches related to phenomenology: Negotiating a complex landscape. Nurse Researcher, 20(2), 21-27. doi:10.7748/nr2012.11.20.2.21.c9440
Eberle, T. S. (2010). The phenomenological life world analysis and the methodology of the social sciences. Human Studies, 33(1), 123-139. doi:10.1007/s10746-010- 9146-9
Englander, M. (2012). The interview: Data collection in descriptive phenomenological human scientific research. Journal of Phenomenological Psychology, 43, 13-35. doi:10.1163/156916212X632943
Finlay, L. (2009). Exploring lived experience: principles and practice of phenomenological research. International Journal of Therapy and Rehabilitation, 16, 474-481. doi:10.12968ijtr.2009.16.9.43765
Fisher, W. P., Jr., & Stenner, A. J. (2011). Integrating qualitative and quantitative research approaches via the phenomenological method. International Journal of Multiple Research Approaches, 5, 85-99. doi:10.5172/mra.2011.5.1.89
Flood, A. (2010). Understanding phenomenology. Nurse Researcher, 17(2), 7-15.
Retrieved from http://nurseresearcher.rcnpublishing.co.uk
Gee, J., Loewenthal, D., & Cayne, J. (2013). Phenomenological research: The case of empirical phenomenological analysis and the possibility of reverie. Counseling Psychology Review, 28(3), 52-62. doi:10.1111/j.2044-8341.2011.02053.x
Gill, M. J. (2014). The possibilities of phenomenology for organizational research.
Organizational Research Methods, 17(2), 118-137. doi:10.1177/1094428113518348
Ginsberg, A., & Sinacore, A. L. (2013). Counseling Jewish women: A phenomenological study. Journal of Counseling & Development, 91, 131-139. doi:10.1002/j.1556- 66760.201300081.x
Giorgi, A. (2012). The descriptive phenomenological psychological method. Journal of Phenomenological Psychology, 43, 3-12. doi:10.1163/156916212X632934
Hayman, B., Wilkes, L., & Jackson, D. (2012). Journaling: Identification of challenges and reflection on strategies. Nurse Researcher, 19(3), 27-31. Retrieved from http://www.nursing-standard.co.uk
Hays, D. G., & Wood, C. (2011). Infusing qualitative traditions in counseling research designs. Journal of Counseling & Development, 89, 288-295. doi:10.1002/j.1556- 6678.2011.tb00091.x
Husserl, E. (2012). Ideas: General introduction to pure phenomenology. London, U.K.: Routledge
Ilkay, J. (2013). Identifying motives of mothers who purchase healthy convenience snacks for their children: A phenomenological study. Journal of Business Studies Quarterly, 5, 237-246. Retrieved from http://jbsq.org/
Iwamoto, D. K., Negi, N. J., Partail, R. N., & Creswell, J. W. (2013). The racial and ethnic identity formation process of second-generation Asian Indian Americans: A phenomenological study. Journal of Multicultural Counseling and Development, 41, 224-239. doi:10.1002/j.2161-1912.2013.00038.x
Kafle, N. P. (2013). Hermeneutic phenomenological research method simplified. Bodhi: An Interdisciplinary Journal, 5, 181-200. doi:10.3126/bodhi.v5i1.8053
Khan, S. N. (2014). Qualitative research method - phenomenology. Asian Social Science, 10, 298-310. 10.5539/ass.v10n21p298
Kumar, A. (2012). Using phenomenological research methods in qualitative health research. International Journal of Human Sciences, 9, 790-804. Retrieved from http://www.j-humansciences.com
Lien, B. Y., Pauleen, D. J., Kuo, Y., & Wang, T. (2014). The rationality and objectivity of reflection in phenomenological research. Quality and Quantity, 48(1), 189-196. doi:10.1007/s11135-012-9759-
McGowan, T. (2013). The presence of phenomenology: Hegel and the return to metaphysics. Mosaic : A Journal for the Interdisciplinary Study of Literature, 46(1), 95-111. doi:10.1353/mos.2013.0010
Moustakas, C. (1994). Phenomenological research methods. New York, NY: Sage.
Nicholls, D. (2009). Qualitative research: Part two—methodologies. International Journal of Therapy and Rehabilitation, 16, 586-592. Retrieved from http://www.ijtr.co.uk
Norlyk, A., Dreyer, P., Haahr, A., & Martinsen, B. (2011). Understanding the creative processes of phenomenological research: The life philosophy of Logstrup.
International Journal of Qualitative Studies on Health and Wellbeing, 6(4), 1-8. doi:10.3402/quhw.v6i4.7320
Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? Part 2: Introducing qualitative research methodologies and methods. Manual Therapy, 17, 378-384. doi:10.1016/j.math.2012.03.004
Phillips-Pula, L., Strunk. J., & Pickler, P.H. (2011). Understanding phenomenological approaches to data analysis. Journal of Pediatric Health Care, 25, 67-71. doi:10.1016/j.pedhc.2010.09.004
Pietkiewicz, I., & Smith, J. (2014). A practical guide to using interpretative phenomenological analysis in qualitative research psychology. Czasopismo Psychologiczne Psychological Journal, 20, 7-14. doi:10.14691/CPPJ.20.1.7
Pringle, J., Drummond, J., McLafferty, E., & Henry, C. (2010). Interpretative phenomenological analysis: A discussion and critique. Nurse Researcher, 18(3), 20-26. doi:10.7748/nr2011.04.18.3.20.c8459
Pringle, J., Hendry, C., & McLafferty, E. (2011). Phenomenological approaches: Challenges and choices. Nurse Researcher, 18(2), 7-18. doi:10.7748/nr2011.01.18.2.7.c8280
Reiter, S., Stewart, G., & Bruce, C. (2011). A strategy for delayed research method selection: Deciding between grounded theory and phenomenology. Electronic Journal of Business Research Methods, 9(1), 35-46. Retrieved from http://www.ejbrm.com
Rennie, D. L. (2012). Qualitative research as methodical hermeneutics. Psychological Methods, 17, 385-398. Retrieved from http://www.psycnet.apa.org
Roberts, T. (2013). Understanding the research methodology of interpretative phenomenological analysis. British Journal of Midwifery, 21, 215-218. Retrieved from http://www.britishjournalofmidwifery.com
Robertson, J. H., & Thomson, A. M. (2014). A phenomenological study of the effects of clinical negligence litigation on midwives in England: The personal perspective. Midwifery, 30, e121-e130. doi:10.1016/j.midw.2013.12.003
Rocha Pereira, H. (2012). Rigour in phenomenological research: Reflections of a novice
nurse researcher. Nurse Researcher, 19(3), 16-19. Retrieved from http://nurse researcher.rcnpublishing.co.uk
Schrag, C. O. (2012). Celebrating fifty years of the society for phenomenology and existential philosophy. Journal of Speculative Philosophy, 26(2), 86-92. doi:10.5325/jspecphil.26.2.0086
Smith, J. A. (2011). Evaluating the contribution of interpretative phenomenological analysis. Health Psychology Review, 5(1), 9-27. doi:10.1080/17437199.2010.51065
Snelgrove, S. R. (2014). Conducting qualitative longitudinal research using interpretative phenomenological analysis. Nurse Researcher, 22, 20-25. Retrieved from http://rcnpublishing.com/journal/nr
Starks, H., & Trinidad, S. B. (2007). Choose your method: A comparison of phenomenology, discourse analysis, and grounded theory. Qualitative Health Research, 17, 1372-1380. doi:10.1177/1049732307307031
Stierand, M. B., & Dorfler, V. (2010). Research in brief: Reflecting on phenomenological study of creativity and innovation in haute cuisine. International Contemporary Hospitality Management, 24, 946-957. doi:10.1108/09596111211247254
Tan, H., Wilson, A., & Olver, I. (2009). Ricoeur’s theory of interpretation: An instrument for data interpretation in hermeneutic phenomenology. International Journal of Qualitative Methods, 8(4), 1-15.Retrieved from https://ejournals.library.ualberta.ca/index.php/IJQM
Tembo, A. C., Parker, V., & Higgins, I. (2013). The experience of sleep deprivation in intensive care patients: Findings from a larger hermeneutic phenomenological study. Intensive and Critical Care Nursing, 29, 310-316. doi:10.1016/j.iccn.2013.05.003
Tirgari, V. (2012). Information technology policies and procedures against unstructured data: A phenomenological study of information technology professionals.
Academy of Information & Management Sciences Journal, 15(2), 87-106. Retrieved from http://www.alliedacademies.org/public/journals/journaldetails.aspx?jid=10
Tomkins, L., & Eatough, V. (2013). The feel of experience: phenomenological ideas for organizational research. Qualitative Research in Organizations and Management: An International Journal, 8, 258-275. doi:10.1108/QROM-04-2012- 1060
Tufford, L., & Newman, P. (2012). Bracketing in qualitative research. Qualitative Social Work, 11, 80-96. doi:10.1177/143325010368316
Vagle, M. D. (2009). Validity as intended: “Bursting forth toward” bridling in phenomenological research. International Journal of Qualitative Studies in Education, 2, 585-605. doi:10.1080/09518390903048784
Van Manen, M. (2007). Phenomenology of practice. Phenomenology & Practice, 1(1), 11-30. Retrieved from https://ejournals.library.ualberta.ca/index.php/pandpr
Wagstaff, C., & Williams, B. (2014). Specific design features of an interpretative phenomenological analysis study. Nurse Researcher, 21(3), 8-12. Retrieved from http://www.nursing-standard.co.uk/
Whittemore, A. H. (2014). Phenomenology and city planning. Journal of Planning, Education, and Research, 34, 301-308. doi:10.1177/0739456X14536989
Willis, P. (2001). The “things themselves” in phenomenology. Indo-Pacific Journal of Phenomenology, 1(1), 1-12. Retrieved from http://www.ajol.info/indenx.php
Wilson, D., & Washington, G. (2007). Retooling phenomenology: Relevant methods for conducting research with African American women. Journal of Theory Construction and Testing, 11, 63-66. Retrieved from http://tuckerpub.com/jtct.htm
Pilot Studies
Arain, M., Campbell, M.J., Cooper, C.L., & Lancaster, G.A. (2010). What is a pilot or feasibility study? A review of current practice and editorial policy. BMC Medical Research Methodology, 10. doi:10.1186/1471-2288-10-67
Cleary, M., Horsfall, J., & Hayter, M. (2014). Data collection and sampling in qualitative research: Does size matter? Journal of Advanced Nursing, 70, 473-475. doi:10.1111/jan.12163
Chenail, R. (2011). Interviewing the investigator: Strategies for addressing instrumentation and researcher bias concerns in qualitative research. The Qualitative Report, 16, 255-262. Retrieved from http://www.nova.edu/ssss/QR/QR16-1/interviewing
Kim, Y. (2011). The pilot study in qualitative inquiry: Identifying issues and learning lessons for culturally competent research. Qualitative Social Work, 10, 190-206. doi:10.1177/1473325010362001
Leon, A.C., Davis, L.L., & Kraemer, H.C. (2011). The role and interpretation of pilot studies in clinical research. Journal of Psychiatric Research, 45, 626-629. doi:10.1016/j.jpsychires.2010.10.008
Morin, K. H. (2013). Value of a pilot study. Journal of Nursing Education, 52, 547-548. doi:10.3928/01484834-20130920-10
Rao, U. (2012). Concepts in sample size determination. Indian Journal of Dental Research, 23, 660-664. doi:10.4103/0970-9290.107385
Schroder, C., Medves, J., Paterson, M., Vaughan, B., Chapman, C., O’Riordan, A., … Kelly, C. (2011). Development and pilot testing of the collaborative practice assessment tool. Journal of Interprofessional Care, 25(3), 189-195. doi:10.3109/13561820.2010.532620
Secomb, J. M., & Smith, C. (2011). A mixed method pilot study: The researchers’ experiences. Contemporary Nurse: A Journal for the Australian Nursing Profession, 39, 3-35. Retrieved from http://www.contemporarynurse.com
Thabane, L., Ma, J., Chu, R., Cheng, J., Ismaila, A., Rios, L.P. & Goldsmith, C.H. (2010). A tutorial on pilot studies: The what, why and how. BMC Medical Research Methodology, 10. doi:10.1186/1471-2288-10-1
van Teijlingen, E., & Hundley, V. (2002). The importance of pilot studies. Nursing Standard, 16(40), 33-36. doi:10.7748/ns2002.06.16.40.33.c3214
Qualitative Research Foundation
Alcadipani, R., & Hodgson, D. (2009). By any means necessary? Ethnographic access, ethics and the critical researcher. Tamara Journal, 7(4), 127-146. Retrieved from https://www.escholar.manchester.ac.uk
Barnham, C. (2010). Qualis? The qualitative understanding of essence. International Journal of Marketing Research, 52, 757-773. doi:10.2501/S1470785310201648
Bernard, H. R. (2013). Social research methods: Qualitative and quantitative approaches (2nd ed.). Thousand Oaks, CA: Sage.
Bertolotti, F., & Tagliaventi, M. R. (2007). Discovering complex interdependencies in organizational settings: The role of social network analysis in qualitative research. Qualitative Research in Organizations and Management: An International Journal, 2(1), 43-46. doi:10.1108/174656407107491261
Carr, W., & Kemmis, S. (1986). Becoming critical. London: Falmer Press.
Chenail, R. J. (2011). Ten steps for conceptualizing and conducting qualitative research studies in a pragmatically curious manner. Qualitative Report, 16, 1713-1730.
Retrieved from http://www.nova.edu/
Cherryholmes, C. H. (1992). Notes on pragmatism and scientific realism. Educational Researcher, 21(6), 13-17. Retrieved from http://www.educ.ttu.edu
Cox, R. (2012). Teaching qualitative research to practitioner-researchers. Theory into Practice, 51(2), 129-139. doi:10.1080/00405841.2012.662868
Crescentini, A., & Mainardi, G. (2009), Qualitative research articles: Guidelines, suggestions, and needs. Journal of Workplace Learning , 21, 431-439. doi:10.1108/13665620910966820
Cunliffe, A. L. (2011). Crafting qualitative research: Morgan and Smircich 30 years on.
Organizational Research Methods,14, 647-673. doi:10.1177/1094428110373658
Davidson, C. (2009). Imperatives for qualitative research. International Journal of Qualitative Methods, 8, 36-50. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/4205
Denzin, N. K., & Lincoln, Y. S. (2011). The Sage handbook of qualitative research (4th ed.). Thousand Oaks, CA: Sage
Denzin, N. K. (2009). The research act: A theoretical introduction to sociological methods. New Brunswick, NJ: AldineTransaction.
Ellis, T. J., & Levy, Y. (2008). Framework of problem-based research: A guide for novice researchers on the development of a research-worthy problem. Informing Science: the International Journal of an Emerging Transdiscipline, 11, 323-337.
Retrieved from http://inform.nu
Eshlaghy, T. E., Chitsaz, S., Karimian, L., & Charkhchi, R. (2011). A classification of qualitative research methods. Research Journal of International Studies, 20, 106-
123. Retrieved from http://www.eurojournals.com/rjis
Fassinger, R., & Morrow, S. L. (2013). Toward best practices in quantitative, qualitative, and mixed- method research: A social justice perspective. Journal for Social Action in Counseling & Psychology, 5(2), 69–83. Retrieved from http://jsacp.tumblr.com/
Gringeri, C., Barusch, A., & Cambron, C. (2013). Examining foundations of qualitative research: A review of social work dissertations, 2008–2010. Journal of Social Work Education, 49, 760–773. doi:10.1080/10437797.2013.812910
Guba, E. G., & Lincoln, Y. L. (1994). Competing paradigms in qualitative research. In N.
K. Denzin, & Y. S. Lincoln (Eds.), Handbook of qualitative research (pp. 105- 117). Thousand Oaks, CA: Sage.
Hegney, D., & Chan, T. W. (2010). Ethical challenges in the conduct of qualitative research. Nurse Researcher, 18(1), 4-7. Retrieved from
nurseresearcher.rcnpublishing.co.uk/
Hodson, R. (1998). Organizational ethnographies: An underutilized resource in the sociology of work. Social Forces, 76(4), 1173-1208. doi:10.1093/sf/76.4.1173
Hussein, A. (2009). The use of triangulation in social sciences research: Can qualitative and quantitative methods be combined? Journal of Comparative Social Work, 1, 1-12. Retrieved from http:// www.jcsw.no
Lietz, C. A., & Zayas, L. E. (2010). Evaluating qualitative research for social work practitioners. Advances in Social Work, 11, 188-202. Retrieved from http://journals.iupui.edu/index.php/advances
Jackson, J. E. (1990). "I am a fieldnote": Fieldnotes as a symbol of professional identity.
In R. Sanjek (Ed.), Fieldnotes: The making of anthropology (pp. 3-33). Ithaca: Cornell University Press.
Jansen, H. (2010). The logic of qualitative survey research and its position in the field of social research methods. Forum: Qualitative Social Research, 11(2), 1-22.
Retrieved from http://www.qualitative- research.net/index.php/fqs/article/view/1450/2946
Johnston, J. (2010). Qualitative research methods. Radiologic Technology, 82(2), 188–
189. Retrieved from http://www.asrt.org
Kahlke, R. (2014). Generic qualitative approaches: Pitfalls and benefits of methodological mixology. International Journal of Qualitative Methods, 13, 37-52. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/19590
Kisely, S., & Kendall, E. (2011). Critically appraising qualitative research: A guide for clinicians more familiar with quantitative techniques. Australasian Psychiatry, 19, 364–367. doi:10.3109/10398562.2011.562508
Kramer-Kile, M. L. (2012). Research column: Situating methodology within qualitative research. Canadian Journal of Cardiovascular Nursing, 22(4), 27-31. Retrieved from http://pappin.com/journals/cjcn.php
Logie-MacIver, L., Piacentini, M., & Eadie, D. (2012). Using qualitative methodologies to understand behaviour change. Qualitative Market Research: An International Journal, 15, 70–86. doi:10.1108/13522751211192008
Maindonald, J. H. (2011). Qualitative research from start to finish by Robert K. Yin.
International Statistical Review, 79, 499-500. doi:10.1111/j.1751- 5823.2011.00159_20.x
Marcus, G. E. (2002). Beyond Malinowski and after writing culture: On the future of cultural anthropology and the predicament of ethnography. The Australian Journal of Anthropology, 13,191-199. doi:10.1111/j.1835-9310.2002.tb00199.x
Marcus, G. E. (1999). What is at stake - and is not - in the idea and practice of multi- sited ethnography. Canberra Anthropology, 6-14. doi:10.1080/03149099909508344
Marshall, C., & Rossman, G. B. (2016). Designing qualitative research (6th ed.).
Thousand Oaks, CA: Sage.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage.
Mishler, E. G. (1986). Research interviewing: Context and narrative. Cambridge, MA: Harvard University Press.
Moustakas, C. (1994). Phenomenological research methods. Thousand Oaks, CA: Sage.
Niaz, M. (2009). Qualitative methodology and its pitfalls in educational research. Quality and Quantity, 43(4), 535-551. doi:10.1007/s11135-007-9136-9
Patton, M. Q. (2002). Qualitative research & evaluation methods. Thousand Oaks, CA: Sage
Powdermaker, H. (1966). Stranger and friend: The way of an anthropologist. New York:
W.W. Norton.
Richardson, L., & Adams St. Pierre, E. (2008). Writing: A method of inquiry. In N. K. Denzin, & Y. S. Lincoln (Eds.), Collecting and interpreting qualitative materials (3rd ed., pp. 473-500). Thousand Oaks, CA: Sage.
Rubin, H. J., Rubin, I. S. (2012). Qualitative interviewing: The art of hearing data (3rd ed.) Thousand Oaks, CA. Sage.
.
Sangasubana, N. (2011). How to conduct ethnographic research. The Qualitative Report, 16(2), 567-573. Retrieved from http://www.nova.edu/ssss/QR
Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22(2), 63-75. Retrieved from http://www.iospress.nl
Sherrod, M. M. (2011). Using multiple methods in qualitative research design. Journal of Theory Construction and Testing, 10(1), 22-25. doi:425.7544788132
Slevitch, L. (2011). Qualitative and quantitative methodologies compared: Ontological and epistemological perspectives. Journal of Quality Assurance in Hospitality & Tourism, 12(1), 73-81. doi:10.1080/1528008X.2011.541810
Snow, D. A., Morrill, C., & Anderson, L. (, 2003). Elaborating analytic ethnography: Linking fieldwork and theory. Ethnography, 4, 2, 181-200. doi:10.1177/14661381030042002
Srivastava, A., & Thomson, S. B. (2009). Framework analysis: A qualitative methodology for applied policy research. Journal of Administration and Governance, 4(2), 72-79. Retrieved from http://www.joaag.com/Home_Page.php
Tewksbury, R. (2009). Qualitative versus quantitative methods: Understanding why qualitative methods are superior for criminology and criminal justice. Journal of Theoretical and Philosophical Criminology, 1, 38-58. Retrieved from http://www.jtprcrim.org
Verdinelli, S. (2013). Data display in qualitative research. International Journal of Qualitative Methods, 12, 359-381. Retrieved from ejournals.library.ualberta.ca
Wengraf, T. (2001). Qualitative research interviewing: biographic narratives and semi- structured methods. Thousand Oaks, CA. Sage.
Whyte, W. F. (1955). Street corner society (2nd ed.). Chicago: University of Chicago Press.
Wolcott, H. F. (2002). Sneaky kid and its aftermath: Ethics and intimacy in fieldwork.
Walnut Creek, CA: AltaMira Press.
Wolcott, H. F. (2004). The ethnographic autobiography. Auto/Biography 12, 93-106. doi:10.1191/0967550704ab004oa
Wolcott, H. F. (2005). The art of fieldwork. (2nd ed). Walnut Creek, Calif: AltaMira. (Seminal work in ethnography)
Wolcott, H. F. (2008). Ethnography: A way of seeing. Lanham, MD: AltaMira Press.
Wolcott, H. F. (2009). Writing up qualitative research. Thousand Oaks, Calif: Sage Wolcott, H. F. (2010). Ethnography lessons: A primer. Walnut Creek, Calif: Left Coast
Press, Inc.
Yilmaz, K. (2013). Comparison of quantitative and qualitative research traditions: Epistemological, theoretical, and methodological differences. European Journal of Education, 48, 311-325. doi:10.1111/ejed.12014
Yin, R. K. (2012). Applications of case study research (3rd ed.). Thousand Oaks, CA: Sage.
Yin, R. K. (2014). Case study research: designs and methods (5th ed.). Thousand Oaks, CA: Sage.
Qualitative and Quantitative Sources
Frankfort-Nachmias, C., & Nachmias, D. (2008). Research methods in the social sciences (7th ed.). New York, NY: Worth Publishers.
Adams, J., Broom, A., & Jennaway, M. (2012). Qualitative methods in research: One framework for future inquiry. Journal of Manipulative and Physiological Therapeutics, 18(3), 55-60. Retrieved from http://www.jmptonline.org
Ahmed, S. P., & Ahmed, M. T. Z. (2014). Qualitative research: A decisive element to epistemological & ontological discourse. Journal of Studies in Social Sciences, 8, 298-313. Retrieved from http://www.infinitypress.info/index.php/jsss/index
Alasuutari, P. (2010). The rise and relevance of qualitative research. International Journal of Social Research Methodology, 13, 139-155. doi:10.1080/13645570902966056
Allwood, C. M. (2012). The distinction between qualitative and quantitative research methods is problematic. Quality and Quantity, 46, 1417-1429. doi:10.1007/s11135-011-9455-8
Amitabh, M., & Gupta, R. K. (2010). Research in strategy-structure-performance construct: Review of trends, paradigms and methodologies. Journal of Management and Organization, 16, 744-763. Retrieved from http://jmo.e- contentmanagement.com/
Arghode, V. (2012). Qualitative and quantitative research: Paradigmatic differences.
Global Education Journal, 2012(4), 155-163. Retrieved from http://franklinpublishing.net/globaleducation.html
Astalin, P. K. (2013). Qualitative research designs: A conceptual framework.
International Journal of Social Science and Interdisciplinary Research 2(1), 118-
124. Retrieved from http://www.indianresearchjournals.com.
Bailey, L. F. (2014). The origin and success of qualitative research. International Journal of Market Research, 56, 167-184. doi:10.2501/ijmr-2014-013
Bansal, P., & Corley, K. (2011). The coming of age for qualitative research: Embracing the diversity of qualitative methods. Academy of Management Journal, 54, 233- 237. doi:10.5465/AMJ.2011.60262792
Bansal, P., & Corley, K. (2012). Publishing in AMJ- part 7: What's different about qualitative research? Academy of Management Journal, 55, 509-513. doi:10.5465/amj.2012.4003
Barnham, C. (2012). Separating methodologies. International Journal of Market Research, 54, 736-738. doi:10.2501/IJMR-54-6-736-738
Bleijenbergh, I., Korzilius, H., & Verschuren, P. (2011). Methodological criteria for the internal validity and utility of practice oriented research. Quality and Quantity, 45, 145-156. doi:10.1007/s11135-010-9361-5
Bloomer, M. J., Cross, W., Endacott, R., O’Connor, M., & Moss, C. (2012). Qualitative observation in a clinical setting: Challenges at end of life. Nursing and Health Sciences, 14, 25-31. doi:10.1111/j.1442-2018.2011.00653.x
Borrego, M., Douglas, E. P., & Amelink, C. T. (2011). Quantitative, qualitative, and mixed research methods in engineering education. Journal of Engineering Education, 41, 153-166. Retrieved from http://www.jee.org
Bristowe, K., Selman, L., & Murtagh, F. E. (2015). Qualitative research methods in renal medicine: An introduction. Nephrology Dialysis Transplantation, 30, 1424-1431. doi:10.1093/ndt/gfu410
Bytheway, A. (2013). Qualitative research without money: Experiences with a home- grown qualitative content analysis tool. The Journal of Community Informatics, 9(4). Retrieved from http://ci-journal.net/index.php/ciej/article/view/978/1058
Cairney, P., & St Denny, E. (2015). Reviews of what is qualitative research and what is qualitative interviewing. International Journal of Social Research Methodology: Theory and Practice, 18, 117-125. doi:10.1080/13645579.2014.957434
Carrera-Fernandez, M. J., Guardia-Olmos, J., & Pero-Cebollero, M. (2013). Qualitative research in psychology: Misunderstandings about textual analysis. Quality & Quantity, 47, 1589-1603. doi:10.1007/s11135-011-9611-1
Carus, A. W., & Oglivie, S. (2009). Turning qualitative into quantitative evidence: A well- used method made explicit. The Economic History Review, 62, 893-925. doi:10.1111/j.1468-0289.2009.00486.x
Camfield, L., & Palmer-Jones, R. (2013). Improving the quality of development research: What could archiving qualitative data for reanalysis and revisiting research sites contribute? Progress in Development Studies, 13, 323-328. doi:10.1177/1464993413490481
Castellan, C. M. (2010). Quantitative and qualitative research: A view for clarity.
International Journal of Education, 2(2), 1-14. Retrieved from http:// www.macrothink.org/journal/index.php/ije/article/download/446/361
Cheu-Jey, G. L. (2012). Reconsidering constructivism in qualitative research.
Educational Philosophy and Theory, 44, 403-412. 10.1111/j.1469- 5812.2010.00720.x
Cleary, M., Horsfall, J., & Hayter, M. (2014). Qualitative research: Quality results?
Journal of Advanced Nursing, 70, 711-713. doi:10.1111/jan.12172
Cokley, K., & Awad, G. H. (2013). In defense of quantitative methods: Using the “master’s tools” to promote social justice. Journal for Social Action in Counseling and Psychology, 5(2), 26-41. Retrieved from http://jsacp.tumblr.com/
Cole, C., Chase, S., Couch, O., & Clark, M. (2011). Research methodologies and professional practice: Considerations and practicalities. Electronic Journal of Business Research Methods, 9(2), 141-151. Retrieved from http:// www.ejbrm.com
Cope, D. G. (2014). Methods and meanings: Credibility and trustworthiness of qualitative research. Oncology Nursing Forum, 41, 89-91. doi:10.1188/14.ONF.89-91
Corley, K. (2011). The coming of age for qualitative research: Embracing the diversity of qualitative methods. Academy of Management Journal, 54, 233-237. doi:10.5465/AMJ.2011.60262792
Cox, R. (2012). Teaching qualitative research to practitioner-researchers. Theory into Practice, 51(2), 129-139. doi:10.1080/00405841.2012.662868
DeForge, B. R. (2010). Research design principles. In Neil Salkind (Ed.), Encyclopedia of research design (Vol. 3, p. 1252). Los Angeles, CA: Sage
Echambadi, R., Campbell, B., & Agarwal, R. (2012). Encouraging best practice in quantitative management research: An incomplete list of opportunities. Journal of Management Studies, 23, 801-820. doi:10.1111/j.1467-6486.2006.00660.x
Erickson, F. (2012). Qualitative research methods for science education. Second International Handbook of Science Education, 24, 1451-1469. doi:10.1007/978-1- 4020-9041-7_93
Erlingsson, C., & Brysiewicz, P. (2013). Orientation among multiple truths: An introduction to qualitative research. African Journal of Emergency Medicine, 3, 92-99. 10.1016/j.afjem.2012.04.005
Eshlaghy, T. E., Chitsaz, S., Karimian, L., & Charkhchi, R. (2011). A classification of qualitative research methods. Research Journal of International Studies, 20, 106-
123. Retrieved from http://www.eurojournals.com/rjis
Fassinger, R., & Morrow, S. L. (2013). Toward best practices in quantitative, qualitative, and mixed- method research: A social justice perspective. Journal for Social Action in Counseling & Psychology, 5(2), 69-83. Retrieved from http://jsacp.tumblr.com/
Fisher, W. P., Jr., & Stenner, A. J. (2011). Integrating qualitative and quantitative research approaches via the phenomenological method. International Journal of Multiple Research Approaches, 5, 85-99. doi:10.5172/mra.2011.5.1.89
Frankfort-Nachmias, C., & Nachmias, D. (2008). Research methods in the social sciences (7th ed.). New York, NY: Worth Publishers
Frels, R. K., & Onwuegbuzie, A. J. (2013). Administering quantitative instruments with qualitative interviews: A mixed research approach. Journal of Counseling and Development, 91(2), 184-194. doi:10.1002/j.1556-6676.2013.00085.x
Freshwater, D., Cahill, J., Walsh, E., & Muncey, T. (2010). Qualitative research as evidence: Criteria for rigour and relevance. Journal of Research in Nursing, 15, 497-508. doi:10.1177/1744987110385278
Gergen, J., Josselson, R., & Freeman, M. (2015). The promises of qualitative inquiry.
American Psychologist, 70(1), 1-9. doi:10.1037/a0038597
Gerring, J. (2011). How good is enough? A multidimensional, best-possible standard for research design. Political Research Quarterly, 64, 625-636. doi:10.1177/1065912910361221
Gibson, J. W. (2010). A winning combination for business researchers: A review of qualitative methods in business research. The Qualitative Report, 15(4), 1012- 1015. Retrieved from http://www.nova.edu/ssss/QR/QR15-4/eriksson
Gibson, S., Benson, O., & Brand, S. L. (2013). Talking about suicide confidentiality and anonymity in qualitative research. Nursing Ethics, 20, 18-29. doi:10.1177/0969733012452684
Gioia, D. A., Corley, K. G., & Hamilton, A. L. (2012). Seeking qualitative rigor in inductive research: Notes on the Gioia methodology. Organizational Research Methods, 16, 15-31. doi:10.1177/1094428112452151
Goertz, G., & Mahoney, J. (2012). Methodological Rorschach tests: Contrasting interpretations in qualitative and quantitative research. Comparative Political Studies, 46, 236-251. doi:10.1177/0010414012466376
Goffin, K., Raja, J. Z., Claes, B., Szwejczewski, M., & Martinez, V. (2012). Rigor in qualitative supply chain management research. International Journal of Physical
Distribution & Logistics Management, 42, 804-827. doi:10.1108/09600031211269767
Gringeri, C., Barusch, A., & Cambron, C. (2013). Epistemology in qualitative social work research: A review of published articles, 2008-2010. Social Work Research, 37, 55-63. doi:10.1093/swr/svs032
Grossoehme, D. H. (2014). Overview of qualitative research. Journal of Health Care Chaplaincy, 20(3), 109-122. doi:10.1080/08854726.2014.925660
Guercini, S. (2014). New qualitative research methodologies in management.
Management Decision, 52, 662-674. doi:10.1108/MD-11-2013-0592
Hazzan, O., & Nutov, L. (2014). Teaching and learning qualitative research: Conducting qualitative research. Qualitative Report, 19, 1-29. Retrieved from http://www.nova.edu/ssss/QR/QR19/hazzan1
Hossain, D. M. (2011, September). Qualitative research process. Postmodern Openings, 7, 143-156. Retrieved from http://postmodernopenings.com
Humphrey, C. (2014). Qualitative research-mixed emotions. Qualitative Research in Accounting & Management, 11, 51-70. doi:10.1108/QRAM-03-2014-0024
Ihantola, E. M, & Kihn, L. A. (2011). Threats to validity and reliability in mixed methods accounting research. Qualitative Research in Accounting and Management, 8(1), 39-58. doi:10:1108/11766091111124694
Isaacs, A. N. (2014). An overview of qualitative research methodology for public health researchers. International Journal of Medicine & Public Health, 4, 318-323. doi:10.4103/2230-8598.144055
Johnson, B. C., Dunlap, E., & Benoit, E. (2010). Organizing mountains of words for data analysis, both qualitative and quantitative. Substance Use & Misuse, 45, 648- 670. doi:10.3109/10826081003594757
Kelemen, M., & Rumens, N. (2012). Pragmatism and heterodoxy in organization research: Going beyond the quantitative/qualitative divide. International Journal of Organizational Analysis, 20, 5-12. doi:10.1108/19348831211215704
Kemparaj, U., & Chavan, S. (2013). Qualitative research: A brief description. Indian Journal of Medical Sciences, 67(3), 89-98. doi:10.4103/0019-5359. 121127
Kisely, S., & Kendall, E. (2011). Critically appraising qualitative research: A guide for clinicians more familiar with quantitative techniques. Australasian Psychiatry, 19, 364-367. doi:10.3109/10398562.2011.562508
Kozlowski, S. W. J., Chao, G. T., Grand, J. A., Braun, M. T., & Kuljanin, G. (2013).
Advancing multilevel research design: Capturing the dynamics of emergence.
Organizational Research Methods, 16, 581-615. doi:10.1177/1094428113493119
Krivokapic-Skoko, B., & ONeill, G. (2011). Beyond the qualitative quantitative distinction: Some innovation methods for business and management research. International Journal of Multiple Research Approaches, 5, 290-300. doi:10.5172/mra.2011.5.3.290
Labaree, D. F. (2011). The lure of statistics for educational researchers. Educational Theory, 61, 621-632. doi:10.1111/j.1741-5446.2011.00424.
Logie-MacIver, L., Piacentini, M., & Eadie, D. (2012). Using qualitative methodologies to understand behaviour change. Qualitative Market Research: An International Journal, 15, 70-86. doi:10.1108/13522751211192008
Malagon-Maldonado, G. (2014). Qualitative research in health design. HERD: Health Environments Research & Design Journal, 7(4), 120-134. doi:10.1177/193758671400700411
Michell, J. (2011). Qualitative research meets the ghost of Pythagoras. Theory & Psychology, 21, 241-259. doi:10.1177/0959354310391351
Nalbone, D. P. (2012). A quantitative look at a new qualitative methodology.
PsycCRITIQUES, 57, 329-335. doi:10.1037/a0026557
Nicholls, D. (2009). Qualitative research: Part two—methodologies. International Journal of Therapy and Rehabilitation, 16, 586-592. Retrieved from http://www.ijtr.co.uk
Pathak, V., Jena, B., & Kalra, S. (2013). Qualitative research. Perspectives in Clinical Research, 4(3), 192. doi:10.4103/2229-3485.115389
Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? Part 2: Introducing qualitative research methodologies and methods. Manual Therapy, 17, 378-384. doi:10.1016/j.math.2012.03.004
Polit, D. F., & Beck, C. T. (2010). Generalization in quantitative and qualitative research: Myths and strategies. International Journal of Nursing Studies, 47, 1451-1458. doi:10.1016/j.ijnurstu.2010.06.004
Plowman, D. A., & Smith, A. D. (2011). The gendering of organizational research methods. Qualitative Research in Organizations and Management, 6(1), 64-82. doi:10.1108/17465641111129399
Poortman, C. C., & Schildkamp, K. K. (2012). Alternative quality standards in qualitative research? Quality & Quantity, 46, 1727-1751. doi:10.1007/s11135-011-9555-5
Reeves, S., Kuper, A., Hodges, B.D. (2008). Qualitative research: Qualitative research methodologies: Ethnography. BMJ, 337, 511-514. doi:10.1136/bmj.a1020
Roulston, K. (2010). Considering quality in qualitative interviewing. Qualitative Research, 10(2), 199-202. doi:10.1177/1468794109356739
Sargeant, J. (2012). Qualitative research part II: Participants, analysis, and quality assurance. Journal of Graduate Medical Education, 4(1), 1-3. doi:10.4300/JGME- D-11-00307.1
Secomb, J. M., & Smith, C. (2011). A mixed method pilot study: The researchers’ experiences. Contemporary Nurse: A Journal for the Australian Nursing Profession, 39, 31-35. Retrieved from http://www.contemporarynurse.com
Sarma, S. K. (2015). Qualitative research: Examining the misconceptions. South Asian Journal of Management, 22, 176-191. Retrieved from http://www.sajm-amdisa.org
Simpson, S. H. (2011). Demystifying the research process: Mixed methods. Pediatric Nursing, 37(1), 28-29. Retrieved from http://www.pediatricnursing.net
Slevitch, L. (2011). Qualitative and quantitative methodologies compared: Ontological and epistemological perspectives. Journal of Quality Assurance in Hospitality & Tourism, 12(1), 73-81. doi:10.1080/1528008X.2011.541810
Smythe, L. (2012). Discerning which qualitative approach fits best. New Zealand College of Midwives, 46, 5-12. Retrieved from http://www.midwife.org.nz
Swafford, L. G. (2014). Elements and evaluation of qualitative research. Radiation Therapist, 23(1), 90-91. Retrieved from http://www.asrt.org/
Terrell, S. R. (2012). Mixed-methods research methodologies. The Qualitative Report, 17(1), 254-280. Retrieved from http://www.nova.edu/ssss/QR/QR17-1/terrell
Thamhain, H. J. (2014). Assessing the effectiveness of quantitative and qualitative methods for R&D project proposal evaluations. Engineering Management Journal, 26(3), 3-12. Retrieved from http://www.asem.org/asemweb-emj.html
Thomas, R. (2012). Five ways of doing qualitative analysis: Phenomenological psychology, grounded theory, discourse analysis, narrative research, and intuitive inquiry. British Journal of Psychology, 103, 291-292. doi:10.1111/j.20448295.2012.02104.x
Toloie-Eshlaghy, A., Chitsaz, S., Karimian, L., & Charkhchi, R. (2011). A classification of qualitative research methods. Research Journal of International Studies, 20, 106-
123. Retrieved from http://kgma.kz/en/2748.html
Tracy, S. J. (2010). Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qualitative Inquiry, 16, 837-851. doi:10.1177/1077800410383121
Tuli, F. (2010). The basis of distinction between qualitative and qualitative research in social science: Reflection on ontological, epistemological and methodological perspectives. Ethiopian Journal of Education and Science, 6(1), 97-108.
Retrieved from http://www.ajol.info/index.php/ejesc/article/download/65384/53078
Wahyuni, D. (2012). The research design maze: Understanding paradigms, cases, methods and methodologies. Journal of Applied Management Accounting Research, 10(1), 69-80. Retrieved from http://www.cmawebline.org/jamar
Waite, D. (2014). Teaching the unteachable: Some issues of qualitative research pedagogy. Qualitative Inquiry, 20, 267-281. doi:10.1177/1077800413489532
Watkins, D. C. (2012). Qualitative research: The importance of conducting research that doesn’t count. Health Promotion Practice, 13, 153-158. doi:10.1177/1524839912437370
Westerman, M. A., & Yanchar, S. C. (2011). Changing the terms of the debate: Quantitative methods in explicitly interpretive research. Theory & Psychology, 21(2), 139-154. doi:10.1177/0959354310393565
White, J., & Drew, S. (2011). Collecting data or creating meaning? Qualitative Research Journal, 11(1), 3-12. doi:10.3316/ARJ1101003
White, D. E., Oelke, N. D., & Friesen, S. (2012). Management of a large qualitative data set: Establishing trustworthiness of the data. International Journal of Qualitative Methods, 11, 244-258. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/9883
Wiles, R., Crow, G., & Pain, H. (2011). Innovation in qualitative research methods: A narrative review. Qualitative Research, 11, 587-604. doi:10.1177/1468794111413227
Woodside, A. G. (2010). Bridging the chasm between survey and case study research: Research methods for achieving generalization, accuracy, and complexity.
Industrial Marketing Management, 39(1), 64-75. doi:10.1016/j.indmarman.2009.03.017
Wuest, J. (2011). Are we there yet? Positioning qualitative research differently.
Qualitative Health Research, 21, 875-883. doi:10.1177/1049732311401424
Yilmaz, K. (2013). Comparison of quantitative and qualitative research traditions: Epistemological, theoretical, and methodological differences. European Journal of Education, 48, 311-325. doi:10.1111/ejed.12014
Reliability, Validity, Transferability, and Generalizability Sources
Abowitz, D. A., & Toole, T. M. (2010). Mixed methods research: Fundamental issues of design, validity, and reliability in construction research. Journal of Construction Engineering & Management, 136(1), 108-116. doi:10.1061/(ASCE)CO.1943-
7862.0000026
Ali, A. M., & Yusof, H. (2011). Quality in qualitative studies: The case of validity, reliability and generalizability. Issues in Social and Environmental Accounting, 5(1/2), 25-64. Retrieved from http://isea.icseard.uns.ac.id
Amerson, R. (2011). Making a case for the case study method. Journal of Nursing Education, 50, 427-428. doi:10.3928.01484834-20110719-01
Andrade, A. D. (2009). Interpretive research aiming at theory building: Adopting and adapting the case study design. The Qualitative Report, 14(1), 42-60. Retrieved from http://www.nova.edu/ssss/QR/QR14-1/diaz-andrade
Anney, V. (2014). Ensuring the quality of the findings of qualitative research: Looking at trustworthiness criteria. Journal of Emerging Trends in Educational Research and Policy Studies, 5, 272-281. Retrieved from http://jeteraps.scholarlinkresearch.com
Aravamudhan, N. R., & Krishnaveni, R. (2015). Establishing and reporting content validity evidence of training and development capacity building scale (TDCBS). Management Journal of Contemporary Management Issues, 20(1), 131-158.
Retrieved from http://hrcak.srce.hr/management
Aust, F., Diedenhofen, B., Ullrich, S., & Musch, J. (2013). Seriousness checks are useful to improve data validity in online research. Behavior Research Methods (Online), 45, 527-35. doi:10.3758/s13428-012-0265-2
Azham, A., & Hamidah, Y. (2011). Quality in qualitative studies: The case of validity, reliability and generalizability. Issues in Social & Environmental Accounting, 5(1/2), 25-64. Retrieved from http://isea.icseard.uns.ac.id
Bekhet, A. K., & Zauszniewski, J. A. (2012). Methodological triangulation: An approach to understanding data. Nurse Researcher, 20(2), 40-43. Retrieved from http://www.nursing-standard.co.uk
Bisman, J. (2010). Postpositivism and accounting research: A personal primer on critical realism. Australasian Accounting Business & Finance Journal, 4(4), 3-25.
Retrieved from http://ro.uow.edu.au
Bleijenbergh, I., Korzilius, H., & Verschuren, P. (2011). Methodological criteria for the internal validity and utility of practice oriented research. Quality and Quantity, 45(1), 145-156. doi:10.1007/s11135-010-9361-5
Bouckenooghe, D., Clercq, D. D., Willem, A., & Buelens, M. (2007). An assessment of validity in entrepreneurship research. The Journal of Entrepreneurship , 16(2), 147-171. doi: 10.1177/097135570701600202
Brahma, S. S. (2009). Assessment of construct validity in management research: A structured guideline. Journal of Management Research, 9, 59-71. Retrieved from http://www.indianjournals.com/ijor.aspz?target=ijor:jmr&type=home
Branthwaite, A., & Patterson, S. (2011). The power of qualitative research in the era of social media. Qualitative Market Research, 14, 430-440. doi:10.1108/13522751111163245
Brod, M., Tesler, L. E., & Christiansen, T. L. (2009). Qualitative research and content validity: Developing best practices based on science and experience. Quality of Life Research, 18, 1263-1278. doi:10.1007/s11136-009-9540-9
Burchett, H. E., Mayhew, S. H., Lavis, J. N., & Dobrow, M. J. (2013). When can research from one setting be useful in another? Understanding perceptions of the applicability and transferability of research. Health Promotion International, 28, 418-430. doi:10.1093/heapro/das026
Burghardt, G. M., Bartmess-LeVasseur, J. N., Browning, S. A., Morrison, K. E., Stec, C. L., Zachau, C. E., & Freeberg, T. M. (2012). Perspectives – minimizing observer bias in behavioral studies: A review and recommendations. Ethology, 118, 511- 517. doi:10.1111/j.1439-0310.2012.02040.x
Cahoon, M. V., Bowler, J. L., & Bowler, M. C. (2012). A reevaluation of assessment center construct-related validity. International Journal of Business and Management, 7(9), 3-19. doi:10.5539/ijbm.v7n9p3
Chenail, R. J. (2010). Getting specific about qualitative research generalizability.
Journal of Ethnographic and Qualitative Research, 5(1), 1-11. Retrieved from http://www.jeqr.org
Cho, J., & Trent, A. (2011). Validity in qualitative research revisited. Qualitative Research, 6, 319-340. doi:10.1177/1468794106065006
Cook, K. E. (2012). Reliability assessments in qualitative health promotion research.
Health Promotion International, 27, 90-101. doi:10.1093/heapro/dar027
Crowson, H. M. (2009). Does the DOG scale measure dogmatism? Another look at construct validity. The Journal of Social Psychology, 149, 265-283. doi:10.3200/SOCP.149.3.365-383
Da Mota Pedrosa, A., Näslund, D., & Jasmand, C. (2012). Logistics case study based research: Towards higher quality. International Journal of Physical Distribution & Logistics Management, 42, 275-295. doi:10.1108/09600031211225963
Donatelli, R. E., & Lee, S. J. (2013). How to report reliability in orthodontic research: Part 1. American Journal of Orthodontics and Dentofacial Orthopedics, 144(1), 156-161. doi:10.1016/j.ajodo.2013.03.014
Dressman, M., McCarthey, S., & Prior, P. (2011). Generalizability or a thousand points of light? The promises and dilemmas of qualitative literacy research. Research in the Teaching of English, 45, 349-352. Retrieved from http://www.ncte.org
Drost, E. A. (2011). Validity and reliability in social science research. Education Research and Perspectives, 38(1), 105-124. Retrieved from http://www.education.uwa.edu.au/research/journal
El Hussein, M., Jakubec, S. L., & Osuji, J. (2015). Assessing the FACTS: A mnemonic for teaching and learning the rapid assessment of rigor in qualitative research studies. The Qualitative Report, 20, 1182-1184. Retrieved from http://nsuworks.nova.edu/tqr/vol20/iss8/3
Elo, S., Kaariainen, M., Kanste, O., Polkki, T., Utriainen, K., & Kyngas, H. (2014, January-March). Qualitative content analysis: A focus on trustworthiness. SAGE Open, 1-10. doi:10.1177/2158244014522633
Feldt, R. C., & Koch, C. (2011). Reliability and construct validity of the college student stress scale. Psychological Reports, 108, 660-666. doi:10.2466/02.08.13.16.PRO.108.2.660-666
Gheondea-Eladi, A. (2014). Is qualitative research generalizable? Journal of Community Positive Practices, 14(3), 114-124. Retrieved from http://jppc.ro/?lang=en
Gibbert, M., & Ruigrok, W. (2010). The what and how of case study rigor: Three strategies based on published work. Organizational Research Methods, 13, 710- 737. doi:10.1177/1094428109351319
Gibbert, M., Ruigrok, W., & Wicki, B. (2008). What passes as a rigorous case study?
Strategic Management Journal, 29, 1465-1474. Retrieved from http://smj.strategicmanagement.net/
Golafshani, N. (2003). Understanding reliability and validity in qualitative research. The Qualitative Report, 8, 597-607. Retrieved from http://www.nova.edu/ssss/QR/QR8-4/golafshani
Green, L. W., & Glasgow, R. E. (2006). Evaluating the relevance, generalization, and applicability of research: Issues in external validation and translation methodology. Evaluation & the Health Professions, 29(1), 126-153. doi:10:1177/0163278705284445
Hodges, N. (2011). Qualitative research: A discussion of frequently articulated qualms (FAQs). Family and Consumer Sciences Research Journal, 40, 90-92. doi:10.1111/j.1552-3934.2011.02091.x
Holloway, I., Brown, L., & Shipway, R. (2010). Meaning not measurement: Using ethnography to bring a deeper understanding to the participant experience of festivals and events. International Journal of Event and Festival Management, 1(1), 74-85. doi:10.1108/17852951011029315
Houghton, C., Casey, D., Shaw, D., & Murphy, K. (2013). Rigour in qualitative case- study research. Nurse Researcher, 20(4), 12-17. doi:10.7748/nr2013.03.20.4.12.e326
Humble, A. M. (2009). Technique triangulation for validation in directed content analysis.International Institute for Qualitative Methodology, 8(3), 34-51. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/viewFile/1480/5586
Ihantola, E. M, & Kihn, L. A. (2011). Threats to validity and reliability in mixed methods accounting research. Qualitative Research in Accounting and Management, 8(1), 39-58. doi:10:1108/11766091111124694
Jensen, H. I., Ammentorp, J., Erlandsen, M., & Ording, H. (2012). End of life practices in Danish ICUs: Development and validation of a questionnaire. BMC Anesthesiology, 12(1), 16-22. doi:10.1186/1471-2253-12-16
Kane, M. (2012). All validity is construct validity. Or is it? Measurement, 10(1/2), 66-70. doi:10.1080/15366367.2012.681977
Kelemen, M., & Rumens, N. (2012). Pragmatism and heterodoxy in organization research: Going beyond the quantitative/qualitative divide. International Journal of Organizational Analysis, 20, 5-12. doi:10.1108/19348831211215704
Kihn, L. & Ihantola, E. (2015). Approaches to validation and evaluation in qualitative studies of management accounting. Qualitative Research in Accounting & Management, 12(3), 230-255. doi:10.1109/QRAM-03-2013-0012
Kornbluh, M. (2015). Combatting challenges to establishing trustworthiness in qualitative research. Qualitative Research in Psychology, 12, 397-414. doi:10.1080/14780887.2015.1021941
Krippendorff, K. (2011). Agreement and information in the reliability of coding.
Communications Methods & Measures, 5(2), 93-112. doi:10.1080/19312458.2011568376
Larsson, S. (2009) A pluralist view of generalization in qualitative research. International Journal of Research & Method in Education, 32(1), 25-38. doi:10.1080/17437270902759931
Lasch, K. E., Marquis, P., Vigneux, M., Abetz, L., Arnould, B., Bayliss, M., Crawford, B., & Rosa, K. (2010). PRO development: Rigorous qualitative research as the crucial foundation. Quality of Life Research, 19, 1087-1096. doi:10.1007/s11136- 010-9677-6
Molina-Azorin, J. F. (2011). The use and added value of mixed methods in management research. Journal of Mixed Methods Research, 5(1), 7-24. doi:10.1177/1558689810384490
Morse, J. M., Barrett, M., Mayan, M., Olson, K., & Spiers, J. (2002). Verification strategies for establishing reliability and validity in qualitative research.
International Journal of Qualitative Methods, 1(2), 13-22. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/index
Nakkeeran, N., & Zodpey, S. P. (2012). Qualitative research in applied situations: Strategies to ensure rigor and validity. Indian Journal of Public Health, 56(1), 4- 11. doi:10.10.4103/0019-557X.96949
Noble, H., & Smith, J. (2015). Issues of validity and reliability in qualitative research.
Evidence-Based Nursing, 18(2), 34-35. doi:10.1136/eb-2015-102054
Oleinik, A. (2011). Mixing quantitative and qualitative content analysis: Triangulation at work. Quality and Quantity, 45, 859-873. doi:10.1007/s11135-010-9399-4
Oliphant, G. C., Hansen, K., & Oliphant, B. J. (2008). Predictive validity of a behavioral interview technique. Marketing Management Journal, 18(2), 93-105. Retrieved from http://www.mmaglobal.org
Oluwatayo, J. A. (2012). Validity and reliability issues in education research. Journal of Educational and Social Research, 2, 391-399. doi:10.5901/jesr.2012.v2n2.391
Onwuegbuzie, A. J., & Leech, N. L. (2007). Validity and qualitative research: An oxymoron? Quality & Quantity: International Journal of Methodology, 41, 233- 249. doi:10.1007/s11135-006-9000-3
Pearson, M., & Coomber, R. (2010). The challenge of external validity in policy-relevant systematic reviews: A case study from the field of substance misuse. Addiction, 105(1), 136-145. doi:10.1111/j.1360-0443.2009.02713.x
Polit, D. F., & Beck, C. T. (2010). Generalization in quantitative and qualitative research: Myths and strategies. International Journal of Nursing Studies, 47, 1451-1458. doi:10.1016/j.ijnurstu.2010.06.004
Rennie, D. L. (2012). Qualitative research as methodical hermeneutics. Psychological Methods, 17, 385-398. Retrieved from http://www.psycnet.apa.org
Riege, A. M. (2003). Validity and reliability tests in case study research: A literature review with “hands-on” applications for each research phase. Qualitative Market Research: An International Journal, 6(2), 75-86. doi:10.1108/13522750310470055
Rocha Pereira, H. (2012). Rigour in phenomenological research: Reflections of a novice nurse researcher. Nurse Researcher, 19(3), 16-19. Retrieved from http://nurse researcher.rcnpublishing.co.uk
Roe, B. E., & Just, D. R. (2009). Internal and external validity in economics research: Tradeoffs between experiments, field experiments, natural experiments, and field data. American Journal of Agricultural Economics, 91, 1266-1271. doi:10.1111/j.14678276.2009.01295.x.
Rossiter, J. R. (2008). Content validity of measures of abstract constructs in management and organizational research. British Journal of Management, 19, 380-388. doi:10.1111/j.1467-8551.2008.00587.x
Shadish, W. R. (2011). The truth about validity. New Directions for Evaluation, 2011(130), 107-117. doi:10.1002/ev.369
Slater, S., & Yani-de-Soriano, M. (2010). Researching consumers in multicultural societies: Emerging methodological issues. Journal of Marketing Management, 26, 1143-1160. doi:10.1080/0267257X.2010.509581
Slone, D. J. (2009). Visualizing qualitative information. The Qualitative Report, 14, 489-
497. Retrieved from http://www.nova.edu/ssss/QR/QR14-3/slone
Soter, A. O., Connors, S. P., & Rudge, L. (2012). Use of coding manual when providing a meta-interpretation of internal-validity mechanisms and demographic data used in qualitative research. Journal of Ethnographic and Qualitative Research, 17(6), 69-80. doi:10.24584593467.567945
Steckler, A., & McLeroy, K.R. (2008). The importance of external validity. American Journal of Public Health, 98(1), 9-10. doi:10.2105/AJP.2007.126847
Stone-Romero, E., & Rosopa, P. J. (2010). Research design options for testing mediation models and their implications for facets of validity. Journal of Managerial Psychology, 25, 697-712. doi:10.1108/02683941011075256
Street, C. T., & Ward, K. W. (2012). Improving validity and reliability in longitudinal case study timelines. European Journal of Information Systems, 21(2), 160-175. doi:10.1057/ejis.2011.53
Thomas, E., & Magilvy, J. K. (2011). Qualitative rigor or research validity in qualitative research. Journal for Specialists in Pediatric Nursing, 16(2), 151-155. doi:10.1111/j.1744-6155.2011.00283.x
Tiira, K., & Lohi, H. (2014). Reliability and validity of a questionnaire survey in canine anxiety research. Applied Animal Behavior Science, 155, 82-92. doi:10.1016/j.applanim.2014.03.007
Tomasik, T. (2010). Reliability and validity of the Delphi method in guideline development for family physicians. Quality in Primary Care, 18, 317-326. Retrieved from http://www.ingentaconnect.com
Woolcock, M. (2013). Using case studies to explore the external validity of ‘complex’ development interventions. Evaluation, 19, 229-248. doi:10.1177/1356389013495210
Yildirim, K. (2010). Raising the quality in qualitative research. Ilkogretim Online, 9(1), 79-92. Retrieved from http://ilkogretim-online.org
http://ilkogretim-online.org.tr/vol9say1/v9s1m8.pdf
Yin, R. K. (2013, July 10). Validity and generalization in future case study evaluations.
Evaluation, 19, 312-332. doi:10.1177/1356389013497081
Yu, C., Jannasch-Pennell, A., & DiGangi, S. (2011). Compatibility between text mining and qualitative research in the perspectives of grounded theory content analysis, and reliability. The Qualitative Report, 16, 730-744. Retrieved from http://www.nova.edu/ssss/QR/QR16-3/yu
Sampling and Incentives
Abrams, L. S. (2010). Sampling hard to reach populations in qualitative research: The case of incarcerated youth. Qualitative Social Work, 9, 536-550. doi:10.1077/1473325010367821
Acharya, A. S., Prakash, A., Saxena, P., & Nigam, A. (2013). Sampling: Why and how of it? Indian Journal of Medical Specialties, 4(2), 330-333. doi:10.7713/ijms.2013.0032
Anderson, R. B., & Hartzler, B. M. (2014. Belief bias in the perception of sample size adequacy. Thinking & Reasoning, 20, 297-314. doi:10.1080/13546783.2013.787121
Angelos, P. (2013). Ethical issues of participant recruitment in surgical clinical trials.
Annals of Surgical Oncology, 20, 3184-3187. doi:10.1245/s10434-013-3178-0
Ardern, C. I., Nie, J. X., Perez, D. F., Radhu, N., & Ritvo, P. (2013). Impact of participant incentives and direct and snowball sampling on survey response rate in an ethnically diverse community: Results from a pilot study of physical activity and the built environment. Journal of Immigrant and Minority Health, 15(1), 207-214. doi:10.1007/s10903-011-9525-y
Baltar, F., & Brunet, I. (2012). Social research 2.0: Virtual snowball sampling method using facebook. Internet Research, 22, 57-74. doi:10.1108/10662241211199960
Brewis, J. (2014). The ethics of researching friends: On convenience sampling in qualitative management and organization studies. Journal of British Management, 25, 849-862. doi:10.1111/1467-8551.12064
Burmeister, E., & Aitken, L. M. (2012). Sample size: How many is enough? Australian Critical Care, 25, 271-274. doi:10.1016/j.aucc.2012.07.002
Cader, H. A., & Leatherman, J. C. (2011). Small business survival and sample selection bias. Small Business Economics, 37, 155-165. doi:10.1007/s11187-009-9240-4
Carlsen, B., & Glenton, C. (2011). What about N? A methodological study of sample size reporting in focus group studies. BMC Medical Research Methodology, 11(1), 26-35. doi:10.1186/1471-2288-11-26
Cleary, M., Horsfall, J., & Hayter, M. (2014). Data collection and sampling in qualitative research: Does size matter? Journal of Advanced Nursing, 70, 473-475. doi:10.1111/jan.12163
Cohen, N., & Arieli, T. (2011). Field research in conflict environments: Methodological challenges and snowball sampling. Journal of Peace Research, 48, 423-435. doi:10.1177/0022343311405698
Dworkin, S. L. (2012). Sample size policy for qualitative studies using in-depth interviews. Archives of Sexual Behavior, 41, 1319-1320. doi:10.1007/s105080120016-6
Emerson, R. W. (2015). Convenience sampling, random sampling, and snowball sampling: How does sampling affect the validity of research? Journal of Visual Impairment & Blindness, 109(2), 164-168. Retrieved from http://http://www.afb.org/jvib/jvib_main.asp
Fetters, M. D., Curry, L. A., & Creswell, J. W. (2013). Achieving integration in mixed methods designs, principles and practices. Health Services Research, 48, 2134- 2156. doi:10.1111/1475-6773.12117
Francis, J. J., Johnston, M., Robertson, C., Glidewell, L., Entwistle, V. Eccles, M. P., & Grimshaw, J. M. (2010). What is an adequate sample size? Operationalizing data saturation for theory-based interview studies. Psychology and Health, 25, 1229- 1245. doi:10.1080/08870440903194015
Fugard, A., & Potts, H. (2015). Supporting thinking on sample sizes for thematic analysis: A quantitative tool. International Journal of Social Research Methodology, 18, 669-684. doi:10.1080/13645579.2015.1005453
Gibbs, L., Kealy, M., Willis, K., Green, J., Welch, N., & Daly, J. (2007). What have sampling and data collection got to do with good qualitative research? Australian and New Zealand Journal of Public Health, 31, 540-544. doi:10.1111/j.1753- 6405.2007.00140.x
Gillet, J., Cartwright, E., & Van Vugt, M. (2011). Selfish or servant leadership?
Evolutionary predictions on leadership personalities in coordination games. Personality and Individual Differences, 51, 231-236. doi:10.1016/j.paid.2010.06.003
Griffith, D. A. (2013). Establishing qualitative geographic sample size in the presence of spatial autocorrelation. Annals of the Association of American Geographers, 103, 1107-1122. doi:10.1080/00045608.2013.776884
Guyll, M., Spoth, R., & Redmond, C. (2003). The effects of incentives and research requirements on participation rates for a community-based preventive intervention research study. Journal of Primary Prevention, 24.doi:10.1023/A:1025023600517
Handcock, M. S., & Gile, K. J. (2011). Comment: On the concept of snowball sampling.
Sociological Methodology, 41, 367-371. doi:10.1111/j.1467-9531.2011.01243.x
Hanson, J., Balmer, D., & Giardino, A. (2011). Qualitative research methods for medical educators. Academic Pediatrics, 11, 375-386. doi:10.1016/j.acap.2011.05.001
Harsh, S. (2011). Purposeful sampling in qualitative research synthesis. Qualitative Research Journal, 11, 63-75. doi:10.3316/QRJ1102063
Head, E. (2009). The ethics and implications of paying participants in qualitative research. International Journal of Social Research Methodology, 12, 335-344. doi:10.1080/13645570802246724
Hochwarter, W. (2014). On the merits of student‐recruited sampling: Opinions a decade in the making. Journal of Occupational and Organizational Psychology, 87(1), 27- 33. doi:10.1111/joop.12043
Hodges, N. (2011). Qualitative research: A discussion of frequently articulated qualms (FAQs). Family and Consumer Sciences Research Journal, 40, 90-92. doi:10.1111/j.1552-3934.2011.02091.x
Hyat, M. J. (2013). Understanding sample size determination in nursing research.
Western Journal of Nursing Research, 35, 943-956. doi:10.1177/0193945913482052
Jawale, K. V. (2012). Methods of sampling design in the legal research: Advantages and disadvantages. Online International Interdisciplinary Research Journal, 2(6), 183-190. Retrieved from http://www.oiirj.org/oiirj/?page_id=924
Jessiman, W. (2013). ‘To be honest, I haven’t even thought about it’ - recruitment in small-scale, qualitative research in primary care. Nurse Researcher, 21(2), 18- 23. doi:10.7748/nr2013.11.21.2.18.e226
Kadam, P., & Bhalerao, S. (2010). Sample size calculation. International Journal of Ayurveda Research, 1(1), 55-57. doi:10.4103/0974-7788.59946
Klotz, A. C., Da Motta Veiga, S. P., Buckley, M. R., & Gavin, M. B. (2013). The role of trustworthiness in recruitment and selection: A review and guide for future research. Journal of Organizational Behavior, 34(Suppl 1), S104-S119. doi:10.1002/job.1891
Larson, A. J., & Sachau, D. A. (2009). Effects of incentives and the Big Five personality dimensions on internet panelists’ ratings. International Journal of Market Research, 51, 687-706. Retrieved from http://www.ijmr.com
Marshall, B., Cardon, P., Poddar, A., & Fontenot, R. (2013). Does sample size matter in qualitative research? A review of qualitative interview in is research. Journal of Computer Information Systems, 54(1), 11-22. Retrieved from http://www.iacis.org/jcis/jcis.php
Mason, M. (2010, September). Sample size and saturation in PhD studies using qualitative interviews. Forum: Qualitative Social Research, 11(3). Retrieved from http://www.qualitative-research.net/index.php/fqs/article/view/1428/3027
McQuarrie, E. F., & McIntyre, S. H. (2014). What can you project from small sample qualitative research? Marketing Insights, 26(2), 34-39. Retrieved from https://www.ama.org/publications/MarketingInsights/Pages/what-can-you-project- from-small-sample-qualitative-research-mi-march-april.aspx
Michaelidou, N., & Dibb, S. (2006). Using email questionnaires for research: Good practice in tackling non-response. Journal of Targeting, Measurement & Analysis for Marketing, 14, 289-296. doi:10.1057/palgrave.jt.5740189
Molenberghs, G., Kenward, M., Aerts, M., Verbeke, G., Tsiatis, A., Davidian, M., & Rizopoulos, D. (2014). On random sample size, ignorability, ancillarity, completeness, separability, and degeneracy: Sequential trials, random sample sizes, and missing data. Statistical Methods in Medical Research, 23, 11-41. doi:10.1177/0962280212445801
Monroe, M. C., & Adams, D. C. (2012). Increasing response rates to web-based surveys. Journal of Extension, 50(6), 6-7. Retrieved from http://www.joe.org/joe/2012december/tt7.php
Namageyo-Funa, A., Rimando, M., Brace, A. M., Christiana, R.W., Fowles, T. L., Davis,
T. L., Martinez, L. M., & Sealy, D. A. (2014). Recruitment in qualitative public health research: Lessons learned during dissertation sample recruitment. The Qualitative Report, 19(1), 1-17. Retrieved http://www.nova.edu/ssss/QR/QR19/namageyo-funa1
Nolen, A., & Talbert, T. (2011). Qualitative assertions as prescriptive statements.
Educational Psychology Review, 23, 263-271. doi:10.1007/s10648-011-9159-6
Olsen, R., Orr, L., Bell, S., & Stuart, E. (2012). External validity in policy evaluations that choose sites purposively. Journal of Policy Analysis and Management, 32, 107- 121. doi:10.1002/pam.21660
Oppong, S. H. (2013). The problem of sampling in qualitative research. Asian Journal of Management Sciences and Education, 2, 202-210. Retrieved from http://www.ajmse.leena-luna.co.jp/
O’Reilly, M., & Parker, N. (2012, May). Unsatisfactory saturation: A critical exploration of the notion of saturated sample sizes in qualitative research. Qualitative Research Journal, 1-8. doi:10.1177/1468794112446106
Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N., & Hoagwood, K. (2013, November). Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 1-12. doi:10.1007/s10488- 013-0528-y
Perez, D. F., Nie, J. X., Ardern, C. I., Radhu, N., & Ritvo, P. (2013). Impact of participant incentives and direct and snowball sampling on survey response rate in an ethnically diverse community: Results from a pilot study of physical activity and the built environment. Journal of Immigrant and Minority Health, 15(1), 207-214. doi:10.1007/s10903-011-9525-y
Polit, D. F., & Beck, C. T. (2010). Generalization in quantitative and qualitative research: Myths and strategies. International Journal of Nursing Studies, 47, 1451-1458. doi:10.1016/j.ijnurstu.2010.06.004
Poulis, K., Poulis, E., & Plakoyiannaki, E. (2013). The role of context in case study selection: An international business perspective. International Business Review, 22, 304-314. doi:10.1016/j.ibusrev.2012.04.003
Pritchard, K., & Whiting, R. (2012). Autopilot? A reflexive review of the piloting process in qualitative e-research. Qualitative Research in Organizations and Management, 7, 338-353. doi:10.1108/17465641211279798
Robinson, O. (2014). Sampling in interview-based qualitative research: A theoretical and practical guide. Research in Psychology, 11(1), 25-41. doi:10.1080/14780887.2013.801543
Roy, K., Zvonkovic, A., Goldberg, A., Sharp, E., & LaRossa, R. (2015). Sampling richness and qualitative integrity: Challenges for research with families. Journal of Marriage and Family, 77(1), 243-260. doi:10.1111/jomf.12147
Sánchez-Fernández, J., Muñoz-Leiva, F., Montoro-Ríos, F. J., & Ibáñez-Zapata, J. Á. (2010). An analysis of the effect of pre-incentives and post-incentives based on draws on response to web surveys. Quality and Quantity, 44, 357-373. doi:10.1007/s11135-008-9197-4
Suen, L. W., Huang, H., & Lee, H. (2014). A comparison of convenience sampling and purposive sampling. Hu Za Zhi, 61(3), 105-111. doi:10.6224/JN.61.3.105
Suri, H. (2011). Purposeful sampling in qualitative research synthesis. Qualitative Research Journal (RMIT Training Pty Ltd Trading As RMIT Publishing), 11(2), 63-75. doi:10.3316/QRJ1102063
Swift, J. A., & Tischler, V. (2010). Qualitative research in nutrition and dietetics: Gettingstarted. Journal of Human Nutrition and Dietetics, 23, 559-566. doi:10.1111/j.1365-277X.2010.01116.X
Szolnoki, G., & Hoffmann, D. (2013). Online, face-to-face and telephone surveys: Comparing different sampling methods in wine consumer research. Wine Economics and Policy, 2(2), 57-66. doi:10.1016/j.wep.2013.10.001
Teddlie, C., & Yu, F. (2007). Mixed methods sampling: A typology with examples.
Journal of Mixed Methods Research, 1(1), 77-100. doi:10.1177/2345678906292430
Tongco, D. C. (2008). Purposive sampling as a tool for informant selection. Ethnobotany Research & Applications, 5, 147-158. Retrieved from cholarspace.manoa.hawaii.edu/handle/10125/227
Trotter, R. T. (2012). Qualitative research sample design and sample size: Resolving and unresolved issues and inferential imperatives. Preventive Medicine, 55, 398- 400. doi:10.1016/j.ypmed.2012.07.003
Uprichard, E. (2013). Sampling: Bridging probability and non-probability designs.
International Journal of Social Research Methodology, 16(1), 1-11. doi:10.1080/13645579.2011.633391
Weijters, B., Schillewaert, N., & Geuens, M. (2008). Assessing response styles across modes of data collection. Journal of the Academy of Marketing Science, 36, 409- 422. doi:10.1007/s11747-007-0077-6
Sensemaking
Abolafia, M. (2010). Narrative construction as sensemaking. Organization Studies, 31, 349-367. doi:10.1177/0170840609357380
Angus-Leppan, T., Metcalf, L., & Benn, S. (2010). Leadership styles and CSR practice: An examination of sensemaking, institutional drivers and CSR leadership. Journal of Business Ethics, 93(2), 189-213. doi:10.1007/s10551- 009-0221-y
Bisel, R. S., & Arterburn, E. N. (2012). Making sense of organizational members’ silence: A sensemaking-resource model. Communication Research Reports, 29(3), 217-226. doi:10.1080/08824096.2012.684985
Bryant, E. M., & Sias, P. M. (2011). Sensemaking and relational consequences of peer co-worker deception. Communication Monographs, 78(1), 115-137. doi:10.1080/03637751.2010.542473
Colville, I., Brown, A. D., & Pye, A. (2011). Simplexity: Sensemaking, organizing and storytelling for our time. Human Relations, 65(1), 5-15. doi:10.1177/0018726711425617
Conroy, S. A., & O'Leary-Kelly, A. M. (2014). Letting go and moving on: Work-related identity loss and recovery. Academy of Management Review, 39(1), 67-87. doi:10.5465/amr.2011.0396
Dana, J., Dawes, R., & Peterson, N. (2013). Belief in the unstructured interview: The persistence of an illusion. Judgment and Decision Making, 512-520. Retrieved from http://journal.sjdm.org
Das, T., Kumar, R. (2010) Interpartner sensemaking in strategic alliances: Managing cultural differences and internal tensions. Management Decision, 48(1), 17-36. doi:10.1108/00251741011014436
DeKrey, S. J., & Portugal, E. J. (2014). Strategic sensemaking: Challenges faced by a new leader of SME. Procedia – Social and Behavioral Sciences, 150(15), 56-65. doi:10.1016/j.sbspro.2014.09.007
Ivanova, M., & Torkkeli, L. (2013). Managerial sensemaking of interaction within business relationship: A cultural perspective. European Management Journal, 31, 717-727. doi:doi:10.1016/j.emj.2013.07.007
Kelley, K. M., & Bisel, R. S. (2014). Leaders’ narrative sensemaking during LMX role negotiations: Explaining how leaders make sense of who to trust and when. The Leadership Quarterly, 25, 433-448. doi:10.1016/j.leaqua.2013.10.011
Kezar, A. (2013). Understanding sensemaking/sensegiving in transformational change processes from the bottom up. Higher Education, 65, 761-780. doi:10.1007/s10734-012-9575-7
Klein, G., Phillips, J. K., Rall, E. L., & Peluso, D. A. (2007). A data-frame theory of sensemaking. In R. R. Hoffman (Ed.). Expertise out of context: Proceedings of the Sixth International Conference on Naturalistic Decision Making (pp. 113-155). Mahwah, NJ, US: Lawrence Erlbaum Associates Publishers.
Krush, M. T., Agnihotri, R., Trainor, K. J., & Nowlin, E. L. (2013, January 30). Enhancing organizational sensemaking: An examination of the interactive effects of sales capabilities and marketing dashboards. Industrial Marketing Management, 42, 824-835. doi:10.1016/j.indmarman.2013.02.017
Landau, D., & Drori, I. (2008). Narratives as sensemaking accounts: The case of an R & D laboratory. Journal of Organizational Change Management, 21, 701-720. doi:10.1108/09534810810915736
Lockett, A., Currie, G., Finn, R., Martin, G., & Waring, J. (2014). The influence of social position on sensemaking about organizational change. Academy of Management Journal, 57, 1102-1129. doi:10.5465/amj2011.0055
Lüscher, L. S., & Lewis, M. W. (2008). Organizational change and managerial sensemaking: Working through paradox. Academy of Management Journal, 51(2), 221-240. doi:10.5465/AMJ.2008.31767217
Mantere, S., Schildt, H., & Sillince, J. (2012). Reversal of strategic change. Academy of Management Journal, 55, 173-196. doi:10.5465/amj.2008.0045
Marshall, A. (2014). Sensemaking in second life. Procedia Technology, 13, 107-111. doi:10.1016/j.protcy.2014.02.014
Moon, M. Y. (2009). Making sense of common sense for change management buy-in.
Management Decision, 47, 518-532. doi:10.1108/00251740910946769
Olson-Buchanan, J. B., & Boswell, W. R. (2008). An integrative model of experiencing and responding to mistreatment at work. Academy of Management Review, 33(1), 76-96. Retrieved from http://www.aom.pace.edu/
Paull, M., Boudville, I., & Sitlington, H. (2013). Using sensemaking as a diagnostic tool in the analysis of qualitative data. The Qualitative Report, 18(27), 1-12. Retrieved from http://www.nova.edu/ssss/QR/QR18/paull54
Rodríguez, C., & Bélanger, E. (2014). Stories and metaphors in the sensemaking of multiple primary health care organizational identities. BMC Family Practice, 15(1), 41-61. doi:10.1186/1471-2296-15-41
Rouleau, L., & Balogun, J. (2011). Middle managers, strategic sensemaking, and discursive competence. Journal of Management Studies, 48, 953-983. doi:10.1111/j.1467-6486.2010.00941.x
Santelli, A. G., & Struthers, C. W., & Eaton, J. (2009). Fit to forgive: Exploring the interaction between regulatory focus, repentance, and forgiveness. Journal of Personality & Social Psychology, 96, 381-394. doi:10.1037/a0012882
Steigenberger, N. (2015). Emotion in sensemaking: a change management perspective.
Journal of Organizational Change Management. 28, 432-451. doi:10.1108/JOCM-05-2014-0095
Stigiliani, I., & Ravasi, D. (2012). Organizing thoughts and connecting brains: Material practices and the transition from individual to group-level prospective sensemaking. Academy of Management Journal, 55, 1232-1259. doi:10.5465/amj.2010.0890
Thiel, C. E., Bagdasarov, Z., Harkrider, L., Johnson, J. F., & Mumford, M. D. (2012). Leader ethical decision-making in organizations: Strategies for sensemaking. Journal of Business Ethics, 107(1), 49-64. doi:10.1007/s10551-012-1299-1
Thurlow, A., & Mills, J. H. (2009). Change, talk and sensemaking. Journal of Organizational Change Management, 22, 459-579. doi:10.1108/09534810910983442
Tsang, E. W. (2012, August 26). Case study methodology: Causal explanation, contextualization, and theorizing. Journal of International Management, 19, 195- 202. doi:10.1016/j.intman.2012.08.004
Weick, K. E. (2011). Organized sensemaking: A commentary on processes of interpretive work. Human Relations, 65(1), 141-153. doi:10.1177/0018726711424235
Welch, C., Piekkari, R., Plakoyiannaki, E., & Paavilainen-Mäntymäki, E. (2011).
Theorising from case studies: Towards a pluralist future for international business research. Journal of International Business Studies, 42, 740–762. doi:10.1057/jibs.2010.55
Wetzel, R., & Dievernich, E. F. (2014). Mind the gap. The relevance of postchange periods for organizational sensemaking. Journal of Systems Research and Behavioral Science, 31, 280-300. doi:10.1002/sres.2198
Qualitative Software Analysis Sources
Abu Bakar, A., & Ishak, N.M. (2012) Qualitative data management and analysis using NVivo: An approach used to examine leadership qualities among student leaders. Education Research Journal, 2(3), 94-103. Retrieved from http://www.resjournals.com/ERJ
Bak, K., Murray, E., Gutierrez, E., Ross, J., & Warde, P. (2014). IMRT utilization in Ontario: Qualitative deployment evaluation. International Journal of Health Care Quality Assurance, 27, 742-759. doi:10.1108/IJHCQA-12-2013-0140
Bergin, M. (2011). NVivo 8 and consistency in data analysis: Reflecting on the use of a qualitative data analysis program. Nurse Researcher, 18(3), 6-12. Retrieved from http://journals.rcni.com
Brennan, M. C., & Cotgrave, A. J. (2014). Sustainable development: A qualitative inquiry into the current state of the UK construction industry. Structural Survey, 32, 315-330. doi:10.1108/SS-02-2014-0010
Burnap, P., Avis, N. J., & Rana, O. F. (2013). Making sense of self-reported socially significant data using computational methods. International Journal of Social Research Methodology, 16, 215-230. doi:10.1080/13645579.2013.774174
Cambra-Fierro, J., & Wilson, A. (2011). Qualitative data analysis software: Will it ever become mainstream? Evidence from Spain. International Journal of Market Research, 53(1), 17-24. doi:10.2501/IJMR-53-1-017-024
Carcary, M. (2011). Evidence analysis using CAQDAS: Insights from a qualitative researcher. Electronic Journal of Business Research Methods, 9(1), 10-24. Retrieved from http://www.ejbrm.com
Crofts, K., & Bisman, J. (2010). An illustration of the use of Leximancer software for qualitative data analysis. Qualitative Research in Accounting & Management, 7(2), 180-207. doi:10.1108/11766091011050859
Davis, N. W., & Meyer, B. B. (2009). Qualitative data analysis: A procedural comparison. Journal of Applied Sport Psychology, 21(1), 116-124. doi:10.1080/10413200802575700
de Casterle, B. D., Gastmans, C., Bryon, E., & Denier, Y. (2012). QUAGOL: A guide for qualitative data analysis. International Journal of Nursing Studies, 49, 360-371. doi:10.1016/j.ijnurstu.2011.09.012
Derobertmasure, A., & Robertson, J. E. (2014). Data analysis in the context of teacher training: Code sequence analysis using QDA miner(R). Quality and Quantity, 48, 2255-2276. doi:10.1007/s11135-013-9890-9
Dierckx de Casterlé, B., Gastmans, C., Bryon, E., & Denier, Y. (2012). QUAGOL: A guide for qualitative data analysis. International Journal of Nursing Studies, 49, 360-371. doi:10.1016/j.ijnurstu.2011.09.012
Fielding, N. (2012). The diverse worlds and research practices of qualitative software.
Forum: Qualitative Social Research, 13(2). Retrieved from www.qualitative- research.net
Fielding, J., Fielding, N., & Hughes, G. (2013). Opening up open-ended survey data using qualitative software. Quality & Quantity, 47, 3261-3276. doi:10.1007/s11135-012-9716-1
Franzosi, R., Doyle, S., MClelland, L., Putnam Rankin, C., & Vicari, S. (2013). Quantitative narrative analysis software options compared: PC-ACE and CAQDAS (ATLAS.ti, MAXqda, and NVivo). Quality & Quantity, 47, 3219-3247. doi:10.1007/s11135-012-9714-3
Glaser, J., & Laudel, G. (2013) Life with and without coding: Two methods for early- stage data analysis in qualitative research aiming at causal explanations. Forum: Qualitative Social Research, 14(2). Retrieved from http://www.qualitativeresearch.net/index.php/fqs/article/view/1886/3528
Hilal, A. H., & Alabri, S. S. (2013). Using NVivo for data analysis in qualitative research. International Interdisciplinary Journal of Education, 2(2), 181-186. Retrieved from http://iijoe.org/index.htm
Housley, W., & Smith, R. J. (2011). Telling the CAQDAS code: Membership categorization and the accomplishment of ‘coding rules’ in research talk. Discourse Studies, 13, 417-434. doi:10.1177/1461445611403258
Humble, A. (2015). Review essay: Guidance in the world of computer-assisted qualitative data analysis software (caqdas) programs. Forum : Qualitative Social Research, 16(2). Retrieved from http://http://www.qualitative- research.net/index.php/fqs/index
Hutchison, A., Johnston, L., & Breckon, J. (2010). Using QSR-NVivo to facilitate the development of a grounded theory project: An account of a worked example. International Journal of Social Research Methodology, 13, 283-302. doi:10.1080/13645570902996301
Iovu, M., & Runcan, P. L. (2012). The potential use of computer-assisted qualitative data analysis software (CAQDAS) to analyze children’s perceptions of maltreating families. Social Work Review, 67-77. Retrieved from http://cswr.columbia.edu
Kikooma, J. F. (2010). Using qualitative data analysis software in a social constructionist study of entrepreneurship. Qualitative Research Journal, 10(1), 40-51. doi:10.3316/QRJ1001040
Koenig, T. (2011). CAQDAS comparison. Retrieved from the University of South Hampton, ReStore: A sustainable Web Resources Repository program funded by the Economic and Social Research Council website: http://www.restore.ac.uk/
Leech, N., & Onwuegbuzie, A. (2011). Beyond constant comparison qualitative data analysis: Using NVivo. School Psychology Quarterly, 26(1), 70-84. doi:10.1037/a0022711
Leong, D., Bahl, V., Jiayan, G., Siang, J., & Lan, T.M. (2013, July). Secure data sanitization for archaic storage devices. Global Science and Technology Journal, 1(1), 41-52. Retrieved from http://www.gstjpapers.com/static/documents/July/2013/7.Vikram.pdf
Mungal, A. (2009). ATLAS.ti: Using QDA software to manage & analyze your research material. NYU Research Digest. Retrieved from http://www.nyu.edu/about/news- publications
Nayelof, J. L., Fuchs, S. C., & Moreira, L. B. (2012). Meta-analyses and forest plots using a Microsoft Excel spreadsheet: Step-by-step guide focusing on descriptive data analysis. BMC Research Notes, 5, 52-57. doi:10.1186/1756-0500-5-52
Odena, O. (2013). Using software to tell a trustworthy, convincing and useful story.
International Journal of Social Research Methodology, 16, 355-372. doi:10.1080/13645579.2012.706019
Pierre, E. A. S., & Jackson, A. Y. (2014). Qualitative data analysis after coding.
Qualitative Inquiry, 20, 715-719. doi:10.1177/1077800414532435
Rabinovich, M., & Kacen, L. (2013). Qualitative coding methodology for interpersonal study. Psychoanalytic Psychology, 30, 210-231. doi:10.1037/a0030897.
Rademaker, L. L., Grace, E. J., & Curda, S. K. (2012). Using computer-assisted qualitative data analysis software (CAQDAS) to re-examine traditionally analyzed data: Expanding our understanding of the data and of ourselves as scholars. The Qualitative Report, 17(22), 1-11. Retrieved from//www.nova.edu/ssss/QR/QR17/rademaker
Rush, S. C. (2014). Review of Transana: Qualitative analysis software for video and audio.
Education Psychology in Practice, 30(2), 213-214. doi:10.1080/02667363.2014.903587
Schmidt, M. (2010). Quantification of transcripts from depth interviews, open-ended responses and focus groups. International Journal of Market Research, 52, 483- 508. doi:10.2501/S1470785309201417
Saillard, E. K. (2011). Systematic versus interpretive analysis with two CAQDAS packages: NVivo and MAXQDA. Forum: Qualitative Social Research, 12(1). Retrieved from http://www.qualitative- research.net/index.php/fqs/article/view/1518/3133
Sinkovics, R. R., & Alfoldi, E. A. (2012). Progressive focusing and trustworthiness in qualitative research: The enabling role of computer-assisted qualitative data analysis software (CAQDAS). Management International Review, 52, 817–845. doi:10.1007/s11575-012-0140-5
Sinkovics, R. R., & Penz, E. (2011). Multilingual elite-interviews and software-based analysis. International Journal of Market Research, 53, 705-724. doi: 10.2501/IJMR-53-5-705-724
Sotiriadou, P., Brouwers, J., & Le, T. (2014). Choosing a qualitative data analysis tool: A comparison of NVivo and Leximancer. Annals of Leisure Research, 17, 218-234, doi:10.1080/11745398.2014.902292
Talanquer, V. (2014). Using qualitative analysis software to facilitate qualitative data analysis. Tools of Chemistry Education Research, 1166. doi:10.1021/bk-2014- 1166.ch005
Turner, B. L., Hyunjung, K., & Andersen, D. F. (2014). Improving coding procedures for purposive text data: Researchable questions for qualitative system dynamics modeling. Systems Dynamics Review, 29, 253-263. doi:10.1002/sdr.1506
White, D. E., Oelke, N. D., & Friesen, S. (2012). Management of a large qualitative data set: Establishing trustworthiness of the data. International Journal of Qualitative Methods, 11, 244-258. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/9883
Wiles, R., Crow, G., & Pain, H. (2011). Innovation in qualitative research methods: A narrative review. Qualitative Research, 11, 587-604. doi:10.1177/1468794111413227
Wong, L. P. (2008). Data analysis in qualitative research: A brief guide to using NVivo.
Malaysian Family Physician, 3, 1985-2274. Retrieved from
http://www.ejournal.afpm.org.my/
Woods, M., Paulus, T., Atkins, D. P., & Macklin, R. (2015). Advancing qualitative research using qualitative data analysis software (QDAS)? Reviewing
potential versus practice in published studies using ATLAS.ti and NVivo, 1994– 2013. Social Science Computer Review, 1-21. doi:10.1177/0894439315596311
Zamawe, F. C. (2015). The implication of using NVivo software in qualitative data
analysis: Evidence-based reflections. Malawi Medical Journal, 27(1), 13-15. doi:10.4314/mmj.v27il.4
Triangulation Sources
Almajali, D. A., & Dahalin, Z. M. (2011). Applying the triangulation approach in IT - business strategic alignment and sustainable competitive advantage. IBIMA Business Review. doi:10.5171/2011.214481
Alvarez, J., Canduela, J., & Raeside, R. (2012). Knowledge creation and the use of secondary data. Journal of Clinical Nursing, 21, 2699-2710. doi:10.1111/j.1365- 2702.2012.04296
Arcidiacono, F., & De Gregorio, E. (2008). Methodological thinking in psychology: Starting from mixed methods. International Journal of Multiple Research Approaches, 2, 118-126. Retrieved from http://mra.e-contentmanagement.com
Bannon, W. (2015). Missing data within a quantitative research study: How to assess it, treat it, and why you should care. Journal of the American Association of Nurse Practitioners, 27, 230-232. doi:10.1002/2327-6924.12208
Baxter, P., & Jack, S. (2008). Qualitative case study methodology: Study design and implementation for novice researchers. The Qualitative Report, 13, 544-559. Retrieved from http://www.nova.edu/ssss/QR/QR13-4/baxter
Bekhet, A. K., & Zauszniewski, J. A. (2012). Methodological triangulation: An approach to understanding data. Nurse Researcher, 20(2), 40-43. Retrieved from http://www.nursing-standard.co.uk
Burau, V., & Andersen, L. B. (2014). Professions and professionals: Capturing the changing role of expertise through theoretical triangulation. American Journal of Economics & Sociology, 73, 264-293. doi:10.1111/ajes.12062
Carter, N., Bryant-Lukosius, D., DiCenso, A., Blythe, J., & Neville, A., J. (2014). The use of triangulation in qualitative research. Oncology Nursing Forum, 41, 545-547. doi:10.1188/14.ONF.545.547
Denzin, N. (2006). Sociological methods: A sourcebook (5th ed.). New York, NY: Aldine Transaction.
Denzin, N. K. (2009). The research act: A theoretical introduction to sociological methods. New York, NY: Aldine Transaction.
Denzin, N. K. (2012). Triangulation 2.0. Journal of Mixed Methods Research, 6(2), 80- 88. doi:10.1177?1558689812437186
Fehrmann, L., Gregoire, T. G., & Kleinn, C. (2012). Triangulation based inclusion probabilities: A design-unbiased sampling approach. Environmental and Ecological Statistics, 19(1), 107-123. doi:10.1007/s10651-011-0177-9
Fielding, N. G. (2012). Triangulation and mixed methods designs: Data integration with new research technologies. Journal of Mixed Methods Research, 6, 124-136. doi:10.1177/1558689812437101
Foster, D. J., Hayes, T., & Alter, F. (2013). Facing the methodological challenges of reusing previously collected data in a qualitative inquiry. Qualitative Research Journal, 13, 33-48. doi:10.1108/14439881311314522
Fusch, G. E. (2008, December). What happens when the ROI model does not fit?
Performance Improvement Quarterly, 14(4), 60-76. doi:10.1111/j.1937- 8327.2001.tb00230.x
Fusch, P., & Ness, L. (2015). Are we there yet? Data saturation in qualitative research.
The Qualitative Report, 20, 1408-1416. Retrieved from http://tqr.nova.edu/wp- content/uploads/2015/09/fusch1
Gorissen, P., van Bruggen, J., & Jochems, W. (2013). Methodological triangulation of the students’ use of recorded lectures. International Journal of Learning Technology, 8(1), 20-40. doi:10.1504/IJLT.2013.052825
Heale, R., & Forbes, D. (2013). Understanding triangulation in research. Evidence Based Nursing, 16(4), 98. doi:10.1136/eb-2013-101494
Hoque, Z., Covaleski, M. A., & Gooneratne, T. N. (2013). Theoretical triangulation and pluralism in research methods in organizational and accounting research.
Accounting, Auditing & Accountability Journal, 26, 1170-1198. doi:10.1108/AAAJ- May-2012-01024
Horne, C., & Horgan J. (2012). Methodological triangulation in the analysis of terrorist networks. Studies in Conflict & Terrorism, 35. doi:10.1080/1057610x.2012.639064
Houghton, C., Casey, D., Shaw, D., & Murphy, K. (2013). Rigour in qualitative case- study research. Nurse Researcher, 20(4), 12-17. doi:10.7748/nr2013.03.20.4.12.e326
Humble, A. M. (2009). Technique triangulation for validation in directed content analysis.International Institute for Qualitative Methodology, 8(3), 34-51. Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/viewFile/1480/5586
Hussein, A. (2009). The use of triangulation in social sciences research: Can qualitative and quantitative methods be combined? Journal of Comparative Social Work, 1, 1-12. Retrieved from http:// www.jcsw.no
Irwin, S. (2013). Qualitative secondary data analysis: Ethics, epistemology and context.
Journal of Progress in Development Studies, 13, 295-306. doi:10.1177/1464993413490479
Jonsen, K., & Jehn, K. A. (2009). Using triangulation to validate themes in qualitative studies. Qualitative Research in Organizations and Management: An International Journal, 4, 123-150. doi:10.1108/17465640910978391
Lloyd, S. (2011). Triangulation research to inform corporate reputation and practice.
Corporate Reputation Review, 14, 221-223. doi:10.1057/crr.2011.16
Manganelli, J., Threatt, A., Brooks, J., Healy, S., Merino, J., Yanik, P., & Green, K. (2014). Confirming, classifying, and prioritizing needed over-the-bed table improvements via methodological triangulation. Health Environments Research & Design Journal, 8, 94-114. Retrieved from http://www.herdjournal.com
Marshall, C., & Rossman, G. (2016). Designing qualitative research (6th ed.). Thousand Oaks: Sage.
Mathison, S. (1988). Why triangulate? Education Researcher, 17(2), 13-17. doi:10.3102/0013189X017002013
Modell, S. (2005). Triangulation between case study and survey methods in management accounting research: An assessment of validity implications. Management Accounting Research, 16, 231-254. doi:10.1016/j.mar.2005.03.001
Modell, S. (2015). Theoretical triangulation and pluralism in accounting research: a critical realist critique. Accounting, Auditing & Accountability Journal, 28(7). doi:10.1108/AAAJ-10-2014-1841
Oleinik, A. (2011). Mixing quantitative and qualitative content analysis: Triangulation at work. Quality and Quantity, 45, 859-873. doi:10.1007/s11135-010-9399-4
Ostlund, U., Kidd, L., Wengstrom, Y., & Rowa-Dewar, N. (2011). Combining qualitative and quantitative research within mixed method research designs: A methodological review. International Journal of Nursing Studies, 48, 369-383. doi:10.1016/j.ijnurstu.2010.10.005
Simpson, S. H. (2011). Demystifying the research process: Mixed methods. Pediatric Nursing, 37(1), 28-29. Retrieved from http://www.pediatricnursing.net
Stake, R. E. (1995). The art of case study research. Thousand Oaks: Sage.
Stavros, C., & Westberg, K. (2009). Using triangulation and multiple case studies to advance relationship marketing theory. Qualitative Market Research, 12, 307- 320. doi:10.1108/13522750910963827
Street, C. T., & Ward, K. W. (2012). Improving validity and reliability in longitudinal case study timelines. European Journal of Information Systems, 21, 160-175. doi:10.1057/ejis.2011.53
Torrance, H. (2012). Triangulation, respondent validation, and democratic participation in mixed methods research. Journal of Mixed Methods Research, 6(2), 111-123. doi:10.1177/1558689812437185
Walsh, K. (2013). When I say … triangulation. Medical Education, 47, 866-866. doi:10.1111/medu.12241
Wilson, V. (2014). Research methods: Triangulation. Evidence Based Library and Information Practice, 9(1), 74-75. Retrieved from http://ejournals.library.ualberta.ca/index.php/EBLIP
Yildirim, K. (2010). Raising the quality in qualitative research. Ilkogretim Online, 9(1), 79-92. Retrieved from http://ilkogretim-online.org.tr/vol9say1/v9s1m8.pdf
Yin, R. K. (2013, July 10). Validity and generalization in future case study evaluations.
Evaluation, 19, 312-332. doi:10.1177/1356389013497081
March 2016
DBA RESEARCH HANDBOOK
SECTION 1: FOUNDATION OF THE STUDY
Inspirational Motivation
Turnover Intention
Intellectual Stimulation
Idealized Behavior
Idealized Attributes
Moral Integrity
SECTION 2: THE PROJECT
SECTION 3: APPLICATION TO PROFESSIONAL PRACTICE AND IMPLICATIONS FOR CHANGE
1
Unit 2 Overview
Self-Diagnostic and Learning Contract
This unit is designed to help you explore your own needs related to the topic of adult learning and to develop a personal learning contract based upon those needs.
Pre self-diagnostic and Pre learning contract
• Core Competency Diagnostic and Planning Guide
• Personal Adult Learning Styles Inventory • Use above to create a personal learning
contract for the course
• Grading criteria: • Thorough • Complete • Reflective of graduate level work
2
Personal learning contract overview
• Will contain 4-7 personal learning objectives (related to adult learning)
• Will be for the duration of this course • Will be aligned to you personal needs (from
the assessments in ch 16 and 17 this unit)
• Will be used to personalize two class assignments
• Research presentation in Unit 6
• Application project in Unit 7
Required Reading from textbook • Chapter 15 • Chapter 16 • Chapter 17
3
Diagnosing Your Needs Ch 16 Core Competency Diagnostic and Planning Guide
R = your role
P = present level
Facilitator B5 Ability to engage learners responsibly in diagnosis of needs for learning
R = your role
P = present level Student 1: R = 5 and P = 0
Student 2: R= 5 and P = 4
Need Student #1 5
Need Student #2 1
4
Administrator B2 Ability to make and monitor financial plans and procedures
R = your role
P = present level Student 1: R = 0 and P = 1
Student 2: R= 3 and P = 3
Need Student #1 -1
Need Student #2 0
Diagnosing Your Needs Ch 16 Core Competency Diagnostic and Planning Guide
Table is not required as graded assignment
Learning journal reflection questions: Which areas did you identify as strengths? Which areas did you identify as your greatest need?
5
Chapter 17 Personal Adult Learning Styles Inventory
• Complete the survey and calculate the results • Scoring directions on page 273 • You will graph results and calculate component
results on page 274
• Learning journal reflection question: How consistent are your results with what you imagined your style to be? (explain and elaborate) How would you like your style to grow and change in the future? (explain and elaborate)
Unit 2 Learning Contract Assignment
• Chapter 15 describes guidelines for learning contracts
• Step 1 on page 255 - diagnose needs • Use your results from ch 16 and ch 17
• Step 2 on page 256 – specify objectives • “Be sure your objectives describe what you
will learn, not what you will do”
• 4-7 learning objectives based on your needs
• Can be accomplished during the course
• Use syllabus to help pick appropriate and realistic objectives for the course
20 points
6
Unit 2 Learning Contract Assignment
• Step 3 on page 257 – specify learning resources and strategies
• Use the syllabus for ideas that direct relate to the course content
• Steps 4 and 5 are evidence of accomplishment and validation of evidence
• Table 15-1 on page 258 for Step 4 ideas
• Narrative in Step 5 on page 257 for Step 5 ideas
Learning Contract Template
7
8
Unit 2 Learning Journal reflection questions
1. From the Chapter 16 Diagnostic, which areas did you identify as strengths? Which areas did you identify as your greatest need?
2. From the Chapter 17 Inventory, how consistent are your results with what you imagined your style to be? (explain and elaborate) How would you like your style to grow and change in the future? (explain and elaborate)
3. Reflect on the process you went through to create your learning contract. How were you able to incorporate the information from the needs assessments?
Assigned REFLECTION QUESTIONS (will be answered in the Unit 2 Learning Journal): 15 points
Summary of Unit 2 (check syllabus for due dates)
• Reading Chapter 15, 16, and 17 of your textbook.
• Completing and analyzing the chapter 16 Core Competency Diagnostic and Planning Guide and chapter 17 Personal Adult Learning Style Inventory. You will use these results to help you put together your learning contract.
• Completing your personal learning contract and submitting via the Assignments area of BBLearn. (20 points)
• Completing the Unit 2 Learning Journal. (15 points)