Quick Start: Pick Your Method in 5 Minutes
Step 1: What’s your main question?
- “How many?” or “How much?” → Use numbers (quantitative)
- “Why?” or “How does it feel?” → Use stories (qualitative)
- Both → Mix methods
Step 2: Check your resources
- Short on time? → Surveys or existing data
- Limited budget? → Online surveys or interviews
- Small team? → Simple approaches
Step 3: Match to your field
- Sciences → Usually numbers
- Social work/education → Often stories
- Health research → Often mixed
Not sure? Start with interviews to explore, then survey to measure.
Stop Overthinking Your Research Method
Picking the wrong research method wastes months of work. This guide helps you choose right the first time.
What You’ll Get
- Simple rules for matching methods to questions
- When numbers work better than stories (and vice versa)
- Step-by-step plans for collecting good data
- How to write methods sections that reviewers love
- The mistakes that kill research projects
Understanding Research Methodology
Research Methodology Basics
Your methodology is your game plan. Think of it as:
- Your approach - Will you count things or understand experiences?
- Your design - How will you actually study this?
- Your tools - Surveys? Interviews? Experiments?
- Your analysis - How will you make sense of what you find?
Why This Matters
People believe you: Good methods make your findings credible You measure the right thing: Right tools actually answer your question Others can copy you: Clear methods let others repeat your work Your work matters: Solid methods help your findings apply to the real world
The Research Process Overview
- Research question development
- Literature review and theory
- Methodology selection
- Research design
- Data collection
- Data analysis
- Interpretation and reporting
Research Paradigms: The Foundation
Research Worldviews
Your research worldview shapes everything. It’s basically:
- What you think reality looks like
- How you think we can understand the world
- What tools you trust most
- How your beliefs influence your work
Major Research Paradigms
“Just Give Me Numbers” Approach
- There’s one right answer and we can measure it
- Use experiments and surveys
- Keep feelings and opinions separate
- Example: Does this medicine work?
“Numbers Plus Context” Approach
- There’s probably a right answer, but we’re not perfect
- Use mostly numbers with some stories
- Admit we might miss things
- Example: Does this teaching method work?
“Everyone’s Different” Approach
- People create their own realities
- Focus on understanding experiences
- Use interviews and watching people
- Example: What’s it like being a new parent?
“Whatever Solves the Problem” Approach
- Use any method that gets useful answers
- Mix numbers and stories freely
- Focus on what actually helps
- Example: How do we reduce homelessness?
Pick Your Approach
Ask yourself:
- Am I proving something or exploring something?
- Do I need hard data or deep understanding?
- What works in my field?
- What can I actually do with my time and money?
Quick rules:
- Proving something works → Use numbers
- Understanding how people feel → Use stories
- Solving messy real-world problems → Use whatever works
- Complicated situations → Mix both
Working with Numbers (Quantitative)
When to Use Numbers
Use numbers when you want to:
- Prove something actually works
- Measure relationships between things
- Apply your findings to lots of people
- Use statistics to back up your claims
- Show that X causes Y
Types of Quantitative Research
True Experiments
- Test if something causes something else
- Randomly assign people to groups
- Very controlled but sometimes artificial
- Example: Testing if a new teaching method works
Quasi-Experiments
- Like experiments but less control
- Use when true experiments aren’t possible
- Good for real-world settings
- Example: Comparing schools with different programs
Surveys
- Describe what people think or do
- Ask lots of people the same questions
- Can’t prove causation
- Example: Opinion polls, customer feedback
Correlation Studies
- See if things go together
- Don’t manipulate anything
- Can’t prove one thing causes another
- Example: Do better students study more?
Study Designs
Different groups:
- Each person only in one condition
- Need more people
- Good for comparing treatments
- Example: Half get new drug, half get placebo
Same group over time:
- Same people in all conditions
- Need fewer people
- Watch for practice effects
- Example: Test everyone before and after training
Snapshot studies:
- Collect data once
- Quick and cheap
- Can’t show what causes what
- Example: Survey students about stress right now
Follow-up studies:
- Collect data multiple times
- Takes longer and costs more
- Can track changes
- Example: Follow students from freshman to senior year
Choosing People for Your Study
Random sampling (anyone could be picked):
True random:
- Everyone has equal chance
- Best for applying results widely
- Need list of everyone
- Example: Pick phone numbers randomly
Groups first:
- Pick groups, then people within groups
- Cheaper when people are spread out
- Less accurate than true random
- Example: Pick schools, then students
Non-random sampling (you pick specific people):
Convenient:
- Pick whoever’s easy to reach
- Quick and cheap
- Can’t apply results broadly
- Example: Students in your class
On purpose:
- Pick people with specific traits
- Good when you need experts
- Can’t generalize
- Example: Only experienced teachers
How Many People Do You Need?
Depends on:
- How sure you want to be (usually 80%)
- How big an effect you expect to find
- How much people vary
- Your budget and time
Figure it out ahead of time:
- Use free software like G*Power
- Better to have too many than too few
- But don’t waste resources on way too many
- Consider who you can actually reach
Data Collection Methods
Surveys and questionnaires:
- Advantages: Large samples, standardized, cost-effective
- Disadvantages: Response bias, limited depth
- Best for: Attitudes, behaviors, demographics
- Examples: Likert scales, multiple choice, ranking
Experiments:
- Advantages: High control, establish causation
- Disadvantages: Artificial settings, limited generalizability
- Best for: Testing interventions, comparing treatments
- Examples: Lab experiments, field experiments
Observations:
- Advantages: Natural behavior, objective measurement
- Disadvantages: Observer effects, time-intensive
- Best for: Behavioral frequency, duration, patterns
- Examples: Structured observation, behavioral coding
Secondary data:
- Advantages: Large datasets, cost-effective
- Disadvantages: Limited control, may not fit research questions
- Best for: Population trends, historical analysis
- Examples: Census data, administrative records
Quantitative Data Analysis
Descriptive statistics:
- Central tendency: Mean, median, mode
- Variability: Standard deviation, range
- Distribution: Skewness, kurtosis
- Purpose: Summarize and describe data
Inferential statistics:
t-tests:
- Compare means between groups
- One-sample, independent samples, paired samples
- Assumes normal distribution
- Example: Comparing test scores before/after intervention
ANOVA (Analysis of Variance):
- Compare means across multiple groups
- One-way, factorial, repeated measures
- Controls for multiple comparisons
- Example: Comparing multiple teaching methods
Correlation:
- Measures relationship strength
- Pearson (continuous), Spearman (ordinal)
- Range from -1 to +1
- Example: Relationship between study time and grades
Regression:
- Predicts outcome from predictors
- Linear, multiple, logistic
- Provides effect sizes
- Example: Predicting job performance from multiple factors
Chi-square:
- Tests associations between categorical variables
- Goodness of fit, independence
- Non-parametric test
- Example: Relationship between gender and career choice
Working with Stories (Qualitative)
When to Use Stories
Use stories when you want to:
- Understand how people really feel
- Explore something nobody’s studied much
- Come up with new ideas about how things work
- Study messy real-world situations
- Let people explain things in their own words
Types of Story Research
Understanding experiences:
- What’s it like to go through something?
- Use deep interviews
- Find common themes
- Example: What’s it like having chronic illness?
Building theories:
- How do processes work?
- Interview and observe
- Build theories from what you see
- Example: How do people cope with losing jobs?
Studying groups:
- How do communities work?
- Spend time with people
- Watch and participate
- Example: How do hospital teams work together?
Deep dives:
- Study specific situations closely
- Use multiple information sources
- Example: How one school adopted new technology
Life stories:
- How do people make sense of their lives?
- Collect and analyze stories
- Example: How teachers change careers
Qualitative Sampling
Purposive sampling strategies:
Maximum variation:
- Include diverse cases
- Captures range of experiences
- Identifies common patterns
- Example: Different types of organizations
Homogeneous:
- Similar cases
- Reduces variation
- Focuses on specific group
- Example: First-time mothers
Critical case:
- Cases that make logical point clearly
- “If this case doesn’t work, nothing will”
- Strategically important
- Example: Best-practice organizations
Theoretical:
- Guided by emerging theory
- Sample to test or refine theory
- Common in grounded theory
- Example: Cases that challenge emerging patterns
Sample size considerations:
- Saturation: When no new information emerges
- Depth over breadth: Fewer cases, more detail
- Typical ranges: 5-25 interviews, 1-10 cases
- Context matters: Complex phenomena need larger samples
Qualitative Data Collection
In-depth interviews:
Structured interviews:
- Predetermined questions in fixed order
- Standardized responses
- Limited flexibility
- Good for comparability
Semi-structured interviews:
- Interview guide with flexibility
- Can explore unexpected topics
- Balance structure and openness
- Most common approach
Unstructured interviews:
- Conversational approach
- Maximum flexibility
- Follows participant interests
- Good for exploratory research
Interview best practices:
- Start with easy, descriptive questions
- Use open-ended questions
- Ask for specific examples
- Avoid leading questions
- Practice active listening
- Record and transcribe accurately
Focus groups:
- Advantages: Group interaction, efficient
- Disadvantages: Dominant voices, social desirability
- Best for: Exploring group norms, generating ideas
- Size: 6-10 participants
- Duration: 90-120 minutes
Participant observation:
- Complete participant: Full involvement, hidden role
- Participant observer: Known researcher, some involvement
- Observer participant: Known researcher, limited involvement
- Complete observer: No involvement, known or unknown
Document analysis:
- Public documents: Reports, newspapers, websites
- Private documents: Diaries, letters, emails
- Physical artifacts: Objects, images, spaces
- Digital traces: Social media, online behavior
Qualitative Data Analysis
Thematic analysis:
- Familiarization: Read and re-read data
- Initial coding: Generate initial codes
- Theme development: Group codes into themes
- Theme review: Refine and define themes
- Writing: Tell the story of the data
Content analysis:
- Conventional: Codes emerge from data
- Directed: Uses existing theory
- Summative: Counts and interprets words
- Quantitative element: Frequency counts
Grounded theory analysis:
- Open coding: Break down data
- Axial coding: Relate categories
- Selective coding: Integrate categories
- Constant comparison: Compare incidents
- Theoretical sampling: Sample to develop theory
Narrative analysis:
- Structural: How stories are told
- Thematic: What stories are about
- Dialogical: Who stories are for
- Visual: How stories are presented
Ensuring Qualitative Rigor
Credibility (internal validity):
- Prolonged engagement: Spend adequate time
- Triangulation: Multiple data sources
- Member checking: Participant feedback
- Peer debriefing: Colleague input
Transferability (external validity):
- Thick description: Rich, detailed description
- Maximum variation: Diverse cases
- Clear context: Detailed setting description
- Reader judgment: Let readers assess fit
Dependability (reliability):
- Audit trail: Document all decisions
- Consistent procedures: Standardize methods
- Multiple coders: Inter-rater reliability
- External audit: Independent review
Confirmability (objectivity):
- Reflexivity: Acknowledge researcher influence
- Data audit: Trace findings to sources
- Negative case analysis: Search for disconfirming evidence
- Bracketing: Set aside preconceptions
Mixing Numbers and Stories
When to Mix Both Approaches
Sometimes you need both numbers and stories to get the full picture.
Use mixed methods when:
- Numbers alone don’t tell the whole story
- You want to explain surprising results
- You need to explore first, then measure
- Your topic is complex
- Different people want different types of evidence
Three Ways to Mix Methods
Do both at the same time:
- Survey people AND interview some of them
- Compare results to see if they match
- Example: Survey about job satisfaction + interviews about why
Numbers first, then stories:
- Start with a survey
- Interview people to explain surprising results
- Example: Test scores look weird, so interview students about what happened
Stories first, then numbers:
- Start with interviews to understand the topic
- Create a survey based on what you learned
- Example: Interview patients, then survey more patients about what you discovered
Integration Strategies
At design level:
- Choose complementary methods
- Plan timing and priority
- Consider resource allocation
- Design integration points
At methods level:
- Connect sampling strategies
- Link data collection instruments
- Coordinate data collection timing
- Plan for data comparability
At interpretation level:
- Compare results across methods
- Look for convergence and divergence
- Explain discrepancies
- Draw meta-inferences
Mixed Methods Sampling
Identical samples:
- Same participants in both phases
- Good for convergent designs
- Allows direct comparison
- May burden participants
Nested samples:
- Qualitative subset of quantitative sample
- Common in explanatory sequential
- Efficient use of resources
- Need to select strategic subset
Concurrent samples:
- Different but related samples
- Good for convergent designs
- Reduces participant burden
- Need to ensure comparability
Sequential samples:
- Second sample informed by first
- Common in exploratory sequential
- Allows targeted sampling
- May limit comparability
Data Analysis and Integration
Side-by-side comparison:
- Present quantitative and qualitative results separately
- Discuss similarities and differences
- Simplest integration approach
- Risk of superficial integration
Data transformation:
- Convert qualitative data to quantitative (counting themes)
- Convert quantitative data to qualitative (creating categories)
- Allows statistical comparison
- May lose richness in conversion
Joint displays:
- Visual representations of integration
- Tables, matrices, figures
- Show relationships between data types
- Facilitate interpretation
Meta-inferences:
- Higher-level interpretations
- Draw from both data types
- Address original research questions
- Represent true integration
Writing Mixed Methods Results
Convergent design reporting:
- Present quantitative results
- Present qualitative results
- Compare and contrast findings
- Discuss convergence/divergence
- Draw integrated conclusions
Sequential design reporting:
- Present first phase results
- Explain how results informed second phase
- Present second phase results
- Integrate findings across phases
- Discuss overall conclusions
Challenges in Mixed Methods
Practical challenges:
- Requires multiple skill sets
- Time-intensive
- Resource-demanding
- Complex analysis
Methodological challenges:
- Paradigm tensions
- Integration difficulties
- Quality criteria unclear
- Timing and priority decisions
Solutions:
- Collaborate with experts
- Plan carefully
- Use established designs
- Focus on research questions
Simple Decision Guide
What Do You Want to Know?
“How many people think X?" → Survey (quantitative)
“Why do people think X?" → Interviews (qualitative)
“Does treatment X work?" → Experiment (quantitative)
“How do people experience treatment X?" → Interviews or observations (qualitative)
“Does it work AND how do people feel about it?" → Mix both approaches
Practical Constraints
Limited time?
- Quick: Online surveys, existing data
- Medium: Interviews, focus groups
- Long: Experiments, following people over time
Limited budget?
- Cheap: Online surveys, interviews
- Moderate: Focus groups, small experiments
- Expensive: Large studies, lab experiments
Hard to find participants?
- Easy: Students, online communities
- Harder: Professionals, specific populations
- Very hard: Experts, sensitive topics
Common Mistakes to Avoid
Don’t pick your method first
- Start with your question
- Then pick the best method to answer it
Don’t ignore your limits
- Be realistic about time and money
- Plan for things to take longer than expected
Don’t copy others blindly
- What works for them might not work for you
- Consider what makes sense for your situation
Writing Your Methods Section
What to Include
Your methods section needs to:
- Explain why you chose this approach
- Describe what you did clearly
- Give enough detail so others could repeat your study
- Address ethical issues
Simple Structure
1. Overview
- Numbers, stories, or mixed approach
- Why this approach fits your question
2. People in your study
- Who you studied and why
- How you found them
- How many people
3. How you collected data
- What tools you used (surveys, interviews, etc.)
- Step-by-step procedures
- When and where
4. How you analyzed data
- What software or approach
- Specific steps you took
5. Ethics
- Approval from ethics board
- How you protected participants
Writing Tips
Use past tense:
- “I interviewed 20 teachers…”
- “Participants completed surveys…”
- “Data were collected over 3 months…”
Be specific:
- “15-minute interviews” not “short interviews”
- “Google Forms survey” not “online questionnaire”
- “Every Tuesday for 6 weeks” not “regularly”
Explain your choices:
- Why interviews instead of surveys?
- Why these participants?
- Why this analysis approach?
- What are the limitations?
Research Design Section Examples
Quantitative example: “This study employed a randomized controlled trial design to examine the effectiveness of a mindfulness intervention on reducing anxiety in college students. The quantitative approach was chosen to test the hypothesis that mindfulness training would significantly reduce anxiety scores compared to a control condition, allowing for statistical inference about intervention effectiveness.”
Qualitative example: “This study used a phenomenological approach to understand the lived experiences of first-generation college students. The qualitative methodology was selected to capture the depth and complexity of participants' experiences, providing rich insights into how these students navigate higher education challenges.”
Mixed methods example: “This study employed an explanatory sequential mixed methods design, beginning with a quantitative survey phase followed by qualitative interviews. This approach was chosen to first measure the prevalence of burnout among teachers and then explore the underlying factors contributing to burnout in greater depth.”
Participants Section Examples
Quantitative sampling: “Participants were recruited through stratified random sampling from a population of 2,500 undergraduate students at a large public university. The sample was stratified by academic year (freshman through senior) to ensure representation across class levels. A total of 320 students participated (response rate = 68%), with ages ranging from 18-24 years (M = 20.1, SD = 1.8). The sample was 62% female, 35% White, 28% Hispanic, 22% Asian, and 15% other ethnicities.”
Qualitative sampling: “Participants were selected using purposive sampling to include individuals with diverse experiences of career transitions. Recruitment continued until data saturation was achieved, resulting in 18 participants. Ages ranged from 28-55 years (median = 42), with 11 women and 7 men representing various professional backgrounds including education, healthcare, business, and technology.”
Data Collection Section Examples
Survey data collection: “Data were collected through an online survey administered via Qualtrics. The survey remained open for three weeks, with reminder emails sent at one-week intervals. Participants accessed the survey through a unique link and could complete it at their convenience. Average completion time was 15 minutes, with a completion rate of 89% among those who started the survey.”
Interview data collection: “Semi-structured interviews were conducted via video conferencing using a standardized interview protocol. Interviews lasted 45-90 minutes (M = 62 minutes) and were audio-recorded with participant permission. The interview guide included open-ended questions about career transition experiences, with follow-up probes to explore responses in depth. All interviews were transcribed verbatim by a professional transcription service.”
Data Analysis Section Examples
Quantitative analysis: “Data were analyzed using SPSS version 28. Descriptive statistics were calculated for all variables, including measures of central tendency and variability. Prior to hypothesis testing, assumptions of normality, linearity, and homoscedasticity were assessed. A repeated measures ANOVA was conducted to examine changes in anxiety scores over time, with post-hoc comparisons using Bonferroni correction for multiple comparisons.”
Qualitative analysis: “Interview transcripts were analyzed using thematic analysis following Braun and Clarke’s (2006) six-phase approach. Analysis began with familiarization through repeated reading of transcripts, followed by initial coding using NVivo 12. Codes were then grouped into potential themes, which were reviewed and refined through team discussion. Final themes were defined and named to capture the essence of participants' experiences.”
Common Methodology Writing Mistakes
Too brief or vague:
- “Interviews were conducted” (no details about duration, setting, procedure)
- “Data were analyzed qualitatively” (no specific approach described)
Lack of justification:
- Describing what you did without explaining why
- Not connecting methods to research questions
- Ignoring alternative approaches
Missing important details:
- No information about recruitment
- Unclear sample characteristics
- Missing reliability/validity information
- No mention of ethical considerations
Inappropriate level of detail:
- Too much detail about obvious procedures
- Not enough detail about novel or complex procedures
- Including results in methodology section
Quality Indicators for Methodology Sections
Clear and logical organization:
- Follows standard structure
- Logical flow between sections
- Clear headings and subheadings
Sufficient detail for replication:
- Exact procedures described
- Timing and sequence clear
- Materials and instruments specified
Appropriate justification:
- Rationale for methodological choices
- Connection to research questions
- Acknowledgment of limitations
Ethical considerations addressed:
- IRB approval mentioned
- Participant protection described
- Confidentiality measures explained
Revision Checklist
Content completeness: □ Research design clearly described □ Sample/participants thoroughly described □ Data collection procedures detailed □ Analysis approach explained □ Ethical considerations addressed
Writing quality: □ Clear and concise language □ Appropriate use of past tense □ Logical organization □ Sufficient detail for replication
Methodological rigor: □ Choices justified with literature □ Limitations acknowledged □ Quality measures described □ Bias mitigation addressed
Real Examples
Example 1: Testing if Something Works
Question: Does mindfulness training reduce student anxiety?
Method: Experiment
- Randomly split 100 students into two groups
- One group gets mindfulness training, other gets regular study skills
- Measure anxiety before and after with validated questionnaire
- Compare changes between groups using statistics
Example 2: Understanding Experiences
Question: How do first-generation college students navigate university?
Method: Interviews
- Interview 15 first-generation students
- Ask about challenges, strategies, support
- Look for common themes in responses
- Report themes with quotes as examples
Example 3: Mixed Approach
Question: Why are teachers leaving the profession?
Method: Survey then interviews
- Survey 500 teachers about job satisfaction
- Interview 20 teachers who scored low on satisfaction
- Use interviews to explain survey results
- Report both numbers and stories
Essential Tools
Analysis Software Quick Guide
For Numbers (Quantitative):
- SPSS - Easy to learn, click-and-point interface
- R - Free, powerful, but requires learning programming
- Excel - Good for simple analysis, widely available
For Stories (Qualitative):
- NVivo - Most comprehensive, expensive
- MAXQDA - User-friendly, good for mixed methods
- Free options - Taguette, QDA Miner Lite
For Surveys:
- Google Forms - Free, simple
- Qualtrics - Powerful, often free through universities
- SurveyMonkey - User-friendly, moderate cost
Picking the Right Tool
Start with:
- Your budget (free vs. paid)
- Your tech skills (beginner vs. advanced)
- What your university provides
- What others in your field use
Learning tip: Start simple, learn as you go
Avoid These Common Mistakes
Wrong Method for Your Question
Mistake: Using surveys to understand experiences Fix: Match your method to what you want to know Example: Want to know how students feel about stress? Interview them, don’t just survey them
Too Few People
Mistake: Not having enough participants Fix: Calculate sample size ahead of time For numbers: Use power analysis For stories: Keep going until you stop learning new things
Measuring the Wrong Thing
Mistake: Your questions don’t actually measure what you want Fix: Use proven questionnaires when possible Test first: Try your questions with a few people before the real study
Data Collection Issues
Problem: Low response rates
- Issue: Threatens representativeness and statistical power
- Solutions: Multiple contact attempts, incentives, personalized invitations
- Prevention: Consider participant burden, use clear communication
Problem: Interviewer bias
- Issue: Researcher influence distorts participant responses
- Solutions: Standardized protocols, multiple interviewers, reflexivity
- Prevention: Training, practice interviews, awareness of biases
Problem: Measurement error
- Issue: Inaccurate or inconsistent data collection
- Solutions: Standardized procedures, reliability checks, calibration
- Prevention: Pilot testing, training, quality assurance protocols
Analysis Challenges
Problem: Assumption violations
- Issue: Statistical tests used inappropriately
- Solutions: Check assumptions, use appropriate alternatives
- Prevention: Understand test requirements, consider non-parametric options
Problem: Multiple comparisons
- Issue: Increased risk of Type I error
- Solutions: Adjust significance levels, focus on planned comparisons
- Prevention: Plan analyses in advance, limit exploratory testing
Problem: Confirmation bias in qualitative analysis
- Issue: Finding only evidence that supports expectations
- Solutions: Systematic coding, negative case analysis, team coding
- Prevention: Acknowledge preconceptions, use structured approaches
Ethical Considerations
IRB/Ethics approval:
- Required for all research involving human participants
- Submit applications early (can take months)
- Include all materials and procedures
- Plan for modifications and amendments
Informed consent:
- Clear explanation of research purpose
- Description of procedures and risks
- Voluntary participation emphasized
- Right to withdraw without penalty
Confidentiality and privacy:
- Secure data storage and transmission
- De-identification procedures
- Limited access to identifiable data
- Data retention and destruction plans
Vulnerable populations:
- Additional protections required
- Consider capacity to consent
- Minimize risks and maximize benefits
- Special approval procedures
Quality Assurance Strategies
Pre-data collection:
- Pilot test all procedures
- Train data collectors
- Establish reliability protocols
- Plan quality checks
During data collection:
- Monitor data quality regularly
- Check for missing data patterns
- Verify data entry accuracy
- Address problems immediately
Post-data collection:
- Clean and verify data
- Check for outliers and errors
- Document all decisions
- Conduct sensitivity analyses
Methodology Evaluation Criteria
Internal Validity (Quantitative)
Threats to address:
- History: External events affecting results
- Maturation: Participants changing over time
- Testing: Effects of repeated measurement
- Instrumentation: Changes in measurement
- Selection: Non-random group differences
- Mortality: Differential dropout
Strategies to enhance:
- Random assignment
- Control groups
- Standardized procedures
- Blinding when possible
External Validity (Quantitative)
Factors affecting generalizability:
- Sample representativeness
- Setting characteristics
- Time and context
- Treatment variations
Strategies to enhance:
- Representative sampling
- Multiple sites/settings
- Diverse populations
- Replication studies
Credibility (Qualitative)
Strategies to enhance:
- Prolonged engagement: Adequate time in field
- Triangulation: Multiple sources/methods
- Member checking: Participant feedback
- Peer debriefing: Colleague review
Transferability (Qualitative)
Strategies to enhance:
- Thick description: Rich contextual detail
- Maximum variation: Diverse cases
- Clear boundaries: Explicit scope
- Reader assessment: Detailed reporting
Research Ethics Deep Dive
Ethical Principles
Respect for persons:
- Treat individuals as autonomous agents
- Protect those with diminished autonomy
- Informed consent requirements
- Right to withdraw
Beneficence:
- Maximize benefits, minimize harms
- Favorable risk-benefit ratio
- Protect participant welfare
- Social value of research
Justice:
- Fair distribution of research benefits and burdens
- Equitable participant selection
- Avoid exploitation of vulnerable groups
- Access to research results
Special Considerations
Vulnerable populations:
- Children and adolescents
- Pregnant women
- Prisoners
- Individuals with impaired decision-making
- Economically disadvantaged
Sensitive topics:
- Trauma and abuse
- Illegal activities
- Stigmatized behaviors
- Personal relationships
- Mental health issues
Digital research ethics:
- Online privacy expectations
- Social media data use
- Digital consent processes
- Data security requirements
Future Directions in Research Methodology
Emerging Approaches
Computational methods:
- Big data analytics
- Machine learning applications
- Natural language processing
- Network analysis
Digital data collection:
- Mobile data collection
- Ecological momentary assessment
- Digital traces and footprints
- Virtual reality environments
Participatory approaches:
- Community-based participatory research
- Co-design methods
- Citizen science
- Action research
Methodological Innovation
Integration of approaches:
- Multi-method designs
- Cross-disciplinary collaboration
- Novel data visualization
- Real-time analysis
Open science practices:
- Preregistration of studies
- Open data sharing
- Reproducible analysis
- Transparent reporting
Your Next Steps
Good methods make people trust your research. Start with a clear question, then pick the approach that fits.
Truth: No method is perfect. Pick the best one you can actually do and be upfront about what you couldn’t control.
Stop Struggling with Research Design
Fynman helps you:
- Find studies that tried what you’re planning
- Pick the right method for your specific question
- Design data collection that actually works
- Plan analysis that makes sense
Good research starts with smart planning. Take time to think it through, admit what you can’t control, and match your methods to your questions.