Evaluate research rigor. Assess methodology, experimental design, statistical validity, biases, confounding, evidence quality (GRADE, Cochrane ROB), for critical analysis of scientific claims.
View on GitHubkjgarza/marketplace-claude
scholarly-comms-researcher
plugins/scholarly-comms-researcher/skills/scientific-critical-thinking/SKILL.md
January 20, 2026
Select agents to install to:
npx add-skill https://github.com/kjgarza/marketplace-claude/blob/main/plugins/scholarly-comms-researcher/skills/scientific-critical-thinking/SKILL.md -a claude-code --skill scientific-critical-thinkingInstallation paths:
.claude/skills/scientific-critical-thinking/# Scientific Critical Thinking
## Overview
Critical thinking is a systematic process for evaluating scientific rigor. Assess methodology, experimental design, statistical validity, biases, confounding, and evidence quality using GRADE and Cochrane ROB frameworks. Apply this skill for critical analysis of scientific claims.
## When to Use This Skill
This skill should be used when:
- Evaluating research methodology and experimental design
- Assessing statistical validity and evidence quality
- Identifying biases and confounding in studies
- Reviewing scientific claims and conclusions
- Conducting systematic reviews or meta-analyses
- Applying GRADE or Cochrane risk of bias assessments
- Providing critical analysis of research papers
## Core Capabilities
### 1. Methodology Critique
Evaluate research methodology for rigor, validity, and potential flaws.
**Apply when:**
- Reviewing research papers
- Assessing experimental designs
- Evaluating study protocols
- Planning new research
**Evaluation framework:**
1. **Study Design Assessment**
- Is the design appropriate for the research question?
- Can the design support causal claims being made?
- Are comparison groups appropriate and adequate?
- Consider whether experimental, quasi-experimental, or observational design is justified
2. **Validity Analysis**
- **Internal validity:** Can we trust the causal inference?
- Check randomization quality
- Evaluate confounding control
- Assess selection bias
- Review attrition/dropout patterns
- **External validity:** Do results generalize?
- Evaluate sample representativeness
- Consider ecological validity of setting
- Assess whether conditions match target application
- **Construct validity:** Do measures capture intended constructs?
- Review measurement validation
- Check operational definitions
- Assess whether measures are direct or proxy
- **Statistical conclusion validity:** Are statistical in