Polk Library

AI in Faculty Research

This page is a resource for UWO faculty who want to use AI tools thoughtfully in their research workflows. AI can accelerate discovery, synthesis, and writing-- but it does not replace scholarly judgement. You remain the author, AI is one of your tools.

 

The archaeological dig metaphor is used throughout as a helpful learning framework. For many of the tools referenced below there are free tiers and subscription tiers.

 

Archaeological Dig Workflow (How These Tools Fit Together)

  1. Surveying the Site (Discovery & Orientation): Semantic Scholar, ResearchRabbit, LitMaps, Connected Papers
  2. Excavating & Sorting (Extraction): Elicit
  3. Mapping the Landscape (Networks): ResearchRabbit, LitMaps, Connected Papers
  4. Authentication & Context (Verification): OpenAlex, Scite + DOI/venue checks
  5. Library as Conservation Authority (Full Text & Authority): PsycINFO, CINAHL, Web of Science, library databases
  6. Synthesis & Interpretation (Evidence Summaries & Writing): Consensus, NotebookLM, Copilot, ChatGPT
  7. Scholar as Final Authority (Judgment & Authorship): You (verification, disclosure, responsibility)

Key idea: AI can speed up discovery and drafting, but library systems + human judgment secure credibility and scholarly responsibility.

I. Ethical, Transparent & Reproducible AI Use

 

 

This section provides grounding in ethical, transparent, and reproducible AI practices for UW Oshkosh faculty. The goal is to help you use AI tools confidently while aligning with scholarly, disciplinary, and institutional norms. While policies vary across journals and agencies, several shared principles have emerged. The sections below summarize widely accepted trends and provide faculty-friendly guidance.

πŸ‘€ A. Humans Are Always the Authors

Across disciplines and publishers, there is unanimous consensus: AI cannot be listed as an author. Human researchers retain full responsibility for:

  • verifying factual accuracy,
  • editing and interpreting all AI-assisted content,
  • ensuring originality and scholarly integrity.

Humans must maintain 100% accountability for published work.

πŸ”’ B. Protect Unpublished Work & Ensure Originality

Do not upload confidential manuscripts, proposals, student work, or peer reviews into public AI tools. This includes:

  • unpublished data or manuscripts,
  • grant applications or reviews,
  • manuals, proprietary materials, or IRB-sensitive content.

Your manuscript must always reflect your own scholarly contribution.

πŸ“„ C. Disclose Meaningful AI Assistance

Most journals require disclosure when AI meaningfully shapes the intellectual content of a manuscript.

Disclosure typically required for AI use in:

  • drafting or rewriting text,
  • summarizing or restructuring content,
  • brainstorming or research design,
  • coding, data analysis, or image processing.

Disclosure typically not required for:

  • light grammar, spelling, or formatting edits,
  • citation-style conversions (e.g., APA ↔ MLA),
  • basic proofreading.

Typical disclosure placement: acknowledgments or methods section.

Must include:

  • tool name,
  • model or version (e.g., GPT-5.1),
  • purpose and scope of use,
  • confirmation of human verification.

πŸ“ D. Example AI Use Statements

Sample statements you can adapt:

“ChatGPT (GPT-5.1) was used to summarize background literature on [topic] and to suggest alternative phrasings. All AI-generated outputs were reviewed, verified, and revised by the author.”
“Consensus was used to review peer-reviewed evidence on [topic]. Final interpretation and synthesis were performed by the author.”
“NotebookLM was used to generate summary tables from a curated set of uploaded PDFs. All interpretations, analyses, and final text were developed by the author.”

πŸ“š F. Well-Known Publications & Bodies with AI Guidance

Many major journals and funding bodies now publish guidance on AI use. While details vary, common themes include: AI cannot be an author, meaningful AI assistance must be disclosed, and reviewers must not upload confidential manuscripts or proposals into AI tools.

Discipline Publication / Body AI Guidance (Summary) Policy Page
Economics American Economic Review (AER) AI software may not be listed as an author. Use for drafting or editing must be briefly described/disclosed during submission. Authors are solely accountable for fact-checking AI outputs. American Economic Review editorial policy (AI guidance)
Political Science American Journal of Political Science (AJPS) AI must be disclosed for any element, including copyediting or writing code. Authors should avoid using AI to write the manuscript or substantial elements such as the literature review. Reviewers cannot use AI to directly evaluate or write any part of the review report. AJPS author guidance and policies (check AI disclosure requirements)
Humanities / Language PMLA (Modern Language Association) AI tools cannot be listed as an author. Authors must fully cite all AI-created content (text, images, data) used in the manuscript. PMLA manuscript submission guidelines (AI content citation/disclosure)
Medicine / Health JAMA Network Journals AI cannot be listed as an author. Authors must report AI use in manuscript preparation (tool, version, dates, description) in the Acknowledgment section. AI used in research methods must be detailed in the Methods section for reproducibility. JAMA guidance on generative AI use (reporting and disclosure)
Chemistry / ACS ACS Publications Journals AI tools cannot be authors. All use must be disclosed in the Acknowledgments/Methods. AI-generated table-of-contents graphics are prohibited. Reviewers must not upload manuscripts to generative AI tools (breach of confidentiality). ACS Publications AI policy (authors and reviewers)
Technology / Engineering IEEE Publications Disclosure required in the acknowledgments for AI-generated content (text, figures, images, code), identifying the system and extent of usage. Disclosure is recommended even for AI editing/grammar enhancement. IEEE author guidelines for AI-generated text (disclosure requirements)
Multi-Disciplinary (Publisher) Elsevier Journals AI tools cannot be authors. Disclosure is required for writing/preparation use. Generative AI cannot create or alter images (unless AI itself is the subject of the research). Reviewers must not upload manuscripts to generative AI tools. Elsevier generative AI policies for journals (author/reviewer guidance)
Multi-Disciplinary (Publisher) Nature Portfolio Journals (Springer Nature) LLMs do not satisfy authorship criteria. Generative AI images are generally not permitted for publication. Reviewers are asked not to upload manuscripts into generative AI tools. Nature Portfolio editorial policy on AI (authorship and images)
Multi-Disciplinary (Open Access) PLOS Journals AI tools cannot serve as authors or reviewers. AI use must be disclosed in the Methods or Acknowledgements (including tool name and how outputs were evaluated). Editors and reviewers should not upload submissions to generative AI tools due to confidentiality. PLOS ethical publishing practices (AI disclosure and confidentiality)
Research Funding (Review) National Institutes of Health (NIH) Reviewers are prohibited from using generative AI to analyze or formulate critiques for grant applications due to confidentiality concerns. NIH notice: prohibited generative AI use in peer review (NOT-OD-23-149)
Research Funding (Review) National Science Foundation (NSF) Reviewers are prohibited from uploading content from proposals or review records to non-approved generative AI tools (violates confidentiality). Proposers are encouraged to disclose AI use in proposal development. NSF notice to the research community on AI (proposal and review guidance)

 

II. Step 1–5: Discovery & Literature Review Tools

 

 

Use these tools in sequence to move from broad discovery to verified, citable scholarship. AI can reveal patterns and connections quickly; library databases provide authoritative full text and support reproducibility.

Step 1: Surveying the Site (Discovery & Orientation)

Use discovery tools to get oriented: key papers, related work, and foundational sources.

πŸ“š C. Semantic Scholar & Connected Papers – “Trustworthy Topic Overviews”

These tools provide academically aligned topic overviews without relying on more controversial AI services.

Semantic Scholar (Allen Institute for AI)

  • Identify influential and highly cited articles.
  • Browse concise paper summaries and key terms.
  • Spot “must read” foundational papers.

Connected Papers

  • Generate a graph of related papers based on co-citations.
  • Trace a topic’s intellectual lineage (“prior” & “derivative” works).
  • Find emerging or understudied areas at graph edges.

Step 2: Excavating & Sorting (Extraction)

Use Elicit to pull out study-level fragments quickly (methods, measures, samples, outcomes) and organize them for comparison.

πŸ”Ž A. Elicit – “The Methods & Patterns Tool”

Best for: Rapidly orienting to a topic and extracting study details.

  • Surface relevant studies from a research question.
  • Extract methods, outcomes, measures, and sample sizes.
  • Build quick tables of study characteristics.
  • Spot gaps, contradictions, and methodological patterns.

Great for: early-stage lit reviews, grant methods sections, and quick topical scanning.


Step 3: Mapping the Landscape (Citation Networks)

Use mapping tools to see clusters, influential hubs, and how ideas connect across time.

πŸ—ΊοΈ B. ResearchRabbit & LitMaps – “The Citation Network Explorers”

Best for: Seeing the landscape and evolution of a field.

  • Visualize clusters of related research around a topic. (both)
  • Identify foundational “hub” articles and leading authors. (both)
  • Trace citation trails forward and backward in time. (both)
  • Surfacing topically similar papers that do not cite each other (Research Rabbit)
  • Building a reproducible, citation-only expansion from a seed paper (Litmaps)
  • Maintaining a living map with automated alerts for new citing work (Litmaps)

Great for: entering a new subfield, scoping reviews, and mentoring graduate researchers.


Step 4: Authentication & Context (Verification)

Use verification helpers to confirm sources exist, assess credibility, and understand how findings are received in later scholarship.

βœ” D. OpenAlex & Scite – “Verification Helpers”

After AI or mapping tools suggest possible articles, use these to confirm that sources are real, relevant, and credible.

OpenAlex

  • Verify that a citation actually exists.
  • Check concepts, fields, and publication venues.
  • Locate related works for deeper searching.

Scite (Smart Citations)

  • See whether later scholarship supports, contradicts, or mentions a study.
  • Assess whether a finding is accepted, debated, or uncertain.

Step 5: Library as Conservation Authority (Full Text & Citable Record)

Use library databases to retrieve vetted full text, apply precise filters, and document a reproducible search.

πŸ”— E. AI (Eg. Microsoft Copilot) + Library Databases– “Your High-Precision Search Team”

AI can help you shape and refine search strategies. Library databases deliver the vetted, peer-reviewed, citable scholarship.

AI can help you:

  • Brainstorm keywords and related concepts.
  • Translate a vague idea into a structured question.
  • Draft Boolean search strings.
  • Identify alternate populations, outcomes, or contexts.

Library databases help you:

  • Retrieve authoritative, peer-reviewed articles.
  • Apply precise filters (method, date, population, study type).
  • Access full text and citation tools.
  • Ensure accuracy, comprehensiveness, and reproducibility.
Try this prompt:
Turn this research question into a precise Boolean search strategy suitable for PsycINFO:
“How do first-generation college students experience belonging in STEM disciplines?”
Suggested discovery workflow:

 

  1️⃣ Semantic Scholar / Connected Papers → orient to key papers & themes
  2️⃣ Elicit → extract methods and study details
  3️⃣ ResearchRabbit/LitMaps → map the citation landscape
  4️⃣ OpenAlex & Scite → verify and contextualize
  5️⃣ Library databases → retrieve and evaluate full text
  6️⃣ Zotero → organize, annotate, and cite

 

III. Step 6–7: Synthesis, Interpretation & Scholarly Authorship
 

synthesis interpretationStep 6: Synthesis & Interpretation (Evidence Summaries & Writing)

Use these tools after you have identified and verified your sources. They help you compare studies, summarize evidence, structure an argument, and revise writing—while you remain responsible for accuracy and interpretation.

πŸ“˜ A. Consensus – “The Evidence Synthesizer”

Best for: Understanding what the research collectively says on a question. can compare against literature set acquired; it is a comparison and triangulation layer that helps you see how your findings align with the broader evidence base.

  • Summarizes findings across multiple studies on a specific question.
  • Shows areas of agreement, disagreement, or uncertainty in the literature.
  • Helps you distinguish well-supported claims from emerging or contested ones.
  • Provides citations you can export into Zotero or other reference managers.

Great for: evidence-based claims in lit reviews, grant background sections, and teaching or policy statements that must be clearly supported by research.

πŸ“‚ B. NotebookLM – “The Corpus-Based Synthesizer”

Best for: Deep synthesis within a curated set of documents you choose.

  • Upload PDFs, research notes, transcripts, or article collections.
  • Ask questions that stay strictly within your selected corpus.
  • Generate tables of methods, samples, measures, or findings.
  • Compare themes, constructs, or theoretical frameworks across studies.

Great for: building a private “focus corpus” for a literature review, scoping review, or grant project, and then interrogating that corpus in a structured way.

πŸ“ C. Microsoft Copilot – “The Drafting & Revision Partner”

Best for: Structuring, revising, and adapting writing directly within Word, PDF, PowerPoint, and Excel using the institution-supported subscription.

  • Summarizes long documents (articles, literature review drafts, grant sections).
  • Drafts and revises outlines, section headings, and argument structure.
  • Rewrites paragraphs for clarity, concision, or different audiences.
  • Translates jargon-heavy passages for reviewers, students, or non-specialists.
  • Explains complex content in plainer language while preserving technical accuracy.

Great for: revising manuscripts, IRB applications, grant proposals, and teaching materials while staying within the UW Oshkosh ecosystem.

⚠ E. Caution with Drafting – “You Stay the Author”

AI can help you draft, revise, and synthesize—but it cannot take responsibility for your scholarship. As you use any of these tools:

  • Always cross-check AI-generated claims against the original sources.
  • Verify that citations exist and are appropriate for your topic.
  • Confirm interpretations through your own reading and disciplinary judgment.
  • Remember that you are responsible for the arguments, framing, and conclusions.

Use AI to accelerate the mechanics of synthesis and writing, while preserving your role as the scholar, interpreter, and author.

Suggested Step 6 workflow:

 

  1️⃣ Library databases → retrieve and read full text
  2️⃣ Zotero → annotate, tag, and organize what you will actually use
  3️⃣ Consensus → scan what the broader literature tends to conclude (where appropriate)
  4️⃣ NotebookLM → interrogate your curated PDF set for patterns, comparisons, and tables
  5️⃣ Copilot → outline, draft, and revise with your verified sources open

Note: For citation-context evaluation (e.g., whether later work supports or contradicts a study), use Scite in Step 4: Authentication & Context.


human judgementStep 7: The Scholar as Final Authority

AI can assist with drafting and synthesis, but you remain responsible for interpretation, evidence selection, framing, and final claims. When AI contributes meaningfully, disclose use and verify all citations against the original sources.

 

IV. Prompt Library for Faculty Research

 

 

Workflow position: This section primarily supports Step 2 (Excavating & Sorting), Step 5 (Library as Conservation Authority), and Step 6 (Synthesis & Interpretation).

 

These prompts are starting points designed to accelerate your research, writing, and evaluation workflows. Customize them with your discipline, methods, datasets, and preferred tools. Always verify AI outputs and integrate them with your own scholarly expertise and database searching.

πŸ”Ž A. Literature Review Prompts

Evidence Mapping Prompt

“You are an expert research assistant in [discipline]. Using ONLY the articles I provide, create a concise synthesis of the main findings on [topic]. Highlight areas of agreement and disagreement among studies, and identify any gaps or unresolved questions noted by the authors.”

Search Term Expansion Prompt

“Based on this research question—[paste question]—generate synonyms, related terms, and possible subject headings for database searching in [discipline]. Return results in a table with the columns: concept | synonyms | possible subject terms.”

πŸ’° B. Grant & Funding Prompts

NIH-Style Specific Aims Starter

“Using NIH ‘Specific Aims’ conventions, help me draft a 1-page aims outline for a project on [topic] in [discipline]. Emphasize significance, innovation, and a clear 2–3 aim structure. Ask me for missing details before drafting.”

Reviewer-Friendly Summary

“Rewrite this technical abstract for a multidisciplinary review panel. Maintain accuracy but emphasize significance, clarity, and broader impacts: [paste abstract].”

πŸ§ͺ C. Methodology & Research Design Prompts

Design Comparison Prompt

“Compare at least three research designs suitable for studying [topic] in [discipline]. For each design, list advantages, limitations, typical sample sizes, and threats to validity. Include citations to standard methods texts where possible.”

Instrument / Measure Scan Prompt

“List commonly used instruments or measures for studying [construct] in [population]. For each, describe what it measures, typical reliability/validity evidence, and relevant citations.”

πŸ“Š D. Data Interpretation & Communication Prompts

Plain-Language Results Explanation

“Explain the following results in plain, accessible language suitable for a grant reviewer or educated public, focusing on meaning and implications rather than technical detail: [paste results].”

Limitations and Next Steps Prompt

“Given these findings—[brief summary]—identify 3–5 realistic limitations and 3–5 logical directions for future research, aligned with norms in [discipline].”

 

VI. Integrating AI with Zotero and Library Services

 

 

Workflow position: Zotero and library services support every step of the workflow, with particular importance for Step 5 (Library as Conservation Authority) and Step 6 (Synthesis & Writing).

 

AI tools become significantly more powerful when paired with the licensed databases, research infrastructure, and expert guidance available through Polk Library. This section outlines practical ways to integrate AI-assisted discovery and synthesis with Zotero, library databases, and librarian support.

πŸ“š A. Using Zotero to Manage AI-Discovered Literature

Zotero helps transform AI-suggested articles into a verified, organized, and citable research library.

  • Import AI-suggested citations using DOI, PMID, or the Zotero Connector.
  • Create collections such as to verify, include, exclude, and maybe.
  • Attach PDFs from library databases.
  • Use notes and tags to document screening and relevance decisions.

πŸ”— B. Using AI to Strengthen Database Searching

AI can help refine search logic, but library databases ensure rigor and reproducibility.

  • Generate keyword lists, synonyms, and Boolean strings with AI.
  • Translate research questions into database-ready queries.
  • Apply controlled vocabularies and filters in PsycINFO, CINAHL, Web of Science, etc.
  • Verify journals, populations, and methods against disciplinary norms.

🀝 C. Working with Librarians for AI-Informed Research Support

Polk Library support ethical, effective AI use across the research lifecycle. For guidance, reach out to

Joe Pirillo, Polk Library’s AI lead contact.

 

  • Co-develop AI-informed search strategies.
  • Evaluate AI-generated citations and claims.
  • Support Zotero workflows and citation management.
  • Advise on AI disclosure and transparency statements.
  • Provide workshops for departments or research teams.

πŸ“ D. Putting It All Together

Used together, AI tools, library databases, Zotero, and librarian expertise form a complete scholarly research ecosystem:

  • AI identifies candidate articles →
  • Library databases verify and retrieve full text →
  • Zotero organizes and documents decisions →
  • AI assists synthesis and drafting →
  • You provide interpretation, judgment, and authorship.