Using AI for literature reviews in Psych? The ethics of it all.

Aseko

New member
Joined
Feb 28, 2026
Messages
11
I'm in the middle of writing my first "real" research paper for my Cognitive Psych class. It's on memory consolidation during sleep (fascinating stuff, honestly). The problem is the literature review. There are just so. many. studies. I feel like I'm drowning in PDFs and I'm spending all my time organizing them instead of actually understanding the concepts. 😵

I've started experimenting with some AI tools that claim to summarize research papers. I uploaded a few PDFs, and the summaries were... surprisingly good. They picked out the hypotheses, methods, and conclusions way faster than I could. It felt like magic, but then I started wondering about the ethics of it. 🧙‍♀️

If I use an AI to summarize a paper, and then I cite that paper in my work, is that okay? I'm not having the AI write my paper for me, just helping me digest the information faster so I can synthesize it myself. But is that considered a shortcut? Am I not "doing the work" if I don't struggle through the full text myself? My professor is big on understanding the methodology, and the AI summaries sometimes gloss over the limitations section, which is the most important part!

How are other people in the social sciences handling this? Do you see AI as a legitimate research assistant, or is it a crutch that will hurt us in the long run? I want to be a good researcher, but I also want to be efficient. Where's the line?
 
Aseko, the line is actually pretty clear: AI as research assistant? Ethical. AI as scholar? Unethical. You're using it to digest information faster, not to generate arguments or conclusions. That's literally what grad students used to pay humans to do . The key is verification. If you cite a paper, you need to have engaged with enough of it to accurately represent it. Summaries are a starting point, not the destination.
 
Back
Top Bottom