- Meetings with client and steering committee to define goals and gather feedback (roughly every ~3 weeks) .
- Research into peer review practices, scientific publishing issues, and training needs.
- Technical feasibility testing for hosting, performance, and Django-based implementation.
Project Description
The current scientific publishing system is severely outdated, with not enough training for PhD students in peer review, too much pressure on researchers to churn out papers, and an overreliance on inaccurate AI writing and reviewing. We intend to revolutionize this system with the Continuous Community Review Compendium, a compendium of articles with posted peer reviews to train new researchers and flag low-quality articles so that researchers don't have to waste time reviewing content that another researcher has already found to be substandard.
In addition to improving transparency and efficiency, the platform will be actively used in our sponsor's lab to train researchers. By reading peer reviews, contributing their own, and interacting with the database, lab members will learn proper peer review practices, improve their research assessment skills, and quickly identify trustworthy scientific work.
Overall, this system saves time and reduces duplication of effort, while providing a practical, hands-on training resource for researchers in the lab. Screenshots, diagrams, and interface graphics will be included to illustrate how the platform works and how it supports training and research evaluation. The platform is intended to be expanded to multiple departments across Northern Arizona University and the state of Arizona.
Link to Original Proposal
Project Introduction
Below is a short project introduction slideshow that gives some background to the motivation behind the Continuous Community Review Compendium!
Open in Google SlidesProblem Statement
The scientific peer review process is essential for maintaining research quality, yet the current system is slow, opaque, and offers very little training for new researchers. Graduate students often have limited opportunities to practice peer review, and published scientific articles rarely include their reviews, making it difficult to learn what strong feedback looks like. At the same time, the rise of AI-generated writing has introduced serious credibility concerns, further complicating the review process.
Without a transparent way to view reviews, evaluate article credibility, or practice reviewing in a guided environment, both students and researchers face unnecessary challenges. A modern, accessible system is needed to organize articles, display reviews, support training, and improve the integrity of scientific literature.
Requirements
Our requirements were gathered through meetings with our client and the graduate-student steering committee, along with additional research and technical feasibility testing. This process helped us identify the system's essential functions, quality expectations, and project constraints.
- User authentication, including advisor/guarantor verification.
- Upload, browse, and view research articles.
- Submit, store, and view peer reviews.
- Search for articles by title, author, or tags.
- Copy-ready article citation generation.
- Flagging for AI-generated or low-credibility content.
- Performance: Article search < ~0.5s; uploads < ~1s.
- Usability: New users should learn interface in ~10 minutes.
- Reliability: Managed cloud hosting for stability (DigitalOcean / AWS).
- Maintainability: Django framework for extensibility and long-term growth.
- Security: Protected user data and verified identity linking.
- System must scale to multiple departments at NAU.
- Copyright limitations restrict external article integration; only public-domain or user-uploaded files allowed.
- Hosting must support large data volumes and long-term expansion.
- Must maintain an environment free of AI-written academic content per client requirement.
These requirements provide the structure for our system design and development plan, while guiding future iterations with the client and steering committee.
User Stories
Easily Navigatable
As a user, I want to easily navigate the website and figure out just where it is I am trying to go with relative simplicity so that I can use the website effectively and not get frustrated.
Account Creation
As a student or researcher, I would like to be able to make an account so that I can perform the actions on the website.
Upload Articles
As a researcher or student, I want to upload a new research article to the system so that others can review and reference it.
Search for Articles
As a user, I want to search for articles by title, tag, or author so that I can easily find specific research to review or cite.
View Articles
As a reader, I want ot open an article's page to see details about it, its authors, and read peer reviews about it so that I can understand the work and its feedback as well as learn about its credibility.
Post Peer Reviews
As a user (specifically a student or researcher), I want to post a peer review for a specific article so that I can correct or contribute a critique to an academic discussion.
View Peer Reviews
As a student or researcher, I want to view others' peer reviews for an article so that I can learn from what others are saying and learn about the article's credibility.
Copy Article Citation
As a researcher or student, I want to be able to copy the citation for an article I am viewing so that I can reference it in my own work or paper.
Advisor Peer Review Management
As an advisor/guarantor training students for peer review, I want to review and approve my students' peer reviews before they are official, so that quality and accuracy are maintained and the students are able to properly learn to write peer reviews.
Advisor/Student Account Linking
As an advisor, I would like to link my students' accounts to mine so that I can easily see what they post and be able to easily review their peer reviews.
Flag Articles for AI
As a researcher, I want to be able to flag an article for AI usage so that everyone has transparency on what was written by AI and what was not. I want to help maintain academic credibility.
Save to Bibliography
As a researcher or student, I would like to save articles to a bibliography so that I have all of my citations in one place for ease of use and access.
Peer Review Draft Saving
As a student, I would like to be able to save my peer review at the draft stage so that I can return to it later, while still having it posted for my advisor to review.
Comment on Peer Reviews
As a user, I would like to comment on others' peer reviews so that I can point out things they missed, ask clarifying questions, or just tell them I appreciate their review.
Upvote/Downvote Peer Reviews
As a user, I would like to upvote and downvote peer reviews so that the most helpful peer reviews for a specific article will appear at the top, and factually incorrect or entirely wrong reviews will be pushed to the bottom.
Full External Journal Integration
As the development team, and for reasons relating to copyright, we will not have full external integration and will only include details that fall in the public domain.
Solution Overview
Our solution, the Continuous Community Review Compendium, is a website that connects academic articles directly to their peer reviews. It modernizes the peer review process by making reviews transparent, searchable, and easy for student to learn to write their own, solving key challenges identified by our client.
The platform provides a secure environment for uploading articles, submitting peer reviews, exploring feedback, and identifying AI-generated or low-credibility content. By gathering articles and reviews into one accessible system, the solution supports both research workflows and peer review education within the client’s lab.
For more information, please see the dedicated solutions page below:
View Full Solution DetailsCompetitive Products and Related Solutions
While several platforms attempt to improve aspects of scientific publishing, none address both transparency and student peer review training in a single, unified system. Below is a comparison of related tools and the gaps our solution fills:
- OpenReview: Provides open peer reviews for select conferences, but does not support student training, advisor validation, or article uploads outside of curated venues.
- Publons / Web of Science: Tracks reviewers’ contributions but does not display reviews publicly or provide practice environments for students.
- ResearchGate / Academia.edu: Allow article sharing but do not offer review tools, AI-credibility features, or structured review workflows.
- Journal submission systems (Elsevier, Springer, etc.): Handle formal peer review but are opaque, inaccessible to students, and do not provide training or review transparency.
The Continuous Community Review Compendium stands apart by combining transparent review access, educational tools, advisor-managed validation, and credibility protections (including AI flagging). This makes it uniquely suited for both research support and peer review training at scale.
Schedule, Resources, and Budget
Schedule: Planning and initial prototyping complete by Dec 2025. Development and implementation complete by May 2026. As of right now, we have all of our must have functional requirements complete and are ready to start user testing. Begining next semester, will we start implementing the should have requirements and make changes to the current system upon client request.
This Gantt Chart shows our current development plan for Spring 2026!
Open in Google SheetsResources/Budget: Our hosting will be through DigitalOcean & AWS. Bugdet at this point in time is still unknown; grants are currently being submitted as it related to this project, it's hosting, and it's ultimate continuation.
Codebase & Demo
The codebase remains private per client request.