Stylized visualization of a classroom social graph used by a GCN to predict student performance
Shenzhen, China, September 1, 2025
A lightweight two-layer Graph Convolutional Network (GCN) can predict four levels of classroom performance with strong accuracy by combining student attributes and social interaction data. Tested on a cleaned dataset of 732 students and a social graph of 5,184 edges, the model uses a 16-feature input matrix and achieves AUC scores near 0.91–0.92 and an F1 around 87%. The approach outperforms GAT and GraphSAGE, and ablation shows social ties are critical. The study highlights interpretability via GNNExplainer, notes limits in scale and multimodality, and recommends ethical adaptation before wider deployment.
Researchers have built a lightweight Graph Convolutional Network (GCN) that combines students’ personal data and their social interactions to predict classroom performance in four categories with strong accuracy. The study was published in Scientific Reports, volume 15, Article number 32044 (2025), DOI 10.1038/s41598-025-17903-4, and appeared online on 01 September 2025.
The GCN model reached an area under the curve of approximately 0.91–0.92 and high F1 and precision scores in cross-validation. The model is aimed at making classroom grade evaluation more objective by adding structured social and online behavior data to traditional measures.
The team treated each student as a node in a graph and used different kinds of interactions as weighted edges. They combined data from school teaching systems, classroom observation notes, and online learning platforms to form a multi-source dataset. Platforms named in the work include the Smart Education Platform for Primary and Secondary Schools of Shenzhen and the xueleyun Teaching Platform.
Performance labels were built by fusing mid-term and final scores (mid-term weight 0.4, final weight 0.6) and combining teacher, self and peer inputs into a fused score. The fused score was then turned into four classes: Excellent (≥90), Good (80–89), Qualified (70–79), and To be improved (<70).
The main edge-weight method combined three interaction indicators: observed cooperation in class, online interaction frequency, and peer ratings. These three were normalized and weighted by coefficients of 0.4, 0.3 and 0.3 respectively to build a weighted adjacency matrix. Alternative graph strategies tested included cosine similarity of interaction vectors, peer evaluation graph, Pearson similarity graph, and fully connected graph.
The model is a compact two-layer GCN with hidden sizes of 128 and 64, ReLU activation, and 0.5 dropout. Training used cross-entropy loss with L2 regularization (weight decay 0.0005) and the Adam optimizer (initial learning rate 0.01 with decay). The environment reported four NVIDIA A100 GPUs, dual Intel Xeon Gold processors, and 256 GB RAM, with software stacks including PyTorch 2.1, PyG 2.3, CUDA 12.1 and cuDNN 8.9.
The study used tools such as GNNExplainer and t-SNE to highlight which neighbors and features mattered most. For high-performing students, group cooperation, class participation, and assignment timeliness were often most influential. The authors present the two-layer GCN as a practical option for school settings because it balances performance and computation.
The work was approved by a school ethics committee and all online behavior data were collected with authorization and privacy protections. The corresponding author can provide datasets on reasonable request by email: wushuying1234@126.com. The authors reported no competing interests. Funding came from local and provincial education projects and school support in Shenzhen.
Limitations include reliance on questionnaires and logs rather than richer multimodal signals, and potential scaling challenges for much larger networks. Future work suggested richer data types, distributed training, and stronger interpretability tools.
A: It classifies students into four performance groups: Excellent, Good, Qualified, and To be improved, using a graph-based neural network.
A: The model reported AUC values around 0.91–0.92, precision near 88.5%, recall about 86.5%, and F1 about 87.3% on cross-validation.
A: Multi-source data from teaching management systems, classroom observations, and online platforms covering 732 cleaned student records across 12 classes over two semesters.
A: The dataset is available from the corresponding author on reasonable request via email: wushuying1234@126.com.
A: Yes. The project had ethics approval, written consent from participants, and authorization for online data collection.
A: The research shows promise, but real-world adoption should consider local privacy rules, technical capacity, and the need for transparent explanations alongside model outputs.
Item | Detail |
---|---|
Article | Application of artificial intelligence graph convolutional network in classroom grade evaluation |
Journal | Scientific Reports, vol. 15, Article 32044 (2025) |
DOI | 10.1038/s41598-025-17903-4 |
Published | 01 September 2025 |
Final sample | 732 student records; graph with 732 nodes and 5,184 edges |
Input features | 16 features (personal, classroom, online behavior) |
Model | Lightweight two-layer GCN ([128,64]) with dropout and L2 regularization |
Top performance | AUC ≈ 0.91–0.92; F1 ≈ 87.3% |
Data access | Available on reasonable request: wushuying1234@126.com |
Ethics | Approved by school ethics committee; written consent obtained |
Austin, Texas, September 5, 2025 News Summary Easy Street Capital, an Austin-based private lender, has increased…
Santa Barbara, CA, September 5, 2025 News Summary Concord Summit Capital arranged a $16.5 million C-PACE…
United States, September 5, 2025 News Summary Manufactured housing is emerging as a lower-cost, faster-built alternative…
San Francisco, California, September 5, 2025 News Summary San Francisco-based HappyRobot closed a $44 million Series…
Villa Rica, September 5, 2025 News Summary Villa Rica-based Caliber 1 Construction is expanding its Building…
New York, September 5, 2025 News Summary Pave Finance closed a $14 million seed round that…