TABLE OF CONTENTS
- Accessing your class report
- Report summary
- Understanding key metrics
- School and district averages
- Performance bands
- Status tags
- Navigating the report tabs
- Evaluating Open-Ended responses
- End Ongoing Attempts
- Adding a student record (Scribe Mode)
- Tips for using your class report
The Teacher Class Report provides a detailed view of how your class performed on a Common Assessment. It includes accuracy metrics calculated from unique participants and best attempts, benchmarks against school and district averages, performance band visualizations, and tools for evaluating Open-Ended responses. This article explains how to access, read, and act on your class report.
This article is for: Teachers (Educators) who proctor Common Assessments.
Accessing your class report
You can access your class report from the Common Assessments page:
Go to Common Assessments in the left navigation pane.
Click on the Common Assessment you want to review.
On the Common Assessment Details page, click "View class report" for the class you want to see.
You can also access the report directly from the Explore page if the Common Assessment appears under "Common Assessments assigned to you."
The class report consolidates data across all test sessions for a given class. Even if the assessment spanned multiple sessions (for example, due to pausing and resuming), you see a single unified view of student performance.
The report opens to the Overview tab by default. If the assessment is still active, a "Go to live session" or "Resume Session" button appears in the top-right corner.
Report summary
When the report opens, a summary bar at the top displays four key metrics:
Accuracy: Average accuracy across all unique participants, based on each student’s best attempt.
Participation Rate: Percentage of rostered students who attempted the assessment.
Participants: Count of students who participated out of total class strength (e.g., "13 of 14").
Questions: Total number of questions in the assessment.
Below these metrics, the Class Insight line provides a quick status summary — for example: "13 completed, 0 incomplete, 1 unattempted." This tells you at a glance whether all students have finished.
Report actions
The action bar below the metrics provides these options:
View content — Preview the assessment questions.
Print — Print the report.
Download — Export the report for offline use.
Email all parents — Send assessment results to all parents.
Share report — Share the report with colleagues.
Pending evaluation banner
If the assessment contains Open-Ended questions, you may see a yellow banner at the top of the report: "X participants have pending evaluations and are excluded from relevant aggregates." This means those students’ Open-Ended responses haven’t been scored yet, so their data is temporarily excluded from accuracy calculations.
Click "View details" to begin evaluating, or see the dedicated article: How to Evaluate Open-Ended Responses on Common Assessments.
Understanding key metrics
The class report calculates metrics using unique participants and their best attempts, giving you an accurate picture of class performance.
Participants
The participant count reflects the number of unique students who submitted at least one attempt. Each student is counted only once, regardless of how many attempts they made.
Class accuracy
Class accuracy is the average accuracy across all unique participants, calculated using each student’s best attempt.
Formula:
Accuracy = Sum of points earned by participants ÷ (Maximum points possible × Number of participants)
The denominator is based on unique participants (not total class strength), so the metric reflects only students who actually took the assessment.
Item accuracy
Item accuracy shows how students performed on each individual question. It is calculated using each student’s best attempt only.
Standard accuracy
Standard accuracy shows how students performed across all items tagged to a given standard. It is calculated using each student’s best attempt across all items within the standard.
School and district averages
Your class accuracy is displayed alongside the school average and district average at three levels:
Assessment level: Overall class accuracy compared to school and district averages.
Item level: Per-item accuracy with school and district benchmarks.
Standard level: Per-standard accuracy with school and district benchmarks.
In the report, these appear as rows in the Overview tab’s comparison grid (District Average, School Average, Class Average) and as rows in the Standards tab’s student performance heatmap.
These benchmarks help you identify strengths and improvement areas more meaningfully by contextualizing your class results within the broader school and district performance.
Performance bands
Performance bands categorize student performance into four levels based on accuracy:
Did Not Meet — Student performance falls below the expected threshold.
Partially Met — Student shows emerging understanding but hasn’t fully met expectations.
Met — Student demonstrates solid understanding of the assessed concepts.
Exceeded — Student exceeds expectations with strong accuracy.
Performance bands appear in two places on your class report:
Overview section — on the Class Accuracy row, showing how your class distributes across the four bands.
Standards tab — showing performance band distribution for each standard.
A color-coded legend at the bottom of the report shows the default ranges: Did Not Meet (0%–40%), Partially Met (40%–70%), Met (70%–90%), Exceeded (90%–100%). Your Common Assessment creator may have configured different ranges.
These visualizations help you instantly see how your class performance aligns with the defined bands, making it easier to identify which students or standards need attention.
Status tags
Assessment-level status
The class report displays the assessment’s status at the top:
Active — The Common Assessment is between its start and end date.
Ended — The Common Assessment has passed its end date.
Student-level status
On the Participant tab, each student record displays their current state:
Pending Evaluation — The student’s submission contains Open-Ended questions that have not yet been evaluated. The accuracy and points fields remain blank until evaluation is complete.
Not Submitted — The student has an ongoing attempt but has not submitted it.
Not Started — The student has not begun any attempt.
Scored students — Once a student's responses are fully evaluated, their row displays the accuracy percentage (in a circular indicator), points (e.g., "22/24"), and an accuracy distribution bar showing each question's result. The data shown reflects the student's best attempt.
Navigating the report tabs
The report is organized into six tabs. Each tab gives you a different view of your class’s performance.
Overview
The Overview tab displays a question-by-question heatmap benchmarked against school and district averages.
Comparison grid (top):
Three benchmark rows appear for every question in the assessment:
District Average — How students across the entire district performed on each question.
School Average — How students across your school performed on each question.
Class Average — How your class performed on each question.
Each cell is color-coded by performance band, making it easy to spot questions where your class is above or below the school or district benchmark.
Student heatmap (bottom):
Each row is a student, each column is a question. Cells show a checkmark (✓) for correct or a cross (✗) for incorrect, color-coded by performance band. Students are sorted by accuracy by default — you can change the sort order using the "Sort by" dropdown.
Participants
The Participants tab shows every student’s performance at a glance. Each student row displays:
Name
Accuracy distribution bar — A color-coded strip showing each question’s result (Correct in green, Partially correct in yellow, Incorrect in red, Unattempted in gray), with a count summary below (e.g., "✓ 22 × 2").
Accuracy percentage in a circular indicator.
Points — Score out of total (e.g., "22/24").
Evaluate button — Opens the student’s full response detail for review.
Students who have not started appear at the bottom of the list with a "Not started" label.
Click "Evaluate" on any student to open a detailed modal showing their response to each question, the correct answer, time taken, and whether the response was correct, partially correct, or incorrect.
Questions
The Questions tab provides a per-question analysis of student responses. Each question is displayed as a card showing:
Question type (e.g., Multiple Choice, Math Response, Drag and Drop), points, and accuracy percentage.
Average time students spent on the question.
Evaluate button (for questions with Open-Ended responses).
Expand any question to see the question text, correct answer, and response analysis — a breakdown showing how many students answered correctly vs. incorrectly. For Multiple Choice questions, each answer option is listed with the number of students who selected it, helping you identify specific misconceptions.
Use the "Sort by" dropdown to sort by question order or accuracy.
Standards
The Standards tab is available when questions are tagged to standards. It displays two sections:
Standards summary (top):
Each standard shows:
Standard code and the number of questions tagged to it (e.g., "TEKS.6.3.D (4 questions)").
Participant distribution bar (color-coded by performance band).
Accuracy percentage in a circular indicator.
Click on any standard to open a detail panel showing the individual questions tagged to that standard, with accuracy for each item and response distribution. A summary line at the top confirms tag coverage (e.g., "24/24 questions tagged, 6 standards").
Student performance heatmap (bottom):
Rows show District Average, School Average, Class Average, and each individual student. Columns are the standards. Each cell shows the accuracy percentage, color-coded by performance band.
Use the "showing Best ∨ attempts" dropdown to switch between:
Best — Each student’s highest score across attempts (default).
First — Each student’s first attempt only.
Last — Each student’s most recent attempt only.
Anti-cheating
The Anti-cheating tab shows students who triggered integrity alerts during the assessment. Each row displays:
Student name
Alert type — Tab switch, Left fullscreen, or Window resize.
Total alerts — How many alerts this student received.
Last alert — When the most recent alert occurred.
Actions — Option to remove the student from the session.
Click on any student to open a detailed view showing device and browser information, an alert summary table, and a question-by-question view with warning banners on questions where alerts were detected.
A Settings gear icon (top-right of the tab) allows you to adjust anti-cheating preferences.
Evaluating Open-Ended responses
If the assessment includes Open-Ended questions (text, audio, video, or drawing responses), those responses must be reviewed and scored by you before they factor into accuracy calculations. Until evaluated, these students are marked as "Pending Evaluation" and are excluded from accuracy roll-ups to maintain the integrity of your report metrics. If a student response is pending evaluation, the points field appears blank.
Clear on-screen messages guide you through the evaluation process, indicating which responses need your attention.
A yellow warning banner at the top of the report tells you how many participants have pending evaluations. You can click "Evaluate" on the Participants or Questions tab, or click "View details" on the banner to begin.
For a complete walkthrough — including how to use AI-suggested scores, assign manual scores, and understand how pending evaluations affect your metrics — see the dedicated article: How to Evaluate Open-Ended Responses on Common Assessments.
End Ongoing Attempts
The End Ongoing Attempts button appears on the report when students have active, unsubmitted attempts from a prior session. It saves their in-progress responses and submits the assessment on their behalf.
This button is visible only when students still have active, unsubmitted attempts.
Once all attempts are ended, the button disappears.
After using this button, the students’ saved responses are submitted and their data becomes available in the report.
This is useful for ensuring all student data is captured — for example, when students left a session without clicking Submit.
Adding a student record (Scribe Mode)
If the Common Assessment creator has enabled the "Allow Teacher to Add Student Records" setting, you can record responses on behalf of students who require accommodations (such as scribe support) or who completed the assessment offline.
To add a student record:
Click the "Add Student Record" button in the class report.
Select the student and provide a reason (e.g., accommodation, detention).
Enter the student experience view and input answers on the student’s behalf.
Submit the record. The record is final — it cannot be repeated for the same student.
The reason is logged and visible to admins in the Common Assessment Admin Report.
Tips for using your class report
Evaluate Open-Ended questions promptly — Pending evaluations delay accurate metrics. Review and score Open-Ended responses as soon as possible to get a complete picture of class performance.
Use school and district averages for context — A 65% class accuracy means something different if the district average is 60% versus 80%. Use the benchmarks to identify where your class stands relative to peers.
Leverage performance bands for grouping — Use the band distribution to quickly identify which students need intervention (Did Not Meet, Partially Met) and which are ready for enrichment (Exceeded).
Start with the Overview — Scan the student heatmap for patterns before diving into specific tabs. Red columns reveal hard questions; red rows reveal struggling students.
Use the Standards tab for instructional planning — Standards where your class is below the school or district average may indicate areas for reteaching. The Best/First/Last toggle helps you see whether retakes improved understanding.
Address "Not Submitted" students — Use the End Ongoing Attempts feature if students have lingering unsubmitted attempts, so their data is captured in the report.
Review anti-cheating alerts — Use the Anti-cheating tab to identify students who may have had integrity issues during the test.
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article