Accessing and Understanding the Common Assessment Admin Report

Modified on Tue, 31 Mar at 7:52 AM

TABLE OF CONTENTS


You’ve administered a Common Assessment across your district. Students have completed it, teachers have proctored it — and the data is in. The District Report is where you turn that data into action.

This report gives you a clear, organized view of how students performed across standards, schools, classes, and demographic subgroups. Use it to prepare for your next Professional Learning Community meeting, identify schools that need support, surface equity gaps, track progress against your district’s accountability targets, and identify individual students for intervention or enrichment.

This article is for: System Admins, School Admins, Assessment Coordinators, Department Heads, and Instructional Coaches.


Quick navigation

I want to…

Go to

Know who can see this report

Who Can Access This Report?

Access the report

How Do I Get to the Report?

Understand overall performance

What’s the Big Picture?

Analyze by standard

Which Standards Need Attention?

Compare schools

How Are My Schools Performing?

Find classes needing support

How Are Individual Classes Doing?

Identify equity gaps

Are There Equity Gaps?

Analyze patterns and trends

How Can I Explore Performance Patterns? (Deep Dive)

Review item-level data

How Did Students Perform on Each Question?

Track accountability goals

Are We Meeting Or Accountability Targets?

Share with my team

How Do I Share This Data With My Team?

Understand pending evaluations

What Does "Pending Evaluation" Mean?

Understand metric calculations

How Are the Numbers Calculated?

Narrow the report to specific cohorts

Using the Filter



Who can access this report?

Different roles see different levels of data. Here’s what each role can access:

Role

Which Common Assessments can you see?

Report scope

System Admin

All Common Assessments in the organization

Organization-wide — all students, all schools

School Admin / Principal

All Common Assessments connected to your school

School-wide — all students in your school

Assessment Coordinator

All Common Assessments you created + all Common Assessments assigned to you

Organization-wide for Common Assessments you created; Class-wide for Common Assessments assigned to you

Teacher / Proctor

All Common Assessments assigned to you

Class-wide only (see: Understanding Your Teacher Class Report)


Note: School Admins gain visibility based on the schools selected during Common Assessment sharing — regardless of when they are added to Wayground. System Admins always have access to all Common Assessments across the district.


How do I get to the report?

  1. Click Common Assessments in the left navigation.

  2. Click the assessment you want to review.

  3. On the Common Assessment Details page, click View Report.

The report opens to the Summary tab by default. Data updates once daily overnight, so recent submissions may take up to a day to appear. Check the "Last updated" timestamp in the report header before presenting data in meetings.

What’s the big picture?

In a nutshell

The top of the report shows overall accuracy, how many students participated, and how they’re distributed across performance bands.


Overall accuracy

A single percentage representing district-wide performance — the average of points earned by all student participants across the district, divided by the total possible points, expressed as a percentage.

Accuracy = (Sum of points earned by all participants) ÷ (Maximum possible points × Number of participants)

The formula uses each student’s best attempt only and counts only unique participants who submitted at least one attempt.

Performance band distribution

A color-coded bar showing what share of students fall into each proficiency level:

Band

Default range

What it tells you

Did Not Meet (Red)

Below 40%

Key concepts may need reteaching

Partially Met (Yellow)

40–70%

Understanding is emerging but incomplete

Met (Green)

70–90%

Performance aligns with expectations

Exceeded (Blue)

Above 90%

Ready for enrichment or extension


Below the bar, you’ll see the participant count for each band (e.g., "24 participants - Did not meet," "267 participants - Partially met"). Hover over any segment to see details.

Note: These are default thresholds. Your Common Assessment creator may have configured different ranges during assessment setup. Performance bands can be updated at any time — changes reflect in reports within one day.


Participant count

The total number of unique students included in the report (e.g., "496 participants").

Report header information

The header displays:

  • Assessment title and metadata (Grade, Subject, Date range)

  • Creator name

  • Status indicator (e.g., "Ended")

  • Last updated timestamp — check this before presenting data in meetings

Report header actions

Action

What it does

View resource

Preview the assessment content

Download

Export the report as PDF or CSV

Filter

Narrow the report to specific schools, classes, or subgroups


See "Pending Evaluation" in the legend? Some students have Open-Ended questions teachers haven’t scored yet. Those students are temporarily excluded from accuracy. See "What Does 'Pending Evaluation' Mean?" below.


Which standards need attention?

In a nutshell

See accuracy and student distribution for each curriculum standard — so you know where students are strong and where they need support. Click into any standard to drill down to the school, class, and student level.


Go to Summary > Standards. Each standard shows:

  • Standard code and description (e.g., "TEKS.MATH.6.3D")

  • Number of items tagged to this standard (e.g., "4 items")

  • Accuracy percentage in a circular indicator

  • Participant distribution bar (color-coded by performance band)

Look for standards where a large share of students fall into "Did Not Meet" or "Partially Met" — these are your priority areas.

Drill down into a standard

Click on any standard row to open its drilldown panel. This lets you move from "this standard is underperforming" to "here’s exactly where the issue is."

The Standards drilldown shows:

Header: Standard code and description, Standard Accuracy percentage, Participant Distribution bar (color-coded by performance band, with counts).

Three sub-tabs:

Tab

What it shows

Schools

Which schools are strong vs. struggling on this standard? Each school shows participant count and "Students in Met + Exceeded" percentage

Classes

Which specific classes within those schools are driving the results?

Students

Individual students grouped by performance band (Did not meet, Partially met, Met, Exceeded) with expandable sections


Items section:

  • Lists each question tagged to this standard

  • Shows "% Students Awarded Points" for each item

If an accountability goal is set, the panel organizes schools and classes into clear sections:

  • "Meeting accountability goal" — Lists cohorts above your target, with count (e.g., "8 of 8 schools")

  • "Not meeting accountability goal" — Lists cohorts below your target, with count and a "View all →" link

In your Professional Learning Community

"On TEKS.MATH.6.3D, all 8 schools are meeting your accountability target. But on TEKS.MATH.6.4H, only 5 of 6 schools are meeting the goal — let’s focus the discussion there."



How are my schools performing?

In a nutshell

See each school’s accuracy ranked against the district average — and click into any school for a full breakdown by standards and classes.


Go to Summary > Schools. Each school shows:

  • School name and participant count

  • Accuracy percentage displayed as a horizontal bar

  • District Average reference line (vertical dashed line) for quick comparison

Schools are listed in order, making it easy to see which are above or below the district average.

Drill down into a school

Click on any school to open its drilldown panel. This is where you go from "this school is below average" to "here’s exactly which standards and classes are driving the gap."

The Schools drilldown shows:

Header: School name and participant count, School Accuracy percentage with comparison to district (e.g., "↑ 6% above district average"), Participant Distribution bar (color-coded by performance band).

"View school report →" link: Opens the school-level report in a new tab — great for sharing with principals.

Three sub-tabs:

Tab

What it shows

Standards

This school’s accuracy on each standard, compared to the district. If an accountability goal is set, standards are organized into "Meeting" and "Not meeting" sections

Classes

Per-class accuracy within this school

Students

Individual students grouped by performance band with expandable sections



How are individual classes doing?

In a nutshell

Compare classes side-by-side to identify which are thriving and which need coaching support.


Go to Summary > Classes. Each class shows:

  • Class name (format: TeacherName Grade+Section SUBJECT SchoolCode)

  • Participant count

  • Accuracy as a horizontal bar

  • District Average reference line for comparison

Use this when: Planning coaching conversations, identifying high-performing classes whose instructional practices might be worth replicating, or spotting classes that may benefit from additional coaching or support.


Are there equity gaps?

In a nutshell

See performance by demographic group to identify disparities that need attention.


Go to Summary > Sub groups. Each demographic group shows:

  • Group name (e.g., Asian, Hispanic, Two or More, English Learners, Special Education)

  • Accuracy percentage in a circular indicator

  • Participant distribution bar (color-coded by performance band)

Click on any group to drill down and see school-by-school and class-by-class performance for that subgroup.

Sub group data is also available throughout the rest of the report:

  • Filter — Narrow the entire report (Summary, Deep Dive, Items) to a single sub group.

  • Drilldowns — Sub group data is reflected in Standards and Schools drilldowns.

  • Deep Dive — Filter the heatmap by sub group for focused analysis.

Use this when: Preparing equity reports, addressing board questions about achievement gaps, or prioritizing intervention resources.

Tip

Pair sub group data with the Student Deep Dive (below) to identify individual students within an underperforming group who need intervention.



How can I explore performance patterns? (Deep Dive)

In a nutshell

The Deep Dive is a heatmap that lets you spot patterns across multiple dimensions at once — schools vs. standards, schools vs. items, or students vs. standards.


Click the Deep Dive tab to access this view.

What you’ll see:

  • Sub-tabs: Standard Accuracy | Item Accuracy

  • "Group by" dropdown: Schools (default) | Classes | Students

  • Sort dropdown: Ascending | Descending

  • Heatmap grid: Rows are schools (or classes/students), columns are standards (or items)

Reading the heatmap:

  • The first row shows the District Average with overall accuracy

  • Each cell shows accuracy and is color-coded by performance band: Red = Did Not Meet, Yellow = Partially Met, Green = Met, Blue = Exceeded

Pattern interpretation:

Pattern

What it might mean

Red column (low accuracy across all schools)

Curriculum gap or pacing issue with this standard

Red row (one school low across all standards)

School needs broader support

Isolated red cells

Specific school-standard combinations to investigate

Green/blue clusters

Strengths to celebrate and share


Use Filter (top-right) to narrow the Deep Dive to a specific subgroup before analyzing patterns.

Tip

Group by "Students" to view individual student performance across all standards. Compare students side-by-side to identify patterns, strengths, and needs. Filter students based on accuracy ranges to find specific cohorts for intervention and extension.



How did students perform on each question?

In a nutshell

See which questions students got right and wrong — and identify items that may reveal specific misconceptions.


Go to the Items tab to view question-by-question results.

What you’ll see:

  • Each question with its accuracy percentage

  • Questions tagged to specific standards

  • Response patterns across the district

How to interpret patterns:

Pattern

What it suggests

Low accuracy on a single item

Possible confusing wording or unfamiliar concept

Low accuracy across items in one standard

Standard needs reteaching

High accuracy across all classes

Standard well-taught; can reduce review time


Scroll down or click on the Response analysis area for any item to see:

  • The question type, the standard it was tagged to, and the points

  • The percentage and number of students who selected each option

You can also access item-level data through the Standards drilldown — each standard’s panel shows the items tagged to it with "% Students Awarded Points."

In your Professional Learning Community

Use item analysis to move from vague observations ("students struggled with fractions") to precise diagnoses ("Q6 had only 38% accuracy — let’s look at that question").



Are we meeting our accountability targets?


In a nutshell

Accountability goals let you set a performance target (e.g., "70% of students in Met or Exceeded"), and the report automatically flags which schools, classes, and standards are hitting it — and which are not.


If an accountability goal has been set for this Common Assessment, you see it reflected throughout the Summary tabs and drilldowns:

  • "Meeting accountability goal" sections show cohorts above your target with a count (e.g., "8 of 8 schools")

  • "Not meeting accountability goal" sections show cohorts below your target with a count and "View all →" link

  • Each cohort displays its Students in Met + Exceeded percentage, so you can see exactly how close (or far) each one is from your target

Setting or updating goals: Accountability goals are configured during Common Assessment creation under Reporting > Accountability Goals. You can update them at any time — changes reflect in reports within one day.

Use this when: Preparing for a school board presentation, updating your district improvement plan, or running a data review meeting where you need to show progress against targets.


How do I share this data with my team?

In a nutshell

Export as PDF for meetings, or CSV for spreadsheet analysis.


Click the Download button (top-right) and choose:

  • PDF — A Wayground-branded report with summaries across Standards, Schools, Classes, Subgroups, and Item Analysis. Ready for leadership meetings and improvement plans.

  • CSV — Raw data for custom analysis in Excel or Google Sheets.

Great for: Professional Learning Community discussions, leadership team meetings, board presentations, school improvement plan documentation, and sharing with principals who want a quick overview.

Best practice: Use the PDF for stakeholder presentations and Professional Learning Community discussions. Use the CSV when you need to combine Common Assessment data with other sources or for custom analytics. For deeper interactive exploration (like drilldowns and Deep Dive), team members can log in to Wayground and use the in-product report.


What does "Pending Evaluation" mean?

Some assessments include Open-Ended questions (text, audio, video, or drawing responses). These require manual scoring by teachers before they can factor into accuracy calculations.

If you see "Pending evaluation" in the report:

Question

Answer

What does it mean?

Some students have submitted, but their teacher hasn’t scored all Open-Ended items yet

What happens?

Those students are temporarily excluded from accuracy at the assessment, standard, and item levels — so unscored responses don’t distort your data

What should I do?

Encourage teachers to complete evaluations. Once scored, those students’ data automatically rolls into the aggregates


Note: Accuracy may shift as teachers complete evaluations. Check the "Last updated" timestamp in the report header before presenting data in meetings.



How are the numbers calculated?

Accuracy formula:

Accuracy = (Sum of points earned by participants) ÷ (Max points possible × Number of participants)

Calculation rules:

Metric

How it works

Participants

Only unique students who submitted at least one attempt

Best attempt

When a student has multiple attempts, only their highest score is used

Open-Ended questions

Excluded from accuracy until manually evaluated by a teacher


Why this matters:

  • Multiple retries don’t artificially inflate or deflate scores

  • Participation counts reflect actual students, not duplicate attempts

  • Open-Ended question integrity is maintained through teacher evaluation


Quick reference: report sections at a glance

Question you’re asking

Where to look

What you’ll find

How is my district performing overall?

Summary (top)

Overall accuracy, band distribution, participant count

Which standards need attention?

Summary > Standards

Per-standard accuracy and bands; click to drill down to schools/classes/students

How are my schools doing?

Summary > Schools

Per-school accuracy vs. District Average; click to drill down to standards/classes

How are individual classes doing?

Summary > Classes

Per-class accuracy vs. District Average

Are there equity gaps?

Summary > Sub groups

Accuracy by demographic group

Where are patterns and trends?

Deep Dive

Heatmap: schools × standards (or items)

Which questions were hardest?

Items tab or Standard drilldowns

Per-question accuracy and % students awarded points

Am I meeting accountability targets?

Summary drilldowns

"Meeting" / "Not meeting" sections in Standards and Schools panels

How do I share this?

Download button

PDF for meetings; CSV for custom analytics



Using the filter

Click Filter (top-right) to narrow the report to specific cohorts.

Filter options:

Filter

What it does

Schools

Show only selected schools

Classes

Show only selected classes

Sub groups

Show only selected demographic groups


Use the search box to quickly find specific schools or classes. Click Apply Filters to update the report, or Reset Filters to clear selections.

Filters apply across all tabs (Summary, Deep Dive, Items), so you can analyze a specific cohort throughout the entire report.



Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article