NWEA Frequently Asked Questions
What is NWEA?
NWEA stands for Northwest Evaluation Association, a company whose sole focus is on developing growth-based assessment tools for educators. We chose NWEA after examining the quality of their metrics and their ongoing commitment to refining these tools.
What are the NWEA MAP?
In Manhasset, we use the NWEA MAP (Measures of Academic Progress) to measure grades K-6 reading and math growth. The MAP assessments are comprehensive, nationally normed, and research-based; they are designed to determine students’ skill levels in reading and math. NWEA results provide teachers and students with data that can be utilized in the classroom to target students’ needs and track student growth over time.
How long is the test?
The MAP is untimed, but most students take approximately 45 minutes to complete each section of the assessment. The assessment is given over two days.
How often do we assess students?
We assess students three times a year in the fall, winter, and spring.
How is the assessment administered?
Unlike most paper and pencil tests, where all students are asked the same questions and spend a fixed amount of time taking the assessment, MAP Growth is computer adaptive. This means that every student receives a unique set of test questions based on his or her responses to previous questions. As the student answers correctly, questions get harder. If the student answers incorrectly, the questions get easier. By the end of the test, most students will answer about half the questions correctly, as is common on adaptive tests.
What is a RIT (Rausch Unit) score?
When students finish their MAP Growth test, they receive a RIT score for each area they are tested in (reading and math). This score represents the level which a student answered questions correctly 50% of the time. The RIT scale is a stable scale, like feet and inches, that accurately measures student performance, regardless of age, grades, or grade level. Like marking height on a growth chart, and being able to see how tall your child is at various points in time, you can also see how much they have grown between tests.
Is MAP Growth a standardized test? How is it different from ‘high-stakes’ tests?
Unlike standardized tests, MAP Growth is administered periodically during the school year, and it adjusts to each student’s performance, rather than asking all students the same questions. When we talk about ‘high-stakes’ tests, we are usually talking about a test designed to measure grade-level proficiency. MAP Growth is designed to measure student achievement in the moment and growth over time, regardless of grade level, so it is quite different.
How do schools and teachers use MAP Growth scores?
NWEA provides many different reports to help schools and teachers use MAP Growth information. Teachers see the progress of individual students and of their classes as-a-whole. Students with similar MAP Growth scores are generally ready for instruction in similar skills and topics. Principals and administrators can use the scores to see the performance and progress of a classroom, grade level, school, or the entire district.
My child’s current administration RIT score is lower than his/her previous assessment score. Should I be concerned?
The MAP is designed to be given several times a year to allow schools to examine student performance trends over multiple administrations to minimize the effects of testing inaccuracies. Manhasset uses MAP data in conjunction with other academic data to develop a comprehensive picture of student achievement. While we examine student trends carefully, MAP is never the sole measure used to determine students’ instructional needs. You may see dips between administrations. One possible cause for a perceived lack of growth is that the student scored very high on the initial administration and so was more likely to score lower on a second administration, a statistical phenomenon known as “regression to the mean.” Additionally, the MAP tests are designed to assess content commonly taught in specific grade bands. This may mean scores may not reflect content or skills currently being taught—especially for students learning skills beyond the grade level. This “dip” does not necessarily mean student growth has stalled; the MAP assessment may not have captured what they have learned at this time. As always, concerns about your child’s growth should be discussed with your child’s teacher, principal, or content area coordinator.
I noticed that the percentiles from previous administrations listed on my child’s Fall 2020 report are different from the percentiles listed on progress reports from previous years. Why did the percentiles change?
NWEA updates their norms studies every three to five years. The most recent norms study was released in July 2020, and students RIT scores from all administrations, including those from previous years, are compared to this updated nationally-normed reference data. Percentiles from the 2017-2019 school years were originally computed based on 2015 norms data. Whenever the NWEA releases new norms, it is likely that the overall distribution of RIT scores will change, meaning that percentile ranks for some students will change as well. Changes in the norms reflect a change in the standard, not a change in student performance. There are three key differences between the 2015 and 2020 norms studies:
1. Updated Data: The 2020 norms use more recent data (Fall 2015 through Spring 2018). Using recent data ensures that students’ scores and percentiles are aligned with updated estimates of achievement and growth norms.
2. Larger sample size: The 2020 study is based on data from more than 24,500 public schools across 5,800 districts in all 50 states. The 2015 study drew from 23,500 schools across 6,000 districts in 49 states.
3. Better Alignment with District Calendars: The 2020 norms use 2.5x more data points that are tied to district testing calendars. Incorporating more data that is tied to district calendars improves the NWEA’s ability to develop a more accurate measure of instructional exposure, resulting in more accurate norms.
Why does NWEA use norms studies?
Up-to-date norms allow educators to compare achievement status—and changes in achievement status (growth)—to students’ performance in the same grade at a comparable stage of the school year. Norms help us see if students are growing at an expected pace, regardless of where a student started, and allow us to make predictions about what kind of growth is typical and atypical.
Each time NWEA researchers create new norms, they work to improve the utility, accuracy, and generalizability of the norms for schools. For example, in the 2015 norms, they reﬁned the use of school calendars to provide growth information that better reﬂects the number of instructional days available to the student between tests. NWEA modeled the norms in a way that is more sensitive to the impact of summer vacations on learning. They included both student level norms and school norms. Additionally, to make sure the norms are of the highest quality — both in terms of accuracy and utility — NWEA subjected the ﬁndings and report to numerous internal reviews and engaged external experts to review and critique the report.
We welcome additional questions regarding your child’s progress. Please contact the following individuals with your comments and concerns:
For help with:
Your child’s teacher
Your child’s learning profile and specific strengths/concerns in the classroom.
Your child’s principal
Munsey Park: Mr. Chad Altman
Shelter Rock: Mr. Rich Roder
Testing administration questions. General concerns.
Mr. Adam Kuranishi
Administrator of Assessment and Data Analysis
Interpretation of test reports or testing administration inquiries.
Dr. Rebecca Chowske
District Coordinator of English Language Arts and Reading
Questions about the ELA program or specific skill concerns.
Ms. Lauren Tallarine
District Coordinator of Mathematics and Business
Questions about the math program or specific skill concerns.
Dr. Gaurav Passi
Assistant Superintendent for Curriculum & Instruction