Copilot Frequently Asked Questions

Conditions that Support Learning

What are learning conditions?

Research shows that many conditions affect students’ desire and ability to learn. When deciding which learning conditions to focus on, PERTS considers several factors:

  • Is there scientific evidence that the learning condition affects learning? The condition must have a well-established relationship to student success.
  • Can educators directly influence the learning condition? Many conditions affect whether students can learn effectively. Some of those conditions, like hunger and sleep deprivation, can be difficult for educators to directly influence. Other learning conditions, meanwhile, are very much within educators’ sphere of influence. For example, educators’ decisions affect whether students get the frequent, specific feedback they need to quickly improve their understanding, whether students feel like they belong in class, or whether students believe they can be successful in class.

Why measure learning conditions?

Even when educators recognize that specific learning conditions are important for student success—and even when educators try their best to create these conditions—doing so can be challenging. One of the main barriers is that, often, the messages educators intend to send are not the messages students receive. In other words, there’s often a big gap between educators’ intentions and students’ experiences. Absent feedback from students, it can be very hard to recognize or close that gap.

Copilot makes it easy for educators to collect student feedback using validated measures. It also provides for confidentiality so that students are comfortable being honest.

How does Copilot measure learning conditions?

Students are asked carefully designed questions that have been developed and tested by leading researchers. The questions help educators identify how students’ experiences in the classroom could be supporting or getting in the way of learning. This enables educators to take targeted actions to influence these experiences and, thereby, improve learning conditions over time.

What else are students asked on the survey?

Copilot surveys periodically ask students several questions in addition to the learning condition questions described above. Some of these questions enable us to disaggregate data in different ways (e.g., to show educators how students from different groups might experience a learning environment differently). To keep the surveys short, most of these questions are only asked periodically or only from random subsets of students. These questions inquire about students’ race, gender, prior grades, learning mindsets, comfort answering questions honestly, level of effort in a given class, perceptions of their teachers’ mindsets, and other similar factors. You can always preview the survey.

How can I preview the student survey?

To preview the student survey, sign into Copilot, click “Settings” on the navigation bar on the left, then click “Survey Settings,” and then click “Preview Survey.”

What if learning conditions improve but learning outcomes don’t?

Fostering student success is less like turning on a light switch, and more like growing a garden. It’s a developmental process that takes time and is influenced by multiple factors:

  1. Time Course: Effective learning is a collection of behaviors that depends on abilities and habits that take time to develop. For example, even if a student decides to study more, it might take her time to realize that she has to start going to the library after school instead of going back to a distracting home environment. Therefore, even if a student is more inclined to be engaged, it might take time for those desires to translate into the habits and study skills that are markers of effective learning. Give it time.
  2. Multiple Determinants: Creating hospitable conditions for learning is like cultivating a garden: water, sunlight, and a variety of nutrients are each necessary but insufficient for all plants to grow optimally. Successful gardeners do not simply focus on providing sufficient water or providing sufficient sunlight; they attend carefully to each of those conditions with respect to the needs of each of their plants. Furthermore, they are especially mindful of the special care certain plants might need, e.g., because of prior neglect. In the same way, we suggest you work systematically over time to establish multiple hospitable learning conditions.

Eligibility and Cost

How much does it cost to use Copilot?

Copilot is free to U.S. schools and colleges thanks to support from the Raikes Foundation, Overdeck Foundation, and Gates Foundation. For more information, please contact

What grade levels can use Copilot?

Most Copilot surveys are appropriate for students in grades six and up. Reading comprehension of surveys is the main barrier to younger students participating. Copilot-Elevate is optimized for students in grades 6-12 while the College Student Experience Tracker (C-SET) is designed for college students of any age.

What subject areas are the surveys relevant to?

Our surveys are primarily designed for use by academic instructors of math, humanities, foreign languages, and the physical and social sciences. Regardless of subject area, we strongly recommend against surveying students in more than 4 of their classes at the same time because of survey fatigue.

Are PERTS Programs available in multiple languages?

Our materials are only available in English at this time.


What is Copilot?

Copilot is an advanced professional learning platform that helps educators create supportive and equitable conditions for learning. Copilot enables educators to get rapid feedback from their students about how they are experiencing key learning conditions. Pilot data suggests that Copilot helps educators systematically improve learning conditions and that better learning conditions promote higher and more equitable academic achievement. You can access Copilot at

What is a cycle?

It takes significant skill to create a supportive and equitable learning environment that enables all students to reach their potential. To master that complex skill, educators need ongoing feedback and practice. That’s why Copilot is set up as an iterative process, organized around multiple cycles of inquiry and action.

In each cycle, educators start by collecting feedback from their students through a student survey. Then educators can investigate and test ways to improve those experiences. In the next cycle, educators use the survey to see if their test was successful and to continue to make adjustments to their practice. Cycles typically last 2-6 weeks, depending on the preferences of the educators using Copilot.

What if we only want to measure a subset of learning conditions?

In the Copilot “Settings” screen for your project, the project lead or host will see a section called “Survey”. In this section, they can control which learning conditions are measured in their survey and turn off (or on) specific learning conditions that you are interested in measuring. In the C-SET program, measures cannot yet be turned off.

What about survey fatigue?

Students get fatigued by surveys that are long and repetitive. They also grow tired of answering questions if no one acknowledges their responses or addresses their concerns. Unfortunately, many school surveys are designed without regard to students’ or teachers’ experiences, and we (at PERTS) find that troubling.

We would be frustrated if someone asked our opinion over and over in five different ways and then gave no indication that anything meaningful would change as a result of our answers. We empathize with the students and with the educators who are sometimes required to take, or to administer, less-than-useful surveys.

In sharp contrast, we designed our surveys to be part of an empowering process that gives students voice and provides educators with rapid insights that they can act on immediately. Unlike many other surveys, Copilot surveys:

  • Are short. Surveys take students less than 10 minutes to complete, or less if certain questions are turned off.
  • Give students voice in shaping their classroom experience. We provide guidance to educators concerning how to frame the surveys to their students (see How do I administer surveys to my students?). When educators follow this guidance, it ensures their students understand that their opinions are valued and that their responses will influence what happens in class.

How do I administer surveys to my students?

Set up in advance. Make sure each of your students has access to an internet-enabled device like a laptop, tablet, or smartphone. Students do not necessarily need to take the survey at the same time, so it is not necessary for each student to have a device, as long as they can each access a device at some point. Finally, make sure your students have their Roster ID handy so that they can log on to take the survey.

Administering the survey. It’s important to appropriately introduce students to the survey so that they understand and buy into its purpose . You can find detailed recommendations for administering the survey (including unique sign-in codes) by following these steps:

  1. Sign into and select your project (you’ll be automatically redirected if you only have one).
  2. On the menu to the left, select the cycle for which you are currently surveying students.
  3. Click the “Survey Instructions” button.

Handling makeups. Students who were absent can take the survey at a different time, as long as it is within the same cycle.

Receiving reports: You should receive new reports on the Monday following your student survey.

Debriefing: In order to get authentic feedback from your students consistently, students need to trust that their responses are being taken to heart. Intentionally debriefing your report reflections with students can contribute to cultivating that trust—it demonstrates to students that you value their voice and care about their experiences in your class. Here are a few suggestions to engage students in this type of dialogue.

  • Demonstrate to your students that you’re listening by sharing back what you learned. Of course, you don’t have to share the actual reports or even specific figures with students. But you can share with them where you’re doing well and what you’re working to improve based on the feedback you received.
  • Dig deeper with students through class discussions or interviews.

Data Disaggregation

Why disaggregate results?

In our school system—and in society more generally—systematic disparities exist between the opportunities afforded to members of different groups. Disparities in opportunity are especially pronounced by race, gender, and socioeconomic status. For example, Black, Latinx, and Native American students are less likely to receive the support they need to reach high academic standards. Copilot reports help educators compare the experiences of members of different groups so that, if significant disparities exist, educators can take additional steps to understand and mitigate those disparities.

How does Copilot disaggregate results?

By default, Copilot reports disaggregate student experience data by gender and by race-ethnicity group membership. Certain Copilot programs also disaggregate results by financial stress, while others enable educators to create a custom target group.

How are race-ethnicity data collected?

The first time that students complete a Copilot survey, they are asked what racial and ethnic groups they identify with. Our priority in asking students to identify their racial and ethnic background is to (1) help educators understand how students from structurally disadvantaged groups are experiencing the educational environment; and (2) to provide students from all racial, ethnic, and nationality backgrounds the chance to select at least one option that closely matches their self-identification and lived experience. In doing this, we recognize that the experiences of sub-populations within large race and ethnicity categories may be meaningfully different from one another and that it is important to reflect to students an understanding of this diversity, rather than forcing them to identify with broad categories that may not resonate well with their lived experiences. For these reasons, Copilot survey inquires about race and ethnic group members using the following question.

With which group(s) do you identify? Please select the box(es) that apply.

Screenshot of survey question

Why are race data grouped in reports?

When students self-identify their race and ethnicity in PERTS surveys, they can choose from 17 different categories and identify with multiple races (see How are race-ethnicity data collected?). However, results in reports are not broken out by those fine-grained categories because that level of disaggregation would make it impossible to maintain the confidentiality of students’ responses. (To maintain students’ privacy, Copilot reports only show results for a given group when that group has at least five members. That means, for example, that reports could not show results from Native American students if there were fewer than five Native American students on a given roster.)

Even though we cannot break out data by each distinct race and ethnicity for privacy reasons, we believe it is important to give educators insight into whether students’ experiences in their classrooms are reinforcing or interrupting the nationally observed disparities in educational opportunity.

As a compromise between the need to maintain student confidentiality and the desire to provide teachers with disaggregated data about opportunity gaps, Copilot reports group together certain races and ethnicities. Using national statistics on disparities in academic and disciplinary outcomes as a guide, reports group together students who self-identify as Black, Latinx, and/or Native American. Reports contrast their experiences to those of White and Asian students (who are comparatively advantaged according to the same national statistics).

This compromise enables educators to get some insight about opportunity gaps in their classrooms while simultaneously protecting student confidentiality. In order to support more fine grained analysis, certain Copilot programs enable educators to create a custom target group that more accurately represents the groups of students in their local context who are situated farthest from opportunity.

What is a target group on Copilot?

In many schools and colleges, specific groups of students are situated further from opportunity. Often, these students are members of specific racial, ethnic, or gender groups; English language learners; or students who are members of multiple intersecting demographics (for example, men of color). If educators are aware that a specific group of students is situated further from opportunity in their context and want to disaggregate their results separately to better attend to their experiences, we encourage them to use the “Target Group” feature in Copilot to define a custom target group that specifically includes those students. The Target Groups feature is intended to help educators adopt a Targeted Universalist approach towards creating a more supportive and equitable learning environment.

When a target group is set, reports will display results separately for students in the target group and for those who are not. This enables educators to see whether members of the target group experience their classroom differently. In this way, educators can recognize whether and how students’ experiences in their own classes may be reinforcing—or mitigating—the opportunity gaps observed in other data.

How do I set up a target group in my reports?

To set up a target group, the project host should:

  1. Sign into Copilot
  2. Click “Settings” on the menu to the left (you might need to select a project first)
  3. Click “Report Settings”
  4. Define the target group

Note that the target group feature is not yet available in C-SET.


We take privacy extremely seriously for both students and educators.

What about educator privacy?

To protect educator privacy, each roster entered into Copilot has a main contact who receives survey instructions and reports for that roster. The only person in a Copilot project who can see a roster’s reports is the main contact of that roster. PERTS encourages project members to share and discuss their reports with colleagues in order to get advice and share insights to the extent they feel comfortable doing so.

Note: it is possible for the project host or project lead to change the main contact setting for a roster. This means that project host Alice could, in theory, change the main contact of Bob’s roster to be Alice herself, which would allow Alice to see Bob’s reports. However, in that event, Bob would get an automated email informing him that Alice changed his roster settings.

How do you safeguard student privacy?

We take student privacy extremely seriously both because it’s the law, and also because it’s the right thing to do! Our technical security measures are described at, and our legally-binding privacy policy can be found at

Our strict privacy policy is specifically designed to meet the requirements spelled out in the Family Educational Rights and Privacy Act (FERPA), and hundreds of schools and colleges around the United States have implemented PERTS programs under the legal coverage provided by our privacy policy. Under the “school official” exemption in FERPA, schools can share identifiable information with an individual or organization that is acting in the capacity of a school official. To qualify as a school official under FERPA, the individual or organization must be acting on behalf of the school, must keep the information confidential, must only use that information for the purposes approved by the school, and must enable the school to control that information. For example, the school must be allowed to review, change, or delete the personal information. Under no circumstance can the information be shared with third parties without the schools’ permission. The PERTS privacy policy meets all of those requirements. Check it out at

Can I see individual students’ responses?

We generally do not share individual students’ responses (with the exception of open ended responses) because doing so could incentivize students to be dishonest and destroy the integrity of the process. Of course, we encourage you to have frank face-to-face conversations with your students, but one of the goals of the surveys is to give you an easy, confidential way to get honest feedback.

Why do my students need to log in to surveys with a Roster ID?

Copilot needs a way to recognize individual students over time so that it can accurately calculate how students’ experiences change over time. Our privacy policy forbids us from using the Roster IDs (or any other potentially identifiable information) for purposes not authorized by the school.

What can I use as a Roster ID?

PERTS strongly recommends that you use students’ school email addresses as their Roster IDs. This will make it easier for students to remember their Roster ID, and it will prevent ID collisions (where two students are accidentally given the same ID, resulting in their responses overwriting each other).

If you would like to use anything other than a school email address as the Roster ID, please contact to confirm the suitability of the alternate Roster ID. As per our strict privacy policy, PERTS only shares student login information with authorized individuals who have a legitimate need to access them, like the educators who set up the rosters (for details, please see our privacy policy).

Do I need IRB approval?

Schools and colleges have different rules about when an Institutional Review Board (IRB) needs to review and approve an activity. Typically, IRBs only need to review formal research activities that are intended to result in generalizable knowledge. In contrast, IRBs typically do not need to review quality improvement activities—activities that are intended to support an organization in improving its own practices and processes. PERTS Copilot programs are intended to help educators collect and use information for quality improvement purposes; therefore, IRB permission should not be required under most circumstances. However, if you intend to use the information you collect for formal research purposes, then you probably do need IRB approval. Also, some schools and colleges ask IRBs to review even quality improvement activities.


Why create a Copilot community?

A Copilot “community” is a group of projects that can be administered together by a community administrator. For example, say that three different instructors in the same math department each have their own Copilot project so that they can decide on their own measures and cycle cadence. The chair of the math department may set up a new community and ask those instructors to attach their projects to that math department community. Doing so would enable the department chair (and any other community administrators) to review and compare results across each of those projects and to adjust project settings, as needed.

How do you create a community?

  1. Sign into Copilot
  2. Click home in the top right
  3. Click add on the “Community” list
  4. Name the community and click save

How do you attach a project to a community?

  1. Get the community code from a community administrator. The community administrator will need to:
    1. Sign into Copilot
    2. Click home
    3. Click into the relevant community from the “Community” list
    4. Click the “add” button on the “Associated Projects” list.
    5. Provide the community code to the host(s) or leads(s) of the projects(s) that will be associated with the community.
  2. Use the community code to associate a project with a community. The project host or lead will need to:
    1. Sign into Copilot
    2. Click home
    3. Click into the relevant project from the “Projects” list
    4. Click the “Settings” on the left
    5. Click “Associated Communities”
    6. Paste in the community code

How do permissions work for communities?

When a project joins a community, it gives the community’s administrator(s) the same permission over its settings and reports data as are granted to the lead or host.

Community administrators are able to:

  • See project-level reports
  • Change survey settings and project membership (including who is the project lead/host)
  • Update rosters

Can a project be associated with multiple communities?

Yes, you can associate a project with up to 5 communities. For example, maybe you want your project to be accessible to an instructional coach who works with three projects and also to a district leader who supervises twenty projects.


How can I help an educator build student buy-in for surveys?

Survey reports prominently flag problems with student buy-in. For example, they show if students are not comfortable being honest or if they do not believe their educator will use the data to improve their learning experiences. If an educator is struggling to get high student buy-in, we recommend trying some of these strategies:

  • Visit the educator’s classroom to see how they are framing the surveys to students.
  • Work with the educator, as needed, to create a plan for improving student buy-in. (The Survey Instructions page on Copilot has some strategies that have worked for other teachers.)

Why can't I access Copilot?

If you have issues accessing Copilot, a firewall may be blocking access to the site. Check with your IT department to see if they can resolve the issue. You can test your network access here:


Copilot was developed by the Project for Education Research That Scales (PERTS) in partnership with leading researchers and educational advocacy organizations. PERTS in a non-profit research and development institute that translates insights from psychological science into cutting-edge tools, measures, and recommendations that educators anywhere can use to foster student success effectively and equitably.

Key Partners & Advisors

Susan Colby, Founder & CEO, Imagine Worldwide

The College Transition Collaborative

Carol Dweck, Professor, Stanford University

Camille Farrington, Managing Director, UChicago Consortium for School Research

Becky Margiotta, Principal, Billions Institute

The National Equity Project

Jason Okonofua, Assistant Professor, UC Berkeley

Shift Results

University of Chicago Consortium for School Research

Greg Walton, Associate Professor, Stanford University

Key Funders

The Raikes Foundation

The Overdeck Family Foundation

The Bill & Melinda Gates Foundation