Skip to main content Skip to secondary navigation

Frequently Asked Questions

Main content start

Quantitative Findings November 2021

Qualitative Findings May 2022

Answers

What was the IDEAL DEI Survey? When was it administered? To whom? Why?

In May, 2021, Stanford administered a diversity, equity and inclusion (DEI) survey as part of the IDEAL initiative. The survey primarily focused on how racial and ethnic identities shape the experiences of community members at Stanford. It included questions about demographic identities, experiences of inclusion and exclusion, and questions about experiences with harassing and discriminatory behavior. The survey was administered to all current undergraduate and graduate students, including students in professional programs and students on leave; postdoctoral scholars; staff and academic staff (including SLAC and Stanford bargaining unit employees); and faculty. 

What was the response rate for the 2021 IDEAL DEI Survey? Did you find any bias in who responded to the survey? 

The overall survey response rate was 36%, with 14,907 survey respondents who completed the survey out of the 41,052 current Stanford affiliates invited to take the survey. For the purposes of analysis, only those survey respondents who completed and submitted their survey were included. At a broad level, survey respondents were representative of the campus community. Response rates among staff  (44%) and faculty (38%) were slightly higher than for students (29% for undergraduate and graduate students and 31% for postdocs). Survey response rates were similar across racial or ethnic identity groups; the response rate was over 30% for community members with known racial or ethnic identity groups (in university records). (The response rate was 25% for respondents with “Unknown” race or ethnicity in university records and 25% for student and postdoc respondents classified as “International.”) The response rate was higher for respondents who identified in university records as female (44%) than males (29%).  The survey data presented in the reports and dashboards on the survey website are not weighted to account for these differences.  Note: Currently university records contain only biological sex. Therefore, calculating survey response rates compared to the total university population required using biological sex instead of gender identity. The survey collected data on gender identity, which can be found on the survey project website.

What is included in the report and dashboards? 

The reports and dashboards on this website primarily present quantitative findings from the survey - including data on demographics and identity, belonging and inclusion, microaggressions, discriminatory, and harassing behaviors. The report and dashboards do not currently include findings from the open-ended, qualitative data collected in the final section of the survey. We are taking extra care with these open-ended responses due to concerns about survey respondent privacy. Responses are being analyzed by a third-party - an independent contractor - to help ensure the anonymity of survey participants and the confidentiality of their responses. We will post the findings from this analysis to the survey website during Winter Quarter of the 2021-22 academic year.

There were open-ended questions on the survey, what are you doing with responses to those questions?

There were multiple kinds of open-ended questions on the survey. For example, the survey analysis team processed the text provided by respondents offering further details about their race or ethnicity and other identity questions where open-ended responses were encouraged.  The last several questions on the survey asked participants to provide information about their experiences at Stanford and to let us know more about things that the University should do differently in the future, as well as to describe programs, resources, places, etc. that they felt have been particularly effective in improving the climate for diversity, equity, and inclusion at Stanford.  We have been especially careful with responses from these questions in order to make sure we protect the privacy of survey participants.  We engaged an external research firm to help analyze and summarize the responses. Reports on the findings from these analyses are now available on this website. You can find more answers about the qualitative survey reports below.

What are microaggressions? How did you measure them?

For more about how other behaviors were defined in the survey, see our Definitions page.

Survey respondents were asked about four types of negative interpersonal interactions they may have had at Stanford. These four types of interactions are referred to as microaggressions in the survey findings. The four questions asked in the survey were derived from existing literature on microaggression. The four questions are: 

During the last two years you have been at Stanford (or fewer, depending on when you started at Stanford), has someone associated with Stanford…

Invalidated your individual lived experiences due to your racial or ethnic identity? 

For example:
Someone told me that they “don’t see color” or we should not think about race anymore
Someone told me that people of color don't experience racism anymore
Others assume that people of my racial background would succeed if they simply worked harder
Someone assumed that I had a particular skill set due to my race or ethnicity (e.g., good at math and science, athletic ability)

Assumed you were inferior due to your racial or ethnic identity?

For example:    
Someone told me that I was “articulate” after she/he/they assumed I wouldn’t be
Someone acted surprised at my scholastic or professional success
Someone assumed that I was poor
Someone assumed I come from a disadvantaged background

Acted as if they were afraid or wary of you due to your racial or ethnic identity? 

For example:
Someone avoided walking near me on the street
Someone clenched her/his/their purse or wallet upon seeing me
Someone’s body language showed they were scared of me
I am singled out by police or security people
A store owner followed me around the store

Made you feel othered or exoticized due to your racial or ethnic identity?

For example:  
Someone did not believe me when I told them I was born in the U.S.
Someone assumed that I spoke a language other than English
Someone wanted to date me only because of my race/ethnicity
Someone suggested I was "exotic"
Someone told me that all people in my racial group look alike or are all the same

What are harassing and discriminatory behaviors? How did you measure them? 

The survey asked about harassing and discriminatory behaviors in the survey as described below. We chose this approach, rather than using university definitions, as we did not want to limit the purview of the survey to the narrow definitions provided by university policy. 

During the last two years you have been at Stanford (or fewer, depending on when you started at Stanford), have you ever experienced…

Verbal, written, or online harassing behaviors by someone associated with Stanford?

For example:
Someone made a derogatory remark or gesture in person or online
Someone sent Someone made a derogatory remark or gesture in person or online  
Someone sent me a derogatory email, text, or social media post  
Someone defaced property with derogatory graffiti  
I was embarrassed, humiliated, or threatened by someone in person or online

Physical harassing behaviors by someone associated with Stanford?

For example:
I was threatened with physical violence
I experienced physical violence
Someone tried to touch me without my consent
I was touched in a way that I did not want

Discriminatory behaviors by someone associated with Stanford?

For example:
Denied or overlooked for a promotion or leadership opportunity
Denied necessary accommodations
Denied or overlooked for a professional development or mentorship opportunity

(See Definitions page or Instrument for full list of examples of discriminatory behaviors provided in the survey.)

Can I get a copy of the survey instrument? 

Yes, we have posted the instrument on the website. Note, we have provided the instrument in several forms. 

User experience: If you would like to get a sense of how respondents experienced the survey, click through the full survey as it originally appeared.  

Screenshots of user experience: If you would prefer to scroll through the entire survey, without having to click through the actual survey itself, view this PDF version of the survey with screenshots of the survey items, showing how the survey appeared to respondents. 

Technical appendix: If you would like to read the survey in its entirety, view this PDF version of the survey with all possible questions

How were people assigned to demographic groups?

All demographic data are self-reported and derived from answers to questions in the survey. Note there were several demographic questions for which respondents could select more than one category (e.g., racial or ethnic identity, gender identity, sexual identity, religious identity). In the report and the dashboards, respondents are represented in every category they selected. For example, a respondent who identifies as both Black or African American and Hispanic or Latino/a is represented in both categories.

Why does the list of demographic groups for comparison vary by chart? Can I see the results by additional breakout groups? 

The decisions about which breakout groups to include in each dashboard were largely driven by a desire to show as much data as possible, while also ensuring the privacy of respondents. In the demographics dashboards, we have included all of the primary demographic questions asked in the survey. When transitioning to the dashboards that explore respondents’ experiences, fewer demographic identities are offered and some demographic categories are aggregated to protect the privacy of respondents, given small numbers of respondents in some categories among certain populations. The option to view data across all roles is only available where the same questions were asked of each population. (For more information on aggregation, see Definitions page.)

Why have you not shown results by School/VP unit or department?

The IDEAL survey was specifically designed to collect data about types of individual experiences. Survey findings show that the prevalence of these experiences differ substantially by racial or ethnic identity (among other identities) of survey respondents.  For the vast majority of departments and work units at Stanford, there were too few survey respondents to report findings broken out by department/unit level across racial or ethnic identities, while systematically maintaining the privacy of individual survey respondents.  Protecting the privacy of survey participants is an important tenet of the survey process.  In most instances, reporting survey findings at the department level by race or ethnic identity, gender identity, or other key identities/demographics would put at risk the anonymity of individuals who participated and the privacy of their survey responses. 

I am now worried about the anonymity of my responses. How are you ensuring the confidentiality of respondents?

Strict measures are in place to protect your privacy. Individual data will not be shared with anyone. The report and dashboards were designed to protect survey-takers’ privacy. The results are presented in summary form so that no individual can be identified. We are not reporting at the department level, and in charts we only show results for groups with more than 10 respondents. (For groups with 10 or fewer respondents, the responses are suppressed and the bar in the chart appears as blank.)  In addition, responses to long-form open-ended survey questions have been given to a third-party independent contractor for analysis, to further prevent the identification of individuals. A summary of the responses from these questions will be posted to the survey website during the Winter Quarter of the 2021-22 academic year. Individual responses will not be made available.

How do I interpret percentages? What is the denominator? How do I interpret questions in which respondents could select more than one category?

Among the demographic dashboards:

Generally speaking, the percentage should be interpreted as the proportion who selected a particular category out of everyone who responded to the question. Respondents who either did not answer the question or selected “Prefer not to say” are not included in the denominator.  

For some survey questions - low-income background, from the U.S., English as a native language - a respondent could only select one category. In these instances, the percentage is the proportion of respondents who selected at least one category. For example, the percent of respondents who identified as coming from a low-income background is computed as the number of respondents who selected  “Yes” out of the total number of respondents who selected any category, other than “Prefer not to say” (a.k.a., answered the question).

For some survey questions - racial or ethnic identity, religious identity, gender identity and sexual identity - a respondent could select multiple categories. In these instances, the percentage is the proportion of respondents who selected at least one category, other than “Prefer not to say.” If a respondent selected multiple categories, they are reflected in each percentage. (For example, if someone identified as Black or African American and Hispanic or Latino/a, they are counted: 

  • Once towards the total number of respondents who answered the question
  • Once towards the number of respondents who identified as Black or African American
  • Once towards the number of respondents who identified as Hispanic or Latino/a

Among the Belonging and Experience dashboards:

Generally speaking, the percentage should be interpreted as the proportion who selected a particular answer or category out of everyone who responded to the question. Respondents who either did not answer the question or selected “Prefer not to say” are not included in the denominator.  In each dashboard, the number of respondents who answered the question will be represented either by a chart with dark gray bars or in a table on the right side of the view. 

When comparing results across demographic categories,  the percentage should be interpreted as the proportion within a particular demographic group who selected a particular answer out of everyone in that demographic group who responded to the question. 

In the dashboards, why are there some charts where a bar is not displayed?

Groups for which 10 or fewer people responded have been suppressed in order to ensure the privacy of respondents.

What do you mean by “role at the university?” How were people assigned to a role?

Role at the university refers to the primary affiliation of the survey respondent with the university. This is predominantly determined by self-reported data in the survey. Respondents could have selected one of the following five categories:

  • Undergraduate or coterm student
  • Graduate student (including professional school student)
  • Postdoc
  • Staff member (including academic/teaching staff such as lecturers and visiting  scholars)
  • Faculty member or clinician educator

Institutional data was used to distinguish between faculty and clinician educators. This decision was made in part to reflect that clinician educator’s experiences are split across the academic and clinical realm. 

Note: We are aware that survey respondents may identify with more than one role at the University. The survey allowed the respondent to select a single role for which they wanted to take the survey.

Can I see aggregated data for underrepresented minority survey respondents?

The survey data show that the prevalence of experiences reported by survey respondents differ substantially by racial or ethnic identity (among other identities). Therefore we can not meaningfully group multiple racial or ethnic identity categories to draw comparisons; doing so would lead to misinterpretation.

Will we receive more results from the survey data, other than what is in this report?

Yes, we will continue to analyze the results. We also hope to have ongoing conversations with the community to discuss what additional analyses would be helpful/valuable. For example, during WInter Quarter we will release a summary of information provided in many of the open-ended questions from the survey.

I have a disability that makes it difficult to access the information on this site. Are the survey results accessible in a different format?

Stanford University is committed to providing an online environment that is accessible to everyone, including individuals with disabilities. If you cannot access this content or use any features on this site, please email ideal_deisurvey@stanford.edu to obtain alternate formats.

Will I be able to access anonymized, row-level survey data for my research?

No. Study participation consent precludes sharing row-level data and the use of project data for publication. If you have additional questions or ideas for analysis, please reach out to the research team at ideal_deisurvey@stanford.edu or use the anonymous feedback form.

I did not participate in the survey, is there a way for me to provide feedback to the university on these issues? Can I participate in the Action Steps and other initiatives that are being planned?

We know that many people at Stanford did not take the IDEAL DEI Survey. We welcome feedback from anyone in the Stanford community about the issues covered by this survey. If you would like to provide input to the survey team or to the broader IDEAL effort, please consider using our (anonymous) feedback form or send an email to ideal_deisurvey@stanford.edu. Regardless of whether or not you took the survey, we encourage the participation of everyone at Stanford in the actions and initiatives that are currently underway, as well as those that will come in the future.

Who can I contact if I still have questions not answered by these FAQs? How can I provide feedback?

If you have additional questions or feedback regarding the survey or report, please email ideal_deisurvey@stanford.edu or use the anonymous feedback form.

Who did the analysis for the qualitative findings?

The IDEAL DEI survey team provided the survey comments to an independent research firm, Actionable Insights (AI), removing any additional identifying survey data. 6,686 survey respondents (46% of all survey respondents) provided comments across the three questions, totaling more than 680,000 words of text. The firm reviewed the text and developed a taxonomy of themes across the content (see Qualitative Analysis Methodology section of this website for further details).

In the qualitative findings, how were the quotes selected?

After Actionable Insights analyzed the comments for thematic content, the IDEAL Survey team worked with the research firm to select a sample of comments that were representative of each theme. We present the key findings that emerged from survey comments by using respondents’ own words to illustrate broader themes. The quotes provided in the reports on this website should be viewed as representative of many other very similar comments.

The quotes selected for this report:

  • Relate personal, lived experiences of individuals in the Stanford community.
  • Illustrate how race, ethnicity, and other identities can influence interactions, behaviors, and opinions at Stanford.
  • Are drawn from all roles at the University represented in the survey data: students, faculty, post-docs, and staff.
  • Illustrate the main topics within a relevant theme as well as variation across comments within a theme.
  • Convey a community-member’s experience without revealing identifying details.

Note: The comments represent a sample of survey respondents’ experiences. No single comment should be viewed as representative of the experiences of all members of an identity or group at Stanford, nor are the themes and individual quotes presented necessarily generalizable to Stanford community members who did not take the survey.

Who provided comments on the survey? Were they representative of all survey takers?

This FAQ is about the comments on the survey, for more information about the overall representativeness of the IDEAL DEI survey see the About the Survey section on the project website.

The 6,686 respondents to the open-ended questions were demographically similar to the 14,907 survey respondents overall. White, woman, and/or native English speakers were slightly over-represented in the survey comments, compared to the overall number that completed the survey. Asian/Asian American respondents and respondents who identified as a man were slightly under-represented in the survey comments compared to the overall number that completed the survey.

Display showing that commenter demographics of race, gender, and English language status were similar to the overall survey

Note: When asked about race/ethnicity and gender identity in the survey, respondents were asked to select all that apply. In the charts on this slide, a respondent is represented in every identity they selected. For example, someone who identified as both Black or African American and Hispanic or Latino/a are represented in the percent in both bars.

How did you ensure the privacy of respondents’ quotes?

The qualitative reports summarize the most common themes that emerged along with direct quotations from survey respondents to illustrate the theme. Multiple research team members reviewed the comments provided to ensure respondent privacy, and some quotes in this document have been lightly redacted (without altering the substance or sentiment of the quote) to preserve the anonymity of the respondent. If you are concerned about the privacy of your responses to this survey, please contact ideal_deisurvey@stanford.edu.

Why did you choose an external research firm to analyze the survey comments? How did you select the research firm?

Due to the volume of the survey responses and the need to ensure respondent privacy, we decided it was important to utilize an external research firm. The IDEAL Survey team put out a request for proposals to multiple external research firms that specialize in qualitative analysis. Out of two finalist firms, Actionable Insights was chosen based on their extensive qualitative research experience, DEI subject matter expertise, and track record for delivering research findings that supported decision making efforts at large organizations.

What did we learn from the comments written by survey participants?

Some of the important findings that emerged from participants’ comments were:

  • The specific ways that community members experience microagresssion, harassment, and discrimination.
  • The personal and professional impact of harmful experiences.
  • Aspects of university structure and organization that create harmful dynamics and reduce accountability.
  • Specific strategies and approaches the community would like the university to pursue.
  • Examples of programs and initiatives that are working well.

Yes, some participants provided comments about the survey design and their experience taking the survey.  These comments will be used to help inform future IDEAL survey research.