Notes

  1. 1 Historically underrepresented groups are not equally represented in STEM; their representation in STEM is smaller than their representation in the U.S. population and has been so over time. These groups include women, blacks, Hispanics, and American Indians or Alaska Natives, among others.

  2. 2 The ECLS-K sample is not nationally representative of all fifth graders. Statistics cited here are nationally representative of the population of students who were first-time kindergarteners in the 2010–11 school year and who were in fifth grade in 2016. It does not include students who may have repeated or skipped a grade.

  3. 3 Family poverty level was determined in spring 2011 when students were in kindergarten. Family income data were not collected in subsequent years of the study.

  4. 4 Scale scores convert the total number of correct answers (raw score) to a standardized score, which allows comparison of test scores across different editions of the test over time. Scale scores are used for comparative purposes among demographic groups and to examine changes in scores over time.

  5. 5 The scale for the main NAEP mathematics assessment is 0–500 for grade 8. In 2017, 80% of students scored between 233 and 333 (Table 1-2).

  6. 6 Student eligibility for a free lunch program is a less-than-perfect measure of SES (Harwell and LeBeau 2010).

  7. 7 NAGB, as directed by NAEP legislation, has developed achievement levels for NAEP since 1990. A broadly representative panel of teachers, education specialists, and the public helps to define and review achievement levels. As provided by law, the achievement levels are to be used on a trial basis and should be interpreted and used with caution until the NCES commissioner determines that the levels are reasonable, valid, and informative to the public. This determination will be based on a congressionally mandated, rigorous, and independent evaluation. More information about NAEP achievement levels is available at https://nces.ed.gov/nationsreportcard/achievement.aspx.

  8. 8 NAEP administered a new science assessment beginning in 2009 to keep pace with advances in both science and cognitive research, the growth in national and international science assessments, advances in innovative assessment approaches, and the need to incorporate accommodations so that the widest possible range of students could be fairly assessed. This assessment was not comparable to prior assessments administered beginning in 1996. As a result, it is not possible to report long-term trends for science achievement.

  9. 9 Although technology and engineering are important aspects of STEM, they receive less coverage in this report because of the lack of national data sources covering these topics. The NAEP TEL assessment began providing national data for eighth graders when it was first administered in 2014.

  10. 10 Actual scores for male and female students in 2014 were 148.6 and 151.4, respectively, a difference of 2.8 points, which rounds up to 3 points.

  11. 11 NAEP TEL data are available at https://www.nationsreportcard.gov/tel/student-questionnaires/.

  12. 12 Although the IMF does not include Russia among the world’s advanced economies, this analysis includes it because it is a large economy with high levels of student achievement and high levels of science and technology capability. Other countries with high and rising levels of science and technology capability, such as India or China, are not included because they do not participate in the TIMSS assessment.

  13. 13 The teachers of the eighth grade students participating in the NAEP mathematics assessments were asked to complete a teacher questionnaire (see https://nces.ed.gov/nationsreportcard/bgquest.aspx). Because the sampling for the teacher questionnaires was based on participating students, the responses to a particular teacher questionnaire do not necessarily represent all teachers of that subject at that grade level in the nation. It is important to note that in all NAEP reports, the student is the unit of analysis, even when information from the teacher or school questionnaire is being reported.

  14. 14 For more information about dual enrollment, see Shivji and Wilson (2019) and Fink, Jenkins, and Yanagiura (2017).

  15. 15 This rate, known as the immediate college enrollment rate, is defined as the annual percentage of high school completers aged 16–24, including GED recipients, who enroll in 2- or 4-year colleges by the October after high school completion.

  16. 16 The analysis presented here is restricted to students who had enrolled in postsecondary education by December 2013, which captures students who had been enrolled in postsecondary education for up to 3 years after high school. This analysis uses NSF’s definition of STEM majors, which includes mathematics, natural sciences, engineering, computer and information sciences, psychology, economics, sociology, and political science. Students are considered to have declared a STEM major if the first or second major field of study they most recently reported was a STEM field.

  17. 17 Credits refers to Carnegie credits. A Carnegie credit is equivalent to a 1-year academic course taken one period a day, 5 days a week.

  18. 18 The STW definition used here is a combination of the NCSES S&E and S&E-related occupations and a list of occupations obtained following the methodology presented in Jonathan Rothwell’s Defining Skilled Technical Work prepared for the National Academies Board on Science, Technology, and Economic Policy project on “The Supply Chain for Middle-Skilled Jobs: Education, Training, and Certification Pathways” in 2015 (Rothwell 2015).

  19. 19 The sample includes both students who earned a diploma or a GED before leaving high school and those who did not.

  20. 20 NCES defines career and technical education as courses at the high school level that focus on the skills and knowledge required for specific jobs or fields of work.

  21. 21 The STW section uses a significance level of 0.1 when testing comparisons. Because of smaller sample sizes, there are larger standard errors, which may cause comparisons to not be significant at the 0.05 level used in the other sections of this report. NCSES accepts comparisons at the 0.1 level, particularly when findings are of substantive interest to policymakers, educators, and the public.