Introduction

Research publications and presentations at conferences represent the main mechanisms for disseminating research findings. Presentations appear in the published research literature in conference proceedings. Published literature is an indicator of scientific activity and global research partnerships. Additionally, analysis of how published literature is cited provides insight into the impact of research output. Scientific publications serve as a key linkage enabling public uses of scientific output (Yin et al. 2022).

This report presents data on research publication output by region, country, or economy and scientific field; impact measures; and international collaboration. The first section examines comparative region, country, or economy data on publication output across science and engineering (S&E) fields and includes a sidebar on federal funding acknowledgments. The second section provides an analysis of scientific impact as measured by bibliographic citations in research publications. The third section focuses on collaboration among researchers in the United States and those in other regions, countries, or economies through examining coauthoring and citation patterns. This section also includes a sidebar on the artificial intelligence (AI) publication output and collaboration network.

Bibliometric Data Preliminaries

This report analyzes S&E publications and citations using bibliometric data in Scopus, a database of scientific literature with English-language titles and abstracts (Science-Metrix 2021a). Because research activities are complex and multifaceted processes, the knowledge and social benefits that they produce are difficult to measure directly. Bibliometric data, including publications and citations, provide invaluable indicators of research output due to their ubiquity across regions, countries, or economies and time. Nonetheless, bibliometric analyses of publication and citation data remain proxy measures for the knowledge and social benefits produced by research activities, so they carry certain limitations. Publications themselves may represent differing “amounts” of research output because differences in field conventions or incentive structures may result in more or fewer publications covering novel research findings. In addition, publications do not represent all types of research products, such as data sets (Franzoni, Scellato, and Stephan 2011; Sugimoto and Larivière 2018).

This report analyzes nearly 44 million English-language articles published from 2003 to 2022. The analysis included papers published in conference proceedings and research articles published in peer-reviewed scientific and technical journals (collectively referred to as articles). The analysis excluded editorials, errata, letters, and other materials that do not typically present new scientific data, theories, methods, apparatuses, or experiments. The analysis also excluded working papers and preprints, which typically have not yet been peer reviewed, and articles published in journals that lack substantive peer review, sometimes referred to as predatory journals (Grudniewicz et al. 2019). Even with robust coverage and filtering, bibliometric data may retain biases or gaps in coverage, including a bias toward English-speaking regions, countries, or economies. In terms of interpretability, longer-term trends are the best way to view publications-related data. Year-to-year differences may be due to the process by which the information is indexed in Scopus. Additional details regarding document selection, limitations, and sources of bias are available in the Technical Appendix.

Information about how research was produced—such as the field; region, country, or economy of origin; and collaboration—may also be inferred from bibliometric data. For example, author affiliation data were used for determining publication output by region, country, or economy through fractional counting and international collaborations through whole counting. The supplemental tables include calculations using both whole and fractional counting for the various indicators to illustrate the difference in results. Articles were categorized by S&E fields corresponding to the 14 fields of science in the National Center for Science and Engineering Statistics (NCSES) Taxonomy of Disciplines (TOD) (Science-Metrix 2019). Additional details regarding fractional and whole counting, field categorization, and limitations are available in the Technical Appendix.