August 2025
Overview
The Executive Order Restoring Gold Standard Science issued on May 23, 2025, aims to ensure that “federally funded research is transparent, rigorous, and impactful, and that Federal decisions are informed by the most credible, reliable, and impartial scientific evidence available.” The White House Office of Science and Technology Policy’s (OSTP) June 23, 2025, memorandum “Agency Guidance for Implementing Gold Standard Science in the Conduct & Management of Scientific Activities” provides guidance to agencies on implementing the Executive Order and the requirements for developing and reporting on an implementation plan.
The General Services Administration (GSA) is committed to ensuring scientific activities adhere to the highest standards to enhance the rigor, transparency, and reproducibility of scientific research and reporting. Although GSA is not a recognized statistical or scientific agency, and does not extensively conduct or fund scientific activities, we are committed to adhering to Gold Standard Science (GSS) tenets building on existing robust standards, policies, and practices in place for scientific activities. For instance, all program evaluation activities adhere to the GSA evaluation policy, which provides evaluation principles and standards aligned with Federal evaluation standards. The principles of rigor, relevance and utility, transparency, independence and objectivity, and ethics are also aligned with the GSS core tenets. In addition, GSA is actively working on a Data Quality Policy that includes data quality standards and a data quality assessment framework that closely align with the GSS core tenets.
This report outlines the GSA GSS implementation plan, detailing how the nine core tenets will be addressed, evaluation metrics, training protocols, technological integrations, and anticipated implementation challenges. It incorporates and builds on the existing policies and practices, creating a plan that will be used to ensure that GSA is supporting rigorous, evidence-based science. Through this effort, GSA seeks to strengthen internal scientific activities, enhance cross-sector collaboration, and ensure that science used in decision-making is independent and driven by the best available evidence. This effort supports the GSA’s dedication to fostering a culture of scientific excellence and accountability in service of the public interest.
GSS tenets
The implementation of GSA GSS is guided by nine essential tenets, as outlined in the Executive Order and its accompanying memorandum. These tenets are: reproducibility, transparency, clear communication of error and uncertainty, collaboration and interdisciplinarity, skepticism toward findings and assumptions, structuring for falsifiability of hypotheses, unbiased peer review, acceptance of negative results as positive outcomes, and freedom from conflicts of interest.
For GSS to be effectively integrated, these tenets must be reflected throughout the agency, including in its culture, funding opportunities, budget and resource allocation, and award selection processes. GSA’s agency culture already integrates the principles of its scientific activities and will continue to enhance adherence to them. For example, GSA offices engaging in scientific activities frequently have existing protocols and procedures, which produce comprehensive project details essential for study reproduction and replication. GSA is committed to ensuring that all offices involved in scientific activities develop these protocols, thorough documentation, statistically robust analysis plans, and appropriate controls.
In addition, the GSA’s Evidence-Based Data Governance Executive Board (EDGE Board) was established as a result of the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act). This board is dedicated to fostering an evidence-based, transparent culture among executives and data practitioners, promoting the sharing of data, methodologies, and best practices throughout the agency. The EDGE Board plays a crucial role in advancing GSA’s strategic data priorities, which support the agency’s mission and strategic plan. This board prioritizes funding opportunities, manages budgets, and allocates other resources to enhance the overall data utility and analytics across the agency. The board adjusts GSA’s approach, resourcing, and priorities as needed to maximize the use of data as a strategic asset, while ensuring adherence to established policies, protocols, and procedures.
GSA will continue to leverage the EDGE Board to enhance its data quality and capability to include the GSS tenets where applicable and ensure that scientific activities meet the highest standards across the agency. GSA’s management of the Analytics, Artificial Intelligence, and Generative AI Communities of Practice facilitates knowledge sharing of best practices about data preparation, data quality, analytical methods, and AI applications (among other subjects) that foster a culture supporting the GSS tenets at GSA and across Government. Evidence is essential to GSA’s mission and directly supports its strategic priorities to optimize the Federal real property portfolio, modernize procurement, and drive efficiencies across Government. GSA will ensure adherence to each of the GSS tenets defined below and has provided some specific examples of how the agency is currently implementing each tenet or plans to enhance adherence in the future.
- Reproducible. This includes two essential pillars of reproducibility and replicability. Reproducibility is the ability to test a hypothesis through multiple methods and achieve consistent results, whereas replicability is the ability to repeat a study using the same methods and achieve the same result. GSA adheres to important aspects to ensuring reproducible and replicable science, which include ensuring that protocols, methods, and statistical approaches are well documented and shared for reproducibility when possible. For example, as part of the Office of Evaluation Sciences (OES) core project process, an evaluator blind to the evaluation team’s initial results conducts an independent analysis to ensure reproducibility of all primary study findings, and to ensure reported results are not dependent upon researcher discretion.
Transparency in science ‘entails the open, accessible, and comprehensive sharing of all components of the research process’ including methods, analytical tools, data, and findings. Transparency is incorporated into GSA’s culture, including the aforementioned evaluation policy. GSA is transparent in planning, implementation, and reporting to enable accountability and learning. GSA keeps a public record of priority evaluations fielded and shares findings for those priority evaluations in a timely way.
Scientific activities will make data and technical documentations associated with work products publicly available (where appropriate), include clear and comprehensive explanations of analytical methodologies associated with deliverables using accessible language for a nontechnical audience, and ensure that technical documentation complies with GSA data quality and metadata quality standards. An example of this is that OES posts accessible analysis plans publicly for all of its evaluations prior to beginning analysis, which specify methods, tools, data and hypotheses for all evaluations.
- Communicative of Error & Uncertainty includes full disclosure of study limitations (e.g., methods, measurement), potential sources of error, assumptions, and measurement limitations for critical assessment of findings. GSA is committed to ensuring that all scientific activities include quantitative estimates of error and uncertainty in statistical analyses and reporting. Technical documentation will clearly explain the methodology for constructing these uncertainty estimates in language accessible to a non-technical audience. Public reports will contextualize findings within their inherent uncertainty, cautioning against overinterpretation. The publicly available OES analysis plans include anticipated study design limitations, including precise descriptions of how uncertainty in findings will be estimated and reported. All unanticipated sources of uncertainty and their remedies are described in detail in the evaluation summaries.
- Collaboration & Interdisciplinarity refers to strategic integration of a wide range of expertise, methodologies, and perspectives across disciplines and sectors to address complex scientific challenges. GSA’s limited number of staff working on scientific activities collaborate both internally and externally with other Federal agencies and industry partners (as appropriate). Fostering and expanding interdisciplinary collaboration is a priority, specifically through joint projects and adoption of interoperable data sharing platforms. For example, before implementation, all OES study designs undergo a review and improvement process by independent researchers. These researchers possess methodological and substantive expertise that differs from the primary evaluation team. This extensive interdisciplinary collaboration is a hallmark of all OES evaluations.
- Skeptical of its Findings and Assumptions refers to critical and open-minded evaluation of research findings, methodologies, and underlying assumptions to ensure their validity, robustness, and reliability. Within GSA, a culture of constructive skepticism and the testing of alternative hypotheses is actively encouraged. OES ensures the transparency of its findings by disclosing its hypotheses (as per tenet 6) and uncertainty estimation methods prior to data analysis, clarifying underlying assumptions. All analyses are subjected to robustness checks, and comprehensive results are publicly released, subject to data privacy considerations.
- Structured for Falsifiability of Hypotheses enables hypotheses to be carefully tested and potentially disproven with appropriately designed research studies and experiments. Any scientific or research activities will ensure clearly articulated research designs, including testable hypotheses and specifying criteria for falsification. Final reports must address original hypotheses, observed outcomes, and the interpretation of any falsifying results. For all confirmatory evaluations, OES posts clear, falsifiable hypotheses and specifies the evidence that will be used to evaluate those hypotheses in its analysis plans prior to beginning analysis.
- Subject to Unbiased Peer Review refers to qualified experts providing impartial and independent review of proposals and manuscripts of federally supported research. Any reviewers of scientific activities will provide formal conflict-of-interest disclosures and review panels will include interdisciplinary teams and a variety of perspectives with review criteria and scoring rubrics clearly identified in advance. As an example of unbiased peer review processes, OES prioritizes independent feedback and impartial review from qualified external experts throughout its project process. Reviews are conducted using pre-specified criteria and rubrics. OES is also a leader in facilitating external replications and meta-analyses, providing study materials and analysis code to external researchers whenever permissible.
- Accepting of Negative Results as Positive Outcomes includes recognizing and valuing null or unexpected findings that fail to support a hypothesis. The GSA culture incorporates this into program evaluation activities, with null results for evaluation projects publicly posted on the OES website. For instance, OES reports and posts publicly (when privacy allows) all evaluation findings, including null or negative results. This transparency makes OES evaluations a trusted resource for understanding Government evaluations and are widely used by external researchers conducting systematic meta-analyses. Evaluations with null or negative findings are often some of the most informative evaluations, and are crucial for understanding and improving Government efficiency.
- Without Conflicts of Interest ensures that research is designed, executed, reviewed, and reported free from financial, personal, or institutional influences that could bias outcomes or undermine objectivity. As described in prior sections, full disclosure of financial and institutional interests is required for all staff, grantees, and reviewers involved with scientific activities.
Metrics and evaluation mechanisms
Developing key metrics and evaluation mechanisms is crucial to assess how effectively GSA is integrating the GSS tenets and their influence on scientific quality. This will enable us to identify areas for continued improvement of our scientific activities.
The initial step involves identifying all offices and key personnel engaged in scientific activities. For each of those offices:
- Catalog types of scientific activities and specific ongoing or planned scientific projects.
- Identify existing policies, practices, and protocols that address GSS tenets and review for alignment.
- Ensure staff involved with scientific activities are aware of GSS and the GSA implementation plan and metrics.
A data call will be sent to all GSA offices and key staff conducting scientific activities for next year’s report. The exact questions will be developed during FY26, though critical concepts collected will include the following:
- Identify which GSA offices are conducting scientific activities and assess the types of scientific activities being developed and implemented.
- Identify the extent to which the nine tenets have been incorporated into scientific activities.
- Any changes in practices since implementation of GSS.
- How technology has been incorporated into projects, particularly facilitating data collection and analysis.
- Challenges with implementing GSS tenets.
Training and resources
We plan to provide training and resources to ensure that agency personnel understand and adhere to the tenets of GSS. Below are current plans for training and resources, which may evolve as we understand staff needs related to GSS.
GSA has established a mandatory data literacy training course that all employees complete annually. This training covers essential concepts including data quality, data lineage, metadata management, understanding the GSA data governance structure, the role of the GSA Chief Data Officer and other data-centric roles, and the data pipeline (collect, clean, analyze, disseminate). In addition, the course covers the basics of artificial intelligence (AI). The training directly supports several GSS principles by building foundational knowledge that enhances transparency through proper documentation, promotes the ability to repeat studies by teaching employees to understand and trace where data comes from, and strengthens the communication of limitations and uncertainties by providing employees with foundational knowledge of data quality concepts. GSA will review the content in more depth and add to this training any additional information needed to ensure agency staff understand the GSS tenets and adhere to them when involved with scientific activities. GSA is also exploring opportunities to integrate GSS principles directly into its broader portfolio of data and analytics training offerings, further reinforcing these scientific standards across the organization.
Furthermore, GSA is dedicated to elevating the role of the data steward, acknowledging its critical function in executing some of the tenants of GSS. To that end, GSA has established an inter-agency working group focused on defining roles and responsibilities for data stewards. GSA is also developing a Data Stewards Community of Practice to provide a platform for data stewards to access training materials, collaborate on best practices, and share expertise in areas such as data cataloging and data quality improvement.
In addition to publicly posting the GSS implementation plan, we will work closely with the small number of offices conducting scientific activities to determine what resources their staff might need to ensure robust adherence to the GSS tenets.
Leveraging technology for implementing GSS
To effectively implement the GSS tenets, GSA is leveraging technology through the development of an enterprise data solution. This centralized platform will provide data scientists and analysts with streamlined access to data, fostering collaboration, open communication, and the sharing of resources. The platform facilitates data-driven decision-making by making data products readily available to business decision-makers. With a focus on replicability and auditability, GSA is actively ingesting raw source data and capturing the full lineage of data products, which can include study results. This complete traceability increases the likelihood of successful study replication, ensuring consistent results through identical methods and conditions.
In support of transparency, GSA has implemented an enterprise data catalog to curate metadata about GSA datasets and data products. This catalog provides comprehensive metadata about data products, which can include study results, methodologies, data, analytical tools, and findings, enabling rigorous scrutiny, validation, and reuse across GSA. The data catalog can also be used to enhance transparency by communicating any errors or uncertainties associated with study results, including limitations. Furthermore, GSA routinely publishes datasets on data.gov, promoting transparency of study results with external agencies and the public by releasing finalized, non-Controlled Unclassified Information.
Challenges with GSS implementation
We do not anticipate major challenges with GSS implementation. As previously described, the small number of GSA staff involved with scientific activities already adhere to key scientific policies and procedures aligned with the GSS tenets. There may be some more minor challenges in tracking and reporting annually on GSS metrics given GSA has a reduced number of staff involved with scientific activities and no dedicated science office or system for collecting this information.