The very famous Royal Society in the UK has warned that using AI in science is likely to make it more difficult to reproduce significant achievements.
In its report titled “Science in the age of AI,” the 350-year-old organization makes the case that the incorporation of AI into scientific research has obstructed reproducibility—the notion that a specific outcome can be repeated by a different research team in a different location—by limiting access to necessary computing infrastructures and resources, making it difficult to comprehend how AI tools arrive at their conclusions, and by lacking adequate documentation.
The IT sector has actively pushed the notion that AI can advance research. Large language models, such as ChatGPT, may advance science more quickly than people alone, according to a breakthrough that was reported to have been made in December of last year.
Professor Alison Noble, who chairs the Royal Society Science in the Age of AI Working Group, is concerned that issues pertaining to the rigorous and safe application of AI have arisen as a result of the technology’s quick adoption in science.
“A growing body of irreproducible studies are raising concerns regarding the robustness of AI-based discoveries,” she said.
Because many AI technologies are proprietary, reproducing findings using AI is limited.
“Barriers such as insufficient documentation, limited access to essential infrastructures (eg code, data, and computing power) and a lack of understanding of how AI tools reach their conclusions (explainability) make it difficult for independent researchers to scrutinise, verify and replicate experiments,” the paper states.
The research report cautions against using artificial intelligence (AI) in research since it may result in “inflated expectations, exaggerated claims of accuracy, or research outputs based on spurious correlations.”
“In the case of AI-based research, being able to reproduce a study not only involves replicating the method, but also being able to reproduce the code, data, and environmental conditions under which the experiment was conducted (eg computing, hardware, software).”
The Royal Society was established in 1660, and among its previous presidents include scientists such as chemist Humphry Davy, scientist Ernest Rutherford, who made the discovery of the atomic nucleus.
The Society cautions in its AI document that reproducibility issues could distort future research in addition to jeopardizing the quality of the individual study.
The study, conducted by the Princeton University Center for Statistics and Machine Learning, demonstrates how 294 publications in 17 scientific fields—including high-stakes ones like medicine—may be impacted by “data leakage” in a single research project, which is a major source of errors in ML applications.
Commercially designed models may exacerbate the issue. For example, the majority of top LLMs are created by major tech firms like Microsoft, Google, Meta, and OpenAI. Due to their private nature, these models only provide a limited amount of information regarding their training set, model architecture, and decision-making procedures that could improve comprehension.”
In order to tackle these obstacles, scientists must to embrace open research tenets, such as the UNESCO Recommendation on Open research. Grand challenges like the ML Reproducibility Challenge, which asks participants to replicate papers from 11 prestigious machine learning conferences, may also be helpful, according to the study.
In August of last year, academics issued a warning that low-quality data presented additional challenges for AI-based research, and that using a random or stochastic technique to train deep learning models led to issues with reproducing AI-assisted results.
Standardized standards and experimental design, according to the Stanford computer science team, can mitigate such problems.
The research paper from the Royal Society also states that “open source initiatives that release open models, datasets, and education programs are another direction towards improving reproducibility.”