Skip to main content

IU researchers determine which statistical tests ensure ethical conduct in the care and use of animals in research

IU researchers from the School of Public Health conduct research funded by the National Institute of Aging to determine which statistical test yields results with proper (and small) error rates using the least number of animal subjects.

News and events Research and discovery May 11, 2022

According to the American Psychological Association’s (APA’s) guidelines for ethical conduct in the care and use of nonhuman animals in research, psychologists should “use procedures that minimize the number of nonhuman animals in research.” Additionally, Emily Sena, a researcher at the University of Edinburgh who leads the Collaborative Approach to Meta-Analysis and Review of Animal Data in Experimental Studies (CAMARADES), is quoted as saying, “I don’t think it’s ethical to do an experiment with five animals in each group when that’s [underpowered],” she says. “That’s a nuanced message that hasn’t been easy to get across,” she added. “I have a few folks who misinterpret my stance. They think it’s, ‘Don’t do animal studies.’ I’m not saying that at all. I’m saying do them properly, and you probably need to do bigger ones.” With this background in mind, Drs. David Allison, Andrew Brown, and Keisuke Ejima conducted research funded by the National Institute of Aging to determine which statistical test would yield statistical results with proper (and small) error rates with the least number of animal subjects.

Dr. David Allison speaking to students from the IU School of Public Health, Bloomington, Indiana.

Dr. Allison is Dean of the School of Public Health at Indiana University Bloomington, Professor of Epidemiology and Biostatistics, and an obesity and statistics researcher. Dr. Brown is an Assistant Professor with the Department of Applied Health Science in the School of Public Health. Much of his work focuses on the interface between nutrition and obesity as well as issues of rigor, reproducibility, and transparent science. Dr. Ejima is an Assistant Research Scientist in the School of Public Health. His background is in mathematical engineering, and he uses his skills to solve issues in public health, including infectious diseases, obesity, and nutrition.

Dr. Keisuke Ejima, Assistant Research Scientist, School of Public Health, Indiana University Bloomington

It [only] took approximately two days to run all of the simulations with Karst. It was also really helpful to be able to use my own computer while the simulations were running on Karst.

Dr. Keisuke Ejima

In an effort to determine the best statistical tests to use in different circumstances, Drs. Allison, Brown, and Ejima simulated a vast number of experiments using different statistical tests and then compared the results of those tests to find out which statistical test had the best performance characteristics with small numbers of mice. Conducting so many simulations required substantial computing power. Dr. Ejima remarks, “At first I tried to run the simulations on my desktop computer in my office, but the number of simulations were too large and it was easily going to take over a month to process them. That’s when I realized that it wasn’t realistic to use my personal computer to run the simulations, so I decided to use Karst.” Dr. Ejima continues, “It took approximately two days to run all of the simulations with Karst. It was also really helpful to be able to use my own computer while the simulations were running on Karst.”

Dr. Andrew Brown, Assistant Professor, Department of Applied Health Science, School of Public Health, Indiana University Bloomington

“To add to that,” says Dr. Brown, “it’s not uncommon to have a mistake in a calculation. Imagine if we made a mistake in a calculation and it took a month to run the simulation on Dr. Ejima’s computer. Once we realized something was wrong with the calculation, we would have to run the simulation for another month to fix the issue. That’s one of the huge benefits of using supercomputers. They enable us to do the calculations more than once in a reasonable amount of time to make sure we’re getting the right answers.” Indiana University’s Karst research supercomputer was retired from service on December 18, 2020. Dr. Keisuke Ejima has been using IU’s Carbonate cluster for his work following the retirement of Karst.

This type of research requires true collaboration among researchers. These simulations were based on two different datasets from real mice so that the simulations would better reflect how real-world data would behave with these statistical tests. For one paper, the data were obtained from the Mouse Phenome Database at The Jackson Laboratory. For the other, the data were shared from Dr. de Cabo’s research at the National Institute of Aging. Dr. Allison muses, “Dr. de Cabo is most familiar with mouse experiments, but not the computer. I am familiar with genetic statistics, but not the computer. Dr. Ejima can run the computer, but is less familiar with mouse experiments or genetic statistics. In order to increase the level of sophistication of the research, we can’t work alone. We have a true collaboration of people working together to figure out how to provide meaningful information for what real investigators need.”

Dr. Andrew Brown, Assistant Professor, Department of Applied Health Science, School of Public Health, Indiana University Bloomington

As for the results of their investigation, according to their paper, they observed, “… type I error inflation for all tests, except the bootstrap test, with small samples (<=5). Type I error inflation decreased as sample size increased (>=8), but remained. Due to these discoveries, the bootstrap test is recommended for small sample sizes to avoid type I error inflation, but this benefit comes at the cost of lower power. When the sample size is large enough, the Welch’s t test is recommended because of high power with minimal type I error inflation.” Drs. Allison, Brown, and Ejima hope this information will help future researchers use as few animals as possible while still obtaining meaningful statistical inferences.

Screenshot from the IU Supercomputer Pathfinder, found on https://hpceverywhere.iu.edu

The UITS Research Technologies department provides a service called Supercomputing Pathfinder to help researchers determine which high performance computing (HPC) resources will best fit their needs. Check out this service today to determine which HPC resource will work best for you.

More stories