Has the Olympics changed how it measures false-starts in track? A Q&A with a biomechanics expert who has researched reaction times

In 2011, James Ashton-Miller, a Michigan Engineer, helped reveal that Olympic starting-line technology created a different experience for male and female sprinters. It did not accurately detect false starts by women. His latest work provides insights into what may, or may not, have happened since.

Ashton-Miller is the Albert Schultz Collegiate Research Professor & Distinguished Research Scientist in mechanical engineering and biomedical engineering.

A paper you published in 2011 analyzed data compiled during the 2008 Summer Olympics in Beijing from track and field events. Those data indicated that female sprinters had slower reaction times than their male counterparts. What troubled you about your own findings in that paper?


Ashton-Miller:
 It happens that, as part of a previous study on the biomechanics of falls, our team measured lower extremity reaction times in a lab. Those data unequivocally showed that for small forces, women reacted faster than men. But if a much larger force was required to register reaction, then it took women longer than men to reach that threshold.

Now, both plates on the starting blocks at the Olympics are equipped with a force sensor. When the athlete adopts their starting position, one foot goes against each plate and, in order to accelerate forward, the athlete has to push off against these instrumented blocks. If the blocks register a certain level of force before 100 milliseconds has elapsed after the starting gun, it is deemed a false start.

Our analysis showing slower reaction times by women in Beijing surprised us, but initially we accepted the finding. Eventually, based on our lab study, we speculated that Omega sister company Swiss Timing (the official Olympic timekeeper) must have used the same force level on the starting blocks for men and women in determining whether a false start had taken place. That would mean that the sex difference in reaction time was likely an artifact.

Once you realized what might be causing the difference in women versus men’s reaction times, what step did you take?


Ashton-Miller:
 Prior to publishing that article, we reached out to Omega to ask what force threshold they used on their starting blocks and whether it was the same for men and women. They replied that information was proprietary.

Then, using results from our earlier lab study which measured lower extremity reactions, we calculated that by reducing the force threshold for women at the Olympics by 21 percent, the reaction gap should disappear. We published that suggestion in the 2011 paper hoping that it might spur Omega to make the competition more equitable for women, since with their current system a women could false start by approximately 20 milliseconds and not be caught. That would obviously be unfair to other women in the race.

So what questions did your team tackle next?


Ashton-Miller:
 Once we knew about this bias in measuring women’s reaction times, we wanted to see if the results from the 2008 Olympics were consistent with other Olympic Games before and after.

We also wanted to see if Swiss Timing had continued to use the same reaction time calculation in subsequent Olympic games. So I asked graduate student Payam Mirshams Shahshahani to compare reaction data from the ‘04, ‘08, ‘12, ‘16 Summer Games using a different kind of statistical analysis that could take advantage of a repeated measures analysis.

What did you find?


Ashton-Miller:
 Our findings showed the same difference – women showing slower reaction times than men – in 2004 and 2008. But in 2012, a year after our first paper appeared, the difference between the sexes disappeared. Women’s reaction times had gotten significantly faster with men’s times remaining consistent.

In our most recent paper, we argue that these Olympic reaction times have not become faster due to improved training but, instead, likely represent changes in the computer algorithm used by Swiss Timing.

We conclude that it is important to set the force thresholds for women lower than men in order for a fair women’s competition. For if a female sprinter knew that the men’s values were being used, she could intentionally false start by up to 20 milliseconds without being flagged by the electronic timing mechanism.

What ramifications does your analysis have outside the world of track and field?


Ashton-Miller:
 Acknowledging that, depending on the force involved, there can be a sex difference in reaction time is also important for designing interfaces by which humans interact with machines: Women should not be disadvantaged in implementing time critical decisions because of the need for high force thresholds. This may be particularly important for the elderly whose longer reaction times mean they have less time to mount their response in time critical tasks.

The new study was done in collaboration with Payam Mirshams Shahshahani, a doctoral student in mechanical engineering; David Lipps, a professor of kinesiology and biomedical engineering; and Andrzej Galecki, a research professor in gerontology and biostatistics. It is published in the journal PLOS One, titled, “On the apparent decrease in Olympic sprinter times.”

Credits:

  • By James Lynch, Research News & Feature Writer, (734) 763-1652
  • Original article: https://news.engin.umich.edu/2018/08/olympics-measures-false-starts-in-track/

$3.46M to Combine Supercomputer Simulations with Big Data

A new way of computing could lead to immediate advances in aerodynamics, climate science, cosmology, materials science and cardiovascular research. The National Science Foundation is providing $2.42 million to develop a unique facility for refining complex, physics-based computer models with big data techniques at the University of Michigan, with the university providing an additional $1.04 million.

The focal point of the project will be a new computing resource, called ConFlux, which is designed to enable supercomputer simulations to interface with large datasets while running. This capability will close a gap in the U.S. research computing infrastructure and place U-M at the forefront of the emerging field of data-driven physics. The new Center for Data-Driven Computational Physics will build and manage ConFlux.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

×

The project will add supercomputing nodes designed specifically to enable data-intensive operations. The nodes will be equipped with next-generation central and graphics processing units, large memories and ultra-fast interconnects.

A three petabyte hard drive will seamlessly handle both traditional and big data storage. Advanced Research Computing – Technology Services at University of Michigan provided critical support in defining the technical requirements of ConFlux. The project exemplifies the objectives of President Obama’s new National Strategic Computing Initiative, which has called for the use of vast data sets in addition to increasing brute force computing power.

The common challenge among the five main studies in the grant is a matter of scale. The processes of interest can be traced back to the behaviors of atoms and molecules, billions of times smaller than the human-scale or larger questions that researchers want to answer.

Even the most powerful computer in the world cannot handle these calculations without resorting to approximations, said Karthik Duraisamy, an assistant professor of aerospace engineering and director of the new center. “Such a disparity of scales exists in many problems of interest to scientists and engineers,” he said.

But approximate models often aren’t accurate enough to answer many important questions in science, engineering and medicine. “We need to leverage the availability of past and present data to refine and improve existing models,” Duraisamy explained.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Data from hospital scans, when fed into a computer model of blood flow, can become a powerful predictor of cardiovascular disease. Courtesy of Alberto Figueroa, Biomedical Engineering.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Data from hospital scans, when fed into a computer model of blood flow, can become a powerful predictor of cardiovascular disease. Courtesy of Alberto Figueroa, Biomedical Engineering.

×

This data could come from accurate simulations with a limited scope, small enough to be practical on existing supercomputers, as well as from experiments and measurements. The new computing nodes will be optimized for operations such as feeding data from the hard drive into algorithms that use the data to make predictions, a technique known as machine learning.

“Big data is typically associated with web analytics, social networks and online advertising. ConFlux will be a unique facility specifically designed for physical modeling using massive volumes of data,” said Barzan Mozafari, an assistant professor of computer science and engineering, who will oversee the implementation of the new computing technology.

The faculty members spearheading this project come from departments across the University, but all are members of the Michigan Institute for Computational Discovery and Engineering (MICDE), which was launched in 2013.

“MICDE is the home at U-M of the so-called third pillar of scientific discovery, computational science, which has taken its place alongside theory and experiment,” said Krishna Garikipati, MICDE’s associate director for research.

The following projects will be the first to utilize the new computing capabilities:

  • Cardiovascular disease. Noninvasive imaging such as MRI and CT scans could enable doctors to deduce the stiffness of a patient’s arteries, a strong predictor of diseases such as hypertension. By combining the scan results with a physical model of blood flow, doctors could have an estimate for arterial stiffness within an hour of the scan. The study is led by Alberto Figueroa, the Edward B. Diethrich M.D. Research Professor of Biomedical Engineering and Vascular Surgery.
  • Turbulence. When a flow of air or water breaks up into swirls and eddies, the pure physics equations become too complex to solve. But more accurate turbulence simulation would speed up the development of more efficient airplane designs. It will also improve weather forecasting, climate science and other fields that involve the flow of liquids or gases. Duraisamy leads this project.
  • Clouds, rainfall and climate. Clouds play a central role in whether the atmosphere retains or releases heat. Wind, temperature, land use and particulates such as smoke, pollen and air pollution all affect cloud formation and precipitation. Derek Posselt, an associate professor of atmospheric, oceanic and space sciences, and his team plan to use computer models to determine how clouds and precipitation respond to changes in the climate in particular regions and seasons.
  • Dark matter and dark energy. Dark matter and dark energy are estimated to make up about 96 percent of the universe. Galaxies should trace the invisible structure of dark matter that stretches across the universe, but the formation of galaxies plays by additional rules – it’s not as simple as connecting the dots. Simulations of galaxy formation, informed by data from large galaxy-mapping studies, should better represent the roles of dark matter and dark energy in the history of the universe. August Evrard and Christopher Miller, professors of physics and astronomy, lead this study.
  • Material property prediction. Material scientists would like to be able to predict a material’s properties based on its chemical composition and structure, but supercomputers aren’t powerful enough to scale atom-level interactions up to bulk qualities such as strength, brittleness or chemical stability. An effort led by Garikipati and Vikram Gavini, a professor and an associate professor of mechanical engineering, respectively, will combine existing theories with the help of data on material structure and properties.

“It will enable a fundamentally new description of material behavior—guided by theory, but respectful of the cold facts of the data. Wholly new materials that transcend metals, polymers or ceramics can then be designed with applications ranging from tissue replacement to space travel,” said Garikipati, who is also a professor of mathematics.