Closest look yet at killer T-cell activity could yield new approach to tackling antibiotic resistance An in-depth look at the work of T-cells, the body's bacteria killers, could provide a roadmap to effective drug treatments.

In a study that could provide a roadmap for combatting the rising threat of drug-resistant pathogens, researchers have discovered the specific mechanism the body’s T-Cells use to kill bacteria.

University of Michigan researchers, in collaboration with colleagues at Harvard University, have discovered a key difference between the way immune cells attack bacteria and the way antibiotics do. Where drugs typically attack a single process within bacteria, T-Cells attack a host of processes at the same time.

On Thursday, the journal Cell published findings from a team headed by U-M’s Sriram Chandrasekaran and Harvard’s Judy Lieberman. It’s a study with potential implications for drug-resistant pathogens—a problem projected to kill as many as 10 million people annually across the globe by the year 2050.

“We have a huge crisis of antibiotic resistance right now in that most drugs that treat diseases like tuberculosis or listeria, or pathogens like E.coli, are not effective,” said Chandrasekaran, an assistant professor of biomedical engineering. “So there is a huge need for figuring out how the immune system does its work. We hope to design a drug that goes after bacteria in a similar way.”

We’ve reached a point where we take what antibiotics can do for granted, and we can’t do that anymore.Sriram Chandrasekaran

Killer T-Cells, formally known as cytotoxic lymphocytes, attack infected cells by producing the enzyme granzyme B. How this enzyme triggers death in bacteria has not been well understood, Chandrasekaran said.

Proteomics – a technique that measures protein levels in a cell—and computer modeling, allowed researchers to see granzyme B’s multi-pronged attack targeting multiple processes.

Chandrasekaran and his team monitored how T-Cells deal with three different threats: E. coli, listeria and tuberculosis.

“When exposed to granzyme B, the bacteria were unable to develop resistance to the multi-pronged attack, even after exposure over multiple generations,” Chandrasekaran said. “This enzyme breaks down multiple proteins that are essential for the bacteria to survive.

“It’s essentially killing several birds with one stone.”

The possible applications of the new findings on T-Cells run the gamut from the creation of new medications to the re-purposing of previously-approved drugs in combination to fight infections by mimicking granzyme B.

Chandrasekaran’s team is now looking at how bacteria hide to avoid T-Cell attacks.

And the need for a new approach in some form is dire. World Health Organization officials describe antibiotic resistance as “one of the biggest threats to global health, food security and development today.”

Sriram Chandrasekaran, Assistant Professor of Biomedical Engineering, shows a computer model of a pathway for a potential disease or infection. Photo: Joseph Xu

Each year, an estimated 700,000 deaths are linked to antibiotic-resistant bacteria, according to the World Health Organization. Projections show that number skyrocketing to 10 million by 2050.

England’s top health official, Sally Davies, recently said the lost effectiveness of antibiotics would mean “the end of modern medicine.”

“We really are facing—if we don’t take action now—a dreadful post-antibiotic apocalypse,” she was quoted saying earlier this month. “I don’t want to say to my children that I didn’t do my best to protect them and their children.”

Of particular concern is the fact that there are few new antibiotics in the pipeline. The heyday of new antibiotics occurred the 1940s through the 1960s, with releases eventually grinding almost to a halt by the end of the twentieth century.

“We’ve reached a point where we take what antibiotics can do for granted, and we can’t do that anymore,” Chandrasekaran said. “We’re taking inspiration from the human immune system, which has been fighting infections for thousands of years.”

The paper is titled, “Granzyme B disrupts central metabolism and protein synthesis in bacteria to promote an immune cell death program.” The research is funded by the National Institutes of Health, Harvard University and the University of Michigan.

Credits:


Understanding pediatric pulmonary hypertension Creating new imaging and modeling tools to improve diagnosis and management

Image caption: Multi-scale modeling framework of the cardiopulmonary system. Credit: Figueroa et al.

by Kim Roth

Pulmonary hypertension (PH), a lung disorder that causes high blood pressure in the pulmonary arteries, affects an estimated 15 million to 50 million individuals worldwide. Its progressive nature, impact on quality of life, and life-threatening long-term consequences make it an important focus of basic scientific and translational research.

“Pulmonary hypertension is a relatively rare disease, but the incidence is likely underestimated, since definitive diagnosis currently requires an invasive heart catheterization” says C. Alberto Figueroa, the Edward B. Diethrich M.D. Associate Professor of Biomedical Engineering and Vascular Surgery.

In addition, non-invasive diagnostic tests, and those used to assess severity, can be highly subjective. Existing treatments mainly target symptoms rather than the underlying cause, which can also be hard to identify. Over time, PH can lead to heart failure; in many cases, patients require a heart or lung transplant.

Particularly in children, diagnosing and treating PH poses unique challenges. Their smaller size and faster heart rate make imaging more difficult than in adult patients.

With U-M colleague Adam Dorfman, MD, associate professor of pediatric cardiology, and colleagues at Michigan State University and Nationwide Children’s Hospital, Figueroa is developing a comprehensive multiscale model of the cardiopulmonary system in pediatric PH.

Using data from MRI and heart catheterization studies in 25 patients – 20 with PH and five cardiac transplant controls – computational models will integrate clinical information, including vessel stiffness and geometry and heart structure and function. The result will be high-resolution simulations of both blood flow dynamics and tissue mechanics of the entire cardiopulmonary system.

Over the four-year study, the team will investigate well-known mechanistic factors at work in PH.

“We know that PH is characterized by smooth muscle hypertrophy, endothelial dysfunction and deposition of collagen and elastin, which result in biomechanical alterations in the system, such as increased resistance and stiffness. While we know that these mechanistic parameters play a critical role, we don’t yet have a full understanding of how they interact and potentially lead to decompensated right ventricular failure,” says Figueroa. “One of our goals is to identify a series of mechanistic markers – rather than the existing subjective assessment tools – to use for patient stratification.”

The work builds upon Figueroa’s previous research. Prior to joining the U-M faculty in 2014, he developed new algorithms to perform simulations of fluid-structure interactions in cardiovascular models constructed from image data. Thanks to the algorithms, simulation of blood flow and artery dynamics in full-scale models became possible.

The exceptional computational resources within the College of Engineering and the world-class clinical expertise in PH management, in both adult and pediatric populations, make U-M the right place to carry out this latest study, Figueroa says.

Ultimately, the goal is to create new imaging and computational modeling tools to improve diagnosis and management of PH on a patient-specific basis.

“If our effort is successful, we might reduce or eliminate the need for risky and invasive catheterization procedures,” -Alberto Figueroa

“If our effort is successful, we might reduce or eliminate the need for risky and invasive catheterization procedures,” says Figueroa. The findings also will be applicable to systemic hypertension, which affects some 36 percent of Americans.

Longer term, Figueroa and Dorfman hope to create a patient-specific computational framework to test the efficacy of new drugs.

“Once we understand the mechanisms better,” says Dorfman, “we can work toward more effective ways of treating pediatric PH. Because, really, at the end of the day, we’re trying to help kids be kids.”

The effort is funded by a $2.4 million U01 grant from the National Institutes of Health, U01HL135842: Image-Based Multi-Scale Modeling Framework of the Cardiopulmonary System: Longitudinal Calibration and Assessment of Therapies in Pediatric Pulmonary Hypertension.


Keeping drugs on the job


LABLOG | PREVENTING DRUGS FROM CRYSTALLIZING
Mandal’s simulation shows how well the polymers slow down the formation of crystals. Polymers are shown in red and green while drug molecules are maize and blue.
Computer simulations developed at the University of Michigan reveal how well drug additives stop the active ingredients from crystallizing in the digestive tract. They tested these simulations on the anti-seizure drug phenytoin.

“Most drugs are hydrophobic, so the mix of water and drugs is not stable,” said Taraknath Mandal, a research fellow in chemical engineering. “They spontaneously form a crystal.”

In the water-based environment of the stomach and small intestine, the active ingredients stick together, and then they don’t make it into the bloodstream. Pharma companies add polymers to drugs to get in the way of crystal growth, but the range of possible designs for these polymers is huge.

The computer simulations enable researchers to try out different polymers, looking for the best performance.

“These tools have never been used before on this problem,” said Ron Larson, the A. H. White Distinguished University Professor of Chemical Engineering and the George Granger Brown Professor of Chemical Engineering, who led the work. He and his team worked closely with researchers at Dow Chemical Company in Midland, Michigan, including WW (Trey) Porter, the team lead on the Dow side.

They tested the simulation with several drugs, focusing on how phenytoin interacted with a cellulose-based polymer – the stuff that makes up the cell walls in plants. They compared how different side groups on the cellulose chain affected how well the polymer kept phenytoin from crystallizing in the simulated gut environment.

The team tested two side-groups head to head – succinyl and acetyl. Both are needed to make a polymer effective in both the acidic environment of the stomach and the more neutral environment of the small intestine.

In the stomach, the succinyl groups have the full complement of hydrogen atoms, so they are not charged. They help the polymer interact with a hydrophobic drug like phenytoin. A charged polymer will be attracted to the water rather than the drug.

But once the polymers hit the small intestine, some hydrogen atoms strike out on their own, leaving the succinyl with a negative charge. Here, the acetyl groups take over helping to keep the polymer groups together even as the neutral succinyl groups expand. Larson and his group developed computer models to find the best balance.

Of course, patients don’t gain anything if the drug additives keep the drug from crystallizing but instead traps it in a web of polymers. Or, if the drugs are released too quickly, they can crystallize in the small intestine. Wenjun Huang, a doctoral student in chemical engineering, created a second simulation to test how well drugs leave the polymer webs.

“Our simulation results are the first ones to directly show the role of each individual functional group in the polymer,” said Huang. “We can see the polymer-drug aggregates form, and we can see how drugs are released from the aggregates.”

Green and red polymers help keep maize and blue drug molecules from crystallizing. Credit: Taraknath Mandal, Larson Lab, University of Michigan.
Again, a balance between the succinyl and acetyl groups is needed to achieve the right pace, and the balance is different for different drugs.

“We are beginning to learn how to determine computationally which modifications to the polymer will be most effective at interacting with a given drug and thereby help in the design of better polymers to enhance drug delivery,” said Larson.

These studies were funded primarily by the Dow Chemical Company. The team also relied on Advanced Research Computing at U-M, funded by the National Science Foundation.

The paper on the model that investigates how well different polymers work to prevent drug crystallization is titled “A framework for multi-scale simulation of crystal growth in the presence of polymers,” and it was published in Soft Matter.

The paper on the computer model that explores whether drugs are released from the web of polymers is titled “Computational Modeling of Hydroxypropyl-Methylcellulose Acetate Succinate (HPMCAS) and Phenytoin Interactions: A Systematic Coarse-Graining Approach,” and it was published in the journal Molecular Pharmaceutics.

Larson is also a professor of macromolecular science and engineering, biomedical engineering, and mechanical engineering, and is a member of the Biointerfaces program.

From: Kate McAlpine, Michigan Engineering

Link: http://www.engin.umich.edu/college/about/news/stories/2017/march/keeping-drugs-on-the-job


‘5-D protein fingerprinting’ could give insights into Alzheimer’s, Parkinson’s

ANN ARBOR—In research that could one day lead to advances against neurodegenerative diseases like Alzheimer’s and Parkinson’s, University of Michigan engineering researchers have demonstrated a technique for precisely measuring the properties of individual protein molecules floating in a liquid.

Proteins are essential to the function of every cell. Measuring their properties in blood and other body fluids could unlock valuable information, as the molecules are a vital building block in the body. The body manufactures them in a variety of complex shapes that can transmit messages between cells, carry oxygen and perform other important functions.

Sometimes, however, proteins don’t form properly. Scientists believe that some types of these misshapen proteins, called amyloids, can clump together into masses in the brain. The sticky tangles block normal cell function, leading to brain cell degeneration and disease.

But the processes of how amyloids form and clump together are not well understood. This is due in part to the fact that there’s currently not a good way to study them. Researchers say current methods are expensive, time-consuming and difficult to interpret, and can only provide a broad picture of the overall level of amyloids in a patient’s system.

The University of Michigan and University of Fribourg researchers who developed the new technique believe that it could help solve the problem by measuring an individual molecule’s shape, volume, electrical charge, rotation speed and propensity for binding to other molecules.

They call this information a “5-D fingerprint” and believe that it could uncover new information that may one day help doctors track the status of patients with neurodegenerative diseases and possibly even develop new treatments. Their work is detailed in a paper published in Nature Nanotechnology.

“Imagine the challenge of identifying a specific person based only on their height and weight,” said David Sept, a U-M biomedical engineering professor who worked on the project. “That’s essentially the challenge we face with current techniques. Imagine how much easier it would be with additional descriptors like gender, hair color and clothing. That’s the kind of new information 5-D fingerprinting provides, making it much easier to identify specific proteins.”

Michael Mayer, the lead author on the study and a former U-M researcher who’s now a biophysics professor at Switzerland’s Adolphe Merkle Institute, says identifying individual proteins could help doctors keep better tabs on the status of a patient’s disease, and it could also help researchers gain a better understanding of exactly how amyloid proteins are involved with neurodegenerative disease.

This illustration depicts the device used to measure individual protein. The inset shows proteins (in red) flowing through a nanopore.

To take the detailed measurements, the research team uses a nanopore 10-30 nanometers wide—so small that only one protein molecule can fit through at a time. The researchers filled the nanopore with a salt solution and passed an electric current through the solution.

As a protein molecule tumbles through the nanopore, its movement causes tiny, measurable fluctuations in the electric current. By carefully measuring this current, the researchers can determine the protein’s unique five-dimensional signature and identify it nearly instantaneously.

“Amyloid molecules not only vary widely in size, but they tend to clump together into masses that are even more difficult to study,” Mayer said. “Because it can analyze each particle one by one, this new method gives us a much better window to how amyloids behave inside the body.”

Ultimately, the team aims to develop a device that doctors and researchers could use to quickly measure proteins in a sample of blood or other body fluid. This goal is likely several years off; in the meantime, they are working to improve the technique’s accuracy, honing it in order to get a better approximation of each protein’s shape. They believe that in the future, the technology could also be useful for measuring proteins associated with heart disease and in a variety of other applications as well.

“I think the possibilities are pretty vast,” Sept said. “Antibodies, larger hormones, perhaps pathogens could all be detected. Synthetic nanoparticles could also be easily characterized to see how uniform they are.”

The study is titled “Real-time shape approximation and fingerprinting of single proteins using a nanopore.” Funding for the project was provided by the Miller Faculty Scholar Award, Air Force Office of Scientific Research, National Institutes of Health, National Human Genome Research Institute, a Rackham Pre-Doctoral Fellowship from U-M and the Microfluidics in Biomedical Sciences Training Program from the National Institutes of Health and National Institute of Biomedical Imaging and Bioengineering.

Original:

More information:


Taking the Guesswork out of Surgical Planning How BME professor Alberto Figueroa’s patient-specific blood flow simulations help clinicians find the ideal surgical path

 
by Aimee Balfe

Alberto Figueroa’s BME lab has achieved an important goal – using computer-generated blood flow simulations to plan complex cardiovascular procedures.

“We’re now using virtual surgical planning in the clinical realm, not as a retrospective theoretical exercise,” says Figueroa.

Using patients’ medical and imaging data, Figueroa can create a model of their unique vasculature and blood flow, then use it to guide surgeons and cardiologists through specific operations and procedures. One type of procedure involves placing grafts in the inferior vena cava to help alleviate complications from pulmonary arteriovenous malformations (PAVMs).

“We’re now using VIRTUAL SURGICAL PLANNING in the clinical realm, NOT AS A RETROSPECTIVE THEORETICAL EXERCISE.” Alberto Figueroa

PAVMs – abnormal connections between a patient’s veins and arteries – are a common complication of a procedure performed early in the lives of children born with only a single functioning ventricle. Called the Fontan procedure, the operation rewires patients’ pulmonary circulation so that the venous return bypasses the heart and is connected directly to the pulmonary arteries for transport to the lungs.

While these surgeries can be lifesavers, the long-term consequences depend heavily on how evenly blood flow is distributed between a patient’s lungs. Patients with ideal hemodynamics do well; those with less-than-perfect flow patterns suffer a sting of life-threatening complications, such as low blood oxygen and elevated cardiac output.

Figueroa’s technique can help those with complications by better balancing flow to the lungs.

Simulation & Outcome: A Perfect Match

“What we bring to the table in operations like this is, instead of going in blind, we can simulate multiple different ways of doing the procedure to see if there is an optimal one.”

Figueroa makes use of detailed anatomical data such as CT scans, Doppler data on velocity in various vessels, invasive catheterization data that shows pressures at multiple locations, and perfusion data from nuclear medicine tests. His lab creates hemodynamic models of each patient that match these data points precisely. They then simulate multiple different ways of placing a stent graft using U-M's high performance Flux computing cluster, provided by Advanced Research Computing, to identify the best outcome.

“During these procedures, the surgeons use angiograms to illuminate the blood flow,” says Figueroa. “This has shown that the results match what our computer simulation predicted.” (See image.)

“The clinicians were amazed, but we told them we were just solving Newton’s law.”  Alberto Figueroa

 

Before (left) and after (right) images from an angiogram (top) and a surgical simulation (bottom). Note the tight correlation between the simulations and angiograms as well as the significantly more even distribution of hepatic venous flow between the two lungs after a simulation-guided procedure. Credit: Kevin Lau, Alberto Figueroa.

Better Primary Surgeries

In addition to corrective surgeries, these simulation techniques can also allow surgeons to optimize initial procedures like the Fontan so that complications may never happen at all.

Of the tens of thousands of patients undergoing Fontan operations each year, he says, roughly half experience major complications after 10 years. That’s because it’s almost impossible for surgeons to know exactly how to perform the procedure on patients with vessels of various sizes, shapes, and flows.

By accounting for these differences, Figueroa hopes his simulations will show surgeons where in the vasculature to make the surgical connections so that blood flow is ideally balanced between the lungs in each patient. He plans to work with U-M colleagues on patient-specific Fontan planning.

And because his simulations add a layer of insight to any procedure where cardiologists and surgeons find that doing things the same way works in some patients and not others, Figueroa hopes they’ll soon become a ubiquitous precision planning tool, much like imaging is today.

Additional Applications

As promising as it is, surgical planning is only the tip of the iceberg for Figueroa. His lab also works to further develop its simulation software and to use it to understand disease progression, always with an eye toward devising better treatments.

In the software arena, his lab is working on enhancements that will account for dynamic changes in blood flow caused by anything from a change in posture to anesthesia.

One of the lab’s clinical fellows is studying how blood vessels remodel in response to the grafts used in thoracic aneurysm repair. Another is modeling aortic dissection, aiming to discover precisely how the flap that shears from the vessel wall moves, deforms the aorta, and affects blood flow. This understanding is a first step toward designing a device specifically for this condition.

His lab also hosts BME students who are developing tools to better understand blood flow in the brain, clot-development in veins, and the progression of hypertension, including which types of vessels sustain various degrees of damage over time. Figueroa has recently submitted a collaborative grant to explore the progression of pulmonary hypertension, as well.

The breadth and clinical relevance of his work are in many ways why Figueroa came to U-M from King’s College, London, two years ago. Named the Edward B. Diethrich M.D. Research Professor of Biomedical Engineering and Vascular Surgery, Figueroa, a PhD, was drawn by his 50/50 appointment in BME and vascular surgery at an institution where medicine and engineering are deeply integrated.

It’s because of this connection that rapid-response surgical planning is made possible, he says. It’s also given him ready access to talented students from the medical and engineering schools – and to usually hard-to-reach study participants, like aortic dissection patients, to gain critical insight into this and other life-threatening conditions.


$3.46M to Combine Supercomputer Simulations with Big Data

A new way of computing could lead to immediate advances in aerodynamics, climate science, cosmology, materials science and cardiovascular research. The National Science Foundation is providing $2.42 million to develop a unique facility for refining complex, physics-based computer models with big data techniques at the University of Michigan, with the university providing an additional $1.04 million.

The focal point of the project will be a new computing resource, called ConFlux, which is designed to enable supercomputer simulations to interface with large datasets while running. This capability will close a gap in the U.S. research computing infrastructure and place U-M at the forefront of the emerging field of data-driven physics. The new Center for Data-Driven Computational Physics will build and manage ConFlux.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

×

The project will add supercomputing nodes designed specifically to enable data-intensive operations. The nodes will be equipped with next-generation central and graphics processing units, large memories and ultra-fast interconnects.

A three petabyte hard drive will seamlessly handle both traditional and big data storage. Advanced Research Computing – Technology Services at University of Michigan provided critical support in defining the technical requirements of ConFlux. The project exemplifies the objectives of President Obama’s new National Strategic Computing Initiative, which has called for the use of vast data sets in addition to increasing brute force computing power.

The common challenge among the five main studies in the grant is a matter of scale. The processes of interest can be traced back to the behaviors of atoms and molecules, billions of times smaller than the human-scale or larger questions that researchers want to answer.

Even the most powerful computer in the world cannot handle these calculations without resorting to approximations, said Karthik Duraisamy, an assistant professor of aerospace engineering and director of the new center. “Such a disparity of scales exists in many problems of interest to scientists and engineers,” he said.

But approximate models often aren’t accurate enough to answer many important questions in science, engineering and medicine. “We need to leverage the availability of past and present data to refine and improve existing models,” Duraisamy explained.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Data from hospital scans, when fed into a computer model of blood flow, can become a powerful predictor of cardiovascular disease. Courtesy of Alberto Figueroa, Biomedical Engineering.

Turbulence simulations for a vortex such as a tornado, a galaxy, or the swirls that form at the tips of airplane wings. Courtesy of Karthik Duraisamy, Aerospace Engineering.

Data from hospital scans, when fed into a computer model of blood flow, can become a powerful predictor of cardiovascular disease. Courtesy of Alberto Figueroa, Biomedical Engineering.

×

This data could come from accurate simulations with a limited scope, small enough to be practical on existing supercomputers, as well as from experiments and measurements. The new computing nodes will be optimized for operations such as feeding data from the hard drive into algorithms that use the data to make predictions, a technique known as machine learning.

“Big data is typically associated with web analytics, social networks and online advertising. ConFlux will be a unique facility specifically designed for physical modeling using massive volumes of data,” said Barzan Mozafari, an assistant professor of computer science and engineering, who will oversee the implementation of the new computing technology.

The faculty members spearheading this project come from departments across the University, but all are members of the Michigan Institute for Computational Discovery and Engineering (MICDE), which was launched in 2013.

“MICDE is the home at U-M of the so-called third pillar of scientific discovery, computational science, which has taken its place alongside theory and experiment,” said Krishna Garikipati, MICDE’s associate director for research.

The following projects will be the first to utilize the new computing capabilities:

  • Cardiovascular disease. Noninvasive imaging such as MRI and CT scans could enable doctors to deduce the stiffness of a patient’s arteries, a strong predictor of diseases such as hypertension. By combining the scan results with a physical model of blood flow, doctors could have an estimate for arterial stiffness within an hour of the scan. The study is led by Alberto Figueroa, the Edward B. Diethrich M.D. Research Professor of Biomedical Engineering and Vascular Surgery.
  • Turbulence. When a flow of air or water breaks up into swirls and eddies, the pure physics equations become too complex to solve. But more accurate turbulence simulation would speed up the development of more efficient airplane designs. It will also improve weather forecasting, climate science and other fields that involve the flow of liquids or gases. Duraisamy leads this project.
  • Clouds, rainfall and climate. Clouds play a central role in whether the atmosphere retains or releases heat. Wind, temperature, land use and particulates such as smoke, pollen and air pollution all affect cloud formation and precipitation. Derek Posselt, an associate professor of atmospheric, oceanic and space sciences, and his team plan to use computer models to determine how clouds and precipitation respond to changes in the climate in particular regions and seasons.
  • Dark matter and dark energy. Dark matter and dark energy are estimated to make up about 96 percent of the universe. Galaxies should trace the invisible structure of dark matter that stretches across the universe, but the formation of galaxies plays by additional rules – it’s not as simple as connecting the dots. Simulations of galaxy formation, informed by data from large galaxy-mapping studies, should better represent the roles of dark matter and dark energy in the history of the universe. August Evrard and Christopher Miller, professors of physics and astronomy, lead this study.
  • Material property prediction. Material scientists would like to be able to predict a material’s properties based on its chemical composition and structure, but supercomputers aren’t powerful enough to scale atom-level interactions up to bulk qualities such as strength, brittleness or chemical stability. An effort led by Garikipati and Vikram Gavini, a professor and an associate professor of mechanical engineering, respectively, will combine existing theories with the help of data on material structure and properties.

“It will enable a fundamentally new description of material behavior—guided by theory, but respectful of the cold facts of the data. Wholly new materials that transcend metals, polymers or ceramics can then be designed with applications ranging from tissue replacement to space travel,” said Garikipati, who is also a professor of mathematics.