Teaching a computer program to track cells
Following the minuscule movements of every cell in a petri dish would be a painstaking task for any human. But teach a set of computer programs to do the job, and they can complete it quickly and even observe things that the human eye would miss.
Scientists at Gladstone Institutes have developed such an approach, which uses "neural nets"—artificial intelligence programs that can detect patterns—to analyze the locations of hundreds of cells growing together in a colony. When they applied the technique to a group of stem cells, the program revealed that a small number of cells act as "leaders," able to direct the movements of their neighbors.
"This technique gives us a much more comprehensive view of how cells behave, how they work cooperatively, and how they come together in physical space to form complex organs," says Gladstone Senior Investigator Todd C. McDevitt, Ph.D., senior author of a new paper published in the journal Stem Cell Reports.
Clusters of stem cells have the ability to form any tissue in the human body when exposed to the right mixture of signaling molecules. But researchers have a poor understanding of how those cells form patterns in space to eventually give rise to complex three-dimensional organs.
Traditionally, to study how cells move in space over time, cell biologists tag cells with fluorescent molecules that make them easy to track. Then, they watch those cells under a microscope to see how they divide and migrate. However, a human observer can only follow a small handful of cells at a time before it becomes too challenging to distinguish different cells and track their movements. This means scientists often have to extrapolate how an entire colony moves based on the movements of just a few of its members.
In the new paper, McDevitt's group trained three different neural networks to follow the motions of individual cells within colonies of thousands of cells. Each network had its own strengths and weaknesses, and individually none of them outperformed a person. But combined, the three neural networks were slightly more effective at tracking cells—they were able to find 94 percent of all cells in sequential frames, meaning they could follow their movements over time. Humans could only track 90 percent of all cells between frames; a scientist trying to follow cell movements could only figure out where nine of every ten cells moved. What's more, the combined neural networks were 500 times faster than a person, averaging 0.35 seconds per frame to identify all cells, while a human averaged about 3 minutes per frame.
When the researchers used the networks to study new colonies of stem cells, they were surprised to see lots more action than previous cell-tracking techniques had identified. While the colonies looked fairly static to the naked eye, the neural networks showed that nearly every cell was on the move—and much of the movement looked random.
"Going in, we didn't really expect there to be that much cell motion, so we had to come up with new approaches to understand the apparent chaos of the cells," says Gladstone Graduate Student David Joy, first author of the new paper.
Cells nearest the edges of each colony moved the most, McDevitt's group discovered. And cells tended to start and stop more than researchers would have guessed—on average, each cell moved for approximately 15 minutes, followed by a quiescent, motionless 10-minute period before another active phase began.
The researchers went on to show how changing the conditions of the cells' environment—by exposing the cells to different nutrients or drugs—can change how cells move. They also used the neural networks to track stem cell colonies over 24 hours as they began to form the multiple layers of different cell types that appear in an early embryo. The team found that cells have a wide range of movement profiles.
"Some cells move with a lot of persistence in once direction, while others move around and around but never get far from where they started," says Ashley Libby, Ph.D., a former graduate student in McDevitt's lab who helped lead the work. The diversity surprised the team; they had expected most cells to follow similar patterns of movement, she says.
What's more, some cells acted as "leaders" while others behaved more like "followers," the researchers say. The motion of a small number of cells spread outward to their neighbors, eventually shifting the dynamics of the entire colony. It's a pattern that wouldn't have been obvious if just a few cells had been tracked over time by a human observer.
The new findings are just a small sampling of the kinds of observations that will be possible as artificial intelligence approaches are applied to cell tracking, says McDevitt. And the knowledge that comes from these future experiments will be useful to researchers trying to coax cells to come together into complex organoids and organs—for both research and therapeutic purposes.
"If I wanted to make a new human heart right now, I know what types of cells are needed, and I know how to grow them independently in dishes," says McDevitt, who is also a professor of bioengineering and therapeutic sciences at UC San Francisco. "But we really don't know how to get those cells to come together to form something as complex as a heart. To accomplish that, we need more insights into how cells work cooperatively to arrange themselves."
As a next step, McDevitt's team is planning future studies that use the neural networks to analyze movements within stem cell cultures that have genetic mutations, to help show the effect of different genes on cell organization.
More information: David A. Joy et al, Deep neural net tracking of human pluripotent stem cells reveals intrinsic behaviors directing morphogenesis, Stem Cell Reports (2021). DOI: 10.1016/j.stemcr.2021.04.008
Journal information: Stem Cell Reports
Provided by Gladstone Institutes