- Rachel Wallach
- Office phone
- Cell phone
If you built the largest, most powerful computer currently feasible, it would have about the same number of transistors as there are synapses in the brain of a 3-year-old child. It would also be slightly larger than a tennis court, and consume 106 times the power needed by the preschooler's brain.
All of which means that it might be time to look in a new direction for the next generation of computers, and the power and efficiency of the brain as a computing device suggests that it's worth exploring biology as an inspiration, says Rebecca Schulman, an associate professor in Johns Hopkins University's Department of Chemical and Biomolecular Engineering. The National Science Foundation, which has a funding stream directed toward innovative biology-based information technologies, agrees with her, and awarded Schulman and three colleagues $1.5 million to design a computing system based on living cells.
"Our thought was that people have started programming cells, and now it is possible to create whole new genomes. What if we start over and engineer cultured cells like yeast, where the goal is to make them computing units?" asks Schulman, who will serve as principal investigator.
The grant, made in partnership with the Semiconductor Research Corporation, is designed to facilitate ultra-low-energy computing, storage, and signal processing systems built on developments in the fields of biology, chemistry, and engineering. Schulman's team will explore the creation of a new generation of dense, inexpensive, and highly energy-efficient computers made of large, three-dimensional yeast cell colonies grown from simple raw materials. Additional researchers are Joshua Vogelstein from the Johns Hopkins Department of Biomedical Engineering, Eric Klavins from the University of Washington, and Andrew Ellington from the University of Texas at Austin.
Previous related efforts have focused on using neurons for computing, but programming neurons is difficult and researchers have yet to successfully direct their information processing. Easy-to-grow cells like yeast solve the issue of programmability but present other challenges: they are prone to error and they divide and die, making it difficult to build a reliable computing architecture.
The team proposes to combine the advantages of neurons—which can process information redundantly, minimizing communication errors—with those offered by yeast cells, which can grow efficiently in culture and be easily reconfigured using modern genome engineering techniques. Their goal is to develop "yeastons," which are Saccharomyces cerevisiae cells that can collectively emulate neural networks. The networks will be designed to tolerate the variability found in single-cell biomolecular information processing, and the computing architectures will grow organically, so no patterning or higher order spatial organization will be required.
The researchers expect to show that a large enough colony of yeastons could perform arbitrarily complex computations. Models offered by neuroscience will provide design principles for assembling robust yeaston networks and scaling laws for yeaston computing. Because the architectures will be modeled on the brain, where computation and memory are distributed across billions of neurons, the overall computing process will stand up to the variability and flux in individual cells' behavior, replication, and death.
The project is expected to provide an example of very low-power computing: A yeaston-powered computer could one day offer the functions of a supercomputer while occupying only a fraction of the space. The project will also add to fundamental knowledge about intercellular coordination and how collective intercellular communication processes can allow for coordinated behavior.
Posted in Science+Technology
Tagged chemical engineering, biomolecular engineering, computers