In the early 1990s I was living in Boston and reflecting on the relative progress of information technology and the life sciences. It appeared to me then that IT was sputtering a bit; the Internet hadn’t yet transformed commerce, and AI was in one of its winters. Life sciences, on the other hand, seemed to be at the beginning of a boom—at least in Boston and Cambridge. Genomics, personalized medicine, gene editing, and so forth hadn’t fully appeared in life sciences yet, but they were on the horizon. So I toyed with the idea that I should abandon my research, teaching, and consulting on IT to focus on life sciences. Not that I have the biochemical expertise to make fundamental contributions to that field, but I could at least focus on the business strategy and organizational effectiveness issues in the life sciences industry.
That would have been a big mistake on my part. To be sure, the life sciences have taken off in a big way. If you’ve visited Kendall Square in Cambridge recently you will know what a biotech boomtown it has become. But almost every person I meet in the biotech companies in the region have the term “informatics” in their title. IT has become a dominant force in almost every type of innovation. In fact, during a recent visit to Kendall Square for the 2019 Emtech MIT conference, I was confirmed in my belief that information technology, big data, and AI are powering developments in almost every area of science and technology.
The specific provocation for this observation was the “2019 35 Innovators Under 35” awards, which are announced annually by MIT Technology Review (which puts on the Emtech conferences). 31 of the 35 young innovators were present at the conference, and each one gave a short talk. The particular science or technology domains of the young innovators were quite varied, ranging from measuring moisture levels in harvested grain to mapping the human brain. Some, of course, worked in information technology fields like artificial intelligence. But I was struck that almost every one of these innovators used information technology to help achieve their goals.
Take, for example, the mapping of the human brain for the purpose of better understanding neurological disorders. That’s the focus of Archana Venkataraman, a professor at Johns Hopkins University. She’s using AI—deep learning models in particular—to analyze EEG data and to pinpoint the time and location of epileptic seizure onset in the brain. She hopes that will help to diagnose and treat epilepsy, as well as other neurological disorders like schizophrenia, brain tumors, spinal cord injury, and autism.
Similarly, measuring moisture levels in harvested grain in Africa wouldn’t seem to be an IT innovation. But Isaac Sesi from Ghana has a product called GrainMate that helps farmers and grain purchasers to measure moisture in their grains, which helps keep them from spoiling after harvesting. It uses an electronic grain-moisture meter and a mobile app.
There were many other examples of IT-enabled innovation. Riana Lynn uses nutrient data and AI to make more nutritious snacks.
Tim Ellis employs machine learning and automation to power a 3D metal printer to make rocket components. Silvia Caballero applies bioinformatics to identify gut bacteria that can control antibiotic-resistant infections. Himabindu Lakkaraju develops AI models to check for bias in important decisions. Between the innovators who develop IT itself—AI models, robots, user interfaces, quantum computing, cybersecurity, etc.—and those who use IT to help with something else—the great majority of these innovators are using information technology to solve important problems.
Of course, there were some exceptions to the prominence of IT. At least in the short talks I heard, there wasn’t much IT in materials science for textile-based building blocks, or a new approach to filtering dirty water. However, I’m guessing that those innovators used IT in some important way. And while I was at MIT on this visit I noticed there was a big effort in the Materials Science and Engineering Department to use machine learning to identify new materials.
Virtually every technological or scientific investigation today involves the generation, collection, and analysis of data. That data is too big and complex to be analyzed solely by the human brain—we need computers to chew through it and make sense of it. To research and develop new technologies in cognitive science, genetics, or medicine is also to research how information and computers can shed new light on those areas.