When it comes to the topic of drug discovery and development, scientists are busy furrowing their lab-goggled brows trying to understand what’s real and what’s hype when it comes to the power and potential of AI.
This *Resonance Test* conversation perfectly dramatizes the situation. In this episode, Emma Eng, VP of Global Data & AI, Development at Novo Nordisk, and scientist and strategist Chris Waller provide a candid view of drug development in the AI era.
“We're standing on a revolution,” says Eng, reminding us that “we've done it so many other times” with the birth of the computer and the birth of the internet. It’s prudent, she cautions, not to rush to judgement guided by either zealots or skeptics.
Waller says, of the articles about AI and leadership in *Harvard Business Review,* one could do “a search and replace ‘AI’ with any other technological change that's happened in the last 30 years. It's the same kind of trend and processes and characteristics that you need in your leadership to implement the technology appropriately to get the outcomes that you're looking for.”
Which means, for pharma, much uncertainty and much experimentation.
“I think experimentation is good,” says Eng, who then adds that we need to always keep track of what is it that we're experimenting on. She says that the word “experimentation” can “sound very fluid” but in fact, “It's a very structured process. You set up some very clear objectives and you either prove or don't prove those objectives.”
Waller references the various revolutions (throughput screening, combinational chemistry, data, and analytics revolutions) that pharma has seen and says: “We've all held out hope for each and every one of these revolutions that the drug discovery process is going to be shrunk by 50% and cost half as much. And every time we turn around, it's still 12 to 15 years, $1.5 to $2 billion.”
Will AI make the big difference, finally?
“Maybe we need to be revolutionized as an industry,” she says. “It can be hard to make much of a difference as long as there are few big players.” Just a few big players, she says, is “the nature of pharma.”
Of course, our scientists are measured in their assessments about industry change. After all, as Waller says, the systems involved—the human body, the regulatory environment, the commercial ecosystems—are all “super-complicated.”
Eng notes that an important side-effect around the AI hype is corporate interest in data. “Now it's much easier to put that topic on the table saying, ‘If you want to do AI, you need to take care of your data and you need to treat it like an asset.’”
Listen on as they test topics such as regional and regulatory challenges in AI adoption, change management, and future tech and long-term impact (watch out for quantum, everyone!).
In the end, Eng returns to the idea of revolutions. “You think you want so much change in the beginning which you don't get because it takes time,” says Eng. This makes us underestimate what will happen later. Having such a farseeing mindset is significant, she says, because “these technology shifts will have a large impact on the long term.”
Host: Alison Kotin
Engineer: Kyp Pilalas
Producer: Ken Gordon
All content for The EPAM Continuum Podcast Network is the property of EPAM Continuum and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
When it comes to the topic of drug discovery and development, scientists are busy furrowing their lab-goggled brows trying to understand what’s real and what’s hype when it comes to the power and potential of AI.
This *Resonance Test* conversation perfectly dramatizes the situation. In this episode, Emma Eng, VP of Global Data & AI, Development at Novo Nordisk, and scientist and strategist Chris Waller provide a candid view of drug development in the AI era.
“We're standing on a revolution,” says Eng, reminding us that “we've done it so many other times” with the birth of the computer and the birth of the internet. It’s prudent, she cautions, not to rush to judgement guided by either zealots or skeptics.
Waller says, of the articles about AI and leadership in *Harvard Business Review,* one could do “a search and replace ‘AI’ with any other technological change that's happened in the last 30 years. It's the same kind of trend and processes and characteristics that you need in your leadership to implement the technology appropriately to get the outcomes that you're looking for.”
Which means, for pharma, much uncertainty and much experimentation.
“I think experimentation is good,” says Eng, who then adds that we need to always keep track of what is it that we're experimenting on. She says that the word “experimentation” can “sound very fluid” but in fact, “It's a very structured process. You set up some very clear objectives and you either prove or don't prove those objectives.”
Waller references the various revolutions (throughput screening, combinational chemistry, data, and analytics revolutions) that pharma has seen and says: “We've all held out hope for each and every one of these revolutions that the drug discovery process is going to be shrunk by 50% and cost half as much. And every time we turn around, it's still 12 to 15 years, $1.5 to $2 billion.”
Will AI make the big difference, finally?
“Maybe we need to be revolutionized as an industry,” she says. “It can be hard to make much of a difference as long as there are few big players.” Just a few big players, she says, is “the nature of pharma.”
Of course, our scientists are measured in their assessments about industry change. After all, as Waller says, the systems involved—the human body, the regulatory environment, the commercial ecosystems—are all “super-complicated.”
Eng notes that an important side-effect around the AI hype is corporate interest in data. “Now it's much easier to put that topic on the table saying, ‘If you want to do AI, you need to take care of your data and you need to treat it like an asset.’”
Listen on as they test topics such as regional and regulatory challenges in AI adoption, change management, and future tech and long-term impact (watch out for quantum, everyone!).
In the end, Eng returns to the idea of revolutions. “You think you want so much change in the beginning which you don't get because it takes time,” says Eng. This makes us underestimate what will happen later. Having such a farseeing mindset is significant, she says, because “these technology shifts will have a large impact on the long term.”
Host: Alison Kotin
Engineer: Kyp Pilalas
Producer: Ken Gordon
The Resonance Test 88: Scott Loughlin, Sam Rehman, and Brian Imholte on Privacy, Education, and AI
The EPAM Continuum Podcast Network
41 minutes 44 seconds
1 year ago
The Resonance Test 88: Scott Loughlin, Sam Rehman, and Brian Imholte on Privacy, Education, and AI
Sam Rehman—a frequent voice on this podcast network and EPAM’s Chief Information Security Officer and SVP—was in the classroom recently, teaching students, and in the process was “surprised by the density of PII that's in in the system.”
This led Rehman to realize that “at least here in California,” higher education’s investment in cybersecurity is “substantially behind.”
Catching up is a theme of today’s conversation about privacy, education, and artificial intelligence.
Speaking for the (cyber)defense, with Rehman, is today’s guest on *The Resonance Test,* Scott Loughlin, Partner and Global Co-Lead of the Privacy & Cybersecurity Practice at the law firm Hogan Lovells.
“It took a long time to get people to understand that the easiest thing to do is not always the right thing to do to protect the company’s interest and protect the company’s data,” says Loughlin. “And that is an experience that we'll all have with respect to generative AI tools.”
Loughlin and Rehman are put through their conversational paces from questions by Brian Imholte, our Head of Education & Learning Services.
They have much to say about data governance (“Data is not by itself anymore, it's broken up in pieces, combined, massaged, and then pulled out from a model,” says Rehman), data pedigree, the laws—and lack thereof—regarding privacy and generative AI. They also kick around the role that FERPA assumes here. “You’re trying to deploy this old framework against this new technology, which is difficult,” says Loughlin, adding: “There are some key areas of tension that will come up with using generative AI with student data.”
So where might an educational publisher or school begin?
“Focus on your value first,” says Rehman. Do your experiments, but do them in small pieces, he says: "And then within those small pieces, know what you're putting into the model.”
This informative and spirited conversation is even occasionally funny. Loughlin brings up a court case about whether or not a selfie-taking monkey selfie would own the copyright to the photo. “The court said no,” notes Loughlin, adding that US Copyright laws are “designed to protect the authorship of humans, not of monkeys, and in this case not of generative AI tools.”
Download now: It’s sure to generate some new thoughts.
Host: Kenji Ross
Engineer: Kyp Pilalas
Producer: Ken Gordon
The EPAM Continuum Podcast Network
When it comes to the topic of drug discovery and development, scientists are busy furrowing their lab-goggled brows trying to understand what’s real and what’s hype when it comes to the power and potential of AI.
This *Resonance Test* conversation perfectly dramatizes the situation. In this episode, Emma Eng, VP of Global Data & AI, Development at Novo Nordisk, and scientist and strategist Chris Waller provide a candid view of drug development in the AI era.
“We're standing on a revolution,” says Eng, reminding us that “we've done it so many other times” with the birth of the computer and the birth of the internet. It’s prudent, she cautions, not to rush to judgement guided by either zealots or skeptics.
Waller says, of the articles about AI and leadership in *Harvard Business Review,* one could do “a search and replace ‘AI’ with any other technological change that's happened in the last 30 years. It's the same kind of trend and processes and characteristics that you need in your leadership to implement the technology appropriately to get the outcomes that you're looking for.”
Which means, for pharma, much uncertainty and much experimentation.
“I think experimentation is good,” says Eng, who then adds that we need to always keep track of what is it that we're experimenting on. She says that the word “experimentation” can “sound very fluid” but in fact, “It's a very structured process. You set up some very clear objectives and you either prove or don't prove those objectives.”
Waller references the various revolutions (throughput screening, combinational chemistry, data, and analytics revolutions) that pharma has seen and says: “We've all held out hope for each and every one of these revolutions that the drug discovery process is going to be shrunk by 50% and cost half as much. And every time we turn around, it's still 12 to 15 years, $1.5 to $2 billion.”
Will AI make the big difference, finally?
“Maybe we need to be revolutionized as an industry,” she says. “It can be hard to make much of a difference as long as there are few big players.” Just a few big players, she says, is “the nature of pharma.”
Of course, our scientists are measured in their assessments about industry change. After all, as Waller says, the systems involved—the human body, the regulatory environment, the commercial ecosystems—are all “super-complicated.”
Eng notes that an important side-effect around the AI hype is corporate interest in data. “Now it's much easier to put that topic on the table saying, ‘If you want to do AI, you need to take care of your data and you need to treat it like an asset.’”
Listen on as they test topics such as regional and regulatory challenges in AI adoption, change management, and future tech and long-term impact (watch out for quantum, everyone!).
In the end, Eng returns to the idea of revolutions. “You think you want so much change in the beginning which you don't get because it takes time,” says Eng. This makes us underestimate what will happen later. Having such a farseeing mindset is significant, she says, because “these technology shifts will have a large impact on the long term.”
Host: Alison Kotin
Engineer: Kyp Pilalas
Producer: Ken Gordon