AI in Business: Moving Past the Hype and False Promises
AI’s impact on chip development and healthcare
There’s no doubt that AI sparks mixed feelings among the public. To quantify this, we can look at the United States as an example. A 2024 survey by YouGov breaks down the four most common emotions toward AI: caution, concern, skepticism, and curiosity.
For instance, the following chart shows that 14% of people believe AI is already smarter than humans.
What’s more, younger generations are the ones who expect the most positive impact from AI in areas like society, their personal lives, and the economy.
No matter the global numbers, a quick look online reveals that many people feel aversion or even outright hostility toward AI. It’s not uncommon to see discussions about companies lying or making exaggerated claims about what AI can really do. On the other hand, you’ll also find conversations about businesses looking to replace workers with AI.
That said, these examples only scratch the surface of what everyone’s talking about. So the question is: What’s happening with the latest developments? To understand that, we need to dive deeper and take a closer look at the more complex aspects of AI.
What I aim to do here is shed light on the positive real-world effects of AI and showcase what this technology is already achieving in fields like assistive medicine, better chip design, and other innovative projects. It’s important to note that many of these initiatives are still in their early stages, so things could evolve in the future, but conceptually, they’re worth exploring.
The New AI Approach in Healthcare
In healthcare, neural networks are showing a lot of potential. They’re improving how treatments are diagnosed and how health is managed on a global scale.
The chart below highlights areas where healthcare is already seeing an impact, like medical equipment preparation and patient care. But we can also see that technical roles are being affected, too.
Google’s MedLM is working on a suite of AI tools designed to quickly generate patient notes and identify key diseases. The aim is to make healthcare tasks easier and more efficient. Along the same lines, NVIDIA has introduced AI tools to make surgeries and medical imaging “smarter” and faster.
Companies like Johnson & Johnson and GE Healthcare are adopting these technological advancements in the pharmaceutical and biopharmaceutical industries. They’ve noted that AI has gone from being just a buzzword to a real game changer in the span of about 18 months. So, what do they mean by that?
Typically, discovering new drugs is a lengthy process involving research, clinical trials, and testing—costing pharmaceutical companies millions of dollars, with a high risk of failure. But with AI, both time and costs can be reduced, addressing one of the industry's biggest challenges.
Companies like Recursion are using NVIDIA’s BioHive-2 AI supercomputer to speed up drug discovery. Another example is Benevolent AI, which uses AI to review drugs that are nearly ready for clinical trials, with a focus on treating diseases that currently lack effective options. Then there’s Atomwise, which applies its deep learning neural network to predict atomic-level binding affinity—a key step in the drug discovery process that helps find new medications more efficiently and more often.
Mackenzie published a study showing that AI has a positive impact on productivity and increases success rates in biopharma research. This is a quiet shift happening behind the scenes, far from the mainstream AI hype.
In fact, there are nearly 270 companies working on AI-driven drug discovery. If we combine this with rapid technological advances—and set aside corporate greed—we could see new drugs with fewer side effects and lower prices. I know, it’s a big ask, but I’m hopeful it will eventually happen.
Take Delfi, for example, which has developed an AI-based test to identify early signs of lung cancer. Similarly, researchers at Washington’s National Children's Hospital have built an AI tool to diagnose rheumatic heart disease in children.
In robotic surgery, AI is helping doctors and surgeons improve precision. Since 2018, the Da Vinci 5 robotic surgeon has been in use. What’s new is that it now integrates machine learning for image recognition, surgical planning, and real-time adjustments that adapt to a patient’s unique anatomy.
It’s important to point out that in all these cases, data training is critical. Data is essential for AI solutions in healthcare, but its effectiveness can be limited by factors like demographics, age, gender, and environmental conditions. So while I think these early breakthroughs are promising, we need to stay realistic—it will probably take some time before the average person sees a significant, lasting impact.
Smart Prosthetics as a Life-Changing Opportunity
One of the most remarkable ways AI could make a difference is by potentially restoring mobility to those who have lost it. This can happen through AI-powered prosthetics.
Take Sarah’s story. Two years ago, at 45, she lost her arm in a subway accident. She now uses an AI-assisted bionic arm that trains itself and improves the more she uses it. One of the challenges, however, is the weight of the prosthetic. But overall, it gives her the ability to do daily tasks that were once impossible—simple things like making coffee, brushing her hair, or hugging her daughter.
You’ve probably seen or even used a prosthetic before, but how are they now powered by AI?
These prosthetics use neural networks and machine learning to interpret the electrical nerve signals from the patient’s muscles, allowing for more precise and intuitive control. AI essentially gives the prosthetic a “brain.” Imagine a prosthetic leg that knows when you’re climbing stairs or a hand that can gently grip a cup without spilling it—that’s neural networks in action. The limb uses sensors inside it to gather information from the environment. These sensors read muscle signals, allowing the limb to move. Since the prosthetic uses AI, it continuously adapts to different environments and situations, getting better over time as it learns.
Ottobock, for example, offers a bionic hand that uses AI to improve functionality and control. Another company, Ossur, makes AI-powered prosthetics for lower limbs.
It’s estimated that there are about 550 million amputees worldwide, with 2 million in the United States alone. That number is expected to double by 2050. So, think about how many lives could be changed by AI and advanced prosthetics.
That said, despite the huge potential, this technology isn’t yet accessible to everyone who needs it. One of the main reasons, as you might have guessed, is the cost. Building a robotic system that can fully mimic natural human movement is incredibly complex. So, it may be decades before we reach that level. But even with the technology available today, it’s already life-changing for those who can access it.
As manufacturing costs decrease, more people will be able to afford these advanced prosthetics, breaking down the barriers to entry.
Using AI to Build the Chips That Drive AI
At this point, it’s no secret that AI is largely powered by NVIDIA GPUs, especially during the training phase. That’s why NVIDIA’s value as a company has skyrocketed in unimaginable ways, along with the AI hype itself—there’s almost a clear correlation between the two. These GPUs rely on silicon, the key component that enables all the AI capabilities we’ve seen. However, the process of making these chips is now taking an unexpected turn.
What I’m saying is that AI is now improving the very chips that power its training. It’s a fascinating cycle to observe. In the chip manufacturing process, there’s a crucial step called lithography, where detailed circuit designs are printed onto semiconductor wafers. This is where NVIDIA has developed a new GPU-based algorithm and library that accelerates computational lithography by 4x. This cuts down the time needed to compute a photomask from several weeks to just a few hours, all with the help of generative AI.
This AI-driven approach isn’t just about speed—it’s about making what once seemed physically impossible, possible. TSMC, the world’s largest independent semiconductor foundry, is already using this accelerated process. Other big industry players like ASML are also joining in, setting the stage for the next generation of semiconductors.
If NVIDIA has been using AI to design its own chips, we can expect to see a major boost in performance.
According to the chart that compares NVIDIA’s GPU performance to Moore’s Law, it’s hard to know exactly how much AI is contributing, but it’s clear that AI is making a big difference here.
So, what’s the bottom line?
We could say that what’s fast today is about to get a whole lot faster. AI might push the computing power of tomorrow to levels we’ve only dreamed of.
Final Thoughts
At this point, it’s safe to say we’ve been bombarded with AI-powered consumer apps, with every company adding AI as some kind of shiny new feature. Honestly, it’s overhyped and, frankly, kind of annoying because it’s hard to see the real impact. I get that some people might find it fun to generate stickers or send emails with AI. But for AI to tackle some of humanity’s bigger challenges, it needs to be focused on solving real, large-scale problems—like the ones we’ve discussed in this article.
Such an insightful breakdown of AI’s real-world impact beyond the hype! It’s fascinating to see how AI is advancing sectors like healthcare and chip design, offering benefits far beyond the usual consumer-focused applications. The focus on neural networks for assistive medicine and smart prosthetics is especially eye-opening—turning what could seem like sci-fi into life-changing realities for many people.
It’s clear there’s a significant positive potential, but there's still a long way to go before it's accessible to all. What do you think are the biggest barriers to wider AI adoption in these areas? Cost, trust, or maybe even understanding its potential?