During Nvidia’s fourth-quarter earnings call earlier this year, CEO Jensen Huang exclaimed, “The next frontier of AI is physical AI.”
Huang was referring to a relatively new field that combines artificial intelligence with everything from humanoid robots, industrial machines, cars, and even internet of things (IoT) devices. Data from those physical systems is actively fed into the AI’s brain — a large language model —after which the technology can make decisions in real time.
The the AI-infused combination differs from traditional robotics because it uses a bidirectional learning system instead of fixed rules, making it more adaptable and capable in complex, unpredictable situations, according to Albert Meige, an associate director with global consultancy Arthur D. Little.
The consultancy recently released a research report on physical AI in which it explained that while generative AI (genAI) is good at creating digital content, it lacks real-world awareness. Physical AI bridges that gap.
Although it’s gotten less attention than genAI, physical AI’s long-term impact could be greater because it enhances how we act in the world, not just how we generate content, Meige explained. “In the past one to two years, we’ve entered the era of agentic AI. Physical AI is the next chapter — embedding intelligence into devices that sense, decide, and act in the real world.”
Physical AI is already reshaping robotics, tackling challenges in productivity, aging, and the environment by taking over repetitive or risky tasks — reshaping jobs, not necessarily replacing them.
AI is driving robotics across a variety of sectors by making machines more autonomous. Physical AI is gaining ground in areas such as logistics, healthcare, and manufacturing, where it handles tasks and gathers real-world data — helping it break through data limitations and making its use increasingly essential.
The main challenge for physical AI is its need for massive data sets to build a usable “abstract representation of the physical world essential for effective functioning,” Meige said.
Physical AI also plays a crucial role in collecting fresh real-world data that goes beyond just high-quality training datasets. In that sense, its deployment is becoming inevitable, Meige said.
Amazon Robotics, which claims to be the world’s largest manufacturer of industrial robots, recently reached a milestone with the deployment of its one millionth robot. It’s even working on a robot with a sense of touch and it has launched a generative AI model to optimize its warehouse robot fleet. (Amazon said its new model, DeepFleet, boosts robot efficiency by 10%, speeding up deliveries and cutting costs across 300+ fulfillment centers.)
Amazon said its robots mainly handle repetitive tasks and heavy lifting, making work safer and easier for employees — allowing those workers to focus on delivering better service to customers.
Even so, Amazon CEO Andy Jassy recently said that because of AI “there will be fewer people doing some of the jobs that the technology actually starts to automate,” though it could create others in fields like robotics and machine learning.
Amazon recently opened its third robotic fulfillment center, this one in Massachusetts. The nearly three million square-foot-facility features hundreds of robots equipped to lift up to 1,500 pounds each. In May, Amazon began construction on a robotics facility in Virginia — its fourth in that state. And the company is building a robotics fulfillment center in North Carolina; it’s set to open in 2026.
Amazon Robotics is building agentic AI to let robots understand natural language and act autonomously. Soon, workers could simply say, “Pick items from the yellow tote,” and robots like its pizza-box shaped “Proteus” model will do it — freeing up humans for higher-level tasks.
To help train its robots to perform more complex tasks, Amazon has partnered with MIT’s Industrial Performance Center (IPC). The IPC lab trains robots to understand and predict human instructions, helping to create a future where humans and machines work together — not separately, as is often the case today.
While AI and robotics are already converging to handle complex, unpredictable tasks, to succeed, AI must learn and reason beyond its training, using multiple data types — much as doctors do with text, speech, and visuals. For example, traditional factory robots are not good at sorting various parts if they’re not exactly where they should be on a conveyor or work area. Researchers are now demonstrating AI’s effectiveness at choosing randomly placed parts, and an AI-aided robotic arm can sort through the parts based on their uses.
Will robots take jobs?
“AI enabled robots promise to be more flexible,” said IPC Executive Director Ben Armstrong. “They don’t do the same thing over and over again. Instead, they can adapt to their conditions to do a different thing in time based on what the suitable response is and a suitable different path in time.”
Since 2018, MIT’s task force has also explored whether AI will replace jobs. The simple answer: no.
“In many cases, they’re slower than humans,” Armstrong explained. But robots are consistent and can run overnight without supervision; so instead of replacing jobs, they can free up technicians for higher-value tasks, he said.
Arthur D. Little’s Meige agreed, saying AI-infused robotics is “shifting the task landscape in a way that transforms human roles rather than eliminating them. These systems are already having tangible impact in terms of adaptability, accuracy, and autonomy across industries. More importantly, they are a key source of fresh, real-world data, which is critical as we reach the limits of current training datasets,” Meige said.
An Amazon spokesperson said its robotics are designed to automate tasks in an effort to continue improving safety, reducing repetition, “and freeing our employees up to deliver for customers in more skilled ways.
“Since introducing robots within Amazon’s operations, we’ve continued to hire hundreds of thousands of employees to work in our facilities and created many new job categories worldwide, including positions like flow control specialists, floor monitors, and reliability maintenance engineers,” the spokesperson said in an email to Computerworld.
The adoption of AI-powered robotics has been slower than expected, and its impact on workers has been gradual, “not drastic,” Armstrong said.
He pointed to a 2018 study by Oxford scholars that indicated 47% of all jobs could be done by machines “over the next decade or two.”
“We’re getting close to 2030 now, and that’s not the case at all,” Armstrong said. “Even given some of the categories of jobs that they said would be entirely automatable by AI, we see that the number of people in those jobs [has] actually gone up since AI has been adopted and become more popular.”
In fact, an MIT survey of more than 6,000 workers in nine countries last year found that 60% believe AI and robotics will positively impact their safety, careers, and productivity.
Today, 20% of factories with between 50 and 150 employees have any industrial robots installed; that’s about half the rate of factories employing more than 1,000 workers, according to a report in the Financial Times.
The rise of ‘humanoid’ robots?
Another area where physical AI is advancing involves “humanoids” — robots that resemble and act like human beings; it’s a market that’s expected to surpass $5 trillion by 2050, according to Morgan Stanley. (That figure includes sales from supply chains and networks for repair, maintenance and support.)
Tesla CEO Elon Musk has said his company will start selling its humanoid robot — Optimus — in 2026. It’s already working autonomously, handling batteries at a Tesla facility and performing other chores.
Tesla’s Optimus humanoid robot is scheduled go on sale in 2026.
Foundry/Lucas Mearian
China, with strong government support, is currently leading in the development of humanoids powered by AI — 90% of them are used for industrial and commercial purposes. And while the adoption of humanoids is likely to accelerate in the late 2030s with improved technology as well as greater regulatory and societal support, the hype around them could exceed the reality.
Armstrong, for example, isn’t particularly bullish on humanoid robots.
“They’ve been hyped for a long time, and it really feels like a solution in search of a problem,” he said. “A lot of the real challenges that robotics can solve don’t necessarily need a human form factor, and a human form factor might actually be disadvantageous to solve those problems.
“To me, it’s driven by this idea that the human form is the best solution to all sorts of problems, but in manufacturing, I’m not sure that that’s always this case,” Armstrong said. “So, I think there are troubles with those theories. One is that we’re not very good at predicting the future.”
Read the full article here