Alex West is a senior principal analyst for the manufacturing technology group at Omdia, a division of Informa TechTarget. Opinions are the author’s own.
As digital technologies are increasingly converging with physical ones, robotics vendors are looking at the next phase — physical AI, the convergence of physical and decision-making automation. At the same time, companies are looking to how their artificial intelligence can operate in a physical world.
Multiple recent developments highlight efforts to introduce a next generation of technology combining physical and decision-making capabilities. These investments and efforts will lead to a new generation of solutions that will make integrating automation into the manufacturing process easier and more flexible.
Nvidia’s physical AI partnerships
In a major announcement for Japan’s tech sector, Fujitsu and Nvidia recently launched a partnership to co-develop a next-generation AI chip by 2030. This will combine Fujitsu’s high performance compute capabilities with NVIDA’s graphics processing units.
Significantly, the companies said they’re also exploring a collaboration with robot manufacturer Yaskawa Electric. The integration of new AI technologies into its robot systems will allow Yaskawa to leverage the Fujitsu–Nvidia alliance into real-world “physical AI” applications on the factory floors. It will also give Yaskawa customers access to digital twin technology through Nvidia's Omniverse platform.
This week, Nvidia also announced plans to support technologies for designing and simulating factory digital twins, including the simulation of robot fleets, as an expansion of its Mega Nvidia Omniverse Blueprint. The company also referenced Fanuc and Foxconn as early supporters of its OpenUSD technology for 3D-based robot digital twins.
Softbank’s ABB investment
Earlier this month, ABB Group also announced plans to spin off its ABB Robotics division as a separate entity that will be sold to SoftBank for nearly $5.4 billion. The deal, which is expected to close in “mid-to-late 2026” will combine one of the top two industrial robot’s vendors with SoftBank’s expansive portfolio of AI and compute capabilities, driving new developments in AI-enabled robotics.
The robots of today aren’t the ones of tomorrow
Industrial robots in factories are commonplace supporting the automation of work, especially in sectors such as automotive and electronics, and increasingly in consumer-packaged goods. But they still require human effort, skills and resources that can be both expensive and in short supply.
Traditional robots are programmed to perform predetermined tasks with limited flexibility, necessitating exact positioning and static environments. Any changes to a production line, the products produced, or their form factors means reprogramming the robot. This often requires manual intervention in a changeover as well as effort to set up and commission robots into new environments.
This month’s investment and partnership news, along with efforts of other robot vendors in developing intelligent machines, is the latest sign that industrial robotics are maturing.
Starting with blind and unintelligent robots, the market is currently seeing the integration of machine vision capabilities enabling robots to “see” and respond to their surroundings. This enables them to identify objects, check positioning and make basic adjustments. This visual capability is enabling robots to become more involved in quality control applications and more complex manufacturing tasks that require environmental awareness.
Physical AI, or adaptive robotics, can help manufacturers that are battling with a more volatile supply chain, as well as demand for a greater variety of goods and product customization. This requires greater flexibility in production and automation that can adapt quickly to a dynamic environment without extended downtime and the costs associated with additional robot programming.
As new adaptive robots are introduced, they will enable rapid adaptation to a greater breadth of products, production lines and changing environments by self-adjusting taught trajectories with no human involvement.
With the use of digital twins, companies can also simulate entire robotic workflows in a virtual environment before deploying on actual production lines. This significantly reduces time and expense in the trial-and-error phase of new automation implementations, whilst eliminating the risk and expense associated with disrupting live production, and allows companies to optimize how and where robots can be best introduced most quickly.
Looking ahead
Developments in physical AI will lead to robots that can operate more collaboratively and intelligently with a human workforce. This will optimize work and reduce some of the need for safety zones around the equipment, thereby reducing the overall footprint of systems. The skillsets required by manufacturers will evolve, reducing the reliance on specialized robotics programmers and increasing the demand for professionals with AI expertise.
Omdia also expects to see continued activity on the supply-side landscape as robot vendors invest to ensure their portfolio goes beyond manual task automation to extend into decision-making through AI capabilities. We expect to see further acquisitions and investments as many hardware vendors have found challenges with building up their in-house AI capabilities.
As Nvidia CEO Jensen Huang said in 2024, “the next wave of AI is physical AI.” In an interesting twist, as illustrated by this month’s partnership announcements, this time we may see AI companies buying in physical automation capabilities.