The first AI chip that learns and infers has been developed, combining FeCAPs and memristors for efficient, adaptive edge computing.
According to Robin Mitchell, one of the most significant barriers to progress in artificial intelligence is not algorithmic, but physical.
The challenge lies in getting AI systems to run efficiently on hardware, as current approaches rely on vast amounts of specialized silicon, massive energy consumption, and complicated cooling systems.
This approach is unsustainable, and if AI continues to scale at its current pace, it could lead to serious resource shortages.
Data centers already strain power grids, and the amount of heat generated requires industrial-scale cooling, often supported by water-intensive processes that introduce environmental and economic issues.
Training remains the most energy-intensive AI process.
AI continues to scale at its current pace, could see serious resource shortages.
Author's summary: AI chip learns and infers efficiently.