AI Hardware & Edge AI Summit: Event Recap and Key Takeaways

After sitting in on 20-odd presentations from innovators and vendors across the technology industry at this year’s AI Hardware & Edge AI Summit, here’s what we at Solidigm learned. Every good AI pitch starts with a graph that looks like this: 

The statistic might be: 

  • Investment
  • Adoption
  • Use case variety
  • Model proliferation
  • Parameter count
  • GPU compute capability

or one of a host of others – but the graph will be there. 

As surely as retired NBA star Allen Iverson is counting his lucky stars for a pair of initials that have created a slew of unexpected mid-life ad spokesman opportunities, it will be there.

Also inevitable is the enthusiastic refrain “in the era of AI,” as a slide title or declared breathlessly over the podium while describing the previously-mentioned graph.

People are rightly excited about AI with its social and technical potential to reshape our lives for the better, and its business potential to shape the fortunes of the companies that are first to market with meaningful innovations. Certainly, we at Solidigm are excited too.

Energy, scalability, and efficiency

But here’s what else we heard, something that Solidigm has been saying for some time now and that seems to be gaining traction across the ecosystem: The AI rocket ship is headed for a scalability barrier, imposed by the amount of energy available on the planet. Without significant improvements to efficiency, the critical path to AI utopia will be determined by the power budget, not hardware performance or the brilliant ideas of data scientists.

Hosted by the London-based Kisaco Research on Sept. 9-12 in San Jose, the summit kicked off with a day devoted to efficient generative AI. Framing the problem nicely was Dr. Neeraj Kumar, chief data scientist at Pacific Northwest National Laboratory, who described the rate of improvement in AI model capabilities as “Moore’s law on steroids.” Underpinning that progress, Kumar said, is the challenging reality that training a single model may use as much energy as 1,000 U.S. homes consume in a year. He pointed to forecasts that suggest data centers may account for 10% or more of global power use by 2030. These concerns have prompted the U.S. Department of Energy to declare AI energy efficiency a “critical scientific challenge” that his team is studying.

Most who follow AI infrastructure trends will understand the contribution of power-hungry GPUs to the problem as they churn away by the thousands in data centers. But running inference on deployed models may present the bigger challenge, as explained by Ankur Gupta, senior VP at Siemens EDA. He shared some World Economic Forum data indicating that up to 80% of AI environmental impact is on the inference side – after all, each instance may be tiny compared to the power required for a training epoch, but training is generally done once, while inference occurs potentially millions of times. According to Gupta, energy demand is outpacing supply and we may be headed for a wall by 2040 without significant improvements to device efficiency and workload optimization.

Data infrastructure to reduce power consumption

What is Solidigm’s position on the AI energy crunch? Don’t overlook your data infrastructure. A recent paper from Microsoft Azure and Carnegie Mellon University suggested fully a third of data center operational emissions may be attributed to the storage subsystem. For more detail on how our high-density NAND SSDs reduce power consumption compared to legacy storage options, check out our AI page.

Other compelling trends to keep an eye on, as identified by summit speakers:

  • The divergent requirements that enterprises are identifying for fine-tuning and deploying use-case-specific solutions based on foundation models
  • The possibilities offered by a hardware/software co-design approach to performance and efficiency
  • Alternative paradigms for challenging the current compute hegemony, including wafer-scale chips

The conference will return next year as the rebranded AI Infra Summit, slated for Sept 9-11, 2025. Whatever the next 12 months holds, one thing is certain: It’ll be anything but dull. As succinctly expressed by Google VP Partha Ranganathan in his session on future systems design: In AI, “Last week was a very busy year.”