Locally run AI systems, known as LLMs on the edge, could help ease the strain on data centers, but it may take some time before this approach goes mainstream.
There has been plenty of coverage on the problem AI poses to data center power. One way to ease the strain is through the use of ‘LLMs on the edge’, which enables AI systems to run natively on PCs, tablets, laptops, and smartphones.
The obvious benefits of LLMs on the edge include lowering the cost of LLM training, reduced latency in querying the LLM, enhanced user privacy, and improved reliability. (DCK)