Google and Microsoft are looking to improve search engine results with large language models they say will distill “complex” information while responding in a human-like fashion to queries.
The computing power necessary to merge AI with the load of search engine queries could increase the amount of computing power necessary from companies like Google and Microsoft by up to 5 times, experts told Wired.
“It requires processing power as well as storage and efficient search,” Alan Woodward, professor of cybersecurity at the University of Surrey, told Wired. “Every time we see a step change in online processing, we see significant increases in the power and cooling resources required by large processing centers. I think this could be such a step.”
The new search engines will also require more data centers to store data. Martin Bouchard, founder of data center company QScale, told Wired that AI would result in “at least four or five times more computing per search.”
In a statement to Insider, Jane Park, a spokesperson for Google, said the company would initially be launching a “lighter” version of Bard that would require less computing power.
“We have also published research detailing the energy costs of state-of-the-art language models, including an earlier and larger, version of LaMDA,” Park said in a statement.
“Our findings show that combining efficient models, processors, and data centers with clean energy sources can reduce the carbon footprint of a ML system by as much as 1000X.”
Full Link ( Msn )
© CopyRights RawNews1st