CORVALLIS, Ore. (KATU) — From Gemini to ChatGPT, AI uses a lot of energy. The International Energy Agency projected that electricity consumption by data centers will double by 2026, reaching 1,000 terawatts, comparable to Japan’s current total consumption.
But not all hope is lost; Oregon State University announced the development of a groundbreaking chip designed to cut the energy consumption of artificial intelligence’s large language models by half.
According to OSU, data rate demands keep increasing while the energy required to transmit the data is unable to keep up…