Write an article about
SAN JOSE: A single text prompt to Google’s artificial intelligence (AI) software, Gemini, consumes roughly as much electricity as watching television for just under nine seconds, the company said on Thursday, reported German Press Agency (dpa).
Google estimated the median energy use at 0.24 watt-hours per text prompt and said around five drops of water, or 0.26 millilitres, are also used to cool the data centres that power the AI.
Concerns over the growing electricity and water consumption of artificial intelligence applications have been raised for years. The tech industry is seeking to reassure the public, pointing out, for example, that data centres are becoming increasingly efficient.
OpenAI, the creator of ChatGPT, offered a similar comparison in June, estimating that an average AI prompt consumes 0.34 watt-hours – roughly the same as running an oven for a second.
Even though individual queries are likely to require less energy over time thanks to improvements in chip and server technology, the sheer volume of AI usage is driving a sharp rise in power demand at data centres.
Moreover, the figures from both Google and OpenAI do not account for the massive amounts of electricity consumed during the initial training of AI models on huge datasets -BERNAMA-dpa
in 1000-1500 words .Organize the content with appropriate headings and subheadings (h1, h2, h3, h4, h5, h6), Retain any existing tags from
SAN JOSE: A single text prompt to Google’s artificial intelligence (AI) software, Gemini, consumes roughly as much electricity as watching television for just under nine seconds, the company said on Thursday, reported German Press Agency (dpa).
Google estimated the median energy use at 0.24 watt-hours per text prompt and said around five drops of water, or 0.26 millilitres, are also used to cool the data centres that power the AI.
Concerns over the growing electricity and water consumption of artificial intelligence applications have been raised for years. The tech industry is seeking to reassure the public, pointing out, for example, that data centres are becoming increasingly efficient.
OpenAI, the creator of ChatGPT, offered a similar comparison in June, estimating that an average AI prompt consumes 0.34 watt-hours – roughly the same as running an oven for a second.
Even though individual queries are likely to require less energy over time thanks to improvements in chip and server technology, the sheer volume of AI usage is driving a sharp rise in power demand at data centres.
Moreover, the figures from both Google and OpenAI do not account for the massive amounts of electricity consumed during the initial training of AI models on huge datasets -BERNAMA-dpa
and integrate them seamlessly into the new content without adding new tags. Include conclusion section and FAQs section at the end. do not include the title. it must return only article i dont want any extra information or introductory text with article e.g: ” Here is rewritten article:” or “Here is the rewritten content:”