Green Prompting: How you can keep the energy consumption of AI in check

The contribution Green Prompting: How you can keep the energy consumption of AI in Zaum first appeared at the online magazine Basic Thinking. You can start the day well every morning via our newsletter update.

Energy consumption Ki Klima Prompts Green Prompting

After the hype about the skills of voice models, the problems are increasingly coming to the fore. Among them: false information in the form of hallucinations and the enormous energy consumption of AI. The user inputs, the so -called prompts, decide how much energy artificial intelligence consumes.

When it comes to the energy consumption of AI, the focus is on the so-called inference process, that is, generating answers to user inquiries. According to a current one study With the title “Green Prompting”, however, the energy hungry of voice models can be kept in check by deliberately formulated inputs.

Prompts: What influences the energy consumption of AI?

According to the results, the energy consumption of AI can vary significantly when it comes to information. Accordingly, the length, the complexity and the tasks of user inputs would have a significant impact. However, a longer promptly does not necessarily cause higher energy consumption.

Rather, the semantic content of the entries is crucial. A prompt like “Explain what quantum mechanics are based on three examples, so that a twelve-year-old understands it”, consumes significantly more energy than a simple yes/no question-even if both are for the same time.

According to the researchers, creative and rather open tasks above all drive up electricity consumption, since AI models have to generate longer and more complex texts in these. This applies, among other things, to storytelling, philosophical discussions or opinion contributions.

Certain words or phrases in one prompt can also drive up the energy consumption of AI. Openai boss Sam Altman replied A user on X (formerly Twitter), for example, that Chatgpt causes several million US dollars in electricity costs just because users say and thank you. According to the study, however, this effect is dependent on the model and tasks.

See also  AI changes its own code to launch itself

Green Prompting: How you can save energy with prompts

With so -called “green prompts”, users can not only achieve better results, but also help to keep the energy consumption of AI in check. According to the study results, even small user settings have an impact on the energy efficiency of voice models-and thus indirectly also on the CO2 footprint of AI. The following tips can be helpful:

  1. Precise and targeted formulations: Instead of asking vague or very open questions (“tell me something about environmental protection”), it is much more efficient to formulate prompt as precisely as possible. An example: “Name three short facts about CO₂ emissions through cars”. A clear task not only leads to better answers, but also saves computing power. Exposing and ambiguous inputs should be avoided.
  2. Use the context sparingly: Long background texts or conversations can massively increase energy consumption during prompting. The reason: Long conversations ensure that a AI model has to process all the previous information again. If possible, the context should be limited to the bare essentials.
  3. Avoid repetitions: Instead of asking several consecutive questions (“What do you mean exactly?” – “Can you explain that again?”), It is more ecological to formulate a promptly way from the start. Because: Many corrections increase energy consumption.
  4. Easy answer formats prefer: Structured response formats such as lists, tables or yes/no questions can keep the power consumption of AI in check. The reason: clear ideas of how an answer should look like can usually be processed faster and more resource-saving. This also applies to requirements for the length of the answer.

Also interesting:

  • Prompts: You shouldn’t reveal these things Chatgpt
  • The current Chatgpt models in comparison
  • AI from Europe: Everything you need to know about the Mistral-Ki “Le Chat”
  • Chatgpt: 5 tips to get better answers

The contribution Green Prompting: How you can keep the energy consumption of AI in Zaum first appeared on Basic Thinking. Follow us too Google News and Flipboard Or subscribe to our update newsletter.

See also  Digitalization: These are the 10 “smartest” cities in Germany


As a Tech Industry expert, I believe that green prompting is a crucial aspect of ensuring that AI systems are sustainable and environmentally friendly. To keep the energy consumption of AI in check, there are several strategies that can be implemented.

First and foremost, optimizing algorithms and hardware design is essential in reducing energy consumption. This can involve developing more efficient algorithms that require less computational power, as well as using energy-efficient hardware components.

Additionally, implementing power management techniques such as dynamic voltage and frequency scaling can help to reduce energy consumption during periods of low computational demand. This allows AI systems to adjust their performance based on workload requirements, leading to significant energy savings.

Furthermore, utilizing renewable energy sources, such as solar or wind power, to power AI systems can also help to minimize their environmental impact. By investing in sustainable energy solutions, companies can reduce their reliance on traditional energy sources and decrease their carbon footprint.

Overall, by prioritizing green prompting strategies and implementing energy-efficient practices, the tech industry can help to mitigate the environmental impact of AI systems and create a more sustainable future.

Credits