Tutors : Romain Rouvoy, Clément Quinton

Studies on Large Language Models' energy consumption and code efficiency

Tristan Coignion

2nd year PhD student

AI and Large Language Models

Large Language Model (LLM) :

AI technology that generates text.

Code assistant : An LLM that integrates into a developer's workflow (e.g. GitHub Copilot)

Helps developers create software

AI and Large Language Models

Study on the performance of LLM-generated code

Studied the performance of LLM-generated solutions to Leetcode algorithmic problems

  • No (or very few) differences between LLMs in terms of code speed
  • LLM are a priori better than humans on average in terms of code speed (Careful : weak proofs)

Study on the energy consumption of a code assistant

Below is a depiction of an LLM's usage of electricity :

The code assistant's generation is done in the cloud

The electrical consumption is hidden

Study on the energy consumption of a code assistants

How do we measure ?

A human participant develops using GitHub Copilot

We replay the generations using multiple AIs and scenarios on servers we own, and measure the electrical consumption

We collect the telemetry of GitHub Copilot which allows us to "replay" the generations with other AIs

Study on the energy consumption of a code assistant

Next steps

  1. Performance of software produced by LLMs
     
  2. Environmental impact of using a code assistant
     
  3. Making the code assistants use less electricity

Thank you for listening !

Team Seminar Autumn 2023

By Tristan Coignion

Team Seminar Autumn 2023

  • 31