What are some ways to estimate how much (bad) effect my AI use is having on the environment, even if it is a rough estimate? I use Claude code, ChatGPT and Gemini.
1. Are there any APIs, datasets that can tell me per API call effect (X KwH per million tokens or gallons of water used per million tokens, noise levels etc)
2. Are there any tools that I can install on my computer that can track (again, even if rough) how many tokens I am sending to LLMs?
And so on.
Is it possible to do the same within a network? Say 50 employees in my company, how to track in total?
We can ignore LLMs running locally for this experiment.
I fully understand MegaCorps like Google, OpenAI etc will not share how much they are fucking up the environment, they won't share resources used etc (unless they are "good" numbers that makes them look good). I am only trying to get an idea, even if it is very rough.