top of page

GenAI is a Bazooka at a Knife Fight

  • Katrina Ingram
  • 5 days ago
  • 4 min read

Updated: 4 days ago

ree

In early September, Ontario Premier Doug Ford (attempted) to pour out a whole bottle of Crown Royal as a publicity stunt protest of an upcoming plant closure that would impact jobs. It had the intended effect of garnering media attention. It also got me thinking about the ways in which we are wasteful and how this idea of wastefulness is entangled with Generative AI.


There’s a growing concern about the environmental impacts of AI. We’re hearing stories about the call for more data centres, as well as the enormous amount of water and energy needed to power these places. That alone should raise alarm bells especially as companies abandon their net zero climate pledges in order to chase computation. 


But there’s another layer to this concern. Using a lot of resources might be fine to pursue something incredibly worthwhile, a well defined goal that would unequivocally benefit humanity. Instead, we seem to be chasing some vague notions of productivity and efficiency to complete low level administrative tasks.   


Bazooka at a knife fight


Consider this analogy - bringing a bazooka to a knife fight. Instead of bringing your trusty blade, you show up with a military grade rocket launcher and go scorched earth on your opponent. Yes, it does the trick, but there’s also a lot of collateral damage and waste. Now let’s look at what we’re doing with GenAI…


Instead of doing our own writing, we turn to ChatGPT - a tool so computationally intensive that it requires the equivalent of a bottle of water to cool the servers to produce 100 words of text.

Instead of calling a friend to brainstorm, we prompt ChatGPT to ideate, generating iteration after iteration. With billions of people prompting each day, this starts to add up. Daily power use for ChatGPT is estimated to be the equivalent of powering 180,000 US households.


We could keep going - images, video, audio, code - there’s no shortage of examples. Everyone can produce more content (add that to time saved) but we still need to review it for accuracy (subtract that from time saved). At the end of the day, is the value of producing a bunch of content really worth all of these material resources for Every. Single. Little. Task.


Everything has a cost


We already do a zillion Google searches a day, rely on cloud computing so much that when AWS goes down people can’t work, bank or adjust their smart bed (really?). How is generative AI all that different from the myriad ways we already blow through resources to make our lives a little more convenient, efficient or comfortable?


The short answer - its a change in scale. We might be 10xing the costs of using digital technology.


By weaving generative AI into every aspect of our lives we turn every situation in which a knife might have served us perfectly well, into a place where a bazooka becomes our default choice. The more we embrace a ‘bazooka first’ mentality the bigger the collateral damage. 

There are places where its appropriate to use a bazooka. A rocket launcher has utility. Bruce Cockburn riffed on getting one to right a few wrongs. However, at this point in time, we’re not being thoughtful enough about those situations that demand heavy artillery. We should be asking ourselves, do we really need a Large Language Model just because we don’t feel like writing our own marketing copy? Just because we’d prefer a summary rather than reading a long document?  


Lest you think I’m trying to sit above the fray, I’ll confess that I’m not immune to the siren song of generative AI. But when I have succumbed to using it because I don’t want do put in the effort, I feel the same way I do when I have microwave popcorn for dinner, instead of being a responsible adult and making a proper meal. 


Part of the problem is that we have not been offered any other options. The big tech companies that build AI tools have their own agenda. It should not be surprising that the same companies whose core business is cloud computing - providing data storage and computation - are also the same companies who have developed large language models that just happen to use a lot of data and computation! They can sell more of their core services as organizations roll out AI tools. Trillions of dollars are being spent to build out AI infrastructure. A handful of AI related companies are propping up the S&P 500. This really big bet better pay off, and how it pays off is by getting us to forget about the costs and just embrace it. Feel the AI.


But what about…


The sales pitch is that once we offload all those ‘chores’, then we will be able level up and do that really important work - the game changing stuff that might justify our use of generative AI despite its many impacts. This line of thinking is based on speculation. I know that when I have more time on my hands, I don’t always spend it well. I think social media offers a general example of how that works at the societal level. 


There’s also the idea that we can just ‘tech-harder’ and use AI to solve the climate issues that have been exacerbated by AI. That too is speculation. I do think we could make better design choices such as smaller models, ethically acquired data and renewable energy sources in order to minimize impacts. Coupled with wiser use, maybe we can find a happy middle ground.


By Katrina Ingram, CEO, Ethically Aligned AI

 

Ethically Aligned AI is a social enterprise aimed at helping organizations make better choices about designing and deploying technology. Find out more at ethicallyalignedai.com     

© 2025 Ethically Aligned AI Inc. All right reserved.




 
 
bottom of page