LOADING
sun icon

Never Use Generative AI

July 12, 2024 — Last Updated: July 27, 2024


image

Hey there, if you’re reading this because you were redirected here from my Commissions page, then let me cut to the chase and tell you what I can accept as reference images:

  • A crudely drawn sketch, no matter how messy it is
  • Stock photos*
  • Picrew/dollmaker images
  • Screenshots from a character creator builder in a video game
  • Official art/screenshots from an established IP*
*CAUTION: This is where you might stumble upon generative AI images.

Every single example is infinitely more creative—and above all else—more human than anything made by a generative artificial intelligence system, aka genAI. I am putting my foot down on this now because genAI has progressed past the phase where it’s pathetically corporate. Today, genAI will be actively harmful to our environment in the foreseeable future because this is a multifaceted problem.

Ten years ago, genAI was harmless (and stupid) enough to be used for memes or jokes because the general populace was digging into genAI expecting that they would get a jumbled mess. I knew this whenever I watched someone receive a generated manuscript that was essentially fan fiction. I knew this when I made a short video meme with a genAI system that can imitate character voices from SpongeBob SquarePants. Generative AI was a technology used for fun until the 2020s—and was just as stupid back then as it is today. But genAI is a threat now because the corporate world is taking it seriously as a capitalist apparatus. Memes die as soon as corporations use them for capital gain, and that’s most assuredly what happened to genAI.

This comic strip is direct, but it sums it up perfectly.


No Thoughts, Head Empty

At first, genAI posed some moral issues with its usage because most genAI models pooled content from any website on the internet, so there’s no telling what content a genAI model used to generate something. Any user generated content posted online has likely been scraped for genAI training, including images, videos, audio, and text. That alone should be a terrifying thought because this gives the immediate impression that generative AI models could care less about human output in favor of making computer-generated content without human involvement...while at the same time basing their entire knowledge base on anything that a human created in the past! That’s how you get distorted generative AI images that don’t understand human anatomy or voicebanks that sound eerily similar to a real-life celebrity.

Before I keep going, I should stress an important distinction. AI is a two-letter acronym for “artificial intelligence” and has described various computer algorithms for decades.

  • We use search engines all the time, for as long as Google has existed. These systems pool data together by creating a list of websites based on an inputted search term.
  • Video games do this all the time! Developers create an AI system that determines how non-playable characters behave or for other bespoke systems like in-game character creators.
  • Spell checkers built into Microsoft Word or external systems like Grammarly can help improve your spelling and grammar.
(These are all rudimentary examples, but they go to show that we have already been used to this kind of thing for so long.)

Another fine example can be found in Spider-Man: Into the Spider-Verse. The various pen strokes that define the features of a character’s face utilize a pool of data comprised of hand-drawn pen strokes. This library of linework is then trained on a machine learning system so that you can render the lines in real time without having a 2D artist painstakingly animate every animation frame with pen strokes overlaid on top of the 3D renders. There are way more use cases that I’m unaware of that were referenced in an article by The Conversation, like smart irrigation or detecting toxic contaminants from drinking water, so I can see AI systems working in our favor.

AI, in general, can be used with internally sourced data or for largely innocuous purposes. It has existed in our day-to-day lives for years because, at the end of the day, AI is just a tool. I am perfectly fine with this as an instance of simplifying a human’s workload or improving productivity.

However, people and companies are taking that logic to its absolute extreme through genAI—to the detriment of everyone involved. Unlike a standard and vaguely defined AI system, generative AI systems are supposed to “create” something new from the data you feed it. So just like any other tool, AI can be used for different intents, even if it comes at the expense of others.

Generative AI models require a ton of digital content to be trained on so they can output something that looks like it, which means that genAI at large is really good at imitating something rather than creating something new that only a human can conceive. It reminds me of that genAI animation made by Corridor Digital, where they made their version of a 2D animation that was, in essence, transforming footage of live actors into an uncannily-looking anime that used genAI as an Instagram filter. Their video used a vast collection of screenshots from Vampire Hunter D: Bloodlust (currently, you can watch this film for free, by the way), which explains why their outputted footage looks incredibly detailed but is a mere facsimile of the anime they shamelessly ripped off. Corridor’s final video looked like a mess, and there’s no way that they can perfectly replicate an intricately hand-drawn animated film from 2000 without learning traditional 2D animation.

Examples like Corridor Digital drive home the point that genAI cannot think, much less create. It can only regurgitate something based on what you feed it. Human thought can reason, render, and observe details when making artwork; it is capable of self-expression and exhibiting a particular feeling. Generative AI, however, cannot do any of that because it is purely observational without any logical reasoning or self-expression.

I couldn’t care less about how “good” a generated AI piece turns out because if you used genAI, then that tells me that you are morally and creatively bankrupt. The joy behind artwork comes not in learning how long and hard the process is and the trial-and-error that comes with it but in how satisfying it is to look at something you made and think, “Wow! I did that!” And then you can take those lessons you’ve learned to try something different or make something better than what you did before.

Generative AI reduces that entire process to a string of text prompts, which completely robs you of the ability to understand why and how people learn how to create illustrations, photography, literature, music, and artwork in general. It’s also unreliable because genAI text prompts, for example, can state factually incorrect statements with the utmost confidence. Nobody should rely on whatever bullshit a genAI tells you if science communities find it so untrustworthy that they will always fact-check it.

GenAI is the definition of “no thoughts, head empty” because if something needs a massive pool of data to make a mountain out of a molehill, then it cannot think like a human. Furthermore, I would go so far as to say that you shouldn't even be using genAI for developmental or conceptual purposes because of reasons like…


Replacing artists with genAI

There have been numerous instances in the industry where people used genAI to make assets for animations, film, and advertisements. Does this mean that genAI will be coming for every artist’s job? I sure hope not. However, genAI usage has already been actively replacing jobs in other industries based on this CNN article, so I suppose anything is possible. However, I imagine that if genAI does replace an artist in a company, it would be because of some executive wingnut who understands nothing about the creative process. In my experience surrounding genAI discussion, the most praise that genAI gets is mostly from anyone who is not an experienced creative artist. It’s sad, considering that artists, in general, have frequently been getting the short end of the stick.

These guys have mastered the art of clickbait. Congratulations! You got me. Now it is MY turn to get you. 🫵

As if making a genAI animation wasn’t already insulting enough, Niko from Corridor Digitial claims that 2D animation is “the most creatively liberated medium [that] is also the least democratized.” This quote bothers me just as much as it confuses me, but it’s not nearly as bad as the title of the BTS video that it comes from: Did We Just Change Animation Forever? To answer that question: no, you didn’t. You just found a way for techbros like you to bypass art production without hiring experienced artists who know what they’re doing. This video feels weird in isolation, but it makes sense that they would do this. Corridor Digital is a video production crew known for their expertise in visual effects (aka VFX), which is often a technical artist field, so the most logical step they could take in imitating animation was to look for a system that’s good at doing that.

But much like film and animation, entertainment production is a multi-disciplinary group effort. You can’t make a big-budget film with just a director and actors; you need camera operators, audio engineers, lighting artists, costume designers, etc. You can’t create a 2D animation with just the animators; you need colorists, storyboard artists, sound designers, composers, compositors, etc. You can’t make a video game quickly with just one guy; you need coders, sound designers, game designers, quality assurance testers, etc. Nobody is truly alone in this because they always need outside help from at least one person.

If you know that entertainment production is a multi-disciplinary task—especially knowing how rough the animation industry has been lately—then you’ll believe that Niko’s breakthrough sentiment is more like a backhanded compliment. However, the sheer incompetence coming from techbros leveraging genAI is not limited to just Corridor Digital. That YouTube channel, at the very least, is merely popularizing (and cashing into) the genAI trend. In creative circles and industry workplaces, people have taken genAI as an excuse to cut legitimate artist expertise out of the equation (even if that means cutting out actual artists themselves) and and consider this to be a shortcut.

Remember when a guy won a contest award because he submitted a genAI piece? That same guy believes that “illustration design jobs are very tedious,” he says as advice to new and emerging artists. “It's not about being artistic, you are a tool." Dude, I think that if you come from a computer science & financing background who is saying this bullshit to artists, to people trying to get into a field that you have zero experience in, then it is you who is being a tool. He’s not wrong about illustration design being tedious, but I’ve already been over this.

This is what that guy’s genAI image looks like. He won $300 USD out of it.

These are the kinds of people that existing artists have to deal with. That jerkwad contest winner is far from the only person who thinks that way. Just do a quick search on Twitter and I’m sure you’ll find a dozen people like him in no time. While genAI has pulled creative fields in a social (and occasionally, an industrial) quagmire, this pales in comparison to the final point I want to make.


Electricity & Water Hoarding

Three years ago, I once tried to write a video essay about how resource-hungry the cryptocurrency industry is (among other things that are wrong with it), but the topic was mostly irrelevant after spending a year working on it, so I never finished it outside of the script. Remember when that was a thing? Especially when NFTs got their moment of mainstream popularity? It used to be that crypto mining facilities, which are industrial-scale locales with rows of graphics cards outfitted for crypto mining, were the largest power consumers in tech industries. Nowadays, AI data centers are positioning themselves to overtake crypto mines as an enormous power consumer with a similar computer hardware setup.

A 2024 report by the International Energy Agency has estimated that AI data centers at a global scale consumed 460 terawatt-hours (TWh) in 2022. An article by The Verge points out from IEA’s study that this estimate could be as much as 620-1,050 TWh by 2026. Crypto mining, by comparison, has consumed 100-150 TWh in 2022. Before the IEA gathered their hard estimates on crypto mining power consumption, I looked at ongoing energy consumption graphs, like the Bitcoin Energy Consumption Index by Digiconimist. This website helps contextualize how energy-intensive Bitcoin is, but the overall power consumption estimates will be slightly higher if you account for all cryptocurrencies.

As for water consumption, Digiconimist’s current estimates for Bitcoin's annual water consumption are over 684 billion gallons of water (or 2592 gigaliters). I could only find future estimates on the amount of water that genAI could consume, with one study suggesting that global water demand can reach somewhere between 1.1 to 1.7 trillion gallons (4.2 to 6.6 billion cubic meters) by 2027.

However, one good reason why cryptocurrencies (and crypto-adjacent systems like NFTs) have died down in general popularity is that it’s not user-friendly. Investing in crypto coins requires a firm understanding of financing, and I just know that not everybody is interested in stocks or financial trading. At this point, I don’t think you should fret about a bunch of libertarian dudebros who consume computer hardware for fun. It’s the corporations you should be worried about—because, as we all know—it’s all fun and games until the big companies that control our lives get involved.

It always comes back to the corporations being incompetent & inconsiderate to the environment, doesn’t it? First, big tech corporations like Microsoft, Google, Amazon, Meta, Adobe, Nvidia, and numerous others have forced genAI or chatbots down everybody’s throat. Of course, most of these companies have been using popular genAI systems or Large Language Models like OpenAI, Midjourney, and more. GenAI’s proliferation in our daily lives is so bad that Google is less reliable than ever. I don’t think I can trust Google to give me good image search results without running into a genAI image.

Second, these same tech companies are all buying renewable energy sources and land developments for proprietary data centers. They do this in the name of being “carbon neutral,” but I’ve known that this is usually a farce. Companies aim for carbon neutrality to wash their hands from running on fossil fuel energy sources instead of combating energy-intensive systems. At the worst, they’re often making new energy-intensive sources! Like genAI!! Corporations are not doing anything by proclaiming to be carbon neutral, they’re just delaying the inevitable effects of climate change, if not compounding them.

I don’t think that tech companies are stroking whatever Shrek’s implying 🙄, but they might as well be stroking their ego.

Case in point: the companies that keep buying land for AI data centers and renewable energy sources to power it are leaving little to no room for general populations to get electricity, forcing some energy producers to turn back to fossil fuel plants. Also, AI data centers are becoming major freshwater consumers because they’re using water as a hardware coolant solution. Sure, utilizing a liquid cooling system can be negligible for one desktop computer, but not for an entire data center. Companies that consume tons of water are dangerous, considering that some arid regions like Mexico are confronting water scarcity due to droughts, sometimes forcing locals to buy water, and regions like Arizona usually don’t have a game plan for when they run out of water.

Ultimately, I believe that these corporations are still advancing the effects of climate change because they’re taking up land for a power-hungry data center that can power a technology that nobody needs, they’re endangering vital resources like water, and they cause energy producers to circle back to fossil fuels. Also, if you admit that your new technology needs a breakthrough energy source to make it viable (especially if you’re the CEO of OpenAI) despite what researchers have said for years, let me be the first to say that it does not instill confidence! Either you’ll look like you’re blind to the consequences of your actions, or you’re dangerously stupid.

If the moral and job security concerns were not bad enough, environmental endangerment is the last straw. I don’t see anything to gain from generative AI if it’s widely abundant but highly unreliable and resource-intensive. So this is why I am now taking a stand on not just avoiding genAI & chatbots myself but also taking a stand against anyone using them. To contribute to a system that appears beneficial but is ultimately harmful will only intensify its popularity, especially if it’s for a capitalist corporation. So the easiest thing you can do is stop using it, period. Do not hang around people who unashamedly use genAI. I know there are systems people still rely on that are supported by profit-driven corporations, but generative AI is certainly not one of them.

To bring it back to my art commissions, I will reject any reference images made with generative AI because if that can waste a ton of resources, then I can waste people’s time if they use it. I’m sorry, but use your noggin for once. Do not depend on the genAI system or an unfiltered Google search and make me a shitty sketch instead, because hey guess what?

My job as an artist is to make something that only a human can create.

Support artists, y’all. They will be thankful that you did. 😘







Works Cited

Alexander, Elliot. “Google Is Taking a Huge Risk With AI Search.” XDA, 28 June 2024, www.xda-developers.com/google-is-taking-a-huge-risk-with-ai-search

@alexkrokus. “Life of a meme.” Twitter, twitter.com/alexkrokus/status/1686018963250040833

Barr, Alistair. “Llamas Don’t Drink Much Water. Meta’s New AI Version Is Damn Thirsty.” Business Insider, 19 July 2023, www.businessinsider.com/meta-llama2-ai-uses-almost-twice-water-as-llama-2023-7

Burford, Doc. “Using Chatgpt and Other Ai Writing Tools Makes You Unhireable. Here’s Why.” Medium, 26 July 2023, docseuss.medium.com/using-chatgpt-and-other-ai-writing-tools-makes-you-unhireable-heres-why-d66d33e0ddb9

@CaptDedEyes. “Hey, coming at you late at night just to say that I am OFFICIALLY shelving this video essay project.” Twitter, 17 Aug. 2022, x.com/CaptDedEyes/status/1560104025814695936

Cole, Samantha. “Netflix Made an Anime Using AI Due to a ‘Labor Shortage,’ and Fans Are Pissed.” Vice, 1 Feb. 2023, www.vice.com/en/article/bvmqkv/netflix-anime-dog-and-the-boy-ai-generated-art

Collins, Katie. “Google’s AI Push Puts Climate Goals in Jeopardy. It Could Do so Much Better.” CNET, 5 July 2024, www.cnet.com/tech/services-and-software/googles-ai-push-puts-climate-goals-in-jeopardy-it-could-do-so-much-better

Corridor Crew. “Did We Just Change Animation Forever?” YouTube, 26 Feb. 2023, www.youtube.com/watch?v=_9LX9HSQkWo

Corridor Digital. “ANIME ROCK, PAPER, SCISSORS.” YouTube, 26 Feb. 2023, www.youtube.com/watch?v=GVT3WUa-48Y

Crawford, Kate. “Generative AI’s Environmental Costs Are Soaring — and Mostly Secret.” Nature, vol. 626, no. 8000, Feb. 2024, p. 693. https://doi.org/10.1038/d41586-024-00478-x

Digiconomist. “Bitcoin Energy Consumption Index - Digiconomist.” Digiconomist, 10 Jan. 2024, digiconomist.net/bitcoin-energy-consumption

Duboust, Oceane. “'Unreliable research assistant’: False outputs from AI chatbots pose risk to science, report says.” Euronews, 24 Nov. 2023, www.euronews.com/next/2023/11/20/unreliable-research-assistant-false-outputs-from-ai-chatbots-pose-risk-to-science-report-s.  Archived version: https://archive.ph/IysEU

Egan, Matt. “AI is replacing human tasks faster than you think.” CNN, 20 June 2024, www.cnn.com/2024/06/20/business/ai-jobs-workers-replacing

“Electricity 2024 – Analysis - IEA.” IEA, 1 Jan. 2024, www.iea.org/reports/electricity-2024

Gupta, Joyeeta, et al. “AI’s excessive water consumption threatens to drown out its environmental contributions.” The Conversation, theconversation.com/ais-excessive-water-consumption-threatens-to-drown-out-its-environmental-contributions-225854

Hale, Craig. “Adobe Users Are Furious About the Company’s Terms of Service Change to Help It Train AI.” TechRadar, 7 June 2024, www.techradar.com/pro/adobe-users-are-furious-about-the-companys-terms-of-service-change-to-help-it-train-ai

Halper, Evan, and Caroline O’Donovan. “AI Is Exhausting the Power Grid. Tech Firms Are Seeking a Miracle Solution.” Washington Post, 24 June 2024, www.washingtonpost.com/business/2024/06/21/artificial-intelligence-nuclear-fusion-climate. Archived version: https://archive.ph/BTn4G

@jzellis. “Almost 25 years ago, I moved to Las Vegas.” Twitter, 3 Mar. 2024, twitter.com/jzellis/status/1764274817455255712

Lang, Jamie. “2024 Animation Industry Layoff Tracker.” Cartoon Brew, 8 Mar. 2024, www.cartoonbrew.com/artist-rights/2024-animation-industry-layoff-tracker-236827.html

Li, Pengfei, et al. “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models.” arXiv.org, 6 Apr. 2023, arxiv.org/abs/2304.03271

@NOAANCEI. “Drought and abnormal dryness expanded across much of Mexico.” Twitter, 8 July 2024, twitter.com/NOAANCEI/status/1810389526084272414

Pskowski, Martha, and Merula Furtado. “Coca-Cola Sucks Wells Dry in Chiapas, Forcing Residents to Buy Water.” Truthout, 13 Sept. 2017, truthout.org/articles/coca-cola-sucks-wells-dry-in-chiapas-forcing-residents-to-buy-water

Rathi, Akshat, and Dina Bass. Bloomberg - Are You a Robot? 15 May 2024, www.bloomberg.com/news/articles/2024-05-15/microsoft-s-ai-investment-imperils-climate-goal-as-emissions-jump-30. Archived version: https://archive.is/kJRvG 

Salesforce. “‘Gold Rush’ :30 | Ask More of AI With Matthew McConaughey | Salesforce.” YouTube, 20 Nov. 2023, www.youtube.com/watch?v=FvG41iEXFrU

Sharma, Dhruv. “Late Night With the Devil’s AI Controversy Explained.” ScreenRant, 22 Mar. 2024, screenrant.com/late-night-with-the-devil-ai-controversy-explained

Shilov, Anton. “Nvidia’s H100 GPUs Will Consume More Power Than Some Countries — Each GPU Consumes 700W of Power, 3.5 Million Are Expected to Be Sold in the Coming Year.” Tom’s Hardware, 26 Dec. 2023, www.tomshardware.com/tech-industry/nvidias-h100-gpus-will-consume-more-power-than-some-countries-each-gpu-consumes-700w-of-power-35-million-are-expected-to-be-sold-in-the-coming-year

Stiffler, Lisa. “Amazon’s carbon footprint shrank 3% last year — but AI-driven climate challenges loom.” GeekWire, 10 July 2024, www.geekwire.com/2024/amazons-carbon-footprint-shrinks-3-last-year-but-ai-driven-climate-challenges-loom

Syed, Nabiha. The Secret Water Footprint of AI Technology – the Markup. 15 Apr. 2023, themarkup.org/hello-world/2023/04/15/the-secret-water-footprint-of-ai-technology

Vallance, Chris. “Art Is Dead Dude” - the Rise of the AI Artists Stirs Debate. 13 Sept. 2022, www.bbc.com/news/technology-62788725

Vincent, James. “How Much Electricity Do AI Generators Consume?” The Verge, 16 Feb. 2024, www.theverge.com/24066646/ai-electricity-energy-watts-generative-consumption

WIRED. “How animators created the Spider-Verse | WIRED.” YouTube, 22 Mar. 2019, www.youtube.com/watch?v=l-wUKu_V2Lk