Will Black Friday Kill AI?
The Question Everyone’s Asking
Will Black Friday kill AI? It’s a dramatic question — but a fair one.
Every November, the tech and retail worlds collide in one of the most intense economic moments of the year. For decades, Black Friday has been when retailers finally go into the black — when the sheer volume of sales pushes their books from red ink to profit.
But now that volume has shifted online. What used to be a retail arms race for shelf space and TV ads has become a digital knife fight across Google, Meta, and—now—the LLMs.
From Paid to Prompt
We’re entering an era where AI Optimization (AIO) — the discipline of structuring, tagging, and training your brand’s data so it surfaces in LLMs like ChatGPT, Perplexity, Gemini, Claude, Grok, and others — is replacing traditional SEO and Search as the new battleground.
So this Black Friday isn’t just a test for retailers. It’s also a stress test for AI itself.
For the first time, millions of people won’t just Google it. They’ll ask their AI:
“What’s the best Black Friday deal on a 65-inch TV under $800?”
“Where can I buy size 9 Nike Dunks near me today?”
“What’s a great gift under $30 for my six-year-old nephew who likes STEM and soccer?”
That shift — from links to direct answers — has massive cost implications.
Why It’ll Be Expensive for the LLMs
First: compute is not free.
Across ChatGPT, Gemini, Claude, Copilot, Grok, and others, we’re already at billions of queries per day. Even using conservative analyst benchmarks, each holiday query to a large model costs fractions of a cent up to a couple of cents in real infrastructure expense once you factor in bigger models, retrieval, and longer answers.
Now run the holiday math:
Roughly 4.5–5 billion queries per day across major LLMs during peak season.
Stretch that across ≈ 40 days (Nov 20 – Dec 31): 180–200 billion queries.
Use a realistic high-side blended cost of $0.005–$0.02 per query.
That yields:
≈ $1 billion on the low end to
$4 billion plus on the high end in Q4 holiday inference costs — just to answer people’s questions.
So when someone types,
“Find me the best 65-inch TV under $800 that can arrive before Sunday,” that’s not just a cute AI moment. That’s real margin getting burned at scale.
The Visual Generation Surge
Second: the holiday content machine.
People aren’t just asking questions; they’re creating:
Custom Christmas cards
Branded “Happy Holidays” posts
Family photos remixed with AI
New Year countdown graphics
Short AI-generated clips for socials
Here’s where it gets brutal:
2–3 billion AI images could be generated globally across all tools in the holiday window.
Each one effectively costs $0.03–$0.20 in compute → $60 million to $600 million.
Add 50–100 million short AI videos or animations at $0.25–$1.00 each → another $12.5 million to $100 million.
Stack that on top of the text queries, and:
The 2025 holiday season alone could drive $1 to $5 + billion in incremental AI inference spend across major platforms.
For an industry still searching for a sustainable business model, that’s not a rounding error — it’s an existential line item.
The Scariest Time of the Year (for the LLMs)
So when I say the holidays might kill AI, I don’t mean the tech collapses. I mean the economics get exposed.
If you could peek at the Q4 infrastructure bills for OpenAI, Google, Anthropic, xAI, Midjourney, and others, it’d look like a horror script. While Halloween is over, for LLM infrastructure teams, this is the real scary season — a wave of long, complex, high-intent prompts with no clear monetization path yet.
Holiday AI by the Numbers (2025 Estimate)
Category
Volume
Estimated Unit Cost
Total Compute Spend
Text / Shopping Queries
180–200 B
$0.005–$0.02
$1 – $4 B
AI Image Generations
2–3 B
$0.03–$0.20
$60 M – $600 M
AI Video Generations
50–100 M
$0.25–$1.00
$12.5 M – $100 M
Total (Q4 Compute Load)
≈ $1 – $5 + B
The Takeaway
Black Friday won’t kill AI — but it will show who’s ready to operate in this new reality.
For the platforms: a reckoning around scalability and profitability. Platforms running bloated models without smart routing or caching will bleed.
For brands: proof that visibility in LLMs is now as critical as visibility in search. The ones who cleaned feeds, used structured data, and optimized for natural-language prompts — they’ll quietly win when LLMs recommend their inventory.
For consumers: the moment AI-powered discovery becomes the default way we shop.
Because the next time someone asks an LLM, “What’s the best deal this Black Friday?” you want your brand to be the answer — not the cost.


