The Creative Lever Is Everything

🎯 Meta changed the rules and how to build real-time controllable game videos

Hello Readers 🥰

Welcome to today's edition, bringing the latest growth stories fresh to your inbox.

If your pal sent this to you, then subscribe to be the savviest marketer in the room😉


In Partnership with Tatari

Why Winning DTC Brands Are Getting Pickier About Their TV Partners

If you've been eyeing TV for 2026, you've seen the flood: hundreds of platforms promising to "unlock CTV for your brand." Here's what they're not saying: who you pick determines whether TV becomes a growth channel or an expensive experiment.

Most new players are programmatic-only – limited transparency, low-quality supply, and measurement that crumbles the second you scale.

The brands actually winning on TV (Jones Road Beauty, Tecovas, Ridge Wallet, Calm) use partners who offer:

  • Not just programmatic, but linear, streaming, and direct publisher inventory
  • Full visibility into where ads ran
  • Outcomes measurement that actually works at scale

2026 will reward marketers who are thoughtful about their TV partners. The space is crowded, but the right choice actually moves the needle.

And don't forget to gut-check those shiny offers. Free ad credits? Performance guarantees? If it sounds too good to be true, it is.

Start smart or switch to a reliable partner: Tatari.tv


📝 The Andromeda Era Made Creative The Growth Engine

Meta ads used to be a game of knobs and levers. You could outsmart the system with tight targeting, stacked interests, lookalikes, and manual bid tweaks. As long as your creative wasn’t terrible, the strategy could still work.

That world is gone.

With Andromeda and modern Meta optimization, broad campaigns can outperform the old hyper-granular playbooks. The algorithm is now the best media buyer in the room. Which means your edge is no longer targeting. It’s creative.

But here’s the nuance most teams miss: Meta doesn’t reward you for uploading more ads. It rewards you for giving it more distinct ideas to pattern match.

Cropping the same image is not a new idea.Minor copy swaps are not a new idea.Color changes rarely create new learning.

If your “testing” is just clones, the system sees one concept and runs out of room to explore.

Steps to Build High-Performing Ads in the Andromeda Era

1️⃣ Test ideas, not iterations:
Start with different hooks, different angles, and different visual stories. Save micro tweaks for later.

2️⃣ Design for attention in 0.3 seconds:
Win with familiar formats that feel native, then add something off pattern. Think recognizable interfaces or everyday scenes with a twist.

3️⃣ Build memorability:
Distinctiveness plus simple structure make the message retellable. Humor and sharp analogies help your ad stick.

4️⃣ Trigger real emotion:
People don’t buy because they studied your features. They buy because they feel relief, desire, fear, or hope.

5️⃣ Keep tension alive:
Promise an outcome but don’t reveal the full mechanism too early. Your job is to make the click feel inevitable.

The Takeaway
Andromeda turns Meta into a creative lab. Your advantage is shipping genuinely different concepts that spark human responses and letting the system find the pockets of performance.


📝 Build Real-Time Controllable Game Videos

Real-time video generation is moving from “render and wait” to “steer and iterate.” Hunyuan-GameCraft is an open-source system designed for high-dynamic, interactive game video generation using hybrid history conditioning, meaning it can use prior frames and context to keep motion and scenes coherent as you guide the output.

If you’ve been watching interactive video models like Genie-style systems, this is the practical shift: longer consistency, controllability, and faster iteration, with artifacts you can actually run and test locally. The project also provides model releases on Hugging Face, making it easier to experiment without building everything from scratch.

Steps to Try Hunyuan-GameCraft:

1️⃣ Start with the technical report:
Open the arXiv page first and scan the “method” and “experiments” sections to understand what’s controllable, what inputs it expects, and what “history conditioning” is doing for stability. 

2️⃣ Pull the model + assets from Hugging Face:
Go to the Hugging Face repo, review the model files, and check any listed inference notes or requirements before downloading weights.

3️⃣ Run a baseline generation:
Begin with a default example (or the simplest provided configuration) to confirm your environment is correct. Capture VRAM usage, FPS, and output stability.

4️⃣ Test “control” with small changes:
Change one variable at a time: camera motion, action type, scene complexity, or prompt constraints. Track which changes improve consistency versus which cause drift. 

5️⃣ Iterate toward your use case:
Once stable, build toward what you actually need: gameplay-like sequences, controllable transitions, or longer horizon clips then lock the settings that preserve identity, layout, and motion.

The Takeaway
Hunyuan-GameCraft is a strong signal that interactive, controllable video is becoming accessible through open releases meaning experimentation can happen faster, cheaper, and closer to real product workflows, not just demos.


We'd love to hear your feedback on today's issue! Simply reply to this email and share your thoughts on how we can improve our content and format.

Have a great day, and we'll be back again with more such content 😍