Welcome to Jumble, your go-to source for AI news updates. Fans called out Taylor Swift over suspected AI in a Google scavenger hunt, raising fair questions about disclosure and creative trust. Meanwhile, Figure unveiled its third-gen humanoid for home tasks. Let’s dive in ⬇️

In today’s newsletter:
🎤 Taylor Swift promo videos spark AI debate
🤖 Figure 03 aims for your home
🏢 New enterprise platform from Google
💰 The AI “deal circle” fuels the economy
🎯 Weekly Challenge: Help your AI know you better

🎤 Taylor Swift Fans Spot AI Glitches

A Google-run scavenger hunt for Taylor Swift’s new album led fans to a series of short city-themed promo videos. Within hours, Swifties cataloged telltale artifacts; objects vanishing mid-shot, distorted text, odd shadows; and launched #SwiftiesAgainstAI, accusing the campaign of using generative video. 

🔍 What the Evidence Suggests

Specific inconsistencies in the video (e.g., a bartender’s hand phasing through a napkin, a two-headed carousel horse) and note an AI-forensics firm’s take that use of generative tools is “highly likely.”

Some defenders argue it could be CGI or quick compositing, but the visible glitches—and subsequent removals or restrictions—fueled the perception that AI was involved.

🧭 Why This Hit a Nerve

Swift has previously criticized AI misuse around her likeness, so fans saw a mismatch between message and marketing tactics. The episode spotlights a broader shift: major rollouts increasingly blend live-action, CGI, and generative tools. 

When brands don’t label methods, trust erodes—especially among communities trained to hunt for easter eggs. Even if the clips were partially CGI, the lack of disclosure created room for the worst assumption.

💡 What Creators Can Take Away

If you’re shipping promo video: disclose when AI is used, set consent rules for cameos, and publish a provenance note (tools, VFX vs gen-AI, who approved). Expect platforms to weigh provenance and watermarks in ranking.

For fans, the practical move is to look for consistent text, hand interactions, and reflections; common failure points in current models. 

🤖 Figure 03 Humanoid Robot Changes the Game

Figure unveiled Figure 03, a ground-up redesign meant for domestic and commercial tasks, with softer exterior materials, articulated hands with palm cameras, tactile finger pads, and voice interaction.

The company positions 03 as its first mass-producible household humanoid, trained via Helix, a vision-language-action system that learns from large volumes of first-person footage. TIME’s profile underlines the promise; and the open questions about safety, privacy, and labor impacts.

🧪 What It Can Do Today

Launch materials and hands-on reporting show 03 loading a dishwasher, folding shirts, tidying surfaces, and moving slowly for safety at home while working faster in factory scenarios.

Reviewers were impressed by manipulation but cautious about speed, reliability, and timelines; the company hasn’t given a consumer ship date, and initial deployments still look enterprise-first.

🌏 Where It’s Going and Who Else Is Racing

Figure’s near-term arc likely mirrors others: paid pilots in logistics and manufacturing, then controlled home trials.

In China, Unitree continues to push with H1 and G-series machines and is pursuing an IPO; UBTECH’s Walker line is active in factories; AgiBot has a commercial humanoid line and publishes on brain-and-cerebellum control stacks. Expect a split path; industrial reliability first, domestic convenience later.

🧭 What It Means if They Succeed

If companies nail manipulation, safety, and cost curves, early household roles will center on repetitive chores, elder support, and basic fetching or sorting. The gating factors are price, service networks, and liability frameworks. Until then, watch dexterity demos and tool-use benchmarks; they’re the truest signal of real-world usefulness.

This Week’s Scoop 🍦

🧩 Weekly Challenge: Teach Your Favorite LLM How to Be You

Challenge: Build a “you-simulator” that can mirror your style, preferences, and priorities.

Here’s what to do:

🧠 Step 1: Define Your Core

Ask your favorite LLM:

“Help me describe my personal operating system — what I value, how I make decisions, and what motivates me.”

Spend five minutes answering its follow-ups honestly.
Save that text — it’s your Profile Prompt.

✍️ Step 2: Give It Your Voice

Paste in three of your own texts, posts, or emails.
Then prompt:

“Rewrite these three examples in your best impression of my tone and phrasing. What patterns do you notice?”

This helps the AI surface your writing rhythm and emotional cues.

💬 Step 3: Test Its Understanding

Ask your new “you-simulator” something personal:

“How would I likely respond if a colleague asked me to work this weekend?”

If it gets it wrong, explain why — and refine your Profile Prompt.

🧱 Step 4: Save the Persona

Once it feels right, name it. Something fun and accurate — Mini-Me, MirrorGPT, Inner Compass.

Copy your Profile Prompt and store it in your notes app or your LLM’s custom instructions.

Want to sponsor Jumble?

Click below ⬇️

Did Taylor Swift take it too far with AI videos, or was it always inevitable? And, can you see yourself buying a Figure 03 robot one day? We’d love to hear your thoughts. See you next time! 🚀

Stay informed, stay curious, and stay ahead with Jumble!

Zoe from Jumble

Keep Reading

No posts found