Join thousands of forward-thinking SEO professionals getting weekly insights about AI's transformation of search.
SHARE
People are shifting their AI use from technical cases to emotional and personal ones.
Therapy/companionship has shot to the #1 position, with “Organizing my life” and “Finding purpose” entering the top 5 as new entries. This reflects a broader movement toward self-actualization through AI, with users turning to these tools for practical assistance and emotional support.
Mental health support, especially in regions with limited healthcare access
Creating personalized timelines for household organization
Enhancing learning by getting explanations for complex topics
Building customized meal plans based on specific macro needs
Generating detailed travel itineraries with hidden gems
Writing successful appeals for traffic violations
There are two takeaways for SEOs from this:
1. From a “search intent” perspective, it’s fascinating that these are all higher task-based intents trickling down to more particular recommendations from AI services.
2. Fundamentally, I think we’re moving beyond “search” as people form more complicated emotional attachments to their AI services.
Wikipedia is trying to dissuade AI companies from scraping their site by giving them the data directly.
The Wikimedia Foundation announced a partnership with Kaggle (Google’s data science community platform) to release a beta dataset of structured Wikipedia content specifically optimized for machine learning.
It’s an interesting solution to what’s becoming an increasingly common problem. Wikipedia’s servers are getting hammered by relentless AI bots consuming bandwidth, and this “well-structured JSON representations of Wikipedia content” should be a more attractive alternative to “scraping or parsing raw article text.”
This gives me serious flashbacks to the SEO vs Sysadmin Divide we discussed a few weeks ago. On one side, we have content creators desperately wanting their stuff to be found and indexed by AI, and on the other, we have infrastructure folks trying to keep their servers from melting down under the scraping load.
In the “tech giants eating other tech giants” news of the week, OpenAI has thrown its hat into the ring as a potential buyer of Chrome if Google is forced to sell it off. When someone asked the million-dollar question, Nick Turley, head of ChatGPT, confirmed this at a hearing about Google’s monopoly situation. His response? “Yes, we would, as would many other parties.”
This is fascinating for a couple of reasons:
OpenAI already has a ChatGPT plugin for Chrome, but Turley mentioned they could create “deeper integrations” if they owned the browser outright.
Under their ownership, Chrome could “introduce users into what an AI-first experience looks like,” - which is exciting and slightly terrifying, depending on your perspective.
The whole Google breakup saga unfolds at the typical speed of justice (aka glacially slow). Google has been found to have monopolies in search and online ad tech, and now vultures are circling to pick up potentially discarded pieces.
ChatGPT and Shopify are building a frictionless shopping assistant right inside your chats.
Developers spotted integration code in ChatGPT’s files with “buy_now” buttons and Shopify checkout URLs in both web and Android versions.
The seamless flow would let users get recommendations, see details, and complete purchases without leaving their conversation. For Shopify merchants, this means instant distribution in a major AI interface with zero integration work.
Following industry trends toward agent-led commerce, this partnership could create a powerful new e-commerce channel where shopping is as simple as asking for “best wireless earbuds under $200” and getting a direct checkout link with zero friction.
Perplexity just launched its voice assistant for iOS, which works on older iPhones that Apple Intelligence ignores.
It’ll book restaurant tables with a single command, draft emails for specific contacts, and mark recommended spots directly on your maps - all while requiring just that final confirmation tap.
Unlike Android, it can’t “see” your surroundings yet, but that feature’s likely coming soon. This further reinforces the new UX surfaces that are emerging for search.
Meta’s hardware division feels split from their AI and search efforts, but we’re starting to see the pieces come together with their RayBan AR glasses.
While it’s not a live context-aware search, their launch of live language translation shows the possibilities they’re working towards.