This is what we have today Team. Enjoy your weekend and Stay Curious.
Why did Meta acquire WaveForms AI?
AI Tools - Drone Technology
Meta’s Superintelligence Team and its implications.
Learning Corner - Full Stack Deep Learning
📰 AI News and Trends
GPT-5 is here. Now what?
Meta acquires AI audio startup WaveForms
Google's AI coding agent Jules is now officially launched with a new pricing structure after completing its beta phase.
OpenAI is offering ChatGPT to US government workers for $1 per year. Anthropic is reportedly planning to offer its product, Claude, to federal agencies for as little as $1 as well.
Tesla Disbands Dojo Supercomputer Team, Unwinding Key AI Effort
Microsoft integrates the newly launched GPT-5 model across its entire Copilot suite for all users.
🌐 Other Tech news
Pinterest Q2 revenue +17% YoY to $998M, net income +336% to 39M, shares plummet
Public companies commit $98.4B to crypto buys in 2025 YTD, ~3x all prior years combined. BTC ATH under the current administration.
Trump signs order allowing alternative assets like cryptocurrencies in private equity like 401(k)s
The President demands Intel CEO resign, alleging conflicts over China ties
Apple hit by string of departures in AI talent war
Meta Acquires WaveForms AI to Decode Human Emotion Through Voice
Meta has acquired WaveForms, an AI voice startup focused on passing the “Speech Turing Test,” the ability to make AI-generated voices indistinguishable from human speech. Founded just 8 months ago, WaveForms had already raised $40M at a $160M pre-money valuation from Andreessen Horowitz. The startup’s co-founders include ex-OpenAI and Meta researcher Alexis Conneau (who helped build GPT-4o’s Advanced Voice Mode) and ex-Google ad strategist Coralie Lemaitre.
This marks Meta’s second major AI audio acquisition in a month, following PlayAI, and is part of its ramp-up of Superintelligence Labs, its all-encompassing AI division focused on voice, vision, reasoning, and superagent infrastructure.
WaveForms specializes in “Emotional General Intelligence” tech designed to detect and understand emotional states through voice signals. Combined with Meta’s billions of voice, video, and message data points across its platforms, the implications are massive.
Here’s why it matters:
Meta’s app ecosystem:
Facebook – 3.05B monthly active users
WhatsApp – 2.8B monthly active users
Instagram – 2.4B monthly active users
Ray-Ban Meta Smart Glasses – 1M+ shipped in under 6 months
Oculus / Meta Quest – ~10M+ headsets sold
These platforms offer Meta an unparalleled multimodal dataset of how we speak, write, react, and feel.
Personally I use Whatsapp and IG the most, and while traveling outside the US, you can realize how the entire world relies more on WhatsApp than any other communication app for business and personal communication. In LATAM billboards and TV ads, which are pricey, have companies whatsapp QR codes or numbers in them as it is the standard and expected way to communicate.
imagine all these voice messages, being run through WaveForms to have an idea how a client or potential client feels at that particular moment and have your AI chatbot, which already creates audio that is hard to know that is a computer, respond accordingly to please that client and upsell.
So let the bots respond. Also all these data can be sold back to the users, in other words, we use these apps for communication, the sell us the data about how we (others) feel in certain situation to improve our ad efforts.
Metas vision: build a universal AI engine capable of real-time emotional analysis, memory, decision-making, and interaction across text, voice, video, AR glasses, and VR environments. Wearables like smart glasses are key capturing what we see, hear, say, and even how we feel.
This acquisition gives Meta the missing piece in voice-emotion modeling, something smaller startups couldn’t scale due to data limitations. Now, with WaveForms’ neural architectures and Meta’s scale, we may see AI that not only hears us, but understands us, and predicts us.
🧠 Learning Corner
Full Stack Deep Learning (FSDL) is a free, hands-on course that teaches you how to build and deploy real-world AI systems, from data pipelines to LLMOps and model monitoring. Used by engineers at OpenAI and top startups, it’s ideal for going beyond just training models.
Meta’s Superintelligence Team Now Called TBD Lab
Meta’s elite AI unit is now officially named TBD Lab, short for “to be determined.” The team is leading development on the next version of Meta’s Llama model, internally dubbed Llama 4.5 or 4.x, and sits under the company’s new Superintelligence Labs (MSL) umbrella.
Led by Chief AI Officer Alexandr Wang (brought in via Meta’s $14B deal with Scale AI), TBD Lab has aggressively recruited top talent from OpenAI, Google, and others—reportedly offering pay packages in the hundreds of millions. Jack Rae, formerly at Google, is leading the Llama project.
Meta says TBD Lab will focus on frontier model development, advanced reasoning, and building powerful AI agents. But this isn’t just about technological progress, it’s about AI supremacy. As users, we’re the ones feeding these systems with data, yet the power and profits concentrate in the hands of a few. Gathering the top minds under one corporate roof to chase dominance raises red flags: monopoly risks, ethical blind spots, and a lack of accountability.
The real question: How do we democratize AI when the company with the deepest pockets sets the pace? And how do we ensure these breakthroughs serve the public good, not just shareholder value? Are these naive questions?
🧰 AI Tools
Drone Technology
Skydio Autonomy (by Skydio) - Autonomous navigation & obstacle avoidance. Visual SLAM, deep learning. Use for law enforcement, inspections, cinematography
DJI Terra + AI Detection (by DJI) - Terrain mapping, AI-powered object detection. 3D Reconstruction and AI for Infrastructure Analysis. Use for construction, agriculture, and mining
Sentera FieldAgent (by Sentera) - Crop health & AI yield prediction. NDVI imaging + AI classification. Use for precision agriculture
Percepto Autonomous Inspection & Monitoring (AIM) - Industrial site monitoring with autonomous drones. AI anomaly detection, machine vision. Use cases are Oil & gas, solar farms, utilities
Azur Drones Skeyetech System - Autonomous drone-in-a-box for security & surveillance. AI-based perimeter breach detection. Replaces manual patrols, active in 10+ countries
Download our list of 1000+ Tools for free.