Google's Gemini robot now thinks

PLUS: Perplexity's new Search API, and Copilot for the command line

In partnership with

Google DeepMind has introduced a new model that moves robots from simple instruction-following to genuine problem-solving. Gemini Robotics 1.5 gives physical agents the ability to reason, research, and autonomously handle complex, multi-step tasks.

This development marks a significant leap from pre-programmed machines to dynamic agents capable of operating in the physical world. With robots now able to learn and transfer skills, how quickly will we see these more adaptable assistants appear in our homes and workplaces?

Today in AI:
  • Google's new 'thinking' robotics model

  • Perplexity's new Search API

  • Copilot comes to the command line

PRESENTED BY MINDSTREAM

Turn AI Into Your Income Stream

The AI economy is booming, and smart entrepreneurs are already profiting. Subscribe to Mindstream and get instant access to 200+ proven strategies to monetize AI tools like ChatGPT, Midjourney, and more. From content creation to automation services, discover actionable ways to build your AI-powered income. No coding required, just practical strategies that work.

What’s new? Google DeepMind just unveiled its new robotics model, Gemini Robotics 1.5, which gives robots the power to reason, research, and complete multi-step physical tasks on their own.

What matters?

  • The model enables a robot to tackle multi-step challenges by first researching information, like checking local recycling rules online, before planning and executing the physical sorting.

  • Skills learned by one robot are designed to automatically transfer to different hardware, which significantly accelerates the learning process across various robot designs.

  • The system’s ability to reason isn't just theoretical; you can watch the robot figure out how to complete a task it has never seen before.

Why it matters?

This development marks a major shift from robots that follow pre-programmed instructions to agents that can dynamically solve problems in the physical world. Giving robots the ability to reason and learn on the fly opens the door for more adaptable and useful assistants in our homes and workplaces.

PRESENTED BY SUPERHUMAN AI

Go from AI overwhelmed to AI savvy professional

AI will eliminate 300 million jobs in the next 5 years.

Yours doesn't have to be one of them.

Here's how to future-proof your career:

  • Join the Superhuman AI newsletter - read by 1M+ professionals

  • Learn AI skills in 3 mins a day

  • Become the AI expert on your team

What’s new? AI search engine Perplexity has launched its own Search API, giving developers direct access to its powerful, real-time search and answer capabilities.

What matters?

  • The API provides developers with access to Perplexity's indexed information, complete with real-time updates.

  • It offers sub-document precision, enabling applications to retrieve exact passages and answers instead of just a list of links.

  • This move positions Perplexity as a direct challenge to Google by offering its core technology as a building block for other developers.

Why it matters?

This launch empowers developers to create a new class of applications with instant, precise answers built-in. It also signals a significant shift in the search landscape, pushing the market towards more accessible and integrated AI-powered information.

What’s new? GitHub is bringing its AI coding assistant out of the editor and into the command line, launching the Copilot CLI public preview for all developers.

What matters?

  • Developers can now get AI assistance directly within the terminal, a core environment where they spend a significant amount of their time.

  • The tool offers real-time suggestions for shell commands, managing repositories, and debugging common errors.

  • This integration closes the gap between writing code in an editor and executing tasks in the command line, reducing context switching.

Why it matters?

Integrating AI directly into the terminal streamlines the development lifecycle by reducing friction and increasing speed. This signals a future where AI assistants are embedded in every step of the software creation process, not just the code editor.

Everything else in AI

Meta introduced Vibes, a new experimental feed for discovering, remixing, and sharing short AI-generated videos directly within its app.

Cloudflare announced its Content Signals policy, a new tool giving creators legal control over whether AI companies can use their content for model training.

Skild AI claims it has developed an "omni-bodied robot brain," a single AI system that can control any robot body without specific pre-programming.

1X seeks a $1B funding round at a reported $10B valuation to advance home trials of its NEO humanoid robot.

Essential AI Guides - Reading List:

Let us know!

Work with us

Reach 100k+ engaged Tech Professionals, Engineers, Managers and decision makers. Join brands like MorningBrew, HubSpot, Prezi, Nike, Ahref, Roku, 1440, Superhuman, and others in showcasing your product to our audience. Get in touch now →