• Perle Labs Secures $17.5M in Funding
  • Join The Beta Launch
  • Perle Labs Secures $17.5M in Funding
  • Join The Beta Launch
  • Perle Labs Secures $17.5M in Funding
  • Join The Beta Launch
  • Perle Labs Secures $17.5M in Funding
  • Join The Beta Launch
  • Perle Labs Secures $17.5M in Funding
  • Join The Beta Launch
  • Perle Labs Secures $17.5M in Funding
  • Join The Beta Launch
A weekly digest of the biggest moves, model releases, and power shifts shaping the future of AI.

Welcome to the first edition of The AI Loop, your weekly capsule roundup of the most relevant AI news.

Each week, we cover the biggest developments shaping the AI landscape, from model releases and infrastructure to research, policy, and partnerships. Our goal is to help you stay informed with fast, accurate updates that make sense of where the industry is heading.

The AI Loop is published by Perle Labs, building technology that connects human expertise with machine intelligence to help shape a smarter, more transparent future for AI.

Perle Labs Update

Big week at Perle Labs. We just launched the beta of our contributor platform, opening a new way for people to earn rewards by providing verified training data for AI systems. Within a day, more than 30,000 people applied for whitelist access. The early response speaks volumes about how much appetite there is for building AI that’s powered by verified human input.

The beta begins with an initial slate of tasks where contributors label and classify audio and video clips to help train real-world AI. New task types will roll out as the platform expands to other data domains in the coming weeks. Stay tuned.

Read more: Earn Rewards for Training Verified AI

This Week’s Big Moves

Nvidia hits $5 trillion as AI boom reshapes markets

Nvidia just made history as the first company to cross a $5 trillion valuation, fueled by a 12x surge in its stock since ChatGPT’s debut.

CEO Jensen Huang announced $500 billion in AI chip orders and plans to build seven new supercomputers for the U.S. government. The milestone cements Nvidia as the backbone of global AI infrastructure and one of the biggest winners of this AI boom.

Still, analysts warn that export controls and shifting market sentiment could test how long the rally lasts.

Sources: CNBC, Reuters

OpenAI restructures for a trillion‑dollar build‑out

OpenAI just hit reset on its corporate structure. The company is now a public-benefit corporation, a move that frees it from fundraising limits and sets the stage for an eventual IPO.

Microsoft keeps a 27% stake and around 20% of revenue, but loses its exclusive compute deal. CEO Sam Altman says OpenAI faces $1.4 trillion in infrastructure obligations as it races to build roughly 30 gigawatts of data-center capacity.

It’s a bold play to secure the world’s compute future, and one that raises big questions about how far this build-out can go.

Sources: Reuters, The Guardian

AWS Project Rainier arms Anthropic with custom chips

Amazon Web Services just rolled out Project Rainier, a massive AI compute cluster spanning multiple U.S. data centers and packed with nearly 500,000 Trainium 2 chips.

Anthropic, part-funded by Amazon, plans to scale past one million Rainier processors by late 2025 to train and deploy its Claude models. The project marks AWS’s biggest move yet to expand AI infrastructure on its own silicon instead of Nvidia’s.

By fusing custom chips with its cloud backbone, Amazon is tightening control of the AI supply chain and pushing deeper into the race for affordable, high-performance compute.

Sources: Amazon Web Services, Barron’s

Quick Bites

MiniMax launches M2, an enterprise-scale open-source LLM

Chinese startup MiniMax publicly released M2, its most advanced LLM to date, under an MIT license. Described as "built for agents and code," the model offers high performance and cost efficiency. Minimax says M2 can run at 8% of the cost of Claude Sonnet, with double the inference speed. M2 now tops the Intelligence Index for open models, performing close to GPT-4 and Claude.

Sources: VentureBeat, MiniMax

Nvidia releases 650+ open models on Hugging Face

At its GTC conference in Washington, DC, Nvidia made a major play for open innovation, releasing more than 650 AI models and 250 datasets to Hugging Face. The drop spans language (Nemotron), physics simulations (Cosmos), robotics (Isaac GR00T), and biomedical AI (Clara), opening up access to tools that have powered some of Nvidia’s most advanced systems.

Source: Nvidia

Stay in the Loop

That’s a wrap for the first edition of The AI Loop. Each week, we’ll track the breakthroughs, power plays, and shifts shaping the world of AI.

Join the Perle Labs community on X, Discord, and Telegram to stay connected and catch the next drop of insights from across the ecosystem.

The AI Loop #1: Nvidia’s $5T Milestone, OpenAI’s Trillion‑Dollar Vision, & AWS Project Rainier