Saturday, February 14, 2026

On Our "Virtual Route 99" (Special W-End Edition): On Artificial Intelligence, Robots & The Future


On this special edition of our Virtual Route 99 on the Tech Scene ((and as we extend best wishes to all on this Valentine's Day 2026): our team is pleased to present thoughts from Dr. Peter Diamandis as we also hereby note an admonition from the Microsoft AI Chief, Mustafa Sulemian during a conversation he had with the Editor of the Financial Times, recently: 


I just spent the afternoon at Figure headquarters in San Jose with Brett Adcock and David Blundin, and I’m still processing what I saw.

We’re not talking about concept robots. We’re talking about fully autonomous humanoid robots running neural networks end-to-end, doing kitchen work, unloading dishwashers, organizing packages – for hours at a time, with no human intervention.

Today? Figure’s robots are doing 67 consecutive hours of autonomous work. One error in 67 hours. That’s not a demo. That’s a product.

And here’s what most people don’t understand: the gap between “doing one task really well” and “doing every task a human can do” is collapsing at exponential speeds.

Let me explain why…

NOTE: Brett has been a past Faculty Member at my Abundance Summit, where leaders like him share insights years before the mainstream catches on. In-person seats for the 2026 Summit next month are nearly sold out. Learn more and apply.

The Death of C++ and the Rise of the Neural Net

When I first visited Figure, they had several hundred thousand lines of C++ code controlling the robots. Handwritten. Expensive. Brittle.

Every new behavior required engineers to anticipate edge cases, write more code, test it, debug it. It was the software equivalent of teaching a toddler to walk by writing an instruction manual.

In the last year, Figure deleted 109,000 lines of C++ code.

All of it. Gone.

What replaced it? A single neural network that controls the entire robot: hands, arms, torso, legs, feet. Full-body coordination. Real-time planning. Dynamic response to unexpected situations.

This is Helix 2, their latest AI model, and it’s a fundamentally different approach to robotics.

Here’s why this matters: neural nets learn from experience, not instructions.

You don’t code a robot to “grab a cup.” You show it thousands of examples of grasping objects—different shapes, weights, materials—and the neural net extracts the underlying patterns. It learns what “grasping” is at a representational level.

And once it understands grasping? It can generalize to objects it’s never seen before.

Brett put it simply: “If you can teleoperate the robot to do a task, you can train the neural net to learn it.”

That’s the unlock. If the hardware is capable—if the motors, sensors, and joints can physically perform the movement—then the AI can learn it from data.

Compare that to traditional robotics, where you’d need to write thousands of lines of code for every single new task. That approach doesn’t scale. Neural nets do.

The implication: Every robot in the fleet learns from every other robot’s experience. When one Figure robot masters folding laundry, every Figure robot on the planet instantly knows how to fold laundry.

Humans don’t work like this. Robots do.

Hardware Built Around the Brain

Most people think you design the robot first, then figure out the AI.

Figure did the opposite.

Brett’s team looked at the neural network architecture they wanted to run and asked: “What hardware do we need to make this work?”

That’s why Figure 3 exists. It’s not an incremental upgrade. It’s a complete redesign built around Helix.

Here’s what changed from Figure 2 to Figure 3:

  • 90% cost reduction in manufacturing

  • ~20 pounds lighter (135 lbs total)

  • Palm cameras for occluded grasping

  • Tactile sensors in every fingertip

  • Passive toe joint for better range of motion

  • Soft-wrapped body to eliminate pinch points

  • Onboard inference compute (no cloud dependency)

And critically: designed for data collection at scale.

Because here’s the thing — if you’re betting on neural nets, you’re betting on data. The more diverse, high-quality data you collect, the better the robot generalizes.

Figure built their robot to be a data-gathering machine. Every sensor, every camera, every interaction feeds back into the training loop.

And they’re not using off-the-shelf parts. They manufacture their own actuators, hands, battery systems, embedded compute—everything.

Why? Because the technology readiness of existing robotics components is too low. If a vendor’s motor fails in the field, you’re stuck waiting for them to fix it. If you built it yourself, you iterate overnight.

This is vertical integration at its finest. And it’s the only way to move fast enough to win.

The Manufacturing Ramp: From Thousands to Millions

Walking through Figure’s Baku (manufacturing facility) was surreal.

Four production lines. Capacity for 50,000 robots per year when fully ramped.

But Brett’s not stopping there. He’s already planning the next facility. Tens of thousands. Then hundreds of thousands. Then millions.

And here’s the kicker: Figure will use its own robots to build more robots.

They’re putting humanoids on the production lines this year. Robots assembling robots. Robots testing robots. Robots packaging robots.

Why? Because if you’re trying to scale to a billion units, you can’t rely on human labor. You need an exponential manufacturing curve, and the only way to get there is recursive self-improvement.

Think about it: every improvement Figure makes to the robot’s dexterity, speed, and reliability makes it better at building the next generation of robots.

It’s a flywheel. And once it starts spinning, it’s nearly impossible to stop.

Brett estimates they could ship a billion robots today if the AI were fully general-purpose. The demand is there. The capital markets (via leasing models) can finance it. The constraint is solving general robotics.

And that’s exactly what they’re working on.

General Robotics: The Only Milestone That Matters

Here’s the thing about humanoid robots that most people—and most companies—don’t get:

Teleoperation is not impressive. Open-loop behaviors are not impressive. One-minute demos are not impressive.

What’s impressive is closed-loop, autonomous work in unseen environments over long time horizons.

Let me break that down.

Closed-loop means the robot is continuously sensing its environment and adjusting in real-time. It’s not replaying a pre-programmed sequence. It’s thinking.

Autonomous means no human in the loop. No remote operator in Tennessee. The robot is making decisions on its own.

Unseen environments means you can drop the robot into a random Airbnb or factory floor it’s never visited, and it figures out how to navigate and work there.

Long time horizons means hours, days, weeks of continuous operation. Not 30-second clips stitched together in post-production.

This is what Brett calls “general robotics,” and it’s the only milestone that matters.

If you can’t do this, you don’t have a product. You have a very expensive remote-controlled toy.

Figure’s current benchmark: four to five hours of continuous neural network operation in logistics, kitchen work, and manufacturing tasks.

Their 2026 goal: Drop a robot into an unseen home and have it do useful work for days with minimal human intervention.

Once they hit that, the game is over. Because if the robot can generalize to any home, it can generalize to any environment. Factories. Warehouses. Hospitals. Senior care facilities. Mining operations. Space stations.

The hard part isn’t building a robot that can do one thing well. The hard part is building a robot that can do everything a human can do.

And Figure is closer than anyone else on the planet.

The Timeline: When Will You Have One?

Everyone wants to know: when can I buy a Figure robot for my home?

Brett’s answer: Not yet. And I’m not going to ship slop.

Here’s his roadmap:

2026: Alpha testing in homes. A small number of robots doing long-horizon work (cleaning, organizing, laundry, dishes) in real households. The goal is to measure human interventions – how often does someone need to step in and help?

Right now, industrial deployments see occasional errors. The target for home deployment is orders of magnitude better.

2027-2028: Scaled home pilots. Tens, then hundreds, then thousands of units. Iterative design based on real-world feedback. Safety validation. Privacy validation. Reliability validation.

2028-2029: Mass production and broad availability.

Why so cautious?

Because Brett learned this lesson at Archer (his eVTOL company): you don’t ship safety-critical systems until they’re ready.

A humanoid robot in your home is around your kids, your pets, your elderly parents. If it drops a pot of boiling water, that’s catastrophic. If it steps on your cat, that’s catastrophic. If it gets hacked and streams your private conversations to the internet, that’s catastrophic.

So Figure is taking the time to get it right.

And honestly? I respect the hell out of that.

Because when Figure does ship, they’ll have a decade head start on everyone else in terms of safety track record, reliability data, and customer trust.

That’s a brand moat you can’t buy.

The Business Model: Leasing Humanoids Like Humans

Here’s the beautiful part of Figure’s go-to-market strategy:

They’re leasing robots the same way you lease humans — by the hour.

Think about it. You don’t “buy” an employee. You pay them a salary. You lease their time and capability.

Figure is doing the same thing. You don’t buy a $20,000 robot. You pay ~$300/month to lease one. That’s $10/day. Forty cents an hour.

Compare that to minimum wage ($15-20/hour in most US states). A Figure robot is 50x cheaper and works 24/7 with no breaks, no benefits, no turnover.

And because it’s a lease, Figure retains ownership. They can remotely update the software. They can recall and upgrade units. They can monitor performance and safety in real-time.

It’s the Apple model applied to robotics. And it’s going to print money.

Brett estimates the market for humanoid robots is half of global GDP: roughly $50 trillion annually. Because half of all economic activity is human labor.

And here’s the thing: the demand is already there.

Figure has signed multiple commercial clients. They’re deploying robots into factories, warehouses, and logistics operations this year. These aren’t pilots. These are revenue-generating contracts.

The bottleneck isn’t demand. It’s supply. It’s solving general robotics and scaling manufacturing fast enough to meet the orders.

When that happens? This becomes the largest economy in the world.

What Comes Next: The Age of Abundance

Let me paint the picture of where this is going.

2030: Every household in the developed world has access to a humanoid robot. You lease it for $300/month. It does your laundry, cleans your house, organizes your kitchen, runs errands.

You come home from work and your robot has already meal-prepped dinner based on your biometric data from your wearable. It knows you’re low on magnesium, so it adjusted the recipe.

2035: There are 10 billion humanoid robots on the planet. Five billion in homes. Five billion in commercial and industrial settings.

The cost of goods and services collapses. Why? Because labor is no longer a constraint. Robots mine the materials, manufacture the products, transport them, and deliver them to your door.

You want a custom piece of furniture? A robot designs it, fabricates it, and assembles it in your garage overnight. Cost: materials + energy. Labor: free.

2040: Robots are building robots. Robots are designing robots. Robots are optimizing supply chains, managing logistics, exploring asteroids, constructing orbital habitats.

Humans are freed from drudgery. We do what we’re best at: creating, exploring, connecting, imagining.

This is the age of Abundance.

And it starts in 2026.

Brett Adcock and his team at Figure are building it. Right now. In San Jose.

I’ve seen it. I’ve walked the floors. I’ve watched the robots work.

And I’m telling you: this is real.

The future doesn’t care if you believe in it. It’s coming anyway.

The only question is: are you ready?

To an Abundant future,

Peter

Saturday, February 7, 2026

View of the Week (Special W-End Edition)

 


As SpaceX has acquired xAI, a snapshot of the big players on the IPO Space over the last 27 years...

Wednesday, February 4, 2026

View of the Week (Special Edition)

 




The Briefing

Martin Peers headshotBy Martin Peers

 


Greetings!

If quarterly earnings announcements were a contest, Mark Zuckerberg would have won Wednesday. You might even say investors are coming to terms with Meta Platforms’ you-only-live-once approach to spending on AI. The owner of Facebook and Instagram reported stronger than projected fourth-quarter growth of 24% and projected that revenue growth would accelerate to 30% in the first quarter. Meta hasn’t grown that fast since the go-go days of 2021. To be sure, some, though not all, of the growth is coming from the dollar’s weakness, which boosts overseas revenue when translated back into dollars. Meta’s ad business is also getting a lift from AI tech, executives detailed. Investors were impressed: Meta stock jumped as much as 10% in after-hours trading.

It was a different story for Microsoft, which also reported Wednesday. Its top line slowed just a tad, to 17% from 18% in the previous quarter, thanks to a slight slowdown at its Azure cloud unit. Investors sold the stock down more than 6% after-hours. Microsoft stock hasn’t been a world-beater for much of the past year, so that selling is notable. On a conference call, analysts made it clear investors are worried about whether Microsoft is getting a return on its AI spending and about its cloud unit’s reliance on OpenAI as a customer. Investors should think more broadly. Microsoft’s profit margins are higher from software than from server rentals in Azure. What Microsoft needs is to persuade its software customers to buy AI-powered features, which it is starting to do. The company disclosed it had 15 million paying subscribers for Office 365 Copilot, which isn’t huge in the big picture (in 2024, the company said Office had 400 million Office 365 paying subscribers). But it’s a start. (As Aaron’s story today showed, CEO Satya Nadella and his colleagues are moving fast to improve Copilot-related products using advanced technology from Anthropic.)

One worry is that Microsoft’s margins are coming under pressure. Its overall gross margin fell very slightly in the most recent quarter. You can blame its cloud unit for that. Microsoft reported that the gross profit margin at its “intelligent cloud unit”—which includes Azure—fell more than 4 percentage points thanks to higher AI infrastructure costs. Azure is growing faster than the rest of the intelligent cloud unit, but that’s actually hurting the unit’s margin, Microsoft acknowledged in a securities filing. Commentary on Microsoft’s call indicated that margins will come under more pressure in the March quarter. 

Still, if we’re going to talk about margins, the picture is uglier for Meta. Its operating profit dropped 7 percentage points in the fourth quarter, thanks to a 40% increase in operating expenses, mostly due to higher compensation costs and increased AI server costs. And, as Meta warned three months ago, its spending is going to mushroom this year. Meta forecast its capital expenditures would balloon to between $115 billion and $135 billion this year, compared with $72 billion in 2025. Meta expects its operating expenses, meanwhile, will rise 40% this year. So even assuming its business strengthens through the year, the company’s bank accounts may look a little emptier by the end of the year. 

True, Meta executives forecast higher operating income for 2026 than for 2025, despite the higher operating expenses. But let’s face it: The operating income measure is something of an accounting fiction, given how accounting rules require it to be calculated. A more meaningful metric is free cash flow. On that metric, Meta reported a 16% dip in 2025 to $43.6 billion. It’s a good bet free cash flow will shrink further this year given the huge capex ramp. It’s no coincidence the company’s debt more than doubled in the fourth quarter.

The bottom line is that for both companies, growth is important, but the cost of that growth is even more important.

Tesla’s Pivot 

Will Tesla sell cars a decade from now? At this rate, I’m not completely sure. 

Elon Musk said on the electric automaker’s analyst call Wednesday that Tesla is ending production of its Model S and Model X luxury vehicle models, both of which were not selling well. The vast majority of the cars Tesla now sells are its Model 3 and Model Y models. Tesla will convert the California production lines where it’s been making the S and X into a facility for Optimus robots, eventually churning out a million bots per year, Musk said. (See a full write-up of the results here, but suffice it to say revenue dipped, as was expected from a previously reported sales drop.) 

At the same time, Musk and other executives repeatedly talked up Tesla’s autonomous driving software and its nascent Robotaxi service. At several points in the call, Tesla executives made the company sound more like a subscription software business than a car company. Tesla is a “transportation as a service company,” according to Lars Moravy, vehicle engineering vice president, while “autonomous software will be the driver for growth” in the future rather than vehicle sales, according to Vaibhav Taneja, chief financial officer. 

What’s more, Tesla is about to stop letting vehicle owners buy lifetime access to its supervised Full Self-Driving software. Starting Feb. 14, drivers will only have the option to subscribe to FSD by paying a monthly fee. So why not go all the way and become a car-as-a-service company? Eventually, I wouldn’t be shocked if the only ways to get in a Tesla become hailing a Robotaxi or signing a lease for a car that includes a self-driving subscription. 

Elsewhere on the call, Musk was asked why Tesla is investing $2 billion in another Musk company, xAI. He gave a typically Muskian answer: xAI’s Grok models will be the “orchestra conductor” that will manage future workforces of Optimus robots. Again, that’s not exactly selling cars!—Theo Wayt

In Other News

Amazon announced on Wednesday that it was laying off 16,000 employees. The layoffs follow an earlier round in October, when the company said it was laying off approximately 14,000 corporate employees.

China has approved imports of the first batch of Nvidia’s H200 AI chips, according to Reuters. The approval was granted as Nvidia CEO Jensen Huang visits China this week, after weeks of uncertainty whether and at which scale Beijing would allow the powerful chips in.

ServiceNow shares dipped 7% in after-hours trading after its fourth-quarter earnings, possibly because investors weren’t impressed by the enterprise software provider’s revenue growth forecast for its current fiscal year, which would be the same as its recently concluded one. ServiceNow is forecasting subscription sales growth between 20% and 21% for its current fiscal year, after growing 21% in its recently concluded year.

 

Thursday, January 22, 2026

On Our "Virtual Route 99" With #RandomThoughts For the Week

 


People probably thought Marcus Aurelius was strange. The time he spent alone in his room. The long walks he took by himself. We know they thought it was strange that he was seen reading and writing in the Colosseum, ignoring the carnage of the games below.

“The world today does not understand, in either man or woman,” Anne Morrow Lindbergh writes in Gift from the Sea, “the need to be alone.” Perhaps we ourselves don’t understand it. We don’t quite see the point. Or as much as we enjoy it, we don’t see it as much of a priority. As we discussed over at Daily Dad in an email recently, parents will manage to make time for so many things…but quiet time by or for themselves is written off as an impossible indulgence.

“Actually,” Lindbergh writes, “these are among the most important times in one’s life—when one is alone. Certain springs are tapped only when we are alone. The artist knows he must be alone to create; the writer to work out his thoughts; the musician, to compose; the saint, to pray.” There would be no Meditations without this quiet solitude, but more alarming, there would have been no Marcus Aurelius, either.

He had to take the time to retreat into his own soul, as he said, to rejoice in perfect stillness. He needed to step away. He needed to reflect and evaluate, prepare and anticipate. He was an extremely busy man with an endless amount of demands on his person and his schedule. But he insisted on stillness, because he knew it was the key to his health and his happiness—and his leadership depended on it.

The same is true for you.

Tuesday, January 20, 2026

On Our "Virtual Route 99" (Weekly Edition): On Artificial Intelligence, Gartner & Developments on the Tech Scene

Dana Point, California (Copyright the Daily Outsider January 2026) 

For this special edition of the "Virtual Route 99", our team chose Vinod Khosa as he reflects upon Artificial Intelligence, as Gartner reflects upon the 10 Top Technology Trends, along with Tech Developments, courtesy the team at TechExpresso





💥 OpenAI launches ChatGPT Translate to rival Google Translate LINK
  • OpenAI has released ChatGPT Translate, a new standalone translation tool that works like Google Translate but adds AI-powered features to help users adjust their translated text for different purposes.
  • The tool offers one-tap prompt options that let users reshape translations to sound more fluent, formal, simple for children, or suited for academic readers, then opens ChatGPT for deeper changes.
  • ChatGPT Translate currently lacks key features Google offers, including document uploads, website translation, real-time conversations, and broad language support, with Google backing over 50 more languages total.
🚫 X blocks Grok from generating sexualized images of real people LINK
  • X has blocked Grok from generating sexualized images of real people after users exploited the AI chatbot to create non-consensual explicit pictures, including images of minors, often by tagging Grok directly under photos posted on the platform.
  • The company now restricts image creation and editing through Grok to paid subscribers only, and has added geoblocking in places where generating images of real people in bikinis or underwear is illegal.
  • Despite these changes, Grok can still remove or alter clothing from uploaded photos, and regulators in California, the UK, Australia, and several other countries have opened investigations into xAI over potential law violations.
🔄 Two Thinking Machines Lab cofounders return to OpenAI LINK
  • Two cofounders of Thinking Machines Lab, the AI startup led by former OpenAI CTO Mira Murati, are returning to OpenAI, along with a third key employee named Sam Schoenholz.
  • Thinking Machines Lab fired cofounder Barret Zoph before his departure, citing "unethical conduct" and sharing "confidential company information with competitors," though OpenAI CEO Fidji Simo says she has no such concerns.
  • The startup has now lost half its cofounders while reportedly trying to raise funds at a $50 billion to $60 billion valuation, which could hurt both investor interest and future recruiting efforts.
📚 Microsoft and Meta pay Wikipedia for AI training data LINK
  • Microsoft, Meta, Amazon, and AI startups Perplexity and Mistral AI are now paying Wikipedia through its enterprise product to access content for training their AI models.
  • Tech companies scraping Wikipedia's 65 million articles for free has increased server demand and costs for the non-profit, which relies mainly on small public donations to operate.
  • The Wikimedia Foundation spent time developing the right features to move companies from free access to a paid commercial platform that handles their large-scale AI training needs.
📰 Digg relaunches as new Reddit competitor LINK
  • Digg, the early internet community that once competed with Reddit, is relaunching as a public beta on Wednesday under the ownership of its original founder Kevin Rose and Reddit co-founder Alexis Ohanian.
  • The new platform will use zero-knowledge proofs and signals from mobile devices to verify real users and build trust, aiming to address social media toxicity and prevent AI bots from taking over.
  • Anyone can now join and create their own communities on any topic, with community managers able to set rules and share moderation logs publicly so members can see what decisions are made.
💰 OpenAI signs $10 billion computing deal with Cerebras LINK
  • OpenAI has signed a multi-year deal worth over $10 billion with AI chipmaker Cerebras, which will provide 750 megawatts of compute power to the company starting this year through 2028.
  • Both companies say the partnership will deliver faster outputs for OpenAI's customers, with Cerebras adding a dedicated low-latency inference solution that speeds up responses requiring more processing time.
  • Cerebras, which claims its AI-specific chips are faster than GPU-based systems like Nvidia's, has delayed its 2024 IPO while reportedly seeking another billion dollars at a $22 billion valuation.