Home / Posts / Sunday Robotics Funding

Sunday Robotics Secures $165M Series B: Solving the Household Generalization Problem

The Sunday Robotics Tech Stack

  • 🧤Skill Capture Glove: High-fidelity haptic and proprioceptive data collection from human demonstrations.
  • 👁️VLM Reasoning: 12B parameter Vision-Language Model for high-level task planning and semantic understanding.
  • 🔄Sim-to-Real+: Proprietary reinforcement learning loop that bridges the gap between digital training and physical execution.
  • 🔋Edge Efficiency: Native execution on custom NPUs allowing for 45W operational power draw during active tasks.

Household robotics is entering its "GPT moment." Sunday Robotics has announced a massive $165 million Series B funding round to commercialize its autonomous home assistants, leveraging a unique "Skill Capture" methodology that sidesteps the data bottlenecks of traditional robotics.

The Data Problem in Home Robotics

While robots have excelled in factories for decades, the unstructured environment of a typical home—with its messy counters, varied lighting, and unpredictable pets—has remained an unsolved challenge. Traditional training requires millions of hours of simulation, which often fails to capture the "feel" of delicate tasks like handling eggs or folding silk.

The Innovation: The Skill Capture Glove

Sunday Robotics' breakthrough is the Skill Capture Glove. Instead of training via reinforcement learning in a vacuum, human "Teachers" wear these gloves while performing tasks. The gloves capture 1,000 data points per second, including pressure, orientation, and subtle muscular adjustments. This proprioceptive data is then fed into a transformer-based model, allowing the robot to "feel" the physics of the task before it ever attempts it.

Vision-Language Models (VLM) for Reasoning

The robot doesn't just copy movements; it understands intent. By integrating a multi-modal VLM, a user can say, "Clean up the red wine spill but don't use the good napkins." The model translates this natural language into a series of visual sub-goals, identifies the spill, locates a rag, and executes the physical movements captured via the Teacher sessions.

Build Your Agentic Workflows

Automate your physical and digital tasks with our professional agent development suite.

Explore Tools

Performance Benchmarks

In pilot tests conducted across 500 varied household layouts, Sunday's "Sunday v2" hardware achieved staggering results:

  • Task Generalization: 92% success rate in kitchen tasks (unloading dishwashers, sorting recycling) in homes the robot had never seen before.
  • Tactile Sensitivity: Successfully handled objects as fragile as a grape and as heavy as a 10lb cast-iron skillet without recalibration.
  • Battery Endurance: 6 hours of continuous operation per charge, with a 30-minute "Boost Charge" capability to 80%.

Architectural Deep-Dive: The "Proprioceptive Transformer"

Technically, Sunday Robotics utilizes what they call a Proprioceptive Transformer (PT). Unlike standard LLMs that predict the next token, the PT predicts the next set of motor torques based on a window of historical tactile feedback and current visual input. This creates a closed-loop system that can adjust in real-time if a glass slips or a table is bumped.

Conclusion: Towards a General Purpose Home OS

Sunday Robotics isn't just building a robot; they are building a Physical OS. With $165M in the bank, the goal is to drive down hardware costs to under $20,000 per unit, making general-purpose household help a reality for the upper-middle class by late 2027. The era of the "Robot-as-a-Service" for chores has officially begun.

For more on how AI is interacting with the physical world, read about the Tesla Optimus Gen 3.