Summary
Transcript
And as for the secret to their success, Figa employs a cutting-edge strategy of deploying its humanoids in simulated environments known as physical digital twins. These are virtual replicas of customer sites that allow the robots to refine their movements and decision-making processes, helping to harden their operational readiness before live deployments. But even more important is the Zero Two’s advanced artificial intelligence, which has been fine-tuned with the help of custom models developed in collaboration with OpenAI, giving its onboard microphones and speakers the ability to carry on seamless one-on-one conversations for extremely natural and intuitive interactions.
And while it’s not quite the personal confidant of the future just yet, it does however provide a level of conversational AI like the leading GPT models, making it approachable for both business users and homeowners out of the box. But its hardware is even more impressive, as the robot’s vision system is equally sophisticated, being powered by a vision-language model that interprets data from a total of six RGB cameras. For forward vision, two of these cameras are embedded in its animated face, giving the robot a more human-like appearance, while the others are strategically placed around its body, allowing the robot to create a 360-degree map of its surroundings, recognize objects, and make informed decisions, all completely autonomously.
And the robot’s physical capabilities are equally impressive, with its sleek, gunmetal-gray exterior concealing a remarkable number of engineering feats from its high-torque joints to its ultra-dextrous hands. In fact, each shoulder joint delivers 50Nm of torque with a 148-degree range of motion, while its knee and hip joints offer even greater power and flexibility, enabling the robot to handle complex tasks like lifting, carrying, and climbing upstairs with a huge load in its hands. And its hands are the robot’s greatest engineering feat, featuring 16 degrees of freedom and having been specially upgraded to perform a wide range of human tasks from delicate operations like picking up fragile objects to strength-based maneuvers that require human-equivalent power.
And most important is its extreme computational power, boasting a threefold increase over its predecessor that enables the robot to process complex AI tasks entirely autonomously. Plus, this leap in performance is complemented by a custom 2.25kWh battery pack housed in its torso, which promises over 50% more energy delivery. And while figure has not disclosed the exact runtime per charge, the enhancements in energy efficiency suggest the robot will be capable of multiple hours of precise physical labor. But the Astrobot S1 robot is also making waves with its newest ability to prepare coffee fully autonomously, leveraging the breakthrough Pi model of physical intelligence.
In fact, this advancement aims to create a foundation for robots to perform any task for an upcoming era of runaway robot learning. As for specifications, the S1 stands at 5 feet 8 inches and showcases superhuman precision, achieving repeatability within just 0.03 millimeters, a stark contrast to human accuracy of about 1 millimeter, with this remarkable precision being crucial for real-world tasks that demand meticulous detail and dexterity. And being trained using the OpenX Embodiment Dataset, the largest open-source robot dataset featuring over a million real robot trajectories across 22 different robot types, the S1 is well-equipped to handle a growing variety of tasks, including everything from single-arm operations to complex bi-manual tasks and quadruped movements.
By automating everyday chores such as cleaning, cooking, and general household management, the Astrobot S1 promises to transform the way consumers manage their homes. Priced at just under $100,000, the S1 is a significant investment. But as more robots enter the market, prices are expected to decrease to that of a used car, making robotic assistance increasingly accessible. And AI is transforming more than just robots, as Microsoft just announced a series of groundbreaking updates aimed at transforming productivity, with the first of these being Copilot Actions, a feature currently in private testing that simplifies task automation across Microsoft 365 apps through natural language prompts.
In fact, by eliminating the complexity of traditional manual configurations, Copilot Actions enables users to delegate routine tasks like generating meeting summaries, creating weekly reports, or compiling missed communication updates after time away from work. Building on this capability, Microsoft is also introducing specialized AI assistants powered by Copilot Actions. A SharePoint Assistant will search files and folders to answer user-specific questions, while a Teams Translation Assistant, set for public testing in early 2025, will provide real-time multilingual translations and even replicate participants’ voices for seamless communication. Additionally, other assistants in development include an employee self-service agent for HR and IT support and tools for managing projects and taking detailed meeting notes.
Plus, to address the complexities of hybrid work, Microsoft also just unveiled Places, an AI-powered platform designed to optimize team collaboration and office resource management. Integrated with Microsoft 365, Places helps teams coordinate in-person meetings by analyzing co-workers’ schedules and suggesting optimal times for face-to-face collaboration, and the system even integrates with teams to display which colleagues are in the same building, encouraging spontaneous in-person interactions. For managers, Places offers tools to streamline office planning by analyzing attendance patterns, enabling smarter space utilization and cost reduction, with Microsoft emphasizing that Places will help businesses create more efficient and sustainable office environments overall.
Meanwhile, for developers, Microsoft introduced a user-friendly interface for designing AI workflows called Azure AI Foundry, with the platform’s software development kit now being available for testing, and Microsoft planning to release their upcoming Azure AI agent services soon. And finally, the AI-powered answer engine called Perplexity now includes direct shopping capabilities to simplify online purchases, allowing users to search for products, view AI-generated recommendations, and complete transactions seamlessly. In fact, the platform connects to e-commerce sites like Shopify, aggregating and analyzing product reviews based on user queries. And for US subscribers, the new Buy with Pro feature enables one-click checkout with free shipping after payment and shipping details are added, while products outside this system redirect users to merchant websites to complete purchases instead.
On top of this, Perplexity also just released a new feature called Snap to Shop, allowing users to upload photos of items to find similar products, even without knowing their names. Importantly, the company emphasizes that its product recommendations aren’t sponsored to ensure unbiased results. And for merchants, a dedicated program enables direct integration of product data, increasing visibility in search results, and supporting one-click checkout. Sellers also gain free access to Perplexity’s analytics, providing insights into shopping trends. Anyways, like and subscribe to be spared by our future ASI rulers. [tr:trw].