Summary
Transcript
Boston Dynamics just unveiled its newest Atlas humanoid robot as an all electric version designed to push the boundaries of what intelligent machines can do. Plus, this cutting edge machine is being born from a partnership with Hyundai, promising to be stronger, more agile, and more dexterous than its predecessors, with huge implications to match. But the unveiling of the electric Atlas also marks a certain turning point in Boston Dynamics pursuit towards pushing the limits of whole body mobility and bimanual manipulation.
In fact, for over a decade, the company remained at the forefront of robotics, continuously raising the bar with each new machine and use case. From the legendary Petman, which tested protective clothing, to the recently retired HD Atlas that performed amazing parkour feats, Boston Dynamics has consistently set new standards for the state of the art in humanoid robotics. And the latest incarnation of Atlas is more than just a few minor improvements.
Powered by electric motors instead of hydraulics, the new atlas boasts unprecedented strength and an expanded range of motion, allowing it to tackle tasks that would surprise most human workers. But that’s just the beginning, because what truly sets the electric Atlas apart is its ability to seamlessly integrate cutting edge artificial intelligence into its operation, such as reinforcement, learning and computer vision. Plus, by harnessing the latest methods in these powerful intelligence paradigms, this new robot can adapt to complex real world world situations with relative ease by learning and evolving with every task it undertakes.
This fusion of advanced hardware and software promises to unlock a world of possibilities, enabling atlas to tackle the dull, dirty, and dangerous jobs that have long been the domain of human workers. And Boston Dynamics isn’t just focused on creating a superior robot, as they’re also building an entire ecosystem to support it. Specifically, the company’s recently introduced orbit software serves as a central platform for managing robot fleets, location maps, and other vital data, ensuring seamless integration and efficient operation in a wide range of industrial and commercial settings.
Most importantly, the introduction of the electric Atlas is a direct challenge to other providers of humanoid robots, such as Elon Musk’s highly anticipated Tesla Optimus robot. And as the humanoid robotics revolution starts full swing, this competition isn’t just pushing the envelope forward in terms of the tech, but it’s also bringing down the price of each unit, meaning that the barrier to entry for brand new competitors is getting smaller.
And as the electric atlas prepares to demonstrate itself in real world testing and tasks alongside Hyundai and its other partners, it will likely deploy these humanoids in nearby manufacturing plants. Nevertheless, Boston Dynamics is raising the bar and, more importantly, keeping up with the increasingly difficult goal of delivering intelligent robots that are uniquely capable of tackling the tasks that humans can’t or shouldn’t. But what can these robots accomplish without being able to generalize? Towards this end, researchers have just unveiled amazingly generalizable robots in an unprecedented demo of dexterity from Stanford University and Google DeepMind, which could finally enable the first ever self sustaining robotic systems.
In fact, the aloha unleashed project is pushing the limits of dexterity and task complexity with their new fleet of advanced aloha two manipulator arms. And of course, these new robot arms outperform their predecessors in terms of durability and performance. Plus, they’ve been meticulously upgraded to enable fleet scale data collection on more complex tasks. Further enhancements to the aloha two include replacing the traditional scissor grippers with low friction rails for improved grasping capabilities, integrating smaller real sense cameras with a broader field of view, and incorporating a passive gravity compensation setup and an overhead vision camera.
Additionally, the team developed a new simulation model with improved accuracy and visuals, further enhancing the learning capabilities of these remarkable machines. In fact, for the past year, the researchers have been demonstrating the impressive lengths at which these robots can achieve, culminating in a series of awe inspiring videos that showcase the aloha two’s ability to perform intricate tasks autonomously, adapting to slips and placement issues in real time with remarkable dexterity.
Among the most impressive feats demonstrated is the ability of these AI powered robots to repair and maintain other robots. For example, the manipulator can be seen assisting DeepMind’s Sara RT model by seamlessly slotting in a replacement gripper or finger. This demonstration may even suggest that a future where robots can autonomously repair and maintain themselves is no longer a distant dream, but rather a rapidly approaching reality. But the Aloha unleashed project’s accomplishments didn’t stop there, as the researchers also tackled two of the most dexterous tasks, tying shoelaces and manipulating clothing.
And despite not training on clothing, the AI powered robots proved their abilities by successfully generalizing and completing each task with coordination and precision. In another impressive demonstration, the aloha manipulators were tasked with hanging a shirt on a nearby rack. But the training policy accounted for different colored items, meaning it didnt include examples of adult shirts. Undeterred, one of the researchers removed their sweater and tossed it into the work zone when the model seamlessly generalized and completed the task, showcasing the remarkable adaptability of these AI powered systems.
But the implications of this breakthrough are more far reaching than doing your laundry. In just the next year or two, autonomous robotic maintenance and repair capabilities could revolutionize industries ranging from manufacturing to exploration where the ability to self sustain and self repair robotic systems could unlock new frontiers of efficiency, safety, and cost effectiveness. As the Aloha unleashed project continues to push the boundaries of what is possible, the world watches in awe, eagerly anticipating the next groundbreaking achievement from these pioneering researchers.
With each remarkable feat, the future of robotics grows brighter and the dream of truly autonomous, self sustaining robotic systems inches ever closer to reality. Finally, the world of AI generated art is about to experience a seismic shift as stability AI announces the availability of stable diffusion three via API, and the forthcoming release of the model waits for self hosting. This groundbreaking development promises to democratize access to cutting edge image generation capabilities, empowering artists, creators and innovators alike.
Through a strategic partnership with Fireworks AI, stability AI has made the new stable diffusion three model and its turbo version accessible via the stability AI developer platform API. This collaboration ensures an enterprise grade API solution guaranteeing 99. 9% service availability, thereby providing users with a reliable and robust platform for their creative endeavors. According to Stability AI, stable diffusion three boasts performance on par with leading image models such as dal e three and mid journey, while offering distinct advantages in prompt following and typography.
This remarkable feat is achieved through a novel multimodal diffusion transformer that utilizes separate weight sets to represent images and text, enabling unprecedented control and fidelity in the creative process. Furthermore, the company is inviting a select group of users to participate in the stable assistant beta pre release. This limited release will grant participants access to stability AI’s newest image and language models, allowing them to create content using the combined power of stable diffusion three and advanced language processing capabilities.
And to top it all off, stability AI claims they’ve taken reasonable steps towards making the API safe against bad actors. .