

[ai-buttons]
Stay Informed with Truth Mafia!
Subscribe to the Newsletter Today: TruthMafia.com/Free-Newsletter
Join Our Patriot Movements!
Connect with Patriots for FREE: PatriotsClub.com
Support Constitutional Sheriffs: Learn More at CSPOA.org
Support Truth Mafia by Supporting Our Sponsors
Reclaim Your Health: Visit iWantMyHealthBack.com
Protect Against 5G & EMF Radiation: Learn More at BodyAlign.com
Secure Your Assets with Precious Metals: Get Your Free Kit at BestSilverGold.com
Boost Your Business with AI: Start Now at MastermindWebinars.com
Follow Truth Mafia Everywhere
Sovereign Radio: SovereignRadio.com/TruthMafia
Rumble: Rumble.com/c/TruthmafiaTV
Facebook: Facebook.com/TruthMafiaPodcast
Instagram: Instagram.com/TruthMafiaPodcast
X (formerly Twitter): X.com/Truth__Mafia
Telegram: t.me/Truth_Mafia
Truth Social: TruthSocial.com/@truth_mafia
TOMMY TRUTHFUL SOCIAL MEDIA
Instagram: Instagram.com/TommyTruthfulTV
YouTube: YouTube.com/@TommyTruthfultv
Telegram: T.me/TommyTruthful
GEMATRIA FPC/NPC DECODE! $33 
Find Your Source Code in the Simulation with a Gematria Decode. Are you a First Player Character in control of your destiny, or are you trapped in the Saturn-Moon Matrix? Discover your unique source code for just $33!
Book our Gematria Decode VIA This Link Below: TruthMafia.com/Gematria-Decode
BECOME A TRUTH MAFIA MADE MEMBER 
Made Members Receive Full Access To Our Exclusive Members-Only Content Created By Tommy Truthful
Click On The Following Link To Become A Made Member!: truthmafia.com/jointhemob
Summary
Transcript
So this thing is not moving super fast but it is able to take her voice commands and it’s able to generalize. Please clean up the spill too and it can execute but at 6x the speed again if you’re actually timing this thing you can see that currently it’s not quite at the threshold of being a commercial possibility but is definitely getting there. And this isn’t about doing new tasks it’s more just about just completing tasks in unseen environments. For instance here you can see that it’s told to essentially clean up the room or make the bed but we can see in this other example it’s asked to clean up a bedroom in one instance in a different instance it’s asked to put a plate in a drawer and yet another putting a spatula and in a fourth cleaning up the bedroom.
And here you can see it’s operating at 10x and let’s time this thing here to see how long does it actually take to execute these different tasks. Problem is that these uncut videos they go between different speeds so it’s hard to gauge but roughly you can see that this takes somewhere about 30 seconds or so to do these different tasks at 7 to 10x the speed. So again this is something that you could probably tell to do different tasks while you were away from home like maybe clean up something and then by the time you would get home probably work in the background.
But I’m about to walk you through the seven features that make this all possible because the problem before this was that there wasn’t enough training data to be able to actually make these things work. So now that you’ve seen some basic demos I’m going to walk you through the seven key features that allow this robot to generalize in this way starting with feature 1 which is co-training on heterogeneous data. So pi.5 learns from all different types of data sources and these include web data like image captioning and object detection as well as verbal instructions from humans and this co-training teaches the robot not just how to move its arms but also how to understand the context of a task.
For instance picking up a spoon you may not think about this but this is important with knowing which objects belong inside of a kitchen sink. So for example it might learn from a video of a robot stacking plates or a photo that’s labeled dirty dishes or a text prompt saying put the mug away but none of this is any good without feature 2 which is semantic context understanding and this is where pi.5 doesn’t just follow orders but it actually thinks about what they mean meaning that it can infer high level structures of a task and then it can break it down into steps.
For instance making a bed involves picking up a pillow then smoothing the sheets and fluffing the blankets etc and this semantic know-how lets the robot decide exactly where to put objects like dirty dishes belonging inside of a sink instead of a fridge or it can infer what tools to use for a task for instance picking up a sponge for a spill. It’s like giving the robot a mental checklist for each chore but what could be even more important is feature 3 which is its ability to transfer this into physical behaviors and here’s where pi.5 gets clever because it learns from other robots even simpler robots that for example have just one arm or they don’t even have any wheels and it can pull these skills from data collected from less diverse settings like a lab or a factory setting and then it can apply them to these cluttered homes.
For instance it might use a static robots trick for grasping a cup to pick up a mug inside of a new kitchen that it’s never seen and this cross embodiment learning stretches the robot’s abilities to learn without needing endless hours of home specific data but entering feature number four we have verbal instruction following because this wouldn’t be very good if you weren’t able to actually tell the robot what to do and this is where pi.5 allows the robot to listen like it’s a roommate and it can handle natural language commands from broad prompts like clean the kitchen to detailed step by step prompts like pick up the plate and then put it in the sink and in tests humans were even able to coach it through tasks step by step and it followed along making it user friendly and what made all this possible was feature five it’s high level and low level control because unlike older systems that split decision making and movement pi.5 does both with the same model and it does this by first deciding on what’s the big picture for instance it’s told to wipe the counter and in this context it’ll translate into precise motor commands what the robot joints need to do then it breaks us down into chain of thought and it uses discrete decoding for its high level choices and continuous flow matching for its motor control and this keeps everything streamlined and cohesive and all of this allows for its robust generalization in these new environments which is feature number six and this is basically pi.5 superpower so in experiments it was able to clean new environments handle different tasks that it may have never approached before in fact it generalized so well that after training on just 100 different environments pi.5 almost matched a model that was trained directly on the test environment and this is a huge deal because it means that pi.5 can generalize with very very little data about 400 hours of mobile manipulation training plus other sources to be exact but the most important feature of all is probably feature number seven where the robot can react to its policies in these new environments because life is unpredictable and that’s why pi.5 is able to take these policies that it already has and it can adapt with slight changes for instance a human bumping a dish out of place or spilling something the video showed that even when humans get in the way the robot can still pick up and continue the task after these little interferences and it can respond to instructions at different levels of detail and react to them in fact the team tested pi.5 in two different scenarios with the first being full cleaning tasks like putting dishes in the sink or clearing a bedroom floor and out of distribution challenges like moving specific objects to a drawer and success rates were measured by how many subtasks the robot was able to perform like getting 80% of objects to the right spots and how well it performed with language prompts and what was found was that by adding web data it was able to boost its performance to recognize new objects while the robot data from other setups improved its actions and the reason why pi.5 is such a big deal is because it shows a proof of concept as to what’s possible in the future because pi.5 is obviously just a test project and it makes mistakes like misjudging a task semantics or fumbling a motor command but the robot points to a new ability for robots to learn from diverse data points and moving into the future the team is already exploring better ways to transfer knowledge and tap into even more varied data sources and if they succeed we could see robots not just in homes but also in grocery stores and hospitals and businesses and beyond adapting to all different types of tasks in the real world and this means that different robots will be able to apply physical intelligence from pi.5 meaning that this is a system that will probably be applied very soon to business for instance anyways keep an eye out for this robot in the real world and tell us in the comments how much you would pay for something like this in your home but like and subscribe and check out this video here if you want to know more about the latest in AI news
[tr:trw].

