Optimus, Tesla’s bipedal robot, can now use vision to self-calibrate its limbs.
The business announced the development in a post from their official X handle, saying that the robot could now use encoders inserted in its joints and eyesight to track and locate its limbs in space. Optimus is powered entirely by on-board machine learning algorithms, which enable the robot to do tasks like object sorting on its own.
Optimus can now independently sort things.
Complete end-to-end training of its neural network is performed: video in, controls out.
Join us to assist with the development of Optimus and enhance its yoga practice.
Visit this link: https://t.co/dBhQqg1qyapic.twitter.com/1Lrh0dru2r
— September 23, 2023, Tesla Optimus (@Tesla_Optimus)
The robot’s ability to sort dynamically—that is, to automatically locate and adjust its limbs to the new position even when the parts are moved—is astounding. As seen by the yoga postures at the end of the film, this also enables the robot to go through its whole range of motions.
Compared to its early demos, which consisted mainly of a man wearing a robot outfit, the robot has come a long way. Updates have been released sporadically since the project was revealed at AI Day in 2021. Tesla showcased a number of robots in March of this year, all of which were able to move independently and cooperate to finish jobs quickly.