NIX Solutions: Facebook’s Interactive Virtual Learning Environment

Developers from Facebook research and several other organizations have released a new version of the virtual environment for testing robots – Habitat 2.0. It simulates rooms with which robots can interact, for example, open boxes, take objects from there and transfer them to a new place. N+1 notes that the new environment allows simulating the robot’s behavior much faster than other publicly available simulators – 55 times faster than real time when working on a single video card. A brief description of the environment is available on the Facebook AI site, and the authors also published a preprint of the article.

NIX Solutions

Machine learning in recent years has made a big leap forward in how robots, self-driving cars, and other machines perform tasks on their own. As with human learning, in machine learning, success depends on the amount of data and the amount of practice. But for many tasks, gaining experience requires a lot of resources. For example, one of the main and not yet fully solved problems in robotics is the capture of objects in everyday life or at work. They vary in shape, size, stiffness, reflectivity, and other parameters in both, so it takes a huge amount of practice and time for robots to learn a universal skill that applies to all subjects. This problem can be solved using scaling, and a few years ago it was solved directly using several identical robots performing the same tasks.

But in recent years, progress in algorithms has made it possible to fairly accurately simulate the behavior of robots in a virtual environment and, more importantly, to transfer the skill learned in it to a real device. This allows you to speed up learning by several orders of magnitude in the presence of computing power. This method is already being used on various types of vehicles, including self-driving cars and robotic arms. In 2019, Facebook developers created their Habitat virtual environment to simulate the behavior of domestic robots in realistic interiors. They created the Replica dataset, which consists of high-precision 3D scans of real-life rooms in houses, capturing information about the shape, color and reflectivity of surfaces. But the environment had a drawback: it was static and only allowed for certain types of tasks, such as navigation and object recognition.

In the new version, the developers changed their approach: they took one room from the Replica dataset and recreated it with high accuracy as a set of interactive 3D objects that you can interact with, says NIX Solutions. In total, there are 92 objects in the room, such as furniture, dishes and books, each of which has physical parameters: mass, shape, frictional properties of the surface. Also, all objects have semantic data (what type of this object) and a simplified form for calculating collisions. Several objects have a model of the movement of components, for example, opening a refrigerator door. By rearranging furniture and other interior items, the room is presented in 111 different versions, on which the robot can learn.

The environment itself has undergone changes. It is able to simulate rigid interactions and various types of movements, including rotational ones. Thanks to this, the robot can interact with all elements of the room and perform tasks typical for robotic assistants, for example, taking a box of food from the refrigerator and bringing it to the table.

The developers note that they deliberately abandoned important aspects of the simulation for its speed. Among other things, the environment is not able to calculate inelastic interactions and fluid behavior, and also does not take into account the peculiarities of wheel contact with the floor. The authors claim that this makes the new virtual environment the fastest of its kind. For example, the developers indicated the simulation speed on a computer with an Intel Xeon Gold 6226R processor and an NVIDIA GeForce 2080 Ti video card. When simulating the robot Fetch and its interaction with objects in the environment, the computer calculated 1660 “steps per second” (SPS). Since the environment calculates physical interactions at a rate of 30 times (steps) per second, 1660 SPS means that the simulation is 55.3 times faster than real time. Like last time, the developers have published documentation and code under a free license on GitHub.