来自今天的爱可可AI前沿推介

[RO] RT-1: Robotics Transformer for Real-World Control at Scale

A Brohan, N Brown, J Carbajal, Y Chebotar…
[Google]

RT-1: 用于实际大规模控制的机器人Transformer

要点:

  1. 从大型,多样化,与任务无关的数据集中迁移知识可以以很高水平解决下游任务;
  2. 机器人技术成功的关键在于采用高容量架构结合开放式任务无关训练;
  3. RT-1可吸收大量数据,并推广到新任务,环境,物体和其他机器人形态。

摘要:
通过从大型、多样化的、与任务无关的数据集中迁移知识,现代机器学习模型可以解决特定的下游任务,无论是零样本还是有小型特定任务数据集,都能达到很高的性能水平。虽然这种能力已经在其他领域得到证明,如计算机视觉、自然语言处理或语音识别,但在机器人领域仍有待证明,由于难以收集真实世界的机器人数据,模型的泛化能力尤为关键。这种通用机器人模型成功的关键之一,在于开放式的任务诊断训练,并与能够吸收所有不同机器人数据的高容量架构相结合。本文提出一个模型类,被称为机器人Transformer,它表现出有希望的可扩展模型特性。在对不同模型类的研究中验证了以上结论,以及它们作为数据大小、模型大小和数据多样性的函数,基于对执行真实世界任务的真实机器人的大规模数据收集的泛化能力。

By transferring knowledge from large, diverse, task-agnostic datasets, modern machine learning models can solve specific downstream tasks either zero-shot or with small task-specific datasets to a high level of performance. While this capability has been demonstrated in other fields such as computer vision, natural language processing or speech recognition, it remains to be shown in robotics, where the generalization capabilities of the models are particularly critical due to the difficulty of collecting real-world robotic data. We argue that one of the keys to the success of such general robotic models lies with open-ended task-agnostic training, combined with high-capacity architectures that can absorb all of the diverse, robotic data. In this paper, we present a model class, dubbed Robotics Transformer, that exhibits promising scalable model properties. We verify our conclusions in a study of different model classes and their ability to generalize as a function of the data size, model size, and data diversity based on a large-scale data collection on real robots performing real-world tasks. The project's website and videos can be found at this http URL

论文链接:https://arxiv.org/abs/2212.06817图片图片图片图片

内容中包含的图片若涉及版权问题,请及时与我们联系删除