Robotics simulation in Unity is as easy as 1, 2, 3!

Robot development workflows rely on simulation for testing and training, and we want to show you how roboticists can use Unity for robotics simulation. In this first blog post of a new series, we describe a common robotics development workflow. Plus, we introduce a new set of tools that make robotics simulation in Unity faster, more effective, and easier than ever.

How to leverage Unity using robotics

Because it is costly and time-consuming to develop and test applications using a real robot, simulation is becoming an increasingly important part of robotic application development. Validating the application in simulation before deploying to the robot can shorten iteration time by revealing potential issues early. Simulating also makes it easier to test edge cases or scenarios that may be too dangerous to test in the real world. 

Key elements of effective robotics simulation include the robot’s physical attributes, the scene or environment where the robot operates, and the software that runs on the robot in the real world. Ensuring that these three elements in the simulation are as close as possible to the real world is vital for valid testing and training. 

One of the most common frameworks for robot software development is the Robot Operating System (ROS). It provides standard formats for robot descriptions, messages, and data types used by thousands of roboticists worldwide, for use cases as varied as industrial assembly, autonomous vehicles, and even entertainment. A vibrant user community contributes many open source packages for common functionalities that can bootstrap the development of new systems.

Roboticists often architect a robot application as a modular set of ROS nodes that can be deployed both to real robots and to computers that interface with simulators. In a simulation, developers build a virtual world that mirrors the real robot’s target use case. By testing in this simulated ecosystem, users can iterate on designs quickly before testing in the real world and ultimately deploying to production.

A common robotics development workflow, where testing in simulation happens before real-world testing

This blog post uses the example of a simple pick-and-place manipulation task to illustrate how users can leverage Unity for this simulation workflow.

1: Defining the robot’s task

Following the above workflow, let’s say that our robot’s task is to pick up an object and place it in a given location. The six-axis Niryo One educational robot serves as the robot arm. The environment is minimal: an empty room, a table on which the robot sits, and a cube (i.e., the target object). To accomplish the motion-planning portion of the task, we use a popular set of motion-planning ROS packages collectively called MoveIt. When we are ready to start the task, we send a planning request from the simulator to MoveIt. The request contains the poses of all the robot’s joints, the cube’s pose, and the target position of the cube. MoveIt then computes a motion plan and sends this plan back to the simulator for execution.

Now that we’ve set up the problem, let’s walk through how to use Unity in this simulation workflow.

2: Bringing your robot into simulation

A robotics simulation consists of setting up a virtual environment — a basic room, as in this example, or something more complex, like a factory floor with conveyor belts, bins, tools, and parts

Continue reading

This post was originally published on this site