Have you ever received a STEP or IGES file from a colleague or client, only to find you need to edit the design quickly without the parametric history? Or perhaps you created a complex design in a hurry, resulting in a parametric history littered with redundant features? At Autodesk Research, we believe that machine learning may help us with these and other challenges.
The Fusion 360 Gallery Dataset is a re-release of public designs from our community in a machine-learning-ready format.
Recently, we announced the Fusion 360 Gallery Dataset, a re-release of designs that our community shared publicly on the Autodesk Online Gallery website. Now available, this dataset is the first of its kind in a machine-learning-ready format. In this follow up to our original post, we’ll share how we began using the dataset in early research. Keep in mind that nothing you see below is available or planned for release in Fusion 360, but these early experiments show how machine learning might improve CAD in the future.
The first set of problems we explored are forms of reverse engineering. This could include recovering the parametric history from a STEP or IGES file, or even using a 3D scan of a part to build a parametric model. As a first step, we tackled a very narrow version of this problem by attempting to recover the parametric history for simple ‘sketch and extrude’ designs.
In our experiments we use machine learning to recover the parametric history from a solid model.
We take a plain solid model as input, such as a STEP or IGES file, and try to recover the sequence of sketch and extrude steps that reconstruct it exactly. The hope is that a machine learning system can learn from the user data we show it to perform the reconstruction as a human designer might. When we train the machine learning model, we share extrude operations that go from face to face. For example, to make the whistle shape below we show the model two extrusions, one for the main body and one for the small hole.
Our machine learning model is trained to predict extrusions from solid model faces.
Our machine learning model learns to predict the faces to extrude. However, it occasionally fails to make the right predictions. Even the best machine learning models make mistakes once every 100 times or so. To address this, we employ a search. For a very simple model like the whistle shape, there are a set number of faces and it doesn’t take long to search all combinations and find the original shape. However, for complex models, a simple search can take a long time. To solve this, we use the machine learning model to guide our search. We take the best guesses that the machine learning model provides and try those first. Although the first guess may be incorrect, the second or third might work.
So how does our experimental model perform? On simple designs, shown on the left side below, we do pretty well. Our reconstruction may take a different path than the original human design, including one that employs fewer CAD operations. On more complex designs, shown on the right side below, we still have more work to do. In these examples we can’t recover the originalContinue reading