Home > Article > Technology peripherals > How to use AI to control digital manufacturing?
While scientists and engineers are constantly creating new materials with special qualities that can be used for 3D printing, this can be a challenging and costly task.
To find the optimal parameters that consistently produce the best print quality for new materials, professional operators often need to conduct manual trial and error experiments, sometimes creating thousands of prints. Print speed and the amount of material deposited by the printer are some of the variables.
Now, MIT researchers are using AI to simplify this process. They developed an ML system that uses computer vision to monitor the production process and fix processing errors in real time.
After using simulations to train the neural network on how to change printing parameters to reduce errors, they put the controller on a real 3D printer.
This work avoids the process of printing tens or hundreds of millions of actual objects to teach neural networks. Additionally, this may make it easier for engineers to incorporate novel materials into their designs, allowing them to create products with unique chemical or electrical properties. This may also make it easier for technicians to make quick adjustments to the printing process if there are unexpected changes in settings or the material being printed.
Selecting the best parameters for a digital manufacturing method can be one of the most expensive steps in the process due to the amount of trial and error involved. Furthermore, once the technician discovers a combination that functions well, these parameters are only optimal in that specific situation. Because there is a lack of information about how the substance performs in various environments, on various equipment, or whether new batches have different characteristics.
In addition, there are difficulties in using ML systems. The researchers had to first make real-time measurements of what was happening at the printer.
To do this, they developed a machine vision device with two cameras pointed at the nozzle of the 3D printer. The technology illuminates the material as it is deposited and determines its thickness based on the amount of light passing through.
Training a neural network-based controller to understand this manufacturing process would require millions of prints, a data-intensive operation.
Their controller is trained using a method called reinforcement learning, which educates the model by paying it when it makes an error. The model requires the selection of printing parameters that can produce specific objects in the virtual environment. When the model is given a prediction result, it can be obtained by selecting parameters that minimize the variance between the printed result and the expected result.
In this case, "error" means that the model is either allocated too much material, filling spaces that should remain empty, or it doesn't have enough material, leaving spaces that need to be filled.
However, the real world is rougher than the model. In practice, conditions often change due to small fluctuations or printing process noise. The researchers used this method to simulate noise, producing more accurate results.
When the controller was tested, this printed objects more accurately than any other control strategy they examined. It is particularly effective when printing infill materials, which involves printing the inside of an object. The researchers' controller changed the printing path so that the object remained horizontal, while some other controllers placed large amounts of material so that the printed object would protrude upward.
Even after the material is deposited, the control strategy can understand how it disperses and adapts to the parameters.
The researchers intended to create controls for other manufacturing processes, and now they have demonstrated the efficiency of this approach in 3D printing. They also want to study how to change the strategy to accommodate situations where there are multiple material layers or various materials being produced simultaneously. Additionally, their method assumes a constant viscosity for each material, but future versions may use AI to detect and calculate viscosity in real time.
The above is the detailed content of How to use AI to control digital manufacturing?. For more information, please follow other related articles on the PHP Chinese website!