This 3D Printer Can Watch Itself Fabricate Objects
Contact-free 3D printing is made possible by computer vision. This lets engineers build with high-performance materials they couldn’t use before.
MIT researchers made a new kind of 3D printer that can watch itself with machine vision and print things faster and with more materials than other printers.
Engineers can now use materials they couldn’t use before. This lets them make more complex and useful things, like a robotic gripper that looks like a hand and is controlled by “tendons” that are flexible and strong, according to the university.
Small droplets of resin are sprayed onto a surface by most 3D printers. The surface is then cleaned with a brush or roller, and UV light is used to fix the resin.
The roller could squash or smear the material, though, if it takes a long time to dry. This means that 3D printers can only work with certain kinds of materials.
MIT’s new frictionless 3D printing system, on the other hand, doesn’t need any motorized parts to smooth the plastic that is used to make things. This means that it can use materials that harden more slowly than the acrylates that are usually used in 3D printing.
Materials that fix more slowly usually have better properties, like being more flexible, lasting longer, and being more durable.
They made many complicated high-resolution composite systems and robots, such as tendon-driven hands, pneumatically actuated walking manipulators, pumps that look like hearts, and metamaterial structures, which they wrote about in a paper. The scientists came from MIT, an MIT spinout called Inkbit, and ETH Zurich.
The team also says that the new 3D printer can print 660 times faster than similar 3D inkjet printers.
Read: Best Sublimation Printers
Complex Device Printing:
The study builds on the MultiFab, a cheap 3D printer that can print in a variety of materials that the researchers first showed off in 2015.
With its thousands of needles, MultiFab could cast very small resin drops that were then hardened by UV light. This made it possible to print in high resolution with up to ten different materials at the same time.
The team’s most recent project was to make a non-contact printing method that could be used on a wider range of materials to make more complicated gadgets.
Vision-controlled jetting is a new way they came up with that uses four high-frame-rate cameras and two lasers to keep an eye on the printing surface all the time. The cameras record these moments as the tubes fire out tiny drops of glue.
In less than a second, the computer vision in the system turns these pictures into a full-depth map. This map is compared to the CAD model of the thing that is being made.
This lets the exact changes be made to the plastic output so that it fits the design that was intended. This automatic process can fine-tune all 16,000 of the printer’s nozzles, giving the person printing the device great control over even the smallest details.
Artificial Intelligence For Printing:
A lot of people are trying to use machine learning for 3D printing, and the MIT project is one of them. Nat Trask, an engineering professor at the University of Pennsylvania, said in an interview that plans made by AI can be printed right away, which allows for quick testing and development.
“For metamaterial design in particular, people print small, complicated, tiled patterns that work together to give a good bulk mechanical response,” Trask said.
It’s been tried for a while, but the patterns and shapes have been limited by how hard it is for humans to figure out plans for things that are too complicated.
People can try out more complex designs that can affect different materials with generative models, which are the same tools that DALL-E uses to make pictures of cats playing basketball on the moon.
When things were designed the old way, Trask said, a person would spend days making a computer model of the design and then running exercises that were based on solving big sets of equations.
In the past few years, machine learning (ML) has made estimates 1,000 times faster than these models.
“Within the next few years, I expect to see machine learning tools that can predict how a part will behave “on the fly.” This will let AI/ML not only suggest new print shapes but also have a feedback loop that tests designs using online physics models, Trask said.
In an interview, Ben Schrauwen, senior vice president and general manager of 3D printing company Oqton, said that AI makes image-based process tracking and closed-loop control work better.
He said that using AI models that understand the shapes and relationships of molecules and atoms can speed up the process of finding and making new plastics and metals.
He also said, “For example, you could have ChatGPT-like interfaces to interact with and come up with ideas around large amounts of research literature.”
“AI is also being used to automatically design dental parts and to recognize and separate 3D images of the human body.”
On this idea, AI can already automatically recognize 3D models and give advice on how to best divide them, place supports, align them, and nestle parts for 3D printing.
Oqton has made AI-based software that lets dental labs prepare files for 3D printing instantly, instead of having operators and techs do it by hand. Automated file preparation can greatly improve the general flow of work in manufacturing.
“With AI-based automated support generation, dental labs have been able to get rid of hours of daily manual work,” Schrauwen said. In turn, this means that labs are making more parts with the same number of tools.
We can expect more businesses and organizations to use 3D printing as more people learn how fast and cheap it is.
Additionally, he said, “AI has made additive manufacturing (AM) workflows more predictable. Technicians can send a job to be printed overnight and know that it will be ready for the next steps in the morning.”
Schrauwen also said that AI is a big part of making quality control better. One example is being able to manage all 3D printing chores from a single, unified view.
“This makes it simple to see live sensor values, see video feeds of the build platform as jobs move forward, and know the status of all jobs in the process,” he said.
“Unlike traditional Manufacturing Execution Systems (MES), a next-generation solution powered by AI can combine data from different machines and apps into a single view that can find issues and suggest fixes.”
Last Updated on March 10, 2024 by Muhammad Haseeb