The “Internet of Mannequins” has begun, by the looks of French start-up Euveka’s body-morphing design form — a connected robot that can change size and shape at will.
Drawing on the famous static bust from couture studios, founder Audrey-Laure Bergenthal conceived the technology six years ago. A former lawyer in industrial property and in the art market, she wound up studying fashion design. After years in the business, she identified a specific issue.
“Prototyping is time-consuming,” said Bergenthal, “and it’s costly and labor-intensive.” As she sees it, those issues feed into greater challenges facing the textile industry: As trends move faster, designers have even less time to create and refine prototypes, making it harder to reach the right style and fit.
She got to work creating the high-tech, shape-shifting form and the software driving it. The form — comprising mechanized parts, silicon and a custom fabric outer layer — was designed to match the realistic curves of the female body, and can swell or shrink to mirror different weights, ages and morphotypes. According to Euveka, the mannequin’s range covers 80 percent of the morphologies of Caucasian and Asian women, from size 36 to 46 (European) and body types from ages 17 to 77 years old.
In the spring, the connected mannequin rolled out in Europe, and now, the start-up works with major couture and luxury brands in France, including Etam and Chanel. Now, Bergenthal hopes to lay groundwork in the U.S.
“This is the new generation of mannequin,” she said. “The robot evolves on all the key axes of the body and the construction of the garment, and it can reproduce a body-aging process, any kind of morphology, the beginning of a pregnancy.…Either scan a person or enter the measurements, and the robot will evolve instantly.”
Apart from the electronics and mechanized parts, the figure is made of silicon and a soft synthetic fabric, a proprietary blend with antistatic features. The software allows the operator to flesh out the belly, the back side, hips or a combination. If a larger bustline is necessary, more silicon can be added to the breast.
“It helps the true atelier, but also ready-to-wear, to prototype faster,” she added. “Normally, in ready-to-wear, you have no time to produce sizes above 40, above L. It’s hard to have a [traditional] mannequin that has a real reality of what is a size 40 and more.” The system was designed to make easy work of seeing and adapting a collection to every size. “To use less fabric, also,” she added. “It has a real ecological impact by reducing by half, the number of prototypes. And it helps them also to take the turn of the mass customization revolution.”
The Euveka mannequin can wirelessly communicate with other Euveka mannequins, in the same facility or across the world. For Bergenthal, a prime example is a European design house connecting to a production mannequin in China. Instead of relying solely on files, the robots at the vendor and the brand’s headquarters can sync, assuring that they’re working from the exact same measurements.
As an emerging platform of interconnected industrial mannequins, which relies on cloud infrastructure for data, Euveka took pains to establish a cybersecurity system. To protect the confidentiality of the brands it works with, the company set up what Bergenthal describes as medical-grade security, conforming to strict standards.
Data security will be key, especially as the mannequin maker eyes the American market. The company is speaking with design schools, luxury brands, sports brands and even the medical sector — any segments dealing with apparel. It hopes to release the mannequin in the U.S. by the end of the year. The high cost — 104,000 euros, or roughly $127,000 — will keep ownership an industrial affair for enterprise clients, though rentals come in far below that, at $3,500.
It won’t be long before Bergenthal finds out how the U.S. receives her bot. But as technologists often do, she’s already looking further into the future.
“My dream is that one day, on the mannequin, you will see the pattern on the robot, and there will be a real merge between reality and software through connected objects,” she mused. “I really believe that someday, we won’t use screens anymore. We will be able to say to the robot, ‘Please, I want this garment. Show me the size. I want to see the pattern.’ And you will be able to have a fusion.”