It’s Alive: Patterns That Speak (and Preserve Identity)
Project Idea Metadata
- Project Idea Name: It’s Alive: Patterns That Speak (and Preserve Identity)
- Date: 9/4/2025 4:08:22 PM
-
Administrators:
Project Idea Description
Problem: The Lack of Sustainability in DPPs and Personalization
With Digital Product Passports (DPPs) becoming mandatory for all garments sold within the EU by 2030, and with rising consumer demand for wardrobes that are expressive yet sustainable, there is a need to move beyond representing digital garment information in a simple cloth tag accessible only to the wearer.
DPPs are most effective when production information remains accessible throughout a garment’s entire lifecycle. Yet standard cloth tags pose a major point of failure for DPPs — tags fade, tear, or are cut off, and rarely survive years of wear and washing. Modern alternatives like RFID chips aren’t consumer-facing, as they require industrial scanners. NFC chips, while less fragile than cloth tags, are still unlikely to last a garment’s lifecycle given they are prone to damage from washing, stress, and body heat — and can be uncomfortable if placed awkwardly. The fact they must be stitched into garments also introduces added cost.
Beyond their fragility, cloth tags or NFC chips are also limited by their lack of outward faceability. Realistically, it is only the owner, when the garment is not being worn, that is able to access information from the inside tag or from up-close scanning of the chip. The impact for public-facing digital garment intelligence is thus limited by tags and chips’ invisibility during everyday use.
This lack of public-facing digital intelligence is critical. While interest in hyper-personalization in fashion is increasing, it can be a fraught goal. The principal motivation of fashion is to connect to broader aesthetic ideas, to embed oneself within a culturally-shared visual language of identity, meaning, and power. Clients turn to designers like Glenn Martens or Stella McCartney precisely to inhabit the exact aesthetic worlds they create. To physically edit the colors or fit of Rick Owens piece to meet one’s personal color or visual graphic preferences is anathema to many. Physically over-personalizing each and every garment we own risks reducing the self-referential and shared symbolic language that gives fashion its power — and risks massive over-production of customized clothing.
There is thus an opportunity to preserve garments’ culturally-shared identity in the physical world, while augmenting how they are personalized in the digital one. Such alive garments can meet consumer needs for personalization without rampant over-production and the loss of shared physical symbols.
Technological Solution: The Pattern as ID
Instead of relying on fragile, internal-facing tags we aim to encode a garment’s identity into pattern motifs themselves. The encoded information will be readable by a generic smartphone camera or even by Augmented Reality (AR) glasses. The impact of our pattern-as-ID approach is two-fold: First, DPP information will be more lastingly preserved throughout the product’s lifecycle — even after repair or resale — because we will not have to rely on fickle tags or chips. Second, we will enable garments to be easily scannable by public audiences and by others across the street during Fashion Week or at fandom events like Comicon, creating hybrid garments with publicly-accessible digital companions. Thus, the pattern will both speak its DPP information as well as its digital intelligence.
In the same way consumers use “bag charms” to accessorize handbags without changing the underlying recognizable looks of the bags, our work will unlock endlessly personalizable digital “garment charms.” These alive garments will allow for swapping out various digital charms using the same physical garment, helping to mitigate the potential waste of mass hyper-personalization.
Our core technological approach relies on existing proprietary algorithms from StegaMatter on encoding and decoding information using the visual complexity of shapes and sizes. The viability of this core approach has already been verified, and is currently being patented (more details in Future Plans).
What remains unresearched, however, is how designers can leverage AI tools to generate these encodings into their pattern motifs without compromising their artistic vision. If a designer has textual descriptions or image-based exemplars of how they want their pattern motif to look, our goal is to allow them to iteratively collaborate with a generative AI system to refine that motif. In each motif iteration the AI system produces, encoding logic rules be followed behind-the-scenes, freeing the designer to focus on aesthetics. What also remains unknown are the exact desirability use cases designers will have for these alive garments, as well as the desirability use cases for various brands and end-consumers.
Our group is uniquely situated to ensure the viability of these new results in collaborative generative AI systems and fashion technology desirability studies. EPFL’s Digital Humanities Lab (DHLAB) is particularly equipped to assist in the research of controlled diffusion tools for designers, and StegaMatter and the DHLAB have extensive experience leading user interviews and prototyping sessions of novel technologies.
In sum, our major research impacts for this grant are the collaborative designer/AI encoded pattern generation and the desirability research with designers and end-consumers on alive garments. Once the identity of a garment is decoded from its pattern, our prototype will then connect that ID to relevant DPP information and a user-chosen AR visual. For feasibility, we assume that DPP information will be provided by designers/brands. In addition, to actually display AR visuals on a garment in the smartphone/glasses, we will make use of existing technologies that already enable this, as opposed to reinventing that wheel. Our unlock is the ability to go from designer idea to an encoded pattern to digital ID; with that digital ID, the downstream options are then wide-open.
Prototype’s Focus and Target Audiences
We choose to focus on the new behaviors that emerge when we make these pattern IDs as visible to others as possible. That is, we explicitly choose to focus on designs with large motifs like polka dots and stripes that would be easily picked up by others’ smartphones/AR glasses. Future work can explore how to embed pattern encodings subtly into less conspicuous motifs or into clothing threads; however the more subtle the pattern’s encoding, the closer others have to be to scan it, which is not desirable for testing new consumer behavior.
Our core audiences span creators and consumers of alive garments. First are Comicon/cosplay makers — consumers who are already blending physical builds with digital lore. Their highly detailed outfits and public display culture at conferences make them ideal for bold, visible motif-IDs and swappable garment-charms.
Luxury fashion influencers and luxury pattern motif manufacturers form another key audience, as alive garments unlock new modes of digital storytelling and personalization. Our work offers an interactive layer of expression — enabling followers to reveal behind-the-scenes narratives, limited-edition drops, or AR transformations tied to a specific outfit. For luxury brands, this creates opportunities for authenticated storytelling and dynamic, artistic digital extensions of their collections.
Environmental activists and AR-glasses companies are adjacent stakeholders: the former will value fewer physical products being made for personalization; the latter will gain a compelling, on-garment content layer.
Sustainable Development Goals (SDGs)
The project aligns with EU DPP regulation and Swiss innovation culture by combining research in sustainability, AI tools, and aesthetic design, in particular aligning with the following SDGs:
-
SDG 9 – Industry, Innovation, and Infrastructure:
By eliminating reliance on fragile tags through our pattern-as-ID innovation, we extend garment lifecycle visibility and better preserve DPP information. -
SDG 12 – Responsible Consumption and Production:
We shift fashion personalization towards multiple “garment charms” layered on one physical garment, so far fewer new items need to be manufactured. -
SDG 8 – Decent Work and Economic Growth:
We open new value chains in digital craft (e.g., personalizable AR asset creation) and upskill designers in controlled generative AI tools to co-lead the AI transition.
Impact of the Innovation Booster
The Innovation Booster will support our transition from foundational research to a fully functional, desirability-tested pilot pipeline for designers. It will provide material support for co-design experiments, funding for exploratory research and usability studies, access to industry networks, and guidance from circular design mentors — all while enabling cross-disciplinary dialogue between computer science researchers and fashion startups.
StegaMatter and EPFL DHLAB aspire to redefine how AI and digital information lives within the material world — making the fabric of fashion not only beautiful, but also truthful, sustainable, and intelligent.
Implementation Plan: Work Packages, Deliverables, Risks/Mitigations
Work Package 1 (Nov 2025 - Feb 2026)
-
Novel AI research on how to control generative diffusion outputs in a way that blends designers’ creative control of pattern motifs with encoding logic rules → Deliverable #1
-
Robustness research on existing proprietary encoding and decoding rules in visual patterns → Deliverable #2
Work Package 2 (Mar - Jun 2026)
-
Integration of the full pipeline from designers’ textual/image pattern descriptors to beautiful encoded patterns → Deliverable #2
-
Internal beta testing of controlled diffusion tool with professional designers → Deliverable #4
-
Physical prototyping/manufacturing of garments with encoding/decoding → Deliverable #3
Work Package 3 (Jul - Nov 2026)
-
Beta version software release of controlled diffusion tool → Deliverable #2
-
Physical fashion item market test launch → Deliverable #3
-
Collecting initial market response and user/stakeholder feedback → Deliverable #4
-
Encapsulation and valorisation of results for future readiness → Deliverable #4
Deliverables:
#1 Open-Source AI Image Diffusion Research: EPFL DHLAB will lead open-source AI research on ways for designers to control diffusion models to achieve their desired pattern visions
#2 Robust, Full Software Pipeline: EPFL DHLAB and StegaMatter will collaborate to create a full software pipeline that will allow a designer to go from pattern idea, to an encoded pattern, and then a way to easily decode the pattern
#3 Physically Manufactured Prototypes of Patterns that Speak: StegaMatter will lead physically manufacturing of prototypes of decodable patterns
#4 Desirability Use Case Research of Designers and End-Consumers: EPFL DHLAB and StegaMatter will collaborate, using the Executional Voucher (detailed more below), to evaluate desirability of the pipeline and prototypes
Risks and Mitigation:
|
Risk |
Description |
Mitigation |
|
Challenges in controlling diffusion to follow logical encoding rules Likelihood: Medium Severity: High |
It may be hard to wrangle generative AI tools into predictable outputs |
Follow existing research paradigms from ControlNet and BlendedLatentDiffussion that have shown viable success in fine-grained control of diffusion image generation. Following existing research in multi-step AI agentic systems where logical steps can be independently verified. |
|
Designers are uncomfortable with leveraging AI collaboration in their workflows Likelihood: Low Severity: Medium |
Some designers may be wary to cede some aspects of creative decisions to AI systems |
Discuss the benefits of alive garments as a novel way to enhance end-consumer creativity and personalization – as well as to ensure longevity of DPP information. Get early feedback with designers to understand how much control they need over the AI generations to be comfortable. |
|
Unstable or malfunctioning decoding algorithms Likelihood: Low Severity: High |
Decoding algorithms may fail to recognize the encoding |
Robustness research will be conducted on the existing decoding algorithm to identify and correct technical weaknesses |
|
Tepid consumer interest in alive garments Likelihood: Medium Severity: High |
There may be limited customer interest on such a new and digital experience |
Get early feedback from fashion, like eco-climate activists, Comicon attendees, and technology-forward brands like Coperni |
Budget Proposal
Cash contributions (CHF 23,000) will cover:
-
Material prototyping (pattern fabric printing and item prototyping): CHF 1,500
-
StegaMatter pipeline integration costs: CHF 10,000
-
EPFL PhD Research Salary part-time contribution: CHF 11,500
Vouchers (CHF 5,000):
Executional Voucher (CHF 2,500) will support:
-
Consumer desirability tests to determine the most appealing garment item: Should it be a long-sleeve polka-dot shirt? A simple brooch you can add to a Comicon costume? A high-end luxury scarf? We want to understand who, where, and how consumers are most willing to engage with alive garments
-
Rigorous co-design sessions with fashion designers to assess comfort and creative usability of the controlled diffusion pattern motif tool
Launchpad Voucher (CHF 2,500) will support:
-
Scaling-up of production-ready prototypes with Swiss fashion partners
-
Exploitation of these prototypes and the software tool at design/innovation events and Comicon conferences
Clothing garments currently rely on external tags, like cloth labels with QR codes, to store production and authentication data. But these tags are fragile. In thrifted or vintage clothing, these cloth tags are often missing or destroyed. Some new tags are removed within hours by comfort-seeking wearers. Modern alternatives like RFID chips require industrial scanners, while NFC chips — especially when placed in awkward or wear-prone spots — rarely survive years of use and washing. Once these external tags are gone, the production information and digital identity disappear with them. We propose a new approach: embedding information directly into the material itself. We will build generative AI tools for designers to create bold, visible pattern motifs that follow specific visual complexity rules, that our computer vision tools will then decode by smartphone camera. The pattern thus becomes the ID. User can wear a polka-dot shirt they love, and when scanned, those dots can reveal verified product data to owner.