26 Mar 2026

AI, Data & Digital Design: The New Prestige Frontier in Materials Innovation

AI, Data & Digital Design: The New Prestige Frontier in Materials Innovation

The global push toward sustainable, high performance materials - whether for packaging, textiles, or consumer goods - has never been more urgent - or  more complex. Historically, balancing performance, cost, safety, regulatory constraints, and environmental impact has demanded decades of trial and error experimentation. But across the materials industry, pioneers are converging on the same conclusion: AI and data driven design are transforming not just discovery, but the entire product development lifecycle.

Ahead of the upcoming Rethinking Materials summit on April 28-29, experts from TNO, IBM, Intellegens, and Alchemy Cloud shared how digital tools are not only accelerating discovery but reshaping the entire materials innovation landscape.

AI as a catalyst for high performance, sustainable design 
Across the panel, AI emerged not as a future ambition but as a practical accelerator already delivering tangible impact.

Ellen de Ruiter (TNO | polySCOUT ) underscored the urgency: “We need to accelerate the design of safe and sustainable polymer materials, while simultaneously meeting performance requirements, cost constraints, feedstock availability, and end of life considerations.” 

polySCOUT’s approach combines polymer science, AI-driven design, and high-throughput experimentation, enabling prediction and validation of polymer behaviour across thermal, mechanical, rheological, and structural properties. 

Sasha Novakovich (Alchemy Cloud) identifies the potential of AI in predicting material properties like “viscosity, mechanical strength, thermal behaviour, as well as performance trade-offs across target properties. The real value comes when it’s embedded directly in the R&D workflow, so predictions are made in the context of actual experiments, not in isolation. This allows teams to move from trial-and-error to AI-guided iteration much faster.”

IBM’s Kristin Schmidt highlighted AI’s reach across chemistry and formulation science. “Modern foundation models can evaluate single molecule properties or the behavior of complex mixtures… predicting toxicity, persistency, biodegradability, and long term stability.” These capabilities enable inverse design - where AI proposes materials that inherently meet sustainability and performance constraints.

For industry, the commercial value is already tangible. Tom Whitehead (Intellegens) noted: “AI is bringing concrete business ROI… optimising parameters that are difficult or impossible to predict using traditional methods.” He emphasised its advantage in predicting wear, long term behaviour, and the performance of complex geometries areas historically dominated by physical testing.
 

Closing the digital–physical loop
Across all speakers, one message is clear: AI predictions must be grounded in real-world validation.

De Ruiter explained polySCOUT’s continuous refinement loop: “Polymer candidates predicted using AI are validated in the lab, and the results are fed back into the model to improve the next round of designs.”

Novakovich stresses a key challenge within ensuring reliability: “The challenge is that lab-scale and production-scale conditions often diverge, so capturing process parameters and context is critical. Without that, even accurate models can fail when applied to real-world manufacturing.”

Schmidt echoed the emphasis on evidence: “Validating AI predictions follows a staged, evidence driven process…automated laboratories can accelerate this loop by rapidly generating or validating large volumes of high quality data.” AI also plays a role in prioritizing the most informative experiments, improving efficiency and reducing uncertainty.

Whitehead stressed the necessity of model confidence: “Materials science datasets are typically small… high quality uncertainty estimates enable scientists to understand when their model is - and crucially, is not - reliable.”

Designing for scalability from day one
While AI accelerates discovery, translating digital concepts into manufacturable materials remains a critical challenge.

According to de Ruiter, “AI powered design can significantly reduce the time required for trial and error R&D… By only including commercially available monomers and integrating cost, toxicity, sustainability, and processability, scale up can be accelerated.” She also highlights the importance of moving beyond a traditional technology-push model toward designing from market needs and end applications, increasing the likelihood of commercial success.

For Novakovich, it’s critical that implementation of AI translates into something that can be produced at scale. “AI needs to operate across the full workflow, from synthesis to processing to final application, not just optimise a single step. When done right, it reduces late-stage failures and shortens the path to commercialisation.”

Schmidt emphasised the role of digital twins to bridge lab scale and industrial reality: “Most AI models learn from laboratory conditions… digital twins allow simulations of operating conditions, stress testing assumptions, and predicting failure modes at scale.” She also notes that integrating processing-related properties - such as rheology, mixing, and extrusion - into AI workflows remains an important area for further development.

Whitehead framed AI as a multi objective optimiser that inherently considers performance, cost, and environmental indicators in parallel rather than sequentially;,   an essential capability for commercially viable circular materials.


Data as the new high value component
While AI provides extraordinary computational power, its performance ultimately depends on the calibre of the data behind it. Across the panel, each speaker returned to the same principle: premium materials innovation requires premium data foundations. 

De Ruiter states it clearly: “If you put garbage in, you get garbage out… Our models learn from an ever growing, expert validated dataset spanning a large chemical space.” These datasets combine internal research, technical data sheets, and scientific literature, with active learning guiding further data generation.

Schmidt highlighted how modern foundation models elevate what’s possible: “These models start from broad chemical knowledge and are then fine tuned… Model uncertainty provides essential context and is a prerequisite for building trust.” In other words, the sophistication of today’s models doesn’t replace the need for good data -  it amplifies its value.

Novakovich emphasises the role of combining historical data with active learning to improve efficacy. “Confidence comes from transparency, traceability, and continuously improving through learning loops of data<>results, not from treating the model as a black box of a linear R&D workflow.“

Tom Whitehead distilled the challenge for the sector: “High quality experimental data is the bedrock… but architecture must support statistically robust insights from small datasets.” 
In materials science, where large datasets are rare, data quality, model design, and uncertainty estimation become critical differentiators.

AI as an enabler of circularity
Every speaker highlighted AI’s growing role in designing sustainability into materials from inception.

De Ruiter emphasised: “We need to design new materials for more than just performance… including safety, feedstock origin, recyclability, and biodegradability.”

Schmidt showcased AI’s capacity to optimise chemistry itself: “AI based reaction tools can reduce waste, minimise hazardous by products, and favour renewable or recycled feedstocks.”

Whitehead added that circularity must no longer be an afterthought: “AI enables all relevant objectives to be considered throughout the development process, ensuring circularity is built in from the start.”

What’s next: A more intelligent, integrated R&D landscape
Looking ahead, the panellists converged on a vision of deeply integrated digital R&D.

Novakovich envisions: “The biggest impact will come from connecting previously siloed data, linking chemistry, processing, and performance into a unified AI-enabled system, enabling the fundamental transformation of materials innovation.” 

Schmidt predicted that “AI is expected to become an everyday collaborator… AI driven robotic labs will operate increasingly autonomously, closing loops between predictions and physical validation.”

De Ruiter sees transformative potential across the value chain, including synthesis, compounding, and process optimisation.

Whitehead anticipates AI enabled strategies that reduce material use, rethink product architectures, and prioritise circularity at the system level .
________________________________________

The insights from innovation leaders at TNO, IBM, Intellegens, and Alchemy Cloud make one thing clear: AI, data, and digital design are no longer auxiliary tools - they are becoming the backbone of modern materials innovation.

As the demands on material performance, sustainability, and scalability grow, the ability to navigate vast design spaces, integrate complex constraints, and reduce experimental cycles will define the leaders of the next decade.

Across polymer innovation, industrial process modelling, automated labs, and circular design, a new premium standard is emerging: materials innovation where intelligence is embedded end to end.

Don’t miss your chance to explore how AI, automation and digital design are among the trends defining the next decade of material innovation and reuse across industries. Register now to secure your place and be part of the movement transforming packaging for a circular future.

Loading