As construction materials become more complex and testing volumes increase, artificial intelligence is emerging as a practical tool to improve consistency, speed, and compliance in materials testing labs. Mehdi Dargahi, national sales manager at ZwickRoell, explains how AI-assisted testing works in practice, what it means for standards compliance, and how labs can get started. From real-time crack detection in hole expansion tests to AI-guided tensile test setup, Dargahi outlines a vision where AI acts as an experienced colleague—supporting decisions and catching errors—without replacing the human expert or the standardized test method.
Materials testing has been around for decades. Why is AI becoming practical and relevant now?

AI is becoming practical in materials testing now because our testing environment has become far more digital and data-rich. In mechanical testing, customers are generating huge volumes of test data, and AI can analyze that data faster, spotting patterns and deviations that are easy to miss in manual reviews. This is especially relevant as labs and production environments become more automated, where throughput and consistency matter.
Just as important, we now have the infrastructure to make AI useful in day-to-day work: centralized storage of test and machine data and flexible interfaces are the foundation for reliable AI support.
And the performance is there. For example, in our AI-assisted hole expansion testing to ISO 16630, we integrated a neural network into our testXpert software to detect cracks in real time, processing each image in ~50 ms on a standard test computer.
What does “AI-assisted testing” actually mean in practice? Does the machine run tests autonomously?
In practice, AI-assisted testing means we keep the standardized test method, and AI supports specific tasks that are traditionally time-consuming, subjective, or error-prone, especially setup checks and evaluation. In our vision for an AI-assisted tensile test, we use AI like an experienced colleague: it asks what test we’re planning, what material we’re testing, and which standards apply. It can then recommend parameters and configure the software accordingly.
That concept also shows how AI can improve compliance and reduce avoidable errors. An optical AI system can assist with clamping and verify the setup (e.g., flagging a missing extensometer or misalignment) before the test proceeds.
So, no. AI assistance doesn’t automatically mean the lab is “hands-off.” It’s about guided correctness and more consistent decisions, not removing the expert from the loop.
How can organizations get started with AI-assisted testing?
We recommend starting with one or two high-value, well-bounded use cases, such as automated evaluation or anomaly detection, where AI can quickly improve consistency and speed. AI can identify deviations and patterns in test data to support predictive quality assurance (e.g., early detection of batch deviations or faulty samples).
Next, make sure your foundation is in place: it’s important that test and machine data is reliable with minimum user influence, because AI is only as good as the data it learns from and evaluates.
Finally, choose solutions designed for controlled use in testing. For example, in hardness testing our approach relies on defined, centrally validated models to avoid uncontrolled changes and keep results consistent.
You’ve developed an AI concept machine for tensile testing. What can labs see when they experience it?
When labs experience our AI-assisted tensile-test concept, they see a workflow where AI acts as a true testing companion, not just a feature in the background. We demonstrate what’s already possible today, while being transparent that it’s an early concept and not a finished product.
In the demo, setup begins with a conversation. The AI asks what test we’re planning, what material we’re working with, which standards apply, and what we should consider. It then provides guidance, explains standard-related details, suggests parameters, and can automatically configure the testing software (testXpert) based on that context.
We also show how optical AI can help verify compliance-critical setup steps, assisting with specimen clamping, checking the configuration via camera, and flagging issues like a missing extensometer or misalignment before the test runs. During and after testing, the AI can support plausibility checks and highlight anomalies or unexplained deviations for review. It doesn’t do the job for you, but makes sure the job is done correctly!
How do you ensure AI-assisted results remain compliant with ASTM, ISO and industry standards?
We start with a non-negotiable principle: AI must not replace what the standard defines. It must support evaluation within the prescribed method. Our ecos AI approach in hardness testing is designed exactly this way. It meets normative requirements according to ISO, ASTM, and NADCAP, and we do not use an alternative measurement approach. The software measures exactly as the standards prescribe.
We also design AI solutions to be stable and auditable in a lab setting. For example, ecos AI runs offline on a standard PC, keeps data local, and operates using defined models that do not continue learning independently, helping prevent uncontrolled changes that could compromise repeatability.
Where a standard includes critical interpretation steps, we align training and labeling to that standard. In our ISO 16630 hole expansion solution, crack annotations strictly adhere to ISO 16630, supporting standard-conform results while reducing operator dependency.
