AtoZ Blog

The AI Act Is Changing the Development of Mobile Machinery – Is Software Testing Keeping Up?

Written by Tommi Puonti | Dec 15, 2025 8:24:33 AM

The EU Artificial Intelligence Act (AI Act) entered into force in August 2024 and will be implemented in phases through 2026. Many people associate it only with chatbots like ChatGPT, but its impact also extends to industry and mobile machinery.

As forestry machines, tractors, and mining drilling equipment become more intelligent, their software testing must also evolve to meet this new reality. The level of automation is increasing, and machines are making increasingly independent decisions. At the same time, regulation is tightening: the new Machinery Regulation (EU 2023/1230) and the AI Act together define boundaries that require more rigorous demonstration of safety.

How Do the AI Act and the Machinery Regulation Complement Each Other?

It is important to understand that these two are not separate bureaucratic contraptions. In the development of mobile machinery, they form a unified whole.

The AI Act defines risk categories for AI systems, data requirements, and explainability—that is, the boundaries within which “intelligence” must operate.

The Machinery Regulation (2023/1230), on the other hand, governs the physical safety of machinery, CE marking, and defines when software is classified as a safety component.

In practice, the AI Act guides the quality and operational logic of artificial intelligence, while the Machinery Regulation defines the overall safety of the machine and the manufacturer’s responsibilities.

You can read more about the Machinery Regulation in an earlier blog post: The New Machinery Regulation Tightens Software Testing Requirements.”

What Does This Mean in Practice for Software Development and Quality Assurance?

When Code Becomes a Safety Component

The AI Act classifies AI systems based on risk. In the case of mobile machinery, systems almost always fall into the high-risk category.

For example, if a machine vision system functions as a safety component—such as camera-based obstacle detection that stops a machine when a person approaches—the strictest requirements of the AI Act apply. The software must demonstrate:

  • Risk management

  • Data quality and representativeness

  • Implementation of human oversight

  • Robust operation even in fault conditions

This has a direct impact on software testing. We are no longer testing only whether a specific feature works, but whether it is safe in all possible situations.

A New Set of Requirements for Software Testing

The AI Act introduces requirements that traditional “happy path” testing does not cover. In software testing for mobile machinery, the following three areas must now be specifically addressed under both the Machinery Regulation and the AI Act:

1. Ensuring Data Quality and Representativeness

AI systems are only as good as the data they are trained on. Testers must ensure that the training data is not biased.

Example: Does the machine’s vision system detect obstacles equally reliably in bright sunlight, snowfall, and on a dusty construction site? Test data must cover the entire operational environment—not just perfectly modeled test conditions.

2. Robustness and Fault Tolerance

According to the AI Act, systems must be “resilient to errors, faults, and inconsistencies.”

Example: How does the vision system react if one sensor becomes dirty or produces noisy data? Traditional software might crash or raise an error message, but an AI-based control system must not make a dangerous decision even with incomplete information. Test plans must include cases where the system is deliberately exposed to abnormal inputs.

3. Explainability and Logging

The system must be able to demonstrate how and why it made a specific decision.

In practice: The system must log its decision-making process in an understandable way. If the machine turns left, the logs must show which sensor data led to that decision.

The Growing Role of Simulation

Not all hazardous situations can—or should—be tested in real operating environments. This is why simulation-based testing is becoming increasingly important. In simulations, a wide range of scenarios can be executed, including extreme edge cases that would not be practical or safe to test on real equipment.

TL;DR: Compliance Is a Competitive Advantage

The AI Act should not be seen only as a burden. It forces higher quality standards in software development and testing and sparks necessary discussion. A safe, well-tested, and well-documented machine is more reliable for end customers and helps build the manufacturer’s reputation as a developer of safe mobile machinery.

Further information on the AI Act and the Machinery Regulation 2023/1230: