Companies that want to optimize their processes, make more informed decisions or develop innovative business models are increasingly relying on AI systems. However, the technological opportunities are also accompanied by considerable legal challenges, particularly with regard to warranty law. Anyone wishing to acquire an AI system for their company or train their own neural network on platforms such as Azure or AWS should carefully examine the legal framework.
The special nature of AI systems in warranty law
German warranty law is fundamentally based on the provisions of sales or work contract law. The main question here is whether the goods delivered or the work created are free from defects. This is comparatively easy to assess for classic products. However, AI systems are in a category of their own. They are characterized by the fact that they are dynamic and continuously evolve by processing new data. This ability to learn can lead to unexpected behavior and makes the assessment of a defect much more complex.
However, a purchase or work contract only makes sense if a quasi-finished solution is procured that can be integrated into the company without major adjustments or ongoing modifications. This is to be distinguished from the case where a company wants to operate a neural network and train it on an ongoing basis in order to integrate the results into operational processes using its own scripts. In this case, the focus of the service is not the delivery of a finished product, but the training and ongoing calibration of the network. This continuous adjustment constitutes a service, the legal assessment of which is different from that of a contract for work and services.
In the case of defective services, there are various possibilities for claims for damages. In accordance with Section 280 (1) BGB, the creditor can demand compensation if the debtor has breached an obligation arising from the contractual relationship and is responsible for this breach of duty. In addition, under the conditions of Section 281 BGB, damages can be claimed instead of performance if the debtor does not provide a due service or does not provide it as owed and a reasonable period for subsequent performance has expired without success. If the obligation to perform is excluded in accordance with Section 275 BGB, compensation for damages is possible in accordance with Section 283 BGB. In addition, the creditor can demand compensation for futile expenses that were made in reliance on the receipt of the service and have become useless due to the defective service in accordance with Section 284 BGB.
Integration of AI into existing systems
Many companies acquire AI systems as ready-made solutions that have to be integrated into the existing IT landscape. This raises not only technical but also legal questions. A central aspect is the definition of the nature of the AI system. This is about what specific requirements are placed on the performance of the AI. How precise should predictions be? How quickly should the system react? Such questions should be clearly regulated in the contract in order to avoid disputes later on.
The environment in which the AI system is operated also plays a decisive role. The performance of an AI can depend significantly on whether it runs on suitable hardware and works with compatible software. If such requirements are not clearly defined, there is a risk that responsibility for potential functional problems will remain unclear. Companies should ensure that such aspects are taken into account in the contract.
Another key point concerns the data that the AI system is to process. It is often necessary to prepare the data before it is used, as unstructured or incorrect data can significantly impair the performance of the AI. It is advisable to contractually define who is responsible for data preparation and which quality standards must be met.
Training neural networks with your own data
Anyone who decides to train their own neural network on platforms such as Azure or AWS faces further legal challenges. One key question concerns responsibility for the training data. If this is incomplete or of inferior quality, this can significantly impair the performance of the finished model. Companies should therefore make clear agreements on how the data is prepared and provided.
Another important issue is the ownership of the results. When using cloud platforms, it is often not clearly regulated who owns the rights to the trained models or the data used. Providers such as AWS sometimes reserve the right to reuse the data. This can be particularly problematic when sensitive or competitively relevant information is involved. Clear contractual regulations are essential here.
The question also arises as to how errors in the training process or defects in the finished model are to be legally assessed. As the success of the training process depends on many factors, the contract should contain clear guarantees as to what results can be expected and who is liable in the event of deviations. In the case of services such as neural network training, the above-mentioned provisions of Sections 280, 281, 283 and 284 of the German Civil Code (BGB) may be relevant in order to enforce legal claims in the event of poor performance.
AI in procurement: it’s not that simple
Be clear about what you want to purchase as a company: Should it be a standalone solution that is purchased “out of the box” – or is it more about training on existing and possibly purchased infrastructure (such as from AWS)? And if training is used: Have you contractually agreed on continuous calibration?
In contrast, the provider who wants to develop or sell a solution is different: pretty much every IT service provider I know is keen to provide the service (or sometimes a rental solution). However, this is not always the ideal scenario and especially with AI systems, when “only” the training is owed, it gets hairy when dysfunctional behavior occurs: When is training technically correct and incorrect results really unpredictable? In addition, there are numerous obligations to instruct the customer, which are ideally already fulfilled in the contract. Disputes in the future can be ungrateful and very expensive.
Data protection and liability
Another key aspect in the use of AI systems is the handling of data. Especially when personal data is processed, companies must ensure that all legal requirements, in particular the General Data Protection Regulation (GDPR), are complied with. This applies not only to obtaining consent and the purpose limitation of data processing, but also to the question of how long the data is stored and who has access to it.
At the same time, companies should clarify how liability issues are regulated if the AI makes incorrect decisions. Such errors can have far-reaching consequences, for example if they lead to incorrect diagnoses or inaccurate financial forecasts. It is important to specify in the contract who is liable for such errors and what the consequences are.

It is somewhat surprising when you look at the legal literature: The defectiveness of AI systems is in fact hardly discussed at present and if it is, then in essays that are years old. It is currently not recognized that the performance that actually takes place in practice is to be qualified as a service rather than a purchase/works contract – with considerable effects on contracts to be drafted!
Conclusion
The implementation of AI systems offers great opportunities, but also entails considerable legal risks. Companies investing in AI should ensure that all relevant aspects – from the nature of the system and integration into the IT landscape to the use and processing of data – are clearly regulated in the contract. This is the only way to avoid potential conflicts and ensure the performance of the systems. Sound legal advice is essential in order to overcome the various legal challenges and fully exploit the potential of AI systems.
- BiotechCrime: Biotechnology and biohacking as a criminal offense - 10. February 2025
- European arrest warrant: Support in Germany - 2. February 2025
- Red Notice - 2. February 2025