News January 29 2026

AI in Dietary Supplement Marketing – How to Balance Innovation with Legal Compliance?

The use of generative artificial intelligence (AI) has rapidly become widespread in activities requiring the creation and processing of text, graphics, photographs, audio works, audiovisual content, as well as in advertising personalization, process automation, and data analysis (including scientific data). As a result, tasks in advertising campaigns—such as creating social media materials, product descriptions and labeling, data analysis, or customer service—can be performed faster and more easily. There is no doubt that AI is a powerful tool for cost optimization. At the same time, one must not overlook the risks associated with the use of AI tools—particularly in the hands of an unaware user (and subsequent users of AI-generated content), where these risks may increase. This is especially important for the dietary supplements industry, which in Poland is one of the most dynamic yet also one of the most heavily regulated sectors. In this industry, AI therefore represents not only a time-saving solution, but also a new compliance challenge.

Who is the author in the age of algorithms?

The key question is no longer whether to use AI, but how to classify the results of such work under existing regulations. The first challenge concerns intellectual property protection, which in the Polish legal system is based on the definition of a work and the concept of an author.

Under the Polish Act on Copyright and Related Rights, the subject matter of copyright protection is any manifestation of creative activity of an individual character, fixed in any form, regardless of its value, purpose, or manner of expression (a “work”). A work may consist of both creative and non-creative elements, but it must always be assessed as a whole. If the result is purely technical or reproductive work, even if it requires specific knowledge and skills, it will not qualify as a work. Moreover, according to the dominant view in legal doctrine and case law, a work must be the result of human activity (that of the author). AI, like advanced graphic editors, is merely a tool; however, the nature of its operation drastically changes the proportion between human and automated work. In addition, AI models are trained on vast datasets, meaning that generated graphics or texts may resemble existing works. The more tasks are delegated to the algorithm, the stronger the human creative contribution must be for the final result to qualify as a copyrighted work at all. Therefore, to ensure legal protection for a dietary supplement label or advertisement, the human role cannot be limited to issuing a simple command—it must retain a dominant and creative influence on the final form of the work, thereby compensating for the reduced technical effort.

The consequences may be significant. If campaign content does not qualify as a work, there is no right to prohibit competitors from copying it. Another dietary supplement manufacturer may legally use the same materials. It should be noted, however, that the lack of copyright protection does not entirely eliminate the risk of disputes—in certain circumstances, copying a competitor’s overall concept may be assessed under the Act on Combating Unfair Competition, for example as unfair imitation or misleading conduct.

Standard marketing services agreements often include clauses on the “transfer of economic copyrights in all specified fields of exploitation” (or the granting of a license to the same extent). In the case of AI-generated content, such provisions may prove legally ineffective. A creator cannot transfer rights that never arose, in accordance with the Roman law principle nemo plus iuris in alium transferre potest quam ipse habet (no one can transfer more rights than they possess). It is therefore advisable to specify in contracts that the contractor guarantees a certain level of human contribution to the creative process and documents the creation process, so that in the event of a dispute it can be demonstrated that AI was merely a tool rather than an autonomous creator. Problems may also arise in the context of accounting for an employee’s creative working time. In order to apply the 50% tax-deductible costs (KUP), it is necessary for a work to be created within the meaning of copyright law and for evidence of its creation to be available.

Regardless of the above, by disseminating AI-generated material, a dietary supplement manufacturer assumes the risk that the model was trained on copyrighted works of other artists. In an industry where brand image is built on trust, allegations of plagiarism—even unintentional—may prove costly both reputationally and financially. Due to the uncertainty surrounding the emergence of copyright protection, contracts should include not only declarations on the transfer of rights, but above all broad warranty clauses and an obligation on the creator to assume full liability for damages (including legal defense costs) in the event of third-party claims or challenges to the legal nature of the materials.

Pitfalls in platform terms and conditions

Another issue concerns the right to distribute a work created by AI. Even if the result of AI-assisted work qualifies as a work under the Act, a dietary supplement manufacturer must also contend with restrictions arising from the terms of service of specific AI platforms. These may include provisions limiting the use of outputs exclusively to non-commercial purposes (so-called Personal Use). The use of an advertising slogan or packaging graphic may therefore constitute a breach of the platform’s terms and, consequently, the agreement with the technology provider. Moreover, such terms are often unilaterally amended, meaning that a change in licensing policy by the AI provider may affect the right to continue distributing materials. For entities in the dietary supplements industry that invest significant resources in branding, reliance on an unstable contractual basis clearly constitutes a risk.

In addition, “as-is” clauses are standard in such terms and conditions. Tool providers do not guarantee that generated content does not infringe third-party rights, provide no warranties or legal guarantees, and often reserve the right to use prompts and outputs to train their models (which raises the risk of trade secret leakage).

Implementing AI in marketing processes therefore requires an audit of terms and conditions to ensure explicit permission for commercialization, the ability to grant sublicenses, and the right to modify and combine outputs with other works (e.g. brand logos).

It should also be remembered that the risk of intellectual property infringement is not limited to copyright alone. AI generating images of people (e.g. experts) may rely on the features of real individuals without their consent. The use of so-called synthetic faces may lead to allegations of infringement of personal rights or misleading consumers as to the authenticity of endorsements.

Hallucinations and health claims

In the dietary supplements industry, every word used in marketing communication is subject to strict verification for compliance with the EFSA register of health claims. This is where the phenomenon of so-called hallucinations comes into play. Language models do not understand scientific facts or legal norms and tend to generate content that sounds highly credible but is substantively incorrect. They may therefore generate content attributing medicinal or disease-preventing properties to supplements, which is prohibited under Article 7 of Regulation (EU) No 1169/2011 and strictly enforced by Polish supervisory authorities. For AI, the line between a permissible health maintenance claim (e.g. “contributes to the normal functioning of the immune system”) and a prohibited medicinal claim (e.g. “has anti-inflammatory properties”) is extremely thin and often unnoticed by the algorithm. The lack of rigorous legal oversight of generated content may lead to situations where, in pursuit of marketing appeal, the algorithm uses wording too close to the definition of a medicinal product. In this context, AI should be treated solely as a tool for generating preliminary drafts that must undergo mandatory expert review.

Transparency under the AI Act

Attention should also be paid to the upcoming regulations under the EU Artificial Intelligence Act (AI Act). Although the AI Act entered into force in August 2024, key transparency obligations relevant to marketing will become fully applicable from August 2026. One such provision is Article 50(4), which requires entities using an AI system that generates or manipulates images, audio, or video constituting deepfake content to disclose that such content has been artificially generated or manipulated. Where the content forms part of material of an explicitly artistic, creative, satirical, fictional, or similar nature, the obligation will be limited to disclosing the existence of such generated or manipulated content in an appropriate manner that does not hinder the display or use of the work.

In advertising practice, this may mean the need to clearly inform consumers that the image of a person promoting a product or the presented graphic was created by an algorithm. Concealing the use of generative technologies—especially when creating user reviews or expert images—may be considered an unfair market practice or misleading action (or omission).

Therefore, to avoid allegations of lack of transparency in consumer communication, dietary supplement manufacturers should use the coming period to audit their creative processes and prepare appropriate labeling for generative content.

What should entrepreneurs do now? Four key recommendations

Document the creative process – it is worth collecting evidence of human contribution (sketches, prompt iterations) in order to defend copyright in AI-assisted outputs. This is also crucial for the correct accounting of creative working time and the safe application of 50% tax-deductible costs (KUP). Failure to demonstrate the employee’s dominant role in the creation process may expose the employer to the risk of the preferential tax rate being challenged by tax authorities.

Conduct ongoing audits of tool terms and conditions – even purchasing a paid subscription does not guarantee full rights to commercial use of outputs. Terms of service must be regularly reviewed, as AI tool providers often reserve the right to amend them unilaterally at any time. It is essential to ensure that a given plan—under the current wording of the terms—actually permits monetization of outputs and does not impose additional restrictions on sensitive industries.

Introduce expert oversight – every AI-generated draft should be approved by an expert with regard to health claims. AI does not recognize the boundaries between a dietary supplement and a medicinal product, making this stage critical for avoiding penalties from the Chief Sanitary Inspectorate (GIS) or the Office of Competition and Consumer Protection (UOKiK).

Implement internal, transparent rules – it is advisable to develop standards for labeling AI-generated content and rules for its creation now. This allows companies to get ahead of AI Act requirements, standardize internal processes, and build a trustworthy brand image.

For more details and support in the safe implementation of AI in your organization, we invite you to contact our Law Firm.

Author:
Attorney-at-law Edyta Oleszczuk-Romańska – AJ Law Uchańska Diskau Law Firm

menu-icon search-icon arrow-right-long facebook twitter instagram linkedin sygnet sygnet-letters