Date of Last Revision: April 17, 2026
Mixbook applies content moderation measures to content submitted through its services in order to maintain safety, legality, and compliance with Mixbook’s Terms of Service and related policies. Projects may be reviewed to ensure compliance with applicable law and Mixbook’s content quality guidelines, including design issues that could affect the integrity of the final product, such as but not limited to empty text boxes, cut-off text, and poor photo quality. However, Mixbook does not proofread customer projects for editorial choices such as spelling, grammar, or layout.
Mixbook prohibits content that is illegal, infringing, harmful, or otherwise incompatible with Mixbook’s Terms of Service, Copyright Complaints and Illegal Content Policy, and other applicable policies. Depending on the nature of the content and the service involved, Mixbook may investigate, restrict, remove, disable access to, refuse to print or fulfill, suspend, terminate, or otherwise limit access to the relevant content, order, project, or account without notice.
Without limiting the generality of the foregoing, the following categories of content may be restricted:
Content that is unlawful under applicable law, including content identified in valid legal notices or orders from competent authorities, may be restricted or removed. Mixbook does not tolerate illegal content uploaded through the Service and will investigate notices of alleged infringement and other illegal content.
Content that infringes copyright, trademark, or other intellectual property rights, or that is the subject of a valid infringement notice, may be removed, disabled, canceled, or otherwise restricted.
Mixbook does not allow full nudity or explicit images in printed books, cards, or calendars. Tasteful boudoir projects may be permitted if private areas are not visible and the project otherwise complies with Mixbook’s guidelines.
Mixbook uses moderation and quality-assurance processes to identify certain categories of objectionable sexualized imagery for further review. This may include content that appears to depict nudity, graphic nudity, sexual activity, or other explicit sexualized content.
Mixbook may restrict content involving children where the content appears illegal, exploitative, sexualized, unsafe, or otherwise incompatible with Mixbook’s Terms or policies. Because automated systems may not always reliably distinguish unlawful or prohibited material from benign family contexts, such content may require human review and contextual assessment.
Content may also be restricted where it is incompatible with Mixbook’s published policies or creates legal, safety, abuse, fraud, or platform-integrity concerns, even if it is not specifically enumerated in this section.
Mixbook recognizes that content moderation may require contextual evaluation. In some cases, content that is benign, artistic, documentary, or family-related may resemble prohibited content when evaluated by automated systems. For that reason, Mixbook may use human review in addition to automated tools when assessing whether content is illegal or incompatible with its Terms or policies.
Mixbook uses a combination of automated tools and human review as part of its moderation and quality-assurance processes. Content submitted through Mixbook’s services may be analyzed using automated systems designed to identify potentially objectionable or unlawful material. Automated detection may be used to flag content for further review, but such tools are used as part of a broader moderation workflow rather than as the sole basis for every decision.
For printer orders, Mixbook’s moderation and review process generally follows this workflow:
Mixbook’s Quality Assurance team may review flagged projects, orders, or other content to determine whether they violate Mixbook’s Terms of Service, content guidelines, or applicable law with or without notifying the customer before or after review Human review may be used to assess context, reduce false positives, and determine the appropriate action in individual cases.
Mixbook recognizes that automated moderation can generate false positives. Content moderation systems may flag content that is ultimately permissible, including some family, artistic, documentary, candid, or boudoir content. Mixbook uses safeguards intended to improve accuracy and reduce unnecessary restrictions, which may include thresholding, escalation criteria, contextual review, and human review before or after action is taken, as appropriate.
Where Mixbook determines that content may be illegal or incompatible with its Terms of Service or related policies, Mixbook may investigate, restrict, remove, disable access to, refuse to print or fulfill, suspend, terminate, or otherwise limit access to the relevant content, project, order, or account, as appropriate and in accordance with applicable law.
In the printer-order context, Mixbook may take different actions depending on the category of issue:
If Mixbook determines that an order presents a fraud-related issue, Mixbook may place the order on hold, cancel the order, or take other protective measures without contacting the customer.
If Mixbook determines that an order violates Mixbook’s content policies or involves copyright infringement or similar rights issues, Mixbook may cancel the order and reach out to the customer where appropriate.
If Mixbook determines that content may require involvement of law enforcement or other competent authorities, Mixbook may preserve relevant information and report the matter to the appropriate local authorities, in accordance with applicable law.
Mixbook may also refer suspected fraudulent, abusive, or illegal activity or illegal or harmful content to appropriate law-enforcement authorities.
Mixbook maintains a repeat infringer policy and may terminate accounts of users deemed to be repeat infringers or otherwise limit access to the service for users who infringe intellectual property rights or upload illegal content.
Persons who believe that content available through the Service is illegal or infringes intellectual property rights may submit a notice using Mixbook’s designated reporting form or other published reporting channel. Mixbook will review and investigate notices of alleged infringement and other illegal content and will take appropriate action under applicable law.
Where Mixbook removes or restricts content, suspends or terminates an account, refuses to fulfill an order, or otherwise materially limits access to the Service on the ground that the content is illegal or incompatible with Mixbook’s Terms of Service or related policies, Mixbook will provide the affected user with a statement of reasons in accordance with applicable law. That statement may include, as applicable, the nature of the decision, the basis for it, whether automated tools were used in the review, and information on how the decision may be challenged.
If Mixbook removes or disables access to content, restricts visibility of content, refuses to print or fulfill an order, suspends or terminates a service, suspends or terminates an account, or otherwise materially restricts access to the Service on the ground that content is illegal or incompatible with Mixbook’s Terms of Service or policies, the affected user may challenge that decision through Mixbook’s internal complaints procedure.
Complaints may be submitted by:
Complaints must be submitted electronically using Mixbook’s designated complaint or counter-notice form. The complaint should identify the challenged decision and include enough information for Mixbook to locate the affected content, order, project, or account and evaluate the challenge.
A complaint should include, as applicable:
Complaints may be submitted for at least six (6) months after the user is notified of the challenged decision.
Mixbook will review complaints in a timely, diligent, non-discriminatory, non-arbitrary, and objective manner. Complaint decisions will be made under the supervision of appropriately qualified personnel and will not be based solely on automated means. Where relevant, Mixbook may consider the original notice, the challenged content or project, applicable law, Mixbook’s Terms of Service and policies, prior related correspondence, and any contextual information submitted by the complainant.
After review, Mixbook may:
Where a complaint shows sufficient grounds to conclude that the original decision should be reversed, Mixbook will reverse the decision without undue delay.
Mixbook will notify the complainant of its reasoned decision without undue delay. That notice will, where applicable, explain whether the original decision is upheld or reversed, the basis for the outcome, and the further avenues of redress available.
Where required by applicable law, Mixbook will inform complainants of the possibility of out-of-court dispute settlement and other available avenues of redress.
The internal complaints procedure is electronic and free of charge.
Mixbook will publish annual transparency reports in a machine-readable and easily accessible format covering the content moderation undertaken during the relevant period, as required by applicable law. These reports may include, as applicable, information regarding legal notices and orders, own-initiative moderation, restrictions imposed on content or accounts, internal complaints, and the use of automated means for moderation, including their purposes and safeguards.
Mixbook has designated a representative for DSA-related inquiries and published the representative’s contact details on its DSA page.
Mixbook may update these Content Moderation Practices from time to time to reflect legal, operational, technical, or product changes. Material updates will be reflected in Mixbook’s public policies and terms as required by applicable law.