Most creators discover how platforms actually work when something goes wrong: a video removed, an account restricted, a monetization program suspended without explanation. At that point, understanding the legal framework is no longer academic.
Brazil's Marco Civil da Internet (Law No. 12,965/2014) is the foundational statute governing platform liability, content regulation, and the rights of both creators and users on the internet in Brazil. For foreign companies building platforms that operate in Brazil, and for foreign creators with Brazilian audiences, this law defines the rules of engagement.
What the Marco Civil Is and Why It Matters
The Marco Civil is Brazil's Internet framework law. It establishes principles, rights, and duties for the use of the internet in Brazil, regardless of where the platform is incorporated. Its two most significant provisions for the creator economy:
- Art. 19: application providers (platforms) are not liable for third-party content unless they fail to comply with a specific court order to remove it
- Art. 21: a different, stricter rule applies to non-consensual intimate image content — here, liability arises from notification by the victim, without requiring a court order
The Art. 19 framework is the core protection for platforms. It means that YouTube, Instagram, TikTok, and any platform operating in Brazil is not automatically liable for user-generated content — but loses that protection if it ignores a judicial removal order.
For foreign platforms: the Marco Civil applies to services provided to Brazilian users regardless of the platform's country of incorporation.
Creators Are Generally Not Liable for Follower Comments
A common concern among creators: if a follower posts defamatory, fraudulent, or harassing content in the comments of a post, is the creator liable?
Under the Marco Civil, generally no. The creator operating a profile on a platform is treated as an application provider in relation to the content posted by others. Liability for that third-party content arises only if, after a specific court order to remove it, the creator fails to take action.
That said, liability exposure increases in specific situations:
- Active moderation: if the creator actively moderates comments (approves, edits, or deletes selectively), the creator's relationship to third-party content becomes closer, and liability analysis shifts
- Pinning or featuring: highlighting or pinning a third-party comment that contains harmful content may be treated as editorial adoption of that content
- Community rules: if the profile operates as a structured community with rules and governance, the creator's role resembles that of an application provider with moderation obligations
A case-by-case assessment is always needed. The default protection exists, but is not absolute.
When Platforms Are Liable Under Art. 19
The Art. 19 framework protects platforms from liability for third-party content. But it also defines the boundary: if a court orders removal of specific content, and the platform fails to comply, the protection dissolves and the platform becomes liable for the resulting harm.
This framework has a practical consequence for creators: reporting content through a platform's native interface does not trigger Art. 19 liability for the platform if they ignore the report. Only a specific judicial order creates that liability threshold. This is one reason why formal takedown channels — backed by legal documentation — are more effective than interface reports for content that crosses into defamation, copyright infringement, or privacy violation.
The Marco Civil's design reflects a deliberate policy choice: platforms are protected from automatic liability for third-party content, but are expected to comply with judicial orders promptly. This is structurally similar to US Section 230's general design — though the enforcement mechanisms and exceptions differ.
Content Removal: Arbitrary vs. Legitimate
Platforms have a contractual right to moderate and remove content under their terms of service. The removal of a post that violates community guidelines is not automatically an unlawful act — even if the creator disagrees with the classification.
The legal line is crossed when:
- Removal is arbitrary: no identifiable rule was violated; the removal was inconsistent with the platform's own precedents
- Removal is discriminatory: the same content is tolerated for some creators and removed for others with no objective basis
- Removal is disproportionate: a minor violation triggers consequences (suspension, demonetization) vastly out of proportion to the actual harm
In any of these cases, the creator has grounds to claim breach of contract and, if the platform is considered a service provider under the CDC, consumer protection violations. The size of the claim depends on the documented harm: lost revenue, audience impact, and the causal link between the removal and those consequences.
Account Termination Without Notice
Account termination is the most severe platform action. It eliminates access to content, followers, communication channels, and in monetized accounts, recurring revenue.
Legally, abrupt termination without cause may constitute unilateral rescission of a service contract — particularly when:
- The creator has established significant audience and revenue through the platform
- There is a pre-existing monetization agreement with the platform (creator fund, ad-share, channel memberships)
- The termination prevents the creator from accessing content they created and own
In Brazil, the CDC may apply to the creator-platform relationship when the creator is considered a consumer of the platform's service. This adds a layer of consumer protection, including the prohibition on abusive contract practices.
The challenge: most platforms' terms of service include unilateral termination rights and forum-selection clauses that point to US or Irish courts. Pursuing claims against major platforms in Brazil is legally viable but practically complex — particularly for smaller accounts. The cost-benefit calculation is case-specific.
The CDC in the Creator-Platform Relationship
Beyond the Marco Civil, the Consumer Protection Code (CDC, Law No. 8,078/1990) applies to the relationship between creators and platforms when the creator is positioned as a consumer of the platform's service — which is common when platforms offer monetization programs, creator funds, or partner programs.
When the CDC applies:
- Abusive clauses in the terms of service are null (Art. 51)
- Sudden unilateral changes to the terms of service without adequate notice may be challenged
- Arbitrary or abusive service termination can trigger liability for material and moral damages
For foreign companies building creator-facing platforms in Brazil: if your platform offers monetization, partnership programs, or any form of business relationship to Brazilian creators, the CDC likely applies to that relationship — regardless of your terms of service's governing law clause.
Action Plan: From Internal Appeal to Litigation
When a creator faces content removal, account restriction, or account termination, the recommended sequence:
- Document immediately: screenshots of the action taken, platform notifications, account status, current follower count, and most recent revenue statements
- Exhaust internal appeals: use the platform's internal review process — this creates a documented record and may resolve the issue without legal action
- Formal extrajudicial notice: a written notice from counsel demanding justification and, if warranted, reactivation — sent to the platform's legal or compliance contact. Effective in many cases; creates the evidentiary record for any subsequent action
- Assess litigation: evaluate jurisdiction (where is the platform's Brazilian entity, if any?), the forum-selection clause in the terms of service, the documented harm, the cost of proceedings, and the realistic timeline
- Consider provisional measures: in cases of documented, ongoing financial harm, the Brazilian Code of Civil Procedure allows for urgent injunctive relief (tutela de urgência) — but the showing required is specific: likelihood of the right and risk of irreparable harm
Preventive Documentation
The most effective time to document is before any problem occurs:
- Account: screenshots of follower count, engagement metrics, account standing, and any platform-issued partner or monetization status
- Revenue: monthly statements from platform monetization programs, bank transfers, and any contracts with the platform
- Content: local backups of all original content — not just what lives on the platform
- Contracts: any written agreement with the platform (creator fund terms, partner agreements, brand safety guidelines accepted)
Documentation created after the fact is significantly weaker in any proceeding. Maintaining it as a routine practice costs very little and matters enormously when needed.
We assist creators and companies in platform disputes, content removal, and account enforcement issues. Our practice covers digital law and creator economy and strategic litigation. See also: Creator Brand Deals in Brazil: Contract Essentials.
FAQ
Generally, no. Brazil's Marco Civil da Internet (Law No. 12,965/2014) provides that application providers are liable for third-party content only if, after a specific court removal order, they fail to make it unavailable. The creator's liability rises with active moderation, pinning, or featuring of third-party comments, or when the profile operates as a community with its own rules. A case-by-case analysis is always needed.
Platforms hold a contractual right to moderate and remove content under their terms of service — and removal alone may not be unlawful. However, arbitrary, discriminatory, or disproportionate removal can create platform liability, especially when it constitutes a breach of contract or violates a legitimate expectation created by the platform itself. The path starts with a documented internal appeal.
Possibly. Abrupt termination may constitute unilateral, unjustified termination of a service contract, particularly when the creator has built significant audience and revenue on the platform. Legal consequences depend on the accepted terms and the platform's jurisdiction. In Brazil, the CDC may apply when the creator is a consumer of the platform's services. Prior documentation of the account, monetization contracts, and revenue history is critical for any action.
Yes. Statements in videos and podcasts can lead to civil liability for moral and material damages if they constitute defamation, slander, insult, or misleading consumer claims. Brazil's Civil Code, Criminal Code, and CDC may apply concurrently. Comparisons with competitors, promises of results, and implicit endorsements carry the highest risk. Maintaining a source archive for published information is a relevant preventive measure.
It depends on the case, the harm, and the out-of-court alternatives available. The recommended path: (i) exhaust the internal appeal; (ii) gather documentation of the account, contracts, and revenue history; (iii) send a formal extrajudicial notice demanding justification and reactivation; (iv) then assess litigation — jurisdiction, forum-selection clause, cost, timeline, and likelihood of success. In some cases, the platform's own terms mandate arbitration.
