歐盟
2023 年 7 月 1 日-2023 年 12 月 31 日

發布日期:

2024 年 4 月 25日

更新日期:

2024 年 4 月 25日

歡迎瀏覽歐盟透明度頁面,我們會在此發布歐盟《數位服務法 (DSA)》、《視聽媒體服務指令 (AVMSD)》與《荷蘭媒體法 (DMA)》規定的詳細資訊。請注意,最新版的透明度報告語言為英文 (美國)。

法定代理人

為遵守數位服務法規定,Snap Group Limited 指定 Snap B.V. 為法定代理人。您可以在 dsa-enquiries [at] snapchat.com 聯絡我們的 DSA 的代表,或於 vsp-enquiries [at] snapchat.com 聯絡 AVMSD 與 DMA 的代表,您可以透過此處我們的支援網站聯絡,或於:

Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands

如果您是執法機構,請按照這裡列出的步驟進行。聯絡我們時,請使用荷蘭文或英文。

主管機關

針對數位服務法,我們受歐盟委員會與荷蘭消費者與市場管理局 (ACM) 的管理。針對視聽媒體服務指令與荷蘭媒體法,我們受荷蘭媒體管理局 (CvdM) 的管理

DSA 透明度報告

依照 DSA 第 15、24 和 42 條款要求,Snap 必須發布包含前述資訊之報告,其中包含有關 Snap 對於 Snapchat 服務中被視為「線上平台」,即聚光燈、為你推薦、公開個人檔案、地圖、特效鏡頭與廣告等內容審核。此報告從 2023 年 10 月 25 日起,每六個月必須發布一次。

Snap 每年發布兩次透明度報告,以提供 Snap 的安全工作以及平台上內容回報的性質與數量分析。您可以在這裡查看 2023 年下半年 (7 月 1 日至 12 月 31 日) 的最新報告。如需數位服務法特定的數據,請查看此頁面。

每月平均活躍使用者人數
(數位服務法第 24.2 與 42.3 條)

截至 2023 年 12 月 31 日,Snapchat 應用程式在歐盟的每月平均活躍使用者 (簡稱「AMAR」) 人數為 9090 萬人。這表示在過去 6 個月,平均而言,歐盟有 9090 萬已註冊使用者會在特定月份至少開啟 Snapchat 應用程式一次。

This figure breaks down by Member State as follows:

These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.


Member States Authority Requests
(DSA Article 15.1(a))

Takedown Requests 

During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9. 

Information Requests 

During this period, we have received the following information requests from EU member states:

The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes — we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.

Content Moderation 


All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means, that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer.  Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.

Content and Account Notices (DSA Article 15.1(b))

Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16.  These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.

During the relevant period, we received the following content and account notices in the EU:

In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content. 

In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU. 

Trusted Flaggers Notices (Article 15.1(b))

For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.

Proactive Content Moderation (Article 15.1(c))

During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:

All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:

  • Proactive detection of illegal and violating content using machine learning;

  • Hash-matching tools (such as PhotoDNA and Google's CSAI Match);

  • Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis


Appeals (Article 15.1(d))

During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems:


*停止兒童性剝削是首要任務。 Snap 為此投入大量資源,並對此類行為採取零容忍態度。  審核 CSE 申訴需要經過特殊培訓,而且由於內容形式的關係,處理這些審核的團隊人數有限。  在 2023 年秋天,Snap 實施政策變更,影響某些 CSE 執行的一致性,我們透過人員的重新訓練及嚴格的品質保證來解決這些問題。  我們預計下一次的透明度報告將顯示縮短 CSE 申訴回覆時間及提高初始處置準確性的進展。 

內容審核的自動化方法(第 15.1(e) 條)

在我們的公共內容頁面上,內容通常會經過自動審核與人工審核,才能符合對廣大觀眾發布的資格。自動化工具包括:

  • 使用機器學習主動檢測非法與違規內容;

  • 雜湊比對工具(如 PhotoDNA 和 Google 的 CSAI Match);

  • 侮辱性文字檢測,基於已識別且定期更新的侮辱性關鍵字清單以杜絕相關內容,包括表情圖案。


所有危害的自動化審核技術的準確率約為 96.61%,錯誤率約為 3.39%。


內容審核保障(第 15.1(e) 條)

我們理解內容審核存在風險,包括自動和人工審核員的偏見以及政府、政治團體或個人濫用回報系統可能對言論和集會自由造成的風險。Snapchat 通常不是政治或激進內容充斥的地方,尤其是在我們的公共空間中。 


然而,為了防範這些風險,Snap 進行了測試與培訓,並制定了強大、一致的程序,來處理非法或違規內容的報告,包括來自執法部門和政府機構。 我們持續評估和發展我們的內容審核演算法。 雖然對言論自由的潛在危害很難發現,但我們還未發現任何重大問題,並且我們為使用者提供回報問題的管道。


我們的政策及系統確保處置是一致與公平的,如上所述,為 Snapchatter 提供透過通知和申訴程序獲得有用的爭議處置結果的機會,這些程序是為了保護我們社群的利益,同時保護 Snapchatter 的權利。

我們持續改進我們的處置政策及程序,並在對付 Snapchat 上潛在有害、非法內容與活動方面取得了重大進展。 這反映在我們最新的透明度報告中所顯示的回報與處置總數的上升,以及 Snapchat 上違規率下降的趨勢。


法院外和解(第 24.1(a) 條)

在我們最新的透明度報告(2023年上半年)期間,DSA 下沒有正式任命的庭外爭議解決機構。因此,在此期間提交至此類機構處理的爭議數量為零,我們無法提供結果、解決的週轉時間中位數,以及我們執行機構決定的爭議比例。 



帳戶停用(第 24.1(b) 條)

在 2023 年上半年,我們沒有任何根據第 23 條實施的帳戶停用。Snap 的信任與安全團隊已制定程序,減少經常提交明顯毫無根據的投訴帳戶的可能性。 這些程序包括限制建立重複性的報告,以及使用電子郵件篩選,以防止經常提交明顯毫無根據的回報的使用者。 Snap 對帳戶採取適當的處置,如 Snapchat 審核、處置與申訴解釋所述,有關 Snap 帳戶處置的更多資訊,請參閱我們的透明度報告 (2023 年上半年)。此類措拖將持續審核精進。


審核者資源、專業知識和支援(第 42.2 條)

我們的內容審核團隊遍佈全球,讓我們能全天候地協助以保護 Snapchatter 的安全。 以下所列為截至 2023 年 12 月 31 日,依照審核人員的語言專長而區分的人力審核資源 (請注意,有些審核人員精通多種語言):

The above table includes all moderators who support EU member state languages as of December 31, 2023. In situations where we need additional language support, we use translation services.

Moderators are recruited using a standard job description that includes a language requirement (depending on the need). The language requirement states that the candidate should be able to demonstrate written and spoken fluency in the language and have at least one year of  work experience for entry-level positions. Candidates must meet the educational and background requirements in order to be considered. Candidates also must demonstrate an understanding of current events for the country or region of content moderation they will support.

Our moderation team applies our policies and enforcement measures to help protect our Snapchat community. Training is conducted over a multi-week period, in which new team members are educated on Snap’s policies, tools, and escalations procedures. After the training, each moderator must pass a certification exam before being permitted to review content. Our moderation team regularly participates in refresher training relevant to their workflows, particularly when we encounter policy-borderline and context-dependent cases. We also run upskilling programs, certification sessions, and quizzes to ensure all moderators are current and in compliance with all updated policies. Finally, when urgent content trends surface based on current events, we quickly disseminate policy clarifications so teams are able to respond according to Snap’s policies.

We provide our content moderation team – Snap’s “digital first responders” – with significant support and resources, including on-the-job wellness support and easy access to mental health services. 

Child Sexual Exploitation and Abuse (CSEA) Media Scanning Report


Background

The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.


We use PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.


Report

The below data is based on the result of proactive scanning using PhotoDNA and/or CSAI Match of media uploaded by a user’s camera roll to Snapchat.

Stopping child sexual exploitation is a top priority. Snap devotes significant resources toward this and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents who handle these reviews due to the graphic nature of the content.  During the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements, and we have addressed these inconsistencies through agent re-training and rigorous quality assurance.  We expect that the next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcements.  

Content Moderation Safeguards

The safeguards applied for CSEA Media Scanning are set out in the above “Content Moderation Safeguards” section under our DSA Report.


European Union Terrorist Content Online Transparency Report

Published: June 17, 2024

Last Updated: June 17, 2024

This Transparency Report is published in accordance with Articles 7(2) and 7(3) of Regulation 2021/784 of the European Parliament and of the Council of the EU, addressing the dissemination of terrorist content online (the Regulation). It covers the reporting period of January 1 - December 31, 2023


General Information
  • Article 7(3)(a): information about the hosting service provider’s measures in relation to the identification and removal of or disabling of access to terrorist content

  • Article 7(3)(b): information about the hosting service provider’s measures to address the reappearance online of material which has previously been removed or to which access has been disabled because it was considered to be terrorist content, in particular where automated tools have been used


Terrorists, terrorist organizations, and violent extremists are prohibited from using Snapchat. Content that advocates, promotes, glorifies, or advances terrorism or other violent, criminal acts is prohibited under our Community Guidelines. Users are able to report content that violates our Community Guidelines via our in-app reporting menu and our Support Site. We also use proactive detection to attempt to identify violative content on public surfaces like ​​Spotlight and Discover. 


Regardless as to how we may become aware of violating content, our Trust & Safety teams, through a combination of automation and human moderation, promptly review identified content and make enforcement decisions. Enforcements may include removing the content, warning or locking the violating account, and, if warranted, reporting the account to law enforcement. To prevent the reappearance of terrorist or other violent extremist content on Snapchat, in addition to working with law enforcement, we take steps to block the device associated with the violating account and prevent the user from creating another Snapchat account. 


Additional details regarding our measures for identifying and removing terrorist content can be found in our Explainer on Hateful Content, Terrorism, and Violent Extremism and our Explainer on Moderation, Enforcement, and Appeals



Reports & Enforcements 
  • Article 7(3)(c): the number of items of terrorist content removed or to which access has been disabled following removal orders or specific measures, and the number of removal orders where the content has not been removed or access to which has not been disabled pursuant to the first subparagraph of Article 3(7) and the first subparagraph of Article 3(8), together with the grounds therefor


During the reporting period, Snap did not receive any removal orders, nor were we required to implement any specific measures pursuant to Article 5 of the Regulation. Accordingly, we were not required to take enforcement action under the Regulation.


The following table describes enforcement actions taken based on user reports and proactive detection against content and accounts, both in the EU and elsewhere around the world, that violated our Community Guidelines relating to terrorism and violent extremism content

Enforcement Appeals
  • Article 7(3)(d): the number and the outcome of complaints handled by the hosting service provider in accordance with Article 10

  • Article 7(3)(g): the number of cases in which the hosting service provider reinstated content or access thereto following a complaint by the content provider


Because we had no enforcement actions required under the Regulation during the reporting period as noted above, we handled no complaints pursuant to Article 10 of the Regulation and had no associated reinstatements.


The following table contains information relating to appeals and reinstatements, both in the EU and elsewhere around the world, involving terrorist and violent extremist content enforced under our Community Guidelines.

Judicial Proceedings & Appeals
  • Article 7(3)(e): the number and the outcome of administrative or judicial review proceedings brought by the hosting service provider

  • Article 7(3)(f): the number of cases in which the hosting service provider was required to reinstate content or access thereto as a result of administrative or judicial review proceedings


As we had no enforcement actions required under the Regulation during the reporting period, as noted above, we had no associated administrative or judicial review proceedings, and we were not required to reinstate content as a result of any such proceedings.

EU DSA: Average Monthly Active Recipients (August 2024)
(DSA Articles 24.2 and 42.3)

As of 1 August 2024, we have 92.4 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the last 6 months, 92.4 million registered users in the EU have opened the Snapchat app at least once during a given month.

This figure breaks down by Member State as follows: