title: 法國《資訊操弄報告》50 個建議翻譯計畫
# 法國《資訊操弄報告》50 個建議翻譯計畫
2018 年 9 月 4 日，法國發佈了[針對資訊操弄的相關報告](https://www.diplomatie.gouv.fr/en/french-foreign-policy/manipulation-of-information/article/joint-report-by-the-caps-irsem-information-manipulation-a-challenge-for-our)，此報告調查了針對其他國家資訊操弄的原因，後果和反應，最後列出 50 項行動建議。為了讓更多人能了解如何應對資訊操弄，在此發起針對這 50 個建議的翻譯計畫。
下方許多翻譯是 google 翻譯的成果，歡迎大家協作修改成信達雅的翻譯。翻譯者若願意具名，請自行在下方「貢獻者」區域加上您的大名。
翻譯成果未來將會以 CC0 的方式釋出，方便各界利用。
Eric Liao (部分審訂
* information manipulation：資訊操弄
* information disorder：資訊亂象、新聞亂象
* **mis**information：錯誤訊息、錯誤新聞（原文定義: *false & not intended to harm*）
* **dis**information：假訊息（原文定義: *false ＆ used intended to harm*）
* **mal**information：惡意訊息、惡意新聞 （原文定義: *not false & used intended to harm*）
* fact-check tag: 事實查核標籤，檢查文件或評論中陳述的事項是否屬實
* honest broker：中立協調者、斡旋者
* information ecosystem: 資訊生態系統（關注於人類如何進行資訊流程建模的系統）[Reference](https://www.lehigh.edu/~dac511/pages/research/infoecol.html)
* metadata：中介資料（用來描述資料本身屬性的資料） [metadata 附註](http://metadata.teldap.tw/project/filebox/BSMI/Symposium.pdf)
* troll account：釣魚帳號
# 50 RECOMMENDATIONS from *INFORMATION MANIPULATION*
《資訊操弄報告》的 50 項建議
## I. General recommendations
1. Define and clearly distinguish the terms, as we sought to do in the introduction. This should help counter widespread relativism, in other words, the claim that “everything is propaganda” and that all the media spreads disinformation. We must not condemn the defense of national interests——Russian media have a legitimate right to defend Russian viewpoints, including those of the Russian government——but the information manipulation. Running a “DIDI” diagnostic (Deception, Intention, Disruption, Interference), as recommended by the Swedish MSB and Lund University, could help with differentiating real information manipulation from more benign influence activities.
1. 定義並清楚地區分用詞（術語），就像我們在本報告的引言中所做的，有助於抑制廣為流通的相對主義——亦即，宣稱「一切都是宣傳」，並認爲所有媒體皆傳播假訊息(disinformation)的主義。我們不應該譴責任何人捍衛自己的國家利益：例如俄羅斯媒體本來就有合法權利捍衛俄羅斯的觀點，包括俄羅斯政府的觀點，我們應該譴責的是資訊操弄。根據瑞典 MSB（民事應急局）和隆德大學的建議，實行「DIDI」診斷（Deception, Intention, Disruption, Interference：欺騙、意圖、破壞、干擾）有助於在相對良性的活動當中，辨識出真正的資訊操弄。
2. Do not underestimate the threat, even though it may not be perceptible on an everyday basis. The Finnish Security Strategy for Society insists that a good preparation against information manipulation depends on an accurate evaluation of the threat. To understand the threat, it recommends regularly envisioning possible threatening scenarios and planning for the potential risks and conflicts that they would involve.
3. See beyond the short term. Influence operations serve both long-term and short-term goals. The short-term goals relate to specific events, often an election, an armed conflict, a social protest, a natural disaster, an assassination (Nemtsov) or an attempted assassination (Skripal), a plane crash (MH17), etc. Fake internet accounts and hoaxes are thus more conspicuous, more aggressive, and less subtle because they have an inherently limited lifespan and are bound to be exposed or suppressed once the goal has been achieved. Long-term operations, on the other hand, are aimed at undermining certain ideas and opinions, or at exacerbating tensions and divisions within targeted communities. They have insidious, incremental subversive effects, steered by more subtle and discreet actors, and with consequences that are more difficult to assess. Those long-term operations are the most dangerous ones. They follow a pattern of erosion: it is through repetition and persistence over a long period of time that water eventually wears down rock. Hence it is important to go beyond short-term approaches, often through the prism of electoral cycles (i.e. that tackle only those informational threats that arise during elections), in order to understand the daily nature of the challenge.
3. 不要只看短期。影響力操作同時具有長期和短期目標。短期目標涉及具體事件，通常是選舉、武裝衝突、社會抗議、自然災害、暗殺（俄羅斯政治家鮑里斯·涅姆佐夫謀殺案）或謀殺（英國前俄羅斯間諜毒殺案）、飛機失事（馬來西亞航空 17 號班機空難）等。為此，網路假帳號和騙局更醒目、更具侵略性、更不隱微，因為它們本質上有時限，一旦達到目標就必然會被揭露或壓制。另一方面，長期運作旨在削弱某些想法和意見，或加劇目標社群內的緊張局勢和分歧。它們具有陰險、漸進的顛覆效果，幕後黑手的手段更微妙謹慎，造成的後果也更難以評估。長期的資訊操作是最危險的，循著既定的侵損模式，只要持續不斷、長期重覆進行，滴水亦可穿石。因此，重要的是我們不能以管窺天，侷限於對抗短期的資訊操弄(也就是說：不能只對抗選舉期間出現的威脅)，才能了解我們所遇到的挑戰在本質上是日積月累的。
> [name=Ru Lu] 對不起我看不懂那個稜鏡到底在比喻什麼，硬翻好像會很意義不明，所以我直接超譯(?)了，假如有更好的翻法可以改掉XD
> [name=Eric Liao] 提供意譯，請過目。
> [name=Ru Lu] 喔喔喔～ 原來如此！！ 感謝～
4. Strengthen the resilience of our societies. Information manipulation feeds off of divisions and tensions that run through the fabric of our societies. Hence, we cannot fight back effectively, or durably, against these forms of manipulation without the political will to increase resilience within our societies. From this point of view, we have much to learn from certain States, in particular Finland, who has made resilience against so-called “hybrid” threats into a national concept.
5. Do not surrender the internet to extremists. Conspiracy theories prosper all the more easily if they are not contradicted. “Internet users who exercise a form of scientific rationality consider the exchange of views with ‘believers’ to be a waste of time and they prefer to mock or ignore them. In a similar fashion, ‘liberal’ internet users do not necessarily deem it worthy to engage in debates with racist, sexist or homophobic users in order to deconstruct their arguments. As a result, online debate is saturated with lies and aggressive content.”
It is necessary, however, to also give due consideration to the risk of the “boomerang effect,” for to refute is also to reiterate. Every correction indirectly increases the circulation of the false information. This propagation effect cannot be avoided and it is therefore important to pick one’s battles, that is, to focus on counteracting those instances of information manipulation that are most dangerous.
6. Do not yield to the temptation of counter-propaganda. As Fred Iklé wrote in 1989, “truth is democracy’s best POLWAR [political war] and PSYOP [psychological war] weapon,” for “the goals of democracy can only be accomplished with methods that are compatible with democracy.” For democracies, then, the best possible response to information manipulation is always “a persuasive factual proof released at the right time.”
6. 不要受誘惑而急於發起反宣傳。正如 Fred Iklé 在 1989 年所寫的，「真理是民主最好的POLWAR [政治戰爭] 和PSYOP [心理戰爭] 武器」，因為「民主的目標只能通過與民主相容的方法來實現」。對於民主國家來說，對資訊操弄的最佳回應始終是「在合適的時間發布有說服力的事實證據」。
7. Do not rely on “technological solutionism,” as Evgeny Morozov warns us his evocatively titled book, To Save Everything, Click Here. There is no one solution to contemporary information issues; the response must be multi-dimensional (just as the problem is multi-dimensional).
7. 不要依賴「技術解決主義」，如 Evgeny Morozov 在他名為《拯救一切，點擊這裡》的書所言。當代的訊息問題並沒有單一的解決方案；應付的辦法必須是多維的（正如問題是多維的）。
## II. Recommendations for Governments
8. Avoid heavy handedness. Civil society (journalists, the media, online platforms, NGOs, etc.) must remain the first shield against information manipulation in liberal, democratic societies. The most important recommendation for governments is that they should make sure they retain as light a footprint as possible—not just in keeping with our values, but also out of a concern for effectiveness. As one of the roots of the problem is distrust of elites, any “top down” approach is inherently limited. It is preferable to champion horizontal, collaborative approaches, relying on the participation of civil society. This also relates to attacks against the population: the largest investigation on the subject (with 74,000 respondents in 37 countries in 2018) shows that respondents feel that in the fight against information manipulation, the main responsibility falls unto the media (75%) and digital platforms (71%) and then governments, especially in Europe (60%) and Asia (63%), followed by the United States (40%).
8. 避免高壓手段。民間社會（記者，媒體，線上平台，非政府組織等）必須是自由民主社會中防止資訊操弄的第一道盾牌。對政府來說，最重要的建議是，他們應該盡可能保持最少程度的干預 - 不只是為了符合我們的價值觀，更是為了有效性。由於問題的根源之一是對精英的不信任，任何「自上而下」的方法本質上都是有限的。最好是依靠民間社會的參與來支持橫向的，協作的方法。這也涉及對人口的攻擊：對該主題的最大調查（2018 年 37 個國家的 74,000 名受訪者）表明受訪者認為，在打擊資訊操弄的鬥爭中，主要責任落在媒體上（75％）和數位平台（71％），然後是政府，特別是歐洲（60％）和亞洲（63％），其次是美國（40％）。
It is important to acknowledge the intrinsic limitations of any purely governmental response, which is bound to be regarded as biased and propagandist. The response therefore needs to be holistic. This is nothing new: in 1952, the Director of the Information Research Department (a then-secret section within the British Foreign Office, which employed up to 300 people tasked with offsetting Soviet influence in the United Kingdom) declared at a conference on counter-propaganda that “we have to dispel any idea that the fundamental issues, and the action that flows from them, are simply the business of governments and government- controlled agencies. Government-sponsored information, tendentious hand-outs, statements of opinion and all obvious attempts to influence free opinion are worse than useless, or should be.”
重要的是要承認任何純粹的政府反應的內在局限性，這必然被視為有偏見和宣傳。因此，反應必須是全面的。這並不是什麼新鮮事：1952 年，訊息研究部主任（英國外交部的一個秘密部門，僱用了多達 300 人，負責抵消蘇聯在英國的影響力）在一次會議上宣布宣傳「我們必須消除任何認為基本問題以及從中流動的行動僅僅是政府和政府控制機構的業務的想法。政府資助的訊息，有傾向性的分發，意見陳述以及所有影響自由意見的明顯嘗試，都比無用更糟糕，或應該是如此。」
In this way, it is preferable that States design a choice architecture without enforcing a particular choice, in accordance with the “nudge approach” in behavioral economics.
9. Create a dedicated structure. Most of the States concerned have already done so. Those that have not should establish a national entity responsible for the detection and countering of information manipulation. This entity can take various forms—from the network of competent people presently scattered across distinct services to the creation of a dedicated center endowed with its own staff. As it may involve bureaucratic rivalries, one crucial aspect relates to the issue of institutional affiliation. In the present international landscape, some entities are supervised by an inter or supra-ministerial body, while others are hosted within a particular ministry. The nature of the link (executive powers or merely a secretariat role) also varies. It is possible, however, to discern a number of features which are key to the success of a good network:
9. 建立專用結構。大多數有關國家已經這樣做了。那些還沒有的，應該建立負責檢測和打擊資訊操弄的國家實體。該實體可以採取各種形式 - 從目前分散在不同服務中的合格人員網路到建立一個擁有自己員工的專門中心。由於它可能涉及官僚競爭，因此一個關鍵方面涉及機構從屬關係問題。在目前的國際格局中，一些實體由一個跨部門或超級部門機構監督，而另一些實體則由一個特定的部門負責。鏈接的性質（行政權力或僅僅是秘書處的作用）也各不相同。但是，有可能識別出一些對良好網路成功至關重要的功能：
a) durability: structures that are permanent and that have clearly defined competencies and goals work better than ad hoc initiatives that often tend to dilute responsibility;
b) variable geometry: the existing networks are usually made up of a security-leaning “core” (Foreign Affairs, Defense, Interior, Intelligence) who meet up on a regular basis and, depending on the agenda, involve other relevant ministries (Education, Culture, Justice), or even members of parliament and civil society actors;
c) a wide focus: networks usually make public that they are fighting information manipulation in general, even though they are often, in reality, focused on Russia. In theory, they are also capable of dealing with other state actors (China, Iran, etc.) as well as non-state ones (jihadist groups). Indeed a number of networks are working on establishing bridges between the fight against information manipulation and the fight against radicalization;
d) set-up: successful networks bring together a small number of people who are well-versed in digital matters and who know and trust one another. When the group is too large or too hierarchically heterogeneous, discussions tend to diminish in quality and efficiency. The interdisciplinary team should also include information system experts who tend to be confined to crisis resolution while they should in fact take part in the strategic thinking. Finally, the groups that work well are those that comprise at least a handful of permanent members who work full time on the subject;
e) production: in addition to meeting and the sharing of information, the best networks are productive. Three types of internal publications could be devised: warning notes, periodic reviews and thematic reports. The entity in question could also manage the publication of an annual report on information manipulation (some intelligence services—such as the Estonian KAPO—and even some armed forces—in Lithuania— already do this);
f) communication: given the crucial role played by transparency in dispelling conspiracy theories, these networks are public and sometimes engage in external communication. In those countries that are most exposed to foreign pressure, in Eastern and Central Europe and, more particularly, in the Baltic States, the role of the security and military forces is emphasized. In contrast, other countries prefer to highlight the work of institutions closer to civil society so as to reassure their populations. In Canada, the brunt of responsibility in the fight against disinformation falls unto the Ministry of Democratic Institutions—insofar as information manipulation threatens elections, and thus the integrity of democratic processes.
f）溝通：鑑於透明度在消除陰謀理論中發揮的關鍵作用，這些網路是公開的，有時也會參與外部溝通。在那些最容易受到外國壓力的國家，東歐和中歐，特別是波羅的海國家，安全和軍事力量的作用被特別強調。相比之下，其他國家更願意強調更接近民間社會的機構的工作，以便向其人民保證。在加拿大，打擊假訊息的責任主要落在民主機構部 - 因為資訊操弄威脅選舉，從而威脅民主進程的完整性。
Notwithstanding the institutional affiliation of the dedicated entity, the Ministry of Foreign Affairs has an important role to play in monitoring and providing early warning, especially in instances of malign campaigns targeting national interests abroad. Diplomatic networks can be effectively mobilized to warn about coalescing campaigns (antennas) as well as to propagate the Ministry’s strategic communication (loudspeaker).
10. Scan the web to identify the communities that propagate the stories. It is difficult to anticipate threats. Nevertheless, they can be detected and the goal is to do so as early as possible. To achieve this, probing antennas must be extended into “risk communities” (extremist, conspiratorial and religious groups). These probes can be passive accounts, which only listen, or active ones, which take part in discussions. There are a number of technical solutions to monitor social networks (DigiMind, AmiSoftware, Linkfluence, etc.).
10. 掃描網路以識別傳播故事的社群。誠然，威脅難以在事發前被察覺。然而，它們依舊可以被偵查到，且目標是儘可能以最快的速度將潛在威脅找出來。為實現這一目標，探測範圍必須擴展到「高風險社群（極端主義，陰謀和宗教團體）」。這些探測器可以是被動帳戶，只會接收其傳達的訊息，或者會加入其討論的活動帳戶。有許多技術性方案可用於監控社交網路上（DigiMind，AmiSoftware，Linkfluence 等）。
Official responses (websites, pages, accounts) have only limited efficacy. Clandestine operations, aiming for instance at manipulating the manipulators, are risky because, if exposed (and it is becoming increasingly difficult to prevent this in the long-run), they can jeopardize the very credibility of the source and invigorate conspiratorial actors—which would end up strengthening the very actors one aimed at undermining. What should therefore be done?
The first step is to survey the web in order to better grasp the communities that propagate false information on the social networks: identify the main actors (which can mean different things: those with the largest following, the most active ones, the most quoted, etc.), ascertain the type of community in question—its structure (is it centralized, hierarchical, horizontal, tribal, etc.?) and its spirit (is it cooperative or competitive? This distinction is important because in a competitive community, within which members compete for the recognition of others, the withdrawal of a key member will have little effect as he or she will simply be replaced by somebody else). Such painstaking work is essential in order to understand the channels of propagation, but also to enable anticipation and adequate action.
第一步是調查網路，以便更好地掌握在社交網路上傳播假訊息的社區：識別主要參與者（可能意味著不同的事物：具有最大追隨者，最活躍者，最引用者，確定有關社區的類型 - 它的結構（它是集中的，分層的，橫向的，部落的等等？）及其精神（它是合作的還是競爭的？這種區別很重要，因為在競爭性社區中，哪些成員競爭對他人的認可，關鍵成員的撤回幾乎沒有影響，因為他或她將被其他人簡單地取代）。這種艱苦的工作對於理解傳播渠道至關重要，同時也是為了實現預期和充分的行動。
It is then possible to a) identify accounts that are the source of manipulations and, conversely, like-minded or at least more neutral and rational accounts that enjoy a significant audience; b) neutralize the former (cyberattacks, suspension) and support the latter (e.g. by offering them training); c) disclose the manipulation attempt, name its source (naming and shaming) and discredit the content of the fake news story—either directly, in an official manner, or indirectly, via like-minded accounts.
11. Communicate better. We will lose the information war if we only respond and react. In order to win this war, it is not only necessary to ensure a continuous presence on the web, to have a communication strategy, disseminate targeted messages, and be able to refute false information. It is equally important to be proactive by drawing the adversary out of their comfort zone. For example, whenever government services detect trolls or dormant bots, they should be exposed publicly before they are even used.
When under attack, communication is key. Defense personnel can tend to classify—rather than use—information. It is possible to condemn an attack without revealing its source and then leave it to the media to do their work. This was one of the reasons for the En Marche! campaign’s successful response to the interference attempt during the French presidential election. This was also the approach the Germans adopted during their pre-electoral period. Proactive communication is now widely recognized as the strategy to follow.
For States who do not have English as their official language, it is also important to communicate information in English about their doctrine, national strategy and experience.
12. Legislate when required. States must be able to implement the following measures when necessary:
a) adopt a law against “fake news” if there is none, or adapt the existing legislation to meet the challenges of the digital era;
b) penalize more strictly the wrongdoings of the media, by following the example of the British Ofcom (which sanctioned RT on several occasions with some success, i.e. a dissuasive effect) and reinforce legislation which punishes online harassment, in particular towards journalists;
c) consider making registration compulsory for foreign media, by following the American example, which would not affect the circulation of these media (and would thereby not constitute censorship) but would simply provide a transparency tool. The public has a right to know who speaks, similar to the logic that prevails in matters of food safety—the traceability of information must be a measure of its quality.
c）考慮通過遵循美國的例子，對外國媒體強制進行註冊，這不會影響這些媒體的傳播（因而不會構成審查），而只是提供透明度工具。公眾有權知道誰說話，類似於食品安全問題上的邏輯 - 訊息的可追溯性必須衡量其質量。
Develop our legal system
“I have decided that we would make changes to our legal system so as to protect democratic life from fake news. A law will soon be proposed on this issue. During the electoral period, [...] platforms will be required to meet obligations of increased transparency regarding all sponsored content so as to make public the identity of advertisers and those who control them. Platforms will also have to limit the sums devoted to such content. [...] In the event of the propagation of fake news, it will be possible to take legal action which, if necessary, will include deleting the content in ques- tion, dereferencing the website, closing the user account in question and even blocking access to the website. The regulating powers, which will be thoroughly reshaped in 2018, will be increased to manage attempts at destabilization by television services controlled or influenced by foreign States. This will allow the reworked CSA [French media regulatory authority], in particular, to refuse to conclude agreements with such services by assessing the content published by said services, including on the internet. It will also enable the regulator, in the event of an act likely to affect the outcome of the ballot—whether in the pre-election or election period—to suspend or cancel an agreement. [...] This new mechanism will involve a duty of intervention on the part of intermediaries to quickly remove any illicit content brought to their attention.”
「我已經決定改變我們的法律制度，以保護民主生活免受虛假新聞的影響。很快將就此問題提出一項法律。在選舉期間，將要求平台履行提高所有贊助內容透明度的義務，以便公佈廣告商和控制者的身份。平台還必須限制用於此類內容的總和。 [...]如果傳播假新聞，將有可能採取法律行動，如有必要，將包括刪除有問題的內容，取消引用網站，關閉有問題的使用者帳戶，甚至阻止訪問該網站。將在 2018 年徹底改組的管制權力將增加，以管理受外國控製或影響的電視服務破壞穩定的企圖。這將使重新制定的 CSA [法國媒體監管機構]特別拒絕通過評估所述服務（包括互聯網）發布的內容來與此類服務達成協議。如果一項行為可能影響投票結果 - 無論是在選舉前還是選舉期 - 暫停或取消協議，它也將使監管機構成為可能。 [......]這一新機制將涉及中介機構的干預義務，以迅速刪除任何引起他們注意的非法內容。」
(Emmanuel Macron, President of France, New Year’s Address to the Press, 4 January 2018.)
（Emmanuel Macron，法國總統，新聞發布會，2018 年 1 月 4 日。）
We must nevertheless be careful to not overregulate. In other words, we must preserve the equilibrium between protecting the population and respecting civil liberties, which are the foundations of our liberal democracies. Overregulation is a real danger, and even a trap set by our adversaries: far from being bothered with overzealous regulations, they will actually benefit from the controversy and divisions that it will create. We must be mindful of the risk of our actions having such unintended effects.
13. Conduct parliamentary inquiries. The American and British examples show that public inquiries offer many benefits in terms of raising citizens’ awareness, accumulating knowledge, and providing deterrence.
14. Hold digital platforms accountable. The role of social networks in information manipulation is now widely recognized. They have become the principal source of information, and hence of disinformation, for a majority of the population. Although information manipulation is costly for their reputation and despite the self-regulation pledges these platforms have made in recent times, it is unclear whether digital platforms actually want to curb these practices. It is our responsibility to find the right levers with which to compel them, at the European level, to:
a) make the sources of their advertising public—by demanding the same level of transparency as is required of traditional media;
b) implement adequate measures with which to fight information manipulation on their websites and contribute to the improvement of media literacy and the awareness of the general public of these issues.
It is up to legislators to strike the right balance between freedom of expression and the need for a greater accountability when it comes to digital platforms in the fight against information manipulation.
15. Share information with digital platforms. We cannot, on the one hand, wait for digital platforms to do more in the fight against information manipulation while, on the other hand, not providing them with information that is sometimes necessary for them to move forward. Public-private cooperation is of capital importance and demands knowledge-sharing in both directions. This is one of the recommendations made to the Trump administration by two former senior officials of the Obama administration, in the context of the midterm elections of 2018.
15. 與數位平台共享訊息。我們不能同時等待數位平台在打擊資訊操弄方面做得更多，卻又告知他們有時需要他們前進。公私合作具有重要意義，需要雙向分享知識。這是歐巴馬政府的兩位前高級官員在 2018 年中期選舉中向特朗普政府提出的建議之一。
16. Go international. In recent years, the issue of information manipulation has been raised primarily by the same group of States on the international stage: Central, Eastern, and Northern European States alongside the U.K. and the United States. France and Spain are in the process of stepping up their international presence because they too have been the target of attacks. Other States should not wait to be attacked; they should become more active now. This implies:
a) increasing their participation in existing initiatives: send an expert to EU institutions, as a priority the East StratCom Task Force; contribute to the work of the European Centre of Excellence for Countering Hybrid Threats (Hybrid CoE); take part in important annual meetings (StratCom Summit in Prague, Riga StratCom Dialogue, the Atlantic Council’s StratCom in Washington DC);
a）增加對現有舉措的參與：向歐盟機構派遣專家，作為東部戰略委員會工作隊的優先事項；為歐洲應對混合威脅卓越中心 (混合CoE) 的工作做出貢獻；參加重要的年度會議（布拉格的 StratCom 峰會，裡加 StratCom 對話，華盛頓特區的大西洋理事會 StratCom）;
b) increase meetings between regional communities. The Euro- Atlantic scene dominates but it is not the only one: there are many interesting developments in Asia, with Singapore being increasingly seen as a point of reference. Not only are authorities proactive and outward-looking, as is demonstrated by the parliamentary hearings and the fact that the Ministry of Defense will soon be sending a resident expert to the NATO Excellence Center in Riga, but so too has civil society been actively involved. The Centre of Excellence for National Security (CENS) at the S. Rajaratnam School of International Studies (RSIS) organizes an annual seminar on disinformation which is one of the very rare meeting points between research and practitioner communities from Europe, North America, Asia and Africa. This diversity is quite refreshing for those accustomed to the Euro-Atlantic scene, which tends to only view the subject through the Russian lens. Each situation is, of course, unique (information manipulation in India, Burma or Indonesia are concerning but endogenous, and thus far removed from the Russian interferences in Europe and North America), but as China presents itself as an ever-increasing threat in the region, such as the Australian case illustrates, there are interesting parallels with Russia to be explored, including to find out what these two countries are learning from each other.
b）增加區域社區之間的會議。歐洲 - 大西洋地區占主導地位，但不是唯一的：亞洲有許多有趣的發展，新加坡越來越被視為參考點。正如議會聽證會所證明的那樣，當局不僅要積極主動和外向，而且國防部將很快派遣駐地專家前往裡加的北約卓越中心，但民間社會也積極參與。S. Rajaratnam 國際研究學院（RSIS）的國家安全卓越中心（CENS）組織了一次關於假訊息的年度研討會，這是歐洲，北美，亞洲和亞洲的研究和從業者社區之間非常罕見的會面點之一。非洲。對於那些習慣於歐洲 - 大西洋場景的人來說，這種多樣性非常令人耳目一新，而歐洲 - 大西洋場景往往只能通過俄羅斯鏡頭來觀察主題。每種情況當然都是獨一無二的（印度，緬甸或印度尼西亞的資訊操弄都是關注但內生的，因此遠離俄羅斯對歐洲和北美的干涉），但隨著中國在此區域的威脅日益嚴重。像澳大利亞案例所說明的那樣，與俄羅斯有一些有趣的相似之處，包括了解這兩個國家相互學習的內容。
c) innovate through the creation of new mechanisms. Information manipulation often has an inherently international scope. For this reason, coordination is critical. An international early warning mechanism could be established, connecting all of the networks, centers and agencies of the EU and NATO Member States. It might not be necessary to create a new network: from the EU’s East StratCom Task Force to the Helsinki and Riga Excellence Centers, there are already various valuable hubs and interfaces for national teams.
Some groups, mostly in the United States, have suggested the creation of an international coalition. In their January 2018 report, Democratic U.S. Senators recommended the creation of “an international coalition against hybrid threats,” which would be spearheaded by the United States. They urged the American President to convene an annual world summit on hybrid threats, modelled on the summits of the Global Coalition against Daesh or against violent extremism, which have been held annually since 2015. Representatives from civil society and private actors would be invited to take part.
Two months later, Fried and Polyakova made a similar suggestion: the creation of a “counter-disinformation coalition” by “the United States and Europe,” “a public-private group bringing together on a regular basis like-minded national government and nongovernmental stakeholders, including social media companies, traditional media, ISP firms, and civil society.” The idea of creating a network involving nongovernmental actors is excellent. However, articulated in these terms, it appears problematic, not just because it excludes Canada, but because such a transatlantic alliance already exists (NATO) and also because it would require an explanation to Moscow. Moscow will certainly ask to join or why it cannot be part of this “coalition of the willing.” The coalition would run the risk of looking like an anti-Russian—rather than an anti-disinformation—alliance. Existing structures, within the EU or NATO, are less susceptible to such criticism.
兩個月後，弗里德和波利亞科娃提出了類似的建議：「美國和歐洲」建立了一個「反情報聯盟」，「一個公共 - 私人團體定期聚集在一起的志同道合的國家政府和非政府組織利益相關者，包括社交媒體公司，傳統媒體，ISP公司和民間社會。」建立一個涉及非政府組織的網路的想法非常好。然而，用這些術語表達，它似乎有問題，不僅因為它不包括加拿大，而且因為這樣的跨大西洋聯盟已經存在（北約），也因為它需要向莫斯科解釋。莫斯科肯定會要求加入，或者為什麼它不能成為這個「意願聯盟」的一部分。聯盟將冒著看起來像反俄羅斯而不是反誹謗聯盟的風險。歐盟或北約內部的現有結構不太容易受到這種批評。
In May 2018, former U.S. Vice President Joe Biden, former Secretary of Homeland Security Michael Chertoff and former NATO Secretary General Anders Fogh Rasmussen created a transatlantic “Commission on Election Integrity.” This Commission is a new actor worth watching, even though it is too soon to assess the role it will play.
Finally, the G7 offers an obvious platform from which to share best practices and formulate common approaches to countering information manipulation. Canada made the issue one of the priorities of its Presidency of the G7 in 2018, by proposing various mechanisms for exchanges and joint action. France, which will take over the G7 Presidency in 2019, should build on these initial results in order to carry out the joint efforts begun within this forum, which are predicated on the preservation and defense of democracy.
17. Train adults as well as children (media literacy and critical thinking). The promotion of media literacy in schools stands as one of the most widely agreed-upon recommendations, despite its unequal application by governments, as can be demonstrated by the Open Society Institute country ranking. However, if we strictly limit ourselves to media-literacy obtained through schooling, as is often the case, it is a long term measure whose effects will only be visible once the children reach adulthood. It is important to consider media literacy and, more broadly, the development of critical thinking, for the whole population, at all stages of life. The education of teenagers and students is particularly important because they tend to be the most vulnerable to information manipulation for a variety of reasons (lack of experience, the need to assert independence, socio-cultural environment) and they have not necessarily benefited from media literacy training in their early years. Offering a core curriculum first-year course in university (text and image analysis, identification of sources) would be useful and easy to implement, at least in social sciences programs.
The idea is to ensure that any person faced with a piece of information can assess its validity (arguments, evidence) and its source (reliability, motivations). This is a public hygiene measure—just as people in the 19th century learned to wash their hands. One possibility would be to follow the Swedish model and publish a “digital hygiene guide” for use by politicians and political parties.
目的是確保任何面對訊息的人都能評估其有效性（論據，證據）及其來源（可靠性，動機）。這是一種公共衛生措施 - 就像19世紀的人們學會洗手一樣。一種可能性是遵循瑞典模式並出版供政治家和政黨使用的「數位衛生指南」。
In other words, it is crucial to educate the general public from a very early age but also at different stages of life, about image, audiovisual media, critical thinking and rational argumentation. The assessment of information is a skill that can be learned. Courses in critical thinking and rational argumentation are widely available in some countries and even considered an indispensable prerequisite in university. These courses teach students how to recognize a paralogism or a sophism and to detect fallacious reasoning. Such measures of “intellectual self-defense” must be developed.
a) Generally speaking, the actions implemented have been hampered by at least two factors: teachers are inadequately trained, and they do not have enough time at their disposal to include this activity in the program. Governments must be mindful of this situation and seek to resolve it.
b) Part of this education must include making people mindful of the mechanisms that exist (trolls, bots, deep fake, etc.). In school, children should be taught how to construct as well as deconstruct false information and conspiracy theories. This would enable them to break down and relativize them. (If they can construct a false information themselves, they will then understand that adults can arguably do it even better.) Children should also learn to use Google image so as to verify the source of any given image. They should also learn not only how to decode/interpret but also how to engage in debate and particularly online debate, through workshops, simulations, etc.
c) Media literacy must include a technological dimension so that young people can understand the operation of social network algorithms (personalization, filter bubbles). It is undoubtedly a challenge to explain such workings to children when even adults struggle to understand them.
d) Go beyond the classroom: to improve its effectiveness, education on information verification should be communicated through a range of media, including television, which after all continues to reach the youngest members of the public. There could be awareness-raising messages played before YouTube videos or sent by digital platforms as private messages, e.g. on Snapchat or Instagram.
e) It is possible to reach out to adults through public campaigns around particular events or through training programs. In that regard, the activities of the NGO Baltic Centre for Media Excellence, which trains journalists and teachers across the region, provide an interesting example. In public service and, in particular, in the Ministries and services most concerned, it is crucial to train staff members so as to reinforce overall “digital hygiene” and develop an internal expertise enabling them to act in an autonomous manner. This involves new recruitment criteria as well as a new range of training programs, public-private partnerships and mobility programs enabling civil servants to acquire new skills from innovative companies. Institutions similar to the French Institute for Higher National Defence Studies (IHEDN) could offer training sessions dedicated to informational threats.
f) The recreational aspect is important, because information manipulation is often entertaining and responses to it are likely to miss their target if they appear too boring (see recommendation n°20). In this way, games, such as the ones developed for Facebook by the NATO Strategic Communications Centre of Excellence, can be quite effective at garnering the interest of young and old alike.18 Yet another example of this is the Buzzfeed media and news compagny, which produces a highly successful weekly “Fake News Quiz.”
18. Develop research. Our immune system against information infection is not only grounded in a capacity to monitor and analyze the information space—which requires us to allocate more intelligence resources to these activities—but also in an ability to comprehend those who manipulate information and, above all, Russia. Therefore, it is necessary to support research on Russia and the post-Soviet sphere at large. This does not mean reviving “sovietology,” but acknowledging that it is possible to respond adequately only to that which we understand well.
18. 開展研究。我們抵禦訊息感染的免疫系統，不僅基於監測和分析訊息空間的能力 - 這需要我們為這些活動分配更多的情報資源 - 而且還要能夠理解那些操縱訊息的人，尤其是俄羅斯。因此，有必要支持對俄羅斯和後蘇聯領域的研究。這並不意味著恢復“蘇維埃學”，而是承認只有我們理解的能夠做出充分反應才有可能。
In concrete terms, this means that States must increase research funding and introduce calls for tenders aimed at studies on predetermined topics or even fund PhDs and/or postdoctoral research projects as well as events (symposiums) and publications on the subject. The connection to information manipulation can either be direct (when it is the topic of research), or indirect, as it can be useful to support sub-projects in the information field, in social psychology or in political science—adding yet another piece to the puzzle.
具體而言，這意味著各國必須增加研究經費，並引入旨在研究預定主題的研究標案，甚至資助博士和/或博士後研究項目以及有關該主題的活動（專題討論會）和出版物。與資訊操弄的聯繫可以是直接的（當它是研究的主題時），也可以是間接的，因為它可以用於支持訊息領域，社會心理學或政治科學中的子項目 - 增加另一部分這個謎題。
19. Marginalize foreign propaganda organizations. Firstly, it is necessary to call out these organs for what they are. This is what the French President did in front of Vladimir Putin at Versailles, in the wake of his election, in a public statement which attracted international attention:
Russia Today and Sputnik have been organs of influence during this campaign that have, on several occasions, produced untruthfull statements about myself and my campaign [...] It is a matter of serious concern that we have foreign news organizations—under whatever influence, I do not know—interfering in a democratic campaign by spreading serious lies. And on this issue, I will yield no ground, no ground whatsoever [...] Russia Today and Sputnik did not act as news organizations and journalists, they acted as organs of influence and propaganda, and of lying propaganda, no more, no less.
今日俄羅斯電視台和俄羅斯衛星通訊社在這次競選期間一直是具影響力的機構，曾多次對我和我的競選活動作出不真實的陳述[...]我們有外國新聞機構 - 在任何影響下，這是一個令人嚴重關切的問題，我不知道 - 通過散佈嚴肅的謊言來干涉民主運動。在這個問題上，我將毫無根據，沒有任何理由[...]今日俄羅斯電視台和俄羅斯衛星社不像是新聞機構和新聞記者，他們充當影響和宣傳機構，撒謊宣傳，不多，不少。
Consequences ought to be drawn, by not accrediting or inviting organs of influence to press conferences reserved to journalists.
20. Use humor. Counter-measures are often criticized for not being entertaining and, for this reason, missing their target audience. On the other hand, stories involving false information are usually amusing. Many people consume fake news like they would junk food: knowing full-well that it is bad for them, but giving in to the pleasure. RT and Sputnik practice “infotainment,” a combination of information and entertainment, compared to which corrective measures can appear very stern. Yet experience in Europe and North America tells us that humor, satire, jokes and mockery work remarkably well against information manipulation. This is something civil society understands: there are a range of satirical programs (“Derzites tam!” in Lithuania), satirical prizes (the “Putin’s Champion Award” of the European Values think tank), as well as numerous satirical accounts on social networks (Darth Putin on Twitter, who provides such advice as “Do not believe *anything* until the Kremlin denies it”), etc. The EU’s task force also uses humor on its website EUvsDisinfo and on social networks. Therefore, even though this veers from their usual pitch, States should consider communicating through humor in some circumstances (Sweden does an excellent job of myth-busting certain clichés on its website Sweden.ru, for example).
20. 善用幽默。因應措施常被批評不具娛樂性而喪失了目標群眾。另一方面，帶有偽造訊息的故事通常很有趣。許多人把假新聞當垃圾食物消費，明知他們有害，但卻享受其中。 俄羅斯電視台和俄羅斯衛星社實踐「訊息娛樂」，一種訊息和娛樂的結合，相較之下糾正措施可能顯得非常嚴厲。然而，歐洲和北美的經歷告訴我們，幽默、諷刺、笑話和嘲弄在對抗資訊操弄時非常有效。這是民間社會所理解的：有一系列諷刺節目（立陶宛的「Derzites tam！」），諷刺獎（歐洲價值觀智囊團的「普京冠軍獎」），以及社交網路上的眾多諷刺作品（Darth Putin在推特上建議「不要相信*任何東西*直到克里姆林宮否認它」）等等。歐盟的工作組也在其網站EUvsDisinfo和社交網路上使用幽默。因此，儘管這種情況偏離了他們通常的說法，各國應該考慮在某些情況下通過幽默進行交流（例如，瑞典在其網站Sweden.ru上的破除某些陳詞濫調中做得非常出色）。
21. Be aware of your own vulnerabilities. Information manipulation exploits the vulnerabilities of our democratic societies. For this reason, it is necessary to map out, locate and understand these vulnerabilities in order to anticipate and try to prevent hostile actions. The ability to put ourselves in the shoes of the adversary is, therefore, essential in order to better predict their next moves. To this end, we must not only study then by research and intelligence but also test our procedures through “red teams,” i.e. teams that play the part of the opponent by trying to identify and manipulate our weaknesses.
22. Remember what we are fighting for. Information manipulation tries to systematically instill doubt in the values and principles of the communities it targets. The best way to combat these manipulation attempts are, firstly, to have a clear idea of what we wish to protect.
23. Acknowledge the unavoidable reversal and diversion of our counter-measures. It is important to recognize that our counter- measures will, in turn, be manipulated by the enemy. Sometimes there will be a mirror effect (RT has its own FakeCheck in four languages, the Russian Ministry of Foreign Affairs’ website launched a section entitled “Published materials that contain false information about Russia” in February 2017, etc.). Sometimes the counter-measures will be distorted by the enemy or third States (illiberal forces taking advantage of the situation to push restrictive laws). Therefore, it is necessary to encourage positive approaches that promote the free circulation of high quality information, in contrast to the fragmentation that currently dominates the internet.
24. Pay attention to weak signals beyond the Russian prism (other States, non-state actors) as well as those working against our interests outside of Europe (notably in Africa and in the Middle East).
25. Listen to civil society, especially journalists. Establishing a regular and open dialogue between journalists and policymakers can help to fight against information manipulation. In Sweden, a Media Council meets on a regular basis, bringing together media leaders and politicians to identify the challenges they face and, crucially, to coordinate their fact- checking efforts. The Belgian group of experts recommends creating a “discussion forum” joining all the actors involved (“universities, the media, journalists and journalism schools, NGOs, digital platforms”). This excellent idea—which would nevertheless be easier to implement in smaller countries, where the actors are less numerous—would also provide the State with a point of contact, allowing them to regularly consult this discussion platform.
26. Keep other forms of influence in check. Information manipulation is but one element in a complex system; it feeds off of other forms of influence. In the case of Russia, targeted States should reduce their energy dependence on Russia as well as target corruption and the Russian financial circuits that contribute to the funding of influence operations.
27. In external operations, nurture relationships with the local population. It is important to never forget that “every action projects an image, generates a perception for the adversary, for local populations but also today with domestic and international audiences. Troops deployed in an external military operation are therefore the first actors of influence, and their actions are not strictly non-lethal.” In the context of NATO’s Enhanced Forward Presence in Baltic countries, American soldiers in Latvia have performed practical services for the Russian- speaking communities (such as chopping wood), which has enhanced their popularity and contributed to undermining anti-American propaganda circulated by Russian media among those communities.
28. Punish those responsible for serious interference, during, for example, an electoral process and if responsibility can be clearly assigned, through economic sanctions or legal proceedings (American Special Prosecutor Robert Mueller indicted 13 Russians and three Russian entities in February 2018, along with 12 officers of the GRU in July 2018).
## III. Recommendations for civil society
29. Understand and reinforce digital confidence-building measures. Information manipulation is both a cause and a symptom of the crisis of confidence in the digital arena. Effectively fighting against these manipulations will have the end result of increasing confidence. At the same time, this first requires having an understanding of the psychological mechanisms that underpin trust, by placing oneself in the users’ position, and promoting good practices that will build trust. In this way, it would be useful to seek enhanced cooperation which would allow the establishment of reliability indices for online content.
30. Enhance fact-checking while remaining aware of its limitations. As most people tend not to accept the correction (and this tendency is even more pronounced if the correct information challenges deeply-held beliefs), fact-checking can be effective on a given individual provided that two conditions are met: firstly, the correction must not directly undercut one’s vision of the world (otherwise it can even have the perverse effect of reinforcing the person’s primary beliefs—this was observed in the case of Iraq’s weapons of mass destruction, and discussions on climate change and vaccination). Secondly, the correction must entail an explanation of why and how disinformation was spread.
30. 加強事實核查，同時了解其局限性。由於大多數人傾向於不接受糾正（如果正確的訊息挑戰深刻的信念，這種趨勢更加明顯），只要滿足兩個條件，事實核查對特定個人就有效：首先，糾正不能直接削弱一個人對世界的看法（否則它甚至會對人原先的主張有負面成長 - 這在伊拉克的大規模殺傷性武器以及關於氣候變化和疫苗接種的討論中都有所體現）。其次，糾正必須解釋假訊息是如何傳播以及為何傳播。
31. Develop simple tools allowing citizens to expose information manipulation attempts themselves, such as knowing who is responsible for a particular advertisement (whotargets.me) or detecting trafficked videos (such as the AFP’s project InVID).
> [name=林傑]trafficked videos在中文似乎沒有準確對應詞，類似是為了利益而做的假影片或者操弄輿論的影片
32. Develop normative initiatives (rankings, indexes, labels, etc.) while recognizing that a proliferation of competing norms and standards will only weaken the overall effort. Therefore, the objective should be to put forward a small number of tools of reference, possibly in connection with reputable NGOs. The Reporters Without Borders (RSF) initiative is, in this respect, very promising.
33. Adopt an international charter of journalistic ethics, in a collaborative manner (by involving both major traditional and online media). The majority of major media platforms have charters of good editorial practices and ethics. The 1971 Munich Charter can provide a useful foundation, but it needs to be adapted to the contemporary media landscape and, notably, the rise of digital media.
33. 以協作方式（通過涉及主要傳統媒體和在線媒體）採用國際新聞倫理憲章。大多數主要媒體平台都有良好的編輯實踐和道德規範。 1971年的慕尼黑憲章可以提供有用的基礎，但它需要適應當代媒體格局，特別是數位媒體的興起。
34. Train journalists to better understand the risks of information manipulation, in journalism schools and throughout their careers. How should one cover a massive leak, detect a fake profile or react to extremist content? There are concrete answers to these questions, which may serve as a basis for teaching material.
> [name=林傑]massive leak不確定是指大量的資訊操弄或是後面大量的fake profile extremist content，不過leak這邊應翻作資訊揭露或公開的意思
35. Build confidence in journalism by enhancing transparency.
The Trust Project, a consortium that brings together news companies such as The Economist, The Globe and Mail, La Repubblica or The Washington Post, recommends revealing sources of funding (similarly, The Conversation also requires researchers who publish on their website to disclose any potential conflicts of interest, a common practice in scientific journals), the profiles of the journalists, proof of their expertise on the subject matter, providing clear a distinction between an opinion, an analysis, or sponsored content, how the sources were accessed, why the journalist chose a particular hypothesis over another, etc. The idea is that the readers want to know how journalists work, and how they know what they know. This transparency in terms of practices, methods, and journalistic procedures can help to build trust.
The Trust Project是一個匯集新聞公司的財團，如The Economist，The Globe and Mail，La Repubblica或The Washington Post，建議披露資金來源（同樣，The Conversation也要求在其網站上發布的研究人員披露任何潛力利益衝突，科學期刊的常見做法），記者的檔案，他們在主題事項上的專業知識證明，明確區分意見，分析或贊助內容，如何獲取來源，為什麼記者選擇了一個特定的假設，而不是另一個，等等。這個想法是讀者想知道記者的工作方式，以及他們如何知道他們所知道的。在實踐，方法和新聞程序方面的透明度有助於建立信任。
36. Develop tools with which to counter “trolling,” such as Perspective by Jigsaw, which uses machine-learning and self-learning as tools to identify toxic messages that can then be isolated, stopped before publication and then submitted to moderators. The New York Times and other major papers use such tools on their websites. Another method consists of the publication of lists of accounts identified as trolls.
36. 開發用於對抗「釣魚」的工具，例如Perspective by Jigsaw，它使用機器學習和自我學習作為工具來識別可以被隔離的有害訊息，在發布之前攔截然後提交給管理者。紐約時報和其他主要論文在他們的網站上使用這些工具。另一種方法包括公佈被識別為釣魚帳戶的清單。
37. Use artificial intelligence and automatic language processing tools in the detection of manipulation attempts and fact-checking. The profusion of fake or biased news is such that journalists, analysts and researchers together will never be numerous enough to spot and deal with all of the threats. Detection software, such as Storyzy, are continuously multiplying and being perfected. With respect to fact-checking, certain software can automatically compare the suspicious news story with all others that were already “debunked” so as to avoid repeating the same work for nothing. This assumes that there is shared access to databases— hence the need for verification networks. Automated verification saves time, but nevertheless still requires, for the time being, a human at the end of the process to validate its results.
37. 在檢測操縱嘗試和事實核查時使用人工智能和自動語言處理工具。大量的虛假或有偏見的新聞使記者，分析師和研究人員永遠不會有足夠的數量來發現和處理所有威脅。檢測軟件，如Storyzy，不斷繁殖和完善。關於事實檢查，某些軟件可以自動將可疑新聞故事與已經「揭穿」的所有其他故事進行比較，以避免重複同樣的工作。這假設存在對數據庫的共享訪問 - 因此需要驗證網路。自動驗證可以節省時間，但目前仍需要人員在流程結束時驗證其結果。
38. Develop surveys and polls aimed at assessing public sensitivity to information manipulation. Collecting precise data on a regular basis would improve the effectiveness of counter-measures.
39. Enhance pluralism through tools promoting information diversity, in order to combat the phenomenon of “filter bubbles:” several projects, including Ghent University’s “NewsDNA,” allow citizens to adjust the degree of diversity in the news that they consume.
40. Rethink the economic model behind journalism, so as to reconcile the preservation of freedom of expression, free market competition and the fight against information manipulation.
41. Incite researchers to intervene in public debates. Pseudo- science proliferates because it occupies a space that is too often left vacant by actual scientists: in particular the dissemination of scientific knowledge (popular science). There are far too many researchers who neglect this activity, by considering the media exposure to be unethical and a hindrance to their career. However, in the context of this ambiguity and confusion, the social responsibility of academics was never greater: they are obliged to provide non-specialists access to the results of their research and to insert themselves in the public debate. In line with this exercise of disseminating research, higher education institutions must also organize media training courses, to teach the specific skills needed to best interact with the media. Moreover, the dissemination of research must be increasingly valorized in the career, as well as constitute a major criterion for evaluation, in order to incite academics to practice this exercise.
## IV. Recommendations for private actors
42. Rethink the status of digital platforms: take platforms at their word and exercise decisive political pressure to compel them to ensure, through strict codes of conduct, that their asserted missions are indeed reflected at the operational level (algorithms, the role of moderators, policing of networks, etc.). In addition, it is necessary to come up with a hybrid status—something between media and host—that enables us to take into account the public service mission that digital platforms have de facto come to assume (digital agora). The possibility of an anti-trust regulation proposed by the European Commission expert group (see above) also deserves consideration.
42. 重新思考數位平台的地位：採取平台並採取決定性的政治壓力迫使它們通過嚴格的行為守則確保其所宣稱的任務確實反映在業務層面（算法，主持人的作用，監管網路等）。此外，有必要提出媒體和主機之間的混合狀態 - 這使我們能夠考慮到數位平台事實上已經承擔的公共服務使命（數位集市）。歐洲委員會專家組（見上文）提出的反托拉斯監管的可能性也值得考慮。
43. Demand the establishment of a new contract with users that is founded on new digital rights. The terms of reference must be reassessed so as to make them intelligible to all and more explicit as regards issues of access to and management of personal data. It is critical that internet users reclaim control over the future use of their data (an opt-in system could be devised, a fee-paying service performing one or several of the following functions: data confidentiality, advertisement blocking, traceability of personal data).
43. 要求與基於新數位權利的使用者建立新合同。必須重新評估職權範圍，以便在訪問和管理個人數據的問題上使所有人更容易理解，更明確。讓網際網路使用者重新控制其數據的未來使用至關重要（可以設計選擇加入系統，執行以下一項或多項功能的付費服務：數據機密性，廣告攔截，個人數據的可追溯性） 。
44. Impose a high level of transparency. In the aftermath of the Cambridge Analytica scandal, wishful appeals for more transparency are no longer good enough. Internet users must be informed of the campaigns that can affect them and the reasons for such targeting. Given the challenge this poses for democratic life, political advertising connected to the exploitation of big data must be subjected to specific regulation. In this context, the possibility has been raised of establishing a public mediator who would be granted access to algorithms under the condition of strict confidentiality.
45. Increase the cost of information manipulation while ensuring the protection of vulnerable individuals and movements. More systematic action must be undertaken against the agents of manipulation, drawing on the concept of “threat actor,” a term that comes from the field of cybersecurity. (This concept allows for the identification of chains of command and infrastructures that are shared between various operations. Rather than censoring contentious content one by one [a “whack-a-mole approach”], platforms could conduct inquiries that lead to the identification of a hostile actor and then suppress all of those actor’s online outlets. We might follow the model set by the deletion of all Facebook pages linked to the IRA.) Whistle-blowers and organizations that are targeted by an information manipulation campaign must, on the other hand, be warned in advance through a special detection system. They must also benefit from protective procedures (hotline) that will enable them to defend themselves.
46. Enhance and better remunerate quality journalism: the current system is unsustainable. Digital platforms have appropriated the bulk of the advertising revenue, which used to be allocated to the funding of traditional media. These platforms have also capitalized on these media’s primary content without remunerating them. It is important to think about new methods of redistribution of information from digital platforms to quality media.
47. Require platforms to contribute to the funding of quality journalism, by requiring them to provide funding for fact-checking, for example.
48. Require platforms to contribute to the funding of independent research: experts agree on the need to access platforms’ data in order to measure the impact of information manipulation campaigns, understand how the information goes viral and assess the effectiveness of measures aimed at countering false information. Platforms must contribute to the funding of this research effort without trying to impose any hidden conditionality as regards the orientation of this research or the political positions of researchers.
49. Consider the creation of “safe zones”: given the present situation of information asymmetry, the challenge online disinformation poses to democracies cannot be met without the cooperation of digital platforms. This requires us creating the conditions for a constructive dialogue. It is, therefore, necessary to devise new forums in which platforms’ intellectual property rights would be guaranteed, in exchange for easier access to their data, software and algorithms. These new spaces should foster cooperation between researchers, civil society and digital platforms. This entails, particularly in the wake of the Cambridge Analytica scandal, the establishment of a preliminary framework for ethical research based on the model by which doctors access their patients’ medical files.
49. 考慮建立「安全區」：鑑於訊息不對稱的現狀，如果沒有數位平台的合作，就無法應對線上假訊息給民主國家帶來的挑戰。這要求我們為建設性對話創造條件。因此，有必要設計新的論壇，以保證平台的知識產權，以換取更容易訪問其數據，軟件和算法。這些新空間應促進研究人員，民間社會和數位平台之間的合作。特別是在Cambridge Analytica醜聞之後，這需要建立一個基於醫生訪問患者醫療檔案的模型的道德研究初步框架。
50. Explore redirection methods so as to ensure that those who seek fake news also come across debunking. Google Redirect, for example, is thought to have efficiently curbed the attraction of ISIS by identifying potential recruits (thanks to their search history) and by exposing them to YouTube videos that demystify ISIS. The idea is to apply such methods to other cases of information manipulation.
V. Responding to objections
In many countries, responses to information manipulation raise concerns—sometimes sincere and other times feigned and calculated. In all cases, however, these responses are legitimate objects of democratic debate. In the following pages, we list the principal criticisms and provide some answers to these objections.
在許多國家，對資訊操弄的回應引起了人們的擔憂 - 有時是真誠的，有時是假裝和計算的。然而，在所有情況下，這些反應都是民主辯論的合法對象。在接下來的幾頁中，我們列出了主要批評，並對這些異議提供了一些答案。
The critique of responses to information manipulation can be categorized along four lines: 1) the issue is irrelevant, the real problem lies elsewhere; 2) the proposed solutions are not efficient; 3) these solutions are counterproductive, and even dangerous; 4) other arguments of a more polemical, yet nonetheless common sort.
## A. An irrelevant cause?
“Nothing new under the sun”: the political use of information is an age-old practice. What is happening today is nothing new when compared to the Cold War period.
→ The current situation presents at least three fundamental differences in comparison with the past and, in particular, the Cold War years:
- social networks ramp up the effects of information manipulation (speed of propagation, scale and diversity of audiences reached; high impact for very low costs);
- the objective today is no longer the defense of a particular ideology or system (the USSR) but the denigration of the West and the polarization of societies;
- non-state actors play a crucial role in the present phase: they interact with one another and with States in a manner that is at once more systematic and more diluted (see Vladimir Putin’s statements on the “Russian patriots” online).
The role of disinformation in recent crises (Brexit, American elections) has been overstated. There is no conclusive research demonstrating that fake news has a direct and tangible impact on internet users. Conversely, by responding to disinformation in a conspicuous manner, we risk granting the stories undue importance.
→ Recent experience has demonstrated, on the contrary, that it is important to not underestimate the seriousness of information manipulation. The Lisa Case has had very real consequences on the rise of anti-migrant sentiment in Germany and such effects are often irreversible, despite later efforts to restore the truth.
→相反，最近的經驗表明，不要低估資訊操弄的嚴重性是很重要的。 Lisa 案對德國反移民情緒的上升產生了非常實際的影響，儘管後來努力恢復真相，但這種影響往往是不可逆轉的。
The Obama Administration chose, for a variety of reasons, not to alert the public to the information manipulation campaign targeting the country, thereby easing the course of an ongoing democratic destabilization effort. On the other hand, the German Chancellor referred publicly to the manipulation threats in the wake of the 2015 attack on the Bundestag.
It was the latter model that France followed during the “Macron Leaks” and is a model that has proven itself effective.
Digital platforms are the ideal scapegoats to blame for the evils of society. However, technology is neutral, these platforms are nothing but spaces without preferences within which internet users can express themselves freely.
→ To use the words of the whistleblower who revealed the Cambridge Analytica scandal: “the knife may be neutral, but it can be used to cook —or to kill somebody.” This very neutrality requires strong principles and clear rules to prevent it from being diverted towards malicious goals or from serving projects that are hostile to our democracies and our citizens’ welfare. It is high time that platforms take responsibility and that governments draw all the lessons from this type of scandals.
## B. Ineffective solutions?
The proposed solutions (media literacy, promotion of quality content) will only impact those who are already convinced and will have no impact on those audiences who are most exposed to disinformation (conspiracy theorists, radical groups, etc.).
→ Contemporary information manipulation campaigns succeed in sowing seeds of doubt in a variety of audiences—not just conspiracy theorists, alternative and radical communities. Measures that support media literacy, fact-checking and quality journalism reinforce the resilience and immunity of the wider public to manipulation threats. We are conscious of the fact that the most radical or pro-conspiracy theory segments of public opinion will not be convinced, but they are a minority and must remain so.
→當代資訊操弄運動成功地在各種受眾中播下了懷疑的種子 - 不僅僅是陰謀理論家，替代性和激進的社區。支持媒體素養，事實核查和高質量新聞的措施加強了廣大公眾對操縱威脅的抵禦能力和豁免權。我們意識到，最激進或支持陰謀的輿論理論部分不會被說服，但它們是少數，必須保持這種狀態。
Counter-productive effects: projects (such as RSF’s) aimed at ranking and indexing reliable sources of information may backfire: public distrust of “the establishment” might actually encourage many internet users to seek their information from any source except those officially designated as reliable.
→ In the current state of information chaos, it is essential for the public to have at their disposal objective references with which to assess the reliability of information sources. Initiatives led by non-governmental and independent organizations such as RSF, that seek to create consensus
→在目前的訊息混亂狀態下，公眾必須擁有客觀的參考資料來評估訊息來源的可靠性。由 RSF 等非政府組織和獨立組織牽頭的倡議，旨在達成共識。
within the profession on objective criteria for quality journalism (working methods, cross-checking information, error correction procedures, media governance, etc.) are very valuable in this context. In order to avoid counter- productive effects, ranking and labeling schemes must offer guarantees of the transparency of the process, the quality of the criteria, and demonstrate the inclusivity and diversity of those assessing these criteria.
The diversion argument: the topic of information manipulation makes the media headlines and thus diverts attention from more substantive topics, in particular the concentration of media ownership in the hands of private interests.
→ Giving due consideration to the grave issues raised by information manipulation does not entail turning a blind eye to the other dimensions of an apparently profound crisis of political communication in the 21st century. The French President, in his New Year’s address to the press, on 4 January 2018, referenced the issue of conflicts of interest between shareholders and editorial boards and suggested some possible courses of action by which to guarantee the full editorial independence of the media.
→適當考慮資訊操弄引起的嚴重問題，並不需要對 21 世紀明顯深刻的政治溝通危機的其他方面視而不見。法國總統在新年致新聞界的講話中，於 2018 年 1 月 4 日提到了股東與編委會之間的利益衝突問題，並提出了一些可行的行動方案，以保證媒體的完全編輯獨立性。
## C. A threat to liberties?
The threat to freedom argument: beneath the cover of the fight against fake news, we are witnessing a reassertion of state control over the field of information, which threatens our freedom of expression. In Egypt, the regime ordered the closure of 21 information websites under the accusation of spreading fake news. Among these censored sites was MadaMisr, an independent, progressive newspaper who had voiced opposition towards the current regime. The cure is therefore worse than the disease.
對自由論證的威脅：在打擊假新聞的掩護下，我們正在目睹國家對資訊領域的控制，這威脅到我們的言論自由。在埃及，該政權下令關閉 21 個資訊網站，他們被指控傳播假新聞。在這些被審查的網站中有 MadaMisr，這是一家獨立且進步的報紙，曾表示反對現政權。因此這種治愈手段比疾病本身更糟。
> [name=林傑]The cure is therefore worse than the disease為英文諺語
> [name=林傑]我只有找到madamasr的網站，原文是寫MadaMisr不確定是否有誤 https://madamasr.com/en
→ In France, the parliamentary bill against information manipulation currently under consideration offers many guarantees. Its provisions are time-limited, applying only to electoral campaigns. The bill also relies on a reinforcement of the powers of the ordinary judge, the guardian of liberty, and the powers of the CSA, the independent public authority responsible for ensuring freedom of audiovisual expression. The fundamental goal of this legislative proposal is simply to protect the honesty and integrity of the ballot, so that it faithfully reflects the popular will. It is not, therefore, about creating a “Ministry of Truth.”
→ Media and civil society actors are involved in the new legislation’s drafting process, which acts as a guarantee that the State will not infringe upon civil liberties in the process of fighting information manipulation.
> [name=林傑]civil society actor不確定是否有專有名詞
The boomerang effect: the denunciation of fake news hurts journalists themselves. The fake news anathema has become a convenient tool with which dictators and illiberal regimes justify censorship.
→ This is a real risk and one that we take very seriously. We made a conscious decision to respond to information manipulation in a transparent and democratic manner, by cooperating with civil society and the media. Grounded as it is in the rule of law and in the values of open societies, our response is by nature more difficult to flip around in an authoritarian setting. In tackling information manipulation, we turn (as described above) either to the ordinary judge, who is the guardian of liberty, or to the CSA, an independent regulation authority whose mission is to protect freedom of audiovisual expression. France will remain vigilant, at every stage of the response, to ensure that the potential risks to civil liberties in an illiberal/ authoritarian context are duly taken into account. Standing alongside the Swedish MSB, “we advocate vigilance, not paranoia.”
→這是一個實際存在的風險，而且我們非常重視。我們深思熟慮後決定通過與民間社會和媒體合作，以透明和民主的方式回應資訊操弄。在法治和開放社會的價值觀中，我們的反應本質上更難以在專制環境中扭轉。在處理資訊操弄方面，我們（如上所述）轉向作為自由監護人的普通法官，或轉向 CSA，CSA 是一個獨立的監管機構，其使命是保護視聽表達自由。在回應的每個階段，法國都將保持警惕，以確保在不自由/專制的背景下對公民自由的潛在風險得到適當考慮。與瑞典MSB並列，「我們提倡警惕，而不是偏執狂。」
Concerns regarding the pluralism of information. In our keenness to define “good information” and to promote “quality content,” we run the risk of reducing the diversity of sources and of effectively homogenizing them.
→ This is a bogus accusation: the fundamental principles of freedom of expression and opinion as well as our democratic attachment to the pluralism of information remain unchanged. The various initiatives mentioned in this report aim at fostering quality content, not at censoring biased or false content.
## D. Polemical arguments
Double standards: you accuse RT and Sputnik of propaganda, yet Al-Jazeera, CNN, the BBC and France 24 do exactly the same thing.
雙重標準：你指控 Russia Today（RT）跟 Sputnik 宣傳不當，不過 Al-Jazeera、CNN、BBC 及 France 24 也是一樣啊？
→ We are not talking about propaganda, but about information manipulation. Al-Jazeera, CNN, the BBC or France 24 contribute to the influence of Qatar, the United States, the United Kingdom or France, but these media outlets retain their editorial independence and respect professional journalistic standards. Furthermore, they do not resort to the methods frequently used by RT and Sputnik, such as the fabrication of facts and the falsification of documents, translations and interviews, the use of edited photos, or fake experts. It is these instances of information manipulation, and these alone, that we denounce; not the fact that these outlets have a particular point of view.
→ 我們不是在談論宣傳，而是在談論資訊操弄。Al-Jazeera、CNN、BBC 或 France 24 等媒體當然對卡塔爾、美國、英國或法國的輿情有所影響，但這些媒體保留了編輯獨立性並尊重專業新聞標準。此外，他們不用 Russia Today 及 Sputnik 的常見招數，例如捏造事實、偽造文件、翻譯、採訪紀錄，使用編輯過的照片或假專家等。我們譴責的是這些、也就只是這些資訊操弄的例子，跟這些網站各自抱持的觀點或立場無關。
The scapegoat argument: you blame Moscow for all of the Western world’s evils.
→ Those actors who are behind information manipulation campaigns—and who are oftentimes easily identifiable—are not the source of our societies’ evils, but they do amplify them. They deliberately identify the fault-lines intrinsic to each society (religious and linguistic minorities, historical issues, inequality, separatist tendencies, racial tensions, etc.) and then seek to further polarize public opinion around these divisive issues.
→ The fight against information manipulation must also take into account other actors, potential or known, who are likely to undertake information manipulation campaigns.
Your response proves that you take your citizens for fools who are unable to “think correctly.”
→ Our approach does not involve any value judgment: our citizens are entirely free to make their own choices and form their own opinions. We are an open and pluralist society, and herein lies our strength. Nevertheless, our duty is to protect our democratic institutions and our national interests from hostile information manipulation as well as to foster the development of programs by civil society and public institutions, enabling citizens and young people, in particular, to fully exercise their critical thinking in the field of information.
You are not innocent: Western nations, and France in particular, did not hesitate to resort to state propaganda in the colonial context.
→ Like all democracies, France is open to any discussion of its past behavior so long as that discussion is scientifically rigorous. This is the remit of historians who study and shall continue to study all the chapters of our national history. Today, we are faced with a new, specific challenge, which we must tackle not only by drawing upon the lessons of the past, but also by looking towards the future.