# Wizard of Wikipedia: Knowledge-Powered Conversational Agents ###### tags: `筆記`, `NLP`, `ICLR 2019` ## Abstract - Motivation: In open-domain dialogue, intelligent agents should exhibit the use of knowledge, but there are few convincing demonstrations of this to date. The paper addresses the challenge of creating agents that can engage in open dialogue grounded in knowledge, which has been difficult due to the lack of a supervised learning benchmark. - Approach: The authors introduce a large dataset with conversations grounded in knowledge retrieved from Wikipedia and design architectures capable of retrieving knowledge, reading and conditioning on it, and generating natural responses. ## Introduction - Background: Machines need to master several skills for humans to be able to talk to them, such as comprehending language, employing memory to retain and recall knowledge, and reasoning about concepts together. - Goal: The paper aims to build dialogue agents that can converse intelligently on open-domain topics by employing direct knowledge memory mechanisms. ## Methodology - Models Introduced: The paper presents Transformer Memory Network architectures that combine elements of Memory Network architectures to retrieve knowledge and Transformer architectures for text representations. - Key Features: These models are capable of retrieving from a large memory, carefully reading and attending over the retrieved set of knowledge, and then generating the next dialogue utterance. ## Experiments - Datasets: [WoW](https://parl.ai/projects/wizard_of_wikipedia/), a supervised dataset of human-human conversations using crowd-sourced workers, involving 201,999 utterances about diverse topics linked to Wikipedia articles. - Metrics: The models' abilities to conduct knowledgeable discussions are evaluated using both automatic metrics and human evaluations. ## Takeaways - 主要貢獻: 本文通過收集大量的數據集並設計新的架構來解決開放領域對話中的知識利用問題,提出的轉換器記憶網絡模型在自動指標和人類評估中均顯示出良好的性能。 - 未來方向: 進一步的研究可以探索如何提高檢索效率、改善知識選擇的準確性,以及如何使對話更自然和引人入勝。 > The contents shared herein are quoted verbatim from the original author and are intended solely for personal note-taking and reference purposes following a thorough reading. Any interpretation or annotation provided is strictly personal and does not claim to reflect the author's intended meaning or context.