Open Access
Issue
MATEC Web Conf.
Volume 232, 2018
2018 2nd International Conference on Electronic Information Technology and Computer Engineering (EITCE 2018)
Article Number 02047
Number of page(s) 6
Section 3D Images Reconstruction and Virtual System
DOI https://doi.org/10.1051/matecconf/201823202047
Published online 19 November 2018
  1. Weston, Jason, S. Chopra, and A. Bordes. “Memory Networks.” Eprint Arxiv (2014). [Google Scholar]
  2. M. Richardson, C. J. Burges, and E. Renshaw. MC Test: A Challenge Dataset for the Open Domain Machine Comprehension of Text. In EMNLP, (2013). [Google Scholar]
  3. Hill, Felix, et al. “The Goldilocks Principle: Reading Children’s Books with Explicit Memory Representations.” Computer Science (2015). [Google Scholar]
  4. He, Wei, et al. “DuReader: a Chinese Machine Reading Comprehension Dataset from Real-world Applications.” (2017). [Google Scholar]
  5. Hermann, Karl Moritz, et al. “Teaching machines to read and comprehend.” 1693-1701.(2015). [Google Scholar]
  6. Kadlec, Rudolf, et al. “Text Understanding with the Attention Sum Reader Network.” 908-918.(2016). [Google Scholar]
  7. Trischler, Adam, et al. “Natural Language Comprehension with the EpiReader.” (2016). [Google Scholar]
  8. Chen, Danqi, J. Bolton, and C. D. Manning. “A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task.” (2016). [Google Scholar]
  9. Cui, Yiming, et al. “Attention-over-Attention Neural Networks for Reading Comprehension.” (2016). [Google Scholar]
  10. Wang, Shuohang, and J. Jiang. “Machine Comprehension Using Match-LSTM and Answer Pointer.” (2016). [Google Scholar]
  11. Yin, Wenpeng, S. Ebert, and H. Schütze. “Attention-Based Convolutional Neural Network for Machine Comprehension.” (2016). [Google Scholar]
  12. Xiong, Caiming, S. Merity, and R. Socher. “Dynamic Memory Networks for Visual and Textual Question Answering.” (2016). [Google Scholar]
  13. Yang, Zhilin, et al. “Words or Characters? Fine-grained Gating for Reading Comprehension.” (2017). [Google Scholar]
  14. Cheng, Jianpeng, L. Dong, and M. Lapata. “Long Short-Term Memory-Networks for Machine Reading.” (2016). [Google Scholar]
  15. Seo, Minjoon, et al. “Bidirectional Attention Flow for Machine Comprehension.” (2016). [Google Scholar]
  16. He, Kaiming, et al. “Deep Residual Learning for Image Recognition.” 770-778.(2015). [Google Scholar]
  17. Vaswani, Ashish, et al. “Attention Is All You Need.” (2017). [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.