Open Access
MATEC Web Conf.
Volume 76, 2016
20th International Conference on Circuits, Systems, Communications and Computers (CSCC 2016)
Article Number 02041
Number of page(s) 7
Section Systems
Published online 21 October 2016
  1. Y. Bui & J. Park, An assessment of metadata quality: a case study of the National Science Digital Library Metadata Repository, In Haidar Moukdad (Ed.) CAIS/ACSI 2006 Information Science Revisited: Approaches to Innovation. Proceedings of the 2005 annual conference of the Canadian Association for Information Science held with the Congress of the Social Sciences and Humanities of Canada at York University, Toronto, Ontario (2005) [Google Scholar]
  2. N. Fuhr, G. Tsakonas, T. Aalberg, M. Agosti, P. Hansen, S. Kapidakis, P. Klas, L. Kovács, M. Landoni, A. Micsik, C. Papatheodorou, C. Peters and I. Sølvberg, Evaluation of Digital Libraries, International Journal of Digital Library, Springer-Verlag, vol. 8, no 1, November 2007, pp. 21–38 (2007) [CrossRef] [Google Scholar]
  3. B. Hughes, Metadata quality evaluation: experience from the open language archives community, Berlin: Springer. Lecture Notes in Computer Science vol. 3334. ISBN 978-3-540-24030-3. doi: 10.1007/b104284 (2005) [Google Scholar]
  4. S. Kapidakis, Comparing Metadata Quality in the Europeana Context, Proceedings of the 5th ACM international conference on PErvasive Technologies Related to Assistive Environments (PETRA 2012), Heraklion, Greece, June 6-8 2012, ACM International Conference Proceeding Series; vol. 661 (2012) [Google Scholar]
  5. S. Kapidakis, Rating Quality in Metadata Harvesting, Proceedings of the 8th ACM international conference on PErvasive Technologies Related to Assistive Environments (PETRA 2015), Corfu, Greece, July 1-3 2015, ACM International Conference Proceeding Series; ISBN 978-1-4503-3452-5 (2015) [Google Scholar]
  6. S. Kapidakis, Exploring Metadata Providers Reliability and Update Behavior, Proceedings of the International Conference on Theory and Practice of Digital Libraries (TPDL 2016), September 5-9, to appear (2016) [Google Scholar]
  7. C. Lagoze, D. Krafft, T. Cornwell, N. Dushay, D. Eckstrom & J. Saylor, Metadata aggregation and “automated digital libraries”: a retrospective on the NSDL experience“ Proceedings of the 6th ACM/IEEE-CS joint conference on Digital libraries (JCDL 06), pp. 230–239 (2006) [Google Scholar]
  8. B. L. Moreira, M. A. Goncalves, A. H. F. Laender & E. A. Fox, Automatic evaluation of digital libraries with 5SQual, Journal of Informetrics, vol. 3, 2, pp. 102–123 (2009) [CrossRef] [Google Scholar]
  9. X. Ochoa & E. Duval, Automatic evaluation of metadata quality in digital repositories, International Journal on Digital Libraries, vol. 10(2/3), pp. 67–91 (2009) [CrossRef] [Google Scholar]
  10. J. Ward, A quantitative analysis of unqualified dublin core metadata element set usage within data providers registered with the open archives initiative, Proceedings of the 3rd ACM/IEEE-CS joint conference on Digital libraries (JCDL 03), ISBN:0-7695-1939-3, pp. 315–317 (2003) [Google Scholar]
  11. Y. Zhang, Developing a holistic model for digital library evaluation, Journal of the American Society for Information Science and Technology, vol. 61, 1, pp. 88110 (2010) [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.