Ontologies, Ontology Development and Evaluation

A special issue of Algorithms (ISSN 1999-4893).

Deadline for manuscript submissions: closed (15 September 2021) | Viewed by 7749

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Computer Science, Foundation for Research and Technology-Hellas (FORTH), Science and Technology Park of Crete, N. Plasthra 100, Vassilika Vouton, GR 700 13 Heraklion, Greece
Interests: big data management; semantic interoperability; data series; information integration; AI
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Electrical and Computer Engineering, Hellenic Mediterranean University, 71410 Estavromenos, Greece
Interests: databases; artificial intelligence; software engineering; semantic web; distributed algorithms; communication protocols; fault tolerance; multimedia databases; methods of automatic optimization of software; interactive verifier
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

According to Gruber, an ontology is a specification of a conceptualization. Ontologies play a key role in the development of the semantic web as they attempt to represent entities, ideas and events, and offer a way of semantically uplifting the available data, with their quality and value becoming augmented over time.

For this Special Issue of Algorithms, we would like to invite authors to contribute articles dealing with the design, implementation and usage of various ontologies addressing real world problems and also methodologies guiding ontology development. Of particular interest are methodologies and systems proposed for evaluating the available ontologies, leading to better structured and higher quality ontologies in the long term.

Dr. Haridimos Kondylakis
Dr. Nikolaos Papadakis
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Ontology engineering
  • Ontology evaluation
  • Ontologies
  • Knowledge graphs

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 2780 KiB  
Article
SENSE: A Flow-Down Semantics-Based Requirements Engineering Framework
by Kalliopi Kravari, Christina Antoniou and Nick Bassiliades
Algorithms 2021, 14(10), 298; https://0-doi-org.brum.beds.ac.uk/10.3390/a14100298 - 15 Oct 2021
Cited by 1 | Viewed by 1927
Abstract
The processes involved in requirements engineering are some of the most, if not the most, important steps in systems development. The need for well-defined requirements remains a critical issue for the development of any system. Describing the structure and behavior of a system [...] Read more.
The processes involved in requirements engineering are some of the most, if not the most, important steps in systems development. The need for well-defined requirements remains a critical issue for the development of any system. Describing the structure and behavior of a system could be proven vague, leading to uncertainties, restrictions, or improper functioning of the system that would be hard to fix later. In this context, this article proposes SENSE, a framework based on standardized expressions of natural language with well-defined semantics, called boilerplates, that support a flow-down procedure for requirement management. This framework integrates sets of boilerplates and proposes the most appropriate of them, depending, among other considerations, on the type of requirement and the developing system, while providing validity and completeness verification checks using the minimum consistent set of formalities and languages. SENSE is a consistent and easily understood framework that allows engineers to use formal languages and semantics rather than the traditional natural languages and machine learning techniques, optimizing the requirement development. The main aim of SENSE is to provide a complete process of the production and standardization of the requirements by using semantics, ontologies, and appropriate NLP techniques. Furthermore, SENSE performs the necessary verifications by using SPARQL (SPIN) queries to support requirement management. Full article
(This article belongs to the Special Issue Ontologies, Ontology Development and Evaluation)
Show Figures

Figure 1

19 pages, 2942 KiB  
Article
Property-Based Semantic Similarity Criteria to Evaluate the Overlaps of Schemas
by Lan Huang, Yuanwei Zhao, Bo Wang, Dongxu Zhang, Rui Zhang, Subhashis Das, Simone Bocca and Fausto Giunchiglia
Algorithms 2021, 14(8), 241; https://0-doi-org.brum.beds.ac.uk/10.3390/a14080241 - 17 Aug 2021
Viewed by 2231
Abstract
Knowledge graph-based data integration is a practical methodology for heterogeneous legacy database-integrated service construction. However, it is neither efficient nor economical to build a new cross-domain knowledge graph on top of the schemas of each legacy database for the specific integration application rather [...] Read more.
Knowledge graph-based data integration is a practical methodology for heterogeneous legacy database-integrated service construction. However, it is neither efficient nor economical to build a new cross-domain knowledge graph on top of the schemas of each legacy database for the specific integration application rather than reusing the existing high-quality knowledge graphs. Consequently, a question arises as to whether the existing knowledge graph is compatible with cross-domain queries and with heterogenous schemas of the legacy systems. An effective criterion is urgently needed in order to evaluate such compatibility as it limits the quality upbound of the integration. This research studies the semantic similarity of the schemas from the aspect of properties. It provides a set of in-depth criteria, namely coverage and flexibility, to evaluate the pairwise compatibility between the schemas. It takes advantage of the properties of knowledge graphs to evaluate the overlaps between schemas and defines the weights of entity types in order to perform precise compatibility computation. The effectiveness of the criteria obtained to evaluate the compatibility between knowledge graphs and cross-domain queries is demonstrated using a case study. Full article
(This article belongs to the Special Issue Ontologies, Ontology Development and Evaluation)
Show Figures

Graphical abstract

21 pages, 4274 KiB  
Article
Ontology Based Governance for Employee Services
by Eleftherios Tzagkarakis, Haridimos Kondylakis, George Vardakis and Nikolaos Papadakis
Algorithms 2021, 14(4), 104; https://0-doi-org.brum.beds.ac.uk/10.3390/a14040104 - 25 Mar 2021
Cited by 8 | Viewed by 2574
Abstract
Advances in computers and communications have significantly changed almost every aspect of our daily activity. In this maze of change, governments around the world cannot remain indifferent. Public administration is evolving and taking on a new form through e-government. A large number of [...] Read more.
Advances in computers and communications have significantly changed almost every aspect of our daily activity. In this maze of change, governments around the world cannot remain indifferent. Public administration is evolving and taking on a new form through e-government. A large number of organizations have set up websites, establishing an online interface with the citizens and businesses with which it interacts. However, most organizations, especially the decentralized agencies of the ministries and local authorities, do not offer their information electronically despite the fact that they provide many information services that are not integrated with other e-government services. Besides, these services are mainly focused on serving citizens and businesses and less on providing services to employees. In this paper, we describe the process of developing an ontology to support the administrative procedures of decentralized government organizations. Finally, we describe the development of an e-government portal that provides employees services that are processed online, using the above ontology for modeling and data management. Full article
(This article belongs to the Special Issue Ontologies, Ontology Development and Evaluation)
Show Figures

Figure 1

Back to TopTop