Semantic research sets world standards

Semantic research sets world standards

(PhysOrg.com) -- European researchers have created new tools for semantic technology development which are helping to set the next generation of official standards. The tools also unblock some key bottlenecks in semantic technology.

The next generation of the World Wide Web will be a cyberspace full of meaning, thanks to the semantic technologies currently rolling out. Semantic technology creates labels for web-page elements that machines can read and ‘understand’ on their own.

This will have a huge impact on the quality and range of accessible information. Right now, if you type in the search term ‘fruit’, you will get a list of pages where this term appears. But in the , you would get lists of pages with apples, oranges, pineapple and everything else relevant. You don’t need to type every single relevant term when the computer ‘knows’ the meaning of the word ‘fruit’.

Better, you would also receive listings of pictures, videos and audio which are relevant to ‘fruit’, even if that word never appears directly in the title. This is the power of , machine-readable data accessible from your browser. This overcomes the problem of implicit information, which is usually hidden from the machine.

The semantic web is the primary focus of a large portion of web development over the past five years. But as the range and ambition of semantic research expands, the technology and tools used to develop it are finding it hard to keep up, because the ontologies at the heart of the are becoming more complex, larger, and more demanding.

Ontologies are large dictionaries of machine-readable labels defining every aspect of a specific domain, such as medicine or engineering. These domains can be populated with further sub-ontologies, such as neurology or mechanical engineering.

Multiple, large-scale and complex

“The increased demand for multiple, large-scale and complex ontologies poses novel challenges on all ontology tasks, such as their design, maintenance, merging, and integration,” explains Diego Calvanese, Free University of Bozen-Bolzano, coordinator of the TONES project.

The TONES project set out to develop a series of tools to make the development, management, integration and operation of large ontologies much simpler.

“The starting point of TONES was to develop a logical formalisation of ontologies. And using this logical formalisation, we can allow machines to understand and reason with the knowledge that is represented [ontologically],” he notes.

Using a formal logic means that a machine can understand how two or more different terms relate to each other in a given ontology, and this can provide enormous benefits in ontology management.

For example, very often ontologies are merged when the work of one group is integrated with the work of another, or when two ontologies are combined for a new task. This exercise is fraught with risk, however, because often terms can conflict or lead to redundancies in the system.

Semi-automatic conflicts

By using logical formalisation, however, a computer can semi-automatically detect potential conflicts or repetitions, and in fact TONES developed a ‘debugging’ tool that performs this function. It makes the management and integration of ontologies much simpler, much faster and more reliable.

Merging ontologies, particularly large ontologies, is another common and very difficult task, but new tools developed by the EU-funded TONES make it much easier. The group also developed a modularisation tool to break a large ontology into distinct parts.

Standards, or the lack of them, have been a big stumbling block in semantic development, so TONES worked extensively on this problem. That work set the foundations for the new official standard language, called OWL 2. Like html, OWL 2 is ultimately governed by the World Wide Web Consortium (W3C).

“What the TONES project has done is not aimed at the user sitting at home [with] an internet browser,” Calvanese states. “It is more directed to the design of complex applications. So what TONES has been doing is developing technologies that can be taken up and used to make end-user tools.”

This is not the first time these types of tools have been developed. But Calvanese suggests earlier attempts have struggled to cope with evolution.

“The earlier technology did not scale,” he reveals. “It could not handle large ontologies and large amounts of data accessed through them. Ontologies are becoming more and more numerous and people are building larger and larger domains, with more definitions in them. TONES developed new algorithms and new techniques to deal with large ontologies.”

The team also tested its work on real-world ontologies and early results have performed well, though testing is ongoing. Nonetheless, technologies developed in TONES have already appeared in several commercial and open-source applications.

Currently, the team is looking at ways to extend the work of TONES. In the meantime, European research is setting the standard for ontology development and management.

More information: TONES project

Provided by ICT Results

Citation: Semantic research sets world standards (2009, November 27) retrieved 19 March 2024 from https://phys.org/news/2009-11-semantic-world-standards.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Laying the foundation for the next-generation Web

0 shares

Feedback to editors