After the Ontology Summit 2013 hackathon

In this post I am going to briefly talk about and show the outcomes from the Ontology Summit 2013 Hackathon. As I said in my previous post, OOPS! was involved in the projects “HC-03 Evaluation of OOPS!, OQuaRE and Other Tools for FIBO Ontologies” and “HC-07 Ontohub-OOR-OOPS! Integration”.

During the first project, we scanned a merged version of the FIBO OWL ontologies with OOPS! and analyse and discuss every pitfall detected. After this process, FIBO development team established that “most of the metrics were ones we would want to apply to the FIBO Business Conceptual Ontologies, not just operational ontologies.” Next steps were to apply also OQuaRE and OntoQA metrics to FIBO ontologies. Finally, we took another working day to determine how to apply OQuaRE characteristics to FIBO ontologies and map them to OntoQA metrics and OOPS! pitfalls. This set of slides summarizes the good and intensive work that the HC-03 team carried out during the project that was awarded with the “First IAOA Best OntologySummit Hackathon-Clinic Prize” during the Ontology Summit 2013 Symposium.

During the second project, OOPS! was integrated into Ontohub web interface and an API for the Ontohub-OOR integration was proposed. The great work mainly done by Ontohub development team is summarised in these slides. OOPS! team work during this project was overall about supporting and helping the Ontohub-OOPS! integration when needed providing details about the OOPS! RESTful web service.

Finally, here there is an example of an ontology analysed with OOPS! within the Ontohub portal. In the first screenshot there is a “Test with OOPS!” button that is active before the ontology is being scanned.

Example ontology before being scanned

Example ontology before being scanned

While OOPS! is scanning the ontology, the Ontohub interface shows the status information “OOPS State: pending” as in this screenshot:

Example of ontology during the scanning process

Example of ontology during the scanning process

When the process is done, the number of pitfalls detected, if any, is displayed (“5 responses” in this example) and an explanation of them is provided when clicking in the ontology element affected by the pitfall, as shown in the last screenshot:

Example of results for an object property

Example of results for an object property

Finally, these and other results from the Ontology Summit were presented at the Ontology Summit 2013 Symposium together with the Ontology Summit 2013 Comunique.

Getting involved in the Ontology Summit 2013 hackathon

The Ontolog Forum is “an open, international, virtual community of practice devoted to advancing the field of ontology, ontological engineering and semantic technology, and advocating their adoption into mainstream applications and international standards”.  The forum was reconstituted in 2002 and organizes annual series of events called Ontology Summits since 2006. An Ontology Summit is “an organized thinking machine that work from January to April every year to brain-storm one of a topic of interest for ontology engineering community” (see source). This year’s summit topic is “Ontology Evaluation Across the Ontology Lifecycle”.

Fortunately, I was invited to participate giving a talk at the “Intrinsic Aspects of Ontology Evaluation: Practice and Theory” session about the work done in OOPS! (OntOlogy Pitfall Scanner!). At the end of the session, Ontology Summit organizers proposed all the participants to get involved in the hackathon they were planning to carry out.  At that moment there was little information about it, indeed it was more of a imprecise plan about an event like a “hackathon” without a clear idea of when, who, how… but with the definite aim of creating something ‘real’ and ‘useful’ for ontology evaluation. So we accepted the invitation… or the challenge?

Soon we had some more information. There were three types of projects:

  • Hackathon: its goal is to create some new code, new API, or new ontology that are relevant to this Ontology Summit and/or this year’s “Ontology Evaluation” theme.
  • Ontology Evaluation Clinic” (abbrev. “Ontology Clinic“): aims at evaluating ontologies or gold standard ontologies through the “evaluation tool,” study the results, diagnose problems with the ontology, and see how the ontology, and the tool, may be improved,
  • Ontology-based Application Evaluation Clinic (abbrev. “Application Clinic“): helps the users evaluate whether ontologies the users already had in mind are fit for the intended purpose and whether the quality of those ontologies are satisfactory and provide appropriate recommendations

Participants had to write a proposal for the type of project they were interested in. As result, 8 hackathon projects, 4 ontology clinics and 3 application clinics were proposed.  After aligning proposals and schedule restrictions, 7 projects were selected for being carried out along the three selected weekends. Finally, OOPS! got involved in two of them, one ontology clinic and one hackathon project. The first one, “Evaluation of OOPS!, OQuaRE and Other Tools for FIBO Ontologies”, aims to explore the application of ontology quality measures to ontologies produced under the Financial Industry Business Ontology (FIBO) umbrella, while “Ontohub-OOR-OOPS! Integration” aims at the integration of OOPS! into the Ontohub and OOR ontology repositories. These two projects will take place the 13th of April 2013.

Now it is time for doing real work and getting some tangible outcomes. Results… in next posts.