(Thanks of the comments, I am updating my code and question. Maybe it will be useful for others.)
Update
I am doing several edit processes in one ontology, at the end, I realized my ontology is inconsistent. Now I should get rid of all the unsatisfiable classes.
I searched a lot, but there is no code to solve the inconsistencies.
Follow the approach of this paper and the library of Matthew Horridge and this, still I am struggling with this issue.
I update my code and question, this is my solution. Is there any more suggestion for it?
What I am doing to solve the inconsistencies:
Run a reasoner to find unsatisfiable classes
From the all unsatisfiable classes, only find the root-unsatisfiable-classes
Find explanation axioms for them
Rank them and select the axioms which should be deleted (or rewrite) by the user
In my below code (using other existing solutions), I only need to implement rank function (the left open issue in inconsistency resolver).
For example, for my ontology, it has 25 unsatisfiable classes, which only 2 of them are the root of inconsistencies. These 2 root-classes, has 11 axioms (which I get them by getExplanations)
import java.io.File;
import java.util.Iterator;
import java.util.Set;
import org.semanticweb.owl.explanation.impl.rootderived.CompleteRootDerivedReasoner;
import org.semanticweb.owl.explanation.impl.rootderived.StructuralRootDerivedReasoner;
import org.semanticweb.owlapi.apibinding.OWLManager;
import org.semanticweb.owlapi.model.*;
import org.semanticweb.owlapi.reasoner.OWLReasoner;
import org.semanticweb.owlapi.reasoner.OWLReasonerFactory;
import com.clarkparsia.owlapi.explanation.BlackBoxExplanation;
import com.clarkparsia.owlapi.explanation.HSTExplanationGenerator;
import com.clarkparsia.pellet.owlapiv3.PelletReasonerFactory;
public class InconsistencyTest {
public static void main(String areg[]) throws Exception {
File fileM = new File("address\\to\\my\\ontology\\myOnt.owl");
OWLOntologyManager tempManager = OWLManager.createOWLOntologyManager();
OWLOntology ont = tempManager.loadOntologyFromOntologyDocument(fileM);
// run an reasoner
OWLReasonerFactory reasonerFactory = new PelletReasonerFactory();
OWLReasoner reasoner = reasonerFactory.createNonBufferingReasoner(ont);
if (reasoner.isConsistent()) {
// an ontology can be consistent, but have unsatisfiable classes.
if (reasoner.getUnsatisfiableClasses().getEntitiesMinusBottom().size() > 0) {
// means: an ontology is consistent but unsatisfiable!
System.out.println("The ontology FAILED satisfiability test with Pellet reasoner. \n Unsatisfiable classes detected: "
+ reasoner.getUnsatisfiableClasses().getEntitiesMinusBottom().size());
// This line return all unsatisfibaleClasses, but I do not need it
// Iterator<OWLClass> aList=reasoner.getUnsatisfiableClasses().getEntitiesMinusBottom().iterator();
// get root of unstaisfiableClasses, and get their explanations
BlackBoxExplanation bb = new BlackBoxExplanation(ont, reasonerFactory, reasoner);
HSTExplanationGenerator multExplanator = new HSTExplanationGenerator(bb);
CompleteRootDerivedReasoner rdr = new CompleteRootDerivedReasoner(tempManager, reasoner, reasonerFactory);
for (OWLClass cls : rdr.getRootUnsatisfiableClasses()) {
System.out.println("**** ROOT! " + cls);
int maxEntailment = 5;
Set<Set<OWLAxiom>> exSet = multExplanator.getExplanations(cls, maxEntailment);
OWLAxiom deletedAxiom = rankAxiom(exSet);
//return these axioms to user to delete or rewrite them
}
} else {
System.out.println("The ontology PASSED the consistency test.");
}
} else {
System.out.println("The ontology FAILED the consistency test, please review the Axioms or debug using Protege");
}
reasoner.dispose();
}
Do I follow everything correct? Is there any suggestion for rank the axioms?
Very Thanks
Related
Can some one please point me a complete example of Spring Data Mongodb DB bulk operation example.
I am trying to switch to bulk updates using spring data mongodb. Not able to find a good example.
Thank you.
BulkOprations in Spring data mongodb uses bulkWrite() from mongodb.
From mongoDB documentation ->
So When you want to update many entities with different updated in one query you can do that via this bulkOps.
Let us see an example eventhough it may not be an perfect one. Lets consider you have an Employee Collection with employees working in a company. Now After appraisal there will be change in salary for all the employees, and each employee salary change will be different and let's pretend there is no percentage wise hike involved and if you want to update the changes in one go you can use bulkOps.
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.BulkOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
public class Example {
#Autowired
MongoTemplate mongoTemplate;
public int bulkUpdateEmployee(List<Pair<Query, Update>> updates){
return mongoTemplate.bulkOps(BulkOperations.BulkMode.UNORDERED,"employees",Employee.class).updateMulti(updates).execute().getModifiedCount();
}
}
--------------Here we can prepare the pair of query and update from -------
-------for each employee->
---"query" - id of employee is blabla
---"update"- set salary to xxx
Sharing the code for bulk operations which worked for me
BulkOperations bulkOps = mongoTemplate.bulkOps(BulkMode.UNORDERED, Person.class);
for(Person person : personList) {
Query query = new Query().addCriteria(new Criteria("id").is(person.getId()));
Update update = new Update().set("address", "new Address as per requirement");
bulkOps.updateOne(query, update);
}
BulkWriteResult results = bulkOps.execute();
I think following code is the simple example that anybody can uderstand
Note : Ensure that custom mongo repository is correctly configured.
#Autowired
MongoTemplate mongoTemplate;
public int bulkUpdate(String member)
{
Query query = new Query();
Criteria criteria=Criteria.where("column name").is(member);
query.addCriteria(criteria);
Update update = new Update();
update.set("column name",true);
return mongoTemplate.bulkOps(BulkOperations.BulkMode.UNORDERED, YourModelClass.class,"name of collection").updateMulti(query,update).execute().getModifiedCount();
}
There are some elegant ways to perform the bulkOperations in Spring data mongodb refer
An excerpt from the reference
Starting in version 2.6, MongoDB servers support bulk write commands for insert, update, and delete in a way that allows the driver to implement the correct semantics for BulkWriteResult and BulkWriteException.
There are two types of bulk operations, ordered and unordered bulk operations.
Ordered bulk operations execute all the operations in order and error out on the first write error.
Unordered bulk operations execute all the operations and report any errors. Unordered bulk operations do not guarantee the order of execution.
Sample bulk operation covering most of the features
import com.mongodb.BasicDBObject;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteOneModel;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.ReplaceOneModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.Updates;
import com.mongodb.client.model.WriteModel;
import org.bson.Document;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.aggregation.Fields;
import org.springframework.stereotype.Repository;
import java.util.Arrays;
import java.util.Date;
import java.util.List;
import static org.springframework.data.mongodb.core.aggregation.Fields.UNDERSCORE_ID;
#Repository
public class BulkUpdateDemoRepository {
#Autowired
private MongoOperations mongoOperations;
public void bulkUpdate() {
MongoCollection<Document> postsCollection = mongoOperations.getCollection("posts");
//If no top-level _id field is specified in the documents, the Java driver automatically
// adds the _id field to the inserted documents.
Document document = new Document("postId", 1)
.append("name", "id labore ex et quam laborum")
.append("email", "Eliseo#gardner.biz")
.append("secondary-email", "Eliseo#gardner.biz")
.append("body", "laudantium enim quasi est");
List<WriteModel<Document>> list = Arrays.asList(
//Inserting the documents
new InsertOneModel<>(document),
//Adding a field in document
new UpdateOneModel<>(new Document(UNDERSCORE_ID, 3),
new Document().append("$set",
new BasicDBObject("postDate", new Date()))),
//Removing field from document
new UpdateOneModel<>(new Document(Fields.UNDERSCORE_ID, 4),
Updates.unset("secondary-email")),
//Deleting document
new DeleteOneModel<>(new Document(Fields.UNDERSCORE_ID, 2)),
//Replacing document
new ReplaceOneModel<>(new Document(Fields.UNDERSCORE_ID, 3),
new Document(Fields.UNDERSCORE_ID, 3)
.append("secondary-email", "Eliseo-updated#gardner.biz")));
//By default bulk Write operations is ordered because the BulkWriteOptions'
// ordered flag is true by default, by disabling that flag we can perform
// the Unordered bulk operation
//ordered execution
postsCollection.bulkWrite(list);
//2. Unordered bulk operation - no guarantee of order of operation - disabling
// BulkWriteOptions' ordered flag to perform the Unordered bulk operation
postsCollection.bulkWrite(list, new BulkWriteOptions().ordered(false));
}
}
Here is a piece of my ontology subclasses request, using JAVA-7 and owlapi library:
import org.semanticweb.owlapi.reasoner.OWLReasoner;
import org.semanticweb.owlapi.reasoner.OWLReasonerFactory;
import org.semanticweb.owlapi.reasoner.ConsoleProgressMonitor;
import org.semanticweb.owlapi.reasoner.OWLReasonerConfiguration;
...
...
OWLReasonerFactory reasonerFactory = new StructuralReasonerFactory();
ConsoleProgressMonitor progressMonitor = new ConsoleProgressMonitor();
OWLReasonerConfiguration config = new SimpleConfiguration(myconfiguration);
OWLReasoner reasoner = reasonerFactory.createReasoner(myontology, config);
Set<OWLClass> subclasses = reasoner.getSubClasses(myClazz, true).getFlattened();
Here is my question:
Why does the subclasses, what OWLReasoner.getSubClasses(...) method returns, contains all the subclasses of myClazz, but always also adds the OWLClass with URI http://www.w3.org/2002/07/owl#Nothing? I have defined nowhere this class.
Thanks in advance.
owl:Nothing is the class defined to be subclass of all classes in OWL, so it's included as subclass of all satisfiable classes (it is equivalent to all unsatisfiable classes).
To skip it during iterations, Node has a getEntitiesMinusBottom() method that will skip owl:Nothing.
At w3.org owl semantics, you can find more info about the owl:Nothing class: https://www.w3.org/TR/2004/REC-owl-semantics-20040210/#owl_Nothing
I am using the flatspec trait to create my tests and I would like to create a base class that would automatically tag any tests in that class with a particular tag.
For example, any tests in classes that inherit from the IntegrationTest class would automatically be appropriately tagged. So instead of:
class ExampleSpec extends FlatSpec {
"The Scala language" must "add correctly" taggedAs(IntegrationTest) in {
val sum = 1 + 1
assert(sum === 2)
}
I would like do this and still have the test tagged as an IntegrationTest
class ExampleSpec extends IntegrationSpec {
"The Scala language" must "add correctly" in {
val sum = 1 + 1
assert(sum === 2)
}
Thanks!
If you're willing to use a direct annotation on the test class, rather than a parent class, you can use the example at https://github.com/kciesielski/tags-demo. Adapted somewhat for your example, you need to declare a Java class:
package tags;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import static java.lang.annotation.ElementType.METHOD;
import static java.lang.annotation.ElementType.TYPE;
import static java.lang.annotation.RetentionPolicy.RUNTIME;
#org.scalatest.TagAnnotation
#Retention(RUNTIME)
#Target({METHOD, TYPE})
public #interface MyAnnotation {
}
Then you use it to annotate the Scala test class:
#tags.MyAnnotation
class ExampleSpec extends FlatSpec {
"The Scala language" must "add correctly" in {
val sum = 1 + 1
assert(sum === 2)
}
You then have to use the actual string tags.MyAnnotation to specify the tag you want run (or ignored).
I tried to annotate a parent class instead, but I can't get it to work. I could imagine it being a significant problem for you or not, depending on what else you're trying to do.
Actually, the online doc for the org.scalatest.Tag class does a fair job of describing all this, although I say it after getting it to work by following the above project on GitHub..
Since ScalaTest 2.2.0 tags can be inherited (http://www.scalatest.org/release_notes/2.2.0).
Add #Inherited to your annotation definition.
package tags;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import static java.lang.annotation.ElementType.METHOD;
import static java.lang.annotation.ElementType.TYPE;
import static java.lang.annotation.RetentionPolicy.RUNTIME;
**#Inherited**
#org.scalatest.TagAnnotation
#Retention(RUNTIME)
#Target({METHOD, TYPE})
public #interface RequiresIntegrationStuff {
}
Annotate your base spec.
#RequiresIntegrationStuff
class IntegrationSpec extends FlatSpec {}
Just use your base spec as a base class.
class ExampleSpec extends IntegrationSpec {
"The Scala language" must "add correctly" in {
val sum = 1 + 1
assert(sum === 2)
}
After that, ExampleSpec will be tagged as tags.RequiresIntegrationStuff.
You will find working project here: https://github.com/wojda/tags-demo (based on https://github.com/kciesielski/tags-demo from Spiro Michaylov's answer)
I basically have a large file of a few thousand names each on a new line in .txt. I am using Protege to build my ontology and I want a quicker way to insert these names as Individuals into the concept 'Person' in my Ontology. is there anyway that this can be done using Protege or the OWL API as clicking the add button in protege and typing/copying each name then adding it to the 'Person' concept will take some time.
Thanks for any suggestions.
If using the OWL API, there is an example of how to do just this in the examples provided in the documentation:
public void shouldAddClassAssertion() throws OWLOntologyCreationException,
OWLOntologyStorageException {
// For more information on classes and instances see the OWL 2 Primer
// http://www.w3.org/TR/2009/REC-owl2-primer-20091027/#Classes_and_Instances
// In order to say that an individual is an instance of a class (in an
// ontology), we can add a ClassAssertion to the ontology. For example,
// suppose we wanted to specify that :Mary is an instance of the class
// :Person. First we need to obtain the individual :Mary and the class
// :Person Create an ontology manager to work with
OWLOntologyManager manager = OWLManager.createOWLOntologyManager();
OWLDataFactory dataFactory = manager.getOWLDataFactory();
// The IRIs used here are taken from the OWL 2 Primer
String base = "http://example.com/owl/families/";
PrefixManager pm = new DefaultPrefixManager(base);
// Get the reference to the :Person class (the full IRI will be
// <http://example.com/owl/families/Person>)
OWLClass person = dataFactory.getOWLClass(":Person", pm);
// Get the reference to the :Mary class (the full IRI will be
// <http://example.com/owl/families/Mary>)
OWLNamedIndividual mary = dataFactory
.getOWLNamedIndividual(":Mary", pm);
// Now create a ClassAssertion to specify that :Mary is an instance of
// :Person
OWLClassAssertionAxiom classAssertion = dataFactory
.getOWLClassAssertionAxiom(person, mary);
// We need to add the class assertion to the ontology that we want
// specify that :Mary is a :Person
OWLOntology ontology = manager.createOntology(IRI.create(base));
// Add the class assertion
manager.addAxiom(ontology, classAssertion);
// Dump the ontology to stdout
manager.saveOntology(ontology, new StreamDocumentTarget(
new ByteArrayOutputStream()));
}
I have been google for a while, not sure whether Spring Data MongoDB supports for bulk save.
I need to save a collection of documents into mongo as atomic, either all saved or none saved.
Can anyone share a link or some sample code for this?
When you do a save through MongoDB Java driver you can only pass a single document to MongoDB.
When you do an insert, you can pass a single element or you can pass an array of elements. The latter is what will result in a "bulk insert" (i.e. single insert command by client will result in multiple documents being inserted on the server).
However, since MongoDB does not support a notion of transaction, if one of the inserts fails there is no way to indicate that previously inserted documents should be deleted or rolled back.
For the purposes of atomicity, each document insert is a separate operation and there is no supported way to make MongoDB either insert all or none.
If this is something that your application requires there may be other ways to achieve it:
- change your schema so that these are subdocuments of a single parent document
(then there is technically only one "insert" of the parent document)
- write the transaction semantics into your application code
- use a database which natively supports two phase commit transactions.
We have used Spring Data and Mongo Driver to achieve copying data from one database server to another.
import com.mongodb.MongoBulkWriteException;
import com.mongodb.MongoClient;
import com.mongodb.MongoException;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.WriteModel;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.stereotype.Component;
#Component
public class DataCopy{
public void copyData(MongoTemplate sourceMongo,MongoTemplate destinationMongo ){
Class cls = EmployeeEntity.class;
String collectionName = sourceMongo.getCollectionName(cls).get();
MongoCollection<Document> collection = destinationMongo.getCollection(collectionName);
Query findQuery = new Query();
Criteria criteria = new Criteria();
criteria.andOperator(Criteria.where("firstName").is("someName"),
Criteria.where("lastName").is("surname"));
query.addCriteria(criteria);
Pageable pageable = PageRequest.of(0, 10000);
findQuery.with(pageable);
List<?> pagedResult = sourceMongo.find(findQuery, cls).get()
while (!pagedResult.isEmpty()) {
try {
BulkWriteResult result = collection.bulkWrite(
pagedResult.
stream().map(d -> mapWriteModel(d, destinationMongo)).collect(Collectors.toList()),
new BulkWriteOptions().ordered(false));
} catch (Exception e) {
log.error("failed to copy", e);
}
pageable = pageable.next();
findQuery.with(pageable);
pagedResult = sourceMongo.find(findQuery, cls).get();
}
}
}
private WriteModel<? extends Document> mapWriteModel(Object obj,
MongoTemplate mongoTemplate
) {
Document document = new Document();
mongoTemplate.getConverter().write(obj, document);
return new InsertOneModel<>(document);
}
// Code example to create mongo templates for source and target databases
MongoClient targetClient = new MongoClient("databaseUri")
MongoTemplate destinationMongo = new MongoTemplate(targetClient, "databaseName");
Hope this would be helpful to you.