Exporting data from ontology - Protege - export

How to export data from ontology to an excel sheet using Protege.
For example, I want to get a table with 2 columns:
Class rdf:ID, Super Class rdf:ID.

solved!..
From the tap SPARQL Query
type the query:
SELECT ?subject ?object
WHERE { ?subject rdfs:subClassOf ?object }
then execute and copy the result directly into an excel sheet :)

Related

Relation IDs mismatch - Mapping OWL to Oracle DB with Ontop

As a Part of my little App I try to map Data between my Ontology and an Oracle DB with ontop. But my first mapping is not accepted by the reasoner and it's not clear why.
As my first attempt I use the following target:
:KIS/P_PVPAT_PATIENT/{PPVPAT_PATNR} a :Patient .
and the following source:
select * from P_PVPAT_PATIENT
Here KIS is the schema, p_pvpat_patient the table and ppvpat_patnr the key.
Caused by: it.unibz.inf.ontop.exception.InvalidMappingSourceQueriesException:
Error: Relation IDs mismatch: P_PVPAT_PATIENT v "KIS"."P_PVPAT_PATIENT" P_PVPAT_PATIENT
Problem location: source query of triplesMap
[id: MAP_PATIENT
target atoms: triple(s,p,o) with
s/RDF(http://www.semanticweb.org/grossmann/ontologies/kis-ontology#KIS/P_PVPAT_PATIENT/{}(TmpToVARCHAR2(PPVPAT_PATNR)),IRI), p/<http://www.w3.org/1999/02/22-rdf-syntax-ns#type>, o/<http://www.semanticweb.org/grossmann/ontologies/kis-ontology#Patient>
source query: select * from P_PVPAT_PATIENT]
As the error said my source query was wrong because I forgot to use the schema in my sql.
the correct sql is
select * from kis.P_PVPAT_PATIENT

Load unmapped tables in Symfony with Doctrine

I have tables in my database, that are not managed by Symfony; there are no entities for these tables. They are tables from another application, I import them and use Symfony to generate statistics from the data in the tables.
How do I access this?
Can i use doctrine and a regular repository for this?
I just want to read data, not update.
Right now I'm using straight mysqli_connect and mysqli_query, but that just doesn't feel right using Symfony 5.
You should just be able to query with sql. The following example comes straight from the docs:
// src/Repository/ProductRepository.php
// ...
class ProductRepository extends ServiceEntityRepository
{
public function findAllGreaterThanPrice(int $price): array
{
$conn = $this->getEntityManager()->getConnection();
$sql = '
SELECT * FROM product p
WHERE p.price > :price
ORDER BY p.price ASC
';
$stmt = $conn->prepare($sql);
$stmt->execute(['price' => $price]);
// returns an array of arrays (i.e. a raw data set)
return $stmt->fetchAllAssociative();
}
}
https://symfony.com/doc/current/doctrine.html#querying-with-sql

Adding comments to database columns and retrieving from AWS Glue

I'm trying to incorporate a AWS GLUE Data Catalog to my Data Lake I'm building out. I'm using a few different databases and would like to add COMMENTS to columns in a few of these tables. These databases include Redshift and MySql. I usually add the comments to the column by doing something along the lines of
COMMENT ON COLUMN table.column_name IS 'This is the comment';
Now i know that Glue has a comment field that shows in the GUI. Is there a way to sync the comment field in Glue with the comments I add to the columns in a DB?
In order to update some meta information about a table that has been defined in AWS Glue Data Catalog, you would need to use a combination of get_table() and update_table() methods with boto3 for example .
Here is the most naive approach to do that:
import boto3
from pprint import pprint
glue_client = boto3.client('glue')
database_name = "__SOME_DATABASE__"
table_name = "__SOME_TABLE__"
response = glue_client.get_table(
DatabaseName=database_name,
Name=table_name
)
original_table = response['Table']
Here original_table adheres response syntax defined by get_table(). However, we need to remove some fields from it so it would pass validation when we use update_table(). List of allowed keys could be obtained by passing original_table directly to update_table() without any chagnes
allowed_keys = [
"Name",
"Description",
"Owner",
"LastAccessTime",
"LastAnalyzedTime",
"Retention",
"StorageDescriptor",
"PartitionKeys",
"ViewOriginalText",
"ViewExpandedText",
"TableType",
"Parameters"
]
updated_table = dict()
for key in allowed_keys:
if key in original_table:
updated_table[key] = original_table[key]
For simplicity sake, we will change comment of the very first column from the table
new_comment = "Foo Bar"
updated_table['StorageDescriptor']['Columns'][0]['Comment'] = new_comment
response = glue_client.update_table(
DatabaseName=database_name,
TableInput=updated_table
)
pprint(response)
Obviously, if you want to add a comment to a specific column you would need to extend this to
new_comment = "Targeted Foo Bar"
target_column_name = "__SOME_COLUMN_NAME__"
for col in updated_table['StorageDescriptor']['Columns']:
if col['Name'] == target_column_name:
col['Comment'] = new_comment
response = glue_client.update_table(
DatabaseName=database_name,
TableInput=updated_table
)
pprint(response)

How to judge if an element is in list as defined Jsonfield?

I use peewee related with an exsits table:
import peewee
from playhouse.postgres_ext import *
class Rules(peewee.Model):
channels = JSONField(null=True)
remark = peewee.CharField(max_length=500, null=True)
class Meta:
database = db
db_table = 'biz_rule'
schema = 'opr'
example: in my table there exists a record in column channels:
["A012102","C012102","D012102","E012102"]
I want to judge whether "A012102" is in the list,how to write the code?
If you're using PostgreSQL 9.4+, you can use the jsonb data type using the corresponding postgres_ext.BinaryJSONField peewee field type. It has contains_any() and contains_all() methods that correspond to the PostgreSQL ?| and ?& operators (see the PostgreSQL JSON docs). So I think it'd be something like this:
from playhouse.postgres_ext import BinaryJSONField
class Rules(peewee.Model):
channels = BinaryJSONField(null=True)
...
query = Rules.select().where(Rules.channels.contains_all('A012102'))

Gql query for repeated StructuredProperty

How do I write the following query in GQL? [1]
Contact.query(Contact.address == Address(city='San Francisco',
street='Spear St'))
[1] Filtering for Structured Property Values
Quoting https://cloud.google.com/appengine/docs/python/ndb/queries#gql , "To query models containing structured properties, you can use foo.bar in your GQL syntax to reference subproperties" -- so if I understand your task correctly,
'''SELECT * FROM Contact
WHERE address.city='San Francisco' AND
address.street='Spear St'
'''
should work. Doesn't it?

Resources