How do I pull all entities linked from another entity in Datomic? - datomic

I don't know how to word my question.
:host/id has a link to :server/id. I want to pull all servers linked to a specific host.
I've tried several approaches but I get either an empty result, all results or an IllegalArgumentExceptionInfo :db.error/not-a-keyword Cannot interpret as a keyword.
I tried following the documentation but I keep getting lost. Here are my attempts so far:
All hosts
(d/q '[:find (pull ?server [{:host/id [:host/hostname]}])
:in $ ?hostname
:where
[?host :host/hostname ?hostname]
[?server :server/name]] db "myhost")
IllegalArgumentExceptionInfo
(d/q '[:find (pull ?server [{:host/id [:host/hostname]}])
:in $ ?hostname
:where
[?server :server/name ?host]
[?host :host/hostname ?hostname]] db "myhost")
[]
(d/q '[:find (pull ?host [{:host/id [:host/hostname]}])
:in $ ?hostname
:where
[?host :host/hostname ?hostname]
[?host :server/name]] db "myhost")

Assuming you have these entities in datomic:
(d/transact conn [{:host/name "host1"}])
(d/transact conn [{:server/name "db1"
:server/host [:host/name "host1"]}
{:server/name "web1"
:server/host [:host/name "host1"]}])
And assuming each server has a reference to host (please see schema below), in order to query which servers are linked to a host, use the reverse relation syntax '_':
(d/q '[:find (pull ?h [* {:server/_host [:server/name]}])
:in $ ?hostname
:where
[?h :host/name ?hostname]]
(d/db conn)
"host1")
will give you:
[[{:db/id 17592186045418,
:host/name "host1",
:server/_host [#:server{:name "db1"} #:server{:name "web1"}]}]]
Here is the sample schema for your reference:
(def uri "datomic:free://localhost:4334/svr")
(d/delete-database uri)
(d/create-database uri)
(def conn (d/connect uri))
(d/transact conn [{:db/ident :server/name
:db/cardinality :db.cardinality/one
:db/unique :db.unique/identity
:db/valueType :db.type/string}
{:db/ident :server/host
:db/cardinality :db.cardinality/one
:db/valueType :db.type/ref}
{:db/ident :host/name
:db/cardinality :db.cardinality/one
:db/unique :db.unique/identity
:db/valueType :db.type/string}])

Related

Datomic query - find all records (entities) with value

Query:
(d/q '[:find [?e ...]
:in $ ?value
:where [?e _ ?value]]
db "Germany")
returns nothing, while:
(d/q '[:find [?e ...]
:in $ ?value
:where [?e :country/name ?value]]
db "Germany")
returns list of entities as expected.
Shouldn't the _ serve as a wildcard for any attribute name and return everything that holds a value ?
I read this Datomic query: find all entities with some value, but can't figure how do I stick an actual value as a parameter.
Datomic version: datomic-pro-0.9.5966
I figured this dirty, time consuming method, but it does the job:
(defn all-by-value
[db value]
(reduce
(fn [res ident]
(try
(->> (d/q '[:find [?e ...] :in $ ?a ?v :where [?e ?a ?v]] db ident value)
(map #(d/entity db %))
(concat res))
(catch Exception _ res)))
[] (d/q '[:find [?e ...] :where [?e :db/ident]] db)))
Hope some of you will find it useful.

Working with Python in Azure Databricks to Write DF to SQL Server

We just switched away from Scala and moved over to Python. I've got a dataframe that I need to push into SQL Server. I did this multiple times before, using the Scala code below.
var bulkCopyMetadata = new BulkCopyMetadata
bulkCopyMetadata.addColumnMetadata(1, "Title", java.sql.Types.NVARCHAR, 128, 0)
bulkCopyMetadata.addColumnMetadata(2, "FirstName", java.sql.Types.NVARCHAR, 50, 0)
bulkCopyMetadata.addColumnMetadata(3, "LastName", java.sql.Types.NVARCHAR, 50, 0)
val bulkCopyConfig = Config(Map(
"url" -> "mysqlserver.database.windows.net",
"databaseName" -> "MyDatabase",
"user" -> "username",
"password" -> "*********",
"dbTable" -> "dbo.Clients",
"bulkCopyBatchSize" -> "2500",
"bulkCopyTableLock" -> "true",
"bulkCopyTimeout" -> "600"
))
df.bulkCopyToSqlDB(bulkCopyConfig, bulkCopyMetadata)
That's documented here.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-spark-connector
I'm looking for an equivalent Python script to do the same job. I searched for the same, but didn't come across anything. Does someone here have something that would do the job? Thanks.
Please try to refer to PySpark offical document JDBC To Other Databases to directly write a PySpark dataframe to SQL Server via the jdbc driver of MS SQL Server.
Here is the sample code.
spark_jdbcDF.write
.format("jdbc")
.option("url", "jdbc:sqlserver://yourserver.database.windows.net:1433")
.option("dbtable", "<your table name>")
.option("user", "username")
.option("password", "password")
.save()
Or
jdbcUrl = "jdbc:mysql://{0}:{1}/{2}".format(jdbcHostname, jdbcPort, jdbcDatabase)
connectionProperties = {
"user" : jdbcUsername,
"password" : jdbcPassword,
"driver" : "com.mysql.jdbc.Driver"
}
spark_jdbcDF.write \
.jdbc(url=jdbcUrl, table="<your table anem>",
properties=connectionProperties ).save()
Hope it helps.
Here is the complete PySpark code to write a Spark Data Frame to an SQL Server database including where to input database name and schema name:
df.write \
.format("jdbc")\
.option("url", "jdbc:sqlserver://<servername>:1433;databaseName=<databasename>")\
.option("dbtable", "[<optional_schema_name>].<table_name>")\
.option("user", "<user_name>")\
.option("password", "<password>")\
.save()

ADODB SQLServer connection with Seek and Index

In my program I want connect to SQL Server using ADODB.Connection, then read the data using ADODB.RecordSet, and use Index and Seek to find the searched record.
I tried using this connection string Provider=SQLNCLI11;;Data Source=localhost;Initial Catalog=TestDB;Integrated Security=SSPI;Persist Security Info=False;, adOpenDynamic as cursorType, adLockOptimistic as LockType and adCmdTable as CommandType.
But when I try to call the Supports method on recordset with adSeek or adIndex returns false.
There is a way to connect to SQL Server and open a Recordset with support for seek and index?
Edit here the code:
LOCAL oCn, nCursor, nLock,oRS
oCn := CreateObject( "ADODB.Connection" )
oCn:ConnectionString := "Provider=SQLNCLI11;;Data Source=localhost;Initial Catalog=TLPosWin;Integrated Security=SSPI;Persist Security Info=False;"
oRS := CreateObject( "ADODB.RecordSet" )
nCursor := adOpenDynamic
nLock := adLockOptimistic
oRS:CursorLocation := adUseServer
oRS:Open("Articoli",oCn, nCursor, nLock, adCmdTable)
? "seek",oRS:Supports(0x200000),"index",oRS:Supports(0x100000) //both false
oRS:Index := "Articoli_ARTART" //Error
oRS:Seek('=',1,'000611') //Error

Can't Open Connection String to SQL Server in VBSscript

I have a .vbs script that runs the following sql query:
Select COUNT (*) from sys.objects
Which count the rows, from the sql query output:
https://i.stack.imgur.com/wduXW.png[1]
And if there is any rows found (> 0). genereate an alert in SCOM using the PropertyBag scripting runtime in SCOM.
Problem is,
When debugging the script (using cscript), i get the following error messeage:
(11,1) Microsoft OLE DB Provider for ODBC Drivers:
[Microsoft][ODBC SQL Server Driver][Shared Memory]SQL Server does not exist or access denied.
Although the Connection string seems to be correct:
strConnection = "Driver={SQL Server};Server=SCOMSRVDB01;Database=DBABee;Trusted_Connection=TRUE"
Here is the Full VBScript:
Dim objCN, strConnection
Dim oAPI, oBag
Set objCN = CreateObject("ADODB.Connection")
Set oAPI = CreateObject("MOM.ScriptAPI")
Set oBag = oAPI.CreatePropertyBag()
strConnection = "Driver={SQL Server};Server=SCOMSRVDB01;Database=DBABee;Trusted_Connection=TRUE"
objCN.Open strConnection
Dim strSQLQuery
strSQLQuery = "Select COUNT (*) from sys.objects"
Dim objRS
Set objRS=CreateObject("ADODB.Recordset")
Set objRS = objCN.Execute(strSQLQuery)
Do Until objRS.EOF
'WScript.Echo objRS.Fields("No column name")
if objRS.Fields("No column name") > 0 then
'WScript.Echo "evaluated as bad"
Call oBag.AddValue("State","BAD")
Call objAPI.Return(oBag)
else
Call oBag.AddValue("State","GOOD")
Call objAPI.Return(oBag)
end if
objRS.MoveNext
Loop
objRS.Close
It worth mentioning,
That in our company you can't connect to an sql server without mention Port Number.
But when i tried to add it (Port: 2880) in the connection string:
strConnection = "Driver={SQL Server};Server=SCOMSRVDB01,2880;Database=DBABee;Trusted_Connection=TRUE"
The script returen the following error:
(23,17) ADODB.Recordset: Item cannot be found in the collection corresponding to the requested name or ordinal.
The ADODB error indicating that the item connect be found means that you successfully connected to the DB, and it can't find the column you requested. This is what is can't find: objRS.Fields("No column name")
Change your query and name the column:
strSQLQuery = "Select COUNT (*) as countStuff from sys.objects"
Then change what you are looking for:
if objRS.Fields("countStuff") > 0 then

Query using bigint attribute return empty for certain values

I created a minimal entity with one attribute of bigint type, my problem is that the query fail for certain values; this is the schema:
[{:db/ident :home/area,
:db/valueType :db.type/bigint,
:db/cardinality :db.cardinality/one,
:db/doc "the doc",
:db.install/_attribute :db.part/db,
:db/id #db/id[:db.part/db -1000013]}]
I inserted a sample value:
(d/transact (d/connect uri2)
[{
:db/id #db/id[:db.part/user]
:home/area 123456789000000N}
])
And confirmed that It was created by using the datomic console. It happens that the following query doesn’t return the entity previously inserted, as expected:
(d/q '[
:find ?e
:in $ ?h
:where
[?e :home/area ?h]]
(d/db (d/connect uri2))
123456789000000N
)
;;--- #{}
Maybe I’m missing something in the way the value is expressed. Another test using a different value like 100N for the attribute :home/area returns the correct answer:
(d/transact (d/connect uri2)
[{
:db/id #db/id[:db.part/user]
:home/area 100N}
])
(d/q '[
:find ?e
:in $ ?h
:where
[?e :home/area ?h]]
(d/db (d/connect uri2))
100N
)
;;-- #{[17592186045451]}
Also works fine with the value 111111111111111111111111111111111111N which is confusing to me.
Datomic version: "0.9.5390" java version "1.8.0_05" Java(TM) SE
Runtime Environment (build 1.8.0_05-b13) Java HotSpot(TM) 64-Bit
Server VM (build 25.5-b02, mixed mode) MySQL as Storage service
Thanks in advance for any any suggestions.
To Clojure users, the name :db.type/bigint can be misleading, since it actually maps to java.math.BigInteger, not clojure.lang.BigInt.
I reproduced the same steps and I can't tell you why the Datalog query fails on 123456789000000N but not 100N and 111111111111111111111111111111111111N. It seems however that the following always works:
(d/q '[
:find ?e
:in $ ?h
:where
[?e :home/area ?h]]
(d/db (d/connect uri2))
(.toBigInteger 100N)
)
I ran your example and got different results (it worked in all cases). I am not sure why, but maybe adding my example will help. The only changes I made were to use uri instead of uri2, I slurped the schema, and I performed a (def conn (d/connect uri)) and a (d/create-database uri). I assume you performed similar steps, which is why I don't know why my example worked:
Clojure 1.8.0
user=> (use '[datomic.api :only [q db] :as d])
nil
user=> (use 'clojure.pprint)
nil
user=> (def uri "datomic:mem://bigint")
#'user/uri
user=> (d/create-database uri)
true
user=> (def conn (d/connect uri))
#'user/conn
user=> (def schema-tx (read-string (slurp "path/to/the/schema.edn")))
#'user/schema-tx
user=> #(d/transact conn schema-tx)
{:db-before datomic.db.Db#b8774875,
:db-after datomic.db.Db#321a2712,
:tx-data [#datom[13194139534312 50 #inst "2016-08-14T18:53:23.158-00:00" 13194139534312 true]
#datom[63 10 :home/area 13194139534312 true] #datom[63 40 60 13194139534312 true]
#datom[63 41 35 13194139534312 true] #datom[63 62 "the doc" 13194139534312 true]
#datom[0 13 63 13194139534312 true]],
:tempids {-9223367638809264717 63}}
(d/transact (d/connect uri)
[{
:db/id #db/id[:db.part/user]
:home/area 123456789000000N}
])
#object[datomic.promise$settable_future$reify__6480 0x5634d0f4
{:status :ready, :val {:db-before datomic.db.Db#321a2712,
:db-after datomic.db.Db#f6ef3cd8,
:tx-data [#datom[13194139534313 50 #inst "2016-08-14T18:53:34.325-00:00" 13194139534313 true]
#datom[17592186045418 63 123456789000000N 13194139534313 true]],
:tempids {-9223350046623220288 17592186045418}}}]
(d/q '[
:find ?e
:in $ ?h
:where
[?e :home/area ?h]]
(d/db (d/connect uri))
123456789000000N
)
#{[17592186045418]}
(d/transact (d/connect uri)
[{
:db/id #db/id[:db.part/user]
:home/area 100N}
])
#object[datomic.promise$settable_future$reify__6480 0x3b27b497
{:status :ready, :val {:db-before datomic.db.Db#f6ef3cd8,
:db-after datomic.db.Db#2385c058,
:tx-data [#datom[13194139534315 50 #inst "2016-08-14T18:54:13.347-00:00" 13194139534315 true]
#datom[17592186045420 63 100N 13194139534315 true]],
:tempids {-9223350046623220289 17592186045420}}}]
(d/q '[
:find ?e
:in $ ?h
:where
[?e :home/area ?h]]
(d/db (d/connect uri))
100N
)
#{[17592186045420]}
user=>
Can you run (first schema-tx) on the REPL line to confirm your schema transacted? I noticed you were using the console and I am wondering if /bigint did not get defined or you were looking at the first uri (since I noticed you had a 2, I am assuming you have multiple examples).

Resources