Saving and retrieving JSONB columns using SqlKata - npgsql

I'm manually serializing objects to and from strings and trying to store them in a Postgres database using SqlKata:
var obj = new { Id = 3, JsonB = "{a: 5}" };
dbInstance.Query("TableName").InsertAsync(obj);
The column JsonB is of type jsonb. When I attempt to do it this way, I get the following error:
Exception data:
Severity: ERROR
SqlState: 42804
MessageText: column "DataField" is of type jsonb but expression is of type text
I don't see anywhere in the documentation nor the code to where I can cast types on insert, and am at a loss on how to make it so that I can save JSON strings/objects to the appropriate fields.

I have same problem. For inserts where json column is present in table switched to use NpgsqlCommand and Npgsql.Json.NET library for serializing objects on the fly. Not SqlKata but working:
Product prod;
using (NpgsqlCommand command =
new NpgsqlCommand(#$"INSERT INTO products (json_col) VALUES (#json_col)", connection))
{
command.Parameters.Add(
new NpgsqlParameter("json_col", NpgsqlTypes.NpgsqlDbType.Json) { Value = prod });
await command.ExecuteNonQueryAsync();
}

I don't know SqlKata, but indeed you need to tell Npgsql that you're sending a json (or jsonb) type, since strings are mapped to the PostgreSQL text type by default. PostgreSQL is quite type-strict and will not implicitly cast between most types - you will need to find out how to set NpgsqlDbType via SqlKata.

Related

Unsupported Scan, storing driver.Value type []uint8 into type *guid.GUID

I work with Golang and SQL Server.
My struct in Golang:
type Role struct {
Id guid.GUID `gorm:"primaryKey;column:Id;type:uniqueidentifier" json:"id"`
RoleName string `gorm:"column:RoleName;not null;unique" json:"roleName"`
IsEnable bool `gorm:"column:IsEnable" json:"isEnable"`
Permissions []RolePermission }
I use gorm to query data but receive error:
unsupported Scan, storing driver.Value type []uint8 into type *guid.GUID.
I used uuid before but the id data is wrong when query (guid to uuid).
Is any way to store and work with Guid using Golang and SQL server
Early versions of go-gorm (v0.2) were including UUID/GUID support for SQLTag, with isUUID() a test on the type name ("uuid" or "guid").
But that code is no longer present in current go-gorm v2.0.
You might need to implement a custom Data Type Scanner / Valuer, or use one like google/uuid:
import (
"github.com/google/uuid"
"github.com/lib/pq"
)
type Post struct {
ID uuid.UUID `gorm:"type:uuid;default:uuid_generate_v4()"`
Title string
Tags pq.StringArray `gorm:"type:text[]"`
}

EF Core Projecting optional property in jsonb column

I need to project some fields from jsonb column where few of them are optional
I'm using EF Core 3.1 and npgsl and so far I got this
var shipments = _dbContext.Shipments.Select(x => new
{
ShipmentNo= x.ShipmentNumber,
ReportNum = x.ShipmentData.RootElement.GetProperty("reportNumber"),
ShipmentValue= x.ShipmentData.RootElement.GetProperty("shipmentMetadata").GetProperty("value").GetString(),
}
However value is optional and this is throwing exception. I see .TryGetProperty(...) method but it requires output variable and, I presume, its evaluation on server side. I wonder if there is way to handle this so query runs completely in Postgres.
You've forgotten to add GetInt32 (or whatever the type is) for reportNumber, just like you have a GetString after the shipmentMetadata. In order for this to be translatable to SQL, you need to tell the provider which type you expect to come out of the JSON element.

How to attach schema to a Flink DataStream - on the fly?

I am dealing with a stream of database mutations, i.e., a change log stream. I want to able to transform the values using a SQL query.
I am having difficulty putting together the following three concepts
RowTypeInfo, Row, and DataStream.
NOTE: I don't know the schema beforehand. I construct it on-fly using the data within the Mutation object (Mutation is a custom type)
More specifically I have code that looks like this.
val execEnv = StreamExecutionEnvironment.getExecutionEnvironment
val tableEnv: StreamTableEnvironment = TableEnvironment.getTableEnvironment(execEnv)
// Mutation is a custom type
val mutationStream: DataStream[Mutation] = ...
// toRows returns an object of type org.apache.flink.types.Row
val rowStream:DataStream[Row] = mutationStream.flatMap({mutation => toRows(mutation)})
tableEnv.registerDataStream("spinal_tap_table", rowStream)
tableEnv.sql("select col1 + 2")
NOTE: Row object is positional, and doesn't have a placeholder for column names.
I couldn't find a place to attach the schema to the DataStream object.
I want to pass some sort of a struct similar to Row that contains the complete information {columnName: String, columnValue: Object, columnType: TypeInformation[_]} for the query.
In Flink SQL a table schema is mandatory when the Table defined. It is not possible to run queries on dynamically typed records.
Regarding the concepts of RowTypeInfo, Row and DataStream:
Row is the actual record that holds the data
RowTypeInfo is a schema description for Rows. It contains names and TypeInformation for each field of a Row.
DataStream is a logical stream of records. A DataStream[Row] is a stream of rows. Note that this is not the actual stream but just an API concept to represent a stream in the API.

How to read a `geography` column by `SqlDataReader`?

I have a SQL Server 2008 database with a geography column which is generated by System.Data.Entity.Spatial.DbGeography in Entity Framework 6.0.0-alpha3.
Now I need to read that column with a SqlDataReader. But I have no idea how to do this. Using the old context is not an option. I tried to cast it as DbGeography:
Location = (DbGeography)reader.GetValue(index)
But I get this error:
Unable to cast object of type 'Microsoft.SqlServer.Types.SqlGeography'
to type 'System.Data.Entity.Spatial.DbGeography'
Do you have any suggestion?
Well, it was simple. I was just confused. But instead of deleting the question, I'll post the answer to others who they have the same question.
// read the value as dynamic:
dynamic temp = reader.GetValue(index);
// the temp contains Lat and Long properties:
var text = string.Format("POINT({0:R} {1:R})", temp.Long, temp.Lat);
// the temp also contains the STSrid as coordinate system id:
var srid = temp.STSrid.Value;
// the rest is really simple:
Location = System.Data.Entity.Spatial.DbGeography.PointFromText(text, srid);
If your geography is a point, you can use:
SELECT MyColumn.Lat, MyColumn.Long ...
reader.GetDouble(0);
reader.GetDouble(1);

LINQ deletion - Can delete one way, can't using Single enumerable

This works:
var i = (from x in db.Test
where x.Id == 1
select x).First();
db.Test.DeleteOnSubmit(i);
db.SubmitChanges();
I get a cast error for this (int/string):
var i = db.Test.Single(x => x.Id == 1);
db.Test.DeleteOnSubmit(i);
db.SubmitChanges();
I was also able to make an update using Single sucesssfully on the same table. Any clues?
Update
Here's more of the error message:
Here's more:
[InvalidCastException: Unable to cast object of type 'System.Int32' to type 'System.String'.]
System.Data.Linq.SingleKeyManager2.TryCreateKeyFromValues(Object[] values, V& v) +134
System.Data.Linq.IdentityCache2.Find(Object[] keyValues) +57
System.Data.Linq.StandardIdentityManager.Find(MetaType type, Object[] keyValues) +51
System.Data.Linq.CommonDataServices.GetCachedObject(MetaType type, Object[] keyValues) +113
System.Data.Linq.ChangeProcessor.GetOtherItem(MetaAssociation assoc, Object instance) +235
System.Data.Linq.ChangeProcessor.BuildEdgeMaps() +510
System.Data.Linq.ChangeProcessor.SubmitChanges(ConflictMode failureMode) +137
System.Data.Linq.DataContext.SubmitChanges(ConflictMode failureMode) +453
System.Data.Linq.DataContext.SubmitChanges() +38
What's the data type of your id field?
I just ran the following code against a dummy database that has data in it where my key field is of type int:
using (var db = new dbDataContext())
{
var x = db.items.Single(i => i.fld_key == 1);
db.items.DeleteOnSubmit(x);
db.SubmitChanges();
var y = (from i in db.items
where i.fld_key == 2
select i).First();
db.items.DeleteOnSubmit(y);
db.SubmitChanges();
}
I had no problems with either, and checking the database, items 1 and 2 were both deleted successfully.
Are you attempting to delete an item that doesn't exist in the second code block - as you deleted it with the first. This would mean that y is actually null and you're attempting to DeleteOnSubmit passing null which will give you an invalid cast as you can't cast null.
Try attempting to delete a different item in your second code block, I would imagine it would work in that case.
Change .Single to .SingleOrDefault and wrap your delete lines to check for null:
if(null != x)
{
db.items.DeleteOnSubmit(x);
db.SubmitChanges();
}
I would wager this will fix the problem you're seeing.
Edit Looking at the exception you've posted, I'm wondering if you changed the data type of your table to int after you generated your dbml. Reload the table into your dbml and see if that fixes the issue. Either that, or the issue is the other way around and you changed it to VarChar/NVarChar and haven't updated your dbml.
I would definitely take a look at your entities and make sure that the data types match those in your data table in your underlying database.
Have you changed the datatypes in your database lately? (Maybe those types currently differ from your generated classes)
Similarly, if you're using a stored procedure for deleting items, make sure the parameter types match up to your generated types.

Resources