I have this NUMERIC(20) field on mssql and I'm trying do read it as a data type string to get the full number because int limits on javascript
code: {
type: DataTypes.STRING
}
But the returned value is a truncated INT
[ {code: 4216113112911594000 } ]
No matter what kind of data type choose it'll return as truncated int
The original values is 4216113112911594192. This is a java UUID
How can a convert this value to using model.findAll()
This is a already created table for another application and I'm trying to read it using sequelize
Here you go :
code: {
type: Sequelize.INTEGER , // <------ keep is as DB's datatype
get() { // <------ Use getter method to modify the output of query
return this.getDataValue('code').toString();
}
}
I think this might help you
model.findOne({
attributes: [[db.sequelize.literal('cast(`code` as varchar)'), 'code_string']] ,
where : { id : YOUR_ID }
}).then(data => {
console.log(data); // <------ Check your output here
})
I have not fully understood what you are trying to achieve.
Nor my code is tested.
Idea 1 : convert into float.
Idea 2 : Append any string alphabet to originial int value before sending to javascript.
The original values is 4216113112911594192.
So the string become 'A'+'4216113112911594192'='A4216113112911594192'
Now you can play with 'A4216113112911594192'
There are many javascript library to support big integer.
BigInteger
This worked for postgres (thanks Bessonov & vitaly-t!)
Just add this before you initialize sequelize
import pg from 'pg'
// Parse bigints and bigint arrays
pg.types.setTypeParser(20, BigInt) // Type Id 20 = BIGINT | BIGSERIAL
const parseBigIntArray = pg.types.getTypeParser(1016) // 1016 = Type Id for arrays of BigInt values
pg.types.setTypeParser(1016, (a) => parseBigIntArray(a).map(BigInt))
A little background:
I didn't want Sequelize to parse bigint columns as strings, so I added pg.defaults.parseInt8 = true. That made pg start parsing bigints as regular numbers, but it truncated numbers that were too big. I ended up going with the above solution.
Related
I usesd Extjs 7.4. When I load data into an extjs store, I have the problem that the last digits are truncated for bigint values.
It doesn't matter if the model field type is int or number. Bigint values are only displayed correctly if the type is string. I can't use the field as a string in the idProperty of the data model. Does somebody has any idea.
Maybe it is a limit of javascript, not ExtJs. In fact if you try to create a new object with a bigint property you get some truncated number:
var record = {
numericValue: 9223372036854776807,
stringValue: "9223372036854775807"
};
console.log(record);
It prints:
{
numericValue: 9223372036854776000,
stringValue: "9223372036854775807"
}
---EDIT---
A solution could be to pass for the convert config of the BigInt field defined in the store's model. Note that your property should be initially read by the store as a string. Doing like this, the property will correctly store BigInt values:
Ext.define("MyStore",{
extend: "Ext.data.Store",
fields: [
{
name: "bigIntProp",
convert: function (value) {
return BigInt(value);
}
}
]
});
var store = new MyStore();
store.add({ bigIntProp: '9223372036854775807' });
// This correctly print the big int value now
console.log(store.getAt(0).get("bigIntProp"));
The following is to get 1 record out of database by matching enum type.
TYPES = (
('ABC_ABC', 'abc abc'),
('XYZ_XYZ', 'xyz xyz'),
)
class Hello(models.Model):
type = models.CharField(max_length=8, choices=TYPES, blank=True)
database:
'1', 'ABC_ABC', 'other data'
Queryset:
qset = Q(type__in=('ABC_ABC'))
hello = models.Hello.objects.filter(qset)
Output:
print('count: {}'.format(hello.count()))
Result is 0. It should be 1. What's wrong?
Try changing your Q filter as follow:
qset = Q(type='ABC_ABC')
Explanations:
When using in lookup, django expects an iterable. As you provided only one value, I suspect, it tried to iterate over the string 'ABC_ABC' which is why you didn't get any hit.
If you really want to use the in lookup, which is useless in this case, you should add a comma in to force creating a 1-tuple.
qset = Q(type__in=('ABC_ABC',))
Further thoughts
As this query is quite basic, using a Q object is superfluous. You can simply call
.filter(type='ABC_ABC')
I'm using mssql and my table have a NUMERIC(20) field. I'm trying to convert it to string on the get function but the this.getDataValue('code') is returning a truncated int (8985353050864662000)
code: {
type: DataTypes.NUMBER(20),
get() {
return this.getDataValue('code').toString();
}
}
The original number is a java UUID (8985353050864661894)
My application reads JSON data in a particular format. I am pulling data from a database to dynamically create the data. I have the data, I just do not know the proper way to put it all in the following format.
Note: the first 2 sets are pulled from one query, and the "variables" section is what needs to be looped through to populate as the variable names AND values are in their own fields.
Sample Tables<br>
Master Table
ID | Custom_Col1 | Custom_Col2
1 custom_val1 custom_val2
Variables Table
ID | Name | Value
1 var_name1 var_value1
2 var_name2 var_value2
3 var_name3 var_value3
4 var_name4 var_value4
5 var_name5 var_value5
6 var_name6 var_value6
{"Custom_Col1":"custom_val1", "Custom_Col2":"custom_val2","variables":{"var_name1":"var_value1","var_name2":"var_value2","var_name3":"var_value3", "var_name4":"var_value4","var_name5":"var_value5","var_name6":"var_value6"}}
I was able to get the looped values in by using the following, but I just don't know how to get the other variables in. I'm sure it's simple, I've just never worked with JSON before. I've scoured the internet and have found examples, but they have only gotten me so far:
var json = {}
while loop
{
json[name]= value.toString();
}
var stringJson = JSON.stringify(json);
You can create a collection in the same way you did with the var json.
var json = {}
while loopMasterTable
{
json[name]= value.toString();
}
json['variables'] = {}
while loopVariableTable
{
json['variables'][name]= value.toString();
}
var stringJson = JSON.stringify(json);
This will give you the following JSON:
{
"Custom_Col1": "custom_val1",
"Custom_Col2": "custom_val2",
"variables": {
"var_name1": "var_value1",
"var_name2": "var_value2",
"var_name3": "var_value3",
"var_name4": "var_value4",
"var_name5": "var_value5",
"var_name6": "var_value6"
}
}
According to the Dapper documentation, you can get a dynamic list back from dapper using below code :
var rows = connection.Query("select 1 A, 2 B union all select 3, 4");
((int)rows[0].A)
.IsEqualTo(1);
((int)rows[0].B)
.IsEqualTo(2);
((int)rows[1].A)
.IsEqualTo(3);
((int)rows[1].B)
.IsEqualTo(4);
What is however the use of dynamic if you have to know the field names and datatypes of the fields.
If I have :
var result = Db.Query("Select * from Data.Tables");
I want to be able to do the following :
Get a list of the field names and data types returned.
Iterate over it using the field names and get back data in the following ways :
result.Fields
["Id", "Description"]
result[0].values
[1, "This is the description"]
This would allow me to get
result[0].["Id"].Value
which will give results 1 and be of type e.g. Int 32
result[0].["Id"].Type --- what datattype is the value returned
result[0].["Description"]
which will give results "This is the description" and will be of type string.
I see there is a results[0].table which has a dapperrow object with an array of the fieldnames and there is also a result.values which is an object[2] with the values in it, but it can not be accessed. If I add a watch to the drilled down column name, I can get the id. The automatically created watch is :
(new System.Collections.Generic.Mscorlib_CollectionDebugView<Dapper.SqlMapper.DapperRow>(result as System.Collections.Generic.List<Dapper.SqlMapper.DapperRow>)).Items[0].table.FieldNames[0] "Id" string
So I should be able to get result[0].Items[0].table.FieldNames[0] and get "Id" back.
You can cast each row to an IDictionary<string, object>, which should provide access to the names and the values. We don't explicitly track the types currently - we simply don't have a need to. If that isn't enough, consider using the dapper method that returns an IDataReader - this will provide access to the raw data, while still allowing convenient call / parameterization syntax.
For example:
var rows = ...
foreach(IDictionary<string, object> row in rows) {
Console.WriteLine("row:");
foreach(var pair in row) {
Console.WriteLine(" {0} = {1}", pair.Key, pair.Value);
}
}