Spring SpEL and BigDecimal expressions - spring-el

We're using Spring Expression Language (v3.1.2) to validate Object values in a map. The two test cases below outline a scenario where we are asserting the value of a BigDecimal object. The first case passes but the second test fails. I think the way we have defined the expression is correction, but is suspect the implementation is not correctly casting the object value.
#Test
public void testBigDecimalValueLess() {
Map<String,Object> map = new HashMap<String,Object>();
map.put("premiums",new BigDecimal("400000.000000"));
StandardEvaluationContext stdContext = new StandardEvaluationContext();
stdContext.setVariables(map);
// Set expression
String ruleExpression = "#premiums>=new java.math.BigDecimal('500000')";
// Evaluate the SpEL expression
ExpressionParser parser = new SpelExpressionParser();
Expression expression = parser.parseExpression(ruleExpression);
Boolean returnValue = expression.getValue(stdContext,Boolean.class);
Assert.assertEquals(Boolean.FALSE,returnValue);
}
#Test
public void testBigDecimalValueGreater() {
Map<String,Object> map = new HashMap<String,Object>();
map.put("premiums",new BigDecimal("119000000000.000000"));
StandardEvaluationContext stdContext = new StandardEvaluationContext();
stdContext.setVariables(map);
// Set expression
String ruleExpression = "#premiums>=new java.math.BigDecimal('500000')";
// Evaluate the SpEL expression
ExpressionParser parser = new SpelExpressionParser();
Expression expression = parser.parseExpression(ruleExpression);
Boolean returnValue = expression.getValue(stdContext,Boolean.class);
Assert.assertEquals(Boolean.TRUE,returnValue);
}
Is there any recommended way to we can achive this on the currect version. Can i pass extra information
to the StandardEvaluationContext object?
EDIT 1
I written this test to show how the BigDecimal values are cast to 'int' and 'long'
#Test
public void bigDecimalIntValue(){
System.out.println(new BigDecimal("1000").intValue());
System.out.println(new BigDecimal("100000").intValue());
System.out.println(new BigDecimal("10000000").intValue());
System.out.println(new BigDecimal("1000000000").intValue());
System.out.println(new BigDecimal("100000000000").intValue()); --> 1215752192
System.out.println(new BigDecimal("100000000000").longValue()); --> 100000000000
System.out.println(new BigDecimal("713027290000.000000").intValue()); --> 62718864
System.out.println(new BigDecimal("713027290000.000000").longValue()); --> 713027290000
}
I suspect i just need to ensure my expression casts both numbers to a Long before the operation is evaluated.

I suggest using compareTo(BigDecimal val) method of BigDecimal class. For details of this method, take a look at http://docs.oracle.com/javase/7/docs/api/java/math/BigDecimal.html#compareTo%28java.math.BigDecimal%29 and change your expression to something like this:
String ruleExpression = "#premiums.compareTo(new java.math.BigDecimal('500000')) >= 0";

Related

EF6 IQueryable dynamic linq Where(predicate, values) with DbFunctions.TruncateTime

In my simplified example i have an object with following properties:
Name (string)
BirthDateTimeStamp (datetime)
I need to ba able to build dynamic queries in the following way
var predicate = "Name = #0";
var values = new object[]{"Ed"};
myIQueryableDataSource.Where(predicate, values)
This work good. Now i want to compare my datetime
var predicate = "BirthDateTimeStamp >= #0";
var values = new object[]{someDateTime};
This works good also. But what i actually want to do when comparing the datetimes and this issue shows itself better when doing the equals is comparing just on date.
var predicate = "BirthDateTimeStamp.Date >= #0";
This is not possible since the Date property is not recognized by EF to SQL server
var predicate = "System.Data.Entity.DbFunctions.TruncateTime(BirthDateTimeStamp) >= #0";
This is also not working since i can only access my object properties in the predicate.
How can i solve this in this way so the predicate stays in the string format. This code is just a part of a big existing parser for my queries and connat be completely rewritten.
See https://stackoverflow.com/a/26451213/525788
System.Linq.Dynamic is parsing the expression that you give as C# but does not recognize the class DbFunctions. However you can patch in DbFunctions as a predefined type:
var type = typeof( DynamicQueryable ).Assembly.GetType( "System.Linq.Dynamic.ExpressionParser" );
FieldInfo field = type.GetField( "predefinedTypes", BindingFlags.Static | BindingFlags.NonPublic );
Type[] predefinedTypes = (Type[])field.GetValue( null );
Array.Resize( ref predefinedTypes, predefinedTypes.Length + 1 );
predefinedTypes[ predefinedTypes.Length - 1 ] = typeof( DbFunctions );
field.SetValue( null, predefinedTypes );
Then you can use
var predicate = "DbFunctions.TruncateTime(BirthDateTimeStamp) >= #0";
Answer given by #RockResolve did work, but is kind a hack. Linq Dynamics provide functionality to add custom functions
public class CustomTypeProvider: IDynamicLinkCustomTypeProvider
{
public HashSet<Type> GetCustomTypes()
{
HashSet<Type> types = new HashSet<Type>();
// adding custom types
types.Add(typeof(DbFunctions));
return types;
}
}
// use below line to add this to linq
System.Linq.Dynamics.GlobalConfig.CustomTypeProvider = new CustomTypeProvier();

How can I set the value and the datatype when creating a new SqlParameter?

I have my code setting up a parameter for QuestionsCount and then for Questions:
parameterList.Add(new SqlParameter("#QuestionsCount", chunkSize));
var p = new SqlParameter("#Questions", questions);
p.TypeName = "dbo.QuestionList";
parameterList.Add(p);
Is there a way that I can combine the last three lines and create a new SqlParameter with the typeName. I was looking at the definitions but cannot find one that takes the value (questions) and the TypeName "dbo.QuestionList".
There are quite a few overloads for the constructor for SqlParameter, some of which allow you to specify a datatype and optionally a length:
var p = new SqlParameter("#Questions", SqlDbType.VarChar, 100);
p.Value = ":.......";
parameterList.Add(p);
Check out the MSDN documentation on SqlParameter for details.
One method is with an initializer to supply property values not available in the constructor overloads:
parameterList.Add(new SqlParameter( "#Questions", SqlDbType.Structured ) { TypeName = "dbo.QuestionList", Value = questions });

How do I use Activator.CreateInstance to do the following please

I use a similar style of code many times in my application to read in records from a database
WorkoutResultsRecord is inherited from a class called BaseRecord. One of the base constructors takes a IDataReader parameter to read fields into the class (as seen below).
What I want to do is define a generic function that will do the following for any/all of my 60+ xxxRecord type classes ie I can pass in a type as a parameter and it will return the correct type of objects as a typed List. Is it possible with Activator class? I've not used it before and my results just wouldn't compile
protected List<WorkoutResultsRecord> ReadRecordList(string sql,
IDbConnection connection)
{
var results = new List<WorkoutResultsRecord>();
using (IDbCommand command = GetCommand(sql, connection))
using (IDataReader reader = command.ExecuteReader())
while (reader.Read())
results.Add(new WorkoutResultsRecord(reader));
return results;
}
My really bad, failed attempt :(
private void sfsdf(Type type)
{
List<typeof(type)> lst = new List<type>();
Activator.CreateInstance(List<typeof(type)>);
}// function
this should work:
private void sfsdf(Type type)
{
Type genericType = typeof(List<>).MakeGenericType(type);
System.Collections.IList theList = (IList) Activator.CreateInstance(genericType);
// do whatever you like with this list...
}
Note: as the type is known at runtime only, it's not possible for you to declare a List when you write the code, so rather, use IList interface instead, but the created object theList should be of the expected type...
The following is the full function with all the generics done as I wanted. This will save a lot of typing!! thanks very much
allResults = (List<WorkoutResultsRecord>)FillList(typeof(WorkoutResultsRecord),
sql, connection, new KVP("FROMDATE", fromUtf.Date),
new KVP("TODATE", endDate.AddDays(1).Date));
IList FillList(Type type,string sql,IDbConnection connection,
params KVP[] parameters)
{
Type genericType = typeof(List<>).MakeGenericType(type);
IList results = (IList)Activator.CreateInstance(genericType);
using (var command= Command(sql,connection))
{
foreach(KVP parameter in parameters)
CreateParam(command,parameter.Key,parameter.Value);
using (IDataReader reader = command.ExecuteReader())
while (reader.Read())
results.Add(Activator.CreateInstance(type,reader));
}
return results;
}

Request server using Axis2 RPC way, parameter order in xml packets not correct

For example, I will send an object Fruits to server side.
The code like this:
public static <T> T call(String url, String ns, String method, Fruits fruits, Class<T> clz) throws AxisFault
{
RPCServiceClient client = new RPCServiceClient();
Options option = client.getOptions();
EndpointReference erf = new EndpointReference(url);
option.setTo(erf);
QName name = new QName(ns, method);
Object[] object = new Object[]{fruits};
Class[] returnTypes = new Class[]{clz};
Object[] reto = client.invokeBlocking(name, object, returnTypes);
T t = (T)reto[0];
return t;
}
The object like this:
public class Fruits implements Serializable
{
private int pear;
private int banana;
private int apple;
public int setPear(int pear){this.pear=pear;}
public int getPear(){return this.pear;}
...
}
The xml part should be this:
...
<fruits>
<pear>10</pear>
<banana>20</banana>
<apple>60</apple>
</fruits>
...
But in fact like this:
...
<fruits>
<apple>60</apple>
<banana>20</banana>
<pear>10</pear>
</fruits>
...
Axis2 makes object's property alphabetical order, but the server doesn't accept. I can't modify the serverside, it is ESB.
The only way to do a success request is to use the Axis2 generated code, I used to use WSDL2Java, but too many redundant code and difficult to maintain. So I want refactor.
I have also tried to use CXF, but it also makes object's property alphabetical order, not followed with WSDL/XSD or DTO defined style.
I've find the reason why CXF makes the ordering, it uses java.beans.BeanInfo to get properties of object, such as:
...
BeanInfo beanInfo = Introspector.getBeanInfo(Fruits.class);
PropertyDescriptor[] propertyDescriptors = beanInfo.getPropertyDescriptors();
...
The property in the array has already alphabetical order.
Who knows how to let Axis2 to serialize the Fruits' property to be correct ordering.
Thank you, the first!
Not sure on Axis2, but if you are using CXF with the JAXB databinding, you can add an annotation like:
#XmlType(name = "fruits", propOrder = { "apple", "banana", "pear" }})
to the Fruits class to tell JAXB what order you need/want them output.

SQLiteDataReader GetFieldType() returns Int64, but then fails on GetInt64() - is this a bug or feature?

When reading from an SQLiteDataReader I'm experiencing some odd behaviour whereby GetFieldType(0) returns typeof(Int64), GetValue(0) returns an Int64, but GetInt64(0) fails with an System.InvalidCastException exception.
It has taken me a rather long time to reproduce this behaviour:
using System;
using System.Data.SQLite;
using NUnit.Framework;
namespace Test
{
[TestFixture]
public class SQLiteType
{
[Test]
public void A()
{
var sqlConnection = new SQLiteConnection("Data Source=:memory:;Version=3;");
sqlConnection.Open();
var create = sqlConnection.CreateCommand();
create.CommandText = "CREATE TABLE FOO (x INTEGER)";
create.ExecuteNonQuery();
var insert = sqlConnection.CreateCommand();
insert.CommandText = "INSERT INTO FOO VALUES (?)";
var param = insert.CreateParameter();
param.Value = new TimeSpan(0); // NOTE INSERTING TIMESPAN DIRECTLY instead of .Ticks
insert.Parameters.Add(param);
insert.ExecuteNonQuery();
var select = sqlConnection.CreateCommand();
select.CommandText = "SELECT x FROM FOO";
var dr = select.ExecuteReader();
while (dr.Read())
{
var valueObject = dr.GetValue(0);
Assert.AreEqual(typeof (Int64), valueObject.GetType());
var valueType = dr.GetFieldType(0);
Assert.AreEqual(typeof (Int64), valueType);
var value = dr.GetInt64(0); // throws System.InvalidCastException
}
}
}
}
It seems to occur when the row is created by inserting a TimeSpan value directly into an INTEGER column (instead of e.g. TimeSpan.Ticks which might be more meaningful). Despite this, the datareader is still telling me that column is an Int64.
I'm not exactly sure what the contract is for SQLiteDataReader but I had previously assumed that if GetFieldType() returns a typeof(Int64), then GetInt64() should not fail. Perhaps this is not the case? (It seems quite odd that GetValue() still returns an Int64) Maybe it is an artifact of SQLite's unique dynamic typing system.
Certainly it is not hard to avoid, but for pedagogical reasons I am just curious why this is happening?
The root cause may have to do with how types are handled with SQLite:
http://www.sqlite.org/datatype3.html#affinity
Even then, this looks like a bug to me; if:
dr.GetValue(0).GetType() == typeof(System.Int64)
then it should certainly follow that dr.GetInt64(0) doesn't throw an exception. I would send an email to sqlite-users#sqlite.org as described here: http://www.sqlite.org/src/wiki?name=Bug+Reports
Please note though that if you replace:
param.Value = new TimeSpan(0);
with
param.Value = new TimeSpan(0).Ticks;
then
var value = dr.GetInt64(0);
works fine. I'm bringing this up because I'm not sure there is any conversion assumption to make when you assign that TimeSpan. For instance, there is no implicit or explicit conversion from TimeSpan to long.

Resources