Mybatis - Passing String[] to Oracle stored procedure - arrays

I have a Oracle SP with the parameters as below
create type stringArray as table of varchar2(30)
/
CREATE OR REPLACE PROCEDURE create_deliverable
(
in_p_name varchar2,
in_p_filename stringArray
)AS
ret_ID number;
BEGIN
...
END;
/
while the "file name" is a string array in Oracle.
The bean is defined as below:
#Data
public class BaseEntity {
private String name;
private String[] filename;
}
I want to pass the entire bean to Oracle stored procedure.
In my mapper.java
#Mapper
public interface BaseMapper {
void add(BaseEntity d);
}
In my BaseMapper.xml
<select id="add" statementType="CALLABLE" parameterType="BaseEntity">
call create_deliverable(
#{name},
#{filename,jdbcType=ARRAY, typeHandler=ArrayTypeHandler}
)
</select>
I attempted to write up a type handler to deal with the case. But I failed in the part to work it out.
Here's what's failed:
public class ArrayTypeHandler extends BaseTypeHandler<Object> {
#Override
public void setNonNullParameter(PreparedStatement ps, int i, Object parameter, JdbcType jdbcType)
throws SQLException {
Class<?> componentType = parameter.getClass().getComponentType();
String arrayTypeName = resolveTypeName(componentType);
Array array = ps.getConnection().createArrayOf(arrayTypeName, (Object[]) parameter);
ps.setArray(i, array);
array.free();
}
}
The failure is related to the part of "createArrayOf". It reads the arrayTypeName as VARCHAR, which is correct
Here's the error message:
Could not set parameters for mapping: ParameterMapping{property='filename', mode=IN, javaType=class java.util.ArrayList, jdbcType=ARRAY, numericScale=null, resultMapId='null', jdbcTypeName='null', expression='null'}. Cause: org.apache.ibatis.type.TypeException: Error setting non null for parameter #2 with JdbcType ARRAY
Any input will be greatly appreciated.
Thanks

#ave has given the correct answer.
Just because I'm using springboot I'm here to recap the solution:
I'm using the default hikari pool & I didn't touch the datasource previously. It's autowired. It needs to be overwritten otherwise below error message would pop up
Cause: java.lang.ClassCastException: class
com.zaxxer.hikari.pool.HikariProxyConnection cannot be cast to class
oracle.jdbc.OracleConnection
(com.zaxxer.hikari.pool.HikariProxyConnection and
oracle.jdbc.OracleConnection are in unnamed module of loader 'app'
Here's my application.yml
spring: application:
name: tools datasource:
username: myuser
password: mypassword
url: jdbc:oracle:thin:#(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=localhost)
(PORT=1521))(CONNECT_DATA= (SERVICE_NAME=orclpdb)))
driver-class-name: oracle.jdbc.OracleDriver
Here's my datasource bean
import oracle.jdbc.pool.OracleDataSource;
#Value("${spring.datasource.username}")
String username;
#Value("${spring.datasource.password}")
String password;
#Value("${spring.datasource.url}")
String url;
#Bean
DataSource oracleDataSource() throws SQLException {
OracleDataSource dataSource = new OracleDataSource();
dataSource.setUser(username);
dataSource.setPassword(password);
dataSource.setURL(url);
return dataSource;
}
2)Here's my entity bean(I created new ones to not change the previous ones):
#Data
public class TestEntity {
private String name;
private String[] filename;
}
3)Here's my mapper java:
#Mapper
public interface TestMapper {
void add(TestEntity t);
}
Here's my mapper xml:
<select id="add" statementType="CALLABLE" parameterType="org.ssc.gss.entity.TestEntity">
call create_deliverable_test(
#{name,mode=IN},
#{filename,mode=IN,jdbcType=ARRAY,javaType=ArrayList, typeHandler=OracleStringArrayTypeHandler}
)
</select>
please note the "filename" parameter needs to be assigned to the typeHandler as mybatis default one won't work
5)the OracleStringArrayTypeHandler => Refers to ave's answer. There are a few more methods to be implemented but the key is the set parameter & I used
import oracle.jdbc.OracleConnection;
instead of
oracle.jdbc.driver.OracleDriver;
as the 2nd one has been deprecated
The Oracle stored procedure part:
create type stringArray as table of varchar2(30)
/
CREATE OR REPLACE PROCEDURE create_deliverable_test
(
in_p_name varchar2,
in_p_filename stringArray
)AS
ret_ID number;
BEGIN
...
END;
/
Thanks for #ave again. Without the help from #ave I couldn't move forward for days. I was thinking of use delimiter seperated string instead previously but absolutely string [] is much more advanced & convinient

Oracle's JDBC driver (as of version 19.8.0.0) does not support java.sql.Connection#createArrayOf() which is used by MyBatis' built-in ArrayTypeHandler, unfortunately.
So, you need to write a custom type handler.
I just tested and the following implementation worked.
package test;
import java.sql.Array;
import java.sql.CallableStatement;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import org.apache.ibatis.type.BaseTypeHandler;
import org.apache.ibatis.type.JdbcType;
import oracle.jdbc.OracleConnection;
public class OracleStringArrayTypeHandler extends BaseTypeHandler<String[]> {
#Override
public void setNonNullParameter(PreparedStatement ps, int i,
String[] parameter, JdbcType jdbcType) throws SQLException {
OracleConnection conn = ps.getConnection().unwrap(OracleConnection.class);
Array array = conn.createOracleArray("STRINGARRAY", parameter);
ps.setArray(i, array);
array.free();
}
...
And specify the type handler in the parameter reference.
call create_deliverable(
#{name},
#{filename,jdbcType=ARRAY,typeHandler=test.OracleStringArrayTypeHandler}
)

Related

Why Spring is turning my object into an array of attributes? [duplicate]

I'm developing a Spring Boot application with Spring Data JPA. I'm using a custom JPQL query to group by some field and get the count. Following is my repository method.
#Query(value = "select count(v) as cnt, v.answer from Survey v group by v.answer")
public List<?> findSurveyCount();
It's working and result is obtained as follows:
[
[1, "a1"],
[2, "a2"]
]
I would like to get something like this:
[
{ "cnt":1, "answer":"a1" },
{ "cnt":2, "answer":"a2" }
]
How can I achieve this?
Solution for JPQL queries
This is supported for JPQL queries within the JPA specification.
Step 1: Declare a simple bean class
package com.path.to;
public class SurveyAnswerStatistics {
private String answer;
private Long cnt;
public SurveyAnswerStatistics(String answer, Long cnt) {
this.answer = answer;
this.count = cnt;
}
}
Step 2: Return bean instances from the repository method
public interface SurveyRepository extends CrudRepository<Survey, Long> {
#Query("SELECT " +
" new com.path.to.SurveyAnswerStatistics(v.answer, COUNT(v)) " +
"FROM " +
" Survey v " +
"GROUP BY " +
" v.answer")
List<SurveyAnswerStatistics> findSurveyCount();
}
Important notes
Make sure to provide the fully-qualified path to the bean class, including the package name. For example, if the bean class is called MyBean and it is in package com.path.to, the fully-qualified path to the bean will be com.path.to.MyBean. Simply providing MyBean will not work (unless the bean class is in the default package).
Make sure to call the bean class constructor using the new keyword. SELECT new com.path.to.MyBean(...) will work, whereas SELECT com.path.to.MyBean(...) will not.
Make sure to pass attributes in exactly the same order as that expected in the bean constructor. Attempting to pass attributes in a different order will lead to an exception.
Make sure the query is a valid JPA query, that is, it is not a native query. #Query("SELECT ..."), or #Query(value = "SELECT ..."), or #Query(value = "SELECT ...", nativeQuery = false) will work, whereas #Query(value = "SELECT ...", nativeQuery = true) will not work. This is because native queries are passed without modifications to the JPA provider, and are executed against the underlying RDBMS as such. Since new and com.path.to.MyBean are not valid SQL keywords, the RDBMS then throws an exception.
Solution for native queries
As noted above, the new ... syntax is a JPA-supported mechanism and works with all JPA providers. However, if the query itself is not a JPA query, that is, it is a native query, the new ... syntax will not work as the query is passed on directly to the underlying RDBMS, which does not understand the new keyword since it is not part of the SQL standard.
In situations like these, bean classes need to be replaced with Spring Data Projection interfaces.
Step 1: Declare a projection interface
package com.path.to;
public interface SurveyAnswerStatistics {
String getAnswer();
int getCnt();
}
Step 2: Return projected properties from the query
public interface SurveyRepository extends CrudRepository<Survey, Long> {
#Query(nativeQuery = true, value =
"SELECT " +
" v.answer AS answer, COUNT(v) AS cnt " +
"FROM " +
" Survey v " +
"GROUP BY " +
" v.answer")
List<SurveyAnswerStatistics> findSurveyCount();
}
Use the SQL AS keyword to map result fields to projection properties for unambiguous mapping.
This SQL query return List< Object[] > would.
You can do it this way:
#RestController
#RequestMapping("/survey")
public class SurveyController {
#Autowired
private SurveyRepository surveyRepository;
#RequestMapping(value = "/find", method = RequestMethod.GET)
public Map<Long,String> findSurvey(){
List<Object[]> result = surveyRepository.findSurveyCount();
Map<Long,String> map = null;
if(result != null && !result.isEmpty()){
map = new HashMap<Long,String>();
for (Object[] object : result) {
map.put(((Long)object[0]),object[1]);
}
}
return map;
}
}
I know this is an old question and it has already been answered, but here's another approach:
#Query("select new map(count(v) as cnt, v.answer) from Survey v group by v.answer")
public List<?> findSurveyCount();
define a custom pojo class say sureveyQueryAnalytics and store the query returned value in your custom pojo class
#Query(value = "select new com.xxx.xxx.class.SureveyQueryAnalytics(s.answer, count(sv)) from Survey s group by s.answer")
List<SureveyQueryAnalytics> calculateSurveyCount();
I do not like java type names in query strings and handle it with a specific constructor.
Spring JPA implicitly calls constructor with query result in HashMap parameter:
#Getter
public class SurveyAnswerStatistics {
public static final String PROP_ANSWER = "answer";
public static final String PROP_CNT = "cnt";
private String answer;
private Long cnt;
public SurveyAnswerStatistics(HashMap<String, Object> values) {
this.answer = (String) values.get(PROP_ANSWER);
this.count = (Long) values.get(PROP_CNT);
}
}
#Query("SELECT v.answer as "+PROP_ANSWER+", count(v) as "+PROP_CNT+" FROM Survey v GROUP BY v.answer")
List<SurveyAnswerStatistics> findSurveyCount();
Code needs Lombok for resolving #Getter
#Repository
public interface ExpenseRepo extends JpaRepository<Expense,Long> {
List<Expense> findByCategoryId(Long categoryId);
#Query(value = "select category.name,SUM(expense.amount) from expense JOIN category ON expense.category_id=category.id GROUP BY expense.category_id",nativeQuery = true)
List<?> getAmountByCategory();
}
The above code worked for me.
I used custom DTO (interface) to map a native query to - the most flexible approach and refactoring-safe.
The problem I had with this - that surprisingly, the order of fields in the interface and the columns in the query matters. I got it working by ordering interface getters alphabetically and then ordering the columns in the query the same way.
I just solved this problem :
Class-based Projections doesn't work with query native(#Query(value = "SELECT ...", nativeQuery = true)) so I recommend to define custom DTO using interface .
Before using DTO should verify the query syntatically correct or not
Get data with column name and its values (in key-value pair) using JDBC:
/*Template class with a basic set of JDBC operations, allowing the use
of named parameters rather than traditional '?' placeholders.
This class delegates to a wrapped {#link #getJdbcOperations() JdbcTemplate}
once the substitution from named parameters to JDBC style '?' placeholders is
done at execution time. It also allows for expanding a {#link java.util.List}
of values to the appropriate number of placeholders.
The underlying {#link org.springframework.jdbc.core.JdbcTemplate} is
exposed to allow for convenient access to the traditional
{#link org.springframework.jdbc.core.JdbcTemplate} methods.*/
#Autowired
protected NamedParameterJdbcTemplate jdbc;
#GetMapping("/showDataUsingQuery/{Query}")
public List<Map<String,Object>> ShowColumNameAndValue(#PathVariable("Query")String Query) throws SQLException {
/* MapSqlParameterSource class is intended for passing in a simple Map of parameter values
to the methods of the {#link NamedParameterJdbcTemplate} class*/
MapSqlParameterSource msp = new MapSqlParameterSource();
// this query used for show column name and columnvalues....
List<Map<String,Object>> css = jdbc.queryForList(Query,msp);
return css;
}
//in Service
`
public List<DevicesPerCustomer> findDevicesPerCustomer() {
LOGGER.info(TAG_NAME + " :: inside findDevicesPerCustomer : ");
List<Object[]> list = iDeviceRegistrationRepo.findDevicesPerCustomer();
List<DevicesPerCustomer> out = new ArrayList<>();
if (list != null && !list.isEmpty()) {
DevicesPerCustomer mDevicesPerCustomer = null;
for (Object[] object : list) {
mDevicesPerCustomer = new DevicesPerCustomer();
mDevicesPerCustomer.setCustomerId(object[0].toString());
mDevicesPerCustomer.setCount(Integer.parseInt(object[1].toString()));
out.add(mDevicesPerCustomer);
}
}
return out;
}`
//In Repo
` #Query(value = "SELECT d.customerId,count(*) FROM senseer.DEVICE_REGISTRATION d where d.customerId is not null group by d.customerId", nativeQuery=true)
List<Object[]> findDevicesPerCustomer();`

JPA2 CriteriaBuilder: Using LOB property for greaterThan comparison

My application is using SQLServer and JPA2 in the backend. App makes use of a timestamp column (in the SQLServer sense, which is equivalent to row version see here) per entity to keep track of freshly modified entities. NB SQLServer stores this column as binary(8).
Each entity has a respective timestamp property, mapped as #Lob, which is the way to go for binary columns:
#Lob
#Column(columnDefinition="timestamp", insertable=false, updatable=false)
public byte[] getTimestamp() {
...
The server sends incremental updates to mobile clients along with the latest database timestamp. The mobile client will then pass the old timestamp back to the server on the next refresh request so that the server knows to return only fresh data. Here's what a typical query (in JPQL) looks like:
select v from Visit v where v.timestamp > :oldTimestamp
Please note that I'm using a byte array as a query parameter and it works fine when implemented in JPQL this way.
My problems begin when trying to do the same using the Criteria API:
private void getFreshVisits(byte[] oldVersion) {
EntityManager em = getEntityManager();
CriteriaQuery<Visit> cq = cb.createQuery(Visit.class);
Root<Visit> root = cq.from(Visit.class);
Predicate tsPred = cb.gt(root.get("timestamp").as(byte[].class), oldVersion); // compiler error
cq.where(tsPred);
...
}
The above will result in compiler error as it requires that the gt method used strictly with Number. One could instead use the greaterThan method which simply requires the params to be Comparable and that would result in yet another compiler error.
So to sum it up, my question is: how can I use the criteria api to add a greaterThan predicate for a byte[] property? Any help will be greatly appreciated.
PS. As to why I'm not using a regular DateTime last_modified column: because of concurrency and the way synchronization is implemented, this approach could result in lost updates. Microsoft's Sync Framework documentation recommends the former approach as well.
I know this was asked a couple of years back but just in case anyone else stumbles upon this.. In order to use a SQLServer rowver column within JPA you need to do a couple of things..
Create a type that will wrap the rowver/timestamp:
import com.fasterxml.jackson.annotation.JsonIgnore;
import javax.xml.bind.annotation.XmlTransient;
import java.io.Serializable;
import java.math.BigInteger;
import java.util.Arrays;
/**
* A RowVersion object
*/
public class RowVersion implements Serializable, Comparable<RowVersion> {
#XmlTransient
#JsonIgnore
private byte[] rowver;
public RowVersion() {
}
public RowVersion(byte[] internal) {
this.rowver = internal;
}
#XmlTransient
#JsonIgnore
public byte[] getRowver() {
return rowver;
}
public void setRowver(byte[] rowver) {
this.rowver = rowver;
}
#Override
public int compareTo(RowVersion o) {
return new BigInteger(1, rowver).compareTo(new BigInteger(1, o.getRowver()));
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
RowVersion that = (RowVersion) o;
return Arrays.equals(rowver, that.rowver);
}
#Override
public int hashCode() {
return Arrays.hashCode(rowver);
}
}
The key here is that it implement Comparable if you want to use it in calculations (which you definitely do)..
Next create a AttributeConverter that will move from a byte[] to the class you just made:
import javax.persistence.AttributeConverter;
import javax.persistence.Converter;
/**
* JPA converter for the RowVersion type
*/
#Converter
public class RowVersionTypeConverter implements AttributeConverter<RowVersion, byte[]> {
#Override
public byte[] convertToDatabaseColumn(RowVersion attribute) {
return attribute != null ? attribute.getRowver() : null;
}
#Override
public RowVersion convertToEntityAttribute(byte[] dbData) {
return new RowVersion(dbData);
}
}
Now let's apply this RowVersion attribute/type to a real world scenario. Let's say you wanted to find all Programs that have changed on or before some point in time.
One straightforward way to solve this would be to use a DateTime field in the object and timestamp column within db. Then you would use 'where lastUpdatedDate <= :date'.
Suppose that you don't have that timestamp column or there's no guarantee that it will be updated properly when changes are made; or let's say your shop loves SQLServer and wants to use rowver instead.
What to do? There are two issues to solve.. one how to generate a rowver and two is how to use the generated rowver to find Programs.
Since the database generates the rowver, you can either ask the db for the 'current max rowver' (a custom sql server thing) or you can simply save an object that has a RowVersion attribute and then use that object's generated RowVersion as the boundary for the query to find the Programs changed after that time. The latter solution is more portable is what the solution is below.
The SyncPoint class snippet below is the object that is used as a 'point in time' kind of deal. So once a SyncPoint is saved, the RowVersion attached to it is the db version at the time it was saved.
Here is the SyncPoint snippet. Notice the annotation to specify the custom converter (don't forget to make the column insertable = false, updateable = false):
/**
* A sample super class that uses RowVersion
*/
#MappedSuperclass
public abstract class SyncPoint {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
// type is rowver for SQLServer, blob(8) for postgresql and h2
#Column(name = "current_database_version", insertable = false, updatable = false)
#Convert(converter = RowVersionTypeConverter.class)
private RowVersion currentDatabaseVersion;
#Column(name = "created_date_utc", columnDefinition = "timestamp", nullable = false)
private DateTime createdDate;
...
Also (for this example) here is the Program object we want to find:
#Entity
#Table(name = "program_table")
public class Program {
#Id
private Integer id;
private boolean active;
// type is rowver for SQLServer, blob(8) for postgresql and h2
#Column(name = "rowver", insertable = false, updatable = false)
#Convert(converter = RowVersionTypeConverter.class)
private RowVersion currentDatabaseVersion;
#Column(name = "last_chng_dt")
private DateTime lastUpdatedDate;
...
Now you can use these fields within your JPA criteria queries just like anything else.. here is a snippet that we used inside a spring-data Specifications class:
/**
* Find Programs changed after a synchronization point
*
* #param filter that has the changedAfter sync point
* #return a specification or null
*/
public Specification<Program> changedBeforeOrEqualTo(final ProgramSearchFilter filter) {
return new Specification<Program>() {
#Override
public Predicate toPredicate(Root<Program> root, CriteriaQuery<?> query, CriteriaBuilder cb) {
if (filter != null && filter.changedAfter() != null) {
// load the SyncPoint from the db to get the rowver column populated
SyncPoint fromDb = synchronizationPersistence.reload(filter.changedBeforeOrEqualTo());
if (fromDb != null) {
// real sync point made by database
if (fromDb.getCurrentDatabaseVersion() != null) {
// use binary version
return cb.lessThanOrEqualTo(root.get(Program_.currentDatabaseVersion),
fromDb.getCurrentDatabaseVersion());
} else if (fromDb.getCreatedDate() != null) {
// use timestamp instead of binary version cause db doesn't make one
return cb.lessThanOrEqualTo(root.get(Program_.lastUpdatedDate),
fromDb.getCreatedDate());
}
}
}
return null;
}
};
}
The specification above works with both the binary current database version or a timestamp.. this way I could test my stuff and all the upstream code on a database other than SQLServer.
That's it really: a) type to wrap the byte[] b) JPA converter c) use attribute in query.

How does one make NHibernate stop using nvarchar(4000) for insert parameter strings?

I need to optimize a query that is being produced by a save (insert query) on a domain entity. I've configured NHibernate using Fluent NHibernate.
Here's the query generated by NHibernate during the insertion of a user's response to a poll:
exec sp_executesql N'INSERT INTO dbo.Response (ModifiedDate, IpAddress, CountryCode,
IsRemoteAddr, PollId) VALUES (#p0, #p1, #p2, #p3, #p4); select SCOPE_IDENTITY()',N'#p0
datetime,#p1 nvarchar(4000),#p2 nvarchar(4000),#p3 bit,#p4 int',
#p0='2001-07-08 03:59:05',#p1=N'127.0.0.1',#p2=N'US',#p3=1,#p4=2
If one looks at the input parameters for IpAddress and CountryCode, one will notice that NHibernate is using nvarchar(4000). The problem is that nvarchar(4000) is far larger than I need for either IpAddress or CountryCode and due to high traffic and hosting requirements I need to optimize the database for memory usage.
Here's the Fluent NHibernate auto-mapping overrides for those columns:
mapping.Map(x => x.IpAddress).CustomSqlType("varchar(15)");
mapping.Map(x => x.CountryCode).CustomSqlType("varchar(6)");
This isn't the only place that I see unnecessary nvarchar(4000)'s popping up.
How do I control NHibernate's usage of nvarchar(4000) for string representation?
How do I change this insert statement to use the proper sized input parameters?
Specify the Type as NHibernateUtil.AnsiString with a Length instead of using a CustomSqlType.
This issue can cause a huge performance problem in queries if it forces SQL Server to perform a table scan instead of using an index. We use varchar throughout our database so I created a convention to set the type globally:
/// <summary>
/// Convert all string properties to AnsiString (varchar). This does not work with SQL CE.
/// </summary>
public class AnsiStringConvention : IPropertyConventionAcceptance, IPropertyConvention
{
public void Accept(IAcceptanceCriteria<IPropertyInspector> criteria)
{
criteria.Expect(x => x.Property.PropertyType.Equals(typeof(string)));
}
public void Apply(IPropertyInstance instance)
{
instance.CustomType("AnsiString");
}
}
Okay this is what we have to do, the SQLClientDriver ignores the length property of the SqlType. So we created a our own driverclass inheriting from SQLClientDriver and override the method GenerateCommand...Something like this:
public override IDbCommand GenerateCommand(CommandType type, NHibernate.SqlCommand.SqlString sqlString, SqlType[] parameterTypes)
{
var dbCommand = base.GenerateCommand(type, sqlString, parameterTypes);
SetParameterSizes(dbCommand.Parameters, parameterTypes);
return dbCommand;
}
private static void SetParameterSizes(IDataParameterCollection parameters, SqlType[] parameterTypes)
{
for (int index = 0; index < parameters.Count; ++index)
SetVariableLengthParameterSize((IDbDataParameter)parameters[index], parameterTypes[index]);
}
private static void SetVariableLengthParameterSize(IDbDataParameter dbParam, SqlType sqlType)
{
SetDefaultParameterSize(dbParam, sqlType);
if (sqlType.LengthDefined && !IsText(dbParam, sqlType) && !IsBlob(dbParam, sqlType))
dbParam.Size = sqlType.Length;
if (!sqlType.PrecisionDefined)
return;
dbParam.Precision = sqlType.Precision;
dbParam.Scale = sqlType.Scale;
}
Here is a work around, if you want to replace all nvarchar with varchar
public class Sql2008NoNVarCharDriver : Sql2008ClientDriver
{
public override void AdjustCommand(IDbCommand command)
{
foreach (System.Data.SqlClient.SqlParameter x in command.Parameters)
{
if (x.SqlDbType == SqlDbType.NVarChar)
{
x.SqlDbType = SqlDbType.VarChar;
}
}
base.AdjustCommand(command);
}
}
Then plug it into your config
var cfg = Fluently.Configure()
.Database(MsSqlConfiguration.MsSql2008.ConnectionString(connectionString)
.Driver<Sql2008NoNVarCharDriver>())
...

Grails constraints GORM-JPA always sorting alphabeticaly

In my grails app, in which I use GORM-JPA, I cannot define the order of the elements of the class using the constraints. If I autogenerate the views, they are all sorted alphabetically, instead of the defined order. Here's my source class:
package kbdw
import javax.persistence.*;
// import com.google.appengine.api.datastore.Key;
#Entity
class Organisatie implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
Long id
#Basic
String naam
#Basic
String telefoonnummer
#Basic
String email
#Basic
OrganisatieType type
#Basic
String adresLijnEen
#Basic
String adresLijnTwee
#Basic
String gemeente
#Basic
String postcode
#Basic
String faxnummer
static constraints = {
id visible:false
naam size: 3..75
telefoonnummer size: 4..18
email email:true
type blank:false
adresLijnEen size:5..250
adresLijnTwee blank:true
gemeente size: 2..100
postcode size: 4..10
faxnummer size: 4..18
}
}
enum OrganisatieType {
School,
NonProfit,
Bedrijf
}
The variable names are in Dutch, but it should be clear (Organisatie = organisation, naam = name, adres = address, ...).
How do I force the app to use that order of properties? Do I need to use # annotations?
Thank you!
Yvan
(ps: it's for deploying on the Google App Engine ;-) )
Try installing and hacking scaffolding, and use DomainClassPropertyComparator in your gsp-s. Scaffold templates do a Collections.sort() on default comparator, but you can use explicit one.
The absence of Hibernate might be the cause: without it, DomainClassPropertyComparator won't work, and Grails uses SimpleDomainClassPropertyComparator - I'm looking at DefaultGrailsTemplateGenerator.groovy
You can, for sure, provide another Comparator that will compare the order of declared fields.
EDIT:
For example, after installing scaffolding I have a file <project root>\src\templates\scaffolding\edit.gsp. Inside, there are such lines:
props = domainClass.properties.findAll{ ... }
Collections.sort(props, comparator. ... )
where comparator is variable provided by Grails scaffolding. You can do:
props = ...
Collections.sort(props, new PropComparator(domainClass.clazz}))
where PropComparator is something like
class PropComparator implements Comparator {
private Class clazz
PropComparator(Class clazz) { this.clazz = clazz }
int compare(Object o1, Object o2) {
clazz.declaredFields.findIndexOf{it.name == o1}
- clazz.declaredFields.findIndexOf{it.name == o2}
}
}

How to update a postgresql array column with spring JdbcTemplate?

I'm using Spring JdbcTemplate, and I'm stuck at the point where I have a query that updates a column that is actually an array of int. The database is postgres 8.3.7.
This is the code I'm using :
public int setUsersArray(int idUser, int idDevice, Collection<Integer> ids) {
int update = -666;
int[] tipi = new int[3];
tipi[0] = java.sql.Types.INTEGER;
tipi[1] = java.sql.Types.INTEGER;
tipi[2] = java.sql.Types.ARRAY;
try {
update = this.jdbcTemplate.update(setUsersArrayQuery, new Object[] {
ids, idUser, idDevice }, tipi);
} catch (Exception e) {
e.printStackTrace();
}
return update;
}
The query is "update table_name set array_column = ? where id_user = ? and id_device = ?".
I get this exception :
org.springframework.dao.DataIntegrityViolationException: PreparedStatementCallback; SQL [update acotel_msp.users_mau set denied_sub_client = ? where id_users = ? and id_mau = ?]; The column index is out of range: 4, number of columns: 3.; nested exception is org.postgresql.util.PSQLException: The column index is out of range: 4, number of columns: 3.
Caused by: org.postgresql.util.PSQLException: The column index is out of range: 4, number of columns: 3.
I've looked into spring jdbc template docs but I can't find any help, I'll keep looking, anyway could someone point me to the right direction? Thanks!
EDIT :
Obviously the order was wrong, my fault...
I tried both your solutions, in the first case I had this :
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [update users set denied_sub_client = ? where id_users = ? and id_device = ?]; nested exception is org.postgresql.util.PSQLException: Cannot cast an instance of java.util.ArrayList to type Types.ARRAY
Trying the second solution I had this :
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [update users set denied_sub_client = ? where id_users = ? and id_device = ?]; nested exception is org.postgresql.util.PSQLException: Cannot cast an instance of [Ljava.lang.Object; to type Types.ARRAY
I suppose i need an instance of java.sql.Array, but how can I create it using JdbcTemplate?
After struggling with many attempts, we settled to use a little helper ArraySqlValue to create Spring SqlValue objects for Java Array Types.
usage is like this
jdbcTemplate.update(
"UPDATE sometable SET arraycolumn = ?",
ArraySqlValue.create(arrayValue))
The ArraySqlValue can also be used in MapSqlParameterSource for use with NamedParameterJdbcTemplate.
import static com.google.common.base.Preconditions.checkNotNull;
import java.sql.Array;
import java.sql.JDBCType;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.util.Locale;
import org.springframework.jdbc.core.StatementCreatorUtils;
import org.springframework.jdbc.support.SqlValue;
public class ArraySqlValue implements SqlValue {
private final Object[] arr;
private final String dbTypeName;
public static ArraySqlValue create(final Object[] arr) {
return new ArraySqlValue(arr, determineDbTypeName(arr));
}
public static ArraySqlValue create(final Object[] arr, final String dbTypeName) {
return new ArraySqlValue(arr, dbTypeName);
}
private ArraySqlValue(final Object[] arr, final String dbTypeName) {
this.arr = checkNotNull(arr);
this.dbTypeName = checkNotNull(dbTypeName);
}
#Override
public void setValue(final PreparedStatement ps, final int paramIndex) throws SQLException {
final Array arrayValue = ps.getConnection().createArrayOf(dbTypeName, arr);
ps.setArray(paramIndex, arrayValue);
}
#Override
public void cleanup() {}
private static String determineDbTypeName(final Object[] arr) {
// use Spring Utils similar to normal JdbcTemplate inner workings
final int sqlParameterType =
StatementCreatorUtils.javaTypeToSqlParameterType(arr.getClass().getComponentType());
final JDBCType jdbcTypeToUse = JDBCType.valueOf(sqlParameterType);
// lowercasing typename for Postgres
final String typeNameToUse = jdbcTypeToUse.getName().toLowerCase(Locale.US);
return typeNameToUse;
}
}
this code is provided in the Public Domain
private static final String ARRAY_DATATYPE = "int4";
private static final String SQL_UPDATE = "UPDATE foo SET arr = ? WHERE d = ?";
final Integer[] existing = ...;
final DateTime dt = ...;
getJdbcTemplate().update(new PreparedStatementCreator() {
#Override
public PreparedStatement createPreparedStatement(final Connection con) throws SQLException {
final PreparedStatement ret = con.prepareStatement(SQL_UPDATE);
ret.setArray(1, con.createArrayOf(ARRAY_DATATYPE, existing));
ret.setDate(2, new java.sql.Date(dt.getMillis()));
return ret;
}
});
This solution is kind of workaround using postgreSQL built-in function, which definitely worked for me.
reference blog
1) Convert String Array to Comma Separated String
If you are using Java8, it's pretty easy. other options are here
String commaSeparatedString = String.join(",",stringArray); // Java8 feature
2) PostgreSQL built-in function string_to_array()
you can find other postgreSQL array functions here
// tableName ( name text, string_array_column_name text[] )
String query = "insert into tableName(name,string_array_column_name ) values(?, string_to_array(?,',') )";
int[] types = new int[] { Types.VARCHAR, Types.VARCHAR};
Object[] psParams = new Object[] {"Dhruvil Thaker",commaSeparatedString };
jdbcTemplate.batchUpdate(query, psParams ,types); // assuming you have jdbctemplate instance
The cleanest way I found so far is to first convert the Collection into an Integer[] and then use the Connection to convert that into an Array.
Integer[] idArray = ids.toArray(new Integer[0]);
Array idSqlArray = jdbcTemplate.execute(
(Connection c) -> c.createArrayOf(JDBCType.INTEGER.getName(), idArray)
);
update = this.jdbcTemplate.update(setUsersArrayQuery, new Object[] {
idSqlArray, idUser, idDevice })
This is based on information in the documentation: https://jdbc.postgresql.org/documentation/head/arrays.html
The argument type and argument is not matching.
Try changing the argument type order
int[] tipi = new int[3];
tipi[0] = java.sql.Types.ARRAY;
tipi[1] = java.sql.Types.INTEGER;
tipi[2] = java.sql.Types.INTEGER;
or use
update = this.jdbcTemplate.update(setUsersArrayQuery, new Object[] {
ids.toArray(), idUser, idDevice })
and see if it works
http://valgogtech.blogspot.com/2009/02/passing-arrays-to-postgresql-database.html explains how to create java.sql.Array postgresql
basically Array.getBaseTypeName should return int and Array.toString should return the array content in "{1,2,3}" format
after you create the array you can set it using preparedstatement.setArray(...)
from PreparedStatementCreator e.g.
jdbcTemplate.update(
new PreparedStatementCreator() {
public PreparedStatement createPreparedStatement(Connection connection) throws SQLException {
Good Luck ..
java.sql.Array intArray = connection.createArrayOf("int", existing);
List<Object> values= new ArrayList<Object>();
values.add(intArray);
values.add(dt);
getJdbcTemplate().update(SQL_UPDATE,values);

Resources