using Microsoft.SqlServer.TransactSql.ScriptDom for parse query with errors - sql-server

I use the following code to get a list of statements in the query:
using System;
using System.Collections.Generic;
using System.IO;
using System.Windows.Forms;
using Microsoft.SqlServer.TransactSql.ScriptDom;
namespace SqlTokenazer
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}
private void Form1_Load(object sender, EventArgs e)
{
Tokenaze();
}
private void Tokenaze()
{
rtbLog.Clear();
string script = "select * from dbo.Mytable where columnName = 0 delete from dbo.Mytable where columnName = 0";
var sqlScript = ParseScript(script);
PrintStatements(sqlScript);
}
public TSqlScript ParseScript(string script){
IList<ParseError> parseErrors;
TSql100Parser tsqlParser = new TSql100Parser(true);
TSqlFragment fragment;
using (StringReader stringReader = new StringReader(script))
{
fragment = (TSqlFragment)tsqlParser.Parse(stringReader, out parseErrors);
}
if (parseErrors.Count > 0)
{
var retMessage = string.Empty;
foreach (var error in parseErrors)
{
retMessage += error.Number + " - " + error.Message + " - position: " + error.Offset + ";\r\n";
}
rtbLog.Text += retMessage;
}
return (TSqlScript)fragment;
}
public void PrintStatements(TSqlScript tsqlScript)
{
if (tsqlScript != null)
{
foreach (TSqlBatch batch in tsqlScript.Batches)
{
if (batch.Statements.Count == 0) continue;
foreach (TSqlStatement statement in batch.Statements)
{
rtbLog.Text += string.Format("{0}\r\n", statement.GetType().ToString());
}
}
}
}
}
}
Results:
Microsoft.SqlServer.TransactSql.ScriptDom.SelectStatement
Microsoft.SqlServer.TransactSql.ScriptDom.DeleteStatement
But when I make a mistake in query, a list of statements is empty :(
string script = "select * from dbo.Mytable where ...
delete from dbo.Mytable where columnName = 0";
how can I get a list of statements, if the query is wrong?
Thanks!

I know this is an old question, but I came across it while Googling so I figured I'd answer it.
If your question is, how to get a list of statements if the SQL can't be parsed, the short answer is that you can't - the parser has no idea what the list of statements would be. You'd have to look at the errors and figure it out.
If your question is, what's wrong with the input code, it's that the select and delete statements are all on the same line. If you separate them with a semicolon or break them into two lines, it'll work and you can get your two statements.

Related

Insert CSV using Apex Batch Class Salesforce for OpportunityLineItem

I want to add a button to my opportunity header record that is called Insert Products. This will send the opportunity ID to a visualforce page which will have a select file button and an insert button that will loop through the CSV and insert the records to the related opportunity.
This is for non technical users so using Data loader is not an option.
I got this working using standard apex class however hit a limit when i load over 1,000 records (which would happen regularly).
I need to convert this to a batch process however am not sure how to do this.
Any one able to point me in the right direction? I understand a batch should have a start, execute and finish. However i am not sure where i should split the csv and where to read and load?
I found this link which i could not work out how to translate into my requirements: http://developer.financialforce.com/customizations/importing-large-csv-files-via-batch-apex/
Here is the code i have for the standard apex class which works.
public class importOppLinesController {
public List<OpportunityLineItem> oLiObj {get;set;}
public String recOppId {
get;
// *** setter is NOT being called ***
set {
recOppId = value;
System.debug('value: '+value);
}
}
public Blob csvFileBody{get;set;}
public string csvAsString{get;set;}
public String[] csvFileLines{get;set;}
public List<OpportunityLineItem> oppLine{get;set;}
public importOppLinesController(){
csvFileLines = new String[]{};
oppLine = New List<OpportunityLineItem>();
}
public void importCSVFile(){
PricebookEntry pbeId;
String unitPrice = '';
try{
csvAsString = csvFileBody.toString();
csvFileLines = csvAsString.split('\n');
for(Integer i=1;i<csvFileLines.size();i++){
OpportunityLineItem oLiObj = new OpportunityLineItem() ;
string[] csvRecordData = csvFileLines[i].split(',');
String pbeCode = csvRecordData[0];
pbeId = [SELECT Id FROM PricebookEntry WHERE ProductCode = :pbeCode AND Pricebook2Id = 'xxxx HardCodedValue xxxx'][0];
oLiObj.PricebookEntryId = pbeId.Id;
oLiObj.Quantity = Decimal.valueOf(csvRecordData[1]) ;
unitPrice = String.valueOf(csvRecordData[2]);
oLiObj.UnitPrice = Decimal.valueOf(unitPrice);
oLiObj.OpportunityId = 'recOppId';;
insert (oLiObj);
}
}
catch (Exception e)
{
ApexPages.Message errorMessage = new ApexPages.Message(ApexPages.severity.ERROR, e + ' - ' + unitPrice);
ApexPages.addMessage(errorMessage);
}
}
}
First problem that I can sense is that the insert DML statement is inside FOR-loop. Can you put the new "oLiObj" into a List that is declared before the FOR-loop starts and then try inserting the list after the FOR-loop ?
It should bring some more sanity in your code.

SqlDependency Data Flood when using COUNT_BIG() in query

FIXED: Code updated that now works.
Trying to setup a websocket for a management dashboard where I need queries using count_big() fields and GROUP BY clauses. Standard recordset lists work great, but once I add the count_big() the websocket doesn't stop sending data. I have read this post about limitations and count_big() appears OK to use. TIA
using Microsoft.Web.WebSockets;
using System;
using System.Collections.Generic;
using System.Configuration;
using System.Data;
using System.Data.SqlClient;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Web;
using System.Web.Http;
namespace DatabaseChangeNotification.Controllers
{
public class DatabaseNotificationController : ApiController
{
public HttpResponseMessage Get()
{
HttpContext.Current.AcceptWebSocketRequest(new ChatWebSocketHandler());
return Request.CreateResponse(HttpStatusCode.SwitchingProtocols);
}
class ChatWebSocketHandler : Microsoft.Web.WebSockets.WebSocketHandler
{
public string wsData = null;
public SqlCommand gblCommand = null;
public ChatWebSocketHandler()
{
SetupNotifier();
}
protected void SetupNotifier()
{
using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["DefaultConnection"].ConnectionString))
{
connection.Open();
// DO NOT USE any "*" in queries
// WHen using count the variable was converted to string. Got Data flood
//
// Testing count_big data type
// getString failed
//
using (SqlCommand command = new SqlCommand(#"select [address], count_big(*) as [CurrentTotal] from dbo.users where address = 'main st' group by address", connection))
{
command.Notification = null;
SqlDependency dependency = new SqlDependency(command);
dependency.OnChange += new OnChangeEventHandler(dependency_OnChange);
if (connection.State == ConnectionState.Closed)
{
connection.Open();
}
//SqlCommand gblCommand = command;
wsData = null;
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
/* MUST MATCH column count and column data type */
// wsData += reader.GetString(0) + " " + reader.GetString(1) + " " + reader.GetString(2);
/* THIS WORKS FOR GETTING NUMERIC VARIABLES */
wsData += reader.GetValue(0) + " " + int.Parse(reader.GetValue(1).ToString());
// wsData += reader.GetString(0) + "</br>"; //works but we get data flood and no numbers
}
// reader.Close();
}
_chatClients.Broadcast("data: " + wsData);
}
}
} //SetupNotifier
private static WebSocketCollection _chatClients = new WebSocketCollection();
public override void OnOpen()
{
_chatClients.Add(this);
} // OnOpen
public override void OnMessage(string msg)
{
} // OnMessage
private void dependency_OnChange(object sender, SqlNotificationEventArgs e)
{
if (e.Type != SqlNotificationType.Change)
{
_chatClients.Broadcast("Returning, not a change notification ");
return;
}
/*
* Must remove dependency. Only works once.
*/
SqlDependency dependency = sender as SqlDependency;
dependency.OnChange -= dependency_OnChange;
//reset for next message.
SetupNotifier();
} // dependency_OnChange
} // ChatWebSocketHandler
} // DatabaseNotificationController
}
NOTE: This was happening before code was fixed.
Web Socket returns infinite listing:
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
data: main st 1
{ .....}
You must check the SqlNotificationEventArgs members. Not all notifications indicate a update. Some notifications (like the ones you're getting) indicate invalid conditions or invalid query. You are getting a notification for invalid query and resubmit, just to be immediately notified for the same reason. Ad-nauseam.
Inspecting the notification would point toward the problem. In your case the problem is listed in the very first bullet point in the link yourself posted:
table names must be qualified with two-part names

Prevent SQL injection when building query

I know usually how to prevent it using preparedStatements, but now I have such a method for bulding queries. For example in Java:
private String buildQuery(String where) {
String query = "SELECT id, name FROM someTable";
if(where.length() > 0) {
query = query + " WHERE " + where;
}
return query;
}
'where' string is like this 'variable = value'. How can i prevent it here? I thought of passing variable and value separately, creating prepared statement using them and then returning that prepared statement as string somehow, but I'm not sure.
This is not specific to any one DB API.
TL;DR: Don't pass "SQL fragments" around.
Rather than passing complete clauses fro a select statement, or (sub-)expressions to add into a select clause, pass the components keeping the user data separate from the identifiers.
In this case do not pass name = value, pass them separately. Then validate name is a valid column for the table, and generate a parameter for the value part.
Thus, pseudo-code (my Java is rusty):
function BuildCommand(string column, object value) {
if !IsValidColumn("theTable", column)) throw InvalidOperation(...)
string sql = "Select column from theTable where " + column + " = #p0";
SqlCommand cmd = new SqlCommand(sql);
cmd.Parameters.Add("#p0", value);
return cmd;
}
You can use a map to pass your values and build a preparedStatement. Check the code below it should be something similar to that logic
public static PreparedStatement buildQuery(String where,Map<Integer, String> cond)
throws SQLException {
PreparedStatement stat = null;
String query = "SELECT id, name FROM someTable " + where;
try {
stat = con.prepareStatement(query);
for (Map.Entry<Integer, String> e : cond.entrySet()) {
stat.setString(e.getKey(), e.getValue());
}
} catch (SQLException e ) {
// Handle ex
} finally {
}
return stat;
}
public static void main(String[] a) throws SQLException {
Map<Integer,String> cond =new HashMap<Integer, String>();
cond.put(1,"val22");
cond.put(2,"val2");
buildQuery("col1 = ? and col2= ?", cond);
}
My suggestion is that if you have an array of where clauses in the parameter and rewrite the function as :
private String buildQuery(String[] where) {
String query = "SELECT id, name FROM someTable";
query = query + " WHERE "
for(int i = 0; i < where.length; i++) {
if(i > 0){
query = query + " AND "
}
query = query + w + " = ?";
}
return query;
}

Dapper calls sp_executesql when I have parameters, is there a way around that?

When I call
connection.Execute(sql);
Dapper executes and everything is fine. When I call
connection.Execute(sql, new { UserId = _userId });
it executes with sp_executesql.
The issue is when it uses sp_executesql it's in its own scope. If it creates a temporary table, it's not accessible to subsequent queries that use the same connection. I could get around it by using global temporary tables, but I don't want to risk having two processes interfere with each other.
Does anybody know a way around that?
Update: I have the same problem when I use SqlCommand objects without Dapper. I wrote a unit test that illustrates the problem I'm having. WorksWithParameters fails with System.Data.SqlClient.SqlException : Invalid object name '#TEMP_OBJECTLIST'.
[TestFixture]
public class DapperTest
{
private const string TestObjectType = "S";
private const string ConnectionString = "XXXXXXXXX";
private static void CreateTempTableWithoutParameters(SqlConnection connection)
{
const string sql = "SELECT TOP 10 * INTO #TEMP_OBJECTLIST FROM sys.objects WHERE TYPE = 'S'";
connection.Execute(sql);
}
private static void UseTempTableWithoutParameters(SqlConnection connection)
{
const int expectedCount = 10;
const string sql = "SELECT COUNT(*) FROM #TEMP_OBJECTLIST WHERE TYPE = 'S'";
var count = connection.Query<int>(sql).First();
Assert.AreEqual(expectedCount, count);
}
private static void CreateTempTableWithParameters(SqlConnection connection)
{
const string sql = "SELECT TOP 10 * INTO #TEMP_OBJECTLIST FROM sys.objects WHERE TYPE = #OBJECT_TYPE";
connection.Execute(sql, new {OBJECT_TYPE = TestObjectType});
}
private static void UseTempTableWithParameters(SqlConnection connection)
{
const int expectedCount = 10;
const string sql = "SELECT COUNT(*) FROM #TEMP_OBJECTLIST WHERE TYPE = #OBJECT_TYPE";
var param = new {OBJECT_TYPE = TestObjectType};
var count = connection.Query<int>(sql, param).First();
Assert.AreEqual(expectedCount, count);
}
[Test]
public void WorksWithParameters()
{
using (var connection = new SqlConnection(ConnectionString))
{
connection.Open();
CreateTempTableWithParameters(connection);
UseTempTableWithParameters(connection);
}
}
[Test]
public void WorksWithoutParameters()
{
using (var connection = new SqlConnection(ConnectionString))
{
connection.Open();
CreateTempTableWithoutParameters(connection);
UseTempTableWithoutParameters(connection);
}
}
}
One way around the temp table scope problem is to create the temp table with one dummy column in the outer scope, then use alter table statements to add all the desired columns and use it.
Additionally, How to share data between procedures by Erland Sommarskog may be useful to you or another person looking for different options for sharing data.
I ran into the same problem with Dapper, but it's not Dapper's fault. sp_executesql is called by ADO.NET and this switches the "scope" so temp tables become invisible.
As a workaround:
//no parameters, so it runs without sp_executesql
conn.Execute("CREATE TABLE #temp BLAHBLAH");
//do your thing
conn.Execute("INSERT INTO #temp BLAHBLAH", parameters);
//cleanup (no parameters again)
conn.Execute("DROP TABLE #temp");

How to update a postgresql array column with spring JdbcTemplate?

I'm using Spring JdbcTemplate, and I'm stuck at the point where I have a query that updates a column that is actually an array of int. The database is postgres 8.3.7.
This is the code I'm using :
public int setUsersArray(int idUser, int idDevice, Collection<Integer> ids) {
int update = -666;
int[] tipi = new int[3];
tipi[0] = java.sql.Types.INTEGER;
tipi[1] = java.sql.Types.INTEGER;
tipi[2] = java.sql.Types.ARRAY;
try {
update = this.jdbcTemplate.update(setUsersArrayQuery, new Object[] {
ids, idUser, idDevice }, tipi);
} catch (Exception e) {
e.printStackTrace();
}
return update;
}
The query is "update table_name set array_column = ? where id_user = ? and id_device = ?".
I get this exception :
org.springframework.dao.DataIntegrityViolationException: PreparedStatementCallback; SQL [update acotel_msp.users_mau set denied_sub_client = ? where id_users = ? and id_mau = ?]; The column index is out of range: 4, number of columns: 3.; nested exception is org.postgresql.util.PSQLException: The column index is out of range: 4, number of columns: 3.
Caused by: org.postgresql.util.PSQLException: The column index is out of range: 4, number of columns: 3.
I've looked into spring jdbc template docs but I can't find any help, I'll keep looking, anyway could someone point me to the right direction? Thanks!
EDIT :
Obviously the order was wrong, my fault...
I tried both your solutions, in the first case I had this :
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [update users set denied_sub_client = ? where id_users = ? and id_device = ?]; nested exception is org.postgresql.util.PSQLException: Cannot cast an instance of java.util.ArrayList to type Types.ARRAY
Trying the second solution I had this :
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [update users set denied_sub_client = ? where id_users = ? and id_device = ?]; nested exception is org.postgresql.util.PSQLException: Cannot cast an instance of [Ljava.lang.Object; to type Types.ARRAY
I suppose i need an instance of java.sql.Array, but how can I create it using JdbcTemplate?
After struggling with many attempts, we settled to use a little helper ArraySqlValue to create Spring SqlValue objects for Java Array Types.
usage is like this
jdbcTemplate.update(
"UPDATE sometable SET arraycolumn = ?",
ArraySqlValue.create(arrayValue))
The ArraySqlValue can also be used in MapSqlParameterSource for use with NamedParameterJdbcTemplate.
import static com.google.common.base.Preconditions.checkNotNull;
import java.sql.Array;
import java.sql.JDBCType;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.util.Locale;
import org.springframework.jdbc.core.StatementCreatorUtils;
import org.springframework.jdbc.support.SqlValue;
public class ArraySqlValue implements SqlValue {
private final Object[] arr;
private final String dbTypeName;
public static ArraySqlValue create(final Object[] arr) {
return new ArraySqlValue(arr, determineDbTypeName(arr));
}
public static ArraySqlValue create(final Object[] arr, final String dbTypeName) {
return new ArraySqlValue(arr, dbTypeName);
}
private ArraySqlValue(final Object[] arr, final String dbTypeName) {
this.arr = checkNotNull(arr);
this.dbTypeName = checkNotNull(dbTypeName);
}
#Override
public void setValue(final PreparedStatement ps, final int paramIndex) throws SQLException {
final Array arrayValue = ps.getConnection().createArrayOf(dbTypeName, arr);
ps.setArray(paramIndex, arrayValue);
}
#Override
public void cleanup() {}
private static String determineDbTypeName(final Object[] arr) {
// use Spring Utils similar to normal JdbcTemplate inner workings
final int sqlParameterType =
StatementCreatorUtils.javaTypeToSqlParameterType(arr.getClass().getComponentType());
final JDBCType jdbcTypeToUse = JDBCType.valueOf(sqlParameterType);
// lowercasing typename for Postgres
final String typeNameToUse = jdbcTypeToUse.getName().toLowerCase(Locale.US);
return typeNameToUse;
}
}
this code is provided in the Public Domain
private static final String ARRAY_DATATYPE = "int4";
private static final String SQL_UPDATE = "UPDATE foo SET arr = ? WHERE d = ?";
final Integer[] existing = ...;
final DateTime dt = ...;
getJdbcTemplate().update(new PreparedStatementCreator() {
#Override
public PreparedStatement createPreparedStatement(final Connection con) throws SQLException {
final PreparedStatement ret = con.prepareStatement(SQL_UPDATE);
ret.setArray(1, con.createArrayOf(ARRAY_DATATYPE, existing));
ret.setDate(2, new java.sql.Date(dt.getMillis()));
return ret;
}
});
This solution is kind of workaround using postgreSQL built-in function, which definitely worked for me.
reference blog
1) Convert String Array to Comma Separated String
If you are using Java8, it's pretty easy. other options are here
String commaSeparatedString = String.join(",",stringArray); // Java8 feature
2) PostgreSQL built-in function string_to_array()
you can find other postgreSQL array functions here
// tableName ( name text, string_array_column_name text[] )
String query = "insert into tableName(name,string_array_column_name ) values(?, string_to_array(?,',') )";
int[] types = new int[] { Types.VARCHAR, Types.VARCHAR};
Object[] psParams = new Object[] {"Dhruvil Thaker",commaSeparatedString };
jdbcTemplate.batchUpdate(query, psParams ,types); // assuming you have jdbctemplate instance
The cleanest way I found so far is to first convert the Collection into an Integer[] and then use the Connection to convert that into an Array.
Integer[] idArray = ids.toArray(new Integer[0]);
Array idSqlArray = jdbcTemplate.execute(
(Connection c) -> c.createArrayOf(JDBCType.INTEGER.getName(), idArray)
);
update = this.jdbcTemplate.update(setUsersArrayQuery, new Object[] {
idSqlArray, idUser, idDevice })
This is based on information in the documentation: https://jdbc.postgresql.org/documentation/head/arrays.html
The argument type and argument is not matching.
Try changing the argument type order
int[] tipi = new int[3];
tipi[0] = java.sql.Types.ARRAY;
tipi[1] = java.sql.Types.INTEGER;
tipi[2] = java.sql.Types.INTEGER;
or use
update = this.jdbcTemplate.update(setUsersArrayQuery, new Object[] {
ids.toArray(), idUser, idDevice })
and see if it works
http://valgogtech.blogspot.com/2009/02/passing-arrays-to-postgresql-database.html explains how to create java.sql.Array postgresql
basically Array.getBaseTypeName should return int and Array.toString should return the array content in "{1,2,3}" format
after you create the array you can set it using preparedstatement.setArray(...)
from PreparedStatementCreator e.g.
jdbcTemplate.update(
new PreparedStatementCreator() {
public PreparedStatement createPreparedStatement(Connection connection) throws SQLException {
Good Luck ..
java.sql.Array intArray = connection.createArrayOf("int", existing);
List<Object> values= new ArrayList<Object>();
values.add(intArray);
values.add(dt);
getJdbcTemplate().update(SQL_UPDATE,values);

Resources