I have a SQL Server 2008 database. This database has a stored procedure that will update several records. The ids of these records are stored in a parameter that is passed in via a comma-delimited string. The property values associated with each of these ids are passed in via two other comma-delimited strings. It is assumed that the length (in terms of tokens) and the orders of the values are correct. For instance, the three strings may look like this:
Param1='1,2,3,4,5'
Param2='Bill,Jill,Phil,Will,Jack'
Param3='Smith,Li,Wong,Jos,Dee'
My challenge is, I'm not sure what the best way to actually go about parsing these three CSVs and updating the corresponding records are. I have access to a procedure named ConvertCSVtoTable, which converts a CSV to a temp table of records. So Param1 would return
1
2
3
4
5
after the procedure was called. I thought about a cursor, but then it seems to get really messy.
Can someone tell me/show me, what the best way to address this problem is?
I'd give some thought to reworking the inputs to your procedure. Since you're running SQL 2008, my first choice would be to use a table-valued parameter. My second choice would be to pass the parameters as XML. Your current method, as you already know, is a real headache and is more error prone.
You can use bulk load to insert values to tmp table after that PIVOT them and insert to proper one.
Related
I have a SSRS report with 20 different datasets with some calculated columns in each.
I want to take few fields from all data sets including some calculated columns and insert them into a SQL table.
I want to do this for each month so that I can see the trends during a period. Is there any way to do that with out editing the data sets?
Can I refer the fields that I need by referring to Textbox4 and insert them into a SQL table? What is the easy way to do without touching data sets?
There is most likely alot better solution than using SSRS to update a SQL database. I am not proposing this as the best solution but rather a way to achieve what was asked.
You could create a dataset that runs a stored procedure you can pass the data as parameters to. The Sproc would do the insert into your chosen table and you can pass in the parameters from your original dataset however you see fit. You could even setup a second report with the Stored procedure Dataset that you can call on command by having an action event on an item to call the report. (I had a subreport embedded in a column of a tablix configured so it would only update with values from that row for instance).
To clarify:
Create a subreport that accepts the data you want to insert as a parameter for each column
Instead of adding a normal Dataset, have it call a stored procedure that inserts as you require
Add the subreport to your main report to be called once its run and configure the required parameters to be passed through.
There will be better, more efficient, cleaner ways to do this, but I found the above to work for my purposes since I was limited by time and resources. But I would still recommend you seek other solutions if possible.
So, similar to "SQL Server compare results of two queries that should be identical", I need to compare the output of two stored procedures to ensure the new version is generating equivalent output to the old version. The tricky part is that my SP outputs six tables of differing widths.
I started writing a hybrid version of them that would compare each of the tables individually, but it's a pretty complex SP, so I was hoping there was an easier way.
I tried using EXCEPT as in the linked question, but it looks like that will only compare one table to one other table.
Easy option 1: Output the stored procedure results to a text file (one per procedure version) and use a diff tool/editor to make sure they are the same.
Easy option 2: Write the stored procedure results to a table/temp table (per return table per procedure) and write sql to compare the results. Just count the rows in each result table and then do a count of the union (not union all) of both tables. Repeat for each result table.
You can capture multiple result sets in .NET (C# or VB) quite easily. You can create a DataAdapter and DataSet, and use the DataAdapter.Fill() method to populate the DataSet. Each result set will be stored as a DataTable within that DataSet. Then you just need to loop through the DataTables collection in each DataSet and compare them. You can find more info on this MSDN page: Populating a DataSet from a DataAdapter
This can be done in either SQLCLR if you want to run it as a stored procedure or user-defined function, OR it can be a stand-alone console application. Running it as a SQLCLR stored procedure is quite convenient, but given that you will be stored all results for all 6 result sets, and for both stored procedures that you are testing, that might require too much memory. In that case, the console app is the the way to go.
The only thing I can think of is add an additional parameter to your both of (New/old) stored procedures to handle which result it should return like.
Exec usp_proc #var1 , #var2 , #ResultSet = 1
The above execution should return the first result set and if you pass #ResultSet = 2 it should return second result set and so on.....
do this with both stored procedure and then compare the result sets group by group (using except will do the trick).
I have a table with more than 50 columns and it is already normalized. Most number of columns have data type as nvarchar.
Now, I need to write a stored procedure that inserts a record in the same table.
I have not sure whether
1) I should write a SP with 50 parameters or
2) I should write SP that takes Xml as a parameter and extract record to be inserted into table as mentioned here.
How to insert xml data into table in sql server 2005
Please advise in terms of performance. Thanks.
According to this link : http://msdn.microsoft.com/en-us/library/ms143432.aspx , you can have up to 2100 parameters. So this won't be a problem.
In term of performance, it'll take extra processing to extract each node from the XML, but it certainly gives more flexibility (and readability) if you'd like to produce dynamics inserts (even if it could be done with standard parameters, with a bit more work I guess).
I need to create a stored procedure which receives a parameter (named #codes).
This is a string which contains a list of codes separated by a semicolumn.
I'd need to look inside a table and return all rows that have a code (which is in the column EANcodes) which was passed in the #codes parameter.
Can anyone help me get started. My knowledge of stored procedures is very limited.
Thanks in advance.
Ideally, I'd prefer to see the parameter passed in another way, either using a table-value parameter (assuming SQL 2008) or XML which can be easily shredded into a table.
Alternatively, use a SQL split function (one example is here) to parse the string into a temp table, then join against that table in your select query.
Stored Procedures aren't really meant to handle a list of strings as a paramter. You'd be better off splitting it up in your App code and then calling the stored procedure many times with each one as a parameter.
However, if you feel the need to do it this way. You could loop through the string, and use CHARINDEX to find the next index of a semicolon and then use SUBSTRING to get the next code. Then you could use a CTE for the matching rows at each iteration and when the loop is done, simply return the CTE. This is pretty hacky, but I can't think of any other way to do this.
(Those are the T-SQL string functions)
For info on the string manipulation functions (in T-SQL): http://msdn.microsoft.com/en-us/library/ms186323.aspx
And here are similar functions in MySQL: http://dev.mysql.com/doc/refman/5.1/en/string-functions.html
I am working on SQL Server 2005.
Here I need to pass an array (2 dimensional, E.g: Emp No, Emp Name ) having multiple records from my ASP.NET application to a Stored Procedure.
Would you please let me know if there is a way to do it?
Many Thanks,
Regards.
Anusha.
There are 6 different approaches outlined here
I personally like the XML approach outlined by Nestor as this can handle an array of any dimensions, although others may prefer fn_split , passing a delimited string and unpacking it into a table variable to join to for simple cases.
All is solved in SQL2008 - where a set based language finally allows a table variable as an input parameter!
This sounds roughly equivalent to sending a table with two columns:
table Employees
.No
.Name
In SQL 2008 you can use a table type parameter to pass this sort of data, but I don't believe this is available in Sql 05.
If you're using .NET, you can use the SqlBulkCopy class to send a batch of records.
If none of these work, you may have to execute a single stored procedure per row (per employee in this case). This is assuming that this data needs to wind up in a table, as opposed to attempting to parse or reformat the data on the SQL server. If the latter is the case, you should really perform such operations on the client instead of the database server.
You could create a temporary table and have proc 1 populate it and have proc 2 select from it if you are using 2005. In 2008 you can pass table valued parameters around. Some people may say you could pass in XML too.
In 2005 I don't think there is a way of doing this elegantly. In the past I have done this by passing in lists of values as text strings e.g. "1,2,3". We have a user defined function that splits lists and returns a table. Just need to be careful that the length of the list doesn't get longer than your parameter length.
Pass the array in xml format, like:
<array>
<row>
<cell>1</cell>
<cell>2</cell>
<cell>3</cell>
</row>
<row>
<cell>4</cell>
<cell>5</cell>
<cell>6</cell>
</row>
</array>
through an argument of type xml. SQL 2005 has xml manipulation operators that you can use inside the stored procedure to parse the array.