Multiple Selects In Query Not Giving All Result Sets - sql-server
In my code there is a loop that gives the results of an SQL query using SqlCommand.
However, for some queries that I need to run there are multiple select statements in the query. For example, this might be what the entire statement would look like:
Dim query as string = "
Select * from people
Select * from places
Select * from items
Select * from foods"
cmd = New SqlCommand(query, connect)
cmd.Connection.Open()
reader = cmd.ExecuteReader
While reader.HasRows()
//various logic
While reader.Read()
//Do Logic Here
End While
End While
When my query is ran I get the results for the first 2 but since the 3rd one has no results it kicks the application out of the loop and I do not get the results of the 4th select. I need the results of the 4th select also.
Edit: Union will not work for this case because I need to be able to differentiate between result sets in my logic.
static void Main(string[] args)
{
cmd = new SqlCommand("zp_multiple_results", connect);
cmd.Connection.Open();
reader = cmd.ExecuteReader();
do
{
if (reader.HasRows)
while (reader.Read())
{
Console.WriteLine(reader[0].ToString());
}
}
while (reader.NextResult());
cmd.Connection.Close();
cmd.Connection.Dispose();
cmd.Dispose();
Console.ReadLine();
}
Related
Unexpected end of JSON input: asp.net core
when select query select * from Permissions where AppId='yZVwUoxKQCu' FOR JSON PATH,INCLUDE_NULL_VALUES executes in sql server direct it return full data with array wrapper without unexpected end.when the query read with DataReader in C# it the same query result have unexpected end. the sql execute using below the method. Query like this var sql = $"select * from Permissions where AppId='{AppId}' FOR JSON PATH,INCLUDE_NULL_VALUES"; var res = Connection.ExecuteScalarCommand(sql); public static String ExecuteScalarCommand(string sql) { string CS = DbConnectionString; SqlConnection con = new SqlConnection(CS); string val = ""; try { con.Open(); SqlCommand cmd = new SqlCommand(sql, con); val = cmd.ExecuteScalar().ToString(); } catch (Exception ex) { Console.WriteLine(ex.Message.ToString()); } finally { con.Close(); } return val; } The output given by sql server direct is given.its expected output. [{"Id":49,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":50,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":51,"IsCreatable":2,"IsViewable":2,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":52,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":53,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":54,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"User","UserId":60,"PermissionTitleId":6,"AppId":"yZVwUoxKQCu"},{"Id":55,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Roles","UserId":60,"PermissionTitleId":7,"AppId":"yZVwUoxKQCu"},{"Id":56,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":57,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":58,"IsCreatable":2,"IsViewable":2,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":59,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":60,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":61,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"User","UserId":60,"PermissionTitleId":6,"AppId":"yZVwUoxKQCu"},{"Id":62,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Roles","UserId":60,"PermissionTitleId":7,"AppId":"yZVwUoxKQCu"},{"Id":42,"IsCreatable":0,"IsViewable":0,"IsDeletable":1,"IsUpdatable":0,"RoleId":28,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":43,"IsCreatable":1,"IsViewable":0,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":44,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":45,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":46,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":47,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"User","UserId":60,"PermissionTitleId":6,"AppId":"yZVwUoxKQCu"},{"Id":48,"IsCreatable":0,"IsViewable":0,"IsDeletable":1,"IsUpdatable":0,"RoleId":28,"Title":"Roles","UserId":60,"PermissionTitleId":7,"AppId":"yZVwUoxKQCu"}] but out put from asp.net core is [{"Id":49,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":50,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":51,"IsCreatable":2,"IsViewable":2,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":52,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":53,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":54,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"User","UserId":60,"PermissionTitleId":6,"AppId":"yZVwUoxKQCu"},{"Id":55,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Roles","UserId":60,"PermissionTitleId":7,"AppId":"yZVwUoxKQCu"},{"Id":56,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":57,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":58,"IsCreatable":2,"IsViewable":2,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":59,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":60,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":61,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"User","UserId":60,"PermissionT
The ExecuteScalar has a character limit: The first column of the first row in the result set, or a null reference (Nothing in Visual Basic) if the result set is empty. Returns a maximum of 2033 characters. To overcome that you will need to use a different function, like ExecuteReader.
best solution for multiple insert update solution
Struggle with understanding C# & Npgsql as a beginner. Following code examples: // Insert some data using (var cmd = new NpgsqlCommand()) { cmd.Connection = conn; cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p)"; cmd.Parameters.AddWithValue("p", "Hello world"); cmd.ExecuteNonQuery(); } The syntax for more than one insert & update statement like this is clear so far: cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p);INSERT INTO data1...;INSERT into data2... and so on"; But what is the right solution for a loop which should handle one statement within. This works not: // Insert some data using (var cmd = new NpgsqlCommand()) { foreach(s in SomeStringCollectionOrWhatever) { cmd.Connection = conn; cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p)"; cmd.Parameters.AddWithValue("p", s); cmd.ExecuteNonQuery(); } } It seems the values will be "concatenated" or remembered. I cannot see any possibility to "clear" the existing cmd-object. My second solution would be to wrap the whole "using" block into the loop. But every cycle would create a new object. That seems ugly to me. So what is the best solution for my problem?
To insert lots of rows efficiently, take a look at Npgsql's bulk copy feature - the API is more suitable (and more efficient) for inserting large numbers of rows than concatenating INSERT statements into a batch like you're trying to do.
If you want to rerun the same SQL with changing parameter values, you can do the following: using (var cmd = new NpgsqlCommand("INSERT INTO data (some_field) VALUES (#p)", conn)) { var p = new NpgsqlParameter("p", DbType.String); // Adjust DbType according to type cmd.Parameters.Add(p); cmd.Prepare(); // This is optional but will optimize the statement for repeated use foreach(var s in SomeStringCollectionOrWhatever) { p.Value = s; cmd.ExecuteNonQuery(); } }
If you need lots of rows and performance is key then i would recommend Npgsql's bulk copy capability as #Shay mentioned. But if you are looking for quick way to do this without the bulk copy i would recommend to use Dapper. Consider the example below. Lets say you have a class called Event and a list of events to add. List<Event> eventsToInsert = new List<Event> { new Event() { EventId = 1, EventName = "Bday1" }, new Event() { EventId = 2, EventName = "Bday2" }, new Event() { EventId = 3, EventName = "Bday3" } }; The snippet that would add the list to the DB shown below. var sqlInsert = "Insert into events( eventid, eventname ) values (#EventId, #EventName)"; using (IDbConnection conn = new NpgsqlConnection(cs)) { conn.Open(); // Execute is an extension method supplied by Dapper // This code will add all the entries in the eventsToInsert List and match up the values based on property name. Only caveat is that the property names of the POCO should match the placeholder names in the SQL Statement. conn.Execute(sqlInsert, eventsToInsert); // If we want to retrieve the data back into the list List<Event> eventsAdded; // This Dapper extension will return an Ienumerable, so i cast it to a List. eventsAdded = conn.Query<Event>("Select * from events").ToList(); foreach( var row in eventsAdded) { Console.WriteLine($"{row.EventId} {row.EventName} was added"); } } -HTH
Multi-threading NpgsqlConnections and readers producing duplicate results. C#
I have thousands of queries I need to run on thousands of different Schema over 10 databases. I am trying to thread these queries and use a BlockingCollection to write results into, while also using another thread to read from this collection and write it to disk as the result sets of these queries are too large to store in memory. Here is the problem area in my code: public class Node { public string ConnectionString; public string Query; public Node(string databaseDetails, string query) { //Cannot put in actual logic, but this part is fine ConnectionString = {logic for connection string} Query = "set search_path to {schema from databaseDetails};" + query } } public void runQuery(string query, BlockingCollection<Dictionary<string, object>> producer) { List<Node> nodes = getNodes(query); Parallel.ForEach(nodes, node => { NpgsqlConnection conn = new NpgsqlConnection(node.ConnectionString); conn.Open(); NpgsqlCommand npgQuery = new NpgsqlCommand(node.Query, conn); NpgsqlDataReader reader = npgQuery.ExecuteReader(); while (reader.Read()) { Dictionary<string, object> row = new Dictionary<string, object>(); for (int i = 0; i < reader.FieldCount; i++) { row[reader.GetName(i)] = reader.GetValue(i); } producer.Add(row); } conn.Close(); }); producer.CompleteAdding(); } This code runs, and retrieves all of the results, but it duplicates a lot of the results as well, so the blocking collection has 5-10 times more records than it should. Any help would be greatly appreaciated.
So I was just an idiot and was comparing my generated result set to the UNION of all the queries I was running not the UNION ALL, so my "true" result set had no duplicates in it because the union was removing them :/
my combo box is duplicating the strings
help, combobox just keep adding items, i tried using removeallitems but after that i cant put anything on the first combobox public class Function { public void combofillsect(JComboBox section, String year){ Connection conn = null; PreparedStatement pst = null; ResultSet rs = null; String query; try{ query = "Select Section from asd where Year=?"; Class.forName("com.mysql.jdbc.Driver"); conn = DriverManager.getConnection("jdbc:mysql://localhost:3306/test","root",""); pst = conn.prepareStatement(query); pst.setString(1, year); rs = pst.executeQuery(); while(rs.next()){ section.addItem(rs.getString("Section")); } }catch(Exception e){ JOptionPane.showMessageDialog(null, e); section.addItem(e.toString()); }; } Function funct= new Function(); {funct.combofillsect(jComboBox1,String.valueOf(jComboBox2.getSelectedItem())); } why cant I post image?
Are you programming in C# ? If it's the case then you can use the function Clear like that : yourComboBox.Items.Clear() to delete all the current items. I don't know if it will solve your problem but your technique of getting the data from your database seems weird to me, if you used a dataset you could have done dataset.Tables(0).Rows.Count() to get the number of entries and then set the exit condition of your loop like this -> counter < dataset.Tables(0).Rows.Count(), and set a counter++ at the end of your while (maybe that's why you say your combobox won't stop filling, but I don't know what do the next() function). I don't know the C# code but there is my VB.NET function : Public Function getAll() As DataSet ConnectionDB() Dim cmd As SqlClient.SqlCommand cmd = New SqlClient.SqlCommand("SELECT * FROM table", Connect)//Connect is a System.Data.SqlClient.SqlConnection, or my connection string Dim adapter As New Data.SqlClient.SqlDataAdapter Dim dataset As New DataSet adapter.SelectCommand = cmd adapter.Fill(dataset) adapter.Dispose() cmd.Dispose() Connect.Close() Return dataset End Function I don't know if I helped you but I didn't really understood what your problem was and you didn't even mentioned the language you use ^^ Good luck Edit : and if you can't post images, that's because you don't have yet 10 points of reputation, you can get informations about reputation here : https://stackoverflow.com/help/whats-reputation, but you still can post the link of a picture it will allow users to click on it
SQL server refusing to cache plan for a fixed length parameterized IN clause
Using .NET 4.0, I have defined the following sqlcommand. When I execute the sqlcommand multiple times consecutively without making any changes, SQL Server refuses to cache the query plan. string[] colors = new string[] { "red", "blue", "yellow", "green" }; string cmdText = "SELECT * FROM ColoredProducts WHERE Color IN ({0})"; string[] paramNames = tags.Select( (s, i) => "#color" + i.ToString() ).ToArray(); string inClause = string.Join(",", paramNames); using (SqlCommand cmd = new SqlCommand(string.Format(cmdText, inClause))) { for(int i = 0; i < paramNames.Length; i++) { cmd.Parameters.AddWithValue(paramNames[i], tags[i]); } //Execute query here } I know it's refusing the cache the plan because the following query was running at a fraction of the time after consecutive runs: string[] colors = new string[] { "red", "blue", "yellow", "green" }; string cmdText = "SELECT * FROM ColoredProducts WHERE Color IN ({0})"; string inClause = string.Join(",", colors); using (SqlCommand cmd = new SqlCommand(string.Format(cmdText, inClause))) { //Execute query here } In my actual test case the param list is fixed at a size of exactly 2000. The scenario I am attempting to optimize is selecting a specific set of 2000 records from a very large table. I would like for the query to be as fast as possible so I really want it to cached. Sleepy post Edit: The question is, why wouldn't this plan get cached? And yes, I have confirmed that the query is not in the cache using sys.dm_exec_cached_plans and sys.dm_exec_sql_test.
Here is an idea using a table-valued parameter. Please let us know if this approach performs better than your huge string. There are other ideas too, but this is the closest to treating your set of colors as an array. In SQL Server: CREATE TYPE dbo.Colors AS TABLE ( Color VARCHAR(32) -- be precise here! Match ColoredProducts.Color PRIMARY KEY ); GO CREATE PROCEDURE dbo.MatchColors #colors AS dbo.Colors READONLY AS BEGIN SET NOCOUNT ON; SELECT cp.* -- use actual column names please! FROM dbo.ColoredProducts AS cp -- always use schema prefix INNER JOIN #colors AS c ON cp.Color = c.Color; END GO Now in C#: DataTable tvp = new DataTable(); tvp.Columns.Add(new DataColumn("Color")); tvp.Rows.Add("red"); tvp.Rows.Add("blue"); tvp.Rows.Add("yellow"); tvp.Rows.Add("green"); // ... using (connectionObject) { SqlCommand cmd = new SqlCommand("dbo.MatchColors", connectionObject); cmd.CommandType = CommandType.StoredProcedure; SqlParameter tvparam = cmd.Parameters.AddWithValue("#colors", tvp); tvparam.SqlDbType = SqlDbType.Structured; // execute query here } I can almost guarantee this will perform better than an IN list with a large number of parameters, regardless of the length of the actual string in your C# code.