QSqlQuery does not work - database

I try to add to my database a row but it does not work.
My database is host by alwaysdata, I use Qt Creator to develop my program (which print no error) and MySql for viewing the database
via MySql Query Browser I entered :
INSERT INTO `mmr` VALUES (NULL,'musictest','albumtest','timetest','datetest');
it works
but in my program that code does not work :
void MainWindow::b_clicked(){
QSqlDatabase db = QSqlDatabase::addDatabase("QSQLITE");
db.setHostName("mysql1.alwaysdata.com");
db.setDatabaseName("mymusicrecognition_mmr");
db.setUserName("xxx");
db.setPassword("yyyy");
if(!db.open())
{
QMessageBox::information(this,"Message","Not connected...");
}
else{
QSqlQuery query;
query.exec(QString("INSERT INTO `mmr` VALUES (NULL,'%1','%2','%3','%4')")
.arg("musictest").arg("albumtest").arg("timetest").arg("datetest"));
QMessageBox::information(this,"Message","Connected !!!");
}
}
I have the message box "Connected !!!"
This must be a beginner mistake

Instead of construction QString from args try prepared QSqlQuery and binding the values.
http://qt-project.org/doc/qt-5/qsqlquery.html#approaches-to-binding-values
QString doesnt know anything about sql escaping. You use Sqlite driver so instead of connecting to your hostname qt tried sqlite file

From the code you have here it looks like you're not passing the database to QSqlQuery.
Instead of
QSqlQuery query;
try this
QSqlQuery query(db);
I'd also recommend that you take this a step further by creating a separate variable for your sql statement, and then passing that in as an argument along with the database. Like the following:
QString insertSql = QString("INSERT INTO `mmr` VALUES (NULL,'%1','%2','%3','%4')")
.arg("musictest").arg("albumtest").arg("timetest").arg("datetest");
QSqlQuery query(insertSql, db);
This will also allow you to verify that you have a properly constructed sql statement in the debugger before you try to run it.
After this, you can call exec() and store the result in a bool (as others on here have mentioned).
bool checkSuccess = query.exec();
Your code above will run without failing. exec() will just return false since it didn't have anything to run the sql statement against.

Related

System.NotSupportedException: Commands with multiple queries cannot have out parameters

I ran into another issue with using a data reader around a sproc with multiple ref cursors coming out. I am getting a not supported exception. Unfortunately, i can see from where it is coming from the source code of npgsql however.. i am not sure if i agree with throwing that exception. The code we have written works with oracle (both fully managed and managed flavors), sql server. Any help appreciated to keep it consistent for an api across some of those key flavors of dbms out there.
sproc body
CREATE OR REPLACE FUNCTION public.getmultipleresultsets (
v_organizationid integer)
RETURNS Setof refcursor
LANGUAGE 'plpgsql'
AS $BODY$
declare public override void AddCursorOutParameter(DbCommand command,
string RefCursorName)
{
NpgsqlParameter parameter = (NpgsqlParameter)CreateParameter(RefCursorName, false);
parameter.NpgsqlDbType = NpgsqlDbType.Refcursor;
parameter.NpgsqlValue = DBNull.Value;
parameter.Direction = ParameterDirection.Output;
command.Parameters.Add(parameter);
}
cv_1 refcursor;
cv_2 refcursor;
BEGIN
open cv_1 for
SELECT a.errorCategoryId, a.name, a.bitFlag
FROM ErrorCategories a
ORDER BY name;
RETURN next cv_1;
open cv_2 for
SELECT *
FROM StgNetworkStats ;
RETURN next cv_2;
END;
$BODY$;
Key Reader code that wraps postgres sql (Entlib implementation of npgsql)
private IDataReader DoExecuteReader(DbCommand command, CommandBehavior cmdBehavior)
{
try
{
var sql = new StringBuilder();
using (var reader = command.ExecuteReader(CommandBehavior.SequentialAccess))
{
while (reader.Read())
{
sql.AppendLine($"FETCH ALL IN \"{ reader.GetString(0) }\";");
}
}
command.CommandText = sql.ToString();
command.CommandType = CommandType.Text;
IDataReader reader2 = command.ExecuteReader(cmdBehavior);
return reader2;
}
catch (Exception)
{
throw;
}
}
The command building code is shown below
Helper.InitializeCommand(cmd, 300, "getmultipleresultsets");
db.AddReturnValueParameter(cmd);
db.AddInParameter(cmd, "organizationId", DbType.Int32, ORGANIZATIONID);
db.AddCursorOutParameter(cmd, "CV_1");
db.AddCursorOutParameter(cmd, "CV_2
The code that adds the refcursor parameter goes something like this
You code above seems to garble the PostgreSQL function with the .NET client code attempting to read its result.
Regardless, your function is declared to return a set of refcursors - this is not the same as two output parameters; you seem to be confusing the name of the cursor (cursors have names, but not ints, for example) with the name of the parameter (int parameters do have names).
Please note that PostgreSQL does not actually have output parameters - a function always returns a single table, and that's it. PostgreSQL does have a function syntax with output parameters, but that is only a way to construct the schema of the output table. This is unlike SQL Server, which apparently can return both a table and a set of named output parameters. To facilitate portability, when reading results, if Npgsql sees any NpgsqlParameter with direction out, it will attempt to find a resultset with the name of the parameter and will simply populate the NpgsqlParameter's Value with the first row's value for that column. This practice has zero added value over simply reading the resultset yourself - it's just there for compatibility.
To sum it up, I'd suggest you read the refcursors with your reader and then fetch their results as appropriate.

file is encrypted or is not a database , QT

Sorry for posting this even though there're a few posts about this , but they're not helping.
I'm using sqlite with QT , I wanted to do a simple query but it gives me this error:
file is encrypted or is not a database Unable to execute statement
Some posts' comments say that the DB may be corrupt, I used another program and created a test table from scratch and still has the same problem, this is the code:
QSqlDatabase db = QSqlDatabase::addDatabase("QSQLITE");
db.setDatabaseName("AdmissionTestDB.sql");
db.open();
if(db.isOpen())
{
QSqlQuery q = db.exec("SELECT * FROM USERS");
if(!q.lastError().isValid())
{
qDebug()<<"works!";
while(q.next())
{
qDebug()<<q.value(8).toString();
}
}
else
qDebug()<<"---db failed to open! , error: "<<q.lastError().text();
db.close();
return true;
}
qDebug()<<"db failed to open! , error: "<<db.lastError().text();
return false;
More Information hopefully it would help solving this:
1- I'm using SQLITE 3
2- The problem happens when I use QSqlQuery q = db.exec("SELECT * FROM USERS"); so the DB actually opens!
3- I used two GUI programs to create the DB one of them is the latest version of SQLiteStudio which is version 3.0.4

How to check the defined error class of a connection?

How to check the defined error class of a connection (DSI) of Sybase Replication Server?
I use alter connection to change the error class associated to the DSI but I want a command that list its information in order to confirm that the error class was correctly associated.
How can I do that?
You can see it by calling rs_getconn in the RSSD database.
I don't see any rs_getconn stored proc in an RSSD database for a 15.7.1 repserver.
I do see a ral_connection_details proc which seems to do the job (example output connecting with sqsh):
> ral_connection_details "MY_ASE_SERVER", "MyDB"; -mvert
dsname: MY_ASE_SERVER
dbname: MyDB
error class: rs_sqlserver_error_class
function string class: rs_sqlserver_function_class
username: mylogin
password: NULL -- Note, the password *isn't* null, whatever
dbid: 123
Not sure where rs_getconn and ral_connection_details came from (3rd party app? custom proc? one of a slew of fly-by-night SRS management tools Sybase provided over the years?), but the simplest (and guaranteed to be in any RSSD going back eons - unless someone dropped it) is the rs_helpdb stored proc.
rs_helpdb generates a result set showing dsname, dbname, did, prs, error and function classes; for RS 15.7+ you'll also get connid (associated with multipath rep / alternate connections) and the repserver error class.

Problems deleting data from database

I am using Hibernate to access my database. I would like to delete a set of fields on function of a criteria. My database is PostgreSQL and my Java code is:
public void deleteAttr(String parameter){
Configuration cfg = new Configuration();
cfg.configure(resource.getString("hibernate_config_file"));
SessionFactory sessionFactory = cfg.buildSessionFactory();
session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
tx.begin();
String sql = "delete from attribute where timestamp > to_date('"+parameter+"','YYYY-MM-DD')"
session.createSQLQuery(sql);
tx.commit();
}
The method runs, but it doesn't delete data from database. I have also checked the sql sentence in PgAdmin and it works, but not in code. Why? Does someone help me?
Thanks in advance!
It's because you're creating a query, but you don't execute it:
String sql = "delete from attribute where timestamp > to_date('"+parameter+"','YYYY-MM-DD')"
Query query = session.createSQLQuery(sql);
query.executeUpdate();
You should really use bound named parameters rather than string concatenation to pass parameters in your query: it's usually more efficient, it' much more robust, but above all, it doesn't open the door to SQL injection attacks.

Execute multiple stored procedures with single trip to database

I have a lot of legacy data access code mainly SqlCommand with Stored Procedure calls that we used to execute alot of Insert statment into an database.
As long as the SQL server has been on the same machine as the application there have been acceptable performace but now are we trying to move some of the data to SQL Azure.
The problem is that our code calls a SP for every record to insert which results in quite a few trips to the database and when not located on the same server it takes some time.
var conn = new SqlConnection("connString")
var cmd = new SqlCommand(conn, "spMyStoreProc");
cmd.Params.Add("#a", SqlDbType.VarChar, 10);
cmd.Params.Add("#b", SqlDbType.Int);
using(conn)
{
conn.Open();
foreach(var rec in recordsToInsert)
{
cmd.Parameters["#a"].Value = rec.A;
cmd.Parameters["#b"].Value = rec.B;
cmd.ExecuteNonQuery();
}
conn.Close();
}
I have tried the code above with and without Transactions.
I have also tried to use a "batch" SQL statement to execute several SPs in every trip to the server.
Like this:
var cmd = new SqlCommand(conn);
cmd.CommandText = "EXEC spMyStoreProc #a='a' #b=2; EXEC spMyStoreProc #a='b' #b=4;"
It greatly increases the performance of the operation but since I have quite a few SPs where every SP has about 20-50 params it gets quite tedious to write this code for all the insert commands in this data access component.
Is this the best way to achive this, or can I somehow tell ADO.NET I want to execute my calls as a batch (havent fount anything suggesting its possible but feel that I atleast should ask) to avoid network latency etc betweeen every single SP call?
If not does anybody know any good way to achive this without having to write it "by hand" and since its a legacy application I can not change the data layer completely.
Is there any applications that can take SqlCommands with parameters and generate the TQL they would execute?
Thanks in advance
You should probably have one stored procedure, that calls all the other stored procedures - it will probably be the least amount of work. So, from the code you only call the stored procedures once... so given that they are the same parameters you are passing every time (because your code seems to imply that) you would basically do something like this:
CREATE PROCEDURE sp_RunBatch(#param1, #param2, etc [all the parameters you need])
AS
exec spMyStoreProc #a='a'
exec spMyStoreProc2 #b='b'
The advantages of this are many, some of which being that its all centralized, and you can even wrap all of them within a transaction, so as not to do dirty inserts (given that they all depend on each other).
Also, if you don't feel like passing 20/30 parameters to each SP, you may want to make a user-table-defined data-type for each set of parameters, that you can pass. So then each SP gets 1 or 2 parameters, and the code becomes much simpler and readable.
EDIT:
This is a good reference for the user-defined table types: http://msdn.microsoft.com/en-us/library/bb675163.aspx
And this is how to pass the table valued types to SQL server: http://msdn.microsoft.com/en-us/library/bb675163.aspx
An alternative to M.R.'s approach would be to send all your parameters as an XML document, then parse the XML document to extract your parameters. This may simplify the interface a bit.
However I think you were on something when you discussed the possibility of chaining all the commands in a single string. But instead of manually building them, consider building an extension method to the SqlCommand object that returns a single string for execution, leveraging the sp_executesql syntax, and execute the entire string in a single pass.
So you would have a loop that looks like this, and you would call a new ToInlineSql extension method:
string sqlCommand = "";
foreach(var rec in recordsToInsert)
{ cmd.Parameters["#a"].Value = rec.A;
cmd.Parameters["#b"].Value = rec.B;
sqlCommand += cmd.ToInlineSql();
}
// execute sqlCommand
The ToInlineSql extension method could look like this (peuso-code, you will have to add certain things such as checking for the data type and so on) [and here is the link to sp_executesql:
public static class SqlCmdExt
{
public static string ToInlineSql(this SqlCommand cmd)
{
string sql = "sp_executesql " + cmd.CommandText ;
foreach (SqlParameter p in cmd.Parameters)
{
sql += ", #" + p.Name + " " + p.DataType.ToString() ;
sql += ", " + p.Name + " = " + p.Value;
}
sql += ";";
return sql;
}
}

Resources