Error initializing SQL Lite with CN1 Data Access - codenameone

I'm used to Codename One, but not to the database part, it's the first time I'm using it and I've run into some problems with the CN1 Data Access lib.
I'm using it's versionning capability, and I'm stuck. I'm trying to create a very simple table to achieve some image caching and I always got this exception :
java.io.IOException: [SQLITE_ERROR] SQL error or missing database (near ")": syntax error)
at com.codename1.impl.javase.SEDatabase.execute(SEDatabase.java:107)
at ca.weblite.codename1.db.DAOProvider.getDatabaseSchema(DAOProvider.java:192)
at ca.weblite.codename1.db.DAOProvider.loadSchema(DAOProvider.java:248)
at ca.weblite.codename1.db.DAO.<init>(DAO.java:109)
at our.app.managers.dao.TarotImgDAO.<init>(TarotImgDAO.java:14)
at our.app.managers.DatabaseHelper.initialize(DatabaseHelper.java:78)
at our.app.managers.DatabaseHelper.getInstance(DatabaseHelper.java:43)
at our.app.App.init(App.java:95)
Here is the sql file :
--Version:1
CREATE TABLE tarot_img (
id INTEGER PRIMARY KEY AUTOINCREMENT,
img_path VARCHAR NOT NULL,
img BLOB
);
--
I've tried it on the SQL lite db with DB Browser for SQL lite, and it's working.
Here is the calling code :
Database db = Display.getInstance().openOrCreate(DB_FILE);
DBinstance = db;
DAOProvider provider = new DAOProvider(db, "/liquibase.sql", DB_VERSION);
provider.set("tarot_img", new TarotImgDAO(provider)); <--- IOExc here
And the TarotImgDAO constructor is :
public TarotImgDAO(DAOProvider provider) throws IOException {
super("tarot_img", provider);
}
I really don't get why this super basic SQL fail, maybe something with the parsing ?
I've also had to replace the original lib with the one rebuild by kaya18 on github to overcome the NumberFormatException caused by a bad trim in the lib.
Any help would be welcome :)

Related

using servicestack ormlite, is there a way to get an execution plan?

Using servicestack ormlite 6,4 and azure SQL server - using SQLServerDialect2012, we have an issue with an enums causing excessive stalling and timeouts.
If we just convert it to a string its quick as it should be.
var results = db.Select(q => q.SomeColumn == enum.value); -> 3,5 seconds
var results2 = db.Select(q => q.SomeColumn.tostring() == enum.value.tostring()); -> 0,08
we are using default settings so the enum in the db is defined as a varchar(255)
both queries give the same result.
to track the issue we wanted to see what its actually firing, but all we get is a query with some #1 #2 etc with no indication of what parameters where used or how they are defined.
All our attempts to get a 1:1 SQL string we can use to manually test the query and see the results have failed... mini profiler was the closest as it shows the parameter values...
but it does not contain the details necessary to recreate the used query and recreate the issue we have. (manually recreating the query gives 80ms as above)
Trying to get the execution plan with the query also fail.
db.ExecuteSql("SET STATISTICS PROFILE ON;");
var results = db.Select(q => q.SomeColumn == enum.value);
db.ExecuteSql("SET STATISTICS PROFILE OFF;");
only returns data, not any extra info i was hoping for.
I have not been able to find any sites or threads that explain how others get any kind of debug info.
What is the correct next step here?
OrmLite's Logging & Introspection page shows how you can view the SQL generated by OrmLite.
E.g. Configuring a debug logger should log the generated SQL and params:
LogManager.LogFactory = new ConsoleLogFactory(debugEnabled:true);

SQLALchemy - cannot reflect a SQL Server DB running on Amazon RDS

My code is simple:
app = Flask(__name__)
app.config.from_object('config')
db = SQLAlchemy(app)
db.metadata.reflect()
And it throws no errors. However, when I inspect the metadata after this reflection, it returns an empty immutabledict object.
The parameters in my connection string is 100% correct and the code works with non-RDS databases.
It seems to happen to others as well but I can't find a solution.
Also, I have tried to limit the reflection to specific tables using the "only" parameter in the metadata.reflect function, and this is the error I get:
sqlalchemy.exc.InvalidRequestError: Could not reflect: requested table(s) not available in mssql+pyodbc://{connection_string}: (users)
I've fixed it. The reflect() method of the SQLAlchemy class has a parameter named 'schema'. Setting this parameter, to "dbo" in my case, solved it.
I am using Flask-SQLAlchemy, which does not have the said parameter in its reflect() method. You can follow this post to gain access to that parameter and others, such as 'only'.
This error occurs when reflect is called without the schema name provided. For example, this will cause the error to happen:
metadata.reflect(only = [tableName])
It needs to be updated to use the schema of the table you are trying to reflect over like this:
metadata.reflect(schema=schemaName, only = [tableName])
You have to set schema='dbo' in parameter for reflect.
db.Model.metadata.reflect(bind=engine, schema='dbo', only=['User'])
and then create model of your table:
class User(db.Model):
__table__ = Base.metadata.tables['dbo.User']
and to access data from that table:

Cannot SQLBulkCopy Error 40197 with %d code of 4815 (Connection Forcibly Closed)

Developing with VS 2013 ASP.NET MVC 5 Web Project and Separate Azure hosted SQL Server database.
At the bottom is all my error information from Visual Studio 2013. I've narrowed down the problem and found a link to the Microsoft Description of the problem without a solution. I'm Developing with Database First and Entity Framework 6. ASP.NET 4 MVC & Razor. I connect to a SQL Azure database - I think this is whats falling over i've checked the logs for Azure website etc already
I have delimited text files (that were uploaded to APP_DATA) that I load into a DataTable then use SQL-Bulk Copy to dump content into Azure Database. All works 100% fine so long as my files are only containing a few hundred records. But I need to insert 20MB files with approx 200,000 rows. When I try the big files I get an Error at the point ASP.NET is performing the Bulk Copy. No matter what I set for batch size etc it bails around the 4000 row mark every-time. I've exhausted all options and at my whits end, I even tried Scaling up the Azure database to Business from FREE web. I tried scaling up the website too. Here is the code :
public void BatchBulkCopy(DataTable dataTable, string DestinationTbl, int batchSize,int identity)
{
try {
// Set the timeout.
System.Diagnostics.Debug.WriteLine("Start SQL Bulk Copy");
using (SqlBulkCopy sbc = new SqlBulkCopy("Server=tcp:eumtj4loxy.database.windows.net,1433;Database=AscWaterDB;User ID=HIDDEN#HIDDEN;Password=XXXXXXX;Trusted_Connection=False;Encrypt=True;Connection Timeout=900;", SqlBulkCopyOptions.TableLock))
{
sbc.DestinationTableName = DestinationTbl;
sbc.BulkCopyTimeout = 0;
// Number of records to be processed in one go
sbc.BatchSize = 1000;
// Add your column mappings here
sbc.ColumnMappings.Add("D2001_SPID", "SupplyPointId");
sbc.ColumnMappings.Add("D2002_ServiceCategory", "D2002_ServiceCategory");
sbc.ColumnMappings.Add("D2025_NotifyDisconnection/Reconnection", "D2025_NotifyDisconnectionReconnection");
sbc.ColumnMappings.Add("WaterBatchId", "WaterBatchId");
sbc.ColumnMappings.Add("D2003_Schedule3", "D2003_Schedule3");
sbc.ColumnMappings.Add("D2004_ExemptCustomerFlag", "D2004_ExemptCustomerFlag");
sbc.ColumnMappings.Add("D2005_CustomerClassification", "D2005_CustomerClassification");
sbc.ColumnMappings.Add("D2006_29e", "D2006_29e");
sbc.ColumnMappings.Add("D2007_LargeVolAgreement", "D2007_LargeVolAgreement");
sbc.ColumnMappings.Add("D2008_SICCode", "D2008_SICCode");
sbc.ColumnMappings.Add("D2011_RateableValue", "D2011_RateableValue");
sbc.ColumnMappings.Add("D2015_SPIDVacant", "D2015_SPIDVacant");
sbc.ColumnMappings.Add("D2018_TroughsDrinkingBowls", "D2018_TroughsDrinkingBowls");
sbc.ColumnMappings.Add("D2019_WaterServicesToCaravans", "D2019_WaterServicesToCaravans");
sbc.ColumnMappings.Add("D2020_OutsideTaps", "D2020_OutsideTaps");
sbc.ColumnMappings.Add("D2022_TransitionalArrangements", "D2022_TransitionalArrangements");
sbc.ColumnMappings.Add("D2024_Unmeasurable", "D2024_Unmeasurable");
sbc.ColumnMappings.Add("D2014_FarmCroft", "D2014_FarmCroft");
// Finally write to server
System.Diagnostics.Debug.WriteLine("Write Bulk Copy to Server " + DateTime.Now.ToString());
sbc.WriteToServer(dataTable); // Fails here when I upload a 20MB CSV with 190,000 rows
sbc.Close();
}
// Ignore this I don't get to this code unless loading a file thats only got a few records
WaterBatch obj = GetWaterBatch(identity); // Now we can get the WaterBatch
obj.StopDateTime = DateTime.Now;
Edit(obj);
Save();
System.Diagnostics.Debug.WriteLine("Finished " + DateTime.Now.ToString());
}
catch (Exception ex)
{
Exception ex2 = ex;
while (ex2.InnerException != null)
{
ex2 = ex2.InnerException;
}
Console.WriteLine(ex.InnerException);
throw;
}
}
My $Exception says :
$exception {"A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)"} System.Exception {System.Data.SqlClient.SqlException}
My InnerException is, if I go into Inner then Inner exception etc its the same message with Hresult of -2146232060 then -2147467259:
InnerException {"An existing connection was forcibly closed by the remote host"} System.Exception {System.ComponentModel.Win32Exception}
UPDATED INFO :
Explanation of Error from Microsoft is (below). I am getting an Error number 40197. Then Microsoft say to look for the %d code - which I get to be 4815. Question is what now, where can I go from here to get into on a 40197 with a %d of 4815:
I got the following info regarding my error from this link: http://msdn.microsoft.com/en-us/library/windowsazure/ff394106.aspx
40197
17
The service has encountered an error processing your request. Please try again. Error code %d.
You will receive this error, when the service is down due to software or hardware upgrades, hardware failures, or any other failover problems. The error code (%d) embedded within the message of error 40197 provides additional information about the kind of failure or failover that occurred. Some examples of the error codes embedded within the message of error 40197 are 40020, 40143, 40166, and 40540.
Reconnecting to your SQL Database server will automatically connect you to a healthy copy of your database. Your application must catch error 40197, log the embedded error code (%d) within the message for troubleshooting, and try reconnecting to SQL Database until the resources are available, and your connection is established again.
I was getting the exact same error during a Bulk Insert. In my case, it was a varchar column that was overflowing. I just needed to increase the character limit and the problem was solved.
Just increase the Length of variable even if the value being stored is much lesser than than the size of the variable, worked for me.

how to connect MS Access database with matlab (transfer data from GUI and save in database )

Hello ppl I am trying to work with databases and I am new to Matlab.
I want to manipulate databeses created in MS Access but I don't know(I hope find a way to enter data from GUI (this GUI created using matlab ) and save in database)
I've designed the user interface in MATLAB, and create a database in MS Access
The problem I do not know how I connect between the database and MATLAB
I find some code to how connect between it.
dbpath = ['C:\Users\Esra\Documents\Esra.accdb'];
conurl = [['jdbc:odbc:Driver={Microsoft Access Driver (*.mdb, *.accdb)};DSN='';DBQ='] dbpath];
con = database('','','','sun.jdbc.odbc.JdbcOdbcDriver', conurl);
I hope find good code or book about this .
final , i don not know if it is the correct place to my question or not , if not ,please put my question in correct place
You need to run SQL queries on the database; you can do this with database.fetch (and a few other friends).
The example query from the docs:
conn = database('dbtoolboxdemo','','');
setdbprefs('DataReturnFormat','cellarray')
results = fetch(conn, 'select productdescription from producttable')
% Not in the example in the docs: this syntax, which I prefer, is also supported
results = conn.fetch('select productdescription from producttable');
Note that you will also need to know how to write SQL. For that, there are plenty of resources online - you just have to search for them.

db.SubmitChanges() not updating the database in Windows phone 8

my question might be similar to many question in google search but I have some specific query. I have written my code like this where db is database and Items is the table having filename as one property.
var query = from fs in dB.Items
where fs.FilePath.Trim() == strOldpath.ToString()
select fs;
foreach (var fs in query)
{
fs.FileName = txtrename.Text.ToString();
}
try
{
dB.SubmitChanges();
}
catch (Exception e)
{
}
This code is running fine but after debugging I stop the emulator and I run in the command prompt
ISETool.exe ts xd 19xxxx-b6f2-474b-a747-6axxxxxxx E:\Practise\WinPhone\PhoneApp3\
it creates the *.sdf in the specific folder and I can open that in server explorer. But I can see that instead of the updated fileName it shows the old File name. the code is running fine. Any help why the file name is not updated? I have set the primary key for the table also.
You appear to have hit a known issue with trying to update the results of a read-only query:
Workaround for LINQ to SQL Entity Identity Caching and Compiled Query Bug?

Resources