IronPython - What kind of database is useable - database

i'm using IronPython 2.6 for .Net4 to build an GUI logging application.
This application received data via serialport and stores these data in an sqlite3 database while showing the last 100 received items in an listview. The listview gathers it's data via an SQL SELECT from the database every 100ms. It only querys data that is not already visible in the listview.
At first, the useage of the sqlite3 module was good and solid but i'm now stuck with several issues that i can't solve.
The sqlite3 module throws after a while exceptions like:
database disk image is malformed
database or disk is full.
These errors occur sporadic and never under high system load.
I stuck with this kind if issues for some weeks now and i'm looking for an alternative way to store binary and ascii data in a database-like object.
Please, does somebody know a good database solution i could use with IronPython 2.6 for .Net4?
Thanks

good
That is highly subjective without far more detailed requirements.
You should be able to use any database with .NET support, whether out of the box (notably SQL Server Express and Compact) or installed separately (SQL Server-other editions, DB2, MySQL, Oracle, ...).
Ten select commands per second should be easily in each of any of the databases above, unless there is some performance issue (e.g. huge amount of data and not able to use an index).

If you don't need compatibility with CPython, then SQL Server Compact is probably your best bet given your requirements.

Related

Convert SQL Server queries to Postgres on the fly

I have a scenario where I get queries on a webservice that need to be executed on a database.
The source for these queries is from a physical device so I cant really change the input to my queries.
I get the queries from the device in MSSQL. Earlier the backend was in SQL Server, so things were pretty straight forward. Queries would come in and get executed as is on the DB.
Now we have migrated to Postgres and we don't have to the option to modify the input data (SQL queries).
What I want to know is. Is there any library that will do this SQL Server/T-SQL translation for me so I can run the SQL Server queries through this and execute the resulting Postgres query on the database. I searched a lot but couldn't find much that would do this. (There are libraries that convert schema from one to another but what I need is to be able to translate SQL Server queries to Postgres on the fly)
I understand there are quite a bit of nuances that will be different between SQL and postgres so a translator will be needed in between. I am open to libraries in any language(that preferably runs on linux : ) ) or if you have any other suggestions on how to go about this would also be welcome.
Thanks!
If I were in your position I would have a look on upgrading your SQL Sever to 2019 ASAP (as of today, you can find on Twitter that the officially supported production ready version is available on request). Then have a look on the Polybase feature they (re)introduced in this version. In short words it allows you to connect your MSSQL instance to other data source (like Postgres) and query the data in as they would be "normal" SQL Server DB (via T-SQL) then in the background your queries will be transformed into the native pgsql and consumed from your real source.
There is not much resources on this product (as 2019 version) yet, but it seems to be one of the most powerful features coming with this release.
This is what BOL is saying about it (unfortunately, it mostly covers the old 2016 version).
There is an excellent, yet very short presentation by Bob Ward (
Principal Architect # Microsoft) he did during SQL Bits 2019 on this topic.
The only thing I can think of that might be worth trying is SQL::Translator. It's a set of Perl modules that have been around for ages but seem to be still maintained. Whether it does what you want will depend on how detailed those queries are.
The no-brainer solution is to keep a SQL Server Express in place and introduce Triggers that call out to the Postgres database.
If this is too heavy, you can look at creating a Tabular Data Stream (TDS is SQL Server network transport) gateway with limited functionality and map each possible incoming query with any parameters to a static Postgres query. This limits any testing to a finite, small, number of cases.
This way, there is no SQL Server, and you have more control than with the trigger option.
If your terminals have a limited dialect demand then this may be practical. Attempting a general translation is very likely to be worth more than the devices cost to replace (unless you have zillions already deployed).
There is an open implementation FreeTDS that you could use if you are happy with C or Java.

How to build big and complex database in sql - IN EASY WAY?

I have installed Oracle XE. I build small database every day to practice from command prompt, but now I want to have more. I want to have a bigger database with a lot of different data to practice and make exercises.
So, is possible to get a big data file from somewhere and upload to XE database?
You can't get 'big' data for Oracle Express edition as it is limited to 4GB (10g) or 10GB (11g ).
That said, there are public datasets available. Personally I like the FAA data on registered aircraft owners/operators
As you are practicing with Oracle, perhaps a good solution (which will also generate exactly the data you need) would be to write your own stored procedures to generate your data in a loop (or similar construct).
You could then generate as much as you like whilst also practicing your handling of large datasets and writing of efficient PL/SQL and SQL code.
This way your data will match your current database structure too without having to build a new database matching whichever dataset you download from the web.
IIRC there are sample schemas as HR that can be enabled. See this.

Entity Framework, No SQL server, What do I do?

Is there seriously no way of using a shared access non-server driven database file format without having to use an SQL Server? The Entity Framework is great, and it's not until I've completely finished designing my database model, getting SQL Server Compact Edition 4.0 to work with Visual Studio that I find out that it basically cannot be run off a network drive and be used by multiple users. I appreciate I should have done some research!
The only other way as far as I can tell is to have to set up an SQL server, something which I doubt I would be able to do. I'm searching for possible ways to use it with Access databases (which can be shared on a network drive) but this seems either difficult or impossible.
Would I have to go back to typed DataSets or even manually coding the SQL code?
Another alternative is to try using SQL
Install SQL Server express. Access is not supported by EF at all and my experience with file based databases (Access, SQL Server CE) is mostly:
If you need some very small mostly readonly data to persist in database you can use them (good for code tables but in the same time such data can be simply stored in XML).
If you expect some concurrent traffic and often writing into DB + larger data sets their performance and usability drops quickly. They are mostly useful for local storage for single user.
I'm not sure how this relates for example to SQLite. To generate database from model for SQLite you need special T4 template (using correct SQL syntax).
Have you tried SQLite? It has a SQL provider, and as far as I know EF supports any provider. Since it's file-based, that might be a plausible solution. It's also free.

How to eager load entire database with EF

My database consists of 5 tables with ~10000 rows combined. It takes ~1Mb in SQL Server CE which is on shared folder. The database itself is hierarchical Country-Region-City-Street-Building. I am using Entity Framework 4.
Because the database is small users are able to explore and edit all 2000 Cities in a WPF ListView. But with every approach I tried so far the GUI is sluggish (because of many database round-trips, with dummy data GUI is lightfast). How can I load entire database into memory with one or few database round-trips?
I tried multiple Include() but I noted great performance penalty as described here
Should I write my own ORM-light? I could also use plain ascii CSV files instead of database but it would obviously exclude concurrency.
Honestly, I've done something like this myself, and the answer for me was to copy the whole database locally and work on it.
If you're looking not only to read but also to write, I'd definitely suggest ditching CE and installing one of the Express versions of Sql Server. They are designed for this kind of situation; CE is not*.
*SP1 is better for concurrent access, but over the network will never be performant for large datasets.
I re-asked this question on Microsoft forum and they were kind to give me some guidance:
Basically my question can be restated as following:
read only once from database on application start
do all subsequent queries from local data, not database (for performance)
write to both context and database each time an entity is being added or deleted.
With plain EF it is not possible because each query goes to the database. This implies that I must read data fast on start and then cache it.
Implementation details:
The best way seems to be using ESQL to import data fast and then cache it, for example using entities not connected to context. From my first experiments it seems to work well.

Data migration process for an application whose architecture is changing?

After having used an application for over 10 years, and been constantly limited by its lack of extensibility, we have decided to rewrite it fully from scratch. Because the new architecture differs from the old application, the database is also different. Here comes the problem: Is there any industrial process for migrating the data from the previous database to the new one? Some tables are alike, others are not. Overall, we need a process that will help us make sure that no data or logical constraints is lost during the migration.
PS: The old and new database are both Oracle databases.
Although you don't specify this in your question, I assume that you're going to develop the new version of your application/database, and then at some switchover point you need to migrate all of the live data from your old database into your new database.
If this is the case, then you're really asking about two distinct processes: the migration (with some modifications) of the database structure, followed later by the migration of the data itself.
For the first process, the best tool is you, the developer (I don't mean you're a "tool" - you know what I mean). You could bring over the structure of the old database and then change it as necessary for the new version; however, this approach in general tends to leave too much of the old structure behind. I think it's better to take advantage of the situation and rebuild the database from the ground up, using the original database just as a general reference.
For the second process, I would treat the data migration as a separate task requiring a separately-written and -tested application. This application could be a set of scripts or a compiled application or whatever is most convenient for you. Because your old and new databases will not have the same structure (and may in fact be very different), there are no commercial tools out there that will handle this task for you automagically. By treating this as a distinct application that you write yourself, you can test the data conversion process many times before your "go live" date.
I've heard of several different ways to attack problems such as this. The most simple solution I've seen is to use a Microsoft Access database and use ODBC connection to connect to both the new and old Oracle databases. You can then use Access to migrate and transform the data as you need.
The more elegant solution involes installing Microsoft SQL Server Development tools. You can use Business Intelligence Development Studio to create a SSIS package with two Oracle endpoints. SSIS can handle the heavy lifting of transforming the data between the databases and you can run the package locally so you don't have to have an instance of SQL Server running anywhere.
There's a tutorial series for SSIS at:
http://www.developerdotstar.com/community/node/364
You might also want to check out Oracle Warehouse Builder (OWB). The name is a little confusing, but it's Oracle's ETL (Extract, Transform, and Load) package. I've never used it personally, but it might do what you're looking to do as well.

Resources