MySQL + Dapper Extensions: Error in SQL syntax - dapper

I am trying to do CRUD operations using Dapper Extensions. But I am getting error while inserting data to MySQL database as follows:
Error: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near [......] at line [....]
If I use MSSQL database, Dapper Extensions is working correctly. Why I am getting this error with MySQL?

The error is because Dapper Extensions is generating the query for SQL server (by default) where as you are actually connected to MySQL. There are syntax differences between these two RDBMSs and hence the error. You have to tell Dapper Extensions that you are connecting to MySQL.
Set the Dialect at the startup of your application somewhere.
//Synchronous
DapperExtensions.DapperExtensions.SqlDialect = new DapperExtensions.Sql.MySqlDialect();
//Asynchronous
DapperExtensions.DapperAsyncExtensions.SqlDialect = new DapperExtensions.Sql.MySqlDialect();
As you can note, you need to configure this separately for synchronous and asynchronous methods. You can read more about this on github.
This will instruct Dapper Extensions to generate the queries according to the syntax of MySql. Not only Dapper Extensions, similar is necessary for many ORMs those support query generation for multiple RDBMSs.
Apart from this, you may also consider implementing logging which may help you diagnose the issues. MiniProfiler is good tool for this purpose. You may find more details in my other answer.

Related

Can I replicate from Postgres to MS SQL?

I will have a Postgres database in production but want to use MS SQL (whatever edition) for reporting. So, I would like to have replication set up where MS SQL subscribes from postgres. Is this possible?
All heterogeneous replication scenarios are deprecated by Microsoft, and they now recommend building solutions using SSIS and CDC instead.
We load data from PostgreSQL into our SQL Server reporting database using SSIS and it works well, although we had to use a commercial OLE DB provider because of limitations (at that time) in the open-source one.
Actually copying the data is usually the easy part; most of the work comes in gathering requirements, understanding the data, transforming it, implementing logging and error handling etc. SSIS can do some things for you right away (e.g. logging) but my general advice would be to use it primarily as a workflow tool and for simple data copying with minimal transformation logic (e.g. data type conversion). If something seems seems too difficult or clumsy in SSIS then you can put it into a stored procedure or script and call that from SSIS instead.
I've been using and following PostgreSQL for several years and not aware of such a solution. If one exists, I'm concerned that might be complex or fragile. I would recommend regular export/imports via cron. In the between the export and import, you would need to take care of the translation step of the formats.
If you reporting actually happens in MS Excel or MS Access, I recommend looking into connecting them directly to PostgreSQL via ODBC.

Execute querys to DBF files on SQL Server

As I've posted on this thread I need keep synchronized an SQL Server (not only SQL Server, if you recommend another server which do the same thing) and VFP DBF tables to use on the systems of the company.
As #alex-k said, Linked Server doesn't support INSERT, UPDATE nor DELETE but I need the CRUD statements to the systems work. I've already tried the suggestion of #alex-k but returns error.
As a lot of things in computation has more than one way, what do you recommend to my situation?
Thanks.
I've actually just installed SQL2008R2 on an old machine. Successfully created a linked server and tried the update direct from within SQL. Ran into your same error that it won't work. So, I looked around too and found something on "tek-Tips" forum that descdribed opening a rowset to apply an update via this link
When trying that, I got a follow-up error listed below.
SQL Server blocked access to STATEMENT 'OpenRowset/OpenDatasource' of component 'Ad Hoc Distributed Queries' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'Ad Hoc Distributed Queries' by using sp_configure. For more information about enabling 'Ad Hoc Distributed Queries', see "Surface Area Configuration" in SQL Server Books Online.
Since I don't want to open up features regarding security, even on a machine just for sample purposes, you might be more open to enabling/disabling this feature and trying the suggestion from tek-tips...
Good luck otherwise.
Today DBF files are not widely supported. As far as I know Microsoft Access still supports dBASE DBF files which should be compatible with VFP DBFs.
You can build some kind of gateway in MS Access responsible for synchronizing VFP and SQL Servers databases.
I have not used them since their pre-apollo days (uit sed to be called successware or SDE), but this company might have something that will help. The problem you are going to encounter with other solutions that can read-write the dbfs is that they might not be able to keep the index files up to date (cdx, ndx, etc). If they don't have a driver that helps, you might have to resort to creating an extended stored procedure to natively call their interface. Hopefully you won't have to resort to that.
-don
Have you considered building a small Visual FoxPro application to do this. Visual FoxPro can read and write to both SQL Server and DBFs. We have done this numerous times. The key is to understand what records have changed, and using surrogate primary keys on both sets of data.
Rick

Doctrine2 SQL Server Mapping Generation

I have a large (100+ tables) SQL Server 2005 database that I would like to start mapping with Doctrine. Right now I've done a manual job of a few tables (no relations yet, just disparate tables), using PHPDOC annotation mapping inside my entities. Manually it works like a charm ... however it really will take ages to get everything mapped out and I'm looking for an easier way.
I looked into ORM Designer, but it doesn't seem to offer imports from a SQL Server database. I also looked at using Doctrine CLI and doing the "reverse engineering" mentioned here. Finally tried using orm:convert-mapping --from-database with no luck. It appears the last two are conditional on the fact that the sqlsrv drivers (running on IIS7 here) cause an error on my tables when they have no index: PDOException: The active result for the query contains no fields.
Is it possible that I can load up Doctrine on an Ubuntu machine, and use whatever drivers Linux has to connect to SQL Server 2005 ... then perhaps the orm:convert-mapping feature wouldn't die on me?
Any help would be much appreciated!
Try with this drivers for php. (clue: non thread safe)
Also check your connection parameters.
I worked on a SQLServer 2005 express project with Symfony2 and I mapped all my tables in reverse with no trouble at all.
Well, actually had to implement a new doctrine type for datetime as it says here.
Good luck!

Programming for various database

I'm wondering for those enterprise programs, how do they link to various type of database just by stating the connection string?
Issues like different syntax, variable type will definitely be there.
Apart from stored procedures for each type of database, how else do they handle in terms of their programming?
1 way that came to my mind is just if else checking of database in order to populate different query.
Asking as I'm curious while using a engine which is built in C++ and jsp, but could support SQL Server, Access, MySQL, Oracle
ORMs tackle this problem by introducing a level of abstraction between the database and the domain model. For example with Hibernate you change the connection string and the dialect and HQL queries and Criteria APIs are automatically translated into the proper SQL for the target database.
Of course this assumes you never write a single line of SQL in your application or anything which is specific to the database.

Recommendations for supporting both Oracle and SQL Server in the same ASP.NET app with NHibernate

Our client wants to support both SQL Server and Oracle in the next project. Our experience comes from .NET/SQL Server platform. We will hire an Oracle developer, but our concern is with the DataAccess code. Will NHibernate make the DB Engine transparent for us? I don't think so, but i would like to hear from developers who have faced similar situations.
I know this question is a little vague, because i don't have Oracle experience, so i don't know what issues we will find.
You can easily use NHibernate to make your application database-agnostic by following some basic practices:
Design your object model first.
Do not use any database-specific code. You need somebody with good C# experience, not an Oracle developer. Do not rely on stuff like triggers, stored procedures, etc.
Let NHibernate generate the DB schemas at least initially (you can tweak things like indexes later) It will choose the best available datatypes for each DB.
Use a DB-agnostic POID generator (hilo or guid) instead of sequences or identity.
Try to avoid using SQL. HQL and Linq work fine in 99% of the cases.
Avoid NH features that are not supported by all of your target DB (for example, Future, MultiCriteria, etc)
NHibernate has a great community. You can always ask your questions in http://groups.google.com/group/nhusers besides posting here.
There are three things to consider - the ISession object, the SQL queries that are generated and your plain-old-clr-objects that are mapped to tables.
NHiberante will generate the required SQL queries based upon the chosen database dialect. If you configure NHibernate to use the SQL Server dialect it will generate SQL server correct SQL statements. This can easily be configured dynamically at runtime based on configuration.
You also need to configure your session to connect to the right type of database. Again, various configuration methods can support dynamic ISession creation at runtime.
Your actual data objects which are mapped to tables should not need to change based on database choice. One of NHibernates strengths is flexibility it provides in supporting multiple databases via a (fairly) simply configuration change and some up-front architectural thought.
See http://codebetter.com/blogs/karlseguin/archive/2009/03/30/using-nhibernate-with-multiple-databases.aspx for some examples of how you might abstract the underlying database away from the creation and usage of NHibernate.

Resources