Playframework evolutions + SQL Server. Generate correct schema script? - sql-server

I've been trying with no success to run evolutions + slick to generate schema for a MSQLServer database.
I am using Play 2.3.x, scala 2.11.6, slick, and SQLServer 2014.
I could make it connect as well, but the script which is generated contains lots of "errors" relates to data types, like the use of BOOLEAN and TIMESTAMP which are types that SQLServer does not use.
The script should use the types BIT instead of BOOLEAN, DATETIME instead of TIMESTAMP, and UNIQUEIDENTIFIER instead of UUID.
Does anyone know a workaround for that?

These datatypes and all specific to database so there is no workaround here.
You have change the data types based on the selected database otherwise you will get the same error in another database as well.

Related

Use Calendar object in java as parameter for SQL Server date column?

I have a project that needs to work on both Oracle and SQL Server. It currently runs on Oracle, and my job is to make queries compatible for Oracle and SQL Server.
The Oracle database has a table with a column called study_date which has DATE type. So in Java code, I create parameter like this:
MapSqlParameterSource parameters = new MapSqlParameterSource();
Calendar calendar = new GregorianCalendar(TimeZone.getTimeZone("America/New_York"));
calendar.setTimeInMillis(event.getTime()); //returns long value of time
parameters.addValue("study_date", calendar);
...
...
return parameters;
This works and insert a row with given time. However, if I run the same code with SQL Server, it doesn't work. I found there are many date types in SQL Server: https://learn.microsoft.com/en-us/sql/t-sql/data-types/date-and-time-types?view=sql-server-ver15, but none of the types work with Calendar object in java.
I was able to insert a row by creating Timestamp using Calendar object I created. It works, but that means I have to write two different sets of code for Oracle and SQL Server. I can, but I am looking for the best way to handle this.

Export SQLite integer as DateTime

I have an SQLite3 database. I also have an SQL Server database with the same structure. I need to export the data from SQLite and insert it into the SQL Server database.
The export from SQLite and the modification of the generated export needs to be 100% scripted. Inserting into the SQL Server database will be done manually through SQL Server Management Studio.
I have a mostly good dump of the database through this answer here. I can modify most of the script as needed with sed.
The one thing I'm stuck on right now is that the SQLite database stores timestamps as number of seconds since UNIX epoch. The equivalent column in SQL Server is DATETIME. As far as I know, inserting an integer into a DateTime won't work.
Is there a way to specify that certain fields be converted a certain way upon dumping from SQLite? Meaning, specify that the integer fields be dumped as proper DateTime strings that SQL Server will understand?
Or, is there something I can run on the Linux command line that will somehow find these Integer timestamps and convert them?
EDIT: Anything that runs in a Bash script on Ubuntu is acceptable.
Three basic solutions: (1) modify the data before the dump; (2) manipulate the file after the dump, or (3) modify the data on import. Which you choose will depend on how much freedom you have to modify schemas.
If you wish to do it in SQLite, I'd suggest adding text columns with the dates stored as needed for import to SQL Server, then ignore or remove the original columns on dump. The SQLite doc page for datetime() may help, as might answers to this question.
Or, you can write a function in SQL Server that handles the import. Perhaps set it on an insert trigger.
Otherwise, a script that manipulates your dump file would work too. It sounds like you have a good handle on how to do this.

MSSQL & SQL data type alternatives?

I have to work on a project connecting to a SQL Server DB, while working with PHP and a Laravel framework.
My issue is with the data types and where I would be able to change them into fully functional and more 'conventional' SQL data types. So let's take NVARCHAR for example, would I be able to change into a normal VARCHAR?
The types I have are:
NCHAR
NVARCHAR
GEOGRAPHY
I've read over here that :
Laravel uses db-library (if it's available) to connect to Sql Server
which cannot receive unicode data from MSSQL. (1,2)
Is there anyone in the crowd that work with Laravel and preformed such a task?
You can use following convention I found from MSSQL data types to MYSQL data types
NCHAR => CHAR/LONGTEXT
NVARCHAR => VARCHAR/MEDIUMTEXT/LONGTEXT
Still couldn't find a solution for GEOGRAPHY type. I'll keep you posted.
Found this on GEOGRAPHY but it clearly doesn't mention a counterpart to it.

SQLBulkCopy can't convert Time to DateTime

I am writing a small utility to copy a database from a proprietary ODBC database into a SQL Server database. Everything is working great, except when I use SQLBulkCopy to copy over the data. It works in most cases, but not when it's trying to copy a TIME field into a SQL DATETIME. I get this error:
The given value of type TimeSpan from the data source cannot be converted to type datetime of the specified target column.
When I create the schema in SQL Server I have to make the DATE and TIME fields DATETIME types in SQL Server, so there is no way around this. I wanted to use SQLBulkCopy so I didn't have to manually read through every record in every table and wrap logic around the special cases. Before I go down that road, is there another way I can do this? I have no control at all on the source database schema or content.
I assume you are dealing with pre-SQL Server 2008. SQL Server 2008 has DATE and TIME data types.
I think you would have to use a DataTable which matched the SQL Server schema and load this from your source reader, appropriately changing any TIME to a DATETIME by adding date information (e.g. 1/1/1900). Then use WriteToServer(DataTable). You might want to do it in batches, since you may use a bunch of memory reading it all into a DataTable.
Any particular reason you can't use SSIS?

Importing data into Oracle via Web Enterprise Manager with unique constraints

I am not at all familiar with Oracle so bear with me!
I am using version Oracle 10G with the web front end called Enterprise Manager. I have been given some CSV files to import however when I use the Load Data from User Files option I think I can set everything up but when the job runs it complains that there are unique constraints, I guess because there is duplicate data trying to be inserted.
How can I get the insert to create a new primary key similar to a MSSQL auto inc number?
Oracle does not have an analog to the MSSQL auto incrementing field. The feature has to be simulated via triggers and Oracle sequences. Some options here are to:
create a trigger to populate the columns you want auto incremented from a sequence
delete the offending duplicate keys in the table
change the values in your CSV file.
You might look at this related SO question.
There is no autoinc type in Oracle. You have to use a sequence.
By using a before insert trigger, you could get something similar to what you get by using an autoinc in SQL Server.
You can see here how to do it.

Resources