FREETDS Dateformat issues - c

Below SQL statement
SELECT getdate()
go
When I run from SQL Server management studio it gives
"Jul 27 2016 22:00:00.860"
When I run the same from sqsh it gives
"Jul 27 2016 10:00PM"
sqsh uses freetds to connect to SQL server from my linux box. I have a C program which uses freetds and it will work fine when date is retrieved in 24hour format.
I guess there are settings for freetds to get date in required format, can someone please suggest how to accomplish that by freetds settings.

Here's what I just did on my ubuntu linux box:
Type this command at the bash shell: locale My result (shortened): LANG=en_US
Copy locales.conf file to the config directory: sudo cp /usr/share/doc/freetds-common/examples/locales.conf /etc/freetds/
open the /etc/freetds/locales.conf file in an editor
comment out the old date format defined in the locales.conf file (I've used ";" as a comment char), copy the line
Define a date format that you need in the corresponding section of the locales.conf file
[en_US]
;date format = %b %e %Y %I:%M:%S:%z%p
date format = %Y-%m-%d %H:%M:%S
restart your web server process
Now I get from sqsh:
SELECT getdate();
: 2016-08-01 11:37:45

Currently, the default date format returned by FreeTDS is configured in the locales.conf file. See http://www.freetds.org/userguide/locales.htm for details.

Related

SQL Server OPENROWSET error reading bcp file

I'm trying to transfer table data from one SQL Server to another and wanting to use the bcp utility for it. This is purely to transfer data between two identical schemas, but I'm not able to use something like SSDT; I need something that can be scriptable and portable so it can be run by others with just SQL server and SSMS access.
I am generating a native output file and format file like so:
$> bcp database.TableName OUT c:\data\bcp\TableName.bcp -T -N -S SQLINSTANCE
$> bcp database.TableName format nul -f c:\data\bcp\TableName.fmt -T -N
Then in Management Studio I am trying to in turn read the files like this:
SELECT
*
FROM
OPENROWSET (BULK 'c:\data\bcp\TableName.bcp',
FORMATFILE = 'c:\data\bcp\TableName.fmt') AS t1
But am getting this error:
The bulk load failed. The column is too long in the data file for row 6, column 19. Verify that the field terminator and row terminator are specified correctly.
I have followed this process before successfully, and it works for other tables. But I'm running into issue with this table. The column mentioned is of datatype nvarchar(max). I can inspect what I think is the "problem" record in the source data and it's just a very long string but I don't see anything else special about it.
Is there something else I should be doing when generating the format file or what else am I missing?
If you are only exporting for the purpose of importing to another SQL Server, native format is the way to go. And is this case you don't need to use format files. Just do a native export and import.
Note you are specifying a capital -N and that's not native. Native is lower -n.
You should export using something like:
bcp database.Schema.TableName OUT c:\data\bcp\TableName.bcp -T -n -S SQLINSTANCE
Then on the importing side I sugest using BULK IMPORT, which don't need a format file for native at all:
BULK INSERT TargetDB.dbo.TargetTable
FROM 'c:\data\bcp\TableName.bcp'
WITH (DATAFILETYPE = 'native');
If you can't use BULK INSERT and must absolutely go for OPENROWSET, you need a format file. bcp can generate that for you, but again, lower case -n:
bcp database.Schema.TableName format nul -f c:\data\bcp\TableName.fmt -T -n -S SQLINSTANCE
Now your OPENROWSET should work.

What is wrong with the codes in this BCP utility running on SQL Server?

I have created a BCP utility and I have wrapped it in a bat file. I have then created a daily task using Task Scheduler in Windows Server 2012.
The function of the BCP utility is to rename a file called 'myfile.csv' (located in C:) by adding a date stamp to it and updating the file with the result of a SQL query.
The codes currently stand as follows:
cd:\Program Files\ Microsoft SQL Server\Client SDK\ODBC\110\Tools\Binn
set vardate=%DATE:~4,10%
set varDateWithoutSlashes=%vardate:/=-%
ren C:\myfile.csv myfile_%varDateWithoutSlashes%.csv
bcp "SELECT TOP 100 ReservationStayID,NameTitle,FirstName,LastName,ArrivalDate,DepartureDate FROM MyDatabase.dbo.GuestNameInfo" queryout C:\myfile.csv -t, -c -S [ipaddress] -U sa -P 1234
My problem is that when the task runs, it renames the file correctly with a the date stamp but it seems that the SELECT query does not run as the file is empty (except the headers, which have been pre-loaded by the way).
What is wrong with my codes?
I should also add the following:
Are the double quotes in the select statement above correct? Or should they be single quotes?
Should the ipaddress in my codes above be in square brackets or should I remove them?
I have left the "Location" filed 'as is' in the Task Scheduler (please see screenshot below). Should that be filled? If yes, by what?
Thanks for helping out!

SQL Server 2014 : Not able to import any data into LocalDB using bcp and format file (zero rows, no errors)

I'm trying to use a non-XML bcp format file to import data into LocalDB on Win7 64 bit. Simplest possible use case.
OS Name: Microsoft Windows 7 Home Premium
OS Version: 6.1.7601 Service Pack 1 Build 7601
LocalDB version: Microsoft SQL Server 2014 (12.0.2000.8)
BCP version: 12.0.2000.8
Basically, latest version of everything downloaded from Microsoft SQL Server 2014 site a few days ago.
I'm able to connect to the LocalDB instance via bcp to create a format file, but the generated format file doesn't work when trying to re-import the simplest possible data using it. No matter what I try, bcp loads zero rows, fails silently and prints no error information to the specified error file.
/* create the table */
use try_db;
create table try(num integer);
/* create the format file based on the table. */
bcp try_db.dbo.TRY format nul -n -T -f TRY.fmt -S (localdb)\default_db
/* above command creates a file TRY.fmt with the following contents */
12.0
1
1 SQLINT 1 4 "" 1 num ""
/* then I create a file data.txt, with just the number 99 in it, followed by a Windows line terminator (\r\n) */
/* then try importing the file into the table */
bcp try_db.dbo.TRY in data.txt -f TRY.fmt -T -S (localdb)\default_db -e errors.txt
Result:
Starting copy...
0 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1
Nothing is written to errors.txt. I am just not able to get bcp to import anything at all using a format file!
I haven't tried it with SQL Server itself (only with LocalDB) but it shouldn't matter for stuff as simple as this.
I tried editing the TRY.fmt file line as follows:
1 SQLINT 1 4 "\r\n" 1 num ""
But that didn't help either.
I am able to get it to successfully import using -c instead of -f:
bcp try_db.dbo.TRY in data.txt -c -T -S (localdb)\default_db -e errors.txt
Any thoughts on (a) why bcp won't import using the format file, and (b) why it prints no errors to the specified error file? There must be something really simple I'm getting wrong here.
Please, no recommendations to use BULK INSERT or SSIS (etc) instead. bcp should just work as documented!
The format file is describing the source data not the destination. When you use -c or datafiletype='char' your input datatypes must be SQLCHAR. Native datatypes are only valid when using -n or datafiletype='native'. A source file in native format is always binary so bcp needs to know the data type of each field in order to read the correct amount of bytes and interpret them correctly.
I think I found the answer. The bcp format spec doesn't work properly! It seems that even for numeric or datetime import fields, you have to specify "SQLCHAR" as the datatype in the .fmt file. Any attempt to use the actual .fmt file generated by "bcp format" is hopeless -- if it gives you SQLINT or SQLDATE lines back, you have to replace those with SQLCHAR for the thing to work, even if the db columns are in fact numeric or date/datetime types.
What a crock!

How to connect to MS SQLServer using Vim dbext on Mac OSX?

I use MacVim and the dbext plugin to connect to Oracle and it works well. Now I need to connect to MS SQLServer,
but it showed error:
Connection: T(SQLSRV) H(localhost) U(user) at 14:38
/bin/bash: osql: command not found
Anyone know how to do this?
Make sure you have one of the FreeTDS CLI programs. I think tsql is more full featured then osql, but the same approach should work with either.
Create a shell script to wrap tsql. Put it somewhere in your path.
Then add the dbext config values to your .vimrc
" I'm using mssql.sh as the wrapper program.
" Re-title to whatever you name yours
let g:dbext_default_SQLSRV_bin = "mssql.sh"
" FreeTDS options for osql/tsql are not as feature rich as dbext expects
let g:dbext_default_SQLSRV_cmd_options = ' '
" set 'host' in you profile to the FreeTDS server config, which will be altered in the script
The wrapper I whipped up is nothing special, but it's tested and works.
#!/bin/bash
# -S is better for FreeTDS then -H
options=$( echo $# | sed -e 's/-H /-S /' -e 's/ -i.*//' )
# osql/tsql in freetds don't seem to accept a file flag
sql_scratch=$( echo $# | sed 's|^.* -i||' )
# and execute...
cat $sql_scratch | tsql $options
Osql comes with the library of FreeTDS, but probably another error will be prompted: "Illegal option -w".
You can use ODBC instead of SQLSRV in the type parameter in the connection of DBEXT. (The other option being using DBI perl interface)
Install iodbc and build freetds with the --with-iodbc option.
Edit your odbc.ini file, you may find it using iodbc-config --iodbcini or find -name odbc.ini / | grep odbc.ini.
My working odbc.ini (please take care with file names, im on a freebsd box):
[MYDNSNAME]
Driver = /usr/local/lib/libtdsodbc.so
Description = Sample OpenLink MT DSN
Server = 192.168.100.4
Port = 50436
TDS_Version = 8.0
Database = initial_db
ServerOptions =
ConnectOptions =
Options =
ReadOnly = no
And my dbext connection on .vimrc:
let g:dbext_default_profile_CONN = 'type=ODBC:dsnname=MYDSNNAME:user=domain\user:passwd=pass:dbname=initial_db'
You can also configure DBExt to use the richer sqsh instead of the osql program in freetds. An example connection profile that does so can be found in :h dbext by searching for "sqsh". You should of course already have sqsh in working order.

Configuring Locales on Linux for PostgreSQL

I'm having trouble getting a particular database set up and running. I'm trying to restore a postgreSQL dump I got from somebody else. I've tried a few methods to no avail.
Straight from pg_restore
pg_restore -C -d postgres --exit-on-error maggie_prod_20111221.dump.sql
Creating the database and tablespace first
createdb -T template0 maggieprod -E LATIN1
SQL: CREATE TABLESPACE magdat OWNER maggie LOCATION '/somewhere/magdat';
pg_restore -v -d template1 maggie_prod_20110121.dump.sql
Using the first method I get the following:
pg_restore: [archiver (db)] Error while PROCESSING TOC:
pg_restore: [archiver (db)] Error from TOC entry 2308; 1262 16386 DATABASE maggieprod postgres
pg_restore: [archiver (db)] could not execute query: ERROR: encoding LATIN1 does not match locale en_CA.utf8
DETAIL: The chosen LC_CTYPE setting requires encoding UTF8.
Command was: CREATE DATABASE maggieprod WITH TEMPLATE = template0 ENCODING = 'LATIN1' TABLESPACE = magdat;
And using the second, when I try and create the database I get:
createdb: database creation failed: ERROR: encoding LATIN1 does not match locale en_CA.utf8
DETAIL: The chosen LC_CTYPE setting requires encoding UTF8.
So it seems to be that I cannot create a LATIN1 encoding database? Why is that? I am new to locales and encoding and don't know very much about them. I just know that the dump was made off of a LATIN1 database.
The output of locale is:
LANG=en_CA.utf8
LC_CTYPE="en_CA.utf8"
LC_NUMERIC="en_CA.utf8"
LC_TIME="en_CA.utf8"
LC_COLLATE="en_CA.utf8"
LC_MONETARY="en_CA.utf8"
LC_MESSAGES="en_CA.utf8"
LC_PAPER="en_CA.utf8"
LC_NAME="en_CA.utf8"
LC_ADDRESS="en_CA.utf8"
LC_TELEPHONE="en_CA.utf8"
LC_MEASUREMENT="en_CA.utf8"
LC_IDENTIFICATION="en_CA.utf8"
LC_ALL=
And the output of locale -a is:
C
en_AG
en_AG.utf8
en_AU.utf8
en_BW.utf8
en_CA.utf8
en_DK.utf8
en_GB.utf8
en_HK.utf8
en_IE.utf8
en_IN
en_IN.utf8
en_NG
en_NG.utf8
en_NZ.utf8
en_PH.utf8
en_SG.utf8
en_US.utf8
en_ZA.utf8
en_ZW.utf8
POSIX
I don't see LATIN1 in the second command, should I? If so, how would I go about adding it? Is it correct for me to assume that I need to change the locale on my computer? If so, is there a way to do that only for postgreSQL? Also, when I try and open the dump I see a lot of garbage characters, I am assuming this is because of the encoding, how would I look at it properly?
Thanks for any help.
You need to create the database with a locale that matches the encoding, e.g.,
createdb -T template0 maggieprod -E LATIN1 --locale=en_CA
Since you don't have all locales installed, I guess you are using Debian or Ubuntu. In that case, call dpkg-reconfigure locales or install the locales-all package.
Alternatively, create the database with encoding UTF8. As long as all your clients set the client encoding correctly, it shouldn't make a difference.
I had trouble using the createdb syntax from The_Denominater, so I did it the following way:
CREATE DATABASE maggieprod WITH ENCODING = 'LATIN1'
LC_CTYPE = 'en_CA' LC_COLLATE = 'en_CA'
TEMPLATE template0;
If you are still interested using the recode command will transform your database dump to the character set of your choice before you import it into your new database. See this link - http://blog.e-shell.org/134

Resources