I am using SQL Server Import and Export Wizard to transfer data from an Excel file into a table. I wrote the following query in the wizard's query option:
Select
ISNULL([Col 1], [Col 2]),
[Col 3]
FROM [myExcelWorkSheet$]
It says this SQL Statement is not a query. In fact, no other functions seem to work, such as COALESCE and even CAST.
Does the SQL Server Import and Export Wizard not accept functions?
You are missing the schema of the function. It must be something like this:
Select
ISNULL([Col1], [Col2]),
[Col3]
FROM [dbo].[myExcelWorkSheet$];
Related
Recently I found an anomaly with SQL Server database creation. If I create with the sql query
create database 6033SomeDatabase;
It throws an error.
But with the Management Studio UI, I can manually create a database with a name of 6033SomeDatabase.
Is this expected behaviour or is it a bug? Please throw some light on this issue.
Try like this,
IF DB_ID('6033SomeDatabase') IS NULL
CREATE DATABASE [6033SomeDatabase]
I'll try to give you detailed answer.
SQL syntax imposes some restrictions to names of database, tables, and fields. F.e.:
SELECT * FROM SELECT, FROM WHERE SELECT.Id = FROM.SelectId
SQL parser wouldn't parse this query. You should rewrite it:
SELECT * FROM [SELECT], [FROM] WHERE [SELECT].Id = [FROM].SelectId
Another example:
SELECT * FROM T1 WHERE Code = 123e10
Is 123e10 the name of column in T1, or is it a numeric constant for 123×1010? Parser doesn't know.
Therefore, there are rules for naming. If you need some strange database or table name, you can use brackets to enclose it.
I am trying to export a sql query (MS SQL 2014) into excel 2010. The problem is column values contain line breaks, so the remaining data gets copied to the next line in excel. Is there a way to get rid of this? Keeping the column as is? or maybe encapsulating the column so the sql considers it as one column and ignores the line breaks?
Here is my SQL Query:
select * from tbl_case
where (casenature not like '%<strong>%'
and casenature not like '%<br />%'
and casenature like '%from:%')
and userid in (select employeelogin from tbl_employees where riding='15010')
Works fine if I use the normal way to import data from MSSQL to Excel which is: in Excel, Data->From other sources->SQL server.
To import data resulting from an arbitrary SQL query:
At the last step of the wizard (where you select the range), press Properties...
In the resulting Connection properties window:
Definition->Command type - SQL
In the Command text field, write your query
You can replace enter keys with space in select statement, and then export to Excel
I have one table in SQL server and 5 tables in Teradata.I want to join those 5 table in teradata with sql server table and store result in Teradata table.
I have sql server name but i dont know how to simultaneously run a query both on sql server and teradata.
i want to do this:
sql server table query
Select distinct store
from store_Desc
teradata tables:
select cmp_id,state,sde
from xyz
where store in (
select distinct store
from sql server table)
You can create a table (or a volatile table if you do not have write privileges) to do this. Export result from SQL Server as text or into the language of your choice.
CREATE VOLATILE TABLE store_table (
column_1 datatype_1,
column_2 datatype_2,
...
column_n datatype_n);
You may need to add ON COMMIT PRESERVE ROWS before the ; to the above depending on your transaction settings.
From a language you can loop the below or do an execute many.
INSERT INTO store_table VALUES(value_1, value_2, ..., value_n);
Or you can use the import from text using Teradata SQL Assistant by going to File and selecting Import. Then execute the below and navigate to your file.
INSERT INTO store_table VALUES(?, ?, ..., n);
Once you have inserted your data you can query it by simply referencing the table name.
SELECT cmp_id,state,sde
FROM xyz
WHERE store IN(
SELECT store
FROM store_table)
The DISTINCT function is most easily done on export from SQL Server to minimize the rows you need to upload.
EDIT:
If you are doing this many times you can do this with a script, here is a very simple example in Python:
import pyodbc
con_ss = pyodbc.connect('sql_server_odbc_connection_string...')
crs_ss = con_ss.cursor()
con_td = pyodbc.connect('teradata_odbc_connection_string...')
crs_td = con_td.cursor()
# pull data for sql server
data_ss = crs_ss.execute('''
SELECT distinct store AS store
from store_Desc
''').fetchall()
# create table in teradata
crs_td.execute('''
CREATE VOLATILE TABLE store_table (
store DEC(4, 0)
) PRIMARY INDEX (store)
ON COMMIT PRESERVE ROWS;''')
con_td.commit()
# insert values; you can also use an execute many, but this is easier to read...
for row in data_ss:
crs_td.execute('''INSERT INTO store_table VALUES(?)''', row)
con_td.commit()
# get final data
data_td = crs_td.execute('''SELECT cmp_id,state,sde
FROM xyz
WHERE store IN(
SELECT store
FROM store_table);''').fetchall()
# from here write to file or whatever you would like.
Is fetching data from the Sql Server through ODBC an option?
The best option may be to use Teradata Parallel Transporter (TPT) to fetch data from SQL Server using its ODBC operator (as a producer) combined with Load or Update operator as the consumer to insert it into an intermediate table on Teradata. You must then perform rest of the operations on Teradata. For the rest of the operations, you can use BTEQ/SQLA to store the results in the final Teradata table. You can also put the same SQL in TPT's DDL operator instead of BTEQ/SQLA and get it done in a single job script.
To allow use of tables residing on separate DB environments (in your case SQL-Server and Teradata) in a single select statement, Teradata has recently released Teradata Query Grid. But I'm not sure about exact level of support for SQL-Server and it will involve licensing hassle and quite a learning curve to do this simple job.
When importing a file.csv to sql server 2008 I am getting a problem.
In my file.csv the decimal is written in this way: (ex. 1234,34112) and It seems that SQL server does not understand the ',' as decimal.
My solution has been to import it using BULK INSERT as VARCHAR and after that convert it to decimal. It works but I guess it may be a better solution which I am not able to get.
Could you help me with that?
Thanks in advance
There are only two ways of doing it one you have already mentioned importing in Sql server and then doing something like this..
CREATE Table ImportTable (Value NVARCHAR(1000)) --<-- Import the data as it is
INSERT INTO #TABLE VALUES
('1234,34112'),('12651635,68466'),('1234574,5874')
/* Alter table add column of NUMERIC type.
And update that column something like this...
*/
UPDATE ImportTable
SET NewColumn = CAST(REPLACE(Value , ',', '.') AS NUMERIC(28,8))
Or you can change it your excel sheet before you import it.
Unless you are using SSIS to import data it is always best to get your data in Sql Server 1st using lose datatypes and then do any data manipulation if needed.
SQL Server Management Studio 17 provides a new direct option to import flat files that handles import of decimal csv columns for you. Right your database, then click Tasks > Import Flat File...
I import data from a TSV file with SQL Server 2008.
null is replaced by 0 when I confirm a table after import with integer column.
How to import as null, please Help me!!
Using bcp, -k switch
Using BULK INSERT, use KEEPNULLS
After comment:
Using SSIS "Bulk insert" task, options page, "Keep nulls" = true
This is what the import wizard uses: but you'll have to save and edit it first because I see no option in my SSMS 2005 wizard.
This can be set in the OLE DB Destination editor....there is a 'Keep nulls' option.
Alternative for those using the Import and Export Wizard on SQL Server Express, or anyone who finds themselves too lazy to modify the SSIS package:
Using text editing software before you run the wizard, replace NULLs with a valid value that you know doesn't appear in your dataset (eg. 987654; be sure to do a search first!) and then run the Import Export Wizard normally. If your data contains every single value (maybe bits or tinyints), you'll have some data massaging ahead of you, but it's still possible by using a temporary table with datatypes that can store a greater number of values. Once it's in SQL, use commands like
UPDATE TempTable
SET Column1 = NULL
WHERE Column1 = 987654
to get those NULLs where they belong. If you've used a temporary table, use INSERT INTO or MERGE to get your data into your end table.