I've imported some data into SQL server using the data import/export tool.
I'm exporting from a .xls workbook to SQL Server 2012.
The issue is that two fields, containing large(ish) numbers have been imported in a weird format.
Original Data:
532842549
What SQL Server Shows (Or Similar)
1.8327e+006
So far I have tried formatting the cells in excel to text, number + general but nothing seems to work.
Is there a way I can cast this value directly in SQL server back to the original value?
The destination field is a varchar(50) field.
I tried exporting Excel data to MSSQL Server and it went smooth. See the following steps.
Created a file with following data.
Went to MSSQL Server and started Imported Data. Right Click on Database => Tasks => Import Data ...
On column mappings step of conversion wizard, I changed type to bigint.
It's done. Check the final output.
Related
First SSIS experience so I'm willing to accept I'm doing things completely wrong here:
Using SSIS:
I'm importing from an Excel sheet
exporting to a client's SQL (SQL Server) database
The data has >250 columns
The Client's database rows are all various nvarchar lengths like 3,5,8, etc
I can assume that the excel data will fit into the database properly, so if I truncate I won't lose any data
What I think I have to do here is truncate the data using a "Data Conversion" transform. The problem I have is it's going to take me hours to do this in the "Data Conversion" editor window because I'm dealing with so many columns, when it would only take me a few minutes in a text editor.
Is there any way to bulk update the Data Conversion settings? Am I doing this the wrong way?
The solution I ended up with was:
- change the package to not fail on truncation.
- Once I did this I could get rid of the transform.
- in the database I created a staging table with the excel column names to import to so that I didn't have to manually match everything up in SSIS
Is it even possible to import .csv flat file data into a SQL Server 2014 table using only the SSMS or SSIS Import/Export Wizards, if the table contains a varbinary(max) column?
I have searched hours and hours, and tried numerous configurations and different data types (e.g. DT_IMAGE) in both the SSMS and SSIS Import/Export Wizards to perform a simple, quick-n-dirty import of .csv file data into a four column table containing a varbinary(max) column in SQL Server.
I realize there are various other means to accomplish this by writing Trans SQL, using bulk-copy, adding column-import tasks, etc., but I find it hard to believe that I can't use the simple point-n-click configuration of the Import/Export Wizard, simply because the data happens to contain a varbinary(max) field, so I assume I must be doing something wrong.
Below are screen shots from the SSMS Import/Export Wizard...I get the same error in both SSMS and SSIS:
You can use DT_TEXT An ANSI/MBCS character string with a maximum length of 2^31-1 (2,147,483,647) characters.
Integration Service Data types.
It Will be varbinary(max) in database.
I have used the sample data provide by you and imported it in the database.
I accidentally deleted YEARS of data in SQL Server Management Studio from a table. Unfortunately the person in this position before me didn't back anything up. Nor did I before I tried to fix an issue. I understand that it cannot be retrieved from SQL but I have all the data I need in a separate file on my desktop. Is their anyway to get that data and input it back into the table that is in SQL? Or is there a query I can run to input the data again into the table? I'm not sure if I am making any sense :/
You can also used Management Studio without SSIS. Right click on the database in MS and select Tasks -> Import Data. You should then be able to select the type of source (flat file) and the format. The rest of the wizard is pretty self-explanatory.
If it is a flat file like .txt or .csv or even an Excel file like(.xls), you can build an SSIS package and dump the data to a new table. Depends, on what kind of data you have in your hand.
I have a table already stored in sql server with all the columns and the data type of the columns. I am now trying to import a txt file into sql server. But since all the fields are already in the string format, it is not able to import the data into the server. Is there any way to change the format of the datatype within the txt file to match that of the table within the sql server? The number of columns are 150 to import
You may need to look into creating a SSIS package and adding the data conversion tool to cast/convert the data being imported into the correct datatype.
I have a SQL Server 2005 instance, into which I am trying to import data from a SQL Server 2008 instance using an SQL Query. I am using the 2008 management studio, and the import/export data wizard.
If I run the select query separately in the management studio, it correctly returns the ~88k rows that are required. The query returns the data with the exact column names and types required by the destination table.
When I run the import wizard, the sql query parses correctly, and the 'Preview' button correctly shows the data. There are no errors or warnings in the conversion section. The task is set to fail if there are any failures in conversion.
When I run the task, no errors are displayed. However, it shows '0 rows transferred' and no data is imported.
Any ideas why?
edit: tried importing to a table created on import in a fresh new db, and still the same result. I'm wondering if the direction of movement from 2008 to 2005 is important (i.e. 2005 can't handle a 2008 feed correctly).
If you have "USE [database]" as part of your query, then this single line is all that gets executed during the import/export.
The solution is to remove the "USE" statement as part of your query.
I've never had much luck with the SQL server management studio's import features under 2005/8.
Lately, I do one of two things.
Either I just use it to import the data into a brand new table on the target server (not even a table of the exact same structure) then run an insert statement to transfer from that newly created table into the destination.
Or I use a tool like Visual Studio for Database Professional (or Redgate) to transfer the data.
I've used the import successfully. Chris has a good suggestion (this is the process I typically follow) though on using a middle table or file. That gives you the ability to do some simple transforms using queries rather than SQL's transform tool. It also gives you a buffer in case things go wrong or you need to isolate issues.
I had a similar problem using the import utility for 2008. We had Oracle native SQL that ran fine in TOAD but imported with 0 rows using the SQL Server import utility. The culprit ended up being related to Oracle's date format. The SQL referred to dates such as '03-JUN-2013'. Once I changed them to use the TO_DATE function such as TO_DATE('06/03/2013','MM/DD/YYYY') we had a successfully execution and import.