exporting query from SQL SERVER 2000 to csv file - sql-server

This is the first time I want to create a .crv file myself. I wish to create if using a query that I create from database in SQL Server. Can anyone tell me how to do so? 1 more thing, if I change any data in database, can data from .crv file auto update?

You can use bcp utility to export data from DB to CSV file. More details on this can be found here
However, the file won't be auto-updated if data changes. You will need to regenerate the file.

Related

Importing Data Using Unix to Oracle DB

I want to import data on a weekly basis to an Oracle DB.
I'm receiving this data on specific location in a server in EDR format. For now I'm uploading them manually using Toad for Oracle uploader wizard. Is there any way to upload them automatically using Unix or any kind of scripting?
I would suggest to try out SQL loader through a shell script.
Code:
sqlldr username#server/password control=loader.ctl
two important files:
a. your data file to be uploaded.
b. Control file which states the table to be inserted and the delimiter character and the column fields, etc. basically describe how to load the data.
Oracle Reference

Does BCP create the destination object?

I apologize if I have a fundamental misunderstanding but basically I wanted to know if there is a way to go directly from ONLY .bcp files to creating a SQL Server database. I have not dealt with .bcp files before and have no format files or know anything about the schema of the database we are trying to re-create. Is there some sort of utility wihtin SQL server management studio that can do what I am asking or do I just not have enough resources to create a database out of this data. Any help is appreciated.
BCP only builk insert the data. So you need to have a blank database first to insert to with the bcp files.
You can use the bcp command line to run with those bcp files. But you need to know the order of the files to run if the db has FKs.

How copy data from one database to another on different server?

I have 2 DB with the same schema on different servers.
I need to copy data from table T to the same table T in test database in different server and network.
What is the easiest way to do it?
I heard that data can be dumped to flat file and than inserted into database. How does it works?
Can this be achieved using sqlplus and oracle database?
Thank you!
Use Oracle export to export a whole table to a file, copy the file to serverB and import.
http://www.orafaq.com/wiki/Import_Export_FAQ
You can use rsync to sync an oracle .dbf file or files to another server. This has problems and syncing all files works more reliably.
For groups of records, write a query to build a pipe-delimited (or whatever delimiter suits your data) file with rows you need to move. Copy that file to serverB. Write a control file for sqlldr and use sqlldr to load the rows into the table. sqlldr is part of the oracle installation.
http://www.thegeekstuff.com/2012/06/oracle-sqlldr/
If you have db listeners up on each server and tnsnames knows about both, you can directly:
insert into mytable#remote
select * from mytable
where somecolumn=somevalue;
Look at the remote table section:
http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_9014.htm
If this is going to be an ongoing thing, create a db link from instance#serverA to instance#serverB.
You can then do anything you have permissions for with data on one instance or the other or both.
http://psoug.org/definition/CREATE_DATABASE_LINK.htm

Speeding Up ETL DB2 to SQL Server?

I came across this blog post when looking for a quicker way of importing data from a DB2 database to SQL Server 2008.
http://blog.stevienova.com/2009/05/20/etl-method-fastest-way-to-get-data-from-db2-to-microsoft-sql-server/
I'm trying to figure out how to achieve the following:
3) Create a BULK Insert task, and load up the file that the execute process task created. (note you have to create a .FMT file for fixed with import. I create a .NET app to load the FDF file (the transfer description) which will auto create a .FMT file for me, and a SQL Create statement as well – saving time and tedious work)
I've got the data in a TXT file and a separate FDF with the details of the table structure. How do I combine them to create a suitable .FMT file?
I couldn't figure out how to create the suitable .FMT files.
Instead I ended up creating replica tables from the source DB2 system in SQL Server and ensured that that column order was the same as what was coming out from the IBM File Transfer Utility.
Using an Excel sheet to control what File Transfers/Tables should be loaded, allowing me to enable/disable as I please, along with a For Each Loop in SSIS I've got a suitable solution to load multiple tables quickly from our DB2 system.

Loading many flatfiles into SQL Server 2005

I have a very annoying task. I have to load >100 CSV-files from a folder to SQL Server database. The files have column names in first row. Data type can be varchar for all columns. The table names in database can just be filename of the CSVs. What I am currently doing is that I use Import/Export Wizard from SSMS, I choose flatfile from dropdown box, choose the file, next->next->next and finish! Any ideas how can I automate such a task in Integration services or with any other practical method?
Note: Files are on my local PC, DB-server is somewhere else, so I cannot use BULK INSERT.
You can use SSIS - Foeach loop container to extract file names - by arranging to particular format.Use a variable to dynamically fill the variable with file name.Then in dataflowtask , use flat file source for source - oledb destination.
Please post some sample file names.so that i can learn and guide you properly.
Thanks
Achudharam

Resources