Loading files in existing table - sybase

In sybase ase. Is there anyway to load a file into a table without using bcp?
I read something about LOAD TABLE statement but it doesn't work for sybase ase.

Related

Load all CSVs from path on local drive into AzureSQL DB w/Auto Create Tables

I frequently need to validate CSVs submitted from clients to make sure that the headers and values in the file meet our specifications. Typically I do this by using the Import/Export Wizard and have the wizard create the table based on the CSV (file name becomes table name, and the headers become the column names). Then we run a set of stored procedures that checks the information_schema for said table(s) and matches that up with our specs, etc.
Most of the time, this involves loading multiple files at a time for a client, which becomes very time consuming and laborious very quickly when using the import/export wizard. I tried using an xp_cmshell sql script to load everything from a path at once to have the same result, but xp_cmshell is not supported by AzureSQL DB.
https://learn.microsoft.com/en-us/azure/azure-sql/load-from-csv-with-bcp
The above says that one can load using bcp, but it also requires the table to exist before the import... I need the table structure to mimic the CSV. Any ideas here?
Thanks
If you want to load the data into your target SQL db, then you can use Azure Data Factory[ADF] to upload your CSV files to Azure Blob Storage, and then use Copy Data Activity to load that data in CSV files into Azure SQL db tables - without creating those tables upfront.
ADF supports 'auto create' of sink tables. See this, and this

How copy data from one database to another on different server?

I have 2 DB with the same schema on different servers.
I need to copy data from table T to the same table T in test database in different server and network.
What is the easiest way to do it?
I heard that data can be dumped to flat file and than inserted into database. How does it works?
Can this be achieved using sqlplus and oracle database?
Thank you!
Use Oracle export to export a whole table to a file, copy the file to serverB and import.
http://www.orafaq.com/wiki/Import_Export_FAQ
You can use rsync to sync an oracle .dbf file or files to another server. This has problems and syncing all files works more reliably.
For groups of records, write a query to build a pipe-delimited (or whatever delimiter suits your data) file with rows you need to move. Copy that file to serverB. Write a control file for sqlldr and use sqlldr to load the rows into the table. sqlldr is part of the oracle installation.
http://www.thegeekstuff.com/2012/06/oracle-sqlldr/
If you have db listeners up on each server and tnsnames knows about both, you can directly:
insert into mytable#remote
select * from mytable
where somecolumn=somevalue;
Look at the remote table section:
http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_9014.htm
If this is going to be an ongoing thing, create a db link from instance#serverA to instance#serverB.
You can then do anything you have permissions for with data on one instance or the other or both.
http://psoug.org/definition/CREATE_DATABASE_LINK.htm

Speeding Up ETL DB2 to SQL Server?

I came across this blog post when looking for a quicker way of importing data from a DB2 database to SQL Server 2008.
http://blog.stevienova.com/2009/05/20/etl-method-fastest-way-to-get-data-from-db2-to-microsoft-sql-server/
I'm trying to figure out how to achieve the following:
3) Create a BULK Insert task, and load up the file that the execute process task created. (note you have to create a .FMT file for fixed with import. I create a .NET app to load the FDF file (the transfer description) which will auto create a .FMT file for me, and a SQL Create statement as well – saving time and tedious work)
I've got the data in a TXT file and a separate FDF with the details of the table structure. How do I combine them to create a suitable .FMT file?
I couldn't figure out how to create the suitable .FMT files.
Instead I ended up creating replica tables from the source DB2 system in SQL Server and ensured that that column order was the same as what was coming out from the IBM File Transfer Utility.
Using an Excel sheet to control what File Transfers/Tables should be loaded, allowing me to enable/disable as I please, along with a For Each Loop in SSIS I've got a suitable solution to load multiple tables quickly from our DB2 system.

SSIS Data Validation and Data Loading

I need suggestion on best approach from below listed options. I need to validate excel file data and load it to SQL Server
Validations include
Non Duplicate columns
Mandatoty fields present
Fields not present in Database
In case of error I would write in errorlog table in database
Below is my approach
Load the Data into a Temp Table in Database
Run the Validations
Log the Error
On success load it to main tables
Please let me know if you have any other better ideas for this scenario
Here are couple of approaches that are possible:
Using SSIS
Create excel connection manager then use dataflow task with OLEDB Source, lookup transform (to eliminate the records NOT needed), OLEDB destination
directly into main table.
You can also choose to redirect or ignore rows that do not satisfy the transformations.
(use can use bulk insert task if the excel is really large instead of dealing RBAR)
2. Using TSQL
BULK INSERT or BCP or use OPENROWSET into staging table. Beware that you need to have approriate drivers installed (JET for x32 or ACE for x64 SQL Server).
Then do error handling by logging to error table (raiseerror, try-catch) before loading to main table.

Loading many flatfiles into SQL Server 2005

I have a very annoying task. I have to load >100 CSV-files from a folder to SQL Server database. The files have column names in first row. Data type can be varchar for all columns. The table names in database can just be filename of the CSVs. What I am currently doing is that I use Import/Export Wizard from SSMS, I choose flatfile from dropdown box, choose the file, next->next->next and finish! Any ideas how can I automate such a task in Integration services or with any other practical method?
Note: Files are on my local PC, DB-server is somewhere else, so I cannot use BULK INSERT.
You can use SSIS - Foeach loop container to extract file names - by arranging to particular format.Use a variable to dynamically fill the variable with file name.Then in dataflowtask , use flat file source for source - oledb destination.
Please post some sample file names.so that i can learn and guide you properly.
Thanks
Achudharam

Resources