How to export a table with code into a file - export

I would like to export a table from Firebird database into a CSV file. With MySQL, I can use an SQL with additional commands like INTO OUTFILE. Here is an example:
SELECT a,b,a+b INTO OUTFILE '/tmp/result.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
This query can be submitted on several sql-query-tools (MySQL WB, HeidiSQL, ccenter).
Is it possible using, for example FlameRobin, to submit an SQL statement like above to export data from Firebird?
I may use FlameRobin using the menu or another tool like FBexport, but I prefer the one-code solution without clicking on the menu or using additional tools (FBexport).

I know its a lil bit late... But
There is a way to export it using OUTPUT ISQL command.
It will be like:
`OUTPUT C:\file_name.txt;`
`SELECT A, B, A+B FROM EXMAPLE_TABLE;`
`OUTPUT;`
So the content of your select statement will be on your txt file.
Remember to create the empty file before using the OUTPUT;

Firebird does not support that directly. You'll have to use some tool (either GUI or CLI) to do it.

Related

Sql server export table to excel

I’m trying to export a table into excel/csv , but I’m having trouble because of one column, which is long and has been concatenated with delimiter of “char(10) + char(13)” for a new lines . When I copy all the data from sql server management studio and use “save as” csv file, the output gets broken . Every place that there is a use of a new line , the output get stretched to more than 1 row and breaks the columns position.
I also tried using the export wizard ( don’t know if it will make a difference ) but with no success as the export keeps failing on the last step (getting a warning of “potential lost conversion from nvarchar to longtext) with error of “data conversion failed ..”
To allow multiline fields in csv, those fields have to be enclosed in quotes:
123,"multiline
field",456
789,second record,147
If this is not the case in your generated csv you might have to tell the generator to quote the fields.
If the quotes are already there the csv is valid and any decent reader should take care of those multiline fields. Of course, if you open the file in Notepad you'll still see multiple lines per record, which is normal.
To avoid such issues, you need to clean the data by replacing the carriage return (char(13)) and line feed (char(10)) in your SELECT statement using the following query:
SELECT replace(replace([ColumnName], char(10), ''), char(13), '')
FROM [dbo].[yourTableName]

Importing *.DAT file into SQL server

I am trying to import *.DAT file(as flat file source) into sql server using SQL server import and export wizard. It has DC4 as delimiter which is causing error while trying to separate the columns and their respective data and importing them in sql server.
Are there any setting changes to be made during the importing process?
If you don't have to use the wizard, you can script it like:
BULK INSERT [your_database].[your_schema].[your_table]
FROM 'your file location.dat'
WITH (ROWTERMINATOR='0x04' -- DC4 char
,MAXERRORS=0
,FIELDTERMINATOR='þ'
,TABLOCK
,CodePage='RAW'
)
The wizard uses SSIS under the hood. Instead of executing it directly, chose CrLF as row delimiter, then chose to save it as file. Open the file and edit it using any text editor. It's a simple xml file.
It's not clear whether 0x04 is the column delimiter or the row delimiter. Assuming it's the row delimiter,
Replace all instances of
Delimiter="_x000D__x000A_"
with
Delimiter="_x0004_"
there're two instances: DTS:HeaderRowDelimiter and DTS:ColumnDelimiter
Save the file and execute it with a double clik or "Open with: Execute package utility". I tested the solution on my PC using an account with limited permissions.

Exporting Specific Columns,Specific Rows from a Specific Table of a Specific Database in Mysql

I need to export only subset of columns from a very large table containing large number of columns.Also this table contains million of rows so i want to export only specific rows from this table.
I have recently started using Mysql earlier i was working on Oracle.
This worked for me:
mysql -u USERNAME --password=PASSWORD --database=DATABASE \
--execute='SELECT `field_1`, `field_2` FROM `table_name`' -X > file.xml
And then importing the file, using command:
LOAD XML LOCAL INFILE '/pathtofile/file.xml'
INTO TABLE table_name(field_1, field_2, ...);
What format do you need the data in? You could get a CSV using a query. For example
SELECT column1,column2,column3,... FROM table WHERE column1=criteria1,....
INTO OUTFILE '/tmp/output.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
http://www.tech-recipes.com/rx/1475/save-mysql-query-results-into-a-text-or-csv-file/
An administration tool like phpMyAdmin (http://www.phpmyadmin.net/) could also be used to run the query and then export the results in a variety of formats.

How to export data from an ancient SQL Anywhere?

I'm tasked with exporting data from an old application that is using SQL Anywhere, apparently version 5, maybe 5.6. I never worked with this database before so I'm not sure where to start here. Does anybody have a hint?
I'd like to export it in more or less any text representation that then I can work with. Thanks.
I ended up exporting the data by using isql and these commands (where #{table} is each of the tables, a list I built manually):
SELECT * FROM #{table};
OUTPUT TO "C:\export\#{table}.csv" FORMAT ASCII DELIMITED BY ',' QUOTE '"' ALL;
SELECT * FROM #{table};
OUTPUT TO "C:\export\#{table}.txt" FORMAT TEXT;
I used the CVS to import the data itself and the txt to pick up the name of the fields (only parsing the first line). The txt can become rather huge if you have a lot of data.
Have a read http://www.lansa.com/support/tips/t0220.htm

SQL Server CSV extract of tables with newline, doublequotes and commas in columns?

I extracted some 10 tables in CSV with " as the text qualifier. Problem is my extract does not look right in Excel because of special characters in a few columns. Some columns are breaking into a new row when it should stay in the column.
I've been doing it manually using the management studio export feature, but what's the best extract the 10 tables to CSV with the double quote qualifier using a script?
Will I have to escape commas and double quotes? Best way to do this?
How should I handle newline codes in my columns, we need them for migration to a new system, but the PM wants to open the files and make modifications using Excel. Can they have it both ways?
I understand that much of the problem is that Excel is interpreting the file where a load utility into another database might not do anything special with new line, but what about double quotes and commas in the data, if I don't care about excel, must I escape that?
Many Thanks.
If you are using SQL Server 2005 or later, the export wizard will export the excel file out for you.
Right click the database, select Tasks-> Export Data...
Set the source to be the database.
Set the destination to excel.
At the end of the wizard, select the option to create an SSIS package. You can then create a job to execute the package on a schedule or on demand.
I'd suggest never using commas for your delimiter - they show up too frequently in other places. Use a tab, since a tab isn't too easy to include in Excel tables.
Make sure you never start a field with a space unless you want that space in the field.
Try changing your text lf's into the literal text \n. That is:
You might have:
0,1,"Line 1
Line 2", 3
I suggest you want:
0 1 "Line 1\nLine 2" 3
(assuming the spacing between lines are tabs)
Good luck
As far as I know, you cannot have new line in csv columns. If you know a column could have comma, double quotes or new line, then you can use this SQL statement to extract the value as valid csv
SELECT '"' + REPLACE(REPLACE(REPLACE(CAST([yourColumnName] AS VARCHAR(MAX)), '"', '""'), char(13), ''), char(10), '') + '"' FROM yourTable.

Resources