Export query result in Pervasive to txt / csv file - export

I'm using Pervasive 10 with PCC (Pervasive Control Center) and I need to export a lot of results (over 100 000) to a TXT file.I know it's possible "Execute in Text" but this feature does not work for me because after exporting about 20 000 records the program stops. I have also changed the settings in PCC (Windows->Preferences->Text Output-> Maximun number of rows to display = 500,000).
Anyone know a way to export my query result to a txt file?

You should be able to use the Export Data function. Right click on the table name in the PCC and select Export Data. From there, you can either execute the standard "select * from " or make a more complex query to pull only the data you need. You can set the delimiter to Comma, Tab, or Colon.

Nice answer mirtheil, was wondering about this my self as well.
To add something to the answer.
It does not matter which table you right click and choose "Export Data" on, Because your query will override the default table query.

Related

Turn off thousands separator in Snowflake Snowsight

I really like Snowflake's new Snowsight web console. One minor issue is that all the numeric columns have commas , as thousands separator rather than just outputting the raw number.
For example I have a bunch of UNIX epochs stored in a column called created_time. For debugging purposes I'd like to quickly copy and paste them into a WHERE clause, but I have to manually remove the commas from 1,666,719,883,332 to be 1666719883332.
Sure it's a minor thing, but doing it several dozen times a day is really starting to up to minutes.
I realize I could cast the column to a VARCHAR, but I'd rather find a setting that I can turn off for this auto-thousand-separator default behavior.
Does anyone know a way to turn it off?
Here is an example:
create TABLE log (
CREATED_TIME NUMBER(38,0),
MSG VARCHAR(20000)
);
insert into log values (1666719883332, 'example');
select * From log;
which outputs
CREATED_TIME
MSG
1,666,719,883,332
example
Prepare to be amazed! The option to show/hide the 000 separator is on the left corner
I'd like to quickly copy and paste them into a WHERE clause, but I have to manually remove the commas from 1,666,719,883,332 to be 1666719883332.
The way I use it is a preview pane and Copy button:

Import a CSV comma-delimited file into a SQL Server table

I need to import a .CSV file into a SQL Server table and I'm having problems due to " appearing within the string.
I have found the problem
lines ,"32" Leather Bike Trs ",
It never splits the column.
I've been trying to solve this for hours, what I'm I missing here.
If it can't be done with SSMS Import.
Can it be done in SSIS, import as one big column and use SQL, C# script, what would be my next step to research?
Thanks.
Below sample line to put into a csv file to try.
"Company","Customer No","Store No","Store Name","Channel","POS Terminal No","Currency Code","Exchange Rate","Sales Order No","Date of Sales Order","Date of Transaction","Transaction No","Line No","Division Code","Item Category Code","Budget Group Description","Item Description","Item Status","Item Variant Season Code","Item No","Variant Code","Colour Code","Size","Original Price","Price","Quantity","Cost Amount","Net Amount","Value Including Tax","Discount Amount","Original Store No","Original POS Terminal No","Original Trans No","Original Line No","Original Sales Order No","Discount Code","Refund Code","Web Return Description" "Motor City","","561","Outback","In-store","P12301","HKD","1","","","20160218","185","10000","MT","WW","Jeans","32" Leather Bike Trs ","In Stock","9902","K346T4","BK12","BK","12","180.00000000000000000000","149.00000000000000000000","1.00000000000000000000","34.12500000000000000000","135.45000000000000000000","149.00000000000000000000",".00000000000000000000","","","0","0","","","",""
You're right the issue come from one " placed in your text. The fun fact is, if you had 2 " in your text, SSMS could handle it (as many other tools).
Maybe you should consider the possibility to change the text qualifier of your file before implementing a SSIS package ?

Need help printing query results to a file/text

This SSMS newbie is trying to print the results of a query instead of having it directed to the grid
I followed the following steps:
Management Studio -->> Tools -->>Options -->>
Query Results -->>General -->> Results to text and c:\works as Default location for saving query
Query Results -->>SQL Server -->> Results to text -->>Include column headers when copying or saving...
yet, when query is executed, I don't see the results at all
could someone please shed the light my way as to how can I get the query results saved to a file that I can print later on?
If you send the results direct to file, you can't see them in the Results pane in SSMS.
You have three choices:
Results to Text (plain text, in the Results pane)
Results to Grid (grid view, with resizeable columns & rows similar to Excel)
Results to File (writes direct to file, results not displayed)
You can choose between these options from the Query -> Results menu, buttons on the Standard toolbar, or keyboard shortcuts (CTRL-T,CTRL-D, CTRL-SHIFT-F, in the order above). Select your output "mode", then execute the query.
With the first two options, you can right-click the results and save to a file from there. Or copy/paste elsewhere.
With Results to File, it will output the results in a file in your default location (c:\works\ in your case) but it should prompt you with the standard Windows File Save dialog.
You need to select "Results to File" not "Results to Text". When you then go back and run your query, you will not see any query results, just a prompt for the file name you want to save the results as.
You don't want Results to Text. You want Results to File. Then, when you execute the query, you'll be prompted for the file name to save under.

Script output to file when using SQL-Developer

I have a select query producing a big output and I want to execute it in sqldeveloper, and get all the results into a file.
Sql-developer does not allow a result bigger than 5000 lines, and I have 100 000 lines to fetch...
I know i could use SQL+, but let's assume I want to do this in sqldeveloper.
Instead of using Run Script (F5), use Run Statement (Ctrl+Enter). Run Statement fetches 50 records at a time and displays them as you scroll through the results...but you can save the entire output to a file by right-clicking over the results and selecting Export Data -> csv/html/etc.
I'm a newbie SQLDeveloper user, so if there is a better way please let me know.
This question is really old, but posting this so it might help someone with a similar issue.
You can store your query in a query.sql file and and run it as a script. Here is a sample query.sql:
spool "C:\path\query_result.txt";
select * from my_table;
spool off;
In oracle sql developer you can just run this script like this and you should be able to get the result in your query_result.txt file.
#"C:\Path\to\script.sql"
Yes you can increase the size of the Worksheet by change the setting Tool-->Preferences - >Database -> Worksheet -> Max rows to print in a script(depends on you).
Mike G answer will work if you only want the output of a single statement.
However, if you want the output of a whole sql script with several statements, SQL*Plus reports, and some other output formats, you can use the spool command the same way as it is used in SQL*Plus.

SQL 2005 CSV Import Quote Delimited with inner Quotes and Commas

I have a CSV file with quote text delimiters. Most of the 90000 rows are fine, but I have a few rows that have a text field that contains both a quote and a comma. For example the fields value would be:
AB",AB
When Delimited this becomes
"AB"",AB"
When SQL 2005 attempts to import this I get errors such as...
Messages
Error 0xc0202055: Data Flow Task: The column delimiter for column "Column 4" was not found.
(SQL Server Import and Export Wizard)
This only seems to happen when a quote and comma are in a text value together. Values like
AB"AB which becomes "AB""AB"
or
AB,AB which becomes "AB,AB"
work fine.
Here are some example rows...
"1464885","LEVER WM","","B","MP17"
"1465075",":PLT-BC !!NOTE!!","","B",""
"1465076","BRKT-STR MTR !NOTE!","","B",""
"1465172",":BRKT-SW MTG !NOTE!","","B","MP16"
"1465388","BUSS BAR !NOTE!","","B","MP10"
"1465391","PLT-BLKHD ""NOTE""","","B","MP20"
"1465564","SPROCKET:13TEETH,74MM OD,66MM","ID W/.25"" SETSCR","B","MP6"
"S01266330002","CABLE:224"",E122/261,8 CO","","B","MP11"
The last row is an example of the problem - the "", causes the error.
I've had MAJOR problems with SSIS. Things that Access, Excel and even DTS seemed to do very well, SSIS chokes on. Variable record-length data is another problem but, yes, these embedded qualifiers are a major problem. Especially if you do not have access to the import files because they're on someone else's server that you pay to gain access to and might even be 4 to 5 GB in size! Cant just to a "replace all" on that every import.
You may want to check into this at Microsoft Downloads called "UnDouble" and here is another workaround you might try.
Seems like with SSIS in SQL Server 2008, the bug is still there. I dont know why they havent addressed this in the parser but its like we went back in time with SSIS in basic import functionality.
UPDATE 11-18-2010: This bug still exists in SSIS. Amazing.
How about just:
Search/replace all "", with ''; (fix all the broken fields)
Search/replace all ;''; with ,"", (to "unfix" properly empty fields.)
Search/replace all '';''; with "","", (to "unfix" properly empty fields which follow a correct encapsulation of embedded delimiters.)
That converts your original to:
"1464885","LEVER WM","","B","MP17"
"1465075",":PLT-BC !!NOTE!!","","B",""
"1465076","BRKT-STR MTR !NOTE!","","B",""
"1465172",":BRKT-SW MTG !NOTE!","","B","MP16"
"1465388","BUSS BAR !NOTE!","","B","MP10"
"1465391","PLT-BLKHD ""NOTE""","","B","MP20"
"1465564","SPROCKET:13TEETH,74MM OD,66MM","ID W/.25"" SETSCR","B","MP6"
"S01266330002","CABLE:224'';E122/261,8 CO","","B","MP11"
Which seems to run the gauntlet fine in SSIS. You may have to step 3 recursively to account for 3 empty fields in a row ('';'';'';, etc.) but the bottom line here is that when you have embedded text qualifiers, you have to either escape them or replace them. Let this be a lesson in your CSV creation processes going forward.
Microsoft says doubled double quotes inside double quote delimited fields just don't work. A fix is planned for the end of 2011...
In the mean time we will have to use workarounds like described in the other answers.
I would just do a search/replace for ", and replace it with ,
Do you have access to the original file?

Resources