How to import file into oracle table - database

I have a .tbl file with data and I'm trying to import this data into a table. I'm using SQL Developer for this with this command:
load data infile "C:\path\users.tbl"
insert into table users fields terminated by "|" lines terminated by "\r\n;
But nothing is working, the data is not loaded and no errors are shown...Do you see why it's not working?

That looks like SQL*Loader syntax.
For that to work, you'd have to run SQL*Loader, which is a separate command-line program available in your ORACLE_HOME/bin directory.
If you don't have an ORACLE_HOME, you'll need to install the client. Then open a shell/cmd window, and run your command there.
OR, if you want to use SQL Developer, you can use our wizard to read the file and insert the data, row-by-row.

Related

How to generate Insert statement from PGAdmin4 Tool?

We are writing a new application, and while testing, we will need a bunch of dummy data. I've added that data by using MS Access to dump excel files into the relevant tables into the Postgres database.
What should I do now to generate an Insert statements from the PGAdmin4 Tool similar to what SQL Studio allow us to generate an Insert statements for SQL Server? There are no options available to me. I can't use the closest one, which is to export and import the data via CSV.
I understand that you cannot import the CSV file into the actual DB as this needs to be done through ASP.NET core EF. Perhaps, you can probably create a test schema and import the CSV file into the test schema. Once you have the data imported into the test schema, you can use that to generate SQL statements using the steps below:
Right click on target table and select "Backup".
Select a file path to store the backup. You can save the file name as data.backup
Choose "Plain" as Format.
Open the tab "Options" check "Use Column Inserts".
Click the Backup-button.
Once the file gets generated you can open with Notepad++ or VSCode to get the SQL insert statements
You can use the statements generated and delete the test schema created
Here is a resource that might help you in loading data from Excel file into PostgresSQL if you still need to take this path Transfer Data from Excel to PostgreSQL

Change the file encoding of the file which is created using SSIS Log provider for Text Files

I am new to SSIS, I have already designed a package and configured SSIS Log provider for Text Files.
This works fine and log files are generated successfully.
We have a monitoring team, they use this log file for monitoring. They are unable to read the log files since the file encoding is in Unicode format.
They are expecting a non unicode format for their monitoring.
I tried to change the existing log file encoding to ANSI but when I re-run the package my log file has been created again with UNICODE encoding.
Is any way we can create log files using SSIS Log provider for Text Files with non unicode encoding. Kindly suggest me any workaround. I am unable to find solution for the past two days.
Trying to figure out the issue
Since SSIS Log provider for Text Files use a File connection manager for logging purposes, you don't have the choice to edit the file encoding within the SSIS package because this type of connection manager can be used for different files format (excel, text ...).
While searching for this issue it looks like if the log is created for the first time by SSIS it will write unicode data.
why are my log files getting generated with a space between every two characters?
Why is my SSIS text logfile formatted in this way?
Possible workaround
Try to create an empty text file using notepad and save it with ANSI encoding.
Then select this file from the SSIS logging configuration.
Other helpful links
Change the default of encoding in Notepad
Add Logging with SSIS
Update 1 - Experiments
To test the workaround i provided i have run the following experiments:
I add SSIS Logging and created and a new log file
After executing the package the file is create in Unicode (to check that i opened the file using notepad and click Save As the encoding shown in the combobox is Unicode)
I create a new file using Notepad and save it using Ansi encoding as mentioned above.
In SSIS i changed the File connection manager to Use Existing instead of Create New and i selected the file i created
After executing the package the log is filled within the file and the encoding is still Ansi
I repeated executing the package several times and the undoing wont changes.
TL DR: Create a file with ANSI encoding outside the ssis package and within the package create a file connection manager, select Use Existing option and choose the created file. Use this file connection manager for logging purposes.

Why is SQLite3 automatically encrypting a database I create?

I am trying to get a thorough understanding of sqlite3 so that I can run some basic queries through DB Browser for SQLite (http://sqlitebrowser.org/).
To do so, I've imported NYC Taxi data for 1 month, and tried (for many hours) to import this data on sqlite3.
.mode csv <Table_Name>
.import <path/to/file/data.csv> <Table_Name>
Once that finishes, I issue the following SQL statement:
.out <path/to/file/data.db>
select * from <table_name>;
Then, when I try to use DB Browser for SQLite to verify that the database has been populated with data, I get a prompt:
SQLCipher Encryption
Please enter the key used to encrypt the database
Why is it getting auto-encrypted? Is there another way to get my csv file into a database?
The message means that the file is not recognized as a database file. This can happen if the file is encrypted.
But in this case, the output generated by .output is the same as what would be printed on the screen. This is not a database file at all.
To get a copy of the entire database file, use .backup.
To get a copy of a single table, use .dump tablename, then execute those SQL statements in a new database:
sqlite3 data.db < file_generated_by_dump

Import SQL Server database from large script

Opening large sql script generated by SQL Server publisher cant be open in management studio, returning error about not enough available storage to open it.
Is there some other way to import db from large script ? (command line maybe)
Is this something you have to edit? If so, you may want to open it in Notepad++ or TextPad or Editplus.
Here are some options I can think of:
Use the batch separator GO between sets of commands. The reason for this is that without the GO, SSMS is trying to execute the entire script as a single command. This puts a heavier load on memory requirements than multiple batches would.
To run the script, you can use SQLCMD from the command line.
Also, for large scripts that load data, you may want to ensure that you have COMMIT commands in the script (where appropriate).
Consider splitting your script into multiple scripts.
If you split into multiple files and build the SQLCMD command line syntax, you can run all scripts from a single batch file fairly quickly.
Have you tried using the OSql tool?

How can I parse Serv-U FTP logs with SSIS?

A while back I needed to parse a bunch of Serve-U FTP log files and store them in a database so people could report on them. I ended up developing a small C# app to do the following:
Look for all files in a dir that have not been loaded into the db (there is a table of previously loaded files).
Open a file and load all the lines into a list.
Loop through that list and use RegEx to identify the kind of row (CONNECT, LOGIN, DISCONNECT, UPLOAD, DOWNLOAD, etc), parse it into a specific kind of object corresponding to the kind of row and add that obj to another List.
Loop through each of the different object lists and write each one to the associated database table.
Record that the file was successfully imported.
Wash, rinse, repeat.
It's ugly but it got the job done for the deadline we had.
The problem is that I'm in a DBA role and I'm not happy with running a compiled app as the solution to this problem. I'd prefer something more open and more DBA-oriented.
I could rewrite this in PowerShell but I'd prefer to develop an SSIS package. I couldn't find a good way to split input based on RegEx within SSIS the first time around and I wasn't familiar enough with SSIS. I'm digging into SSIS more now but still not finding what I need.
Does anybody have any suggestions about how I might approach a rewrite in SSIS?
I have to do something similar with Exchange logs. I have yet to find an easier solution utilizing an all SSIS solution. Having said that, here is what I do:
First I use logparser from Microsoft and the bulk copy functionality of sql2005
I copy the log files to a directory that I can work with them in.
I created a sql file that will parse the logs. It looks similar to this:
SELECT TO_Timestamp(REPLACE_STR(STRCAT(STRCAT(date,' '), time),' GMT',''),'yyyy-M-d h:m:s') as DateTime, [client-ip], [Client-hostname], [Partner-name], [Server-hostname], [server-IP], [Recipient-Address], [Event-ID], [MSGID], [Priority], [Recipient-Report-Status], [total-bytes], [Number-Recipients], TO_Timestamp(REPLACE_STR([Origination-time], ' GMT',''),'yyyy-M-d h:m:s') as [Origination Time], Encryption, [service-Version], [Linked-MSGID], [Message-Subject], [Sender-Address] INTO '%outfile%' FROM '%infile%' WHERE [Event-ID] IN (1027;1028)
I then run the previous sql with logparser:
logparser.exe file:c:\exchange\info\name_of_file_goes_here.sql?infile=c:\exchange\info\logs\*.log+outfile=c:\exchange\info\logs\name_of_file_goes_here.bcp -i:W3C -o:TSV
Which outputs a bcp file.
Then I bulk copy that bcp file into a premade database table in SQL server with this command:
bcp databasename.dbo.table in c:\exchange\info\logs\name_of_file_goes_here.bcp -c -t"\t" -T -F 2 -S server\instance -U userid -P password
Then I run queries against the table. If you can figure out how to automate this with SSIS, I'd be glad to hear what you did.

Resources