A while back I needed to parse a bunch of Serve-U FTP log files and store them in a database so people could report on them. I ended up developing a small C# app to do the following:
Look for all files in a dir that have not been loaded into the db (there is a table of previously loaded files).
Open a file and load all the lines into a list.
Loop through that list and use RegEx to identify the kind of row (CONNECT, LOGIN, DISCONNECT, UPLOAD, DOWNLOAD, etc), parse it into a specific kind of object corresponding to the kind of row and add that obj to another List.
Loop through each of the different object lists and write each one to the associated database table.
Record that the file was successfully imported.
Wash, rinse, repeat.
It's ugly but it got the job done for the deadline we had.
The problem is that I'm in a DBA role and I'm not happy with running a compiled app as the solution to this problem. I'd prefer something more open and more DBA-oriented.
I could rewrite this in PowerShell but I'd prefer to develop an SSIS package. I couldn't find a good way to split input based on RegEx within SSIS the first time around and I wasn't familiar enough with SSIS. I'm digging into SSIS more now but still not finding what I need.
Does anybody have any suggestions about how I might approach a rewrite in SSIS?
I have to do something similar with Exchange logs. I have yet to find an easier solution utilizing an all SSIS solution. Having said that, here is what I do:
First I use logparser from Microsoft and the bulk copy functionality of sql2005
I copy the log files to a directory that I can work with them in.
I created a sql file that will parse the logs. It looks similar to this:
SELECT TO_Timestamp(REPLACE_STR(STRCAT(STRCAT(date,' '), time),' GMT',''),'yyyy-M-d h:m:s') as DateTime, [client-ip], [Client-hostname], [Partner-name], [Server-hostname], [server-IP], [Recipient-Address], [Event-ID], [MSGID], [Priority], [Recipient-Report-Status], [total-bytes], [Number-Recipients], TO_Timestamp(REPLACE_STR([Origination-time], ' GMT',''),'yyyy-M-d h:m:s') as [Origination Time], Encryption, [service-Version], [Linked-MSGID], [Message-Subject], [Sender-Address] INTO '%outfile%' FROM '%infile%' WHERE [Event-ID] IN (1027;1028)
I then run the previous sql with logparser:
logparser.exe file:c:\exchange\info\name_of_file_goes_here.sql?infile=c:\exchange\info\logs\*.log+outfile=c:\exchange\info\logs\name_of_file_goes_here.bcp -i:W3C -o:TSV
Which outputs a bcp file.
Then I bulk copy that bcp file into a premade database table in SQL server with this command:
bcp databasename.dbo.table in c:\exchange\info\logs\name_of_file_goes_here.bcp -c -t"\t" -T -F 2 -S server\instance -U userid -P password
Then I run queries against the table. If you can figure out how to automate this with SSIS, I'd be glad to hear what you did.
Related
Using SQL Manager ver 18.4 on 2019 servers.
Is there an easier way to allow an end user with NO access to anything SQL related to fire off some SQL commands that:
1.)create and update a SQL table
2.)then create a file from that table (csv in my case) that they have access to in a folder share?
Currently I do this using xp_command shell with bcp commands in a cloud hosted environment, hence I am not in control of ANY permission or access, etc. For example:
declare #bcpCommandIH varchar(200)
set #bcpCommandIH = 'bcp "SELECT * from mydb.dbo.mysqltable order by 1 desc" queryout E:\DATA\SHARE\test\testfile.csv -S MYSERVERNAME -T -c -t, '
exec master..xp_cmdshell #bcpCommandIH
So how I achieve this now is allowing the end users to run a Crystal report which fires a SQL STORED PROCEDUE, that runs some code to create and update a SQL table and then it creates a csv file that the end user can access. Create and updating the table is easy. Getting the table in the hands of the end user is nothing but trouble in this hosted environment.
We always end up with permission or other folder share issues and its a complete waste of time. The cloud service Admins tell me "this is a huge security issue and you need to start and stop the xp_command shell with some commands every time you want generate this file to be safe".
Well this is non-sense to me. I wont want to have to touch any of this and it needs to be AUTOMATED for the end user start to finish.
Is there some easier way to AUTOMATE a process for an END USER to create and update a SQL table and simply get the contents of that table exported to a CSV file without all the administration trouble?
Are there other simpler options than xp_command shell and bcp to achieve this?
Thanks,
MP
Since the environment allows you to run a Crystal Report, you can use the report to create a table via ODBC Export. There are 3rd-party tools that allow that to happen even if the table already exists (giving you an option to replace or append records to an existing target table).
But it's not clear why you can't get the data directly into the Crystal report and simply export to csv.
There are free/inexpensive tools that allow you to automate/schedule the exporting/emailing/printing of a Crystal Report. See list here.
Here are the details:
The database has to be archived such that records older than 6 months can be copied to a new database and deleted from the main(production) database. The complexity here will be to copy all rows in all tables which have reference each other. After that, these copied rows from some of the tables (which are really huge and whose data is no more needed) will be deleted.
The postgres database is an Amazon RDS instance.
What is the best way to achieve this?
I was thinking either a Springboot application
OR
Have postgresql.conf invoke a shell script which invokes a sql batch.
For the second approach, I am not sure how to edit a amazon RDS postgresql.conf file and where to specify the shell script. Where would be the sql batch written? This is a little new to me, appreciate any pointers.
It will be much faster if you do everything server side instead of using a Springboot application. The problem is not dump/restore which you could easily do with pg_dump utility or psql -d dbname -t -A -F";" -c "SELECT * FROM yourdata WHERE cutdate<=current_timestamp-interval '6 months'" > output.csv
But you have to guarantee that everything that is exported is loaded into the second database and that you do not delete anything that has not been exported.
I would first SELECT a subset of primary keys into a temporary table. Then use server side COPY command to export the preselected keys (and all its dependencies)
COPY (SELECT d.* FROM yourdata d INNER JOIN temporal t WHERE d.pk=t.pk) To '/tmp/yourdata.csv' WITH CSV DELIMITER ',';
After all the export files have been generated
DELETE FROM yourdata WHERE pk IN (SELECT pk FROM temporal)
Then on the backup database do
COPY yourdata(column1,column2,column3) FROM '/tmp/yourdata.csv' DELIMITER ',' CSV
You can write a script that invokes all that commands on server side using psql command line tool and last move the imported files into a permanent location just in case something went wrong and you need to process them again.
See Save PL/pgSQL output from PostgreSQL to a CSV file and How to import CSV file data into a PostgreSQL table?
First forgive me for my English. It is a little bad. Second forgive my ignorance, i'm newiest in postgres
I'm having trouble when I try to up a backup database on another database. I need to dump the database just to get one table, but I only have the files that was in /var/lib/pgsql/data/base/
Here what I try:
I create a database named "test" with OID 227763 so I put the files of the old database to this new database with another OID. I fix the folder and files permissions, but when I log into "test" and run select * from pg_tables; the tables does not appears to me. And when I try to create the table on PhpPgAdmin, I got
ERROR: relation already exists
I'm trying to do this because I need to know which of this files is the table that i want. I will log into database and run SELECT oid,* from pg_class; to get the OID.
I found the old OID database in /var/lib/pgsql/data/global/pg_database
If anyone can help me, I thank you.
There are many ways to backup and restore an entire database or a single table. It sounds like you need to be using pgDump instead of working on individual files. A file level copy is likely to corrupt your database if not in backup mode and if not copying the entire thing + archive logs.
If you MUST copy it by files, make sure the database is shut down for maximum safety.
For me, if I had one table to backup, I'd use pg_dump
pg_dump -U {user-name} {source_db} -f {dumpfilename.sql}
you can use the -t flag to list a single table if you like.
I use a Wordpress plugin called 'Shopp'. It stores product images in the database rather than the filesystem as standard, I didn't think anything of this until now.
I have to move server, and so I made a backup, but restoring the backup is proving a horrible task. I need to restore one table called wp_shopp_assets which is 18MB.
Any advice is hugely appreciated.
Thanks,
Henry.
For large operations like this it is better to go to command line. phpMyAdmin gets tricky when lots of data is involved because there are all sorts of timeouts in PHP that can trip it up.
If you can SSH into both servers, then you can do a sequence like the following:
Log in to server1 (your current server) and dump the table to a file using "mysqldump" --- mysqldump --add-drop-table -uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
Do a secure copy of that file from server1 to server2 using "scp" ---
scp BACKUPFILE USER#SERVER2DOMAIN:FOLDERNAME
Log out of server 1
Log into server 2 (your new server) and import that file into the new DB using "mysql" --- mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
You will need to replace the UPPERCASE text with your own info. Just ask in the comments if you don't know where to find any of these.
It is worthwhile getting to know some of these command line tricks if you will be doing this sort of admin from time to time.
try HeidiSQL http://www.heidisql.com/
connect to your server and choose the database
go to menu "import > Load sql file" or simply paste the sql file into the sql tab
execute sql (F9)
HeidiSQL is an easy-to-use interface
and a "working-horse" for
web-developers using the popular
MySQL-Database. It allows you to
manage and browse your databases and
tables from an intuitive Windows®
interface.
EDIT: Just to clarify. This is a desktop application, you will connect to your database server remotely. You won't be limited to php script max runtime, or upload size limit.
use bigdupm.
create a folder on your server witch is not easy to guess like "BigDump_D09ssS" or w.e
Download the http://www.ozerov.de/bigdump.php importer file and add them to that directory after reading the instructions and filling out your config information.
FTP The .SQL File to that folder along side the bigdump script and go to your browser and navigate to that folder.
Selecting the file you uploaded will start importing the SQL is split chunks and would be a much faster method!
Or if this is an issue i reccomend the other comment about SSH And mysql -u -p -n -f method!
Even though this is an old post I would like to add that it is recommended to not use database-storage for images when you have more than like 10 product(image)s.
Instead of exporting and importing such a huge file it would be better to transfer the Shopp installation to file-storage for images before transferring.
You can use this free plug-in to help you. Always backup your files and database before performing this action.
What I do is open the file in a code editor, copy and paste into a SQL window within phpmyadmin. Sounds silly, but I swear by it via large files.
I'm looking for a command line tool to generate DDL for both tables and indexes (nothing more complicated is needed) for some Sybase tables in databases that I take care of. I have access to GUI tools for viewing the individual DDLs, and I could cut and paste them, but I would like something that will go through all the tables in a database and generate some nice text files that I can get checked into CVS.
I tried using a tool called ddlgen, which was provided by Sybase, but it just threw exceptions like this:
bash-3.00# ./ddlgen -SdatabaseServer:4100 -Uusername -PsecretPassword -TDB -NdatabaseName
U64: null: databaseName.dbo.firstTable
U64: null: databaseName.dbo.firstTable
at com.sybase.ddlgen.container.UserTableContainer.getDependentDDL(UserTableContainer.java:1065)
at com.sybase.ddlgen.container.UserTableContainer.open(UserTableContainer.java:1364)
at com.sybase.ddlgen.container.UserTableMetaContainer.open(UserTableMetaContainer.java:94)
at com.sybase.ddlgen.container.DDLBaseContainer.load(DDLBaseContainer.java:76)
at com.sybase.ddlgen.container.DatabaseContainer.addChildren(DatabaseContainer.java:552)
at com.sybase.ddlgen.container.DatabaseContainer.open(DatabaseContainer.java:104)
at com.sybase.ddlgen.container.DatabaseMetaContainer.open(DatabaseMetaContainer.java:114)
at com.sybase.ddlgen.DDLThread.run(DDLThread.java:89)
which wasn't very helpful. I keep thinking that there must be a nice Perlish way to do this, but I don't know what that would be.
You can also use the Perl-based dbschema.pl
http://www.isug.com/Sybase_FAQ/ASE/section9.html#9.3.2
use below command to get deffination
defncopy -P tester1 -S sqppdb2 -U pmestr -D ppdb2 -o tab4 ppdb2..tab4
Thanks
Download an evaluation version of Embarcadero DBArtisan and use its extract feature to get the DDL out.
You can turn Logging on in DBArtisan (Logfile ->Log SQL) and then see what SQL it's sending to Sybase to get the table DDL. Copy and paste the SQL in the logfile to a script that you run from the command line and that might work.
Apologies in advance if you are not using Windows...DBArtisan is Windows-only.
Another way of doing this is MyGeneration a code generator (like CodeSmith but open source) which uses templates to create code. That code could be anything you like - Sql, C# etc. I use Sql Server and I've used some of the freely available templates to create DDL as you specify, and automagically create NHibernate Mapping files too - brilliant.
ddlgen will give you what you require and works very well. You seem to be having an enviroment issue with Java. Try again and post the error that you have in it's entirety.