i have a sqlite database created from the honeypot. the database contains malware files. how can i extract these files from the sqlite database. please if someone can help
You can dump the whole database with:
echo .dump | sqlite3 database.sqlite > database.dump
Or just view the structure with:
echo .schema | sqlite3 database.sqlite
To get the files, you'll probably need a small script to extract the BLOBs into files. Post the schema of the database if you need help.
The sqlite3 command can easily interrogate an sqlite3 database and the .dump command will allow you to dump a given table, and the .output command will let you select a filename for the output before dumping.
If the data came from a honeypot, be very careful about the tools you use to inspect the contents: flaws have been found in terminals that allow malicious content to gain privileges on the system. Simply using 'cat' to inspect a file on such a terminal could grant the malicious program your complete set of privileges.
So, at a minimum step, please at least use an unprivileged user account with no access to other data on the system. Using a tool such as AppArmor, SMACK, TOMOYO, SELinux, LIDS, to confine your tools to a small subset of system resources would be a good idea too. Virtualization could also work, but there have been plenty of 'breakouts' from those tools as well.
Related
My question is about the following, it is required to pass a database of Oracle Database 10g Express Edition to 11g. I was given the backup on a pendrive, it is a file with extension .dmp (Dump file).
I installed the 11g Express Edition on a new server but also installed the database that comes with this (XE).
I want to restore the database from the 10g to another unit other than C, which is where the Oracle 11g database is installed. I also want this new database to "replace" the XE (I do not know if it is the correct way to say it).
I have only found adjustments and location changes but only within the same unit.
Any scope would be very useful.
Thank you.
Judging from the comments, it sounds like you have been given a Database Dump file (.dmp) from a database on a pendrive, and you need to figure out how to get that file into a database, correct?
First, let's go over some background. What is a dump file (.dmp)? From Oracle:
The dump file set is made up of one or more disk files that contain
table data, database object metadata, and control information. The
files are written in a proprietary, binary format. During an import
operation, the Data Pump Import utility uses these files to locate
each database object in the dump file set.
At a high level, that .dmp file is a collection of DDL and DML statements that will recreate whatever data and objects that were exported. .dmp files make it easier to transport and move large amounts of data between databases using Data Pump. But what is Data Pump? Again, from Oracle:
Oracle Data Pump technology enables very high-speed movement of data and metadata from one database to another.
Basically, Data Pump is a set of utilities (EXPDP & IMPDP) that are used to move data between databases. The .dmp file you have was likely created using EXPDP. You will need to use IMPDP to import that .dmp file into a database.
Here's were it gets interesting - you say that you already have an 11g database, correct? If you want to, you should be able to import the 10g dump file directly into your 11g database without any issues. The reason is that Oracle tends to be backwards compatible, and typically speaking, anything that you do with one version of Oracle will be compatible with the version that immediately succeeds it. Jumping from something like Oracle 8i to 11g won't work, but you can always go from 8i to 9i, from 9i to 10g, and so on.
If you want to import that dump file into your 11g database, here's what you'll need to do:
Create a DBA account, or have an account that has been granted Data Pump privileges explicitly.
Move the .dmp file to the server where your 11g database lives. If you want to make it even easier for yourself, you can move the .dmp file to your database's datapump directory. If you don't know where that is, execute the following query on your database: select * from all_directories where directory_name = 'DATA_PUMP_DIR'; This query will return a directory. You don't have to use this directory, it will just make it easier.
Once you have the dump file in place and you have all of the necessary database and operating system privileges, you are ready to import the dmp file. Open a new command line window, set your Oracle home if it is not already set, and then navigate to the directory where you placed the .dmp file. Your import command will look something like this:
impdp [USERNAME]/[PASSWORD]#[DATABASE] directory=[DIRECTORY] dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Where [USERNAME]/[PASSWORD] are your credentials, [DATABASE] is the name of the database you're importing the dump file into, [DIRECTORY] is whatever directory you placed the dump file in, [FILENAME] is the name of the .dmp file, and [LOGFILE] is whatever name you chose for your log file.
Assuming your database has everything necessary for the .dmp file, the import should begin and you will start seeing status updates that look similar to this:
Starting [USERNAME]."SYS_IMPORT_FULL_01": [USERNAME]/******** directory=DATA_PUMP_DIR dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Note that this is just an example, your results may look different. Assuming all goes well, you will see a message like this at the end:
Job [USERNAME]."SYS_IMPORT_FULL_01" completed
If you don't want to import it into your existing 11g database, you could always spin up a new database and import the .dmp file to that one using these same guidelines.
That should be enough to get you started down the right path, hope this helps and good luck!
P.S. Here is a great FAQ on the Data Pump utilities as well: http://www.orafaq.com/wiki/Import_Export_FAQ
First and foremost, apologies for a very novice question here. I was just starting to get the hang of how the data pump dump worked in 11g when the customers I support moved to 12c. Darn the luck. :)
So a hopefully quick question: I'm using a coworker's Windows server running 12c to try to import a customer's data pump dump (have their dump and log file), but I have no idea where to place the dump for import. When I run:
select * from dba_directories where directory_name='DATA_PUMP_DIR';
this is the output returned:
c:\ade\aime_v\oracle/admin/seeddata/dpdump/
That directory does not exist anywhere on this machine, plus it looks like an unusual directory path. (My coworker is on vacation, else I'd just ask of course).
So has something changed in 12c where it treats directories a bit differently? I keep thinking surely the one who created this server couldn't have just pointed the data pump dir to a non-existent path like that. I'm presently googling myself in circles, so I suppose the short question is simply where do I begin as a first step in figuring out where my directory is for dropping my dump import file?
Again, apologies for the embarrassingly newbie question, and thanks in advance for entertaining my question.
Use the following sql query.
select * from dba_directories where directory_name='DATA_PUMP_DIR';
Oracle will let you create a directory metadata entry to a non-existant physical directory which is what happened here. So whatever user has CREATE ANY DICTIONARY authority would need to (re-)create the Oracle metadata directory to an existing directory, and ensure the operating system permissions on the physical directory are set up to allow the Oracle user ID to read and write to that directory. As #OldProgrammer suggested, it looks like someone combined half Unix and half Windows path separators.
If there is a directory c:\ade\aime_v\oracle\admin\seeddata\dpdump (and your dump file is there) then the following should work; if you are importing from a non-dba ID (some_user in the sample below), you will need to have the dba grant read on that directory also:
drop directory data_pump_dir;
create directory data_pump_dir as 'c:\ade\aime_v\oracle\admin\seeddata\dpdump';
-- grant read on data_pump_dir to some_user;
OrientDB is using console command create DB by the way run console.bat file on "..\orientdb\bin\". However i need create a .bat separate file run like this command:
create database remote:localhost/test <root> <password> local graph
And run that .bat file by PHP
Thank for advance !
Based on the old documentation at https://code.google.com/p/orient/wiki/ImportFromRDBMS it seem that it is possible and is similar to what i want to do as well.
console.bat database.sql
where all your sql commands are in database.sql. The link about includes a sample for porting a full database over but don;t see why it should not work for similar scripts as well.
That said i have not tested this yet.
I'm trying to code a mssql job that does something using the files in a specific directory. But I don't know the name of the file / files, they will vary in time.
I've found xp_cmdshell command, but I can not use it because of security reasons
Is there any other way to check directory if it contains txt files or not (and if yes get the names of them) in tsql.
Thanks in advance,
Without access to the xp_ stored procedures, no. The other way would be to create a COM object using sp_OACreate that creates a COM Scripting.FileSystemObject, but again access to this may well be restricted as it's a security issue.
As your describing this as an MSSQL job, I'm assuming this is going to be a scheduled task of some description? If so, your best option is probably going to be creating a standard Windows batch file (.BAT) that's scheduled in SQL Server agent that does the existential checking and passes whatever files are found in to your SQL script via sqlcmd/osql.
I use a Wordpress plugin called 'Shopp'. It stores product images in the database rather than the filesystem as standard, I didn't think anything of this until now.
I have to move server, and so I made a backup, but restoring the backup is proving a horrible task. I need to restore one table called wp_shopp_assets which is 18MB.
Any advice is hugely appreciated.
Thanks,
Henry.
For large operations like this it is better to go to command line. phpMyAdmin gets tricky when lots of data is involved because there are all sorts of timeouts in PHP that can trip it up.
If you can SSH into both servers, then you can do a sequence like the following:
Log in to server1 (your current server) and dump the table to a file using "mysqldump" --- mysqldump --add-drop-table -uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
Do a secure copy of that file from server1 to server2 using "scp" ---
scp BACKUPFILE USER#SERVER2DOMAIN:FOLDERNAME
Log out of server 1
Log into server 2 (your new server) and import that file into the new DB using "mysql" --- mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
You will need to replace the UPPERCASE text with your own info. Just ask in the comments if you don't know where to find any of these.
It is worthwhile getting to know some of these command line tricks if you will be doing this sort of admin from time to time.
try HeidiSQL http://www.heidisql.com/
connect to your server and choose the database
go to menu "import > Load sql file" or simply paste the sql file into the sql tab
execute sql (F9)
HeidiSQL is an easy-to-use interface
and a "working-horse" for
web-developers using the popular
MySQL-Database. It allows you to
manage and browse your databases and
tables from an intuitive Windows®
interface.
EDIT: Just to clarify. This is a desktop application, you will connect to your database server remotely. You won't be limited to php script max runtime, or upload size limit.
use bigdupm.
create a folder on your server witch is not easy to guess like "BigDump_D09ssS" or w.e
Download the http://www.ozerov.de/bigdump.php importer file and add them to that directory after reading the instructions and filling out your config information.
FTP The .SQL File to that folder along side the bigdump script and go to your browser and navigate to that folder.
Selecting the file you uploaded will start importing the SQL is split chunks and would be a much faster method!
Or if this is an issue i reccomend the other comment about SSH And mysql -u -p -n -f method!
Even though this is an old post I would like to add that it is recommended to not use database-storage for images when you have more than like 10 product(image)s.
Instead of exporting and importing such a huge file it would be better to transfer the Shopp installation to file-storage for images before transferring.
You can use this free plug-in to help you. Always backup your files and database before performing this action.
What I do is open the file in a code editor, copy and paste into a SQL window within phpmyadmin. Sounds silly, but I swear by it via large files.