Database Generator php and mysql - database

How do I create a database generator to allows a user to create his own table in the database without access to PHPMyAdmin and without any emphasized text knowlege about PHP or MySQL?

You at-least need the sql file to run so that the database can be created....
create the sql file (the whole backup of the database along with subroutines)
just load the database via this command
mysql -u'root' -p'password' databaseName < /path/to/file.sql
In order to run mysql, it should be in your environment variable..

Related

Same tables with same data in different sql server instance in different pc

I have one pc as main database server which all clients are logging to main table. I have another two pcs lying around and I want to use them as backup servers. These backup servers will have data from main table in main database server. I am not sure how to achieve such process and really appreciate the help. My database server is microsoft sql express edition and incoming data are from apis in aspnet core. Usually, I will use Microsoft SQL Management Studio and extract data tier from table and import data tier in another pc with same table name.
Main Database (Main PC) -> Second Backup Database (Second PC) and Third Backup Database (Third PC)
I have never done this before and I can't find the solution yet. I want to replicate table from Main PC in another two pc. Not replicate whole database in another pc.
I found that there is no replication feature in express edition. Any possible approach for this backup process?
As I said in my comment you are going in wrong direction.
First of all you said
I have another two pcs lying around and I want to use them as backup
servers.
Backup server does not mean "to replicate table from Main PC in another two pc. Not replicate whole database in another pc.", what can you do with the copy of 1 table if something happen to your main server?
Backup server should contain transactionally consistent copy of your database, only this way you can re-direct your applications to the backup server and they will be able to work with it in case of disaster with your main server. And this means you should backup your database on the main server and restore it on the backup server, backup/restore will provide you with transactionally consistent copy of database, and bacpac won't.
As you are on Express Edition and cannot use SQL Server Agent you can write 2 scripts to backup and restore and launch them using sqlcmd. To schedule it you can use Windows scheduler.
Your backup script can look like this:
backup database MyDB to disk = 'path-to-backup-file' with init;
And your restore script looks like this:
restore database MyDB from disk = 'path-to-backup-file'
with move 'MyDB' to 'db-copy-path\MyDB.mdf',
move 'MyDB_log' to 'db-copy-path\MyDB_log.ldf',
replace;
Your cmd command looks like this:
sqlcmd -S myServer\instanceName -i C:\myScript.sql –U login_name –P password
Here you pass your backup or restore command in the file myScript.sql
my source address is 10.11.20.181 and port is 5001
This means that for execute your backup script you should use the following:
sqlcmd -S 10.11.20.181,5001 -i C:\myBackupScript.sql –U login_name –P password
SQL Server doesn't allowed SQL Server agent also in Express edition.
CREATE the linked server on your destination database to connect primary database.
Schedule one Operating system scheduler to execute database script. In your database script you need to fetch new records from source database using linked server based on "Which are inserted or updated in last n minutes".
check those data in your tables using LEFT JOIN. If not exist the insert into the table.
For better performance, Insert fetched data into the temp table, then use below query.
INSERT INTO your_table()
SELECT t.*
FROM #temp t
LEFT JOIN your_table y ON t.id = y.id
WHERE y.id IS NULL
I tested this solution that can fulfill my requirement with minimum steps.
I copy powershell script from this link.
I also install sqlpackage from microsoft.
.\SqlPackage.exe /a:Export /ssn:ServerName /sdn:TableName/tf:path-to-backup-folder\mybackup$(get-date -f dd-MM-yyyy-HH-mm-s).bacpac
and I created task scheduler in my backup pc to execute this script every 6hrs. and I have another script to import this data back to database inside backup pc every 12hrs and delete those bacpac after import. One thing to consider using this method is how big is your database since I am exporting every data every six hours and if your database is huge, this would cause the performance issue & I don't know what will happen new rows are inserted or updated when executing this operation.
I am really not sure what kind of errors will occur in the long run.

how to mirror a whole database cluster in postgresql

I'm using a postgresql (9.6) database in my project which is currently in development stage.
For production I want to use an exact copy/mirror of the database-cluster with a slightly different name.
I am aware of the fact that I can make a backup and restore it under a different cluster-name, but is there something like a mirror function via the psql client or pgAdmin (v.4) that mirrors all my schemas and tables and puts it in a new clustername?
In PostgreSQL you can use any existing database (which needs to be idle in order for this to work) on the server as a template when you want to create a new database with that content. You can use the following SQL statement:
CREATE DATABASE newdb WITH TEMPLATE someDbName OWNER dbuser;
But you need to make sure no user is currently connected or using that database - otherwise you will get following error.
ERROR: source database "someDbName" is being accessed by other users
Hope that helped ;)

extract oracle database erd with sqlplus

i want to ask about extracting oracle ER Diagram (ERD) from sqlplus. i have a condition that i must access oracle database from my vps (this oracle database at another server). i need to use vps because with vps ,it will provide a domain that will be used for oracle database server to grant access to it database. what i know to generate ER Diagram is from oracle sql Developer .but with my current vps ,it provided CLI only. Unfortunatly sql developer using GUI so i cant use it. So how can i extract ER Diagram from oracle database witohout using sql developer and this solution must not using an app with GUI.
You can generate a plain text file containing the description for each of your tables, download this file from your remote machine to your local machine and load the content in an ERD tool such as Oracle DataModeler.
A common script to obtain a table description (including your goal, Foreign key constraints) is:
set heading off;
set echo off;
Set pages 1000;
set long 50000;
SET linesize 150;
spool My_ddl.sql;
select DBMS_METADATA.GET_DDL('TABLE','<your table 1>','<schema>') from DUAL;
select DBMS_METADATA.GET_DDL('TABLE','<your table 2>','<schema>') from DUAL;
select DBMS_METADATA.GET_DDL('TABLE','<your table 3>','<schema>') from DUAL;
.
.
.
spool off;
So basically you still are gonna use a GUI tool to build your diagram, but the trick resides in how you get your source scripts.
If you are worried about how to generate the get_ddl instruction for all of your 10,000 tables, fear not, as you can run the following:
SELECT 'SELECT DBMS_METADATA.GET_DDL(''TABLE'','''||table_name||''', ''<LOCAL/FOREIGN SCHEMA>'') FROM dual;' as get_ddls
FROM all_tables
WHERE owner = '<User of the foreign schema where tables are stored>';
and then just copy paste the results to the previous script.
In order to retrieve the file generated server side, you can always ask (kindly, of course) your DBA to hand it over to you, unless you have a way to access directly into the server directories/files.
Good luck.

Import DB2 files to SQL Server

Given the DAT file and the DDL file for each table in a DB2 database, can I import this data to SQL Server? I have no access to the original server or any copy of a DB2 server so connecting to a live instance isn't an option.
Can I do this without a live instance of DB2 or should I go back to the client and ask for CSV files? Is there a procedure or tool that makes this process smoother? I've tried to find a file-based connection string to use to connect to a set of DB2 files with no luck. I've also tried SwissSQLDB2ToSQLServer and SqlLinesData to see if they have a file-based option built in.
OK, given the comment above, you can't import DB2's container files (DAT, LRG, or anything else) directly. You need a CSV or equivalent. Yes, one way to get this is run the EXPORT utility on a live DB2 database. HTH!

I have a 18MB MySQL table backup. How can I restore such a large SQL file?

I use a Wordpress plugin called 'Shopp'. It stores product images in the database rather than the filesystem as standard, I didn't think anything of this until now.
I have to move server, and so I made a backup, but restoring the backup is proving a horrible task. I need to restore one table called wp_shopp_assets which is 18MB.
Any advice is hugely appreciated.
Thanks,
Henry.
For large operations like this it is better to go to command line. phpMyAdmin gets tricky when lots of data is involved because there are all sorts of timeouts in PHP that can trip it up.
If you can SSH into both servers, then you can do a sequence like the following:
Log in to server1 (your current server) and dump the table to a file using "mysqldump" --- mysqldump --add-drop-table -uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
Do a secure copy of that file from server1 to server2 using "scp" ---
scp BACKUPFILE USER#SERVER2DOMAIN:FOLDERNAME
Log out of server 1
Log into server 2 (your new server) and import that file into the new DB using "mysql" --- mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
You will need to replace the UPPERCASE text with your own info. Just ask in the comments if you don't know where to find any of these.
It is worthwhile getting to know some of these command line tricks if you will be doing this sort of admin from time to time.
try HeidiSQL http://www.heidisql.com/
connect to your server and choose the database
go to menu "import > Load sql file" or simply paste the sql file into the sql tab
execute sql (F9)
HeidiSQL is an easy-to-use interface
and a "working-horse" for
web-developers using the popular
MySQL-Database. It allows you to
manage and browse your databases and
tables from an intuitive Windows®
interface.
EDIT: Just to clarify. This is a desktop application, you will connect to your database server remotely. You won't be limited to php script max runtime, or upload size limit.
use bigdupm.
create a folder on your server witch is not easy to guess like "BigDump_D09ssS" or w.e
Download the http://www.ozerov.de/bigdump.php importer file and add them to that directory after reading the instructions and filling out your config information.
FTP The .SQL File to that folder along side the bigdump script and go to your browser and navigate to that folder.
Selecting the file you uploaded will start importing the SQL is split chunks and would be a much faster method!
Or if this is an issue i reccomend the other comment about SSH And mysql -u -p -n -f method!
Even though this is an old post I would like to add that it is recommended to not use database-storage for images when you have more than like 10 product(image)s.
Instead of exporting and importing such a huge file it would be better to transfer the Shopp installation to file-storage for images before transferring.
You can use this free plug-in to help you. Always backup your files and database before performing this action.
What I do is open the file in a code editor, copy and paste into a SQL window within phpmyadmin. Sounds silly, but I swear by it via large files.

Resources