I'm starting to use the PostgreSQL now, and I wonder if I can schedule tasks / work (in SQL) to be done by the db without having to use pgAgent.
I'm working on a system where administrators need to schedule promotions. For example, from day X to day Y there is a Z promotion. This must be done in the system interface (UI), on a page that will send the command to the database. All I need is to perform a SQL command when a proper time comes.
I have searched on the internet, and all I find is about pgAgent or how to configure it. I do not want it. From what I saw, the pgAgent only works by pgAdmin interface, and system administrators can not lay a finger on pgAdmin... Or not (I'm new to PostgreSQL)...? :/
In pgAdmin, when creating a new job I also clicked on the help button but there does not talk much except set everything through pgAdmin interface.
Is there any way to achieve this? Are there alternatives?
Thank you for your attention.
PgAgent doesn't work just through PgAdmin, but rather, PgAdmin is the only (current) GUI that interfaces with the PgAgent tables. PgAgent is a service that interacts exclusively with its own set of tables, which can be modified and reported on by any software, not just PgAdmin. PgAdmin can be very useful since it implements multi-stepped jobs, and the results are stored in the database and can be queried or custom reports can be made.
There are many alternatives, from developing your own PgAgent-like tool, to using cron in Linux/Unix/Cygwin or Scheduled Tasks in Windows.
For example, in Linux, a daily table export can be implemented in cron by adding a batch file in /etc/cron.daily/ like
sudo -i -u postgres psql -c "copy foo.bar to '/var/lib/dbexports/foo-bar.csv' with csv header" foo_db
or in a file in /etc/cron.d/ to export that file specifically on Mondays at 5:30 like
30 5 * * 1 postgres psql -c "copy foo.bar to '/var/lib/dbexports/foo-bar.csv' with csv header" foo_db
or similar on any user's crontab.
Related
How do you schedule a job in SQL Server 2016? I've done this in 2005 but going through the tree in SSMS I don't see anything that resembles any type of scheduling that I am familiar with.
Searching finds me nothing for 2016. In older versions I see references to Jobs and Agent but I do not see any of those choices. Could I not have permission? Do they have new names? I also can't find the activity monitor which I found to be very useful (especially for terminating my processes during debugging sessions).
As it turns out I went through a similar situation and had to find a workaround.
It is actually quite simple. Just have a batch file created to run msserver from shell, then schedule to run that on OS.
Assuming you're running on Windows, use Task Scheduler to run a file that goes:
sqlcmd -S servename -d database_name -Q "Query or procedure here"
I don't know if the nature of the job and the permissions you have would make this not feasable, but nevertheless, maybe it will be of help.
I am very new in Google app engine please help me to solve my problem
I have created one instance in Google cloud sql when I import SQL file then it shows me error like this.
ERROR 1227 (42000) at line 1088: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
How do I to add super privilege to my instance.
As stated at the Cloud SQL documentation:
The SUPER privilege is not supported.
You can take a look at this page that explains how to import data to a Cloud SQL instance.
I also faced the same issue. But the problem was in dumped sql database. When exporting the database use these flags
--hex-blob --skip-triggers --set-gtid-purged=OFF
Here is the complete documentation of how to do it (https://cloud.google.com/sql/docs/mysql/import-export/importing). Once the data is exported, it can be imported using command line, gcloud shell or there is an option of import in gcloud sql as well.
I used the import feature of gcloud sql console and it worked for me.
I ran into the the same error when backporting a gzipped dump (procured with mysqldump from a 5.1 version of MySQL) into a Google Cloud SQL instance of MySQL 5.6. The following statement in the sql file was the problem:
DEFINER=`username`#`%`
The solution that worked for me was removing all instances of it using sed :
cat db-2018-08-30.sql | sed -e 's/DEFINER=`username`#`%`//g' > db-2018-08-30-CLEANED.sql
After removal the backport completed with no errors. Apparently SUPER privilege is needed, which isn't available in Google Cloud SQL, to run DEFINER.
Another reference: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Good luck!
i faced same issue you can try giving 'super permission' to user but isn't available in GCP cloud SQL.
The statement
DEFINER=username#`%
is an issue in your backup dump.
The solution that you can work around is to remove all the entry from sql dump file and import data from GCP console.
cat DUMP_FILE_NAME.sql | sed -e 's/DEFINER=<username>#%//g' >
NEW-CLEANED-DUMP.sql
After removing the entry from dump and completing successfully you can try reimporting.
For the use case of copying between databases within the same instance, it seems the only way to do this is using mysqldump, which you have to pass some special flags so that it works without SUPER privileges. This is how I copied from one database to another:
DB_HOST=... # set to 127.0.0.1 if using cloud-sql proxy
DB_USER=...
DB_PASSWORD=...
SOURCE_DB=...
DESTINATION_DB=...
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
| mysql -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $DESTINATION_DB
Or if you just want to dump to a local file and do something else with it later:
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
> $SOURCE_DB.sql
See https://cloud.google.com/sql/docs/mysql/import-export/exporting#export-mysqldump for more info.
It's about the exporting of data. When you export from the console, it exports the whole Instance, not just the schema, which requires the SUPER privilege for the project in which it was created. To export data to another project, simply export by targeting the schema/s in the advanced option. If you run into could not find storage or object, save the exported schema to your local, then upload to your other project's storage, then select it from there.
In case somebody is searching for this in 2018 (at least august) the solution is:
Create a database. You can do this from UI, just go to Database menu and click "Create a database".
After you clicked "import" and selected your sql_dump (previously saved in a bucket), press "Show advanced options" and select your Db (not that advanced, are they?!). Otherwise, the default is the system mysql which, of course can not
support import.
Happy importing.
I solved this by creating a new database and in the SQL instance. (Default database is sys for mysql).
Steps(Non-cli version):
1) In GCP > SQL > Databases , create a new database e.g newdb
2) In your sql script, add: Use newdb;
Hope that helps someone
SUPER privilege is exclusively reserved for GCP
For you question, you need to import data into a YOUR database in which you have permission ..
I installed db2 client in my system for personal use. I am not able to understand from where to create the database so that i can create tables on that db and play around with sql queries as of now.
This is totally related to my personal use and learning purpose.
Please inform how to create a dummy database and play around with it with db2 client?
Regards,
Are you on Windows? Are you using DB2 Express-C? I'm assuming you are, since you say this is for personal work.
After you have the DB2 binaries installed, you should have been prompted with the "DB2 First Steps" application, where you could have the application go through a GUI to create the database for you. If you missed it, you should be able to find it in your start menu with Start -> Programs -> IBM DB2 -> DB2COPY1 (Default) -> Set-up Tools -> First Steps.
If you prefer the command line, you can use the following:
First, determine if your installation created a default instance (on Windows, this will be called DB2) by using the db2ilist command. In the DB2 Command Window, if you're not on the instance you want to create a database in, you can switch with the following command:
set db2instance=DB2
Be sure to not include spaces around the equal sign.
Now, in order to create a database in the current instance, you use this command:
db2 create database mydatbase
For further reading, IBM has produced a Getting Started ebook, and I would highly recommend you check it out!
The DB2 client is just that - a client only. It does not include the database engine.
To create a database you have to install the server. The server includes the client portions.
I use a Wordpress plugin called 'Shopp'. It stores product images in the database rather than the filesystem as standard, I didn't think anything of this until now.
I have to move server, and so I made a backup, but restoring the backup is proving a horrible task. I need to restore one table called wp_shopp_assets which is 18MB.
Any advice is hugely appreciated.
Thanks,
Henry.
For large operations like this it is better to go to command line. phpMyAdmin gets tricky when lots of data is involved because there are all sorts of timeouts in PHP that can trip it up.
If you can SSH into both servers, then you can do a sequence like the following:
Log in to server1 (your current server) and dump the table to a file using "mysqldump" --- mysqldump --add-drop-table -uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
Do a secure copy of that file from server1 to server2 using "scp" ---
scp BACKUPFILE USER#SERVER2DOMAIN:FOLDERNAME
Log out of server 1
Log into server 2 (your new server) and import that file into the new DB using "mysql" --- mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
You will need to replace the UPPERCASE text with your own info. Just ask in the comments if you don't know where to find any of these.
It is worthwhile getting to know some of these command line tricks if you will be doing this sort of admin from time to time.
try HeidiSQL http://www.heidisql.com/
connect to your server and choose the database
go to menu "import > Load sql file" or simply paste the sql file into the sql tab
execute sql (F9)
HeidiSQL is an easy-to-use interface
and a "working-horse" for
web-developers using the popular
MySQL-Database. It allows you to
manage and browse your databases and
tables from an intuitive Windows®
interface.
EDIT: Just to clarify. This is a desktop application, you will connect to your database server remotely. You won't be limited to php script max runtime, or upload size limit.
use bigdupm.
create a folder on your server witch is not easy to guess like "BigDump_D09ssS" or w.e
Download the http://www.ozerov.de/bigdump.php importer file and add them to that directory after reading the instructions and filling out your config information.
FTP The .SQL File to that folder along side the bigdump script and go to your browser and navigate to that folder.
Selecting the file you uploaded will start importing the SQL is split chunks and would be a much faster method!
Or if this is an issue i reccomend the other comment about SSH And mysql -u -p -n -f method!
Even though this is an old post I would like to add that it is recommended to not use database-storage for images when you have more than like 10 product(image)s.
Instead of exporting and importing such a huge file it would be better to transfer the Shopp installation to file-storage for images before transferring.
You can use this free plug-in to help you. Always backup your files and database before performing this action.
What I do is open the file in a code editor, copy and paste into a SQL window within phpmyadmin. Sounds silly, but I swear by it via large files.
This question already has answers here:
How to backup/restore a Firebird database?
(2 answers)
Closed 3 years ago.
I am working on my first WinForms application with a Firebird database shared over a network. Now I wonder how should I ensure database backup and restoring?
Up till now, my applications used embedded databases (SQLite), so I was sure that only my application accessed the database. The application itself was responsible for the backups and restores. I could simply copy the database file and that's it.
The backup was made:
Automatically at each application start
Automatically every week
Manually by user
When the user wanted to restore from backup, he could do this anytime and he could choose from any type of backup. All directly from my application.
For the new application, I've moved from SQLite to Firebird. I've chosen Firebird because the application will run with embeded database by default, but can be used also with classic server. With Firebird, I can use both embedded and server with the same database file.
The problem is that when the database will run on a server, there can be many users working with the database at the same time, so I don't know how to make backup and restore. Should I omit the backup/restore functionality in my app and let the admin make the backups on the server? Or should my app include backup and restore?
The shared database is simply totally new to me, so I don't know the best practices. Anyway, the database will be pretty small and there will be only several users working at the same time.
Thanks, Petr
Don't copy the database file, it will corrupt the database.
Firebird is a relational database server. gbak is the official application to run hot backups.
Check this out: http://firebirdfaq.org/cat5/
On a shared server, you have several options for making backups:
Use a file-backup tool that supports Microsoft Volume Shadow Copy. This will take a snapshot of your database. Firebird was designed to "survive" such backups. However, restoring such a backup is like having a power failure, but on the other hand, if you need to instruct an IT department how to do it and make surveillance, this is a serious option.
Use gbak.exe to make a copy of the database while it is in use, into a backup file. Then, make a backup of that. This is the recommended method, but in order for this to work properly, you need to inspect the exit code of gbak.exe to check that no error happened. Not all IT departments are able to do that.
However, on a shared server, you must always be paranoid: Most backups in big organizations cannot be restored, and usually the problem is that humans make mistakes. Therefore, I can recommend the third option, which is basically the combination of the first two:
Use gbak.exe to make a copy of the database into a backup file. If possible, make surveillance on the exit code of gbak.
Use a Microsoft Volume Shadow Copy enabled backup program to make a backup of both the primary database and the backup file.
This should give you a nice backup file to restore, and if gbak should have failed and noone noticed, you can fall back to the raw snapshot of the running database file. Several people must make several mistakes for this to fail.
If you are using a shared database, then you should probably take the backup/restore process out of your application, otherwise one user could corrupt or eliminate the work of another user.
You can use nbackup in C# as follows:
const String Usuario = "SYSDBA";
const String Contrasena = "masterkey";
String argumentos = String.Format(#"/C nbackup -U {0} -P {1} -B {2} BD.FDB"
, Usuario, Contrasena, (Int32) nivelRespaldo);
Process process = new Process();
process.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
process.StartInfo.FileName = "cmd.exe";
process.StartInfo.Arguments = argumentos;
process.Start();
process.Close();
In case you want to block the database while making the backup
String argumentos = String.Format(#"/C nbackup -U {0} -P {1} -L {2}"
, NombreArchivoRespaldo.Usuario, NombreArchivoRespaldo.Contrasena, Glo.NombreBaseDatos);
Don't forget to unblock
String argumentos = String.Format(#"/C nbackup -U {0} -P {1} -N {2}"
, NombreArchivoRespaldo.Usuario, NombreArchivoRespaldo.Contrasena, Glo.NombreBaseDatos);
if you want to close all the connections try:
FbConnection.ClearAllPools();