I'm monitoring a few VM with one Zabbix server and I need to create another Zabbix server on another network. I think it's possible to export all the configuration (templates, actions, discovery rules, ...) without the data and import all of that in the new Zabbix server.
I have seen the export method but I don't know how to export actions, maybe directly export from the database but which table?
I'm using Zabbix version 4.2 and it's a MySQL database.
You can use https://github.com/maxhq/zabbix-backup to backup the configuration from the first server and restore it on the second one.
This will copy everything, including hosts, and needs the two servers to use the same db type (in your case, both MySQL). The second server can be the same release of the first one, or a following one: when starting, zabbix-server is able to upgrade the db schema.
Related
I have an application that cannot be modified that connects to a SQL Server database using a hardcoded connection string with windows authentication.
I need to move the database to another server but as I cannot modify the hardcoded connection string - I am looking for something to act as a local connection that will then relay the query to the remote database and return the result back to the app.
The only other way I can see to do this is to upgrade from SQL Server Express and use database replication but that will be expensive option for what I need.
Can anyone suggest any software to do this or recommend an alternative method?
Thanks
Update:
The connection string also uses Windows authentication which will not work on the remote server.
If your workstations don't need access to the old server, you could perhaps solve this with DNS, using a cname record to point the old server name to the new. If you can't do this organization wide, you might be able to use entries in the hosts file on the impacted workstations.
I just saw this in the comments:
the database server is the same machine where the app runs (ie, 'localhost')
In that case, you want to figure out what the connection string is using, and the hosts entry should be able to accomplish this.
In old server you can define linked server, pointing it to new one and create queries to link to remote tables; you can use different credentials for it. You may get some performance problems (esp update statements may slow down).
I'm trying to migrate the Data from my One instance of Oracle 11g Server to another. For normal scenario, I'll use the SQLDeveloper and export the database.
But in current scenario, i'm not allowed to use SQLDeveloper and can access the Oracle DB only using CommandLine SqlPlus.
Can someone advice how to migrate data from one oracle 11g server to another using commandline?
Data Pump is probably the best solution for this task. If you want to export the entire database, the command would look something like this:
expdp [USER]/[PASS]#[DATABASE] full=Y directory=TEST_DIR dumpfile=DataBaseExport.dmp logfile=DataPumpLog.log
More info can be found here: https://docs.oracle.com/cd/B28359_01/server.111/b28319/dp_export.htm#i1006388
I have been given oracle database credentilas as read/write only user.
Now for my experimenting stuff I want to export the data from oracle server to local VM.
I tried copy and inserting tables into postgres using pentaho but that failed.
Is there any way I can export that oracle data and insert in locally?
Can I install some free oracle on ubuntu and then I can do something to get that data?
I don't know oracle much
There are certainly more sophisticated ways of how to export database or a table on Oracle, but I usually export and copy the dbs with the help of Oracle SqlDeveloper. Just go to Tools menu and then select Database Export or Database Copy. Just remember that this is not the full db expot, ie. the users are not usually copied with this procedure.
The good thing is, that you may connect this tool to any database server that has Java Connector, and that makes it easy to export the data into an SQL Sever or Postgres.
If you want a more sophisticated way to export the data, check out the Recovery Manager (RMAN) and the DUPLICATE command documentation here: RMAN on Oracle
Use Oracle Data Pump. That is the full documentation for Oracle 10 Data Pump. Also you can find the documentation for 11g here and here.
impdp and expdp work as command line. You can invoke them directly from the server or from a machine that has Oracle client installed and a connection to the server.
Considering you have limited rights on the server you will not be able to do a full database export, but I guess you do not need to; you only need to export your working schema/tables. The documentation shows how to do that: use expdp export only a schema and/or only some objects.
Once the export is done, you can copy the dmp file to any other computer (including a VM) and use impdp to import all the data to another Oracle database. You need to set up the new database before the import (import does not create it for you).
Basically I'm developing a website using ASP.NET MVC4, when creating a model, I need to establish a connection to the database first
When I selected the database name 7CH3LM1 (default in the drop down list), it said:
cannot connect to the database
but when I wrote like 7CH3LM1\SQLEXPRESS, it let me through
That reminded me of a question: what's the real difference behind that? what's the meaning of 7CH3LM1 and what's the point adding the SQLEXPRESS?
Any ideas are very welcomed!
7CH3LM1 is your machine name, and it helps identify the instance of SQL Server you're connecting to. In this case, you are connecting to a named instance of SQL Server running Express Edition, and its instance name is SQLEXPRESS. A machine can run multiple instances of SQL Server concurrently, and it can have one default instance and 0, 1 or multiple named instances. To connect to the default instance, you would use 7CH3LM1, and to connect to a named instance, you would use 7CH3LM1\<instance name>, as you have with 7CH3LM1\SQLEXPRESS.
I typically rename my machines to something a little more memorable and meaningful, as that can make things much easier, but that's just me.
An instance of SQL Server can contain many databases. You can specify the database in your connection string as well (recommended), or you can rely on your login's default database to define initial context (not recommended, since that database could be offline, single user, detached, etc. - which will affect your ability to log in).
Ultimately, your software will need both an instance and a database in order to read / write data.
You're talking about an INSTANCE name. Multiple instances of SQL servers can exist on the same machine. If you do not specify the instance, you will connect to the DEFAULT instance of SQL server on the machine (which may or may not exist.) When you do specify one (after the "\"), it's known as a NAMED instance.
You can read more about instances, here: msdn.microsoft.com/en-us/library/hh231298.aspx
In summary, syntax for selection of instance on MSSQL servers is expressed as
SERVERNAME\INSTANCENAME
or for the DEFAULT instance:
SERVERNAME
And, personally, I think the docs for SQL 2000 are much easier to read about DEFAULT vs. NAMED instances. You can find them here: http://msdn.microsoft.com/en-us/library/aa174516(v=sql.80).aspx
I'm being demanded to develop a new software that must be built over SharePoint and use Microsft SQL Server 2012.
I have a DB in Postgres, and some of its tables will be used in this new project, so I must import these tables, everyday. I'd like to use WebService to do it, but they want it to be done DB-to-DB directly.
Postgres-to-Postgres is already done and it "works", but importing to Microsoft is being troublesome.
Does anybody know some MSSQL tool that can connect to Postgres and do the import?
Typically for this sort of situation (assuming that it'll be a process that's repeated on a regular basis) you'd use SSIS (comes with most versions of MS SQL Server). Have a look at the first several hits on this Google search, especially the first one.
Another option is to connect to the Postgres database directly from your application via ODBC, and eliminate the redundant copy of the data - and get real-time updates instead of having to wait for the next execution of the SSIS job.