For SQL Server, we are able to send over the db for the most part pretty easily to offshore staff.
Is this possible with the AS/400 or they can only VPN in to work?
Every database engine has a slightly different version of SQL. DB2 for i at V5R4 has differences to DB2 LUW 9.7 and both are different to SQL Server and MySQL at any version. So the quick answer is no, you can't simply make a copy of a DB2 for i database and run it on MySQL or SQL Server. You'd normally do exactly as you are doing with SQL Server: Have one machine here and another machine there and unload/reload the data as needed.
Having said that, the differences between SQL dialects are not usually crippling. Use the IBM Navigator for i and extract all of the DDL for the IBM database, then try to execute the DDL on the SQL Server machine. You'll have some syntax problems, but you should be able to work them out with someone who is knowledgeable in both dialects. Keep track of the changes to the DDL because you'll need them in order to extract the data from the IBM side.
Once you have the empty database created on the new machine, it's time to extract out the data. Write some CL programs to do CPYTOIMPF to generate CSV files or flat files or whatever it is that SQL Server wants in order to import properly. Then FTP that data to the new machine and write some scripts to do the import.
As you can tell, this is not going to be a simple process and it will take some time to develop and debug. I'd go with having the offshore staff using a VPN to the local IBM machine.
The easiest way I can think of would be to create a Save File (SAVF) then FTP that save file to the other IBM i and [restore it] (http://pic.dhe.ibm.com/infocenter/iseries/v6r1m0/index.jsp?topic=/cl/rstobj.htm).
In the PC world this is similar to zipping up a directory, FTPing it to another machine and then unzipping it.
If this isn't what you mean, can you elaborate on what you're wanting?
The offshore site probably has their own SQL Server, probably running the same version as you.
But unless they also have an IBM Power System running the same release of IBM i, then they will most likely need to access your system.
Related
I have a server on Microsoft Server Management Studio that I need to download in order to work on it locally.
When I right click on the database and go to Tasks -- Export Data I get the SQL Server Import Export Wizard. I am able to pick a source, but I can't find the write destination to allow me to download the file locally.
I don't want to transfer the files to another server, I just want to have to local file to work with.
Is this the right approach? Or is there a better way to handle this task?
I don't want to transfer the files to another server....
A SQL Server database is a complex binary structure. To read it / work with it, you need a copy of SQL Server on the machine you want to work with it on. The Developer edition would be a good option for downloading to a local machine, or you could install the free version and export data to a local database.
If what you are trying to do is to just see the data without any of the SQL Server functionality, then you can export them to different types of files through the import/export functionality. However, unless the amount of data involved is quite small, I would really recommend against this. The organization and cross-referencing of data can be quite extensive depending on how the database was designed.
These are about your only two options. What you end up doing may depend on what you are planning to do with the data.
Our company has a SQL Server 2008 R2 database on one of our servers. We would like to be able to make a copy of this database and open it off of a local machine; however, the size is greater than 10GB so the Express version won't do it. Is there a way we can open this locally without paying for another full license, since we do have one for running the database itself?
There is no way to restore a DB over 4 gigs without a licence.
If we had more information about what you were trying to do with it, we might be able to suggest alternatives.
Is it for analysis? Perhaps you can come up with a SSIS package that copies over only the tables that you need.
Do you need it for backup? Perhaps there is another way to verify things.
Is it for testing? Well, you almost certainly need another licence in this case.
The current application I'm working lets call X is an archiving application for the data kept another application say Y. Both are very old applications developed about 8 odd years back. So far in my reading of the documentation, I have learnt that the process to transfer data used is that, the SQL Server Database Tables snapshot is created in flat files and then this flat files are ftp'd to the correct unix box where through ctl various insert statements are generated for the Oracle Database and that's how this data is transferred. It uses bcp utility. I wanted to know if there is a better and a faster way this could be accomplished. There should be a way to transfer data directly, I feel the whole process of taking it in files and then transfer and insert must be really slow and painstaking. Any insights???
Create a DB Link from your Oracle Database to SQL Server database, and you can transfer the data via selects / inserts.
Schedule the process using DBMS_SCHEDULER if this needs to be done on a periodic basis.
you can read data from a lot of different database vendors using heterogeneous services. To use this you create a service on the Unix box that uses - in this case - odbc to connect to the SQL Server database.
You define this service in the listener.ora and you create a tns alias that points to this service. The alias looks pretty normal, except for the extra line (hs = ok). In your database you make a database link that using this tns alias as connect string.
UnixODBC in combination with the FreeTDS driver works fine.
The exact details vary between releases, for 10g look for hs4odbc, 11g dg4odbc.
I have a client that uses a point-of-sale solution involving an Access database for its back-end storage. I am trying to provide this client with a service that involves, for SLA reasons, the need to copy parts of this Access database into tables in my own database server which runs SQL Server 2008. I need to do this on a periodic basis, probably about 5 times a day.
Is there an easy programmatic way to do this, or an available tool? I don't want to handcraft what I assume is a relatively common task.
I am running this on SQL Azure, so there's no way for me to run prepackaged software on the server. It would either have to be open source and portable to Azure or executable on the client's computer.
I'm unfortunately thinking I'm going to have to roll my own tool to do this. Any suggestions or more tools that are out there that can do this themselves before I go ahead?
David, I looked at multiple solution for a similar problem: converting from dbf to mysql, here are 3 solutions (all commercial - but relatively inexpensive) that can work for you:
Full Convert
SQL Manager
ESF
Other than that I couldn't find a good robust data conversion tool that would be open source or free. At least not for DBF to MySQL conversion. There might be something out there for SQL/Access. You could roll out your own solution, but is it worth your time?
DISCLOSURE: I ended up using Full Convert.
Also all of these products generate some sort of batch file, that can be scheduled using Task Manager.
There are two things to consider:
connectivity
ETL tool
For connectivity, you will need to establish VPN tunnel of some sort between the client server and your server.
Then use SSIS to connect to MS Access, to create packages to pull data from MS Access to SQL Server database. On SQL Server, you will need to create new schema, to mirror or be close MS Access
On connectivity side, another option - since MS Access db is in the file, you may be able to FTP the file to your server and point SSIS to the file
Hello,
I'm new here, so sorry, if my question is too basic. However, maybe you have some advice, example, links, which could help me... I'm trying to find something helpfull for few days, but no results as for now.
I'm working in a distributed environment. I have a Oracle server hundreds of miles away and a MS SQL server close to me. I'm writing a application using Visual Web Developer 2008 Express. I need some data from Oracle. It's not worth to query the Oracle server every time i need some data from it. I'd prefer to run some Oracle queries once each night and store results in some local (SQL Server) tables. I assume, I should run queries through standard windows scheduler (Windows Server 2008). I have the basic connectivity - I can open Oracle Database from local Visual Studio.
The questions are:
How to write a query/procedure/function that would get data from Oracle and put them into a SQL Server table (possibly recreated before each query run)?
How can I run such a query from command line (or in other way run from scheduler)
What naming conventions are applicable? In VS I use something like //IP.IP.IP.IP/Name and a user with password.
Thanks for any help or advice.
Regards,
Matteo
I suggest you speak to the DBA's of the Oracle and SQL Server databases, as there may be other considerations you need to bear in mind. (Data Integrity, Security, ownership etc.)
One route you could follow would be to implement DTS (For older databases) or SSIS (for new versions of SQL Server) processes to copy the data across on the schedule you want. (This is pretty much what they were built for.)
How much data are we talking about?
If there is a small quantity that you need to transfer every day, you can write a stupid fetch and insert script in language of your choice.
You only need to search for better solutions if "sync" would take too much resources.
Thanks...
I'm the DBA for the SQL Server, which will serve only for my application. For Oracle I just want to read data and I have enough privileges and agreement with DBA's. Security, ownership and integrity are not an issue for now. I just need some technical advise how to get data from Oracle to MSSQL tables on a schedule.
I use MS SQL Server 2008 Express SP1. I'm very close to solve my problem - I have established connections and everything installed and working. I just don't know, how to run a query, which would get data from Oracle and put into MSSQL, on regular basis, without manual interaction.
I've some experience in programming, but not much in databases (except creating complex SQl queries). Therefore some example or links to detailed description would be helpful. I'm not sure about naming conventions, differences between procedures, functions and queries, command line options to run db automation procedures and so on. I'm also not sure, about which mechanisms or technologies are available in MS SQL Server 2008 Express edition.