How can I migrate MariaDB to Aurora - database

I have running MariaDB Database and I want to migrate it to Amazon Aurora, how can I migrate It to Amazon Aurora

This migration is doable in multiple ways. Few questions to begin with :
Which version of Maria DB are you currently on and which version of Aurora Mysql are you planning to move to?
Are you already on RDS MariaDB? If yes, the migration might be a bit simple with snapshot restore.
If you are not currently on RDS MariaDB:
MariaDB 10.0 to Aurora 5.6 and MariaDB 10.1, 10.2, 10.3 to Aurora 5.7 should be doable without much complications. This is based on the compatibility shown here: https://mariadb.com/kb/en/mariadb-vs-mysql-compatibility/ . You would most likely export your data using mysqldump and then import them via SQL apis or LOAD from S3 that Aurora supports.
For any other migrations, you might need to do additional steps since you'd be crossing the compatibility boundaries.
Doc: https://docs.aws.amazon.com/dms/latest/sbs/CHAP_MySQL2Aurora.html
If you are already on RDS Maria DB:
Try snapshot restore. A direct snapshot restore might not get you where you need to get, but thats the starting point for you.
Doc: https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Migrating.RDSMySQL.Import.html

Related

Migrating Microstrategy reports from on prem with Teradata as backend to Microstrategy Cloud with snowflake as backend

We have the Microstrategy reports pointing to Teradata as backend. The plan is to migrate the Microstrategy from On Prem to Microstrategy Cloud with Snowflake as backend.
I wanted to know the steps and various ways/processes to do this migration.
For the data, there are two basic approaches:
Export the data to files on a cloud storage platform and then use Snowflake’s COPY INTO command to load the files into Snowflake
Use a tool that can read from Teradata and write to Snowflake. This could be a dedicated ETL tool or a generic coding tool like Python
For Metadata backup, you can use "mstr bak" to do the migration. This is officially supported to migration from on-prem to cloud. Refer to https://www2.microstrategy.com/producthelp/Current/Cloud/en-us/Content/upgrade_mstr_back_up.htm
For warehouse migration(Teradata->Snowflake), I believe there are other tools that support this.

Legacy SSMigrationAssistant for Oracle 7.3?

Im unable to get any connectivity to Oracle 7.3 using the current SSMA from Microsoft. There is no legacy download anymore and i think i need an older version, 6.0. I keep getting two-task connection potocol errors when I use the tnsnames method to define a service. This works with the schema manager in the legacy oracle tools but no matter what i try Im unable to get a connection in the new SSMA without seeing that error. I should say im on a windows 7 vm for legacy reasons.
I have tried using a linked server in sql 2008 and 2012 which would allow me to make views. I could then bring it into Entity Framework, which is the ultimate goal, but same result as above.
Anyone had a similar issue and happen to have an older SSMA?
Client/Server Interoperability Support [ID 207303.1]
For Oracle Server 7.3 you need oracle client 7.3 or 8.0.5-8.1.7 or 9.0
Oracle8 i 8.1.7 Documentation
To configure oracle client SQL*Net V1, V2 ,tnsnames.ora read SQL*Net FAQ
I installed Oracle 7.3 very long ago in 1996 :).
Update
You can migrate to the free version of the Oracle 11 XE version or use an RDS instance of Oracle in the Amazon cloud with free 1-year service .
Use the utility exp and imp. These utilities are no longer being developed and are not supported, but are necessary for migration from very old versions.
Migration plan:
Define the necessary schemes and users for migration in the source
database, table spaces.
Install the free version of Oracle 11 or create an RDS instance of Oracle in the Amazon cloud with free 1-year service.
Create the required tablespaces.
Create profiles, users and roles.
Import the dump file.
Make the migration in mssql.

Copy SQL Server database to Amazon AWS RDS without downtime

I have in hands a task where I'm required to copy a 300GB SQL Server 2008 R2 Standard Edition to Amazon's AWS RDS.
The issue is that a couple weeks ago, AWS finally released a feature that allows the restore of a .bak file generated by T-SQL's EXPORT command (or SSMS's one). This is a good start, but the procedure of exporting the DB, copying the .bak file to S3 and then restoring it to RDS takes about 6h.
During that time, our application servers must be down so that the databases are in sync and; as we are talking about our website's database, 6h is something very difficult to cope with.
We have already tried AWS's DMS service and RedGate's Data Compare to no avail...
Anyone has an idea how this can be done or do we really have to accept the 6h downtime?
The features of SQL Server standard edition on AWS and rest environment is detailed in TaskLance blog
Issues with MS SQL Server Standard edition in AWS: http://tasklance.com/index.php/2016/03/16/issues-with-ms-sql-server-standard-edition-in-aws-2/

Does Amazon RDS for SQL Server support SSIS?

Reading some conflicting answers from Google searches, not sure if the answer is Yes, No, or maybe.
I thought it was pretty clear when reading this:
Amazon RDS currently does not support the following SQL Server features:
The ability to run Reporting, Analysis, Integration, or Master Data Services on the same server as the DB instance. If you need to do this, we recommend that you either install SQL Server on an EC2 instance or use an on-premise SQL Server instance to act as the Reporting, Analysis, Integration, or Master Data Services server.
Amazon now supports SSIS on RDS as of May 2020.
More info here

Can I use SQL Server database with Apache Mahout?

I chose to use Apache Mahout as my recommendation engine but at the same time due to some reasons it would be easier if I could store my data in a SQL Server db. Can mahout be connected with SQL Server without any problems ?
The documentation says that it can be connected with other db engines through JDB driver but I see all articles , books using mysql and also the data model supported are for mysql only.
How to convert MySQL to SQL Server databases:
SQL Import/Export Wizard through ODBC (http://www.mssqltips.com/sqlservertutorial/2205/mysql-to-sql-server-data-migration/)
SQL Server Migration Assistant (http://msdn.microsoft.com/en-us/library/hh313125(v=sql.110).aspx)
Here is the JDBC driver for SQL server:
JDBC Driver for SQL Server: http://msdn.microsoft.com/en-us/sqlserver/aa937724.aspx
Changing DB input format/driver in Hadoop cluster: http://blog.cloudera.com/blog/2009/03/database-access-with-hadoop/
There are also numerous example of using Mahout with an Azure Hadoop Cluster via HDInsight:
http://bluewatersql.wordpress.com/2013/04/12/installing-mahout-for-hdinsight-on-windows-server/
http://www.codeproject.com/Articles/620717/Building-A-Recommendation-Engine-Machine-Learning
I have just started my experiments with Mahout. I managed to run some book examples after replacing the in-memory data models with SQL92JDBCDataModel or SQL92BooleanPrefJDBCDataModel shipped with Mahout 0.9.
I passed an instance of SQLServerDataSource to constructors of those data models. This class is included into the Microsoft JDBC Drivers for SQL Server package (I used the version 4.1)
However, the SQL92JDBCDataModel documentaton states that it is "not optimized for performance".

Resources