I'm working on a project of migrating zeppelin notebooks to Azure Databricks, I haven't find any documentation on the same.
Any guidance on that would be very appreciated.
There is no proven tool to migrate/convert zeppelin notebook to databricks notebook.
Both notebooks are json format but follow different formats.
Ipython(databricks) : https://nbformat.readthedocs.io/en/latest/
I could find github repo https://github.com/rdblue/jupyter-zeppelin that addresses your issue, but it is not updated recently so not sure if it will work on latest versions.
You can give a try and modify the code if required.
Related
Do you have information on how to integrate Snowflake with Azure devops for ci/cd.I see not much of information on docs.snowflake.com, I am interested in step by step process or guide of implementing the Azure devops with Snowflake.
There are different ways to do this. You can install and execute SnowSQL, dbt or schemachange on ADO agents. Please see this guide for an example with schemachange. https://quickstarts.snowflake.com/guide/devops_dcm_schemachange_azure_devops/index.html?index=..%2F..index#0
I have a project which required migrating all the stored procedure from SQL Server to Hadoop ecosystem.
So the main point makes me concerned that if HPL/SQL is terminated or not up-to-date as in http://www.hplsql.org/new. It shows latest updated features HPL/SQL 0.3.31-September 2011,2017
Has anyone been using this open source tool and this kind of migration is feasible basing on your experience? Very highly appreciated your sharing.
I am facing the same issue of migrating a huge amount of stored procedures from traditional RDBMS to Hadoop.
Firstly, this project is still active. It is included in Apache Hive since version 2.0 and that's why it stopped releasing individually since September 2017. You could download the latest version of HPL/SQL from the hive repo.
You may have a look at the git history for new features and bug fixes.
I made a website that uses a sqlite3 database, and I'm trying to get my program on AWS using elastic beanstalk. I've been googling, but can't find any instructions/tutorials on how to get a sqlite3 database running on AWS. Does AWS support sqlite3? Is there some trick to making it work? And if not, what do you recommend? Many thanks.
You can refer to the documentation below which will help you to get to the Beanstalk console and add the SQLite3 to the AWS. This is for the MySQL but you can change the database engine to SQLite3 from Database settings.
https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.managing.db.html
I am not entirely sure whether this is possible because I have not done it before, but I'll point you in the right direction.
There is documentation that shows you how to get started with a custom Amazon Machine Image (AMI) for your elastic beanstalk environment. So what I would recommend doing is:
install sqlite3 on an EC2 instance,
configure sqlite3 to your requirements,
ensure the instance starts the sqlite3 service on boot,
create an AMI of the instance,
follow this documentation:
https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.customenv.html
Please let me know how you go and I may be able to help if you get stuck along the way.
It would be epic if AWS released a service/ intermediate server for it. I love SQLite.
However, the problem is that SQLite ~ does not support transactions over NFS. I actually tried storing SQLite on AWS EFS and then mounting EFS from both AWS Lambda & Batch, so I hit this wall organically.
Given that cloud environments are largely multi-machine/node, you really start to see the benefit of a server-based approach like PostgreSQL.
I am using Redgare DLM Automation for database CI in a SQL Server and Visual Studio Team Services environment. I can easily deploy to multiple databases in a single environment, but apparently DLM Automation does not support multiple environments out of the box. Redgate support suggested using VSTS post-scripts in PowerShell, sqlcmd or something called "account_y" (I'm not sure what this refers to) to potentially add multiple environments.
Has anyone tried using DLM Automation for multiple environments? I have explored the PowerShell CmdLets, looked at SQL Compare options and filters, thought about using VSTS's Tokenizer for script alterations, but am still struggling with how to put all of this together to deploy to more than one environment.
Any experience or guidance would be greatly appreciated.
Thank you!
You definitely can deploy to multiple environments, however the issue of needing different user accounts for different environments is not a trivial problem to solve. Ultimately whatever you source control will be deployed to each environment, so if you need different user accounts then you will need to take care of it yourself by using some sort of post-deployment script.
I would suggest not source controlling user accounts and then adding a custom step after deployment to add the users - either command line using sqlcmd or the equivalent powershell cmdlets.
There are some blog posts that go into detail regarding this problem and their answers are probably more detailed than anything I can provide. I'd suggest that you have a read of them.
https://www.red-gate.com/blog/building/source-controlling-database-permissions
http://workingwithdevs.com/source-controlling-database-users/
I hope this helps.
I have a large database project and I am trying to publish it to Azure. I have done the following:
In the project settings, changed Target Platform to Windows Azure
On Azure Made sure I am on Standard tier S0
Made sure the Server Version is V12
Changed the Timeout for the publish
Tried creating a new SQL Server database
All result in a ton of errors a couple of which are:
ForeignKey: [xxxxxx] has an unresolved reference
ROWGUIDCOL is not supported for the targeted platform
I tried searching and tried everything I saw with no luck. Also, I need to be able to publish as things change, so using the migration tool is not an option.
Thank you
Eric
That was correct. I just needed to update my SSDT. I posted this as the answer before, but it never showed up.
Thank you