Hi I was wondering how I can deploy angular-fullstack to Azure, I have looked around but can't find any tutorials etc. Can anybody provide any instructions or examples?
https://github.com/DaftMonk/generator-angular-fullstack
My understanding is you have two problems to solve to get this into azure.
How do I best host my mongo db?
How do I publish my node.js application as an azure app?
Answers.
Deploy your mongo db either using a VM or a managed mongodb service from the marketplace.
a. Worker role option: https://github.com/mongodb/mongo-azure (worker role mongo db)
b. Managed Mongo Hosted Azure Service - You can find this from the marketplace from your azure console/portal. (Note that not all regions apply and its not cheap)
To deploy your node.js app you have two options.
a.
Create an azure app and hook up the source control so that it will deploy on every commit.
This is the step by step guide from MS https://azure.microsoft.com/en-gb/documentation/articles/cloud-services-nodejs-develop-deploy-app/
b.
To publish, run the Publish-AzureServiceProject cmdlet as follows:
Publish-AzureServiceProject -ServiceName NodeHelloWorld -Location "East US" -Launch
-ServiceName specifies the name for the deployment. This must be a unique name, otherwise the publish process will fail.
-Location specifies the datacenter that the application will be hosted in. To see a list of available datacenters, use the Get-AzureLocation
cmdlet.
-Launch opens a browser window and navigates to the hosted service after deployment has completed.
To get the full info on deploy node apps to azure check out the microsoft docs, its pretty comprehenssive with screenshots too.
https://azure.microsoft.com/en-gb/documentation/articles/cloud-services-nodejs-develop-deploy-app/
Related
So not sure it this is stupid to ask, but I'm running a neo4j database server (using Apollo server) from my React Application. Currently, I run it using node in a separate terminal (and I can navigate to it on localhost), then run npm start in a different terminal to get my application going. How can I get the database just up and running always, so if customers use the product they can always access the database? Or, if this isn't good practice, how can I establish the database connection while I run my client code?
Technologies being used: ReactJS, Neo4j Database, GraphQL + urql
I tried moving the Apollo server code into the App.tsx file of my application to run it from there directly when my app is launched, but this was giving me errors. I'm not sure if this is the proper way to do it, as I think it should be abstracted out of the client code?
If you want to run your server in the cloud so that customers can access your React application you need two things:
one server/service to run your database, e.g. Neo4j AuraDB (Free/Pro) or other Cloud Marketplaces https://neo4j.com/docs/operations-manual/current/cloud-deployments/
A service to run your react application, e.g. netlify, vercel or one of the cloud providers (GCP, AWS, Azure) that you then have to configure with the server URL + credentials of your Neo4j server
You can run neo4j-admin dump --to database.dump on your local instance to create a copy of your database content and upload it to the cloud service. For 5.x the syntax is different, I think neo4j-admin database dump --path folder.
I have an application written in WPF (C#) and I deploy this application using Squirrel:
https://intellitect.com/deploying-app-squirrel/
Now I build application using these commands:
.\nuget pack nuget\HelloWorld.nuspec
Squirrel --releasify HelloWorld.1.0.0.nupkg --releaseDir "C:\SquirrelReleases"
on my local machine. But my application is stored in Azure.
How to make deployment using Squirrel on Azure?
How to make deployment using Squirrel on Azure?
Reading the documentation here, it seems there's no direct way to deploy your files in Azure Storage at least as of answering this question. Even for Amazon S3, they are mentioning that you upload the files manually in a s3 bucket:
5. upload the files from the Squirrel Releases directory into the S3 bucket.
I guess you can do something similar for Azure Storage as well. I have not tried it but I believe this is what you would need to do (based on their documentation for Amazon S3):
Create a blob container in your Azure Storage account and set its access level to either Blob (recommended) or Public.
Update the package location on the UpdateManager in your application to use the blob container URL (https://account.blob.core.windows.net/blob-container-name).
Upload the files in blob container. There are many options for you to do that from using available storage explorers, or using AzCopy, or using Azure PowerShell/CLI tools or writing code yourself by using any of the available SDKs.
I am wanting to deploy basically the bin folder to a VM that is hosted on Azure. I want to deploy from my TFS 2015 Server to a Window 7 VM which are both hosted in Azure. I have set up a Machine File Copy task and tried to get it to point to the correct VM but nothing seems to work. It continually comes back with the error :
2015-12-29T14:27:30.5763871Z ##[debug]Initiating copy on machine-name
2015-12-29T14:27:57.0124127Z ##[debug]Finished copy operation on machine-name
2015-12-29T14:27:57.0280368Z ##[debug]Deployment logs for copy operation on machine-name
2015-12-29T14:27:57.0280368Z ##[debug]System.AggregateException: Failed to execute the powershell script. Consult the logs below for details of the error.
2015-12-29T14:27:57.0280368Z System.Management.Automation.RuntimeException: Failed to connect to the path \\machine-name with the user username for copying. System error 53 has occurred.
2015-12-29T14:27:57.0280368Z The network path was not found.
I changed the machine name and username in the above snippet.
I have followed most of the steps in these articles:
http://www.codewrecks.com/blog/index.php/2015/06/20/build-vnext-support-for-deploying-bits-to-windows-machines/
http://blogs.msdn.com/b/visualstudioalm/archive/2015/07/31/dev-test-in-azure-and-deploy-to-production-on-premises.aspx
But I could not see the option for Azure File Copy so that's out the window. When I use PsPing I can connect from the TFS Server to the target VM when targeting the specific WinRM port 5985.
I am not sure if what I am trying to achieve is even possible with TFS + Azure, but I would have imagined this is exactly the type of scenario Azure was built for?!
Any help would be greatly appreciated.
EDIT: Here is the add Step Dialog:
You need to use the Azure File Copy task to copy files to Azure VMs. Open the build definition, and click Add build step... -> go to Deploy page on the ADD TASKS dialog. Then you will find the Azure File Copy task. See:
However, you mentioned that the task is missing, can you share me the screenshot of the ADD TASKS dialog?
While configuring the SQL Server 2012 Master Data Services, I am having following problem
The required .svc handler mappings are not installed in IIS.
What I want to do is that, I want to query my database using a URL so that I can retrieve data directly using the URL it self just like we can store the querystring parameters into SQL Server
How do I deal with it, I followed several documents but not any ideas.
To fix this issue, open a command prompt and go to the .NET directory
(for example %windir%\Microsoft.NET\Framework64\v4.0.30319).
Run the command: aspnet_regiis –i
For further details check:SVC Handler mapping error in MDS Configuration Manager
I've come across these types of errors a few times when installing MDS, the problem usually comes about because just having IIS installed is not enough, there are loads of other role services and features that you need to enable and install as well which the setup program doesn't tell you about.
Thankfully they are all documented here:
Web Application Requirements (Master Data Services)
And, if you've missed any, you can go back, install them and then re-launch the configuration tool to complete the setup without having to re-install MDS from scratch.
OK since I am in a holding pattern on this issue perhaps someone has seen these symptoms and can provide some sage advice. (Note: I have learned only enough Active Directory information to build this feature and I only have read access to the Active Directory.)
I updated the company intranet to allow the automatic entry/modification of employee phone/address information; it uses a web service to connect to the company Active Directory so I can call it from multiple locations in the main application.
The AD has two domains (A and B) in the same forest. Each domain has an ‘ADS update user’ group and an ‘ADSupdate’ account (which belongs to ‘ADS update user’).
Problem: Entries in Domain A update fine for Local Development Servers, Test Servers, and Production Servers. Entries in Domain B update only when run from Local Development Servers. When you run the same code (verified multiple times) on either Test or Production you get a (General access denied error).
The domain name is stored in the employee record so the exact same code is called for all employees.
All Local Development Servers, Test, and Production servers reside in Domain A.
This has the Active Directory Admin for Domain B stumped and to be honest I am thankful that the Local Development Servers are able to update the Active Directory entries in domain B. It proves that the code works at least in one location
I have looked at machine permissions, permissions on the group and user, and IIS and I can spot no significant differences.
Any help would be appreciated…
Is integrated authentication enabled on any of the web service applications?
Are the production application on domain A installed on a domain controller?
Does the updates from the development workstation work when you call the web service from a remote machine?
This was not caused by any code changes. The Production and Test servers were upgraded and run a newer version of IIS (6.0). The newer version of IIS will not work accross Active Directory domains.
My development machine is running the older version of IIS (5.1)
This explains why everthing was working last year and then suddenly stopped working. There are so few employees in the other domain that it was not immediatly noticed.