azure logic apps and ftp to azure VM on subnet - azure-logic-apps

Is it possible to connect an Azure logic app to an FTP server residing on an Azure VM that lives in a subnet under a private IP address ? I cannot, when trawling the internet, find many (-if any at all) references to anybody trialling this, although we felt it might be a valid test of a need to access sFTP under unusual network circumstances. We have such a server, initially created to act as a 'remote' development VM but accessible via another Azure VM which can connect with our on-premise network, acting as a 'jump' server.
It is not a showstopper if we cannot do this - we can, after all, set up non-private VMs to act as FTP servers. And I understand that 'private' might be private for a reason ! But I was surprised that I cannot find any definite content, from Microsoft or anyone else, that this isn't possible: it seems that the only way to engineer such connection might be through an expensive Azure Integration Account or Application Service Environment. But the hint that such connection might be possible via those additional Azure facilities has caused my organisation to investigate the possibilities via the usual logic-app connector features. Thanks

No, your supposition is correct. Private means private. You can't directly access your private VM on Azure for the same reason you can't access a private VM on an internal network.
But, all the same facilities are available. If it's a Windows VM, you can use the On Premise Data Gateway with the File Connector.

Related

On-prem SQL Server to Azure

I have 2-3 source systems which are on-prem databases. I am planning to use Logic Apps to connect to these source systems. As per the Azure documentation we need to install a On-Prem Gateway on a local computer.
I am skeptical of this methodology as it demands dedicated system, so not sure if this works in actual production scenario.
Please can you suggest what is the right way to do it.
Here is how to connect to on-premise data sources:
If the services are accessible over the internet then you call service endpoint over HTTP or HTTPS from azure logic apps. This article will help you with details steps to be followed: https://learn.microsoft.com/en-us/azure/connectors/connectors-native-http
If it is not accessible over the internet then this article will help with step by step process: https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-gateway-connection
Before you can access data sources on premises from your logic apps, you need to create an Azure resource after you install the on-premises data gateway on a local computer. Your logic apps then use this Azure gateway resource in the triggers and actions provided by the on-premises connectors that are available for Azure Logic Apps.
Also check this for reference.
You may also want to consider the costs.

How to security access onprem database from Azure AppService

Is there a way to securely access a on-prem Sql Server, from an AppService?
The IT guys are nervious about letting an App Service which needs access to our on-premise database.
I am not a networking guy, and am trying to come up with a solution.
The only thing I have thought of is creating a new database (CDS_API). The AppService is then given a connection string to this database. This database would then have access to the primary database (CDS).
If the AppService has only execute permissions to CDS_API, this seems secure to me. Am I missing something?
Is there a better way to do this?
The simple solution is to use an App Service Hybrid Connection
Hybrid Connections can be used to access application resources in any
network that can make outbound calls to Azure over port 443. Hybrid
Connections provides access from your app to a TCP endpoint and does
not enable a new way to access your app. As used in App Service, each
Hybrid Connection correlates to a single TCP host and port
combination. This enables your apps to access resources on any OS,
provided it is a TCP endpoint. The Hybrid Connections feature does not
know or care what the application protocol is, or what you are
accessing. It simply provides network access.
Alternatively, you can Integrate your app with an Azure virtual network which is connected securely to your on-prem networks either with a Site-to-Site VPN or over ExpressRoute.

Separate SQL server speed too slow in Google Cloud

I was moving all website to google cloud and encounter a performance problem.
I set up a VM instance on Compute Engine and a Cloud SQL server.
And connect the Joomla website from VM to Cloud SQL server using provided IP address. (Seems public IP)
The performance is really slow compared to the website using local database inside the VM itself.
So, my question is, is there a way to find local IP to connect to Cloud SQL since our web server is also on the Google Cloud infra itself.
Or, the only way is to stick with the database inside VM?
Update
I set up the Cloud proxy using this guide.
Can connect to mysql prompt with the proxy now.
But still cannot find a way to let joomla use this cloud proxy to connect to the database.
The fastest, easiest, and most secure way to connect to your Cloud SQL instance from your Compute instance is by using the Cloud SQL Proxy. There are multiple reasons for this, but here are the main ones:
Secure connections: The proxy automatically encrypts traffic to and from the database using TLS 1.2 with a 128-bit AES cipher; SSL certificates are used to verify client and server identities.
Easier connection management: The proxy handles authentication with Google Cloud SQL, removing the need to provide static IP addresses.
There's also the fact that you only need a static and small number of instances (1 in your case) connecting to the database, so you don't really need to overcomplicate your setup, you can just drop this binary into your instance, run it as a daemon, and instantly have a fast lane to your Cloud SQL instance (I use "fast lane" here because the traffic will go through Google Cloud's internal network).
Setting up the Cloud SQL Proxy comes down to enabling the Cloud SQL API, giving the service account of your intance access to the Cloud SQL API, making sure the binary has execution permissions (chmod +x), and giving it the connection string to the Cloud SQL instance. You seem to be having issues using the Proxy, so if you need more troubleshooting ideas, you can find them in the documentation. The tutorial you've followed should have detailed instructions on how to do these steps.
After all of that and after making sure the Proxy is running, connecting Joomla to the database should be similar to how you do it via the MySQL client. You should point your Joomla installation to localhost (or 127.0.0.1), give it a set of credentials to access the database itself (you can create database users via the Console), give your Joomla database's name, and that should be it!
Don't forget that the Proxy needs to be running in TCP mode! That should be as simple as adding =tcp:LOCAL_PORT_TO_LISTEN_ON to the connection string parameter you're passing to the Proxy. Here's an example of how to run the Proxy:
./cloud_sql_proxy -instances=<INSTANCE_CONNECTION_NAME>=tcp:3306
Virtual Private Cloud (VPC) helps to increase the performance.
Private Google access enables virtual machine (VM) instances on a subnetwork to reach Google APIs and Services using an internal IP address rather than an external IP address. You can use Private Google access to allow VMs without Internet access to reach Google services.
Here you get more details: https://cloud.google.com/vpc/docs/private-google-access

SSAS with Kerberos delegation gets connection timeout error

I have a situation where clients connecting to my webservice(that exists on another server) must access SQL Server databases and SSAS servers.
It must use the credentials of the client that is calling the service when accessing the SQL Servers and SSAS cubes.
For this to work I do
var winId = HttpContext.Current.User.Identity as WindowsIdentity;
var ctx = winId.Impersonate();
//Access Database/SSAS
ctx.Undo();
in my service which works fine when accessing SQL Server databases.
However when I access the SSAS servers I get
"The connection either timed out or was lost"
There are a number of posts like
http://denglishbi.wordpress.com/2009/03/31/windows-server-2008-kerberos-bug-%E2%80%93-transport-connection-issues-with-ssas-data/
http://sqlblogcasts.com/blogs/drjohn/archive/2009/03/28/kerberos-kills-large-mdx-queries-on-windows-server-2008-ssas.aspx
on this but I am using Windows Server 2008 R2
where my service lives so this should not be a problem as this bug should have been fixed by Microsoft.
Any information as to how to best diagnose this problem would be appreciated.
To clarify the SSAS servers do have SPNs. This was actually working at one point but has now stopped. Appears no sign of duplicate SPNs or anything.
What is interesting is it works intermittently on one SSAS server but seems to work all the time for the other.
They both have named SPNs as mentioned by this document
https://support.pyramidanalytics.com/entries/22056243-Configuring-Kerberos-Delegation-for-Named-Instances-of-SSAS-with-Active-Directory-and-additional-pro
My production environment is a load-balanced (and under heavy load) on a very large corporate Active Directory network. The following took a lot of testing to finally nail down settings that work.
I also run on Windows 2008 Server R2
My web services are in ASP.NET in IIS. For authentication I enabled "Windows Auth" and "ASP.NET Impersonation". Kernel mode is disabled and provider is "Negotiate:Kerboros"
SPNS and Trusted Delegation are setup for an AD account. My AD account looks like sys_myservice (sys_ is just a naming convention at my company)
The Application Pool identity is set to use the sys_myservice
After you make all these changes in your dev env, restart the entire server. For some odd reason this is always necessary when we bring on new servers and configure them.
With this setup my web services access SSAS, SQL Server, and more that use Windows Kerboros auth and all queries are performed correctly under the user's credentials.
The difference in my setup from yours is ASP.NET Impersonation is enabled at the IIS level. I had trouble doing the impersonation in code which is what you are trying to do. If you get code-level impersonation to work with your workflow I would be really interested in seeing you post an update.
Forgot to mention. My services are in a MVC application, I apply a global filter to all Action methods to force the application to authenticate all connections.
public static void RegisterGlobalFilters(GlobalFilterCollection filters)
{
filters.Add(new HandleErrorAttribute());
filters.Add(new System.Web.Mvc.AuthorizeAttribute());
}
and in my web.config system.web section
<authentication mode="Windows" />
<identity impersonate="true" />
I agree that the intermittently 'successful' SSAS instance is suspicious. I wonder if it's really using Kerberos all the time. Could be using a combination of negotiate/NTLM and Kerberos, with one auth method actually working and the other approaching failing. Might be worth another look at the SPN. This link might help: http://msdn.microsoft.com/en-us/library/dn194200.aspx
Did you try using Wireshark or any other Network Analysis tools to see anything red happening at that point of failure? Probably its better if you provide more troubleshooting observations from your end.
Also, does your web services has a Load balancer?
Regards,
Sasi.

How to secure a database using web services?

Now an application is connected to a database server in the same LAN and performs selects and inserts.
The database will be moved to a remote location accessible throughout internet. Performance degradation will be addressed reducing the number of operations to the db. It is not possible to use vpn or configure access-in rules based on client IP on the firewall of the net where the database server will be moved. So it seems to me it is necessary to create a database front end in order to protect it. I suppose one way to achieve this goal is to create a web service.
Are there easier alternatives?
I'm new to web services: it should run into Glassfish server while the client would be a c# application.
I read a bit about securing a web service but I'm a confused.
One method I found in internet is to use Glassfish built-in authentication mechanism and configure web.xml limiting the access to the web service URL to a group of users.
It seems an easy approach, are there any drawbacks?
Is it easy to use this type of authentication in the C# client?
Other existing web services wants a parameter key in the request. Then this key is compared with valid ones and if the check is successful the request will be accepted.
Is this approach more secure than the previous one?
Another alternative is to use WSIT but at a first glance it seems over-complicated and all the security mechanisms need a server certificate.
Anyway it looks more secure; does it fit well with JAX-RS and restful web services?
You can use L2TP or PPTP VPN in this case.
Let me show you first Network topology.
Client (accessing firewall with L2TP or PPTP )-----> Firewall (L2TP or PPTP VPN Server)---> Firewall LAN where your Server placed.
In above case all Client come from VPN so its secure and On firewall you have to Configure VPN to LAN rule with client base rule.

Resources