Can I connect with Redshfit from Angular for reporting? - angularjs

I need to connect Angular with redshift for historical reporting. Can it be achieved, what are the prerequisites ?

This is possible in theory using the Redshift Data API, but you should consider whether you truly want your client machine to be writing and executing SQL commands directly to Redshift.
To allow this the following are true:
The client machine will send the SQL to be executed, a malicious actor could modify this so permissions would be important.
You would need to generate IAM credentials via a service like Cognito to directly interact with the API.
It would be more appropriate to create an API that can directly communicate with Redshift offering protection on the SQL that can be executed.
This could use API Gateway and Lambda to keep it simple, with your frontend calling this instead of directly writing the SQL.
More information is available in the Announcing Data API for Amazon Redshift post.

Related

Calling API from Azure SQL Database (as opposed to SQL Server)

So I have an Azure SQL Database instance that I need to run a nightly data import on, and I was going to schedule a stored procedure to make a basic GET request against an API endpoint, but it seems like the OLE object isn't present in the Azure version of SQL Server. Is there any other way to make an API call available in Azure SQL Database, or do I need to put in place something outside of the database to accomplish this?
There are several options. I do not know whether a powershell job as stated in the first comment to your question can execute http requests but I do know at least a couple of options:
Azure Data Factory allows you to create scheduled pipelines to copy/transform data from a variety of sources (like http endpoints) to a variety of destinations (like azure sql databases). This involves no or a little bit of scripting.
Azure Logic Apps allows you to do the same:
With Azure Logic Apps, you can integrate (cloud) data into (on-premises) data storage. For instance, a logic app can store HTTP request data in a SQL Server database.
Logic apps can be triggered by a schedule as well and involves none or little scripting
You could also write an Azure Function that is executed on a schedule and calls the http endpoint and write the result to the database. Multiple languages are supported for writing functions, like c# and powershell for example.
All those options include the possibility to force an execution outside the schedule.
In my opinion Azure Data Factory (no coding) or an Azure Function (code only) are the best options given the need to parse a lot of json data. But do mind that Azure Functions on a Consumption Plan have a maximum execution time allowed of 10 minutes per invocation.

Using SOAP web services to get data from SQL Server 2008 database

I'm a newbie at SOAP and web services (2 day experience).
I use Bonita Open Solution as a BPMS in which I have a 'WebServer SOAP 1.2' connector. I need to get and write data from/into a database using SOAP. I don't want to use the 'SQL Server' connector which is based on JDBC because the system will be tightly-coupled.
Is there any already implemented SOAP web service in SQL Server 2008 to do that or should I develop my own? In case I should develop my own, I'm guessing the best way to do so is using ASP.NET, am I right?
Before you do anything, you need to decide exactly which data is required by the BPMS system and what access it requires. For instance, it may need read access to some data, but read and write to other data. Your service should only expose the data and operations which are actually required, and nothing more.
Your data is precious - don't expose more of it than necessary.
I recommend that you use Entity Framework in a database-first mode, but only add the required tables to the model. Then, simplify the model by removing columns which are not required, simplifying relationships, etc. Thus, you are exposing a conceptual model of your data which makes sense to the consumer, rather than having to expose every implementation detail of your database (do you really need to expose every junction table, for instance?)
It is then pretty simple to write a WCF service that uses Entity Framework to do the hard work of data access.
Even if deprecated, Sql Server 2008 has native SOAP web services (see Native XML Web Services: Deprecated in SQL Server 2008).
You need to balance the risk of a Sql Server upgrade against the cost of developing (and maintain) a custom service.

One ASP.net WebAPI Application, Multiple Signal-R Backplanes (Sql Server databases)

Is it possible to create multiple backplanes within Signal-R?
We're working on an ASP.net WebAPI Sass application and are looking to implement Signal-R for "real-time" web functionality. Since we'll be hosting the application a web farm, client-connection state will be managed through a SQL Server backplane.
The application is multi-tenant - but database is not. The application determines which connection string to use and all client requests talk to their appropriate database. Now the code for configuring the Signal-R SQL Server backplane within Application_Start() is:
GlobalHost.DependencyResolver.UseSqlServer(connectionString);
Does anyone know if it's possible to create multiple backplanes with Signal-R, basically loop through each connection string and call the above code?
Thanks for checking this out!
If you need to eliminate the single point of failure, I suggest setting up a failover server in case the primary SQL Server machine goes down. Reference: http://technet.microsoft.com/en-us/library/hh231721.aspx
If you simply need more performance than a single SQL Server instance can provide, I suggest using Redis as the backplane.
In either case, I doubt attempting to use "multiple backplanes" will be helpful, unless you intend to map certain hubs to certain backplanes for load distribution.

Exposing SQL Data to clients

we have an internal SQL Server 2008R2 db that we'd like to expose (partially - only some tables) to our clients via Internet, so they can feed their Excel reports. What are our best options? How should we provide security (ie. Should we create another, staging DB server on DMZ for this?). As far as quantity to transfer, it's very small (< 100 recs).
Here would be one simple way to start with if they need live, real-time access:
Create a custom SQL user account for web access, locked down with read-only access to the relevant tables or stored procedures.
Create a REST web service that connects to the database using the SQL Account above. Expose methods for each set of data that can be retrieved.
Make sure the web service runs over SSL (HTTPS) and requires username/password authentication - for example via BASIC auth with custom hard-coded account per client.
Then when the clients need to retrieve data, they can access a specific URL and receive data in CSV format or whatever is convenient for their reports. Also, REST web services are easily accessed via XMLHTTPObject if you have clients that are technically-savvy and can write VBA macros.
If the data is not needed real-time - for instance, if once a day is often enough, you could probably just generate .csv output files and host them somewhere the clients can download manually through their web browser. For instance, host on an FTP site or simple IIS website with BASIC authentication.
If data is not needed real-time, the other alternative is use SSIS or SSRS to export excel file, and email to your clients.

how to query SQL Server via REST to get XML

We have been using a web application framework to build apps that need to be able to query a SQL Server database and get the results as XML.
In the past, the framework provided that capability. But that capability is now deprecated.
So we were thinking, the framework allows us to easily query a REST service over HTTP, so why not use a SQL Server HTTP Endpoint. However, we then read that HTTP Endpoints are deprecated, as of SQL Server 2008. Not a platform on which to design an architecture for the future.
Azure (formerly SQL Data Services) was going to offer similar services, but now only supports the TDS protocol, not http. So no REST to be found in Azure.
The suggested alternative is to develop a custom app using WCF Data Services (formerly ADO.NET Data Services). But that would mean a whole additional app to develop, deploy, and maintain, presumably with its own authentication setup separate from SQL Server's, and its own source code repository... using a technology we have no experience with, therefore with its own pretty deep learning curve.
Can you suggest any other way to query a SQL Server database via REST/HTTP, that is not deprecated, and that would return results as XML?
Thanks for any help.
Read here: Creating an OData API for StackOverflow including XML and JSON in 30 minutes. Basically, the road forward is for REST to be offered by app layer (WCF powering EF that provides the OData mapping). IMHO straight HTTP access into the engine was a very bad idea to start with, nobody liked the HTTPEndpoints of SQL Server 2005 and they were as misguided as it gets. One cannot map the HTTP error model, security, type system into SQL and expect a smooth interoperability. Having the HTTP layer live in a dedicated app pushes the responsibility of handling the HTTP ecosystem into a component specialized in that (WCF), and the logic of mapping the REST model to the DB model ina component specialized in that job (EF).
It sounds like you may be wedded to an MS stack but if you're not, you can use restSQL in a Java EE container (Tomcat, WebLogic, etc.) on top of MySQL or PostgreSQL. restSQL has a full HTTP API with JSON or XML encoding. It offers two twists: updatable composite views and hierarchical composite views. The framework is extensible to other databases and addition of SQL Server is in its supported evolution. Check out http://restsql.org.
Another option is something like Dreamfactory. They have a SOAP to REST solution that allows you to connect to any database or service. I have used their free hosted solution in the past for projects. They also have an open source solution available. The cool thing about the service is that they use Swagger 3.0 to create service definitions in a nice front-end solution so you can test and create new endpoints.
I have used the OpenAPI 3.0 definitions to connect to 3rd party SOAP and REST services as well. They also support stored procedures and server-side scripting in the SQL Server environments.
Anyways, might be another option for you.

Resources