App fabric without SQL Server whatsoever - sql-server

I got VPS with limited memory and my WCF service is hosted using AppFabric.
Since memory is limited and I am not using SQL server for anything other than AppFabric prerequisite im thinking about uninstalling SQL Server. (instance can eat up to 200mb memory at times). I am not using any DB related features of AppFabric like dashboard or caching. I like IIS extensions and simplicity for WCF service manipulations however, and I am thinking those do not require Sql Server actually.
I am unable to just try it out so wonder if someone has such experience, or can predict result of uninstalling SQL server on appfabric behaviour.

Instead of uninstalling SQL Server you could just stop the SQL Server process. Set the process to manual startup.
That way if you need SQL Server in the future you can just start the process.

As #Shiraz Bhajiji illudes to if you are using SQLServer as the configuration store, you will need to reconfigure it to use file based configuration instead, it sounds like you are only using a single AppFabric instance, but if you are or needed to use multiple instances the config file would need to be accessible to all instances.
Again it isn't necessarily relevant to you, but if you have multiple app fabric instances, the sql server configuration option is a much more robust approach. With the file based approach, if you configure things incorrectly one app fabric node going down can take down the entire cluster. The SQLServer approach does represent a single point of failure however, if you are using clustering etc you can easily mitigate this. Again I appreciate I'm getting a little off topic here.

Related

One ASP.net WebAPI Application, Multiple Signal-R Backplanes (Sql Server databases)

Is it possible to create multiple backplanes within Signal-R?
We're working on an ASP.net WebAPI Sass application and are looking to implement Signal-R for "real-time" web functionality. Since we'll be hosting the application a web farm, client-connection state will be managed through a SQL Server backplane.
The application is multi-tenant - but database is not. The application determines which connection string to use and all client requests talk to their appropriate database. Now the code for configuring the Signal-R SQL Server backplane within Application_Start() is:
GlobalHost.DependencyResolver.UseSqlServer(connectionString);
Does anyone know if it's possible to create multiple backplanes with Signal-R, basically loop through each connection string and call the above code?
Thanks for checking this out!
If you need to eliminate the single point of failure, I suggest setting up a failover server in case the primary SQL Server machine goes down. Reference: http://technet.microsoft.com/en-us/library/hh231721.aspx
If you simply need more performance than a single SQL Server instance can provide, I suggest using Redis as the backplane.
In either case, I doubt attempting to use "multiple backplanes" will be helpful, unless you intend to map certain hubs to certain backplanes for load distribution.

Are there alternatives to ODBC for MS Access/SQL Server Connection?

My question is this: Are there alternatives to ODBC that would allow us to connect our SQL Server to MS Access?
Here's the situation: My company works with a proprietary, SQL database (ProVenue) that up and decided to "no longer support ODBC" to MS Access, our front-end tool, without telling us.
We are currently migrating away from ProVenue, but in the meantime , we're stuck with a vendor, which "no longer supports" our ODBC connection(s). The vendor also has no incentive to help since we're leaving in several months.
I've devised a workaround where I manually export the ProVenue tables (ASCII), proof (yes, the export utiliy pulls unreliably), convert and upload on a daily basis into Access. That said, it is unreasonably time consuming given the number of tables. This work-around could be a full-time job.
Do you know of any alternatives?
Do NOT consider using ADP. It has been dropped from Access 2013 and hence is a technology with no future.
From what you're saying, you don't "own" your own MSSQL database - you're simply connecting to an instance that the provider manages, correct? I would guess that they've disabled ODBC connections to MSSQL because they don't like the load placed on their servers and/or that they've decided they want to change some underlying structures and don't want to have to cope with anybody whining about those changes.
That said, do they allow direct MSSQL connections? Via SQL Management Studio, for example? If so, you should be able to define an export & import process which is less buggy than theirs, and simply re-point your Access database to the local copy of data. True, this would still require some (possibly automated) import process, so you'd be out of synch with the server, but it'd give you the solution.
You might try connecting an .adp file to the server, to see if they'll still let you access things in that manner. That would possibly require significant modifications to your Access solution, but would also be a bit easier on their servers than linked tables via ODBC.
You could have a look at Access Data Projects (ADP) which are tied directly to one SQL Server database. I don't think they use ODBC at all, but they have their own limitations, and of course, aren't available in older Access versions.

Entity Framework, No SQL server, What do I do?

Is there seriously no way of using a shared access non-server driven database file format without having to use an SQL Server? The Entity Framework is great, and it's not until I've completely finished designing my database model, getting SQL Server Compact Edition 4.0 to work with Visual Studio that I find out that it basically cannot be run off a network drive and be used by multiple users. I appreciate I should have done some research!
The only other way as far as I can tell is to have to set up an SQL server, something which I doubt I would be able to do. I'm searching for possible ways to use it with Access databases (which can be shared on a network drive) but this seems either difficult or impossible.
Would I have to go back to typed DataSets or even manually coding the SQL code?
Another alternative is to try using SQL
Install SQL Server express. Access is not supported by EF at all and my experience with file based databases (Access, SQL Server CE) is mostly:
If you need some very small mostly readonly data to persist in database you can use them (good for code tables but in the same time such data can be simply stored in XML).
If you expect some concurrent traffic and often writing into DB + larger data sets their performance and usability drops quickly. They are mostly useful for local storage for single user.
I'm not sure how this relates for example to SQLite. To generate database from model for SQLite you need special T4 template (using correct SQL syntax).
Have you tried SQLite? It has a SQL provider, and as far as I know EF supports any provider. Since it's file-based, that might be a plausible solution. It's also free.

Using SQL Server for WSS 3.0 instead of Windows Internal database

There are actually two related questions:
is it possible or advisable to use a full blown stand-alone SQL server for SharePoint Services WSS3.0 instead of the supplied windows internal database it comes with? The client I am working for is asking to utilize their existent SQL server for all WSS content databases to possibly minimize admin effort and improve performance.
As well, would you advise to install WSS on one physical server and the content database on another server? Any gain in performace? Practicality? ect. The default is WSS and all of its databases are installed on the same single server. We don't really need a farm setup of MOSS, because the WSS capabilities are enough for our needs.
Thanks,
Val
Yes, when you create the site check the installation to be a "Web Front End" It will then prompt you to select a location for the SQL database. Just point it to which server you want.
I would definitely recommend putting it on a non-Sql Express instance. The express version only scales to 4 gig, limits the maximum number of connections etc. If your client is going to do much with it at all, you will eventually hit that limit. Full blown sql server has other advantages too, like help with backups etc.
Yes and yes.
Keeping the SQL and WSS servers separate saves resources on both, and neither are light weight applications. It also allows you to easily begin clustered/distributed environment in the event your usage increases, as well as following a least privledge principle, keeping product patches separate, etc.
As an addendum, you say you don't need a MOSS farm because WSS fits your needs, but be aware that it's just as easy to setup a distributed WSS environment as is MOSS; MOSS only adds capabilities to the application. It's usually a good idea to have at least two WFE's in the farm, if for nothing else than redundancy in case of failure.
Yes you can use a 'full blown' SQL Server instead of the the free and limited SQL Server Express that is delivered with Windows SharePoint Services 3.0 (wss 3.0)
It's even better to separate the database and the actual website! More scalable (if you upgrade to MOSS), easier to manage and less security risks.

Microsoft Sync Services - good solution for me?

We upload sales transactions from our stores to the headoffice server. At the moment, we use DTS (SQL Server Data Transformation Services), but we’re planning on replacing that with Microsoft Sync services for ADO.NET, as this seems to be Microsoft’s preferred solution for this type of setup and we want to follow the standard (that will be hopefully be around for a long time).
Here are the details of our setup and what we’re planning. I’m looking for some advice, especially about whether Sync Services fits into our solution.
Situation
Each store has a 3rd party EPOS system which stores sales in a Microsoft Access 2000 database, which we can access. Our headoffice database is SQL Server 2005, but will be upgraded to 2008. The headoffice is not on a VPN with all the stores, but we can open up our firewall to the stores’ IP addresses, so that they can send data directly to SQL Server. The stores are always connected to the internet via ADSL, although they do lose connection and we don’t want to lose sales data.
We are only uploading transactions from the store – definitions do not need to be downloaded.
Current solution
We have written a Windows service that runs on the store PC. This service downloads a DTS package from the server (which contains all the details of the upload) and runs it in the store – and this will upload sales to our server.
We chose DTS, because it is free when you install MSDE. We can’t use SSIS, because that would require a SQL Server licence at every store.
Another reason we chose DTS is that the details of the upload (i.e. which tables and fields to include) are stored on our headoffice server, so if we need to change things we can do that centrally and don’t need to install anything new at the stores. This isn’t a showstopper, but would be nice to have this ability in our new solution.
Potential solution - Microsoft Sync services for ADO.NET
We are currently building a proof of concept with Microsoft Sync services for ADO.NET. The idea is to put SQL CE (SQL Server Compact 3.5) in each store (client) and sync that to the headoffice SQL Server 2005 database (server). We’ll get the data into the SQL CE database either by (1) syncing it with the Access 2000 database or (2) getting the EPOS system developers to write sales straight into the SQL CE database – probably (2). But our main concern is getting the data from the store to the headoffice server. This method seems to be Microsoft’s preferred solution for occasionally connected systems and that is what made us look seriously at Sync Services.
I’m hoping that using this will mean that most of the work needed to upload the sales will be built into Sync Services and we won’t have to re-invent the wheel.
Potential solution - Upload to a custom webservice
There is also the possibility of uploading the sales transactions to a custom web service on our headoffice server and then into our SQL Server database. This means that we will have to build our own mechanism for determining which rows are new, and as well as caching for when the systems are disconnected. Also, we might be missing out on other functionality that will come built into Sync Services.
Please let me know if you have any advice that will help, especially : “Is Sync services the right solution!”. The problem that we are trying to solve seems very generic (uploading sales from stores) – and I’d like to solve it with a generic solution.
Microsoft Sync services is more than you need, but it will certainly do what you want, and it was built with your type of application in mind.
As with most new technologies out of microsoft, (caution: generalization!) you may find that it's not as mature as you might like. It'll do what you need it to, but you may run into issues that aren't easily resolved because it hasn't been put through the ringer. As an early adopter, though, you may find that the Sync developers are eager to help you out when you get stuck, so this isn't as big a problem as it might seem.
Make sure you read through all the literature on it, some of which is here, or linked in the following sites:
http://msdn.microsoft.com/en-us/sync/default.aspx
http://msdn.microsoft.com/en-us/sync/bb887608.aspx
http://en.wikipedia.org/wiki/Microsoft_Sync_Framework
Given your one-way flow of information, though, and centralized layout I expect you should have few, if any, issues setting it up and using it.
Be sure to report your experience back here!
-Adam

Resources