I think this should be simple but I cannot really find a way to do it. I have some devs connecting to a remote SQL Server 2017 using SQL Server Management Studio. Sometimes they need to run queries that take several hours to complete.
However, their laptops go to sleep if they leave them overnight which sometimes breaks the process. I do not want them to RDP to the server. Is there a way (without server/SQL admin intervention) for their queries to run persistently on the server instead of their local laptops?
I know, they could just set up their laptops to just do not go to sleep. However, that is a security policy in the company and it is a huge pain in the butt to ask for a laptop to be excluded from it.
Any suggestion is welcome.
Thanks!
What kind of database is it? For example... If an MS SQL Server then you can use an SQL Server Agent which run scheduled jobs (with tasks). Is it Azure? Then use an Elastic Job. Figure that out first before you waste your time looking into a tool that is not right for your database.
Related
So I'm using SQL Server as Azure DevOps database and there's a 10 GB limit which I didn't know and is now reached. So the obvious solution is to use a proper SQL Server instead - I know.
However for now (this is an emergency) I need a quick and dirty solution, so I deleted a bunch of stuff in Azure DevOps, but it has no effect. Is there a way in SQL Server Express to force the cleanup job that presumably runs once a day so that some memory is freed?
At work we load data into a SQL Server 2012 database, and create .bak files that are exported. Yes that is correct, due to compatibility issues, we need to use SQL Server 2012.
This process, which is probably running for 3-4 hours per day, is currently running on an on-premise machine, but we want to move it to Azure.
However, SQL databases in Azure are v2017+, but I have read that it's possible to run SQL Sever 2012 in a Docker container. Before I invest a lot of time into this idea, has any one tried to host an old SQL Server version in a Docker container in Azure?
As said, use a VM. Microsoft maintain VM images all the way back to 2008 plus you can integrate backup and automatic updates to the OS and SQL. Images are listed here and you can pay as you go or bring your own license:
https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview
In the pay as you go model you can shutdown the VM (i.e. the VM itself not just the OS) and you won’t get charged for the VM or the SQL license. You will still get charged for storage. See here:
https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/pricing-guidance#pay-per-usage
I'm currently working on a project proposal which would require moving multiple Access databases into a new MS SQL Server database. The idea is to keep the front end program as MS Access so that the users are familiar with the process of inputting data and creating reports.
However, things get complicated in that the internet in the areas where the survey will be collected has poor connectivity and will be out from time to time. I had thought of a few ways of solving this issue but all of them are cumbersome:
1) Having a PC with a router that stores the SQL Server database in offline mode and the data entry PCs connect to the PC with the offline database through the router. The PC with the SQL Server database can then backup the db on the server when it has an internet connection.
2) Adding the data to MS Access databases that can then be merged with the SQL Server at specified increments (this would probably cause some issues).
We've done option 1 before for similar projects but never for connecting to an SQL Server database in offline mode. However, it seems feasible.
My question is: Does anyone know of a way of using Access as a front end application for SQL Server and being able to update data during times without internet connectivity? The SQL Server database would automatically assign primary keys, so, duplicate unique values shouldn't be an issue while syncing the data.
Thanks for your help. I've been having a hard time finding an answer on Google and syncing to databases is complicated at the best of times. I'm really just looking for a starting point to see if there are easier ways of accomplishing this.
I would run a the free editon of SQL express on all laptops. So the Access database would be the front end to the local edition of SQL express. SQL express can be a subscriber to the "main" sql database. You thus use SQL replication to sync those local editions of SQL server to the master server. Of course the main SQL server can't be the free edition of SQL server. So to publish the database for replication, you can't use the free edition, but those free editions can certainly be used as subscribers.
This approach would eliminate the need to build or write special software for the Access application. You thus do a traditional migration of the access back end (data tables) to sql server, and then simply run the Access application local with sql express installed on each laptop. You then fire off a sync to the main edition of sql server when such laptops are back at the office.
The other possible would be to adopt and use the net sync framework. This would also allow sync, and would eliminate the need to run sql expess on each machine. I think the least amount of effort is to sync the local editions of sql express with the main editon of SQL server running at the office (but that office edition of SQL server can't be a free edition).
Does the SQL Server Agent affect the performance of SQL Server?
Do I need to stop it to increase SQL Server speed!
If yes, please give me more details
The SQL Server Agent is a service which executes any jobs you have configured on the server. It will not inherently affect performance itself, as it's not running most of the time.
When it runs a job, the performance hit will be identical to if you were running the items in the job yourself.
If you were to disable it, you wouldn't be able to run any jobs. Probably not a great idea.
We have migrated our production environment to Azure (using SQL Azure instead of SQL Server)
Our local development environment uses SQL Server. We write change scripts when database changes need to occur during a release.
PROBLEM
The issue now is that some T-SQL commands/statements/keywords don't work when run on SQL Azure. This is constantly disrupting our release process.
Educating everyone to use a subset of T-SQL is happening, but these problems continue to crop up.
Is it not possible for us to parse our SQL scripts as 'Azure SQL compatible' in SQL Server using the parser before running them:
Thanks
If you use the new "SQL Server Database Project", there is a feature in Visual Studio that allows you to select the "Target Platform" to SQL Azure. This will allow you to build all your t-sql scripts and check them for SQL Azure compatibility. Any compatibility errors will throw errors and it can create a bacdac/dacpac/t-sql script that is compatible with SQL Azure.
In order to take advantage of this, you have to manage your code using the "SQL Server Database Project" projects in Visual Studio.
You are not going to like this answer .. but the best bet is to move your development environment to SQL Azure. We started in the same process but you will just have the constant battle. We used Redgates SQL Compare tools but I don't believe they will solve your problems, but maybe worth a shot as they are constantly getting better.
Even if you have multiple devs and they each need a SQL Database I would still recommend getting everyone to signed up with Azure and pay the cost under it. Thankfully the database are not that expensive to run, and you might get under the free tier structure. It is still cheaper for us to run the cost of the SQL Azure Dev boxes than it is to have the pain and waste of time at release.