Performing Operations On Azure Database - database

I have a database hosted on Azure. I have an MVC 4 website were users log in and interact with the database. I need something that I can use to traverse through a table in my database, check for certain conditions and then make the required changed to my database. What framework or coding language could I use to achieve this? My hope is that I could have this script run continually or at certain time intervals.

There are multiple ways of achieving this and they all depend on your need for scale/resources/etc of this script.
You can code the job in whatever language (including batch if you utilize sqlcmd or some such)
You can schedule the job to run anywhere you want, including a machine in your office, data center, etc.
You can utilize newly released Aditi's job scheduler (it's free, called Scheduler, and is available in Azure app store)

Related

Application with SQL Server express and scheduling a backup everyday

I have an application that is used by more than one user, however backups need to be performed twice a day. I don't have SQL server agent and i was wondering if i should create an exe that would run in the background.
I have read other posts about using the scheduler, but i would look it to be away from the end user and simple exe is used. As i am looking to incorporate sending an email once back up is complete.
basically what is the best approach for creating a backup on SQL server and how does other applications allow for this?
Update
Would a good idea be to create a windows service that checks the time, if time matches then perform backup, i have never created a windows service.
What are the positives and negatives ?
will this start on pc boot?
any other suggestions?

Rebuilding an unstable tool from scratch (Currently Access based - can go anywhere)

I have inherited a custom built tool that is poorly designed and unstable, and I have a great opportunity to rebuild it from scratch. This is an internal tool only that works almost entirely in Access, and its purpose is to provide higher detail on parts that cost the company over a certain dollar amount.
How it works:
1) The raw data (new part numbers) gets pulled nightly from the EDW via macros in Access.
2) The same macros then join two tables (part numbers from one, names from another). Any part under a certain dollar amount is removed, and the new data is appended to the existing Access database.
3) During the day employees can then open a custom Access form to add more details about the part. Different questions are asked depending on the part category.
4) The completed form is forwarded to management, and the information entered is retained in the Access database – it does not write back to the EDW.
5) Managers can also pull some basic reports from the database, based on overall costs.
The problems:
1) Currently everyone has to have Access installed on their work stations, and whenever there is an update the new database gets pushed to their stations. This is not considered an ideal situation by management or IT.
2) If anyone has left the tool open accidentally at the end of the day the database is locked out, therefore the macros cannot run and the tool cannot be updated with new part numbers.
3) If the tool cannot update for a few days in a row the database can become corrupted. We can restore from the last good backup, in the past this has resulted in the loss of multiple days of work.
Ideally we want to take the tool completely out of Access. I am building a SharePoint site that can host the tool, which (if I can get it right) will eliminate the need of Access on end-user stations with a database push. However the SharePoint form would need read/write ability.
The big question is: How do I build this?
I have a completely open path of possibilities – I can design it work any way I want, using any tools or platform I want, as long as it works. It does not have to update automatically, as I already run a number of SQL scripts at the start of my day and adding one more is inconsequential.
The resources I have at my immediate disposal are: SharePoint (with designer), Access, Toad, and SQL Server. The database can be hosted on a shared network drive.
I am a recent college graduate with basic SQL knowledge. I have about a year to produce a final product, but would like to get it up and running far sooner if possible.
Any advice on what direction to pursue would be very helpful, thank you.
Caveat: I've never worked with SQL Server, so I don't know all of it's capabilities (I'm an Oracle developer).
What I'd do in your situation is something like the following (although not necessarily in this exact order):
Get a SQL Server database set up to host your tables.
Create the tables etc
Migrate test data across (I'm assuming you have a dev/uat/test environment for your current system! If you haven't, make sure you set up at least a separate test environment to prod for your new db!)
Write stored procs to do the work for adding new parts, updating existing data, etc etc
Set up an automated job on the db (I'm assuming SQL Server can do this!) to do the overnight processing.
Create a separate db user with the necessary permissions to call the stored procedures
Get your frontend to call the stored procs with the relevant parameters using the db user you created in step 6 to connect to the db.
You'd also have to think about transaction control to try and mitigate the case where users go home at the end of the day without committing their work - Does the db handle the commits/rollbacks or does Sharepoint?
Once you've worked out everything in your test environment, it's then a case of creating the prod db, users and objects, and then working out the best way of migrating the prod data across.
Good luck.
Don't forget to get backups for the new db set up as well.

Win Form Application Runs a Method Everyday Even If It is Not Running

I have a VB .Net Windows Form application with MS SQL server for database part of it. I need to run a method which essentially depending on some date sensitive data in the database may or may not create a notification email to be sent to one or more recipients. This application may not be used everyday. So ideally I don't want that method to be bound to let say Form Load of the main Form. How can this be achieved?
You probably don't want that logic in your client application. There are three ways that come to mind:
put the logic in the SQL Server and create a job that is scheduled to run every day
create a small utility application and schedule that to once each day
create a windows service that runs all the time and handles these jobs for you
If all the data necessary to make the determination of whether or not to send the notification e-mails is available in the database, and you have access to create a job on the SQL server, I would recommend that route.
However if there are external components that you need in order to make the determinations or to send the e-mails then either approach 2 or 3 will be the way to go. Creating an application and scheduling it to run each day would be easier to implement but a service has the benefit of not requiring an interactive logon session (i.e. doesn't need a user to be actively logged in on the computer) which is preferable on a server.

Azure VM availability, mirroring

Apologies for the noob question, I've never dealt with failover before.
Currently we have a single hardware server running Windows Server, SQL Server, ASP.NET and a single (very large) web application. We are considering migrating this to an Azure VM.
I see in the SLA that Microsoft will only guarantee 99.95% availability if I am running more than one instance of an Azure VM, to allow for failure and reboots etc.
Does this mean I therefore would have two servers to manage and maintain? For example, two versions of SQL with a database on each, and two sets of ASP.NET application files? If correct, this puts the price up dramatically.
I assume there is no way to 'mirror' one server across to the other to reduce this workload?
Also, our hardware server has 25,000 uploaded files on it. Would we need to put these on a VHD then 'link' them to whichever live server was running, or does Azure do this automatically? Or do they have to be mirrored from the live server to the failover server?
Any pointers would be appreciated. I've already read all the Azure documentation but it hasn't really made things much clearer...
Seems like you have multiple topics you should look after.
Let's start with the database. The easiest thing would be, if you could migrate your sql server into the sql azure one. Than you would not have no need to maintain it and to maintain the machines you should use.
This would you give the advantage, that you central component can be used by 1 to many applications.
Second one are you uploaded files. I assume that your application allows to upload files for sharing or something else. The best thing would be, if you could just write these files into the windows azure blobstorage. Often this means you have to rewrite a connector, but this would centralize another component.
For the first step you could make them available and clients can download it with the help of a link. If not you could load the files from their and deliver them to the customer.
If you don't want to rewrite your component, you should have to use the VHD. One VHD can only have one lease. So only one instance can be used. A common way I have seen is that if the application is starting, it is trying to "recover" the lease. (try-and-error like)
Last but not least your ASP.NET application. If you have such an application I would have a look into cloud instances. Try not to consider the VMs, because than you have to do all the management. VMs are the IaaS. With a .NET application should easily be able to convert it and deploy instances.
Than you have not to think about failover and so on. Just deploy 2 instances and the load-balancer will do the rest.
If you are able to "outsource" the SQL server, you could minimize your machine for the ASP.net application. Try to use scale-out and not scale-up. This means use more smaller nodes, than one big one. (if possible)
If you are really going the VM way, you have to manage all the stuff by yourself and yes than you need 2 vms. You are also need 3 vms, because you have no auto-loadbalancer and if you only have 2 just one machine can have the port 80 exported.
HTH

Scheduling tasks Advice? .Net, SQL Job?

I am creating a system where users can setup mailings to go out at specific times. Before I being I wanted to get some advice. First, is there already a .Net component that will handle scheduling jobs (either running another application or calling a URL) that will do what I am suggesting (Open Source would be cool)? If there isn’t, is it better to schedule a job in SQL and run some sort of script, create a .Net service that will look at an xml file or db for schedules, or have an application create scheduled tasks? There could be a ton of tasks, so I am thinking creating scheduled tasks or SQL jobs might not be a good idea.
Here may be a typical scenario; a user wants to send a newsletter to their clients. The user creates the newsletter on a Saturday, but doesn’t want it to go out until Monday. The user wants that same e-mail to go out every Monday for a month.
Thanks for looking!
Check out Quartz.NET
Quartz.NET is a full-featured, open
source job scheduling system that can
be used from smallest apps to large
scale enterprise systems.
If you want to use the readily available services in Windows itself, check out this article A New Task Scheduler Task Library on CodeProject on how to create scheuled tasks in Windows from your C# application.
You probably have more flexibility and power if you use C# and scheduled tasks in Windows, rather than limiting yourself to what can be done in SQL Server. SQL Server Agent Jobs are great - for database specific stuff, mostly - maintenance plans and so forth.
You can build your own windows service that schedules and executes jobs. Be sure to make good abstractions. In a similar project, I have used an abstraction where scheduling items are abstracted as Jobs composed of tasks. For example, sending newsletter may be a job whereas sending newsletter to each subscriber can be considered as a task. Then you need to run the job and tasks in defined threading models preferably using Threadpool threads or Task Parallel Library. Be sure to use asynchronous API for IO whenever possible. Also separate your scheduling logic from the abstractions. so that the scheduling logic can execute arbitrary types of jobs and its inclusive tasks.

Resources