Detecting vandalism in SQL Server [closed] - database

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about programming within the scope defined in the help center.
Improve this question
Over the past several months I've seen quite a few unexpected bugs popping up in a legacy application, most of which are related to inconsistencies between the application code (classic ASP) and the underlying SQL Server database.
For example, a user reported a 500 error on a page last week that has been working correctly for five years. I discovered that the page in question was looking for a column in a result set named "AllowEditDatasheets", while the real column name was "AllowDatasheetEdit".
Normally I'd attribute this to publishing untested code but, as I said, the page has been working correctly for a very long time.
I've run across this quite a few times recently - pages that never should have worked but have been.
I'm starting to suspect that another employee is making subtle changes to the database, such as renaming columns. Unfortunately, there are several applications that use a common login that was granted SA rights, and removing those rights would break a lot of code (Yes, I know this is poor design - don't get me started), so simply altering account permissions isn't a viable solution.
I'm looking for a way to track schema changes. Ideally, I'd be able to capture the IP address of the machine that makes these sorts of changes, as well as the change that was made and the date/time when it occurred.
I know I can create a scheduled process that will script the database and commit the scripts to our source control system, which will let me know when these chages occurr, but that doesn't really help me find the source.
Any suggestions?

The default trace already tracks schema changes.
In Management Studio you can right click the node of the database of interest and from the reports menu view the report "Schema Changes History" that pulls its data from there.
If the information recorded there is not sufficient you can add a DDL trigger to perform your own logging (e.g. recording HOST_NAME() though that can be spoofed)

If you are using SQL Server 2008 and above, you can use SQL Server Audit.
With earlier versions, you may be able to add triggers to the system tables that hold schema information and log changes to those.

That's just as bad as GRANT DBA TO PUBLIC!.. You best rewrite code and restrict SA privilege to one or few DBA's!.. Column renaming is not the only thing they could wreak havoc upon!.. Having a common login-ID is also not a good idea because you have no way of pinpointing exactly who did what.

Related

Convention for Database Creation [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
We're developing a new product at work and it will require the use of a lightweight database. My coworkers and I, however, got into a debate over the conventions for database creation. They were of the mindset that we should just build quick outline of the database and go in and indiscriminately add and delete tables and stuff until it looks like what we want. I told them the proper way to do it was to make a script that follows a format similar to this:
Drop database;
Create Tables;
Insert Initial Data;
I said this was better than randomly changing tables. You should only make changes to the script and re-run the script every time you want to update the design of the database. They said it was pointless and that their way was faster (which holds a bit of weight since the database is kind of small, but I still feel it is a bad way of doing things). Their BIGGEST concern with this was that I was dropping the database, they were upset that I was going to delete the random data they put in there for testing purposes. That's when I clarified that you include inserts as part of the script that will act as initial data. They were still unconvinced. They told me in all of their time with databases they had NEVER heard of such a thing. The truth is we all need more experience with databases, but I am CERTAIN that this is the proper way to develop a script and create a database. Does anyone have any online resources that clearly explain this method that can back me up? If I am wrong about this, then please fell free to correct me.
Well, I don't know the details of your project, but I think its pretty safe to assume you're right on this one, for a number of very good reasons.
If you don't have a script that dictates how the database is structured, how will create new instances of it? What happens when you deploy to production or it gets accidentally deleted or the server crashes? Having a script means you don't have to remember all the little details of how it was all set up (which is pretty unlikely even for small databases).
It's way faster in the long run. I don't know about you, but in my projects I'm constantly bringing new databases online for things like unit testing, new branches, and deployments. If I had to recreate the database from by hand every time it would take forever. Yes it takes a little extra time to maintain a database script but it will almost always save you time over the life of the project.
It's not hard to do. I don't know what database you're using but many of them support exporting your schema as a DDL script. You can just start with that and modify it from them on. No need to type it all up. If your database won't do that, it's worth a quick search to see if a 3rd party tool that works with your database will do it for you.
Make sure your check your scripts into your source control system. It's just as important as any other part of your source code.
I think having a data seeding script like you mentioned is a good idea. But keep it as a separate script from the database creation script. This way your can a developer seed script, a unit testing seed script, a production seed script, etc.

Log file not stored [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to do research about log files not stored in the database. I do not know much about database systems so i need someone to give at least some ideas about it. What i was told is that some of the log files was not written in a bank's database.Log files are coming from various sources like atms,website vs. For example, the reason could be high rate of data flow causing some data to be left out.
The question is what are the reasons behind it and what could be the solutions to them?
I would really appreciate if you could share some articles about it.
Sorry if i could not explain it well. Thanks in advance
Edit:what i meant was not there is a system not writing some of log files to database intentionally. What i tried to mean is that some of the log files are not written into database and the reason is not known and my intention is to identify the possible reasons and solutions to them.the database belongs to a bank and as you can imagine, lots of data is flowing to database per second
Well, the questions is not very clear, so let me rephrase it:
What are the reasons why application logs are not stored in a database
It depends of the context, and there are different reasons:
First question, is why you might store logs in database? Usually you do it because they contains relevant data to you that you want to manipulate.
So why not store always these datas:
you are not interested by the log, except when something goes wrong, but then it's more debugging than storing log.
you don't want to mix business data (users, transaction, etc...) with not so important / relevant data
the amount of log is too important for your current system and putting them in a database might crash it completly
you might want to use another system to dig into the log, with a different typoe of storage (haddop, big data, nosql )
when you do database backup, you usually backup all the database. Logs are not 'as important' as other critical data, are bigger, and then would take too much place
there is no need to always put logs in database. Using plain text and some other tools (web server log for instance) is usually more than enough.
So that's for these reason that logs are in general not stored in the same database than the application.

How can I set up a database that is able to import large amounts of data automatically on a daily basis? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I am new to databases and programming and am now supposed to create a database that can store large amounts of data. The critical problem to me is that i need to update the database everyday and add 150 sets of data to 60 different tables. The datasets all come in a different format though (.csv, .row, .sta...).
I would like to be able to create a GUI that can automatically import the datasets every day and update the tables. So far I have found a lot of info on exporting data from databases, not so much on importing data though.
Does someone have a pointer?
You haven't given a reasonable definition of "best", what are your criteria? Is cost a major concern (I'm guessing not if you're considering Matlab, but maybe)? Is the UI for you only or are you turning end users loose on it? If end users, how smart/reliable are they? When you say manual import, do you mean a mostly automatic process that you occasionally manually initiate, or will it have to ask a lot of questions and have many different combinations?
I import lots of data daily and from different sources, I have to manually re-launch a process because a user has made a change and needs to see it reflected immediately, but but my set of defined sources doesn't change often. I've had good luck using the SSIS (SQL Server Integration Services) tool in Microsoft's SQL Server. And it can certainly handle large amounts of data.
The basic functionality is you write a "package" that contains definitions of what your source is, how it's configured (i.e. if you are importing from a text file tell it the name and path, is it fixed field or delimited, what is the delimiter or width of each field, what field to skip, how many rows to skip, etc.), and where to put it (DB name and table, map fields, etc). I then set the schedule (mine are all overnight) in SQL agent, and it is all automatic from there unless something changes, in which case you edit your package to account for the changes.
I can also manually start any Package at any time and with very little effort.
And the range of imports sources is pretty impressive. I pull in data from CSV files, Lotus Notes, and DB2, and it's all automatic every night. It's also a fairly graphical "builder", which is frustrating for a hardcore coder, but if you're new to programming it's probably easier than a more code or script oriented approach.
Good luck, and welcome to the dark side. We have cookies.

How to catch a SQL server cracker? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Quick synopsis:
The guys on my team have been working on a production database (sql server 2005). We've added various things such as constraints, added triggers, etc.
Now we've found that someone or something has been rolling back our changes at various times. Problem is we all share a common admin login. (dumb, yeah I know, we're fixing this). Its causing tons of frustration, at this point we just want to find out whodunnit.
How would you go about tracking down the guilty party?
NOTE: I'm NOT looking for a way to fix this, that's already being done. I'm looking for a way to track down the culprit.
Stay away from production databases. Create your scripts and email them to the DBA in charge (if you don't have one, get one). Developers with access to production database is a recipe for disaster - I don't have it and don't want to have it.
Tracking down your problem is obviously a symptom and not the cause: since it's a SQL Server 2005 database, there should be a 'Default' trace that runs out of the box. It's very lightweight, but does include some object creation and deletion. You can view it from the sys.traces view using the following query:
SELECT *
FROM sys.traces
WHERE id = 1
It rolls over after only a few MB so it's usefulness will depend on how much activity there is on the server.
Presumably, the real cause is not having your changes scripted and in version control.
Agree with the other posters who mentioned that all changes to a Production Database should be done only by an Admin, and not individual developers.
I'll assume that you have a audit log with change data capture-esque features. This will be keeping track of the who, what, and when of each change.
Are the rollbacks intermittent or consistent? Any chance you have auto commit turned off and forget to commit your changes?
There can't be that many people that have sufficient permissions to do such a thing. Find out who can do it and ask. Better than any technology you can put in place.
Hacker? It should be somebody on the inside. If someone outside your firewall has access to that database you need to talk to your network people.
Try adding a monitor to that URL and port to see what requests come through.
The thing you are going to have to watch out for is that if someone is maliciously altering the database, and they have admin access, you have to assume they are smart enough to cover their tracks. At this point, you can stop further damage, but if the attacker is any good at all, you'll either blame the wrong person as the log files will be altered, or all the evidence point to the right person will be gone.
The best way to do is it to have it so that no one has direct admin access to the production database. We have a system set up so that no account has administrative access by default, and everyone has their own accounts. No one gets to use the SA account.
Someone has to grant the account access and it is automatically removed 24 hours after being granted. Ideally, the same person to grant access shouldn't be the one that gets administrative access to the database. That way two people always have to be involved to make changes to the system.
Ideally, two people should always be involved in making changes. That way the second person can verify what the first does. (It's easy to make mistakes at 10 at night after working several hours).
People will counter this by saying that sometimes they "need" to be able to make quick changes. In most places this is not the case. It may take an extra 10 minutes to get a second person involved, and explain the situation. It will take years to clean up a reputation about someone stealing/altering corporate data.
By adding user-level security like you should have.
Can you cross-reference roll-back times with the whereabouts of people on the team?
Or alternatively - just ask everyone?
SQL Server 2005 added DDL and DML triggers so you can track who's modifying data as well as the data structure.
If you're fixing it -- and by "fixing it" I mean locking down the production database and following some of the other practices mentioned here -- then don't worry about finding the culprit. It was probably accidental anyway and when you lock it down someone will start wondering why something doesn't work.
Tracking down the user who did won't solve anything. If it was malicious, they'll lie and say it was accidental.
The root cause is the security on the database so the group at fault is the one that allowed the database to be so susceptible.
Asking everyone isn't useful, people lie and/or don't know they are screwing this up. We assume it's malicious but hope it's not.
Wow, you've got a real problem then. If you can't trust your own people...
Time to turn off all the IDs except one. Make sure that person knows what they're doing and doesn't lie.
In addition to what you've already received in responses, my vote is that it's nobody; you're simply mistaken about how you're using the system.
Now, don't get me wrong, I'm not talking about incompetence here. What I do mean, though, is that there may well be scripts that are running periodically, and someone rightly mentioned that sometimes auto-commit may be on versus off and someone's getting fooled.
I also believe you are asking for trouble by mixing ANY development work in the production environment. Disk space is CHEAP - a terabyte is less than $300 these days! You don't need whiz-bang performance for development work in most circumstances...

Should I give a client a SQL Server login with the 'db_owner' role? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
One of our clients has requested that we include the 'db_owner' role on the database login that their website uses, so that they can upload a script (an ASP page) to run some database changes. Normally the logins for the databases hosted on our server only include 'db_reader' and 'db_writer'. Is this ok, or should I request that they forward us the sql script to run on their behalf?
Or am I being too protective? Thanks
I would suggest that you act as a filter between them and anything they might want to do to the database such as uploading and running those scripts. If they get db_owner and hose it all up, it will still probably be your head on the chopping block for letting them have it to begin with.
I think that I would want to have a service level agreement that is acceptable to everyone before I would give out that much control over the database. For example, you could specify that if the client damages their databases in a way that they can't fix, your response would be limited to restoring it to a backup point of their choosing within a certain timeframe. You might also require them to maintain a specific technical contact for database issues who will be the first contact for their developers, etc. The SLA should spell out the various risks, including loss of data, inherit in having this level of capability.
In general, I'm in favor of giving more control, rather than less, if the client is willing to accept the responsibility. As a person who uses such services, I know that it can definitely improve productivity if I'm allowed to make the changes that need to be made without having to jump through hoops. I'm also willing to accept the risks involved, but I clearly know what the implications are.
What kind of scripts are they running?
Rather then providing them direct access you could provide some kind of interface as TheTXI suggested. I would be very concerned about giving db_owner access unnecessarily.
That might be you, or a team member, or depending on the type of scripts you may be able to provide them some kind of web interface (thus allowing you to at least wrap some validation around the script).
But if they directly run something on the system that you don't want to it will most likely be on you (whether that be just managing a restore or something more serious)
You can get more granualar with your permissions to let them only do what you want. It would depend on how often they want to make changes and how responsible you are for their data. I would not want to grant dbo to someone unless there was a really good reason.
Make sure that they are the owner of the database not just in the dbo role. If dbchaining is on in another database with the same owner they could turn it on in their database and have dbo permissions in that other database.
Or, make them sign an agreement that says "Any damage you cause to the data or the database schema caused by you or anyone logged in under said db account is not of your fault and no blame can be put on you, etc etc" At least if they balls something up, that way you're covered and the client stays happy. Though you might want to give them a separate login for this, so that they can't blame incorrect changes on the website code.
There's a word for DBAs who are overprotective: "Employed"
The rest, not so much.

Resources