How to rollback/tear down/clear the database changes after a system test runs? - sql-server

I have a test method, using NUnit and Selenium, which opens a browser on our website which is on the Production Server and registers a user and verifies that the registration is successful.
(I know ideally the system tests should run on a separate Test Server rather than production but here they want to test whether the prod system works!)
The problem is how to rollback the database changes as a result of this test? For example, the state of my database before and after running the state should be the same.
I thought of 3 possible options but none is practical:
1) writing SQL queries to delete from the actual tables before starting the test (Setup) and after running the test (TearDown); this is my current approach however
The problem with this approach is that I have to know exactly which tables were involved for each System Test which runs and this can quickly become very complex as a test may impact more than one table.
2) Writing transactional Code
This is not an option since the code changes are done by the website, not by the unit test written.
3) Getting an snapshot of existing database (SQL Server 2008 R2) before each test starts then after the test finished, restoring the snapshot to the original one.
This idea sounds good to me if we could run the tests only on Staging environment but the problem is that the tests have to run on Production and may take like 5 minutes totally so rolling it back and restoring it, would be a stupid idea as the changes done in that 5 minutes would be lost!
Please advise what approach would be best possible option to resolve this problem? there may be a 4th option?
Thanks,

Option 4 never ever ever ever do tests on a production server it's a recipe for disaster (see thousands of funny (if you are not the protagonist) stories on the internet on how this could go horribly wrong), the right thing to do would be to configure the test and production server in the same way.

There is a fith option. If the website receives a registration for user "WeAreTestingOutSite" it does everything except for actually adding the user to the Database.
To be honest, as was said, there are better ways to test if a production site is still in operation than to run bots to register a user to make sure it is working (or operational).

I would recommend you going with 4th option: Introduce new feature which allows to delete the user. Probably not to the user himself/herself but to the system admins (Backoffice users). That way you can test if user can be registered - and deleted afterwards while not caring that much about the SQL scripts.

Related

Testing database interactions

I have an API that has a storage layer. It only does the database interactions and perform the CRUD operations. Now I want to test these functions.
In my path API/storage/ , I have different packages having functions to interact with different tables in same database. Tables A, B and C are in same database.
My file hierarchy goes like:
--api
--storage
--A
--A.go
--A_test.go
--B
--C
--server
--A
--testData
--A.sql
--B.sql
In this way I want to test the whole storage layer using command
go test ./...
The approach I was following is that I have a function RefreshTables which first truncates the table, then fills it with a fixed test data that I have kept in testData folder. For truncating I do :
db.Exec("SET FOREIGN_KEY_CHECKS = 0;")
db.Exec("truncate " + table)
db.Exec("SET FOREIGN_KEY_CHECKS = 1;")
As go test runs test functions of different packages in parallel by default, multiple sql connections get created and truncate runs on some other connection while set foreign key runs on some other connection randomly from connection pool.
I am not able to pass my tests if run together but all tests pass if run alone or package wise.
If I do :
go test ./... -p 1
which makes test functions run one by one, all the tests pass.
I have also tried using a transaction for truncate and locking table before truncate.
I checked this article (https://medium.com/kongkow-it-medan/parallel-database-integration-test-on-go-application-8706b150ee2e), and he suggests making different databases in every test function and dropping that database after function ends. I think this will be very time taking.
It would be really helpful if someone suggest the best method for testing database interactions in Golang.
I don't have much experience in integration testing, I'm not sure if mocking the data base drivers could work for you, but if so, I've been using go-sqlmock package for mocking sql database results in unit tests and works like a charm. You could use it and literally have a separated "database engine" for each of your tests. It's a bit time consuming since you have to manually tell the mock what queries to expect and what to return but trust me, it's a good time investment.
As I said before, I'm not sure if using this strategy suits your case, because if you are interested in knowing how your application behaves in a "real database scenario", like verifying that registries are actually saved, then mocking the database results is kind of useless.

Can an Oracle test database import be done without adversely affecting the production Oracle instances?

On a production database, 11gR2, I have exported everything via Sql Developer, file.sql. I just took all the defaults.
I have a test server with 11gR2 I am going to copy the .sql dump file over to.
Is there anything contained in a export, the one with everything in it, all the objects, data, and so on, that would cause problems for the production environment when I import the data into the test environment?
In other words, I don't want to break my production. I don't have tnsnames.ora set up on my test. I only want the schema, data, all the rest mentioned.
EDIT:
SELECT * FROM DBA_SCHEDULER_WINDOWS;
Showed nothing active.
DBA_JOBS
shows APEX jobs about the mail, stock jobs I think. One about, EMD_MAINTENANCE.EXECUTE_EM_DBMS_JOB_PROCS();
SELECT * FROM DBA_DB_LINKS;
There is a link. But, I know what it is from and it is no longer being used.
Thanks for the info you gave. I feel better now.
The standard things I would think of are:
use another system (not the same VM/server as production)
disable all DBA_SCHEDULER_JOBS and don't enable them back until you review their code
disable all DBA_JOBS and don't enable them back until you review their code
point DBA_DB_LINKS (both public and private database links) from the production databases to corresponding test databases or delete them; these sometimes use tnsnames.ora, but sometimes bypass it.

Unit Testing TSQL

Is there anybody out there writing unit tests for their TSQL stored procedures, triggers, functions ... etc.
I've recently started making database and restores and installs part of our automated Cruise Control build process. Now I'm thinking about taking it to the next level where we do the install, then run through a list of stored procedure tests etc.
I was going to just roll my own using MsBuild Extensions to invoke the tests. However I'm aware of http://www.tsqltest.org/ and http://tsqlunit.sourceforge.net/. I'm also aware that TFS has sql testing.
I just wanted to see what people in the real world are doing and if they have any suggestions.
Thanks
The critical parts:
Make it automated and integrated with your build/test (so you have a
green or red from your build)
Make it easy to add a new test
Keep your tests up-to-date
Advanced:
test failure conditions in your code
make sure your tests clean up after
themselves (TSqlTest's example
scripts use #beforeCount and
#afterCount variables to validate the
clean-up)
Stored procedures. I generally include test queries in comments in the SP header, and record correct results and query times. This still leaves it as a manual exercise, however.)
Functions. Again, put SQL statements in the header with the same info.
Triggers. I avoid them for a number of reasons, one of them being that they are so hard to test and debug for so little benefit compared to putting the same logic in another tier. It's like asking how to test for Referential Integrity.
This is still a manual process, however. But since I think one should intentionally design SQL artifacts to be totally uncoupled (e.g. no SPs calling SPs, same with functions, and another strike against triggers IMHO) it's relatively less complex.
I have used the database testing that is built into Visual Studio 2008 Database Edition on a project here. It works well, but feels more like a third party bolt-on to Visual Studio than a native component. Some of the pains I felt with it are:
Because SQL code lives in the res files and a single code file can include multiple tests, it is not as easy to search for tests based on table/column names.
Because multiple tests live in the same code files, you have some annoying variable name collisions (eg, if you have two tests in a single code file, all of the assertions for those tests have to have unique names; That means your assertion names will probably look like "testname_assertionname", which really shouldn't be necessary).
Refactoring your tests is not easy - for example, if you want to move a test from one code file to another, the easiest way is to create the test from scratch in the new file because there are bits and pieces of the test scattered about the res file and the code file.
All of that said, as I started with - It does work well. Unfortunately, we have not added these tests to our continuous integration server yet, so I can't comment on how easy it is to automate the running of these tests. We are using TFS for CI, and I am assuming that automation of the tests would work very similar to automation of standard unit tests; In other words, it seems like there should be an MSTest command line that would run the tests.
Of course, this is only an option if you are licensed to run Visual Studio 2008 DB Edition (which I understand is now included in the VS 2008 Pro license).
I've done this in java, using dbunit.
Basically, anything you do in the database either:
returns a result set
or alters the state of the database.
The state of the database can be described as all the values in all the rows in all the table in all the schemas of a database; the state of any subset is the state of all the data affect by some test.
So, start with a database filled with enough test data that you can perform you tests, call this the baseline. Extract a snapshot, with dbunit or the tool of your choice.
Given that your database is at baseline, any result set is deterministic (as long as your sp is deterministic, less so, if it does a "select random();").
Get the baseline result set of all your SPs, save those as snapshots with dbunit or whatever tool you're using.
To test operations that don't change state, just test that the result set you get is the one you initially got. To test operations that change the database, test that baseline + operation = expected change. After each test that potentially chnages the db, restoe it to baseline.
Basically, the ability to restore to a baseline makes the testing possible.
Have you tried using the red-gate.com API?
They have a bunch of products for comparing things in SQL Server and the API allows virtually the same functionality programmatically.
http://help.red-gate.com/help/SQLDataCompareAPIv5/4/en/GettingStartedAPI.html

Best Practice for seeing live data on the dev server?

Assumption: live/production web app suppresses errors being shown to end-users.
Suppose your tech support team wants to see live data but through the eyes of the development-side of the application (maybe you want to see what errors are occurring, or want to see when you've got an issue fixed using an end-user's data).
Right now we've got one database serving both the dev and live boxes (not my idea - I know it's gross).
Ideas?
Edit: Best/handy tools for implementing your suggestion?
We replicate the data back to a different database. Yes, there is a delay, but it keeps people hands out of the production servers. This also allows us to "hide" information that tech support (and other people for that matter) aren't supposed to see.
In addition to replicating data down, on production, we see who's logged into the application, and if it's a member of the company, send them to the real error page versus the happy kitten playing with a ball of yarn apologizing.
Back up and restore from live to dev on a regular basis (once, twice a day). It doesn't need to be realtime (as you might be entering data from the dev side anyway, which could cause problems).
If you have PCI or HIPAA data, make sure you don't put that in your dev environment -- that might break laws.
I generally like to have a 3-tier system for web development:
Development
Testing
Live
Most of the time testing is an exact copy of the live system, except that errors are turned on, when a new version is about to be moved live it's replaced with the new version BEFORE live is, to detect upgrade issues.
Development is completely separate from live, to allow for major changes to things like the database, or changes to the production environment.
I would firstly make errors are either emailed to someone with details of how the user got there or at minimum logged so you can watch the error log while you perform similar actions to see if you get the same messages in the log.
And yes, copying the database on the dev server/site is probably your only option. You don't want any changes made by the development team to live data and you'll probably also have changes that won't work with the production database at some point.
I wouldn't recommend doing a nightly copy as a developer might be in the middle of some new feature where they have added data and then it's erased that night. I usually copy the production database(s) to dev each time a major version is released. This also allows me to do speed testing with a lot of live data. On some systems I also change everyones password to a default so I can login easily as any user.
If your configuration permits it:
a. Add a logging function (if there isn't one already) to write messages of interest to a log file.
b. Run the unix command
tail -f < logfile.txt
which will stream the growing log file to your console.
http://www.monkey.org/cgi-bin/man2html?tail
If you have Windows, you might try this:
http://tailforwin32.sourceforge.net/

How to Test Web Code?

Does anyone have some good hints for writing test code for database-backend development where there is a heavy dependency on state?
Specifically, I want to write tests for code that retrieve records from the database, but the answers will depend on the data in the database (which may change over time).
Do people usually make a separate development system with a 'frozen' database so that any given function should always return the exact same result set?
I am quite sure this is not a new issue, so I would be very interested to learn from other people's experience.
Are there good articles out there that discuss this issue of web-based development in general?
I usually write PHP code, but I would expect all of these issues are largely language and framework agnostic.
You should look into DBUnit, or try to find a PHP equivalent (there must be one out there). You can use it to prepare the database with a specific set of data which represents your test data, and thus each test will no longer depend on the database and some existing state. This way, each test is self contained and will not break during further database usage.
Update: A quick google search showed a DB unit extension for PHPUnit.
If you're mostly concerned with data layer testing, you might want to check out this book: xUnit Test Patterns: Refactoring Test Code. I was always unsure about it myself, but this book does a great job to help enumerate the concerns like performance, reproducibility, etc.
I guess it depends what database you're using, but Red Gate (www.red-gate.com) make a tool called SQL Data Generator. This can be configured to fill your database with sensible looking test data. You can also tell it to always use the same seed in its random number generator so your 'random' data is the same every time.
You can then write your unit tests to make use of this reliable, repeatable data.
As for testing the web side of things, I'm currently looking into Selenium (selenium.openqa.org). This appears to be a cross-browser capable test suite which will help you test functionality. However, as with all of these web site test tools, there's no real way to test how well these things look in all of the browsers without casting a human eye over them!
We use an in-memory database (hsql : http://hsqldb.org/). Hibernate (http://www.hibernate.org/) makes it easy for us to point our unit tests at the testing db, with the added bonus that they run as quick as lightning..
I have the exact same problem with my work and I find that the best idea is to have a PHP script to re-create the database and then a separate script where I throw crazy data at it to see if it breaks it.
I have not ever used any Unit testing or suchlike so cannot say if it works or not sorry.
If you can setup the database with a known quantity prior to running the tests and tear down at the end, then you'll know what data you are working with.
Then you can use something like Selenium to easily test from your UI (assuming web-based here, but there are a lot of UI testing tools out there for other UI-flavours) and detect the presence of certain records pulled back from the database.
It's definitely worth setting up either a test version of the database - or make your test scripts populate the database with known data as part of the tests.
You could try http://selenium.openqa.org/ it is more for GUI testing rather than a data layer testing application but does record your actions which then can be played back to automate tests across different platforms.
Here's my strategy (I use JUnit, but I'm sure there's a way to do the equivalent in PHP):
I have a method that runs before all of the Unit Tests for a specific DAO class. It puts the dev database into a known state (adds all test data, etc.). As I run tests, I keep track of any data added to the known state. This data is cleaned up at the end of each test. After all the tests for the class have run, another method removes all the test data in the dev database, leaving it in the state it was in before the tests were run. It's a bit of work to do all this, but I usually write the methods in a DBTestCommon class where all of my DAO test classes can get to them.
I would propose to use three databases. One production database, one development database (filled with some meaningful data for each developer) and one testing database (with empty tables and maybe a few rows that are always needed).
A way to test database code is:
Insert a few rows (using SQL) to initialize state
Run the function that you want to test
Compare expected with actual results. Here you could use your normal unit testing framework
Clean up the rows that were changed (so the next run won't see the previous run)
The cleanup could be done in a standard way (of course, only in the testing database) with DELETE * FROM table.
In general I agree with Peter but for creating and deleting of test data I wouldn't use SQL directly. I prefer to use some CRUD API that is used in product to create data as similar to production as possible...

Resources