I am creating an application using Grails Framework for which I plan to use the database which is provided by Grails.
Just wanted to be sure of the advantages/disadvantages before proceeding.
Does using the internal database invite issues?
Thanks!
By default, Grails has an in-memory database, which means, that whenever you shut down your application, all your data is lost... probably not what you want.
You could change this to a file-database, and this file-database would by default end up in your Grails application root. If you deploy this to an app server and undeploy again, your data is lost... again, probably not what you want.
I would recomend installing a MySql database. It's easy, and you have your data separate from your application.
The internal database is an in-memory database, so all your data disappears when the server is restarted. It seems very unlikely that you would want this behaviour for a real application, so I recommend MySQL, Postgres (or similar) instead.
Related
Our application from a vendor is using SQL Server 2008 R2 as its database on the backend. Although rarely, but sometimes an issue comes up that we think is related to their database (for example a performance issue etc.). We cannot modify their propriety database but want to detect transactions going into that proprietary(vendor) database and react accordingly. Question: How can we achieve the above in their propriety SQL Server database?
UPDATE
I probably did not phrase the above question well. So, I'm writing this update. What I mean is the following: In a similar scenario for another application that uses propriety Oracle database in its backend we use Oracle’s LogMiner utility to detect transactions going into that proprietary(vendor) database and react accordingly.
You can't. You can talk to the vendor and see if they'll give you an admin login/password into the database directly, but they aren't likely to want to for fear that you will break something and then blame them for it. Your best bet is to file a bug report with the vendor and pressure them to fix the underlying problem.
I got VPS with limited memory and my WCF service is hosted using AppFabric.
Since memory is limited and I am not using SQL server for anything other than AppFabric prerequisite im thinking about uninstalling SQL Server. (instance can eat up to 200mb memory at times). I am not using any DB related features of AppFabric like dashboard or caching. I like IIS extensions and simplicity for WCF service manipulations however, and I am thinking those do not require Sql Server actually.
I am unable to just try it out so wonder if someone has such experience, or can predict result of uninstalling SQL server on appfabric behaviour.
Instead of uninstalling SQL Server you could just stop the SQL Server process. Set the process to manual startup.
That way if you need SQL Server in the future you can just start the process.
As #Shiraz Bhajiji illudes to if you are using SQLServer as the configuration store, you will need to reconfigure it to use file based configuration instead, it sounds like you are only using a single AppFabric instance, but if you are or needed to use multiple instances the config file would need to be accessible to all instances.
Again it isn't necessarily relevant to you, but if you have multiple app fabric instances, the sql server configuration option is a much more robust approach. With the file based approach, if you configure things incorrectly one app fabric node going down can take down the entire cluster. The SQLServer approach does represent a single point of failure however, if you are using clustering etc you can easily mitigate this. Again I appreciate I'm getting a little off topic here.
With an ASP.NET MVC3 application, how do I deploy the database into production, and how do I manage the schema changes?
When I'm developing my application, I see an aspnetmvc.mdf (and .ldf) file in app_data. This has the aspnet._ tables, and also my tables (which I hand-created in SQL Server Express). This file is 10MB, and it doesn't seem to me that I should simply upload it to my production machine.
Should I instead keep schema (and seed data) changes in a .SQL file and (somehow) run them on the server? Should I use NHibernate's methods for auto-generating tables? (If so, what about the ASP.NET standard tables?)
What's the best way to manage this? Ideally, I'd like something like LiquiBase or Rails' DB migrations, where I can isolate changes and run them in isolation. But I've never put a from-scratch ASP.NET MVC site into production, so I'm not sure what to do.
My thoughts on NHibernate's Schema Update are here.
There is no one right solution, but SchemaUpdate can get you about 90% of the way there. For the other 10%, I currently use hand-written sql files (named by the date they were created), but there are other, more sophisticated options (such as RedGates SqlCompare or the data tools built into some versions of Visual Studio).
So I'm inexperienced in hosting DB's and I've always had the luxury of someone else getting the db setup.
I was going to help a friend out with getting a webpage setup, I've got experience in Asp.Net MVC so I'm going with that. They want to setup a search page to query a db and display the results. My question I have is in getting the DB setup and hosted. They currently just have the Access DB on a local computer. There is basically only one table that would need to be queried for the search.
What is the best approach to getting this table/db accessible? They would like to keep the main copy of the db on the local machine, so copying the entire db over to the hosted site would be time consuming, could the lone table needed be solely copied to the host? Should I try to convince them to make changes on the hosted db and just make copies of that for their local machines? Any suggestions are welcome, Again I'm a total noob when it comes to hosting databases.
Thanks
Added: They are using a MS Access 2000, and the page will have access restrictions. Thanks for the responses.
How about SQL Server Express? I think you can do a remote connect from Access and just push the data over from Access.
I wouldn't use Access on a web server in any case.
I would strongly recommend against access from web work, its just not designed for it and given that SQL server express is free there is no reason not to give it a go.
You can migrate the data over by using the SQL server upsizing wizard, here is a link for help on using that feature
http://support.microsoft.com/kb/237980
It depends on what you mean by web work? Access 2010 can build scalable browser neutral web applications. They can scale to 1000's to users. In fact, you can even park the web sites on Microsoft's new cloud hosting options, and scale out to as many users as you need.
Here is a video of an application I wrote in access 2010. Note how at the half way I run the same application including the Access forms in a standard web browser. This application was built 100% inside of the Access client. The end result needs no ActiveX or Silverlight to run.
http://www.youtube.com/watch?v=AU4mH0jPntI
So, the above shows that access can now be used to build scale web sites (you can ignore the confusing answers by the other two posters here they are not quite up to speed on how access works or functions).
However, for your case, I would continue to have the access database on the desktop. You can simply link to tables that are hosted on the web server. Those tables can exist in MySql, or sql server. As long as the web site supports external ODBC connections (many do), then you can thus have the desktop application use the live data from the web server. If connections to the live data at all times is a issue, then you could certainly setup something to send up new records (or the whole table) on some kind of interval or perhaps the reverse, and pull down new records on a interval from the web site (depends which way you need to go). So, connecting to MySql or sql server is quite easy as long as the web hosting and site permits external ODBC connections. I do this all the time, and it works quite well.
As mentioned, new for access 2010 is web site building ability but that does requite Access Web services running on SharePoint.
You don't need to upgrade to Access 2010. One option is to use the EQL Data plugin to sync the database up to the server. Then you can write an asp.net, php, or whatever application that queries the table using the EQL API and prints the results however you want. This kb article describes how to use the EQL API from a web app.
The nice thing is that the database is still totally usable (and at full speed) even when you're not online, and then you can sync the new data up to the web occasionally. It only uploads the changes, not the entire database every time, so it's fast.
Disclaimer: I work at EQL Data so I'm a bit biased. But this kind of use case is the whole reason the company exists.
As the post title implies, I have a legacy database (not sure if that matters), I'm using Fluent NHibernate and I'm attempting to test my mappings using the Fluent NHibernate PersistenceSpecification class.
My question is really a process one, I want to test these when I build locally in Visual Studio using the built in Unit Testing framework for now. Obviously this implies (I think) that I'm going to need a database. What are some options for getting this into the build? If I use an in memory database does NHibernate or Fluent NHibernate have some some mechanism for sucking the database schema from a target database or maybe the in memory database can do this? Will I need to manually get the schema to feed to an in memory database?
Ideally I would like to get this this setup to where the other developers don't really have to think about it other than when they break the build because the tests don't pass.
NHibernate isn't able to re-create your database's schema and because it's a legacy system you likely won't be able to generate the schema from NH. The best approach is to do your integration tests in transactions and roll them back when complete. We run integration tests against our dev and test databases which are periodically refreshed from the live system.