Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have web application that will be similar to forums. I plan to have few thousands+ users on this application, and i wonder if i can use MS SQL Express - here is the list of it's limitations:
Despite the cpu usage, i wonder if this 1 GB of memory usage is enough. Can anyone tell me if this might be enough, or maybe give me some examples when this 1 GB would be enough.
I think it should be fine. Depending on how many columns you have in each row and their types, you should be able to get at least 1 million rows into 1GB of disk space. The database will only load as much of that table as it needs in memory. If it reaches its memory limit it will start paging.
If you are using SQL 2008 R2, then the actual database can only grow up to 10GB which is the real limit you should be concerned about.
In our database, our tables that are around 1GB contain about 4 million rows. We have 2 databases that are 50GB and the one takes up 16GB of RAM and the other 2GB. So it depends on how often and which tables are accessed.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
Our site has a few stats tables that log which users / page views / etc.
They're simple tables (a couple of id numbers and a datetime) but have lots of rows (e.g. 194,000,000) and the data print is getting quite big (6GB +).
Most activity is writing data (i.e. logging the stats) and performing count(*) operations by an indexed id field.
This question has some info about sqlserver 2008 limitations. But it mentions data in terms of TB and the max number of rows as "Limited by available storage". Does that mean there is no problem as long as we have enough disk space?
I can't shake the concern that these massive tables might lead to slower queries/ increased load on the server as they grow.
My Questions
Realistically, how big can a table grow before causing problems? Is it really only limited by available storage? Are there any sensible limits (in GB / rows) given the table types I've mentioned?
Given that we only perform writes and count(*) queries, what type of
problems could a massive table cause?
I'm guessing that count(*) queries will become more resource
intensive the larger the table is. Is that correct? And to what
extent?
Say for example a table was 500GB with 300 gazillion rows. Could
that cause any other problems for sql server other than slower queries on the table itself?
There is so much variation in how tables are designed and what kinds of queries you can run over them that you're going to have a difficult time finding any guidance worth anything. My suggestion would be to load up a table with the amount of data you're expecting to have "live" in a non-production database and try the types of queries that you plan to do in your production table. If they work well, you're set. And if they don't, you'll see why they don't and perhaps be able to work around it with a different approach (i.e. different indexing, partitioning, etc).
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
My database has 16MB of space left
I used to just truncate as I was taught but I found these links that advise against truncating
http://www.sqlskills.com/blogs/paul/why-you-should-not-shrink-your-data-files/
http://blog.sqlauthority.com/2011/01/19/sql-server-shrinking-database-is-bad-increases-fragmentation-reduces-performance/
Is there anything else I can do on my database to reduce the size other then deleting table records? Any advise would be greatly appreciated.
You should most definitely look for ways to increase storage space as a long term solution. Having only 15MB left in database can easily lead to corruption and database going offline.
If you’re on SQL Server 2008+ you can try to enable its native data compression.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
According to MSDN, the memory limit for SQL Server 2008 Standard edition is 64 GB. Does anyone know if this total is for SQL Server only, for each service you run on that instance (SQL, SSAS, SSIS), or a single total that is shared among all services you run within that instance.
For example, if I want to allocate 64 GB of memory to SQL, will there be any memory available under my license for me to run any other services on that instance?
Note: This is not a question about physical memory limitations, as my server has more than enough physical memory to meet my allocation requirements. I am only curious to know if I will be limited by the license itself.
It looks like it's 64GB limit is per instance, not server. Other people were asking the same question in this article... http://www.brentozar.com/archive/2010/06/sql-server-r-standard-supports-less-memory/
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
What are the most memory consuming reasons in SQL SERVER 2005?
The SQL Rocks article Memory Use in SQL Server will probably answer your question.
I think this is one of the important parts:
SQL Server's caching behavior is the
reason for the substantial memory use.
This is by design and is not a bug,
memory leak nor incorrect
configuration. Every time
SQL Server needs to read a page from
disk it caches the page in memory so
that the slow disk operation will be
unnecessary should SQL Server need
that page again. Every time SQL Server
needs to read a page from disk it
caches the page in memory so that the
slow disk operation will be
unnecessary SQL Server should need
that page again.
SQL Server is just memory hungry. The more memory you give it, the more it will use. SQL Server should probably always be run on its own server if it is doing anything non-trivial. In other words, don't install SQL Server on your domain controller, file server or source control repository (unless your source control repository uses SQL Server).
Buffer pool for mainly data, plans, locks
Can you add some background please?
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Which free database would you use for a relatively small datawarehouse system?
Are there any 'special' databases e.g. multidimensional databases freely available?
Which of the free relational databases is best suited for the job?
By datawarehouse system I mean a system that will receive some inserts, few updates, next to no deletes and plenty of complex selects. Structured in star schemas (if the database is relational).
By small I mean about 100.000 records in the main fact table, maybe 10 dimensions, the largest containing 5.000 records.
Be free I mean free of charge of internal commercial use.
Edit: Since so far I mostly only got a list of free databases, let me specify some features that would be interesting / needed:
outer joins (must)
inlineviews / subselects (almost must)
materialized views (nice)
smart query optimizer (the smarter the better)
support for dimensions, roll up, cube queries (nice)
analytic functions (that's the name in oracle, don't know how they are named in other databases)(nice)
We have had very good results with Firebird. It's free, open source, runs on all major platforms and has support for all important database functions.
There are excellent tools available to manage the databases, like IB-Expert which has a free (limited, but good enough) version.
SQLite
HSQLDB
MySQL
PostgreSQL
What about SQL Server Express?
If the total amount of data < 4 gigabyte you can use Oracle XE.
Edit: Jens Schauder came with new 'demands'. I believe that ProgeSQL, MySQL and SQLite don't support analytics.
If the purpose of the data warehouse is not to improve the finances of the company in some way then you should save the effort and forget about wasting your time.
If the data warehouse is actually going to make money for the company in some way then it spending a few bucks on a real system is probably not unreasonable.