Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I need to run kernel function 1 million times. In each core i need to create dynamic array of 1000 elements. How i can resolve this problem? Should i create buffer of 1 million * 1000 elements and contact them using id,but it's a lot of memory, and i haven't as much? Or is there another resolve?
Yes, you have to create a buffer of 1mil*1000, at least thats what i experienced.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 days ago.
Improve this question
enter image description here
INSERT INTO #ApplicationFiles (OurBranchID, ApplicationFileNo)
SELECT
ApplicationDetails.x.value('OurBranchID[1]','nVarchar(12)'),
ApplicationDetails.x.value('ApplicationFileNo[1]','nVarchar(20)')
FROM
#ApplicationDetails.nodes('/JSON/dt_ApplicationDetails') AS ApplicationDetails(x)
The query is taking a long time to complete. I want to increase the performance of the query.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 days ago.
Improve this question
I have this requirement where I need to dump a large array of user Ids in one of the columns of a table in PostgreSQL DB. Let's say the max number of user Ids would be 100,000 and each user Id is of max 50 characters length. I won't be performing any operations on that table, it is just for logging purpose.
I've used text[] type column to dump those array of user Ids. I don't know if its the best way to do. I'm worried that some "max size limit reached" error gets thrown if the length of the array increases in the future.
Please suggest a better way to achieve this :)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
In SQL Server we have Index Seek operator. Which works very well for a search operation.
How much operation SQL Server needs to perform in order to get a value? I assume that it should be the height of the tree.
Nobody can say the one answer for sure because it depends on many parameters :
Index type (Clusted, None Clustered)
Unique or Not
Null or Not Null
Expected rows stored in which page
So, there is the well-explained article about index seeking [O2] blow:
https://sqlserverfast.com/epr/index-seek/
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am searching an example for data loss by a concurrent access from two or more threads. Have anyone an idea, how I could do that? (in C)
In the second step I want to fix the problem with mutex or smth like that.
But it would help just to have an idea how to do the data loss!
greetings
Implement a simple linked list, start two thread : one adding and one deleting.
I guarante you a beautiful sigsev after a while.
The mutex will help keep your linked list clean.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Does any one have idea about storing email boby in the sql server. The email body is about 15 lines. what has to be done inorder to maintain a table with 40 different emails contents.
Example:
a : some cotent should be sent
b : some other content
You'll probably want an nvarchar(max) column to store the contents of the body. This allows you to store up to 2GB worth of text...which is kind of a lot of text, so you should be good.