I've hit an error when I've been working on a table that uses a text field.
If I was getting the length of a varchar column I could do
var result = (from t in context.tablename select t.fullname.Length)
However, if I run the same query on a text field:
var result = (from t in context.tablename select t.biography.Length)
I get the error :
Argument data type text is invalid for argument 1 of len function.
Having done a bit of reading up on the subject I understand why SQL Server raises this error but I'm not sure of the best way around it. I know I could return the result and then get the length of the resulting string but surely there is an easier way of doing this?
I think your best option is to update the column data type to VARCHAR(MAX) if it is TEXT or NVARCHAR(MAX) if it is NTEXT. There are plenty of resources on how to do this, but generally you make a new column of [N]VARCHAR(MAX) and then you update all your data across into the new column, then drop the old column and finally rename the new column to the old name.
If you can't change the table schema, then you will need to create a view and do the type casting in the select of that view.. but then you might as well have just changed the column data type as mentioned above (unless you're not the db owner and you create the view in a different database). But be mindful that EF doesn't always play as nice with views as it does with tables.
Related
I'm trying to add a new custom field to the Customer page. The field is a "comment" field with formatted text in it. I know I'll have to use the PXRichTextEdit in my page and I need to add a SQL column of type TEXT, not VARCHAR.
My problem is that when I try to add a custom column in the table, the list of data types is this :
By the way, I tried selecting "string" but this creates a column with the type VARCHAR and I get errors about truncating data when trying to save something in my column.
I successfully did it by creating manually the column in my SQL Server database (with SSMS), but I really don't want to do that when deploying in the production environment. I would prefer to have it in my personalisation package.
Is there a way to have it in the project ? Am I missing something obvious ?
The limitation I have comes from the PXRichTextEdit. It needs an SQL column with a maximum size limit. The solution I found is to create a string column in the customization screen. Although the size is limited to 4000 characters, I created a database script to change the column size to max. With this, I don't have the error about truncating data and the SQL column is of type varchar.
I'm trying to use SqlBulkCopy.WriteToServer to insert data in a certain table of an SQL database. To do so, I have a DataTable which is populated with records I need to save on the database. Basically my code is based on the Microsoft's example provided for the method SqlBulkCopy.WriteToServer.
The problem is that I have two DateTime fields in the SQL table, and i don't know how to represent them while defining the columns of the DataTable. I tryed both System.String and System.DateTime, but after the code has been executed, he says he can not convert a String type to DateTime. DataTable columns are defined in the following way (code taken from the example linked above):
Dim productID As DataColumn = New DataColumn()
productID.DataType = System.Type.GetType("System.Int32")
How can i do that? What is the correct type to use for a DataTable's column corresponding to an SQL DateTime field?
Previously, I used an SQL command to map every field, for example:
' Fields initialization
SqlCmd.Parameters.Add("#Field1", SqlDbType.DateTime)
[...]
SqlCmd.Parameters.Add("#FieldN", SqlDbType.NChar, 255)
' After opened the transaction
SqlCmd.Parameters("#Field1").Value = MyDateTimeSavedInAString
[...]
SqlCmd.Parameters("#FieldN").Value = "NTHVALUE"
Thanks in advance.
[UPDATE1] The DateTime column now works, but the same kind of error is given by another column which will be saved on a time field in the SQL Server's table. What kind of VB .NET type i should use to map the DataTable column with the one of SQL Server marked as time?
[UPDATE2] I'm trying to use an SQL table with every field set to nvarchar data type, but it still gives the same error. In fact he says that it is impossible to convert the String type of the origin column in the nvarchar type of the destination column.
Use DateTime - DateTime conversion works like a charm. SqlBulkCopy does not want a lot of modifications in the data - it bypasses most of SQL Server's processing for raw performance.
And you can avoid using a DataTable - it takes about an hour or two to write your own object wrapper ;) DataTables are not exactly efficient.
And try to wrap it up more- SqlBUlkCopy is terrible code in that it puts an exclusive lock on the target table. I have my own wrapper creating a temporary table, bulk copying into this and then using a simple SELECT INTO to move the data to the final table in a short atomic operation.
And be aware - below around 1000 lines it makes no sense to use SqlBulkCopy. High overhead. Rather create a long multi line insert statement.
I have created a table having two columns with datatype varbinary(max). I am saving pdf files in binary format in these columns. There is no issue while inserting the pdf files in these columns. But when I am selecting even a single record with only one column of type varbinary in select query it takes around one minute to fetch the record. The size of pdf file inserted is of 1MB. Here is the sql query to fetch single record:
select binarypdffile from gm_packet where jobpacketid=1
Kindly suggest if there is a way to improve the performance with varbinary datatype.
Could you try and time the following queries:
SELECT cnt = COUNT(*) INTO #test1 FROM gm_packet WHERE jobpacketid = 1
SELECT binarypdffile INTO #test2 FROM gm_packet WHERE jobpacketid = 1
The first one tests how long it takes to find the record. If it's slow, add an index on the jobpacketid field. Assuming these values come in sequentially I wouldn't worry about performance as records get added in the future. Otherwise you might need to rebuild the index once in a while.
The second tests how long it takes to fetch the data from the table (and store it back into another table). Since no data goes out of 'the system' it should show 'raw database performance' without any "external" influence.
Neither should be very long. If they aren't but it still takes 'a long time' to run your original query in SSMS and get the binary data in the grid, then I'm guessing it's either a network issue (wifi?) or SSMS simply is very bad at representing the blob in the GUI; it's been noticed before =)
I used the PRAGMA table_Info('table_name') to get the table names,field data type and other info about the table,it's giving me the expected values but when it comes to views the field data type returned is always "numeric". What might be the cause of this problem?...Is there other ways to get the field data type from a view? please help
I'm guessing it's a by-product of the fact that columns aren't really strongly typed in SQLite. While you can declare a type for a column, it won't prevent you from putting data of other types into that column. In other words, the data type is associated with the individual field rather than the column it's in. The specifics of how types are determined vary a bit depending on which version of SQLite you're using:
SQLite 2:
http://www.sqlite.org/datatypes.html
SQLite 3:
http://www.sqlite.org/datatype3.html
I have a table that has ntext field. MSDN says that ntext is deprecated and they suggest other data types:
ntext, text, and image data types will be removed in a future version of Microsoft SQL Server. Avoid using these data types in new development work, and plan to modify applications that currently use them. Use nvarchar(max), varchar(max), and varbinary(max) instead.
In my particular case it was decided to switch to varbinary(max). I tried to alter the table definition but that didn't work.
ALTER TABLE MyTable ALTER COLUMN MyColumn VARBINARY(MAX);
What are the possibilities to change the type to varbinary(max)? I tried change the type from ntext -> nvarchar(max) and then from nvarchar(max) -> varbinary(max) but that is not possible (error: Implicit conversion from data type nvarchar(max) to varbinary(max) is not allowed).
The only working solution is to add a new column of type varbinary(max), convert the existing value to the new column and then drop the old column. This takes WAY TOO MUCH time (on my dataset of about 15GB it takes about 30 minutes). That's why I am investigating other possibilities to achieve the same (possibly in-place = without moving data and conversion).
I presume you went with varbinary(max) because your ntext column had non textual data in it? In that case, I think you're going to have to add a separate varbinary(max) column to your table, then run a conversion operation to copy from the ntext to the new column. Then, delete the old column, and rename the new column to the old name.
"Implicit conversion from data type nvarchar(max) to varbinary(max) is not allowed" means that you're going to have to be explicit about the conversion.
It seems like this conversion is going to have to happen at some point. If you search, you'll find many people going from text to varchar(max) and stating it takes 20+ minutes to convert. My two cents after researching for a few minutes, so don't take it as gospel.
If your table just takes inserts, you could convert the existing data in a holding table and then rename the tables so the holding is then production. Then move any newly created data from the old table during your down time.
Handling updates makes things more complex of course.
Adding the extra column is probably the best way to go.
I favour doing this kind of thing in steps to reduce risks
Add the column varbinary(max) as nullable
Modify your insert code to populate both columns
At your leisure, say overnight, run the UPDATE statement with a CAST
Remove all code support for the old column, ensure new column is read
Drop the old column, and change the new column to be non null if needed