Multiply Varchar(50) column and int column in SQL Server - sql-server

select convert(int, Price_Each) * Quantity_Ordered as Total_Sale
from data
I am trying to multiply two columns where data type of Price_Each is varchar(50), and data type of Quantity_Ordered is int.
Even after convert and casting I am having the same error:
Conversion failed when converting the varchar value ' $11.95' to data type int.
My table name is "DATA"
Order ID
Product
Quantity Ordered
Price Each
176558
USB-C Charging Cable
2
$11.95
176559
Bose SoundSport Headphones
1
$99.99
My problem statement is: create new columns named as total sales per person (by multiplying quantity order with the price each)
Could anyone please help me out?

As mentioned in comment by #Jeroen Mostert, you need to convert to money which handles the currency symbol $. The query would be :
select convert(money, Price_Each) * Quantity_Ordered as Total_Sale
from data
And you should actually change your column datatype to money, then you can easily multiply without casting and definitely you will get more benefits while doing any other processing with this column.
alter table [DATA]
alter column Price_Each money

Related

how to cast each row of type varchar to array?

So I have this problem. I have a table called Orders, where i have a column product_id, but it's of the type varchar, because there can be many of them e.g. '12,3,4,345'. They are all sepearted with ','
But my question is how can I cast in query each row of varchar to array?
I tried it with the string_to_array function, but it only casts one row. I need to cast all of the rows in the given table.
My code
SELECT o.product_id, string_to_array(
(select product_id
from orders), ',' ) as array
from orders o ;
I am working in IntellIJ, using JDBC driver on postresql database.
It would be nice if you attach sample data and expected output. So far, I understood your problem, that you want to "put" the contents of each line into an array.
SELECT product_id,
(string_to_array(product_id, ',')::int[]) as array
from orders;
Check out my dbfiddle and let me know if the result is different from what you want.

SQL Server changes the value in the float column when converting to varchar

I have a column in my table that is of float type. The table was automatically generated when I imported the spreadsheet (Excel) data to my database. Thus there is a column I wish to change from float to varchar, but when I try to do this, I get an error:
'tblInvoices' table
Unable to create index 'IX_tblInvoices'.
The CREATE UNIQUE INDEX statement terminated because a duplicate key was found for the object name 'dbo.tblInvoices' and the index name 'IX_tblInvoices'.
The duplicate key value is (1.00001e+006). The statement has been terminated.
It is a unique column, and set that way (not set as the primary key for reasons). I have already run queries to search for and delete duplicate fields but there are none. The query I ran as follows:
WITH CTE AS
(
SELECT
Invoice,
RN = ROW_NUMBER()OVER(PARTITION BY Invoice ORDER BY Invoice)
FROM
dbo.tblInvoices
)
DELETE FROM CTE
WHERE RN > 1
So the value within the Invoice column is 1000010 and when I run the following query a single row is found.
SELECT *
FROM [TradeReceivables_APR_IFRS9].[dbo].[tblInvoices]
WHERE Invoice = 1.00001e+006
Note that I have searched for the value in the error, 1.00001e+006, and not 1000010.
So my question is why does the DBMS do this? Why does it change the value like that? When I remove the column, it does it with another column and so on and so on (about 40 000 rows in total). How can I change the column from float to varchar without changing the data and getting errors?
Any help will be greatly appreciated!
It seems that the field is an integer so you can Cast it to BIGINT before cast to VARCHAR
Declare #Invoice as float = 1.00001e+006
print cast(#Invoice as varchar) -->> Result : 1.00001e+006
print cast(cast(#Invoice as bigint) as varchar) -->> Result : 1000010

Converting bigint to smallint shows an error

ALTER TABLE employee
ALTER COLUMN emp_phoneNo SMALLINT;
I am trying to alter the data type from BIGINT to SMALLINT and it is showing this error:
Arithmetic overflow error converting expression to data type int.
I am not able to understand what is wrong.
You have existing rows with values in that specific column that are bigger than the new data type allows.
You need to update or delete the rows that are currently "oversized".
(or not perform the column alter at all .. because most likely you don't want to lose the information)
You can find the rows with this query:
SELECT 'CurrentlyOverSized' as MyLabel, * FROM dbo.employee WHERE ABS(emp_phoneNo ) > 32767
Note a phone number like : 5555555555 (which would be numeric for 555-555-5555) would be greater than the 32767 number.
Even 5555555 (for 555-5555 (no area code)) is too big for 32767.
Also
A debatable topic. But number or string for storing phone numbers...check out this link for food for thought:
What datatype should be used for storing phone numbers in SQL Server 2005?
Personally I think numeric is the wrong data type for phone numbers.
Whatever you do, be consistent. If you go with a string (varchar(xyz)) for example..........store them with no extra characters 5555555555, with hyphens 555-555-5555 , with dot 555.555.5555 .. or whatever but do them all the same would be my advice.

SQL Server query coding in master detail table

I have two tables master and detail
Master table has columns:
id primary
mmrec
billno
billdate
billtype
name
address
amount
Details table has columns:
id primary
purchrecno - master>mmrec
itemcode
itemqty
amount
transtype (There will always be 4 types for this )
taxtype
Besides there is a tax table which has
taxcode primary
taxdesc
percent
What I want to achieve is a summary report, which will have four rows as per billtyp
There will be two sets of columns having twice as many columns as taxtype that were used in a particular period. One set will display base amount (this will be sum of amount for a particular type of tax) another set will have sum of tax (tax is calculated using percent from tax table on amount so that base amount + tax amount equals to amount ( in details table)
obviously Master table will provide id to select details table transaction and provide
date for the details table transaction
tax table will provide tax information, and percent for tax calculation.
What I have conceived is there will 4 union for each transtype.
But somehow or other could not work out properly.
And I need your help in coding this query.
I am open to using a stored procedure or view but would prefer a query will get me 4 rows.
Thanks

SQL Server Row Length

I'm attempting to determine the row length in bytes of a table by executing the following stored procedure:
CREATE TABLE #tmp
(
[ID] int,
Column_name varchar(640),
Type varchar(640),
Computed varchar(640),
Length int,
Prec int,
Scale int,
Nullable varchar(640),
TrimTrailingBlanks varchar(640),
FixedLenNullInSource varchar(640),
Collation varchar(256)
)
INSERT INTO #tmp exec sp_help MyTable
SELECT SUM(Length) FROM #tmp
DROP TABLE #tmp
The problem is that I don't know the table definition (data types, etc..) of the table returned by 'sp_help.'
I get the following error:
Insert Error: Column name or number of supplied values does not match table definition.
Looking at the sp_help stored procedure does not give me any clues.
What is the proper CREATE TABLE statement to insert the results of a sp_help?
How doing it this way instead?
CREATE TABLE tblShowContig
(
ObjectName CHAR (255),
ObjectId INT,
IndexName CHAR (255),
IndexId INT,
Lvl INT,
CountPages INT,
CountRows INT,
MinRecSize INT,
MaxRecSize INT,
AvgRecSize INT,
ForRecCount INT,
Extents INT,
ExtentSwitches INT,
AvgFreeBytes INT,
AvgPageDensity INT,
ScanDensity DECIMAL,
BestCount INT,
ActualCount INT,
LogicalFrag DECIMAL,
ExtentFrag DECIMAL
)
GO
INSERT tblShowContig
EXEC ('DBCC SHOWCONTIG WITH TABLERESULTS')
GO
SELECT * from tblShowContig WHERE ObjectName = 'MyTable'
GO
Try this:
-- Sum up lengths of all columns
select SUM(sc.length)
from syscolumns sc
inner join systypes st on sc.xtype = st.xtype
where id = object_id('table')
-- Look at various items returned
select st.name, sc.*
from syscolumns sc
inner join systypes st on sc.xtype = st.xtype
where id = object_id('table')
No guarantees though, but it appears to be the same length that appears in sp_help 'table'
DISCLAIMER:
Note that I read the article linked by John Rudy and in addition to the maximum sizes here you also need other things like the NULL bitmap to get the actual row size. Also the sizes here are maximum sizes. If you have a varchar column the actual size is less on most rows....
Vendoran has a nice solution, but I do not see the maximum row size anywhere (based on table definition). I do see the average size and all sorts of allocation information which is exactly what you need to estimate DB size for most things.
If you are interested in just what sp_help returns for length and adding it up, then I think (I'm not 100% sure) that the query to sysobjects returns those same numbers. Do they represent the full maximum row size? No, you are missing things like the NULL bitmap. Do they represent a realistic measure of your actual data? No. Again VARCHAR(500) does not take 500 bytes if you only are storing 100 characters. Also TEXT fields and other fields stored separately from the row do not show their actual size, just the size of the pointer.
None of the aforementioned answers is correct or valid.
The question is one of determining the number of bytes consumed per row by each column's data type.
The only method(s) I have that work are:
exec sp_help 'mytable' - then add up the Length field of the second result set (If working from Query Analyzer or Management Studio - simply copy and paste the result into a spreadsheet and do a SUM)
Write a C# or VB.NET program that accesses the second resultset and sums the Length field of each row.
Modify the code of sp_help.
This cannot be done using Transact SQL and sp_help because there is no way to deal with multiple resultsets.
FWIW: The table definitions of the resultsets can be found here:
http://msdn.microsoft.com/en-us/library/aa933429(SQL.80).aspx
I can't help you with creating a temp table to store sp_help information, but I can help you with calculating row lengths. Check out this MSDN article; it helps you calculate such based on the field lengths, type, etc. Probably wouldn't take too much to convert it into a SQL script you could reuse by querying against sysobjects, etc.
EDIT:
I'm redacting my offer to do a script for it. My way was nowhere near as easy as Vendoran's. :)
As an aside, I take back what I said earlier about not being able to help with the temp table. I can: You can't do it. sp_help outputs seven rowsets, so I don't think you'll be able to do something as initially described in the original question. I think you're stuck using a different method to come up with it.
This will give you all the information you need
Select * into #mytables
from INFORMATION_SCHEMA.columns
select * from #mytables
drop table #mytables
UPDATE:
The answer I gave was incomplete NOT incorrect. If you look at the data returned you'd realize that you could write a query using case to calculate a rows size in bytes. It has all you need: the datatype|size|precision. BOL has the bytes used by each datatype.
I will post the complete answer when I a chance.

Resources