I am working on a Select statement where I have to show the average of genres but the result must show the 0 values as 0.0
My code is
SELECT genreName AS 'Genre'
, CAST(CASE WHEN
AVG(rateValue) IS NULL
THEN 0
ELSE ROUND(FORMAT(AVG(rateValue),2),0.0)
END AS FLOAT) AS 'Average Rating'
FROM [IMDB].[FilmGenre] flmgnr
LEFT JOIN [IMDB].[FilmGenreAllocation] flmgnrall
ON(flmgnrall.genreId = flmgnr.genreId)
LEFT JOIN [IMDB].[Film] flm
ON(flm.filmId = flmgnrall.filmId)
LEFT JOIN [IMDB].[FilmReview] flmrvw
ON(flmrvw.filmId = flmgnrall.filmId)
GROUP BY genreName;
GO
and the result must be
+---------+----------------+
| Genre | Average Rating |
+---------+----------------+
| Action | 0.00 |
| Comedy | 4.67 |
| Crime | 4.50 |
| Drama | 4.50 |
| Family | 0.00 |
| Mystery | 4.40 |
+---------+----------------+
Agree with Zohar Peled. This should really be in the presentation layer.
However, if you must, one option is to use the Format() function if 2012+, or just cast as decimal
...Format(IsNull(AVG(rateValue),0),'#0.00')
or
...Cast(IsNull(AVG(rateValue),0) as decimal(18,2))
Don't cast your values to float, but rather to DECIMAL(18,2) like so:
CAST(CASE WHEN
AVG(rateValue) IS NULL
THEN 0
ELSE ROUND(FORMAT(AVG(rateValue),2),0.0)
END AS DECIMAL(18,2)) AS 'Average Rating'
Instead of
THEN 0
try
THEN 0.0
With 0 you told to SQL Server - case is int data type.
Or better way from:
CAST(CASE WHEN
AVG(rateValue) IS NULL
THEN 0
ELSE ROUND(FORMAT(AVG(rateValue),2),0.0)
END AS FLOAT)
Make:
1.0 * ROUND(ISNULL(AVG(rateValue), 0), 1) -- isnull assume data type from first item
Related
I have SQL code like this:
SELECT
CASE WHEN TRY_CONVERT(int, strCol) IS NULL
THEN strCol
ELSE CONVERT(VARCHAR, CONVERT(int, strCol))
END
Table is as follow:
| strCol |
|--------|
| 000373 |
| 2AB38 |
| C2039 |
| ABC21 |
| 32BC |
I wish to drop all the leading 0s in rows with pure number
| strCol |
|--------|
| 373 |
| 2AB38 |
| C2039 |
| ABC21 |
| 32BC |
But I got the following error:
Conversion failed when converting the varchar value '2AB38' to data type int.
I don't quite understand, it should not even enter the second case branch isn't it?
Yet another option is try_convert in concert with a coalesce
Example
Declare #YourTable Table ([strCol] varchar(50)) Insert Into #YourTable Values
('000373')
,('2AB38')
,('C2039')
,('ABC21')
,('32BC')
Select *
,NewVal = coalesce(left(try_convert(int,strCol),10),strCol)
From #YourTable
Returns
strCol NewVal
000373 373
2AB38 2AB38
C2039 C2039
ABC21 ABC21
32BC 32BC
Thank you so much #Dale K and #Programnik
CASE WHEN ISNUMERIC(strCol) = 0
THEN strCol
ELSE TRY_CONVERT(VARCHAR, TRY_CONVERT(int, strCol))
END AS strCol
These is the piece of code got the work done.
All branches are evaluated, there are no guarantee it will short circuit #Dale K
Can use ISNUMERIC in SQL SERVER #Programnik
I still have an issue with working out the best way to calculate a running balance.
I am going to be using this code in a Rent Statement that I am going to produce in SSRS, but the problem I am having is that I can't seem to work out how to achieve a running balance.
SELECT rt.TransactionId,
rt.TransactionDate,
rt.PostingDate,
rt.AccountId,
rt.TotalValue,
rab.ClosingBalance,
ROW_NUMBER()OVER(PARTITION BY rt.AccountId ORDER BY rt.PostingDate desc) AS row,
CASE WHEN ROW_NUMBER()OVER(PARTITION BY rt.AccountId ORDER BY rt.PostingDate desc) = 1
THEN ISNULL(rab.ClosingBalance,0)
ELSE 0 end
FROM RentTransactions rt
--all accounts for the specific agreement
INNER JOIN (select raa.AccountId
from RentAgreementEpisode rae
inner join RentAgreementAccount raa on raa.AgreementEpisodeId = rae.AgreementEpisodeId
where rae.AgreementId=1981
) ij on ij.AccountId = rt.AccountId
LEFT JOIN RentBalance rab on rab.AccountId = rt.AccountId AND rt.PostingDate BETWEEN rab.BalanceFromDate AND isnull(rab.BalanceToDate,dateadd(day, datediff(day, 0, GETDATE()), 0))
What this gives me are the below results- I have included the results below -
So my code is sorting my transactions in the order I want and also is row numbering them in the correct order as well.
Where the Row Number is 1 - I need it to pull back the balance on that account at that point in time, which is what I am doing....BUT I am then unsure how I then get my code to start subtracting the proceeding row - so in this case The current figure of 1118.58 would need the Total Value in Row 2 = 91.65 subtracted from it - so the running balance for row 2 would be 1026.93 and so on...
Any help would be greatly appreciated.
Assuming you have all the transactions being returned in your query you can calculate a running total using the over clause, you just need to start at the beginning of your dataset rather than working backwards from your current balance:
declare #t table(d date,v decimal(10,2));
insert into #t values ('20170101',10),('20170102',20),('20170103',30),('20170104',40),('20170105',50),('20170106',60),('20170107',70),('20170108',80),('20170109',90);
select *
,sum(v) over (order by d
rows between unbounded preceding
and current row
) as RunningTotal
from #t
order by d desc
Output:
+------------+-------+--------------+
| d | v | RunningTotal |
+------------+-------+--------------+
| 2017-01-09 | 90.00 | 450.00 |
| 2017-01-08 | 80.00 | 360.00 |
| 2017-01-07 | 70.00 | 280.00 |
| 2017-01-06 | 60.00 | 210.00 |
| 2017-01-05 | 50.00 | 150.00 |
| 2017-01-04 | 40.00 | 100.00 |
| 2017-01-03 | 30.00 | 60.00 |
| 2017-01-02 | 20.00 | 30.00 |
| 2017-01-01 | 10.00 | 10.00 |
+------------+-------+--------------+
For some reason, the data in the ORIGINAL_BOOK column (even though it has 2 decimals (eg. 876.76)), the statement below truncates the decimals. I want the decimals to be visible as well. Can someone suggest how to fix this issue please?
Case
When [DN].[SETTLEMENT_DATE] > [DN].[AS_OF_DATE]
Then Cast([DN].[ORIGINAL_BOOK] as decimal(28, 2))
Else Cast([DN].[CURRENT_BOOK] as decimal(28, 2))
End
I can't be sure because you only say that the type of the fields involved is NUMERIC, without specifying any precision or scale, however if your source fields really are just NUMERIC type, SQL defaults to NUMERIC(18,0) (as per MSDN documentation here) and so you will only be able to store values with a scale of zero (i.e. no value after the decimal place) and any values written to these fields with a greater scale (i.e. data after the decimal place) will be rounded accordingly:
CREATE TABLE dn (
ORIGINAL_BOOK NUMERIC,
CURRENT_BOOK NUMERIC
)
INSERT INTO dn
SELECT 876.76, 423.75
UNION
SELECT 0, 0
UNION
SELECT 1.1, 6.5
UNION
SELECT 12, 54
UNION
SELECT 5.789, 6.321
SELECT CAST(dn.ORIGINAL_BOOK AS DECIMAL(28,2)) AS ORIGINAL_BOOK_CONV,
CAST(dn.CURRENT_BOOK AS DECIMAL(28,2)) AS CURRENT_BOOK_CONV
FROM dn
DROP TABLE dn
gives results:
/----------------------------------------\
| ORIGINAL_BOOK_CONV | CURRENT_BOOK_CONV |
|--------------------+-------------------|
| 0.00 | 0.00 |
| 1.00 | 7.00 |
| 6.00 | 6.00 |
| 12.00 | 54.00 |
| 877.00 | 424.00 |
\----------------------------------------/
Increasing the scale of the field in the table will allow values with greater numbers of decimal places to be stored and your CAST call will then reduce the number of decimal places if appropriate:
CREATE TABLE dn (
ORIGINAL_BOOK NUMERIC(28,3),
CURRENT_BOOK NUMERIC(28,3)
)
INSERT INTO dn
SELECT 876.76, 423.75
UNION
SELECT 0, 0
UNION
SELECT 1.1, 6.5
UNION
SELECT 12, 54
UNION
SELECT 5.789, 6.321
SELECT CAST(dn.ORIGINAL_BOOK AS DECIMAL(28,2)) AS ORIGINAL_BOOK_CONV,
CAST(dn.CURRENT_BOOK AS DECIMAL(28,2)) AS CURRENT_BOOK_CONV
FROM dn
DROP TABLE dn
gives results:
/----------------------------------------\
| ORIGINAL_BOOK_CONV | CURRENT_BOOK_CONV |
|--------------------+-------------------|
| 0.00 | 0.00 |
| 1.10 | 6.50 |
| 5.79 | 6.32 |
| 12.00 | 54.00 |
| 876.76 | 423.75 |
\----------------------------------------/
If you are sure that your table fields are capable of containing numeric values to more than zero decimal places (i.e. scale > 0), please post the CREATE TABLE script for the table (you can get this from SSMS) or a screenshot of the Column listing so we can see the true type of the underlying fields. It would also be useful to see values SELECTed from the fields without any CASTing so we can see how the data is presented without any conversion.
I'd like to compare in a query between decimal values in varchar columns.
The comparison should be if maximum age is highest than the minimum age * 10 for each groupid
for example (the age column is varchar):
ID | Name | Age | GroupID
--------------------------
1 | AAA | 10.1 | 1
2 | BBB | 11 | 1
3 | CCC | 31.2 | 1
4 | DDD | 30.4 | 2
This is what I've tried to do after searching for a solution
SELECT TOP 10 * FROM Groups g
JOIN People p1 ON p1.GroupID = g.ID
JOIN People p2 ON p2.GroupID = g.ID
WHERE CONVERT(DECIMAL(8,2),ISNULL(p1.Age,0)) > (CONVERT(DECIMAL(8,2),ISNULL(p2.Age,0)) * 10)
Thanks in advance!
I think you want to use window functions for this:
select t.*
from (select t.*,
min(case when isnumeric(Age) = 1 then cast(Age as float) end) over
(partition by groupID) as minage
from table t
) t
where age > 10 * minage;
The case statement helps prevent errors in the event that age's are not numeric. Also, I figure float is a good enough data type for the age, although you can also use decimal.
I am using SQL Server 2008. I have a table AdvanceEntry.
--------------------------------------------------------------------------------
Code | PaidDate | Amount | ReceiveDate | ReceiveAmount
--------------------------------------------------------------------------------
102 | 15-04-2004 | 3000 | 20-04-2004 | 2000
104 | 23-05-2006 | 1000 | NULL | 0.00
104 | 25-05-2005 | 1500 | 12-06-2005 | 500
When any person tack the Loan then Loan amount is stored in the Amount column and date is stored in PaidDate and person code is stored in Code column. When that person gives back the amount then that amount is stored in ReceiveAmount and date is stored in ReceiveDate.
Now I want to create a report like ledger of a specific code.
For example code 102
----------------------------------------------------------------------------
PaidDate / ReceiveDate | Amount | ReceiveAmount | Balance
----------------------------------------------------------------------------
15-04-2004 | 3000 | 0 | 3000
20-04-2004 | 0 | 2000 | 1000
And for code 104
----------------------------------------------------------------------------
PaidDate / ReceiveDate | Amount | ReceiveAmount | Balance
----------------------------------------------------------------------------
23-05-2006 | 1000 | 0 | 1000
25-05-2005 | 1500 | 0 | 2500
12-06-2005 | 0 | 500 | 2000
How can I do this? Please help me.. Thanks
Here's one way of doing it:
with Paid as
(
select Code
, PaidDate
, Amount
from AdvanceEntry
where PaidDate is not null
), Received as
(
select Code
, ReceiveDate
, ReceiveAmount
from AdvanceEntry
where ReceiveDate is not null
), Details as
(
select Code = coalesce(p.Code, r.Code)
, CodeDate = coalesce(p.PaidDate, r.ReceiveDate)
, Amount = sum(p.Amount)
, ReceiveAmount = sum(r.ReceiveAmount)
from Paid p
full join Received r on p.PaidDate = r.ReceiveDate and p.Code = r.Code
group by coalesce(p.Code, r.Code)
, coalesce(p.PaidDate, r.ReceiveDate)
)
select d.Code
, PayReceiveDate = d.CodeDate
, Amount = isnull(d.Amount, 0.0)
, ReceiveAmount = isnull(d.ReceiveAmount, 0.0)
, Balance = isnull(b.Balance, 0.0)
from Details d
outer apply (select Balance = sum(isnull(b.Amount, 0.0) - isnull(b.ReceiveAmount, 0.0))
from Details b where d.Code = b.Code and d.CodeDate >= b.CodeDate) b
order by d.Code, d.CodeDate
SQL Fiddle with demo.
It also looks like you had a slight typo in your data; I've changed it slightly in the fiddle to get your expected results.
Also worth mentioning that if you are only getting one pay/receive action per day per code you can get away without any GROUP BY in the query.
try this (untested):
;with cte as (
select Code, PaidDate as Date, Amount as Dr, 0 as Cr, Amount as Net
from Data where PaidDate is not null
union all
select Code, ReceivedData as Date, 0 as Dr, -ReceivedAmount as Cr, -ReceivedAmount as Net
from Data where ReceivedDate is not null
)
select
t1.*, sum(t2.Net) as Balance
from cte as t1
left join cte as t2 on t2.Code = t1.Code and t2.Date <= t1.Date
group by
t1.Code, t1.Date
having t1.Code = #Code