How to include a new incremented column using DENSE_RANK() in Synapse - sql-server

In my synapse, I have a schedule table that stores information about all programs for a given day. It would be helpful if I could include rank/dense_rank in the output from this table based on the title and ordered by the event number and program date. Here are the table script and sample records. Could you please help me in achieving the What I expect column from my screenshot -
CREATE TABLE [Prod].[Schedule]
(
[EventNo] [int] NOT NULL,
[ProgramDate] [date] NOT NULL,
[PlannedStartDateTime] [datetime2](3) NOT NULL,
[PlannedEndDateTime] [datetime2](3) NOT NULL,
[PlannedDuration] [varchar](15) NOT NULL,
[Title] [varchar](500) NOT NULL,
[Type] [varchar](10) NOT NULL
)
WITH
(
DISTRIBUTION = ROUND_ROBIN
);
INSERT INTO [Prod].[Schedule] VALUES(1,'2023-01-27','2023-01-27 06:00:00','2023-01-27 06:20:00','00:20:00:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(7,'2023-01-27','2023-01-27 06:22:00','2023-01-27 06:35:00','00:13:00:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(11,'2023-01-27','2023-01-27 06:37:00','2023-01-27 06:50:00','00:13:00:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(16,'2023-01-27','2023-01-27 06:52:00','2023-01-27 07:20:00','00:28:00:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(25,'2023-01-27','2023-01-27 07:23:30','2023-01-27 07:35:00','00:11:30:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(32,'2023-01-27','2023-01-27 07:38:30','2023-01-27 07:50:00','00:11:30:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(40,'2023-01-27','2023-01-27 07:53:30','2023-01-27 08:20:00','00:26:30:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(48,'2023-01-27','2023-01-27 08:23:30','2023-01-27 08:35:00','00:11:30:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(56,'2023-01-27','2023-01-27 08:38:30','2023-01-27 08:50:00','00:11:30:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(63,'2023-01-27','2023-01-27 08:53:30','2023-01-27 09:10:00','00:16:30:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(68,'2023-01-27','2023-01-27 09:13:30','2023-01-27 09:26:30','00:13:00:00','Breakfast - Episode 49','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(76,'2023-01-27','2023-01-27 09:30:00','2023-01-27 09:56:30','00:26:30:00','Briefing - Episode 336','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(84,'2023-01-27','2023-01-27 10:00:00','2023-01-27 10:20:00','00:20:00:00','Friday Morning - Episode 20','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(86,'2023-01-27','2023-01-27 10:22:00','2023-01-27 10:35:00','00:13:00:00','Friday Morning - Episode 20','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(91,'2023-01-27','2023-01-27 10:37:00','2023-01-27 10:50:00','00:13:00:00','Friday Morning - Episode 20','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(95,'2023-01-27','2023-01-27 10:52:00','2023-01-27 11:20:00','00:28:00:00','Friday Morning - Episode 20','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(97,'2023-01-27','2023-01-27 11:23:00','2023-01-27 11:35:00','00:12:00:00','Friday Morning - Episode 20','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(103,'2023-01-27','2023-01-27 11:37:00','2023-01-27 11:58:00','00:21:00:00','Friday Morning - Episode 20','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(109,'2023-01-27','2023-01-27 12:00:00','2023-01-27 12:20:00','00:20:00:00','Friday Afternoon - Episode 12','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(111,'2023-01-27','2023-01-27 12:22:00','2023-01-27 12:35:00','00:13:00:00','Friday Afternoon - Episode 12','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(115,'2023-01-27','2023-01-27 12:37:00','2023-01-27 12:50:00','00:13:00:00','Friday Afternoon - Episode 12','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(118,'2023-01-27','2023-01-27 12:52:00','2023-01-27 13:20:00','00:28:00:00','Friday Afternoon - Episode 12','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(120,'2023-01-27','2023-01-27 13:22:00','2023-01-27 13:35:00','00:13:00:00','Friday Afternoon - Episode 12','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(123,'2023-01-27','2023-01-27 13:37:00','2023-01-27 13:58:00','00:21:00:00','Friday Afternoon - Episode 12','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(131,'2023-01-27','2023-01-27 14:00:00','2023-01-27 14:20:00','00:20:00:00','The Briefing - Episode 62','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(133,'2023-01-27','2023-01-27 14:22:00','2023-01-27 14:35:00','00:13:00:00','The Briefing - Episode 62','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(138,'2023-01-27','2023-01-27 14:37:00','2023-01-27 14:58:00','00:21:00:00','The Briefing - Episode 62','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(142,'2023-01-27','2023-01-27 15:00:00','2023-01-27 15:20:00','00:20:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(144,'2023-01-27','2023-01-27 15:23:00','2023-01-27 15:35:00','00:12:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(148,'2023-01-27','2023-01-27 15:37:00','2023-01-27 15:50:00','00:13:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(153,'2023-01-27','2023-01-27 15:52:00','2023-01-27 16:20:00','00:28:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(155,'2023-01-27','2023-01-27 16:22:00','2023-01-27 16:35:00','00:13:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(161,'2023-01-27','2023-01-27 16:37:00','2023-01-27 16:50:00','00:13:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(165,'2023-01-27','2023-01-27 16:52:00','2023-01-27 17:20:00','00:28:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(167,'2023-01-27','2023-01-27 17:22:00','2023-01-27 17:35:00','00:13:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(172,'2023-01-27','2023-01-27 17:37:00','2023-01-27 17:58:00','00:21:00:00','Friday Show - Episode 59','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(176,'2023-01-27','2023-01-27 18:00:00','2023-01-27 18:15:00','00:15:00:00','Send to Phone - Episode 392','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(182,'2023-01-27','2023-01-27 18:17:00','2023-01-27 18:30:00','00:13:00:00','Send to Phone - Episode 392','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(186,'2023-01-27','2023-01-27 18:32:00','2023-01-27 18:45:00','00:13:00:00','Send to Phone - Episode 392','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(191,'2023-01-27','2023-01-27 18:47:00','2023-01-27 18:58:00','00:11:00:00','Send to Phone - Episode 392','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(196,'2023-01-27','2023-01-27 19:00:00','2023-01-27 19:15:00','00:15:00:00','Listen to Music - Episode 18','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(198,'2023-01-27','2023-01-27 19:18:00','2023-01-27 19:30:00','00:12:00:00','Listen to Music - Episode 18','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(205,'2023-01-27','2023-01-27 19:33:00','2023-01-27 19:45:00','00:12:00:00','Listen to Music - Episode 18','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(213,'2023-01-27','2023-01-27 19:48:00','2023-01-27 19:57:00','00:09:00:00','Listen to Music - Episode 18','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(215,'2023-01-27','2023-01-27 20:00:00','2023-01-27 20:15:00','00:15:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(222,'2023-01-27','2023-01-27 20:18:00','2023-01-27 20:30:00','00:12:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(229,'2023-01-27','2023-01-27 20:33:00','2023-01-27 20:45:00','00:12:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(236,'2023-01-27','2023-01-27 20:48:00','2023-01-27 20:57:00','00:09:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(242,'2023-01-27','2023-01-27 21:00:00','2023-01-27 21:15:00','00:15:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(249,'2023-01-27','2023-01-27 21:18:00','2023-01-27 21:30:00','00:12:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(256,'2023-01-27','2023-01-27 21:33:00','2023-01-27 21:45:00','00:12:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(263,'2023-01-27','2023-01-27 21:48:00','2023-01-27 21:57:00','00:09:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(271,'2023-01-27','2023-01-27 22:00:00','2023-01-27 22:15:00','00:15:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(279,'2023-01-27','2023-01-27 22:18:00','2023-01-27 22:30:00','00:12:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(286,'2023-01-27','2023-01-27 22:33:00','2023-01-27 22:45:00','00:12:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(292,'2023-01-27','2023-01-27 22:48:00','2023-01-27 22:57:00','00:09:00:00','Tonight Game - Episode 16','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(300,'2023-01-27','2023-01-27 23:00:00','2023-01-27 23:15:00','00:15:00:00','Head - Episode 418','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(305,'2023-01-27','2023-01-27 23:17:00','2023-01-27 23:30:00','00:13:00:00','Head - Episode 418','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(310,'2023-01-27','2023-01-27 23:32:00','2023-01-27 23:45:00','00:13:00:00','Head - Episode 418','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(314,'2023-01-27','2023-01-27 23:47:00','2023-01-27 23:57:00','00:10:00:00','Head - Episode 418','LIVE');
INSERT INTO [Prod].[Schedule] VALUES(322,'2023-01-27','2023-01-27 00:00:00','2023-01-27 00:15:00','00:15:00:00','Listen to Music Replay - Episode 18','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(324,'2023-01-27','2023-01-27 00:18:00','2023-01-27 00:35:00','00:17:00:00','Listen to Music Replay - Episode 18','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(326,'2023-01-27','2023-01-27 00:38:00','2023-01-27 00:45:00','00:07:00:00','Listen to Music Replay - Episode 18','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(328,'2023-01-27','2023-01-27 00:48:00','2023-01-27 00:57:00','00:09:00:00','Listen to Music Replay - Episode 18','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(330,'2023-01-27','2023-01-27 01:00:00','2023-01-27 01:15:00','00:15:00:00','Head Replay - Episode 418','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(332,'2023-01-27','2023-01-27 01:17:00','2023-01-27 01:30:00','00:13:00:00','Head Replay - Episode 418','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(334,'2023-01-27','2023-01-27 01:32:00','2023-01-27 01:45:00','00:13:00:00','Head Replay - Episode 418','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(336,'2023-01-27','2023-01-27 01:47:00','2023-01-27 01:57:00','00:10:00:00','Head Replay - Episode 418','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(338,'2023-01-27','2023-01-27 02:00:00','2023-01-27 02:15:00','00:15:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(340,'2023-01-27','2023-01-27 02:18:00','2023-01-27 02:30:00','00:12:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(342,'2023-01-27','2023-01-27 02:33:00','2023-01-27 02:45:00','00:12:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(344,'2023-01-27','2023-01-27 02:48:00','2023-01-27 02:57:00','00:09:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(346,'2023-01-27','2023-01-27 03:00:00','2023-01-27 03:15:00','00:15:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(348,'2023-01-27','2023-01-27 03:18:00','2023-01-27 03:30:00','00:12:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(350,'2023-01-27','2023-01-27 03:33:00','2023-01-27 03:45:00','00:12:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(352,'2023-01-27','2023-01-27 03:48:00','2023-01-27 03:57:00','00:09:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(354,'2023-01-27','2023-01-27 04:00:00','2023-01-27 04:15:00','00:15:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(356,'2023-01-27','2023-01-27 04:18:00','2023-01-27 04:30:00','00:12:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(358,'2023-01-27','2023-01-27 04:33:00','2023-01-27 04:45:00','00:12:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(360,'2023-01-27','2023-01-27 04:48:00','2023-01-27 04:57:00','00:09:00:00','Tonight Game Replay - Episode 16','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(362,'2023-01-27','2023-01-27 05:00:00','2023-01-27 05:15:00','00:15:00:00','Head Replay - Episode 418','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(364,'2023-01-27','2023-01-27 05:17:00','2023-01-27 05:30:00','00:13:00:00','Head Replay - Episode 418','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(366,'2023-01-27','2023-01-27 05:32:00','2023-01-27 05:45:00','00:13:00:00','Head Replay - Episode 418','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(368,'2023-01-27','2023-01-27 05:47:00','2023-01-27 05:57:00','00:10:00:00','Head Replay - Episode 418','REPLAY');
INSERT INTO [Prod].[Schedule] VALUES(373,'2023-01-27','2023-01-27 05:59:00','2023-01-27 06:00:00','00:01:00:00','UK National Anthem - Episode 369','LIVE');
SELECT EventNo,ProgramDate,PlannedStartDateTime,PlannedEndDateTime,PlannedDuration,Title,Type,DENSE_RANK() OVER(ORDER BY Title,Type ASC) Expected
FROM [Prod].[Schedule] WHERE ProgramDate = '2023-01-27' ORDER BY EventNo,ProgramDate;

;with cte as (
select *
, case when coalesce(lag(title) over (order by EventNo), title) = Title then 0 else 1 end sameShow
from [Prod].[Schedule] )
select EventNo,ProgramDate,PlannedStartDateTime,PlannedEndDateTime,PlannedDuration,Title,Type
, sum(sameShow) over(order by EventNo) +1 expected
from cte
order by EventNo

Related

How do you calculate the number of days between dates on different rows

I have a table with 3 columns: clientid, start_date, end_date
Each row is an episode of care, clients can have multiple episodes of care. I am trying to write a query in SQL Server which will display the start and end date of the latest episode of care only if it is 21 days after the previous episodes end date. Can anyone help please?
'I do not understand how to compare the end date from one episode (row 1) with the start date of the next episode (row 2).' is straightforward but you cannot use window functions in the where clause or the calculated difference - which means using a sub query (or a cte) to do the bulk of the work
drop table t
go
create table t
(clientid int, start_date date, end_date date);
go
insert into t values
(1,'2022-09-01','2022-09-01'),
(1,'2022-09-10','2022-09-10')
go
select * from
(
select *,
lag(end_date) over (partition by clientid order by start_date) lage,
datediff(d,lag(end_date) over (partition by clientid order by start_date),start_date) diff
from t
) s
where diff = 9
clientid start_date end_date lage diff
----------- ---------- ---------- ---------- -----------
1 2022-09-10 2022-09-10 2022-09-01 9
(1 row(s) affected)
LAG worked for me. Thanks for the help.

Under what conditions will the gin index in opengauss be used?

CREATE TABLE tsearch.pgweb(id int, body text, title text, last_mod_date date);
CREATE TABLE
omm=# INSERT INTO tsearch.pgweb VALUES(1, 'China, officially the People''s Republic of China(PRC), located in Asia, is the world''s most populous state.', 'China', '2010-1-1');
INSERT 0 1
omm=# INSERT INTO tsearch.pgweb VALUES(2, 'America is a rock band, formed in England in 1970 by multi-instrumentalists Dewey Bunnell, Dan Peek, and Gerry Beckley.', 'America', '2010-1-1');
INSERT 0 1
omm=# INSERT INTO tsearch.pgweb VALUES(3, 'England is a country that is part of the United Kingdom. It shares land borders with Scotland to the north and Wales to the west.', 'England','2010-1-1');
– To speed up text searches, GIN indexes can be created (specify english configuration to parse and normalize strings)
omm=# CREATE INDEX pgweb_idx_1 ON tsearch.pgweb USING gin(to_tsvector('english', body));
CREATE INDEX
– concatenated columns index
omm=# CREATE INDEX pgweb_idx_3 ON tsearch.pgweb USING gin(to_tsvector('english', title || ' ' || omm(# body));
CREATE INDEX
At this point, execute explain SELECT body FROM tsearch.pgweb WHERE to_tsvector(body) ## to_tsquery('america'); and find that the gin index is not used.
I would like to ask in what circumstances will such an index be used? (this type of index was not used when testing inserting 10,000 pieces of data)

Lots of inorrect syntax errors along with IDENTITY_INSERT Issue

I'm trying to create my first database and it's my first time using SQL Server. The software is a little confusing to me, right now I have a database but when I execute it I get a lot of errors. I tried following a guide but it didn't show any errors when he ran it.
I'm trying to create a database that is a stream website. This is what I have so far: These are my errors: I don't understand any of them, how is my syntax wrong? I've only used Java before so this is all a little overwhelming.
Msg 156, Level 15, State 1, Line 14
Incorrect syntax near the keyword 'ON'.
Msg 102, Level 15, State 1, Line 19
Incorrect syntax near 'User_Password'.
Msg 102, Level 15, State 1, Line 29
Incorrect syntax near ')'.
Msg 208, Level 16, State 1, Line 32
Invalid object name 'Content'.
Msg 2714, Level 16, State 6, Line 45
There is already an object named 'ProfileSettings' in the database.
Msg 544, Level 16, State 1, Line 52
'ProfileSettings' when IDENTITY_INSERT is set to OFF.
USE master
GO
if exists ( select * from sysdatabases where name = 'Detour')
drop database Detour
GO
CREATE DATABASE [Detour]
GO
ALTER DATABASE [Detour] SET COMPATIBILITY_LEVEL = 140
GO
SET IDENTITY_INSERT ON [dbo].[orders]
GO
CREATE TABLE dbo.Account (
UserName INT IDENTITY (1,1) PRIMARY KEY
User_Password VARCHAR (20),
GO
INSERT INTO Account(Username, Password) VALUES ('ReedKinney', 'PASSWORD123')
CREATE TABLE Content (
Content_Cycle INT IDENTITY (1,1) PRIMARY KEY,
TV VARCHAR(50),
MOVIES VARCHAR(50),
GENRES VARCHAR(50),
HIGHESTRATED VARCHAR(50)
GO
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('ACTION', 'THE WITCHER' , 'BLADE
RUNNER', 'SHERLOCK HOLMES')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('CLASSICS', 'TWIN PEAKS', 'TAXI
DRIVER', 'THE LONGEST YARD')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('COMEDIES', 'THE OFFICE', 'JUST
FRIENDS', 'WATERBOY')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('DOCUMENTARIES', 'TIGER KING',
'ICARUS', 'ZEITGEIST')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('DRAMAS', 'GAME OF THRONES', 'THE
KINGS SPEECH', 'MARRIAGE STORY')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('HORROR', 'HAUNTED MANSION', 'HOUSE
ON THE LEFT', 'PARANORMAL ACTIVITY')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('ROMANCE', 'GILMORE GIRLS', 'HER',
'SILVER LNININGS PLAYBOOK')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('THRILLER', 'OZARK', 'SE7EN', 'THE
GIRL WITH THE DRAGON TATTOO')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('SPORTS', 'BASKETBALL', 'MONEYBALL',
'NACHO LIBRE')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('SCIENCE', 'THE UNIVERSE', 'THE
MARTIAN', 'OUR PLANET')
GO
CREATE TABLE ProfileSettings (
ProfileName INT IDENTITY (1,1) PRIMARY KEY,
LanguageChoice VARCHAR(50),
MaturitySetting VARCHAR(50) )
GO
INSERT INTO ProfileSettings (ProfileName, LanguageChoice, MaturitySetting) VALUES('REED',
'ENGLISH', 'ALL CONTENT')
GO
You should use a decent ide such as ssms which would highlight the errors pretty well. I have corrected the syntax errors and annotated the changes in line. If this was an accurate transcription of code I suggest you get another guide.
if exists ( select * from sysobjects where name = 'Detour') --sysdatabases >> sysobjects
drop database Detour
GO
CREATE DATABASE [Detour]
GO
ALTER DATABASE [Detour] SET COMPATIBILITY_LEVEL = 140
GO
SET IDENTITY_INSERT [dbo].[orders] on --move on to end of statement
GO
CREATE TABLE dbo.Account (
UserName INT IDENTITY (1,1) PRIMARY KEY, --
User_Password VARCHAR (20) -- remove ,
) -- added
GO
INSERT INTO Account(Username, USER_Password) VALUES ('ReedKinney', 'PASSWORD123') --password >> user_password
CREATE TABLE Content (
Content_Cycle INT IDENTITY (1,1) PRIMARY KEY,
TV VARCHAR(50),
MOVIES VARCHAR(50),
GENRES VARCHAR(50),
HIGHESTRATED VARCHAR(50)
) --added
GO
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('ACTION', 'THE WITCHER' , 'BLADE
RUNNER', 'SHERLOCK HOLMES')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('CLASSICS', 'TWIN PEAKS', 'TAXI
DRIVER', 'THE LONGEST YARD')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('COMEDIES', 'THE OFFICE', 'JUST
FRIENDS', 'WATERBOY')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('DOCUMENTARIES', 'TIGER KING',
'ICARUS', 'ZEITGEIST')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('DRAMAS', 'GAME OF THRONES', 'THE
KINGS SPEECH', 'MARRIAGE STORY')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('HORROR', 'HAUNTED MANSION', 'HOUSE
ON THE LEFT', 'PARANORMAL ACTIVITY')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('ROMANCE', 'GILMORE GIRLS', 'HER',
'SILVER LNININGS PLAYBOOK')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('THRILLER', 'OZARK', 'SE7EN', 'THE
GIRL WITH THE DRAGON TATTOO')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('SPORTS', 'BASKETBALL', 'MONEYBALL',
'NACHO LIBRE')
INSERT INTO Content(GENRES, TV, MOVIES, HIGHESTRATED) VALUES('SCIENCE', 'THE UNIVERSE', 'THE
MARTIAN', 'OUR PLANET')
GO
CREATE TABLE ProfileSettings (
ProfileName INT IDENTITY (1,1) PRIMARY KEY,
LanguageChoice VARCHAR(50),
MaturitySetting VARCHAR(50)
)
GO
INSERT INTO ProfileSettings (ProfileName, LanguageChoice, MaturitySetting) VALUES('REED',
'ENGLISH', 'ALL CONTENT')
GO

SQL Server - Find and Replace, If row contains a value from another table it needs to be removed

I have a table that contains a list of retailers (200 or so), for example:
Cambridge Food Gauteng
Cambridge Food Klerksdorp & Carletonville
Cambridge Food KZN
Cambridge Food Mitchells Plain
Cambridge Food Nelspruit & Seshego
Cambridge Food Tembisa
Boxer Super Store Eastern Cape
Boxer Super Store Free State
Boxer Super Store Gauteng
Boxer Super Store KZN
Boxer Super Store Limpopo
Boxer Super Store Mpumalanga
Boxer Super Store North-West
Checkers Eastern Cape
Checkers Eastern Cape, Northern Cape & KwaZulu-Natal
Checkers Gauteng, Mpumalanga, Limpopo, North West
Checkers Hyper
Checkers Hyper Western Cape
Checkers KwaZulu - Natal
Checkers KwaZulu-Natal
Checkers Medirite Specials
Checkers Western Cape
Checkers Western Cape & Inland
Checkout Eastern Cape
My objective is to remove the area name(s) from the retailer name (Gauteng, Klerksdorp etc) so that it only shows Cambridge Food.
I have a table that contains all these areas (154 areas).
Example of my area list:
Cape Town
Carletonville
Centurion
Chatsworth
Claremont
Cresta
Dolphin Coast
Durban
Durban North
East London
East Rand
Eldos
Empangeni
My thinking was to check each retailer row, if it contains an area name in my area table that I could replace it with nothing / remove it.
Ultimately leaving me with a list of retailer names without any region name.
Ideally it would be great to do without a loop, but if that is the only option it would have to do.
In short, if retailer name contains area in area table, it needs to be removed.
Hope it makes sense, thanks for helping.
If every row contains area name after 'Food' then you can simply ignore everything past that.
SUBSTRING(RETAILER, 1, CHARINDEX('FOOD', RETAILER, 1)+4)
OR
try the following:
DECLARE #RETAILERS TABLE (RETAILER VARCHAR(200))
INSERT INTO #RETAILERS SELECT 'Cambridge Food Gauteng'
INSERT INTO #RETAILERS SELECT 'Cambridge Food Klerksdorp & Carletonville'
INSERT INTO #RETAILERS SELECT 'Cambridge Food KZN'
INSERT INTO #RETAILERS SELECT 'Cambridge Food Mitchells Plain'
INSERT INTO #RETAILERS SELECT 'Cambridge Food Nelspruit & Seshego'
INSERT INTO #RETAILERS SELECT 'Cambridge Food Tembisa'
INSERT INTO #RETAILERS SELECT 'Cambridge Food NO AREA'
DECLARE #REGION TABLE (REGION VARCHAR(100))
INSERT INTO #REGION SELECT 'Gauteng'
INSERT INTO #REGION SELECT 'Klerksdorp & Carletonville'
INSERT INTO #REGION SELECT 'KZN'
INSERT INTO #REGION SELECT 'Mitchells Plain'
INSERT INTO #REGION SELECT 'Nelspruit & Seshego'
INSERT INTO #REGION SELECT 'Tembisa'
UPDATE R
SET RETAILER = REPLACE(R.RETAILER, RG.REGION, '')
FROM #RETAILERS R, #REGION RG
WHERE CHARINDEX(RG.REGION, R.RETAILER, 1) > 0
SELECT * FROM #RETAILERS

Getting measures from parent table using child table dimensions

I have a cube built in SSAS2008 that is built upon 2 tables - Accounts and Payments.
Both tables have dimensions and measures, and it appears to be working relatively well - I can get payments for accounts broken down into dimensions for either payments or accounts, and account measures broken down into account dimensions.
What I can't do is view measures for accounts where a relationship exists with the child payments table. For example, see the balance of all accounts that have at least 1 payment.
I understand I may need a separate cube for this, but I still can't see how my data source would need to be configured.
I'd ideally not like/have to completely reformat the data into a fact / dimension snowflake schema, as I'm not entirely sure how to do this with the relational data I have, however any suggestions on this would be welcome.
Thanks.
Update: Bounty added due to lack of interest...
My answer takes into account that you don't want to reformat your data into a traditional data warehouse schema. If it gets you further down the road then good for you but I suspect you'll run into more of these problems as you grow your project. It might be worth tinkering with how you might transform the data into a star schema before you need it.
I can suggest a few options. The first that comes to mind is to make a degenerate dimension in the accounts cube that is based on the payments fact table. The following example answers your "all accounts that have a payment" problem but this should work for similar questions. I assumed an account statement date as the last day of each calendar month so you'll want to count payments made in each calendar month.
create table accounts_fact
( account_id int not null,
statement_date datetime not null,
bal int not null,
constraint acc_pk primary key (account_id, statement_date)
)
create table payments_fact
( account_id int not null,
payment_date datetime not null,
amount money not null
)
insert into accounts_fact values (1, '20100131', 100)
insert into accounts_fact values (1, '20100228', 120)
insert into accounts_fact values (1, '20100331', 0)
insert into accounts_fact values (2, '20100131', 100)
insert into accounts_fact values (2, '20100228', 20)
insert into accounts_fact values (2, '20100331', 50)
insert into accounts_fact values (3, '20100131', 10)
insert into accounts_fact values (3, '20100228', 30)
insert into accounts_fact values (3, '20100331', 50)
insert into payments_fact values (1, '20100112', 50)
insert into payments_fact values (1, '20100118', 60)
insert into payments_fact values (1, '20100215', 70)
insert into payments_fact values (1, '20100318', 80)
insert into payments_fact values (1, '20100331', 90)
insert into payments_fact values (2, '20100112', 50)
insert into payments_fact values (2, '20100215', 60)
insert into payments_fact values (2, '20100320', 70)
insert into payments_fact values (3, '20100101', 50)
insert into payments_fact values (3, '20100118', 60)
insert into payments_fact values (3, '20100318', 70)
create view dim_AccountPayments
as
select acc.account_id, acc.statement_date,
sum(case when pay.payment_date IS NULL THEN 0
else 1
end) as payment_count
from accounts_fact acc
left outer join payments_fact pay on acc.account_id = pay.account_id
and pay.payment_date >= dateadd(mm, -1, dateadd(dd, 1, acc.statement_date))
and pay.payment_date <= acc.statement_date
group by acc.account_id, acc.statement_date
select * from dim_AccountPayments
This produces the following results:
account_id statement_date payment_count
1 2010-01-31 00:00:00.000 2
1 2010-02-28 00:00:00.000 1
1 2010-03-31 00:00:00.000 2
2 2010-01-31 00:00:00.000 1
2 2010-02-28 00:00:00.000 1
2 2010-03-31 00:00:00.000 1
3 2010-01-31 00:00:00.000 2
3 2010-02-28 00:00:00.000 0
3 2010-03-31 00:00:00.000 1
It should now be a doddle to make a payments count dimension in your accounts cube. For extra points, remove the group by and sum in the view to let SSAS do the aggregation; it suited me to show the results table above. Use the view's SQL in your data source view you don't have create view permission in the source database.
Option 2 would be to make payment count from the view above a measure in the accounts cube. You can do this similarly to the above solution except make your accounts fact use a view similar to dim_AccountPayments. This time you must group by all key fields and aggregate the measures in the database... Very ugly. I don't recommend it but it is possible.
If you go with option 1 then it's simple enough to make a named set in the account payments dimension called 'Made a payment this month' which is children of all filtered to remove 0.
I hope I understood your question. I did have to make quite a few assumptions about your data structures but I hope it's useful.
Good luck.

Resources