Convert string date to datetime on sql server - sql-server

i've have something like this:
.
I need to concatenate and convert the DATA and ORA fields into one because I'll insert those in another table with just one field.
My problem is to convert them 'cause I've not found any format good for making it.
Also the customer uses "Italian month" like in the photo... Apr it's "Aprile" (April)
Does someone have a possible solution?
I can't actually modify the format of the two fields unfortunately.
EDIT: the table fields are VARCHAR(MAX), the point is i need to make an insert into another table where the "date" field is in datetime format, and the year it's supposed to be always the current one
EDIT 2: i create and drop this small table every time, and data is brought in by a bulk insert from a .csv
EDIT 3: i'm sorry but i'ts my first question =)...btw the output should be like this table here with the "DATA" in datetime format
EDIT 4: DDL:
create table notaiTESTCSV(
NUMERO_FINANZIAMENTO varchar(MAX),
DATA varchar(MAX),
ORA varchar(MAX),
)
EDIT 5: this is how i take data from csv:
bulk insert notaiTESTCSV from 'path\SPEDIZIONE NOTAI.csv' with
(firstrow = 2,fieldterminator = ';', rowterminator ='
')
The customer uses "Italian month" like in the photo
PS: sorry for my bad English it's not my first language
Thank you in advance!

SQL Server is remarkably robust in the ways it can manage datetime data. This gets ugly by the end, so I tried to break it down some to show what it's doing in steps.
Here's what each piece does by itself:
DECLARE #data varchar(100) = '19-apr',
#ora varchar(100) = '9,00',
#dt datetime,
#tm datetime;
--The date component
SET #data = CONCAT(#data,'-',CAST(YEAR(GETDATE()) AS VARCHAR(4)));
SET #dt = CAST(#data as DATETIME);
--The time component
SET #ora = CONCAT(REPLACE(#ora,',',':'),':00');
SET #tm = CAST(#ora as DATETIME);
Then a little help from our friends, showing that math works:
How to combine date from one field with time from another field - MS SQL Server
SELECT #dt + #tm AS [MathWorks];
Results:
+-------------------------+
| MathWorks |
+-------------------------+
| 2018-04-19 09:00:00.000 |
+-------------------------+
Bringing it all into one statement
DECLARE #data varchar(100) = '19-apr',
#ora varchar(100) = '9,00';
SELECT CAST(CONCAT(#data,'-',CAST(YEAR(GETDATE()) AS VARCHAR(4))) as DATETIME)
+
CAST(CONCAT(REPLACE(#ora,',',':'),':00') as DATETIME) AS [CombinedDateTime]
Results:
+-------------------------+
| CombinedDateTime |
+-------------------------+
| 2018-04-19 09:00:00.000 |
+-------------------------+

Related

Order by results are out-of-range when converting varchar to datetime with missing time

I have a varchar row in a log-db that normally looks like 17.09.2020 00:00:01 and I can CAST() it to datetime and order by it without problems, e.g. this simplified query:
SELECT CAST(my_date_and_time AS datetime)
FROM my_table
ORDER BY CAST(my_date_and_time AS datetime) DESC
Now the problem is, that tonight I've got a log-entry at exactly 00:00:00 and the logging-script or database cut off the time from the varchar field and so now it only contains 17.09.2020 instead of 17.09.2020 00:00:00.
When I try to cast this one to datetime it results in an out-of-range error, probably because the time part is missing?
Unfortunately I have no access to the logging script to change it there.
So is there any way to convert this to an useable format within the query?
EDIT: Here's an example of what the my_date_and_time row looks like:
|---------------------|
| my_date_and_time |
|---------------------|
| 17.09.2020 05:29:53 |
|---------------------|
| 17.09.2020 00:02:11 |
|---------------------|
| 17.09.2020 |
|---------------------|
| 16.09.2020 23:59:38 |
|---------------------|
| 16.09.2020 23:59:18 |
|---------------------|
EDIT2: By further testing I was able to narrow the error further down. It is NOT because of the timestamp as initially assumed, which has no time information. The problem actually affects records that are not compliant.
These data records are missing the preceding 0 for the month and one point is too much after the year.
e.g. 16.9.2020. 14:22:23
I'm on the verge of ripping off the head of whoever programmed this ..
Is there any conceivable way to get the correct values ​​back via query anyway?
Sorry for the misunderstandings and thanks to everyone who has contributed to solving the problem so far
As I mention in the comment, really you should be fixing your data type.
If all your dates are in the format dd.MM.yyyy hh:mm:ss then you use use CONVERT and a style code to convert them regardless of settings:
SELECT TRY_CONVERT(datetime,Your_varchar,103)
FROM dbo.YourTable;
If any return NULL they either have the value NULL, or failed to convert.
But back to fixing your data.
Firstly, let's do this in a new column:
ALTER TABLE dbo.YourTable ADD Actual_date_and_Time datetime NULL
Now we can UPDATE that value by coverting the value:
UPDATE dbo.YourTable
SET Actual_date_and_Time = TRY_CONVERT(datetime,Varchar_Date_and_time,103);
Then you can query an actual datetime column.
You can then review your new data and see if any if bad, and amend:
SELECT Varchar_Date_and_time
FROM dbo.YourTable
WHERE Varchar_Date_and_time IS NOT NULL
AND Actual_date_and_Time;
Then, finally, you can DROP the old column and rename the new one (if you want to):
ALTER TABLE dbo.YourTable DROP COLUMN Varchar_Date_and_time;
EXEC sys.sp_rename N'dbo.YourTable.Actual_date_and_Time',N'Varchar_Date_and_time','COLUMN';
Let's try convert function (Date and Time Conversions Using SQL Server):
declare #my_date_and_time varchar(19) = '17.09.2020 00:00:01';
select convert(date, #my_date_and_time, 104)
or try to set dateformat as dmy (day/month/year):
set dateformat dmy
declare #my_date_and_time varchar(19) = '17.09.2020 00:00:01';
select cast(#my_date_and_time as date)

SQL Server XML Column Performance

Converting nText columns which contained XML to the XML data type has resulted in worse performance in SQL Server.
I am currently working on a project where nText columns have been used to store valid XML. I have successfully migrated these columns to the XML data type. However according to SQL Profiler the performance of the XML data type is worse than using nText or nvarchar(max) to store the XML. Everything I have read implies that this should not be the case.
In order to verify this I created two tables with the same indexes etc
Table Name Order1
[id] [int] IDENTITY(1,1) NOT NULL,
[uid] [varchar](36) NOT NULL,
[AffiliateId] [varchar](36) NOT NULL,
[Address] [ntext] NOT NULL,
[CustomProperties] [ntext] NOT NULL,
[OrderNumber] [nvarchar](50) NOT NULL,
...
Table Name Order2
[id] [int] IDENTITY(1,1) NOT NULL,
[uid] [varchar](36) NOT NULL,
[AffiliateId] [varchar](36) NOT NULL,
[Address] [xml] NOT NULL,
[CustomProperties] [xml] NOT NULL,
[OrderNumber] [nvarchar](50) NOT NULL,
...
I have then copied the data using a select/insert statement and rebuilt the indexes on both the tables. I then created a script with the following SQL.
DBCC DROPCLEANBUFFERS
GO
--Part1
Select id, uid, AffiliateId, Address, CustomProperties, OrderNumber from [dbo].[Order1] where uid = 'F96045F8-A2BD-4C02-BECB-6EF22C9E473F'
Select id, uid, AffiliateId, Address, CustomProperties, OrderNumber from [dbo].[Order1] where uid = 'A3B71348-EB68-4600-9550-EC2CF75698F4'
Select id, uid, AffiliateId, Address, CustomProperties, OrderNumber from [dbo].[Order1] where uid = 'CB114D91-F000-4553-8AFE-FC20CF6AD8C0'
Select id, uid, AffiliateId, Address, CustomProperties, OrderNumber from [dbo].[Order1] where uid = '06274E4F-E233-4594-B505-D4BAA3770F0A'
DBCC DROPCLEANBUFFERS
GO
--Part2
Select id, uid, AffiliateId, Address, OrderNumber,
CAST(CustomProperties AS xml).query('CustomProperty/Key[text()="AgreedToTerms"]/../Value/text()') as "TermsAgreed"
from Order1
DBCC DROPCLEANBUFFERS
GO
--Part3
Insert Into Order1 uid, AffiliateId, Address, CustomProperties, OrderNumber
Select NewId(), AffiliateId, Address, CustomProperties, OrderNumber + 'X' from [dbo].[Order1] where uid = 'F96045F8-A2BD-4C02-BECB-6EF22C9E473F'
Insert Into Order1 uid, AffiliateId, Address, CustomProperties, OrderNumber
Select NewId(), AffiliateId, Address, CustomProperties, OrderNumber + 'X' from [dbo].[Order1] where uid = 'A3B71348-EB68-4600-9550-EC2CF75698F4'
Insert Into Order1 uid, AffiliateId, Address, CustomProperties, OrderNumber
Select NewId(), AffiliateId, Address, CustomProperties, OrderNumber + 'X' from [dbo].[Order1] where uid = 'CB114D91-F000-4553-8AFE-FC20CF6AD8C0'
Insert Into Order1 uid, AffiliateId, Address, CustomProperties, OrderNumber
Select NewId(), AffiliateId, Address, CustomProperties, OrderNumber + 'X' from [dbo].[Order1] where uid = '06274E4F-E233-4594-B505-D4BAA3770F0A'
DBCC DROPCLEANBUFFERS
GO
-- Part4 This updates a .5M row table.
Update [dbo].[Order1] Set CustomProperties = Cast(CustomProperties as NVARCHAR(MAX)) + CAST('' as NVARCHAR(MAX)), Address = Cast(CustomProperties as NVARCHAR(MAX)) + CAST('' as NVARCHAR(MAX))
The results average results from the SQL Profiler are as follows:-
NTEXT
+-------+-------------+-------------+-------------+-------------+
| Test | CPU | Reads | Writes | Duration |
+-------+-------------+-------------+-------------+-------------+
| Part1 | 281.3333333 | 129.3333333 | 0 | 933 |
| Part2 | 78421.66667 | 5374306 | 10.66666667 | 47493.66667 |
| Part3 | 281.6666667 | 616 | 27.66666667 | 374.6666667 |
| Part4 | 40312.33333 | 15311252.67 | 320662 | 67010 |
| Total | | | | 115811.3333 |
+-------+-------------+-------------+-------------+-------------+
XML
+-------+-------------+-------------+-------------+-------------+
| Test | CPU | Reads | Writes | Duration |
+-------+-------------+-------------+-------------+-------------+
| Part1 | 282 | 58.33333333 | 0 | 949.3333333 |
| Part2 | 21129.66667 | 180143.3333 | 0 | 76048.66667 |
| Part3 | 297 | 370.3333333 | 14.66666667 | 378 |
| Part4 | 112578.3333 | 8908940.667 | 145703.6667 | 114684.3333 |
| Total | | | | 192060.3333 |
+-------+-------------+-------------+-------------+-------------+
Is the test script flawed? Or is there some other optimisation that needs to be carried out for xml data type columns out side of https://learn.microsoft.com/en-us/previous-versions/sql/sql-server-2005/administrator/ms345115(v=sql.90)
I would expect the XML column type to outperform ntext.
So this might not be an answer, at least not a solution, but it will hopefully help to understand what's going on...
The most expensive part with XML is the initial parsing, put in other words: The transformation between the textual representation and the technical storage.
Important to know: Native XML is not stored as the text you see, but as hierachy table. This needs very heavy proecessing when you pass some textual XML into SQL-Server. Calling this XML for a human reader needs the opposite process. Storing this string in a string column (be aware that NTEXT is deprecated for centuries) is faster, than storing it as native XML, but you will lose many advantages.
So to your script:
I assume, that you ran the same script but just changed Order1 to Order2. Is this correct?
Part 1 measures a simple SELECT.
In order to offer a readable representation, SQL-Server (or rather SSMS) will transform any value to some kind of text. If your tables include INTs, GUIDs or a DateTime, you would not see the actual bit patter, would you? SSMS will use quite expensive actions to create something readable for you. The expensive part is the transformation. Strings do not need this, so NTEXT will be faster.
Part 2 measures the .query() method (also in terms of "how to present the result").
Did you use the CAST( AS XML) with Order2 too? However, with such a need XML should be faster, because NTEXT will have to do the heavy parsing over and over, while XML is stored in a queryable format already... But your XQuery is rather sub-optimal (due to the backward navigation ../Value). Try this:
.query('/CustomProperty[Key[text()="AgreedToTerms"]]/Value/text()')
This will look for a <CustomProperty> where there is a <Key> with the given content and will read the <Value> below <CustomProperty> without the need of ../
I'd surely expect XML to outperform NTEXT with a CAST here... The very first call to completely new tables and indexes might return biased results...
Part 3 measures inserts
Here I would expect rather the same performance... If you move a string value into another string column this is simple copying. Moving native XML into another XML column is simple copying too.
Part 4 measures updates
This looks rather weird... What are you trying to achieve? The code needs to tranform your native XMLs to strings and re-parse them to be stored in XML. Doing the same with NTEXT will not need these expensive actions at all...
Some general thougths
If you get some XML from outside, read it from a file and you need to query it just once, string methods on string types can be faster, but: If you want to store XML permanently in order to use and manipulate their values more often, the native XML type will be much better.
In many cases performance measures do not measure what you think you do...
Try to create your tests in a way, that the presentation of the results is not part of the test (e.g. do an INSERT against a temp table, stop the clock and push the output from the temp table)
UPDATE Another test for "Part 2"
Try this test script:
USE master;
GO
CREATE DATABASE testShnugo;
GO
USE testShnugo;
GO
CREATE TABLE dbo.WithString(ID INT,SomeXML NTEXT);
CREATE TABLE dbo.WithXML(ID INT,SomeXML XML);
GO
--insert 100.000 rows to both tables
WITH Tally(Nmbr) AS (SELECT TOP 100000 ROW_NUMBER() OVER(ORDER BY (SELECT NULL)) FROM master..spt_values v1 CROSS JOIN master..spt_values v2)
INSERT INTO dbo.WithXML(ID,SomeXML)
SELECT Nmbr,(SELECT Nmbr AS [#nmbr],CONCAT('hallo',Nmbr) AS [SomeTest/#FindMe],CONCAT('SomeTestValue',Nmbr) As [SomeTest] FOR XML PATH('row'),ROOT('root'),TYPE)
FROM Tally
--copy everything to the second table
INSERT INTO dbo.WithString(ID,SomeXML) SELECT ID,CAST(SomeXML AS NVARCHAR(MAX)) FROM dbo.WithXML;
GO
--check the actual content
SELECT * FROM dbo.WithString;
SELECT * FROM dbo.WithXML;
GO
DECLARE #d DATETIME2=SYSUTCDATETIME();
SELECT * FROM dbo.WithString WHERE SomeXML LIKE '%FindMe="hallo333"%'
PRINT 'String-Method LIKE '
PRINT DATEDIFF(millisecond,#d,SYSUTCDATETIME());
SET #d=SYSUTCDATETIME();
SELECT * FROM dbo.WithString WHERE CAST(SomeXML AS xml).exist('/root/row[SomeTest[#FindMe="hallo333"]]')=1
PRINT 'CAST NTEXT to XML and .exist()'
PRINT DATEDIFF(millisecond,#d,SYSUTCDATETIME());
SET #d=SYSUTCDATETIME();
SELECT * FROM dbo.WithXML WHERE CAST(SomeXML AS nvarchar(MAX)) LIKE '%FindMe="hallo333"%'
PRINT 'String-Method LIKE after CAST XML to NVARCHAR(MAX)'
PRINT DATEDIFF(millisecond,#d,SYSUTCDATETIME());
SET #d=SYSUTCDATETIME();
SELECT * FROM dbo.WithXML WHERE SomeXML.exist('/root/row[SomeTest[#FindMe="hallo333"]]')=1
PRINT 'native XML with .exist()'
PRINT DATEDIFF(millisecond,#d,SYSUTCDATETIME());
GO
USE master;
GO
DROP DATABASE testShnugo;
First I create tables and fill them with 100.000 XMLs like this
<root>
<row nmbr="1">
<SomeTest FindMe="hallo1">SomeTestValue1</SomeTest>
</row>
</root>
My results
String-Method LIKE
836
CAST NTEXT to XML and .exist()
1962
String-Method LIKE after CAST XML to NVARCHAR(MAX)
1079
native XML with .exist()
911
As expected the fastest approach is a string method against a string type on very tiny strings. But - of course - this will not be as mighty as an elaborated XQuery and will not be able to deal with namspaces, multiple occurances and so on.
The slowest is the cast of NTEXT to XML with .exist()
A string method against the native XML after a cast to string is not that bad actually, but this depends on the XML's size. This one was very tiny...
And 100.000 non-trivial XQuery calls against 100.000 different XMLs is almost as fast as the pure string approach.
UPDATE 2: larger XMLs
I repeated the test with larger XMLs just by changing the code above in one line
SELECT Nmbr,(SELECT TOP 100 Nmbr AS [#nmbr],CONCAT('hallo',x.Nmbr) AS [SomeTest/#FindMe],CONCAT('SomeTestValue',x.Nmbr) As [SomeTest] FROM Tally x FOR XML PATH('row'),ROOT('root'),TYPE)
Now each and any XML will consist of 100 <row> elements.
<root>
<row nmbr="1">
<SomeTest FindMe="hallo1">SomeTestValue1</SomeTest>
</row>
<row nmbr="2">
<SomeTest FindMe="hallo2">SomeTestValue2</SomeTest>
</row>
<row nmbr="3">
<SomeTest FindMe="hallo3">SomeTestValue3</SomeTest>
</row>
...more of them
With a search for FindMe="hallo333" this won't return anything, but the time to find, that there is nothing to return, is enough for us:
String-Method LIKE
71959
CAST NTEXT to XML and .exist()
74773
String-Method LIKE after CAST XML to NVARCHAR(MAX)
104380
native XML with .exist()
16374
The fastest - by far! - is now the native XML. The string approaches get lost due to the strings sizes.
Please let me know your result too.

How to solve An INSERT EXEC statement cannot be nested SQL error?

I One stored procedure. I tried to get this values in temporary tabe. But i am getting below error message.
ERROR:
Msg 8164, Level 16, State 1, Procedure spLocal_MES_UnitOEE, Line 200
An INSERT EXEC statement cannot be nested.
This is my SQL statement.
Declare #tempJanMonth_OEE table(CurrentStatusIcon int, UnitName Varchar(20), ProductionAmount float, AmountEngineeringUnits varchar(20), ActualSpeed float, IdealProductionAmount float, IdealSpeed float,
SpeedEngineeringUnits varchar(50), PerformanceRate float, WasteAmount int, QualityRate int, PerformanceTime varchar(20), RunTime varchar(20), LoadingTime varchar(20), AvailableRate float, PercentOEE float, HighAlarmCount int,
MediumAlarmCount int, LowAlarmCount int, UnitId varchar(200), CategoryID int, Production_Variable varchar(100), SummaryRow int);
Insert into #tempJanMonth_OEE
Exec spLocal_MES_UnitOEE '14;15;16;3;7;9;4;5;24;25;','2017-05-14 07:00:00 AM','2017-05-15 07:00:00 AM',1,NULL,1,NULL
Select * from #tempJanMonth_OEE
Can you please help to solve this issue.
If what you are looking for is the output of the proc into a table, what I normally end up doing is using the code definition of the procedure itself, but rather than performing a SELECT, you'd insert the data into a table for further processing (As an example, I'd be interested in part of the output from sp_spaceused, but would like to perform additional calculations on the output, and leave out some columns. However, when attempting to just store the output into a temporary table or table variable, the same error message would pop up).
To do this, in SSMS, expand the database, then the programmability folder, "stored procedures", and find the proc in question. Right-click it, and then pick "Modify". This will script the definition of the proc to a new query window, where you can then alter it to your hearts' content.
The obvious disadvantage is that in case the logic in the original proc changes, a second update of your newly created logic will have to be done, because you essentially just duplicated code.
If you search the web, you will find suggestions for rather-complicated workarounds - like: Errors: "INSERT EXEC statement cannot be nested." and "Cannot use the ROLLBACK statement within an INSERT-EXEC statement." How to solve this?
But the short answer is NO. Perhaps this is a good indication that it is time for a review of the current implementation to see if it can be improved.
What about if you do the following:
Exec spLocal_MES_UnitOEE '14;15;16;3;7;9;4;5;24;25;','2017-05-14 07:00:00 AM','2017-05-15 07:00:00 AM',1,NULL,1,NULL
Insert into #tempJanMonth_OEE
select * from dbo.temp
drop table dbo.temp
Where the procedure spLocal_MES_UnitOEE, instead of doing
select *
from #result
does
select *
into dbo.temp
from #result
I do not know very much about the topic, but I think it should work (it's maybe not a good solution).
As per your requirement, i guess you should put a select query in stored procedure of your result so as to insert it into a table while executing. I do have an example, hope it solves your issue:
This procedure takes an input, adds 1 to the value and select result as follows
CREATE PROCEDURE TEST
#VAL INT
AS
BEGIN
DECLARE #NEW INT
SET #NEW=#VAL+1
SELECT #VAL,#NEW
END
Now i have created a temp table and inserted the executed result:
declare #temps table (ins int)
insert into #temps exec test '2'
insert into #temps exec test '3'
insert into #temps exec test '4'
insert into #temps exec test '5'
select * from #temps
The result is:
+---+---+
|ins|fin|
+---+---+
|2 |3 |
+---+---+
|3 |4 |
+---+---+
|4 |5 |
+---+---+
|5 |6 |
+---+---+

Inserting value to all rows in one column

I am using VB.net 2013 and SQL Server 2012. I have a table tblEmployeeInfo with two columns, EmployeeName and Date:
EmployeeName Date
--------------------
Jay Null
Mike Null
Paul Null
When I input a date value in Textbox1 like 3/20/2017, it would insert that value into all rows into column Date.
EmployeeName Date
-------------------------
Jay 3/20/2017
Mike 3/20/2017
Paul 3/20/2017
Please anyone help me out. Still no idea how to code using VB.net. My idea just to insert only using WHERE clause. But how to insert at once? Thank you guys.
A simple approach
DECLARE #NewDate DATETIME=GETDATE();
UPDATE tblEmployeeInfo SET [Date]=#NewDate;
Will set all rows to the same value. Is it really this you are trying to achieve?
If you want to hit only rows, where this value is NULL you can add
UPDATE tblEmployeeInfo SET [Date]=#NewDate WHERE [Date] IS NULL;
You - quite probably - have there some kind of grouping field where you want to set the new value selectively...? In this case just add an appropriate WHERE clause.
Looks like you need an update statement:
update tblEmployeeInfo set Date = #myDate
And add the #myDate as a parameter when calling the query.

Create report using dynamic SQL

I have a table example as such:
State Project Build Type ACTUAL0 ACTUAL1 ACTUAL2
------- ------- ----------- ----------- ----------- ----------
Ohio 154214 Residential 1/5/2013 2/25/2013 7/12/12
Utah 214356 Commercial 7/08/13 6/9/13 7/1/12
I am trying to create a report that takes the column headers beginning with the word actual and get a count of how many dates are less than a specific date. I have a temp table that I create of the column headers beginning with the word actual. This is just and example, there are over 250 columns name actual. So the table looks like this:
MilestoneNmbr
-------------
ACTUAL1
ACTUAL2
ACTUAL3
Now what I think would work is to take the row as a variable for the column header and pass in a date into a function. Here is a function I created:
CREATE FUNCTION [dbo].[GetMSActualCount]
(
#ACTUAL nvarchar(16),
#DATE nvarchar(16)
)
RETURNS int
AS
BEGIN
DECLARE #ACTUALRETURN int
DECLARE #SQL nVarchar(255) =
'SELECT COUNT(' + #ACTUAL + ') AS Expr1
FROM [CASPR_MILESTONES_000-036]
WHERE '+ #ACTUAL +' > ' + #DATE
exec sp_executesql #SQL, N''
SET #ACTUALRETURN = #SQL
-- Return the result of the function
RETURN #ACTUALRETURN
END
If I run the following query:
DECLARE #DATE varchar(20)
SET #DATE = '''1/1/2013'''
SELECT MilestoneNmbr, dbo.getMSActualCount(milestonenmbr,#Date) from #List_CASPR_Milestones
So my error is that I can't use dynamic SQL in a function. With that being so, what can I do? My easy query here I think will turn into hundreds of lines. Is there another easy way I can do this?
EDIT:
My results I am looking for is something like this:
MilestoneNmbr CountofDate
--------------- ------------
ACTUAL1 200
ACTUAL2 344
ACTUAL3 400
You are right you can't use dynamic SQL in a function. There are two answers:
First your table with 250 columns ACTUAL plus a number is a nightmare. You can't use any of the built in stuff that SQL does well to help. You should have two tables. First a projects table that has an ID column plus columns for State, Project, and BuildType. Then a table of ProjectDates with a ProjectID column that references the first table and then a column for ActualDate. Reporting from that should be easy.
Given that you probably can't fix the structure try writing a stored procedure. That can use dynamic SQL. Event better is that your stored procedure can create temp tables like above and then use them to do statistics.
I agree 100% with Charles. If you CAN change the structure this is what I would do:
If possible have a build type table (ID/Build Type), don't have text columns unless you need them as text for something. Anything that can be coded, code it.
The two tables:
project header (Proj_ID (long_int)/State (int or char(2)) / build_type (int)), primary key either Proj_id by itself or a new ID if its not unique (as a PK Proj_id & State would not be too useful as a PK).
Project_date (Proj_ID (same as PK above) / Date_ID (int) / Actual_Date (DateTime))
So your second example would be:
Project_Header:
214356 / UT / 2 (being 1 Residential, 2 Commercial, 3 Industrial ...)
Project_Date:
214356 / 0 / '07/08/13'
214356 / 1 / '06/09/13'
214356 / 2 / '07/01/12'
Latest build date by project would be:
Select 'Actual_date'
from Project_date
where Project_id='nnn'
order by date_id DESC
Limit 1;
Your query would be something like (if the dates are in incremental order):
Select Project_id, max(Date_id)
From Project_date
Group by Project_id
having Actual_date < #date
you can see it's pretty straight forward.
If you CAN'T change the structures but you CAN make new tables I would make an SP that takes that ugly table and generates the Project_Date x times per day ( or you could even tie it to a trigger on inert/update of the first table) and the Project_header once per day (or more often if needed). This would take considerably less time and effort than what you are attempting, plus you could use it for other queries.
To solve this I created a table housing the ACTUAL dates. I then went and looped through each row in the List_ACTUAL table to get the names and select the count of the dates names greater than the variable I pass in. I will be converting this to a PROC. This is how:
DECLARE #MS nvarchar(16)
DECLARE MSLIST CURSOR LOCAL FOR SELECT MilstoneNmbr FROM List_ACTUAL
DECLARE #SQL nvarchar(max)
DECLARE #DATE nvarchar(16)
SET #DATE = '1/1/2013'
CREATE #TEMP (Milestones nvarchar(16), Frozen int)
OPEN MSLIST
FETCH NEXT FROM MSLIST INTO #MS
WHILE ##FETCH_STATUS=0
BEGIN
SELECT #SQL = 'INSERT INTO #TEMP VALUES (''' +#MS+ ''', (Select count(' +#MS+ ') FROM PROJECTDATA WHERE ' +#MS+ ' > ''' + #Date + '''))'
EXEC sp_executesql #SQL, N''
FETCH NEXT FROM MSLIST INTO #MS
END
CLOSE MSLIST
DEALLOCATE MSLIST
Hope this helps someone.

Resources