I have an existing table => zoo[design] I add to it new column:
zooName nvarchar(50) allow null
Then I updated the server but when i go in zoo[data] and push refresh I can't see the zooName there.
But if I make new Query
select * from zoo
it's display for me the zooName but i can't edit it in the zoo[data]
I closed the project and open it again then it's ok
I don't want to open and close the project every time I need to add new row.
Refresh on your database node from Object Explorer if you are using MSSQL.
Related
Where can I find breaking changes list from version 2 and/or 3 to version 4 of IdentityServer4?
I am trying to upgrade a project with IdentityServer4 version 2 to version 4.
I made a migration from 2.2 to 4.1.2 version two years ago and I remember I found some breaking changes. Others were imposed by the upgrade of framework I made (net core 2.1 to 3.1). These are some of the changes related only to IdentityServer4:
Database scheme. If you have data in production you'll need to maintain this data in the migration. Automatic migrations will delete and recreate tables mercilessly. There are table and column renames, new columns, deleted columns, changes in the indexes...
Client Cors Origins validation. There is a new validation to force all Url configured in ClientCorsOrigins to complain with an Origin format. If only one of them does not comply with the format, an exception is thrown. You should review your production values to avoid fails.
// format:
<protocol>://<domain>:<port>
// good examples:
http://localhost:5000
http://example.com
https://anotherdomain.com
http://example.com:1234
// bad examples
http://example.com/
https://example.com/mypath
example.com:1234
Some code changes:
AuthorizationRequest.ClientId -> AuthorizationRequest.Client.ClientId.
ResourceValidationResult groups ApiResources and IdentityResources properties in a common property called Resources.
ValidatedTokenRequest changes his property Scopes to RequestedScopes.
GetAllUserConsentsAsync -> GetAllUserGrantsAsync.
In your UI some ModelViews will need to be updated to the new scheme. If you stared with the QuickStart.UI you can compare with the new version to add the new features.
If you have an admin you'll have to adapt it to the new scheme.
Migrations
I created the migrations automatically and then I edited the migration to reorder and to add manual scripts to save the data (For example, create a table before deleting the old one and move the data).
These are the scripts I had to insert manually for the Up migration.
Reorder the code to create ApiResourceScopes table before deleting column ApiResourceId from ApiScopes table.
Insert Into [ApiResourceScopes] ([ApiResourceId], [Scope]) Select [ApiResourceId], [Name] From [ApiScopes]
As ApiScopes has a new field called Enabled and by default takes 0, you'll want to enable for all of them. Run this script just before create the Enabled column:
Update [ApiScopes] set [Enabled] = 1
ApiSecrets must be moved to the new table ApiResourceSecrets. So you should run this script before delete ApiSecrets:
Insert Into [ApiResourceSecrets] ([Description], [Value], [Expiration], [Type], [ApiResourceId], [Created]) Select [Description], [Value], [Expiration], [Type], [ApiResourceId], GetDate() From [ApiSecrets]
The table IdentityClaims is renamed to IdentityResourceClaims. So you'll need to run this script after crate IdentityResourceClaims and before delete IdentityClaims.
Insert Into [IdentityResourceClaims] ([Type], [IdentityResourceId]) Select [Type], [IdentityResourceId] From [IdentityClaims]
For the Down migration you need to do exactly the reverse:
Restore ApiScopes. Move the data from ApiResourceScopes.ApiResourceId by using in the join the Scope and the Name fields respectively.
Update [ApiScopes] Set [ApiScopes].[ApiResourceId] = apir.[ApiResourceId] from [ApiScopes] apis Inner Join [ApiResourceScopes] apir On apis.[Name] = apir.[Scope]
Restore ApiSecrets. Move the data after create ApiSecrets table and before delete ApiResourceSecrets:
Insert Into [ApiSecrets] ([Description], [Value], [Expiration], [Type], [ApiResourceId]) Select [Description], [Value], [Expiration], [Type], [ApiResourceId] From [ApiResourceSecrets]
Restore IdentityClaims. Move the data after create IdentityClaims and before delete IdentityResourceClaims:
Insert Into [IdentityClaims] ([Type], [IdentityResourceId]) Select [Type], [IdentityResourceId] From [IdentityResourceClaims]
I am using SQL Server and I created a table:
create table data
(
description nvarchar(100),
url varchar(500)
)
Then I insert data:
INSERT INTO data (description, url)
VALUES ('google', 'https://www.google.com'),
('yahoo', 'https://www.yahoo.com'),
('baidu', 'https://www.baidu.com')
I want to get a clickable url when executing:
select url
from data
where description = 'google'
In other words, when I click the url returned, a chrome will be opened and go to google site.
I found that there is a method but it requires user to click twice.
What should I do?
The one that succeed to me is setting the data type of the url as xml so that it is clickable then you will get a new query window and there if you click the url with CTRL+K then the url will pop in the sql server window. For url the most appropriate data type would Varchar with preferred length but unfortunately that is not clickable within SSMS.
You can check the following URL,
basically you can click in query editor or you can set and there are few more alternatives here,
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/8c78549d-0aa2-4fed-acab-51ad11cb59a4/is-url-clickable-somewhere-in-ssms?forum=transactsql
I have database records to store page urls but we have problem in deployment the developer using localhost to store page filed in database because of that we end up with
http://localhost:8080/wow/page1.aspx
http://localhost:8080/wow/page2.aspx
http://localhost:8080/wow/page3.aspx
just wondering can i write Tsql to loop throw record to remove the localhost from url and updating the new url
You don't need a loop, a simple UPDATE will do the trick. E.g.
UPDATE WEB_TABLE SET URL_FIELD = REPLACE(URL_FIELD, 'localhost:8080', 'www.mysite.com')
You can execute a SQL update query using the replace method.
UPDATE [your table] SET [your field] = REPLACE([your field],'http://localhost:8080','')
That will update them all in one go.
OR, if you want to loop you could use a CURSOR
I am trying to use Dapper support my data access for my server app.
My server app has another application that drops records into my database at a rate of 400 per minute.
My app pulls them out in batches, processes them, and then deletes them from the database.
Since data continues to flow into the database while I am processing, I don't have a good way to say delete from myTable where allProcessed = true.
However, I do know the PK value of the rows to delete. So I want to do a delete from myTable where Id in #listToDelete
Problem is that if my server goes down for even 6 mintues, then I have over 2100 rows to delete.
Since Dapper takes my #listToDelete and turns each one into a parameter, my call to delete fails. (Causing my data purging to get even further behind.)
What is the best way to deal with this in Dapper?
NOTES:
I have looked at Tabled Valued Parameters but from what I can see, they are not very performant. This piece of my architecture is the bottle neck of my system and I need to be very very fast.
One option is to create a temp table on the server and then use the bulk load facility to upload all the IDs into that table at once. Then use a join, EXISTS or IN clause to delete only the records that you uploaded into your temp table.
Bulk loads are a well-optimized path in SQL Server and it should be very fast.
For example:
Execute the statement CREATE TABLE #RowsToDelete(ID INT PRIMARY KEY)
Use a bulk load to insert keys into #RowsToDelete
Execute DELETE FROM myTable where Id IN (SELECT ID FROM #RowsToDelete)
Execute DROP TABLE #RowsToDelte (the table will also be automatically dropped if you close the session)
(Assuming Dapper) code example:
conn.Open();
var columnName = "ID";
conn.Execute(string.Format("CREATE TABLE #{0}s({0} INT PRIMARY KEY)", columnName));
using (var bulkCopy = new SqlBulkCopy(conn))
{
bulkCopy.BatchSize = ids.Count;
bulkCopy.DestinationTableName = string.Format("#{0}s", columnName);
var table = new DataTable();
table.Columns.Add(columnName, typeof (int));
bulkCopy.ColumnMappings.Add(columnName, columnName);
foreach (var id in ids)
{
table.Rows.Add(id);
}
bulkCopy.WriteToServer(table);
}
//or do other things with your table instead of deleting here
conn.Execute(string.Format(#"DELETE FROM myTable where Id IN
(SELECT {0} FROM #{0}s", columnName));
conn.Execute(string.Format("DROP TABLE #{0}s", columnName));
To get this code working, I went dark side.
Since Dapper makes my list into parameters. And SQL Server can't handle a lot of parameters. (I have never needed even double digit parameters before). I had to go with Dynamic SQL.
So here was my solution:
string listOfIdsJoined = "("+String.Join(",", listOfIds.ToArray())+")";
connection.Execute("delete from myTable where Id in " + listOfIdsJoined);
Before everyone grabs the their torches and pitchforks, let me explain.
This code runs on a server whose only input is a data feed from a Mainframe system.
The list I am dynamically creating is a list of longs/bigints.
The longs/bigints are from an Identity column.
I know constructing dynamic SQL is bad juju, but in this case, I just can't see how it leads to a security risk.
Dapper request the List of object having parameter as a property so in above case a list of object having Id as property will work.
connection.Execute("delete from myTable where Id in (#Id)", listOfIds.AsEnumerable().Select(i=> new { Id = i }).ToList());
This will work.
I have a SQL Server CE database file and I have ADO.NET Entity Framework object named History. I perform the following to get the latest ID:
var historyid = (from id in db.Histories // Get the current highest history Id
select (Int32?)(id.Id)).Max();
HistoryID = Convert.ToInt32(historyid);
if (HistoryID == 0) // No current record exists
{
HistoryID = 1; // Set the starter history Id
}
(Not the best way, but it is just to test).
This works well and returns the correct value (If I manually enter an ID of 1, it returns 1, same with 2 etc..)
The issue is with the following code:
History history = new History();
history.Id = 1;
history.SoftwareInstalled = "test";
db.AddToHistories(history);
db.SaveChanges();
The above works and runs through fine, but the changes are not saved to the database! Why is this?
Here is my db variable: dbDeSluggerEntities db = new dbDeSluggerEntities();
Thanks.
Edit: Here is an update: I noticed when I run the program and debug it, it shows that a history has been created and inserted and also gives an error of the primary key already existing, this happens each time I run the application (over and over). However, when I go to server explorer and right click the db and select 'Show Table Data' it shows no data.. When the program is run again, there is no primary key error or history's showing in the debug!
I was having exactly the same problem. By default the database is copied into your bin folder when you run the app and that is used.
My solution was to go into the properties of the db and set it to never copy and to put a full file path to the database in the connection string in the apps config file.