Recently I'm using ASP Boiler Plate for my project. I found that the audit log was created automatically after the API function ended. However, the audit value inserted into the db was truncated and replaced with three dots.
I've gone through a lot of forum but there isn't any solution found.
Is there any setting that we can make to change the behavior of replacing the data with three dots?
example as below:-
{"input":{"id":12345,"modificationDT":"2022-08-15T10:00:00.000000+08:00","isChangesMade":false,"details":{"status":13,"emailSentDT":"2021-05-0T00:00:00","emailSentId":12,"msgSentDT":"ASAP","remark":null,"itemsRepresentativeId":15,"itemsRepresentativeName":"xxxxxx"},"itemsChosen":[{"id":527,"itemDescription":"xxxxx xxxxx xxxxx xxxxxxx ","isItemInUse":true,"itemPrice":13.2,"itemQuantity":25,"remarks":null}],"return":[{"itemId":595,"itemCategory":"11","itemReturnRemarks":"","isItemBroken":false,"returningPolicyAcceptanceStatus":true,"isCompensationNeeded":false,"compensationMethod":0,"adminRemarksOnItem":"","CompensationAmount":0.0,"userUpdate":true,"itemImage":[{"compensationId":900,"itemImageURL":"https://ksuHVUJH-jnsadkna.KBidbwJBK!#.OLjba7s87/HBBDA/hbjdas-!#!!##!j-jb3123-31231knc^&kn/jkkdwqjbdkjq(-60.jpg","imageId":109231}],"imageCreationDT":"2021-01-01T15:29:23.728136","itemModifiedDT":"2021-05-05T10:10:10.120912+08:00","UserAcceptanceId":"JDSAJBD-FKAJBFKB-FKQJFBKBWF-KSJABKFBAS-XXXX","adminId":89182},{"itemId":907,"itemCategory":"21","itemReturnRemarks":"XXXXXX","isItemBroken":true,"returningPolicyAcceptanceStatus":true,"isCompensationNeeded":true,"compensationMethod":3,...
You can increase AuditLogActionConsts.MaxParametersLength in the PreConfigureServices method of your module.
The default values from AuditLogActionConsts.cs are:
public static int MaxServiceNameLength { get; set; } = 256;
public static int MaxMethodNameLength { get; set; } = 128;
public static int MaxParametersLength { get; set; } = 2000;
Related
I've got a relatively basic model - Users and Tags. There is a fixed list of Tags. A User can have multiple Tags and a Tag can be used by multiple users.
I had gone with structure below and finding performance issues when returning results.
public class User
{
public string Id {get; set;}
public virtual List<UserTag> UserTags {get; set}
}
public class UserTag
{
public string UserId { get; set; }
public User User { get; set; }
public int TagId { get; set; }
public Tag Tag{ get; set; }
}
public class Tag
{
[Key]
public int TagId { get; set; }
public string Name { get; set; }
public virtual List<UserTag> UserTags { get; set; }
}
I have the following query which is takings a long time (several seconds):
var x = db.Users.Include(u => u.UserTags).ThenInclude(u => u.Trait).ToList<User>();
I have tried writing it as such, which has improved the time, however it is still taking too long:
db.UserTags.Load();
db.Tags.Load();
var x = db.Users.ToList<User>();
Is there any other way to speed this up? Running a query directly in SQL SMS is almost instant (e.g.
select * from Users u left outer join UserTags t on t.UserId = u.Id)
In terms of data rows, it is apx Tags: 100, UserTags:50,000, Users: 5,000
First you can check how EF translates your request to SQL Server - therefore use the "SQL Server Profiler"
Then you could use the genereated query to check if there might be an missing index which speeds up the query
You also can try to write a Join instead of ThenInclude and see how the query then behaves
best regards
Jimmy
So I'm using the C# nuget wrapper around Azure Search. My problem is I have a index of products:
public class ProductDocument
{
[System.ComponentModel.DataAnnotations.Key]
public string Key { get; set; }
[IsSearchable]
public string Sku { get; set; }
[IsSearchable]
public string Name { get; set; }
[IsSearchable]
public string FullDescription { get; set; }
[IsSearchable]
public List<CustomerSkuDocument> CustomerSkus { get; set; }
}
public class CustomerSkuDocument
{
[IsSearchable]
public int AccountId { get; set; }
[IsSearchable]
public string Sku { get; set; }
}
Example data would be:
new Product() { Key= 100,Name="Nail 101",Sku = "CCCCCCCC", CustomerSkus = new List<ProductCustomerSku>()
{
new ProductCustomerSku() {AccountId = 222, CustomerSku = "BBBB"},
new ProductCustomerSku() {AccountId = 333, CustomerSku = "EEEEEEE"}
}
So the problem is around CustomerSkuDocument.
When I Search I need to pass the AccountId in as well as the search term, however the AccountId is only used for when searching the ProductCustomerSkus.
Basically an Account can have different customer skus but it's only associated to that account - I don't want a separate index per account.
So my call would be something like /AccountId=222&term=BBBB which would find the match.
However /AccountId=333&term=BBBB would not find a match.
So I'm calling it like:
SearchParameters sp = new SearchParameters();
sp.SearchMode = SearchMode.Any;
sp.QueryType = QueryType.Full;
DocumentSearchResult<ProductDocument> results =
productIndexClient.Documents.Search<ProductDocument>(term, sp);
Where term is the normal search term, tried it with adding the AccountId but it doesn't work.
Azure Search does not support repeating data structures nested under a property of the outer document. We're working on this (see https://feedback.azure.com/forums/263029-azure-search/suggestions/6670910-modelling-complex-types-in-indexes), but we still have some work to do before we can release that.
Given that, the example you're showing is not probably indexing the nested parts. Can you post the search index definition you're using? While we work in direct support for complex types, you can see your options for approach here: https://learn.microsoft.com/en-us/azure/search/search-howto-complex-data-types
From the above you'll arribe at a index structure that will also guide your query options. If all you need is equality, perhaps you can simply include the accountId and the SKU in the same field and use a collection field so you can have multiple instances. For your query you would issue a search query that requires the accountId and has the rest as optional keywords.
I have a BlobEntity table that contains paths to files for many other tables (tableX, tableY, tableZ, etc...) in my application.
The relation between all the other tables to BlobEntity table is one to many.
Example:
tableX -> BlobTable (OTM)
tableY -> BlobTable (OTM)
tableZ -> BlobTable (OTM)
and the relation is:
public virtual ICollection<BlobEntity> BlobEntity { get; set; }
I'm not sure if this is an issue, but entity framework Code First creates a new FK column in BlobEntity table for each source table.
In my case, BlobEntity contains three FK columns for tableX, tableY and tableZ.
In order to be efficiency, i rather create one column in BlobEntity that contains the FK for the source tables.
Is it reasonable?
Please advise...
Thanks.
No, you can't do this even in plain old SQL.
You can have a foreing key pointing to more than one table; that's why you need
three columns.
If you want to do a "trick" like this, you have to manually manage the relation (I mean, no real FK), but you can't map it into EF.
What about this?
public class EntityA
{
public int Id { get; set; }
public int MyFileID {get;set;}
public virtual MyFiles MyFile { get; set; }
}
public class EntityB
{
public int Id { get; set; }
public int MyFileID {get;set;}
public virtual MyFiles MyFile { get; set; }
}
public class MyFiles
{
public MyFiles()
{
// ReSharper disable once VirtualMemberCallInContructor
FilesForEntityA = new List<EntityA>();
// ReSharper disable once VirtualMemberCallInContructor
FilesForEntityB = new List<EntityB>();
}
public int Id { get; set; }
public int? EntityAId {get;set;}
public int? EntityBId {get;set;}
public virtual ICollection<EntityA> FilesForEntityA { get; set; }
public virtual ICollection<EntityB> FilesForEntityB { get; set; }
}
This way you can have the FK in place and you can easily manager multiple entities.
Obviously if you have many files for each entity, you can go with a N-to-N relationship, like this.
I store animals in an SQL Server database table created by Entity Framework Code First. This is the POCO for it:
[Table("Animal")]
public class Animal
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int ID { get; set; }
public string SomeData { get; set; }
public byte[] OtherData { get; set; }
public int ExternalSourceId { get; set; }
public string ExternalSourceAnimalId { get; set; }
}
Some animals are regularly updated by an external source (there are a few sources but a specific animal is updated from maximum 1 source). Also when an external source has a new animal it needs to be inserted. Some animals are maintained locally, these have no external source.
An external source have the Animal identified by the ExternalSourceAnimalId. These are unique for one source but there are chances of 2 sources using overlapping IDs, so ExternalSourceId is also there to make the 2 of them a combined natural key.
So I get a huge list of animals from an external source and I need to insert or update them in the local database depending on the existence of the specific external key in our datebase.
This is my current solution on doing this (in a class inheriting form SharpRepository):
public void InsertOrUpdateAnimal(ExternalAnimal exAnimal, int externalSourceId)
{
var animal = DbSet.SingleOrDefault(o => o.ExternalSourceId == externalSourceId && o.ExternalSourceAnimalId == exAnimal.Id);
if (animal != null)
{
CopyData(exAnimal, animal); // copies the properties from exAnimal to animal
this.Update(animal);
}
else
{
animal = new Animal();
CopyData(exAnimal, animal);
this.Add(animal);
}
}
Since this is a bulk operation and takes quite some time I was wondering if there is a faster solution for this. For example if there is a way to upsert in a single database operation.
I have this exert from a POCO class with many fields:
public class Call
{
public int Id { get; set; }
public string Customer { get; set; }
public int StatusId { get; set; }
public int UserAssignedToId { get; set; }
public string UserAssignedToName { get; set; }
}
However my stored procedure returns different names to the properties above (in this case the Id is before:
IdCall
IdStatus
IdUserAssignedTo
This is the code I am using to execute the stored procedure:
var call = conn.Query<Call>("CallSPName", new { IdCall = callId }, commandType: CommandType.StoredProcedure).First();
How can I specify a mapping to say I would like "IdStatus" from my stored procedure map to "StatusId" in my POCO class and "IdCall" to "CallId" etc?
I don't have access to change the stored procedures as they are controlled by DBAs and older legacy systems are using them which would break if the fields got changed in the stored procedure.
Any ideas/thoughts appreciated.
The closest thing which comes to my mind is to have private properties mapped to columns returned by the stored procedure and make the public properties with the names you want setting and getting those private fields:
// ...
private int IdStatus;
public int StatusId {
get { return IdStatus; }
set { IdStatus = value; }
}
// ...