Is there anyway i can do $each on Update.addToSet() on spring mongodb ?
Currently when i pass the array object as the addToSet param,
it's only using $addToSet without the $each, resulting in putting the passed array into the existing array.
I've got a working version of a custom Update class,
mixed with the solution proposed in https://jira.springsource.org/browse/DATAMONGO-471
public class MyUpdate extends Update {
public MyUpdate myAddToSet(String key, Object value, MongoOperations ops) {
BasicDBObject dbObject = new BasicDBObject();
ops.getConverter().write(value, dbObject);
this.addToSet(
key,
dbObject
);
// dont do this, will cause serialization exception
// System.out.println(this.getUpdateObject().toString());
return this;
}
public MyUpdate myAddToSetAll(String key, Collection<Object> values, MongoOperations ops) {
BasicDBList eachList = new BasicDBList();
for (Object value : values) {
BasicDBObject dbObject = new BasicDBObject();
ops.getConverter().write(value, dbObject);
eachList.add(dbObject);
}
this.addToSet(
key,
BasicDBObjectBuilder.start("$each", eachList).get()
);
// dont do this, will cause serialization exception
// System.out.println(this.getUpdateObject().toString());
return this;
}
public MyUpdate myPushAll(String key, Object[] values) {
super.addMultiFieldOperation("$pushAll", key, values);
return this;
}
public MyUpdate myPush(String key, Object[] values) {
super.addMultiFieldOperation("$push", key, values);
return this;
}
public MyUpdate myPush(String key, Object value) {
super.addMultiFieldOperation("$push", key, value);
return this;
}
}
Related
I have a DynamoDB table with a primary key (id : integer) and secondary key (dateTo : String). I've made a Class that utilizes DynamoDBMapper:
#DynamoDBTable(tableName="MyItems"
public class MyItemsMapper {
private int id;
private String dateTo;
private String name;
#DynamoDBHashKey(attributeName="id")
public void setId(int id) { this.id = id; }
public int getId() { return id; }
#DynamoDBAttribute(attributeName="dateTo")
public void setDateTo(String dateTo) { this.dateTo = dateTo; }
public String getDateTo() { return dateTo; }
#DynamoDBAttribute(attributeName="name")
public void setName(String name { this.name = name; }
public String getName() { return name; }
public boolean saveItem(MyItemsMapper item) {
try {
DynamoDBMapper mapper = new DynamoDBMapper(client); //<-- This connects to the DB. This works fine.
item.setId(generateUniqueNumber()); //<-- This generates a unique integer. Also seems to work fine.
mapper.save(item);
logger.info("Successfully saved item. See info below.");
logger.info(item.toString());
return true;
} catch (Exception e) {
logger.error("Exception while trying to save item: " + e.getMessage());
e.printStackTrace();
return false;
}
}
}
I then have a manager class that uses the bean above, like so:
public class MyManager {
public boolean recordItem(
int id,
String dateTo,
String name,
) {
MyItemsMapper myItemsMapper = new MyItemsMapper();
myItemsMapper.setId(id);
myItemsMapper.setDateTo(dateTo);
myItemsMapper.setName(name);
myItemsMapper.saveItem(myItemsMapper);
}
}
I am running the manager class in a JUnit test:
public class MyManagerTest {
#Test
public void saveNewItemTest() {
MyManager myManager = new MyManager();
myManager.recordItem(1234567, "2018-01-01", "Anthony");
}
}
When I use the saveItem method above via my manager by running my JUnit test, I get the following error:
com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: MyItemsMapper; no mapping for HASH key
Not really sure what it's pertaining to, as I definitely have a primary key for my table and my secondary key always has a value as well.
How do I get this to work?
More Info:
It's worth noting that I can record data into my DynamoDB table via the Item object. If I do the below, my data gets recorded into the database:
DynamoDB dynamoDB = new DynamoDBClient().connectToDynamoDB(); //<--
Connection. Works fine.
Table table = dynamoDB.getTable("MyItems");
item.withPrimaryKey("id", 1234567);
item.withString("dateTo", "2018-01-01");
item.withString("name", "Anthony");
PutItemOutcome outcome = table.putItem(item);
However, I'm trying to use DynamoDBMapper because I'm reading that it is a more organized, better way to access data.
Im not sure if this is causing the problem, but you are creating the myItemsMapper object, then passing a reference to this object to itself.
I would suggest removing your saveItem method. The MyItemsMapper class should be a plain old java object. Then make MyManager like this
public class MyManager {
public boolean recordItem(
int id,
String dateTo,
String name,
) {
MyItemsMapper myItemsMapper = new MyItemsMapper();
myItemsMapper.setId(id);
myItemsMapper.setDateTo(dateTo);
myItemsMapper.setName(name);
DynamoDBMapper mapper = new DynamoDBMapper(client);
mapper.save(myItemsMapper);
}
}
If you particularly want to keep the saveItem method make it like this
public boolean saveItem() {
try {
DynamoDBMapper mapper = new DynamoDBMapper(client);
mapper.save(this);
logger.info("Successfully saved item. See info below.");
logger.info(this.toString());
return true;
} catch (Exception e) {
logger.error("Exception while trying to save item: " + e.getMessage());
e.printStackTrace();
return false;
}
}
And then in MyManager do
MyItemsMapper myItemsMapper = new MyItemsMapper();
myItemsMapper.setId(id);
myItemsMapper.setDateTo(dateTo);
myItemsMapper.setName(name);
myItemsMapper.saveItem();
I am trying to convert a custom object that contains a map containing dotted key string value using the latest 1.7.2 spring mongodb.
Setting a dot replacement doesnt seem to do the job. Here's my code :
class FakeUser {
Map<String, String> map = new LinkedHashMap<>();
void addValue(String key, String value) {
this.map.put(key, value);
}
}
FakeUser fakeUser = new FakeUser();
fakeUser.addValue("test.dot.for.key", "test.dot.for.value");
this.mappingMongoConverter.setMapKeyDotReplacement(":");
Object convertedObject = this.mappingMongoConverter.convertToMongoType(fakeUser);
System.out.println("convertedObject: " + convertedObject.getClass() + ":" + convertedObject);
And the output:
convertedObject: class com.mongodb.BasicDBObject:{ "map" : { "test.dot.for.key" : "test.dot.for.value"}}
And i also tried:
class FakeUser {
Map<String, String> map = new LinkedHashMap<>();
void addValue(String key, String value) {
this.map.put(key, value);
}
}
FakeUser fakeUser = new FakeUser();
fakeUser.addValue("test.dot.for.key", "test.dot.for.value");
this.mappingMongoConverter.setMapKeyDotReplacement(":");
BasicDBObject dbo = new BasicDBObject();
this.mappingMongoConverter.write(fakeUser, dbo);
System.out.println("dbo: " + ":" + dbo.toMap());
And with the output of dbo: :{_class=app.security.MyClass$1FakeUser, map={ "test.dot.for.key" : "test.dot.for.value"}}
I was expecting "test.dot.for.key" to become "test:dot:for:key", so what did i do wrong ?
You need make adjustments to the converter in your SpringMongoDBConfig (which should extend AbstractMongoConfiguration) config files, rather than in the class you're trying to serialize/deserialze. If you're using annotation driven settings, you can use a custom bean to set the converter like so:
#Bean
#Override
public MappingMongoConverter mappingMongoConverter() throws Exception
{
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
//mongo won't accept key values with dots(.) in them, so configure it to store them as :
converter.setMapKeyDotReplacement("\\:");
return converter;
}
Once the converter is setup, it will handle both serialization and deserialization for you automagically.
If you really want to do the conversion in-line (not recommended), you can just use a string replace function to re-write the string like so:
class FakeUser
{
Map<String, String> map = new LinkedHashMap<>();
void addValue(String key, String value) {
key = key.replace(".",":");
this.map.put(key, value);
}
}
Although, if you do it inline you'll likely have issues deserializing when you get it back out of mongo.
I'm using Spring AOP trying to define a good approach to have all my tables audited with no much hassle. Example of scenario:
I have a table named Person and its respective table PersonLog, which will store the Person values in addition to the user who modified, when and the type of the event, for each update.
Simply put, my question is:
I'm trying to come up with a way that my advice class would be smart enough to handle any new table being audited without any needed modification to it... let's say that I created the table Car and its CarLog table, if I could avoid the need to change anything in my advice implementation (it would automatically identify Car as being audited and would be able to persist a CarLog entity) ---> I can identify table Car as being audited pretty easily (by annotation), but I'm struggling to find a way to create and persist a CarLog instance dynamically.
Can anyone think of a way to accomplish that? Thanks.
This is called "change data capture" or CDC.
Personally, I don't think this is a good use for Spring or AOP. I think it would be better done in the database itself, especially if the database is shared/modified by more than one application.
You don't say which database you're using, but I'd recommend digging into your vendor's docs to find out what they have out of the box to support CDC.
i had similiar requirement in project where i am suppose to take snapshot of complex object graph before saving.
solution i have applied is 1) developed custom annotation #Archivable with certain attribute like nullify,ignore, orignal, setArchiveFlag
2) written hiberante deep cloner utility which create replica of object and insert into same table. deep cloner works on simple trick searlize and then desearlize object this will create new instances and then set id and version to null.
3) used cloner utility in entity interceptor to take decision weather to archive or not.
below is some of that code.
#Retention(RetentionPolicy.RUNTIME)
#Target( { ElementType.TYPE })
public #interface Archivable {
/** This will mark property as null in clone */
public String[] nullify() default {};
/**
* If property is archivable but not from enclosing entity then specify as
* ignore.
*/
public String[] ignore() default {};
/**
* sets original reference to clone for back refer data. This annotation is
* applicable to only root entity from where archiving started.
*
* #return
*/
public String original() default "";
/**
* if marks cloned entity to archived, assumes flag to be "isArchived".
* #return
*/
public boolean setArchiveFlag() default false;
}
#Component
public class ClonerUtils {
private static final String IS_ARCHIVED = "isArchived";
#Autowired
private SessionFactory sessionFactory;
public Object copyAndSave(Serializable obj) throws Exception {
List<BaseEntity> entities = new ArrayList<BaseEntity>();
Object clone=this.copy(obj,entities);
this.save(clone, entities);
return clone;
}
public Object copy(Serializable obj,List<BaseEntity> entities) throws Exception{
recursiveInitliaze(obj);
Object clone = SerializationHelper.clone(obj);
prepareHibernateObject(clone, entities);
if(!getOriginal(obj).equals("")){
PropertyUtils.setSimpleProperty(clone, getOriginal(obj), obj);
}
return clone;
}
private void save(Object obj,List<BaseEntity> entities){
for (BaseEntity baseEntity : entities) {
sessionFactory.getCurrentSession().save(baseEntity);
}
}
#SuppressWarnings("unchecked")
public void recursiveInitliaze(Object obj) throws Exception {
if (!isArchivable(obj)) {
return;
}
if(!Hibernate.isInitialized(obj))
Hibernate.initialize(obj);
PropertyDescriptor[] properties = PropertyUtils.getPropertyDescriptors(obj);
for (PropertyDescriptor propertyDescriptor : properties) {
Object origProp = PropertyUtils.getProperty(obj, propertyDescriptor.getName());
if (origProp != null && isArchivable(origProp) && !isIgnore(propertyDescriptor, obj)) {
this.recursiveInitliaze(origProp);
}
if (origProp instanceof Collection && origProp != null) {
for (Object item : (Collection) origProp) {
this.recursiveInitliaze(item);
}
}
}
}
#SuppressWarnings("unchecked")
private void prepareHibernateObject(Object obj, List entities) throws Exception {
if (!isArchivable(obj)) {
return;
}
if (obj instanceof BaseEntity) {
((BaseEntity) obj).setId(null);
((BaseEntity) obj).setVersion(null);
if(hasArchiveFlag(obj)){
PropertyUtils.setSimpleProperty(obj, IS_ARCHIVED, true);
}
entities.add(obj);
}
String[] nullifyList = getNullifyList(obj);
for (String prop : nullifyList) {
PropertyUtils.setProperty(obj, prop, null);
}
PropertyDescriptor[] properties = PropertyUtils.getPropertyDescriptors(obj);
for (PropertyDescriptor propertyDescriptor : properties) {
if (isIgnore(propertyDescriptor, obj)) {
continue;
}
Object origProp = PropertyUtils.getProperty(obj, propertyDescriptor.getName());
if (origProp != null && isArchivable(origProp)) {
this.prepareHibernateObject(origProp, entities);
}
/** This code is for element collection */
if(origProp instanceof PersistentBag){
Collection elemColl=createNewCollection(origProp);
PersistentBag pColl=(PersistentBag) origProp;
elemColl.addAll(pColl.subList(0, pColl.size()));
PropertyUtils.setSimpleProperty(obj, propertyDescriptor.getName(), elemColl);
continue;
}
if (origProp instanceof Collection && origProp != null) {
Collection newCollection = createNewCollection(origProp);
PropertyUtils.setSimpleProperty(obj, propertyDescriptor.getName(), newCollection);
for (Object item : (Collection) origProp) {
this.prepareHibernateObject(item, entities);
}
}
}
}
#SuppressWarnings("unchecked")
private Collection createNewCollection(Object origProp) {
try {
if(List.class.isAssignableFrom(origProp.getClass()))
return new ArrayList((Collection)origProp);
else if(Set.class.isAssignableFrom(origProp.getClass()))
return new HashSet((Collection)origProp);
else{
Collection tempColl=(Collection) BeanUtils.cloneBean(origProp);
tempColl.clear();
return tempColl;
}
} catch (Exception e) {
e.printStackTrace();
}
return new ArrayList();
}
private boolean isIgnore(PropertyDescriptor propertyDescriptor,Object obj){
String propertyName=propertyDescriptor.getName();
String[] ignores=getIgnoreValue(obj);
return ArrayUtils.contains(ignores, propertyName);
}
private String[] getIgnoreValue(Object obj) {
String[] ignore=obj.getClass().getAnnotation(Archivable.class).ignore();
return ignore==null?new String[]{}:ignore;
}
private String[] getNullifyList(Object obj) {
String[] nullify=obj.getClass().getAnnotation(Archivable.class).nullify();
return nullify==null?new String[]{}:nullify;
}
public boolean isArchivable(Object obj) {
return obj.getClass().isAnnotationPresent(Archivable.class);
}
private String getOriginal(Object obj) {
String original=obj.getClass().getAnnotation(Archivable.class).original();
return original==null?"":original;
}
private boolean hasArchiveFlag(Object obj) {
return obj.getClass().getAnnotation(Archivable.class).setArchiveFlag();
}
#SuppressWarnings({ "unchecked", "unused" })
private Collection getElemColl(Object obj, Object origProp) {
Collection elemColl=createNewCollection(origProp);
for (Object object : (Collection)origProp) {
elemColl.add(object);
}
return elemColl;
}
#SuppressWarnings("unused")
private boolean isElementCollection(Object obj, String name) {
try {
Annotation[] annotations=obj.getClass().getDeclaredField(name).getAnnotations();
for (Annotation annotation : annotations) {
if(annotation.annotationType() == ElementCollection.class)
return true;
}
} catch (Exception e) {
e.printStackTrace();
}
return false;
}
}
Envers is what you require for auditing purposes
I'm new to Entity Framework, and I'm think there is something that I misunderstand here.
I'm trying to insert a row in a table, and everywhere I found code example, they call the method InsertOnSubmit(), but the problem is that I can't find anywhere the method InsertOnSubmit, or SubmitChanges.
The error tell me:
System.Data.Object.ObjectSet do not contain the definition for InsertOnSubmit, ...
What I'm doing wrong??
http://msdn.microsoft.com/en-us/library/bb763516.aspx
GMR_DEVEntities CTX;
CTX = new GMR_DEVEntities();
tblConfig Config = new tblConfig { ID = Guid.NewGuid(), Code = "new config code" };
CTX.tblConfigs.InsertOnSubmit(Config); // Error here
Edit:
Using Visual Studio 2010 on FW 4.0
InsertOnSubmit is a Linq-to-SQL method and not in the Entity Framework.
However, since our project was a conversion from Linq-to-SQL we have some extension methods that might help:
public static class ObjectContextExtensions
{
public static void SubmitChanges(this ObjectContext context)
{
context.SaveChanges();
}
public static void InsertOnSubmit<T>(this ObjectQuery<T> table, T entity)
{
table.Context.AddObject(GetEntitySetName(table.Context, entity.GetType()), entity);
}
public static void InsertAllOnSubmit<T>(this ObjectQuery<T> table, IEnumerable<T> entities)
{
var entitySetName = GetEntitySetName(table.Context, typeof(T));
foreach (var entity in entities)
{
table.Context.AddObject(entitySetName, entity);
}
}
public static void DeleteAllOnSubmit<T>(this ObjectQuery<T> table, IEnumerable<T> entities) where T : EntityObject, new()
{
var entitiesList = entities.ToList();
foreach (var entity in entitiesList)
{
if (null == entity.EntityKey)
{
SetEntityKey(table.Context, entity);
}
var toDelete = (T)table.Context.GetObjectByKey(entity.EntityKey);
if (null != toDelete)
{
table.Context.DeleteObject(toDelete);
}
}
}
public static void SetEntityKey<TEntity>(this ObjectContext context, TEntity entity) where TEntity : EntityObject, new()
{
entity.EntityKey = context.CreateEntityKey(GetEntitySetName(context, entity.GetType()), entity);
}
public static string GetEntitySetName(this ObjectContext context, Type entityType)
{
return EntityHelper.GetEntitySetName(entityType, context);
}
}
Where EntityHelper is as per the MyExtensions open source library.
Hello this works for me
Entity db = new Entity();
TABLE_NAME table = new TABLE_NAME
{
COLUMN1 = "TEST",
cOLUMN2 = "test"
//etc...
};
db.TABLE_NAME.Add(table);
db.SaveChanges();
Finally found what was wrong, my Entity database was a dbmx file and not a dbml file. I do not understand why this .. but has long as it work. (Need to buy a new book I guess) – Hugo Feb 17 at 19:40
i also have the same problem .we can insert by using Add
GMR_DEVEntities CTX;
CTX = new GMR_DEVEntities();
tblConfig Config = new tblConfig { ID = Guid.NewGuid(), Code = "new config code" };
CTX.tblConfigs.Add(Config);
I am new to the Dapper micro ORM. So far I am able to use it for simple ORM related stuff but I am not able to map the database column names with the class properties.
For example, I have the following database table:
Table Name: Person
person_id int
first_name varchar(50)
last_name varchar(50)
and I have a class called Person:
public class Person
{
public int PersonId { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
}
Please note that my column names in the table are different from the property name of the class to which I am trying to map the data which I got from the query result.
var sql = #"select top 1 PersonId,FirstName,LastName from Person";
using (var conn = ConnectionFactory.GetConnection())
{
var person = conn.Query<Person>(sql).ToList();
return person;
}
The above code won't work as the column names don't match the object's (Person) properties. In this scenario, is there anything i can do in Dapper to manually map (e.g person_id => PersonId) the column names with object properties?
Dapper now supports custom column to property mappers. It does so through the ITypeMap interface. A CustomPropertyTypeMap class is provided by Dapper that can do most of this work. For example:
Dapper.SqlMapper.SetTypeMap(
typeof(TModel),
new CustomPropertyTypeMap(
typeof(TModel),
(type, columnName) =>
type.GetProperties().FirstOrDefault(prop =>
prop.GetCustomAttributes(false)
.OfType<ColumnAttribute>()
.Any(attr => attr.Name == columnName))));
And the model:
public class TModel {
[Column(Name="my_property")]
public int MyProperty { get; set; }
}
It's important to note that the implementation of CustomPropertyTypeMap requires that the attribute exist and match one of the column names or the property won't be mapped. The DefaultTypeMap class provides the standard functionality and can be leveraged to change this behavior:
public class FallbackTypeMapper : SqlMapper.ITypeMap
{
private readonly IEnumerable<SqlMapper.ITypeMap> _mappers;
public FallbackTypeMapper(IEnumerable<SqlMapper.ITypeMap> mappers)
{
_mappers = mappers;
}
public SqlMapper.IMemberMap GetMember(string columnName)
{
foreach (var mapper in _mappers)
{
try
{
var result = mapper.GetMember(columnName);
if (result != null)
{
return result;
}
}
catch (NotImplementedException nix)
{
// the CustomPropertyTypeMap only supports a no-args
// constructor and throws a not implemented exception.
// to work around that, catch and ignore.
}
}
return null;
}
// implement other interface methods similarly
// required sometime after version 1.13 of dapper
public ConstructorInfo FindExplicitConstructor()
{
return _mappers
.Select(mapper => mapper.FindExplicitConstructor())
.FirstOrDefault(result => result != null);
}
}
And with that in place, it becomes easy to create a custom type mapper that will automatically use the attributes if they're present but will otherwise fall back to standard behavior:
public class ColumnAttributeTypeMapper<T> : FallbackTypeMapper
{
public ColumnAttributeTypeMapper()
: base(new SqlMapper.ITypeMap[]
{
new CustomPropertyTypeMap(
typeof(T),
(type, columnName) =>
type.GetProperties().FirstOrDefault(prop =>
prop.GetCustomAttributes(false)
.OfType<ColumnAttribute>()
.Any(attr => attr.Name == columnName)
)
),
new DefaultTypeMap(typeof(T))
})
{
}
}
That means we can now easily support types that require map using attributes:
Dapper.SqlMapper.SetTypeMap(
typeof(MyModel),
new ColumnAttributeTypeMapper<MyModel>());
Here's a Gist to the full source code.
This works fine:
var sql = #"select top 1 person_id PersonId, first_name FirstName, last_name LastName from Person";
using (var conn = ConnectionFactory.GetConnection())
{
var person = conn.Query<Person>(sql).ToList();
return person;
}
Dapper has no facility that allows you to specify a Column Attribute, I am not against adding support for it, providing we do not pull in the dependency.
For some time, the following should work:
Dapper.DefaultTypeMap.MatchNamesWithUnderscores = true;
I do the following using dynamic and LINQ:
var sql = #"select top 1 person_id, first_name, last_name from Person";
using (var conn = ConnectionFactory.GetConnection())
{
List<Person> person = conn.Query<dynamic>(sql)
.Select(item => new Person()
{
PersonId = item.person_id,
FirstName = item.first_name,
LastName = item.last_name
}
.ToList();
return person;
}
Here is a simple solution that doesn't require attributes allowing you to keep infrastructure code out of your POCOs.
This is a class to deal with the mappings. A dictionary would work if you mapped all the columns, but this class allows you to specify just the differences. In addition, it includes reverse maps so you can get the field from the column and the column from the field, which can be useful when doing things such as generating sql statements.
public class ColumnMap
{
private readonly Dictionary<string, string> forward = new Dictionary<string, string>();
private readonly Dictionary<string, string> reverse = new Dictionary<string, string>();
public void Add(string t1, string t2)
{
forward.Add(t1, t2);
reverse.Add(t2, t1);
}
public string this[string index]
{
get
{
// Check for a custom column map.
if (forward.ContainsKey(index))
return forward[index];
if (reverse.ContainsKey(index))
return reverse[index];
// If no custom mapping exists, return the value passed in.
return index;
}
}
}
Setup the ColumnMap object and tell Dapper to use the mapping.
var columnMap = new ColumnMap();
columnMap.Add("Field1", "Column1");
columnMap.Add("Field2", "Column2");
columnMap.Add("Field3", "Column3");
SqlMapper.SetTypeMap(typeof (MyClass), new CustomPropertyTypeMap(typeof (MyClass), (type, columnName) => type.GetProperty(columnMap[columnName])));
An easy way to achieve this is to just use aliases on the columns in your query.
If your database column is PERSON_ID and your object's property is ID, you can just do
select PERSON_ID as Id ...
in your query and Dapper will pick it up as expected.
Taken from the Dapper Tests which is currently on Dapper 1.42.
// custom mapping
var map = new CustomPropertyTypeMap(typeof(TypeWithMapping),
(type, columnName) => type.GetProperties().FirstOrDefault(prop => GetDescriptionFromAttribute(prop) == columnName));
Dapper.SqlMapper.SetTypeMap(typeof(TypeWithMapping), map);
Helper class to get name off the Description attribute (I personally have used Column like #kalebs example)
static string GetDescriptionFromAttribute(MemberInfo member)
{
if (member == null) return null;
var attrib = (DescriptionAttribute)Attribute.GetCustomAttribute(member, typeof(DescriptionAttribute), false);
return attrib == null ? null : attrib.Description;
}
Class
public class TypeWithMapping
{
[Description("B")]
public string A { get; set; }
[Description("A")]
public string B { get; set; }
}
Before you open the connection to your database, execute this piece of code for each of your poco classes:
// Section
SqlMapper.SetTypeMap(typeof(Section), new CustomPropertyTypeMap(
typeof(Section), (type, columnName) => type.GetProperties().FirstOrDefault(prop =>
prop.GetCustomAttributes(false).OfType<ColumnAttribute>().Any(attr => attr.Name == columnName))));
Then add the data annotations to your poco classes like this:
public class Section
{
[Column("db_column_name1")] // Side note: if you create aliases, then they would match this.
public int Id { get; set; }
[Column("db_column_name2")]
public string Title { get; set; }
}
After that, you are all set. Just make a query call, something like:
using (var sqlConnection = new SqlConnection("your_connection_string"))
{
var sqlStatement = "SELECT " +
"db_column_name1, " +
"db_column_name2 " +
"FROM your_table";
return sqlConnection.Query<Section>(sqlStatement).AsList();
}
Messing with mapping is borderline moving into real ORM land. Instead of fighting with it and keeping Dapper in its true simple (fast) form, just modify your SQL slightly like so:
var sql = #"select top 1 person_id as PersonId,FirstName,LastName from Person";
If you're using .NET 4.5.1 or higher checkout Dapper.FluentColumnMapping for mapping the LINQ style. It lets you fully separate the db mapping from your model (no need for annotations)
This is piggy backing off of other answers. It's just a thought I had for managing the query strings.
Person.cs
public class Person
{
public int PersonId { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public static string Select()
{
return $"select top 1 person_id {nameof(PersonId)}, first_name {nameof(FirstName)}, last_name {nameof(LastName)}from Person";
}
}
API Method
using (var conn = ConnectionFactory.GetConnection())
{
var person = conn.Query<Person>(Person.Select()).ToList();
return person;
}
The simple solution to the problem Kaleb is trying to solve is just to accept the property name if the column attribute doesn't exist:
Dapper.SqlMapper.SetTypeMap(
typeof(T),
new Dapper.CustomPropertyTypeMap(
typeof(T),
(type, columnName) =>
type.GetProperties().FirstOrDefault(prop =>
prop.GetCustomAttributes(false)
.OfType<ColumnAttribute>()
.Any(attr => attr.Name == columnName) || prop.Name == columnName)));
The easier way (same as #Matt M's answer but corrected and added fallback to default map)
// override TypeMapProvider to return custom map for every requested type
Dapper.SqlMapper.TypeMapProvider = type =>
{
// create fallback default type map
var fallback = new DefaultTypeMap(type);
return new CustomPropertyTypeMap(type, (t, column) =>
{
var property = t.GetProperties().FirstOrDefault(prop =>
prop.GetCustomAttributes(typeof(ColumnAttribute))
.Cast<ColumnAttribute>()
.Any(attr => attr.Name == column));
// if no property matched - fall back to default type map
if (property == null)
{
property = fallback.GetMember(column)?.Property;
}
return property;
});
};
for all of you who use Dapper 1.12, Here's what you need to do to get this done:
Add a new column attribute class:
[AttributeUsage(AttributeTargets.Field | AttributeTargets.Property]
public class ColumnAttribute : Attribute
{
public string Name { get; set; }
public ColumnAttribute(string name)
{
this.Name = name;
}
}
Search for this line:
map = new DefaultTypeMap(type);
and comment it out.
Write this instead:
map = new CustomPropertyTypeMap(type, (t, columnName) =>
{
PropertyInfo pi = t.GetProperties().FirstOrDefault(prop =>
prop.GetCustomAttributes(false)
.OfType<ColumnAttribute>()
.Any(attr => attr.Name == columnName));
return pi != null ? pi : t.GetProperties().FirstOrDefault(prop => prop.Name == columnName);
});
I know this is a relatively old thread, but I thought I'd throw what I did out there.
I wanted attribute-mapping to work globally. Either you match the property name (aka default) or you match a column attribute on the class property. I also didn't want to have to set this up for every single class I was mapping to. As such, I created a DapperStart class that I invoke on app start:
public static class DapperStart
{
public static void Bootstrap()
{
Dapper.SqlMapper.TypeMapProvider = type =>
{
return new CustomPropertyTypeMap(typeof(CreateChatRequestResponse),
(t, columnName) => t.GetProperties().FirstOrDefault(prop =>
{
return prop.Name == columnName || prop.GetCustomAttributes(false).OfType<ColumnAttribute>()
.Any(attr => attr.Name == columnName);
}
));
};
}
}
Pretty simple. Not sure what issues I'll run into yet as I just wrote this, but it works.
Kaleb Pederson's solution worked for me. I updated the ColumnAttributeTypeMapper to allow a custom attribute (had requirement for two different mappings on same domain object) and updated properties to allow private setters in cases where a field needed to be derived and the types differed.
public class ColumnAttributeTypeMapper<T,A> : FallbackTypeMapper where A : ColumnAttribute
{
public ColumnAttributeTypeMapper()
: base(new SqlMapper.ITypeMap[]
{
new CustomPropertyTypeMap(
typeof(T),
(type, columnName) =>
type.GetProperties( BindingFlags.NonPublic | BindingFlags.Public | BindingFlags.Instance).FirstOrDefault(prop =>
prop.GetCustomAttributes(true)
.OfType<A>()
.Any(attr => attr.Name == columnName)
)
),
new DefaultTypeMap(typeof(T))
})
{
//
}
}