Specified Cast is not Invalid (Enum with int value, Dapper) - dapper

I have a class with a (simple, first cut) implementation of user roles:
class User {
public Role Role { get; set; }
// ...
public User() { this.Role = Role.Normal; }
public void Save() { Membership.CreateUser(...) } // System.Web.Security.Membership
}
enum Role : int {
Invalid = 0,
Normal = 1,
SuperUser = 4096
}
Before adding the role, everything worked fine (if that matters).
Now, when I try to fetch users, this line fails:
toReturn = conn.Query<User>("SELECT TOP 1 * FROM dbo.UserProfile WHERE 1=1");
The stack trace (from ELMAH):
System.Data.DataException: Error parsing column 2 (Role=1 - Int16) ---> System.InvalidCastException: Specified cast is not valid.
at Deserialize06df745b-4fad-4d55-aada-632ce72e3607(IDataReader )
--- End of inner exception stack trace ---
at Dapper.SqlMapper.ThrowDataException(Exception ex, Int32 index, IDataReader reader) in c:\Dev\Dapper\Dapper\SqlMapper.cs:line 2126
at Deserialize06df745b-4fad-4d55-aada-632ce72e3607(IDataReader )
at Dapper.SqlMapper.<QueryInternal>d__d`1.MoveNext() in c:\Dev\Dapper\Dapper\SqlMapper.cs:line 827
at System.Collections.Generic.List`1..ctor(IEnumerable`1 collection)
at System.Linq.Enumerable.ToList[TSource](IEnumerable`1 source)
at Dapper.SqlMapper.Query[T](IDbConnection cnn, String sql, Object param, IDbTransaction transaction, Boolean buffered, Nullable`1 commandTimeout, Nullable`1 commandType) in c:\Dev\Dapper\Dapper\SqlMapper.cs:line 770
In the database, the column type for Role is smallint.
I'm using Dapper 1.12.1 from NuGet.

Gah. The answer was to make the database and class definitions match.
For smallint (which is what MigratorDotNet generated for me), I needed the enum to derive from short, not int. Everything works now.
Possibly useful Google Code issue: https://code.google.com/p/dapper-dot-net/issues/detail?id=32

Related

stored procedure 'auto_pk_for_table' not found

I don't know why I received the error :
org.apache.cayenne.CayenneRuntimeException: [v.4.0.M5 Feb 24 2017 07:47:55] Commit Exception
[...]
Caused by: java.sql.SQLException: Procédure stockée 'auto_pk_for_table' introuvable.
[...]
I'm using Cayenne :
<dependency>
<groupId>org.apache.cayenne</groupId>
<artifactId>cayenne-server</artifactId>
<version>4.0.M5</version>
</dependency>
and JDTS for sql server :
<dependency>
<groupId>net.sourceforge.jtds</groupId>
<artifactId>jtds</artifactId>
<version>1.3.1</version>
</dependency>
The connexion is ok :
avr. 10, 2017 2:36:30 PM org.apache.cayenne.datasource.DriverDataSource getConnection
INFOS: +++ Connecting: SUCCESS.
I'm trying to create a new user (I'm starting by bascis!) so my code is :
(I cut a little bit, it's too long:!)
public abstract class _UserInfo extends CayenneDataObject {
public static final String ADDRESS_PROPERTY = "address";
public void setAddress(String address) {
writeProperty(ADDRESS_PROPERTY, address);
}
public String getAddress() {
return (String)readProperty(ADDRESS_PROPERTY);
}
}
public class UserInfo extends _UserInfo implements Serializable {
private static final long serialVersionUID = 1L;
public String address;
public String getAdress() {
return address;
}
public void setAddress(String address) {
super.setAddress(address);
}
//I have the hashcode and equals too
}
Then, I used vaadin to create my form :
public class UserAddView extends CustomComponent implements View {
private static final long serialVersionUID = 1L;
private TextField address;
private Button save;
public static final String USERVIEW = "user";
public boolean checkValidation() {
if (!checkTextFieldValid(address))
return false;
return true;
}
public boolean checkTextFieldValid(TextField element) {
if (element == null || element.isEmpty()) {
Notification.show(
"You should register a " + element.getDescription(),
Type.WARNING_MESSAGE);
return false;
}
return true;
}
public UserAddView() {
VerticalLayout mainLayout = new VerticalLayout();
mainLayout.setSizeFull();
setCompositionRoot(mainLayout);
final VerticalLayout vlayout = new VerticalLayout();
address = new TextField("Address:");
address.setDescription("Address");
vlayout.addComponent(address);
save = new Button("Save");
vlayout.addComponent(save);
mainLayout.addComponent(new HeaderMenu());
mainLayout.addComponent(vlayout);
addListeners();
}
private void addListeners() {
save.addClickListener(new ClickListener() {
private static final long serialVersionUID = 1L;
#Override
public void buttonClick(ClickEvent event) {
if (checkValidation() == true) {
ServerRuntime cayenneRuntime = ServerRuntime.builder()
.addConfig("cayenne-myapplication.xml").build();
ObjectContext context = cayenneRuntime.newContext();
UserInfo user = context.newObject(UserInfo.class);
user.setAddress(address.getValue());
user.getObjectContext().commitChanges();
Notification.show(
"Has been saved, We will send you your password by email. Your user login is: "
+ email.getValue(), Type.TRAY_NOTIFICATION);
getUI().getNavigator().navigateTo(HomepageView.MAINVIEW);
}
}
});
}
#Override
public void enter(ViewChangeEvent event) {
// TODO Auto-generated method stub
}
}
EDIT, add information : In my user object, I have a userid (primary key), in cayenne I wrote it as primary key too and in smallint. This error seems to be link... https://cayenne.apache.org/docs/3.1/api/org/apache/cayenne/dba/sybase/SybasePkGenerator.html
The error happens when you insert a new object. For each new object Cayenne needs to generate a value of the primary key. There are various strategies to do this. The default strategy depends on the DB that you are using. For SQLServer (and for Sybase, as you've discovered :)) that strategy is to use a special stored procedure.
To create this stored procedure (and other supporting DB objects), go to CayenneModeler, open your project, and select "Tools > Generate Database Schema". In "SQL Options" tab, uncheck all checkboxes except for "Create Primary Key Support". The SQL you will see in the window below the checkboxes is what you need to run on SQL server. Either do it from Cayenne modeler or copy/paste to your favorite DB management tool.
There's also an alternative that does not require a stored procedure - using DB auto-increment feature. For this you will need to go to each DbEntity in the Modeler and under the "Entity" tab select "Database-Generated" in the "Pk Generation Strategy" dropdown. This of course implies that your PK column is indeed an auto-increment in the DB (meaning you may need to adjust your DB schema accordingly).

How can I make Dapper.NET throw when result set has unmapped columns?

Using the example code below as context... When I run this query I get the 'Id' field coming back as default value (which is 0 for an int). I would like to tell dapper to run in a manner where it would throw an exception if there is a column in the result set that does not get mapped to a property on my result object. (I understand that the issue is just that I need to remove the extra 'd' in the SQL query but I'm interested in having this expose itself more explicitly)
I've been unable to find anything on this topic. Please let me know if this is even possible with Dapper.
Thanks in advance (besides this issue, and for anyone who hasn't taken the plunge, Dapper really is the greatest thing since sliced bread!).
class CustomerRecord
{
public int Id { get; set; }
public string Name { get; set; }
}
CustomerRecord[] GetCustomerRecords()
{
CustomerRecord[] ret;
var sql = #"SELECT
CustomerRecordId AS Idd,
CustomerName as Name
FROM CustomerRecord";
using (var connection = new SqlConnection(this.connectionString))
{
ret = connection.Query<CustomerRecord>(sql).ToArray();
}
return ret;
}
You could create your own type map where you use Dapper's DefaultTypeMap and throw an exception when it cannot find the member:
public class ThrowWhenNullTypeMap<T> : SqlMapper.ITypeMap
{
private readonly SqlMapper.ITypeMap _defaultTypeMap = new DefaultTypeMap(typeof(T));
public ConstructorInfo FindConstructor(string[] names, Type[] types)
{
return _defaultTypeMap.FindConstructor(names, types);
}
public ConstructorInfo FindExplicitConstructor()
{
return _defaultTypeMap.FindExplicitConstructor();
}
public SqlMapper.IMemberMap GetConstructorParameter(ConstructorInfo constructor, string columnName)
{
return _defaultTypeMap.GetConstructorParameter(constructor, columnName);
}
public SqlMapper.IMemberMap GetMember(string columnName)
{
var member = _defaultTypeMap.GetMember(columnName);
if (member == null)
{
throw new Exception();
}
return member;
}
}
Downside of this, is that you have to configure all the type maps for every entity:
SqlMapper.SetTypeMap(typeof(CustomerRecord), typeof(ThrowWhenNullTypeMap<CustomerRecord>));
This could be configured using reflection, however.
I came here after I solved this same problem for the IEnumerable<dynamic> methods in Dapper. Then I found the proposal to solve the issue for Query<T>; but that doesn't seem to be going anywhere.
My answer builds on the answer proposed by #HenkMollema, and uses his class in the solution, so credit to him for that...
To solve the IEnumerable<dynamic> scenario, I had created a "SafeDynamic" class (follow the link above to see that). I refactored the static "Create" method into an extension method:
public static class EnumerableDynamicExtensions
{
public static IEnumerable<dynamic> Safe(this IEnumerable<dynamic> rows)
{
return rows.Select(x => new SafeDynamic(x));
}
}
and then I created a DapperExtensions class to provide 'Safe' versions of Query and Read (Read is used after QueryMultiple), to give me...
internal static class DapperExtensions
{
public static IEnumerable<dynamic> SafeQuery(this IDbConnection cnn, string sql, object param = null, IDbTransaction transaction = null, bool buffered = true, int? commandTimeout = default(int?), CommandType? commandType = default(CommandType?))
{
return cnn.Query(sql, param, transaction, buffered, commandTimeout, commandType).Safe();
}
public static IEnumerable<dynamic> SafeRead(this SqlMapper.GridReader gridReader, bool buffered = true)
{
return gridReader.Read(buffered).Safe();
}
}
So to solve this issue I added a "SafeQuery<T>" method to DapperExtensions, which takes care of setting up that type mapping for you:
private static readonly IDictionary<Type, object> TypesThatHaveMapper = new Dictionary<Type, object>();
public static IEnumerable<T> SafeQuery<T>(this IDbConnection cnn, string sql, object param = null, IDbTransaction transaction = null, bool buffered = true, int? commandTimeout = default(int?), CommandType? commandType = default(CommandType?))
{
if (TypesThatHaveMapper.ContainsKey(typeof(T)) == false)
{
SqlMapper.SetTypeMap(typeof(T), new ThrowWhenNullTypeMap<T>());
TypesThatHaveMapper.Add(typeof(T), null);
}
return cnn.Query<T>(sql, param, transaction, buffered, commandTimeout, commandType);
}
So if the original poster changes the call to Query to become SafeQuery, it should do what he requested
Edit 25/1/17
Improvements to avoid threading issues on the static dictionary:
private static readonly ConcurrentDictionary<Type, object> TypesThatHaveMapper = new ConcurrentDictionary<Type, object>();
public static IEnumerable<T> SafeQuery<T>(this IDbConnection cnn, string sql, object param = null, IDbTransaction transaction = null, bool buffered = true, int? commandTimeout = default(int?), CommandType? commandType = default(CommandType?))
{
TypesThatHaveMapper.AddOrUpdate(typeof(T), AddValue, UpdateValue);
return cnn.Query<T>(sql, param, transaction, buffered, commandTimeout, commandType);
}
private static object AddValue(Type type)
{
SqlMapper.SetTypeMap(type, XXX); // Apologies... XXX is left to the reader, as my implementation has moved on significantly.
return null;
}
private static object UpdateValue(Type type, object existingValue)
{
return null;
}
I'd like to expand on #Richardissimo 's answer by providing a visual studio project that includes his "SafeQuery" extention to Dapper, wrapped up nice and neat and tested.
https://github.com/LarrySmith-1437/SafeDapper
I use this in all my projects now to help keep the DAL clean of mismapped data, and felt the need to share. I would have posted up a Nuget, but the dependency on Dapper itself makes it much easier to post the project where consumers can update the reference to the Dapper version they want. Consume in good health, all.
Based on this thread and some other resources on SO, I've created an extension method without any custom mapper. What I needed was to throw when some property of my DTO was not set because for example SQL query has some column missing in SELECT statement.
This way my DTO would be set with default property silently and that's kinda dangerous.
The code can be simplified a little by not checking firstly for all properties being present in result, but throwing exception in the last Select call where we could iterate through properties of our type and check if query result has this property as well.
public static class Extensions
{
public static async Task<IEnumerable<T>> SafeQueryAsync<T>(
this IDbConnection cnn,
string sql,
object param = null,
IDbTransaction transaction = null,
int? commandTimeout = default(int?),
CommandType? commandType = default(CommandType?))
where T : new()
{
Dictionary<string, PropertyInfo> propertySetters = typeof(T)
.GetProperties().Where(p => p.CanRead && p.CanWrite)
.ToDictionary(p => p.Name.ToLowerInvariant(), p => p);
HashSet<string> typeProperties = propertySetters
.Select(p => p.Key)
.ToHashSet();
var rows = (await cnn.QueryAsync(sql, param, transaction, commandTimeout, commandType)).ToArray();
if (!rows.Any())
{
return Enumerable.Empty<T>();
}
var firstRow = rows.First();
HashSet<string> rowColumns = ((IDictionary<string, object>) firstRow)
.Select(kvp=>kvp.Key.ToLowerInvariant()).ToHashSet();
var notMappedColumns = typeProperties.Except(rowColumns).ToArray();
if (notMappedColumns.Any())
{
throw new InvalidOperationException(
$"Not all type properties had corresponding columns in SQL query. Query result lacks [{string.Join(", ", notMappedColumns)}]");
}
return rows.Select(row =>
{
IDictionary<string, object> rowDict = (IDictionary<string, object>) row;
T instance = new T();
rowDict.Where(o => propertySetters.ContainsKey(o.Key.ToLowerInvariant()))
.ToList().ForEach(o => propertySetters[o.Key.ToLowerInvariant()].SetValue(instance, o.Value));
return instance;
}).AsEnumerable();
}
}

Retrieving XML from database with Dapper

I am using Dapper to query a table that includes an XML field:
CREATE TABLE Workflow
(
Guid uniqueidentifier not null,
State xml not null
)
which is then mapped to a property of type XDocument:
public class Workflow
{
public Guid InstanceId { get;set; }
public XDocument State { get;set; }
}
but when I try to query the table, I get the following error:
Error parsing column 1 (State= - String)
at Dapper.SqlMapper.ThrowDataException(Exception ex, Int32 index, IDataReader reader, Object value) in d:\\Dev\\dapper-dot-net\\Dapper NET40\\SqlMapper.cs:line 4045
at Deserialize038b29f4-d97d-4b62-b45b-786bd7d50e7a(IDataReader )
at Dapper.SqlMapper.<QueryImpl>d__11`1.MoveNext() in d:\\Dev\\dapper-dot-net\\Dapper NET40\\SqlMapper.cs:line 1572
at System.Collections.Generic.List`1..ctor(IEnumerable`1 collection)
at System.Linq.Enumerable.ToList[TSource](IEnumerable`1 source)
at Dapper.SqlMapper.Query[T](IDbConnection cnn, String sql, Object param, IDbTransaction transaction, Boolean buffered, Nullable`1 commandTimeout, Nullable`1 commandType) in d:\\Dev\\dapper-dot-net\\Dapper NET40\\SqlMapper.cs:line 1443
at MyProject.DapperBase.Query[TResult](String command, DynamicParameters parameters, IDbTransaction transaction, Boolean buffered, Int32 commandTimeout) in d:\\MyProject\\DapperBase.cs:line 122
at MyProject.WorkflowData.Get(Guid identifier) in d:\\MyProject\\WorkflowData.cs:line 41
at MyProject.WorkflowLogic.Save(Workflow workflow) in d:\\MyProject\\WorkflowLogic.cs:line 34
at MyProject.WorkflowsController.Save(Guid id, WorkflowRequest request) in d:\\MyProject\\WorkflowsController.cs:line 97
InnerException: Invalid cast from 'System.String' to 'System.Xml.Linq.XDocument'.
at System.Convert.DefaultToType(IConvertible value, Type targetType, IFormatProvider provider)at System.String.System.IConvertible.ToType(Type type, IFormatProvider provider)
at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvider provider)
at System.Convert.ChangeType(Object value, Type conversionType)
at Deserialize038b29f4-d97d-4b62-b45b-786bd7d50e7a(IDataReader )
Other than modifying my POCO to use a string datatype and then convert the string into an XDocument elsewhere, is there a way of getting Dapper to correctly deserialise the XML from the database?
In the end, I just brute-forced it:
public class Workflow
{
public Guid InstanceId { get;set; }
public XDocument StateIn { set { State = value.ToString(); } }
public string State { get;set; }
public XDocument StateOut { get { return XDocument.Parse(State); } }
}
Dapper plays with the State value, and I just set the value on StateIn and read it off StateOut. I feel a little bit dirty coming up with a solution like this, but hey, it works.
Perhaps creating a custom type handler can help? Something like:
public class XDocumentTypeHandler : SqlMapper.TypeHandler<XDocument>
{
public override void SetValue(IDbDataParameter parameter, XDocument value)
{
// set value in db parameter.
}
public XDocument Parse(object value)
{
// parse value from db to an XDocument.
}
}
You have to add the type handler with SqlMapper.AddTypeHandler().
See a sample implementation.

JPA2 CriteriaBuilder: Using LOB property for greaterThan comparison

My application is using SQLServer and JPA2 in the backend. App makes use of a timestamp column (in the SQLServer sense, which is equivalent to row version see here) per entity to keep track of freshly modified entities. NB SQLServer stores this column as binary(8).
Each entity has a respective timestamp property, mapped as #Lob, which is the way to go for binary columns:
#Lob
#Column(columnDefinition="timestamp", insertable=false, updatable=false)
public byte[] getTimestamp() {
...
The server sends incremental updates to mobile clients along with the latest database timestamp. The mobile client will then pass the old timestamp back to the server on the next refresh request so that the server knows to return only fresh data. Here's what a typical query (in JPQL) looks like:
select v from Visit v where v.timestamp > :oldTimestamp
Please note that I'm using a byte array as a query parameter and it works fine when implemented in JPQL this way.
My problems begin when trying to do the same using the Criteria API:
private void getFreshVisits(byte[] oldVersion) {
EntityManager em = getEntityManager();
CriteriaQuery<Visit> cq = cb.createQuery(Visit.class);
Root<Visit> root = cq.from(Visit.class);
Predicate tsPred = cb.gt(root.get("timestamp").as(byte[].class), oldVersion); // compiler error
cq.where(tsPred);
...
}
The above will result in compiler error as it requires that the gt method used strictly with Number. One could instead use the greaterThan method which simply requires the params to be Comparable and that would result in yet another compiler error.
So to sum it up, my question is: how can I use the criteria api to add a greaterThan predicate for a byte[] property? Any help will be greatly appreciated.
PS. As to why I'm not using a regular DateTime last_modified column: because of concurrency and the way synchronization is implemented, this approach could result in lost updates. Microsoft's Sync Framework documentation recommends the former approach as well.
I know this was asked a couple of years back but just in case anyone else stumbles upon this.. In order to use a SQLServer rowver column within JPA you need to do a couple of things..
Create a type that will wrap the rowver/timestamp:
import com.fasterxml.jackson.annotation.JsonIgnore;
import javax.xml.bind.annotation.XmlTransient;
import java.io.Serializable;
import java.math.BigInteger;
import java.util.Arrays;
/**
* A RowVersion object
*/
public class RowVersion implements Serializable, Comparable<RowVersion> {
#XmlTransient
#JsonIgnore
private byte[] rowver;
public RowVersion() {
}
public RowVersion(byte[] internal) {
this.rowver = internal;
}
#XmlTransient
#JsonIgnore
public byte[] getRowver() {
return rowver;
}
public void setRowver(byte[] rowver) {
this.rowver = rowver;
}
#Override
public int compareTo(RowVersion o) {
return new BigInteger(1, rowver).compareTo(new BigInteger(1, o.getRowver()));
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
RowVersion that = (RowVersion) o;
return Arrays.equals(rowver, that.rowver);
}
#Override
public int hashCode() {
return Arrays.hashCode(rowver);
}
}
The key here is that it implement Comparable if you want to use it in calculations (which you definitely do)..
Next create a AttributeConverter that will move from a byte[] to the class you just made:
import javax.persistence.AttributeConverter;
import javax.persistence.Converter;
/**
* JPA converter for the RowVersion type
*/
#Converter
public class RowVersionTypeConverter implements AttributeConverter<RowVersion, byte[]> {
#Override
public byte[] convertToDatabaseColumn(RowVersion attribute) {
return attribute != null ? attribute.getRowver() : null;
}
#Override
public RowVersion convertToEntityAttribute(byte[] dbData) {
return new RowVersion(dbData);
}
}
Now let's apply this RowVersion attribute/type to a real world scenario. Let's say you wanted to find all Programs that have changed on or before some point in time.
One straightforward way to solve this would be to use a DateTime field in the object and timestamp column within db. Then you would use 'where lastUpdatedDate <= :date'.
Suppose that you don't have that timestamp column or there's no guarantee that it will be updated properly when changes are made; or let's say your shop loves SQLServer and wants to use rowver instead.
What to do? There are two issues to solve.. one how to generate a rowver and two is how to use the generated rowver to find Programs.
Since the database generates the rowver, you can either ask the db for the 'current max rowver' (a custom sql server thing) or you can simply save an object that has a RowVersion attribute and then use that object's generated RowVersion as the boundary for the query to find the Programs changed after that time. The latter solution is more portable is what the solution is below.
The SyncPoint class snippet below is the object that is used as a 'point in time' kind of deal. So once a SyncPoint is saved, the RowVersion attached to it is the db version at the time it was saved.
Here is the SyncPoint snippet. Notice the annotation to specify the custom converter (don't forget to make the column insertable = false, updateable = false):
/**
* A sample super class that uses RowVersion
*/
#MappedSuperclass
public abstract class SyncPoint {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
// type is rowver for SQLServer, blob(8) for postgresql and h2
#Column(name = "current_database_version", insertable = false, updatable = false)
#Convert(converter = RowVersionTypeConverter.class)
private RowVersion currentDatabaseVersion;
#Column(name = "created_date_utc", columnDefinition = "timestamp", nullable = false)
private DateTime createdDate;
...
Also (for this example) here is the Program object we want to find:
#Entity
#Table(name = "program_table")
public class Program {
#Id
private Integer id;
private boolean active;
// type is rowver for SQLServer, blob(8) for postgresql and h2
#Column(name = "rowver", insertable = false, updatable = false)
#Convert(converter = RowVersionTypeConverter.class)
private RowVersion currentDatabaseVersion;
#Column(name = "last_chng_dt")
private DateTime lastUpdatedDate;
...
Now you can use these fields within your JPA criteria queries just like anything else.. here is a snippet that we used inside a spring-data Specifications class:
/**
* Find Programs changed after a synchronization point
*
* #param filter that has the changedAfter sync point
* #return a specification or null
*/
public Specification<Program> changedBeforeOrEqualTo(final ProgramSearchFilter filter) {
return new Specification<Program>() {
#Override
public Predicate toPredicate(Root<Program> root, CriteriaQuery<?> query, CriteriaBuilder cb) {
if (filter != null && filter.changedAfter() != null) {
// load the SyncPoint from the db to get the rowver column populated
SyncPoint fromDb = synchronizationPersistence.reload(filter.changedBeforeOrEqualTo());
if (fromDb != null) {
// real sync point made by database
if (fromDb.getCurrentDatabaseVersion() != null) {
// use binary version
return cb.lessThanOrEqualTo(root.get(Program_.currentDatabaseVersion),
fromDb.getCurrentDatabaseVersion());
} else if (fromDb.getCreatedDate() != null) {
// use timestamp instead of binary version cause db doesn't make one
return cb.lessThanOrEqualTo(root.get(Program_.lastUpdatedDate),
fromDb.getCreatedDate());
}
}
}
return null;
}
};
}
The specification above works with both the binary current database version or a timestamp.. this way I could test my stuff and all the upstream code on a database other than SQLServer.
That's it really: a) type to wrap the byte[] b) JPA converter c) use attribute in query.

Error in SQL Server to WCF development

I am testing a DB that have two tables (Satellite and Channel) to be exposed as I need using WCF. fortunately, I tried everything I know and online for more that I week now and I can't solve the problem.
This is the service contract IService.cs
[ServiceContract]
public interface IService
{
[OperationContract]
List<Satalite> SelectSatalite(int satNum);
[OperationContract]
List<Satalite> SataliteList();
[OperationContract]
List<Channel> ChannelList(int satNum);
[OperationContract]
String Sat(int satNum);
}
And this is the Service.svc.cs file
public class Service : IService
{
DataDbDataContext DbObj = new DataDbDataContext();
public List<Satalite> SataliteList()
{
var satList = from r in DbObj.Satalites
select r;
return satList.ToList();
}
public List<Satalite> SelectSatalite(int satNum)
{
var satList = from r in DbObj.Satalites
where r.SateliteID == satNum
select r;
return satList.ToList();
}
public List<Channel> ChannelList(int satNum)
{
var channels = from r in DbObj.Channels
where r.SateliteID == satNum
select r;
return channels.ToList();
}
public String Sat(int satNum)
{
Satalite satObj = new Satalite();
satObj = DbObj.Satalites.Single(p => p.SateliteID == satNum);
return satObj.Name;
}
}
Whenever I try to run the first three I got an error when testing them using wcftestclient.exe, the last one works with no issues.
The underlying connection was closed: The connection was closed
unexpectedly.
Server stack trace:
at System.ServiceModel.Channels.HttpChannelUtilities.ProcessGetResponseWebException(WebException
webException, HttpWebRequest request, HttpAbortReason abortReason)
at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan
timeout)
at System.ServiceModel.Channels.RequestChannel.Request(Message message,
TimeSpan timeout)
at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message
message, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Call(String action,
Boolean oneway, ProxyOperationRuntime operation, Object[] ins,
Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage
methodCall, ProxyOperationRuntime operation)
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage
message)
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage
reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData&
msgData, Int32 type) at IService.SelectSatalite(Int32 satNum)
at ServiceClient.SelectSatalite(Int32 satNum)
Inner Exception: The underlying connection was closed: The connection
was closed unexpectedly.
at System.Net.HttpWebRequest.GetResponse()
at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan
timeout)
What I understand is that the error happens for the custom classes which are the DB tables, if I used a known type by the .net compiler (ex. int or string) it will work with no problems. Fortunately, I didn't find a solution.
The error appears to be one of two reasons:
a timeout since you're returning too much data, e.g. the selection of the data from the database takes too long for the service method to complete in time
or:
the message size is too large, because you're selecting too much data, and thus the WCF communication aborts before the whole data has been returned
My solution:
don't select all data from the tables! Return only as much data as you can really handle / display, e.g. 10 rows, 20 rows or a maximum of 100 rows....
Try this - if you change your method to:
public List<Satalite> SataliteList(int count)
{
var satList = (from r in DbObj.Satalites
select r).Take(count);
return satList.ToList();
}
Can you call this from the WCF Test Client with e.g. count = 10 or count = 50 ??
Adjusting timeout settings on server and client side will help you.
Server Side adjust the SendTimeout attribute of binding element and on client side adjust the RecieveTimeout attribute of binding element.
Thanks,

Resources