Salesforce Commerce Cloud/Demandware - OCAPI query orders by date range - salesforce

I am trying to query the Demandware (SFCC) api to extract orders using a date range. The POST to orders_search works but it seems terribly inefficient. First I pull ALL data and then I filter the results.
I would like to simply query for the date ranges, but I cannot figure out how to do that. Help.
{
query : {
filtered_query: {
query: {
term_query: { fields: ["creation_date"], operator: "is_not_null" }
},
filter: {
range_filter: {
field: "creation_date",
from: "2017-08-12T00:00:00.000Z",
to: "2017-08-13T00:00:00.000Z",
from_inclusive: true
}
}
}
}
}
EDIT: While I solved the initial question, this ends up being more complicated because the service only allows 200 responses at a time. So first you have to make a request to find out how many results there are, then call the service multiple times to get data. Below is the code as used with C#. The date ranges are passed in as variables.
var m_payload_count = "{ query : { filtered_query: { query: { term_query: { fields: [\"creation_date\"], operator: \"is_not_null\" } }, filter: { range_filter: { field: \"creation_date\", from: \"" + strBeginDateTime + "\", to: \"" + strEndDateTime + "\", from_inclusive: true } } } } }";
// can only get 200 responses at a a time so make a basic call first to get the total
m_response_count = apiClient.UploadString(m_url, m_payload_count);
dynamic m_jsoncount = JsonConvert.DeserializeObject(m_response_count);
// determine number of times of full api call, rounding up. substitute begin/end dates and beginning count placeholder
int m_records = m_jsoncount["total"];
int m_numbercalls = (m_records + (200 - 1)) / 200;
dynamic m_json;
for (int i = 0; i < m_numbercalls; i++)
{
var m_payload = "{ query : { filtered_query: { query: { term_query: { fields: [\"creation_date\"], operator: \"is_not_null\" } }, filter: { range_filter: { field: \"creation_date\", from: \"" + strBeginDateTime + "\", to: \"" + strEndDateTime + "\", from_inclusive: true } } } }, select: \"(**)\", start: " + i * 200 + ", count: 200 }";
m_response = apiClient.UploadString(m_url, m_payload);
m_json = JsonConvert.DeserializeObject(m_response);
The remainder of the code is omitted, but it is essentially iterating through the m_json object.

{
"query" :
{
"filtered_query": {
"query": { match_all_query: {} },
"filter": {
"range_filter": {
"field": "creation_date",
"from": "2016-01-01T00:00:00.000Z"
}
}
}
},
"select" : "(**)"
}

Related

Formatting return Type of the column in Sequelize

I have used Sequelize to state the type of one of my columns as follows:
userTime: DataTypes.TIME
This declares the column in MS SQL as time(7) format.
I add data to the respective column as a string. e.g: "12:00"
However, when I retrieve the data from the DB it comes out in the following format:
userTime: "1970-01-01T12:00:00.000Z"
How can I change the output to be of the following format, using Sequelize:
userTime: "12:00"
Is there a possibility, to use FORMAT of the return type in Sequelize?
What I would suggest is to create a getterMethods , work like a virtual field , that you will get in each query.
var User = sequelize.define('users',{
userTime:{
type: DataTypes.TIME
},
name:{
type: db.Sequelize.STRING,
allowNull: false
},
image: {
type: db.Sequelize.STRING,
allowNull: true
},
.... // other fields
},{
getterMethods:{
modifiedUserTime() {
return moment(this.userTime); // modify the time as you want
}
}
});
Actually, I solved the problem, was way easier than I thought.
userTime: {
type: DataTypes.TIME,
allowNull: true,
get() {
const userTime = this.getDataValue('userTime');
if(userTime == null) return userTime;
const time = new Date(userTime);
let hours = addZero(time.getUTCHours());
let minutes = addZero(time.getUTCMinutes());
let combinedTime = hours + ":" + minutes;
return combinedTime;
}
}
function addZero(i) {
if (i < 10) {
i = "0" + i;
}
return i;
}

How to write more than 25 items/rows into Table for DynamoDB?

I am quite new to Amazon DynamoDB. I currently have 20000 rows that I need to add to a table. However, based on what I've read, it seems that I can only write up to 25 rows at a time using BatchWriteItem class with 25 WriteRequests. Is it possible to increase this? How can I write more than 25 rows at a time? It is currently taking about 15 minutes to write all 20000 rows. Thank you.
You can only send up to 25 items in a single BatchWriteItem request, but you can send as many BatchWriteItem requests as you want at one time. Assuming you've provisioned enough write throughput, you should be able to speed things up significantly by splitting those 20k rows between multiple threads/processes/hosts and pushing them to the database in parallel.
It's maybe a bit heavyweight for that small of a dataset, but you can use AWS Data Pipeline to ingest data from S3. It basically automates the process of creating a Hadoop cluster to suck down your data from S3 and send it to DynamoDB in a bunch of parallel BatchWriteItem requests.
I was looking for some code to do this with the JavaScript SDK. I couldn't find it, so I put it together myself. I hope this helps someone else!
function multiWrite(table, data, cb) {
var AWS = require('aws-sdk');
var db = new AWS.DynamoDB.DocumentClient({region: 'us-east-1'});
// Build the batches
var batches = [];
var current_batch = [];
var item_count = 0;
for(var x in data) {
// Add the item to the current batch
item_count++;
current_batch.push({
PutRequest: {
Item: data[x]
}
});
// If we've added 25 items, add the current batch to the batches array
// and reset it
if(item_count%25 == 0) {
batches.push(current_batch);
current_batch = [];
}
}
// Add the last batch if it has records and is not equal to 25
if(current_batch.length > 0 && current_batch.length != 25) batches.push(current_batch);
// Handler for the database operations
var completed_requests = 0;
var errors = false;
function handler(request) {
return function(err, data) {
// Increment the completed requests
completed_requests++;
// Set the errors flag
errors = (errors) ? true : err;
// Log the error if we got one
if(err) {
console.error(JSON.stringify(err, null, 2));
console.error("Request that caused database error:");
console.error(JSON.stringify(request, null, 2));
}
// Make the callback if we've completed all the requests
if(completed_requests == batches.length) {
cb(errors);
}
}
}
// Make the requests
var params;
for(x in batches) {
// Items go in params.RequestItems.id array
// Format for the items is {PutRequest: {Item: ITEM_OBJECT}}
params = '{"RequestItems": {"' + table + '": []}}';
params = JSON.parse(params);
params.RequestItems[table] = batches[x];
// Perform the batchWrite operation
db.batchWrite(params, handler(params));
}
}
function putInHistory(data,cb) {
var arrayOfArray25 = _.chunk(data, 25);
async.every(arrayOfArray25, function(arrayOf25, callback) {
var params = {
RequestItems: {
[TABLES.historyTable]: []
}
};
arrayOf25.forEach(function(item){
params.RequestItems[TABLES.historyTable].push({
PutRequest: {
Item: item
}
})
});
docClient.batchWrite(params, function(err, data) {
if (err){
console.log(err);
callback(err);
} else {
console.log(data);
callback(null, true);
};
});
}, function(err, result) {
if(err){
cb(err);
} else {
if(result){
cb(null,{allWritten:true});
} else {
cb(null,{allWritten:false});
}
}
});
}
You can use lodash to make chunks of data from the array and then use async library's each/every method to do a batchWrite on chunks of 25 elements
Using aws cli and aws-vault, this is what I do.
Let's imagine you have the following file (data.json) with 1000 rows
{ "PutRequest": { "Item": { "PKey": { "S": "1" }, "SKey": { "S": "A" }}}},
{ "PutRequest": { "Item": { "PKey": { "S": "2" }, "SKey": { "S": "B" }}}},
{ "PutRequest": { "Item": { "PKey": { "S": "3" }, "SKey": { "S": "C" }}}},
... to 1000
and you need to split it into chunk files with 25 rows in each!
I use the following c# code in LinqPad to generate the .sh file and json chunks to be able to insert them into dynamodb using aws cli
void Main()
{
var sourcePath= #"D:\data\whereYourMainJsonFileIsLocated\";
var sourceFilePath = #"data.json";
var awsVaultProfileName = "dev";
var env = "dev";
var tableName = "dynamodb-table-name";
var lines = System.IO.File.ReadAllLines(sourcePath + sourceFilePath);
var destinationPath = Path.Combine(sourcePath, env);
var destinationChunkPath = Path.Combine(sourcePath, env, "chunks");
if (!System.IO.Directory.Exists(destinationChunkPath))
System.IO.Directory.CreateDirectory(destinationChunkPath);
System.Text.StringBuilder shString= new System.Text.StringBuilder();
for (int i = 0; i < lines.Count(); i = i+25)
{
var pagedLines = lines.Skip(i).Take(25).ToList().Distinct().ToList();
System.Text.StringBuilder sb = new System.Text.StringBuilder();
sb.AppendLine("{");
sb.AppendLine($" \"{tableName}\": [");
foreach (var element in pagedLines)
{
if (element == pagedLines.Last())
sb.AppendLine(element.Substring(0, element.Length-1));
else
sb.AppendLine(element);
}
sb.AppendLine("]");
sb.AppendLine("}");
var fileName = $"chunk{i / 25}.json";
System.IO.File.WriteAllText(Path.Combine(destinationChunkPath, fileName), sb.ToString(), Encoding.Default);
shString.AppendLine($#"aws-vault.exe exec {awsVaultProfileName} -- aws dynamodb batch-write-item --request-items file://chunks/{fileName}");
}
System.IO.File.WriteAllText(Path.Combine(destinationPath, $"{tableName}-{env}.sh"), shString.ToString(), Encoding.Default);
}
the result would be chunk files as chunk0.json, chunk1.json, etc
{
"dynamodb-table-name": [
{ "PutRequest": { "Item": { "PKey": { "S": "1" }, "SKey": { "S": "A" }}}},
{ "PutRequest": { "Item": { "PKey": { "S": "2" }, "SKey": { "S": "B" }}}},
{ "PutRequest": { "Item": { "PKey": { "S": "3" }, "SKey": { "S": "C" }}}}
]
}
and .sh file
aws-vault.exe exec dev -- aws dynamodb batch-write-item --request-items file://chunks/chunk0.json
aws-vault.exe exec dev -- aws dynamodb batch-write-item --request-items file://chunks/chunk1.json
aws-vault.exe exec dev -- aws dynamodb batch-write-item --request-items file://chunks/chunk2.json
and finally just run the .sh file and you have all data in your table!
From the answer from #Geerek here is the solution with a lambda function:
exports.handler = (event, context, callback) => {
console.log(`EVENT: ${JSON.stringify(event)}`);
var AWS = require('aws-sdk');
AWS.config.update({ region: process.env.REGION })
var docClient = new AWS.DynamoDB.DocumentClient();
const {data, table, cb} = event
// Build the batches
var batches = [];
var current_batch = [];
var item_count = 0;
for (var i = 0; i < data.length; i++) {
// Add the item to the current batch
item_count++
current_batch.push({
PutRequest: {
Item: data[i],
},
})
// If we've added 25 items, add the current batch to the batches array
// and reset it
if (item_count % 25 === 0) {
batches.push(current_batch)
current_batch = []
}
}
// Add the last batch if it has records and is not equal to 25
if (current_batch.length > 0 && current_batch.length !== 25) {
batches.push(current_batch)
}
// Handler for the database operations
var completed_requests = 0
var errors = false
function handler (request) {
console.log('in the handler: ', request)
return function (err, data) {
// Increment the completed requests
completed_requests++;
// Set the errors flag
errors = (errors) ? true : err;
// Log the error if we got one
if(err) {
console.error(JSON.stringify(err, null, 2));
console.error("Request that caused database error:");
console.error(JSON.stringify(request, null, 2));
callback(err);
}else {
callback(null, data);
}
// Make the callback if we've completed all the requests
if(completed_requests === batches.length) {
cb(errors);
}
}
}
// Make the requests
var params;
for (var j = 0; j < batches.length; j++) {
// Items go in params.RequestItems.id array
// Format for the items is {PutRequest: {Item: ITEM_OBJECT}}
params = '{"RequestItems": {"' + table + '": []}}'
params = JSON.parse(params)
params.RequestItems[table] = batches[j]
console.log('before db.batchWrite: ', params)
// Perform the batchWrite operation
docClient.batchWrite(params, handler(params))
}
};
I wrote an npm package that should work as a simple drop-in replacement for the batchWrite method, you just need to pass the dynamoDB instance as the first parameter and things should work:
https://www.npmjs.com/package/batch-write-all
Check the example in the project readme file:
// Use bellow instead of this: dynamodb.batchWrite(params).promise();
batchWriteAll(dynamodb, params).promise();
const { dynamoClient } = require("./resources/db");
const { v4: uuid } = require("uuid");
const batchWriteLooper = async () => {
let array = [];
for (let i = 0; i < 2000; i++) {
array.push({
PutRequest: {
Item: {
personId: uuid(),
name: `Person ${i}`,
age: Math.floor(Math.random() * 100),
gender: "Male",
createdAt: new Date(),
updatedAt: new Date(),
},
},
});
}
var perChunk = 20; // items per chunk
var result = array.reduce((resultArray, item, index) => {
const chunkIndex = Math.floor(index / perChunk);
if (!resultArray[chunkIndex]) {
resultArray[chunkIndex] = []; // start a new chunk
}
resultArray[chunkIndex].push(item);
return resultArray;
}, []);
Promise.all(
result.map(async (chunk) => {
const params = {
RequestItems: {
"persons": chunk,
},
};
return await dynamoClient.batchWrite(params).promise();
})
).then(() => {
console.log("done");
});
};
batchWriteLooper();

Firebase.util - similar intersections with completely different results

I'm working on an angular project that uses Firebase as it's sole backend, angularFire for some synchronisation cases and I'm using this tool Firebase.util for dealing with shared resources. My case is this:
{
users {
user1 : {
tasks : {
active : {
task1 : true,
task2 : true
},
archived : {
task3 : true,
task4 : true
}
},
...
},
tasks : {
task1 : {
users : {
user1 : true,
user2 : true
}
},
...
}
},
}
and I'm dealing with the query like this:
var tasksRef = new $window.Firebase(FIREBASE_URL + '/tasks');
function _userActiveTasksRef(userId) {
return new $window.Firebase(FIREBASE_URL + '/users/' + userId + '/tasks/active');
}
function _userArchivedTasksRef(userId) {
return new $window.Firebase(FIREBASE_URL + '/users/' + userId + '/tasks/archived');
}
function getActive(userId) {
var interRef = $window.Firebase.util.intersection(_userActiveTasksRef(userId), tasksRef);
return $firebase(interRef).$asArray();
}
function getArchived(userId) {
var interRef = $window.Firebase.util.intersection(_userArchivedTasksRef(userId), tasksRef);
return $firebase(interRef).$asArray();
}
On the first case, when I intersect the active tasks with the "all tasks" ref everything works fine but when I try to perform the same operation with the archived tasks the intersection is always empty. I've already logged the individual queries and everything is working as expected, only the intersection doesn't seem to work. Is there any caveat that I'm missing? The two queries are being loaded at the same time if that matters.. the result is being stored in a controller like this:
this.tasks = {
active: tasks.getActive(currentUser.uid),
archived: tasks.getArchived(currentUser.uid)
};

How can I make a grid with checkboxes that maintain state between page loads?

How can we make checkboxes remain checked when the page is refreshed in a Sencha ExtJS 3.3.0 GridPanel?
I have a GridPanel which displays some information with checkboxes. When the page is refreshed, the checkbox should still be checked.
Any suggestions, ideas, or code samples?
Had the same problem and I fixed it in such way - manually save id's of records that I show in cookies. Solution is not beautiful, but works for me.
store.on({
'beforeload': function () {
var checkeditems = [];
for(var i=0;i<gridResources.selModel.selected.length;i++)
{ checkeditems.push(grid.selModel.selected.items[i].data.ID);
}
if(checkeditems.length>0)
setCookie("RDCHECKBOXES", checkeditems.join("|"));
},
'load': function () {
if (getCookie("RDCHECKBOXES")) {
var checkeditems = getCookie("RDCHECKBOXES").split("|");
for (var i = 0; i<gridResources.store.data.items.length && checkeditems.length>0; i++) {
for(var j=0;j<checkeditems.length;j++) {
if (gridResources.store.data.items[i].data.ID == checkeditems[j]) {
gridResources.selModel.select(gridResources.store.data.items[i], true);
checkeditems.splice(j, 1);
break;
}
}
}
}
}
});
Here are code for functions getCookie() and setCookie():
// Example:
// setCookie("foo", "bar", "Mon, 01-Jan-2001 00:00:00 GMT", "/");
function setCookie (name, value, expires, path, domain, secure) {
document.cookie = name + "=" + escape(value) +
((expires) ? "; expires=" + expires : "") +
((path) ? "; path=" + path : "") +
((domain) ? "; domain=" + domain : "") +
((secure) ? "; secure" : "");
}
// Example:
// myVar = getCookie("foo");
function getCookie(name) {
var cookie = " " + document.cookie;
var search = " " + name + "=";
var setStr = null;
var offset = 0;
var end = 0;
if (cookie.length > 0) {
offset = cookie.indexOf(search);
if (offset != -1) {
offset += search.length;
end = cookie.indexOf(";", offset)
if (end == -1) {
end = cookie.length;
}
setStr = unescape(cookie.substring(offset, end));
}
}
return(setStr);
}
Have you looked at all at the ExtJS documentation or the included samples? There's a sample grid using the CheckColumn extension that does exactly what you ask.
In the example linked, take note that the checkbox column is linked to a boolean record field
// in your record
{name: 'indoor', type: 'bool'}
and represented in the grid's column model by a CheckColumn:
// in the grid's column model
xtype: 'checkcolumn',
header: 'Indoor?',
dataIndex: 'indoor',
width: 55
This way, when boolean data comes into the store from the server in JSON or XML, the values are represented as checkboxes in the grid. As long as you write your changes to the server, your checkbox boolean values will be preserved.

Sorting in grid panel

var store = new FMP.AspNetJsonStore({
fields: [
{ name: 'AssetID' },
{ name: 'AssociationID' },
{ name: 'Image' },
{ name: 'StatusName' },
{ name: 'ModelName' },
{ name: 'IPAddress' },
{ name: 'InScope', type: 'boolean' },
{ name: 'ServicePlanName' },
{ name: 'PricePlanName' },
{ name: 'PricePlanDescription' },
{ name: 'Program' },
{ name: 'ServicePlanID' },
{ name: 'Customer' },
{ name: 'Black', type: 'float' },
{ name: 'Cyan', type: 'float' },
{ name: 'Magenta', type: 'float' },
{ name: 'Yellow', type: 'float' },
{ name: 'BlackPct' },
{ name: 'CyanPct' },
{ name: 'MagentaPct' },
{ name: 'YellowPct' },
{ name: 'PrinterMarkerSupplies' },
{ name: 'PageCount' },
{ name: 'BlackImpressions' },
{ name: 'ColorImpressions' },
{ name: 'PricePlanID' },
{ name: 'ResponsibilityForAction' },
{ name: 'PrinterSerialNumber' }
],
totalProperty: "TotalCount",
autoLoad: { params: { start: 0, limit: myPageSize} },
//autoLoad: true,
proxy: new Ext.data.HttpProxy({
// Call web service method using GET syntax
url: 'GetPrintersGrid.asmx/buildGrid',
// Ask for Json response
headers: { 'Content-type': 'application/json' },
method: "GET"
}),
remoteSort: true,
//sortInfo: { field: 'PageCount', direction: "DESC" },
groupField: 'Customer',
root: 'Records'
});
store.setDefaultSort('PageCount', 'DESC');
I am using a webservice to sort this.
I am getting an error
{"Message":"Invalid JSON primitive: DESC.","StackTrace":" at System.Web.Script.Serialization.JavaScriptObjectDeserializer.DeserializePrimitiveObject()\r\n at System.Web.Script.Serialization.JavaScriptObjectDeserializer.DeserializeInternal(Int32 depth)\r\n at System.Web.Script.Serialization.JavaScriptObjectDeserializer.BasicDeserialize(String input, Int32 depthLimit, JavaScriptSerializer serializer)\r\n at System.Web.Script.Serialization.JavaScriptSerializer.Deserialize(JavaScriptSerializer serializer, String input, Type type, Int32 depthLimit)\r\n at System.Web.Script.Services.RestHandler.GetRawParamsFromGetRequest(HttpContext context, JavaScriptSerializer serializer, WebServiceMethodData methodData)\r\n at System.Web.Script.Services.RestHandler.GetRawParams(WebServiceMethodData methodData, HttpContext context)\r\n at System.Web.Script.Services.RestHandler.ExecuteWebServiceCall(HttpContext context, WebServiceMethodData methodData)","ExceptionType":"System.ArgumentException"}
Can anyone help me in this issue
I am using Ext.ux.AspWebServiceProxy class and used this proxy class in the store.Also defined the webservice in the user control in scriptmanager proxy
Iam getting an error saying GetPrintersGrid is undefined.Iam using the follwing example for reference.
http://osman.in/aspnet/using-extjs-grid-with-aspnet-ajax-wcf-webservices-c/
Can you please help me in this issue.
/// <reference path="ExtJS/ext-all.js" />
Ext.namespace('Ext.ux');
Ext.ux.AspWebServiceProxy = function(conn)
{
Ext.ux.AspWebServiceProxy.superclass.constructor.call(this);
Ext.apply(this, conn);
};
Ext.extend(Ext.ux.AspWebServiceProxy, Ext.data.DataProxy,
{
load : function (params, reader, callback, scope, arg)
{
var userContext = {
callback: callback,
reader: reader,
arg: arg,
scope: scope
};
var proxyWrapper = this;
//Handles the response we get back from the web service call
var webServiceCallback = function(response, context, methodName)
{
proxyWrapper.loadResponse(response, userContext, methodName);
}
var serviceParams = [];
//Convert the params into an array of values so that they can be used in the call (note assumes that the properties on the object are in the correct order)
for (var property in params)
{
serviceParams.push(params[property]);
}
//Add the webservice callback handlers
serviceParams.push(webServiceCallback);
serviceParams.push(this.handleErrorResponse);
//Make the actual ASP.Net web service call
this.webServiceProxyMethod.apply(this.webServiceProxy, serviceParams);
},
handleErrorResponse : function(response, userContext, methodName)
{
alert("Error while calling method: " + methodName + "\n" + response.get_message());
},
loadResponse : function (response, userContext, methodName)
{
var result = userContext.reader.readRecords(response);
userContext.callback.call(userContext.scope, result, userContext.arg, true);
}
});
var dataStore = new Ext.data.Store(
{
//Note that I have renamed the web service proxy class
proxy: new Ext.ux.AspWebServiceProxy(
{
webServiceProxy: GetPrintersGrid,
webServiceProxyMethod: GetPrintersGrid.buildGrid
}),
remoteSort: true
});
<asp:ScriptManagerProxy ID="PageScriptManager" runat="server">
<Services>
<asp:ServiceReference Path="~/GetPrintersGrid.asmx" />
</Services>
<Scripts>
<asp:ScriptReference Path="~/Ext.ux.AspWebServiceProxy.js" />
</Scripts>
</asp:ScriptManagerProxy>
This is the souce code i ussed
FMP.AspNetJsonReader = Ext.extend(Ext.data.JsonReader, {
read: function(response) {
// Assuming ASP.NET encoding - Data is stored as
var json = response.responseText;
var o = Ext.decode(json);
if (!o) {
throw { message: "AspNetJsonReader.read: Json object not found" };
}
if (!o.d) {
throw { message: "AspNetJsonReader.read: Root element d not found" };
}
return this.readRecords(o.d);
}
});
FMP.AspNetJsonStore = Ext.extend(Ext.data.GroupingStore, {
/**
* #cfg {Ext.data.DataReader} reader #hide
*/
constructor: function(config) {
FMP.AspNetJsonStore.superclass.constructor.call(this, Ext.apply(config, {
reader: new FMP.AspNetJsonReader(config)
}));
}
});
Iam using AS.NET for server side
Here is my webservice
public PagedResult buildGrid(int start, int limit, string sortfield, string dir)
{
var a=5;
Guid AccountID = (Guid)Session["AccountID"];
//string sortdir;
//if( dir == "DESC")
//{
// sortdir = dir.Substring(0, 4).Trim().ToUpper();
//}
//else
//{
// sortdir = dir.Substring(0, 3).Trim().ToUpper();
//}
string SortExpression = sortfield + " " + (!String.IsNullOrEmpty(dir) ? dir : String.Empty);
//string whereClause = "SELECT value a FROM XSP_AssetList_V AS a WHERE a.AccountID = GUID'" + AccountID + "' order by a.PageCount = '" + + "'";
string whereClause = "SELECT value a FROM XSP_AssetList_V AS a WHERE a.AccountID = GUID'" + AccountID + "' Order By a."+SortExpression;
//string whereClause = "SELECT value a , ROW_NUMBER() OVER(ORDER BY" + " " + SortExpression + ") As RowNumber FROM XSP_AssetList_V AS a WHERE a.AccountID = GUID'" + AccountID + "'";
//string whereClause = "SELECT value a FROM XSP_AssetList_V AS a WHERE a.AccountID = GUID'" + AccountID + "'";
List<FMPAsset> fmpAssets = new List<FMPAsset>();
using (XSPAssetModel.XSPAssetEntities assetEntities = new XSPAssetEntities(b.BuildEntityConnectionString1("XSMDSN")))
{
ObjectQuery<XSP_AssetList_V> assets = new ObjectQuery<XSP_AssetList_V>(whereClause, assetEntities);
//var assetOrder = assets.OrderBy(x => x.StatusName).ToList();
var assetPage = assets.Skip(start).Take(limit);
//var totalAssetCount = assets.Count();
currentAssets = assetPage.ToList();
int currentAssetsCount = currentAssets.Count;
string imgprefix = System.Configuration.ConfigurationManager.AppSettings["ImgPrefix"];
char[] separators = { '/' };
string appname = "";
int lastloc = imgprefix.Substring(0, imgprefix.Length - 1).LastIndexOfAny(separators);
if (lastloc > 6)
{
appname = imgprefix.Substring(lastloc + 1);
}
FMPAsset asset = new FMPAsset();
//StreamWriter sw = new StreamWriter("C:\\test.txt");
XSPPrinterMarkerSupplyModel.XSPPrinterMarkerSupplyEntities markerCtx = new XSPPrinterMarkerSupplyModel.XSPPrinterMarkerSupplyEntities(b.BuildEntityConnectionString1("XSMDSN"));
for (int x = 0; x < currentAssetsCount; x++)
{
asset = new FMPAsset();
asset.AssetID = currentAssets[x].AssetID.ToString();
asset.PricePlanID = currentAssets[x].PricePlanID.ToString();
asset.AssociationID = currentAssets[x].AssociationID;
asset.ModelName = currentAssets[x].ModelName;
asset.ResponsibilityForAction = currentAssets[x].ResponsibilityForAction;
asset.IPAddress = (String.IsNullOrEmpty(currentAssets[x].PrinterIPAddress)) ? "No IP" : currentAssets[x].PrinterIPAddress; ;
if (currentAssets[x].InScope)
{
asset.InScope = b.GetString("SDE_YES");
}
else
{
asset.InScope = b.GetString("SDE_NO");
}
asset = SetStatus(appname, asset, x);
asset.PricePlanName = currentAssets[x].Program;
asset.PricePlanDescription = currentAssets[x].PricePlanDescription;
asset.ServicePlanName = currentAssets[x].ServicePlanName;
if (currentAssets[x].PrinterSerialNumber != null)
{
asset.PrinterSerialNumber = currentAssets[x].PrinterSerialNumber;
}
else
{
asset.PrinterSerialNumber = "-";
}
//sw.WriteLine("ChargebackDescription: " + DateTime.Now.Millisecond);
if (this.b.UseChargebackDescription && !String.IsNullOrEmpty(currentAssets[x].CustomerChargebackDescription) && currentAssets[x].CustomerChargebackDescription != "Generated by OUT Integration")
{
asset.Customer = currentAssets[x].CustomerChargebackDescription;
if (asset.Customer.IndexOf(Environment.NewLine) > -1)
{
asset.Customer = asset.Customer.Substring(0, asset.Customer.IndexOf(Environment.NewLine));
}
}
else
{
asset.Customer = currentAssets[x].CustomerChargeBackEntryName;
}
if (this.b.UsePricePlanDescription && !String.IsNullOrEmpty(currentAssets[x].PricePlanDescription))
{
asset.Program = currentAssets[x].PricePlanDescription;
if (asset.Program.IndexOf(Environment.NewLine) > -1)
{
asset.Program = asset.Program.Substring(0, asset.Program.IndexOf(Environment.NewLine));
}
}
else
{
asset.Program = currentAssets[x].Program;
}
asset.BlackPct = -3;
asset.CyanPct = -3;
asset.MagentaPct = -3;
asset.YellowPct = -3;
Guid id = currentAssets[x].AssetID;
asset = SetCMYKvalues(asset, x);
BuilldImpressionsValues(currentAssets[x], ref asset);
fmpAssets.Add(asset);
}
var totalAssetCount = assets.Count();
var y = new PagedResult<FMPAsset>();
y.Records = fmpAssets;
y.TotalCount = totalAssetCount;
return y;
// CommonGrid1.BindDataSource(SortByStatusName(fmpAssets));
}
}
This error is happening when your store is making the call to your web service. Whatever JSON is being sent is not valid for some reason (or .NET does not think it is), hence the server error when ASP.NET is trying to deserialize the data into a valid argument list for your method. I would first look in Firebug to see exactly what JSON is being passed to the server -- that might give you a clue as to what the issue is. If the JSON being sent is not valid then it's a client/Ext issue -- if it is valid, then it's a .NET issue.

Resources