Visual studio 2014 deployment error - sql-server

I'm trying to run my very first dimension with visual studio 2014 and it's not really going smooth.. I did all checks I could (change it to specific window account authentication, check data sourced and provider, make sure that sql server is running ...) but it's still not working... I'm a beginner and the error messages do not make much sense to me yet.
Below is the details I got when I tried to deploy my dimension.
---------------Details------------------------------------
<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Parallel>
<Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200" xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200" xmlns:ddl300="http://schemas.microsoft.com/analysisservices/2011/engine/300" xmlns:ddl300_300="http://schemas.microsoft.com/analysisservices/2011/engine/300/300" xmlns:ddl400="http://schemas.microsoft.com/analysisservices/2012/engine/400" xmlns:ddl400_400="http://schemas.microsoft.com/analysisservices/2012/engine/400/400">
<Object>
<DatabaseID>MVA_SSAS</DatabaseID>
<DimensionID>Date</DimensionID>
</Object>
<Type>ProcessFull</Type>
<WriteBackTableCreation>UseExisting</WriteBackTableCreation>
</Process>
</Parallel>
</Batch>
Processing Dimension 'Date' completed.
Start time: 22/06/2016 00:26:05; End time: 22/06/2016 00:26:05; Duration: 0:00:00
Processing Dimension Attribute '(All)' completed.
Start time: 22/06/2016 00:26:05; End time: 22/06/2016 00:26:05; Duration: 0:00:00
Processing Dimension Attribute 'Month Number Of Year' completed.
Start time: 22/06/2016 00:26:05; End time: 22/06/2016 00:26:05; Duration: 0:00:00
Errors and Warnings from Response
Internal error: The operation terminated unsuccessfully.
The following system error occurred: There are currently no logon servers available to service the logon request.
Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW2014', Name of 'Adventure Works DW2014'.
Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Date', Name of 'Date' was being processed.
Errors in the OLAP storage engine: An error occurred while the 'Month Number Of Year' attribute of the 'Date' dimension from the 'MVA_SSAS' database was being processed.
Server: The current operation was cancelled because another operation in the transaction failed.

I solved the problem. The problem was that I didn't have any pw to my windows...Thought that it might be useful if someone gets the similar error. :)

I guess you should worry about this:
 
A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW2014', Name of 'Adventure Works DW2014'.
Test that data connection again. Check the impersonation account, save password if needed...
 

Related

SSIS 2012 - Package Task Run Time Status

In Table [catalog].[execution_component_phases] there is an column called as
Phase. The Value of the Phase Column are:
PreExecute
Validate
ProcessInput
ReleaseConnection
AcquireConnection.
Can someone please suggest which value is saying specific Task in a
package is in Running Status.
Is there any value which states task is started but not completed yet.
Regards
Based on the official documentation of [catalog].[execution_component_phases] table:
Displays the time spent by a data flow component in each execution phase.
This view displays a row for each execution phase of a data flow component, such as Validate, Pre-Execute, Post-Execute, PrimeOutput, and ProcessInput. Each row displays the start and end time for a specific execution phase.
Based on my experience i can assume that the Order of the execution phases is:
AcquireConnection : Acquiring the related connections required
Validate : Validating the Task/Component
Pre-Execute
ProcessInput : Processing phase
PrimeOutput : Generating outputs
Post-Execute
ReleaseConnection : Release acquired connections
In the official documentation they provided the following query to read the time spent in each phase:
use SSISDB
select package_name, task_name, subcomponent_name, execution_path,
SUM(DATEDIFF(ms,start_time,end_time)) as active_time,
DATEDIFF(ms,min(start_time), max(end_time)) as total_time
from catalog.execution_component_phases
where execution_id = 1841
group by package_name, task_name, subcomponent_name, execution_path
order by package_name, task_name, subcomponent_name, execution_path
Based on the information above, you can - as example - check whether the current task phase to know if it is still running or not.
References
catalog.execution_component_phases

Error processing tabular model - From SQL Server Agent

We have a tabular cube, processing database (full) in SSMS works fine, but when processing from SQL server agent, throws following error.
<return xmlns="urn:schemas-microsoft-com:xml-analysis">
  <root xmlns="urn:schemas-microsoft-com:xml-analysis:empty">
    <Messages xmlns="urn:schemas-microsoft-com:xml-analysis:exception">
      <Warning WarningCode="1092550744" Description="Cannot order ''[] by [] because at least one value in [] has multiple distinct values in []. For example, you can sort [City] by [Region] because there is only one region for each city, but you cannot sort [Region] by [City] because there are multiple cities for each region." Source="Microsoft SQL Server 2016 Analysis Services Managed Code Module" HelpFile="" />
    </Messages>
  </root>
</return>
Here is the script is used from SQL server agent.
{
"refresh": {
"type": "full",
"objects": [
{
"database": "DBName"
}
]
}
}
Can anyone suggest how to eliminate this error or ignore this error/warning?
Thanks,
I had the same issue, tabular model in VS 2015, cube in SSAS. Builds fine when I process the database but the SQL Server Agent was bringing up this error. A couple of forums had some mention of the error but no steps for deeper investigation & resolution. Particularly difficult when the 'Cannot Order' is blank. I opened the model in VS, select every column in turn and looked for any sorting operation in either the filter or the 'Sort by Column' button which is easy to miss. Removed all the sorts and it built fine. Take a note of the ones removed as you may have a data issue.
Use SQL Server Integration Services (SSIS) for processing. Just create a package with an "Analysis Services Processing Task". This task processes the model like SSMS.
The error message correctly explains the problem but unhelpfully doesn't tell which attribute is the offending one. I was sorting account names by account number but because there were a few accounts with the same name but different number, I got this same error. Setting keepUniqueRows didn't help.
Removing the offending sortBy fixes the problem when processing with an SQL Server Agent. What's interesting is that when the sortBy is in place and I processed the model with SSMS the accounts were sorted as expected. This led me to think this is because SQL Agent Job interprets the warning as an error and does a rollback but SSMS ignores it. The SSIS task probably ignores the warning just like SSMS and processing succeeds.
Try this,
<Process xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Type>ProcessFull</Type>
<Object>
<DatabaseID>DBName</DatabaseID>
</Object>
</Process>
I also faced same problem. I just made type "full" to "automatic" and it starts working.
{
"refresh": {
"type": "automatic",
"objects": [
{
"database": "AU MY Model"
}
]
}
}

ColdFusion 11 SOLR indexing logging conspicuous errors

I am running a scheduled task to refresh my SOLR collection. The coldfusion-out.log file has quite a few of the following errors:
Jan 12, 2016 10:10:21 AM Warning [ajp-bio-8014-exec-5] - WARNING: Could not index D:\PDFs\9310.PDF in SOLR. Check the exception for more details: An error occurred during GETINFO operation in the cfpdf tag.
Jan 12, 2016 10:10:51 AM Warning [ajp-bio-8014-exec-5] - WARNING: Could not index D:\PDFs\9887.PDF in SOLR. Check the exception for more details: An error occurred during the EXTRACTTEXT operation in the cfpdf tag.
The task code to retrieve the file locations:
<cfsetting requesttimeout="3600" />
<cfquery name="getDocs" datasource="#APPLICATION.DataSource#">
SELECT DISTINCT
SKU,
STATUS,
'D:\PDFs\'
CONCAT TRIM(DOCID)
CONCAT '.PDF' AS document_file
FROM MYTABLE
</cfquery>
<cfindex
query="getDocs"
collection="prudocs3"
action="refresh"
type="file"
key="document_file"
custom1="SKU"
custom2="STATUS" />
Any idea what is causing this?
While googling more about this, I came across this post from the wise Raymond Camden:
http://www.raymondcamden.com/2014/07/11/Issue-with-CFINDEX/
Once I read this, I took a closer look at the PDFs being used, they indeed did not have the Extraction permission turned on. I have contacted the vendor about updating their PDFs, so that they can be properly indexed and included in search results.

Cannot SQLBulkCopy Error 40197 with %d code of 4815 (Connection Forcibly Closed)

Developing with VS 2013 ASP.NET MVC 5 Web Project and Separate Azure hosted SQL Server database.
At the bottom is all my error information from Visual Studio 2013. I've narrowed down the problem and found a link to the Microsoft Description of the problem without a solution. I'm Developing with Database First and Entity Framework 6. ASP.NET 4 MVC & Razor. I connect to a SQL Azure database - I think this is whats falling over i've checked the logs for Azure website etc already
I have delimited text files (that were uploaded to APP_DATA) that I load into a DataTable then use SQL-Bulk Copy to dump content into Azure Database. All works 100% fine so long as my files are only containing a few hundred records. But I need to insert 20MB files with approx 200,000 rows. When I try the big files I get an Error at the point ASP.NET is performing the Bulk Copy. No matter what I set for batch size etc it bails around the 4000 row mark every-time. I've exhausted all options and at my whits end, I even tried Scaling up the Azure database to Business from FREE web. I tried scaling up the website too. Here is the code :
public void BatchBulkCopy(DataTable dataTable, string DestinationTbl, int batchSize,int identity)
{
try {
// Set the timeout.
System.Diagnostics.Debug.WriteLine("Start SQL Bulk Copy");
using (SqlBulkCopy sbc = new SqlBulkCopy("Server=tcp:eumtj4loxy.database.windows.net,1433;Database=AscWaterDB;User ID=HIDDEN#HIDDEN;Password=XXXXXXX;Trusted_Connection=False;Encrypt=True;Connection Timeout=900;", SqlBulkCopyOptions.TableLock))
{
sbc.DestinationTableName = DestinationTbl;
sbc.BulkCopyTimeout = 0;
// Number of records to be processed in one go
sbc.BatchSize = 1000;
// Add your column mappings here
sbc.ColumnMappings.Add("D2001_SPID", "SupplyPointId");
sbc.ColumnMappings.Add("D2002_ServiceCategory", "D2002_ServiceCategory");
sbc.ColumnMappings.Add("D2025_NotifyDisconnection/Reconnection", "D2025_NotifyDisconnectionReconnection");
sbc.ColumnMappings.Add("WaterBatchId", "WaterBatchId");
sbc.ColumnMappings.Add("D2003_Schedule3", "D2003_Schedule3");
sbc.ColumnMappings.Add("D2004_ExemptCustomerFlag", "D2004_ExemptCustomerFlag");
sbc.ColumnMappings.Add("D2005_CustomerClassification", "D2005_CustomerClassification");
sbc.ColumnMappings.Add("D2006_29e", "D2006_29e");
sbc.ColumnMappings.Add("D2007_LargeVolAgreement", "D2007_LargeVolAgreement");
sbc.ColumnMappings.Add("D2008_SICCode", "D2008_SICCode");
sbc.ColumnMappings.Add("D2011_RateableValue", "D2011_RateableValue");
sbc.ColumnMappings.Add("D2015_SPIDVacant", "D2015_SPIDVacant");
sbc.ColumnMappings.Add("D2018_TroughsDrinkingBowls", "D2018_TroughsDrinkingBowls");
sbc.ColumnMappings.Add("D2019_WaterServicesToCaravans", "D2019_WaterServicesToCaravans");
sbc.ColumnMappings.Add("D2020_OutsideTaps", "D2020_OutsideTaps");
sbc.ColumnMappings.Add("D2022_TransitionalArrangements", "D2022_TransitionalArrangements");
sbc.ColumnMappings.Add("D2024_Unmeasurable", "D2024_Unmeasurable");
sbc.ColumnMappings.Add("D2014_FarmCroft", "D2014_FarmCroft");
// Finally write to server
System.Diagnostics.Debug.WriteLine("Write Bulk Copy to Server " + DateTime.Now.ToString());
sbc.WriteToServer(dataTable); // Fails here when I upload a 20MB CSV with 190,000 rows
sbc.Close();
}
// Ignore this I don't get to this code unless loading a file thats only got a few records
WaterBatch obj = GetWaterBatch(identity); // Now we can get the WaterBatch
obj.StopDateTime = DateTime.Now;
Edit(obj);
Save();
System.Diagnostics.Debug.WriteLine("Finished " + DateTime.Now.ToString());
}
catch (Exception ex)
{
Exception ex2 = ex;
while (ex2.InnerException != null)
{
ex2 = ex2.InnerException;
}
Console.WriteLine(ex.InnerException);
throw;
}
}
My $Exception says :
$exception {"A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)"} System.Exception {System.Data.SqlClient.SqlException}
My InnerException is, if I go into Inner then Inner exception etc its the same message with Hresult of -2146232060 then -2147467259:
InnerException {"An existing connection was forcibly closed by the remote host"} System.Exception {System.ComponentModel.Win32Exception}
UPDATED INFO :
Explanation of Error from Microsoft is (below). I am getting an Error number 40197. Then Microsoft say to look for the %d code - which I get to be 4815. Question is what now, where can I go from here to get into on a 40197 with a %d of 4815:
I got the following info regarding my error from this link: http://msdn.microsoft.com/en-us/library/windowsazure/ff394106.aspx
40197
17
The service has encountered an error processing your request. Please try again. Error code %d.
You will receive this error, when the service is down due to software or hardware upgrades, hardware failures, or any other failover problems. The error code (%d) embedded within the message of error 40197 provides additional information about the kind of failure or failover that occurred. Some examples of the error codes embedded within the message of error 40197 are 40020, 40143, 40166, and 40540.
Reconnecting to your SQL Database server will automatically connect you to a healthy copy of your database. Your application must catch error 40197, log the embedded error code (%d) within the message for troubleshooting, and try reconnecting to SQL Database until the resources are available, and your connection is established again.
I was getting the exact same error during a Bulk Insert. In my case, it was a varchar column that was overflowing. I just needed to increase the character limit and the problem was solved.
Just increase the Length of variable even if the value being stored is much lesser than than the size of the variable, worked for me.

Business intelligence Cube and Dimensions

What I have to do to fix this problems? (I`m beginner in BI)
Errors
Error 3 Internal error: The operation terminated unsuccessfully. 0 0
Error 4 Errors in the high-level relational engine. The following exception occurred while the managed IDbConnection interface was being used: Login failed for user 'NT Service\MSSQLServerOLAPService'.. 0 0
Error 5 Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Karol s10016 Rafal s10017', Name of 'Karol s10016 Rafal s10017'. 0 0
Error 6 Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Kampanie Marketingowe', Name of 'Kampanie Marketingowe' was being processed. 0 0
Error 7 Errors in the OLAP storage engine: An error occurred while the 'Id Kampanie Marketingowe' attribute of the 'Kampanie Marketingowe' dimension from the 'Karol_Rafal_Analysis_Servicies' database was being processed. 0 0
Error 8 Server: The current operation was cancelled because another operation in the transaction failed.
Image
According to the error, the SSAS service user (NT Service\MSSQLServerOLAPService) cannot connect to the Data Source Karol s10016 Rafal s10017 when you're trying to process the cube.
Make sure the SSAS service user can access this Data Source to prevent the error.

Resources