Spark Tasks Failing Intermittently With Connection Reset Error - sql-server

I am using the spark dataframe reader to pull data from an SQL server database, do some minor changes like column renaming, data type casting and saving the dataframe to S3 using delta lake format. The job is triggered from Airflow using LivyOperator.
The code for the dataframe reader is as follows:
val table_source = spark.read
.format("com.microsoft.sqlserver.jdbc.spark")
.option("Driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")
.option("dbtable", select_query)
.option("inferschema", "true")
.option("url", ds_url)
.option("user", ds_user)
.option("password", ds_pass)
.option("numPartitions", num_partitions)
.option("partitionColumn", "RNO")
.option("lowerBound", 0)
.option("upperBound", rowCount).load()
Then I create a temporary view on top of this data and add a few more standard columns like client_id, timestamp etc and return a dataframe. Then the dataframe is saved as parquet files using 'delta' format.
table_source.createOrReplaceTempView("table_name")
val table_mapping = spark.sql(mapping_query)
table_mapping
.write.format("delta")
.mode("append")
.save(path)
So, now the issue is that for tables that have around 50k rows or more the job always gets stuck at this 'save' stage. Say I provided numPartitions=8, in the spark UI I can see that 8 tasks are generated for this stage by spark. Some of these tasks finish executing successfully within a few minutes. The remaining tasks get stuck for more than 2 hours and fail with a 'connection reset' error caused by SQLServerException. The tasks are then retried by spark, some of these tasks complete right away and again some get stuck for two more hours and so on until the stage is completed eventually.
Note: There is no limit on the maximum concurrent connections on the source server.
Finally job takes around 2+, 4+ or 6+ hours to complete. Using spark speculation helped bring the job completion time to 1 hour, but it is still too high a time for the volume of data we are dealing with. For comparison we tested fetching the data in the same environment as the EMR cluster using SSIS. It took just 20 mins to complete.
When the tasks are stuck we observed the following thread lock in the executor thread dump.
Thread ID: ##
Thread Name: Executor task launch worker for task 1.0 in stage 17.0 (TID 17)
Thread State: RUNNABLE
Thread Locks: Lock(java.util.concurrent.ThreadPoolExecutor$Worker#881566968}),
Monitor(com.microsoft.sqlserver.jdbc.TDSReader#1700342603})
When I expand this thread, I see the following trace.
> java.net.SocketInputStream.socketRead0(Native Method)
> java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
> java.net.SocketInputStream.read(SocketInputStream.java:171)
> java.net.SocketInputStream.read(SocketInputStream.java:141)
> com.microsoft.sqlserver.jdbc.TDSChannel.read(IOBuffer.java:1819)
> com.microsoft.sqlserver.jdbc.TDSReader.readPacket(IOBuffer.java:5461)
> => holding Monitor(com.microsoft.sqlserver.jdbc.TDSReader#648027762}) com.microsoft.sqlserver.jdbc.TDSReader.nextPacket(IOBuffer.java:5371)
> com.microsoft.sqlserver.jdbc.TDSReader.ensurePayload(IOBuffer.java:5347)
> com.microsoft.sqlserver.jdbc.TDSReader.readBytes(IOBuffer.java:5640)
> com.microsoft.sqlserver.jdbc.TDSReader.readWrappedBytes(IOBuffer.java:5662)
> com.microsoft.sqlserver.jdbc.TDSReader.readInt(IOBuffer.java:5603)
> com.microsoft.sqlserver.jdbc.TDSReader.readUnsignedInt(IOBuffer.java:5620)
> com.microsoft.sqlserver.jdbc.PLPInputStream.readBytesInternal(PLPInputStream.java:313)
> com.microsoft.sqlserver.jdbc.PLPInputStream.getBytes(PLPInputStream.java:129)
> com.microsoft.sqlserver.jdbc.DDC.convertStreamToObject(DDC.java:438)
> com.microsoft.sqlserver.jdbc.ServerDTVImpl.getValue(dtv.java:2965)
> com.microsoft.sqlserver.jdbc.DTV.getValue(dtv.java:206)
> com.microsoft.sqlserver.jdbc.Column.getValue(Column.java:130)
> com.microsoft.sqlserver.jdbc.SQLServerResultSet.getValue(SQLServerResultSet.java:2087)
> com.microsoft.sqlserver.jdbc.SQLServerResultSet.getValue(SQLServerResultSet.java:2072)
> com.microsoft.sqlserver.jdbc.SQLServerResultSet.getString(SQLServerResultSet.java:2413)
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$makeGetter$12(JdbcUtils.scala:444)
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$makeGetter$12$adapted(JdbcUtils.scala:442)
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$$Lambda$1086/1697796400.apply(Unknown
> Source)
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anon$1.getNext(JdbcUtils.scala:352)
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anon$1.getNext(JdbcUtils.scala:334)
> org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
> org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:31)
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown
> Source)
> org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:35)
> org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:832)
> org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:277)
> org.apache.spark.sql.execution.datasources.FileFormatWriter$$$Lambda$1243/1672950527.apply(Unknown
> Source)
> org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1473)
> org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:286)
> org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$15(FileFormatWriter.scala:210)
> org.apache.spark.sql.execution.datasources.FileFormatWriter$$$Lambda$1085/1036621064.apply(Unknown
> Source)
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> org.apache.spark.scheduler.Task.run(Task.scala:131)
> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
> org.apache.spark.executor.Executor$TaskRunner$$Lambda$465/565856309.apply(Unknown
> Source)
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> java.lang.Thread.run(Thread.java:750)
>
>
> Version Information
> EMR: 6.40
> Spark: 3.1.2
> mssql-jdbc: 9.2.1.jre8
> spark-mssql-connector_2.12: 1.2.0
> delta-core: 1.0.0
We have tried setting QueryTimeout on the spark jdbc reader to no avail. Tried increasing executor and driver cores/memory and also using dynamic allocation but ended up with the same results. Also tried removing partitions, same issue. We have been at this for weeks, I would highly appreciate any pointers on solving this problem.

Related

Why sys.sp_describe_first_result_set return 1 on very first time- not always?

SSIS package for import csv file has been configured with SQL server agent which run every 2 min.
Experiencing, SQL process suspended with result sys.sp_describe_first_result_set;1
When : ONLY at VERY first time (when Azure VM created) with more than one csv file import.
When not :
If any single CSV file import has happen before.
If point 1 has been followed , we can rapid import any number of csv file.(no sql suspend)
SQL Process suspended with sys.sp_describe_first_result_set return 1
Details of suspended :
Blkby = -2 = orphaned distributed transaction
Suspended : session is waiting for an event to complete
Command : Execute
Issue has been resolved by removing <DTS:TransactionOption = "2" from SSIS package
It was not properly implemented in my case ,follow for more details
https://learn.microsoft.com/en-us/sql/integration-services/integration-services-transactions?view=sql-server-ver16
Below are TransactionOption values
NotSupported : 0 Supported : 1 Required : 2

NiFi connection to SqlServor for ExecuteSQL

I'm trying to import some data from different SqlServer databases using ExecuteSQL in NiFi, but it's returning me an error. I've already imported a lot of other tables from MySQL databases without any problem and I'm trying to use the same workflow structure for the SqlServer dbs.
The structure is as follows:
There's a file .txt with the list of tables to be imported
This file is fetched, splitted and uptaded; so there's a FlowFile for each table of each db that has to be imported,
These FlowFiles are passed into ExecuteSQL which executes their contents
For example:
file.txt
table1
table2
table3
is being updated into 3 different FlowFiles:
FlowFile1
SELECT * FROM table1
FlowFile2
SELECT * FROM table2
FlowFile3
SELECT * FROM table3
which are passed to ExecuteSQL.
Here follows the configuration of ExecuteSQL (identical for SqlServer tables and MySQL ones)
ExecuteSQL
As the only difference with the import from MySQL db is in the connectors, this is how a generic MySQL connector has been configured:
SETTINGSPROPERTIES
Database Connection URL jdbc:mysql://00.00.00.00/DataBase?zeroDateTimeBehavior=convertToNull&autoReconnect=true
Database Driver Class Name com.mysql.jdbc.Driver
Database Driver Location(s) file:///path/mysql-connector-java-5.1.47-bin.jar
Database User user
PasswordSensitive value set
Max Wait Time 500 millis
Max Total Connections 8
Validation query No value set
And this is how a SqlServer connector has been configured:
SETTINGSPROPERTIES
Database Connection URL jdbc:jtds:sqlserver://00.00.00.00/DataBase;useNTLMv2=true;integratedSecurity=true;
Database Driver Class Name net.sourceforge.jtds.jdbc.Driver
Database Driver Location(s) /path/connectors/jtds-1.3.1.jar
Database User user
PasswordSensitive value set
Max Wait Time -1
Max Total Connections 8
Validation query No value set
It has to be noticed that one (only one!) SqlServer connector works and the ExecuteSQL processor imports the data without any problem. The even stranger thing is that the database that is being connected via this connector is located in the same place as other two (the connection URL and user/psw are identical), but only the first one is working.
Notice that I've tried appending ?zeroDateTimeBehavior=convertToNull&autoReconnect=true also to the SqlServer connections, supposing it was a problem of date type, but it didn't give any positive change.
Here is the error that is being returned:
12:02:46 CEST ERROR f1553b83-a173-1c0f-93cb-1c32f0f46d1d
00.00.00.00:0000 ExecuteSQL[id=****] ExecuteSQL[id=****] failed to process session due to null; Processor Administratively Yielded for 1 sec: java.lang.AbstractMethodError
Error retrieved from logs:
ERROR [Timer-Driven Process Thread-49] o.a.nifi.processors.standard.ExecuteSQL ExecuteSQL[id=****] ExecuteSQL[id=****] failed to process session due to java.lang.AbstractMethodError; Processor Administratively Yielded for 1 sec: java.lang.AbstractMethodError
java.lang.AbstractMethodError: null
at net.sourceforge.jtds.jdbc.JtdsConnection.isValid(JtdsConnection.java:2833)
at org.apache.commons.dbcp2.DelegatingConnection.isValid(DelegatingConnection.java:874)
at org.apache.commons.dbcp2.PoolableConnection.validate(PoolableConnection.java:270)
at org.apache.commons.dbcp2.PoolableConnectionFactory.validateConnection(PoolableConnectionFactory.java:389)
at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2398)
at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2381)
at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2110)
at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1563)
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:305)
at org.apache.nifi.dbcp.DBCPService.getConnection(DBCPService.java:49)
at sun.reflect.GeneratedMethodAccessor1696.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:84)
at com.sun.proxy.$Proxy449.getConnection(Unknown Source)
at org.apache.nifi.processors.standard.AbstractExecuteSQL.onTrigger(AbstractExecuteSQL.java:195)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

How to specify which database to query from the BaseX console?

How do I switch between databases? Or, more specifically, if multiple databases are open, how would I specify which database to run a query against?
thufir#dur:~/basex$
thufir#dur:~/basex$ basex
[warning] /usr/bin/basex: Unable to locate /usr/share/java/jing.jar in /usr/share/java
BaseX 9.0.1 [Standalone]
Try 'help' to get more information.
>
> LIST
Name Resources Size Input Path
----------------------------------------------------------
books99 1 61253 /home/thufir/basex/db.books.xml
foo 1 61253 /home/thufir/basex/db.books.xml
new 1 61253 /home/thufir/basex/db.books.xml
3 database(s).
>
> OPEN foo
Database 'foo' was opened in 72.11 ms.
>
> OPEN new
Database 'new' was opened in 16.43 ms.
>
> CLOSE foo
Stopped at , 1/6:
Syntax: CLOSE
Close current database.
Closes the currently opened database.
>
> CLOSE
Database 'new' was closed.
>
> exit
Enjoy life.
thufir#dur:~/basex$
Mostly I'm just running XQUERY / from the BaseX console at the moment, establishing the existence of data.
Perhaps not a complet answer, but to run a query against a specific database open that database and run the query:
thufir#dur:~/basex$
thufir#dur:~/basex$ basex
[warning] /usr/bin/basex: Unable to locate /usr/share/java/jing.jar in /usr/share/java
BaseX 9.0.1 [Standalone]
Try 'help' to get more information.
>
> LIST
Name Resources Size Input Path
----------------------------------------------------------------
books 1 61253 /home/thufir/basex/db.books.xml
bookstore 1 6164 /home/thufir/basex/db.bookstore.xml
2 database(s).
>
> OPEN books
Database 'books' was opened in 67.74 ms.
>
> XQUERY /bookstore/book/title
Query executed in 217.72 ms.
>
> OPEN bookstore
Database 'bookstore' was opened in 2.2 ms.
>
> XQUERY /bookstore/book/title
<title lang="en">Everyday Italian</title>
<title lang="en">Harry Potter</title>
<title lang="en">XQuery Kick Start</title>
<title lang="en">Learning XML</title>
Query executed in 6.08 ms.
>
> OPEN books
Database 'books' was opened in 2.48 ms.
>
> XQUERY /bookstore/book/title
Query executed in 1.09 ms.
>
> exit
Have a nice day.
thufir#dur:~/basex$
notably, never closed a database. I suppose the context is whichever database was most recently opened?

Cannot create team foundation 2012 project?

I restored the tfs 2012 with an old backup after a new installation, every thing works just fine but cannot create new project thanks to the following error that i think a wrong credentials info somewhere in TFS or ReportServer databases, it doesn't match the new Report credentials.
Does anybody know where to change these info ?
http://abc:7979/ReportServer -> ok
http://abc:7979/Reports -> ok
> Module: Engine Event Description: TF30162: Task "Populate Reports"
> from Group "Reporting" failed Exception Type:
> Microsoft.TeamFoundation.Client.PcwException Exception Message: The
> Project Creation Wizard encountered an error while creating reports to
> the SQL Server Reporting Services on
> http://server:7979/ReportServer/ReportService2005.asmx. Exception
> Details: The Project Creation Wizard encountered a problem while
> creating reports on the SQL Server Reporting Services on
> http://server:7979/ReportServer/ReportService2005.asmx. The reason
> for the failure cannot be determined at this time. Because the
> operation failed, the wizard was not able to finish creating the SQL
> Server Reporting Services site. Stack Trace: at
> Microsoft.VisualStudio.TeamFoundation.PCW.RosettaReportUploader.Execute(ProjectCreationContext
> context, XmlNode taskXml) at
> Microsoft.VisualStudio.TeamFoundation.PCW.ProjectCreationEngine.TaskExecutor.PerformTask(IProjectComponentCreator
> componentCreator, ProjectCreationContext context, XmlNode taskXml)
> at
> Microsoft.VisualStudio.TeamFoundation.PCW.ProjectCreationEngine.RunTask(Object
> taskObj)
> -- Inner Exception -- Exception Message: TF30225: Error uploading report 'Backlog Overview':
> System.Web.Services.Protocols.SoapException: The current action cannot
> be completed. The user data source credentials do not meet the
> requirements to run this report or shared dataset. Either the user
> data source credentials are not stored in the report server database,
> or the user data source is configured not to require credentials but
> the unattended execution account is not specified. --->
> Microsoft.ReportingServices.Diagnostics.Utilities.InvalidDataSourceCredentialSettingException:
> The current action cannot be completed. The user data source
> credentials do not meet the requirements to run this report or shared
> dataset. Either the user data source credentials are not stored in the
> report server database, or the user data source is configured not to
> require credentials but the unattended execution account is not
> specified. at
> Microsoft.ReportingServices.Library.ReportingService2005Impl.SetCacheOptions(String
> Report, Boolean CacheReport, ExpirationDefinition Expiration, Guid
> batchId) at
> Microsoft.ReportingServices.WebServer.ReportingService2005.SetCacheOptions(String
> Report, Boolean CacheReport, ExpirationDefinition Expiration) (type
> ReportingUploaderException) Exception Stack Trace: at
> Microsoft.TeamFoundation.Client.Reporting.ReportingUploader.UploadReport(XmlNode
> report) at
> Microsoft.TeamFoundation.Client.Reporting.ReportingUploader.HandleCreateReports(XmlNode
> node) at
> Microsoft.TeamFoundation.Client.Reporting.ReportingUploader.Run()
> at
> Microsoft.VisualStudio.TeamFoundation.PCW.RosettaReportUploader.Execute(ProjectCreationContext
> context, XmlNode taskXml)
>
> Inner Exception Details:
>
> Exception Message: System.Web.Services.Protocols.SoapException: The
> current action cannot be completed. The user data source credentials
> do not meet the requirements to run this report or shared dataset.
> Either the user data source credentials are not stored in the report
> server database, or the user data source is configured not to require
> credentials but the unattended execution account is not specified.
> ---> Microsoft.ReportingServices.Diagnostics.Utilities.InvalidDataSourceCredentialSettingException:
> The current action cannot be completed. The user data source
> credentials do not meet the requirements to run this report or shared
> dataset. Either the user data source credentials are not stored in the
> report server database, or the user data source is configured not to
> require credentials but the unattended execution account is not
> specified. at
> Microsoft.ReportingServices.Library.ReportingService2005Impl.SetCacheOptions(String
> Report, Boolean CacheReport, ExpirationDefinition Expiration, Guid
> batchId) at
> Microsoft.ReportingServices.WebServer.ReportingService2005.SetCacheOptions(String
> Report, Boolean CacheReport, ExpirationDefinition Expiration) (type
> SoapException)SoapException Details: <detail><ErrorCode
> xmlns="http://www.microsoft.com/sql/reportingservices">rsInvalidDataSourceCredentialSetting</ErrorCode><HttpStatus
> xmlns="http://www.microsoft.com/sql/reportingservices">400</HttpStatus><Message
> xmlns="http://www.microsoft.com/sql/reportingservices">The current
> action cannot be completed. The user data source credentials do not
> meet the requirements to run this report or shared dataset. Either the
> user data source credentials are not stored in the report server
> database, or the user data source is configured not to require
> credentials but the unattended execution account is not
> specified.</Message><HelpLink
> xmlns="http://www.microsoft.com/sql/reportingservices">http://go.microsoft.com/fwlink/?LinkId=20476&EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&EvtID=rsInvalidDataSourceCredentialSetting&ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&ProdVer=11.0.2100.60</HelpLink><ProductName
> xmlns="http://www.microsoft.com/sql/reportingservices">Microsoft SQL
> Server Reporting Services</ProductName><ProductVersion
> xmlns="http://www.microsoft.com/sql/reportingservices">11.0.2100.60</ProductVersion><ProductLocaleId
> xmlns="http://www.microsoft.com/sql/reportingservices">127</ProductLocaleId><OperatingSystem
> xmlns="http://www.microsoft.com/sql/reportingservices">OsIndependent</OperatingSystem><CountryLocaleId
> xmlns="http://www.microsoft.com/sql/reportingservices">1033</CountryLocaleId><MoreInformation
> xmlns="http://www.microsoft.com/sql/reportingservices"><Source>ReportingServicesLibrary</Source><Message
> msrs:ErrorCode="rsInvalidDataSourceCredentialSetting"
> msrs:HelpLink="http://go.microsoft.com/fwlink/?LinkId=20476&EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&EvtID=rsInvalidDataSourceCredentialSetting&ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&ProdVer=11.0.2100.60"
> xmlns:msrs="http://www.microsoft.com/sql/reportingservices">The
> current action cannot be completed. The user data source credentials
> do not meet the requirements to run this report or shared dataset.
> Either the user data source credentials are not stored in the report
> server database, or the user data source is configured not to require
> credentials but the unattended execution account is not
> specified.</Message></MoreInformation><Warnings
> xmlns="http://www.microsoft.com/sql/reportingservices" /></detail>
> Exception Stack Trace: at
> Microsoft.TeamFoundation.Client.Channels.TfsHttpClientBase.HandleReply(TfsClientOperation
> operation, TfsMessage message, Object[]& outputs) at
> Microsoft.TeamFoundation.Client.Channels.TfsHttpClientBase.Invoke(TfsClientOperation
> operation, Object[] parameters, TimeSpan timeout, Object[]& outputs)
> at
> Microsoft.TeamFoundation.Client.Reporting.ReportingService.Invoke(TfsClientOperation
> operation, Object[] outputs) at
> Microsoft.TeamFoundation.Client.Reporting.ReportingService.SetCacheOptions(String
> Report, Boolean CacheReport, ExpirationDefinition Expiration) at
> Microsoft.TeamFoundation.Client.Reporting.ReportingUploader.UploadReport(XmlNode
> report)
UPDATE: i fixed it
When you open http:///Reports/Pages/Folder.aspx, you may find the datasource Tfs2010OlapReportDS and Tfs2010ReportDS, open the datasource, make sure that the datasource are using Credentials stored securely in the report server and Use as Windows credentials when connecting to the data source is checked.
Last time I used Windows Integrated security, so the web service might not be able to connect.
Not sure about 2012 version but this kind of info in 2010 version is set Team Foundation Administration Console. Give it a try – it shouldn’t be much different in 2012 version.

Oracle Identity Federation - RCU OID Schema Creation Failure

I am trying to install OIF - Oracle Identity federation as per the OBE http://www.oracle.com/webfolder/technetwork/tutorials/obe/fmw/oif/11g/r1/oif_install/oif_install.htm
I have installed the Oracle 11gR2 11.2.0.3 with the charset = AL32UTF8 and db_block size of 8K and nls_length_semantics=CHAR. Created database and listener needed.
Installed weblogic 10.3.6
Started installation of OIM - Oracle identity management, chosen install and configure option and schema creation options.
Installation goes fine, but during configuration it fails. Below is the relevant part of the logs.
I have tried multiple times just to fail again and again. If someone can kindly shed some light as what is going wrong in here. Please let me know, if you need more info on the setup...
_File : ...//oraInventory/logs/install2013-05-30_01-18-31AM.out_
ORA-01450: maximum key length (6398) exceeded
Percent Complete: 62
Repository Creation Utility: Create - Completion Summary
Database details:
Host Name : vccg-rh1.earth.com
Port : 1521
Service Name : OIAMDB
Connected As : sys
Prefix for (non-prefixable) Schema Owners : DEFAULT_PREFIX
RCU Logfile : /data/OIAM/installed_apps/fmw/Oracle_IDM1_IDP33/rcu/log/rcu.log
RCU Checkpoint Object : /data/OIAM/installed_apps/fmw/Oracle_IDM1_IDP33/rcu/log/RCUCheckpointObj
Component schemas created:
Component Status Logfile
Oracle Internet Directory Failed /data/OIAM/installed_apps/fmw/Oracle_IDM1_IDP33/rcu/log/oid.log
Repository Creation Utility - Create : Operation Completed
Repository Creation Utility - Dropping and Cleanup of the failed components
Repository Dropping and Cleanup of the failed components in progress.
Percent Complete: 93
Percent Complete: -117
Percent Complete: 100
RCUUtil createOIDRepository status = 2------------------------------------------------- java.lang.Exception: RCU OID Schema Creation Failed
at oracle.as.idm.install.config.IdMDirectoryServicesManager.doExecute(IdMDirectoryServicesManager.java:792)
at oracle.as.install.engine.modules.configuration.client.ConfigAction.execute(ConfigAction.java:375)
at oracle.as.install.engine.modules.configuration.action.TaskPerformer.run(TaskPerformer.java:88)
at oracle.as.install.engine.modules.configuration.action.TaskPerformer.startConfigAction(TaskPerformer.java:105)
at oracle.as.install.engine.modules.configuration.action.ActionRequest.perform(ActionRequest.java:15)
at oracle.as.install.engine.modules.configuration.action.RequestQueue.perform(RequestQueue.java:96)
at oracle.as.install.engine.modules.configuration.standard.StandardConfigActionManager.start(StandardConfigActionManager.java:186)
at oracle.as.install.engine.modules.configuration.boot.ConfigurationExtension.kickstart(ConfigurationExtension.java:81)
at oracle.as.install.engine.modules.configuration.ConfigurationModule.run(ConfigurationModule.java:86)
at java.lang.Thread.run(Thread.java:662)
_File : ...///fmw/Oracle_IDM1_IDP33/rcu/log/oid.log_
CREATE UNIQUE INDEX rp_dn on ct_dn (parentdn,rdn)
*
ERROR at line 1:
ORA-01450: maximum key length (6398) exceeded
Edited by: 1008964 on May 30, 2013 12:10 PM
Edited by: 1008964 on May 30, 2013 12:12 PM
Update :
I looked at the logs again and tracked which sql statements were leading to the above error…
CREATE BIGFILE TABLESPACE "OLTS_CT_STORE" EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO DATAFILE '/data/OIAM/installed_apps/db/oradata/OIAMDB/gcats1_oid.dbf' SIZE 32M AUTOEXTEND ON NEXT 10240K MAXSIZE UNLIMITED;
CREATE TABLE ct_dn (
EntryID NUMBER NOT NULL,
RDN varchar2(1024) NOT NULL,
ParentDN varchar2(1024) NOT NULL)
ENABLE ROW MOVEMENT
TABLESPACE OLTS_CT_STORE MONITORING;
*CREATE UNIQUE INDEX rp_dn on ct_dn (parentdn,rdn)
TABLESPACE OLTS_CT_STORE
PARALLEL COMPUTE STATISTICS;*
I ran these statements from sqlplus and I was able to create the index without issues and as per the table space creation statement, autoextend is on. If RCU – repo creation utility runs to create the schemas needed, it fails with the same error as earlier. Any pointers ?
Setting NLS_LENGTH_SEMANTICS=BYTE worked

Resources