TypeSafe-Activator ScalaTest value in is not a member of String - scalatest

I am trying to run the test of a project I cloned. I am new to Scala and Activator and I can not figure out what is happening
in the root folder I have the build.sbt
name := "xxxxx"
version := "2.0-SNAPSHOT"
scalaVersion := "2.11.5"
doc in Compile <<= target.map(_ / "none")
scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature")
lazy val root = (project in file(".")).enablePlugins(PlayScala)
libraryDependencies ++= Seq(
jdbc,
anorm,
cache,
"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test",
"json-module" %% "json-module" % "2.0.0",
)
instrumentSettings
ScoverageKeys.excludedPackages in ScoverageCompile := "<empty>;views*;views.*;views*html;Routes*;controllers*routes*;controllers*Reverse*;controllers*javascript*;controller*ref*"
ScoverageKeys.highlighting := true
This is an example file
class TestExampleSpec extends AbstractSpec {
"TestExample#get(:testId)" must {
val url = controllers.routes.TestExample.get(1, 1, 10).url
"return 400 without X-Client-Id header" in {
val fakeRequest = FakeRequest("GET", url).withTextBody("")
val result: Future[Result] = controllers.TestExample.get(1, 1, 10)(fakeRequest)
status(result) mustBe 400
contentType(result) mustBe Some("application/json")
val body = contentAsString(result)
val bodyJson = Json.parse(body)
(bodyJson \ "error").asOpt[Boolean] mustBe Some(true)
(bodyJson \ "message").asOpt[String] mustBe Some("Missing X-Client-Id header")
}
I tried this commands
$ activator reload
$ activator clean
$ activator test
But none of this worked I get this kind of messages
home/username/code/testing-project/test/TestExampleSpec.scala:10: value in is not a member of String
"return 400 without X-Client-Id header" in {
Thank you

I think the problem was related because the project was changed from play 2.2 to play 2.3.8
I added this dependency
"org.scalatestplus" %% "play" % "1.2.0" % "test"
Which seems did the trick.

Related

How to make ALE to work with mixed Windows/Cygwin and tricore-gcc?

EDITED: I am running on Windows 10, working with gvim but the overall project is built in Cygwin. The compiler that I am using is tricore-gcc. With lot of pain I managed to build a compile_commands.json file that apparently is working fine (if I use a linter that cannot parse compile_commands.json I immediately have an error on the first #include whereas if I use e.g. clangtidy such an error disappear).
Nevertheless, I tried to mess around with a .c file by removing some ; or } to check if the linter is working, but nothing happens.
I tried to debug a bit by running the commands displayed in ALEInfo from both the Powershell and Cygwin, but I got the following error:
C:\> clang-tidy C:\Project1\Root\software\ComponentONE\src\fileA.c -p C:\Project1\Root
Skipping C:\Project1\Root\software\ComponentONE\src\fileA.c. Compile command not found.
I also tried to adjust the compile_commands.json by changing / with \ (the same for the commands run from the Powershell), I tried to replace tricore-gcc with gcc, etc. but I always got the same error as above.
If I just run
C:\>clang-tidy.exe
clang-tidy.exe: Not enough positional command line arguments specified!
Must specify at least 1 positional arguments: See: C:\Program Files (x86)\LLVM\bin\clang-tidy.exe -help
which means that clang-tidy is recognized.
Finally, if I run the real compiler command:
C:\> tricore-gcc.exe (various flags) C:\Project1\Root\software\ComponentONE\src\fileA.c
from both Powershell or Cygwin I don't get any error.
It follows my ALEInfo
Current Filetype: c
Available Linters: ['cc', 'ccls', 'clangd', 'clangtidy', 'cppcheck', 'cquery', 'flawfinder']
Linter Aliases:
'cc' -> ['gcc', 'clang']
Enabled Linters: ['clangtidy']
Ignored Linters: []
Suggested Fixers:
'astyle' - Fix C/C++ with astyle.
'clang-format' - Fix C/C++ and cuda files with clang-format.
'clangtidy' - Fix C/C++ and ObjectiveC files with clang-tidy.
'remove_trailing_lines' - Remove all blank lines at the end of a file.
'trim_whitespace' - Remove all trailing whitespace characters at the end of every line.
'uncrustify' - Fix C, C++, C#, ObjectiveC, ObjectiveC++, D, Java, Pawn, and VALA files with uncrustify.
Linter Variables:
let g:ale_c_always_make = 0
let g:ale_c_build_dir = ''
let g:ale_c_build_dir_names = ['build', 'bin']
let g:ale_c_clangtidy_checks = []
let g:ale_c_clangtidy_executable = 'clang-tidy'
let g:ale_c_clangtidy_extra_options = ''
let g:ale_c_clangtidy_options = ''
let g:ale_c_parse_compile_commands = 1
let g:ale_c_parse_makefile = 0
Global Variables:
let g:ale_cache_executable_check_failures = v:null
let g:ale_change_sign_column_color = 0
let g:ale_command_wrapper = ''
let g:ale_completion_delay = 100
let g:ale_completion_enabled = 0
let g:ale_completion_max_suggestions = 50
let g:ale_disable_lsp = 0
let g:ale_echo_cursor = 1
let g:ale_echo_msg_error_str = 'Error'
let g:ale_echo_msg_format = '%code: %%s'
let g:ale_echo_msg_info_str = 'Info'
let g:ale_echo_msg_warning_str = 'Warning'
let g:ale_enabled = 1
let g:ale_fix_on_save = 0
let g:ale_fixers = {'c': ['clangtidy'], '*': ['remove_trailing_lines', 'trim_whitespace']}
let g:ale_history_enabled = 1
let g:ale_history_log_output = 1
let g:ale_keep_list_window_open = 0
let g:ale_lint_delay = 200
let g:ale_lint_on_enter = 1
let g:ale_lint_on_filetype_changed = 1
let g:ale_lint_on_insert_leave = 1
let g:ale_lint_on_save = 1
let g:ale_lint_on_text_changed = 'normal'
let g:ale_linter_aliases = {}
let g:ale_linters = {'c': ['clangtidy']}
let g:ale_linters_explicit = 0
let g:ale_linters_ignore = {}
let g:ale_list_vertical = 0
let g:ale_list_window_size = 10
let g:ale_loclist_msg_format = '%code: %%s'
let g:ale_max_buffer_history_size = 20
let g:ale_max_signs = -1
let g:ale_maximum_file_size = v:null
let g:ale_open_list = 0
let g:ale_pattern_options = v:null
let g:ale_pattern_options_enabled = v:null
let g:ale_root = {}
let g:ale_set_balloons = 0
let g:ale_set_highlights = 1
let g:ale_set_loclist = 1
let g:ale_set_quickfix = 0
let g:ale_set_signs = 1
let g:ale_sign_column_always = 0
let g:ale_sign_error = '>>'
let g:ale_sign_info = '--'
let g:ale_sign_offset = 1000000
let g:ale_sign_style_error = '>>'
let g:ale_sign_style_warning = '--'
let g:ale_sign_warning = '--'
let g:ale_sign_highlight_linenrs = 0
let g:ale_statusline_format = v:null
let g:ale_type_map = {}
let g:ale_use_global_executables = v:null
let g:ale_virtualtext_cursor = 0
let g:ale_warn_about_trailing_blank_lines = 1
let g:ale_warn_about_trailing_whitespace = 1
Command History:
(executable check - success) clang-tidy
(finished - exit code 0) 'cmd /s/c "clang-tidy C:\Project1\Root\software\ComponentONE\src\fileA.c -p C:\Project1\Root"'
<<<NO OUTPUT RETURNED>>>
and my _vimrc
let g:ale_linters = {
\ 'c': ['clangtidy'],
\}
let g:ale_fixers = ['clangtidy']
let g:ale_fixers={
\ 'c': ['clangtidy'],
\ '*': ['remove_trailing_lines', 'trim_whitespace'],
\}
and one extract of my compile_commands.json file:
{
"directory": ".",
"command": "c:/Hightec/tricore/v4.6.6.1/bin/tricore-gcc.exe (number of flags) -c software/ComponentONE/src/fileA.c",
"file": "software/ComponentONE/src/fileA.c"
},
If I run pwd from vim I have C:\Project1\Root
FIXED!
I had to change the compile_commands.json file in this way:
{
"directory": "C:/Project1/Root",
"command": "c:/Hightec/tricore/v4.6.6.1/bin/tricore-gcc.exe (number of flags) -c software/ComponentONE/src/fileA.c",
"file": "software/ComponentONE/src/fileA.c"
},
or in this way
{
"directory": ".",
"command": "c:/Hightec/tricore/v4.6.6.1/bin/tricore-gcc.exe (number of flags) -c software/ComponentONE/src/fileA.c",
"file": "C:/Project1/Root/software/ComponentONE/src/fileA.c"
},
to make it to work.
Obviously now there are other issues such as
error: unknown argument: '-fstrict-volatile-bitfields' [clang-diagnostic-error]
but this because the option -fstrict-volatile-bitfields exists for gcc but for clang.
I would say problem solved.

PyFlink - Issue using Scala UDF in JAR

I'm trying to register a Scala UDF in Pyflink using an external JAR as follows, but get below error.
Scala UDF:
package com.dummy
import org.apache.flink.table.functions.ScalarFunction
class dummyTransform(factor: Int) extends ScalarFunction {
def eval(s: String): Int = {
s.hashCode()
}
}
build.sbt:
name := "hello_scala_for_flink"
version := "0.1"
scalaVersion := "2.12.11"
libraryDependencies += "org.apache.flink" % "flink-table-common" % "1.11.2" % "provided"
assembly.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
Scala project structure:
hello_scala_for_flink/
project/
src/
main/
resources/
scala/
com.dummy/
dummyTransform
hello_scala
test
target/
build.sbt
shell:
cd hello_scala_for_flink
sbt assembly
cp ./target/scala-2.12/hello_scala_for_flink-assembly-0.1.jar /Users/py-r/opt/anaconda3/envs/venv_pyflink_37/lib/python3.7/site-packages/pyflink/lib/
Python:
from pyflink.dataset import ExecutionEnvironment
exec_env = ExecutionEnvironment.get_execution_environment()
#exec_env = StreamExecutionEnvironment.get_execution_environment()
exec_env.set_parallelism(1)
t_config = TableConfig()
table_env = BatchTableEnvironment.create(exec_env, t_config)
table_env.register_java_function("hash_code","com.dummy.dummyTransform")
Error (incl. after restarting Anaconda):
Py4JJavaError: An error occurred while calling o12.newInstance.
: java.lang.InstantiationException: com.dummy.dummyTransform
at java.base/java.lang.Class.newInstance(Class.java:598)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:832)
Caused by: java.lang.NoSuchMethodException: com.dummy.dummyTransform.<init>()
at java.base/java.lang.Class.getConstructor0(Class.java:3427)
at java.base/java.lang.Class.newInstance(Class.java:585)
11 more
Versions:
jdk = 1.8.0_151.jdk
scala = 2.12.11
python = 3.7
apache-beam = 2.19.0
apache-flink = 1.11.2
Any idea what the issue might be ?
Thanks for your support
Looks like I found the issue myself. Apparently only a class instantiation was required in above code:
class dummyTransform(factor: Int) extends ScalarFunction {
def eval(s: String): Int = {
s.hashCode() * factor
}
def this() = this(1)
}
Also, I changed to Scala 2.11.12 due to another error. Now everything seems to work: awesome !

SparkSQL MS SQL Server , Get message "No suitable driver" After compiled

build.sbt
name := "BigData"
version := "0.1"
scalaVersion := "2.12.7"
libraryDependencies += "com.github.tototoshi" %% "scala-csv" % "1.3.5"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"
// https://mvnrepository.com/artifact/com.microsoft.sqlserver/mssql-jdbc
libraryDependencies += "com.microsoft.sqlserver" % "mssql-jdbc" % "6.1.0.jre8" % Test
SparkMSSQL.scala
import org.apache.spark.sql.SparkSession
object SparkMSSQL {
def main(args: Array[String]): Unit = {
val spark = SparkSession.
builder.master("local[*]")
.appName("Simple Application")
.getOrCreate()
val url = "jdbc:sqlserver://localhost;databaseName=scalatest;integratedSecurity=true";
// Define database table to load into DataFrame
val jdbcDbTable = "dbo.user_profiles"
val df = spark
.read
.format("jdbc")
.option("url", url)
.option("dbtable", "dbo.user_profiles")
.load()
df.printSchema()
}
}
Error after compile
Exception in thread "main" java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:315)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.$anonfun$driverClass$2(JDBCOptions.scala:105)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:105)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
Please advice , what is wrong with my code?
First, you have your jdbc driver in the test scope, so the jar is probably not loaded at runtime. But also, spark needs driver class information to create JDBC connection, so try adding the following option to the DF initializer:
.option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")

Waf:Create custom parallel tasks

In Waf how can I create multiple custom tasks, that can run parallel (with --jobs=JOBS)?
Sources = ["C:\\src1.c", "C:\\Mod1\src2.c", ... 30pcs] # one per call
Incl_Paths = ["Mod1". "Mod1"] # list all of them in all call
INCL_ST = "-I%s" # how to format an include path in an argument
Ext_out = "_loc" # output file extension
The goal:
C:\\LOC.exe -IMod1 -IMod2 C:\\src1.c > build\\src1.c_loc //or better src1_loc
C:\\LOC.exe -IMod1 -IMod2 C:\\Mod1\src2.c > build\\src2.c_loc //or better src2_loc
...
I couldn't get it work
def build(bld):
for i in Sources:
bld.new_task_gen(
source = i,
rule='C:\\LOC.exe ${INCL_ST:Incl_Paths} ${SRC} > ' + i + Ext_out,
)
Also I couldn't extract the exe
# find_program(self, filename, path_list=[], var=None, environ=None, exts=''):
cfg.find_program("C:\\LOC.exe", var='LOC')
To change from:
rule='C:\\LOC.exe ...'
To:
rule='${LOC} ...'
Something like this should work with waf 1.7:
from waflib.Task import Task
from waflib.TaskGen import extension
Ext_out = "_loc" # output file extension
def configure(conf):
# loc.exe must be in the system path for this to work
conf.find_program(
'loc',
var = "LOC",
)
conf.env.Incl_Paths = ["Mod1", "Mod1"]
conf.env.INCL_ST = "-I%s"
#extension('.c')
def process_loc(self, node):
out_node = node.change_ext(Ext_out)
tsk = self.create_task('loc')
tsk.set_inputs(node)
tsk.set_outputs(out_node)
class loc_task(Task):
ext_in = ['.c']
ext_out = ['_loc']
run_str = "${LOC} ${INCL_ST:Incl_Paths} ${SRC} > ${TGT}"
def build(bld):
bld(source = ["src1.c", "src2.c"])
Well it works for me on linux faking loc ...

Test page for mod_wsgi

mod_python has a test page script which emits information about the server configuration. You can put
SetHandler mod_python
PythonHandler mod_python.testhandler
into your .htaccess and it displays the page.
Now my question: Does something similiar exist for mod_wsgi as well?
No. You can create something kind of helpful by iterating over the keys of environ, though:
def application(env, respond):
respond('200 OK', [('Content-Type', 'text/plain')])
return ['\n'.join('%s: %s' % (k, v) for (k, v) in env.iteritems())]
I have now put together something like a test page here. For your convenience, I'll share it with you here:
def tag(t, **k):
kk = ''.join(' %s=%r' % kv for kv in k.items())
format = '<%s%s>%%s</%s>' % (t, kk, t)
return lambda content: format % content
def table(d):
from cgi import escape
escq = lambda s: escape(s, quote=True)
tr = tag('tr')
th = tag('th')
td_code = lambda content: tag('td')(tag('code')(content))
return tag('table', border='1')(''.join((
'\n\t' + tr(th('Key') + th('Value') + th('Repr')) + '\n',
''.join(('\t' + tr(td_code('%s') + td_code('%s') + td_code('%s')) + '\n') % (k, escq(str(v)), escq(repr(v))) for k, v in sorted(d.items())),
))) + '\n'
def application(environ, start_response):
import os
l = []
from wsgiref.headers import Headers
h = Headers(l)
h.add_header('Content-Type', 'text/html')
start_response('200 OK', l)
yield '<html><head><title>my mod_wsgi test page</title></head><body>\n'
# yield '<h3>General information</h3>\n'
# yield table({})
yield '<h3>Process info</h3>\n'
yield table(dict(
wd=os.getcwd(),
pid=os.getpid(),
ppid=os.getppid(),
uid=os.getuid(),
gid=os.getgid(),
))
yield '<h3>Environment</h3>\n'
yield table(environ)

Resources