how to move slider using Rselenium? - rselenium

i am trying to move the slider to the max of 2000 px
driver = rsDriver(browser = c("firefox"))
remDr <- driver[["client"]]
remDr$navigate('https://www.qrcode-monkey.com/#url')
webElem <- remDr$findElement("xpath", "//span[#class='rz-pointer rz-pointer-min']")
i have tried these commands but it is not working
webElem$buttondown(2)
webElem$setElementAttribute("aria-valuetext", 2000)

We can simply click on pointer and press end
#launch browser
driver = rsDriver(browser = c("firefox"))
remDr <- driver[["client"]]
remDr$navigate('https://www.qrcode-monkey.com/#url')
#click element
remDr$findElement(using = "xpath",'/html/body/div[2]/div[2]/div[2]/div/div/div[2]/div[2]/div[1]/div/div[1]/span[5]')$clickElement()
#press end
webElem <- remDr$findElement(using = "xpath",'/html/body/div[2]/div[2]/div[2]/div/div/div[2]/div[2]/div[1]/div/div[1]/span[5]')
webElem$sendKeysToElement(list(key="end"))

Related

How do i add blueprint into workflow_set in tidymodels

I tried to follow the examples in the
Link 1 - Sparse Matrix
https://www.tidyverse.org/blog/2020/11/tidymodels-sparse-support/
Link 2 - Workflow_sets
https://www.tmwr.org/workflow-sets.html
I had trouble including the blue print into the workflow sets.
In the examples where workflow_set is defined in link 2
no_pre_proc <-
workflow_set(
preproc = list(simple = model_vars),
models = list(MARS = mars_spec, CART = cart_spec, CART_bagged = bag_cart_spec,
RF = rf_spec, boosting = xgb_spec, Cubist = cubist_spec)
)
and the way we add blue print into the workflow in link 1
wf_sparse <-
workflow() %>%
add_recipe(text_rec, blueprint = sparse_bp) %>%
add_model(lasso_spec)
wf_default <-
workflow() %>%
add_recipe(text_rec) %>%
add_model(lasso_spec)
Where and how do I add the "blueprint = sparse_bp" option in the workflow_set above?
My attempts were
no_pre_proc <-
workflow_set(
preproc = list(simple = model_vars),
models = list(MARS = mars_spec, CART = cart_spec, CART_bagged = bag_cart_spec,
RF = rf_spec, boosting = xgb_spec, Cubist = cubist_spec)) %>%
option_add(update_blueprint(blueprint = sparse_bp))
Running the racing tune gave me this error
Error: Problem with `mutate()` column `option`.
i `option = purrr::map(option, append_options, dots)`.
x All options should be named.
Run `rlang::last_error()` to see where the error occurred
<error/rlang_error>
There were 9 workflows that had no results.
Backtrace:
1. ggplot2::autoplot(...)
2. workflowsets:::autoplot.workflow_set(...)
3. workflowsets:::rank_plot(...)
4. workflowsets:::pick_metric(object, rank_metric, metric)
6. workflowsets:::collect_metrics.workflow_set(x)
7. workflowsets:::check_incompete(x, fail = TRUE)
8. workflowsets:::halt(msg)
Run `rlang::last_trace()` to see the full context.
> rlang::last_trace()
<error/rlang_error>
There were 9 workflows that had no results.
Backtrace:
x
1. +-ggplot2::autoplot(...)
2. \-workflowsets:::autoplot.workflow_set(...)
3. \-workflowsets:::rank_plot(...)
4. \-workflowsets:::pick_metric(object, rank_metric, metric)
5. +-tune::collect_metrics(x)
6. \-workflowsets:::collect_metrics.workflow_set(x)
7. \-workflowsets:::check_incompete(x, fail = TRUE)
8. \-workflowsets:::halt(msg)
>
thanks,
Thank you for asking this question; we definitely are not supporting this use case (passing non-default arguments to the recipe or model) very well right now. We've opened an issue here where you can track our work on this.
In the meantime, you could try a bit of a hacky workaround by manually using update_recipe() on the workflow you are interested in:
library(tidymodels)
#> Registered S3 method overwritten by 'tune':
#> method from
#> required_pkgs.model_spec parsnip
data(parabolic)
set.seed(1)
split <- initial_split(parabolic)
train_set <- training(split)
test_set <- testing(split)
glmnet_spec <-
logistic_reg(penalty = 0.1, mixture = 0) %>%
set_engine("glmnet")
rec <-
recipe(class ~ ., data = train_set) %>%
step_YeoJohnson(all_numeric_predictors())
sparse_bp <- hardhat::default_recipe_blueprint(composition = "dgCMatrix")
wfs_orig <-
workflow_set(
preproc = list(yj = rec,
norm = rec %>% step_normalize(all_numeric_predictors())),
models = list(regularized = glmnet_spec)
)
new_wf <-
wfs_orig %>%
extract_workflow("yj_regularized") %>%
update_recipe(rec, blueprint = sparse_bp)
Created on 2021-12-09 by the reprex package (v2.0.1)
Then (I know this feels hacky for now) manually take this new_wf and stick it in to the wfs_orig$info[[1]]$workflow slot to replace what is there.

R: how to properly create rx_forest_model object?

I'm trying to do a churn analysis with R and SQL Server 2016.
I have uploaded my dataset on my database in a local SQL Server and I did all the preliminary work on this dataset.
Well, now I have this function trainModel() which I would use to estimate my random model forest:
trainModel = function(sqlSettings, trainTable) {
sqlConnString = sqlSettings$connString
trainDataSQL <- RxSqlServerData(connectionString = sqlConnString,
table = trainTable,
colInfo = cdrColInfo)
## Create training formula
labelVar = "churn"
trainVars <- rxGetVarNames(trainDataSQL)
trainVars <- trainVars[!trainVars %in% c(labelVar)]
temp <- paste(c(labelVar, paste(trainVars, collapse = "+")), collapse = "~")
formula <- as.formula(temp)
## Train gradient tree boosting with mxFastTree on SQL data source
library(RevoScaleR)
rx_forest_model <- rxDForest(formula = formula,
data = trainDataSQL,
nTree = 8,
maxDepth = 16,
mTry = 2,
minBucket = 1,
replace = TRUE,
importance = TRUE,
seed = 8,
parms = list(loss = c(0, 4, 1, 0)))
return(rx_forest_model)
}
But when I run the function I get this wrong output:
> system.time({
+ trainModel(sqlSettings, trainTable)
+ })
user system elapsed
0.29 0.07 58.18
Warning message:
In tempGetNumObs(numObs) :
Number of observations not available for this data source. 'numObs' set to 1e6.
And for this warning message, the function trainModel() does not create the object rx_forest_model
Does anyone have any suggestions on how to solve this problem?
After several attempts, I found the reason why the function trainModel() did not function properly.
Is not a connection string problem and is not even a data source type issue.
The problem is in the syntax of function trainModel().
It is enough to eliminate from the body of the function the statement:
return(rx_forest_model)
In this way, the function returns the same warning message, but creates the object rx_forest_model in the correct way.
So, the correct function is:
trainModel = function(sqlSettings, trainTable) {
sqlConnString = sqlSettings$connString
trainDataSQL <- RxSqlServerData(connectionString = sqlConnString,
table = trainTable,
colInfo = cdrColInfo)
## Create training formula
labelVar = "churn"
trainVars <- rxGetVarNames(trainDataSQL)
trainVars <- trainVars[!trainVars %in% c(labelVar)]
temp <- paste(c(labelVar, paste(trainVars, collapse = "+")), collapse = "~")
formula <- as.formula(temp)
## Train gradient tree boosting with mxFastTree on SQL data source
library(RevoScaleR)
rx_forest_model <- rxDForest(formula = formula,
data = trainDataSQL,
nTree = 8,
maxDepth = 16,
mTry = 2,
minBucket = 1,
replace = TRUE,
importance = TRUE,
seed = 8,
parms = list(loss = c(0, 4, 1, 0)))
}

How to import data from SQL Server into quantmod?

I'm looking for some guidance and hope that I did the right thing posting it in here. I'm looking to input data to Quantmod with GetSymbols from SQL Server. I'm new to R but have a background working with SQL Server, not a pro but finding my way.
I have imported all my data into one table in SQL Server named Quotes with the following columns;
- Ticker Varchar(10)
- Name varchar(50)
- [Date] datetime
- [Open] Decimal(19,9)
- High Decimal(19,9)
- Low Decimal(19,9)
- [Close] Decimal(19,9)
- Volume Decimal(19,9)
- Signal Decimal(19,9)
I'm able to connect to the database using the RODBC package:
- (cn <- odbcDriverConnect(connection="Driver={SQL Server Native Client 11.0};server=localhost;database=DB;trusted_connection=yes;"))
and make various select statement in R, but I'm lost in getting the data into Quantmod without having to do other workaround like exporting to csv from SQL. Importing the data from Yahoo is a problem as I cannot find a complete Yahoo-tickerlist.
Is there a way to get data directly into R and quantmod from SQL Server?
Something like this should do the trick.
getPrices.DB <- function(Symbol, from=NA) {
cn <- "add your connection info here"
qry <- sprintf("select [Date], [Open],[High],[Low],[Close],[Volume],[Signal] from MarketPrice where Ticker = '%s'", Symbol)
if (!is.na(from)) { qry <- paste(qry, sprintf(" and [Date]>= '%s'", from)) }
DB <- odbcDriverConnect(cn)
r <- sqlQuery(DB, qry, stringsAsFactors = FALSE)
odbcClose(DB)
if (!is.null(r) && NROW(r) >= 1) {
x <- xts(r[, 2:7], order.by = as.POSIXct(r[, 1], tz = "UTC"))#can eliminate tz if you want in local timezone
indexFormat(x) <- "%Y-%b-%d %H:%M:%OS3 %z" #option. I find useful for debuggging
colnames(x) <- paste(Symbol, c("Open", "High","Low", "Close", "Volume", "Signal"), sep = ".")
return(x)
} else {
return(NULL)
}
}
Now hook into the quantmod infrastructure:
getSymbols.DB <- function(Symbols, env, ...) {
importDefaults("getSymbols.DB")
this.env <- environment()
for (var in names(list(...))) {assign(var, list(...)[[var]], this.env)}
if (!hasArg(from)) from <- NA
if (!hasArg(verbose)) verbose <- FALSE
if (!hasArg(auto.assign)) auto.assign <- FALSE
for (i in 1:length(Symbols)) {
if (verbose) cat(paste("Loading ", Symbols[[i]], paste(rep(".", 10 - nchar(Symbols[[i]])), collapse = ""), sep = ""))
x <- getPrices.DB(Symbols[[i]], from = from)
if (auto.assign) assign(Symbols[[i]], x, env)
if (verbose) cat("done\n")
}
if (auto.assign)
return(Symbols)
else
return(x)
}
example usage:
APPL <- getSymbols("AAPL", src="DB", auto.assign=F)

Execute Microsoft SQL query on R Shiny

I am writing an R-Shiny app. Can some one tell me how to execute a Microsoft SQL query in R Shiny ?
This is what I have done so far:
data <- reactive({
conn <- reactive ({ databaseOpen(serverName="[serverName]", databaseName=[dbName])})
qr <- reactive ({ SELECT * from myTable })
res <- reactive ({databaseQuery(conn = conn,query = qr)})
close(conn)
View(res)
})
Any help is appreciated !
I was able to call a query by creating a function outside of the server and ui functions (in other words, in a global.r). Then the server function could call that query function using one of the inputs in the function.
Here is my code:
queryfunction <- function(zipper){
odbcChannel <- odbcConnect("myconnection")
querydoc <- paste0("
SELECT distinct *
FROM mydb
where substring(nppes_provider_zip,1,2) = '43'
and [provider_type] = 'General Practice'
")
pricetable <- sqlQuery(odbcChannel, querydoc)
close(odbcChannel)
pricetable[which(substring(pricetable$nppes_provider_zip,1,5)==zipper),]
}
server <- shinyServer(function(input, output) {
output$mytable1 <- renderDataTable(data.table(queryfunction(input$zip)))
})
I figured it out. It can be done as:
server.r
serverfun<-function(input, output){
# Storing values in myData variable
myData <- reactive({
# Opening database connection
conn <- databaseOpen(serverName = "myServer",databaseName = "myDB")
# Sample query which uses some input
qr <- paste( "SELECT name FROM Genes g WHERE Id = ",input$myId," ORDER BY name")
# Storing results
res <- databaseQuery(conn = conn,query = qr)
# closing database
databaseClose(conn)
# Returning results
res
})
output$tbTable <- renderTable({
# Checking if myData is not null
if(is.null(myData())){return ()}
# return myData
myData()
})
ui.r
library("shiny")
shinyUI(
pageWithSidebar(
headerPanel("Hide Side Bar example"),
sidebarPanel(
textInput("Id", "Enter ID below","1234")
),
mainPanel(
tabsetPanel(
tabPanel("Data", tableOutput("tbTable"))
)
)
)
)

gWidgets return multiple responses for multiple columns

I posted before on this and was not able to arrive at a solution with the single posted response. I have tried another approach that gets me what I want, mostly. The current solution would work better if it were in a single dialogue box but I am not sure how to do this. I am putting together other solutions I found on Nabble and stack and just cannot get what I need. I want the following to offer the choices in a single window, instead of having the user input into 3 separate boxes. Please help.
options(guiToolkit="RGtk2")
library(gWidgets)
#Creat selection function
select <- function(x,multiple = TRUE, Title,...){
ans<-new.env()
x1<-ggroup(horizontal=TRUE) # no parent container here
x2<-gcheckboxgroup( x,multiple=multiple,con=x1,expand=TRUE)
ret <- gbasicdialog(title = Title, widget=x1,handler=function(h,...){
value <- svalue(x2)
if (length(value)==0) value=""
assign("selected",value,env=h$action$env)
dispose(x1)
},action=list(env=ans))
ans
}
#Create list to store results
Days. <- c("Day1","Day2","Day3")
Outputs_ <- list()
SelectionOptions. <- c("Bicycle Event1", "Construction Nearby","Path Closure")
#Cycle through each day
for(day in Days.){
ans <- select(SelectionOptions., Title = day)
Outputs_[[day]] <- ans$selected
}
#return results of selection
unlist(Outputs_)
You are making it a bit more complicated than need be.
you are better off making things asynchronous (in most cases), using a handler to be called when the values are selected.
However, if you want it done this way, make a container for your selection widgets, then add them to the container. This container is then passed into gbasicdialog. That way all will show in one window. Using gWidgets2 makes working with the basic dialog a lot easier.
Okay, here is an example, though I'm not quite sure what problem you are trying to solve, so may be off:
library(gWidgets2)
Days. <- c("Day1","Day2","Day3")
SelectionOptions. <- c("Bicycle Event1", "Construction Nearby","Path Closure")
w <- gwindow()
g <- gformlayout(cont=w)
days <- gradio(Days., cont=g, label="Days:")
opts <- gcombobox(SelectionOptions., cont=g, label="Options:")
btn <- gbutton("Show them", cont=g, label="")
addHandlerClicked(btn, handler=function(h,...) {
values <- lapply(list(days, opts), svalue)
print(values)
})
## or using a modal dialog
dlg <- gbasicdialog(handler=function(h,...) {
values <- lapply(list(days, opts), svalue)
print(values)
})
g <- gformlayout(cont=dlg)
days <- gradio(Days., cont=g, label="Days:")
opts <- gcombobox(SelectionOptions., cont=g, label="Options:")
visible(dlg, TRUE)
## third time is a charm? (Though you could clean this up)
library(gWidgets2)
e <- new.env()
e$Days <- ""; e$Sel <- ""
Days. <- c("Day1","Day2","Day3")
SelectionOptions. <- c("Bicycle Event1", "Construction Nearby","Path Closure")
dlg <- gbasicdialog(title="test", do.buttons=FALSE)
g <- ggroup(cont=dlg, horizontal=FALSE)
fl <- gformlayout(cont=g)
days <- gradio(Days., cont=fl, label="Days:")
cb <- gcombobox(SelectionOptions., selected=0, cont=fl, label="Options:")
handler <- function(h,...) {
e$Days <- svalue(days)
e$Sel <- ifelse(length(svalue(cb)) > 0, svalue(cb), "")
if (all(sapply(e, function(x) nchar(x) > 0)))
dlg$dispose_window()
}
sapply(list(days, cb), addHandlerChanged, handler=handler)
visible(dlg, TRUE)
Take 4
library(gWidgets2)
days <- paste("Day", 1:3)
events <- c("none", "Bicycle Event","Construction Nearby","Path Closure")
l <- list()
handler <- function(h,...) {
l[[h$action]] <<- svalue(h$obj)
if(length(l) == 3)
print(l)
}
w <- gwindow()
g <- gvbox(cont=w)
for (day in days) {
l <- glabel(day, cont=g); font(l) <- list(weight="bold")
gradio(events, cont=g, handler=handler, action=day)
}

Resources