i am trying to create a variable in a python dictionary and post this variable to the a server in a json format to a server.
the server isn't accepting a dict containing a list.
this is the code:
b9RuleData = {'hash': 'fea66a01c8091683539597d529557b8f2f21270e', 'fileState': 3, 'policyIds' : '11' }
b9RuleData2 = {'hash':'ff9586097b762b7d534cad008fc5f0382b5dcff8', 'fileState': 3, 'policyIds' : '12'}
b9RuleData3 = {'hash':'fea66a01c8091683539597d529557b8f2f21270e', 'fileState': 3, 'policyIds' : '13'}
So, what's the best way to write this ? Let's assume that,I have 100 file hashes. I can't keep adding each individual file hash..
Related
I am trying to parse an API response which is JSON. the JSON looks like this:
{
'id': 112,
'name': 'stalin-PC',
'type': 'IP4Address',
'properties': 'address=10.0.1.110|ipLong=277893412|state=DHCP Allocated|macAddress=41-1z-y4-23-dd-98|'
}
It's length is 1200, If i convert it I should get 1200 rows. My goal is to parse this json like below:
id name type address iplong state macAddress
112 stalin-PC IP4Address 10.0.1.110 277893412 DHCP Allocated 41-1z-y4-23-dd-98
I am getting the first 3 elements but having an issue in "properties" key which value is pipe delimited. I have tried the below code:
for network in networks: # here networks = response.json()
network_id = network['id']
network_name = network['name']
network_type = network['type']
print(network_id, network_name, network_type)
It works file and gives me result :
112 stalin-PC IP4Address
But when I tried to parse the properties key with below code , its not working.
for network in networks:
network_id = network['id']
network_name = network['name']
network_type = network['type']
for line in network['properties']:
properties_value = line.split('|')
network_address = properties_value[0]
print(network_id, network_name, network_type, network_address )`
How can I parse the pipe delimited properties key? Would anyone help me please.
Thank you
Using str methods
Ex:
network = {
'id': 112,
'name': 'stalin-PC',
'type': 'IP4Address',
'properties': 'address=10.0.1.110|ipLong=277893412|state=DHCP Allocated|macAddress=41-1z-y4-23-dd-98'
}
for n in network['properties'].split("|"):
key, value = n.split("=")
print(key, "-->", value)
Output:
address --> 10.0.1.110
ipLong --> 277893412
state --> DHCP Allocated
macAddress --> 41-1z-y4-23-dd-98
I would like to put my summary statistics into a table using the kable function, but I cannot because it comes up as an array.
```{r setup options, include = FALSE}
knitr::opts_chunk$set(fig.width = 8, fig.height = 5, echo = TRUE)
library(mosaic)
library(knitr)
```
```{r}
sum = summary(SwimRecords$time) # generic data set included with mosaic package
kable(sum) # I want this to be printed into a table
```
Any suggestions?
You can do so easily with the broom package which is built to "tidy" these stats-related objects:
#install.packages(broom)
broom::tidy(sum)
I have a line as:
Name:sample Location:(xyz)
I want to to convert it to dictionary as follows:
{'Name':'sample','Location':'(xyz)'}
I want to do this using python script. So, please suggest how do i make this possible. The platform i am working on is linux.
# First split at whitespaces ==> ['Name:sample', 'Location:(xyz)']
# Next split each item at ':' and convert them into list of tuples
# ==>[('Name', 'sample'), ('Location', '(xyz)')]
# Convert the list of tuples to dictionary
sample_string = "Name:sample Location:(xyz)"
split_sample_string = sample_string.split()
tuple_string = [tuple(item.split(":")) for item in split_sample_string]
final_dictionary = dict(tuple_string)
print final_dictionary
# final_dictionary = {'Name': 'sample', 'Location': '(xyz)'}
i have many data.frames() that i am trying to send to MySQL database via RMySQL().
# Sends data frame to database without a problem
dbWriteTable(con3, name="SPY", value=SPY , append=T)
# stock1 contains a character vector of stock names...
stock1 <- c("SPY.A")
But when I try to loop it:
i= 1
while(i <= length(stock1)){
# converts "SPY.A" into SPY
name <- print(paste0(str_sub(stock1, start = 1, end = -3))[i], quote=F)
# sends data.frame to database
dbWriteTable(con3,paste0(str_sub(stock1, start = 1, end = -3))[i], value=name, append=T)
i <- 1+i
}
The following warning is returned & nothing was sent to database
In addition: Warning message:
In file(fn, open = "r") :
cannot open file './SPY': No such file or directory
However, I believe that the problem is with pasting value onto dbWriteTable() since writing dbWriteTable(con3, "SPY", SPY, append=T) works but dbWriteTable(con3, "SPY", name, append=T) will not...
You are probably using a non-base package for str_sub and I'm guessing you get the same behavior with substr. Does this succeed?
dbWriteTable(con3, substr( stock1, 1,3) , get(stock1), append=T)
Hiya i have made a program that stores the player name and strength..Here is the code:
data = {
"PLAYER":name2,
"STRENGTH":str(round(strength, 2)),
}
with open("data2.txt", "w", encoding="utf-8") as file:
file.write(repr(data))
file.close()
So this stores the data so what to i do if i wanna append/change the value after a certain action usch as a 'BATTLE'
Is it possible the get the variable of 'STRENGTH' and then change the number?
At the moment to read data from the external file 'DATA1.txt'i am using this code:
with open("data1.txt", "r", encoding="utf-8") as file:
data_string = file.readline()
data = eval(data_string)
# (data["STRENGTH"])
S1 = (float(data["STRENGTH"]))
file.close()
Now i can do something with the variable --> 'S1'
Here is the external text file 'data1.txt'
{'PLAYER': 'Oreo', 'STRENGTH': '11.75'}
... But i wanna change the strength value after a "battle" many thanks
Maybe you're not understanding Python dict semantics?
Seems to me you're doing a lot of unnecessary things like S1 = (float(data['STRENGTH'])) to try to manipulate and change values when you could be doing really simple stuff.
>>> data = {'PLAYER': 'Oreo', 'STRENGTH': '11.75'}
>>> data['STRENGTH'] = float(data['STRENGTH'])
>>> data
{'PLAYER': 'Oreo', 'STRENGTH': 11.75}
>>> data['STRENGTH'] += 1
>>> data
{'PLAYER': 'Oreo', 'STRENGTH': 12.75}
Maybe you should give Native Data Types -- Dive Into Python 3 a read to see if it clears things up.