I am sendind files from js to my golang server:
for (var value of formData.values()) {
console.log(value);
}
// File {name: 'img/<hash_key>.png', lastModified: 1635043863231, lastModifiedDate: Sat Oct 23 2021 23:51:03 GMT-0300 (Brasilia Standard Time), webkitRelativePath: '', size: 969, …}
// ...
var request = new Request( serverEndpoint, { body: formData, method: "POST", ... })
return fetch(request).then(response => { ... })
In my golang server, I am using the following code to deal with multipart form data from a request to read files
if err := r.ParseMultipartForm(32 << 20); err != nil {
...
}
for _, fileHeader := range r.MultipartForm.File["files"] {
...
}
I expected to read the files in Go with the same filenames, like img/<hash_key>.png
but my server is reading multipart-form to the following struct:
f = {*mime/multipart.Form | 0xc000426090}
├── Value = {map[string][]string}
└── File = {map[string][]*mime/multipart.FileHeader}
├── 0 = files -> len:1, cap:1
│ ├── key = {string} "files"
│ └── value = {[]*mime/multipart.FileHeader} len:1, cap:1
│ └── 0 = {*mime/multipart.FileHeader | 0xc000440000}
│ ├── Filename = {string} "<hash_key>.png" // notice how FileName is missing 'img/' prefix
│ └── ...
└── ...
I am trying to figure out how this is happening and how to prevent this strip prefix as I need this prefix to correctly resolve upload path for my files
Edit:
Closer inspection revealed that my server IS in fact getting the files with the correct name. After calling r.ParseMultipartForm(32 << 20), I get the following in r.Body.src.R.buf:
------WebKitFormBoundary1uanPdXqZeL8IPUH
Content-Disposition: form-data; name="files"; filename="img/upload.svg"
---- notice the img/ prefix
Content-Type: image/svg+xml
<svg height="512pt" viewBox= ...
However, in r.MultipartForm.File["files"][0].FileName, it shows as upload.svg
The directory is removed in in Part.FileName():
// RFC 7578, Section 4.2 requires that if a filename is provided, the
// directory path information must not be used.
return filepath.Base(filename)
Workaround Part.FileName() by parsing the content disposition header directly.
for _, fileHeader := range r.MultipartForm.File["files"] {
_, params, _ := mime.ParseMediaType(fileHeader.Header.Get("Content-Disposition"))
filename := params["filename"]
if filename == "" {
// TODO: Handle unexpected content disposition
// header (missing header, parse error, missing param).
}
Related
I want to use mssql in a vue-electron project.
I installed mssql with npm.
When i want to run the app, i get the following error:
ERROR Failed to compile with 2 errors
error in ./node_modules/mssql/lib/tedious/connection-pool.js
Module parse failed: Unexpected token (39:63)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
| cfg.options.database = cfg.options.database || this.config.database
| cfg.options.port = cfg.options.port || this.config.port
> cfg.options.connectTimeout = cfg.options.connectTimeout ?? this.config.connectionTimeout ?? this.config.timeout ?? 15000
| cfg.options.requestTimeout = cfg.options.requestTimeout ?? this.config.requestTimeout ?? this.config.timeout ?? 15000
| cfg.options.tdsVersion = cfg.options.tdsVersion || '7_4'
# ./node_modules/mssql/lib/tedious/index.js 4:23-51
# ./node_modules/mssql/index.js
# ./src/modules/db.js
# ./src/background.js
# multi ./src/background.js
error in ./node_modules/mssql/lib/tedious/request.js
Module parse failed: Unexpected token (446:15)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
| const req = new tds.Request(command, err => {
| // tedious v15 has started using AggregateErrors to wrap multiple errors into single error objects
> (err?.errors ? err.errors : [err]).forEach((e, i, { length }) => {
| // to make sure we handle no-sql errors as well
| if (e && (!errors.length || (errors.length && errors.length >= length && e.message !== errors[errors.length - length + i].message))) {
# ./node_modules/mssql/lib/tedious/index.js 6:16-36
# ./node_modules/mssql/index.js
# ./src/modules/db.js
# ./src/background.js
# multi ./src/background.js
Can someone please help me?
I imported the database of ontime airlines from here https://clickhouse.com/docs/en/getting-started/example-datasets/ontime/
Then I created a dictionary mapping the 2 digit airplane codes to company names like this:
id,code,company
1,UA,United Airlines
2,HA,Hawaiian Airlines
3,OO,SkyWest
4,B6,Jetblue Airway
5,QX,Horizon Air
6,YX,Republic Airway
7,G4,Allegiant Air
...
..
I used this query to generate it and it seems to be working:
CREATE DICTIONARY airlinecompany
(
id UInt64,
code String,
company String
)
PRIMARY KEY id
SOURCE(FILE(path '/var/lib/clickhouse/user_files/airlinenames.csv' format 'CSVWithNames'))
LAYOUT(FLAT())
LIFETIME(3600)
In the main database (ontime) Looks like this:
SELECT Reporting_Airline AS R_air
FROM ontime
GROUP BY R_air
LIMIT 4
┌─R_air─┐
│ UA │
│ HA │
│ OO │
│ B6 │
└───────┘
What I want to do is have a table that uses R_air's 2 code value and then checks it against the airlinecompany dict to create a mapping ie
R_Air Company
UA | United Airlines
HA | Hawaiian Airlines
00 | SkyWest
...
..
But I cant seem to form this query correctly:
SELECT
Reporting_Airline AS R_Air,
dictGetString('airlinecompany', 'company', R_Air) AS company
FROM ontime
GROUP BY R_Air
Received exception from server (version 22.3.3):
Code: 6. DB::Exception: Received from localhost:9000. DB::Exception: Cannot parse string 'UA' as UInt64: syntax error at begin of string. Note: there are toUInt64OrZero and toUInt64OrNull functions, which returns zero/NULL instead of throwing exception.: while executing 'FUNCTION dictGetString('airlinecompany' :: 1, 'company' :: 2, Reporting_Airline :: 0) -> dictGetString('airlinecompany', 'company', Reporting_Airline) String : 4'. (CANNOT_PARSE_TEXT)
What am I missing? I dont know why it thinks UA is a UInt64
LAYOUT = COMPLEX_KEY_HASHED
CREATE DICTIONARY airlinecompany
(
id UInt64,
code String,
company String
)
PRIMARY KEY code
SOURCE(FILE(path '/var/lib/clickhouse/user_files/airlinenames.csv' format 'CSVWithNames'))
LAYOUT(COMPLEX_KEY_HASHED())
LIFETIME(3600)
SELECT dictGet('airlinecompany', 'company', tuple('UA'))
┌─dictGet('airlinecompany', 'company', tuple('UA'))─┐
│ United Airlines │
└───────────────────────────────────────────────────┘
SELECT Reporting_Airline AS R_air,
dictGetString('airlinecompany', 'company', tuple(R_Air)) AS company
FROM ontime
LIMIT 4;
┌─R_Air─┬─company───────────┐
│ B6 │ Jetblue Airway │
│ G4 │ Allegiant Air │
│ HA │ Hawaiian Airlines │
│ OO │ SkyWest │
└───────┴───────────────────┘
LOAD CSV WITH HEADERS FROM
'file:///epl_mataches.csv' as row
MATCH (c1:Club {name:row.`Team1`}), (c2:Club {name:row.`Team2`})
MERGE (c1) -[f:FEATURED{
round:toInteger(row.Round),
date:row.Date,
homeTeamFTScore: toInteger(split(row.FT,"-" [0])),
awayTeamFTScore: toInteger(split(row.FT,"-" [1])),
homeTeamHTScore: toInteger(split(row.HT,"-" [0])),
awayTeamHTScore: toInteger(split(row.HT,"-" [1]))
}] -> (c2)
The error is present when I try to create the relationships and to pull through the required information from the data file.
Neo.ClientError.Statement.SyntaxError
Type mismatch: expected List<T> but was String (line 7, column 45 (offset: 248))
" homeTeamFTScore: toInteger(split(row.FT,"-" [0])),"
There is a typo on your script, so instead of
homeTeamFTScore: toInteger(split(row.FT,"-" [0])),
Use below
homeTeamFTScore: toInteger(split(row.FT,"-") [0])
Notice the parenthesis before [0] and NOT after it.
For example:
RETURN toInteger(split("2-test","-") [0]) as sample
result:
╒════════╕
│"sample"│
╞════════╡
│ 2 │
└────────┘
This validation block works for a single input variable.
variable "mytestname" {
validation {
condition = length(regexall("test$", var.mytestname)) > 0
error_message = "Should end in test"
}
}
I need it to work inside a for_each - or have some workaround to accomplish this. The issue is that there is a restriction on the condition statement - the condition HAS to take in the input variable itself (i.e. - it cannot accept an each.value)
variable "mytestnames" {
listnames = split(",",var.mytestnames)
for_each = var.listnames
validation {
condition = length(regexall("test$", each.value)) > 0
error_message = "Should end in test"
}
}
The above snippet does not work. I need a way I can iterate over a list of values and validate each of them. It looks like the newly introduced 'validation block' does not work on lists of input variables. There must be a way to do this without a validation block...??
I believe it will not work. The attributes that can be defined in the variable block are type, description, and default. So how could we define additional attribute such as "listnames" dynamically...
variable "mytestnames" {
listnames = split(",",var.mytestnames)
}
$ terraform validate
Error: Unsupported argument
on hoge.tf line 3, in variable "mytestnames":
3: listnames = split(",",var.mytestnames)
An argument named "listnames" is not expected here.
we can validate a loop
variable "mytestnames" {
type = string
description = "comma separated list of names"
# default = "nametest,name1test,name2test"
default = "nametest,nametest1,nametest2"
validation {
condition = alltrue([
for n in split(",", var.mytestnames) :
can(regex("test$", n)) # can't use a local var 'can only refer to the variable itself'
])
error_message = "Should end in test" # can't use local var here either
}
}
│ Error: Invalid value for variable
│
│ on main.tf line 5:
│ 5: variable "mytestnames" {
│ ├────────────────
│ │ var.mytestnames is "nametest,nametest1,nametest2"
│
│ Should end in test
│
... but, we can do better by using an output
locals { name_regex = "test$" }
output "mytestnames_valid" {
value = "ok" # we can output whatever we want
precondition {
condition = alltrue([
for n in split(",", var.mytestnames) :
can(regex(local.name_regex, n)) # in an output we can use a local var
])
error_message = format("invalid names: %s",
join(",", [
for n in split(",", var.mytestnames) :
n if !can(regex(local.name_regex, n)) # we can reference local AND make a list of bad names
]
)
)
}
}
│ Error: Module output value precondition failed
│
│ on main.tf line 23, in output "mytestnames":
│ 23: condition = alltrue([
│ 24: for n in split(",", var.mytestnames) :
│ 25: can(regex(local.name_regex, n)) # an an output we can use a local var
│ 26: ])
│ ├────────────────
│ │ local.name_regex is "test$"
│ │ var.mytestnames is "nametest,nametest1,nametest2"
│
│ invalid names: nametest1,nametest2
I'm trying to upload file on one of my Azure containers
this is one of my request send with ajax:
headers: Object
Authorization: "SharedKey MYACCOUNT:ENC_KEY"
Content-Type: "application/octet-stream"
data: File
x-ms-blob-type: "BlockBlob"
x-ms-date: "Mon, 19 Oct 2015 13:54:53 GMT"
x-ms-version: "2009-09-19"
type: "PUT"
url: "https://MYACCOUNT.blob.core.windows.net/data-test"
for the ENC_KEY I use :
authorizationHeader =
compute: (options, xhrOptions) ->
sig = #_computeSignature(options, xhrOptions)
result = 'SharedKey ' + options.storageAccount + ':' + sig
result
_computeSignature: (options, xhrOptions) ->
sigString = #_getSignatureString(options, xhrOptions)
key = CryptoJS.enc.Base64.parse(options.primaryKey)
hmac = CryptoJS.algo.HMAC.create(CryptoJS.algo.SHA256, key)
hmac.update sigString
hash = hmac.finalize()
result = hash.toString(CryptoJS.enc.Base64)
result
any ideas?
EDIT :
all code for authorizationHeader ->
https://gist.github.com/F4Ke/88debcede3b7e2312b11
2)
ERROR RESPONSE :
PUT https://MYACCOUNT.blob.core.windows.net/data-test 403 (Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.)
3)
#_getCanonicalizedHeadersString
itemcreation.coffee:232 x-ms-blob-type:BlockBlob
x-ms-date:Mon, 19 Oct 2015 14:33:15 GMT
x-ms-version:2009-09-19
#_getSignatureString
PUT
application/octet-stream
x-ms-blob-type:BlockBlob
x-ms-date:Mon, 19 Oct 2015 14:35:19 GMT
x-ms-version:2009-09-19
/MYACCOUNT/MYACCOUNT.blob.core.windows.net/data-test
The REST documentation describing how to sign a message can be found here. You can also take a look at the node shared key implementation hhere - might also help you...