Saving the request sent in a variable in Tavern - tavern

I have the below tavern test file:
test_name: Create a new book and get the book
stages:
- name: Create a book
request:
url: http://localhost:8000/books
method: POST
json:
name: "Tell me your dreams"
author: "Sidney Sheldon"
genres:
- Fiction
- Thriller
published_year: "1997"
description: "Some description"
response:
status_code: 201
save:
json:
book_link: link
inserted_book: "{tavern.request_vars.json}"
- name: Get a non existent book
request:
url: http://localhost:8000/book/123
method: GET
response:
status_code: 400
- name: Get the correct book
request:
url: "{book_link}"
method: GET
response:
status_code: 200
verify_response_with:
function: integration_utils:validate_book_response
extra_kwargs:
inserted_book: "{inserted_book}"
- name: Delete the created book
request:
url: "{book_link}"
method: DELETE
response:
status_code: 200
And my function for checking the response for the book is like so :
def validate_book_response(response, inserted_book):
response_data = response.json()
replaced_inserted_data = inserted_book.replace("\'", "\"")
inserted_data = json.loads(replaced_inserted_data)
for key in inserted_data:
if key == "author":
assert response_data.get("author") == string.capwords(inserted_data.get("author"))
else:
assert response_data.get(key) == inserted_data.get(key)
assert "link" in response_data
However it seems like the variable inserted_book is not saved as i get this error:
test_create_and_get_the_book.tavern.yaml::Create a new book and get the book FAILED
======================================================================= FAILURES =======================================================================
__ /home/subhayan/Codes/mongo-based-book-store-fastapi/tests/integration/test_create_and_get_the_book.tavern.yaml::Create a new book and get the book __
Format variables:
book_link = 'http://127.0.0.1:8000/book/59a507de-0bd1-11ec-8d37-0242ac180003'
inserted_book = '???'
Source test stage (line 27):
- name: Get the correct book
request:
url: "{book_link}"
method: GET
response:
status_code: 200
verify_response_with:
function: integration_utils:validate_book_response
extra_kwargs:
inserted_book: "{inserted_book}"
Missing format vars for stage
Errors:
E tavern.util.exceptions.MissingFormatError: inserted_book
------------------------------------------------------------------ Captured log call -------------------------------------------------------------------
WARNING tavern.util.dict_util:dict_util.py:45 Formatting 'tavern.request_vars.json' will result in it being coerced to a string (it is a <class 'box.box.Box'>)
ERROR tavern.util.dict_util:dict_util.py:35 Failed to resolve string '{inserted_book}' with variables '{'tavern': {'env_vars': {'SHELL': '/bin/bash', 'SESSION_MANAGER': 'local/subhayan-SCHENKER-SLIM14-SSL14L19:#/tmp/.ICE-unix/1965,unix/subhayan-SCHENKER-SLIM14-SSL14L19:/tmp/.ICE-unix/1965', 'QT_ACCESSIBILITY': '1', 'COLORTERM': 'truecolor', 'XDG_CONFIG_DIRS': '/etc/xdg/xdg-ubuntu:/etc/xdg', 'XDG_MENU_PREFIX': 'gnome-', 'GNOME_DESKTOP_SESSION_ID': 'this-is-deprecated', 'CONDA_EXE': '/home/subhayan/anaconda3/bin/conda', '_CE_M': '', 'MANDATORY_PATH': '/usr/share/gconf/ubuntu.mandatory.path', 'LC_ADDRESS': 'en_GB.UTF-8', 'GNOME_SHELL_SESSION_MODE': 'ubuntu', 'LC_NAME': 'en_GB.UTF-8', 'SSH_AUTH_SOCK': '/run/user/1000/keyring/ssh', 'XMODIFIERS': '#im=ibus', 'DESKTOP_SESSION': 'ubuntu', 'LC_MONETARY': 'en_GB.UTF-8', 'SSH_AGENT_PID': '1921', 'GTK_MODULES': 'gail:atk-bridge', 'PWD': '/home/subhayan/Codes/mongo-based-book-store-fastapi/tests/integration', 'LOGNAME': 'subhayan', 'XDG_SESSION_DESKTOP': 'ubuntu', 'XDG_SESSION_TYPE': 'x11', 'GPG_AGENT_INFO': '/run/user/1000/gnupg/S.gpg-agent:0:1', 'XAUTHORITY': '/run/user/1000/gdm/Xauthority', 'GJS_DEBUG_TOPICS': 'JS ERROR;JS LOG', 'WINDOWPATH': '2', 'HOME': '/home/subhayan', 'USERNAME': 'subhayan', 'IM_CONFIG_PHASE': '1', 'LC_PAPER': 'en_GB.UTF-8', 'LANG': 'en_US.UTF-8', 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:', 'XDG_CURRENT_DESKTOP': 'ubuntu:GNOME', 'VTE_VERSION': '6003', 'GNOME_TERMINAL_SCREEN': '/org/gnome/Terminal/screen/78a1cd47_1f03_4fa3_aebf_60ffa123f941', 'INVOCATION_ID': 'fb17b0ca1013400ca6df59cfad054277', 'MANAGERPID': '1739', 'GJS_DEBUG_OUTPUT': 'stderr', 'LESSCLOSE': '/usr/bin/lesspipe %s %s', 'XDG_SESSION_CLASS': 'user', 'TERM': 'xterm-256color', 'LC_IDENTIFICATION': 'en_GB.UTF-8', '_CE_CONDA': '', 'DEFAULTS_PATH': '/usr/share/gconf/ubuntu.default.path', 'LESSOPEN': '| /usr/bin/lesspipe %s', 'USER': 'subhayan', 'GNOME_TERMINAL_SERVICE': ':1.153', 'CONDA_SHLVL': '1', 'DISPLAY': ':0', 'SHLVL': '1', 'LC_TELEPHONE': 'en_GB.UTF-8', 'QT_IM_MODULE': 'ibus', 'LC_MEASUREMENT': 'en_GB.UTF-8', 'PAPERSIZE': 'a4', 'CONDA_PYTHON_EXE': '/home/subhayan/anaconda3/bin/python', 'XDG_RUNTIME_DIR': '/run/user/1000', 'LC_TIME': 'en_GB.UTF-8', 'JOURNAL_STREAM': '8:51643', 'XDG_DATA_DIRS': '/usr/share/ubuntu:/usr/local/share/:/usr/share/:/var/lib/snapd/desktop', 'PATH': '/home/subhayan/anaconda3/envs/mongo-book-store-fastapi/bin:/home/subhayan/anaconda3/condabin:/home/subhayan/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/local/go/bin:/home/subhayan/.dotfiles/submodules/antigen/bundles/robbyrussell/oh-my-zsh/lib:/home/subhayan/.dotfiles/submodules/antigen/bundles/esc/conda-zsh-completion:/home/subhayan/.dotfiles/submodules/antigen/bundles/robbyrussell/oh-my-zsh/plugins/git:/home/subhayan/.dotfiles/submodules/antigen/bundles/robbyrussell/oh-my-zsh/plugins/git-auto-fetch:/home/subhayan/.dotfiles/submodules/antigen/bundles/robbyrussell/oh-my-zsh/plugins/vi-mode:/home/subhayan/.dotfiles/submodules/antigen/bundles/zsh-users/zsh-autosuggestions:/home/subhayan/.dotfiles/submodules/antigen/bundles/zsh-users/zsh-syntax-highlighting:/home/subhayan/.dotfiles/submodules/antigen/bundles/zsh-users/zsh-history-substring-search:/home/subhayan/.dotfiles/submodules/antigen/bundles/romkatv/powerlevel10k', 'GDMSESSION': 'ubuntu', 'DBUS_SESSION_BUS_ADDRESS': 'unix:path=/run/user/1000/bus', 'LC_NUMERIC': 'en_GB.UTF-8', 'OLDPWD': '/home/subhayan/Codes/mongo-based-book-store-fastapi', 'P9K_TTY': 'old', 'PAGER': 'less', 'LESS': '-R', 'LSCOLORS': 'Gxfxcxdxbxegedabagacad', 'P9K_SSH': '0', 'CONDA_PREFIX': '/home/subhayan/anaconda3/envs/mongo-book-store-fastapi', 'CONDA_DEFAULT_ENV': 'mongo-book-store-fastapi', 'CONDA_PROMPT_MODIFIER': '(mongo-book-store-fastapi) ', '_': '/home/subhayan/anaconda3/envs/mongo-book-store-fastapi/bin/python', 'MONGODB_URI': 'mongodb://server:27017', 'BASE_URI': 'http://127.0.0.1:8000/', 'PYTEST_CURRENT_TEST': 'test_create_and_get_the_book.tavern.yaml::Create a new book and get the book (call)'}, 'request_vars': {'method': 'GET', 'url': 'http://127.0.0.1:8000/book/59a507de-0bd1-11ec-8d37-0242ac180003', 'verify': True, 'stream': False, 'allow_redirects': False}}, 'book_link': 'http://127.0.0.1:8000/book/59a507de-0bd1-11ec-8d37-0242ac180003'}'
ERROR tavern.util.dict_util:dict_util.py:38 Key(s) not found in format: inserted_book
Can someone please tell me how do i save the request sent in an earlier stage in a variable and then use it in a later stage?
Thanks in advance.

I would say inserted_book is out of indentation by 2 spaces.
yours:
save:
json:
book_link: link
inserted_book: "{tavern.request_vars.json}"
should be:
save:
json:
book_link: link
inserted_book: "{tavern.request_vars.json}"

Related

Axios post request is being denied by flask made api. CORS error coming up [duplicate]

This question already has an answer here:
cors enable in Request header field Access-Control-Allow-Origin is not allowed by Access-Control-Allow-Headers in preflight response
(1 answer)
Closed 2 years ago.
I am trying to add user from my react application through an API made with flask. But the post request is getting in error with the following error.
'http://localhost:5000/api/v1.0/add' from origin 'http://localhost:3000' has been blocked by CORS policy: Request header field access-control-allow-origin is not allowed by Access-Control-Allow-Headers in preflight response.
my axios code is following
const params = {
user_name : '5678234121',
passwd : 'password',
location : 'Kolkata',
pin_code : '700019',
secret_ques: 'What is your mother s maiden name?',
answr : 'aba',
status : 'Active',
remarks : 'test data'
};
const res = await Axios.post(
'http://localhost:5000/api/v1.0/add', params, {
headers: {
'content-type': 'application/json',
'Access-Control-Allow-Origin' : '*',
'Access-Control-Allow-Methods' : 'GET,PUT,POST,DELETE,PATCH,OPTIONS',
},
});
console.log(res.data);
my flask code is following
#app.route('/api/v1.0/add', methods=["POST"])
def add():
con = None
db = datadir + datafile
try:
_json = request.json
_name = _json['user_name']
_psswd = _json['passwd']
_locatn = _json['location']
_pincd = _json['pin_code']
_secrt = _json['secret_ques']
_ans = _json['answr']
_stat = _json['status']
_remks = _json['remarks']
# validate the received values
if _name and _psswd and _pincd and request.method == 'POST':
#do not save password as a plain text
_hashed_password = base64.b64encode(_psswd.encode("utf-8"))
# save edits
sql = '''INSERT INTO user_mast(user_name, passwd, location, pin_code, secret_ques, answr, status, remarks ) VALUES (?, ?, ?, ?, ?, ?, ?, ?)'''
data = (_name, _hashed_password.decode('ASCII'), _locatn, _pincd, _secrt,_ans, _stat, _remks , )
con = sqlite3.connect( db ) # Connection to database
cur = con.cursor()
cur.execute(sql, data)
con.commit()
resp = jsonify({'Status' : 'User added successfully!'})
resp.status_code = 200
else :
resp = jsonify ({'Status' :'Mandatory fields: Name,Password,Pincode missing..'})
resp.status_code = 502
except sqlite3.Error as e:
resp = jsonify({'Status' :'Database Error'})
resp.status_code = 500
except Exception as e:
print(e)
resp = jsonify({'Status' :'Unknown Error : Contact Administrator'})
resp.status_code = 501
finally:
cur.close()
con.close()
return resp
Please help me to fix the error, going clueless about this.
If you're new to this, I'd recommend just adding Flask-CORS to your application and not futzing around with the headers.

SQL Server using TypeORM: Error Timeout: Request failed to complete when inserting records by batch

The issue I am having is that when I run the migration command I always get an ETIMEOUT error, but if I comment out the await populateTable() the code will run without any error.
I already tried to increase the requestTimeout from 15sec to 300sec but that didn't help.
Here's my code:
Create the following:
// ormconfig.json
[
{
"name": "app",
"type": "mssql",
"host": "127.0.0.1",
"port": 1433,
"username": "root",
"password": "root",
"database": "app",
"logging": true,
"options": {
"useUTC": true
},
"entities": ["src/**/*.entity.ts"],
"migrations": ["migration/**/*.ts"],
"cli": {
"migrationsDir": "migration"
}
}
]
// root-dir/src/grouping/grouping.entity.ts
import { Column, Entity, OneToMany, PrimaryGeneratedColumn } from 'typeorm';
#Entity({ name: 'grouping' })
export class Grouping {
#PrimaryGeneratedColumn({ name: 'id', unsigned: true })
id: number;
#Column({ name: 'name', unique: true })
name: string;
}
// root-dir/migration/tables/grouping.ts
import { getRepository, QueryRunner, Table, TableIndex } from 'typeorm';
import { Grouping } from '../../src/grouping/grouping.entity';
export async function up(queryRunner: QueryRunner): Promise<any> {
await createTable(queryRunner);
await createIndexes(queryRunner);
await populateTable();
}
export async function down(queryRunner: QueryRunner): Promise<any> {
await queryRunner.dropIndex('grouping', 'IDX_GROUP');
await queryRunner.dropTable('grouping');
}
async function createTable(queryRunner: QueryRunner) {
return queryRunner.createTable(
new Table({
name: 'grouping',
columns: [
{
name: 'id',
type: 'integer',
isPrimary: true,
isGenerated: true,
generationStrategy: 'increment',
unsigned: true,
},
{
name: 'name',
type: 'varchar',
isUnique: true,
},
],
}),
true,
);
}
async function createIndexes(queryRunner: QueryRunner) {
return await queryRunner.createIndex(
'grouping',
new TableIndex({
name: 'IDX_GROUP',
columnNames: ['name'],
}),
);
}
async function populateTable() {
await getRepository(Grouping, 'app').save([{ name: 'classification' }, { name: 'categorization' }]);
}
// root-dir/migration/initial-migration.ts
import { MigrationInterface, QueryRunner } from 'typeorm';
import * as groupingTable from './tables/grouping';
export class InitialMigration1550229771145 implements MigrationInterface {
async up(queryRunner: QueryRunner): Promise<any> {
await groupingTable.up(queryRunner);
}
async down(queryRunner: QueryRunner): Promise<any> {
await groupingTable.down(queryRunner);
}
}
Run the migration command.
ts-node ./node_modules/typeorm/cli.js migration:run -c
This should create the Grouping table and insert 2 records but what I am getting is this error:
Query: CREATE TABLE "grouping" ("id" integer NOT NULL IDENTITY(1,1), "name" varchar(255) NOT NULL, CONSTRAINT "UQ_07314fe287a837177015c041131" UNIQUE ("name"), CONSTRAINT "PK_135d73da7246e0250716afdc0ab" PRIMARY KEY ("id"))
query: SELECT SCHEMA_NAME() AS "schema_name"
query: SELECT DB_NAME() AS "db_name"
query: SELECT * FROM "app"."INFORMATION_SCHEMA"."TABLES" WHERE ("TABLE_SCHEMA" = 'dbo' AND "TABLE_NAME" = 'grouping')
query: SELECT * FROM "app"."INFORMATION_SCHEMA"."COLUMNS" WHERE ("TABLE_SCHEMA" = 'dbo' AND "TABLE_NAME" = 'grouping')
query: SELECT "columnUsages".*, "tableConstraints"."CONSTRAINT_TYPE", "chk"."definition" FROM "app"."INFORMATION_SCHEMA"."CONSTRAINT_COLUMN_USAGE" "columnUsages" INNER JOIN "app"."INFORMATION_SCHEMA"."TABLE_CONSTRAINTS" "tableConstraints" ON "tableConstraints"."CONSTRAINT_NAME" = "columnUsages"."CONSTRAINT_NAME" LEFT JOIN "app"."sys"."check_constraints" "chk" ON "chk"."name" = "columnUsages"."CONSTRAINT_NAME" WHERE (("columnUsages"."TABLE_SCHEMA" = 'dbo' AND "columnUsages"."TABLE_NAME" = 'grouping' AND "tableConstraints"."TABLE_SCHEMA" = 'dbo' AND "tableConstraints"."TABLE_NAME" = 'grouping')) AND "tableConstraints"."CONSTRAINT_TYPE" IN ('PRIMARY KEY', 'UNIQUE', 'CHECK')
query: SELECT "fk"."name" AS "FK_NAME", 'app' AS "TABLE_CATALOG", "s1"."name" AS "TABLE_SCHEMA", "t1"."name" AS "TABLE_NAME", "col1"."name" AS "COLUMN_NAME", "s2"."name" AS "REF_SCHEMA", "t2"."name" AS "REF_TABLE", "col2"."name" AS "REF_COLUMN", "fk"."delete_referential_action_desc" AS "ON_DELETE", "fk"."update_referential_action_desc" AS "ON_UPDATE" FROM "app"."sys"."foreign_keys" "fk" INNER JOIN "app"."sys"."foreign_key_columns" "fkc" ON "fkc"."constraint_object_id" = "fk"."object_id" INNER JOIN "app"."sys"."tables" "t1" ON "t1"."object_id" = "fk"."parent_object_id" INNER JOIN "app"."sys"."schemas" "s1" ON "s1"."schema_id" = "t1"."schema_id" INNER JOIN "app"."sys"."tables" "t2" ON "t2"."object_id" = "fk"."referenced_object_id" INNER JOIN "app"."sys"."schemas" "s2" ON "s2"."schema_id" = "t2"."schema_id" INNER JOIN "app"."sys"."columns" "col1" ON "col1"."column_id" = "fkc"."parent_column_id" AND "col1"."object_id" = "fk"."parent_object_id" INNER JOIN "app"."sys"."columns" "col2" ON "col2"."column_id" = "fkc"."referenced_column_id" AND "col2"."object_id" = "fk"."referenced_object_id"
query: SELECT "TABLE_CATALOG", "TABLE_SCHEMA", "COLUMN_NAME", "TABLE_NAME" FROM "app"."INFORMATION_SCHEMA"."COLUMNS" WHERE COLUMNPROPERTY(object_id("TABLE_CATALOG" + '.' + "TABLE_SCHEMA" + '.' + "TABLE_NAME"), "COLUMN_NAME", 'IsIdentity') = 1 AND "TABLE_SCHEMA" IN ('dbo')
query: SELECT "NAME", "COLLATION_NAME" FROM "sys"."databases"
query: SELECT 'app' AS "TABLE_CATALOG", "s"."name" AS "TABLE_SCHEMA", "t"."name" AS "TABLE_NAME", "ind"."name" AS "INDEX_NAME", "col"."name" AS "COLUMN_NAME", "ind"."is_unique" AS "IS_UNIQUE", "ind"."filter_definition" as "CONDITION" FROM "app"."sys"."indexes" "ind" INNER JOIN "app"."sys"."index_columns" "ic" ON "ic"."object_id" = "ind"."object_id" AND "ic"."index_id" = "ind"."index_id" INNER JOIN "app"."sys"."columns" "col" ON "col"."object_id" = "ic"."object_id" AND "col"."column_id" = "ic"."column_id" INNER JOIN "app"."sys"."tables" "t" ON "t"."object_id" = "ind"."object_id" INNER JOIN "app"."sys"."schemas" "s" ON "s"."schema_id" = "t"."schema_id" WHERE "ind"."is_primary_key" = 0 AND "ind"."is_unique_constraint" = 0 AND "t"."is_ms_shipped" = 0
query: CREATE INDEX "IDX_GROUP" ON "grouping" ("name")
query: BEGIN TRANSACTION
query: INSERT INTO "grouping"("name") OUTPUT INSERTED."id" VALUES (#0), (#1) -- PARAMETERS: [{"value":"classification","type":"nvarchar","params":[]},{"value":"categorization","type":"nvarchar","params":[]}]
query failed: INSERT INTO "grouping"("name") OUTPUT INSERTED."id" VALUES (#0), (#1) -- PARAMETERS: [{"value":"classification","type":"nvarchar","params":[]},{"value":"categorization","type":"nvarchar","params":[]}]
error: { RequestError: Timeout: Request failed to complete in 15000ms
at Request.tds.Request.err [as userCallback] (C:\Users\me\Workspace\app\api\node_modules\mssql\lib\tedious.js:629:19)
at Request.callback (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\request.js:37:27)
at Connection.message (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:2136:24)
at Connection.dispatchEvent (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:1084:36)
at MessageIO.messageIo.on (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:984:14)
at MessageIO.emit (events.js:189:13)
at MessageIO.EventEmitter.emit (domain.js:441:20)
at Message.message.on (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\message-io.js:32:14)
at Message.emit (events.js:194:15)
at Message.EventEmitter.emit (domain.js:441:20)
code: 'ETIMEOUT',
number: 'ETIMEOUT',
state: undefined,
originalError:
{ RequestError: Timeout: Request failed to complete in 15000ms
at RequestError (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\errors.js:32:12)
at Connection.message (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:2136:33)
at Connection.dispatchEvent (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:1084:36)
at MessageIO.messageIo.on (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:984:14)
at MessageIO.emit (events.js:189:13)
at MessageIO.EventEmitter.emit (domain.js:441:20)
at Message.message.on (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\message-io.js:32:14)
at Message.emit (events.js:194:15)
at Message.EventEmitter.emit (domain.js:441:20)
at endReadableNT (C:\Users\me\Workspace\app\api\node_modules\tedious\node_modules\readable-stream\lib_stream_readable.js:1077:12)
message: 'Timeout: Request failed to complete in 15000ms',
code: 'ETIMEOUT' },
name: 'RequestError',
precedingErrors: [] }
query: ROLLBACK
query: ROLLBACK
Error during migration run:
{ QueryFailedError: Error: Timeout: Request failed to complete in 15000ms
at new QueryFailedError (C:\Users\me\Workspace\app\api\src\error\QueryFailedError.ts:9:9)
at C:\Users\me\Workspace\app\api\src\driver\sqlserver\SqlServerQueryRunner.ts:221:37
at _query (C:\Users\me\Workspace\app\api\node_modules\mssql\lib\base.js:1346:25)
at Request.tds.Request.err [as userCallback] (C:\Users\me\Workspace\app\api\node_modules\mssql\lib\tedious.js:671:15)
at Request.callback (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\request.js:37:27)
at Connection.message (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:2136:24)
at Connection.dispatchEvent (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:1084:36)
at MessageIO.messageIo.on (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:984:14)
at MessageIO.emit (events.js:189:13)
at MessageIO.EventEmitter.emit (domain.js:441:20)
message: 'Error: Timeout: Request failed to complete in 15000ms',
code: 'ETIMEOUT',
number: 'ETIMEOUT',
state: undefined,
originalError:
{ RequestError: Timeout: Request failed to complete in 15000ms
at RequestError (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\errors.js:32:12)
at Connection.message (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:2136:33)
at Connection.dispatchEvent (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:1084:36)
at MessageIO.messageIo.on (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\connection.js:984:14)
at MessageIO.emit (events.js:189:13)
at MessageIO.EventEmitter.emit (domain.js:441:20)
at Message.message.on (C:\Users\me\Workspace\app\api\node_modules\tedious\lib\message-io.js:32:14)
at Message.emit (events.js:194:15)
at Message.EventEmitter.emit (domain.js:441:20)
at endReadableNT (C:\Users\me\Workspace\app\api\node_modules\tedious\node_modules\readable-stream\lib_stream_readable.js:1077:12)
message: 'Timeout: Request failed to complete in 15000ms',
code: 'ETIMEOUT' },
name: 'QueryFailedError',
precedingErrors: [],
query:
'INSERT INTO "grouping"("name") OUTPUT INSERTED."id" VALUES (#0), (#1)',
parameters:
[ MssqlParameter { value: 'classification', type: 'nvarchar', params: [] },
MssqlParameter { value: 'categorization', type: 'nvarchar', params: [] } ] }
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! app-api#0.1.0 migration:run: ts-node ./node_modules/typeorm/cli.js migration:run -c "app_engine"
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the app-api#0.1.0 migration:run script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\me\AppData\Roaming\npm-cache_logs\2019-06-04T18_56_36_972Z-debug.log
Any help is greatly appreciated!
Update:
I found this https://github.com/typeorm/typeorm/issues/3100#issuecomment-446309812 while browsing through the previous issues and it is now working.
try to use a legacy version of the npm i mssql # 5.1.1 driver --save

How I can create user with REST_API wordpress?

I need create user on Wordpress. I use WP_REST_API - this is default API for WP. You can look at it "YOU_SITE/wp-json/"
I have ionic3 project and have function.
onSubmit(values){
this.http.post(Config.WORDPRESS_URL + 'wp-json/jwt-auth/v1/token',{
username: 'admin',
password: 'pass'
})
.subscribe(
res => {
let token = res.json().token;
let header : Headers = new Headers();
header.append('Authorization','Basic ' + token);
this.http.post(Config.WORDPRESS_REST_API_URL + 'users?token=' + res.json().token,{
username: values.username,
name: values.displayName,
email: values.email,
password: values.password,
},header)
.subscribe(
result => {
console.log(result.json());
},
error => {
console.log(error.json());
}
)
},
err => {
console.log(err);
}
)
}
But I always get error:
code: "rest_cannot_create_user",
message: "Sorry, but you can't create new user"
status: 401
admin:pass - this is admin on site and has a role admin.
Also I added to .htaccess
SetEnvIf Authorization "(.*)" HTTP_AUTHORIZATION=$1
RewriteCond %{HTTP:Authorization} ^(.*)
RewriteRule ^(.*) - [E=HTTP_AUTHORIZATION:%1]
Please help me to find a mistake
I found my issue and solution. I delete param TOKEN from URL and
create options header
let header = new Headers({"Authorization": "Bearer "+token});
let options = new RequestOptions({headers: header});
I have next request
this.http.post(Config.REGISTER, {
username: username,
name: displayName,
email: email,
password: password,
nonce: nonce
}, options)
it is work for me.

Angularjs Fileupload change upload URL?

I use angular to upload the file as below :
If I run the following code, I get 403(Forbidden)
var contextRoot = "http://localhost\\:6060/nomunoli";
...
...
uploadFile : function(taglist, description, userid, file) {
return $upload.upload({
url : contextRoot + "/auth/insertNMNL001FUN02",
fields : {
'description' : description,
'userId' : userid,
'tagList' : taglist,
},
file : file,
fileFormDataName : 'file'
});
},
...
In debugger console
POST http://localhost/:6060/nomunoli/auth/insertNMNL001FUN02 403 (Forbidden)
b # angular.min.js:79
s # angular.min.js:74
c.$get.f # angular.min.js:71
l.promise.then.J # angular.min.js:101
(anonymous function) # angular.min.js:102
a.$get.h.$eval # angular.min.js:113
a.$get.h.$digest # angular.min.js:110
a.$get.h.$apply # angular.min.js:113
(anonymous function) # angular.min.js:195
n.event.dispatch # jquery-2.1.3.min.js:3
n.event.add.r.handle # jquery-2.1.3.min.js:3
When I change the code as below, it is OK.
...
uploadFile : function(taglist, description, userid, file) {
return $upload.upload({
url : "http://localhost\\:6060/nomunoli/auth/insertNMNL001FUN02",
fields : {
'description' : description,
'userId' : userid,
'tagList' : taglist,
},
file : file,
fileFormDataName : 'file'
});
},
...

Specifying bigquery table schema that resides on a file in a multipart http request

I have a text file schema.txt in which the schema for the table that I want to create is defined.
I want to include this file in the multipart HTTP request that I'm using to create my table.
How do I specify the schema.txt file in the multipart HTTP request?
Below is what I'm currently doing (not working though):
def loadTable(service, projectId, datasetId, targetTableId, sourceCsv, filenm):
try:
jobCollection = service.jobs()
jobData = {
'projectId': projectId,
'configuration': {
'load': {
'sourceUris': [sourceCsv],
'schema': filenm,
'destinationTable': {
'projectId': projectId,
'datasetId': datasetId,
'tableId': targetTableId
},
'createDisposition': 'CREATE_IF_NEEDED',
'writeDisposition': 'WRITE_TRUNCATE',
'encoding': 'UTF-8'
}
}
}
Where filenm will be 'schema.txt'.
I know I can specify the schema directly as:
'schema': {
'fields': [
{
'name': 'level',
'type': 'STRING',
},
{
'name': 'message',
'type': 'STRING',
}
]
},
But instead I want to specify the file containing the schema.
Hmm, not sure why you need a "multipart HTTP request" unless you are ingesting directly from a file. Here you are specifying a CSV input, indicating a Cloud Storage object.
See here for more info:
https://developers.google.com/bigquery/docs/developers_guide#storageimport
In any case, this is not really a BigQuery question, more a Python question.. do you mean this?
import json
def loadTable(project_id, dataset_id, target_table, source_csv, filename):
file = open(filename, 'r')
schema = file.read()
file.close()
schema_json = json.loads('{%s}' % schema)
job_data = {
"projectId": project_id,
"configuration": {
"load": {
"sourceUris": [source_csv],
"schema": schema_json,
"destinationTable": {
"projectId": project_id,
"datasetId": dataset_id,
"tableId": target_table
},
"createDisposition": "CREATE_IF_NEEDED",
"writeDisposition": "WRITE_TRUNCATE",
"encoding": "UTF-8"
}
}
}
print json.dumps(job_data, indent=2)
loadTable('project_id', 'dataset_id', 'target_table', 'source_csv', '/tmp/schema.txt')

Resources