DeadlineExceededError despite using ndb.put_multi_async - google-app-engine

I have the following POST method in my api. After sending a big JSON collection around 350 entries to the api, I get an DeadlineExceedError exception.
I already am utilising the ndb.multi_put_async. I don't know what else I could do to speed this up?
def post(self):
arguments = self.reqparser.parse_args()
json_records = arguments.get('records')
user = User.query(User.email == request.authorization.username).get()
if user:
records_put_list = []
events_put_list = []
for json_record in json_records:
record_id = json_record['record_id']
rec = Record.get_or_insert(record_id,
user=user.key,
record_date=record_date,
timestamp=record_timestamp)
for json_event in json_record['events']:
event = Event.get_or_insert(json_event['event_id'],
parent=rec.key,
user=user.key,
is_deleted=json_event['is_deleted'])
if event.timestamp < json_event['timestamp']:
event.user = user.key
event.record = rec.key
event.date_time = event_dt
event.timestamp = json_event['timestamp']
event.parent = rec.key
events_put_list.append(event)
ndb.put_multi_async(records_put_list)
ndb.put_multi_async(events_put_list)
return '', 201
else:
return '', 401
This is the exception. Any advice what I could do?
Traceback (most recent call last):
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 266, in Handle
result = handler(dict(self._environ), self._StartResponse)
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/lib/flask/app.py", line 1836, in __call__
return self.wsgi_app(environ, start_response)
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/lib/flask/app.py", line 1817, in wsgi_app
response = self.full_dispatch_request()
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/lib/flask/app.py", line 1475, in full_dispatch_request
rv = self.dispatch_request()
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/lib/flask/app.py", line 1461, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/lib/flask_restful/__init__.py", line 397, in wrapper
resp = resource(*args, **kwargs)
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/lib/flask/views.py", line 84, in view
return self.dispatch_request(*args, **kwargs)
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/lib/flask_restful/__init__.py", line 487, in dispatch_request
resp = meth(*args, **kwargs)
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/application/http_basic_auth.py", line 36, in decorated
return f(*args, **kwargs)
File "/base/data/home/apps/s~feeltracker1/1-0-1-0.376950929493492366/application/rest_api_view.py", line 362, in post
is_deleted=json_event['is_deleted'])
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/model.py", line 3401, in _get_or_insert
return cls._get_or_insert_async(*args, **kwds).get_result()
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 325, in get_result
self.check_success()
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 320, in check_success
self.wait()
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 304, in wait
if not ev.run1():
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/eventloop.py", line 235, in run1
delay = self.run0()
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/eventloop.py", line 197, in run0
callback(*args, **kwds)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 474, in _on_future_completion
self._help_tasklet_along(ns, ds_conn, gen, val)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 371, in _help_tasklet_along
value = gen.send(val)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/ext/ndb/context.py", line 1015, in transaction
ok = yield tconn.async_commit(options)
DeadlineExceededError

You are not using the right logic. Do not put the get_or_insert in the loop - that is never going to work for anything except the smallest number of records (and is generally an anti-pattern to avoid). Instead create the entities and append them to the list inside the loop. Once this is done, the put_multi will then take this list and handle all the puts in parallel. There is a question about how many items in a put_multi list is tolerable, but 350 is far from what I remember as a limit.

You can use a Task API to split this processing job into smaller batches (e.g. 100 entries per task - you can find the optimum number).

Related

ODOO 12 server error regarding invoice sequencing

I am trying to change the sequence of my invoicing. Instead of resetting it each new year, I can keep the count going upwards continuously.
(for example)
inv/2021/0001 date 1/1/2023   (this one should be 2366)
inv/2021/2365    date 31/12/2022
researching on the subject I found out I need to go into technical -> sequences to get the invoice numbers I want.
but my problem is, once i click sequences I get the following server error:
Error:
Odoo Server Error
Traceback (most recent call last):
File "/odoo/odoo-server/odoo/api.py", line 1039, in get
value = self._data[key][field][record._ids[0]]
KeyError: 254
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/odoo/odoo-server/odoo/fields.py", line 981, in __get__
value = record.env.cache.get(record, self)
File "/odoo/odoo-server/odoo/api.py", line 1041, in get
raise CacheMiss(record, field)
odoo.exceptions.CacheMiss: ('ir.sequence(254,).number_next_actual', None)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/odoo/odoo-server/odoo/http.py", line 656, in _handle_exception
return super(JsonRequest, self)._handle_exception(exception)
File "/odoo/odoo-server/odoo/http.py", line 314, in _handle_exception
raise pycompat.reraise(type(exception), exception, sys.exc_info()[2])
File "/odoo/odoo-server/odoo/tools/pycompat.py", line 87, in reraise
raise value
File "/odoo/odoo-server/odoo/http.py", line 698, in dispatch
result = self._call_function(**self.params)
File "/odoo/odoo-server/odoo/http.py", line 346, in _call_function
return checked_call(self.db, *args, **kwargs)
File "/odoo/odoo-server/odoo/service/model.py", line 97, in wrapper
return f(dbname, *args, **kwargs)
File "/odoo/odoo-server/odoo/http.py", line 339, in checked_call
result = self.endpoint(*a, **kw)
File "/odoo/odoo-server/odoo/http.py", line 941, in __call__
return self.method(*args, **kw)
File "/odoo/odoo-server/odoo/http.py", line 519, in response_wrap
response = f(*args, **kw)
File "/odoo/odoo-server/addons/web/controllers/main.py", line 904, in search_read
return self.do_search_read(model, fields, offset, limit, domain, sort)
File "/odoo/odoo-server/addons/web/controllers/main.py", line 926, in do_search_read
offset=offset or 0, limit=limit or False, order=sort or False)
File "/odoo/odoo-server/odoo/models.py", line 4589, in search_read
result = records.read(fields)
File "/odoo/odoo-server/odoo/models.py", line 2791, in read
vals[name] = convert(record[name], record, use_name_get)
File "/odoo/odoo-server/odoo/models.py", line 5117, in __getitem__
return self._fields[key].__get__(self, type(self))
File "/odoo/odoo-server/odoo/fields.py", line 985, in __get__
self.determine_value(record)
File "/odoo/odoo-server/odoo/fields.py", line 1098, in determine_value
self.compute_value(recs)
File "/odoo/odoo-server/odoo/fields.py", line 1052, in compute_value
self._compute_value(records)
File "/odoo/odoo-server/odoo/fields.py", line 1043, in _compute_value
getattr(records, self.compute)()
File "/odoo/odoo-server/odoo/addons/base/models/ir_sequence.py", line 96, in _get_number_next_actual
seq.number_next_actual = _predict_nextval(self, seq_id)
File "/odoo/odoo-server/odoo/addons/base/models/ir_sequence.py", line 68, in _predict_nextval
self.env.cr.execute(query % {'seq_id': seq_id})
File "/odoo/odoo-server/odoo/sql_db.py", line 148, in wrapper
return f(self, *args, **kwargs)
File "/odoo/odoo-server/odoo/sql_db.py", line 225, in execute
res = self._obj.execute(query, params)
psycopg2.ProgrammingError: relation "ir_sequence_1000015" does not exist
LINE 6: FROM ir_sequence_1000015
I believe it could be a database error but I am not sure what this is about. Any idea?
Thanks!

I cann't creat new table in sqlite3 from django

When I added new string in models.py his tables did't create in sqlite3. What I do wrong, again :)
python manage.py makemigrations
python manage.py migrate - I did!
It's happend, when I don't input templates information after messege:
Please enter the default value now as valid Python.
The datetime and django.utils.timezone module are available so you can do e.g. timezone.now.
I imputed just 1 and enter. After thet it broke.
If you need different information give me know, please.
Environment:
Request Method: POST
Request URL: http://127.0.0.1:8000/admin/meetups/meetup/add/
Django Version: 3.2.8
Python Version: 3.10.0
Installed Applications:
['django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'meetups']
Installed Middleware:
['django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware']
Traceback (most recent call last):
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\utils.py", line 84, in _execute
return self.cursor.execute(sql, params)
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\sqlite3\base.py", line 423, in execute
return Database.Cursor.execute(self, query, params)
The above exception (table meetups_meetup has no column named enormus) was the direct cause of the following exception:
File "C:\install\Projects_1\env\lib\site-packages\django\core\handlers\exception.py", line 47, in inner
response = get_response(request)
File "C:\install\Projects_1\env\lib\site-packages\django\core\handlers\base.py", line 181, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\contrib\admin\options.py", line 616, in wrapper
return self.admin_site.admin_view(view)(*args, **kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\utils\decorators.py", line 130, in _wrapped_view
response = view_func(request, *args, **kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\views\decorators\cache.py", line 44, in _wrapped_view_func
response = view_func(request, *args, **kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\contrib\admin\sites.py", line 232, in inner
return view(request, *args, **kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\contrib\admin\options.py", line 1657, in add_view
return self.changeform_view(request, None, form_url, extra_context)
File "C:\install\Projects_1\env\lib\site-packages\django\utils\decorators.py", line 43, in _wrapper
return bound_method(*args, **kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\utils\decorators.py", line 130, in _wrapped_view
response = view_func(request, *args, **kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\contrib\admin\options.py", line 1540, in changeform_view
return self._changeform_view(request, object_id, form_url, extra_context)
File "C:\install\Projects_1\env\lib\site-packages\django\contrib\admin\options.py", line 1586, in _changeform_view
self.save_model(request, new_object, form, not add)
File "C:\install\Projects_1\env\lib\site-packages\django\contrib\admin\options.py", line 1099, in save_model
obj.save()
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\base.py", line 726, in save
self.save_base(using=using, force_insert=force_insert,
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\base.py", line 763, in save_base
updated = self._save_table(
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\base.py", line 868, in _save_table
results = self._do_insert(cls._base_manager, using, fields, returning_fields, raw)
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\base.py", line 906, in _do_insert
return manager._insert(
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\manager.py", line 85, in manager_method
return getattr(self.get_queryset(), name)(*args, **kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\query.py", line 1270, in _insert
return query.get_compiler(using=using).execute_sql(returning_fields)
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\sql\compiler.py", line 1416, in execute_sql
cursor.execute(sql, params)
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\utils.py", line 98, in execute
return super().execute(sql, params)
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\utils.py", line 66, in execute
return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\utils.py", line 75, in _execute_with_wrappers
return executor(sql, params, many, context)
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\utils.py", line 79, in _execute
with self.db.wrap_database_errors:
File "C:\install\Projects_1\env\lib\site-packages\django\db\utils.py", line 90, in __exit__
raise dj_exc_value.with_traceback(traceback) from exc_value
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\utils.py", line 84, in _execute
return self.cursor.execute(sql, params)
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\sqlite3\base.py", line 423, in execute
return Database.Cursor.execute(self, query, params)
Exception Type: OperationalError at /admin/meetups/meetup/add/
Exception Value: table meetups_meetup has no column named enormus
Errors, when I input: makemigration and migrate
(env) PS C:\install\Projects_1> python manage.py makemigrations
System check identified some issues:
WARNINGS:
meetups.Meetup.participant: (fields.W340) null has no effect on ManyToManyField.
You are trying to add a non-nullable field 'dates' to meetup without a default; we can't do that (the database needs something to populate existing rows).
Please select a fix:
1) Provide a one-off default now (will be set on all existing rows with a null value for this column)
2) Quit, and let me add a default in models.py
Select an option: 1
Please enter the default value now, as valid Python
The datetime and django.utils.timezone modules are available, so you can do e.g. timezone.now
Type 'exit' to exit this prompt
>>> '2021-10-10'
You are trying to add a non-nullable field 'organizer_emails' to meetup without a default; we can't do that (the database needs something to populate existing rows).
Please select a fix:
1) Provide a one-off default now (will be set on all existing rows with a null value for this column)
2) Quit, and let me add a default in models.py
Select an option: 1
The datetime and django.utils.timezone modules are available, so you can do e.g. timezone.now
Type 'exit' to exit this prompt
>>> test#.test.com
Invalid input: invalid syntax (<string>, line 1)
>>> 'test#.test.com'
Migrations for 'meetups':
meetups\migrations\0011_auto_20211026_2116.py
- Remove field enormus from meetup
- Add field dates to meetup
- Add field organizer_emails to meetup
(env) PS C:\install\Projects_1> python manage.py migrate
System check identified some issues:
WARNINGS:
meetups.Meetup.participant: (fields.W340) null has no effect on ManyToManyField.
Operations to perform:
Apply all migrations: admin, auth, contenttypes, meetups, sessions
Running migrations:
Applying meetups.0005_auto_20211026_1234...Traceback (most recent call last):
File "C:\install\Projects_1\manage.py", line 22, in <module>
main()
File "C:\install\Projects_1\manage.py", line 18, in main
execute_from_command_line(sys.argv)
File "C:\install\Projects_1\env\lib\site-packages\django\core\management\__init__.py", line 419, in execute_from_command_line
utility.execute()
File "C:\install\Projects_1\env\lib\site-packages\django\core\management\__init__.py", line 413, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "C:\install\Projects_1\env\lib\site-packages\django\core\management\base.py", line 354, in run_from_argv
self.execute(*args, **cmd_options)
File "C:\install\Projects_1\env\lib\site-packages\django\core\management\base.py", line 398, in execute
output = self.handle(*args, **options)
File "C:\install\Projects_1\env\lib\site-packages\django\core\management\base.py", line 89, in wrapped
res = handle_func(*args, **kwargs)
File "C:\install\Projects_1\env\lib\site-packages\django\core\management\commands\migrate.py", line 244, in handle
post_migrate_state = executor.migrate(
File "C:\install\Projects_1\env\lib\site-packages\django\db\migrations\executor.py", line 117, in migrate
state = self._migrate_all_forwards(state, plan, full_plan, fake=fake, fake_initial=fake_initial)
File "C:\install\Projects_1\env\lib\site-packages\django\db\migrations\executor.py", line 147, in _migrate_all_forwards
state = self.apply_migration(state, migration, fake=fake, fake_initial=fake_initial)
File "C:\install\Projects_1\env\lib\site-packages\django\db\migrations\executor.py", line 227, in apply_migration
state = migration.apply(state, schema_editor)
File "C:\install\Projects_1\env\lib\site-packages\django\db\migrations\migration.py", line 126, in apply
operation.database_forwards(self.app_label, schema_editor, old_state, project_state)
File "C:\install\Projects_1\env\lib\site-packages\django\db\migrations\operations\fields.py", line 104, in database_forwards
schema_editor.add_field(
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\sqlite3\schema.py", line 330, in add_field
self._remake_table(model, create_field=field)
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\sqlite3\schema.py", line 191, in _remake_table
self.effective_default(create_field)
File "C:\install\Projects_1\env\lib\site-packages\django\db\backends\base\schema.py", line 324, in effective_default
return field.get_db_prep_save(self._effective_default(field), self.connection)
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\fields\__init__.py", line 842, in get_db_prep_save
return self.get_db_prep_value(value, connection=connection, prepared=False)
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\fields\__init__.py", line 1271, in get_db_prep_value
value = self.get_prep_value(value)
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\fields\__init__.py", line 1266, in get_prep_value
return self.to_python(value)
File "C:\install\Projects_1\env\lib\site-packages\django\db\models\fields\__init__.py", line 1228, in to_python
parsed = parse_date(value)
File "C:\install\Projects_1\env\lib\site-packages\django\utils\dateparse.py", line 75, in parse_date
match = date_re.match(value)
TypeError: expected string or bytes-like object

How to create a Tensorflow Dataset without labels? Input 'filename' of 'ReadFile' Op has type float32 that does not match expected type of string

Using Tensorflow 2.3, I'm trying to create a tf.data.Dataset without labels.
I have my .png files in a folder './Folder/'. For creating the minimal working sample, I think the only relevant line is the one where I am calling tf.keras.preprocessing.image_dataset_from_directory. The class definition is here.
dataset = tf.keras.preprocessing.image_dataset_from_directory('./Folder/',label_mode=None,batch_size=100)
When the Python interpreter reaches the line above, it returns this error message:
Traceback (most recent call last):
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/framework/op_def_library.py", line 465, in _apply_op_helper
values = ops.convert_to_tensor(
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/framework/ops.py", line 1473, in convert_to_tensor
raise ValueError(
ValueError: Tensor conversion requested dtype string for Tensor with dtype float32: <tf.Tensor 'args_0:0' shape=() dtype=float32>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "04-vaeAnomalyScores.py", line 135, in <module>
historicKLD, encoder, decoder, vae = artVAE_Instance.run_autoencoder() # Train
File "/media/roi/9b168630-3b62-4215-bb7d-fed9ba179dc7/images/largePatches/artvae.py", line 386, in run_autoencoder
trainingDataSet = self.loadImages(self.trainingDir)
File "/media/roi/9b168630-3b62-4215-bb7d-fed9ba179dc7/images/largePatches/artvae.py", line 231, in loadImages
dataset = tf.keras.preprocessing.image_dataset_from_directory(dir[:-1]+'Downscaled/',label_mode=None,batch_size=self.BATCH_SIZE)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/keras/preprocessing/image_dataset.py", line 192, in image_dataset_from_directory
dataset = paths_and_labels_to_dataset(
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/keras/preprocessing/image_dataset.py", line 219, in paths_and_labels_to_dataset
img_ds = path_ds.map(
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 1695, in map
return MapDataset(self, map_func, preserve_cardinality=True)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 4041, in __init__
self._map_func = StructuredFunctionWrapper(
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 3371, in __init__
self._function = wrapper_fn.get_concrete_function()
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 2938, in get_concrete_function
graph_function = self._get_concrete_function_garbage_collected(
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 2906, in _get_concrete_function_garbage_collected
graph_function, args, kwargs = self._maybe_define_function(args, kwargs)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 3213, in _maybe_define_function
graph_function = self._create_graph_function(args, kwargs)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 3065, in _create_graph_function
func_graph_module.func_graph_from_py_func(
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py", line 986, in func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 3364, in wrapper_fn
ret = _wrapper_helper(*args)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 3299, in _wrapper_helper
ret = autograph.tf_convert(func, ag_ctx)(*nested_args)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/autograph/impl/api.py", line 255, in wrapper
return converted_call(f, args, kwargs, options=options)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/autograph/impl/api.py", line 532, in converted_call
return _call_unconverted(f, args, kwargs, options)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/autograph/impl/api.py", line 339, in _call_unconverted
return f(*args, **kwargs)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/keras/preprocessing/image_dataset.py", line 220, in <lambda>
lambda x: path_to_image(x, image_size, num_channels, interpolation))
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/keras/preprocessing/image_dataset.py", line 228, in path_to_image
img = io_ops.read_file(path)
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/ops/gen_io_ops.py", line 574, in read_file
_, _, _op, _outputs = _op_def_library._apply_op_helper(
File "/home/roi/.local/lib/python3.8/site-packages/tensorflow/python/framework/op_def_library.py", line 492, in _apply_op_helper
raise TypeError("%s expected type of %s." %
TypeError: Input 'filename' of 'ReadFile' Op has type float32 that does not match expected type of string.
Thank you so much for your help.
One way to fix this I found is to put all your images in another sub-directory inside the directory whose path you are feeding to the image_dataset_from_directory.
Taking your example, you would create a new folder, let's call it new_folder, inside of ./Folder/ where you would put all your images, such that now the path to all your images is ./Folder/new_folder/. Then you can call the image_dataset_from_directory method with the exact same arguments as you have done in your question:
tf.keras.preprocessing.image_dataset_from_directory(
'./Folder/',
label_mode=None,
batch_size=100
)
I found this to work for me so hopefully someone else will also find it helpful!

Discord Error Loading JSON File: json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

When i execute my code i receive next problem.Someone have a idea what i maked wrong in this code?
Ignoring exception in on_member_join
Traceback (most recent call last):
File "C:\Users\Adryan\AppData\Local\Programs\Python\Python37\lib\site-packages\discord\client.py", line 270, in _run_event
await coro(*args, **kwargs)
File "C:\Users\Adryan\Desktop\bot\cogs\events.py", line 51, in on_member_join
users = json.load(f)
File "C:\Users\Adryan\AppData\Local\Programs\Python\Python37\lib\json\__init__.py", line 296, in load
parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
File "C:\Users\Adryan\AppData\Local\Programs\Python\Python37\lib\json\__init__.py", line 348, in loads
return _default_decoder.decode(s)
File "C:\Users\Adryan\AppData\Local\Programs\Python\Python37\lib\json\decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Users\Adryan\AppData\Local\Programs\Python\Python37\lib\json\decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
async def on_member_join(self, member):
with open('users.json', 'r') as f:
users = json.load(f)

App Engine Development Environment - db.get() ReferenceProperty Query random error

I have the following datastore model:
class FeatureCategory(db.Model):
name_eng = db.StringProperty(required=True)
name_spa = db.StringProperty()
name_por = db.StringProperty()
device_type = db.ReferenceProperty(DeviceType, required=True, collection_name='feature_categories')
class Feature(db.Model):
name = db.StringProperty(required=True)
category = db.ReferenceProperty(FeatureCategory, required=True, collection_name='features')
desc_eng = db.StringProperty()
desc_spa = db.StringProperty()
desc_por = db.StringProperty()
I've been experiencing errors when after doing a couple of db.get(db.Key('key_string')) and to the Referenced object like:
feats = dbmodel.Feature.all()
for feat in feats:
cat = feat.category
in the development environment. If I stop the server and restart, it will work for some queries and be back to throwing the error below. Any ideas how can I fix this?
Traceback (most recent call last):
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/lib/webapp2/webapp2.py", line 1536, in __call__
rv = self.handle_exception(request, response, e)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/lib/webapp2/webapp2.py", line 1530, in __call__
rv = self.router.dispatch(request, response)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/lib/webapp2/webapp2.py", line 1278, in default_dispatcher
return route.handler_adapter(request, response)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/lib/webapp2/webapp2.py", line 1102, in __call__
return handler.dispatch()
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/lib/webapp2/webapp2.py", line 572, in dispatch
return self.handle_exception(e, self.app.debug)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/lib/webapp2/webapp2.py", line 570, in dispatch
return method(*args, **kwargs)
File "/Users/danielgarcia/Documents/workspace/rfpbuilder/src/get.py", line 50, in get
cat = feat.category
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 3686, in __get__
instance = get(reference_id)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 1536, in get
return get_async(keys, **kwargs).get_result()
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/api/apiproxy_stub_map.py", line 592, in get_result
return self.__get_result_hook(self)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/datastore/datastore_rpc.py", line 1467, in __get_hook
entities = rpc.user_data(entities)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/api/datastore.py", line 600, in local_extra_hook
return extra_hook(result)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 1506, in extra_hook
model = cls1.from_entity(entity)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 1441, in from_entity
return cls(None, _from_entity=entity, **entity_values)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 973, in __init__
prop.__set__(self, value)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 613, in __set__
value = self.validate(value)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 2815, in validate
value = super(StringProperty, self).validate(value)
File "/Applications/Development/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 640, in validate
raise BadValueError('Property %s is required' % self.name)
BadValueError: Property name is required
Please indent the code next time. The error message Property name is required indicates that some of your Feature entity don't have name property, which is marked as required in your model.

Resources