Failing to import generated ics file to google calendar and outlook - calendar

I'm trying to generate ics-files, see sample here: http://deap.nu/myprogram.ics
It validates alright here: (URL removed: service no longer exists)
but when trying to import it into a google calendar I get the message
"Failed to import events: Unable to process your ical/CSV file."
which isn't very informative. I've tried to slim the file down, and googled a lot but can't find whats wrong. Any input on this is appreciated.
Importing to outlook doesn't work either.

After some sleep and more trial and error I managed to pinpoint the problem to the Organizer elements in the .ics file. So I removed them for now, adding the information to the description instead.

I have another recommandation for a successful import to GCalendar (calendar.google.com). I failed several times to import an ics-File with meaningless error messages.
I tried anything to make the ics-file standard conform, left out all UID rows etc. but nothing helped.
I used Google Chrome, which is my normal every day browser.
I thought: Which browser could be better to use it with Google-Services?
But: When I logged in to google-calendar using Microsoft Edge (Chromium-based) the ics-import just worked out of the box.
Hope it helps others with the same problem...
Then

I just got past a problem with the same description. I could leave Organizer in, but, editing the file with emacs on linux, I noticed that the Description contained carriage returns (^M), whereas the other line breaks had been normalized to linux standard when I had saved the .ics file. I removed the ^Ms and it then imported fine.

I realize this is an old post, but I'm adding this in to help others that might run into issues trying to import an .ics file.
As far as Google Calendar, the errors are not descriptive or outright misleading. I had a permissions error, but that was due to a UID (Unique Event ID) being in the file. As described below from Google's Calendar Community Forum, the solution was to remove that line from the file, save it, then try to import it again. It worked!
some of Google Calendar's error messages are not to be taken at face
value... this particular error often means that your UIDs (unique
event IDs) are not formatted to Google Calendar's liking, so my
question is where was this ics file generated?
I would try opening the file in a good text editor, and deleting every
line starting with UID as they are not needed for inserting new
events, and will cause this error if not formatted correctly.

Related

I am unable to export my redcap data and receiving an error notification. notification says to much data although it is a tiny project. solutions?

The problem I am trying to export my redcap data to a CSV form, and unable to do so. I am receiving an error notification that says there is to much data, although it is a tiny project. help will be much appreciated.
The full error text: " We are sorry, but apparently the data export is not able to complete successfully. It may simply be that there is too much data trying to be exported at once, in which it is causing REDCap to crash. If this error occurs again, it is recommended that you attempt to export a smaller data set (fewer fields and/or perhaps fewer records) so that this error does not occur. Our apologies for this inconvenience."
what I have tried
I have made sure I have the necessary user rights.
I have tried through a colleges redcap user (who has thenecessary user rights) .
have tried exporting only one instrument (no success)
have created a test project with only 2 questions. also in the new test project I receive the same notification.
could not export data both in development mode and in production.
any ideas?
Many Thanks!
The institution had blocked the option for file upload due to security reasons. Apparently REDCap's exporting system uses the uploading mechanism, and therefore landed up being disabled as well.
The local storage folder location was pointing to a folder that was missing on the server. Just created the folder and upload and subsequent problem was fixed

Looking for Editor that can handle .doc or .docx files

i am writing a web application with React, where users can write protocolls for their appointments. The current system is: the web application saves the word file in the local file system, the user edits it and uploads it via a macro in word.
That seems a bit clunky to me and i am not so sure about the security issues of letting the browser directly access the local file system.
So i wanted to let the users edit the files directly via the browser, with an editor similar to GoogleDocs.
Problem is:
Documents have to remain on premis
Converting doc files to a format that can be displayed in a browser and back seems to have some formating issues.
The user must be able to download the file and edit it, in case they have an appointment without internet access and upload it later. So it has to be at least convertable to a document that can be easily edited in Word.
There are so many richtext editor, but from what ive seen none is designed for that use-case. So my question is: Is what i want to do even possible, and if so does anyone know a good editor or library for doing so?

What defines a message as 'cached' in discord.js

As we all know, in the change from v11 to v12, cache was introduced, however I can't find anything online that explains exactly what it is as a concept. Can anyone explain how it works?
I have just taken a look at the D.JS source and thankfully it is heavily commented.
In many files it suggests that any message that is sent by a user, or any message mentioning a user caches said user. This works in the same way with guilds I believe.
This means that in order to find a user in a server using #get or other fetch methods, the user must have been mentioned, or typed a message.
BaseManager.js source file here
Client.js source file here
GuildManager.js sournce file here

Google Drive API method Drive.File.update Can Silently Fail to Update File Content

Using Java Google Drive API v2r109lv1.16.0-rc we have found that occasionally files are not updated correctly using the Drive.Files.update method. This seems to have occurred among customers sporadically since 22nd April 2014. Before then it was working just fine.
We are updating the file with new content and meta data. The meta data updates OK but the file content is not touched. There are no errors and as far as can be told the operation succeeds without an issue.
It is only when a user goes to access the file that a problem is seen by us. If for example the desired file content update is greater than the previous content then the meta information (file size) will be incorrect and the read will fail.
We are really struggling to create a reproduction scenario. Has anyone else experienced this problem, or know of a fix, or if Google has found an underlying problem that is (hopefully) fixed?
I am not convinced this issue is reliably fixed.
I would really like some clarity from google. This is a very bad problem. Data appears to be being lost, and file meta data and the actual data stream are out of sync.

BigQuery Throwing Import Error, No Information Provided

I am trying to import a CSV file into my BigQuery Table. This import has worked in the past, but now I am getting the following error message:
{"message":"Too many errors encountered. Limit is: 0.","reason":"invalid"}
All other fields are empty when I run the debugger.
This is... not helpful. I am unaware of any issues with the data itself, as the export/import data has not changed. Curiously, when trying to use a previous Job Template and run through the web console, the web console itself hangs and the dialog never goes away once I hit the blue "Submit" button.
Job Id: job_e0faf560d3df424ea74519e1b24a23f7
I am generating a CSV and exporting it to Google Cloud Storage. I am using AppEngine and have switched to the new Google Cloud Storage Client Library. I had uploaded the file using the GcsFileOptions.getDefaultInstance() as well as constructing my own GSFileOptions setting the content type to CSV.
After failure, I downloaded the file from Google Cloud Storage, change the encoding (tried ASCII and UTF8) and still have gotten the same result.
I am using AppEngine 1.8.1.1 and the BigQuery Library (google-api-services-bigquery-v2-rev89-1.15.0-rc). This was working as expected previously, so I'm not sure what has happened. Any suggestions are welcome. Thank you!
There are two error fields on the bigquery job. The first is the error result, which tells you whether (and why) the job failed. The error result in your case that the job failed due to encountering too many input errors during the import.
The second field is the error stream, which tells you about errors encountered during the job. If you had set the maxBadRecords field, for example, you could have errors in the error stream, but the actual job might succeed.
I looked up your job in the BigQuery logs, and was able to find that the error stream indicates an error on line 6253: "Too few columns: expected 80 column(s) but got 1 column(s). For additional help: http://goo.gl/RWuPQ"
Can you verify that line 6253 is correct?
-- Jordan Tigani / BigQuery Engineer
Today there is some general problem with app engine:
"We are still investigating the issue with Google App Engine, primarily (but not restricted to) Datastore latency.
We will provide another status update in the next two hours."
https://groups.google.com/forum/#!topic/google-appengine-downtime-notify/1pJZnl4EMKk

Resources