iCloud Calender and task in Mozilla Thunderbird/Lightning - calendar

Does anyone know how i can get the calender and tasks from iCloud to Mozilla Thunderbird/Lightning - and tey are editable?
I know how to make a public agenda in iCloud, but that agenda is read only.
I hope anyone can help ....

There is a German solution that has recently worked for me, while Ronny's link does unfortunately not work since I just do not get any responses from the server: https://www.nico-beuermann.de/blogging/archives/115-Zugriff-auf-iCloud-Kalender-mit-Thunderbird.html
You will need three ingredients for Thunderbird:
a. your calendar's server (especially, the number xx in pxx-caldav.icloud.com)
b. your user ID (dsid)
c. the calendar's UID (guid)
In short (and English):
Logging in to your iCloud calendar in the browser is the first important step
You can either use Firefox' or Chrome's functionality to look at the network; I will continue with Chrome for this explanation.
Press Ctrl + Shift + I and click on the Network tab; it might be that it is empty for now since logging might also start after pressing Ctrl + Shift + I; if it is empty, just reload having the network tab still open
In my case, it was helpful to right-click on a column to additionally show the domain
checking the domain or just hovering over the links, you can find the xx in pxx, this is your specific server; the remainder of the domain does not matter (information a)
I then clicked on one entry with domain pxx-calendarws.icloud.com
in the upcoming new sub-window, you can click on Headers and find dsid somewhere below under Query string parameters; alternatively, you can find it in the URL; dsid corresponds to your user id (information b)
now, switch from "Headers" to "Reponse"; you might need to check multiple URL get lines to find one with the right response, but you will find a number of guid entries which correspond to your calendar IDs (information c)
Now, you have all three pieces of information. In Thunderbird with the Calendar plugin Lightning, now add a network calendar of type "CalDAV" (iCalendar will not work!). There, you can enter the following url with the pieces of information marked in bold:
https://pxx-caldav.icloud.com/dsid/calendars/guid
Et voìla. You will need to enter your credentials for iCloud. This has worked at the date of this post with Thunderbird 45.8.0 with Lightning 4.7.8.
Update 18 July 2017:
Apparently, sometime in June Apple has changed something with the system. You need to activate two-factor authentication now in order to create app-specific passwords. You can do that with any of your apple devices. Then, you should log in to your apple account. In the security area, you can generate some password. Use this one to login to your caldav in Thunderbird with your email address. This seems to work fine.
Without two-factor authentication, you cannot activate app-specific passwords. And without an app-specific password, CalDAV does not seem to be accessible anymore.
Update 05 September 2017: It seems that pGuid sometimes works but guid does not for the calendar ID. So, take care here.

After a lot of research i have found an solution.
This is provided by https://icloud.niftyside.com/
They have a php script, where you can fill in the credentials and the right CardDAV/Contacts servers, which you can add to Mozilla Thunderbird/Lightning

Related

You currently do not have a developer account in QBO

I am trying to obtain my production keys for my QBO app following the steps in this link
here
However, when I try to start the "App Assessment Questionnaire", I get the error message below:
You currently do not have a developer account, please click here to complete setting up your developer account. Once that is complete you will be able to access the help pages.
This is what I see, and I DO have a developer account. It won't let me continue.
Please help!
UPDATE
I see this error in the address bar:
ErrorCode=ERROR_CREATING_USER&ErrorDescription=License+Limit+Exceeded
UPDATE #2
I tried creating a brand new account, a new app, on a different PC and the same thing happened. So this is not a cache issue.
UPDATE #3
Created 2 support tickets for this issue
#00114423
#00114415
I had to use a different account to access the help site
https://help.developer.intuit.com
I've the same issue since Friday (02/18) and spent hours to figure out what's the problem.
tried from different browsers and different IP addresses
made a brand new developer account to test with it
had a 1+ hour chat session with QB support (but not developer support)
sent an email to an address received from the chat assistant
sent a feedback at https://www.surveymonkey.com/r/AppAssess
According to the browser's developer tools:
the Start questionnaire button opens this URL:
https://developers.intuit.com/app/developer/appdetail/prod/questionnaire?appId=xxxxx:UUID_of_app
then it redirects to:
https://login.salesforce.com/services/auth/sso/yyyyyyyyyyy/Intuit_Enterprise?community=https://help.developer.intuit.com
finally, SSO to salesforce fails and it redirects back to:
https://developer.intuit.com/app/developer/qbpayments/docs/qbms-payments/hosted-paypage/faqs/help-redirect?**ErrorCode=ERROR_CREATING_USER&ErrorDescription=License+Limit+Exceeded**+-+Customer+Community+Login&ProviderId=xxxxxx&startURL=%2Fs%2Fquestionnaire%3Fapp%yyyyyyyyyyyyy
So, it seems to be, QB have reached a license limit at salesforce, which prevents new logins to create and the questionnaire from to load.
And the funny part is: the same thing happens, when I tried to create a support ticket and used the "Ask a question" button at https://help.developer.intuit.com/s/
Which means, I can't start the questionnaire and can't start a ticket about the error either.
I guess, if QB developer accounts whom created support tickets previously or started the questionnarie before the license limit has been reached, they have have a SSO login account at salesforce and able to fill in the form or start new support tickets, but others are stuck because of the license limit.
If somebody have a working QB developer account and able to start a support ticket, please do it, and link this page in it.
Or maybe, we should contact salesforce support to let QB know about the license limit.
I'll give it a try.
This seems to have been fixed. I tried running the questionaire and it worked.
I have also been having this problem the last several days and had the same lack of success with QB support. The URL callback error I see is:
ErrorCode=REGISTRATION_HANDLER_ERROR&ErrorDescription=Please+sign+the+terms+of+service+before+you+login+to+community
I don't see anywhere I can sign a TOS in my account page - it's possible that in fact QBO hasn't signed a TOS with Salesforce. What a joke.

Which ADuser's record does NPS check to validate an account? Can we change it?

For a school I implemented eduroam two years ago and from time to time we add new students in the AD.
Five days ago I added 40 more new students but I changed the CN's (or what in New-ADUser is called "-Name") format:
from "name.surname" to "SURNAME, NAME" (quotes excluded), hence
earlier it was
CN=name.surname, OU=CLASS_A, OU=STUDENTS, DC...
now it is
CN=SURNAME, NAME, OU=CLASS_A, OU=STUDENTS, DC...
an eduroam's username normally is <string with no blanks>#<yourschool>.<tld> so that the RADIUS proxies can route the auth request based on #<yourschool>.<tld> , So I must keep such a format.
Now, the new users cannot be authenticated anymore by NPS.
All the tests I ran back my thesis (i.e. that NPS uses CN to authenticate) but I cannot find any Microsoft document that states that.
Could anybody share the link to such doc?
is it a way to change the check from CN (if proved by answer of point 1)) to another user's recor like sAMAccountNAme or UPN?
I'm sure I'm touching something deep in AD but I hope somebody has tripped into this issue and has found a answer.
TIA
P.S. I guess the alternative would be to use FreeRADIUS but I would rather explore the options to still make within NPS/AD
• Please check the Windows Server event security log for more details on the issue for NPS authentication because that might shed some more light on the actual issue that you might be facing. Till then, please clear the cache and temporary files from the server and restart the whole infrastructure regarding NPS, i.e., domain controller, NPS Server, Access points and other related devices through which users can login through NPS.
• Once restarted, please try to authenticate any allowed user through NPS once again and check. Also, as you are using NPS as a radius server proxy, please check for the attribute manipulation rules for message forwarding since the CNs are changed in their order/format in your AD. Specifically, regarding the username which is provided by the access client and is included by the NAS in the Radius access-request message. The value of this attribute is a character string that typically contains a realm name and a user account name.
• To correctly replace or convert realm names in the username of a connection request, you must configure attribute manipulation rules for the User-Name attribute on the appropriate connection request policy.
Also, find the below links regarding your query whether which attribute you can use to authenticate in case of NPS. In it, it clearly stated that user principal name should be used as an attribute as a best practice: -
https://learn.microsoft.com/en-us/windows-server/networking/technologies/nps/nps-best-practices#performance-tuning-nps
https://learn.microsoft.com/en-us/windows-server/networking/technologies/nps/nps-best-practices#using-nps-in-large-organizations
Please check the below documentation link for your condition: -
https://learn.microsoft.com/en-us/windows-server/networking/technologies/nps/nps-plan-proxy#key-steps-3

Why does my users get a .ost error message after giving them Full Access to mailbox?

I work in a big company and we have just migrated to office 365 in a hybrid scenario.
Here is the "stack":
Exchange 2016 Hybrid
ADSync with AADConnect
Usermailboxes hosted on Office 365
Users use the Outlook 2016 Client (can't roll out o365 client, because we have over 50.000 users and so many custom outlook plugins 32 Bit)
We do this as followed:
Create a new ad user.
Enable-RemoteMailbox samAccountName -RemoteRoutingAddress samAccountName#tenant.mail.onmicrosoft.com -PrimarySmtpAddress address#tenant.com -shared
(This also turns of emailAddressPolicy which it should do according to our exchange admins). Our exchange admins are also stuck on that problem so that's why I created this post here)
Then I wait and have a look in the ECP Admin center. Before the sync happens the remote Routing address is: address#tenant.com
After the first sync (every 30 minutes) it's samAccountName#tenant.mail.onmicrosoft.com ==> How it should be.
After another 30 minutes (2nd sync back to AD) it's a X500 address.
When I look it up in PS like get-remotemailbox <UPN> | fl *remote* the address is samAccountName#tenant.mail.onmicrosoft.com (how it should be).
So it's displayed wrong in the ecp.
But the huge problem we face is this:
When I give any user from the company full access to this shared mailbox it won't get Automapped.
After 1 hour of waiting I manually add it. When I do this a .OST error comes.
Error:
"Microsoft Outlook cannot expand the folder. The set of folders cannot
be opened. The file
C:\Users\UserName\AppData\Local\Microsoft\Outlook{username]}.ost"
Also with outlook restart it's not working. So our guess is because something is wrong about ECP and the Remote Routing address.
Please note that this isn't a client problem. It effects almost every mailbox I create these days.
I had another post about this but with fewer details and without the knowledge of the remote routing address: https://www.reddit.com/r/exchangeserver/comments/eceqm6/automapping_doesnt_work_on_hybrid_setting/
Anyone have any ideas? I appreciate any kind of help from you guys. If you need any more informations please ask
You have an Hybrid architecture, ok. You need to use O365, because if you use older versions you will need to change a bit the computer registry in each computer. Or change the on-premise autodiscover.
But your big problem is big indeed. Right now, you can not hare mailboxes in O365 correctly. If you do that, you may have access to main mailbox, but if you use archive, you won't have access (you just have OWA access).
Regards.

Do I need back-end and database for single page app using Facebook Connect?

I want to create a one page site that will use Facebook Connect and allow my friends to reserve a spot for an upcoming event. Using there Facebook ID I would like my friends to pay(reserve) a spot and then show their Facebook picture in the spot they reserved. Sort of like Meetup.com when you RSVP except its a one page site and for a one time event. Can I build this only using front-end technologies or do I need a backend?
The answer is yes, you will need a back end system to store the paid registrants, print out a list of paid users to have at the door of the event to make sure they paid, store the user id to be able to display their picture, etc. etc. etc.

How best to screen scrape a password protected site on behalf of a 3rd party?

I want to write a program that analyzes your fantasy baseball team and notifies you of recommended actions, possibly multiple times per day. The problem is, you aren't playing fantasy baseball on my site, you're playing on yahoo, or cbs, or espn, etc.
On the majority of these sites, fantasy teams and leagues are not public, so you must be logged in and a member of the league to see the teams in the league.
All that I need is the plain html for the team page on each of those sites to be sent to my server, where I can then parse and analyze the file and send user notifications.
The problem is that I need username/password combinations to easily get this data to my server when I need it, and I think there will be a lot of people who wouldn't want to entrust their yahoo/espn/cbs password to me.
I have come up with several possible ways to solve this problem:
The most obvious way is to ask for their credentials for the site on which their team is hosted. Then I could just programmatically log in and request the data I need. I'm guessing a number of people would be comfortable giving me their credentials, and a number of them not so much.
Write a desktop client, which the user then downloads. The client would require their credentials, but it could then basically do exactly the same thing that the server based version would do, log in, request the page, and send the page back to my server. The difference being that their password would never need to leave their desktop. Their computer would need to be on, and this program running for this method to work.
Write browser add-ons that navigate to the page I need, use the cookie that is saved from a previous login to login to the site, and send the page back to my server. This doesn't require my software to ever ask for their password, but if the cookie expires I am hosed, and I don't know much about browser add-ons besides.
I'm sure there are other options, but these are what I've come up with so far.
I have two questions:
1. What are the other possibilities for this type of task?
2. Am I over-estimating people's reluctance to give me their yahoo (for example) password? Is option (1) above the obvious choice?
It was suggested in the comments that I try yahoo pipes, and that looked like a promising suggestion so I explored it a bit. Having looked now at this, I don't think that is an option. So, it looks like I'll be going with option 1.
This is a problem I grappled with a couple of years ago when I wanted to do the same thing. Our site is http://benchcoach.com and the options we were considering were the following:
Original we considered getting the user's credentials and login. We would then log in and scrape their league and team info. The problem there is that after reading several of the various terms of service, this would definitely be violating the terms of service. On top of this, Yahoo! was definitely one of the sites we were considering and their users have email (where we could get access to sensitive data), and Yahoo! wallet. In addition, it would be pretty trivial for Yahoo/ESPN/CBS to block our programmatic logins by IP Address.
The solution we settled on (not 100% happy but it does seem to work) was asking our users to install a bookmarklet (like delicious, digg or reddit) which would post the current html page to our servers, where we could parse the data and load our database. If they were still logged into their Yahoo/ESPN/CBS account, we would direct them directly to the pages, otherwise, those sites would prompt for authentication. Clicking the bookmarklet once more, would post the page to our servers.
The pros of this approach was that we never collected anyone's credentials so any concern of security would have been alleviated. Secondly, it would make it impossible for Yahoo/ESPN/CBS to block access to our service since we would never be connecting directly to their servers but rather the user's browser would be posting the contents of their browser to our server.
The problems with this is that it takes 2 clicks to post a page to our site. For head to head leagues, we needed 3-4 pages so it would take our user 6-8 clicks to sync their league to our servers. We're still looking at options for this.
One important note is that I ran into the product manager of the Yahoo Fantasy Football site at a conference a year ago. We talked about how we were getting the Yahoo data, and he confirmed that getting credentials would violate their TOS and they may stop us. While I don't think they would have, it would have made it hard to invest time and energy to develop this only to have them block our site and pissing of users by closing their accounts.
A potentially more complicated answer could possibly be done with (for example) yahoo pipes.
Hypothetically, you create a pipe which prompts the user for their credentials and provides them with a url which contains their scraped data. They enter this URL in their site, and never have to provide their credentials directly. Even better, for the security-conscious, it would be possible to examine what the pipe was actually doing before entering any information.
The downside would be increased complexity (as well as you'd have to write and maintain the pipe). Having said that, you could provide a link directly to the published pipe from your site, to make things as easy as possible.
Option 1 is the obvious choice. People who trust your site will provide the details. There is no other way you can login to other site while screen scraping.

Resources