Does Snowflake monitor "Snowflake Ideas Pages" and act? - snowflake-cloud-data-platform

I thought of posting an idea on Snowflake Ideas Page and I see as of this posting date, there are 587 ideas listed. Out of which 585 ideas status is still listed as New Idea, 1 idea on Roadmap and 1 idea under construction. These two ideas that have some progress is about updating something on community site. It doesn't look like Ideas are being actively monitored and acted to provide response on the progress of any of the ideas. Disappointing and not sure if it even worth posting it there.

Related

Error message "Data Studio isn't available at the moment" for days

Some of our reports suddenly produce the following error message when trying to open them (from https://datastudio.google.com/)
We're sorry! Data Studio isn't available at the moment. Please try again later.
We are getting this message for more than 24h now. Any idea what the reason might be, and how to regain access to the reports?
While I still don't know what the root cause was, it's very likely something outside our sphere of influence. Looks like someone on Google's side pushed the wrong button, as it affected more users than just us (see, e.g., here and here).
So to fix this, have a look at the Google support forums, and then sit back, relax, and hope that Google is quick to do their thing.

How do I give the user a customized error code/block reason with a custom pam ssh module?

I wrote a pam_module whichs does a couple of things and became to huge to post any code here. It basically works similar to pam_abl but with a couple of additional features like City/Country based blocking as well as checking with a dns blacklist.
Now I want to give the user a reason why his login was not successful. Something like: login failed because your country is blocked.
I hope you get the idea. Although I did some research I did not find a possibility yet to do this in pam_auth. I hope someone can give me a hint and/or lead me in the right direction. Thanks in advance.
Edit: For anyone else with a similar problem: pam_info is what you are looking for.
Source code of pam_motd(8) or should give you some idea how to write back to the user.
Actually, there is function pam_info(3), which does exactly what you want.

What is the MYOB API's queries per second limit?

I'm getting a Developer Over Qps response while working with the account live api.
I'm going to use a limiter to throttle my requests, and would like to know what the allowed qps is.
Any ideas?
(I'll get a rough idea through trial and error, but it'd be sweet to know the exact figure.)
George here from the MYOB API Team. That's a pretty close estimation you managed to reach there, the actual limit is 5 calls per second which is usually quite enough for most people. It is flexible however, if you reach a stage that you need more than that.
Cheers!
After some trial and error I've found that a request every 175ms (5.7 qps) works whereas every 170ms (5.8 qps) breaks.
If anyone from the MYOB team could confirm this number that'd be great.

Don't get prompt to save- CRM Solution export

While exporting a Solution from CRM, I dont get the prompt to save the exported solution.
I cleared Cookies and tried,
Upgraded to IE 9 also and tried,
I made sure that the Popup is not blocked, but these things didn't help.
This is the case with few other colleagues of mine also.\
Thanks
I usually get a popup right at the bottom of the window. Its not instant however, Crm has to generate the solution file, so if its particulary large it can take a minute or two. I would suggest clicking export, finish the dialog and then waiting 5 minutes to see if anything happens (it needs a loading icon really).
If that doesnt work, try the same again with a small solution (one entity).
I know it is too late but I am adding the solution here if someone faces the issue again.
The issue was due to URLScan. It was solved by removing URLScan from ISAPI filters for CRM web site.

Fogbugz database schema management

This is a very simple question, and maybe the man himself can provide insight on this :)
Does anyone know the pseudocode behind how Fog Creek does database schema management?
I'm running into an issue and I'm trying to figure out if I'm handling it right... I have a module that runs each time someone spins up their site and examines their database to make sure that they have the right changes in place. if they are missing changes, then the script makes the required changes.
My issue is that I was trying to tie it to the session_start portion of the Global.asax, but it seems to be rather flaky at times, and I'm trying to come up with a better scenario.
For reference, I'm trying to run 1 x web application that can respond to any number of hosts, where the host maps via a metabase to find out what database it belongs to and then makes the necessary connections.
You might have more luck asking this on http://fogbugz.stackexchange.com/

Resources