Sharing storage among VMs - google-app-engine

I have used aws, azure and google-aapengine but i find the ability to share a storage lacking in them(correct me if i am wrong). My basic need is to have multiple vm instances having a common storage. My setup is
Having 2 sets of servers. 1st set will be running a web-app that will upload files. 2nd set will be processing the files. None of these services allow you to attach common disks to multiple vms. I tried to create AzureFileService But the 1st step is giving error. Get-Account does return me my account data.
$ctx=New-AzureStorageContext <account name> <account key>
I wrote got no clue what that <account key> is. I tried using the Azure login password as account-key but got the following error
PS C:\> $ctx=New-AzureStorageContext mewtoo **password**
New-AzureStorageContext : Invalid length for a Base-64 char array or string.
At line:1 char:6
+ $ctx=New-AzureStorageContext mewtoo *********
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [New-AzureStorageContext], FormatException
+ FullyQualifiedErrorId : Microsoft.WindowsAzure.Commands.Storage.Common.Cmdlet.NewAzureStorageContext
Obviously the last option would be to upload/download the files using HTTP calls. But the files can be 100MB to 1GB+ in size so downloading each file processing them and uploading back will be time consuming. If better solutions are available, please let me know.
Thanks in advance.

The account key is the access key for your storage account, which you can find in the management portal:
Here's a step-by-step guide to create a new share: http://blogs.technet.com/b/canitpro/archive/2014/09/23/step-by-step-create-a-file-share-in-azure.aspx

Related

How do you resolve an "Access Denied" error when invoking `image_uris.retrieve()` in AWS Sagemaker JumpStart?

I am working in a SageMaker environment that is locked down. For example, my user account is prevented from creating S3 buckets. But, I can successfully run vanilla ML training jobs by passing in role=get_execution_role to an instance of the Estimator class when using an out-of-the-box algorithm such as XGBoost.
Now, I'm trying to use an algorithm (LightBGM) that is only available via the JumpStart feature in SageMaker, but I can't get it to work. When I try to retrieve an image URI via image_uris.retrieve(), it returns the following error:
ClientError: An error occurred (AccessDenied) when calling the GetObject operation: Access Denied.
This makes some sense to me if my user permissions are being used when creating an object. But what I want to do is specify another role - like the one returned from get_execution_role - to perform these tasks.
Is that possible? Is there another work-around available? How can I see which role is being used?
Thanks,
When I encountered this issue, it was a permissions issue with a bucket that had changed.
In the SageMaker Python SDK source code , there is a cache that is located at in an AWS-owned bucket: jumpstart-cache-prod-{region}. and a manifest.json that translates the ECR path for the image for you.
If you look at the stack trace, it could be erroring out at the code that is looking for the manifest.
One place to look is if there are new restrictions placed in IAM, Included here is the minimum policy you need to access JumpStart (pretrained) models

How to do I connect from my cmd to snowflake account using snowsql

I am trying to connect using snowsql from cmd but it fails every time
enter image description here
The reason for this issue is that the account name looks incorrect. It should have the region details where it is deployed, except for the case when it is deployed on US West (Oregon) region and should not be including "snowflakecomputing.com"
For eg: if the actual URL which you use from browser to access SF UI is abcd.region.snowflakecomputing.com, then account name to be used for connecting from Snowsql would be : abcd.region
If not, then get the exact identifier from the following doc link for specific regions: https://docs.snowflake.com/en/user-guide/admin-account-identifier.html
and try again connecting from SnowSQL.

Error with the Zerologon POC on Samba AC DC

I have a school projet that require me to emulate the CVE-2020-1472 (Zerologon) on a local environment.
I am currently trying to test following github script but I am facing some issue that I haven't been able to solve since then.
To sumerize, I have a :
Ubuntu 16.04 machine with Samba 4.3.8 Domain Controller Active Directory
Windows Server 2019 that joined the domain EXAMPLE.COM that I created with Samba.
I've made several test and I can succesfully modify the password of the "Administrator" account with first part of the POC :
./cve-2020-1472-exploit.py -n EXAMPLE-DC -t 1.2.3.4
Problem comes when trying to run impacket to extract some data from the domain :
secretsdump.py -no-pass -just-dc Domain/'DC_NETBIOS_NAME$'#DC_IP_ADDR
It sucesfully log in but then I get this message :
Password:
[*] Dumping Domain Credentials (domain\uid:rid:lmhash:nthash)
[*] Using the DRSUAPI method to get NTDS.DIT secrets
[-] DRSR SessionError: code: 0x20e4 - ERROR_DS_CANT_FIND_EXPECTED_NC - The naming context could not be found.
[*] Something wen't wrong with the DRSUAPI approach. Try again with -use-vss parameter
[*] Cleaning up...
UPDATE 1 :
I also tried to use the -use-vss option but it also fails after login in.
[-] RemoteOperations failed: DCERPC Runtime Error: code: 0x5 - rpc_s_access_denied
[*] Searching for NTDS.dit
[-] 'NoneType' object has no attribute 'request'
[*] Cleaning up...
I tried to get some information about the DRSR SessionError: code: 0x20e4 - ERROR_DS_CANT_FIND_EXPECTED_NC but I didn't found any usefull informations on internet..
I hope someone have already faced this error or have knowledge regarding Active Directory because I'm really stuck
Thanks in advance and have a good week !
UPDATE 2:
I made a post on impacket's github and it seems like the DRSUAPI approach hasn't been tested against a non Windows AD before. So secretsdump wont works.
Every page mentioned SAMBA as vulnerable so I wonder if they really tested to exploit the vulnerability or if it was just a theoretical assumption.

Connecting to SnowSQL Client using Snowflake Credentials

I have successfully installed SnowSQL Client version 1.2.5 and while trying to get log into my snowflake account, using account id, username and password, I am somehow unable to connect and get following error:
snowsql unable to log in
This appears to be networking issue. Have you tried to set that debug logging as directed?
To assist in situations like this, Snowflake has a tool which could help you determine if your client host is able to access all required network endpoints for your Snowflake account, it's called SnowCD, the documents are here and the installation is fairly straightforward:
https://docs.snowflake.com/en/user-guide/snowcd.html
I'd recommend trying SnowCD as your first step, the next step would be to review any required proxy settings your organization might have. I'd also double-check your "account name" argument, the URL looks OK to me but there is a nice writeup on the account name construction at this link:
https://docs.snowflake.com/en/user-guide/connecting.html#your-snowflake-account-name
I hope this helps...Rich
THANKS Rich for doing some R&D and sharing proposals. I got successfully logged into snowsql by providing my account id till ".aws". Hope it will help others struggling so far, like myself:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-log-in.html
demo log in

Read log files on JBoss AS 7

I have an application running on JBoss AS 7 and creating log files in /standalone/log.
For security reasons I not allowed to browse JBoss directories.
Is there any build-in application to read these logs files from a browser ?
NB : I cannot use admin console either.
No, nothing built in. You can have the admins configure the logging service to put logs where you can get to them, or you can configure the logger to capture logs and post to a database or other.
Not yet, but there are some requests for it (one by me, BTW ;-) and it might appear in WildFlz 8. Hopefully. (Vote on them if you like.)
WFLY-1048 Allow hooking into logging subsystem through Management API
WFLY-1144 Provide the ability to view server logs through the web interface
WFLY-280 Provide an operation to retrieve the last 10 errors from the log
Until then, I suggest to ask the admins to allow access to that one particular log file.
If that doesn't pass through, you may declare dependency of your deployment on a logging service's modules (Dependencies: ... in MANIFEST.MF) and the log manager in JVM. Unless there's some additional obstacle like security manager or so.

Resources