Export content from a yammer network - export

How can I export all threads - including attachment - from a yammer-network ?
Background
we have used the free version of yammer for a while - and it has now been decided to use a paid version. Because of that I need to backup all post/images/etc on our existing network.
But so far I have been unable to find a suitable tool to do this - and the export utility is not available for a free instance (which will be closed down eventually) ?
plaease advice - thnx in advance

It appears to be hidden on the developer portal, however there is a data export API, which is available to paid networks. You will need to use API credentials from a verified admin account to execute the API. Normal user accounts are unable to execute the data export endpoint.

Related

MapsCreatorStorageQuotaExceeded error with Azure Maps

I am using azure-maps API for DWG to GeoJson conversion, where it have started to give below error:
"400 Bad Request:
[{"error":{
"code":"MapsCreatorStorageQuotaExceeded",
"message":"Storage
used by Maps Creator (. Mb) exceeds storage available (.
Mb)."}}]"
Although, I have deleting conversions, datasets and mapData resources, but still facing same issue. What else could be consuming the resources here?
Or is there any way, I can check where the storage is being consumed? As I am not the administrator I can't check that in Admin console. I can only access the APIs using api key.
You can get storage utilization through Creator overview page from Azure Portal.
To know more about Creator, see Manage Azure Maps Creator.
Also, Please see the Creator section in Azure Maps for pricing details.

Recover Cloud functions default service account with the undelete POST call

I have used Google Cloud Functions for quite a long time, with no real authentication problem for now.
Today I meet this error while deploying a new function
ERROR: (gcloud.functions.deploy) ResponseError: status=[400], code=[Bad Request], message=[Default service account 'PROJECT-ID#appspot.gserviceaccount.com' doesn't exist. Please recreate this account (for example by disabling and enabling the Cloud Functions API), or specify a different account.]
I tried several things :
disable/enable GCF API : no service account recovered
gcloud beta app repair reference here
No default service account recovered
the undelete API POST call
If I understand well the current GCP features, using the last option is my best solution, but somehow I keep getting a 400 error
I found my unique-id in my log activity at the creation of the default service account
I really can't see where is the problem in the undelete API call and would be really thankful if you could help with it
Thanks to #Maxim, I know now that my problem comes from the fact that the deleting of this service account happened more than 30 days ago. Which means that it has already been purged from the system and it's not recoverable anymore.
In case you meet this same kind of problem, please try out this link :
https://cloud.google.com/iam/docs/creating-managing-service-accounts#undeleting_a_service_account
I see three alternative ways in how to proceed here next:
Create a new project from scratch to work from.
File a support case via the support center.
Open a private issue by providing your project number in the following component.
I believe it's convenient in reaching out GCP Support for help at this stage, and recommend you to do so; seeing as you've attempted most if not all ways of Service Account recovery to no success.
On a last note, as for the latter option, the contents of the private issue will only be visible to you, and to the GCP Support staff (us). If you choose this option, please let me know when it's opened, and I'll start working on it as soon as possible.

Codename One Preferences/Storage permissions

I have developed and published an app in Google Play Store, which only send simple String request to REST API and store the results in the Preferences. The same app is also submitted to Windows Store for publication, however it was rejected due to the following reason:
The app declares use of the sensitive capability [musicLibrary, picturesLibrary, videosLibrary] without appearing to access the declared capability. Please removed the sensitive capability declaration and re-submit the app.
Upon inspection to Google Play Store submission, I noticed the same permissions are requested:
This app has access to:Photos/Media/Filesread the contents of your USB storagemodify or delete the contents of your USB storageStorageread the contents of your USB storagemodify or delete the contents of your USB storageOtherreceive data from Internetview network connectionsfull network accessprevent device from sleeping
So my question is, do Preferences really need these permissions, or can I set some kind of build hints to remove these permission requests, especially for UWP build? I have also tried to set android.blockExternalStoragePermission build hint, but the permissions are still requested in Android build. I have yet to try iOS build since currently I don't have Apple Developer account.
Thank you very much in advance.
Edit #1 (23/10/2018):
Upon further inspection, I found that I have mistakenly uploaded the version that didn't declare android.blockExternalStoragePermission to Google Play Store, so all good on Android version.
Currently I'm not using any of cn1libs, and here's the list of all classes imported in my application:
java.util.HashMapjava.util.Mapjava.util.Randomcom.codename1.components.InfiniteProgresscom.codename1.components.ToastBarcom.codename1.components.ToastBar.Statuscom.codename1.io.CharArrayReadercom.codename1.io.JSONParsercom.codename1.io.Logcom.codename1.io.NetworkManagercom.codename1.io.Preferencescom.codename1.io.rest.Responsecom.codename1.io.rest.Restcom.codename1.l10n.L10NManagercom.codename1.ui.Buttoncom.codename1.ui.Componentcom.codename1.ui.Containercom.codename1.ui.Dialogcom.codename1.ui.FontImagecom.codename1.ui.Formcom.codename1.ui.Labelcom.codename1.ui.events.ActionEventcom.codename1.ui.events.ActionListenercom.codename1.ui.layouts.BorderLayoutcom.codename1.ui.layouts.FlowLayoutcom.codename1.ui.layouts.GridLayoutcom.codename1.ui.plaf.Bordercom.codename1.ui.plaf.Stylecom.codename1.ui.plaf.UIManagercom.codename1.ui.util.Resources
So my original question remain, how do I set the build hints to prevent the same external storage read/write permission in Windows and iOS?
See the section titled "Android Permissions" here, for a list of some API's that might trigger extra permissions. I suggest extracting the manifest from the XML and inspecting it. It should include two permissions based on your description you should have two permissions there:
android.permission.WRITE_EXTERNAL_STORAGE - which you should have been disabled when you applied android.blockExternalStoragePermission
android.permission.INTERNET - this one you actually need
I'm assuming you have a permission for media access and here it becomes a question of where it came from?
Did you use a cn1lib that might include a feature that triggers this?
Do you have a feature in the app that isn't active yet?
Once you have the specific name or results of this investigation comment here and I'll revise the answer with more details.

Exporting a Typo3 site bit by bit

(edit: I'm leaving all the mistaken assumptions in just in case someone else makes the same mistakes)
I have an ancient Typo3 3.8.1 site on a remote server. I don't have access to that server, and the team in charge of maintaining the site doesn't know who to contact to get access to the server. I do have the admin rights on that site, though. (edit: no I don't. oops.)
This is what I see in the (not) admin menu:
I'm not sure if this version supports extensions, I can't find an extension manager anywhere. (because I'm not an admin)
I want to export the site so I can host it on a server on my own domain instead. The problem is the export file is too large, I can't download it. Will I destroy the directory structure if I export a bunch of pages at a time?
If you have admin access to the backend you can try to install Quixplorer - file manager. Using it you can try to zip folders in the main directory ie. (typo3, typo3conf, fileadmin etc) one by one and download them via browser.
It's important to download and remove typo3conf.zip from the server as soon as possible, cause it contains sensitive data.
Additionally you can also install PhpMyAdmin extension (search in repository) i you haven't other MySQL client.
Edit:
If you can't use Quixplorer the only way is... to write own extension and upload it via Extension Manager, there you'll need to try perform primitive file system operations like:
(PHP)
system('zip -R t3c.zip typo3conf/');
Sometimes the server allows more memory and execution_time that the T3D Export. So, if you can change PHP files on that server, try to change typo3/sysext/impexp/class.tx_impexp.php - search for ini_set and change that settings. If the server allows, you can then create bigger t3d-files.
And you could try some shell-extensions to get hands on that server:
http://typo3.org/extensions/repository/view/phpshell
http://typo3.org/extensions/repository/view/mw_shell
http://typo3.org/extensions/repository/view/shell
But to answer your initial question: you can crate a couple of T3D-files and import them again. Just force uid if you import them - and install all needed extensions first!

How to automate download of weekly export service files

In SalesForce you can schedule up to weekly "backups"/dumps of your data here: Setup > Administration Setup > Data Management > Data Export
If you have a large Salesforce database there can be a significant number of files to be downloading by hand.
Does anyone have a best practice, tool, batch file, or trick to automate this process or make it a little less manual?
Last time I checked, there was no way to access the backup file status (or actual files) over the API. I suspect they have made this process difficult to automate by design.
I use the Salesforce scheduler to prepare the files on a weekly basis, then I have a scheduled task that runs on a local server which downloads the files. Assuming you have the ability to automate/script some web requests, here are some steps you can use to download the files:
Get an active salesforce session ID/token
enterprise API - login() SOAP method
Get your organization ID ("org ID")
Setup > Company Profile > Company Information OR
use the enterprise API getUserInfo() SOAP call to retrieve your org ID
Send an HTTP GET request to https://{your sf.com instance}.salesforce.com/ui/setup/export/DataExportPage/d?setupid=DataManagementExport
Set the request cookie as follows:
oid={your org ID}; sid={your
session ID};
Parse the resulting HTML for instances of <a href="/servlet/servlet.OrgExport?fileName=
(The filename begins after fileName=)
Plug the file names into this URL to download (and save):
https://{your sf.com instance}.salesforce.com/servlet/servlet.OrgExport?fileName={filename}
Use the same cookie as in step 3 when downloading the files
This is by no means a best practice, but it gets the job done. It should go without saying that if they change the layout of the page in question, this probably won't work any more. Hope this helps.
A script to download the SalesForce backup files is available at https://github.com/carojkov/salesforce-export-downloader/
It's written in Ruby and can be run on any platform. Supplied configuration file provides fields for your username, password and download location.
With little configuration you can get your downloads going. The script sends email notifications on completion or failure.
It's simple enough to figure out the sequence of steps needed to write your own program if Ruby solution does not work for you.
I'm Naomi, CMO and co-founder of cloudHQ, so I feel like this is a question I should probably answer. :-)
cloudHQ is a SaaS service that syncs your cloud. In your case, you'd never need to upload your reports as a data export from Salesforce, but you'll just always have them backed up in a folder labeled "Salesforce Reports" in whichever service you synchronized Salesforce with like: Dropbox, Google Drive, Box, Egnyte, Sharepoint, etc.
The service is not free, but there's a free 15 day trial. To date, there's no other service that actually syncs your Salesforce reports with other cloud storage companies in real-time.
Here's where you can try it out: https://cloudhq.net/salesforce
I hope this helps you!
Cheers,
Naomi
Be careful that you know what you're getting in the back-up file. The backup is a zip of 65 different CSV files. It's raw data, outside of the Salesforce UI cannot be used very easily.
Our company makes the free DataExportConsole command line tool to fully automate the process. You do the following:
Automate the weekly Data Export with the Salesforce scheduler
Use the Windows Task Scheduler to run the FuseIT.SFDC.DataExportConsole.exe file with the right parameters.
I recently wrote a small PHP utility that uses the Bulk API to download a copy of sObjects you define via a json config file.
It's pretty basic but can easily be expanded to suit your needs.
Force.com Replicator on github.
Adding a Python3.6 solution. Should work (I haven't tested it though). Make sure the packages (requests, BeautifulSoup and simple_salesforce) are installed.
import os
import zipfile
import requests
import subprocess
from datetime import datetime
from bs4 import BeautifulSoup as BS
from simple_salesforce import Salesforce
def login_to_salesforce():
sf = Salesforce(
username=os.environ.get('SALESFORCE_USERNAME'),
password=os.environ.get('SALESFORCE_PASSWORD'),
security_token=os.environ.get('SALESFORCE_SECURITY_TOKEN')
)
return sf
org_id = "SALESFORCE_ORG_ID" # canbe found in salesforce-> company profile
export_page_url = "https://XXXX.my.salesforce.com/ui/setup/export/DataExportPage/d?setupid=DataManagementExport"
sf = login_to_salesforce()
cookie = {'oid': org_id, 'sid':sf.session_id}
export_page = requests.get(export_page_url, cookies=cookie)
export_page = export_page.content.decode()
links = []
parsed_page = BS(export_page)
_path_to_exports = "/servlet/servlet.OrgExport?fileName="
for link in parsed_page.findAll('a'):
href = link.get('href')
if href is not None:
if href.startswith(_path_to_exports):
links.append(href)
print(links)
if len(links) == 0:
print("No export files found")
exit(0)
today = datetime.today().strftime("%Y_%m_%d")
download_location = os.path.join(".", "tmp", today)
os.makedirs(download_location, exist_ok=True)
baseurl = "https://zageno.my.salesforce.com"
for link in links:
filename = baseurl + link
downloadfile = requests.get(filename, cookies=cookie, stream=True) # make stream=True if RAM consumption is high
with open(os.path.join(download_location, downloadfile.headers['Content-Disposition'].split("filename=")[1]), 'wb') as f:
for chunk in downloadfile.iter_content(chunk_size=100*1024*1024): # 50Mbs ??
if chunk:
f.write(chunk)
I have added a feature in my app to automatically backup the weekly/monthly csv files to S3 bucket, https://app.salesforce-compare.com/
Create a connection provider (currently only AWS S3 is supported) and link it to a SF connection (needs to be created as well).
On the main page you can monitor the progress of the scheduled job and access the files in the bucket
More info: https://salesforce-compare.com/release-notes/

Resources