moving BigQuery data in the new European zone - google-app-engine

Following the announcement of Google that BigQuery data can now be located in Europe, I'm trying to find out how to start the migration process.
The support has replied to me: "For more information about moving data in the new European zone you will need to contact the Google Cloud Platform technical support team. I invite you to submit your technical questions on Stackoverflow:"
So here I am. How do we move our data to Europe ?
Thank you for your help.
Best regards,
Alain

Use the BQ web UI to either:
Copy the table into a new dataset
Share the existing dataset
Export and save the table locally, upload again to new dataset.

Related

Do you need a google workspace account to retrieve a TXT record for a custom google domain name?

I have been attempting to learn web development skills by making my own website (ever so slowly and painfully) which has eventually led me to create a react firebase site. I bought a custom domain name from google domains, and to verify it with firebase I need to add a TXT record. From all my searching it seems I need to buy a google workspace membership to retrieve this TXT record? Is this true??? I would really like to avoid spending any more money and would love any advice or assistance, thank you!
You can add DNS records directly in Google Domains.

Pulling data from sharepoint site analytics

I am looking to pull site usage data from sharepoint such as daily users, click through rate, which parts of the site they are using the most, which links they are using the most, which documents are being opening the most. Is there a way to do this through excel, any programs, etc. I have been looking at Power BI, Excel, Power Query, etc. I haven't found a way to pull the data from sharepoint analytics itself though.
I am looking to pull data from the sharepoint site and display it as a chart, a Pareto chart for example.
1.The current situation that the site usage analysis page can export:
Site owners can export the 90-days site usage data in an excel file by going to the download button in the upper right corner on site usage page. Report on unique viewers, site visits, popular platforms and site traffic. For popular content on the site (news posts, documents and pages) the report will be for last 7 days.
excel_example
2.I've also tried getting data from the web in Excel, but it doesn't work. There is currently only one connector between site usage and PowerBI.
I tracked down a published post in UserVoice: Export to Excel on Site usage. You can vote and comment anytime.
3.You might try using the Office 365 Admin API to use, retrieve and store the data in a database, and then report on it with PowerBI. This requires registering with Azure AD and give it permissions to the API.
Reference: Office 365 Management Activity API reference

Solution to export subscribers who selected various field opt-outs, through query or other method in salesforce marketing cloud

I am looking for a solution to export subscribers who selected various field opt-outs, through query or other method in salesforce marketing cloud. For example, some subscribers have opted into receiving emails from some locations (designated fields) and not all. This export is being run through alteryx and a master database and into a new Email Service Provider. Retaining the opt outs are crucial. Looking for some help with this migration as I am not an expert at salesforce marketing cloud. Thank you!

Developing for Google App Engine and using the datastore

I am just getting started with Google Web Toolkit and Google App Engine and have a quick question. I think I understand how to use the datastore now but I was wondering if there is a way that I can quickly create a "database" with static data from an excel sheet? I just need to add some data for a proof of concept later this week.
I am picturing something similar to a SQL database browser where I can just import the data?
I developing in Eclipse with appropriate plugins.
Thanks,
Rob
The easiest way to do this would be to save your spreadsheet as a CSV file, then use the bulkloader to load it into the datastore.
Your best bet is probably to write something to handle uploading it, or to handle processing it on the server.
However, you should also look at the bulk loader. It might be able to save you a little bit of time.
Here is the API (Google Documents List API) that "allows client applications to programmatically access and manipulate user data stored with Google Documents".

Is it possible to get a data source URL of Google Spreadsheets for appengine datastore entities?

Is it possible to get a data source URL of Google Spreadsheets for appengine datastore entities? I want to use the google visualization query objects to query my datastore. Or how I an expose my datastore with a datasource URL.
And for a Google visualization based project which one is better between Google Spreadsheet and GAE big table. Since Google Spreadsheet has very good query options and a nice harmonics with google visualization. One can get a direct DataTable from a data source URL. To do the same thing needs a good amount of task with GAE big table. Please share your experience in this area.
There's nothing built in to do this. You'll need to write your own code that returns your data in a format GViz supports.

Resources