YouTube API Resumable Uploader Session Token - wpf

I am working with the YouTube api trying to upload a video using the resumable uploader. I would really rather not have to directly ask the user for their credentials.
I am able to use AuthSub and get a session token. The problem is I cant seem to use this with the resumable uploader. Is this possible or is this totally separate? I see that GDataCredentials can take a client token. What is this? I i use the session token I get back Error = {"The remote server returned an error: (401) Unauthorized."}
Here is my code
Video newVideo;
var mResumableUploader = new ResumableUploader(10485760);
mResumableUploader.AsyncOperationCompleted += mResumableUploader_AsyncOperationCompleted;
mResumableUploader.AsyncOperationProgress += mResumableUploader_AsyncOperationProgress;
var youTubeAuthenticator = new ClientLoginAuthenticator(AppName, ServiceNames.YouTube, new GDataCredentials(YouTubeToken));
youTubeAuthenticator.DeveloperKey = DevKey;
newVideo = new Video();
newVideo.Title = "video";
newVideo.Tags.Add(new MediaCategory("Entertainment", YouTubeNameTable.CategorySchema));
newVideo.Keywords = "video";
newVideo.Description = "video";
newVideo.YouTubeEntry.Private = false;
newVideo.YouTubeEntry.MediaSource = new MediaFileSource(FilePath, "video/mp4");
var link = new AtomLink("http://uploads.gdata.youtube.com/resumable/feeds/api/users/default/uploads");
link.Rel = ResumableUploader.CreateMediaRelation;
newVideo.YouTubeEntry.Links.Add(link);
mResumableUploader.InsertAsync(youTubeAuthenticator, newVideo.YouTubeEntry, "inserter");

ClientLogin is problematic and will be deprecated soon. Please use OAuth2 and you won't have problems.

Related

Upload Image to Supabase Storage Using Storage APIs

Supabase is wonderful !! I am trying to upload an image to the public bucket using POST request to <SUPABASE_URL>/storage/v1/object/<BUCKET_NAME>/<IMAGE_NAME>.
The difficult is I have only base64 encoded image string and I am not able to make a successful request to above endpoint. Have tried numerous iterations of setting Content-type but no luck.
I am trying to upload my image from Appsmith which provides base64 format for the image from where I will have to hit the above endpoint.
Please help me out here.
I'm glad to be able to find another Supabase fan like me!
I hear your pain. Could you try this technique to convert base 64 string to a blob object?
const byteCharacters = atob(b64Data);
const byteNumbers = new Array(byteCharacters.length);
for (let i = 0; i < byteCharacters.length; i++) {
byteNumbers[i] = byteCharacters.charCodeAt(i);
}
const byteArray = new Uint8Array(byteNumbers);
const blob = new Blob([byteArray], {type: contentType});
The blob variable at the end is the variable you can use to upload to Supabase.
Also, have you considered using the Supabase-js SDK? It will make your life a lot easier as they provide better API's to interact with Supabase.
You can get the supabase-js package here:
https://www.npmjs.com/package/#supabase/supabase-js
And you can find some sample code here:
https://supabase.io/docs/reference/javascript/storage-from-upload
In your case, you could do something like this to upload your file:
const { data, error } = await supabase
.storage
.from('avatars')
.upload('public/sample.png', blob, {
cacheControl: 3600,
upsert: false
})

New session after download - angular js - internal rest call

I have an ng-click implementation that does a $http.post() call to the server which returns a file content. so, a file download happens on the browser. below is the code snippet:
$scope.downloadFile = function(mediaObj) {
var downloadReq = {
"cid": $cId,
"pid":$pId,
"mid":$mId
};
$http.post(<server url>, downloadReq)
.then(function(response) {
var downloadUrl = URL.createObjectURL(new Blob([response.data]));
var a = document.createElement('a');
a.href = downloadUrl;
a.target = '_blank';
a.download = response.headers('Content-Disposition').split(';')[1].trim().split('=')[1];
document.body.appendChild(a);
a.click();
a.remove();
}, function(response) {
//unable to download
$scope.downloadErr = true;
});
}
server side code snippet is like this:
public void handleDownloadRequest(String json, HttpServletRequest request, HttpServletResponse httpServletResponse) {
....
// make rest call to another microservice to get file content
IOUtils.copyLarge(new ByteArrayInputStream((byte[])response.getBody()), httpServletResponse.getOutputStream());
// copy all headers from rest call response to httpServletResponse
httpServletResponse.flushBuffer();
}
after this, the next call to server (it need not be download itself, any other server call) is getting a brand new session. the old session has been destroyed.
because of this, server side client session state is lost and working on the client is messed up.
can anyone please help me with understanding why a new session is getting created after the download operation? how i can avoid this? why is the old session getting destroyed?
Thanks in advance.
I am answering my own question. Maybe it will save someone's time. As part of the handleDownloadRequest() method, i was making a rest call to another microservice to get the file data. the httpresponse of that rest call had a new session id that was also getting copied into the httpServletResponse of the handleDownloadRequest() method.
this was getting propagated to the client and in turn the client session state was lost.
SOLUTION: i removed the session cookie header from the response while copying over the headers.
Take care of the http response while making internal rest calls...

Solr 4.3.1 Data-Import command

I'm currently using Solr 4.3.1. i have configured dih for my solr. i would like to do a full import through command prompt. I know the url will be something like this http://localhost:8983/solr/corename/dataimport?command=full-import&clean=true&commit=true is there any method i can do this without using curl ?
Thanks
Edit
string Text = "http://localhost:8983/solr/Latest_TextBox/dataimport?command=full-import&clean=true&commit=true";
var wc = new WebClient();
var Import = wc.DownloadString(Text);
Currently using the above code
Call it like a normal REST url that's it !! I am using it in my application for importing and indexing data from my Local drive and it just works fine ! :) . Use HttpURLConnection to make a request and capture response to see whether it was successful or not . You don't need any specific API to do that . This is a sample code to make a GET request correctly in C# .Try data import handler url with this, it may work !
Console.WriteLine("Making API Call...");
using (var client = new HttpClient(new HttpClientHandler { AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate }))
{
client.BaseAddress = new Uri("https://api.stackexchange.com/2.2/");
HttpResponseMessage response = client.GetAsync("answers?order=desc&sort=activity&site=stackoverflow").Result;
response.EnsureSuccessStatusCode();
string result = response.Content.ReadAsStringAsync().Result;
Console.WriteLine("Result: " + result);
}
Console.ReadLine();
}
}
}
You'll have to call the URL in some way - Solr only operates through a REST API. There is no command line API (the command line tools available just talk to the API). So use your preferred way to talk to a HTTP endpoint, that being curl, wget, GET or what's available for your programming language of choice.
The bundled solrCli application does not have any existing command for triggering a full-import as far as I were able to see (which would just talk to the REST API by calling the URL you've already referenced).

How to access any other's google calendar using google php api

I need access any other user's google calendar freebusy using google php api. Can any one help me step by step process to implement that. I have already checked the https://developers.google.com/google-apps/calendar/v3/reference/freebusy/query#response but nothing fruitful I got.
My Approach
session_start();
require_once ('libraries/Google/autoload.php');
$client = new Google_Client();
$client->setClientId($client_id);
$client->setClientSecret($client_secret);
$client->setRedirectUri($redirect_uri);
$client->addScope("email");
$client->addScope("profile");
$client->addScope(Google_Service_Calendar::CALENDAR);
$client->addScope(Google_Service_Calendar::CALENDAR_READONLY);
$client->setAccessType('offline');
$service = new Google_Service_Oauth2($client);
if (isset($_GET['code'])) {
$client->authenticate($_GET['code']);
$_SESSION['access_token'] = $client->getAccessToken();
header('Location: ' . filter_var($redirect_uri, FILTER_SANITIZE_URL));
exit;
}
/************************************************
If we have an access token, we can make
requests, else we generate an authentication URL.
************************************************/
if (isset($_SESSION['access_token']) && $_SESSION['access_token']) {
$client->setAccessToken($_SESSION['access_token']);
} else {
$authUrl = $client->createAuthUrl();
}
$client->setApplicationName("My Calendar"); //DON'T THINK THIS MATTERS
$client->setDeveloperKey($api_key); //GET AT AT DEVELOPERS.GOOGLE.COM
$cal = new Google_Service_Calendar($client);
$calendarId = 'any_other_user_email#gmail.com';
$freebusy_req = new Google_Service_Calendar_FreeBusyRequest();
$freebusy_req->setTimeMin($minTime);
$freebusy_req->setTimeMax($maxTime);
$freebusy_req->setTimeZone($time_zone);
$freebusy_req->setCalendarExpansionMax(10);
$freebusy_req->setGroupExpansionMax(10);
$item = new Google_Service_Calendar_FreeBusyRequestItem();
$item->setId($email);
$freebusy_req->setItems(array($item));
$query = $cal->freebusy->query($freebusy_req);
$response_calendar = $query->getCalendars();
$busy_obj = $response_calendar[$email]->getBusy();
But I am getting blank in terms of free busy.
Just make sure the calendar is public, then try google calendar freebusy method. It will definitely work.

Bing Maps Route Service CalculateRoute returns "An error occurred while processing the request."

I've written a Silverlight class to consume the Bing Maps Routing Service. I'm creating an array of Waypoint objects from lat/long data that I have stored in a database, and sending that to the CalculateRoute method of the webservice in order to get a route back, but I am unable to successfully get a route back. The response always contains the error "An error occurred while processing the request." I'm stumped. Any ideas about how I could solve this or at least get a more helpful error/exception out of the service? Here's the method that calls the service:
public void CalculateRoute(Waypoint[] waypoints)
{
request = new RouteRequest();
request.Waypoints = new ObservableCollection<Waypoint>();
for (int idx = 0; idx < waypoints.Length; idx++)
{
request.Waypoints.Add(waypoints[idx] as Waypoint);
}
request.ExecutionOptions = new ExecutionOptions();
request.ExecutionOptions.SuppressFaults = true;
request.Options = new RouteOptions();
request.Options.Optimization = RouteOptimization.MinimizeTime;
request.Options.RoutePathType = RoutePathType.Points;
request.Options.Mode = TravelMode.Walking;
request.Options.TrafficUsage = TrafficUsage.TrafficBasedRouteAndTime;
_map.CredentialsProvider.GetCredentials(
(Credentials credentials) =>
{
request.Credentials = credentials;
RouteClient.CalculateRouteAsync(request);
});
}
I then have a callback that handles the response, but I have been unable to get a successful response. I've tried making sure the maxBufferSize and maxReceivedMessageSize are set correctly and that timeouts are set correctly, but to no avail. Any help would be much appreciated.
It appears that this line:
request.Options.TrafficUsage = TrafficUsage.TrafficBasedRouteAndTime;
was the culprit. Apparently if you've got that option set and request a route for somewhere that doesn't have traffic data, it dies rather than just ignoring it.

Resources