Get The Audio File using ibm-watson-speech-to-text - ibm-watson

i am using ibm watson-speech-to-text.
i am getting the text output related to the audio.
My question is is it possible to get the audio file from it ?
Here my code
var recognizeMic = require('watson-speech/speech-to-text/recognize-
microphone')
var stream = recognizeMic({
access_token: this.state.watsonAccessToken,
outputElement: '#output',
})
stream.on('data', async (data: any) => {
this.state.recognizedText= data.results[0].alternatives[0].transcript;
}
can anyone help me how i get the speech ?

Related

How to get WebRTC video to video streaming to decode video?

I have a working WebRTC connection between two people using PeerJS that captures a stream from a user video element and sends it to the other person. The video capture is as follows (working with React in Typescript):
useEffect(() => {
const srcVideo = document.getElementById('normalVideo');
const sinkVideo = document.getElementById('sinkVideo');
if (srcVideo === null) return;
const setSinkSrc = () => {
let stream;
const fps = 0; // Setting to zero means frames captured when requestFrame() is called.
// #ts-ignore
stream = srcVideo.captureStream(fps);
// #ts-ignore
sinkVideo.srcObject = stream;
setStreamOut(stream);
};
srcVideo.addEventListener('loadedmetadata', setSinkSrc);
return () => {
srcVideo.removeEventListener('loadedmetadata', setSinkSrc);
}
}, [vidSrc]);
and sends it out with PeerJS:
mySocket.on('user-connected', (theirPeerID: string, theirSocketID: string) => {
if ( streamOut ) {
const outConnection: Peer.MediaConnection = myPeer.call(theirPeerID, streamOut);
// This is weird.
// navigator.mediaDevices.getUserMedia({audio: true, video: true}).then(streamOut => myPeer.call(theirPeerID, streamOut));
}
});
And the user connects with the stream
useEffect(() => {
myPeer.on('call', (hostMediaConnection: Peer.MediaConnection) => {
const sinkVideo = document.getElementById('sinkVideo');
hostMediaConnection.answer();
hostMediaConnection.on('stream', (hostStream: MediaStream) => {
console.log(hostStream.getAudioTracks().length);
console.log(hostStream.getVideoTracks().length);
if (sinkVideo === null || sinkVideo === undefined) {
console.log("Sink video either undefined or null");
return
}
// Do not use URL.createObjectURL();
// #ts-ignore
sinkVideo.srcObject = hostStream;
// #ts-ignore
sinkVideo.addEventListener('loadedmetadata', () => {
// #ts-ignore
sinkVideo.play();
});
});
hostMediaConnection.on("error", (err) => {
console.log(err.type, "%cMediaConnectionError", "color:green;", err);
});
});
}, []);
On a local stream, the video works just fine. Where the smaller element's data comes from a media stream: sinkVideo.srcObject = stream;
However, when this runs, there is just a black screen on the consumer side. Audio is streamed - and is able to be heard - but no video is ever shown. Going into chrome://webrtc-internals, I correctly see two RTCMediaConnections: outbound data from the source, and inbound data for the sink. Audio is being transmitted, video is being transmitted, video frames are being decoded, and yet nothing. The screen is black for the consumer, where it should not be.
So, my question is where am I going wrong? video is apparently being sent out, PeerJS makes a successful connection from the source to the sink, and the sink is decoding video. Audio has absolutely no problem at any point getting sent out or being recieved, decoded, and heard by the sink.
I have a comment above a line of commented out code.
// This is weird.
// navigator.mediaDevices.getUserMedia({audio: true, video: true}).then(streamOut => myPeer.call(theirPeerID, streamOut));
Because when I send out my audio and video as the media, the stream is processed just fine, and the sink gets the video. This is one of the reasons I know that a successful WebRTC connection is being made: when I stream user webcam, data is sent.

upload image to S3 presigned url using react-native-image-picker and axios

I am trying to get an presigned url image upload working correctly. Currently the upload succeeds when selecting an image from the IOS simulator, however when I actually try to view the file it seems the file is corrupted and will not open as an image. I suspect it has something to do with my FormData but not sure.
export async function receiptUpload(file) {
const date = new Date();
const headers = await getAWSHeaders();
const presignUrl = await request.post(
urls.fileUpload.presignUpload,
{file_name: `${date.getTime()}.jpg`},
{headers}
)
.then(res => res.data);
const formData = new FormData();
formData.append('file', {
name: `${date.getTime()}.jpg`,
uri: file.uri,
type: file.type
});
const fileUpload = presignUrl.presignUrl && await request.put(
presignUrl.presignUrl,
formData
)
.then(res => res.status === 200);
}
I have tried from other fixes to change the file uri like so...
Platform.OS === 'android' ? file.uri : file.uri.replace('file://', '');
however this does not seem to work either.
I did this just recently in my current project and the following code is a working example for my use case. I didn't need to convert to a blob either though I am uploading to AWS S3 so if you are uploading elsewhere that may be the issue.
export const uploadMedia = async (fileData, s3Data, setUploadProgress = () => {}) => {
let sendData = { ...fileData };
sendData.data.type = sendData.type;
let formData = new FormData();
formData.append('key', s3Data.s3Key);
formData.append('Content-Type', fileData.type);
formData.append('AWSAccessKeyId', s3Data.awsAccessKey);
formData.append('acl', 'public-read');
formData.append('policy', s3Data.s3Policy);
formData.append('signature', s3Data.s3Signature);
formData.append('file', sendData.data);
return axios({
method: 'POST',
url: `https://${s3Data.s3Bucket}.s3.amazonaws.com/`,
data: formData,
onUploadProgress: progressEvent => {
let percentCompleted = Math.floor((progressEvent.loaded * 100) / progressEvent.total)
setUploadProgress(percentCompleted);
}
})
}
I would first check to see where the issue is occurring. After uploading can you view it on whatever storage service you are trying to upload it to. If so it's something on React Native side. If it doesn't ever get uploaded to the location you know its an error in your upload process. Might help you track the exact location of the error.
I had to do this recently for a project. I believe the data is a base64 string when coming directly from the file input. So the issue is your are uploading a base64 string not the image by simply passing the data field. I had to process it before uploading to the signed URL with the following method.
private dataUriToBlob(dataUri) {
const binary = atob(dataUri.split(',')[1]);
const array = [];
for (let i = 0; i < binary.length; i++) {
array.push(binary.charCodeAt(i));
}
return new Blob([new Uint8Array(array)], { type: 'image/jpeg' });
}
This answer fixed it for me: How can I upload image directly on Amazon S3 in React Native?
I had tried uploading with axios and fetch with FormData. The download went through but the image file was not readable, even when downloaded to my Mac from the S3 console:
The file "yourfile.jpg" could not be opened. It may be damaged or use a file format that Preview doesn’t recognize.
Only after trying to upload with XHR with the correct Content-Type header did it work. Your signedUrl should be correct as well, which seems to be the case if the download goes through.

Upload to Firebase Storage executes but image comes back as a black square. Tried everything

Hey I've been building out a full-stack tinder app using react native + firebase auth/storage/realtimedb.
Everything has been going great so far but I've ran into an issue a few days ago and I don't know what's wrong with it.
I get back the correct uri of the image and pass it in as parameters to my uploadImage function and convert that to a blob. It uploads a file to firebase storage but it's not my image. This is what gets uploaded:
Image that is getting uploaded.
Weird things going on in the file description of my 'image'
The first things I notice is when I upload the image and look at the description of the supposed image I see that the size is 600,000 bytes which is strange because when I upload the pictures manually through the firebase storage console they are a few megabytes.
The second thing is the image preview is not working.
editAvi = async () => {
console.log('wtf')
await Permissions.askAsync(Permissions.CAMERA_ROLL);
const { cancelled, uri } = await ImagePicker.launchImageLibraryAsync({
allowsEditing: true,
});
if (!cancelled) {
this.setState({ image: uri });
}
console.log('The image is' + this.state.image)
};
uploadImage = async (uri, imageName) => {
// Create file metadata including the content type
var metadata = {
contentType: 'image/jpeg',
}
// Points to the root reference
var storageRef = firebase.storage().ref();
// Points to 'images'
const response = await fetch(uri);
const blob = await response.blob();
var ref = storageRef.child('images/' + this.state.currentID);
ref.put(uri, metadata);
console.log('This is the blob: ' + blob)
}
I've been researching this extensively for two days and have asked about it multiple times in a web development discord I'm in and I still can't fix it.
Please help me fix this! This is one of the last things I need to get this app done. :)
Found this question when I was also searching for an answer. I was able to solve this following the recommendation from a Github issue https://github.com/expo/expo/issues/2402#issuecomment-443726662
The main idea is to replace
const response = await fetch(uri);
const blob = await response.blob();
var ref = storageRef.child('images/' + this.state.currentID);
ref.put(uri, metadata);
with
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = () => {
resolve(xhr.response);
};
xhr.onerror = (e) => {
reject(new TypeError("Network request failed"));
};
xhr.responseType = "blob";
xhr.open("GET", uri, true);
xhr.send(null);
});
var ref = storageRef.child('images/' + this.state.currentID);
ref.put(blob, metadata);
Fetch in know to have a problem in ReacNative when using Expo.
Hope this solves it.
Found this question when I was searching around for a solution to the exact same issue. I fixed it after about 12 hours of trial and error by adding 'application/octet-stream;BASE64' a the type when creating the blob (using rn-fetch-blob).
Blob.build(data, { type: 'application/octet-stream;BASE64' });
Not sure if that's the method you're using to create the blob, but if so, using that as the type fixed the issue for me.

How to upload audio file to Firebase Storage?

I'm trying to upload audio file to Firebase Storage in my Ionic2 project.
First I recorded a audio file using Media plugin (Cordova plugin), and this file is playing well. From the Android storage and from the media plugin method (this.media.play()...;).
Second I need to push the recorded file to Firebase Storage.
this is my code:
let storageRef = firebase.storage().ref();
let metadata = {
contentType: 'audio/mp3',
};
let filePath = `${this.file.externalDataDirectory}`+`${this.fileName}`;
const voiceRef = storageRef.child(`voices/${this.fileName}`);
var blob = new Blob([filePath], {type: 'audio/mp3'});
voiceRef.put(blob);
After reading the Firebase doc, I can push blob to Firebase.
The file is successfully pushed to Firebase Storage with empty data (95 Byte).
this is screenshot:
The problem isn't a Firebase issue
My problem is solved by using the File cordova plugin method (readAsDataURL()) and the putString(fileBase64,firebase.storage.StringFormat.DATA_URL) method.
First, I create a file reference:
let filePath = "this.file.externalDataDirectory" + "this.fileName";
Then I transform the file to a base64 string by using the readAsDataURL method that returns a promise containing the file as a string base64. Also, I push the file to Firebase using the putString method that has two parameters the File that returned by the readAsDataURL and the second is firebase.storage.StringFormat.DATA_URL.
My Final code:
let storageRef = firebase.storage().ref();
let metadata = {
contentType: 'audio/mp3',
};
let filePath = `${this.file.externalDataDirectory}` + `${this.fileName}`;
this.file.readAsDataURL(this.file.externalDataDirectory, this.fileName).then((file) => {
let voiceRef = storageRef.child(`voices/${this.fileName}`).putString(file, firebase.storage.StringFormat.DATA_URL);
voiceRef.on(firebase.storage.TaskEvent.STATE_CHANGED, (snapshot) => {
console.log("uploading");
}, (e) => {
reject(e);
console.log(JSON.stringify(e, null, 2));
}, () => {
var downloadURL = voiceRef.snapshot.downloadURL;
resolve(downloadURL);
});
});
That's working fine for me.
Thanks.

React-native-android - How to save an image to the Android file system and view in the phone's 'Gallery'

Is it possible to save an image to the android's local file system so it can be viewed from the phone's 'Gallery' and in a folder??
I found this react-native-fs library but after studying the documentation and working through an example I am still unsure if it is possible.
Thanks
For anyone having the same problem, here is the solution.
Solution
I am using the File System API from the react-native-fetch-blob library. This is because I tought it was way better documented and easier to understand than the 'react-native-fs' library.
I request an image from the server, receive a base64 and I then save it to the Pictures directory in the android fs.
I save the image like this:
var RNFetchBlob = require('react-native-fetch-blob').default;
const PictureDir = RNFetchBlob.fs.dirs.PictureDir;
getImageAttachment: function(uri_attachment, filename_attachment, mimetype_attachment) {
return new Promise((RESOLVE, REJECT) => {
// Fetch attachment
RNFetchBlob.fetch('GET', config.apiRoot+'/app/'+uri_attachment)
.then((response) => {
let base64Str = response.data;
let imageLocation = PictureDir+'/'+filename_attachment;
//Save image
fs.writeFile(imageLocation, base64Str, 'base64');
console.log("FILE CREATED!!")
RNFetchBlob.fs.scanFile([ { path : imageLocation, mime : mimetype_attachment } ])
.then(() => {
console.log("scan file success")
})
.catch((err) => {
console.log("scan file error")
})
}).catch((error) => {
// error handling
console.log("Error:", error)
});
},
The following code that is in the above method refreshes the Gallery otherwise the images would not display untill the phone is turned off and back on again.
RNFetchBlob.fs.scanFile([ { path : imageLocation, mime : mimetype_attachment } ])
.then(() => {
console.log("scan file success")
})
.catch((err) => {
console.log("scan file error")
})
Enjoy!
You can absolutely do this with react-native-fs. There's a PicturesDirectoryPath constant which isn't mentioned in the README for the project; if you save a file into there it should appear in the Gallery app. If you want it to appear in your own album, just make a new directory in that folder and save the file into there, eg
const myAlbumPath = RNFS.PicturesDirectoryPath + '/My Album'
RNFS.mkdir(myAlbumPath)
.then(/* write/copy/download your image file into myAlbumPath here */)
I don't have full example code anymore sorry, because I ended storing images in my app's private cache directory instead. Hope this helps anyway!

Resources