I'm working on a project where one of the requirements we were asked was to support foldable devices, unfortunately I can't find much content on the internet about their development in React Native.
Is there a library or something that React Native has to know if the device on which the user is using the application is a foldable device?
I'm very new to the subject and would like to know if anyone has come up with strategies to solve this "problem". For example, when the foldable device is open it shows the text "Open", if it is closed it shows "Closed".
const { width } = Dimensions.get("window")
const [bannerWidth, setBannerWidth] = useState<number>(width)
useEffect(() => {
const remove = Dimensions.addEventListener("change", status => {
console.log(status.window.width)
setBannerWidth(status.window.width)
})
return () => {
remove.remove()
}
}, [])
console.log(bannerWidth)
do this
Related
I have a web page with following event:
const onDeleteBankData = async (bankData) => {
if (!window.confirm('Are you sure you wish to delete this item?')) {
return;
}
}
I tested it in Chrome and on an Android device and it works. However on iPhone this does not work. Is this expected behavior? Is it a browser thing, or something I need to setup in REACT?
I am using Get Stream Io react native in my project https://github.com/GetStream/react-native-activity-feed
I see there is a functionality in the library of following a particular user, whereas the opposite is not available i.e. how to unfollow a user.
Please point me in the right direction on how to achieve this in react native
Our library doesn't include the logic for following/unfollowing. It has a button that allows you to set it up yourself though. Which is the FollowButton you're talking about. You could do something like this:
<FollowButton
followed={async () => {
// check if you're following the user
const following = await your_timeline_feed.following({filter: ['user:user_42'] })
if (following) {
return true;
}
return false;
}} // renders the button as "following"
clicked={async () => {
// your logic for following/unfollowing
await your_timeline_feed.follow('user', 'user_42');
}}
/>
Read more here:
https://getstream.github.io/react-native-activity-feed/#!/FollowButton
https://getstream.io/docs/following/?language=js
My React Native app receives data (products) from an API that contains an array of objects, each object has a link for a picture. There is an option to download all the products for offline view (I'm using redux-persist + realm for that) but the problem is that the pictures itself are not downloaded only the links for them.
What would be the best way for me to download the pictures so that I can attach them to the corresponding products?
There are multiple ways to do that, by a manual way you can download all images as base64 by using their addresses that comes from the API response. you should use JavaScript to download them, let see a JavaScript base64 downloading:
const imageLink = 'https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png';
fetch(imageLink)
.then(res => res.blob())
.then(data => {
const reader = new FileReader();
reader.readAsDataURL(data);
reader.onloadend = () => {
const base64data = reader.result;
console.log(base64data); // you can store it in your realm
};
});
After downloading each image store it in your realm local database and in usage call it from the realm.
There are other ways like using libraries for catching images. like react-native-cached-image.
I want to upload an image to Azure Blob Storage using React.
I've tried a lot of examples and none of them work.
The one that seemed the best was this one but still didn't manage to get it working on React.
What I'm trying right now is to use the createContainerIfNotExists method just to test and the error is Cannot read property createBlobServiceWithSas of undefined
My code is the following:
import AzureStorage from 'azure-storage';
const account = {
name: 'x',
sas: 'x',
};
const blobUri = `https://${account.name}.blob.core.windows.net`;
const blobService = AzureStorage.Blob.createBlobServiceWithSas(blobUri, account.sas);
export const createContainer = () => {
blobService.createContainerIfNotExists('test', (error, container) => {
if (error) {
// Handle create container error
} else {
console.log(container.name);
}
});
};
export default createContainer;
According to my research, because you develop A React application, we can not use the createBlockBlobFromBrowserFile method. We just can use the method in the browser. For more details, please refer to the document.
According to the situation, I suggest you use the other method(such as uploadStreamToBlockBlob) to upload image with V10 sdk. For more details, please refer to https://learn.microsoft.com/en-us/javascript/api/#azure/storage-blob/?view=azure-node-latest
I have a react component that has a button with a handler to play an audio.
const initializeAudio = () => {
let context = new AudioContext()
let analyser = context.createAnalyser()
let audioSrc = context.createMediaElementSource(audioRef.current)
audioSrc.connect(analyser)
analyser.connect(context.destination)
analyser.fftSize = 512
setAnalyser(analyser)
}
const play = () => {
if (!audioSet) {
initializeAudio()
setAudioSet(true)
audioRef.current.volume = 0.5
}
audioRef.current.play()
setPlaying(true)
interval = window.setInterval(turnToTime, 1000)
}
<audio
data-keepplaying
ref={audioRef}
id="audioElement"
src={audio}
/>
<WaveForm analyser={analyser} />
<button onClick={isPlaying ? pause : play} className="media_button">
<FontAwesomeIcon
className="fa-icon"
icon={isPlaying ? "pause" : "play"}
style={!isPlaying && { paddingLeft: "0.4rem" }}
></FontAwesomeIcon>
</button>
Everything works fine on firefox and chrome, but won't work on safari on iphone, any one know why?
I feel that my audio is created in response to a click which is what the new restriction needed, but why is safari not working?
From reading your code I would guess that the second line is throwing an error in Safari. Safari still has no official support for the Web Audio API and the AudioContext is only available as webkitAudioContext.
If initializing the context is your only concern using the following technique might work.
let context = new (window.AudioContext || window.webkitAudioContext)();
However I would not recommend using it as it treats Safari's implementation similar to the one in Firefox or Chrome even though they differ in many aspects. As I'm the author of standardized-audio-context using this package is of course my preferred way to handle inconsistent browser support for the Web Audio API. :-)
You can use it like this.
import { AudioContext } from 'standardized-audio-context';
let context = new AudioContext();
I hope this helps.