/r/reactnative
A community for learning and developing native mobile applications using React Native by Facebook.
A community for learning and developing native mobile applications using React Native by Facebook.
Interested in building web apps using React.js? Check out /r/reactjs!
Getting Started w/React Native
irc.freenode.net #reactnative
Keywords: ios, android, mobile, apps, apple, iphone, ipad
/r/reactnative
Hey everyone,
I recently upgraded my React Native project to version 0.72 from 0.71 , and I'm encountering an issue with the react-native-radial-gradient package in Android. It seems that after the upgrade, the gradient is now hiding content behind it, and I'm having trouble with setting the correct opacity or zIndex. I’ve tried adjusting the zIndex and opacity values, but the gradient still overlays on top of other components unexpectedly.
Has anyone else faced a similar issue with gradients after upgrading? Are there any workarounds, or is this a known issue with RN 0.72? I'd appreciate any advice or solutions!
Thanks!
Hey everyone, we’re running into an issue that works fine on a fresh project, but not on an updated one.
For iOS, we’re getting this error:
[!] The command: 'node --no-warnings --eval require(require.resolve('expo-modules-autolinking', { paths: [require.resolve('expo/package.json')] }))(process.argv.slice(1)) react-native-config --json --platform ios' returned a status code of 1
It seems like autolinking through Cocoapods is failing because it relies on @react-native-community/cli to find React Native native modules. The suggestion is to add it as a dev dependency with:
yarn add -D @react-native-community/cli
Or to check the framework’s documentation.
For Android, we get this error:
A problem occurred evaluating settings 'android'.
ERROR: autolinkLibrariesFromCommand: process node --no-warnings --eval require(require.resolve('expo-modules-autolinking', { paths: [require.resolve('expo/package.json')] }))(process.argv.slice(1)) react-native-config --json --platform android exited with error code: 1
It seems like the autolinking command isn’t working properly in the updated project setup. Does anyone have ideas on how to resolve this?
( https://discord.com/channels/695411232856997968/1302338271048634520 )
I have checked the code that queues the mount
when React rendering completes.
However, I am not sure when it is dequeued and executed.
In React, when the rendering process is scheduled using unstable_scheduleCallback
, the mount queue is cleared at the end, but rendering is scheduled using queueMicrotasks
.
How does C++ detect when the JavaScript runtime is in an idle state and start the mount
process?
Is Making a payments app with react native like creating a virtual card or adding one and spending money safe? Do I still need a native layer security? And also, Have any one tried tap and pay with rn? Are there any good libraries?
I'd like to parse an email using React Native.
There are many Javascript email parsers out there. eg. https://github.com/netas-ch/eml-parser
Is there a way to import them into my RN project?
Thank you.
I was given the task last week of upgrading someone else's codebase to Android API 34 (from 33). I don't have a ton of React Native experience, but a bit of Googling seemed to suggest this this would be a pretty simple job. Goes to show how little I know.
I tortured plugin and library versions for hours and finally managed to get through a build. But the app is crashing in my emulator with errors referencing receiver elements -- to wit:
com.mlg.perform one of RECEIVER_EXPORTED or RECEIVER_NOT_EXPORTED should be specified when a receiver isn't being registered exclusively for system broadcasts
I've read that API 34 requires some changes to the way receivers are set up as regards exported settings, but far as I can tell the export attributes for the he receivers I've found in the code 'appear' to be properly configured for export -- but then I'm stupid and really don't know what I'm talking about.
Any advice anyone could give would be much appreciated.
*edit -- fixed typos.
Hey, I have recently started learning react native, and I don't know which options are the best for styling, or if the default stylesheet is enough.
Thanks,
re TechWithTwin
Overall Goal is to get job as RN mobile dev…
Is it best to start with frontend web react as jobs are less with React Native or just go straight to mobile react native? Meaning how you market yourself - your title in LinkedIn, to have highest chance of getting a job as a junior?
Honestly it feels way more easier to use Zustand than Redux, it is less code and probably easier to understand.
Hey everyone! I'm curious to hear how anyone managed to add a shadow to a Google marker in react native maps library. From what I've read, the maps library doesn't use the native view system.
Any guidance would be greatly appreciated :)
What’s the best, idiomatic way to create consistent custom headers for screens?
I need two types of headers:
Should these headers be implemented directly in the screen components or in the <Stack>
via screenOptions
as custom components? Which approach is best?
I am new to React Native. And I am creating my very first app with expo. I am trying to implement styles through NativeWind in the app, I carefully installed nativewind from the official docs, but the styles are not getting applied in my app, and now, my very first screen in my very first app is looking like a junk.
Please help me!!
I’m building a boilerplate for react native expo apps to help devs build and launch mobile apps faster.
What features would you want it to have? What struggles have you had in the development and deployment process?
I’m trying to integrate OpenAI’s Assistant Api into my React Native app, but I ran into a roadblock – the official Node package isn’t supported in React Native.
Has anyone successfully used OpenAI’s API in a React Native project? I’d love to know how you approached this. Specifically, I’m looking for alternatives or workarounds, maybe a way to make HTTP requests directly to OpenAI’s API without needing the Node package. Will fetch api work?
I am developing a mobile application for my final year project. I have used React Native and now I am developing the web application. I've started with React, but I don't really like CSS (never have), and I've seen that another option is to use React Native Web. I know that the styles with React Native and css are similar (with their differences) but I find the React Native styles much easier. What do you recommend, do I continue with React or go on to create the web application with React Native Web?
Hey all!
I’m currently working with OpenAI’s new RealTime API, which sends back audio chunks called *audio deltas* in base64 format. My goal is to play these audio deltas in my React Native app, but I’m hitting some roadblocks.
Here’s what I’ve tried so far:
The final base64-encoded WAV file plays just fine in a browser (I tested it on [base64.guru](https://base64.guru)), but I can’t get it to play within React Native.
I’ve tried various libraries like **react-native-sound**, **track-player**, and **audio-toolkit**, but none seem to work with these audio chunks in React Native.
Interestingly, on my backend (Express server), I can play the audio chunks without adding any WAV header by simply doing:
const base64AudioChunk =
message.delta
;
const audioBuffer = Buffer.from(base64AudioChunk, "base64");
But I’m stuck on how to achieve similar results on the frontend in React Native. Has anyone here dealt with audio deltas or real-time audio playback in React Native? If anyone with experience in native audio modules has any pointers, I’d really appreciate your advice!
Thanks a lot for any help!
How does the dev build process from bundling with metro to deploying the app on an emulator with presumably xcrun work? and how what changes with the new architecture with turbomodules and hermes? The documentations I found were not really easy to understand…
The website is built using react native, seems ridiculous considering react exists for web. But it gives me freedom to add new platforms with minimal changes to code.
Tech stack used :
About the app : It is a travel planning app that allows you to plan and share trips. Explore trips shared by other people and also discover and share cool spots (underrated or hidden spots). I built it to counter the influencer effect ruining authentic experience. Now hardcore travelers can share their experience directly with the travel community!
link : tripyojan.com
Let me know what you think :)
Edit : it uses expo managed workflow, not barebone react native
Is there still issue that apple does not allow to access call logs? I guess same for Flutter right?
Im trying to make an app that has a fog of war feature. Im using react-native-maps and react-native-skia.
The idea I had was to put a canvas over the maps and as the location changed, it would take away from the canvas permanently.
However, whenever the canvas is over the map view component, I can’t interact with the map. I tried to set pointerEvents to none but it still didn’t work. Any help? I’m not super familiar with these libraries.
Hello everyone, I want to make an animation like this. Can someone give me keywords or suggest me how to do it? Thanks a lot
Hey guys!
I’m kinda confused with the whole auth flow topic.
I’m using tanstack query and zustand but I cannot grasp how I can handle my jwt authentication and showing the appropriate Stack.Screens etc.
Does anyone have an example project which is not over-complicated?
I tried to search for good examples but didn’t find a single one which is easy to understand…
I’m using expo with expo router btw.
Hi developers, here’s a general poll about how you feel about CSS (or CSS-like) styling in RN.
Hi guys, below is how I’m handling protected routes and routing on authenticate. Everything is working well as desired. However, the navigation is choppy at times. Wild appreciate some pointers towards what might be the issue. Thanks.
useEffect(() => { if (!loading && !reAuthLoading && authCheckComplete) { const inAuthenticatedGroup = segments[0] === "(authenticated)"; const inOnboardingGroup = segments[0] === "(onboarding)"; const isEmailSignUpPage = segments[0] === "emailSignUp"; const isLoginPage = segments[0] === "loginPage";
if (
!isAuthenticated &&
segments[0] !== undefined &&
!isEmailSignUpPage &&
!isLoginPage
) {
router.replace("emailSignUp");
} else if (isAuthenticated && !userData && !inOnboardingGroup) {
router.replace("/(onboarding)/onboardingMenu");
} else if (isAuthenticated && userData) {
if (!inAuthenticatedGroup && !inOnboardingGroup) {
// router.replace("/(onboarding)/proceedToKyc");
router.replace("/(authenticated)/Dashboard");
}
}
}
}, [ isAuthenticated, userData, loading, reAuthLoading, authCheckComplete, segments, router, ]);
const inAuthenticatedGroup = segments[0] === "(authenticated)"; const inOnboardingGroup = segments[0] === "(onboarding)"; const isEmailSignUpPage = segments[0] === "emailSignUp"; const isLoginPage = segments[0] === "loginPage";
if ( (loading || reAuthLoading || !authCheckComplete) && !isEmailSignUpPage && !isLoginPage && !inOnboardingGroup ) { return <LoadingScreen />; }
return <Slot />; }
Hi all,
With the news that CodePush/Microsoft AppCenter was being shut down, we've created a hosted alternative. The platform uses the same underlying open-source library so all of the same functionality is intact without you having to worry about the server configuration and maintenance.
One of the drawbacks of the open-source library was that the AppCenter UI was never released. So, we have rebuilt a similar web-based tool for management. We also have plans to expand on the base functionality to make it even better.
If you are a CodePush user and are looking to continue to use CodePush without interruption, shoot me a PM to join the beta. Currently free and looking for feedback to continue to build on the cool technology!
I have followed all the instructions to install expo, all oif them. But when when I scan, from my iPhone ,the qr code displaying on my windows machine. I keep getting the famous "There was a problem running the requested app.
manifest should be a valid JSON object" error.
Anyone with tips an tricks on how to navigate around it?
Hi everyone!
I’m working on an Expo-based cross-platform app, and I’m looking for advice on the best practices for enabling continuous audio recording in the background, including while the device is locked. My primary goal is to keep the app within the Expo managed workflow and avoid the need to eject, if possible.
Currently, I’m using expo-av
to handle basic audio recording. Here’s a simplified version of my setup:
import { Audio } from 'expo-av';
const startRecording = async () => {
await Audio.setAudioModeAsync({
allowsRecordingIOS: true,
playsInSilentModeIOS: true,
staysActiveInBackground: true,
interruptionModeIOS: Audio.INTERRUPTION_MODE_IOS_DO_NOT_MIX,
shouldDuckAndroid: true,
interruptionModeAndroid: Audio.INTERRUPTION_MODE_ANDROID_DO_NOT_MIX,
playThroughEarpieceAndroid: false,
});
const { recording } = await Audio.Recording.createAsync(
Audio.RecordingOptionsPresets.HIGH_QUALITY
);
await recording.startAsync();
};
From what I’ve gathered, enabling background audio on iOS typically requires setting UIBackgroundModes
, while Android would need a Foreground Service. I’m hoping that the Expo SDK might offer features or that there are maintained packages out there that would work within Expo's environment.
Any advice, examples, or resources on how to achieve this functionality would be greatly appreciated! Looking forward to hearing your insights.
Thanks in advance!
I am using google place API in my app. we only using it to find city names (not place) once when use choose its location. The usage is really high even with testing the app we are getting 13000 request per month which cost us a few hundred dollars. I am a beginner here and need advice on a placement for the google place api, or if I miss something that add up to this much of usage. Is there any way to get live requests to track where the app is sending all these api requests?