This limitation is enforced by Apple because we are running as a broadcast extension, and have limited resources we are allowed to use. Widgets consume a lot of system resources, and are disabled during screen broadcasting on iOS in order to reduce the very likely risk of the app being killed off by the system.Does user expect them over the stream while screen broadcasting?.When screen capturing, the app is in the background and effectively suspended so it can’t render things inside of another app.Were the alerts set as “Show on Preview” and user expects to see them while outside our app?.Only what is being captured via the microphone can end up on the stream. Android: If you are using headphones and expect to hear the alert sounds on the stream: that will not work, Android doesn’t support capturing the internal audio.Bluetooth mics and headphones are having known issues right now and we are looking into how to resolve this. If you aren’t hearing audio from the mic on the stream, check the mute button. ![]()
0 Comments
Leave a Reply. |