QtMultimedia on iOS and tvOS: Difference between revisions

From Qt Wiki
Jump to navigation Jump to search
(Add "cleanup" tag)
m (Jake Petroules moved page QtMultimedia iOS to QtMultimedia on iOS and tvOS: Page refers to both iOS and tvOS, not just the former)
 
(5 intermediate revisions by 5 users not shown)
Line 1: Line 1:
{{Cleanup | reason=Auto-imported from ExpressionEngine.}}
[[Category:iOS]]
[[Category:tvOS]]
[[Category:QtMultimedia]]


The Qt Multimedia module is supported on iOS, but there are some limitations in what is supported, and some additional steps that need to be taken to make use of the support. One of the unusual things about the iOS port is that everything must be compiled statically due to the Apple policy of dis-allowing share object files to be bundled with applications. It is not a problem to statically compile the Qt modules (including Multimedia) and link them into your application, but most of QtMultimedia's functionality is provided by plugins. These plugins also need to be statically linked into your application, and to do that you must add a line to your qmake project file.
The Qt Multimedia module is supported on iOS and tvOS, but there are some limitations in what is supported, and some additional steps that need to be taken to make use of the support. It is not a problem to statically compile the Qt modules (including Multimedia) and link them into your application, but most of QtMultimedia's functionality is provided by plugins. If you are using a static build of Qt, these plugins also need to be statically linked into your application, and to do that you must add a line to your qmake project file.


== Low Latency Audio and Low Level Audio Support: ==
== Low Latency Audio and Low Level Audio Support: ==


To access the low level audio API's on iOS you need to statically link in the CoreAudio plugin:
To access the low level audio API's on iOS and tvOS you need to link in the CoreAudio plugin:
_QTPLUGIN ''= qtaudio_coreaudio_
'''QTPLUGIN += qtaudio_coreaudio'''
This gives you the ability to use the C''+ Low Level Audio APIs (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput)
This gives you the ability to use the C''+ Low Level Audio APIs (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput)
You are also about to use QSoundEffect and the QML component SoundEffect which allows you to playback uncompressed WAV files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).
You are also about to use QSoundEffect and the QML component SoundEffect which allows you to playback uncompressed WAV files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).
Line 12: Line 14:
== Audio Engine support (3D Audio) ==
== Audio Engine support (3D Audio) ==


iOS should also support using the QML only QtAudioEngine API provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of QML. If you are using this API in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file:
iOS and tvOS should also support using the QML only QtAudioEngine API provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of QML. If you are using this API in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file:
''QTPLUGIN ''= qtmedia_audioengine_. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
'''QTPLUGIN += qtmedia_audioengine'''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.


h2. Camera Capture
== Camera Capture ==
 
iOS has basic support for capturing images and videos via the builtin camera devices (not supported on tvOS due to platform/hardware limitations). To use these API's you need to statically link in AVFoundation Camera backend and you do this by adding the following line to your qmake project file:
iOS has basic support for capturing images and videos via the builtin camera devices. To use these API's you need to statically link in AVFoundation Camera backend and you do this by adding the following line to your qmake project file:
'''QTPLUGIN += qavfcamera'''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
_QTPLUGIN''= qavfcamera''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.


== Multimedia Playback (Videos, Audio) ==
== Multimedia Playback (Videos, Audio) ==


iOS also supports the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in QML) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both QML and C++. The limitation of not being able to playback Video in QML is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the AVFoundation mediaplayer backend, and you do this by including the following line in your qmake project file:
iOS and tvOS also support the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in QML) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both QML and C++. The limitation of not being able to playback Video in QML is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the AVFoundation mediaplayer backend, and you do this by including the following line in your qmake project file:
''QTPLUGIN += qavfmediaplayer''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
'''QTPLUGIN += qavfmediaplayer'''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.

Latest revision as of 19:42, 16 February 2017


The Qt Multimedia module is supported on iOS and tvOS, but there are some limitations in what is supported, and some additional steps that need to be taken to make use of the support. It is not a problem to statically compile the Qt modules (including Multimedia) and link them into your application, but most of QtMultimedia's functionality is provided by plugins. If you are using a static build of Qt, these plugins also need to be statically linked into your application, and to do that you must add a line to your qmake project file.

Low Latency Audio and Low Level Audio Support:

To access the low level audio API's on iOS and tvOS you need to link in the CoreAudio plugin: QTPLUGIN += qtaudio_coreaudio This gives you the ability to use the C+ Low Level Audio APIs (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput) You are also about to use QSoundEffect and the QML component SoundEffect which allows you to playback uncompressed WAV files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).

Audio Engine support (3D Audio)

iOS and tvOS should also support using the QML only QtAudioEngine API provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of QML. If you are using this API in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file: QTPLUGIN += qtmedia_audioengine. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.

Camera Capture

iOS has basic support for capturing images and videos via the builtin camera devices (not supported on tvOS due to platform/hardware limitations). To use these API's you need to statically link in AVFoundation Camera backend and you do this by adding the following line to your qmake project file: QTPLUGIN += qavfcamera. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.

Multimedia Playback (Videos, Audio)

iOS and tvOS also support the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in QML) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both QML and C++. The limitation of not being able to playback Video in QML is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the AVFoundation mediaplayer backend, and you do this by including the following line in your qmake project file: QTPLUGIN += qavfmediaplayer. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.