QtMultimedia on iOS and tvOS: Difference between revisions

From Qt Wiki
Jump to navigation Jump to search
No edit summary
(Fix broken markup)
Line 6: Line 6:


To access the low level audio API's on iOS you need to statically link in the CoreAudio plugin:
To access the low level audio API's on iOS you need to statically link in the CoreAudio plugin:
_QTPLUGIN ''= qtaudio_coreaudio_
'''QTPLUGIN += qtaudio_coreaudio'''
This gives you the ability to use the C''+ Low Level Audio APIs (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput)
This gives you the ability to use the C''+ Low Level Audio APIs (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput)
You are also about to use QSoundEffect and the QML component SoundEffect which allows you to playback uncompressed WAV files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).
You are also about to use QSoundEffect and the QML component SoundEffect which allows you to playback uncompressed WAV files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).
Line 13: Line 13:


iOS should also support using the QML only QtAudioEngine API provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of QML. If you are using this API in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file:
iOS should also support using the QML only QtAudioEngine API provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of QML. If you are using this API in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file:
''QTPLUGIN ''= qtmedia_audioengine_. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
'''QTPLUGIN += qtmedia_audioengine'''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.


== Camera Capture ==
== Camera Capture ==
iOS has basic support for capturing images and videos via the builtin camera devices. To use these API's you need to statically link in AVFoundation Camera backend and you do this by adding the following line to your qmake project file:
iOS has basic support for capturing images and videos via the builtin camera devices. To use these API's you need to statically link in AVFoundation Camera backend and you do this by adding the following line to your qmake project file:
_QTPLUGIN''= qavfcamera''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
'''QTPLUGIN += qavfcamera'''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.


== Multimedia Playback (Videos, Audio) ==
== Multimedia Playback (Videos, Audio) ==


iOS also supports the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in QML) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both QML and C++. The limitation of not being able to playback Video in QML is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the AVFoundation mediaplayer backend, and you do this by including the following line in your qmake project file:
iOS also supports the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in QML) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both QML and C++. The limitation of not being able to playback Video in QML is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the AVFoundation mediaplayer backend, and you do this by including the following line in your qmake project file:
''QTPLUGIN += qavfmediaplayer''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
'''QTPLUGIN += qavfmediaplayer'''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.

Revision as of 19:55, 27 February 2016


The Qt Multimedia module is supported on iOS, but there are some limitations in what is supported, and some additional steps that need to be taken to make use of the support. One of the unusual things about the iOS port is that everything must be compiled statically due to the Apple policy of dis-allowing share object files to be bundled with applications. It is not a problem to statically compile the Qt modules (including Multimedia) and link them into your application, but most of QtMultimedia's functionality is provided by plugins. These plugins also need to be statically linked into your application, and to do that you must add a line to your qmake project file.

Low Latency Audio and Low Level Audio Support:

To access the low level audio API's on iOS you need to statically link in the CoreAudio plugin: QTPLUGIN += qtaudio_coreaudio This gives you the ability to use the C+ Low Level Audio APIs (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput) You are also about to use QSoundEffect and the QML component SoundEffect which allows you to playback uncompressed WAV files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).

Audio Engine support (3D Audio)

iOS should also support using the QML only QtAudioEngine API provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of QML. If you are using this API in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file: QTPLUGIN += qtmedia_audioengine. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.

Camera Capture

iOS has basic support for capturing images and videos via the builtin camera devices. To use these API's you need to statically link in AVFoundation Camera backend and you do this by adding the following line to your qmake project file: QTPLUGIN += qavfcamera. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.

Multimedia Playback (Videos, Audio)

iOS also supports the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in QML) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both QML and C++. The limitation of not being able to playback Video in QML is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the AVFoundation mediaplayer backend, and you do this by including the following line in your qmake project file: QTPLUGIN += qavfmediaplayer. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.