Contents
1. Introduction
There are several packages (e.g. assets_audio_player, audioplayers, just_audio) can handle audio play in Flutter, but if you want to play the audio in background, you need to do more. And I will show you how to do it in this article!
2. Preparation
We need to use the audio_service to handle the background task. audio_service
does all the work to interface with Android, iOS, and other platforms so that you don’t need to worry about platform-related details, and Audio Service isn’t an audio player; it’s just an interface to the system audio controls, so you still need another audio play packages for handle the player functions.
For this article, I will use just_audio
to handle the audio service, you also can use other packages to do that. So we need to add the below in pubspec.yaml
file:
dependencies:
just_audio: ^0.9.36
audio_service: ^0.18.12
The base flow should be
Flutter UI --> Audio Service --> Audio Handler
so we need to create an audio service and handler!
3. Create the Audio Handler
We can create the audio handler with just_audio
first. The handler extends BaseAudioHandler
and with QueueHandler
class JustAudioPlayerHandler extends BaseAudioHandler with QueueHandler {
// create the just audio object and source for play list
final _player = AudioPlayer();
final _playlist = ConcatenatingAudioSource(children: []);
//TODO other logics
//...
}
We need to broadcast all playback state changes as they happen via playbackState
within audio_service
, so we can use _play.playbackEventStream
to map the transform events and pipe to playbackState
:
Transform a just_audio
event into an audio_service
state, this method should used from the constructor. Every event received from the just_audio
player will be transformed into an audio_service state so that it can be broadcast to audio_service clients.
PlaybackState _transformEvent(PlaybackEvent event) {
return PlaybackState(
// setup and allow which control item in the control panel in the phone's lock screen
controls: [
//MediaControl.skipToPrevious,
MediaControl.rewind,
if (_player.playing) MediaControl.pause else MediaControl.play,
MediaControl.stop,
MediaControl.fastForward,
//MediaControl.skipToNext,
],
// setup the action can be used, it will show the buttons in the phone's lock screen
systemActions: const {
MediaAction.seek,
MediaAction.seekForward,
MediaAction.seekBackward,
},
// for android lock screen control buttons
androidCompactActionIndices: const [0, 1, 3],
// map the audio service processing state to just audio
processingState: const {
ProcessingState.idle: AudioProcessingState.idle,
ProcessingState.loading: AudioProcessingState.loading,
ProcessingState.buffering: AudioProcessingState.buffering,
ProcessingState.ready: AudioProcessingState.ready,
ProcessingState.completed: AudioProcessingState.completed,
}[_player.processingState]!,
playing: _player.playing, // is playing status
updatePosition: _player.position, // the current playing position
bufferedPosition: _player.bufferedPosition, // the buffered position
speed: _player.speed, // player speed
queueIndex: event.currentIndex, // the index of the current queue
);
}
and We also need to load an empty playlist the first time to load
Future<void> _loadEmptyPlaylist() async {
try {
await _player.setAudioSource(_playlist);
} catch (e) {
print(e);
}
}
and put these two init methods into the constructor
JustAudioPlayerHandler() {
_player.playbackEventStream.map(_transformEvent).pipe(playbackState);
_loadEmptyPlaylist();
}
implement the base methods from the audio service
@override
Future<void> play() => _player.play();
@override
Future<void> pause() => _player.pause();
@override
Future<void> seek(Duration position) => _player.seek(position);
@override
Future<void> stop() async {
await _player.stop();
await playbackState.firstWhere(
(state) => state.processingState == AudioProcessingState.idle);
}
@override
Future<void> addQueueItems(List<MediaItem> mediaItems) async {
final audioSource = mediaItems.map(_createAudioSource);
_playlist.addAll(audioSource.toList());
final newQueue = queue.value..addAll(mediaItems);
queue.add(newQueue);
}
// add the audio item into playlist and queue before playing it
@override
Future<void> addQueueItem(MediaItem mediaItem) async {
final audioSource = _createAudioSource(mediaItem);
_playlist.add(audioSource);
final newQueue = queue.value..add(mediaItem);
queue.add(newQueue);
}
@override
Future<void> removeQueueItemAt(int index) async {
_playlist.removeAt(index);
final newQueue = queue.value..removeAt(index);
queue.add(newQueue);
}
@override
Future<dynamic> customAction(String name,
[Map<String, dynamic>? extras]) async {
// set the custom action like adjusting the volume from the UI
if (name == 'setVolume') {
_player.setVolume(extras!['value']);
}
}
// create the just audio source from audio file
// we pass the audio file via mediaitem.id from UI
UriAudioSource _createAudioSource(MediaItem mediaItem) {
print('add media item=========${mediaItem.id}');
return AudioSource.uri(
Uri.parse(mediaItem.id),
tag: mediaItem,
);
}
the base audio handler should be done, but you can also create some custom methods based on your requirements, for example, I want to get the audio’s total duration, then create the below method
Duration? getTotalDuration() => _player.duration;
and set the speed below
Future<void> setPlaySpeed(double speed) async {
await _player.setSpeed(speed);
}
resume item
Future<Duration?> resumeMediaItem(
MediaItem item, Duration currentDuration) async {
var index = _getIndex(item);
await _player.seek(currentDuration, index: index);
mediaItem.add(item.copyWith(duration: _player.duration));
await _player.play();
return _player.duration;
}
// get the media item index in the playlist for seeking
// so that we can resume it
int _getIndex(MediaItem item) {
int targetIndex = -1;
final audioSource = _createAudioSource(item);
for (int i = 0; i < _playlist.length; i++) {
final currentItem = _playlist.children[i] as UriAudioSource;
// print('current item');
if (currentItem.uri.path == audioSource.uri.path) {
//print('{audioSource.uri.path} index:i');
targetIndex = i;
break;
}
}
// print('get index:$targetIndex');
return targetIndex;
}
check the item whether already exists in the playlist, this can avoid adding the duplicate item to the list
Future<bool> hasMediaItem(MediaItem item) async {
var index = -1;
if (_playlist.sequence.isNotEmpty) {
index = _getIndex(item);
}
return index >= 0;
}
4. Create the Audio Service
After creating the audio handler, we can create the audio service and expose the methods to UI to call.
I will also use the GetX pattern for the example. So we can create a service that extends GetxService
class AudioPlayerService extends GetxService {
// use the audio handler that we created
late JustAudioPlayerHandler audioPlayerHandler;
// define the MediaItem
late MediaItem _currentItem;
// the current audio duration
Duration _currentDuration = Duration.zero;
}
initialize the audio handler in the init()
event
void init() async {
audioPlayerHandler = await AudioService.init(
builder: () => JustAudioPlayerHandler(),
config: const AudioServiceConfig(
androidNotificationChannelId: 'com.audioplayer', //set the app bundle id for android
androidNotificationChannelName: 'Flutter Audio Player', //the name to show in the background player
androidNotificationOngoing: true,
),
);
}
implement the methods we defined in the handler
Future<void> addQueueItem(MediaItem item) async {
_currentItem = item;
await audioPlayerHandler.addQueueItem(item);
}
Future<bool> hasMediaItem(MediaItem item) async {
return await audioPlayerHandler.hasMediaItem(item);
}
Future<void> setPlaySpeed(double speed) async {
await audioPlayerHandler.setPlaySpeed(speed);
}
void setMediaItem(MediaItem item) {
audioPlayerHandler.setMediaItem(item);
}
void setResumeMediaItem(MediaItem? item, Duration currentDuration) {
// print('set resume duration:$currentDuration');
if (item != null) {
_currentItem = item;
}
_currentDuration = currentDuration;
}
Future<void> removeAll() async {
audioPlayerHandler.removeAll();
}
Future<Duration?> getTotalDuration() async {
return audioPlayerHandler.getTotalDuration();
}
Future<Duration?> play() =>
audioPlayerHandler.resumeMediaItem(_currentItem, _currentDuration);
Future<void> pause() => audioPlayerHandler.pause();
Future<void> seek(Duration position) =>
audioPlayerHandler.resumeMediaItem(_currentItem, position);
Future<void> stop() async {
await audioPlayerHandler.stop();
}
Future<void> setVolume(double volume) async {
await audioPlayerHandler.customAction('setVolume', {'value': volume});
}
The audio service will be simple, just call the methods from the handler.
5. Use it in the UI
Create a song model
For showing the song information on lock screen, we need the model below:
class SongItem {
final int id;
String name;
String artist;
String coverImg;
String album;
String audioFilePath;
SongItem(this.id, this.name, this.artist, this.coverImg, this.album, this.audioFilePath);
}
and pass the model to the audio player service
Using the AudioPlayerService
We can easier to use the service via GetX
, just put below in main.dart
, and you can use the service anywhere
Get.find<AudioPlayerService>().init();
and we need to create a MediaItem
from the song model
var mediaItem = MediaItem(
id: songItem.audioFilePath, //pass the audio file path to MediaItem's id
album: songItem.album,
title: songItem.name,
artist: songItem.artist,
artUri: Uri.file(songItem.coverImage),
);
check the song whether has been adding to the playlist, if not then do it otherwise don’t add the duplicate item
//check and don't add the duplicate item
if (await audioPlayerService.hasMediaItem(mediaItem) &&
resumeDuration.inMilliseconds > 0) {
audioPlayerService.setResumeMediaItem(mediaItem, resumeDuration);
currentDuration.value = resumeDuration.inMilliseconds.toDouble();
playingItem(resumeDuration);
} else {
audioPlayerService.addQueueItem(mediaItem);
}
listen to the duration for updating
AudioService.position.listen((Duration p) {
currentDuration.value = p.inMilliseconds.toDouble();
currentPosition.value =
'{p.mmSSFormat} /{totalDuration.value.mmSSFormat}';
currentDuration.value = p.inMilliseconds.toDouble();
resumeDuration = p;
//TODO:
// Update the progress bar values based on the current duration...
});
and create the base control methods for UI buttons
Future<void> play() async {
audioPlayerService.setResumeMediaItem(null, resumeDuration);
var totalDur = await audioPlayerService.play();
//get the total duration
if (totalDur != null) {
totalDuration.value = totalDur;
}
// handle other events...
}
Future<void> seek(Duration d) async {
isPlaying.value = true;
await audioPlayerService.seek(d);
}
Future<void> pause() async {
isPlaying.value = false;
await audioPlayerService.pause();
}
I just roughly explain how to use in the UI, there are still many things that need to be done for a complete UI logic, but these are not the main parts of this article 🙂
In the end, don’t forget setup the permissions of background audio service in Android and iOS
you can find the below link from audio_service
settings:
6. Conclusion
I just demoed how to play audio in the background with just_audio
and audio_service
, you also can create the audio handler with your favorite audio package (e.g assets_audio_player, audioplayers …). You also can make more controls in the transform event to make the lock screen look good. The customAction
is the best way to handle your custom functions.
Ok, please let me know if you have other ideas?
If you enjoyed this article, please follow me here on Medium for more stories about .Net Core, Angular, and other Tech! 🙂