Voice Recorder app in SwiftUI – #2 Playing the audios

Share on facebook
Share on twitter
Share on pinterest

Welcome to a new SwiftUI tutorial. In this article, we will create our own dictation app. We will learn how to record audios, how to save audio files and how to play them.

In the last part of this tutorial, we dealt with how we record and save audio. In this part, we will learn how to playback the recorded audios. We will also enable the user to delete old recordings.

Preparing the AudioPlayer 🎵

Similar to what we already did with the AudioRecorder, we create our own ObservableObject for the playback functionality. For this purpose, we create a new Swift file called AudioPlayer.

In this file, we import the SwiftUI, Combine and the AVFoundation framework. Then we create a class called AudioPlayer which adapts the ObservableObject protocol.

import Foundation
import SwiftUI
import Combine
import AVFoundation

class AudioPlayer: ObservableObject {
    
}

Again, we need a PassthroughObject to notify observing views about changes, especially if an audio is being played or not.

class AudioPlayer: ObservableObject {
    
    let objectWillChange = PassthroughSubject<AudioPlayer, Never>()
    
}

Accordingly, we implement a variable isPlaying which we set to false by default. If the value of the variable gets changed, we inform observing views using our objectWillChange property.

var isPlaying = false {
        didSet {
            objectWillChange.send(self)
        }
    }

And for the playback functionality, we need an AVAudioPlayer instance from the AVFoundation framework.

var audioPlayer: AVAudioPlayer!

You can see that the structure of our AudioPlayer is very similar to the AudioRecorder.

Now we can insert the start and stop buttons into each RecordingRows.

Updating the RecordingRows ✍️

Each RecordingRow needs its own AudioPlayer for the respective audio recording. To do this, we initialise one separate AudioPlayer instance as an ObservedObject for each RecordingRow.

struct RecordingRow: View {
    
    var audioURL: URL
    
    @ObservedObject var audioPlayer = AudioPlayer()
    
    var body: some View {
        //...
    }
}

If the audioPlayer is not playing, we want to display a play button that allows the user to listen to the recording.

HStack {
            Text("\(audioURL.lastPathComponent)")
            Spacer()
            if audioPlayer.isPlaying == false {
                Button(action: {
                    print("Start playing audio")
                }) {
                    Image(systemName: "play.circle")
                        .imageScale(.large)
                }
            }
        }

If an audio is currently playing, we need to display a button to stop the playback.

            if audioPlayer.isPlaying == false {
                Button(action: {
                    print("Start playing audio")
                }) {
                    Image(systemName: "play.circle")
                        .imageScale(.large)
                }
            } else {
                Button(action: {
                    print("Stop playing audio")
                }) {
                    Image(systemName: "stop.fill")
                        .imageScale(.large)
                }
            }

When you run the app in the regular simulator it should look like this:

Now we can implement the functions for playing and stopping the audio in our AudioPlayer, which we will call from the buttons of the RecordingRows.

Setting up the playback functionality

In our AudioPlayer we start by adding a function called startPlayback. This function should accept a URL, i.e. a file path for the audio to be played.

func startPlayback (audio: URL) {
        
    }

Similar to the recordingSession from the last part of the tutorial, we start by initializing a playbackSession inside this function.

func startPlayback (audio: URL) {
        
        let playbackSession = AVAudioSession.sharedInstance()
        
    }

By default, sounds are played through the device’s earpiece. However, we want the audio to be played through the loudspeaker. To achieve this, we have to overwrite the output audio port accordingly.

do {
            try playbackSession.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
        } catch {
            print("Playing over the device's speakers failed")
        }

Now we can start playing the audio with the help of the given file path and inform the observing views about this. If this does not work, we will output a corresponding error.

do {
            audioPlayer = try AVAudioPlayer(contentsOf: audio)
            audioPlayer.play()
            isPlaying = true
        } catch {
            print("Playback failed.")
        }

To stop the playback, we add the following function to our AudioPlayer:

func stopPlayback() {
        audioPlayer.stop()
        isPlaying = false
    }

We can now call the two functions from our RecordingRow’s start and stop buttons.

if audioPlayer.isPlaying == false {
                Button(action: {
                    self.audioPlayer.startPlayback(audio: self.audioURL)
                }) {
                    Image(systemName: "play.circle")
                        .imageScale(.large)
                }
            } else {
                Button(action: {
                    self.audioPlayer.stopPlayback()
                }) {
                    Image(systemName: "stop.fill")
                        .imageScale(.large)
                }
            }

Run the app and tap on the play button to listen to your recorded audio! You may have noticed that although an audio was played to the end, the stop button is still being displayed. This is because we have not yet updated our isPlaying variable accordingly.

To be notified when an audio has finished playing, we need the audioDidFinishPlaying function. This function is part of the AVAudioPlayerDelegate protocol that our AudioPlayer has yet to adapt. Hint: To adapt this delegate protocol, the AudioRecorder must also adapt the NSObject protocol.

class AudioPlayer: NSObject, ObservableObject, AVAudioPlayerDelegate {
    
    //...
    
}

When an audio is played, we need to set the AudioPlayer itself as the delegate of the AVAudioPlayer.

func startPlayback (audio: URL) {
        
        //...
        
        do {
            audioPlayer = try AVAudioPlayer(contentsOf: audio)
            audioPlayer.delegate = self
            audioPlayer.play()
            isPlaying = true
        } catch {
            print("Playback failed.")
        }
    }

Now we can add the audioDidFinishPlaying function to our AudioPlayer. If the audio was successfully played, we set the playing properties value back to false.

func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
        if flag {
            isPlaying = false
        }
    }

When we now run the app and play a recording, the AudioPlayer will call the audioDidFinishPlaying function as its own delegate after the audio has been finished playing. This will cause the playing attribute to be false again, which will eventually cause the particular RecordingRow to update itself and display the play button again.

Deleting recordings 🗑

Finally, we would like to allow the user to delete individual recordings. For this purpose, we add the default edit button to the navigation bar of our ContentView.

.navigationBarTitle("Voice recorder")
.navigationBarItems(trailing: EditButton())

This button enables the user to select individual RecordingRows from the RecordingList that he wants to delete. To do this, the Edit button expects us to implement a delete function. We have to add this function to our RecordingsList.

struct RecordingsList: View {
    
    //...
    
    var body: some View {
        //...
    }
    
    func delete(at offsets: IndexSet) {
        
        
    }
}

The offsets argument represents a set of indexes of recording rows that the user has chosen to delete. With these, we create an array of the file paths of the recordings to be deleted.

func delete(at offsets: IndexSet) {
        
        var urlsToDelete = [URL]()
        for index in offsets {
            urlsToDelete.append(audioRecorder.recordings[index].fileURL)
        }

    }

We can now add a function within our AudioRecorder that accepts an array of urls and deletes the corresponding files from the document folder. When the deletion is completed we update our recordings array using the fetchRecording function.

func deleteRecording(urlsToDelete: [URL]) {
        
        for url in urlsToDelete {
            print(url)
            do {
               try FileManager.default.removeItem(at: url)
            } catch {
                print("File could not be deleted!")
            }
        }
        
        fetchRecordings()
        
    }

We now call this function from the delete function of our RecordingsList.

func delete(at offsets: IndexSet) {
        
        var urlsToDelete = [URL]()
        for index in offsets {
            urlsToDelete.append(audioRecorder.recordings[index].fileURL)
        }
        audioRecorder.deleteRecording(urlsToDelete: urlsToDelete)
    }

Finally, we have to apply the delete functionality to every RecordingRow in the RecordingList by writing:

List {
            ForEach(audioRecorder.recordings, id: \.createdAt) { recording in
                RecordingRow(audioURL: recording.fileURL)
            }
                .onDelete(perform: delete)
        }

Run the app to see if it works. We can now either swipe a recording to the right or tap the edit button to delete it.

Conclusion 🎊

That’s it, We are finished with our own voice recorder app! We learned how to record and save audios and how to play and delete them.

You can look up the complete source code of the app here.

I hope you enjoyed this tutorial! If you want to learn more about SwiftUI, check out our other tutorials! Also make sure you follow us on Instagram and subscribe to our newsletter to not miss any updates, tutorials and tips about SwiftUI and more!

2 replies on “Voice Recorder app in SwiftUI – #2 Playing the audios”

Thank you so much for this tutorial. I’m learning iOS programming much faster thanks to people like you and your generosity. I have one question. This is what I’m trying to do: Content View shows a list of Classrooms inside a NavigationView. Tapping on a Classroom takes you to a “stock SwiftUI” detail view that displays a photo of that classroom and the ability to make multiple audio recordings. My goal is that each Classroom would have its own list of recordings. How would I modify your code so that the recording’s documentPath is also saved in this Classroom Entity’s attribute called “fileURL” in Core Data?

Leave a Reply

Your email address will not be published. Required fields are marked *

small_c_popup.png

Covid-19 Forces you into quarantine?

Start Mastering swiftUI Today save your 33% discount

small_c_popup.png

Are you ready for a new era of iOS development?

Start learning swiftUI today - download our free e-book