This project was created to help developers who are interested in experimenting with the Vision framework and other frameworks which are dependent on the use of the AVCaptureVideoDataOutputSampleBufferDelegate and AVCaptureAudioDataOutputSampleBufferDelegate protocols.
Currently the example does the following:
- Creates and displays a Preview layer which displays camera output in the correct orientation.
- Has a button to switch between the front and back camera.
- Can record video with the correct orientation.
- Saves the video to the Photo app.
I plan to add a bit of a tutorial to this Readme in the future.
Please note: This is NOT a production ready example! If you would like to help move it in that direction please feel free to issue "Pull requests".
If you like this, notes of encourgement are alway welcome.
PS: Much of this code came from this Stackoverflow article: https://stackoverflow.com/questions/51670428/avassetwriter-capturing-video-but-no-audio. Apple's AVCAM example provided insight for many of the issues I encountered. I need to figure out and update the copyright information.