Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
108 changes: 100 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ There are a few different ways to approach filtering an image. The easiest are t
```swift
let testImage = UIImage(named:"WID-small.jpg")!
let toonFilter = SmoothToonFilter()
let filteredImage = testImage.filterWithOperation(toonFilter)
let filteredImage = try! testImage.filterWithOperation(toonFilter)
```

for a more complex pipeline:
Expand All @@ -161,7 +161,7 @@ for a more complex pipeline:
let testImage = UIImage(named:"WID-small.jpg")!
let toonFilter = SmoothToonFilter()
let luminanceFilter = Luminance()
let filteredImage = testImage.filterWithPipeline{input, output in
let filteredImage = try! testImage.filterWithPipeline{input, output in
input --> toonFilter --> luminanceFilter --> output
}
```
Expand All @@ -173,7 +173,7 @@ Both of these convenience methods wrap several operations. To feed a picture int
```swift
let toonFilter = SmoothToonFilter()
let testImage = UIImage(named:"WID-small.jpg")!
let pictureInput = PictureInput(image:testImage)
let pictureInput = try! PictureInput(image:testImage)
let pictureOutput = PictureOutput()
pictureOutput.imageAvailableCallback = {image in
// Do something with image
Expand All @@ -186,24 +186,116 @@ In the above, the imageAvailableCallback will be triggered right at the processI

### Filtering and re-encoding a movie ###

To filter an existing movie file, you can write code like the following:
To filter and playback an existing movie file, you can write code like the following:

```swift

do {
let bundleURL = Bundle.main.resourceURL!
let movieURL = URL(string:"sample_iPod.m4v", relativeTo:bundleURL)!
movie = try MovieInput(url:movieURL, playAtActualSpeed:true)
let bundleURL = Bundle.main.resourceURL!
let movieURL = URL(string:"sample_iPod.m4v", relativeTo:bundleURL)!

let audioDecodeSettings = [AVFormatIDKey:kAudioFormatLinearPCM]

movie = try MovieInput(url:movieURL, playAtActualSpeed:true, loop:true, audioSettings:audioDecodeSettings)
speaker = SpeakerOutput()
movie.audioEncodingTarget = speaker

filter = SaturationAdjustment()
movie --> filter --> renderView

movie.start()
speaker.start()
} catch {
fatalError("Could not initialize rendering pipeline: \(error)")
print("Couldn't process movie with error: \(error)")
}
```

where renderView is an instance of RenderView that you've placed somewhere in your view hierarchy. The above loads a movie named "sample_iPod.m4v" from the application's bundle, creates a saturation filter, and directs movie frames to be processed through the saturation filter on their way to the screen. start() initiates the movie playback.

To filter an existing movie file and save the result to a new movie file you can write code like the following:


```swift
let bundleURL = Bundle.main.resourceURL!
// The movie you want to reencode
let movieURL = URL(string:"sample_iPod.m4v", relativeTo:bundleURL)!

let documentsDir = FileManager().urls(for:.documentDirectory, in:.userDomainMask).first!
// The location you want to save the new video
let exportedURL = URL(string:"test.mp4", relativeTo:documentsDir)!

let asset = AVURLAsset(url:movieURL, options:[AVURLAssetPreferPreciseDurationAndTimingKey:NSNumber(value:true)])

guard let videoTrack = asset.tracks(withMediaType:AVMediaType.video).first else { return }
let audioTrack = asset.tracks(withMediaType:AVMediaType.audio).first

// If you would like passthrough audio instead, set both audioDecodingSettings and audioEncodingSettings to nil
let audioDecodingSettings:[String:Any] = [AVFormatIDKey:kAudioFormatLinearPCM] // Noncompressed audio samples

do {
movieInput = try MovieInput(asset:asset, videoComposition:nil, playAtActualSpeed:false, loop:false, audioSettings:audioDecodingSettings)
}
catch {
print("ERROR: Unable to setup MovieInput with error: \(error)")
return
}

try? FileManager().removeItem(at: exportedURL)

let videoEncodingSettings:[String:Any] = [
AVVideoCompressionPropertiesKey: [
AVVideoExpectedSourceFrameRateKey:videoTrack.nominalFrameRate,
AVVideoAverageBitRateKey:videoTrack.estimatedDataRate,
AVVideoProfileLevelKey:AVVideoProfileLevelH264HighAutoLevel,
AVVideoH264EntropyModeKey:AVVideoH264EntropyModeCABAC,
AVVideoAllowFrameReorderingKey:videoTrack.requiresFrameReordering],
AVVideoCodecKey:AVVideoCodecH264]

var acl = AudioChannelLayout()
memset(&acl, 0, MemoryLayout<AudioChannelLayout>.size)
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo
let audioEncodingSettings:[String:Any] = [
AVFormatIDKey:kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey:2,
AVSampleRateKey:AVAudioSession.sharedInstance().sampleRate,
AVChannelLayoutKey:NSData(bytes:&acl, length:MemoryLayout<AudioChannelLayout>.size),
AVEncoderBitRateKey:96000
]

do {
movieOutput = try MovieOutput(URL: exportedURL, size:Size(width:Float(videoTrack.naturalSize.width), height:Float(videoTrack.naturalSize.height)), fileType:.mp4, liveVideo:false, videoSettings:videoEncodingSettings, videoNaturalTimeScale:videoTrack.naturalTimeScale, audioSettings:audioEncodingSettings)
}
catch {
print("ERROR: Unable to setup MovieOutput with error: \(error)")
return
}

filter = SaturationAdjustment()

if(audioTrack != nil) { movieInput.audioEncodingTarget = movieOutput }
movieInput.synchronizedMovieOutput = movieOutput
movieInput --> filter --> movieOutput

movieInput.completion = {
self.movieOutput.finishRecording {
self.movieInput.audioEncodingTarget = nil
self.movieInput.synchronizedMovieOutput = nil
print("Encoding finished")
}
}

movieOutput.startRecording { started, error in
if(!started) {
print("ERROR: MovieOutput unable to start writing with error: \(String(describing: error))")
return
}
self.movieInput.start()
print("Encoding started")
}
```

The above loads a movie named "sample_iPod.m4v" from the application's bundle, creates a saturation filter, and directs movie frames to be processed through the saturation filter on their way to the new file. In addition it writes the audio in AAC format to the new file.

### Writing a custom image processing operation ###

The framework uses a series of protocols to define types that can output images to be processed, take in an image for processing, or do both. These are the ImageSource, ImageConsumer, and ImageProcessingOperation protocols, respectively. Any type can comply to these, but typically classes are used.
Expand Down
Empty file.
Empty file.
Empty file.
Empty file.
2 changes: 1 addition & 1 deletion examples/iOS/FilterShowcase/FilterShowcaseSwift/FilterDisplayViewController.swift
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ class FilterDisplayViewController: UIViewController, UISplitViewControllerDelega
currentFilterConfiguration.filter.addTarget(view)
case .blend:
videoCamera.addTarget(currentFilterConfiguration.filter)
self.blendImage = PictureInput(imageName:blendImageName)
self.blendImage = try? PictureInput(imageName:blendImageName)
self.blendImage?.addTarget(currentFilterConfiguration.filter)
self.blendImage?.processImage()
currentFilterConfiguration.filter.addTarget(view)
Expand Down
Empty file.
Empty file modified examples/iOS/FilterShowcase/FilterShowcaseSwift/Info.plist
100644 → 100755
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file modified examples/iOS/SimpleImageFilter/SimpleImageFilter/Info.plist
100644 → 100755
Empty file.
17 changes: 15 additions & 2 deletions examples/iOS/SimpleImageFilter/SimpleImageFilter/ViewController.swift
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,14 @@ class ViewController: UIViewController {
// Filtering image for saving
let testImage = UIImage(named:"WID-small.jpg")!
let toonFilter = SmoothToonFilter()
let filteredImage = testImage.filterWithOperation(toonFilter)

let filteredImage:UIImage
do {
filteredImage = try testImage.filterWithOperation(toonFilter)
} catch {
print("Couldn't filter image with error: \(error)")
return
}

let pngImage = UIImagePNGRepresentation(filteredImage)!
do {
Expand All @@ -25,8 +32,14 @@ class ViewController: UIViewController {
print("Couldn't write to file with error: \(error)")
}


// Filtering image for display
picture = PictureInput(image:UIImage(named:"WID-small.jpg")!)
do {
picture = try PictureInput(image:UIImage(named:"WID-small.jpg")!)
} catch {
print("Couldn't create PictureInput with error: \(error)")
return
}
filter = SaturationAdjustment()
picture --> filter --> renderView
picture.processImage()
Expand Down
Loading