Listening to this week’s ATP and considering their hopes and dreams for this year’s WWDC got me to thinking more about the things I wish could be a bit better developing on Apple’s platforms. I present this not as a set of predictions but as a more formal list of observations and suggestions than my afternoon twitter complaints.


SnapKit should be the API, really. As someone who went from setting frame, to setting center, to being really intrigued [and ultimately turned off] by Auto-Layout when it was released, this is the most natural layout API I’ve ever used. Over the past two years, typing .snp in Xcode has never stopped being novel.

Tell me that this isn’t an improvement:

let box = UIView()
let rightBox = UIView()
let container = UIView()


box.snp.makeConstraints { (make) -> Void in

rightBox.snp.makeConstraints { (make) -> Void in

Swift 4 Codables

The acceptance of SE-0167 will bring about a new protocol in Swift 4 called Codable. This will allow for direct mapping of JSON data to Swift classes (as well as property list support). While NSKeyedArchiver/NSKeyedUnarchiver have added support for the new protocol, it would be great if this were taken a step further by adding support to CloudKit where Codables could be loaded straight into CKRecord. Considering CKRecord’s supported data types, there are a few cases where there would be some work involved to add support:

  • [✅] NSString
  • [✅] NSNumber
  • [✅] NSArray
  • [✅] NSDate
  • [❓] NSData
  • [❓] CKReference
  • [❓] CKAsset
  • [✅] CLLocation

I particularly worry about support for NSData and CKReference. Creating many-to-one relationships in JSON is already kind of gross. Building a CloudKit-specific solve to handle CKRecord for a problem like this outside of the new core JSONEncoder would seem to me like a bad idea. It will be interesting to see where this gets adopted around the system frameworks (and how Objective-C interop is handled).

Also, it supports ISO 8601 date encodings out of the box… At the encoder level. 👍

var encoder = JSONEncoder()
encoder.dateEncodingStrategy = .iso8601

Gonna be sweet.

SceneKit “Bug Fixes and General Improvements”

SceneKit is great. So much of it feels like using jMonkeyEngine again, except it isn’t open source and I can’t just jump in and fix bugs. The bugs totally exist and some of the API is kind of weird, let’s talk about that.

Initializing SCNView with no frame

Derp. This should work. I love using the no-argument initializer on UIViews, it keeps my code clean and serves as a nice informal convention for “this view is under the control of auto-layout”. I first happened upon this issue in the iOS 9 days, hopefully this gets cleaned up in iOS 11.

class BestAppEverViewController: UIViewController {
    // MARK:  I fail!
    let sceneView = SCNView()

    // MARK: - I work!
    let sceneView = SCNView(frame: CGRect(x: 0, y: 0, width: 1, height: 1))

OpenGL textures as SCNMaterial contents

SCNMaterial allows for you to attach a GLKTextureInfo as the material content. Unfortunately, Apple’s only officially supported way of creating a texture info object is by using the rather limited GLKTextureLoader, which allows for loading from files or image representations in memory (raw data or CGImage). With no way to simply specify a texture ID, there aren’t many options.

If you’re adventurous, you might override GLKTextureInfo with writable properties and notice that it totally works. It might also scare the hell out of you as one eagerly awaits the iOS update that topples the house of cards.

In most cases, this shouldn’t be an issue for folks. Shader modifiers are a pretty incredible way of having fun with pixels. Unfortunately this won’t be of much use for video considering the state of SKVideoNode.

PhotoKit Asset Status

I love iCloud Photo Library. For as much as I truly miss Aperture, its made my traditional photo management nightmare much easier to deal with.

Unfortunately, writing apps to deal with these photo libraries isn’t quite as lovely as actually using the service. Without a way to tell if a PHAsset is cached on-device, the code for handling these assets (especially in the case of video) becomes increasingly complicated the more you try to improve the user experience of accessing one of these assets. There are some tricks you can try to make an educated guess as to whether or not the file exists on device, like attempting to load the asset and checking out what the progress/completion callbacks do. This makes nearly impossible the ability to have nice animations for assets already on device, as there’s always a bit of time in between the request for the asset and when you actually get data.

Then there’s also the curious case of making resource requests if you actually need the raw data for a video and attempting to use PHAssetResourceProgressHandler. I am yet to see this progress handler return data correctly or consistently. If you’ve hit this and are looking for a fix, by the way, you can concurrently make an AVAsset request to PHCachingImageManager and use its progress handler, PHAssetVideoProgressHandler, which actually works. Thankfully, the progress and completion for the asset request will match what should be happening within the resource request. A fix for this would be super duper as well.


I can go on longer about shortcomings, irritations, and bugs, but when I put the last few years of iOS development in context I see a whole picture that’s pretty darn positive. The blemishes on the system frameworks don’t feel any more severe than looking at something like the Android SDK, while the nastiest bugs I’ve encountered are around the corner cases of use. Those are the projects I love though, the ones where I can take the sealed box and see how hard I can smash it without breaking. I’m looking forward to WWDC and the chance to talk over some of these things (and the lack of Swift refactoring in Xcode, holy smokes do I want that) in the labs. See you in San Jose!