Netflix, iOS, tvOS, CenturyLink, and a Netgear Nighthawk, Oh My!

There is zero point to this post. I’m only writing it in case someone else runs into this issue. Hopefully they don’t have to spend as many hours as I did to debug this. Maybe someone can also explain why this was an issue in the first place.

For the last six months or so I have been unable to use Netflix on any iOS or tvOS device (version 10.x at the time of this writing). I could sometimes get my list of content to load, but any attempt at playback failed with generic Netflix errors. Even signing into my Netflix account would be a chore. If you search the Netflix support site they offer the basic unhelpful support information: have you tried restarting your device or router?

I have a CenturyLink Gigabit fiber line, so hearing that it might be an issue with my connection seemed wrong, especially when you consider that I use a variety of other streaming services without issue. The other questionable thing is that Netflix does work on any non-iOS device such as my Xbox One, TiVo Bolt, and (gasp) Android phone. Any device with a fruit logo, however, was a no-go.

My home network setup isn’t super complex, but it’s also not a stock setup either.

What I tried and What Actually Worked

My first thought was maybe it was some sort of Quality of Service (QoS) issue with the router. I checked and I didn’t have any QoS settings enabled. Just as an experiment I tried turning them on. Still no Netflix on iOS or tvOS.

My next experiment involved trying to put my tvOS devices as the DMZ host so they had a straight passthrough from the router to the open Internet. My hope was that maybe whatever is going on in the router that was filtering the Netflix connection would just be bypassed. Hopes dashed. Still nothing.

My last idea was based off a random guess, but turned out to actually work: I disabled IPv6 on the Nighthawk. Success!

I have no idea what issue is with the chemistry between Netflix, Apple’s network stack, my router, and IPv6. All I know is that I can now watch Black Mirror.

Update: Maybe it has something to do with IPv6 being blocked by Netflix’s war on VPNs?

Advanced iOS Application Architecture

I am mostly retired from the public speaking circuit. I do still make a point to try and do 360iDev in Denver each year, considering the proprietor is a friend and I can sleep in my bed every night of the conference. This year I did a talk that was a departure from my last few years of Auto Layout talks. Instead, I did a walkthrough of how I architect modern iOS apps. If you attended the conference and saw the talk, I hope you enjoyed it. If you didn’t attend, it’s already online and ready to watch.

What Old Is New Again in Auto Layout on iOS 10

The “What’s New in Auto Layout” talk seems to be becoming an annual tradition at WWDC. I didn’t attend in person this year, because I couldn’t get out of San Francisco fast enough on the Friday of the conference, but I recently sat down and watched the video and have been thinking about what Apple is offering this year.

This year’s session was surprising, to say the least. After spending the last three years preaching that autoresizing masks weren’t long for this world, Apple went ahead and made me eat a full serving of crow by “introducing” a new feature in Xcode 8 called “autoresizing masks.” For those new to Mac and iOS development, autoresizing masks is the “old” way of doing things before Auto Layout was introduced back with OS X Lion. Instead of explicitly setting relationships between your views, you set the springs and struts values to assign the resizing and pinning behavior of the view.

Even with Auto Layout, springs and struts never really went away. When the system unable to determine the layout of your application with your given set of constraints, it would plug in its own set of constraints implicitly, usually tied to springs and struts, to fill in the gaps. Any time you call translatesAutoresizingMaskIntoConstraints on a UIView you are opting out of letting the system create constraints for you based off autoresizing masks as well.

With the new incremental adoption policies in iOS 10 and above, Apple is now stepping back its hardline on Auto Layout adoption and instead realizing that in some scenarios, the old way can actually can be better. Specifically, if you just need to set up some basic resizing behavior opting into letting the system add its default constraints at compile time with a little help from your defined autoresizing masks is less work than manually creating constraints yourself. In a lot of ways it feels like a more pragmatic middle ground between the old way of doing things and the 100% pure Auto Layout approach.

Resizing masks are still not without their tradeoffs: if you are dealing with localization you’re going to want to opt into Auto Layout and all that is has to offer. I just localized an entire app and it would have been far more hellacious if I was sticking to autoresizing masks instead of Auto Layout. But, for things like fast prototypes or the most basic of views with minimal behavior requirements not having to set up constraints could definitely save you some development time.

A Sane Reference for TargetConditionals

I’ve spent the past couple days cleaning up some code to work on iOS, macOS, and watchOS. One thing I constantly struggle with is keeping the values I need from TargetConditionals.h in my head for when I want to #if branch a segment of Objective-C code for a specific platform.

So . . . I made a table.

Current as of: iOS 9.3, tvOS 9.2, watchOS 2.2, macOS 10.11

Macro 💻
device
📱
device
📱
simulator
⌚️
device
⌚️
simulator
📺
device
📺
simulator
TARGET_OS_MAC ☑️ ☑️ ☑️ ☑️ ☑️ ☑️ ☑️
TARGET_OS_IPHONE 🙅🏻 ☑️ ☑️ ☑️ ☑️ ☑️ ☑️
TARGET_OS_IOS 🙅🏻 ☑️ ☑️ 🙅🏻 🙅🏻 🙅🏻 🙅🏻
TARGET_OS_WATCH 🙅🏻 🙅🏻 🙅🏻 ☑️ ☑️ 🙅🏻 🙅🏻
TARGET_OS_TV 🙅🏻 🙅🏻 🙅🏻 🙅🏻 🙅🏻 ☑️ ☑️
TARGET_OS_SIMULATOR 🙅🏻 🙅🏻 ☑️ 🙅🏻 ☑️ 🙅🏻 ☑️
TARGET_OS_EMBEDDED 🙅🏻 ☑️ 🙅🏻 ☑️ 🙅🏻 ☑️ 🙅🏻
TARGET_IPHONE_SIMULATOR 🙅🏻 🙅🏻 ☑️ 🙅🏻 ☑️ 🙅🏻 ☑️

Hopefully someone else finds this useful.

TED 3.0: Working Effectively with Legacy iOS Code

I don’t talk often about the work I do to keep the lights on at my house, but I have spent a good portion of the last four years working on the TED for iOS app. The 3.0 release is the biggest release we’ve undertaken at TED in terms of scope and amount of changes to bring it to a head. The marquee feature is the app is now localized in 20 new languages. Users all over the world can now experience TED in their native languages from Arabic to Turkish.

Behind the scenes we have been planning to localize the app for over a year. What took so long?

A History Lesson

TED for iOS 1.0

TED for iOS first arrived on the App Store in October of 2010, and was released exclusively for the original iPad. TED for iOS 1.0 was designed to run on iOS 3.2. For the last six years we have been building on on this foundation as devices and capabilities for Apple’s platforms change. In my time at TED (September 2012 to present) I’ve seen:

When TED shipped back in 2010 it was written entirely in Objective-C and designed to work on two device sizes: the iPhone 3GS and original iPad. Now we support phones as small as the iPhone 4s up to iPads like the 12.9″ iPad Pro. In that entire time, we have continued to update the same app code base. There has never been a full-on rewrite of the iOS project, as tempting as it may be at times.

Renovating The House You Live In

Early last year, I took over as the lead developer of TED and began to dedicate more time to it than anyone had in previous years (TED has always been at most 2 part-time developers, and sometimes just 1). This allowed us to shift our mindset from maintaining the existing app to keep the wheels on while adding features occasionally to trying to be more ambitious with mobile goals. The only problem was that a lot of our code was showing its age. I made a list of high level goals to modernize the app going forward in small chunks so that we could still continue to ship updates to make TED HQ happy, while I eliminated many of the things that kept me up at night.

The first of those projects was to modularize the app. Since we are a small team, and have a fairly engaged group of users who upgrade quickly, we jump on the latest version of the iOS SDK faster than most companies. I set our minimum SDK to iOS 8 and began to break the app down into separate components. Instead of a single Xcode project with all our code in it, we broke it into separate dynamic frameworks:

By modularizing the code, it made it easier to start adopting unit tests for pieces of the code that had never sniffed XCTest in its life. It also helped the workflow so I could only try compiling TEDData if I was working with the data model exclusively, rather than having to wait and compile the entire app.

Adopting Adaptivity

When I found out that localizing the app into 20 languages was high on the list of priorities for TED, I made it pretty clear that the current foundation that the app lived on would not be capable of supporting it in a way that would allow the development team to be successful. Nearly every piece of code in the application was duplicated for both the iPhone and iPad. Up until earlier this year when you would launch the TED app on your iPhone, we would fork off to a different set of interface elements depending on if you were on an iPhone or iPad. So, for example, our Featured tab had FeaturedViewController_iPhone and FeaturedViewController_iPad with completely separate interface elements. On top of it, none of this interface code had been updated for Auto Layout, size classes, or Storyboards. In fact, most of it hadn’t been touched in years other than the occasional bug fix.

This was by far the most painful part of the app and took several months (you can see the gap in our releases last fall.) to not only consolidate down to a single UIApplicationDelegate, but also to consolidate and clean up the TED interface as much as possible using size classes, Auto Layout, and Storyboards.

We shipped all of this work in late January as part of TED 2.6. You likely didn’t notice a thing had changed if you used the app, but that was by design. Our metric for success was the user not noticing anything had changed. Internally, however, the entire app had been built up on a new foundation that would enable us to move a lot faster and move closer to localization.

Adopting Swift

As of version 3.0, about 20% of the TED code base is written in Swift. That is up from 0% at this time last year.

The first bits of Swift code we wrote were part of the adaptivity cleanup when I rewrote the entire ‘Surprise Me’ tab in Swift as a test case to see how it would go. You’ve heard the success stories before, so it should come as no surprise that I was a big fan. Not only was the code more concise and easy to read, but there was a lot less of it.

I’m pragmatic when it comes to adopting Swift in TED. If we need receive new designs for an existing feature, we go through the process of rewriting the code in Swift. If we encounter some Objective-C code that isn’t a candidate for a rewrite, I do take the time to add nullability attributes to it so that it behaves better when we are calling it in Swift, which makes the entire mixed code base experience much more pleasant.

I lose zero sleep over the lack of dynamism in Swift.

Tests and CI

The last component of our maturation as a mobile development team is building out our test suite and getting it running automatically as part of continuous integration. All commits are now automatically running our test suite on our CI server to make sure that we don’t add any regressions. Most (not all) new code is also being written with automated tests attached to it. We still have a major gap when it comes to integration and UI testing, but we’ll get there eventually.

One example where we are having great success with our test suite is when it comes to working with localization. There are several edge cases we need to handle when it comes to mapping Chinese language codes between iOS and the TED API. We automate all of that using XCTest, which makes it super easy to understand the intent of the code without having to run the app and step through the debugger.

The Localization Process

This entire modernization process took place off-and-on from January of 2015 to April of 2016. The actual process of localizing the app took place in the last four weeks. Since we went through the work of getting the app running on Auto Layout in as many places as possible, exporting and importing our xliff files meant that we had a minimal amount of interface work to do when it came to testing the actual interface. The biggest hassles turned out to be adjusting a few buttons to support multi-lines for more verbose languages like German.

I was admittedly shocked that this worked as well as it did, but it’s a testament to the work we put in building up to this actual release. 16 months of preparation for 4 weeks of work!

tl;dr

There’s still a ton of work to be done both internally and externally to bring the TED app to where I want it to be, but I’m pretty pleased with the work that we put in to get to this 3.0 release. I hope you’ll check it out

TED 3.0

Picking your Swift toolchain from xcodebuild

With Xcode 8 xcodebuild uses Swift 3 to build by default. This will be great in about six months, but right now it’s somewhat of a pain. I am using Carthage as my dependency manager and all of those are using Swift 2.x still.

If you want to build your Carthage dependencies (or anything from the command line for that matter) using Swift 2.3, add the TOOLCHAINS parameter.

TOOLCHAINS=com.apple.dt.toolchain.Swift_2_3 carthage build

Update: As of Carthage 0.17 you can use a --toolchain option. This doesn’t help with raw xcodebuild obviously.

When Good Apps Crash Unexpectedly

I had a fun Friday evening. I received a Slack notification sent to everyone in our group stating that the iOS app was crashing on launch. This came as a surprise to me, because our most recent release had been out for several weeks at this point. It also was the most stable release in terms of number of crashes we’ve ever released. This was a good point release.

After a few minor heart palpitations, I grab a few iOS devices and try launching the App Store version of the app. Success. No crashes for me. A few other team members launch the app successfully as well. At this point, I make a suggestion straight out of Microsoft’s 1990’s playbook: restart your phone. No more crashing on launch.

My theory was correct. I’ve seen a few times in the last few months where an app will randomly begin crashing on launch for no rhyme or reason. I’ve never really tried debugging it because it wasn’t my app crashing, but now I was annoyed. I made a point to check the iOS device console next time I saw one of my apps doing this weird crash on launch behavior.

Fast forward to yesterday when I wanted to check a few things in the Major League Baseball app. I tapped the MLB icon and . . . crash. Rinse. Repeat. I decided to go on a round-robin through my apps and see if any others are crashing. Turns out Medium, United Airlines, and Pocket are also crashing. The odds of four of these major apps all shipping at-launch crashers at once are good enough to make you rich in Vegas. At this point, I am pretty confident what’s going on but I’m not at home so I can’t easily plug my phone in and see what’s happening. I’ll just have to forgo three of my most used apps because I’m not restarting my phone yet.

Once I get home, I plugged my iPhone into my Mac, popped open device console in Xcode and tried launching the failing apps again. Here’s what spit out:

Apr 30 11:17:55 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028
Apr 30 11:17:55 iPhone assertiond[12347] <Warning>: Unable to obtain a task name port right for pid 14063: (os/kern) failure (5)
Apr 30 11:17:55 iPhone SpringBoard[12335] <Warning>: Unable to register for exec notifications: No such process
Apr 30 11:17:55 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for pid 14063: (os/kern) failure (5)
Apr 30 11:17:55 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for <FBApplicationProcess: 0x13facbcf0; com.medium.reader; pid: 14063>
Apr 30 11:17:55 iPhone com.apple.xpc.launchd[1] (UIKitApplication:com.medium.reader[0x7257][14063]) <Notice>: Service exited due to signal: Killed: 9
Apr 30 11:17:55 iPhone SpringBoard[12335] <Warning>: Application 'UIKitApplication:com.medium.reader[0x7257]' exited abnormally via signal.
Apr 30 11:17:56 iPhone SpringBoard[12335] <Warning>: Application '(null)' exited for an unknown reason.
Apr 30 11:17:56 iPhone kernel[0] <Notice>: IOAccessoryManager::configureAllowedFeatures: tristar: revoking mask=0xffff
Apr 30 11:17:56 iPhone iaptransportd[12353] <Warning>: CIapPortAppleIDBus: Auth timer timeout completed on pAIDBPort:0x15cd06f70, portID:01 downstream port
Apr 30 11:17:58 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028
Apr 30 11:17:58 iPhone com.apple.xpc.launchd[1] (UIKitApplication:com.ideashower.ReadItLaterPro[0x33c1][14064]) <Notice>: Service exited due to signal: Killed: 9
Apr 30 11:17:58 iPhone assertiond[12347] <Warning>: Unable to obtain a task name port right for pid 14064: (os/kern) failure (5)
Apr 30 11:17:58 iPhone SpringBoard[12335] <Warning>: Unable to register for exec notifications: No such process
Apr 30 11:17:58 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for pid 14064: (os/kern) failure (5)
Apr 30 11:17:58 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for <FBApplicationProcess: 0x13fed4190; com.ideashower.ReadItLaterPro; pid: 14064>
Apr 30 11:17:58 iPhone SpringBoard[12335] <Warning>: Application 'UIKitApplication:com.ideashower.ReadItLaterPro[0x33c1]' exited abnormally via signal.
Apr 30 11:17:59 iPhone SpringBoard[12335] <Warning>: Application '(null)' exited for an unknown reason.
Apr 30 11:18:03 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028
Apr 30 11:18:03 iPhone com.apple.xpc.launchd[1] (UIKitApplication:com.united.UnitedCustomerFacingIPhone[0x620b][14065]) <Notice>: Service exited due to signal: Killed: 9
Apr 30 11:18:03 iPhone assertiond[12347] <Warning>: Unable to obtain a task name port right for pid 14065: (os/kern) failure (5)
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Unable to register for exec notifications: No such process
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for pid 14065: (os/kern) failure (5)
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for <FBApplicationProcess: 0x13fa60dc0; com.united.UnitedCustomerFacingIPhone; pid: 14065>
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Application 'UIKitApplication:com.united.UnitedCustomerFacingIPhone[0x620b]' exited abnormally via signal.
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Application '(null)' exited for an unknown reason.
Apr 30 11:18:07 iPhone syslogd[12297] <Notice>: ASL Sender Statistics
Apr 30 11:18:07 iPhone kernel[0] <Notice>: xpcproxy[14066] Container: /private/var/mobile/Containers/Data/Application/C1AEAFEE-6FDD-46F1-BBAB-F3E345D35EB9 (sandbox)
Apr 30 11:18:07 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028
Apr 30 11:18:07 iPhone com.apple.xpc.launchd[1] (UIKitApplication:com.mlb.AtBatUniversal[0x37dc][14066]) <Notice>: Service exited due to signal: Trace/BPT trap: 5
Apr 30 11:18:07 iPhone ReportCrash[14067] <Notice>: Formulating report for corpse[14066] AtBat.Full
Apr 30 11:18:07 iPhone SpringBoard[12335] <Warning>: Application 'UIKitApplication:com.mlb.AtBatUniversal[0x37dc]' crashed.
Apr 30 11:18:07 iPhone ReportCrash[14067] <Warning>: Saved type '109(109_AtBat.Full)' report (3 of max 25) at /var/mobile/Library/Logs/CrashReporter/AtBat.Full-2016-04-30-111807.ips

The big thing that points out to me is this line:

Apr 30 11:18:07 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028

Fairplay is the DRM Apple uses for App Store (and iTunes purchases in general). Something is going wrong with the DRM on these apps causing them to fail to launch. Confidence inspiring!

This is as far as my investigation has taken me at this point. I don’t really have a theory what is causing this. My first thought was it was caused by an app being updated, but the United app hadn’t been updated in a while and it was crashing for no reason. I am mostly writing this up because when I was trying to defend that this was an iOS issue and not a bug in our code, I really didn’t have much of a source to cite other than the few tweets I remembered seeing from other folks running into the same issue. I’d love to know more if you have any theories.

Apple folks, have a radar: rdar://26032481

GoogleCast, GitHub, Git-LFS, and You

My main project these days bundles Google’s GoogleCast / Chromecast framework in it to allow users to stream video content from their iOS devices to Google’s little streaming sticks. Overall, I like the Chromecast devices. They seem to work a bit more reliably than Airplay in a lot of ways. What I don’t like, however, is the framework that Google provides for iOS to enable casting support in third-party apps.

We have had the framework in our git repository for the last year without much issue. Only recently did we try to update to the latest version (2.10.4 as of this writing) that the issues started to really pop up. Seemingly overnight, the GoogleCast framework ballooned from 17MB to 439MB. That is not a typo. It really did grow 25x in size. I opened an issue and Google is quick to blame Bitcode for the issue, though I’ve never seen a framework grow 25x. So our app is now a little bigger because we have this giant framework embedded in our project. No big deal, right?

The problem is that we use GitHub for our version control system and it has a hard limit of 100MB for a file size that can be pushed to a repository. If it’s over that 100MB size, the file is rejected and GitHub suggests adding it to GitHub’s Large File Storage system. Git-LFS is an extension to the Git protocol that replaces giant files like a 439MB iOS framework with a pointer to a location on a remote storage location that is better designed to house such large files (think Amazon S3).

Easy fix, I thought. I installed the Git-LFS extensions and tried adding the GoogleCast framework to LFS, but was met with an error:

GitHub LFS WTF

Long story short, to get a file on GitHub LFS it has to have zero history in your repository. Otherwise, the pre-push hook won’t recognize it as an LFS file and you’ll be in a jam. GitHub has a page that explains this and how to get around it. It took a bit of trial and error for me still to get GoogleCast to behave with Git LFS. These are the steps I went through in Terminal:

1. Analyze The Damage

Using gitk I was able to see how far back and how many commits were around with the GoogleCast Framework as a part of it. To get gitk working on El Capitan, however, you need to update the copy of tcl through Homebrew.

brew cask install tcl
gitk --all -- External/GoogleCast.framework

In the output you should see the commits that include the GoogleCast framework and its children inside.

Sweep It All Away

Here comes the painful part. You’re going to need to go through and remove every reference of the file in your repository history on every branch and tag. You’ll then need to force push the changes to your repo, which is always a little scary. There’s two ways to do this: git filter-branch or a third-party tool called BFG Repo Cleaner. The BFG is supposed to be faster, but I was unable to get it to successfully work for what I was trying to accomplish, so I went the slow and time consuming route.

You’ll need to repeat this on every branch you have open in your repository. Grab a drink:

git filter-branch --force --index-filter 'git rm -rf --cached --ignore-unmatch External/GoogleCast.framework' --prune-empty --tag-name-filter cat -- --all

This will go through and remove the GoogleCast framework files from every commit in your history. After you’ve finished cleaning up a branch you can run gitk again with this command to ensure it’s no longer in your history:

gitk -- External/GoogleCast.framework

Clean Up

After this, it’s probably worth running the Git garbage collector.

git gc --aggressive --prune=now

And finally, you’ll want to force push all your changes back to your repository.

git push origin --force --all; git push origin --force --tags

On to LFS

Now that we’ve got a cleaned up copy of the repository on GitHub, we can finally start re-adding the GoogleCast framework back to your repo and making sure its under LFS. Go ahead and add a fresh copy of the GoogleCast framework and xcassets files to where you previously were storing them. Make sure your app builds at this point just for safety.

First we need to install the Git LFS pre-push hook in our repository:

git lfs install

Next, we’ll go ahead and tell LFS about our oversized GoogleCast files.

git lfs track External/GoogleCast.framework/Versions/A/GoogleCast
git lfs track External/GoogleCast.framework/Versions/Current/GoogleCast
git lfs track External/GoogleCast.framework/GoogleCast

You’ll notice I am doing this three times. For some reason only known to Google they are bundling the same 139 MB binary three times in their “framework” so you have to make sure you hit all three instances of it.

Commit all your changes, including the new .gitattributes file additions that were made to track the new oversized GoogleCast files. You should now be able to push your changes to your master or develop branch and have the oversized files added to GitHub’s LFS system.

Finally you can go back to getting actual work done.

Rock-Paper-Scissors

I can’t say enough good things about this Android commercial ad that played a few times during the Oscars.

Stephen Wilkes’s Day to Night Photography

Stephen Wilkes is a professional photographer who gave a talk at TED last week on his day to night photography project. Wilkes sets up at a location for an entire day or more capturing the same shot repeatedly. He then heads back into his studio and stitches the moments together into single photograph that captures the activity of that location over a single day. The photos are incredibly mesmerizing.

I probably will have more to say about my experience at TED. Probably.