Advanced iOS Application Architecture

I am mostly retired from the public speaking circuit. I do still make a point to try and do 360iDev in Denver each year, considering the proprietor is a friend and I can sleep in my bed every night of the conference. This year I did a talk that was a departure from my last few years of Auto Layout talks. Instead, I did a walkthrough of how I architect modern iOS apps. If you attended the conference and saw the talk, I hope you enjoyed it. If you didn’t attend, it’s already online and ready to watch.

What Old Is New Again in Auto Layout on iOS 10

The “What’s New in Auto Layout” talk seems to be becoming an annual tradition at WWDC. I didn’t attend in person this year, because I couldn’t get out of San Francisco fast enough on the Friday of the conference, but I recently sat down and watched the video and have been thinking about what Apple is offering this year.

This year’s session was surprising, to say the least. After spending the last three years preaching that autoresizing masks weren’t long for this world, Apple went ahead and made me eat a full serving of crow by “introducing” a new feature in Xcode 8 called “autoresizing masks.” For those new to Mac and iOS development, autoresizing masks is the “old” way of doing things before Auto Layout was introduced back with OS X Lion. Instead of explicitly setting relationships between your views, you set the springs and struts values to assign the resizing and pinning behavior of the view.

Even with Auto Layout, springs and struts never really went away. When the system unable to determine the layout of your application with your given set of constraints, it would plug in its own set of constraints implicitly, usually tied to springs and struts, to fill in the gaps. Any time you call translatesAutoresizingMaskIntoConstraints on a UIView you are opting out of letting the system create constraints for you based off autoresizing masks as well.

With the new incremental adoption policies in iOS 10 and above, Apple is now stepping back its hardline on Auto Layout adoption and instead realizing that in some scenarios, the old way can actually can be better. Specifically, if you just need to set up some basic resizing behavior opting into letting the system add its default constraints at compile time with a little help from your defined autoresizing masks is less work than manually creating constraints yourself. In a lot of ways it feels like a more pragmatic middle ground between the old way of doing things and the 100% pure Auto Layout approach.

Resizing masks are still not without their tradeoffs: if you are dealing with localization you’re going to want to opt into Auto Layout and all that is has to offer. I just localized an entire app and it would have been far more hellacious if I was sticking to autoresizing masks instead of Auto Layout. But, for things like fast prototypes or the most basic of views with minimal behavior requirements not having to set up constraints could definitely save you some development time.

A Sane Reference for TargetConditionals

I’ve spent the past couple days cleaning up some code to work on iOS, macOS, and watchOS. One thing I constantly struggle with is keeping the values I need from TargetConditionals.h in my head for when I want to #if branch a segment of Objective-C code for a specific platform.

So . . . I made a table.

Current as of: iOS 9.3, tvOS 9.2, watchOS 2.2, macOS 10.11

Macro 💻
TARGET_OS_MAC ☑️ ☑️ ☑️ ☑️ ☑️ ☑️ ☑️
TARGET_OS_IPHONE 🙅🏻 ☑️ ☑️ ☑️ ☑️ ☑️ ☑️
TARGET_OS_IOS 🙅🏻 ☑️ ☑️ 🙅🏻 🙅🏻 🙅🏻 🙅🏻
TARGET_OS_WATCH 🙅🏻 🙅🏻 🙅🏻 ☑️ ☑️ 🙅🏻 🙅🏻
TARGET_OS_TV 🙅🏻 🙅🏻 🙅🏻 🙅🏻 🙅🏻 ☑️ ☑️
TARGET_OS_SIMULATOR 🙅🏻 🙅🏻 ☑️ 🙅🏻 ☑️ 🙅🏻 ☑️
TARGET_OS_EMBEDDED 🙅🏻 ☑️ 🙅🏻 ☑️ 🙅🏻 ☑️ 🙅🏻
TARGET_IPHONE_SIMULATOR 🙅🏻 🙅🏻 ☑️ 🙅🏻 ☑️ 🙅🏻 ☑️

Hopefully someone else finds this useful.

TED 3.0: Working Effectively with Legacy iOS Code

I don’t talk often about the work I do to keep the lights on at my house, but I have spent a good portion of the last four years working on the TED for iOS app. The 3.0 release is the biggest release we’ve undertaken at TED in terms of scope and amount of changes to bring it to a head. The marquee feature is the app is now localized in 20 new languages. Users all over the world can now experience TED in their native languages from Arabic to Turkish.

Behind the scenes we have been planning to localize the app for over a year. What took so long?

A History Lesson

TED for iOS 1.0

TED for iOS first arrived on the App Store in October of 2010, and was released exclusively for the original iPad. TED for iOS 1.0 was designed to run on iOS 3.2. For the last six years we have been building on on this foundation as devices and capabilities for Apple’s platforms change. In my time at TED (September 2012 to present) I’ve seen:

When TED shipped back in 2010 it was written entirely in Objective-C and designed to work on two device sizes: the iPhone 3GS and original iPad. Now we support phones as small as the iPhone 4s up to iPads like the 12.9″ iPad Pro. In that entire time, we have continued to update the same app code base. There has never been a full-on rewrite of the iOS project, as tempting as it may be at times.

Renovating The House You Live In

Early last year, I took over as the lead developer of TED and began to dedicate more time to it than anyone had in previous years (TED has always been at most 2 part-time developers, and sometimes just 1). This allowed us to shift our mindset from maintaining the existing app to keep the wheels on while adding features occasionally to trying to be more ambitious with mobile goals. The only problem was that a lot of our code was showing its age. I made a list of high level goals to modernize the app going forward in small chunks so that we could still continue to ship updates to make TED HQ happy, while I eliminated many of the things that kept me up at night.

The first of those projects was to modularize the app. Since we are a small team, and have a fairly engaged group of users who upgrade quickly, we jump on the latest version of the iOS SDK faster than most companies. I set our minimum SDK to iOS 8 and began to break the app down into separate components. Instead of a single Xcode project with all our code in it, we broke it into separate dynamic frameworks:

By modularizing the code, it made it easier to start adopting unit tests for pieces of the code that had never sniffed XCTest in its life. It also helped the workflow so I could only try compiling TEDData if I was working with the data model exclusively, rather than having to wait and compile the entire app.

Adopting Adaptivity

When I found out that localizing the app into 20 languages was high on the list of priorities for TED, I made it pretty clear that the current foundation that the app lived on would not be capable of supporting it in a way that would allow the development team to be successful. Nearly every piece of code in the application was duplicated for both the iPhone and iPad. Up until earlier this year when you would launch the TED app on your iPhone, we would fork off to a different set of interface elements depending on if you were on an iPhone or iPad. So, for example, our Featured tab had FeaturedViewController_iPhone and FeaturedViewController_iPad with completely separate interface elements. On top of it, none of this interface code had been updated for Auto Layout, size classes, or Storyboards. In fact, most of it hadn’t been touched in years other than the occasional bug fix.

This was by far the most painful part of the app and took several months (you can see the gap in our releases last fall.) to not only consolidate down to a single UIApplicationDelegate, but also to consolidate and clean up the TED interface as much as possible using size classes, Auto Layout, and Storyboards.

We shipped all of this work in late January as part of TED 2.6. You likely didn’t notice a thing had changed if you used the app, but that was by design. Our metric for success was the user not noticing anything had changed. Internally, however, the entire app had been built up on a new foundation that would enable us to move a lot faster and move closer to localization.

Adopting Swift

As of version 3.0, about 20% of the TED code base is written in Swift. That is up from 0% at this time last year.

The first bits of Swift code we wrote were part of the adaptivity cleanup when I rewrote the entire ‘Surprise Me’ tab in Swift as a test case to see how it would go. You’ve heard the success stories before, so it should come as no surprise that I was a big fan. Not only was the code more concise and easy to read, but there was a lot less of it.

I’m pragmatic when it comes to adopting Swift in TED. If we need receive new designs for an existing feature, we go through the process of rewriting the code in Swift. If we encounter some Objective-C code that isn’t a candidate for a rewrite, I do take the time to add nullability attributes to it so that it behaves better when we are calling it in Swift, which makes the entire mixed code base experience much more pleasant.

I lose zero sleep over the lack of dynamism in Swift.

Tests and CI

The last component of our maturation as a mobile development team is building out our test suite and getting it running automatically as part of continuous integration. All commits are now automatically running our test suite on our CI server to make sure that we don’t add any regressions. Most (not all) new code is also being written with automated tests attached to it. We still have a major gap when it comes to integration and UI testing, but we’ll get there eventually.

One example where we are having great success with our test suite is when it comes to working with localization. There are several edge cases we need to handle when it comes to mapping Chinese language codes between iOS and the TED API. We automate all of that using XCTest, which makes it super easy to understand the intent of the code without having to run the app and step through the debugger.

The Localization Process

This entire modernization process took place off-and-on from January of 2015 to April of 2016. The actual process of localizing the app took place in the last four weeks. Since we went through the work of getting the app running on Auto Layout in as many places as possible, exporting and importing our xliff files meant that we had a minimal amount of interface work to do when it came to testing the actual interface. The biggest hassles turned out to be adjusting a few buttons to support multi-lines for more verbose languages like German.

I was admittedly shocked that this worked as well as it did, but it’s a testament to the work we put in building up to this actual release. 16 months of preparation for 4 weeks of work!


There’s still a ton of work to be done both internally and externally to bring the TED app to where I want it to be, but I’m pretty pleased with the work that we put in to get to this 3.0 release. I hope you’ll check it out

TED 3.0

Picking your Swift toolchain from xcodebuild

With Xcode 8 xcodebuild uses Swift 3 to build by default. This will be great in about six months, but right now it’s somewhat of a pain. I am using Carthage as my dependency manager and all of those are using Swift 2.x still.

If you want to build your Carthage dependencies (or anything from the command line for that matter) using Swift 2.3, add the TOOLCHAINS parameter. carthage build

Update: As of Carthage 0.17 you can use a --toolchain option. This doesn’t help with raw xcodebuild obviously.

When Good Apps Crash Unexpectedly

I had a fun Friday evening. I received a Slack notification sent to everyone in our group stating that the iOS app was crashing on launch. This came as a surprise to me, because our most recent release had been out for several weeks at this point. It also was the most stable release in terms of number of crashes we’ve ever released. This was a good point release.

After a few minor heart palpitations, I grab a few iOS devices and try launching the App Store version of the app. Success. No crashes for me. A few other team members launch the app successfully as well. At this point, I make a suggestion straight out of Microsoft’s 1990’s playbook: restart your phone. No more crashing on launch.

My theory was correct. I’ve seen a few times in the last few months where an app will randomly begin crashing on launch for no rhyme or reason. I’ve never really tried debugging it because it wasn’t my app crashing, but now I was annoyed. I made a point to check the iOS device console next time I saw one of my apps doing this weird crash on launch behavior.

Fast forward to yesterday when I wanted to check a few things in the Major League Baseball app. I tapped the MLB icon and . . . crash. Rinse. Repeat. I decided to go on a round-robin through my apps and see if any others are crashing. Turns out Medium, United Airlines, and Pocket are also crashing. The odds of four of these major apps all shipping at-launch crashers at once are good enough to make you rich in Vegas. At this point, I am pretty confident what’s going on but I’m not at home so I can’t easily plug my phone in and see what’s happening. I’ll just have to forgo three of my most used apps because I’m not restarting my phone yet.

Once I get home, I plugged my iPhone into my Mac, popped open device console in Xcode and tried launching the failing apps again. Here’s what spit out:

Apr 30 11:17:55 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028
Apr 30 11:17:55 iPhone assertiond[12347] <Warning>: Unable to obtain a task name port right for pid 14063: (os/kern) failure (5)
Apr 30 11:17:55 iPhone SpringBoard[12335] <Warning>: Unable to register for exec notifications: No such process
Apr 30 11:17:55 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for pid 14063: (os/kern) failure (5)
Apr 30 11:17:55 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for <FBApplicationProcess: 0x13facbcf0; com.medium.reader; pid: 14063>
Apr 30 11:17:55 iPhone[1] (UIKitApplication:com.medium.reader[0x7257][14063]) <Notice>: Service exited due to signal: Killed: 9
Apr 30 11:17:55 iPhone SpringBoard[12335] <Warning>: Application 'UIKitApplication:com.medium.reader[0x7257]' exited abnormally via signal.
Apr 30 11:17:56 iPhone SpringBoard[12335] <Warning>: Application '(null)' exited for an unknown reason.
Apr 30 11:17:56 iPhone kernel[0] <Notice>: IOAccessoryManager::configureAllowedFeatures: tristar: revoking mask=0xffff
Apr 30 11:17:56 iPhone iaptransportd[12353] <Warning>: CIapPortAppleIDBus: Auth timer timeout completed on pAIDBPort:0x15cd06f70, portID:01 downstream port
Apr 30 11:17:58 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028
Apr 30 11:17:58 iPhone[1] (UIKitApplication:com.ideashower.ReadItLaterPro[0x33c1][14064]) <Notice>: Service exited due to signal: Killed: 9
Apr 30 11:17:58 iPhone assertiond[12347] <Warning>: Unable to obtain a task name port right for pid 14064: (os/kern) failure (5)
Apr 30 11:17:58 iPhone SpringBoard[12335] <Warning>: Unable to register for exec notifications: No such process
Apr 30 11:17:58 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for pid 14064: (os/kern) failure (5)
Apr 30 11:17:58 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for <FBApplicationProcess: 0x13fed4190; com.ideashower.ReadItLaterPro; pid: 14064>
Apr 30 11:17:58 iPhone SpringBoard[12335] <Warning>: Application 'UIKitApplication:com.ideashower.ReadItLaterPro[0x33c1]' exited abnormally via signal.
Apr 30 11:17:59 iPhone SpringBoard[12335] <Warning>: Application '(null)' exited for an unknown reason.
Apr 30 11:18:03 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028
Apr 30 11:18:03 iPhone[1] (UIKitApplication:com.united.UnitedCustomerFacingIPhone[0x620b][14065]) <Notice>: Service exited due to signal: Killed: 9
Apr 30 11:18:03 iPhone assertiond[12347] <Warning>: Unable to obtain a task name port right for pid 14065: (os/kern) failure (5)
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Unable to register for exec notifications: No such process
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for pid 14065: (os/kern) failure (5)
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Unable to obtain a task name port right for <FBApplicationProcess: 0x13fa60dc0; com.united.UnitedCustomerFacingIPhone; pid: 14065>
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Application 'UIKitApplication:com.united.UnitedCustomerFacingIPhone[0x620b]' exited abnormally via signal.
Apr 30 11:18:03 iPhone SpringBoard[12335] <Warning>: Application '(null)' exited for an unknown reason.
Apr 30 11:18:07 iPhone syslogd[12297] <Notice>: ASL Sender Statistics
Apr 30 11:18:07 iPhone kernel[0] <Notice>: xpcproxy[14066] Container: /private/var/mobile/Containers/Data/Application/C1AEAFEE-6FDD-46F1-BBAB-F3E345D35EB9 (sandbox)
Apr 30 11:18:07 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028
Apr 30 11:18:07 iPhone[1] ([0x37dc][14066]) <Notice>: Service exited due to signal: Trace/BPT trap: 5
Apr 30 11:18:07 iPhone ReportCrash[14067] <Notice>: Formulating report for corpse[14066] AtBat.Full
Apr 30 11:18:07 iPhone SpringBoard[12335] <Warning>: Application '[0x37dc]' crashed.
Apr 30 11:18:07 iPhone ReportCrash[14067] <Warning>: Saved type '109(109_AtBat.Full)' report (3 of max 25) at /var/mobile/Library/Logs/CrashReporter/AtBat.Full-2016-04-30-111807.ips

The big thing that points out to me is this line:

Apr 30 11:18:07 iPhone kernel[0] <Notice>: AppleFairplayTextCrypterSession::fairplayOpen() failed, error -42028

Fairplay is the DRM Apple uses for App Store (and iTunes purchases in general). Something is going wrong with the DRM on these apps causing them to fail to launch. Confidence inspiring!

This is as far as my investigation has taken me at this point. I don’t really have a theory what is causing this. My first thought was it was caused by an app being updated, but the United app hadn’t been updated in a while and it was crashing for no reason. I am mostly writing this up because when I was trying to defend that this was an iOS issue and not a bug in our code, I really didn’t have much of a source to cite other than the few tweets I remembered seeing from other folks running into the same issue. I’d love to know more if you have any theories.

Apple folks, have a radar: rdar://26032481

GoogleCast, GitHub, Git-LFS, and You

My main project these days bundles Google’s GoogleCast / Chromecast framework in it to allow users to stream video content from their iOS devices to Google’s little streaming sticks. Overall, I like the Chromecast devices. They seem to work a bit more reliably than Airplay in a lot of ways. What I don’t like, however, is the framework that Google provides for iOS to enable casting support in third-party apps.

We have had the framework in our git repository for the last year without much issue. Only recently did we try to update to the latest version (2.10.4 as of this writing) that the issues started to really pop up. Seemingly overnight, the GoogleCast framework ballooned from 17MB to 439MB. That is not a typo. It really did grow 25x in size. I opened an issue and Google is quick to blame Bitcode for the issue, though I’ve never seen a framework grow 25x. So our app is now a little bigger because we have this giant framework embedded in our project. No big deal, right?

The problem is that we use GitHub for our version control system and it has a hard limit of 100MB for a file size that can be pushed to a repository. If it’s over that 100MB size, the file is rejected and GitHub suggests adding it to GitHub’s Large File Storage system. Git-LFS is an extension to the Git protocol that replaces giant files like a 439MB iOS framework with a pointer to a location on a remote storage location that is better designed to house such large files (think Amazon S3).

Easy fix, I thought. I installed the Git-LFS extensions and tried adding the GoogleCast framework to LFS, but was met with an error:


Long story short, to get a file on GitHub LFS it has to have zero history in your repository. Otherwise, the pre-push hook won’t recognize it as an LFS file and you’ll be in a jam. GitHub has a page that explains this and how to get around it. It took a bit of trial and error for me still to get GoogleCast to behave with Git LFS. These are the steps I went through in Terminal:

1. Analyze The Damage

Using gitk I was able to see how far back and how many commits were around with the GoogleCast Framework as a part of it. To get gitk working on El Capitan, however, you need to update the copy of tcl through Homebrew.

brew cask install tcl
gitk --all -- External/GoogleCast.framework

In the output you should see the commits that include the GoogleCast framework and its children inside.

Sweep It All Away

Here comes the painful part. You’re going to need to go through and remove every reference of the file in your repository history on every branch and tag. You’ll then need to force push the changes to your repo, which is always a little scary. There’s two ways to do this: git filter-branch or a third-party tool called BFG Repo Cleaner. The BFG is supposed to be faster, but I was unable to get it to successfully work for what I was trying to accomplish, so I went the slow and time consuming route.

You’ll need to repeat this on every branch you have open in your repository. Grab a drink:

git filter-branch --force --index-filter 'git rm -rf --cached --ignore-unmatch External/GoogleCast.framework' --prune-empty --tag-name-filter cat -- --all

This will go through and remove the GoogleCast framework files from every commit in your history. After you’ve finished cleaning up a branch you can run gitk again with this command to ensure it’s no longer in your history:

gitk -- External/GoogleCast.framework

Clean Up

After this, it’s probably worth running the Git garbage collector.

git gc --aggressive --prune=now

And finally, you’ll want to force push all your changes back to your repository.

git push origin --force --all; git push origin --force --tags

On to LFS

Now that we’ve got a cleaned up copy of the repository on GitHub, we can finally start re-adding the GoogleCast framework back to your repo and making sure its under LFS. Go ahead and add a fresh copy of the GoogleCast framework and xcassets files to where you previously were storing them. Make sure your app builds at this point just for safety.

First we need to install the Git LFS pre-push hook in our repository:

git lfs install

Next, we’ll go ahead and tell LFS about our oversized GoogleCast files.

git lfs track External/GoogleCast.framework/Versions/A/GoogleCast
git lfs track External/GoogleCast.framework/Versions/Current/GoogleCast
git lfs track External/GoogleCast.framework/GoogleCast

You’ll notice I am doing this three times. For some reason only known to Google they are bundling the same 139 MB binary three times in their “framework” so you have to make sure you hit all three instances of it.

Commit all your changes, including the new .gitattributes file additions that were made to track the new oversized GoogleCast files. You should now be able to push your changes to your master or develop branch and have the oversized files added to GitHub’s LFS system.

Finally you can go back to getting actual work done.


I can’t say enough good things about this Android commercial ad that played a few times during the Oscars.

Stephen Wilkes’s Day to Night Photography

Stephen Wilkes is a professional photographer who gave a talk at TED last week on his day to night photography project. Wilkes sets up at a location for an entire day or more capturing the same shot repeatedly. He then heads back into his studio and stitches the moments together into single photograph that captures the activity of that location over a single day. The photos are incredibly mesmerizing.

I probably will have more to say about my experience at TED. Probably.

@Justin Bieber Explains Twitter’s Difficulties

In case you weren’t aware, Twitter is doomed! Like, Apple levels of doomed. User growth has been stalled for years. Advertising isn’t picking up nearly as well as they’d hoped. And the stock price is trading at less than my high school allowance because Wall Street is losing faith. I would personally prefer they regain their faith in Twitter, because I bought the stock at $45. I’d like to recoup that investment.

Ever since Jack came back to save his flailing company, people have been waiting for him to sprinkle pixie dust on Market St and suddenly solve all of Twitter’s problems. Sadly, not even Steve Jobs’s spiritual son can perform this kind of magic in just a few months. There have been a few different product launches since last fall and another round of #ExecutiveShuffle, but the ultimate problems of Twitter still remain. Twitter is too damn hard. Not for you, the fine person reading this article. Not for me certainly. I’ve been on it for a decade this July. But for those other people out there in the world? The “normals”? They don’t get it.

How do I know? I am @justin on Twitter and this has been my mentions tab for at least half a decade.

My Twitter Mentions Nightmare

You’ll notice that most of these tweets aren’t necessarily for me. I am constantly inundated with tweets to “@Justin Bieber”, “@Justin Trudeau”, “@Justin Timberlake”, and maybe once even “@Justin Guarini.” The examples above are just a few random people who were confused while tweeting. It gets worse when someone like TMZ (3.69 million followers) tweets about “@Justin Bieber” instead of “@JustinBieber” and I can’t use Twitter for a few days without wanting to switch back to Pownce. When Bieber himself tweets I am well aware, because my mentions stream blows up even worse than this.

More than annoying, it can also be somewhat sad. I’ve seen teen girls confess their undying love to Bieber. I know when they cry because he wore a green shirt. I’ve even seen people threaten to cut themselves if he doesn’t reply to them. Am I supposed to reply to that tweet? It was sent to me (@justin bieber), and not Bieber (@justinbieber), after all. I ultimately end up spending an inordinate amount of time blocking these accounts to try to keep my mentions at some sort of sane level. At last check, I’ve blocked well over 60,000 accounts. That’s at least 1/3 of Twitter’s 4th quarter monthly active users!

Because of this, I don’t really enjoy Twitter that much anymore. This is also why I don’t use it nearly as much as I used to. It’s bad enough that the culture of Twitter is centered around abuse, actually-ing people, and making it way too easy for dumb people to try to sound smart. Add on top of it that most people don’t understand the product and I have to spend part of my life doing manual labor trying to make the service usable for me. No better way to spend a Saturday night than Twitter Block Button and Chill.

The fundamental tenents of Twitter are obviously broken for most people and they have been for years. Based on the currently super confusing interface the product offers, Twitter’s conversational nature lets me gauge where Bieber is still most popular (South America, Southeast Asia) and how Justin Trudeau is doing up in Canada (pretty great!). The product does not enable people to successfully talk to their favorite celebrities or #brands. Instead they end up talking to an iOS developer in Colorado who really doesn’t want to know them. Why would I want to keep using a product that enables this exactly?

Twitter the the company seems aware of this based on their last shareholder letter.

We are going to fix the broken windows and confusing parts, like the .@name syntax and @reply rules, that we know inhibit usage and drive people away. We’re going to improve the timeline to make sure you see the best Tweets, while preserving the timeliness we are known for. The timeline improvement we announced just this morning has grown usage across the board (including Tweeting and Retweeting). We’re going to improve onboarding flows to make sure you easily find both your contacts and your interests. We’re going to make Tweeting faster while making Tweets more expressive with both text and visual media. We’re going to help people come together around a particular topic, such as our @NBA timelines experiences. Relentlessly refining Twitter will enable more people to get more out of Twitter faster.

Whether this is just lip service to shareholders to try and quell another mass sell-off after a disappointing quarter, or something actually will change remains to be seen. I personally welcome the algorithmic timeline, because I no longer check Twitter more than once a day or so. I don’t want to see every tweet from every person I follow and since my mentions are usually a dumpster fire I don’t have much to look for in there either.

If @jack is looking for a fuzzy metric to determine whether Twitter is getting easier to use, take a peek at my mentions every couple months and gauge my misery level. The happier I am, the easier Twitter is becoming to use most likely.