WWDC 2017: One more big thing

When watching this year’s keynote at Apple’s Worldwide Developers Conference, I implore you to do one thing: think big.
I don’t mean big as in a new 12.9-inch iPad Pro, or big as in the number of features packed into this year’s annual iOS or macOS software updates. I don’t even mean big news, like the rumored Siri Speaker the company might announce.
No, I mean think big picture. After all, while Apple may be a huge company made up of disparate units, products, and platforms, it has always promulgated the idea that it brings all of these resources to bear towards one unified goal. And I think that if you look at the big picture of what Apple ends up announcing next week, you’ll come away noticing a couple major themes in that overall strategy.
Learning in the machine
“Machine learning” is probably one of the biggest buzzwords in the technology industry right now. While we used to focus on somewhat vague idea of artificial intelligence—that is, computers that can think like people—machine learning is kind of the practical application of AI theory. How do we use the strengths of computers to solve problems for people?
Machine learning at work, in your pocket: iOS 10’s app sugestions.
Machine learning first came to play in a big way with Apple’s release of iOS 9 and its “proactive” features. These features generally lurk beneath the surface, using algorithms to try and predict what users might want when they took certain actions, and then present more relevant options more quickly—it’s all about trying to derive the intent of what the user is doing.
For example, when you swipe down to the search view from iOS’s home screen and see a list of Siri App Suggestions, that list is based on a number of factors: apps you recently used, apps you often use, apps you use at this type of day, apps relevant to other things your phone is currently doing (playing music, for example), your location, and so on.
That’s not to say that these types of proactive features are perfect, but they improve not only as Apple’s engineers work on them, but also as the algorithms themselves learn more about your behaviors. Proactive features have begun to appear in more places through Apple’s software. For example, when you open the Maps app on your iPhone, it offers locations that you’ve recently looked up elsewhere on your device.
So don’t be surprised at all when machine learning is a big part of Apple’s WWDC announcements this year. The company’s invested heavily in the arena, and it’s even had its engineers presenting at AI conferences, an unusually open move for a company that’s all about secrecy. It’s safe to say machine learning is going to keep cropping up in Apple’s technology for the foreseeable future.
Silver services
Apple’s been pushing the financial success of its Services division for a while now. Services are the glue that tie together not only Apple’s software and hardware, but also its disparate hardware devices and platforms, and those links are only going to grow stronger going forward.
It helps that “services” is a really broad category, encompassing everything from the iTunes and App Stores to Apple Pay and iCloud. Not all of them work across every platform that Apple has to offer, and their uses aren’t always obvious.
The biggest opportunity for services in the Apple ecosystem is the continued increase in the number of devices we all have in our lives. From computers to smartphones and tablets to set-top boxes, we have them in our homes, our workplaces, our cars, and on our person at all times. And with this surfeit of devices, people have developed less and less patience for them not working together. If you start watching a video on your smartphone, you ought to be able to easily pick up where you left off on your TV. If you’re listening to a song on your Mac, you should be able to put on your headphones and continue on—as that very first iPod commercial demonstrated.
Services are key to these features, and they’re only going to become more important as Apple introduces additional devices, such as a putative Siri Speaker. With connected efforts like HomeKit, the need for those devices to be aware of each other is paramount. We’re already getting to the point where we have too many devices to easily manage—the burden needs to be taken off of us. We’re no longer at a point where we can be expected to, say, tag all our photos on separate devices.
And perhaps that’s where these two themes mesh together into one uber-theme. The combination of machine learning and services is poised to make things easier for us, the users. Really, that shouldn’t be a big surprise, as it’s been an overarching theme of Apple since the beginning: technology that works for us, instead of the other way around.