The future of the Finder

The desktop metaphor is practical and useful, but maybe it’s time to move on to a new way to use our computers.

Apple
“);});try{$(“div.lazyload_blox_ad”).lazyLoadAd({threshold:0,forceLoad:false,onLoad:false,onComplete:false,timeout:1500,debug:false,xray:false});}catch(exception){console.log(“error loading lazyload_ad “+exception);}});
This is the last installment of this column, and as such, I wanted to cover one of the most important features on the Mac: the Finder. This file manager, browser, and user interface layer is the tool that people use to launch applications, work with and manage files and folders, and control pretty much everything their computer does.
The early Mac was revolutionary, bringing the desktop metaphor to everyday computers. It wasn’t the first computer to use this type of interface, but it was the first one that was widely adopted. Instead of controlling a computer by typing lines of text commands, it used the WIMP interface: windows, icons, menus, and pointer. (And even before text commands, computers were controlled by punch cards, tapes, and other ways of inputting commands and data.)
One thing the desktop metaphor does is allow us to organize files any way we want. Unlike tags, where you set keywords for your files—that you may or may not recall later—folders let you sort items in the way that best fits your style of organizing items. They’re flexible and extensible, through sub-folders, and sub-sub-folders. You could dump all your files in a single folder and use Spotlight to find the ones you want, but you’d quickly find that it’s more time consuming to use this type of interface than to keep your files sorted.
While the desktop metaphor is practical and useful, maybe it’s time to move on. Apple has make a variant of this with iOS. It’s got apps and windows, but no files—though files appear in some apps, in lists—and the pointer is your fingers. Perhaps its time to start experimenting on the desktop. I would be surprised if Apple doesn’t have teams working on new interfaces, and I would love to see how they imagine this, even if it’s only as a proof of concept.

Minority Report is often cited as an example of a futuristic computer UI.
It’s not that the desktop metaphor is bad; and it may be that it is the optimal way to work with a computer. I’m not sure I want to use something like the computers in the movie Minority Report; I’ve always felt that these gesture-based ideas ignore the fact that working like this all day would lead to shoulder, neck, and arm pain. (And this is probably the main reason that Apple doesn’t release a desktop or laptop computer with a touch screen.)
And we can’t do everything by voice. Even though Nuance’s Dragon is an extraordinary tool for converting your voice into text, and you can use Siri to control some of your Mac’s functions, good luck with that when you work in a crowded office. Augmented reality (AR) seems to be the next big thing, but that’s not a way of interacting with a computer, it’s more a tool for extending the capabilities of mobile devices.
The science-fiction method that’s gaining traction—and proving promising—is brain-computer interfaces. But they only provide a way of interacting with a computer’s existing interface; they don’t change the metaphor used to present items on a screen. They replace the P part of WIMP, not the rest.
Perhaps there is no better way. It may be that we’ve found the best way of interacting with complex devices through the WIMP interface. But I would love Apple to surprise us with some new way of working with Macs, even if it’s only an option or an add-on.
“Can’t innovate any more, my ass?” Come on, Apple, show us what you’ve got.