Tuesday, June 28, 2016

Why Neo and Scorpio WON'T be the Worst Thing Ever


In 2005, I wanted an Xbox 360.  Everyone wanted one.  But, like so much of new tech, there were massive supply shortages and at $400, the initial price tag was just too high to simply "dive in".  Plus, I still had some great games like God of War II and Shadow of the Colossus that I'd yet to finish on my reliable, old Playstation 2.  The more I played them, the more I realized I wasn't a Microsoft guy.  I was a Sony fanboy.  Yes, Microsoft had some great exclusives like Left 4 Dead, Gears of War, and Mass Effect, but I was content to wait until Sony released their console.  (Sorry, guys, give me Quake II or Unreal Tournament over Halo, any day.)

This was also when the HD Format Wars were shaking out, and while Microsoft had fastened its "next-gen" console with a standard DVD player and an expensive add-on, Sony was going all out by fitting its shiny new living room invader with the barely-emerging Blu-Ray player, which promised higher storage capacities that fostered better video quality.  Yes, it was more expensive than the competition, but the Playstation 3 offered a more "complete" console with HDMI ports, an integrated wireless adapter, a native wireless controller, and damn it looked sleek as hell.

Unfortunately, by then, Microsoft had been entrenching itself into homes for almost a full year, and Sony would have a rough go of catching up when the 80GB version of its console was retailing for $600.  And even if you concede that Sony lost the fight for the last console generation, the allegiances pledged and forsaken during this time would define it for years to come.

Buying a console is a hallmark of a new gaming cycle.  Think of it like Christmas, if Christmas came once every seven years or so.  A console wasn't just a toy or some piece of equipment for your entertainment center like a new DVD player that could be swapped out for $50 year-on-year.  A console was an investment.  Whatever you connected to your television in those early days would likely remain connected for the next decade.  The weight of that decision can overwhelm the casual gamer who is coming from outside the hype, and could start wars of rage among those who have already been indoctrinated into the ultra-competitive gamer culture.

I eventually caved a year or so later and got an Xbox 360 anyway, and while I got exponentially more frequent and consistent use out of my Playstation 3, playing the Mass Effect series on my 360 was worth the price of the console alone.

With my loyalties firmly behind Sony, I was all too eager for them to give me an excuse to upgrade to the Playstation 4.  That excuse came in the form on Uncharted 4, and it did not disappoint.  With a lower retail price and beefier specs than the Xbox One, it seemed obvious which console to buy, especially after Microsoft blundered its way through the initial unveiling of their new baby.

Once again, I felt assured that the console I bought back in February would be decorating my entertainment center for the next ten years or so.  As a gamer, I was content.

Then it all came crashing down at E3 2016, when Microsoft announced their mysterious Project Scorpio and Sony confirmed longstanding rumors that they'd been brewing an early successor to the PS4.  Both consoles promised more power, but each company assured its customers that these consoles would not only be backwards-compatible with the existing catalog of current-generation games, but that all games going forward would be available for both the existing consoles and the consoles to come.

That should have been enough to assuage my anger, but I've heard similar promises before.  Anyone who's played MLB: The Show on a Playstation 2 when they could be playing a much smoother, better looking version on Playstation 3 can tell you that the promise of ubiquity among games is not as reassuring as it may at first appear.

What were these guys thinking?  It's not like the Xbox One and Playstation 4 have grown long in the tooth!  They still average $300-$350 a pop!  That's a far cry from the $100 Xbox 360's or Playstation 3's you can get these days!  Why would they even bother introducing new hardware when this console generation is just starting to hit its stride?

Well, to be honest, there's a few reasons.

Firstly, there's the adoption of 4K.  Whereas Ultra-High Definition seemed like an expensive luxury a few years ago, 4K TVs are finally reaching the price point that makes them attractive to those that couldn't previously afford them.  With companies like Netflix and Amazon pledging to produce content in 4K resolution, the real boom for the format will happen when broadcast television makes the switch (which is a huge can of worms for a whole other article).  Say what you want about the lack of 4K content (which is keeping me from splurging on a new screen of my own), but those looking for the crispest video experience are expected to be flocking to 4K in 2016.

Unfortunately, one thing you won't be able to do in 4K is play video games.  As powerful as the latest crop of hardware is, it's barely powerful enough to push 1080p at 60 frames per second.  Try quadrupling that resolution, and you'll be lucky to see anything resembling a playable experience.

The upcoming Xbox One S may be able to play 4K video, but if Microsoft has its way, it looks like Scorpio is groomed for 4K gaming.

The next elephant in the room is Virtual Reality games, which were prominently featured at E3 this year.  Like 4K games, the Xbox One and Playstation 4 are nowhere near powerful enough to present a smooth VR experience; and when it comes to VR, smoothness is everything.



So while Microsoft didn't announce their own VR hardware, Sony went ahead and put a price tag on theirs, and while Sony promises that existing PS4's will in fact be able to use the upcoming Playstation VR headsets, it's unknown if the system has the "oomf!" needed to render two instances of 1080p video at a constant 60-90 frames per second.

That's all well and good, but I was still brimming with frustration.  I'm not what most people would consider a "hardcore" gamer.  Sure, I enjoy video games.  A lot of video games.  And yeah, I like to pimp out my PC to make it the best it can possibly be.  I love beautiful games, and I love them at smooth framerates.  Did Uncharted 4 start to stutter a little bit towards the end when things got hot and heavy?  Sure, but the game itself was never unplayable.  In fact, aside from one instance in which the game froze toward the end of what had been a rage-inducing firefight, not only did the game look incredible, but I never noticed a perceptible dip in performance until the very end.


It's not a question of whether or not I'd like a console that could play Uncharted 4 with no hiccups whatsoever.  It's the question of whether or not I'm willing to pay for a new console, effectively replacing my less-than-a-year-old one, for that privilege.

I also have to ask myself how quickly I'll be dipping in to the world of VR, and whether the current crop of promised experiences is worth the investment in a headset that in and of itself costs as much (if not more) than any console required to use it.

For me, the answer is no.  Not only can I not afford this upgrade, but there's simply no motivation for me to upgrade right now.  However, coming to that realization helped me understand that I was looking at this all wrong.

As a PC gamer, I'm used to new hardware coming out on a more-or-less annual basis.  When it does, I have to weigh the pros and cons (mostly financial ones) to see if the investment in new hardware is really worth the performance boost.  Most times, the answer is no.

But I'm also a console gamer, and the console gamer in me is used to investing in one machine over seven or so years before needing to drop another $400-$500 on a new one.

See, the language of these industries is different.  PC language says, "If you can afford it, we have it," whereas console language used to be a one-stop-shop for a system that "just works."  New PC hardware meant that elitists could upgrade to the latest and greatest, but new console hardware meant that the industry expected you to upgrade to whatever they were selling if you even wanted to be relevant in the next year.

Then I took a look at my phone.  There's a notification from AT&T that I am eligible for an upgrade.  But I'm staring at my iPhone 6s thinking to myself, "Apple hasn't even come out with a new flagship phone for me to upgrade to!  And even if they did, there's no reason for me to upgrade this phone!  It works fine!"

That's the way we will start thinking about consoles: Sure, there's something better out there right now for those who can afford the upgrade, but mine's working just fine for now.

The difference is, whereas not upgrading from a Playstation 2 to a Playstation 3 meant you couldn't play God of War III, not upgrading from a PS4 to a Playstation Neo doesn't shut you out.  You'll still be able to play all the games that are coming out for this console generation.  Will they look quite as pretty or play quite as smoothly on your current hardware as they will on the new one?  Probably not.  And if that bothers you enough to sink more money into a Neo or a Scorpio, then the option is there.

But for someone like me, who can't afford to upgrade their console or their phone every two years, the promise is that I'm still part of this console generation.  I'm not being left behind.

Now, obviously at some point they will phase out the Playstation 4.  More and more games will become exclusive to the consoles with the hardware capable of presenting them as their creators intended, and then gamers will have to choose whether or not to upgrade.

Hopefully that day is a good five or six years from now.  By then, something newer than Scorpio or Neo will be on the market, and VR will be cheaper than $400.

By then, the idea of having a choice will be more liberating and less intimidating, and we as a gaming culture will be looking at upgrade cylces as a freedom of choice rather than a requirement.

Wednesday, June 15, 2016

WWDC 2016: A Reboot for Apple


Apple's annual summer developers conference was met with mixed reviews by their adoring faithful, but it marked a much-needed return to form for the "magic" the folks in Cupertino are known for.  With a complete omission of hardware-based hype (no new iPhones, Macs, or iPads), the focus was solely on improving the "Apple experience" across every existing platform, and that is something the company desperately needed.

Just over a week ago, I proposed that Apple's muddled design language has been a sore spot for consumers even as its hardware continued to evolve (albeit at an incremental rate).  Using iTunes as the primary example, interacting with the same content across iOS, OS X, tvOS, watchOS, and even Windows is drastically, frustratingly different.  At this year's Worldwide Developers Confernce, Apple seemed determined to rectify that by not only introducing new features to each of these platforms, but redesigning the interfaces to provide a more cohesive user experience across devices.  Is it perfect?  No.  Will it solve all the problems plaguing the Apple ecosystem?  Probably not.  But it is a welcome step in a more proactive direction.  Rather than bury its head in the sand with some hyperbole about the inherent "magic" of simplicity, Apple is working hard to bring that magic back to the fore, and it showed.

In some ways, this was Apple's most Apple-like press conference in years.  Here are some of the reasons why.

Craig Federighi

As the CEO of Apple, it's Tim Cook's obligation to open and close the show.  But to say he lacks the instant charisma and presence of Steve Jobs is an understatement.  Despite getting a hearty round of applause upon gracing the stage, Cook lacks the understated "cool" factor that Jobs radiated when he was at his best.

There's nothing wrong with Tim Cook.  He's done a fine job as CEO, particularly in a time where Apple is under tremendous scrutiny by its shareholders and the government, and under immense pressure from competition like Microsoft, Google, and Amazon.  As a presenter, though, Cook feels less like a rock star and more like the boss showing up to the company picnic.  "Hi, everyone.  I'm here.  I'm in charge."

When Cook starts spouting numbers, it's with the sort of boardroom finesse of a seasoned business executive instead of the enthusiasm of someone drinking the Apple Kool-Aid.  This may seem nit-picky, but the truth is that Apple's success has less to do with "groundbreaking" products and more to do with the culture and mystique surrounding the company as a whole.  As an audience, we may love our Macbooks and iPhones, but that enthusiasm wanes after just a year of using them.  Already my iPhone 6s feels standard rather than "premium."  My 3 year-old Macbook Pro is starting to lose its luster in light of thinner, lighter, sleeker machines (even if they're grossly under-powered).  Apple's job isn't just to sell us new products and services every year, it's to re-ignite that "Reality-Distortion Field" Jobs was so famous for: pulling us back into the aura of Apple where everything is "magical" and every new product is a revelation.

Cook is not the man for that.  He does a fine job talking about Android to iOS migration, Apple's green energy initiatives, or educating the next generation of programmers, but he's not the man to be touting the "magical" new products or services we're there to see.

That's where Craig Federighi comes in.

Now, don't get me wrong.  Just watching this iOS 7 reveal video featuring Craig makes me think this guy's been brainwashed by some Apple voodoo shaman, but listen to the difference in tone from when Cook is on stage talking numbers to when Craig is on stage talking about embedding Siri into macOS.

Federighi has what Cook needs: the ability to sound truly passionate about what he's showing you.  Was anything at WWDC truly revolutionary?  No.  But I'm excited about it because Federighi made me believe these were features I not only wanted, but couldn't wait to have!  Even with hiccups in the on-stage demo and a clear fetish for emoji, the promise of more responsive apps in watchOS and a redesigned iOS Music app give me hope that my experience with the products I use every day is only going to get better.

In contrast, Apple should be keeping Eddie Cue off stage when at all possible.  Cue may be Apple royalty and a genius behind the scenes, but he was not meant to be on camera in front of a live audience.  He swaggers onto the stage in "casual" dress and a smile that's a little too happy to be there.  Whether intentional or not, it's an example of how the Apple hubris can harm when too prominently displayed.  Remember: we're attempting to distort reality, here, and Cue takes me right out of it and reminds me that this is nothing but a product demonstration by a tech company.  Cue seemed caught between being over-rehearsed, and not having rehearsed enough.  Doing his best to sound excited, he often stumbled over key parts of his presentation, always looking down at the teleprompter even as the products he talked about flashed on the screen behind him.

On the flip side, Federighi comes across like a kid in a candy store dressed like a professional adult.  His smile is the kind that's not just happy to be there, but excited to show you what he has for you. That excitement permeates the room and the audience.  When he speaks, it's deliberate but never forceful.  It's the tone someone uses when they don't want to openly disagree with you, but will give you just the right information to make you question your outlook.  He's not forcing a product in your face, he's inviting you into his world, and whether your mind realizes that or not, it makes a huge difference not just in our perception of him, but our perception of the products he's displaying.

Even moreso than Sir Jonathan Ive, Federighi carries with him a vestige of Steve Jobs: the power to change our minds, and the childlike wonder that still manages somehow to be in awe of itself.  That's where the Apple "magic" comes from, and it's very reassuring to see its return.

No Gadgets

Even if you're familiar with Apple's retail schedule and knew we wouldn't see a new iPhone until the Fall, you were probably expecting some sort of hardware unveiling at WWDC.  Rumors are still circulating about a refreshed Thunderbolt Display, an Apple Watch 2, and a refreshed Macbook line with current-gen processors and even OLED touchbars.  You probably hoped one of these would show up at WWDC.  None did.

That's a sore point for many Apple fans who get their jollies out of upgrading to the absolute latest and greatest every year, but the lack of gadgets at WWDC made one thing crystal clear to everyone watching and attending the conference:

Apple is committed to its software development community.

Obvious, right?  (Tt is a developers conference, after all!)  However, the long-reaching effects of this message may not be immediately clear.

This may seem rudimentary to the way mobile software is distributed, but we often don't take into account how developers get their software to us.

A great example is the idea of the "free app."  I know plenty of friends who simply refuse to pay for an app.  They'll use the ad-infested free version if it means they don't have to pull out their credit card.  There's something to be said for frugality, but most people don't even recall that on the other end of that app is a developer that sank time, money, and resources into getting it to you.  That $1.99 may seem like a lot, but when you consider that Apple takes a 30% cut of every app purchased, suddenly your $1.99 becomes less than $1.40.  A cup of coffee costs more than that.

It's not just Apple doing this.  Virtually every mobile app store follows the model established by iOS.

Developers often feel powerlessly beholden to this business model, since there's no other way for them to get their software on the device aside from asking users to jailbreak their phones.  They're expected to just "suck it up".  It may not be a big deal for a game like Angry Birds, which has been downloaded millions of times across multiple platforms, but it is a big deal for the small studio putting out an app or a game for the first time.  Not only is it difficult to get people to pay for an app from an un-established company, but doing so then means that Apple is still entitled to 1/3 of the revenue just for hosting the app in their store.

Conversely, the App Store in OS X is at a crossroads as well.  The difference here is that developers have a work-around: rather than hosting their software through Apple's App Store, they can just direct users to download their software directly from their own website.  If I want a copy of, say, Scrivener, I can download it directly from the App Store, or I can go to Scrivener's website and download it there.  The latter bypasses Apple completely and ensures that the money I fork over for the software goes directly into the developers' pockets.

Apple is all-too aware of this, and has decided to rethink its model.  Soon, Apple will take just 15% of app purchases, effectively cutting its share in half and ensuring that developers receive up to 85% of the money paid for their software.

That sounds generous by Apple, and it is.  It's very possible Apple will see a significant drop in App Store revenue because of this, and that's a risky situation considering shareholders like to pay attention to things like revenue streams.  What choice does Apple have?  They could simply keep the existing model and force developers to eat the cost of doing business with them.  But that's not exactly prudent on Apple's part.

Why not?  Because any anti-Apple zealot will tell you, Android is the most widely used mobile OS in the world.  Now, we can get into the minutiae of the plethora of devices Android runs on (everything from $50 phones to $1,000 tablets), but the truth is that while iOS may be perceived as the standard-bearer for mobile computing, Android is the system more people see every day.  It's the Windows 95 to iOS's Mac OS.  It may not be as flashy, but it's flexible, customizeable, and growing more capable every day.

Apple can still pull the "But we're Apple," card, but if we're talking sheer numbers, developers who want to get their apps on the most devices develop it for Android.

By cutting their revenue in half, Apple is practically begging developers to keep iOS at the forefront of their development cycle, and that's very good for developers and users alike.  It ensures that developers get the exposure they want on all platforms without the penalty of loss of revenue, and it ensure that iOS users get the best and widest selection of apps available.

A Better Experience

There was another message sent by Apple on Monday, and while it's a positive one, it's distinctly un-Apple-like.

Contrary to what they might have you believe, Apple is very much aware that its software isn't perfect and is committed to improving the experience of its software across devices.

Right out of the gate, Apple's vice president of technology Kevin Lynch admitted that load times for for watchOS were poor, to say the least.  An on-screen demonstration showed us waiting several seconds between the time we tapped an app to the time the app appeared ready.  To remedy this, Lynch described watchOS 3 as being able to keep certain apps running the background and constantly refreshing so they would be instantly accessible at a touch.  (There was no word on how this would affect the Watch's battery life.)

Without outright saying, "We know it takes apps forever to load on Apple Watch," and using positive language like, "You deserve apps on Apple Watch that respond as seamlessly as apps on your iPhone" (paraphrasing), Apple manages to come out looking like a hero for addressing a problem that's existed since the launch of Apple Watch over a year ago.

The same can be said for the Music app in iOS.  Even though he didn't outright declare what a mess the current Apple Music app is, Eddie Cue was adamant about the ease and beauty of the new user interface.

tvOS gets new features like the ability to download corresponding apps when they are downloaded on your iOS devices (i.e. download ESPN app on iPhone, it appears on AppleTV).

And of course, OS X--or rather macOS--now gets Siri and a host of other features that go along with that.

Apple probably could've held a press conference on iOS alone (which might have made the duration of its Messages demo a little more bearable), but instead it presented a picture of a cohesive experience and commitment to improvement across all major platforms, and that's something they should be commended for.

Not every problem will be fixed by the Fall roll-out of these updates, but it does show that Apple is at least listening to its customers and trying its best to improve the experience without sacrificing the parts that make it uniquely "Apple."

So, no, you won't be able to brag about being the first of your friends to get the newest iPhone.  And yes, October will most likely bring with it the promise of new Macbooks, iPhones, and maybe even a new Apple Watch, so you have that to look forward to.

But even if you don't buy a new Apple product this year: even if you're still rocking that old 4th-gen iPad from 2012, Apple has promised you that your experience with those products will get better this year.

And that is something to be excited about.

Tuesday, June 14, 2016

Sony Wins E3 (Again)






Even as a Sony fanboy, I was ready to concede to Microsoft following their E3 Press Conference.  New, exclusive games, cross-platform cohesion, and the addition of two new consoles gave the folks from Redmond a huge head start early Monday afternoon.  All the heavy hitters were there, including a new Gears of War and Halo Wars, as well as Sea of Thieves that had me salivating for the competing console.

In addition to its software, Microsoft beefed up the hardware with the introduction of the Xbox One S: a slimmed down version of the flagship console that now supports 4K video output for movies and TV shows.  But what about 4K games, you ask?  Well, they took care of that, too with a brief video package for Project Scorpio: Microsoft's next big leap forward in the console space.  Unlike either of its predecessors from this generation, Scorpio promises to finally deliver the dream of 4K gaming to the console market as well as support for the burgeoning realm of virtual reality gaming.

The best part?  All consoles would be compatible with all software.  In other words, games for Xbox One would be playable on both Xbox One S and Project Scorpio (Xbox Scorpio?)  As Xbox Chief Phil Spencer put it, "No one gets left behind!"

It was a pretty sweet salvo that I'm sure was intended to take some of the shine off Sony, who has roundly been considered the "winner" of E3 since the launch of the PS4.  And while new hardware was rumored and even confirmed by people from within the company, I don't think anyone could imagine that Sony would not only win E3, but do so without revealing a Playstation 4.5.

But they did, and here's the five reasons why:

1.)  God of War (4)


The opening moments of a press conference often dictate the tone for the next hour or so.  Generally, these openings feature a video package showcasing the brand and major IP's from first and third-party developers.  You'll then see the host of the conference emerge to talk about how great their hardware and games are, and then they'll start unveiling some game trailers.

Sony bypassed most of that.  After an extended opening overture by an in-house orchestra, the curtains rose and the lights dimmed to reveal the familiar "Sony Interactive Entertainment Presents" title screen.

Rather than showing us a three-minute glimpse of whatever lay beyond, we were treated to real-time gameplay of a young boy in an ancient village.  As the audience wondered what franchise we were looking at, they collectively gasped and applauded when a hitherto disembodied voice emerged from shadow to reveal the aged, yet iconic Kratos.  What followed was a solid ten minutes of third-person gameplay unlike anything God of War fans have seen before.

It was an astounding way to grab the audiences attention and keep them glued to the stage, and it set the tone for a press conference tailored for Playstation gamers (more on that later).

2.)  Kojima & Reedus: Death Stranding

If returning IP's weren't enough, returning developers were even better.

The reveal of Hideo Kojima was received with the type of fanfare usually reserved for rock stars.  What followed was an enigmatic trailer featuring a naked Norman Reedus in a desolate wasteland cradling an infant that may or may not really exist.  Little is known about Death Stranding, but the promise of Kojima's genius seeps from every pore and leaves the audience straining for piece of what comes next.

3.)  10/25/16

Chances are, even if you're not good with dates, you'll have a hard time forgetting this one.

The first hint at a new title from Shadow of the Colossus developer ICO was dropped as far back as 2008, where they confirmed they were working on a new IP for the freshly released Playstation 3.  Another article two years later suggested it was at one point slated for a 2011 release.  Almost five years and an entire console-generation later, The Last Guardian re-emerged at E3 last year to deafening applause, and while we were given some actual gameplay to admire, we still had no idea when we'd get our hands on it.

That question was answered yesterday, when the latest trailer revealed an official release date for the long-awaited title.  Unlike many of the games showcased at E3 that were scheduled for release next year, fans were elated to learn they would only have to wait a little over four months to play a game that--at this point--has little choice but to live up to the hype.

4.)  Games.  Just Games.


Unlike every other press conference at E3, Sony took a minimalist approach to their own.  The hour-long showcase featured little talking.  Aside from Hideo Kojima, no developers were featured on-stage.  Instead, the focus was on games and gameplay.  Even those demoing the software were hidden backstage with their output being projected onto the screen.

Don't get me wrong: E3 is a tremendous opportunity for gamers to get a glimpse of the people that work so hard to deliver the entertainment they love.  There is no reason developers shouldn't be a prominently featured part of a game-centric expo.

But by cutting down on the chatter, Sony managed to enthrall gamers with one hit after another.  Much of the conference was little more than gameplay and trailers shown back-to-back-to-back.

From God of War to Final Fantasy XV to Horizon: Zero Dawn, Sony was relentless in its focus and execution.  The press event flew by as we were whisked from one title to the next with little more to indicate a transition than a few moments of silence as we held our breath for what lay ahead.

Sure, it felt like one, big, hour-long commercial, but it was a commercial you couldn't step away from.

5.)  Playstation VR

No matter which camp you belong to, VR was on display in a big way at E3 2016.  Whether it was Bethesda's promise of a truly immersive Fallout 4, Microsoft's commitment to all-new VR-ready hardware, or Ubisoft transporting us to the final frontier, it's clear that even while virtual reality gaming may be in its infancy, it's not going anywhere anytime soon.

With the Oculus Rift and HTC Vive now available to PC gamers, the question was how Microsoft and Sony (and presumably Nintendo at some point) would bring the VR experience to the console community.  Would living room gamers be willing to pay upwards of $600 for a peripheral and possibly purchase another console to drive it after just investing in new hardware less than four years ago?

Sony answered these questions by not only unveiling a $399 headset (that's $200 cheaper than the Oculus Rift and $300 cheaper than the HTC Vive), but assured the Sony faithful that these headsets would work on the consoles they were gaming on right now.  Sony accomplishes this by including a "processor unit" in the box that plugs into the PS4 to handle the extra power needed to get the expected performance out of the system.  An extra $100 gets you the Playstation Camera and two Playstation Move controllers.

The announcement was similar to when Sony unveiled the Playstation 4 back in 2013: undercutting the Xbox One price point by $100.  I have little doubt Sony is going to be selling these headsets at a loss in the hopes that the games draw people into their ecosystem, but it's an impressive feat nonetheless that makes VR instantly more attractive to casual gamers.

Sony started off strong with this console generation, and judging from its track record over the past three years, it has no intention of letting up anytime soon.

Tuesday, June 7, 2016

Extreme Makeover: iTunes Edition









Think back to your first iPod.  Chances are we all remember something different: some had the physical wheel of clicky buttons, others had the smooth touch interface of the click wheel.  Still others may have had one of the early iPod Touch models.  I remember being truly enraptured by the boundless possibilities of having 20GB of music in my pocket, available to me at a moment's notice with the simple scroll of the responsive click wheel and the glow of that grayscale screen backlit in blue.

I also remember the seamlessness of iTunes: importing my considerable collection of CD's and allowing them to mingle with new music I was purchasing from the store at $0.99 each.  I was hesitant to commit to a purely digital listening experience and clung to buying CD's for a couple of years afterward, but by 2007 I was a full-fledged believer and haven't bought a physical CD in some years.

iTunes seemed almost as magical as the iPod itself: add your music, plug in your iPod, hit the "Sync" button, and voila!  Your music in your pocket!

As the iPod continued to evolve, so did iTunes.  In addition to music, we were soon able to purchase music videos, movies, and listen to podcasts; and all of it could be synced with our beloved iPod.

Then the iPhone and the App Store arrived.  Soon, customers were downloading their entertainment on the go.  No need to "plug in" to iTunes.  You want a new album to listen to?  Go get it!  It's right there in your hand!


To say that iTunes has not aged well in the post-smartphone age is an understatement.  The program continues to be an amalgamation (or abomination) of functions that seem both half-baked and antiquated.  Even the name, "iTunes" harkens back to an era where digital music was a novel concept.  It speaks nothing of the full breadth of content the app attempts to juggle.

The end result is a convoluted mess that tries to balance a plethora of functions, but does none of them particularly well.  Following a redesign that attempted to give iTunes a "simpler" interface, Apple recently returned to the classic "sidebar" after users complained of not being able to navigate the app intuitively.

Speaking of which, has anyone tried to import a CD, lately?  Let's put aside the fact that Apple doesn't include an optical drive on any of its computers (save for the aged 13" Macbook Pro that's cleverly hidden at the bottom of its webpage).  Let's just focus on the act of importing a CD.  Upon inserting the disc, iTunes will, with an impressive degree of reliability, search its database to see which disc you've given it.  It automatically fills out track names and will even grab the album artwork.  In most cases, it will ask you at once if you want to import the disc into your library.

Now, if you're like me and a little OCD about your music, you like to have everything conform to a particular standard: most of my music is imported at 192kbps (a nice balance between quality and storage).  Since I hadn't imported a disc in years, and never configured the import settings on my iMac where most of my music is stored, I figured I'd double-check my settings.  So when iTunes asked me if I wanted to import the disc, I said, "no."  Instead I went into the Preferences to find out what my import settings were.

Confirming the settings was one thing.  Getting back to the disc was quite another.  Even now, I can't recall what I had to click to get back to the "Import" screen.  Suffice to say there was some Googling involved.  (Keep in mind, this was before Apple restored the sidebar, which clearly shows the disc as a mounted device in iTunes).

Okay, that's a minor complaint.  Aside from audio enthusiasts who crave pristine, CD-quality sound, I can't imagine there are too many people importing CD's into iTunes these days.  I could go on about the many issues with the Music portion of iTunes (like iCloud Music wreaking havoc on users' libraries), but let's see what else goes on in iTunes.

Video.  Sounds simple, right?

iTunes breaks video down into three primary categories: Movies, Music Videos, and TV Shows.  Once purchased from the video section of the iTunes Store, they appear in their respective sections in iTunes.  That's all well and good if your primary viewing experience is in front of your laptop or desktop.  But this is the age of AirPlay and Chromecast.  The idea of sitting in front of a dedicated device to watch anything other than a YouTube video is archaic.

So let's pull that video up on our iPad, shall we?  You'll stumble a bit, here, if you're a first-time iTunes movie user.  Whereas video is an embedded function in the iTunes desktop app, Video is its own dedicated app in iOS.  So rather than clicking on the Music app (whose icon still bears a resemblance to the desktop iTunes client), you're actually going to click on "Video", and then navigate to the kind of video you downloaded per the three categories above.

Downloading a podcast?  Same deal.  You can subscribe to and sync your podcasts in iTunes.  You would think the Music app would be where to find them on your iPhone (I mean, it's still audio, at least).  But Podcasts is its own app as well, and unless it's something you use regularly, you'll likely have to remember where you left it when you moved it off the main pane of your Home Screen.




There's little consistency to the design language between iTunes and its iOS counterparts.  What was once seamless and "magical" is now confusing and frustrating.  Finding content in iTunes is vastly different from finding it on your iOS device, and that's not even including a device like the Apple TV, which while similar to iOS, speaks yet another design dialect.

What's baffling is that Apple has already established a more consistent design language for its apps, but only in a few instances.

Firstly, there is iBooks.  Granted, iBooks are still synced via iTunes, but the actual purchasing and reading of iBooks is delegated to its own app both in iOS and OS X.  If I want to download a digital copy of Treasure Island, I open the iBooks icon on my Mac and purchase the book.  With my Apple ID, the purchase is synced over iCloud, and I can pick up my iPad, open the iBooks app, and see that the book I just purchased is ready for me to read.

Furthermore, we can take the iWork suite of apps: Pages, Numbers, and Keynote.  Each of these apps allows me to create content on my computer, save it to iCloud Drive, and open it on my iPad or iPhone using the same, respective apps.

Anyone who's used a 90's-era Macintosh may remember AppleWorks: the Apple equivalent of the ubiquitous Microsoft Office suite.  It had one icon.  Clicking on it presented you with a choice of what you wanted to create: a word processing document, spreadsheet, slideshow, or database.  It was the "iTunes" of Apple office software.  At the time, it was convenient not having to scroll through the hundreds of programs on my Mac to find the one function of AppleWorks I wanted.  This philosophy is probably the reason Apple is hesitant to disband iTunes in 2016, but it actively conflicts with their overall design.

A couple of years ago, Apple overhauled the look of its desktop operating system to more closely resemble the look and feel of iOS 7.  There's a clear dedication by Apple to develop a uniform language for its interfaces across mobile and desktop platforms.  The Pages icon in OS X is the same icon I click on in iOS.  The same goes for Maps, Notes, Calendar, Mail, Safari, Contacts, and many other native applications, the exception being iTunes.

For people who own smartphones (regardless of platform), iTunes is largely obsolete beyond the scope of media playback.  Virtually all the content I can access in iTunes is accessible directly from any of my devices.  But playback is still important.  If I'm writing, I don't want to have to type on my laptop while relying on my iPhone for something to listen to.

Rather than insisting that iTunes be a catch-all for my digital entertainment, spinning each of those components off into their own apps would not only make the design more consistent across platforms, but it will trim down iTunes (preferably renamed simply, "Music") and allow programmers to focus on making it a better music player and organizer.  By giving each of these functions their own app, we get both a cleaner workflow and more intuitive, consistent interfaces.  Each piece of content is already synced over iCloud, so there's no reason to force users to open iTunes to ensure the content is shared properly.

With iCloud, Apple promised us a world in which our mobile devices would no longer be tethered to our aging, stationary desktop computing lives.  To fully embrace that future, we need to cut our dependence on legacy apps and services.  At the backbone of its "magic", Apple has long contended that "less is more."  In this case, however, "more" may be "less."