Wednesday, May 28, 2014

Apple’s New Taste

That we know of on the outside, Apple has had four people highly influential in setting the company's taste during its golden era: Steve Jobs, Jony Ive, Ron Johnson, and Scott Forestall. Singling out these folks alone is an oversimplification, but they certainly have had outsized influence.

Ive is now the design chief for both hardware and software, but Jobs, Johnson, and Forestall are gone. Ive certainly will continue on setting trends and direction, but he alone can’t do it for the whole company.  Tim Cook is generally known to be a numbers guy and not really a replacement for the taste making roles that Jobs and Forestall had.

The hire of Angela Ahrendts along with the Beats acquisition might be Cook’s way of injecting some new talent in that whole area. It feels odd to think about Apple turning over its chief taste makers all at once, but the old guys were around forever. Ive became an Apple employee in 1992, and Jobs and Forestall came in the NeXT deal in 1997. Johnson was the newbie, only coming on in 2000. The positions haven’t been open for a while (except Johnson’s retail head position, of course).

Jimmy Iovine and Dr. Dre joining Ive and Ahrendts as the company’s top taste executives makes as much sense as anything for the Beats acquisition. Cook’s internal letter talks as much about those two as it does Beats itself, and Beats as a brand is held in much higher regard than any of its products are.

Part of Apple’s corporate DNA is having a distinct sense of taste and style. With some of the most important people responsible for its past placement there gone, others have to step in. Iovine and Dre are just the latest two to do so.

Wednesday, May 7, 2014

Monument Valley Is a New Super Mario Galaxy


Monument Valley is a pretty great iOS game that came out not too long ago. If you like impossible objects and M.C. Esher, this is the game for you.

Gameplay-wise, it’s pretty simple. It’s like a point-and-click adventure game, in that you simply tap on where you want Ida to go. If she can get there, she goes; if not, she won’t. You have to step on switches to alter the level architecture, avoid crows that block your path, and use a movable block tower to help you at times.

It reminds me a lot of Super Mario Galaxy in a couple of ways. The obvious one is that Ida can walk on walls and ceilings, just like Mario can in some levels. Some levels in Monument Valley require rotating the stage all 360°, again, like some Galaxy levels.

The other way it reminds me of SMG is that it’s not terribly difficult. With few exceptions, Galaxy is not that hard of a game for anyone who’s not a complete beginner to 3D platform games. It itches your brain in some novel ways, but once you learn its conventions, it’s not overly challenging.

Monument Valley is also fairly easy. The only time I felt stuck to any degree was in an early level where I didn’t realize I could tap and drag a piece of the building to open up a new path. Once I learned to recognize what is movable, it really flowed easily from there. Some parts felt more like a “click to continue cutscene” experience more than playing through a game.

It’s also pretty short. It only has ten levels, some with multiple screens. Combine that with the low difficulty level, and you’ve got a pretty short game. The reviewer at Polygon needed three hours to complete it; I didn't time it, but it took me about one hour, if that. I don’t feel bad about the $4 I spent on it because it’s beautifully designed and forges an interesting path. It leaves me wanting more, but fortunately, more is on the way.

Super Mario Galaxy left me wanting more too. With that one, it had nothing to do with length. It has 120 level variations to complete, and I did it as both Mario and Luigi (their controls are different, so it’s a somewhat different experience). It almost never upped the difficulty though, and I wanted a challenge. Super Mario Galaxy 2 fulfilled that wish, thankfully.

I don’t know if future installments of Monument Valley will be significantly harder. It seems like the game is more about the design and atmosphere than really being a challenge. And that’s fine! It’s OK for some games to be like that. I just hope the difficulty curve ramps up at some point in the future, even if it’s not the next release.

Monday, March 31, 2014

How I Met Your Mother’s Finale Was Too Late

We finally saw how Ted finally met the mother of his kids. It was a really nice moment. A nice moment at the end of a very flawed episode of television.

(Spoilers ahead)

The problem with the HIMYM finale simply is that it came too late. If that was the finale of a hypothetical fifth or sixth season, it might have worked. I realize that it’s nearly impossible to turn down CBS when it offers you millions of dollars to continue making your series that peaks at over 10 million viewers a couple times a year. However, all that extra time ruined the story that Carter Bays and Craig Thomas wanted to tell.

It all goes back to the Ted and Stella storyline all the way from Seasons 3-4. Given the show’s established narrative structure, themes, conventions, etc., it was clear from that point on that the Mother wasn’t going to sneak up on us. If it wasn’t 100% obvious already, that plot thread confirmed that the moment of meeting the Mother wasn’t going to happen until the series finale. Stella got as far with Ted as anyone was ever going to without being the Mother.

Given that fact, every plot line involving Ted and a woman was going to feel hollow until we got the flashing neon sign that said, “HERE SHE IS”. The remainder of Season 4 after Ted got left at the altar and even into Season 5 was fine, as we got to see how Ted dealt with overcoming that big deal in his life. Season 5 is also when the Robin-Barney relationship first began, something that injected a lot of new energy to the show.

The opener of Season 6 is when HIMYM first teased Barney and Robin’s wedding. It wouldn’t reveal until that season's finale that it was Barney’s wedding, and it wouldn’t be until far later yet that we learned he was marrying Robin. Nevertheless, that episode is when the show began the end game. It aired on September 20, 2010; tonight was March 31, 2014.

That gap is too wide for the payoff to be satisfying. In Season 5 and really into Season 6, the show basically stopped being about Ted, the ostensible protagonist, and it became about Barney and Robin. It asked the viewers to get invested in Ted’s new crush Zoey, who clearly wasn’t going to be the titular Mother. It had to come up with things for Lily and Marshall to do, as there was no real dramatic tension in their relationship because a thousand flash forwards showed that they never would split up. Their marriage was in just as much mortal peril as Anakin Skywalker was in the Star Wars prequels.

From Season 6 through Season 8, the show basically just put things together in order to break them apart so it could put them back together again. It was marking time, just waiting for the last season to come to finally do the big reveal. Even someone who doesn’t overanalyze TV shows would have gotten the thought at some point: where is this going? Isn’t this supposed to be the story of how Ted met the kids’ mother? Why is it spending all this time on Barney and Robin? There eventually was an answer—Future Ted was still hung up on Robin—but it had to wait until after years of frustration set in to let us know.

So, the finale. It didn’t help its cause that it tried to fit about four episodes’ worth of story into one double-length episode. That made it feel rushed. It also took two storylines that were years in the making—Barney’s transformation from a womanizer into husband material and the Barney-Robin wedding that was the backdrop for every single episode this season—and wiped them away before the second commercial break. Years of buildup gone, just like that.

With Barney, I understand what Bays and Thomas were going for. They wanted him becoming a father to be the thing that finally turned him around. The problem is they had him make too much of that turnaround before getting married. It was an enormous letdown to see him go right back to being his old self when him wanting to be his old self isn’t even why he and Robin split up anyway.

Of course, the buildup for those things pales in comparison to the buildup of how Ted and the Mother would meet. That event is what the show ostensibly turned on, which is ostensibly what Ted’s life turned on. Turns out that Ted meeting a woman who, as far as any viewer could tell, was absolutely perfect for him, who he had a long relationship with, who he had two children with, was yet another speed bump on the way to him getting with Robin*.

Maybe they could have pulled that off if the series was shorter. Maybe. I don’t know. I do know it couldn't do it after nine years. The buildup for that moment ended up larger than I think the writers ever intended, as it’s evident now that the very title of the series was the first of so very many tongue-in-cheek misdirections. With a shorter run, it might have worked. After this much time, it never had a chance.

I think Bays and Thomas wanted the point when Ted holds up the blue French horn in the last shot to be a moment when the viewers shout, “Finally!” at their TV sets. Instead, that moment happened six episodes earlier in “Sunrise” when Ted let Robin go in a pretty embarrassing CGI sequence. After false start after fake out after aborted run after dead end conversation, we seemed to be past the Ted-Robin thing once and for all.

The show went to that well only to pull out an empty bucket too many times. We were all sick of the will-they-or-won’t-they with those two. The finale had enough to it with seeing the main characters’ developments over the years that it didn’t need one last left turn at the end. With about 12 fewer “either Ted or Robin wants it to work out between them but it’s just not going to happen” sequences, the last moment of the series might have been welcome. But at some point, you stop rooting for either Lucy or Charlie Brown and just want to stick a machete into the football.

Had the writers wanted to, they could have scrapped the planned ending and given us Ted and Tracy living happily ever after. It would have been a bit saccharine, but it wouldn’t have been infuriating. We got to know Tracy. She was great. She was just the right person for Ted, more so that Robin ever was.

I doubt Bays and Thomas ever seriously considered going with anything other than the ending they decided on when the conceived the show a decade ago. After all that time and commitment to it, they really couldn’t have done anything else. All that time was its enemy, though, and their big ending suffered greatly for it.

*Maybe! We still don’t know if things work out with them!

Sunday, February 16, 2014

Disney’s Frozen Has a Secretly Ominous Ending

I saw Disney’s Frozen recently, and it’s a really good movie for the most part. It does exist a bit in a Catch-22 though.

The most interesting character is by far Elsa, the only one who actually goes through a proper character arc. The other people in the movie largely are the same person at the end as when they’re introduced (one of the trolls even sings that “people don’t really change”). It could be a stronger movie if it focused more on Elsa, but it’d be a darker movie for it and probably too dark to be a children’s movie. They also couldn’t just go for it and jettison the children’s movie aspect, as so much of it relies on the viewer not overanalyzing it thanks to it being a children’s movie.

Anyway, keeping in mind that this is just a fantasy children’s movie, it’s notable that it’s the most business-focused Disney movie yet. The Duke of Weselton is obsessed with international trade, shopkeeper Oaken gives a quick lesson on supply and demand, and concern for Kristoff’s ice business is a running theme throughout.

Warning: spoilers ahead.

On that note, the ending of the movie is actually pretty ominous from a business perspective.

Elsa’s unintentional winter spell in the middle of summer would have disrupted the economy of Arendelle considerably. What little crops there are in the area would have largely died from the deep freeze, some of the livestock could have died too from exposure thanks to farmers being caught off guard, the frozen fjord would be awful for the fishing industry, and the logging of the area certainly would be set back a bit. From that alone, Arendelle is probably headed for at least a sharp recession as a result of the movie’s events. We know from the Year Without a Summer that winter-like conditions in the summer would be devastating to an early-to-mid 1800s European state like Arendelle.

However, that’s not all. Just before the end, Elsa issues a decree that Arendelle will no longer do business with Weselton, it’s largest trading partner. That’s understandable given that the Duke of Weselton sent people to assassinate her, and this preindustrial fantasy land wouldn’t have some kind of UN to settle the dispute.

It’s also the last thing the kingdom needs. With the local agriculture and industry severely stunted, Arendelle needs trade now more than ever. Cutting off relations with the kingdom’s largest trading partner will only make the recession that much deeper.

Some of the downturn might get offset by an increase of government spending. When the king shut off the castle from outsiders to protect Elsa early on, he reduced the staff. With Elsa’s new open-gate policy, government employment will rise. There also will be more social functions, of which there’s been one (the coronation) in the last 10-15 years, which will lead to more spending in the local area. The royal treasury likely can sustain this deficit spending for a while since it would have built up considerably during the decade plus of reduced staff and few expenditures. Having a few more castle servants and some fancy parties wouldn’t come close to offsetting the entire consequences of the week of winter, though.

Elsa would need to act quickly to repair the situation. She would need to send someone, perhaps the regent who ran the kingdom in the three years between her parents’ deaths and her coronation, to find new trading partners. She could also become Europe’s first entrepreneurial monarch. She might be able to bring in tourists from the region’s nobility by doing public demonstrations of her magic powers and setting up tours of her mountain ice palace. She could also travel to nearby kingdoms to create ice art for their special occasions. That money could then go to subsidize the rebuilding of the kingdom’s economy from the damage she unwittingly caused. I don’t know the extent of her powers, which seem considerable, but she might even be able to forestall the year’s coming winter to give her kingdom a chance to produce a few more goods to sell without competition.

Arendelle is probably in a better situation for the long haul with an open and confident monarch ruling it, but in the short run, there will be a struggle to fight off famine. For a seemingly happy movie that is actually the darkest Disney animated feature yet, a seemingly happy ending that is actually quite foreboding is only appropriate.

Addendum

This all assumes that Elsa doesn’t just create automatons from ice and snow to perform all economic tasks. They could work all hours of the day and dramatically expand Arendelle’s economy.

They also would put everyone out of work, creating a one of those utopias where everyone can live a life of leisure that philosophers once dreamed of. That would work, to whatever extent it can given that people generally prefer to work rather than do nothing all day, until Elsa dies. Presumably all of the automatons would then cease to function.

At that point, Arendelle would plunge into a dystopian situation where the entire infrastructure of the economy fell apart all at once. It would be a long, slow slog out of the depression as the populace would have lost all experience with actually running agriculture and industry.

It would be tempting to go the automaton route given the immediate economic crisis that is coming to the land not long after the credits roll. However if Elsa allowed her automatons to take over the whole economy, it would be far worse in the long run than accidentally plunging the kingdom into winter ever was.

Sunday, September 8, 2013

Apple Is One Rule Away From Ruling Console Gaming

Apple is very, very close to being able to just about kill off Ninendo and Sony's gaming console businesses and perhaps Microsoft's too if the media features of the Xbox One don't work as well as advertised. Only one very Apple-y rule will keep it from doing so.

Let's start with something that leaked a while ago (I'm going off the leak so I don't break the Apple Developer NDA). iOS 7 will support game controllers. Some legit images leaked out a while back, so you can see what they're planning. There are going to be three kinds of controllers. One cradles phone-sized iOS devices and has a limited button set: ABXY, two shoulders, a D-pad, and pause. The next cradles a phone-sized device and adds two analog sticks and two more shoulder buttons. The third kind is standalone (the diagram of which appears to have been inspired by the Wii Classic Controller), and it has the same, larger button set as the second one. The standalone controller image shows that up to four controllers can be used at once.

The implications for single-use handheld gaming devices are dire. The Nintendo DS and PlayStation Vita can provide a much wider variety of gaming options than touchscreen phones and tablets can thanks to having buttons. With these cradle controllers, now iOS devices can provide those experiences too on top of everything else they do. Well, they would if not for that rule I mentioned. But that's not all.

Thanks to AirPlay, you will be able to play a traditional controller-based game on iOS while sitting on your couch with the video on the TV. In fact, this setup is like the Wii U, only reversed. The Wii U has a smart box hooked up to the TV with a dumb tablet you hold in your hand:

Whereas Apple's setup has a smart tablet in your hand that connects to a dumb box hooked up to the TV:

The killer aspect for Apple is pricing. The Wii U, even after its upcoming discount, will go for $299, and it's the least expensive console of the new generation. A lot of people will already have iOS devices, or at least they can justify getting one because they can use it for far more than just games. A person who has an iPhone, iPod Touch, or iPad can buy into Apple's living room gaming setup for a $99 AppleTV and whatever one controller costs. Even if it's $35 or $40 like a traditional console controller, the combined price still less than half of the Wii U.

There is an immense advantage to buying into this kind of gaming setup. The hardware on iOS devices gets revised about every year. You won't have to wait six to eight years for the Nintendo, Sony, or Microsoft to provide updated specs. Plus, the App Store model makes it far easier for games to get to you and opens up the door for a wealth of third party developers who might never get something on a Wii U, Xbox, or PlayStation due to their barriers. And, again, the console part of it would be "free" to someone already committed to buying iDevices every couple of years anyway for their multitude of non-gaming functions.

Now, the red flag. The fact that there are two different button sets is a bit worrisome for fragmentation reasons, but that's not it. It's that Apple has made a rule that says controllers must be optional. An iOS game must be designed for touch and motion first with the controller only being a bonus add-on.

I know why Apple did this. It's to maintain simplicity for the store. It's also to remove a potential support headache. Apple doesn't want people calling them up asking for refunds when they buy a game and they find out they have to buy a controller in order to play it. Having a game in the App Store that requires a controller just wouldn't do at all.

It also means that Apple won't kill off the other game console makers as quickly as it could have. Think about some traditional handheld or living room console titles, anywhere from Zelda to Smash Bros. to Madden to Halo. They require a boatload of buttons for a reason. Making a game that functions well both with the limitations of touch input and the freedom of buttons is going to be tough, and the categories of games that require controllers will still not be feasible to provide for iOS.

Apple should know this. It knows well the difference between touch input and bucket-o-buttons input. It's why it keeps iOS and OS X separate. Any gamer can tell you that this rule is a bad idea, and people inside Apple should be able to tell you that too.

As far as the living room goes, this strategy makes total sense for Apple. It can make a limited play for living room gaming while not disrupting its plans for the AppleTV. It doesn't have to turn the AppleTV into a full fledged gaming console on top of everything else; an iDevice, a controller, and AirPlay will cover that use case just fine. It can keep selling $99 hockey pucks to people who have no interest in gaming, which makes far more sense as a living room strategy than Microsoft's apparent gambit of wanting to sell $500 Xbox Ones to people who don't play games.

Between controller support and Sprite Kit in iOS 7 and Mavericks, Apple is making a real effort at competing in games this fall. This one rule that controllers must be optional keeps it from being able to take over everything. Between apps that run on either iPhones or iPads but not both and iBooks Author creations that only work on iPads, Apple already has things in its stores that don't work everywhere. I would have thought that a simple modal dialog box saying something like "This game requires a separate controller. Do you want to buy?" might be enough to allow them to have apps that require controllers, but the powers that be chose not to go that route.

As long as that rule exists, there still is room for dedicated gaming hardware. We'll see how long that rule lasts.

Sunday, April 14, 2013

A Few Good Years Have Passed

Last night, my wife and I watched the 1992 classic A Few Good Men. I was of course familiar with the famous courtroom scenes, but it's actually the first time I had seen it all the way through. My wife hadn't seen it either, but she is in the Navy now, so I figured she'd enjoy it for that reason. Her favorite line actually didn't end up being any of the famous ones. Rather, it was Kevin Pollack's Lieutenant Weinberg wryly stating that, "No one likes the whites". This is true; no one she knows likes the Navy's dress white uniforms. It had some inaccuracies that bugged her though, not the least being Tom Cruise's Lieutenant Kaffee treating Demi Moore's Lieutenant Commander Galloway as though he outranked her throughout.

Anyway, I had recorded it off of AMC, and it had little fact boxes popping up at the bottom periodically. It wasn't until one of those boxes appeared some time into it that it really clicked for me why Jack Nicholson's Colonel Jessup was so intense about being on the wall and so forth. He was the leader at the Guantanamo Bay base, and at the time that Aaron Sorkin wrote the play on which the movie was based, the Cold War was still going on. Not that Cuba is the United States' friend now or anything, but the implications of the island being Communist were far more important then than now.

I was born three years before Sorkin's play first hit the stage. I can remember old maps from elementary school that said USSR and can recall seeing fallout shelter signage here and there, but I have no recollection of the Cold War and its existential threat to the US. I was four when the Berlin Wall fell; I was six when the Soviet Union dissolved. Even if I had learned about Russian nukes being pointed at my country at the time, I wasn't old enough to really understand the implications.

For my generation, Guantanamo Bay has a very different connotation. It's not an outpost of democracy on the edge of Communist territory; it's a holding cell for War on Terror suspects. The incident that started everything for the plot in A Few Good Men was a Marine shooting a single bullet outward across the fence unprovoked. While that's never something you want to see happen, it probably would be more or less a nonevent these days beyond whatever punishment a Marine gets for unnecessarily discharging a weapon. It wouldn't be an event that could potentially cost lives. Cuba isn't a battlefield anymore. Guantanamo is a very different part of the wall that keeps America safe now.

The climactic scene with Kaffee haranguing Jessup on the stand is still as intense as ever, but it has lost a little something because of the way the film takes for granted that the audience understands the Cold War subtext of the film. I am pretty well versed in history and probably would have put it all together eventually, but it's not something that people in the Millennial generation and beyond will get instinctively. I certainly understand it in an intellectual sense, but I don't feel it viscerally. A young person could make it through the whole thing and think that Jessup is just really cranky because he thinks every member of the military who isn't in an office in D.C. plays a part in guarding the wall that protects the homeland. The latent yet very specific threat of nuclear war will be lost in that scenario.

If Hollywood ever decides to remake this film, it will definitely hammer (probably excessively so) on that element of it during the first couple of acts. The future Jessup will throw around terms like "the Red Menace" to make sure it's clear (crystal, even) that the stakes here are related to the Cold War. For him and his generation, "Cuba" probably primarily conjures feelings surrounding the Cuban Missile Crisis or the Bay of Pigs; for me, it conjures Elian Gonzalez well before any of JFK's incidents down there. It's still a really good movie if you don't have that in the forefront of your mind, but it's not as good as it can be without it.

One of the other popup fact boxes said that Rob Reiner had hoped to make A Few Good Men be a timeless movie and that, aside from Cruise's civilian wardrobe, it is. We must add one other caveat besides loud shirts: it's timeless except for its inherent assumption that Guantanamo Bay, Cuba will always have Cold War connotations for its audience. It certainly does not for most anyone younger than 30, and it might not for those older than that anymore either given its prominence in the last decade's news cycle.

To watch A Few Good Men again:

Thursday, August 16, 2012

Apple Wants iCloud to Be the World's DVR

The Wall Street Journal has been revealing some details about Apple's plans in the television space. Steve Jobs famously said he thought he had "cracked" the problem of television shortly before he passed away last year, and everyone has been trying to figure out what he meant ever since.

The latest report from the WSJ, if true and I'm interpreting it correctly, likely reveals what Jobs thought was the breakthrough:

The Cupertino, Calif.-based company proposes giving viewers the ability to start any show at any time through a digital-video recorder that would store TV shows on the Internet. Viewers even could start a show minutes after it has begun.

The vision here is pure Apple. The company identified an area of complexity, in this case managing TV recordings, and plans to offer a simple solution where it simply does it for you. Here, iCloud becomes the world's DVR. There won't be boxes in every individual home making millions of individual recordings of the same programs; there will be one place that "records" the programs (Apple's datacenter) and all of the boxes will stream that copy.

You won't miss a show because you forgot to set up a recording; Apple is recording it for you. You won't miss a show because the DVR filled up; Apple is recording it for you. You won't miss a recording because you're out of free tuners, or because the cable went out, or because a cloud went between you and the satellite. Don't worry. Apple's recording it for you.

Obvious road blocks have to be overcome before this vision of the future can come to pass. For one, the WSJ reports that Apple doesn't have a single deal worked out yet with any content providers or cable providers to make this happen legally. For another, this setup requires a completely reliable Internet connection. If the Internet goes out, you not only have no TV anymore (not a guaranteed problem today) but you can't watch your recordings in the meantime either.

Plus, ISPs aren't going to be happy about a system like this because it would put an enormous strain on their networks. They are already playing around with bandwidth caps, and that's without most people getting their TV through the Internet. Perhaps the new H.265 standard will solve this particular issue, but it's not going to be available for anything until "as soon as 2013" (which probably means later than that, given the choice of weasel words here).

This sounds like a really cool way forward. I have my doubts that we'll see anything like it any time soon because content owners, cable providers, and ISPs are some of the worst companies in the world. Of course, Apple worked things out with cell operators, who are just as bad if not worse, so there is some hope out there.