Friday, September 18, 2015

Ad Blockers and the Future of Internet Publishing

There is a debate about ad blocking on the Internet every so often, and the release of iOS 9 with its support for content blockers has reignited it.

I am sympathetic to both sides. I don't block ads, but I do block both Flash and ad trackers. I block Flash because it slows down browsers, and I block the tracking because I don't think it's appropriate for any single entity to know by default what I do when interacting with completely separate entities. It's fine for Google to know what I do on YouTube since it owns YouTube. It's not fine for it to know what I do with every site I visit that runs Google Analytics.

That said, I also make a small amount of money from writing on the Internet, and that money ultimately comes from advertisers. I don't get paid anything close to a living wage for how much time I spend on it, but I'm blessed to be in a position where that doesn't have to matter if I don't want it to. Many people are not.

Right now, proponents of ad blocking list all the abuses of online ad technologies and say, "adapt or die". They might also point to focused (and non-abusive) ad networks like the Deck or point out that advertisers barely know anything about the effectiveness of their spots on TV or spreads in magazines. Those media can't track ad targets like online advertising can, and it was fine. Maybe just take that attitude online.

Those arguments are fine for a certain set of people who have audiences that skew affluent, but it's bad for everyone else in the short to medium term. If you tell advertisers that they will have less targeting, they'll pay lower rates. They already pay next to nothing, so it'd be a financial bloodbath.

It's tough for publishers. The hard fact of the matter is that the supply of content creators far outstrips demand. Internet technology makes publishing content of all kinds easier than it's ever been. A laptop is far cheaper than a printing press. A webcam is far cheaper than a TV studio. The upshot of that fact is that more people want to make a living by writing or making videos or whatever on the Internet than the market can possibly support. Take away the easy avenue of super intrusive ads and some publishers will go away because every other option is really hard.

I know this. I've been writing consistently, year-round on the Internet since mid-2007. By now, I think I've gotten pretty good at it, but "pretty good" isn't good enough to justify me doing it full time. The market has spoken by now. It says I'm not special enough to warrant a full-time gig. I am far from alone.

This is where it gets tough on the publishing side. A lot of new people appear on the Internet every year trying to make it by creating content. Sturgeon's Law says that most of them won't.

But everyone produces a lot of crap when they first start. Everyone who writes a lot says they look back on their early work and cringe because it's so awful compared to where they are in the present. A ruthless world where only the largest publications make it and it's mostly impossible to make any money without being a part of one of them means that only the people who can afford to write a lot for no money to prove themselves to those publications will make it. Only people who are decently well off will be able to break into the business, and that's not an appealing future. I realize it's kind of like that now in a lot of ways, but it has room to get worse.

I don't know what the answer is. Maybe it's micropayments, although I'm not bullish on them. Maybe it's some kind of scheme to essentially pay people to read sites and look at ads, although I'm not bullish on that either. If I did know, I'd be going and doing that instead of writing this essay. The long term good news for Internet content creators is that the future will have no TV or radio or magazines and only data flowing on the Internet. The ad dollars that go to old media now will go to online media in the future because they'll have to. That'll mean more ad money to go around. Its just that no one knows when that future will arrive, and many creators won't survive financially until then.

I hope there is something between the near privacy-less Internet we have today and the dystopian future without journalism. If it's out there to be found, iOS 9 and content blockers are giving the people searching for it a new sense of urgency.

Saturday, January 10, 2015

Universal’s Harry Potter Areas Put the Rest to Shame

I finally had a chance to go to Universal Orlando for the first time since the Harry Potter areas opened. It highlighted a real problem that the Islands of Adventure park has.

So you know, I grew up in Orlando and even worked at Universal Studios for two summers and a Christmas break during college. I've been to all the Orlando theme parks more times than I can count. I know more about this stuff than a person probably should.

The Harry Potter areas in both Islands and Studios are the most impressively themed areas of a theme park I've ever seen. The new Diagon Alley in Studios is particularly great. Wide lanes for packing in the tourists aside, it really feels like you're walking onto the set from the films. The entrance to the area from the rest of the park is even inconspicuously located in an unmarked brick wall, going along with the bit from the books about Diagon Alley being hidden in London. Everything is just so well done. You could have fun standing just inside the entrance and listening to the cries of surprise and joy when people enter for the first time.

There is only one real new ride in both Potter areas. Universal re-themed two old rides in Islands for Hogsmeade—the Dueling Dragons as the Dragon Challenge and the Flying Unicorn as Flight of the Hippogriff—but those coasters are exactly the same. The Hogwarts ride in Hogsmeade is probably the better of the two, although its plot is incoherent. The Escape from Gringotts ride in Diagon Alley makes sense, and the queue is the best themed one in either park, but it's kind of short.

Aside from the fact that Universal spent almost no money in theming the old Dueling Dragons queue—it used to be the coolest area in the place, but now it's mostly just plain and boring "stone" walls—I don't really have complaints. Everything looks great, the train ride between Hogsmeade and Diagon is a themed ride in and of itself, and the Butterbeer is dangerously good given the astronomic sugar content.

The problem with these Potter areas is that they make anything that came before them look old and tired. Some of that is simple neglect on Universal's part, like the terribly faded pictures on the side of Shrek 4D. Some of it is the passage of time, with Men in Black now mostly looking silly where it once was cool. A lot of it is that the Potter areas are state of the art and had far higher budgets behind them than other attractions (looking at you, Simpsons ride).

The Studios park is mostly fine because it has been getting newer stuff. There’s the Hollywood Rip Ride Rockit roller coaster in the front plus recently opened Transformers and Despicable Me attractions. Its general theme of simply “movies” means it’s easy to rotate things in and out.

Islands is a different story. Hogsmeade is the first major update the park has had since it opened in 1999. The rest is showing its age, and I don’t know how they’re going to proceed other than replace large sections entirely.

Marvel Super Hero Island has been a dead end ever since Disney bought Marvel. The major rides there—the Hulk coaster, Spider-Man, Dr. Doom’s Fear Fall—do all hold up well. The theming is pretty dated to the late ‘90s though, and there’s no way any of the Marvel Cinematic Universe is making its way in. It’ll have to do as-is until it gets replaced some day, but it’s not going to age gracefully until then.

Toon Lagoon is also a dead end, being based on newspaper comics and old cartoons. Children don’t read newspaper comics anymore, and the most famous ones—Peanuts, Garfield, Calvin and Hobbes—aren’t included in the area. It also only has water rides, so if you don’t want to get wet, there’s nothing much in the area for you to do. Big updates are unlikely because the area is basically composed of cultural relics. In 10-20 years, the name “Popeye” might be more well known for fried chicken than the sailor. It also will have to be replaced entirely instead of upgraded.

Jurassic Park is pretty safe because they keep making more JP movies. Plus, everyone loves dinosaurs. The only big ride there is another water ride, though, so unless you have kids who will look at the kiddie attractions there, people who don’t want to get wet have another entire area to skip. Another ride, perhaps based on the upcoming Jurassic World, would help.

What’s left of the Lost Continent is probably not worth saving. It used to be right there with Marvel as the best area in the park, but Hogsmeade took over its best ride (Dueling Dragons). All that’s left is a Sinbad stunt show and Poseidon’s Fury, a walkthrough show attraction that has always been embarrassingly cheesy. It wouldn’t surprise me if Hogsmeade or even a new Potter area eventually consumed the rest of it. It could even get pincered, with Potter taking some and Seuss Landing taking the rest.

Speaking of, Seuss Landing is fine. Dr. Seuss books are evergreen as a part of children’s entertainment. It could use some sprucing here and there—the Cat in the Hat ride is surprisingly unpleasant—but second to Hogsmeade, it’s in the best shape long term.

There is some conspicuous construction between Toon Lagoon and Jurassic Park, and supposedly that’s going to be a King Kong themed Skull Island. It’ll be a nice nod to people who remember the old Kongfrontation ride from Studios, and apparently it’s based on a new Kong movie that will come out in a couple years. That’s nice and all, but it’ll end up another new thing to make the old attractions look, well, old.

Universal is now spending $500 million per year on its parks, and overhauling Islands of Adventure has to soak up a lot of that money in the coming years. IOA immediately became the most exciting park in Orlando when it opened, but now, everything outside Hogsmeade just not very thrilling. The place is going to need to look completely different a decade from now to retain its viability.

Wednesday, May 28, 2014

Apple’s New Taste

That we know of on the outside, Apple has had four people highly influential in setting the company's taste during its golden era: Steve Jobs, Jony Ive, Ron Johnson, and Scott Forestall. Singling out these folks alone is an oversimplification, but they certainly have had outsized influence.

Ive is now the design chief for both hardware and software, but Jobs, Johnson, and Forestall are gone. Ive certainly will continue on setting trends and direction, but he alone can’t do it for the whole company.  Tim Cook is generally known to be a numbers guy and not really a replacement for the taste making roles that Jobs and Forestall had.

The hire of Angela Ahrendts along with the Beats acquisition might be Cook’s way of injecting some new talent in that whole area. It feels odd to think about Apple turning over its chief taste makers all at once, but the old guys were around forever. Ive became an Apple employee in 1992, and Jobs and Forestall came in the NeXT deal in 1997. Johnson was the newbie, only coming on in 2000. The positions haven’t been open for a while (except Johnson’s retail head position, of course).

Jimmy Iovine and Dr. Dre joining Ive and Ahrendts as the company’s top taste executives makes as much sense as anything for the Beats acquisition. Cook’s internal letter talks as much about those two as it does Beats itself, and Beats as a brand is held in much higher regard than any of its products are.

Part of Apple’s corporate DNA is having a distinct sense of taste and style. With some of the most important people responsible for its past placement there gone, others have to step in. Iovine and Dre are just the latest two to do so.

Wednesday, May 7, 2014

Monument Valley Is a New Super Mario Galaxy


Monument Valley is a pretty great iOS game that came out not too long ago. If you like impossible objects and M.C. Esher, this is the game for you.

Gameplay-wise, it’s pretty simple. It’s like a point-and-click adventure game, in that you simply tap on where you want Ida to go. If she can get there, she goes; if not, she won’t. You have to step on switches to alter the level architecture, avoid crows that block your path, and use a movable block tower to help you at times.

It reminds me a lot of Super Mario Galaxy in a couple of ways. The obvious one is that Ida can walk on walls and ceilings, just like Mario can in some levels. Some levels in Monument Valley require rotating the stage all 360°, again, like some Galaxy levels.

The other way it reminds me of SMG is that it’s not terribly difficult. With few exceptions, Galaxy is not that hard of a game for anyone who’s not a complete beginner to 3D platform games. It itches your brain in some novel ways, but once you learn its conventions, it’s not overly challenging.

Monument Valley is also fairly easy. The only time I felt stuck to any degree was in an early level where I didn’t realize I could tap and drag a piece of the building to open up a new path. Once I learned to recognize what is movable, it really flowed easily from there. Some parts felt more like a “click to continue cutscene” experience more than playing through a game.

It’s also pretty short. It only has ten levels, some with multiple screens. Combine that with the low difficulty level, and you’ve got a pretty short game. The reviewer at Polygon needed three hours to complete it; I didn't time it, but it took me about one hour, if that. I don’t feel bad about the $4 I spent on it because it’s beautifully designed and forges an interesting path. It leaves me wanting more, but fortunately, more is on the way.

Super Mario Galaxy left me wanting more too. With that one, it had nothing to do with length. It has 120 level variations to complete, and I did it as both Mario and Luigi (their controls are different, so it’s a somewhat different experience). It almost never upped the difficulty though, and I wanted a challenge. Super Mario Galaxy 2 fulfilled that wish, thankfully.

I don’t know if future installments of Monument Valley will be significantly harder. It seems like the game is more about the design and atmosphere than really being a challenge. And that’s fine! It’s OK for some games to be like that. I just hope the difficulty curve ramps up at some point in the future, even if it’s not the next release.

Monday, March 31, 2014

How I Met Your Mother’s Finale Was Too Late

We finally saw how Ted finally met the mother of his kids. It was a really nice moment. A nice moment at the end of a very flawed episode of television.

(Spoilers ahead)

The problem with the HIMYM finale simply is that it came too late. If that was the finale of a hypothetical fifth or sixth season, it might have worked. I realize that it’s nearly impossible to turn down CBS when it offers you millions of dollars to continue making your series that peaks at over 10 million viewers a couple times a year. However, all that extra time ruined the story that Carter Bays and Craig Thomas wanted to tell.

It all goes back to the Ted and Stella storyline all the way from Seasons 3-4. Given the show’s established narrative structure, themes, conventions, etc., it was clear from that point on that the Mother wasn’t going to sneak up on us. If it wasn’t 100% obvious already, that plot thread confirmed that the moment of meeting the Mother wasn’t going to happen until the series finale. Stella got as far with Ted as anyone was ever going to without being the Mother.

Given that fact, every plot line involving Ted and a woman was going to feel hollow until we got the flashing neon sign that said, “HERE SHE IS”. The remainder of Season 4 after Ted got left at the altar and even into Season 5 was fine, as we got to see how Ted dealt with overcoming that big deal in his life. Season 5 is also when the Robin-Barney relationship first began, something that injected a lot of new energy to the show.

The opener of Season 6 is when HIMYM first teased Barney and Robin’s wedding. It wouldn’t reveal until that season's finale that it was Barney’s wedding, and it wouldn’t be until far later yet that we learned he was marrying Robin. Nevertheless, that episode is when the show began the end game. It aired on September 20, 2010; tonight was March 31, 2014.

That gap is too wide for the payoff to be satisfying. In Season 5 and really into Season 6, the show basically stopped being about Ted, the ostensible protagonist, and it became about Barney and Robin. It asked the viewers to get invested in Ted’s new crush Zoey, who clearly wasn’t going to be the titular Mother. It had to come up with things for Lily and Marshall to do, as there was no real dramatic tension in their relationship because a thousand flash forwards showed that they never would split up. Their marriage was in just as much mortal peril as Anakin Skywalker was in the Star Wars prequels.

From Season 6 through Season 8, the show basically just put things together in order to break them apart so it could put them back together again. It was marking time, just waiting for the last season to come to finally do the big reveal. Even someone who doesn’t overanalyze TV shows would have gotten the thought at some point: where is this going? Isn’t this supposed to be the story of how Ted met the kids’ mother? Why is it spending all this time on Barney and Robin? There eventually was an answer—Future Ted was still hung up on Robin—but it had to wait until after years of frustration set in to let us know.

So, the finale. It didn’t help its cause that it tried to fit about four episodes’ worth of story into one double-length episode. That made it feel rushed. It also took two storylines that were years in the making—Barney’s transformation from a womanizer into husband material and the Barney-Robin wedding that was the backdrop for every single episode this season—and wiped them away before the second commercial break. Years of buildup gone, just like that.

With Barney, I understand what Bays and Thomas were going for. They wanted him becoming a father to be the thing that finally turned him around. The problem is they had him make too much of that turnaround before getting married. It was an enormous letdown to see him go right back to being his old self when him wanting to be his old self isn’t even why he and Robin split up anyway.

Of course, the buildup for those things pales in comparison to the buildup of how Ted and the Mother would meet. That event is what the show ostensibly turned on, which is ostensibly what Ted’s life turned on. Turns out that Ted meeting a woman who, as far as any viewer could tell, was absolutely perfect for him, who he had a long relationship with, who he had two children with, was yet another speed bump on the way to him getting with Robin*.

Maybe they could have pulled that off if the series was shorter. Maybe. I don’t know. I do know it couldn't do it after nine years. The buildup for that moment ended up larger than I think the writers ever intended, as it’s evident now that the very title of the series was the first of so very many tongue-in-cheek misdirections. With a shorter run, it might have worked. After this much time, it never had a chance.

I think Bays and Thomas wanted the point when Ted holds up the blue French horn in the last shot to be a moment when the viewers shout, “Finally!” at their TV sets. Instead, that moment happened six episodes earlier in “Sunrise” when Ted let Robin go in a pretty embarrassing CGI sequence. After false start after fake out after aborted run after dead end conversation, we seemed to be past the Ted-Robin thing once and for all.

The show went to that well only to pull out an empty bucket too many times. We were all sick of the will-they-or-won’t-they with those two. The finale had enough to it with seeing the main characters’ developments over the years that it didn’t need one last left turn at the end. With about 12 fewer “either Ted or Robin wants it to work out between them but it’s just not going to happen” sequences, the last moment of the series might have been welcome. But at some point, you stop rooting for either Lucy or Charlie Brown and just want to stick a machete into the football.

Had the writers wanted to, they could have scrapped the planned ending and given us Ted and Tracy living happily ever after. It would have been a bit saccharine, but it wouldn’t have been infuriating. We got to know Tracy. She was great. She was just the right person for Ted, more so that Robin ever was.

I doubt Bays and Thomas ever seriously considered going with anything other than the ending they decided on when the conceived the show a decade ago. After all that time and commitment to it, they really couldn’t have done anything else. All that time was its enemy, though, and their big ending suffered greatly for it.

*Maybe! We still don’t know if things work out with them!

Sunday, February 16, 2014

Disney’s Frozen Has a Secretly Ominous Ending

I saw Disney’s Frozen recently, and it’s a really good movie for the most part. It does exist a bit in a Catch-22 though.

The most interesting character is by far Elsa, the only one who actually goes through a proper character arc. The other people in the movie largely are the same person at the end as when they’re introduced (one of the trolls even sings that “people don’t really change”). It could be a stronger movie if it focused more on Elsa, but it’d be a darker movie for it and probably too dark to be a children’s movie. They also couldn’t just go for it and jettison the children’s movie aspect, as so much of it relies on the viewer not overanalyzing it thanks to it being a children’s movie.

Anyway, keeping in mind that this is just a fantasy children’s movie, it’s notable that it’s the most business-focused Disney movie yet. The Duke of Weselton is obsessed with international trade, shopkeeper Oaken gives a quick lesson on supply and demand, and concern for Kristoff’s ice business is a running theme throughout.

Warning: spoilers ahead.

On that note, the ending of the movie is actually pretty ominous from a business perspective.

Elsa’s unintentional winter spell in the middle of summer would have disrupted the economy of Arendelle considerably. What little crops there are in the area would have largely died from the deep freeze, some of the livestock could have died too from exposure thanks to farmers being caught off guard, the frozen fjord would be awful for the fishing industry, and the logging of the area certainly would be set back a bit. From that alone, Arendelle is probably headed for at least a sharp recession as a result of the movie’s events. We know from the Year Without a Summer that winter-like conditions in the summer would be devastating to an early-to-mid 1800s European state like Arendelle.

However, that’s not all. Just before the end, Elsa issues a decree that Arendelle will no longer do business with Weselton, it’s largest trading partner. That’s understandable given that the Duke of Weselton sent people to assassinate her, and this preindustrial fantasy land wouldn’t have some kind of UN to settle the dispute.

It’s also the last thing the kingdom needs. With the local agriculture and industry severely stunted, Arendelle needs trade now more than ever. Cutting off relations with the kingdom’s largest trading partner will only make the recession that much deeper.

Some of the downturn might get offset by an increase of government spending. When the king shut off the castle from outsiders to protect Elsa early on, he reduced the staff. With Elsa’s new open-gate policy, government employment will rise. There also will be more social functions, of which there’s been one (the coronation) in the last 10-15 years, which will lead to more spending in the local area. The royal treasury likely can sustain this deficit spending for a while since it would have built up considerably during the decade plus of reduced staff and few expenditures. Having a few more castle servants and some fancy parties wouldn’t come close to offsetting the entire consequences of the week of winter, though.

Elsa would need to act quickly to repair the situation. She would need to send someone, perhaps the regent who ran the kingdom in the three years between her parents’ deaths and her coronation, to find new trading partners. She could also become Europe’s first entrepreneurial monarch. She might be able to bring in tourists from the region’s nobility by doing public demonstrations of her magic powers and setting up tours of her mountain ice palace. She could also travel to nearby kingdoms to create ice art for their special occasions. That money could then go to subsidize the rebuilding of the kingdom’s economy from the damage she unwittingly caused. I don’t know the extent of her powers, which seem considerable, but she might even be able to forestall the year’s coming winter to give her kingdom a chance to produce a few more goods to sell without competition.

Arendelle is probably in a better situation for the long haul with an open and confident monarch ruling it, but in the short run, there will be a struggle to fight off famine. For a seemingly happy movie that is actually the darkest Disney animated feature yet, a seemingly happy ending that is actually quite foreboding is only appropriate.

Addendum

This all assumes that Elsa doesn’t just create automatons from ice and snow to perform all economic tasks. They could work all hours of the day and dramatically expand Arendelle’s economy.

They also would put everyone out of work, creating a one of those utopias where everyone can live a life of leisure that philosophers once dreamed of. That would work, to whatever extent it can given that people generally prefer to work rather than do nothing all day, until Elsa dies. Presumably all of the automatons would then cease to function.

At that point, Arendelle would plunge into a dystopian situation where the entire infrastructure of the economy fell apart all at once. It would be a long, slow slog out of the depression as the populace would have lost all experience with actually running agriculture and industry.

It would be tempting to go the automaton route given the immediate economic crisis that is coming to the land not long after the credits roll. However if Elsa allowed her automatons to take over the whole economy, it would be far worse in the long run than accidentally plunging the kingdom into winter ever was.

Sunday, September 8, 2013

Apple Is One Rule Away From Ruling Console Gaming

Apple is very, very close to being able to just about kill off Ninendo and Sony's gaming console businesses and perhaps Microsoft's too if the media features of the Xbox One don't work as well as advertised. Only one very Apple-y rule will keep it from doing so.

Let's start with something that leaked a while ago (I'm going off the leak so I don't break the Apple Developer NDA). iOS 7 will support game controllers. Some legit images leaked out a while back, so you can see what they're planning. There are going to be three kinds of controllers. One cradles phone-sized iOS devices and has a limited button set: ABXY, two shoulders, a D-pad, and pause. The next cradles a phone-sized device and adds two analog sticks and two more shoulder buttons. The third kind is standalone (the diagram of which appears to have been inspired by the Wii Classic Controller), and it has the same, larger button set as the second one. The standalone controller image shows that up to four controllers can be used at once.

The implications for single-use handheld gaming devices are dire. The Nintendo DS and PlayStation Vita can provide a much wider variety of gaming options than touchscreen phones and tablets can thanks to having buttons. With these cradle controllers, now iOS devices can provide those experiences too on top of everything else they do. Well, they would if not for that rule I mentioned. But that's not all.

Thanks to AirPlay, you will be able to play a traditional controller-based game on iOS while sitting on your couch with the video on the TV. In fact, this setup is like the Wii U, only reversed. The Wii U has a smart box hooked up to the TV with a dumb tablet you hold in your hand:

Whereas Apple's setup has a smart tablet in your hand that connects to a dumb box hooked up to the TV:

The killer aspect for Apple is pricing. The Wii U, even after its upcoming discount, will go for $299, and it's the least expensive console of the new generation. A lot of people will already have iOS devices, or at least they can justify getting one because they can use it for far more than just games. A person who has an iPhone, iPod Touch, or iPad can buy into Apple's living room gaming setup for a $99 AppleTV and whatever one controller costs. Even if it's $35 or $40 like a traditional console controller, the combined price still less than half of the Wii U.

There is an immense advantage to buying into this kind of gaming setup. The hardware on iOS devices gets revised about every year. You won't have to wait six to eight years for the Nintendo, Sony, or Microsoft to provide updated specs. Plus, the App Store model makes it far easier for games to get to you and opens up the door for a wealth of third party developers who might never get something on a Wii U, Xbox, or PlayStation due to their barriers. And, again, the console part of it would be "free" to someone already committed to buying iDevices every couple of years anyway for their multitude of non-gaming functions.

Now, the red flag. The fact that there are two different button sets is a bit worrisome for fragmentation reasons, but that's not it. It's that Apple has made a rule that says controllers must be optional. An iOS game must be designed for touch and motion first with the controller only being a bonus add-on.

I know why Apple did this. It's to maintain simplicity for the store. It's also to remove a potential support headache. Apple doesn't want people calling them up asking for refunds when they buy a game and they find out they have to buy a controller in order to play it. Having a game in the App Store that requires a controller just wouldn't do at all.

It also means that Apple won't kill off the other game console makers as quickly as it could have. Think about some traditional handheld or living room console titles, anywhere from Zelda to Smash Bros. to Madden to Halo. They require a boatload of buttons for a reason. Making a game that functions well both with the limitations of touch input and the freedom of buttons is going to be tough, and the categories of games that require controllers will still not be feasible to provide for iOS.

Apple should know this. It knows well the difference between touch input and bucket-o-buttons input. It's why it keeps iOS and OS X separate. Any gamer can tell you that this rule is a bad idea, and people inside Apple should be able to tell you that too.

As far as the living room goes, this strategy makes total sense for Apple. It can make a limited play for living room gaming while not disrupting its plans for the AppleTV. It doesn't have to turn the AppleTV into a full fledged gaming console on top of everything else; an iDevice, a controller, and AirPlay will cover that use case just fine. It can keep selling $99 hockey pucks to people who have no interest in gaming, which makes far more sense as a living room strategy than Microsoft's apparent gambit of wanting to sell $500 Xbox Ones to people who don't play games.

Between controller support and Sprite Kit in iOS 7 and Mavericks, Apple is making a real effort at competing in games this fall. This one rule that controllers must be optional keeps it from being able to take over everything. Between apps that run on either iPhones or iPads but not both and iBooks Author creations that only work on iPads, Apple already has things in its stores that don't work everywhere. I would have thought that a simple modal dialog box saying something like "This game requires a separate controller. Do you want to buy?" might be enough to allow them to have apps that require controllers, but the powers that be chose not to go that route.

As long as that rule exists, there still is room for dedicated gaming hardware. We'll see how long that rule lasts.