Monday, March 31, 2014

How I Met Your Mother’s Finale Was Too Late

We finally saw how Ted finally met the mother of his kids. It was a really nice moment. A nice moment at the end of a very flawed episode of television.

(Spoilers ahead)

The problem with the HIMYM finale simply is that it came too late. If that was the finale of a hypothetical fifth or sixth season, it might have worked. I realize that it’s nearly impossible to turn down CBS when it offers you millions of dollars to continue making your series that peaks at over 10 million viewers a couple times a year. However, all that extra time ruined the story that Carter Bays and Craig Thomas wanted to tell.

It all goes back to the Ted and Stella storyline all the way from Seasons 3-4. Given the show’s established narrative structure, themes, conventions, etc., it was clear from that point on that the Mother wasn’t going to sneak up on us. If it wasn’t 100% obvious already, that plot thread confirmed that the moment of meeting the Mother wasn’t going to happen until the series finale. Stella got as far with Ted as anyone was ever going to without being the Mother.

Given that fact, every plot line involving Ted and a woman was going to feel hollow until we got the flashing neon sign that said, “HERE SHE IS”. The remainder of Season 4 after Ted got left at the altar and even into Season 5 was fine, as we got to see how Ted dealt with overcoming that big deal in his life. Season 5 is also when the Robin-Barney relationship first began, something that injected a lot of new energy to the show.

The opener of Season 6 is when HIMYM first teased Barney and Robin’s wedding. It wouldn’t reveal until that season's finale that it was Barney’s wedding, and it wouldn’t be until far later yet that we learned he was marrying Robin. Nevertheless, that episode is when the show began the end game. It aired on September 20, 2010; tonight was March 31, 2014.

That gap is too wide for the payoff to be satisfying. In Season 5 and really into Season 6, the show basically stopped being about Ted, the ostensible protagonist, and it became about Barney and Robin. It asked the viewers to get invested in Ted’s new crush Zoey, who clearly wasn’t going to be the titular Mother. It had to come up with things for Lily and Marshall to do, as there was no real dramatic tension in their relationship because a thousand flash forwards showed that they never would split up. Their marriage was in just as much mortal peril as Anakin Skywalker was in the Star Wars prequels.

From Season 6 through Season 8, the show basically just put things together in order to break them apart so it could put them back together again. It was marking time, just waiting for the last season to come to finally do the big reveal. Even someone who doesn’t overanalyze TV shows would have gotten the thought at some point: where is this going? Isn’t this supposed to be the story of how Ted met the kids’ mother? Why is it spending all this time on Barney and Robin? There eventually was an answer—Future Ted was still hung up on Robin—but it had to wait until after years of frustration set in to let us know.

So, the finale. It didn’t help its cause that it tried to fit about four episodes’ worth of story into one double-length episode. That made it feel rushed. It also took two storylines that were years in the making—Barney’s transformation from a womanizer into husband material and the Barney-Robin wedding that was the backdrop for every single episode this season—and wiped them away before the second commercial break. Years of buildup gone, just like that.

With Barney, I understand what Bays and Thomas were going for. They wanted him becoming a father to be the thing that finally turned him around. The problem is they had him make too much of that turnaround before getting married. It was an enormous letdown to see him go right back to being his old self when him wanting to be his old self isn’t even why he and Robin split up anyway.

Of course, the buildup for those things pales in comparison to the buildup of how Ted and the Mother would meet. That event is what the show ostensibly turned on, which is ostensibly what Ted’s life turned on. Turns out that Ted meeting a woman who, as far as any viewer could tell, was absolutely perfect for him, who he had a long relationship with, who he had two children with, was yet another speed bump on the way to him getting with Robin*.

Maybe they could have pulled that off if the series was shorter. Maybe. I don’t know. I do know it couldn't do it after nine years. The buildup for that moment ended up larger than I think the writers ever intended, as it’s evident now that the very title of the series was the first of so very many tongue-in-cheek misdirections. With a shorter run, it might have worked. After this much time, it never had a chance.

I think Bays and Thomas wanted the point when Ted holds up the blue French horn in the last shot to be a moment when the viewers shout, “Finally!” at their TV sets. Instead, that moment happened six episodes earlier in “Sunrise” when Ted let Robin go in a pretty embarrassing CGI sequence. After false start after fake out after aborted run after dead end conversation, we seemed to be past the Ted-Robin thing once and for all.

The show went to that well only to pull out an empty bucket too many times. We were all sick of the will-they-or-won’t-they with those two. The finale had enough to it with seeing the main characters’ developments over the years that it didn’t need one last left turn at the end. With about 12 fewer “either Ted or Robin wants it to work out between them but it’s just not going to happen” sequences, the last moment of the series might have been welcome. But at some point, you stop rooting for either Lucy or Charlie Brown and just want to stick a machete into the football.

Had the writers wanted to, they could have scrapped the planned ending and given us Ted and Tracy living happily ever after. It would have been a bit saccharine, but it wouldn’t have been infuriating. We got to know Tracy. She was great. She was just the right person for Ted, more so that Robin ever was.

I doubt Bays and Thomas ever seriously considered going with anything other than the ending they decided on when the conceived the show a decade ago. After all that time and commitment to it, they really couldn’t have done anything else. All that time was its enemy, though, and their big ending suffered greatly for it.

*Maybe! We still don’t know if things work out with them!

Sunday, February 16, 2014

Disney’s Frozen Has a Secretly Ominous Ending

I saw Disney’s Frozen recently, and it’s a really good movie for the most part. It does exist a bit in a Catch-22 though.

The most interesting character is by far Elsa, the only one who actually goes through a proper character arc. The other people in the movie largely are the same person at the end as when they’re introduced (one of the trolls even sings that “people don’t really change”). It could be a stronger movie if it focused more on Elsa, but it’d be a darker movie for it and probably too dark to be a children’s movie. They also couldn’t just go for it and jettison the children’s movie aspect, as so much of it relies on the viewer not overanalyzing it thanks to it being a children’s movie.

Anyway, keeping in mind that this is just a fantasy children’s movie, it’s notable that it’s the most business-focused Disney movie yet. The Duke of Weselton is obsessed with international trade, shopkeeper Oaken gives a quick lesson on supply and demand, and concern for Kristoff’s ice business is a running theme throughout.

Warning: spoilers ahead.

On that note, the ending of the movie is actually pretty ominous from a business perspective.

Elsa’s unintentional winter spell in the middle of summer would have disrupted the economy of Arendelle considerably. What little crops there are in the area would have largely died from the deep freeze, some of the livestock could have died too from exposure thanks to farmers being caught off guard, the frozen fjord would be awful for the fishing industry, and the logging of the area certainly would be set back a bit. From that alone, Arendelle is probably headed for at least a sharp recession as a result of the movie’s events. We know from the Year Without a Summer that winter-like conditions in the summer would be devastating to an early-to-mid 1800s European state like Arendelle.

However, that’s not all. Just before the end, Elsa issues a decree that Arendelle will no longer do business with Weselton, it’s largest trading partner. That’s understandable given that the Duke of Weselton sent people to assassinate her, and this preindustrial fantasy land wouldn’t have some kind of UN to settle the dispute.

It’s also the last thing the kingdom needs. With the local agriculture and industry severely stunted, Arendelle needs trade now more than ever. Cutting off relations with the kingdom’s largest trading partner will only make the recession that much deeper.

Some of the downturn might get offset by an increase of government spending. When the king shut off the castle from outsiders to protect Elsa early on, he reduced the staff. With Elsa’s new open-gate policy, government employment will rise. There also will be more social functions, of which there’s been one (the coronation) in the last 10-15 years, which will lead to more spending in the local area. The royal treasury likely can sustain this deficit spending for a while since it would have built up considerably during the decade plus of reduced staff and few expenditures. Having a few more castle servants and some fancy parties wouldn’t come close to offsetting the entire consequences of the week of winter, though.

Elsa would need to act quickly to repair the situation. She would need to send someone, perhaps the regent who ran the kingdom in the three years between her parents’ deaths and her coronation, to find new trading partners. She could also become Europe’s first entrepreneurial monarch. She might be able to bring in tourists from the region’s nobility by doing public demonstrations of her magic powers and setting up tours of her mountain ice palace. She could also travel to nearby kingdoms to create ice art for their special occasions. That money could then go to subsidize the rebuilding of the kingdom’s economy from the damage she unwittingly caused. I don’t know the extent of her powers, which seem considerable, but she might even be able to forestall the year’s coming winter to give her kingdom a chance to produce a few more goods to sell without competition.

Arendelle is probably in a better situation for the long haul with an open and confident monarch ruling it, but in the short run, there will be a struggle to fight off famine. For a seemingly happy movie that is actually the darkest Disney animated feature yet, a seemingly happy ending that is actually quite foreboding is only appropriate.

Addendum

This all assumes that Elsa doesn’t just create automatons from ice and snow to perform all economic tasks. They could work all hours of the day and dramatically expand Arendelle’s economy.

They also would put everyone out of work, creating a one of those utopias where everyone can live a life of leisure that philosophers once dreamed of. That would work, to whatever extent it can given that people generally prefer to work rather than do nothing all day, until Elsa dies. Presumably all of the automatons would then cease to function.

At that point, Arendelle would plunge into a dystopian situation where the entire infrastructure of the economy fell apart all at once. It would be a long, slow slog out of the depression as the populace would have lost all experience with actually running agriculture and industry.

It would be tempting to go the automaton route given the immediate economic crisis that is coming to the land not long after the credits roll. However if Elsa allowed her automatons to take over the whole economy, it would be far worse in the long run than accidentally plunging the kingdom into winter ever was.

Sunday, September 8, 2013

Apple Is One Rule Away From Ruling Console Gaming

Apple is very, very close to being able to just about kill off Ninendo and Sony's gaming console businesses and perhaps Microsoft's too if the media features of the Xbox One don't work as well as advertised. Only one very Apple-y rule will keep it from doing so.

Let's start with something that leaked a while ago (I'm going off the leak so I don't break the Apple Developer NDA). iOS 7 will support game controllers. Some legit images leaked out a while back, so you can see what they're planning. There are going to be three kinds of controllers. One cradles phone-sized iOS devices and has a limited button set: ABXY, two shoulders, a D-pad, and pause. The next cradles a phone-sized device and adds two analog sticks and two more shoulder buttons. The third kind is standalone (the diagram of which appears to have been inspired by the Wii Classic Controller), and it has the same, larger button set as the second one. The standalone controller image shows that up to four controllers can be used at once.

The implications for single-use handheld gaming devices are dire. The Nintendo DS and PlayStation Vita can provide a much wider variety of gaming options than touchscreen phones and tablets can thanks to having buttons. With these cradle controllers, now iOS devices can provide those experiences too on top of everything else they do. Well, they would if not for that rule I mentioned. But that's not all.

Thanks to AirPlay, you will be able to play a traditional controller-based game on iOS while sitting on your couch with the video on the TV. In fact, this setup is like the Wii U, only reversed. The Wii U has a smart box hooked up to the TV with a dumb tablet you hold in your hand:

Whereas Apple's setup has a smart tablet in your hand that connects to a dumb box hooked up to the TV:

The killer aspect for Apple is pricing. The Wii U, even after its upcoming discount, will go for $299, and it's the least expensive console of the new generation. A lot of people will already have iOS devices, or at least they can justify getting one because they can use it for far more than just games. A person who has an iPhone, iPod Touch, or iPad can buy into Apple's living room gaming setup for a $99 AppleTV and whatever one controller costs. Even if it's $35 or $40 like a traditional console controller, the combined price still less than half of the Wii U.

There is an immense advantage to buying into this kind of gaming setup. The hardware on iOS devices gets revised about every year. You won't have to wait six to eight years for the Nintendo, Sony, or Microsoft to provide updated specs. Plus, the App Store model makes it far easier for games to get to you and opens up the door for a wealth of third party developers who might never get something on a Wii U, Xbox, or PlayStation due to their barriers. And, again, the console part of it would be "free" to someone already committed to buying iDevices every couple of years anyway for their multitude of non-gaming functions.

Now, the red flag. The fact that there are two different button sets is a bit worrisome for fragmentation reasons, but that's not it. It's that Apple has made a rule that says controllers must be optional. An iOS game must be designed for touch and motion first with the controller only being a bonus add-on.

I know why Apple did this. It's to maintain simplicity for the store. It's also to remove a potential support headache. Apple doesn't want people calling them up asking for refunds when they buy a game and they find out they have to buy a controller in order to play it. Having a game in the App Store that requires a controller just wouldn't do at all.

It also means that Apple won't kill off the other game console makers as quickly as it could have. Think about some traditional handheld or living room console titles, anywhere from Zelda to Smash Bros. to Madden to Halo. They require a boatload of buttons for a reason. Making a game that functions well both with the limitations of touch input and the freedom of buttons is going to be tough, and the categories of games that require controllers will still not be feasible to provide for iOS.

Apple should know this. It knows well the difference between touch input and bucket-o-buttons input. It's why it keeps iOS and OS X separate. Any gamer can tell you that this rule is a bad idea, and people inside Apple should be able to tell you that too.

As far as the living room goes, this strategy makes total sense for Apple. It can make a limited play for living room gaming while not disrupting its plans for the AppleTV. It doesn't have to turn the AppleTV into a full fledged gaming console on top of everything else; an iDevice, a controller, and AirPlay will cover that use case just fine. It can keep selling $99 hockey pucks to people who have no interest in gaming, which makes far more sense as a living room strategy than Microsoft's apparent gambit of wanting to sell $500 Xbox Ones to people who don't play games.

Between controller support and Sprite Kit in iOS 7 and Mavericks, Apple is making a real effort at competing in games this fall. This one rule that controllers must be optional keeps it from being able to take over everything. Between apps that run on either iPhones or iPads but not both and iBooks Author creations that only work on iPads, Apple already has things in its stores that don't work everywhere. I would have thought that a simple modal dialog box saying something like "This game requires a separate controller. Do you want to buy?" might be enough to allow them to have apps that require controllers, but the powers that be chose not to go that route.

As long as that rule exists, there still is room for dedicated gaming hardware. We'll see how long that rule lasts.

Sunday, April 14, 2013

A Few Good Years Have Passed

Last night, my wife and I watched the 1992 classic A Few Good Men. I was of course familiar with the famous courtroom scenes, but it's actually the first time I had seen it all the way through. My wife hadn't seen it either, but she is in the Navy now, so I figured she'd enjoy it for that reason. Her favorite line actually didn't end up being any of the famous ones. Rather, it was Kevin Pollack's Lieutenant Weinberg wryly stating that, "No one likes the whites". This is true; no one she knows likes the Navy's dress white uniforms. It had some inaccuracies that bugged her though, not the least being Tom Cruise's Lieutenant Kaffee treating Demi Moore's Lieutenant Commander Galloway as though he outranked her throughout.

Anyway, I had recorded it off of AMC, and it had little fact boxes popping up at the bottom periodically. It wasn't until one of those boxes appeared some time into it that it really clicked for me why Jack Nicholson's Colonel Jessup was so intense about being on the wall and so forth. He was the leader at the Guantanamo Bay base, and at the time that Aaron Sorkin wrote the play on which the movie was based, the Cold War was still going on. Not that Cuba is the United States' friend now or anything, but the implications of the island being Communist were far more important then than now.

I was born three years before Sorkin's play first hit the stage. I can remember old maps from elementary school that said USSR and can recall seeing fallout shelter signage here and there, but I have no recollection of the Cold War and its existential threat to the US. I was four when the Berlin Wall fell; I was six when the Soviet Union dissolved. Even if I had learned about Russian nukes being pointed at my country at the time, I wasn't old enough to really understand the implications.

For my generation, Guantanamo Bay has a very different connotation. It's not an outpost of democracy on the edge of Communist territory; it's a holding cell for War on Terror suspects. The incident that started everything for the plot in A Few Good Men was a Marine shooting a single bullet outward across the fence unprovoked. While that's never something you want to see happen, it probably would be more or less a nonevent these days beyond whatever punishment a Marine gets for unnecessarily discharging a weapon. It wouldn't be an event that could potentially cost lives. Cuba isn't a battlefield anymore. Guantanamo is a very different part of the wall that keeps America safe now.

The climactic scene with Kaffee haranguing Jessup on the stand is still as intense as ever, but it has lost a little something because of the way the film takes for granted that the audience understands the Cold War subtext of the film. I am pretty well versed in history and probably would have put it all together eventually, but it's not something that people in the Millennial generation and beyond will get instinctively. I certainly understand it in an intellectual sense, but I don't feel it viscerally. A young person could make it through the whole thing and think that Jessup is just really cranky because he thinks every member of the military who isn't in an office in D.C. plays a part in guarding the wall that protects the homeland. The latent yet very specific threat of nuclear war will be lost in that scenario.

If Hollywood ever decides to remake this film, it will definitely hammer (probably excessively so) on that element of it during the first couple of acts. The future Jessup will throw around terms like "the Red Menace" to make sure it's clear (crystal, even) that the stakes here are related to the Cold War. For him and his generation, "Cuba" probably primarily conjures feelings surrounding the Cuban Missile Crisis or the Bay of Pigs; for me, it conjures Elian Gonzalez well before any of JFK's incidents down there. It's still a really good movie if you don't have that in the forefront of your mind, but it's not as good as it can be without it.

One of the other popup fact boxes said that Rob Reiner had hoped to make A Few Good Men be a timeless movie and that, aside from Cruise's civilian wardrobe, it is. We must add one other caveat besides loud shirts: it's timeless except for its inherent assumption that Guantanamo Bay, Cuba will always have Cold War connotations for its audience. It certainly does not for most anyone younger than 30, and it might not for those older than that anymore either given its prominence in the last decade's news cycle.

To watch A Few Good Men again:

Thursday, August 16, 2012

Apple Wants iCloud to Be the World's DVR

The Wall Street Journal has been revealing some details about Apple's plans in the television space. Steve Jobs famously said he thought he had "cracked" the problem of television shortly before he passed away last year, and everyone has been trying to figure out what he meant ever since.

The latest report from the WSJ, if true and I'm interpreting it correctly, likely reveals what Jobs thought was the breakthrough:

The Cupertino, Calif.-based company proposes giving viewers the ability to start any show at any time through a digital-video recorder that would store TV shows on the Internet. Viewers even could start a show minutes after it has begun.

The vision here is pure Apple. The company identified an area of complexity, in this case managing TV recordings, and plans to offer a simple solution where it simply does it for you. Here, iCloud becomes the world's DVR. There won't be boxes in every individual home making millions of individual recordings of the same programs; there will be one place that "records" the programs (Apple's datacenter) and all of the boxes will stream that copy.

You won't miss a show because you forgot to set up a recording; Apple is recording it for you. You won't miss a show because the DVR filled up; Apple is recording it for you. You won't miss a recording because you're out of free tuners, or because the cable went out, or because a cloud went between you and the satellite. Don't worry. Apple's recording it for you.

Obvious road blocks have to be overcome before this vision of the future can come to pass. For one, the WSJ reports that Apple doesn't have a single deal worked out yet with any content providers or cable providers to make this happen legally. For another, this setup requires a completely reliable Internet connection. If the Internet goes out, you not only have no TV anymore (not a guaranteed problem today) but you can't watch your recordings in the meantime either.

Plus, ISPs aren't going to be happy about a system like this because it would put an enormous strain on their networks. They are already playing around with bandwidth caps, and that's without most people getting their TV through the Internet. Perhaps the new H.265 standard will solve this particular issue, but it's not going to be available for anything until "as soon as 2013" (which probably means later than that, given the choice of weasel words here).

This sounds like a really cool way forward. I have my doubts that we'll see anything like it any time soon because content owners, cable providers, and ISPs are some of the worst companies in the world. Of course, Apple worked things out with cell operators, who are just as bad if not worse, so there is some hope out there.

Monday, August 13, 2012

I Disagree Vigorously In Many Ways

I decided to back App.net because I am deeply distrustful of companies that make money by selling users' personal information. For instance I've only once ever posted an image of myself on Facebook, even when I was an early user when the service was only open to college students (I've since deleted that photo). I stay logged out of Google except when explicitly using its services for which I judge there to be no better alternative. I install anti-tracking extensions in every browser I control.

App.net appeals to me because it promises to be a clone of Twitter, a web service I love dearly, that will not sell my information to advertisers and will treat developers well. Yes, privacy really does matter to me. I'm not a software dev, but I'm enough of a tech nerd that I sympathize with them.

You can then imagine then the offense I took when I saw this article by Whitney Erin Boesel proclaiming App.net to be a "white flight" from Twitter and Facebook. There's so much to disagree with, it's hard to know where to begin.

Well, let's start with the premise as revealed by the title. Boesel is not suggesting a literal white flight, but rather that techies (who happen to be predominantly white, so you know) are unhappy with Twitter because it's not just for them anymore. Regular people, minorities, and, in general, the others came in and ruined it.

That premise fundamentally misunderstands App.net founder Dalton Caldwell's critique of Twitter that led to the current incarnation of App.net. It's very simple. Twitter was once very friendly to third party developers. In fact, third party developers did as much as anyone to popularize Twitter.

Twitter didn't have a plan to make money back in those days, though, and all it came up with was being just another advertising-supported social network. That was a disappointment to the tech crowd because it could have been much more technically interesting (see Caldwell's piece for the details). She apparently only looked at the pictures and didn't read the text of Caldwell's tweets complaining about targeted trending topics and K-Mart ads. The issue isn't that "OMG it’s the end of the world: K-mart shoppers and people of color found Twitter" as she sardonically put it. It's that they are supposed to be customized for him but don't come close. The company that could have changed the way information flows online became not just a targeted ad company, but a bad targeted ad company.

No one begrudges Twitter's need to make money. It couldn't survive forever by burning through VC money. The problem is that the way it has chosen to pursue of money has caused it to do things that offend the techie crowd. Boesel shouldn't be chiding that techie crowd for feeling hurt after being offended. We really do think and care about this stuff. A lot. Desiring the ability to pay to avoid advertising is not confined solely to people unhappy with Twitter, too.

And yes, there are some tech-y people who proudly announce themselves as non-Facebook users as Boesel notes, but there's always a backlash against popular things. With a billion or so users, it's hard to name anything ever that's as popular as Facebook is. The effect can be adapted from one of modernity's great sages: no one uses Facebook anymore; it's too crowded. But App.net's existence is not really about Facebook; no one ascribed any idealistic hopes to Facebook as they did Twitter because it's always been clear that Facebook would be ad-supported. App.net is about a Twitter pulling the rug out from under third party devs because of the incentives created by the ad supported, plain and simple. Without those incentives, App.net might be able to fulfill some of the technical promise of the Twitter used to have.

It's certainly true that people who have a spare $50 to speculatively bet on a fledgling social network are going to trend towards the more affluent side of things, but if you're going to hammer on that, then you're also targeting nearly everything ever launched from Kickstarter.com too.

The $50 yearly membership has to be charged up front because Dalton Caldwell is trying to get this thing off the ground. Starting a new service costs money, and if you’re not going the VC route, then you have to do something like this.

His calling it a "are you serious" fee is a great way of putting it. The kind of people drawn to App.net right now are by definition early adopters. Early adopters tend to be fickle. If they try it for a month and don't like it, they're out $4.17 and are gone. They're more likely to stick through early bugs and growing pains (it's still very much an alpha; there's not even a TOS yet) if they've plunked down a more significant sum of money. That's Loss Avoidance 101.

It did cross my mind that the upfront charge is going to price out a lot of people who might otherwise be able to join up at $4.17 a month instead. Ultimately, I backed it anyway because I realized that it's probably due to what I just discussed in the previous two paragraphs. I expect to see a monthly, rather than yearly, format for charging in the future. If that never comes, then I'll reconsider this part.

Finally, Boesel seems to be offended by the idea that enthusiasts make things for fellow enthusiasts.

In its early days, Twitter appealed largely to tech enthusiasts. As the company cast its nets wider, it made changes to make it more appealing to the masses rather than the enthusiasts. Not surprisingly, that upset the enthusiasts. In App.net, the enthusiasts are attempting to build or back something that will cater to their preferences. She slams the idea of a social network geared towards the desires of tech enthusiasts as a "digital country club", but there's nothing new or controversial in enthusiasts congregating with other enthusiasts.

Automobile enthusiasts have car clubs. Bowling enthusiasts have leagues that are separate from the casual players, and they typically cost more than a year of App.net does. On the Internet, you can find message boards where enthusiasts in just about anything meet and discuss things (and yes, some do cost money). If tech enthusiasts want to have a social network by and for them, what's the big deal? Not that I think App.net will always have that narrow a scope to it, but if even it does, what's the harm? Should model airplane clubs be disbanded because it's expensive to build, fly, and maintain model airplanes?

In a comment to her story, while saying that she sees how App.net might be better than Facebook or Twitter, she says:

i think it’s important to ask *to whom* the better thing is being offered, and *to whom* the better thing is inaccessible–as well as whether picking up and moving is the best way to deal with the much larger issues that manifest in what angers us about facebook and twitter.
It's being offered to anyone who can help with the cost of bootstrapping the business. This is almost certainly a temporary phase as the business gets off the ground; a monthly plan will have to be offered for it to get enough users to be sustainable. [Update: Caldwell has acknowledged that a price cut could be coming, but it must be sustainable first.] Even that will price out some people from this potentially better alternative to Facebook and Twitter, but you'll be hard pressed to find many products where no one is priced out of the best available kind. If you contend that the best of everything should be priced so that it is available to all, then what you're describing is not capitalism.

It is inaccessible to those who don't have the money to spare, just like literally anything following the Kickstarter model.

And if people don't leave Facebook or Twitter to something else, what pressure will there be on them to change? Twitter is a private company that you have to be far richer than having a spare $50 to have any influence with. The majority of Facebook's voting shares are owned by Mark Zuckerberg, so he literally has complete control of the company. If you're not some combination of fabulously wealthy, a VC, or a large business with a large advertising budget, no individual has any hope of making a dent in those companies.

The only options are either supporting something that does things differently and therefore exerting pressure through the market or non-consumption. I chose the former because I think it will yield more results faster than silently dropping out will.

Even if App.net does take off, I won't quit Twitter. It's too useful for my main blogging gig. My hope is that App.net will put pressure on Twitter to one day offer an option to pay for an ad-free, non-tracking form of the service. That's a wish upon a star, but I actually did something about it. If you don't work towards the future you want to see, then you can't complain if it never materializes.

Sunday, August 12, 2012

Paul Ryan Is Mostly Unremarkable

Paul Ryan is the pick as Mitt Romney's vice president. Lots of pixels and ink have been devoted to what that means and how Ryan changes the game (or not, as the case may be).

Ultimately, Ryan isn't that remarkable among Republicans. Look over his record.

He voted for George W. Bush's unfunded tax cuts and his unfunded Medicare Part D expansion. He voted in favor of the unfunded war of choice in Iraq. He voted for TARP and the bailouts, practically begging his colleagues to support the former. When Barack Obama took office, Ryan got religion about deficits and eventually put together his famous series of budgets that cut taxes and spending.

In other words, he is a garden variety politician. Spending by his party's leader is a judicious use of our resources that strikes the right balance, while spending by the other party's leader is wasteful and a burden to future generations. Deficits created by his party aren't worth worrying about, but those racked up by the other party are dangerous. Nothing is new under the sun.

Ryan does shine as a communicator, as he's able to state his cases in a clear and often convincing manner. He will do a better job at advancing his party's ideas than Romney does. However, there's nothing in his record that makes Ryan all that special. He's a Republican who mainly just votes the party line.