Monday, August 31, 2009

Sad Tales from the Android App Store

Makes you think twice before abandoning the iPhone, even with the hassles and review process problems, doesn't it?

Click-to-Flash

Just thought this was an apropos time to mention Wolf Rentzsch's wonderful Click-to-Flash, which disables the Flash plug-in from working in your browser unless you specifically authorize it. It's a wonderful tool that any Mac user should have installed (except you Flash developers, of course).

Flash Post Mortem

Okay, this is going to be my last word on Flash for a while. As fun as this has been, beyond this, I think that things will begin to get counterproductive and I have a lot of stuff I have to get done in the next few weeks.

I do appreciate everyone who took time to point out their perceived flaws in my argument, and to everyone who provided links and information about the Flash Platform with regards to the mobile web. Nothing I saw changed my mind about the future of Flash as a web development tool, but I feel better informed about Adobe's desperate attempts to hold onto the position they've carved out for themselves on the pre-mobile web.

Even if my predictions are completely wrong and Flash manages to survive as a dominant web development tool, it won't change my conviction that it's simply the wrong paradigm for the vast, vast majority of web development tasks. It's a fine tool for interactive presentations, kiosks, and limited cross-platform development. But for the web? It couldn't be a worse tool. Sorry. I know many of you disagree with me on this, but I've used a lot of web development tools and visited a lot of websites over the years on multiple platforms and Flash is just a back-asswards approach to web development. If Adobe changes the fundamental architecture of the platform, then I'm open to re-assessing Flash's worth, but until then, any enhancements or new features are nothing more than turd polishing. The shiniest, most jewel-encrusted turd, is still a turd.

But I'm completely unconvinced that Flash is going to survive the growth of the mobile web, at least as the dominant player it has been. Here's the thing that's missing. Adobe has created a consortium to get Flash on to "billions" of mobile devices. Who's not on that list right now? Apple and RIM, makers of Blackberry. When you look at the mobile web, either by units or packets transferred, those two have the lion's share of the market. Having a consortium to promote something on the mobile web without those two participating would be like having a consortium to develop standards for the regular web without including Microsoft. If you can't get at least one of them, and probably both, Flash is a dead end for the mobile web. That's just the political reality. Even if you can get RIM, you've still got the single largest smart phone in terms of units, packets transferred and, certainly, mindshare not participating in your "consortium". So, are companies going to want to do their sites one way for the iPhone and another for everyone else? Or are they going to start veering towards the open standards that will work on all devices? I think the slow move toward standards whenever feasible has already begun and eventually Flash will be left to handle the small number of niche jobs that can't be handled using Javascript/HTML/CSS.

On top of that, as long as Apple is in the position they're in with the iPhone not supporting Flash, other phone manufacturers are going to view mobile flash as "not mission critical", even if they join your consortium and let you put their logo on the consortium's web page. As long as that's true, these companies are going to view Flash as a potential marketing advantage and nothing more. They'll list it as a feature they have that the iPhone doesn't, but they're not going to expend huge resources making sure it's wonderful, because it's already been proven that people will buy smart phones that don't have Flash. These companies will do just enough to be able to put "Flash-enabled" in their ads and that's it. Again, it might not be right, but it is the political reality of how decisions are made at large companies. There's always too much to do and too few hours, and if you can't convince them all that Flash is mission critical, they're not going to pull resources away from other things to get it done.

Now, if Apple and RIM get onboard with Flash, that completely changes the political landscape. It won't change my mind about Flash being a poor choice for most types of web development, but it certainly will change its future viability substantially.

Nested Arrays

I'm staying the heck away from talking about Flash for a while.

In my quest for a good solution for handling table-based detail editing panes, I've been experimenting with using nested arrays to drive the table structure. A nested array is nothing more than an array of arrays (and I'm talking about NSArray instances here). The main or outer array contains one instance of NSArray for each section, and each subarray contains one object for each row in the section it represents. It takes multiple nested arrays to hold the structure of a table, with one nested array holding the labels, another holding the keys, and another holding the class of the controller class that can be used to edit that item. They're paired nested arrays, I guess.

This solution isn't quite as turnkey as the property-list driven solution I was working on earlier, but it's conceptually a lot easier to explain, and it doesn't squirrel away all the code I need to demonstrate into complex, generic classes. It's a hell of a lot better than having large nested switch statements in your controller class.

The entire process I'm developing will be shown in More iPhone 3 Development, but here's a category to make it easier to pull information out of a nested array in case you want to experiment on your own. This category adds a method to NSArray that lets you retrieve the right object from the nested array for a given NSIndexPath. Note that this category, like the table views it was created to support, supports only simple index paths that store a row and a section.

NSArray-NestedArray.h
//
// NSArray-NestedArrays.h
#import <Foundation/Foundation.h>


@interface NSArray(NestedArrays)
/*!
This method will return an object contained with an array
contained within this array. It is intended to allow
single-step retrieval of objects in the nested array
using an index path
*/

- (id)nestedObjectAtIndexPath:(NSIndexPath *)indexPath;
@end



NSArray-NestedArray.m
//
// NSArray-NestedArrays.m
#import "NSArray-NestedArrays.h"

@implementation NSArray(NestedArrays)
- (id)nestedObjectAtIndexPath:(NSIndexPath *)indexPath {
NSUInteger row = [indexPath row];
NSUInteger section = [indexPath section];
NSArray *subArray = [self objectAtIndex:section];

if (![subArray isKindOfClass:[NSArray class]])
return nil;

if (row >= [subArray count])
return nil;

return [subArray objectAtIndex:row];
}

@end

Friday, August 28, 2009

Just a Few More Vacation Thoughts

Wow. I really didn't expect this kind of response. I see that my original Flash rant is up to 29 comments. I'm a little scared to read the ones I haven't read when I finally get back from vacation. I really seem to have pissed off a lot of people. That honestly was never my intention.

To some extent, I regret the "Flash Sucks" part of the discussion. Not because it's necessarily untrue, though: It's my honest opinion of Flash on Mac OS X - it's leaky, crash-prone and, well... sucks. But, I regret it because that statement has detracted from the real discussion that's been going on underneath it all. It seems to me that many Flash proponents are trying to write me off as "just another run-of-the-mill Flash Hater" predicting the demise of Flash "yet again". The same old story, nothing to see here. Move along, move along.

But I'm not a run-of-the-mill Flash hater, and I'm not predicting Flash's demise because I hate it. I'm not even predicting Flash's total demise, just the demise of Flash cum general purpose web-development tool. And my predictions are the result of a fairly thorough knowledge of the history of web development and a reasonably educated guess about where things are likely going next.

And you know what? Flash is the completely wrong approach for a general-purpose web development tool. Flash's current approach is anathema to what has become fairly widely recognized as good web development practices. The fact that Flash is fun or easy, or that the tools are great doesn't change that it is fundamentally flawed approach to doing web development.

And, believe it or not, I'm going on about this for the opposite motivation from hate. I'm dwelling on this because I don't want the many excellent Flash and Flex developers I've met to get screwed if the walls do come crumbling down. I can tell you stories about programmers I've known who spent a career working in PL360 or COBOL, only to find themselves without a job and with a skillset that was completely obsolete and not marketable. Take a look at this list and tell me how many languages you recognize. Languages go obsolete all the time. Some of the ones that have gone obsolete were bigger stars in their day than Flash is now.

Right now, we have fifteen years of web development evolution to look back upon. It's now a fairly mature space, and we are at a point where we have a pretty good idea of what things have worked and are sustainable and scaleable. We also have a pretty good idea of what things have not worked. And though you might disagree with me, when I look at the web technologies that have survived, the Flash plug-in is an anomaly. If the web is headed where it looks like it's headed, it's hard for me to see how Flash can continue to remain as relevant as it has up until now. That's not hate, that's just reality.

You can do so much more with HTML and Javascript today than you could a decade ago, or even a couple of years ago. With the exception of certain types of online games, the vast majority of what Flash is used for on the web can already be done in HTML and Javascript without waiting for HTML5 (combined, of course, with server-side development of some sort). The situations where Flash is truly necessary are dwindling, and Flash's uptake in the area of greatest growth (the mobile web) has been lackluster at best. "Dismal" is probably a better adjective than "lackluster" to describe the mobile Flash uptake.

There might be a silver lining here, but I don't see it. Adobe's done too little too late when it comes to opening up Flash, and has done too little too late in the mobile realm period. Adobe can send out press releases about "billions of Flash-enabled mobile devices" till the cows come home, but it's simply not reality. It's just marketing types playing with semantics. Companies only mince words when they have something to hide. To a consumer, a Flash-enabled mobile device is one that can access most any Flash content that's anywhere on the web just as if they were at home on their computer. There aren't billions of devices that can do that. There aren't any devices that can do that yet.

And think of this. If Flash somehow maintains or increases its prevalence as a web development tool going forward, I'll have been wrong in a couple of blog postings on the Internet. If I'm right, an awful lot of people are going to be finding their skills less and less relevant and less and less marketable.

I've been wrong before. I can live with being wrong. But what if it turns out I am right? What if it turns out that after 26 years of being online1 and keeping abreast of the technology involved in online communications I actually know something about it? What if these rants aren't the result of hatred but are rather just an honest assessment of Flash's current place in a complex and ever-changing technology landscape? If you're a Flash developer, especially if Flash development is your only marketable software development skill, well… just make sure you can live with me being right. I'll continue to be able to put food on the table if I'm wrong about this. Please make sure you can say the same thing.

Just as an aside: I left my last full-time gig (at PeopleSoft) in 2002. I'm not sure if you remember, but 2002 was a pretty crappy economy. Not as bad as today overall, but it was a pretty rough time for programmers and software consultants because the dot com boom and subsequent bust had created a glut of tech workers. I started my first project as an independent consultant only four days after walking out the doors at PeopleSoft (two of which were a weekend). After that project (knock on wood) I've found steady work with very little unplanned downtime. The work hasn't always been in my preferred language or using my preferred tools, but I've always found work quickly because my most marketable skill is that I have a varied toolbox, an understanding of what tools exist outside of my toolbox, and a willingness to expand my toolbox if that's what I need to do to get a job done right. Languages and environments come and go. Don't tie your fortunes too tightly to something that can become obsolete unless you want to risk becoming obsolete yourself.


1 - I got my first modem in 1983, a hand-me-down acoustically coupled modem from a friend's dad who was a college professor. It was replaced less than a year later with a Hayes-compatible 1200 baud model. I had my first internet e-mail address in 1986 (a fidonet address through a BBS I called using PCPursuit), and I got first true Internet account in 1988 (a Unix shell account through my university that I accessed using Kemit). I had my first PPP internet account in 1992: a metered account through a company called holonet.

Thursday, August 27, 2009

You Guys Rock

A lot of you disagree with me on my Flash assessment and your comments have been great. Unfortunately, I really do need to get back to my vacation. Do please feel free to keep knocking down my arguments in the comments, though.

I want to leave with this definition that I tweeted a minute ago, because this is the crux of my point in the last point. I'm really not hoping Flash dies, but I do fear for the hammer developers in the Flash world because there's at least some chance of the platform becoming obsolete. We can argue all night about how great those chances are, but given the current environment, the chance is at very least greater than zero.

A "hammer developer" is a developer who has only ever learned one language and development framework. With only one tool in their box, every problem looks like a nail. It's the hammer developer that's responsible for sites like Disney's, where even basic things that HTML has been able to handle for years (like navigation bars) are done in Flash.

Hammer developers are the ones who have the hardest time when a technology does become obsolete. Even if you love Flash, don't be a hammer developer. Seriously. Diversify your toolset, even if you love one particular one. No toolset is guaranteed to be around forever.

And, yes, hammer developers exist in every developer community. I'm only singling the Flash community out because the platform sits in a unique position at this point in time. Flash has a lot of baggage from a different era and I'm not sure Adobe really understands what must be done for Flash to remain viable. Maybe they do, but are you willing to bet your income on it?

Flash is Dead! Long Live Flash!

I am on vacation.

I did not intend to do a blog post while on vacation, but I feel like I need more than 140 characters to explain my recent twitter rants. We're having a quiet night after several days in the Disney parks, so it's a good opportunity to expand on these recent tweets, since the twitter versions, limited to 140 characters, are evoking a lot of anger.

The Spark


While at Disney World's Magic Kingdom, I wanted to look something up on Disney's website. Navigating to one of the Disney.com web pages using my iPhone resulted in an error page. That's right, I thought to myself, Disney uses Flash for almost everything, don't they.

I'm not thrilled about a company doing that, but that's absolutely not the thing that set me off. It was the fact that Disney's explanation for why I couldn't view their page was that I wasn't using a "Standards compliant browser".

photo.jpg


Now, say what you will about Mobile Safari, but one accusation you can't reasonably level at it is that it isn't Standards-compliant. It's based on WebKit, and WebKit fully passes the Acid3 test. It's as standards-compliant as any browser, certainly any mobile browser. It's a hell of a lot more "standards compliant" than IE5 (look at the screenshot).

Disney is, basically, putting the blame on the user for their decision to require Flash on their website, and using made-up statistics to make the user sound like an anomaly. A weirdo. C'mon, be cool like 99.9% of our viewers! Yet, they couch it in language that SOUNDS like good customer service (we want to help). Crikey!

Out of Place


Now, I find this odd. For all of their faults, Disney tends to be incredibly good at not insulting their customers. Some of the best customer service I've ever seen has been within the 47 square miles owned by the Disney corporation in Central Florida. Seriously. I'm not exaggerating here at all. Disney castmembers bend over backwards to accommodate their customers and the company has been incredibly progressive over the last thirty years or so in their treatment of customers. When they do screw up, they're usually quick to apologize and do everything possible to rectify the situation.

I had a situation with park security several years back. The details aren't important, but I was accused of doing something illegal (in front of my children and many, many strangers) in a situation where it would have been physically impossible for me (or anyone) to have done the thing I was accused of. The situation was rectified quickly once I asked for management involvement, and I was sent on my way with a sincere-sounding apology. When I got home from vacation, I received a box of gifts for my children with a hand-written apology from a VP. A few days later, I received another apology by phone and was given specific information about changes that would be made to their training to make sure that same situation didn't happen again.

That relatively minor situation, if handled wrong, could easily have ended with me never wanting to do business with Disney again. But, it didn't, because they handled it right. Whether any of the apologies were sincere or not makes absolutely no difference. They acknowledged that they had made a mistake, apologized to me for making it, and then followed up afterwards to see if I needed anything. It wouldn't be reasonable to ask for or expect more than that. And to their benefit, Disney Management did not blame or scapegoat the castmember, either. They recognized it was their responsibility to train their castmembers for foreseeable encounters, and that they had failed.

As a result, I have been back to Disney World several times since then. I've never had another serious situation myself, but I've become acutely aware of how this company handles customer service, and often notice little details about the way the Disney castmembers treat people that probably go unnoticed by most people visiting the parks and hotels. With only rare exceptions, Disney does a phenomenal job interacting with their customers.

Except online.

When it comes to the online world, they seem to act with a much more typical corporate attitude, one that, if they were a person, would probably be labeled "arrogance".

So, anyway, my inability to look up something on Disney's website from my iPhone while standing in a Disney park led to my recent twitter-rant about Flash. It was a somewhat adolescent but very cathartic outburst that can basically be summed up as
  • Flash sucks
  • Flash is not a standard, and
  • Flash sucks
Oddly enough, these comments seemed to hit a nerve with a lot of people. I probably should have expected that - one of the more prevalent backgrounds in the iPhone developer community is that of Flash/Flex developer, but I really didn't expect people to take it quite as personally as they did. The only other place I've seen people take criticism of a language or development environment so personally is when I've dared to criticize .Net or, before .Net existed, Visual Basic.

The next twenty-four hours or so after my twitter-rant saw a fair number of replies, including several ad hominem attacks and many factually incorrect assertions about Flash being a standard, along with a lot of "the iPhone is FAIL cuz it don't support Flash" kinds of statements.

Let me just clarify my comments a bit and explain them in more rational terms here.

Flash is Dead


I hate to break it to you, but Flash, as it currently exists, is dead. Oh, it's not going to die quickly, it's going to die a slow painful death precisely because there has been such a large investment of time and money into using it by so many large corporations like Disney. Flash's roots run way too deep for it to disappear quickly.

Here's the thing, though: Flash is a product of a different generation of computing. It's a product of a world where 90% of the people used one platform, and the bulk of the remaining used another. There was Windows, and there was the Mac. And then Linux gained some popularity and became a viable platform, yet for a long time, Linux users couldn't access Flash web sites. Eventually, Linux got Flash too.

But Adobe never lavished the kind of love on the Linux or Mac versions of the Flash plug-in that they did on the Windows version, and less-popular options were SOL because Flash is a proprietary platform.

And now, the world is changing. People are increasingly browsing the web from mobile devices, and unlike the computer world of a decade ago, the mobile computing landscape is not anything like a monoculture or monopoly. There are several viable mobile platforms all competing in that space. We have the iPhone, Blackberry, Palm Pre, Windows Mobile, Android, Symbian and probably others that have slipped my mind. All of these are operating systems currently shipping on phones and all come with browsers. None of them, except a solitary model of Android phone, has Flash.

Do you think Adobe is hard at work writing Flash virtual machines for every possible configuration of hardware and software that exists in the mobile space. Hell, no! They're not even willing to fix the massive memory leaks in the OS X Flash plugin. No, they're going to wait for one or two clear leaders to emerge and then, if they can, and if their MBAs decide there's a sufficient ROI in doing so, then they'll develop Flash for those platforms.

If no clear victors emerge, who knows what we'll see. It's unlikely we'll ever see Flash on the iPhone1. Flash on the Hero is painfully slow and clunky. And, Adobe is also unlikely to ever devote the resources necessary to fully develop and maintain versions for even the platforms they do decide to support. They're going to do the least they can do to say that Flash is part of the mobile web, and that's it.

What I mean by Sucks


When I say that Flash sucks, I'm not talking about ActionScript or the developer tools or anything of that nature. I'm not a Flash developer and never have been. Developing for Flash might be better than receiving oral sex for all I know. When I say that Flash sucks, I'm specifically referring to the leaky, crash-prone implementation of Flash available for Mac OS X. I'm selfish; I rant about things that affect me. And Flash definitely affects me. I'm talking about something that is the polar opposite of a "thin-client" - Flash is a client that can suck up 60% of processor cycles on a high-end machines to run a recreation a a 40-year old arcade game. I'm talking about a technology that is more likely to bring up the SBBD than any other piece of Mac software. Flash on Windows is tolerable (barely), but even on a fast Mac, it can be a horrible, horrible experience to have even a single Flash item on a web page. Every Safari crash I've had in recent memory was directly caused by the Flash plug-in.

I'm absolutely not saying that Flash developers are bad people. I'm absolutely not saying they're dumb. I had no intention of saying a single thing about Flash developers at all. My intentions was just to point out one of the problems of treating Flash as if it were a "de facto Standard" and using it as a general-purpose web-development tool. Flash is controlled by a single-company. It's got the same Achille's Heel as every other proprietary solution (including several Apple technologies like Quicktime, iTunes, and the iPhone SDK). For web development, that was sort of okay when there were only three operating systems to write for and one of them had most of the market share.

I'm not even saying proprietary is always bad. I love my proprietary iPhone and Mac, and you can pry them from my cold dead fingers. But, I wouldn't ever advocate only supporting the iPhone or the Mac on your web site simply because I love them. The whole point of the web is that it's platform agnostic (or, if you buy into Google's point of view, it IS the platform). When you put something on the web, it should be readable by any device that can get on the internet. Your site should use open standards so that the developers for platforms that can't already view your content have the ability to implement that functionality for their users simply by referring to the standard. You should only use a proprietary option like Flash when you have a compelling reason to do so. Implementing a drop-down menu in your navigation bar is NEVER a compelling reason. If something can be done with Javacript and HTML, you'd better have a damn good solid reason for doing it in Flash (or any other proprietary solution, for that matter) rather than using standards-compliant tools, and that reason had better not be "it's what our developer knows/likes" or "it was convenient".

Degrade Gracefully


Someone responded to my Twitter rant by pointing out the crux of the original problem: when you put something on the web, it should at very least degrade gracefully. This is just common sense. If you detect that a browser can't support some feature you use, don't assume it's because your user is running old software. It's just as likely that they're running newer software that wasn't around when you wrote your detection algorithm. Don't make any assumptions or implications in your error page. Just describe, in as much detail as possible, what the problem is and apologize for the inconvenience. That's all. You don't need to explain or defend your choice. Treat your virtual customers the same way you would treat real ones in person. If you've made a decision that will inconvenience some of your customers, man up and live with it, don't try to place the blame back on your user for your fracking decision.

The Future of Flash


I understand what causes some of these attacks I've received. I really do: panic. When people point out to a Flash developer the fact that Flash is not well-positioned for the future, increasingly mobile Web, Flash developers feel a rush of panic. They go into defensive mode. They want those statements to be wrong. They don't like the fact that something they like and have invested a lot of time and energy into might be obsolete in a few short years. This isn't anything unique to Flash. I've seen it before many times. Some people manage to hang on by finding niche work (hell, I know full-time Cobol developers), others (sometimes grudgingly) move to other technologies, while others, like the many NextSTEP developers out there who got a second chance with Mac OS X, and a third chance with the iPhone, get a reprieve.

Hell, I never thought I'd find a way to code in Objective-C for a living. It was a dying language when I started learning it. I didn't learn it because it would make me money2, I learned it because I saw something in it that I thought was right. I learned it because I wanted to be able to write code that was as good as the code that I saw from the NeXT developers.

What will happen with Flash? Hell if I know. My current level of confidence in Adobe is not very high. The management team there has somehow managed to take a customer base who were rabidly loyal and turn them into customers who feel trapped and desperately want an alternative. This has happened in less than a decade. Talk about spending political capital! Somewhere along the line, Adobe stopped being a company that did, first and foremost, what their customers needed, and instead became a company that looked to make the most money they could with the least expenditure. It's a short-term strategy taught in many business schools (including Harvard) using impressive-sounding phrases like "maximizing shareholder value". Yet, it's a strategy that anyone with any common sense (aka not an MBA) knows is completely and utterly moronic. In the long-term, rabidly loyal fans are far better than great salespeople. They're better than good advertising campaigns, slogans, or even Superbowl ads. They're better than product placement in a summer blockbuster.

And you can't buy them for any price.

Any management team that can do what Adobe's has done in the past ten years truly deserves to die. Frankly, I wouldn't bet on them doing the right thing in any particular situation, including this one.

But, that doesn't mean there's no hope. There are many ways that Adobe could save Flash/Flex for the mobile world. One way would be to create something like Google's GWT - an environment where some or all of the code gets translated into HTML and Javascript to be run on the client, leaving to a VM only those tasks that can't reasonably be handled that way.

With the determination to do it, and the willingness to recognize that the world has, indeed, changed, Adobe could future-proof Flash/Flex code. It would be a hell of a first step back to having rabidly loyal fans. As an aside, a Carbon-free, 64-bit clean, GCD-enabled Photoshop would be another big step in that direction.

Learning is Cool


But, even if Adobe continues to be Adobe, all is not lost. You may like a lot of things about Flash/Flex and ActionScript, but learning a new language and new frameworks is very possible. In fact, it's a lot of fun. It's an adventure. The really hard, brain-bending stuff is the conceptual stuff, much of which you've already got worked out from learning to develop in ActionScript. Heck, don't even wait for Flash to die! Cross-training is good for developers, and you should look forward to an opportunity to see how different languages and frameworks have solved the same problems. You'd be surprised at how much you can use from other languages when writing code.

And if you face the prospect of learning a new language with more than a little trepidation, maybe software development is not the right line of work for you. And I mean that very seriously.



1 - If Adobe manages to come up with a 64-bit clean, GCD-enabled Photoshop, then all bets are off.
2 - A joke from a few years back that I heard from Mike Lee jumps to mind here. What's the difference between a Cocoa developer and a large pizza? A large pizza can still feed a family of four.

Friday, August 21, 2009

To Autorotate or Not to Autorotate

Toby Joe Boudreaux has a great post today about one shortcoming of autorotation on the iPhone, which is when you try to use an application that supports autorotation laying down. In that scenario, the phone often doesn't do what you expect it to do and it can be a pain. Toby Joe shows how to allow your users to turn off autorotation from within your app.

iPhone Sketch Book Mini-Review


Well, only one day after finding out about the iPhone Sketch Book, one showed up in the mail. Very nice. The stencil I ordered a week and a half ago still hasn't shown up, so Kapsoft, the creators of the iPhone Sketch Book are already getting brownie points by being prompt.

The sketch book itself is quite nice. The cover design is professional looking, although the pink eraser and #2 pencil seemed out of place to me. Personally, those objects tend to make me think of elementary school as opposed to professional design. I probably would have opted for a mechanical pencil myself, but that's a tiny quibble if ever there was one. I know many people do, in fact, use #2 pencils and rubber erasers to do their designs.

The binding is a wire coil binding, which means it will lay flat on any page, which is a nice feature. Because it's a wire coil, the book will lay flat without having problems with pages coming loose like you have with plastic comb bindings.

The individual pages are nice. I'll be honest with you, I've been a sucker for graph paper since I was a kid. I loved it even before I started playing Dungeons & Dragons in the late seventies, and that hobby really reinforced my feelings for it. It's been years (decades, actually) since I've mapped out a dungeon on graph paper, but I still love graph paper, so I was thrilled to see that the background of each page is light-blue quad graph. And it's not that horrible 1/4" grid stuff, either. The grids are nice and small. And my love of graph paper isn't completely irrational. It gives you both vertical and horizontal guides for drawing shapes and letters, which lets someone with poor handwriting (like, say, me) to create something not completely horrible looking, which is nice if you're going to show something to a client.

The center of each page is an illustration of an iPhone drawn exactly to scale, which can be good or bad depending on your ability to draw small. There's a status bar at the top, and a black line at the bottom, the rest of the space is blank, ready for your designs.

And here's my one real quibble with the notebook. The status bar and the line at the bottom (that I assume is for a toolbar - it's too small for the Dock or for a tab bar) should have been drawn in non-photo blue like the background grid rather than in black. In any given application, a substantial portion of my views are unlikely to use a tool bar, and some applications won't show the status bar. Ideally, there should be light blue guides for tool bar, tab bar, nav bar, and status bar, but no black inside the iPhone's screen. It's not a huge deal, but I would definitely recommend that version 2.0 remove all black lines inside the screen space and replaces them with a light blue or light grey guideline that won't draw the eye if not used.

Other than that, though, the paper is thick and high quality, and the whole package gives off a professional air. I would not have any hesitation in showing a client designs from the sketchbook whatsoever. The book is thinner than the clipboard I currently use, so I plan to put my copy to good use.

Other recommendations I have for future versions of the iPhone Sketch Book: a version with the phone at 1.5 scale so it can work with the stencil, and one with the iPhone illustration only on the right-side pages, leaving the left side as just graph paper so that the developer has more space for notes about the view.

Another Pimp My Code

Wil Shipley has another installment up in his Pimp my Code series of blog postings. Wil's blog is on my very short "must read" list, so I'm heading off to read this one. You should to.

Yes, I know that on a few subject areas (unit testing, use of the oxford semicolon) Wil argues against the widely accepted view, but Wil's been coding Objective-C almost as long as anybody, so it's always worth reading what he has to say, even if people occasionally disagree with some of it.

Thursday, August 20, 2009

Property-List Driven Detail Editing Pane

A while back, I discussed using a property list to drive a table-based detail editing pane. I had intended to use this code in a sample project in More iPhone Development. I still feel strongly that this is a good approach, and when I have some time, plan to work on it more. However, from a pedagogical standpoint, it wasn't working for me. All of the code I needed to demonstrate was getting moved to generic classes and the main point was getting lost in a confusing mess of classes.

Therefore, since it's not going to be used in the book, I'm releasing the code for people to look at and use (no restrictions or requirements). Please note, however, that this is NOT production quality code, so caveat emptor. If you want to use this in a real application, expect to sink some time into extending and debugging. In the long run, I think you'll be happier than maintaining huge sets of nested case statements, however.

You can find the project here.

Interesting...

Here's an interesting post on how to get an iPhone-like interface in your WinMo application. It seems like a lot of work to get the functionality you'd get for "free" if you created an iPhone application. My first thought was that I didn't think it was a good sign for the Windows Mobile community that an article like this even exists. It felt like SDK-envy to me.

But then I thought about it some more. As somebody who spent years programming for my non-preferred platform, I really do applaud an effort to work around the limitations of a platform in order to create something that's better, even if it takes a lot more work. I also applaud the willingness to borrow good things from other platforms, as long as the goal isn't to be "as good" as the other platform. Shooting for "as good" never results in greatness, but recognizing that there is a better way can be the first step toward it.

In the long run, maybe it's a really good thing for the WinMo community that you have people who want to create apps with interfaces that are better than the standard WinMo user interface and are willing to put in extra effort to do it. There are definitely signs of life in the WinMo developer community. And, if you think about it, every .Net developer out there is a potential WinMo developer, so with some smart non-Balmer-esque management decisions, Microsoft could capitalize on that base of developers and the apparent desire of at least some of them to create something really good to catapult WinMo solidly back into the fray.

Alas, the chances of any non-Balmer-esque management decisions is not particularly good, but one can hope.

Tuesday, August 18, 2009

Nice iPhone Application Idea.

I don't often post about specific non-developer iPhone applications, but this is just too neat not to mention. Very nice, and a great price to boot.

iPhone Sketch Book

I just found out about the iPhone Application Sketch Book, a product aimed firmly at iPhone developers.

It looks like I may get a chance to review one of these, so probably when I get back from vacation I'll post my impressions of the actual product.

It's going to get bonus points if the template matches in size the iPhone Stencil Kit I ordered last week.

(yes, I like doing designs using analog tools, sue me)

Blocks on iPhone

I really do no need to start writing, but just one more thing that came out of last Thursday's meetup. We talked a little about when some of the cool Snow Leopard features like Blocks and Grand Central Dispatch might be coming to the iPhone.

Again, I'm purely speculating, but I think we will see blocks sooner rather than later, maybe in iPhone SDK 3.5. Grand Central Dispatch is harder to predict. Although there is some value in it now, since the iPhone has two processors - a CPU and a GPU - the real power of GCD comes with even more processing cores, and I have no idea if or when we'll get a multi-core iPhone.

But, blocks? Blocks on the iPhone would be cool, and you know what? You don't have to wait.

Sure, by using this, you don't get block support in the Foundation and UIKit classes, but you do get blocks, and you can always create categories to add block-based methods to existing classes, so you might want to check it out.

On iPhone Competitors

At the New York City iPhone Meetup last week, Steve Kochan and I were tasked with talking about the future of the iPhone and the iPhone SDK. I'm not sure either of us were particularly qualified to speak on that. I'm not being modest; I'm not sure anybody outside of One Infinite Loop is qualified to speak about that, and anybody who is qualified is almost certainly not allowed to speak about it. But everybody's got ideas of where things should go and where they might go, so we talked for a little while about some of the things we might see in a hypothetical iPhone SDK 4 or in the next generation of iPhone OS devices.

First of all, I hope nobody took anything I said as gospel. Although I do know a small handful of people inside Apple, none of them are close enough friends that they'd be willing to risk their jobs by giving me secret information, and if any of them did give me secret information, I wouldn't stand up in front of a group of people and spill that secret. I do NOT know what will be coming out when. I don't know if there will be a tablet, or what it will look like. As far as hardware is concerned, I don't know what the future holds at all, and as far as software is concerned, I don't know all that much that hasn't already been made public. I don't know that we'll get garbage collection or, at least, I don't know when we'll get it. I know Apple is capable of delivering a tablet. I highly suspect that they've prototyped several ideas, but Apple prototypes lots of ideas that are never turned into actual products. I know that there are indications of a move toward a completely resolution independent OS, but that's not a secret. Apple has been moving that way for quite some time in the Mac OS, since even before the iPhone OS was forked off.

So, just for the record, anything I said last Thursday night was pure speculation. Do not buy stock based on it, or spread anything I said as if it came from somebody who actually knows something.

And with that out of the way, I noticed that John Gruber has tackled one topic we discussed at the end of the night, which has to do with challengers to the iPhone throne. John's specifically speaking to the Android platform in his posting while we were talking about a handful of platforms, including Palm, RIM, Android, and Microsoft, but a lot of what he says echos what I said during that discussion. And John has a much, much better track record than I do with regard to making predictions about technology and specifically about Apple, so for that part of the discussion, at least, I feel like I was on fairly solid ground.

Workshop Done

I am back home, at my desk for the first time since last Thursday.

The workshop went pretty well. Being the first time I had taught a class of that length, it was a bit stressful. It ended up being even more stressful than I had anticipated. A combination of a really, really bright group of students, and the fact that we provided the exercises in digital rather than printed form, meant that exercises that should have taken an hour, often took fifteen minutes. Because they were smart students, they absorbed material quickly, and because the instructions were provided digitally, they were able to copy and paste the code portions of the instructions, which is not only faster than having to type in the code, but it also has a considerably lower possibility for making mistakes. Frankly, we learn a lot more from making mistakes, than we do from having things work right the first time.

For the future, I will probably use printed exercises. I think the way most people learn, something that can be copied and pasted bypasses the type of brain processing that moves stuff from short-term memory to long-term memory. Now, I'm no psychologist, so I could be making it up, but I know that in my personal experience, when I've been able to just copy and past and then tweak somebody else's code rather than figure it out myself, the next time I needed to do it, I didn't really remember it.

All things considered, though, I felt like it went pretty well and I hope the students felt it was valuable and worth attending. I did run out of steam toward the end of the day Sunday, having only gotten about two hours of sleep each of the two previous nights (thank Dog for caffeine!), but we covered a lot more material than I ever anticipated over the course of three days. I think that the Keynote presentation I have right now is more honestly a week's worth of material for a more typical classes working from printed exercises.

Now of course, I'm scrambling to catch up even more than I was before. My inbox has really piled up, and I'm even further behind on the book. On top of that, our annual family vacation is coming up next week, so you probably won't see a long tutorial post for the next two weeks.

Wednesday, August 12, 2009

Boot Camp Imminent

Well, it's not just under 48 hours until I teach my first workshop. I'm a little harried right now. I'm comfortable teaching and speaking in front of groups, but I've never taught anything of this scope before, so I underestimated the prep time involved. I haven't slept much this week, and the situation doesn't look like it'll get much better before Friday.

If you e-mail me, IM, or tweet me in the next few days, please don't be offended if you don't get a response. I will try to catch up after the weekend and will be responding only to urgent e-mails in the meantime. I'm going to be mostly shutting out the world in an effort to get everything done and make sure that I'm satisfied with the workshop materials.

Despite the stress, I'm actually very happy with what I've gotten done so far. I think I'm really starting to understand what makes iPhone programming hard for both new programmers and experienced programmers coming from other languages. I guess we'll find out in a few days if I'm right about that, though.

Monday, August 10, 2009

OpenGL ES Update

I know there's probably one or two people out there who would like to see another entry in the OpenGL ES from the Ground Up series.

The good news is, I started one a while back, right after WWDC. It's a fairly extensive introduction to OpenGL ES 2.0 and shaders.

The bad news is, I have no idea when I'll have time to finish it. I'm behind on writing the next book, plus I'm frantically trying to get ready for the iPhone Boot Camp NYC this weekend (boy, that snuck up on me). I will finish it at some point, though.

My Last Word on Dot Notation

The dot notation discussion has taken far more of my time lately than it was wise for me to spend. I've said pretty much all that I need to say on the topic at this point. I don't care if you use it, just don't tell me I'm wrong for doing so.

Just for the record, I was horribly opposed to Objective-C's dot notation when it first came on the scene. i had programmed in both C++ and Java over the years, but I much preferred Objective-C, and I saw dot notation as being a step backwards and a really bad idea for a number of reasons.

For the book, Dave and I decided we were going to follow Apple's lead when it came to coding style and coding conventions, so I bit the bullet and started using properties for the book exercises. Now that I've used them regularly for about a year and a half, I have completely changed my opinion, and I'm pretty sure a lot of my original issues with dot notation were just rationalizations of my own resistance to change.

Arguing theory has only so much merit. In practice, with a little thought and understanding, properties and dot notation work well and can be used to make code considerably more readable and faster to write. If you haven't spent some real quality time using them - if you are arguing only from language theory - then you need to spend some time with them before you condemn them further.

Sunday, August 9, 2009

How to Use Dot Notation and Properties

Since we're talking so much about dot notation, I though I should link to a fabulously awesome blog post by Chris Hanson that tells how you should use dot notation and properties.

Dot Notation Redux: Google's Style Guide

Before I get into this post, let me make a few things absolutely clear. I do not want my intentions misunderstood.
  • When coding for yourself, do what feels right to you. If you don't like dot notation, don't use it, and don't feel like you should apologize for not using it.
  • When coding for a client or employer who has a coding style guide or other published coding conventions, use those, even if they disagree with your personal opinion of the "right" way to code. In a group programming environment, consistency is extremely valuable.
My goal here is not to tell you that you must or should use dot notation, it is only to refute the idea that dot notation shouldn't have been added to the language and that it inherently makes code harder to read.

My illustrious writing partner, Dave Mark tweeted today about the Google Objective-C Style Guide's argument against using dot notation in Objective-C, which reads as follows:
  1. Dot notation is purely syntactic sugar for standard method calls,whose readability gains are debatable. It just gives you another way to make method calls.

  2. It obscures the type that you are dereferencing. When one sees:
    [foo setBar:1]
    it is immediately clear that you are working with an Objective-C object. When one sees
    foo.bar = 1
    it is not clear if foo is an object, or a struct/union/C++ class.
  3. It allows you to do method calls that look like getters.
    NSString *upperCase = @"foo".uppercaseString;
    which is not only confusing, but difficult to spot in a code review.
  4. It hides method calls.
    bar.value += 10;
    is actually two separate method calls (one to set and one to get) and if your properties are not simple you may find a lot of work being done in a hidden manner.
As you read through these, they sound rather logical, and possibly even compelling. But in reality, they are not logical at all. In fact, the whole argument is basically one series of logical fallacies. Let's look at the specific arguments in order and then put the pieces together at the end.

The First Argument: the Non Argument

Dot notation is purely syntactic sugar for standard method calls,whose readability gains are debatable. It just gives you another way to make method calls.
This first "argument" contains no actual argument against the use of dot notation. The first part of the first sentence is taken almost verbatim from The Objective-C 2.0 Programming Language on Apple's site and is just a restatement (out of context) of how dot notation is implemented.

The second half of the first sentence is an attempt to discount one of the benefits of dot notation by simply dismissing it offhand without evidence or support.

The second sentence is simply an attempt to bolster the arguments that follow by trivializing dot notation as "just" something we can already do. It's sort of like saying that jet engines do not add value over propellers because they're "just" another way to create thrust. Every construct in any programming language that's higher-level than assembly is "just" another way to do something we can already do. This sentence has no semantic or logical value, it's simply here to set a negative tone toward the use of dot notation without actually offering any facts or reasons not to use it. This first "argument" is rhetoric, nothing more.

The Second Argument: the Invalid Assumption

It obscures the type that you are dereferencing.
This argument brings to mind the arguments for Hungarian Notation. The argument for Hungarian Notation is that when you look at a variable, you know right away what it is. By prefixing every variable with a jumble of individual letters, each with its own meaning, you know (in theory) all that there is to know about that variable just by glancing at it.

In reality, you don't see much Hungarian Notation these days. Variables with semantic meaning - those that use words that are recognizable by and have meaning to the brain - work much better. We may not know the variable's specific type, but we know what purpose it serves, which is really more important.

Dot notation doesn't "obscure" the type you are dereferencing unless there's some reason why you would know the type from looking just at that line of code. This argument makes the assumption that we already know and that we should know what the type of foo is. Sure, with bracket notation, we know we're dealing with an object, but we don't know what kind of object it is from looking at this one line of code in a vacuum.

But, when do you ever look at a line of code in a vacuum? You don't. Code has no meaning taken out of context. If it was vital that we know everything about a variable from looking at it out of context, then we'd all be using Hungarian Notation. Yet we're not.

Somewhere, probably not more than a few lines above
        foo.bar = 1
is either a declaration of foo or the start of the method. If you're confused about the type, generally scrolling up a few lines can resolve that confusion. If that doesn't work (because it's an instance or global variable, for example), command-double-clicking on it will take you to its declaration and then you'll know its type.

You can't obscure something that you don't have a reason to know. The amount of information that bracket notation gives us over dot notation is trivial and not enough to make an informed decision about what the code is doing anyway, so you have to consider its context. If it's not your code, you have to look at the preceding code to understand it anyway.

The Third Argument: the Red Herring

It allows you to do method calls that look like getters.
Allows? This argument is that it's bad because it "allows" you to do something? And what it allows you to do is create method calls that look like getters? What are getters? They are a kind of method, right? Am I missing something here?

Any programming language, to be useful, has to allow some kinds of bad code. I doubt it's possible to create a programming language that doesn't "allow" an inexperienced programmer to do all sorts of completely horrible things. I could come up with dozens of examples of ways that Objective-C 1.0 "allows" you to do bad things. This isn't an argument, it's a one-line example of bad code that's being passed off as an argument. It's disingenuous because there's nothing to prevent you from creating methods that look like getters but aren't without dot notation. There's no language-level constraint on that in Objective-C, and no compile-time checks for it regardless of whether dot notation is used. Dot notation changes this in no way whatsoever.

I actually find it hard to believe that an experienced Objective-C programmer would even attempt this argument because, frankly, it sounds like an argument you'd get from a C++ programmer. Objective-C is a permissive language. It's in Objective-C's DNA to let you do things. It's weakly typed and handles at runtime many things that are handled at compile-time in C++ (and all other OO languages based on the Simula object model). These are intentional design decisions. This language is designed to give you a lot of flexibility and puts trust in the developer that you'll use its features appropriately. Objective-C's dot notation doesn't run contrary to that in the slightest. In fact, it's a logical extension of that underlying philosophy. They're faulting dot notation for something that's inherent in Objective-C

The Fourth Argument: Missing the Point

It hides method calls.
Why yes, yes it does. The sample line of code supporting this "argument"
        bar.value += 10;
will result in exactly the expected behavior if you're using dot notation to represent object state. If the value and setValue: methods are something other than an accessor/mutator pair, then this it is true that this line of code could cause unexpected results, but the fault for that lies not with dot notation, but rather with a programmer who made extremely poor method naming choices, essentially lying to the world about their methods by not respecting the naming convention for accessors and mutators. Under this scenario, you'd have exactly the same problem with this line of code that doesn't use dot notation:
        [bar setValue:[bar value] + 10];
In other words, this argument is only a problem when somebody does bad things with their code, and it's just as much of a problem when not using dot notation.
Whoops! It was pointed out in the comments that I sorta missed the point on this one, and that the "problem" is that there are two method calls when someone who didn't understand dot notation might reasonably think there was only one. My response to that is: so what? How is it a problem if the result is correct? The code used to illustrate the problem will achieve the results that you should reasonably expect. After the line of code, value will be correct. The fact that there are two messages sent and not one will not matter in the vast, vast majority of situations. What counts is that the result is correct, and in the example, it would be assuming the accessor and mutator are correctly implemented. If you're having performance problems and determine through profiling that it's caused by the extra message, then you optimize by implementing a method that increments the value in just one call. It's still a non-issue.

Illusory Arguments

The law has an interesting concept called an illusory promise (or condition), which is a promise that really isn't a promise at all. It's something that looks like a promise, and is couched in the language of a promise, but which simply isn't a promise.

These arguments against dot notation in Google's Objective-C Style Guide are illusory arguments. The first one, isn't an argument at all. The second rests on assumptions that are provably untrue (that you know what type a variable is from looking at just its line of code). The remaining two are predicated on a programmer doing something wrong and can both be demonstrated just as easily without using dot notation.

Google makes the case that dot notation is bad because it can result in confusing code when a developer pays no attention to established naming conventions or makes really poor design choices. But these problems have nothing to do with dot notation. Poorly written code is poorly written code. The simple fact of the matter is, if you're trying to read code like that, nothing is going to help. With, or without dot notation, the code will be hard to read because it's bad. The correct solution in that situation is to fire or train the developer who wrote the offending code.

How I Use Dot Notation


But, there are ways in which dot notation can be used to make code more readable. The way I use it (picked up from Apple's sample code) is to use properties and dot notation to represent object state, and bracket notation when calling methods that represent behavior or trigger actions. In fact, it could be argued that using bracket notation for both state and behavior has at least as much potential for causing confusion as using dot notation does. Take this line of code, for example
        [foo setIsOn:YES]
Am I setting state or triggering behavior? It could be either. It could be both. To know for sure, I have to check the documentation for the method being called. If, on the other hand, I've used dot notation and properties to separate out state from behavior, it's instantly understood that
        foo.isOn = YES;
is setting state, but
        [foo turnOn];
is triggering behavior (which may, of course, affect state). Yes, this requires consistent design choices, but all good coding does. If you throw that out as a requirement, you can argue that anything is bad.

Arguing against letting people use dot notation because some people will make poor decisions is ignorant. It's basically saying "I don't like it, so you shouldn't use it", and when you state it like that, it sounds kinda silly, doesn't it?

Friday, August 7, 2009

The Dot Notation Controversy

Sorry! It was Joe Conway, not Aaron who wrote the post. I've corrected the article below. The salient points are unchanged, but my apologies for mis-attributing

I knew that some developers didn't particularly care for Objective-C 2.0's dot notation, but I didn't realize how strongly some of them felt about it. Joe Conway of the Big Nerd Ranch has a very strongly worded post about the horrors of dot notation.

I have a lot of respect for Joe and the Big Nerd Ranch, and reading his post, I understand his complaints. He has identified some situations where dot notation can lead to confusion. In practice, however, I don't personally find the use of properties confusing in the slightest and think telling people to never, ever, ever use them is misguided imposition of a personal preference rather than sage advice. If there is anything close to a consensus on dot notation in the Objective-C developer community, it is that dot notation is a positive addition to the language.

Although, in terms of how they are implemented, dot notation and properties are orthogonal, I tend to think of them as working hand in hand. I always use dot notation for properties and bracket notation the rest of the time. Instead of making my code harder to read, I personally find that it makes it much easier. The only possible point of confusion I've had with this approach comes when accessing struct or union members owned by properties, but in practice, it just isn't a meaningful problem for me and it's offset by the benefit of separating exposure of state from exposure of behavior, something that has no language-level support in Objective-C prior to 2.0 and no language level support at all in most OO languages (though C# does, so kudos to Microsoft for that).

My only issue with dot notation, if you could call it an issue, is that since it doesn't mean exactly the same thing as it does in many other OO languages like Java or C++, it can be a bit of a stumbling block for experienced programmers who are new to Objective-C (and with the iPhone, there are a lot of those lately). It can be difficult for these people to make the transition because they subconsciously transfer their understanding of dot notation to Objective-C and think they understand something that they don't. The result of that is typically memory management problems that frustrate the hell out of them. But that's not really an issue with the language, it's just a training issue and hardly an insurmountable one. There are many happy, well-adjusted former C++ and Java programmers among the iPhone developer ranks.

So, keep in mind that although there are a few well-respected and knowledgeable developers who strongly dislike dot-notation, there are many more well-respected and knowledgeable developers who do use and like them (and I do too). You should definitely read Joe's post and give his opinion weight. He has been working with and teaching the language for quite some time and he's a really, really smart guy. But keep in mind that what you're reading is just one opinion.

Translations and Xcode

Chris Hanson has a really handy post today. In the localization chapter of Beginning iPhone Development, we mentioned that Apple recommended using the ISO two letter language code for your localizations, but that Xcode used the older style language codes for your development base language. Chris shows how to work around this inconsistency.

Wednesday, August 5, 2009

Multi-Row Delete in 3.0

The networkpx Project Blog has an interesting post on doing multi-row delete under SDK 3.0. It's a good post, even if they do credit the excellent Cocoa with Love blog for "introducing" a technique that I demonstrated three months earlier.

Anyway, the ability to do multi-row delete is now built-into UITableView starting with SDK 3.0, meaning you can now implement multi-row delete it with just few lines of code. Yay.

Or, perhaps not.

Unfortunately, using this functionality requires you to return an undocumented UITableViewCellEditingStyle value from tableView:editingStyleForRowAtIndexPath: in order to turn this feature on. To fully utilize the functionality, you have to use and override other undocumented, private methods.

Technically, doing that in an application submitted to the App Store violates your SDK agreement and is grounds for having your application rejected.

Will your apps be rejected if you do it? Who knows? Maybe yes, maybe no. It's even possible that people using the older, manual technique will get their apps rejected for using private methods even though that technique doesn't, similar to the Coverflow debacle from last year.

Now, I'm not going to advise you whether to use this functionality in your apps. It's a risk, and you have to decide how risk tolerant you are. I am going to advise that if you have any desire to use this functionality at all, go now and open a bug report with Apple requesting that they make the multi-row delete functionality available to developers.

Tuesday, August 4, 2009

NinjaWords

John Gruber says pretty much all that needs to be said on this. Not a sign that things are heading in the right direction, I"m afraid.

Monday, August 3, 2009

A Mac App Store

Back before WWDC, one of my long-shot predictions was the creation of a Mac App Store following the same business model as the iPhone App Store. It didn't come true, and now with the various issues surrounding the App Store, I've changed my mind that it would be a good idea. I've come to the conclusion that I like not having Apple as the gatekeeper for what Apps can do.

But, it's nearly impossible to argue that the App Store isn't convenient. It has its issues, but its a great idea, which is why the App Store is flourishing despite all the negative press.

Now, a third-party called Bodega is extending the idea to Mac applications, and it's a snazzy little application.



In some ways, Bodega outdoes Apple's iTunes Store. The interface is clean and easy to use, and it's filled with lots of little touches reminiscent of the Delicious Generation. For example, when you move the window around when the "Featured" option is selected, the little hanging signs you can see in the picture above swing with gravity and creak as they move. It's an unnecessary, yet completely satisfying little touch.

This project is still in its infancy, but it shows a lot of promise and it's one I'm definitely going to keep an eye on.

Me, the Hypocrite Apparently

Today, I saw somebody tweeting about signing a petition to get DRM off of the Kindle. I tweeted that the best way to "vote" on something like that is with your dollars. If you buy a Kindle and then sign a petition about the DRM on it, you haven't really given Amazon much motivation to stop using DRM. Since you bought it, you obviously didn't care enough about DRM to not purchase it, so if Amazon doesn't stop using DRM… well, they've already got your money and probably will continue getting your money in the future. At least from the point of view of an MBA, such a petition is essentially meaningless; it has about as much significance as the mewling of kittens. Money talks more loudly than petitions.

Now, I consider what I tweeted to be a truism. I wasn't particularly intending to take a stand on DRM by posting it. I was just stating a simple fact: If you feel strongly about DRM (or anything else), you shouldn't buy a product that uses it because your dollars are the only meaningful way of voting when it comes to for-profit corporations. If you want a DRM-free Kindle, you should refuse to buy a Kindle until Amazon stops using DRM.

Generally speaking, I'm not a big fan of DRM, but I'm not much of an ideologue, either. I'm practical about the whole thing. With some products, the benefits outweigh the potential harm of DRM, with others, they don't. With the iPhone, for example, the convenience and benefits outweigh the negatives by a long shot. Sure, I would rather the iPhone didn't have DRM, but I'm not going to stop using it because it has it (although it is almost certainly true that I buy less music and movies as a result of the fact that they are delivered with DRM). And, yes, the threat of having purchased apps deleted by an un-appealable decision by some unnamed person at Apple is a fear but, in reality, it's not enough of a threat to make me give up my iPhone. Maybe someday it will be, but today it's not even close.

There are other products, like the Kindle where, for me, the DRM tips the scales enough in the other direction that I don't buy the product. They don't offer enough of a benefit for me personally to offset the threat of having my books deleted behind my back, for example. Obviously, based on how well the Kindle has sold, the value proposition of the Kindle is different for me than it is for a lot of other people.

The long and short of it is that DRM is just one factor among many that I consider when making a purchase decision. Or, in other words, I'm pretty standard geek consumer.

Anyway… a little while after I tweeted that, I was basically called out as a hypocrite for making that statement because my publisher puts DRM on the electronic versions of my book. The implication being, it would seem, that you can't say anything about DRM unless you give away all of your creative output for free? Yeah, that makes sense.

Frankly, I do not like being called a hypocrite. Over the last two years, I have probably given away the equivalent of at least a thousand hours worth of my time (and probably much more) writing tutorials and sample code for my blog. Additionally, I spend at least an hour a day helping other people with programming problems over e-mail, IM and Twitter, none of which I get paid for. Now, I enjoy doing all that, but idiots like this "John Doe" sometimes make me really regret the time I've spent in that way. If I had used all that time doing contracting work or writing an additional book, I would be considerably wealthier right now.

Here's the simple truth of the matter: I have no input into how Apress delivers their eBooks. I have no contractual right to control that part of the process. I wouldn't be tempted to do so, but if if I did decide to make some ideological stand on DRM by, say, refusing to finish my current writing obligations unless Apress stopped using DRM, Apress would likely (and rightly) hold me in breach of contract. It's unlikely to the point of being ridiculous that they would stop using DRM as a result of one author being an asshole.

To state it simply, I'm an author, not a publisher, and just one of literally dozens (maybe hundreds) of Apress authors. Heck, I'm not even the only author on the books I work on! I have a great relationship with Apress and I very much like the people there that I interact with there. But I'm not them. And yet I still have opinions. Fancy that.

iSimulate

One of the difficult aspects of making an iPhone program is marketing it. One thing that helps is a professional quality video of the program in action. This is especially true for games. However, since the iPhone Simulator doesn't have access to the accelerometer, camera, or the full capabilities of the multi-touch screen, it can be difficult to get a professional looking video.

iSimulate (App Store Link) is a great option for developers. It's a standalone program that runs on your iPhone that will send the inputs from the iPhone to the simulator. This means you can run a program in your simulator and control it by touching or tilting your actual phone. This allows you to use a screen capture program to capture flawless video, even if your program uses features not available on the simulator.

Right now, you can get iSimulate for $3.99, but only for today. Tomorrow, the prices rises to $7.99 until August 8, then increases to $15.99, until August 16 when it reaches its full price of $31.99. Although $32 is still a small price to pay to get a professional quality video, if you're interested in the program, you can save yourself some money by buying now.

Sunday, August 2, 2009

This Concerns Me Greatly

Yes, I'm a little behind. I've been gone for a week, so this is probably not news to any of you, but this really upsets me, so I'm writing about it. I try to give Apple the benefit of the doubt when they make decisions that seem unfair or arbitrary, fully cognizant of the fact that I'm not privy to all the factors that went into the decision.

But, this... Well..., if this is true, it would seem to indicate that maybe I've been wrong in giving Apple the benefit of the doubt. That maybe those who have let out a hue and cry over every little Apple decision they didn't like had a more accurate picture of the situation.

If it's true that Apple won't even give more than a boilerplate reason for pulling an application that had been on the App Store for four months and won't tell the developers what the specific conflicts are so they can fix them, then I think there is more than a little cause for concern. Especially troubling is that now RiverTurn, since they are unable to support or update their app thanks to Apple's decision to pull their application, would like to give refunds, but will have to pay not just their share of the income, but Apple's as well, even though they are only trying to do what's right after Apple put them in a tough situation. This means they not only lose whatever income they might have made in the last four months, but they also have pay out additional money on top of what they invested to develop and market their application.

The FCC has decided to investigate the situation. I'll be interested to see if anything comes of it. I don't have much faith in the FCC, that's for sure, but maybe they can do something good for a change.

Lately I've been bashing Microsoft a lot for making poor decisions and failing to recognize the reality of their situation. In most respects, Apple has been on a roll, making good decisions and making elegant products that people are clamoring to buy even in a poor economy. But, Apple has to realize that a large part of the success of the iPhone has been the App Store. Given that they've based much of their advertising around that single point, it's clear that they do recognize it.

From the start, third party developers have had to live with an arbitrary review process that potentially meant they could spend lots of time and money and end up completely unable to sell their application for failing to comply with some unwritten rule. That was bad enough, and certainly has had a chilling effect on third party application development. The App Store was so hot, though, that most developers accepted the risk, figuring the potential reward outweighed the risk.

This decision by Apple adds a new aspect that is almost certain to drive away some of the most innovative developers. Not only do we have to worry about whether our apps will be approved by the somewhat arbitrary review process, but now we have to worry about having their approved applications removed.

I don't have enough evidence to be able to say I know for sure why Apple did this or why they handled it in this way, but if they don't take steps to fix this and to communicate that they're aware of the problem, it will leave a black stain on their reputation in the eyes of even the most ardent fanboys.

It really concerns me and I hope Apple fixes it, quickly. Apple should, at very least, pay their own share of the refunds. If the people making the decisions have a soul, though, they should offer to pay the full amount of the refund and probably do even more. Riverturn expended time and resources to create a program that any reasonable person would believe complied with the App Store policies and SDK agreement. There are several other apps on the store that haven't been pulled that allow voice communications over wi-fi, including the iPhone Skype App. Heck, Apple's own reviewers must have felt the same way, since they approved the application and it was on sale for four months before somebody said "hey, let's pull this".

Apple, please make this situation right and then fix whatever internal policies allowed it such an injustice to happen.

 
Design by Wordpress Theme | Bloggerized by Free Blogger Templates | coupon codes