The 7-inch iPad

Brazil Tech

Tim Bray is predicting a 7" iPad. I'm skeptical. As a former Kindle user and a current iPad user, I can understand the attraction of an option between the iPhone and the iPad: personally, I've often wished my iPad were a bit lighter. However I'd argue that, while reducing weight and reducing size are reasonable goals, I'd rather keep the size and reduce the weight - a paperback-sized reading device serves me as a reading device, period: smaller than an iPad, there is little advantage over my iPhone when it comes to producing anything like email. And a singly-purposed reading device isn't enough.

Beast of Burden

As a frequent traveler, I'm always looking for solutions that reduce my carry-on weight, or the number of devices I travel with, or the ease with which I can get through airport security. I don't think I can support the one-device model until I can get a phone-sized device that has some sort of expandable display (see Earth: The Final Conflict, or the 2005 Philips Readius prototype that went from cool roll-away to meh fold-away) - the phone in my pocket is used far too frequently, too casually, to imagine giving it up for an iPad in my pack. So the first device, the phone, is a given.

Can I get away with only two devices? For a long time I've lived that model, phone and laptop. Yearning for first-class room to open my laptop in the air, seeking outlets in the airports like some digital mosquito, accepting that I'm going to run out of power on long flights (DigEPlayer, anyone?) In transit, the iPad has generally solved those problems, and the laptop stays in my carry-on for 95% of the time on business trips. For personal trips, I've started leaving the laptop at home. I read on flights, manage email when appropriate, read documents, watch movies, play games. I never worry about space or battery or a separate tub in the security line. I'm down to three or even two devices. Using a 7" device, I would likely be schlepping my iPad along for games/photo editing/newsreading/email/web browsing/other. So now I take three devices on personal trips, four devices on business trips? Not likely.

Precision-Targeted or MIRVs

As a developer, I appreciate the limited form factors targeted by iOS. I can fine-tune my app to either or both displays, and realize maximum quality of user experience. Granted, a Retina-displayed iPad may throw a resolution curve into the mix, but I can deal with that just as I adapted to the iPhone 4. But supporting essentially two devices is viscerally different from three, or more. The cost, time/effort-wise, of addressing 3+ devices means I'll start considering ways to cut corners: compromises in the UX, common elements that are "good enough". I encounter the issues, and compromises, faced by web developers and browser/display compatibility. I run across the fragmentation issues faced by the Android community (the same issues faced, and never really addressed, by the Java community more than 10 years ago). While I think WebKit is an excellent tool under the right circumstances, it is not a universal hammer, but I'm starting to see it being proposed often as such, to mitigate platform porting and time to market. In those cases the UX almost invariably ends up being prioritized below production efficiency.

And therein lies the rub: is the experience more important, or is the ease/speed with which I can develop/market an app? If I have to compromise on the experience to get a paperback-sized device, will I love what I will, ultimately, see as a user? And the target for UX should be users loving the experience, something about which Apple has been pretty clear.

I'll Take Light

Just to make things interesting: rumors of a thinner, lighter iPad.

Feedback is a gift

Feedback is a gift. Sometimes it's the sort of thing where you wonder if a person is blowing smoke up your nether parts, but often it's a lot harder to digest because it's not what you want to hear and, on some level, you know it's valid. That's when things get interesting. Even if a gift, feedback is usually as stressful for the giver as it is for the receiver. I'd argue it's more difficult for the giver because they need to take the initiative and deliver the goods. The receiver's job is not only to listen, but to hear. It's another from of the dance I explored in my previous posting on leadership.

I've worked at firms that touted how they embraced feedback. One company sent all executives to a remote resort, one at a time, to go through multi-day workshops. Up in front of a group of people in similar roles at other companies, we received anonymous feedback from our peers and reports, and specific feedback from our supervisors. It was intriguing, at times frightening, but also liberating: there you are, being faced with very real criticism, in a group of total strangers. And you have to deal with it. No prevaricating, no rationalizing. No hiding. Cool. Even better, the group proceeded to work together to help each attendee find tools they could use to improve themselves and the relationships they'd built in their office.

Feedback is normally accompanied by a certain level of stigma: criticism is bad, isn't it? For something like two decades, I've advocated a working environment where honest feedback is so common that the fangs are pulled, and it becomes just another source of input to help one improve. Artistic people generally have the edge here: they swim in an ocean of criticism, where ideas, concepts, deliverables and even job interviews involve critique sessions where other artists freely, even aggressively, call out the good and bad. Everyone survives, and the sun rises tomorrow.  For technology groups, the stigma is especially difficult to overcome, as they rarely get close to this level of critique. Code  and project reviews generally keep to "the aspects of the project", not "your stuff". So skins are thinner, the impulse to retreat much closer to the surface. The learning/adaptive/acceptance curve is longer, but the desire is the same: people really do want to know how they're doing, especially from people they respect.

Let's face it: embracing feedback is hard. The natural tendency of management is to place the burden of feedback on their reports: "why aren't you giving us feedback about how we're doing?" "How can we make the company better if you don't tell us things?" "I know I'm busy, but make an appointment with me."  It's distressingly common for management to fail to appreciate power differentials: people in power are intimidating. If you're not one of "them", "they" can make your life miserable, or even end your career. The reality is that feedback requires an open door, a safe space, an environment of trust - and it is the responsibility of leadership to create that space and open that door. I make sure that my team members can approach me with anything - even if it's criticism about the way I'm doing my job. And they know that, if there are issues surrounding them, they'll hear it from me first. And we'll work it out.

Someday a leadership team will get creative, and hold a company-wide meeting where the only slide in the inescapable Powerpoint deck is one that reads:

We're listening.

You need us to actually hear what you're telling us, and you need to feel safe about it. And you need a reason to believe that we're actually going to use your feedback to make ourselves, you, and by extension our company, better.

We're on it.

Which brings us to review time. A stressful period, replete with deep concerns about trust and just plain survival. If I say what I really mean, will I expose myself to repercussion? Will I be labeled as a complainer? Are my leaders actually listening? This last concern is, for me, the most dangerous, the most insidious: If the people in a company become convinced that their leadership doesn't listen to feedback, don't see tangible changes from feedback, feedback ends, along with trust in and respect for the people in charge. I put a lot of effort into ensuring that nothing at review time is a surprise. If one of my reports is caught off guard by my evaluation at feedback time, I've failed, because I haven't sustained the level of communication with that person that (a) keeps them from being blind-sided at review time and (b) keeps them supplied with what they need to constantly grow, improve and advance.

A surprise at review time means I haven't been listening.

I've considered a radical solution to this: perhaps the feedback for a manager should be collected after they have delivered their review sessions to all of their reports. Call it the "response model". It involves a certain amount of risk, and objective third parties should be looped in. But if a manager's team members are getting ambushed in their reviews, that manager should be accountable for that.

Then there are times when you find yourself delivering a review that takes the form of something like "You rocked it, you grew, feedback across the board was beyond positive. Awesome job, all year."

If I did my job, the feedback won't be a surprise to anyone.

What if the web died but nobody noticed?

From Wired:

The Web is Dead. Long Live the Internet — Chris Anderson and Michael Wolff, 17 August 2010

And I didn't even know it was sick. And isn't the Internet some thing Al Gore invented? Hmmm. In truth, it regurgitates something I've had to explain over and over again to people about the difference between the Internet and the Web. Sort of like the difference between the USGS 15 minute series as compared to a nice friendly tourist map.

what is the web, anyway?

Tim Berners-Lee, 'early 90's: "right, if you guys won't listen, then I'll do it myself." I paraphrase, but that's what it boils down to: Tim had An Idea. Nobody would listen to him, so he built it himself. At which time it seemed sort of, well, obvious. What one might refer to as "a V-8 moment". Think chocolate and peanut butter. Or V-8, I suppose.

One original definition of the World Wide Web: "a system of interlinked hypertext documents accessed via the Internet. " Somewhat dry, but it gets to the point: the Web is documents. Some may be static, some created on the fly, some from databases or mashups or other you-name-the-API but, as presented to the user, they are forms of a document. Of course, these days a URL may respond with data that isn't at all human-readable - but that's largely a presentation issue, isn't it? Which brings us back to the browser.

Or, one can define something in terms of what it is not:

  • not networking technology
  • not hardware
  • not IP addresses

Initially, "the Web" was the part of the Internet accessed via a newfangled animal called a "web browser", a specialized piece of software intended to access only servers configured to understand its requests. The "www." prefix (mainly vestigial now)  to a URL identified a specific host that ran a web server, since most machines on the Internet at the time had no idea what to do with web traffic and mod rewrite didn't exist yet.. And domain names were initially free (who knew they'd be worth something).

Ironically, one of the lasting results of the web is the W3C (, the governing body which, 20 years later, is led by none other than Tim Berners-Lee. So we have him to thank for the web and he has us to thank for a job. You have to appreciate his staying power.

Perhaps my favorite definition of the web is from Douglas Adams: "The World Wide Web is the only thing I know of whose shortened form takes three times longer to say than what it's short for" .

when does the web stop being the web?

You can type an IP address directly into a browser's address field. Try using '' instead of ''. Not a "URL" per se (I'm taking some liberties here with browser destination auto-complete, but bear with me for fun), a good old-fashioned IP address. Still the web? It's still a document created and delivered to your browser. Is it the web because you're using a browser? Or that the destination responds to web requests?

Does it really, in the end, matter?

when does an app stop being an app?

Traditionally, there has been an understated dichotomy between "application" developers and "web" developers. Mashups have really confused the issues. So much for the simple browser and straightforward web sites. Now you have data. And applications that can send and receive web requests and data. And different sites collaborate with their data.

Early on in the iPhone world, Steve Jobs told the developer community that they didn't need an SDK, that they didn't need to develop actual iPhone-native apps because they could develop "web applications": web pages custom-engineered to feed data to and from the iPhone's multi-touch interface and display. The development community responded by giving Mr. Jobs the finger.

making sense of all this (and I use that term loosely)

So we all get wrapped up in determining how many of the elephants in the room can fit on the head of a pin. But the really important thing is that an elephant with a pin in its ass is a dangerous thing if it's about to fart, and a blind man will never see it coming.

Leadership and followship

In the original Dune by Frank Herbert, the main character Paul recounts being questioned about leadership:

She asked me to tell her what it is to rule, and I said that one commands. And she said I had some unlearning to do.


I've taught sea kayaking on and off for quite a while, including something called "Leadership Workshops", a series of one-evening seminars followed by the main event: a six-day expedition off the coast of British Columbia, each day being led by a different team of two students. From before sunrise until dinner, that team was responsible for knowing the weather, the route, the tides and currents. It was their job to get everyone up, fed, packed and launched, then guide the group to the evening's destination, ensuring everyone was safely arrived and properly settled.

Each evening, over a group dinner, we would discuss how the day went, providing feedback and context, sharing lessons learned. This was an especially interesting exercise, as each day's newly-minted leaders had no cred with the group, and had no time to earn it - the rest of the group was asked to reserve judgment on the leaders until the dinner debriefing session, which can be a very hard ask. While the trip was designed to surface and discuss leadership challenges (and it certainly did), followship, or lack of it, frequently made the difference between a successful day and a ten-hour rolling conflict.

Project teams can reach this point as well, where everyone can practically read each others' minds, the concept feels equally shared, the goals clear, the progress exhilarating. A tight team can sense when a strong leader is on a roll  - but even a strong leader can be diverted by an intractable team member. We spend a lot of time defining what makes a good leader, and rewarding those who meet those challenges. We spend considerably less time identifying and rewarding those members of a team who, by demonstrating great followship, help create an environment within which the leader can be even more effective and the team more successful.

At one point I decided to learn ballroom dancing, and eventually experienced the conversation that is constant, subtle and sublime when two people are in the groove. But as everyone in a beginner class discovers quickly, you cannot both lead. Our instructors, Walter and Nancyanna, would explain the roles in this way: "The job of the lead is to be clear and consistent. The job of the follow is to respond gracefully and maintain impeccable rhythm."

Catching the next wave

Global Communicator Link

We have a saying in engineering about improving something until it no longer works. Another one goes "if it doesn't fit, force it - and if it breaks, it should've been replaced anyway." Both may apply to Google's recent short circuit. ArsTechnica staff explain their attempts to incorporate Google Wave into their communications portfolio:

I find it interesting that quite a lot of people take a few things for granted:

  • Email is good, but needs to be improved... somehow.
  • Sustaining communications with multiple people/groups over multiple protocols is hard.
  • The solution must be to roll all of our protocols into a single interface.
  • Nobody has yet found a way to successfully integrate all those in-need-of-fixing protocols into a single UX.

So email has become less satisfying, but IM, SMS, MMS, IRC, voicemail and plain old voice aren't scratching the itch well enough. A few years ago, I resisted when my boss required everyone to be on IM. I told him that I thought IM as a mode of business communication was horrible. His response was that people 10 years ago probably felt the same way about email. Touché. But that seems to highlight the problem: rather than evolving a protocol of communication to offer new flexibility and features, we continue to add new protocols. A hundred years ago, telephones had no buttons or even a dial - you turned a crank to alert the operator (when was the last time you spoke with an operator?) to place a call for you. Then we added a dial, then push buttons, then call lists, voicemail, voice dialing, visual voicemail - all extending the 'protocol'. We didn't add five types of telecommunications networks - imagine a bank of five phones on your desk, each used for different sorts of communication - such desks actually existed at one time.

For quite a while, I've felt that, rather than try to develop a UX to integrate multiple protocols that have quite a bit in common, and in concert develop ways collect and curate disparate contact lists, what's needed is an extension of a single protocol to meet modern needs. Standardize such a protocol, along with address book information architecture. The "communications apps":

  • Email.
  • (Visual) voicemail = email with voice "channel".
  • IM =  email + presence.

Note that I do not include social apps in this protocol. Nor do I include collaborative editing in the list.

While Facebook, Twitter, LinkedIn, Gowalla, Foursquare and 50 other services are doubtless valuable, there is no standard, and I'd argue there shouldn't be: these sorts of apps, indeed even the "medium", is under such rapid evolution that conformance with a set of standards would probably inhibit innovation.

Regarding multi-person concurrent document creation, the workflow is so different from messaging that I'm actually surprised that Google decided to shoehorn it into Wave in the first place. Sharing creation of a document by ping-ponging it between editors is rarely satisfying beyond a very small number of edits or editors (say 2). Incorporating that workflow into a messaging client hardly bodes well for either messaging or document collaboration.

That said, the architecture for an address book for communications apps should be designed to be extensible in order to incorporate information and access control parameters for social/other (e.g. content) applications as well as geotagging and provisions for future developments in digital signatures.

So what would the basics be for an ideal communications protocol? For starters:

  • Basic messaging capability (I send you a message; you reply, rinse and repeat). I include in "basic" everything we take for granted these days:
    • Address book integration.
    • Media/document attachments.
    • Group capabilities.
    • Mobile-class stability - this implies store-and-forward: I lose my connection (network drop, close my laptop too quickly, etc) and you still get my message.
    • Presence - you know when I'm available, and you know when I'm typing (ala IM as opposed to "see what I type letter by letter").
      • Probably with on/off preferences that can be tuned to a certain degree for recipients/time of day/other or integrated as part of ACL.
      • PKI integration - you can verify it's from me, and only you can read it if that's my intention.

And the address book?

  • The usual:
    • Name/phones/emails/IMs/addresses/personal info.
    • Digital signatures/encryption keys etc.
    • Presence preferences.
    • ACL - Access Control Lists.
      • Who gets to see/listen to/download what.
      • Groups.

Really, it seems to boil down to email + voicemail + presence + more full-featured address book. Can it really be that simple? Somehow I doubt it - but this seems a worthy direction for exploration.