Entries in apple (8)


The iPad reconsidered

As promised, here are my current thoughts on the iPad - as developed on from my thoughts of four months ago.

I still find the screen too glossy and hard to read in some circumstances. This has been somewhat mitigated by the onset of autumn as my train journey is usually in the dark or half-light. However, now that I'm used to the iPhone 4's retina display I'm also very distracted by the sight of all those rough edges and pixels. I'm sure they weren't there before.

As I understand it there are some rather large technical challenges to bringing retina display technology to a panel the size of the current iPad. I do wonder if this feeds into some of the rumours of an upcoming 7" iPad (or other smaller form factor).

On the flip side I've found that, with the retina display, the iPhone is now even more useful in some situations where I might otherwise have preferred the iPad - e.g. reading. It's now almost as comfortable to read material on the iPhone, at the higher resolution, as on the iPad - and without those distracting pixels.

With this in mind I think I would welcome a smaller form factor (as an option in a range) - as long as it had higher pixel density (ideally "retina" level).

Aside from the screen, we've certainly seem a lot more apps targeting the iPad. Some more successfully than others. We're seeing innovation in this area and its an exciting time - but I think for the casual observer it's still a bit too early to really benefit. Unless you have something specific in mind I'd advise holding off to see what the next generation - or possible an upcoming competitor - hold. If you just want an eReader I think the latest Kindles are a good bet - and needn't preclude getting a next gen iPad later.

I think that is the mark of something truly new. We're finding our feet as a community. The hype behind Apple and the iPad seems to be sufficient to keep the momentum going until we reach the next level.

It's also been interesting to see how other vendors have responded. There has been the inevitable glut of cheap knock-off clones, of course. But I think we're starting to see some real contenders. The Android space has really taken off this year. In some ways Android is even ahead - but I think Apple is still leading on innovation at this point. There are fundamentally different philosophies behind Google's approach and Apple's. But you can't deny that that Google are following the strategy that Jeff Attwood recently coined as, "Go that way, really fast". Even Blackberry are making credible contributions to the space (or are they?) - focusing, as is their wont, on the business side of things.

Competition is good. Not just because it will encourage Apple to keep innovating, and keep iterating. I genuinely think this area of computing is the one to watch. It's only just getting off the ground. I think Apple will be the thought leaders in the space for a little while yet, but it would be unhealthy for that to remain so in the long term. That was less true of the iPhone, but I still think it's good that competitors are catching up there too (and ahead of schedule). The reason I think Apple are still ahead is that they control the end-to-end user experience and that is, right now, critical. That may not always be the most important factor.

But back to the specifics of the, current, iPad. Most of the areas I commented on in the previous post were software related. I now have iOS 4.2b2 installed. How has that changed things? Well I can't talk about specifics, beyond what has already been advertised, as it is still under NDA. But I can say that just between Multitasking and Folders alone I can't imagine going back to 3.2. I actually jumped on 4.2b1 as soon as it came out - despite my usual policy of not installing first betas, if any, on devices I use in daily life (I have an iPod touch and an older iPhone specifically for testing). Android users will be quick to point out that corresponding features have been available on that platform for some time. Not having used them I can't comment, but I have heard that the experience is less satisfying.

However, despite all this I have to say that I've been using it less than I was even four months ago.


Well, in part, it's a matter of time. My typically daily computing experience is something like this:


Catch up on Twitter on my iPhone while waiting for the train. Development (or writing blog posts) on my laptop on the train. I sometimes use the iPad for looking up reference material as the internet connection is more stable than my 3 Mi-Fi

Work on a PC running Windows. Occasional emails and Twitter checks are catered for by the iPhone. Music supplied by the iPhone.

Evening commute:
Catch up on Twitter on my iPhone while waiting for the train. Development (or writing blog posts) on my laptop on the train.

Evening at home:
Watching an episode of something (currently: Lost) from my Mac Mini (possibly soon to be my Apple TV 2) while having dinner. Occasionally a little more development on my laptop, or downloading iOS betas. Sometimes some reading in bed - either on the iPhone or the iPad.

So there is not a lot of scope for the iPad to make inroads there. Currently the only regular appearance is the late night reading - which the iPhone is often more convenient for anyway. I didn't mention, too, that I sometimes use the iPad during the day as a photo frame, but it has been disappointing even in that capacity as there is, currently, no way to change the update frequency (iOS4.2 may or may not address that - I'll say no more) - and no way to create photo albums on the device. If these things are addressed I will probably use it more for that - but that's hardly revolutionary.

Once my Apple TV 2 arrives and AirPlay starts working I may well use the iPad as a source for that too - but the iPhone may serve that role just as well.

So, in summary, I'm making it sound as though the current iPad is much less useful than I had originally hoped for. There is some truth in that. I think an option of a smaller form factor and/ or a high density display will help there. But I think another obstacle has just been my time to find apps that make better use of what the iPad has to offer (this would be true of any tablet device) - and more importantly - to write my own!

The time for the iPad is, maybe, yet to come, but the revolution has already begun


The iPad - two ... no, fifteen, weeks in

I've had a draft post in MarsEdit for some time now called "The iPad - two weeks in". Clearly that title is a little outdated now.

What's interesting is that my opinions haven't really changed in that time. So what follows is my unedited thoughts just over four months ago. I'll follow up with what has changed in the meantime - but that is all due to external factors.

The iPad - two weeks in

I've had my iPad now for two whole weeks. I've not used it as heavily as some in that time, but I think it's long enough to give my initial impression. I've publicly been quite excited about the iPad in principle since it was announced - but now I've been able to taste the proof in the pudding.

After the initial opening, where you find out for yourself how natural using apps like Safari and Maps are on the new device there's an inevitable awkward period where you realise that it doesn't do anything (yet) that you couldn't already do with your laptop or your phone. For some people this is all they see. That is, of course, missing the point.

First of all I'm used to taking my laptop with me everywhere - and if I don't have that I still have my iPhone. There are not many occasions where I need more than my iPhone, don't have my laptop but would have my iPad. But that's mostly because I'm a developer. If I wasn't using it for coding then most days I would probably leave my laptop at home and just use the iPad on the train.

In time, however, I've started to reach for the iPad first, even if I have the laptop with me. Why? Well it's smaller for a start. I have a 17" Macbook Pro - which is quite a lot to pull out on the train if I don't really need it. When I'm coding I really appreciate the extra screen estate - but for just about anything else it's not needed.

I'm also finding it generally a nicer, more natural, experience to interact with apps and content through the touch metaphor - especially Apple's implementation. After three years of iPhone O..., I mean iOS I still get great satisfaction in working with the interial scrolling views, for example.

So far this has just been a refinement of an experience I already had - it's not adding anything truly new - and there are some downsides, which I'll come on to. It's worth mentioning here, though, that we're only just getting off the ground with this. I'm very much an early adopter here. It's a little unusual that the hype around the iPhone and iPad have lead to such mass adoption already. There are bound to be people who expected more or are still wondering what you can actually do with these things to make them worth their keep. It will come. It will all come (and, as alluded to in that blog post I linked earlier, I hope to have my own part in that).

So that's the positive and the realistic. What about those negatives that I mentioned.

Well the first is that with the larger display and extra power you really do miss multi-tasking. Of course that's coming soon, to a degree, and that will mitigate most of my concerns here. However I do feel that in some cases it would be nice to have more than one app on screen at a time. I wouldn't want this to be the default way of working - as it is with desktop OSes. But the ability to do this selectively, perhaps with the widget metaphor, would be a nice addition. That said I'm a power user and not everyone would need or be comfortable with this. Even if we never get it, with the service-based multi-tasking that's coming it's going to be a good experience.

On a similar note I'm finding mobile Safari to be much more frustrating than in the iPhone context. Two things - the lack of tabs is annoying. While you have a somewhat similar mechanism in the form of the page toggle view, it's not the same and if you want to do a bit of research it's very limiting. Of course this is entirely a software implementation issue and there's no reason it couldn't be added in a future release (allowing for my next point).

The other issue with Safari, which it also inherits from the iPhone, is that it doesn't seem to do any disk caching. It holds a whole page in memory. If you switch to another page and it runs low on memory it will purge the first from memory and if you then navigate back it has to load the whole page over the air again! I feel this would need to be addressed before tabbed browsing could be offered.

Finally - and I think this is the biggest grievance I have with the iPad today - is the glossy screen. It's fine in low light conditions (if you turn the brightness right down). But outdoors, especially if the sun is out - or even indoors if the lights are bright - the display is really hard to read from and tires the eyes very quickly. What concerns me most is that Apple seem to be fine with this. Their "solution" is just to crank the brightness up until it overcomes the glare. This almost works. Sometimes even that is not enough - and it certainly doesn't address the eye strain issue - tiring them even more.

Before the announcement back in January the display technology was probably the most talked about aspect of the then-rumoured device. From reading the opinions at the time it sounded like if the iPad launched with backlight display at all - let alone a glossy one - it would be an instant failure. After the announcement those opinions became a distant minority as everyone else focused on what's great about the device. Sales so far certainly don't seem to be hindered by this weakness. This is a shame because I think it will just give Apple reason to ignore it altogether. I hope I'm wrong. After all they did do a U-turn over the same issue with the Macbook Pros when they went glossy. I held off getting a new laptop until they finally offered a matt display option again. I'm not so hopeful with the iPad, however since it's the glass that makes it glossy and that really needs to be there on a multi-touch display. The glimmer of hope, no pun intended, comes from the iPhone 4 which apparently pioneers a new manufacturing technique for connecting the LCD to the display which closes the gap between them. I'm hoping this will reduce glare - at least a little - and that this technology will work its way into the next generation of iPad devices.

In summary, there are irritations and weakness but all of these, with the exception of the glossy display, can be fixed with software updates - and I'm confident that some of these will filter through. The display is particularly disappointing but for many people it's fine. It's potentially "fixable" in future hardware revisions. An anti-glare screen protector may help too, although I've been reluctant to try one just yet.

Despite these downsides, and the early stage that the eco-system is at in terms of must-have apps, I still find the iPad to be a really great device that currently has no equal. It's not yet for everybody but I really do believe that the trend is that this gap will close.

The one area that I think the iPad will really shine - and we're only seeing embryonic examples yet - is in note capture and consumption. The immediacy of iOS, the natural interaction of multi-touch and the larger display/ touch surface of the iPad are, I think, the perfect ingredients for making the capture of notes and ideas directly into digital form more practical and accessible than ever before. This is the direction my app ideas lie in and I'm really excited by the possibilities now on offer.

Remember - the revolution is only just starting.


Welcome to the new decade

What makes a tweet take off? I recently had a tweet go viral and it gave me a fascinating insight into the how and why these things spread. In some ways Twitter amplifies existing social epidemiology. In others it is unique.

So what was the tweet? It was a Saturday afternoon - about 4pm here in the UK. I was having a shower (where most of my best ideas are formed) and I was thinking about recent tech news. It struck me that recent events (Oracle suing Google over Java, Google net neutrality controversy with Verizon) have added to other changes (Apple's rise to the top of the mobile, online music and tablet spaces - and even eBooks - and Microsofts inability to get into - or back into - and of these area) has resulted in a reversal of some of the positions we have taken for granted over the last ten years or so.

So, as I was drying off, I posted a casual message on Twitter. I had around 200 followers at the time - at least half of whom I know personally. I thought I might even get a couple of retweets.

The tweet read as follows:

Welcome to the new decade: Java is a restricted platform, Google is evil, Apple is a monopoly and Microsoft are the underdogs

That's 125 characters to summarise the juxtapositions I had been pondering. Like most tweets where several thoughts are being conveyed it took a couple of iterations to prune it enough that it fit into the 140 character limit - and leaving just enough space for the RT.

Just in case.

After that I took my family off to my sister's, where we had been invited for the evening, and didn't think any more of it.

My sister's house is a bit of a 3G (or any G, for that matter) blackspot. If you've seen those adverts for femtocell repeaters that have people hanging out of windows to get a signal then you have an idea of what it's like.

However, even there, my Magical iPhone 4 antenna occasionally picked up a signal and I'd get email in bursts. During the course of our meal I heard a few emails popping in and took a look. My unread emails badge told me I had 50 emails waiting. 50! As it turns out, 50 is the maximum number of email headers the iPhone will download automatically. I don't know how many I already had at that point. But it was a lot.

So what where they? They were notifications from Twitter of new followers.

I managed to get a connection to my brother-in-law's wifi - which has a 128 character hash as a password (!) - and checked in on my twitter account. And there it was. Screen after screen of my tweet retweeted over and over again! A three letter word came to mind. It starts with W and ends with TF!

I couldn't investigate fully until I got home. By that time I found I was #1 in two categories on Reddit - and later found I had also been #1 on Hackernews all night! A little more searching showed the tweeting popping up in other places too, mostly blogs. By monday I heard I'd even had a mention on "This Week in Tech".

By Tuesday (already three whole days later) I was still seeing retweets in my timeline every few minutes, and new followers were trickling through. They seemed to have levelled out at around 900 (some people were already unfollowing) - but then over Tuesday night (UK time - so day/ evening across the US) it picked up again leaving me with 930 by Wednesday morning.

So what happened?

My Twitter Social Ego Networks.jpg

Tipping Point? Or Life Of Brian

If you've not read Malcolm Gladwell's Tipping Point you're probably at least familiar with the phrase, or can take a guess at what it means. It's all about the factors that contribute to the adoption or awareness of something taking off - usually by several orders of magnitude. Almost by definition this is not an exact science. If you want to paint a scientific face on it I'd probably paint it with Chaos Theory. In practice the elements that Gladwell covers tend to be more sociological, and usually highly anecdotal.

But a tipping point seems to have been what was reached here - so what does Gladwell have to say about the process?

Connectors, Mavens and Salesmen

Probably the most obvious example of a Tipping Point factor are Gladwell's: "Connectors". These are people who have a lot of social connections. They know people. (Even more) people know them and generally trust their opinions. Often these people will be celebrities or with some other form of media presence. Book authors, tech journalists or high profile employees of big name companies are common Connectors in the tech world. There were certainly a number of these in the mix and they would have played a huge amplifying role in the process. I can't imagine my tweet would have "tipped" without them. Some of the names I've seen are: Robert Scoble, Leo Laporte (who had mentioned me on TWiT and Travis Swicegood (author of "Pragmatic Version Control With GIT", and who posted the tweet to Hackernews). With a bit more digging I'm sure I'll turn up more, perhaps even bigger names.

Gladwell also talks about Mavens and Salesmen. In this case I believe the Mavens involved where probably also the Connectors. Salesmen have less of a role in a Twitter epidemic.

Stickiness and tl;dr

Gladwell's concept of the "Stickiness Factor", I believe, translates to the quality of the tweet that led it to take off in the first place. After all it needed enough momentum to reach the connectors.

In retrospect I can have a good stab at what it was about the tweet that gave it Stickiness. I want to emphasise that I can't claim credit for how effective it turned out to be. That was mostly down to luck and the constraints imposed by Twitter itself.

Twitter's famous 140 character limit, while it has other historical reasons, has proved to be one of its most compelling (and sometimes frustrating) "features". It's an oasis in today's crisis of information overload. Anything else is tl;dr.

And yet at the same time we are addicted to content - especially social content. We want more of it, but in smaller amounts. And that's precisely what Twitter gives us. Furthermore we are forced to think about how we can keep within that limit - compressing paragraphs of material into a couple of laser focused sentences. We often surprise ourselves at how much unnecessary waffle we can distill down to essence of the point we wanted to make.

And that's exactly the process I went through to arrive at the wording in my tweet.

But it wasn't just the fat that was trimmed. There was no room to explain the nuances, resolve the ambiguities, or balance the controversy. Twitter editing is brutal. It has to be left to the reader to add the flesh back to the bone.

Anyone familiar with recent events in the tech world could identify with the sequence of statements - whether they agreed with them or not. But each person also read into them their own interpretation. Some took exception to what they thought I was implying. This was important. If you look at the discussion threads that exploded on Reddit and Hackernews you'll see how many possible interpretations and opinions about each word were represented.

It's a sad, but well known, fact that controversy "sells". Each point I made had enough truth that it could be talked about in serious debate, but was controversial enough that people wanted to do so. The fact that each point was also a reversal of a previous view was the light and amusing packaging for this combustible concoction. Top that off with the easy to read meter, practically forced on it by the Twitter limit and it's hard to imagine a more carefully planned Stickiness assault on the Twitterverse.

Yet it really wasn't planned that way.

Anatomy of a perfect tweet

We've looked at the general Gladwell effects that probably contributed to the tweet "tipping". Considerable research has also gone into to more specialised effects within the context of retweeting. What makes the difference between something being retweeted 0-5 times compared to something that takes off to hundreds, thousands, or more?

Perhaps the most notable Social Media expert in this area is Dan Zarrella. He has broken down vast numbers of statistics and correlations and come up with some key observations. These include things such as the type of words that are most retweetable, sentence structure, time of day, day of week, and many more. Some of the effects are more pronounced than others.

Looking at the timing: Dan suggests the best time of day to be retweeted is sometime in the early evening. According to his graph this peaks about 5pm. Sure enough that was almost exactly the time I posted. So that goes to prove the point? Well it would be a mistake to look at a sample size of one and draw conclusions. There are also problems with this statistic. Twitter is a global phenomenon. Most people have at least some international members of their network. So time-of-day is all but meaningless. In my case, being based in the UK, I think 5pm worked out well because the U.S. was coming online and spread throughout the day.

On the flip side, the best day of the week is apparently Thursday. The worst day of the week is Saturday! As I said these are statistical biases - not absolutes. Nonetheless his findings are very interesting and can make you rethink the quality of your tweets.

I was a little surprised that he didn't even mention, at least in the linked article, network effects of the Tipping Point variety (although arguably most of his findings relate to the Stickiness of the content).

A numbers game

As this saga unraveled I became more and more fascinated with it. I wanted to see how many times it had been retweeted, by whom, seen by how many, and who the Connectors were. There are tools online to help, but they are constrained by Twitter limits (for example only the last 1500 tweets from a search). To get around this I made multiple searches using the since: and until: commands. Unfortunately these only work for dates - not times. On Sunday I had more than 1500 retweets in the final eight hours alone!

So I wasn't able to piece together the whole story, but by throwing in a few estimates for the missing data I arrived at a figure of about 3-4 million impressions (that is, people who would have seen the tweet, given the followers of those that retweeted - allowing for overlaps).

I think the first big Connector in the mix was Robert Scoble, about 3 hours and 30 retweets in. During the rest of saturday evening it was picked up by five others with more than 10k followers each - including @toptweets with 354k followers alone! I suspect Sunday had the most Connectors at play. Unfortunately most of Sunday is a bit of a black hole due to those Twitter limits.


Blessed are the cheesemakers

At times it seemed like I'd got trapped in Life Of Brian - only with less Romans. When anything plays out on a large enough social network; the thing itself takes on a life of its own - detached from its origins. This was Dawkins' memetics at play in a highly concentrated form. But being so accelerated, even a network as large as Twitter reaches saturation fairly quickly. I was amazed that by Wednesday I was still being retweeted so much, but by Friday it was finally abating - with only about 3-4 retweets an hour.

By next week it will have gone the way of the dinosaurs. My days as a micro-celebrity are numbered.


Why the iPad may never need multi-tasking


Like the iPhone and iPod touch the iPad does not allow multi-tasking of third-party apps. That is, if you are using one app and want to start another you must close the first app. I wouldn't rule out that changing in a future update, but I'm more inclined to think we won't see it even on the iPad - at least not in a general form. Why?

The way I see it multi-tasking is used for three purposes:

  1. Background apps, such as music players. The iPod app does run in the background, of course, but there is certainly a case for third-party apps such as Spotify or Last.fm to have the same ability.
  2. Notifications. This is at least partly addressed by the Push Notification service that Apple now provides. More on that in a moment.
  3. Fast switching between apps

I'd say that (3) is probably the number one reason we usually have so many apps open at once. After all we can usually only interact with one at a time. As the time it takes to (save and) quit one app and start (and restore) a new one gets closer to zero our need to have the apps open just so we can switch to them diminishes or disappears. From what I have read and heard the iPad gets us pretty close to the threshold where this happens. By some reports it crosses that threshold. I'll reserve final judgement until I see it with my own hands (mixed metaphor intended).

Going back to reason (2) let me explain what I mean by being only partly addressed so far. Imagine a calendar-type app - such as Event Horizon. You set up events and appointments - which may be days, weeks or months later. Ideally the app would be able to alert you of upcoming events in much the same way that Apple's own calendar app can already do. One way to achieve that now would be to use the push notification service. This would work but has two problems. First it's a bit heavyweight. Your event data would need to be synced up to a cloud service which could then decide when to push a notification back to the device. Cloud services certainly have their place, but bringing them in solely to provide alerts seems out of proportion. Secondly it means you will not receive the notification if you don't happen to be connected at that time. You will get it when you do get online, but that may be too late! Furthermore you need to be connected for the alert to sync up to the cloud in the first place. These all seems unjustifiable limitations when it concerns data that you already hold locally on your device. Therefore I think it reasonable that some sort of Scheduled Notification API should be made available. This could work just like the Push Notification service, allowing badges, messages and alert tones to be "pushed" to the user, with the ability for the user to open the associated app straight away, or defer that until later. The difference would be that the notification would be posted by the app itself for delivery at a specific time in the future and would never leave the device. I don't see any technical challenges to providing such an API.

If a scheduled notification API is made available I think the only significant remaining need for background tasks would be for things like music players. If Apple do allow background tasks in the future I think it will be limited to these types of tasks, and very strictly policed in the app store approval process. I'll say again, I don't believe Apple will ever allow general purpose multi-tasking in App Store apps. - and I don't believe they need to with the concessions just described. Note that I am ruling out other scenarios such as video encoders or bit-torrent clients, which I don't think are appropriate apps for running on these devices in the first place. If you can think of anything genuinely useful for a significant number of people that wouldn't be covered by such provisions I'd be interested to know.

Technorati Tags: , , ,


What *I'll* be using the iPad for

My last post was a bit of a teaser for the app that I am writing for the iPad. I don't apologise for that. While I did want to pique your curiosity my key point was that there are developers, like me, already out there with plans for iPad applications that simply wouldn't be feasible for any other platform. Just to clarify that: they would be possible for sure - I was planning first a desktop version, then later an iPhone version, of my app - but the experience would always feel like a compromise. I still plan on desktop, iPhone, and even web clients - but they will effectively be auxiliary clients in much the same way that many iPhone apps today are auxiliary to their desktop counterparts.

Does this mean that, before such apps arrive, the iPad is just an empty slab? Perhaps useable only for the very old, the very young, or the technically illiterate? Not at all. The key reason that there has been such a backlash against the iPad is that there was so much hype about it before the announcement. It wasn't just that everyone had idyllic expectations of what it would be for them - although that certainly didn't help. Just the fact that there was hype meant that everyone was watching this event as being a turning point in the history of computing. The trouble with historical turning points is that they are often only recognisable as such when you have the historical perspective. Put it this way: if the launch of the iPad really does turn out to be the moment of a revolution in computing, when looking back on this in years to come, would it seem reasonable to think that its lack of a front-facing camera or support for Flash would be a factor in that status?

I think it's quite likely that some sort of camera facility will arrive at some point - either by way of a future hardware upgrade, or as a peripheral. There are some practical challenges there, but if they can be overcome I suspect they will. As for Flash, that's another story. I'm with the camp that hopes it doesn't get it. There is plenty of good discussion of why that should be the case around already. Whether you agree with that or not, and whether Apple does eventually allow Flash or not, I think is irrelevant to whether this device will change the way we think about computing in the future.

Some great articles and blog posts have been written already about why the iPad really is a revolutionary new device. I think I can summarise them in one sentence:

The iPad takes the tasks we use personal computers and the internet for the most and packages them into a focused, polished, device with an interaction method that gets out of the way.

To add a couple more sentences to that: It does this by removing the need to know about mice and windows and multi-tasking and file systems and cables and hard drives and memory and even, to some extent, physical keyboards! Not that those things are gone forever - just that, for most tasks - tasks that even us technically-savvy power users perform much of the time - they are just a distraction! They are, or were, the means not the end. The iPad gets you to what you were trying to achieve more directly than any device before. If this is not what you wanted that's fine. That doesn't mean this device won't change the way many - perhaps most - people interact with and think about computers forever.

But that hasn't answered the question of what can we use the iPad for right away? Well, even assuming there are no additional game-changing apps available on launch (remember that's still 2-3 months away) there are still some very worthwhile apps bundled with the device, or available on the app store (including Apple's own iWork suite). While none of these are conceptually revolutionary, the ways you interact with them, and when you can use them, may well be. Here are some of the things I can see myself doing that currently would be either impossible or a different quality of experience:

  • Reading on the train. The laptop is not always possible - iPhone too small for sustained reading
  • Reading/ watching video at the gym
  • Browsing maps. Especially when planning future or reflecting on earlier travel. Looks like such a better experience than a laptop.
  • Picture frame in my office. I was planning on getting one anyway - this will save me the cost and looks better than most, if not all, dedicated devices anyway.
  • Task management. It's been great having GTD apps like Things with me all the time on my iPhone, but they still suck for data entry or big re-orgs - so I usually have to wait until I have access to my laptop - by which time I often have forgotten (which was why I was using them in the first place!). I'm confident that at least once such app will be available on, or shortly after launch. If not the existing iPhone versions, in pixel doubled mode, will still be more usable just be dint of a larger keyboard.
  • I'm hoping that the Remote app for controlling iTunes will soon be updated with an iPad UI as that will be an awesome way to control my household media
  • Showing photos. When family visit it's been great that we can browse photos on the TV or a laptop - but this will be a far more natural and intimate experience.
  • Playing vConqr. No seriously - it's going to be great! :-)
  • Of course casual browsing, email contacts and calendar when I'm away from my laptop - or even instead of using the laptop for such things. That way I keep my workspace a little tidier. Obviously I'll still do serious web-browsing on the laptop (if only because of multiple-tabs and copy-and-paste into my other desk-top apps).

I should point out that I consider myself very much a power-user of my laptop/ desktop computers. I have my laptop (a Macbook Pro) connected to a 30" display and still use Spaces to give me more workspace area! As a developer and technologist this will alway be my primary computing experience, but I still see myself using the iPad for more and more tasks too - and not just when I'm on the move. For many people I can see it being their primary, and at some point only, computing device (for now, at least, it seems the iPad still needs to sync against a general purpose computer but the need for that seems to be decreasing).

I started to write about multi-tasking here too, but realised I had a lot of material that diverges from this post, so I've moved it into its own article, which I'll post a bit later.

Technorati Tags: , , ,