Saturday, October 1, 2011

Fight! Amazon vs Apple for vertical dominance

The Kindle Fire, announced this week, represents Amazon's first attempt in earnest to create a competitor to Apple's iPad and associated media sales. This one's been shaping up for years, with Amazon leading in value and back-end muscle with EC2 as Apple led the way in design, devices and commerce with the iTunes store. The Kindle Fire is the first genuine multi-role device Amazon has branded, capable of handling games, movies, music and books.

While it's not a toe-to-toe iPad competitor - at 7" it's too small to make that claim - it's a capable tablet that leads the way in price/value at $199. This opens the door to sales vastly in excess of the volume the iPad can produce; Kindle will sell an estimated 17m units across all models in 2011. The iPad is projected to sell roughly 40m during that time, but I believe the Kindle Fire may approach that next year as consumers recognize they can achieve much of the value of the iPad, as a casual browsing/gaming platform, in a much less expensive package. Households already owning an iPad may add a Kindle rather than a second iPad. In the long-run the Kindle is priced like a game console or an iPod Touch, a far more casual purchase than the $499 minimum iPad.

The biggest difference lies in strategic value. As Jeff Bezos claims, Amazon believes in creating: "premium products and offering them at non-premium prices." Amazon can and does sell the Kindle line near break-even in order to create a long-term channel to it's content and services. Apple has artfully played both sides of this game - selling beautifully designed products at a premium price AND earning significant profits on the content it offers through iTunes.

In the broad market there is room for both approaches. Just as Mercedes and Toyota can both prosper in the automotive market, Apple and Amazon can coexist. However Apple's continued dominance of the tablet market and associated services has just been called into question.



Wednesday, April 20, 2011

And Kindle Library Lending is announced

In my last post I discussed the landscape around electronic books and libraries. Today Amazon announced Kindle Library Lending... in collaboration with Overdrive, a DRM provider among other things. It'll be fascinating to watch the business landscape develop here.

Tuesday, February 15, 2011

Electronic books and libraries - tenable?

Libraries are changing as digital media becomes more popular. The rise of the electronic book is the largest opportunity and threat to the usability of libraries. For example, as reported in the New York Times, Harper-Collins proposes to limit the number of times an electronic book can be checked out before they expire the license. Libraries are naturally concerned with self-destructing books.

The electronic book has stormed into the world of publishing. In 2009, only about 2% of all books sold were eBooks, far behind sales of trade paperback and hardcover books. By the end of 2010, that number had climbed to nearly 10%, and almost $1B was spent by US consumers on eBooks. There are widely varying estimates of the continued adoption rate, but all agree this trend will continue. One estimate is that 50% of the publishing market will move to eBooks by 2020. While the impact of online book sales and eBook distribution is clearly impacting brick-and-mortar bookstores such as Borders (preparing for Chapter 11 bankruptcy), the path ahead for lending libraries is muddled.

Chief among the challenges - large-scale purchase and lending of eBooks is largely prohibited by the Amazon customer agreement. Authors, publishers, and eBook vendors have yet to agree on a lending model which adequately protects their copyright and business models. Much as the music industry which came before them, the publishing industry is struggling with a very rapid market behavior change. Libraries are caught in the middle.

Libraries provide multiple values to communities, including meeting space, student study spaces, and their access to content including written, audio, and video content. Digital book content has the potential to allow libraries to significantly reduce the number of hardcopy books they retain, especially for books in significant demand. Libraries costs are dominated by library staffing, but new book purchases are an important component, representing about 15% of Palo Alto and Mountain View city library costs. Mountain View public library, for example, purchased 31,000 new book/media items, over 14,000 titles in 2010FY, at a cost of $770k. Each item in the collection circulates about 5 times per year (on average), so this represents over 150,000 media uses.

Libraries and publishers have been at odds over various issues through the years. In the 1970's, a significant conflict arose over the use of Xerox machines to copy excerpts of books and journals. The Supreme Court eventually ruled on the issue, establishing the bounds of 'fair use' which could be practiced.

Going to the economic root, how do authors thrive in the presence of libraries today? (OK - being an author isn't the easiest path to riches, but leaving that aside.) Consumers explore books at libraries, reading books they probably would not otherwise purchase. Libraries buy significant numbers of books, especially including books which don't sell in high volume. Consumers who do purchase books (rather than borrow them from libraries) may do so for various reasons, including permanence of ownership, time to access, and convenience.

Exploring the reasons for book ownership, all contribute meaningfully. Some people enjoy reading books when the mood strikes, and thus prefer ownership to borrowing. Others want to read a book as soon as its published, and purchase rather than waiting for a library copy to become available. Still others purchase in airports or other 'locations of convenience'. How do these motivations play in a fully digital era. "Ownership" becomes less important to immediacy and convenience, witness the drop in DVD sales as streaming video media becomes more highly available. The movie industry has tiered pricing based on time-from-release; movies are first available in theaters at prices comparable to paperback book prices. Months later, movies become available at lower prices in cable pay-per-view, and then finally become yet more broadly available on TV and in streaming form.

The same time-from-release model exists in the publishing world too, for books in high demand, with higher priced hardcover volumes arriving before lower-cost mass-market paperback books.

Interestingly, there are authors giving their digital book editions away for free in order to encourage physical book sales. Will this model work in the longer term? As more people read eBooks as their primary media, it authors may end up relying on donations and a 'shareware' model of income. Alternately (and in keeping with the video model), older books can be free or nearly free in order to encourage the purchase of the latest new books. This is compatible with the role of libraries as providers of all but the most recently published content. Consumers would still be incented to purchase access to the latest books. In addition, libraries could (and arguably should) continue to buy books for loan. Each eBook license purchased could reasonably be limited to a small number of simultaneous borrowers, especially during the first year after publishing. Authors and publishers could fairly set a price per license which varies/drops (even to zero) over time, much as video pricing works today. Libraries might reasonably wait to purchase multiple license copies until the cost of licensing drops.

Experiments in eBook pricing will continue as authors and publishers weigh the impact of electronic distribution on their businesses. Libraries and the largest distributors of eBooks will ideally be active participants in these experiments. Amazon's current lack of 'library-friendliness' opens a door of opportunity for other digital media titans, most notably Google and Apple, to step in. Like the music industry before it, the book industry is poised for change.

Libraries will 'peacefully coexist' with eBooks when publishers, distributors, and authors agree to adopt aligned rules of engagement.

Wednesday, December 29, 2010

New York Times - David Pogue on camera sensor size

David Pogue wrote an illuminating article on digital camera size, discussing how hard it was to figure out the size of a camera sensor, and why it mattered. A very good point. One of the things I like about the camera specifications listed at Digital Photography Review is inclusion of the sensor size in terms of area and pixel density. (Area is handy because bigger is better, but pixel density is more revealing in my opinion, because it incorporates pixel count. Lower pixel density is better.)

Pogue then goes on to laud several cameras as having larger than average sensors for their class, such as the Canon Powershot S95. They are very good cameras with excellent image quality, but the landscape is more complex than just sensor size.

For example, the Canon S95 (4x zoom range) has 23MP/sq cm, while the Sony HX5 (a 10x zoom camera) has a pixel density of 37MP/sq cm. Lower pixel density generally translates to lower noise and cleaner images in low light. For reference, a large sensor SLR camera like the Canon EOS 1-D Mk IV has a pixel density of 3MP/sq cm - and takes incredibly low-noise images in low light. It also weighs several pounds.

So while larger sensors have an advantage, buying a pocket camera on that basis leaves out an important dimension of flexibility - the lens zoom (focal length) range. Camera manufacturers use smaller sensors in compact cameras with large zoom ranges because a 'long lens' size is roughly proportional to sensor size. In order to make a pocket-size camera, they put a smaller sensor in the camera. For many users, living without a 10x zoom range greatly limits the shots which can be captured. In particular, travel shots and action shots can be tough with a 4x zoom.

The other part of the landscape which is changing is sensor design. Digital camera sensors have become much more effective at capturing clean photos in low light as the sensor technology has been refined. For example, the latest generation of Sony sensors utilizes their "Exmor" sensors, which improves sensitivity by putting the circuit wiring below the image-sensitive layer of the sensor. Even in low light (for example using ISO 1600), very satisfying photos can be captured relative to prior sensors, and the technology improvement can larger overcome the difference in sensor size between these compact cameras.

Before buying a camera, consider the kind of pictures you'd like to take (including zoom range), the degree of 'automatic' versus 'manual' control you prefer, and compare real photos captured by reviewers and other users. Try the "Comparison Widget" as a quick way to compare photos of different cameras side-by-side (for the cameras already reviewed.)

Monday, December 20, 2010

Loving and hating Microsoft

My wife is a serious Outlook user. For her its the nexus of all things organizational. Tasks, calendars, and email all converge in a carefully honed system which keeps our household and generally at least one major educational organization at a time (PTA, etc.) afloat. That said, Outlook 2003 had some issues and failings and I decided to let her try OneNote, an element of Office 2010 as part of a longer-term transition plan.

Mistake #1 - believing the installer works as described. I installed OneNote by first electing to 'install alongside instead of replacing' Office 2003, selecting custom installation, and only picking OneNote, with all other Office apps 'installed on first use'. No errors or warnings! Great.

But Outlook 2003 was simply.... Gone. Nowhere to be found. Word and Excel 2003 were still available, but in fact NO version of Outlook was installed anymore.

Mistake #2 - not doing a full backup of the system directly before Mistake #1. I did have a backup of the Outlook 2003 data file - but it would be some real effort to roll back the system.

Mistake #3 - not verifying feature consistency. Always know ahead of time what the customer values. Some things that look small (like showing start and end time explicitly in appointment details in week-view of calendar) were very important to her and are simply unavailable in Outlook 2010.

After listening to some choice words and tweaking some really hard-to-decipher details (like the way tasks are ordered and displayed), Outlook 2010 is close enough to working like Outlook 2003 had worked to be acceptable. I'm halfway out of the doghouse and some of the new features of Outlook 2010 (like color-based categorizing) are proving helpful. That said, I've been reminded how important it is to verify critical features when rolling a system out to users, and ALWAYS have a rollback plan.

I've also been reminded about the dangers of believing the documentation.

Enjoy the holidays....

Thursday, November 11, 2010

The Next camera you buy

Have you bought a camera lately? If you bought a cell phone within the last eighteen months, you have almost certainly bought a camera too. Do you love it? Probably not. Even the best of the cell phone cameras (arguably the iPhone 4 camera right now) is far less versatile than a $150 camera from Walmart. The reason isn't cost or complexity - it's lens size. An optical zoom lens of 'interesting' quality is just too big to wedge into a sleek smartphone. We're left with a fixed field of view and mediocre sensitivity. It's a disappointment considering the hype! It turns out that your average 'good' pocket camera - I've recently been a fan of this Nikon - has about 10 optical 'elements' (lenses) to deliver fairly sharp images through a 10x zoom range. There are two general ways to get lots of versatility with fewer/smaller lens elements. Either you use variable focal elements such as liquid lenses ala Varioptic or you 'relax' your constraints on preserving nice flat images and allow more 'distortion' which you later remove digitally. It's the latter approach which pays greater dividends. By cooperatively designing optics with digital correction in mind, better end-point images can be achieved. Ultimately, smaller, cheaper, and simpler lenses can be used to achieve the results formerly requiring exotic and heavy lenses. Canon's S95 uses in-camera correction to deliver significant reduction in distortion over its zoom range. As the compute power to run these algorithms move downstream, cell phones with powerful optical zoom lenses should appear, with Japan as usual the harbinger for broader market growth.

Thursday, July 22, 2010

From iPhone intention to Droid X


I had every intention of moving to the iPhone 4. The user interface, camera, and design are all clearly top-notch. Antenna glitch aside, it's a stunning piece of engineering. And then... AT&T happened. I had an old 'AT&T blue' data account and SIM, and the back-end order systems which manage migration have bitten me before. In this case, they busted during the iPhone 4 initial release frenzy. On top of that, AT&T had massive challenges (9 months of incorrect billing) on my business land-line account. Finally, my attempts to use the Nexus One on AT&T, watching every other call never reach the handset, convinced me I was destined to switch carrier.

In a more ideal world I would have chosen between the iPhone 4 and the Droid X on Verizon - but in 2010 AT&T remains the only iPhone 4 carrier and the choice is already made. Droid X it was to be.

The Droid X is a fine phone and very capable smartphone device. I am impressed by the display, the soft keypad, and by the range of Android widgets and apps. The camera is the only bit of dirt in the sauce. While billed at 8MP of resolution, images look distinctly soft even when taken in bright light. A small 1:1 sample of a document image gives a good idea of what you can expect.


Downscaling the image as a point of reference, the 'true resolution' is closer to 3MP with the remainder being noise and blur. On the positive side, the images are not mangled by excessive in-camera sharpening, so a certain amount of post-processing (unsharp mask for instance) can yield slightly more appealing results.

This is yet another classic case of wasted pixels though. Had Motorola done their homework and selected a good 5MP module as Apple did, a significantly better camera could have emerged. As it is, I'll be enjoying the data-centric side of this phone while still fondly remembering the camera in my now-venerable Nokia N95.....