Holiday season free giveaway: a must-read for developers

Among my top books for 2011 is this one by Jez Humble and David Farley. I reviewed it here, and it also sparked some discussion of what is the difference between the various continuous software development/deployment models.

I have a spare copy of this book to give away. All you need to do is comment to this post with a valid email address – this will not be posted or used for any other purpose, but I will use it to request your address if you win. Please do not include an URL as it risks being dumped in the spam bucket!

On  January 6th I will select a winner at random. I will post to anywhere in the world.

Update: I have selected a winner. To do so, I used Java’s Random class to generate a number between zero and one less than the number of comments. The number it came up with was 4, so the winner is Ian Smith, the 5th person to comment. Congratulations! awards 2011: ten key happenings, from Nokia’s burning platform to HP’s nightmare year

2011 felt like a pivotal year in technology. What was pivoting? Well, users are pivoting away from networks and PCs and towards cloud and devices. The obvious loser is Microsoft, which owns PCs and networks but is a distant follower in devices and has mixed prospects in the cloud. Winners include Apple, Google, Amazon, and Android vendors. These trends have been obvious for some time, but in 2011 we saw dramatic evidence of their outcome. As 2011 draws to a close, here is my take on ten happenings, presented as the first ever annual awards.

1. Most dramatic moment award: Nokia’s burning platform and alliance with Microsoft

In February Nokia’s Stephen Elop announced an alliance with Microsoft and commitment to Windows Phone 7. In October we saw the first results in terms of product: the launch of the Lumia smartphone. It is a lovely phone though with some launch imperfections like too short battery life. We also saw greatly improved marketing, following the dismal original Windows Phone 7 launch a year earlier. Enough? Early indications are not too good. Simply put, most users want iOS or Android, and the app ecosystem, which Elop stated as a primary reason for adoption Windows Phone, is not there yet. Both companies will need to make some smart moves in 2012 to fix these issues, if it is possible. But how much time does Nokia have?

2. Riskiest technology bet: Microsoft unveils Windows 8

In September 2011 Microsoft showed a preview of Windows 8 to developers at its BUILD conference in California. It represents a change of direction for the company, driven by competition from Apple and Android. On the plus side, the new runtime in Windows 8 is superb and this may prove to be the best mobile platform from a developer and technical perspective, though whether it can succeed in the market as a late entrant alongside iOS and Android is an open question. On the minus side, Windows 8 will not drive upgrades in the same way as Windows 7, since the company has chosen to invest mainly in creating a new platform. I expect much debate about the wisdom of this in 2012.

Incidentally, amidst all the debate about Windows 8 and Microsoft generally, it is worth noting that the other Windows 8, the server product, looks like being Microsoft’s best release for years.

3. Best cloud launch: Office 365

June 2011 saw the launch of Office 365, Microsoft’s hosted collaboration platform based on Exchange and SharePoint. It was not altogether new, since it is essentially an upgrade of the older BPOS suite. Microsoft is more obviously committed to this approach now though, and has built a product that has both the features and the price to appeal to a wide range of businesses, who want to move to the cloud but prefer the familiarity of Office and Exchange to the browser-based world of Google Apps. Bad news though for Microsoft partners who make lots of money nursing Small Business Server and the like.

4. Most interesting new cross-platform tool: Embarcadero Delphi for Windows, Mac and iOS

Developers, at least those who have still heard of Embarcadero’s rapid application development tool, were amazed by the new Delphi XE2 which lets you develop for Mac and Apple iOS as well as for Windows. This good news was tempered by the discovery that the tool was seemingly patched together in a bit of a hurry, and that most existing application would need extensive rewriting. Nevertheless, an interesting new entrant in the world of cross-platform mobile tools.

5. Biggest tech surprise: Adobe shifts away from its Flash Platform


This one caught me by surprise. In November Adobe announced a shift in its business model away from Flash and away from enterprise development, in favour of HTML5, digital media and digital marketing. It also stated that Flash for mobile would no longer be developed once existing commitments were completed. The shift is not driven by poor financial results, but rather reflects the company’s belief that this will prove a better direction in the new world of cloud and device. Too soon and too sudden? Maybe 2012 will show the impact.

6. Intriguing new battle award: NVIDIA versus Intel as GPU computing catches on

In 2011 NVIDIA announced a number of wins in the supercomputing world as many of these huge machines adopted GPU Computing, and I picked up something of a war of words with Intel over the merits of what NVIDIA calls heterogeneous computing. Intel is right to be worried, in that NVIDIA is seeing a future based on its GPUs combined with ARM CPUs. NVIDIA should worry too though, not only as Intel readies its “Knight’s Corner” MIC (Many Integrated Core) chips, but also as ARM advances its own Mali GPU; there is also strong competition in mobile GPUs from Imagination, used by Apple and others. The GPU wars will be interesting to watch in 2012.

7. Things that got worse award: Spotify. Runners up: Twitter, Google search

Sometimes internet services come along that are so good within their niche that they can only get worse. Spotify is an example, a music player that for a while let you play almost anything almost instantly with its simple, intuitive player. It is still pretty good, but Spotify got worse in 2011, with limited plays on free account, more intrusive ads, and sign-up now requires a Facebook login. Twitter is another example, with URLS now transformed to shortcuts whether you like it not and annoying promoted posts and recommended follows. Both services are desperately trying to build a viable business model on their popularity, so I have some sympathy. I have less sympathy for Google. I am not sure when it started making all its search results into Google links that record your click before redirecting you, but it is both annoying and slow, and I am having another go with Bing as a result.

8. Biggest threat to innovation: Crazy litigation from Lodsys, Microsoft, Apple

There has always been plenty of litigation in the IT world. Apple vs Microsoft regarding graphical user interfaces 1994; Sun vs Microsoft regarding Java in 1997; SCO vs IBM regarding UNIX in 2003; and countless others. However many of us thought that the biggest companies exercised restraint on the grounds that all have significant patent banks and trench warfare over patent breaches helps nobody but lawyers. But what if patent litigation is your business model? The name Lodsys sends a chill though any developer’s spine, since if you have an app that supports in-app purchases you may receive a letter from them, and your best option may be to settle though others disagree. Along with Lodsys and the like, 2011 also brought Microsoft vs several OEMs over Android, Apple vs Samsung over Android, and much more.

9. Most horrible year award: HP

If any company had an Annus Horribilis it was HP. It invested big in WebOS, acquired with Palm; launched the TouchPad in July 2011; announced in August that it was ceasing WebOS development and considering selling off its Personal Systems Group; and fired its CEO Leo Apotheker in September 2011.

10. Product that deserves better award: Microsoft LightSwitch

On reflection maybe this award should go to Silverlight; but it is all part of the same story. Visual Studio LightSwitch, released in July 2011, is a model-driven development tool that generates Silverlight applications. It is nearly brilliant, and does a great job of making it relatively easy to construct business database applications, locally or on Windows Azure, complete with cross-platform Mac and Windows clients, and without having to write much code. Several things are unfortunate though. First, usual version 1.0 problems like poor documentation and odd limitations. Second, it is Silverlight, when Microsoft has made it clear that its future focus is HTML 5. Third, it is Windows and (with limitations) Mac, at a time when something which addresses the growing interest in mobile devices would be a great deal more interesting. Typical Microsoft own-goal: Windows Phone 7 runs Silverlight, LightSwitch generates Silverlight, but no, your app will not run on Windows Phone 7.  Last year I observed that Microsoft’s track-record on modelling in Visual Studio is to embrace in one release and extinguish in the next. History repeats?

An Apple iPad Christmas

The Apple iPad had a stunning Christmas – at least, it did in my part of the world.

A key factor was that EA Games decided to offer a range of classic board games adapted as iPad apps for 69p ($0.90)  each. So for less than the cost of a takeaway pizza I downloaded Scrabble, Monopoly, Trivial Pursuit and Risk.


The games are not perfect – Scrabble accepts all sorts of odd words and US spellings, for example – but they are official licensed versions, nicely implemented, and a lot of nostalgic fun, which is the idea after all.


Trivial Pursuit supports in-game purchases for extra questions, so that could work out more expensive eventually, but nobody could complain about the value.

It is not quite the full board game experience, with wine spilt on the pieces, junior tipping over the board in disgust, and game abandoned early because it is time to visit grandma, but the changes are mostly for the better.

One thought: this is another example of how well a tablet substitutes for physical things. A book, a board game, a photo album: the iPad is a better replacement than a PC or laptop, easily passed round, long battery life, no flapping screen, and a more natural user interface.

I am not sure what are the economics of selling games at 69p, but no doubt EA has drawn the graphs. Currently EA 69p games occupy four of the “Top Paid iPad Apps” category slots in the UK store.

Of course I am interested in the big picture. Looking at user reviews of Android equivalents like Monopoly I get the impression that there are more bugs, partly because EA has a dedicated iPad verson for these games whereas the Android versions are universal across multiple screen sizes, and partly because there are more OS versions and hardware differences to accommodate.

What about other tablets or new entrants to the market like Windows 8 in 2012? Prising users away from their Apple devices will not be easy, though I still think Microsoft has chances if it plays to its strengths in business applications.

Running Windows on an Apple iPad

I love the convenience of the iPad but there are times when I miss Windows apps. It is not just for work; there is nothing on the iPad to rival Jack Bridge, for example.

The solution is to run Windows on the iPad via remote desktop.


Most versions of Windows have remote desktop built-in, though you do need to install a client on the iPad. I have tried several and settled for the moment on Mocha RDP. If you tap the up arrow at bottom right, you get a toolbar which controls the on-screen keyboard, extra keys useful for Windows, and a menu with options including a macro of pre-defined keystrokes. It even works with my cheap iPad keyboard.

The downside of this approach is that Windows needs to be running somewhere on your network. However Mocha RDP supports wake on lan, so you can turn it on remotely; note that this normally needs to be activated in the PC BIOS.

In my case I already run a Hyper-V server, a free download. I have installed Windows 7 on a VM (virtual machine), so it is always available.

The iPad also supports VPN (Virtual Private Network), so given a decent broadband connection I could connect to Windows while out and about. Alternatively there are systems like LogMeIn which do not require a VPN, though you have to install the LogMeIn agent on the target PC.

The general approach makes a lot of sense to me. Technically it is a hybrid thin/thick client approach. An iPad or other tablet is smart and has its own local apps and storage, but does not attempt to provide the full capabilities of a PC or Mac. When you need that, you can log into a remote desktop.

It is another example of how the mobile revolution is making us rethink how we do computing. The thin client concept is nothing new, but it is only now that it is becoming compelling for users as well as administrators, giving them the convenience of a tablet as well as access to rich applications like Microsoft Office.

Microsoft no doubt has its own plans for combining tablets with desktop-as-a-service. I would guess that it involves Windows 8 on ARM; but it will take some effort to tempt users away from their iPads.

Adobe: why the big business shift when financial results look so good?

Adobe released its quarterly and full year results last week; I am catching up with this now after a week in China.

The company is doing well. Revenue is up by 11% year on year and it generated $1.5 billion in cash. It is buying back shares, usually a sign that a company has more money than it knows what to do with.

Here is the comparison with the equivalent quarter last year:

  Q4 2010 Q4 2011
Creative and interactive 404.8 437.2
Digital Media 165.9 186.4
Digital Enterprise 273.3 342.4
Omniture 109.0 131.1
Print and publishing 55 55.1

In other words, all business segments grew – impressive in uncertain economic times. See this earlier post for a rough breakdown of the segments.

A couple of observations. First, Adobe is benefiting from the big trend in IT towards web, cloud and device. Many companies regard apps (as in mobile apps) as vehicles for marketing, and Adobe’s tools are a natural fit, with or without Flash. We are in a more design-centric IT world than was the case a few years back, driven by Apple, SEO (Search Engine Optimisation), and just because we can: technology now performs basic computing functions with ease so design becomes the key differentiator.

Adobe is nevertheless remarkable in the way it has managed the transition from print to digital. Few companies manage that kind of fundamental shift in their market successfully.

The other point that interests me is why Adobe announced a major change in its business model in November. Digital media and marketing will be the focus, while it winds down its enterprise development platform, as well as moving away from Flash and focusing on HTML5 for delivery.

Unless the announced figures disguise future problems that are only visible on the inside, this move was driven by bad results. Digital Enterprise, which includes the middleware business, increased revenue by 25% over the same quarter last year.

In 2012 the Digital Enterprise segment is being renamed Digital Marketing Solutions, expressing the company’s intent.

Adobe’s change of direction caught me by surprise, as it was not really flagged at the MAX conference the previous month, though there was evidence of struggle with regard to Flash versus HTML5.

I would describe Adobe’s moves as bold. Taking action ahead of when it becomes inevitable is a good thing, but there are significant risks. Adobe’s platform is all about synergies, and chopping off bits that still have a significant following may have unexpected consequences.

Another curious facet of Adobe’s move is that its normally excellent PR department has done little, as far as I am aware, to brief the press. Major news concerning what will be donated to Apache, or the discontinuation of Flash Catalyst, has emerged from sporadic reports instead. Normally that is a sign of a company under stress, rather than one which is about to deliver excellent results.

I guess this time next year we will have a clearer picture.

Android: good or bad for Java? Oracle claims harm but I am sceptical

Patent blogger Florian Mueller quotes a statement filed by Oracle in its legal dispute with Google over its use of the Java language in Android:

Android’s growth in the mobile device market has been exponential, steadily diminishing Java’s share. For instance, Amazon’s newly-released Kindle Fire tablet is based on Android, while prior versions of the Kindle were Java-based. Android has been gaining in other areas as well, with Android-based set-top boxes and even televisions appearing this year. These are markets where Java has traditionally been strong but is now losing ground to Android. The longer Android is allowed to continue fragmenting the Java ecosystem, the more serious the harm to Java becomes, and the more difficult it is to try to unwind. Oracle suffers harm in the form of lost licensing opportunities for its existing Java platform products, and the enterprise-wide harm from fragmentation of Java, which reduces the ‘write once, run anywhere’ capability that has historically provided Java such great value.

The Kindle is an interesting example. I had not realised that the pre-Fire Kindle runs Java, but Oracle shows it as a case study and indeed, here are the javadocs.

Android infuriates Oracle because it uses the Java language, but has its own virtual machine called Dalvik. Dalvik bytecode is different from Java bytecode.

I have no expertise on the legal position, but while I can see Oracle’s point it is also true that Android has greatly boosted interest in Java development. Although Google has fragmented Java, the fact that the language is the same benefits Oracle insofar as it increases the pool of Java developers who may also be inclined to create Java applications on the server or in other contexts.

The interesting question to ask is where Java would be without Android. On mobile, it would likely be close to death. Apple’s iOS platform is equally as resistant to Java as to Adobe Flash. RIM Blackberry used to be a Java platform, but is moving away:

While we will continue to support our BlackBerry Java developer community as they build for BlackBerry smartphones, after further investigation we decided against supporting BlackBerry Java on BlackBerry BBX. We concluded that the BlackBerry Java experience on the BlackBerry PlayBook platform would ultimately not satisfy us, our development community, or our customers as the platform continues to evolve.

Microsoft has no interest in Java on the Windows Phone OS or in the Windows 8 OS that will likely replace it on devices.

Oracle’s claim is in the context of a legal dispute, and as Mueller observes, the company is happy to show off growing interest in Java in its press releases – though without mentioning the A word.

Of course you can understand why Oracle might want to enjoy the benefit of Java’s Android boost as well as the reward of a legal victory over Google.

PS: interesting that Oracle’s Java press release seems to be served by Microsoft .NET:


On Supercomputers, China’s Tianhe-1A in particular, and why you should think twice before going to see one

I am just back from Beijing courtesy of Nvidia; I attended the GPU Technology conference and also got to see not one but two supercomputers:  Mole-8.5 in Beijing and Tianhe-1A in Tianjin, a coach ride away.

Mole-8.5 is currently at no. 21 and Tianhe-1A at no. 2 on the top 500 list of the world’s fastest supercomputers.

There was a reason Nvidia took journalists along, of course. Both are powered partly by Nvidia Tesla GPUs, and it is part of the company’s campaign to convince the world that GPUs are essential for supercomputing, because of their greater efficiency than CPUs. Intel says we should wait for its MIC (Many Integrated Core) CPU instead; but  Nvidia has a point, and increasing numbers of supercomputers are plugging in thousands of Nvidia GPUs. That does not include the world’s current no. 1, Japan’s K Computer, but it will include the USA’s Titan, currently no. 3, which will add up to 18.000 GPUs in 2012 with plans that may take it to the top spot; we were told that that it aims to be twice as fast as the K Computer.

Supercomputers are important. They excel at processing large amounts of data, so typical applications are climate research, biomedical research, simulations of all kinds used for design and engineering, energy modelling, and so on. These efforts are important to the human race, so you will never catch me saying that supercomputers are esoteric and of no interest to most of us.

That said, supercomputers are physically little different from any other datacenter: rows of racks. Here is a bit of Mole-8.5:


and here is a bit of Tianhe-1A:


In some ways Tianhe-1A is more striking from outside.


If you are interested in datacenters, how they are cooled, how they are powered, how they are constructed, then you will enjoy a visit to a supercomputer. Otherwise you may find it disappointing, especially given that you can run an application on a supercomputer without any need to be there physically.

Of course there is still value in going to a supercomputing centre to talk to the people who run it and find out more about how the system is put together. Again though I should warn you that physically a supercomputer is repetitive. They achieve their mighty flop/s (floating point per second) counts by having lots and lots of processors (whether CPU or GPU) running in parallel. You can make a supercomputer faster by adding another cupboard with another set of racks with more boards with CPUs


or GPUs


and provided your design is right you will get more flop/s.

Yes there is more to it than that, and points of interest include the speed of the network, which is critical in order to support high performance, as well as the software that manages it. Take a look at the K Computer’s Tofu Interconnect. But the term “supercomputer” is a little misleading: we are talking about a network of nodes rather than a single amazing monolithic machine.

Personally I enjoyed the tours, though the visit to Tianhe-1A was among the more curious visits I have experienced. We visited along with a bunch of Nvidia executives. The execs sat along one side of a conference table, the Chinese hosts along the other side, and they engaged in a diplomatic exercise of being very polite to each other while the journalists milled around the room.


We did get a tour of Tianhe-1A but unfortunately little chance to talk to the people involved, though we did have a short group interview with the project director, Liu Guangming.


He gave us short, guarded but precise answers, speaking through an interpreter. We asked about funding. “The way things work here is different from how it works in the USA,” he said, “The government supports us a lot, the building and infrastructure, all the machines, are all paid for by the government. The government also pays for the operational cost.” Nevertheless, users are charged for their time on Tianhe-1A, but this is to promote efficiency. “If users pay they use the system more efficiently, that is the reason for the charge,” he said. However, the users also get their funding from the government’s research budget.

Downplayed on the slides, but mentioned here, is the fact that the supercomputer was developed by the “National team of defence technology.” Food for thought.

We also asked about the usage of the GPU nodes as opposed to the CPU nodes, having noticed that many of the applications presented in the briefing were CPU-only. “The GPU stage is somewhat experimental,” he said, though he is “seeing increasing use of the GPU, and such a heterogeneous system should be the future of HPC [High Performance Computing].” Some applications do use the GPU and the results have been good. Overall the system has 60-70% sustained utilisation.

Another key topic: might China develop its own GPU? Tianhe-1A already includes 2048 China-designed “Galaxy FT” CPUs, alongside 14336 Intel CPUs and 7168 NVIDIA GPUS.

We already have the technology, said Guangming.

From 2005 -7 we designed a chip, a stream processor similar to a GPU. But the peak performance was not that good. We tried AMD GPUs, but they do not have EEC [Extended Error Correction], so that is why we went to NVIDIA. China does have the technology to make GPUs. Also the technology is growing, but what we implement is a commercial decision.

Liu Guangming closed with a short speech.

Many of the people from outside China might think that China’s HPC experienced explosive development last year. But China has been involved in HPC for 20 years. Next, the Chinese government is highly committed to HPC. Third, the economy is growing fast and we see the demand for HPC. These factors have produced the explosive growth you witnessed.

The Tianjin Supercomputer is open and you are welcome to visit.

Adobe discontinues Flash Catalyst, clarifies Flex and Flash Builder futures

Adobe has told a group of Flex developers, invited to San Francisco for a special reconciliatory summit following the sudden announcement that Flex is moving to the Apache Foundation, that Flash Catalyst will be discontinued. Developer Fabien Nicollet was there and posts:

CS5.5 version of Catalyst is the latest version of Flash Catalyst. It is compatible with Flex 4.5, but compatibility will not be ensured for future versions.

Flash Builder will also have features removed in future versions. Adobe’s slide talks of:

Removing unpopular and expensive to maintain features: Design View, Data Centric Development (DCD) and Flash Catalyst workflows.

The Monocle profiler, shown at the MAX conference as a sneak peek, “continues as a priority”.

The FalconJS project, to compile Flex to HTML5, will be discontinued, though possibly donated to Apache at a date to be determined.

AIR on Linux will not be given to Apache because it would mean sharing the proprietary Flash Player code. This is bad news in the Apache context.

Nicollet concludes:

Flex still has a bright future for companies who want to build fast and robust applications . Not to mention the people who will have a hard time building complex applications on HTML5, for whom Flex will always be a viable and mature alternative.

That is the optimistic view. What is clear from the summit is that Adobe is greatly reducing its investment. I guess we knew this already; but hearing about how Flash Builder will be cut-down, Catalyst discontinued, and so on, will not improve developer confidence.

A lot depends on the progress of the Apache project. My concern here is that since the Flash player, which is the Flex runtime, remains proprietary, this will dampen enthusiasm in the open source community and limit its ability to innovate around Flex.

Review: my bargain iPad Bluetooth keyboard from a Chinese market

During my recent visit to Beijing I went along to the Hong Qiao market. It was quite an experience, with lots of fun gadgets on display, mostly fake but with plenty of good deals to be had.


I did not buy much but could not resist an iPad Bluetooth keyboard. I have been meaning to try one of these for a while. The one I picked is integrated into a “leather” case.


The packaging is well future-proofed:


Of course I had to haggle over the price, and we eventually settled on ¥150, about £15.00 or $24.00.

It comes with a smart 12-page manual, which you will enjoy if you like slightly mangled English, though there are some small differences between the product and the manual. A power LED is described in the manual but seems not to exist. The manual makes a couple of references to Windows and in fact the keyboard does also work with Windows, but there is nothing silly like a Windows key and this really is designed for the iPad.

No manufacturer is named, which is odd as the vendor insisted that it is “original”, though the box does say “Made in China”.

The design is straightforward. The iPad slots in to what becomes the top flap of the case. Open the case, and you can set the iPad into an upright position for typing. The lower flap of the case has a magnetic clasp, which works fine. It is a bit of a nuisance though as it gets a little in the way when you are in typing mode. You cannot fold it back to tuck it out of the way.

image image

I noticed a few blemishes in the case; possibly I had a second-grade example.


But I have not found any technical problems.

The unit is supplied with a micro USB cable for charging. It did not take long to charge and I think was already half-charged when I purchased.

Here is a closer look at the keyboard itself.


Once charged, you turn on the power and pair it to your iPad by pressing the Connect button. I had a little difficulty with this until I discovered that you must press down until you feel a distinct click, then it goes into pairing mode. If you then go into the iPad’s Bluetooth settings you will see the keyboard as an available device. Connect, and you are prompted with a code. Type this code on the keyboard to complete the pairing.

The power switch on the keyboard is impossibly small and fiddly to use. If you know how small is a standard micro USB socket you will get the scale in this picture:


So can you just leave the keyboard on? The keyboard claims a standby time of 100 days, so maybe that will be OK, though the manual warns:

When you are finished using your keyboard or you will be required after the keyboard to carry, so don’t forget to set aside the keyboard to switch the source OFF, turn off the keyboard’s power to extend battery life.

Note: When you normally using the keyboard, or if you are not using the keyboard and didn’t turn off the power switch, please don’t fold or curly, so you will have been working at the keyboard, it will greatly decrease you using the keyboard.

I think this means that turning it off is recommended.

Now the big question: how is it in use? It is actually pretty good. I can achieve much faster text input with the keyboard than using the on-screen option, and it is great to see your document without a virtual keyboard obscuring half the screen.

The keyboard is the squishy type and claims to be waterproof. In fact:

It is waterproof, dustproof, anti-pollution, anti-acid, waterproof for silicone part

according to the manual, as well as having:

Silence design, it will not affect other people’s rest.

which is good to know.

The keyboard has a US layout, but shift-3 gets me a £ sign and alt-2 a € symbol so I am well covered.

There are a number of handy shortcut keys along the top which cover brightness, on-screen keyboard display, search, iTunes control, and a few other functions. There is a globe key that I have not figured out; it looks as if it should open Safari but it does not. There are also Fn, Control, Alt and Command keys, cursor keys, and Shift keys at left and right. Most of the keyboard shortcuts I have seen listed for iPad keyboards in general seem to work here as well.

Learning keyboard shortcuts is one of things you need to do in order to get the best from this. For example, press alt+e and then any vowel to get an acute accent, press alt+backtick and then any vowel to get a grave accent, and so on. Finding the right shortcuts is a bit of an adventure and I have more to discover. Not everything is covered; I have not found any way to apply bold from the keyboard in Pages, for example. I would also love to find an equivalent to alt-tab on Windows, which switches through running apps. There is a Home key which you can double-tap, but then you have to tap the screen to select an app (unless you know better).

I am pleased with the keyboard, though given the defects in the case and irritations like the tiny power switch it is not really a huge bargain. I find it thought-provoking though. Is iPad + keyboard all I need when on the road, or have I just recreated an inferior netbook? The size and weight is not much different.


Unlike some, I do still see value in the netbook, which has a better keyboard, a battery life that is nearly as good (at least it was when new), handy features like USB, ethernet and VGA ports, and the ability to run Microsoft Office and other Windows apps.

I am also finding that while I like the iPad keyboard for typing, the integrated case has a downside. If you just want to use the iPad as a tablet, the keyboard gets in the way. Maybe a freestanding Bluetooth keyboard is better, like the official Apple item, though that means another item kicking around in your bag.

In the end, the concept needs a little more design work. Having a keyboard in the case is a good idea, but it needs to be so slim that it does not bulk up the package much and gets out of the way when not needed. Perhaps some sort of fabric keyboard is the answer.

Incidentally, if you hanker after one of these but cannot get to the Hong Qiao market, try eBay or Amazon for a number of keyboard cases that look similar to me. Look carefully though; I noticed one by “LuvMac” which lacks a right Shift key, causing some complaints. Mine does have a right Shift key; perhaps it is a later revision.

Hmm, I have just realised that the lady on the stall forgot to give me a receipt or warranty …

NVIDIA plans to merge CPU and GPU – eventually

I spoke to Dr Steve Scott, NVIDIA’s CTO for Tesla, at the end of the GPU Technology Conference which has just finished here in Beijing. In the closing session, Scott talked about the future of NVIDIA’s GPU computing chips. NVIDIA releases a new generation of graphics chips every two years:

  • 2008 Tesla
  • 2010 Fermi
  • 2012 Kepler
  • 2014 Maxwell

Yes, it is confusing that the Tesla brand, meaning cards for GPU computing, has persisted even though the Tesla family is now obsolete.

Dr Steve Scott showing off the power efficiency of GPU computing

Scott talked a little about a topic that interests me: the convergence or integration of the GPU and the CPU. The background here is that while the GPU is fast and efficient for parallel number-crunching, it is of course still necessary to have a CPU, and there is a price to pay for the communication between the two. The GPU and the CPU each have their own memory, so data must be copied back and forth, which is an expensive operation.

One solution is for GPU and CPU to share memory, so that a single pointer is valid on both. I asked CEO Jen-Hsun Huang about this and he did not give much hope for this:

We think that today it is far better to have a wonderful CPU with its own dedicated cache and dedicated memory, and a dedicated GPU with a very fast frame buffer, very fast local memory, that combination is a pretty good model, and then we’ll work towards making the programmer’s view and the programmer’s perspective easier and easier.

Scott on the other hand was more forthcoming about future plans. Kepler, which is expected in the first half of 2012, will bring some changes to the CUDA architecture which will “broaden the applicability of GPU programming, tighten the integration of the CPU and GPU, and enhance programmability,” to quote Scott’s slides. This integration will include some limited sharing of memory between GPU and CPU, he said.

What caught my interest though was when he remarked that at some future date NVIDIA will probably build CPU functionality into the GPU. The form that might take, he said, is that the GPU will have a couple of cores that do the CPU functions. This will likely be an implementation of the ARM CPU.

Note that this is not promised for Kepler nor even for Maxwell but was thrown out as a general statement of direction.

There are a couple of further implications. One is that NVIDIA plans to reduce its dependence on Intel. ARM is a better partner, Scott told me, because its designs can be licensed by anyone. It is not surprising then that Intel’s multi-core evangelist James Reinders was dismissive when I asked him about NVIDIA’s claim that the GPU is far more power-efficient than the CPU. Reinders says that the forthcoming MIC (Many Integrated Core) processors codenamed Knights Corner are a better solution, referring to the:

… substantial advantages that the Intel MIC architecture has over GPGPU solutions that will allow it to have the power efficiency we all want for highly parallel workloads, but able to run an enormous volume of code that will never run on GPGPUs (and every algorithm that can run on GPGPUs will certainly be able to run on a MIC co-processor).

In other words, Intel foresees a future without the need for NVIDIA, at least in terms of general-purpose GPU programming, just as NVIDIA foresees a future without the need for Intel.

Incidentally, Scott told me that he left Cray for NVIDIA because of his belief in the superior power efficiency of GPUs. He also described how the Titan supercomputer operated by the Oak Ridge National Laboratory in the USA will be upgraded from its current CPU-only design to incorporate thousands of NVIDIA GPUs, with the intention of achieving twice the speed of Japan’s K computer, currently the world’s fastest.

This whole debate also has implications for Microsoft and Windows. Huang says he is looking forward to Windows on ARM, which makes sense given NVIDIA’s future plans. That said, the I get impression from Microsoft is that Windows on ARM is not intended to be the same as Windows on x86 save for the change of processor. My impression is that Windows on ARM is Microsoft’s iOS, a locked-down operating system that will be safer for users and more profitable for Microsoft as app sales are channelled through its store. That is all very well, but suggests that we will still need x86 Windows if only to retain open access to the operating system.

Another interesting question is what will happen to Microsoft Office on ARM. It may be that x86 Windows will still be required for the full features of Office.

This means we cannot assume that Windows on ARM will be an instant hit; much is uncertain.