No native code development on Windows Phone 7 says Microsoft – so what about Flash?

Windows Phone 7 is a managed code platform, we’ve been told at Mix10 in Las Vegas. Development is via Silverlight or XNA; there is no native API.

Of course there is a native API; the question is more about what code is allowed to access it. Still, in the press briefing the spokesman was clear that native code development will not be supported.

What about projects like Adobe’s Flash runtime, which both Microsoft and Adobe have said is planned, or at least (in Microsoft’s case), not blocked – although we already know that Flash will not be available in the first release.

All my spokesman would say is that nothing has been announced about that.

My suspicion is that in reality certain privileged vendors will be able to, in effect, extend the operating system with native code libraries. Adobe could be one of those; so too could a company like Rhomobile, which has a cross-compiler for a variety of mobile platforms. So I doubt that Microsoft has yet given us the full story here.

Update: The latest on this is that Microsoft’s Charlie Kindel says that Adobe will have special native access for Flash, but that no other vendor will have that privilege. This still does not make sense to me. Let’s suppose that Windows Phone 7 is a big success. What justification could Microsoft have for supporting the Flash runtime but not the Java runtime, for example? I suspect that Microsoft is chasing the Flash checkbox to one-up Apple; but if Adobe gets native access, others will no doubt follow.

Microsoft copies Apple with Windows Phone app lock-in?

I’m reading the documentation for Windows Phone development. Here’s what it says:

A set of tools will help the developer to submit and certify their applications for the Windows Phone Marketplace. Applications are submitted in a .XAP file format, which is essentially one compressed file that contains all the files that are needed by the application. Developers can track their submission status and then receive a notification once the certification is complete. After an application is certified, it can then be submitted for publishing on the Windows Phone Marketplace. Developers can set pricing and select the markets in which they wish to publish the application.

Application updates can go through the certification and publishing process again in order to fix bugs, add new functionality, or provide whole new versions.

Windows Phone Marketplace and Billing

The Windows Phone Marketplace provides the one place where developers can make their applications available for purchase by consumers. Both Mobile Operator and credit card billing are supported, making it as easy as possible for consumers to pay for the program.

Note the lock-in words: apps must be certified by Microsoft, and the Windows Phone Marketplace is the “one place” to make applications available for purchase. According to Microsoft’s Charlie Kindel, Microsoft will take 30% of all revenue, and in most cases there is also a fee for registering as a Marketplace vendor.

I understand Microsoft’s Apple-envy; but it is disappointing to find that this new platform is equally locked, if I’m reading this correctly.

Windows Phone 7 developer story unveiled at Mix10

I’m in Las Vegas for Microsoft’s Mix10 conference, where the developer story for Windows Phone 7 series is being unveiled. According to the press release, the tooling for Windows Phone 7 looks like this:

  • Visual Studio 2010 Express for Windows Phone (free)
  • Windows Phone 7 Series add-in for Visual Studio 2010 RC
  • XNA Game Studio 4.0
  • Emulator
  • Expression Blend for Win Phone CTP

Essentially, you are meant to use XNA for games on the device, and Silverlight for other kinds of application.

Another part of the announcement describes new services for developers – Microsoft Location Service to provide location information, and a notification service to “push information to the phone, regardless of whether or not an application is running”.

Applications will be marketed through a new Windows Phone Marketplace.

I’ll report more details as they emerge, here and on The Register.

Update: Microsoft has added that Expression Blend 4.0, for Silverlight 4.0 support, will be a free upgrade from Expression Blend 3.0.

Silverlight 4.0 RC, Expression Blend 4.0 beta, VS 2010 add-in available for download here: http://silverlight.net/getstarted/silverlight-4-beta/ 

The MSDN documentation for Windows Phone 7 is here: http://msdn.microsoft.com/en-us/library/ff402531(VS.92).aspx

The two specifications of HTML 5.0: WHAT WG vs W3C

I’m just back from a workshop on HTML 5, led by web standards advocate and CSS expert Molly Holzschlag. It proved an illuminating session, though not quite in the way I had expected. Holzschlag, who works for Opera, was keen to convey the ideology behind HTML 5 rather than giving us a blow-by-blow tour of its features (though she did a little of that). She was also open about its problems, explaining that the spec is in flux and everything may change – “we make it up as we go along” – and talking about the politics as well as the technical aspects.

In her view, Microsoft is now fully on-board with IE, and committed to implementing the W3C HTML spec as it evolves. So too are Mozilla and Opera. She is less warm in this respect towards Apple, Google and Adobe, who she described as the “new Microsoft”, meaning I think that their business interests may be detrimental to their work on progressing the standard.

It is surprising to see Google mentioned in this context, since it is the company most obviously concerned to advance the browser’s capabilities, thus increasing the capabilities of its web-based platform. Ian Hickson, who is the editor of the HTML 5 specification, works for Google. Still, HTML 5 is a subject full of contradictions. One of its most curious aspects is that there are two HTML 5 specifications, one at WHAT WG, and one at the W3C, both edited by Hickson.

The history is that at one time the W3C, the official body in charge of the HTML specification, decided to replace HTML with a stricter XML-compliant language called XHTML. Real-world adoption was limited, and WHAT WG was set up by Google and browser vendors as a renegade group to work on a new version of HTML outside the W3C. When it became clear that XHTML was not achieving its goals, and that HTML 5.0 was meeting real needs, the W3C changed direction, stopped working on XHTML, and adopted the WHAT WG spec.

At this point you would have thought that WHAT WG might have closed itself down, job done. That is not the case though; it continues to work on its own version of the spec. I asked Holzschlag why this is, given that the existence of two HTML 5 specifications seems on the face of it to be destructive:

I think it’s very destructive. It’s very problematic. The WHAT working group is into innovation, and pushing the envelope, where they can’t do that in the W3C. The reason why the W3C’s stuff is important … is because it’s about open standards. The WHAT working group has no validation or validity or standing as an organisation other than its own self-involvement. The W3C is clearly the authority for most of these things.

Holzschlag emphasized that the W3C is a cross-industry body, with every browser maker and other interested parties such as Adobe represented.

Get us all round the table, and once we’ve spilt enough blood [laughs] we get on with the work and that actually goes through a very rigorous process, which a lot of people criticise and I feel it could be streamlined as well, but the bottom line is to ensure that it stays open, and it’s open standards, whereas the WHAT working group can decide any day that they want to close that door. At the W3C that can’t happen. That’s why if you’re really going to commit to anything in HTML 5, go with the W3C specs not with WHAT WG.

It’s a political issue in part, and in part it’s an ego issue. I think that Ian and his mates are great, very bright people but they are not totally mature yet … and I think that there’s a sense of self-importance going on, to be perfectly honest … I’m a little concerned about the monoculture that HTML 5 has created. So that exists and is a known factor. Everything I’ve said is nothing that hasn’t been said before publicly.

Strong words; yet overall Holzschlag conveys great enthusiasm for HTML 5 and its potential. She says that the mere fact of having all the leading browser vendors on board and talking to one another is of great significance.

But does HTML 5 exist? In some ways it does not; it is work in progress and not implemented consistently across browsers yet. That said, Holzschlag noted that the latest versions of the main browsers already implement significant parts of HTML 5; we will no doubt see more of it in Internet Explorer 9, for example. Even though Hickson said HTML 5 might not be done until 2022, it will be usable long before that.

QCon London 2010 report: fix your code, adopt simplicity, cool .NET things

I’m just back from QCon London, a software development conference with an agile flavour that I enjoy because it is not vendor-specific. Conferences like this are energising; they make you re-examine what you are doing and may kick you into a better place. Here’s what I noticed this year.

Robert C Martin from Object Mentor gave the opening keynote, on software craftsmanship. His point is that code should not just work; it should be good. He is delightfully opinionated. Certification, he says, provides value only to certification bodies. If you want to know whether someone has the skills you want, talk to them.

Martin also came up with a bunch of tips for how to write good code, things like not having more than two arguments to a function and never a boolean. I’ve written these up elsewhere.

image

Next I looked into the non-relational database track and heard Geir Magnusson explain why he needed Project Voldemort, a distributed key-value storage system, to get his ecommerce site to scale. Non-relational or NOSQL is a big theme these days; database managers like CouchDB and MongoDB are getting a lot of attention. I would like to have spent more time on this track; but there was too much else on; a problem with QCon.

I therefore headed for the functional programming track, where Don Syme from Microsoft Research gave an inspiring talk on F#, Microsoft’s new functional language. He has a series of hilarious slides showing F# code alongside its equivalent in C#. Here is an example:

image

The white panel is the F# code; the rest of the slide is C#.

Seeing a slide that this makes you wonder why we use C# at all, though of course Syme has chosen tasks like asychronous IO and concurrent programming for which F# is well suited. Syme also observed that F# is ideal for working with immutable data, which is common in internet programming. I grabbed a copy of Programming F# for further reading.

Over on the Architecture track, Andres Kütt spoke on Five Years as a Skype Architect. His main theme: most of a software architect’s job is communication, not poring over diagrams and devising code structures. This is a consistent theme at QCon and in the Agile movement; get the communication right and all else follows. I was also interested in the technical side though. Skype started with SOAP but switched to a REST model for web services. Kütt also told us about the languages Skype uses: PHP for the web site, C or C++ for heavy lifting and peer-to-peer networking; Delphi for the Windows interface; PostgreSQL for the database.

Day two of QCon was even better. I’ve written up Martin Fowler’s talk on the ethics of software development in a separate post. Following that, I heard Canonical’s Simon Wardley speak about cloud computing. Canonical is making a big push for Ubuntu’s cloud package, available both for private use or hosted on Amazon’s servers; and attendees at the QCon CloudCamp later on were given a lavish, pointless cardboard box with promotional details. To be fair to Wardley though, he did not talk much about Ubuntu’s cloud solution, though he did make the point that open source makes transitions between providers much cheaper.

Wardley’s most striking point, repeated perhaps too many times, is that we have no choice about whether to adopt cloud computing, since we will be too much disadvantaged if we reject it. He says it is now more a management issue than a technical one.

Dan North from ThoughtWorks gave a funny and excellent session on simplicity in architecture. He used pseudo-biblical language to describe the progress of software architecture for distributed systems, finishing with

On the seventh day God created REST

Very good; but his serious point is that the shortest, simplest route to solving a problem is often the best one, and that we constantly make the mistake of using over-generalised solutions which add a counter-productive burden of complexity.

North talked about techniques for lateral thinking, finding solutions from which we are mentally blocked, by chunking up, which means merging details into bigger ideas, ending up with “what is this thing for anyway”; and chunking down, the reverse process, which breaks a problem down into blocks small enough to comprehend. Another idea is to articulate a problem to a colleague, which exercises different parts of the brain and often stimulates a solution – one of the reasons pair programming can be effective.

A common mistake, he said, is to keep using the same old products or systems or architectures because we always do, or because the organisation is already heavily invested in it, meaning that better alternatives do not get considered. He also talked about simple tools: a whiteboard rather than a CASE tool, for example.

Much of North’s talk was a variant of YAGNI – you ain’t gonna need it – an agile principle of not implementing something until/unless you actually need it.

I’d like to put this together with something from later in the day, a talk on cool things in the .NET platform. One of these was Guerrilla SOA, though it is not really specific to .NET. To get the idea, read this blog post by Jim Webber, another from the ThoughtWorks team (yes, there are a lot of them at QCon). Here’s a couple of quotes:

Prior to our first project starting, that client had already undertaken some analysis of their future architecture (which needs scalability of 1 billion transactions per month) using a blue-chip consultancy. The conclusion from that consultancy was to deploy a bus to patch together the existing systems, and everything else would then come together. The upfront cost of the middleware was around £10 million. Not big money in the grand scheme of things, but this £10 million didn’t provide a working solution, it was just the first step in the process that would some day, perhaps, deliver value back to the business, with little empirical data to back up that assertion.

My (small) team … took the time to understand how to incrementally alter the enterprise architecture to release value early, and we proposed doing this using commodity HTTP servers at £0 cost for middleware. Importantly we backed up our architectural approach with numbers: we measured the throughput and latency characteristics of a representative spike (a piece of code used to answer a question) through our high level design, and showed that both HTTP and our chosen Web server were suitable for the volumes of traffic that the system would have to support … We performance tested the solution every single day to ensure that we would always be able to meet the SLAs imposed on us by the business. We were able to do that because we were not tightly coupled to some overarching middleware, and as a consequence we delivered our first service quickly and had great confidence in its ability to handle large loads. With middleware in the mix, we wouldn’t have been so successful at rapidly validating our service’s performance. Our performance testing would have been hampered by intricate installations, licensing, ops and admin, difficulties in starting from a clean state, to name but a few issues … The last I heard a few weeks back, the system as a whole was dealing with several hundred percent more transactions per second than before we started. But what’s particularly interesting, coming back to the cost of people versus cost of middleware argument, is this: we spent nothing on middleware. Instead we spent around £1 million on people, which compares favourably to the £10 million up front gamble originally proposed.

This strikes me as an example of the kind of approach North advocates.

You may be wondering what other cool .NET things were presented. This session was called the State of the Art .NET, given by Amanda Laucher and Josh Graham. They offer a dozen items which they considered .NET folk should be using or learning about:

  1. F# (again)
  2. M – modelling/DSL language
  3. Boo – static Python for .NET
  4. NUnit – unit testing. Little regard for Microsoft’s test framework in Team System, which is seen as a wasted and inferior effort.
  5. RhinoMocks – mocking library
  6. Moq – another mocking library
  7. NHibernate – object-relational mapping
  8. Windsor – dependency injection, part of Castle project. Controversial; some attendees thought it too complex.
  9. NVelocity – .NET template engine
  10. Guerrilla SOA – see above
  11. Azure – Microsoft’s cloud platform – surprisingly good thanks to David Cutler’s involvement, we were told
  12. MEF – Managed Extensibility Framework as found in Visual Studio 2010, won high praise from those who have tried it

That was my last session (I missed Friday) though I did attend the first part of CloudCamp, an unconference for cloud early adopters. I am not sure there is much point in these now. The cloud is no longer subversive and the next new thing; all the big enterprise vendors are onto it. Look at the CloudCamp sponsor list if you doubt me. There are of course still plenty of issues to talk about, but maybe not like this; I stayed for the first hour but it was dull.

For more on QCon you might also want to read back through my Twitter feed or search the entire #qcon tag for what everyone else thought.

Martin Fowler on the ethics of software development – QCon report

Martin Fowler of ThoughtWorks gave what seemed an important session at QCon London, exploring the ethical dimension of software development with a talk called What are we programming for?. The room was small, since the organisers figured that a track on IT with a a conscience would be a minority interest; but Fowler always attracts a large audience and the result was a predictable crush.

The topic itself has many facets, perhaps awkwardly commingled. Should you walk away if your customer or employer asks you to code the wrong thing – maybe enforcing poor practice, or creating algorithms to fleece your customer’s customers? What about coding for proprietary platforms owned by vendors with questionable business practices? Fowler mentioned that around 50% of ThoughtWorks solutions are built on Microsoft .NET; and that the efforts of Bill Gates to combat malaria was something that helped him make sense of that relationship.

image

Fowler also echoed some of Robert Martin’s theme from the day before: if you build poor software, you are in effect stealing from the future revenue of your customer.

Finally, he talked about the male/female imbalance in software development, rejecting the idea that it is the consequence of natural differences in aptitude or inclination. Rather, he said, we can’t pretend to have the best people for the job while the imbalance continues. I’d add that since communication is a major theme of Agile methodology, and that females seem to have communication skills that males lack, fixing the imbalance could improve the development process in other ways.

Big questions; and Fowler had few answers. When asked for the takeaway, he said the main one is to discuss these issues more.

Penguin’s Apple love-in

An article on paidcontent gives me pause for thought. In it, Penguin Books’ CEO John Makinson talks of plans to publish content on Apple’s forthcoming iPad device.

The iPad represents the first real opportunity to create a paid distribution model that will be attractive to consumers

says Makinson.

This is all to do with the App store; somehow we are more willing to buy stuff on the App Store than to pay for other forms of content on the Internet. Penguin’s conclusion: make books into apps:

So for the time being at least we’ll be creating a lot of our content as applications, for sale on app stores and HTML, rather than in ebooks. The definition of the book itself is up for grabs.

Adobe’s .epub format is not good enough, apparently; only the full flexibility of a native application will do.

Two things strike as notable here. One is Makinson’s presumption that the iPad will be a big hit, thanks presumably to Apple’s success with iPod and iPhone. The tablet format has been a niche market in the past, because it lacks both the convenience of a pocketable mobile, and the capability of a keyboard-equipped netbook or laptop.

The second point is that here is a major publisher planning to create single-platform content that can only be sold through Apple and consumed on Apple’s devices.

Makinson does say “for sale on app stores and HTML”. I am not sure quite what he means; but clearly Penguin does not intend to use iPad apps for all its epublishing. Nevertheless, it raises the possibility of some content that is only on Apple, or best on Apple, or earliest on Apple.

If this idea takes hold, the consequence will be to disadvantage users of non-Apple devices. For example, what if you are on a course, and the recommended reading is only available as an Apple application?

I am already experiencing some of this pressure. I was at a conference earlier this week where the organisers provided an iPhone app to help attendees schedule their time:

This year QCon also has an iPhone app allowing you to browse the schedule by track, by time, favourite a track and access the #qcon twitter channel.

This is not a trend that I welcome. In some respects it is worse than having to run Windows for the sake of some particular application, since iPhone apps have to be approved by Apple, and emulators that have helped us cope with Windows-only requirements do not exist.

I do not have an iPhone; but I am beginning to think that it is a business requirement.

Functional programming, NOSQL themes at QCon London

One reason I enjoy the QCon London software development conference is that it reflects programming trends. Organiser Floyd Marinescu described it as by practitioners for practitioners. In previous years I’ve seen themes like disillusionment with enterprise Java, the rise of Agile methodologies, the trend towards dynamic languages, and the benefits of REST.

So what’s hot this year? A couple of trends are striking. One is functional programming. Don Syme, Principal Researcher at Microsoft Research and co-inventor of F#, gave a lively presentation on functional approaches to parallelism and concurrency. He shows screen after screen of equivalent F# and C# code, illustrating how F# is more concise and expressive, as well as being better suited to concurrent development.

F# is one of the languages included by default in Visual Studio 2010, which should be released shortly.

I asked Syme what sort of problems are not well suited to F#. In his reply he described the state of play in Visual Studio 2010, where you can easily create F# libraries but there is no designer support for user interface code, such as Windows Forms or Windows Presentation Foundation. That is merely a tooling issue though.

Syme’s point is that functional programming, and F# in particular, is ideal for today’s programming challenges, including concurrency and asynchronous code.

If nothing else, he convinced me that every .NET programmer should at least be looking at F# and learning what it can do.

The functional programming track at QCon is not just about F#, of course, though in some ways it seems to be the functional language of the moment.

The other theme that has made a big impression is NoSQL, or what the QCon track calls “Non-relational database managers and web-oriented data”.Geir Magnusson from Gilt Groupe talked about the challenge of running a web site which has extreme peaks in traffic, and where every user needs dynamic data and transaction support so simple caching does not work. They were unable to get their relational database store to scale to handle thousands of transactions a second. They solved the problem with an in-memory non-relational database.

In another talk, the BBC discussed their use of CouchDB for highly scalable web sites.

The Windows Netbook experience: Toshiba NB300

I’ve just received a Toshiba NB300 Netbook, which looks like it will be useful for blogging and web access during a couple of conferences coming up shortly – up to 11 hours battery life, great. I am interested in the user experience when starting out with a new machine, so made a few notes.

I regard this as a critical issue. Microsoft and its OEM partners are up against Apple, a company which pays careful attention to the user experience, from box unwrapping on. Apple charges a premium of course; Windows machines are generally cheaper, and there is an unwritten deal that you put up with a certain amount of foistware and rough edges for the sake of better value overall. On the other hand, if users do not feel good about a product they are unlikely to recommend it to others; Apple has won a fanatical following partly thanks to this attention to detail.

So how was the Toshiba? Better than the Samsung/Vodafone Netbook about which I blogged last month, but still not great.

I switched on and was immediately guided through a registration wizard, being assured that this would activate my warranty. Next I was prompted to activate TEMPRO, a Toshiba service which is meant to send me alerts concerning software updates and so on. I tried to do so, but the activation wizard told me the serial number was invalid, though as far as I can tell it is correct. Next, TEMPRO sent me an alert that my warranty was not registered. You what?

Trying to imagine what a typical user might do, I clicked the Register button just in case. This started up Internet Explorer for the first time. Next, Google popped up a dialog asking me to agree to its privacy policy for the pre-installed Google toolbar. I clicked Disagree and it started uninstalling. In the meantime, IE started its welcome wizard and McAfee started badgering me that I was not fully protected. Here’s my screen a few minutes after first power-on:

image

The problem here is that a bunch of different applications want to get you to agree some terms or set up a subscription, and they are all competing for attention. It is all very predictable, and the end result is ugly. You would think that someone could figure out how to do this in an organised manner.

I took a look at Control Panel. There was a ton of stuff installed although Toshiba is certainly not the worst when it comes to the bundling game. Pre-installed software included the following:

  • Adobe AIR
  • Amazon.co.uk
  • eBay.co.uk
  • Java 6
  • McAfee Security Center (reboot required on uninstall)
  • Silverlight 3
  • Office Home and Student 2007 trial (reboot required on uninstall)
  • Powerpoint 2007
  • Microsoft Works 9
  • Photo Service powered by myphotobook
  • An amazing number of Toshiba utilities – I counted 24
  • Wild Tangent games
  • Windows Live Essentials

I tried Office 2007 trial, which asked to install an ActiveX control to check whether Office 2007 was already installed. This seems a clumsy solution, and perplexing for the user. I let it install, then clicked Buy Now, which got me to a web site where I could purchase it for £86.04.

Microsoft Works 9.0 is also installed in a full version, but whereas Office 2007 has an icon on the desktop, Works is hidden away in the Start menu. It might be all you need on a Netbook, except that its default document formats are unhelpful, if you need to share them with others. Works can open Microsoft’s Office 2007 XML formats (.docx, .xslsx) to some extent, but things went a little awry after I uninstalled Office 2007 trial. Double-clicking a .docx raises a Save As dialog defaulting to .docm, the macro-enabled Open XML format, which is something to do with the Microsoft Open XML Converter. I can’t imagine why it is doing that. Office 2007 will be going back on shortly.

A Toshiba utility called Web Camera Application has an annoying menu which docks to the side of the screen and pops up when you move the mouse there. Since Microsoft has worked hard on the taskbar area, which is where always-on utilities normally live, I’m not clear why Toshiba thinks this is a good idea. Having said that, the similar effort at the top of the screen which handles the Fn keys (known as Flash Cards) is not so bad: mouse activation is off by default, and it shows at a glance what all these keys do. Fn-F8 disables wireless for flight mode, for example. If you want to get rid of the side menu but not the top one, open it and right-click. Uncheck Auto Run and then click Close. If you then want it back, choose Start – All Programs – Toshiba – Utilities – Web Camera Application.

Toshiba pre-installs a multi-function utility called Toshiba Bulletin Board. It includes a Message Center which raises alerts, some of which link to TEMPRO as mentioned above. This turns out to be a bit of a usability disaster too. Here’s what happens. I get a notification that there are alerts to be read. I open Toshiba Bulletin Board and click a hyperlink to open Message Center. It says TEMPRO has some alerts to read, so I click Open. Now I’m in TEMPRO which apparently was not designed with the short 1024×600 screen in mind. It has lots of stuff in a huge dialog, leaving only 1.5 lines of space for the actual message, with a tiny scroll bar next to it. I’ve encircled the message in the pic below so you can see it:

image

This one is a new software driver. Sounds like something useful, so I click Alert Details. This takes me to a web page called Driver Details. It has a big download icon, but clicking that does nothing. The page says:

To download your chosen file, simply click on the filename below.

Curiously, the “filename” is actually a link to an HTML page.

image

I click it. Now I’m here:

image

The IE pop-up blocker is doing its stuff, and if I’m impatient I can click a link. I wait a few seconds, nothing happens, so I click the link.

Help! Now I’m at some kind of portal with four big buttons and no clue which to click:

image

I vaguely recall it was a wireless driver so using my knowledge of acronyms I click WLAN Downloads:

image

Lovely! Now I have a list of around 25 downloads for various operating systems. All I have to do is decide whether my adapter is Intel, Atheros or Realtek, and which version and operating system I require.

Sorry, Toshiba, this is a bad joke. You’ve installed your special utility supposedly to make it easy to keep your product up-to-date, it takes multiple clicks to get anywhere useful, and it is so hopeless that it cannot even select the right driver automatically.

By the way, there is yet another update utility called Toshiba Service Station that comes with an intimidating agreement saying it will keep your data for seven years. I tried that too when prompted; it said No software updates available. How many update utilities does a little netbook need?

While I’m beating up this machine, let me mention the partitioning. The hard drive is only 250GB, but it is divided into three partitions: a small hidden partition for some clever recovery stuff, then two equally-sized partitions one called Windows and the other Data. There’s a case for having a separate partition for the operating system, though I don’t much like it on a Windows client machine because getting the sizes right is a challenge. However, Toshiba hasn’t really done what the names imply. Everything is on the Windows partition, including the data. In other words, the user’s home directory and documents are on the operating system partition. The only thing on drive D is an irritating directory called HDDRecovery which includes a readme pleading with you not to delete it.

Drive D may be handy though – I expect I’ll be trying MeeGo on here soon.

Lessons not learned

I like Toshiba machines, I know Windows backwards, and likely this machine will do a great job for me. Nevertheless, I can see that it has all sorts of usability issues, and that these are mostly not Microsoft’s fault but put there by the OEM vendor.

It beats why there isn’t some kind of usability trial where the prototype is put before a user, who is asked to turn the machine on and, as they say, follow the on-screen directions. The issues are not hard to spot. Toshiba is not a small company; it has the skills and resources to make a machine that offers a pleasing user experience.

It also beats me why resources are devoted to half-baked software like Toshiba Bulletin Board and TEMPRO, which are counter-productive, instead of aiming to integrate seamlessly with the good usability work Microsoft has done in Windows 7.

Why programmers should study Microsoft’s random failure and not trust Google search

The bizarre story of the EU-mandated Windows browser choice screen took an unexpected twist recently when it was noticed that the order of the browsers was not truly random.

image

IBM’s Rob Weir was not the first to spot the problem, but did a great job in writing it up, both when initially observed and after it was fixed by Microsoft.

It was an algorithm error, a piece of code that did not return the results the programmer intended.

Unless Microsoft chooses to tell us, there is no way to tell how the error happened. However, as Weir and others observe, it may be significant that a Google search for something like Javascript random sort immediately gets you sample code that has the same error. Further, the error is not immediately obvious, making it particularly dangerous.

I am sure I am not the only person to turn to Google when confronted with some programming task that requires some research. In general, it is a great resource; and Google’s own algorithms help a little with filtering the results so that sites with better reputation or more inbound links come higher in the results.

Still, what this case illustrates – though accepting again that we do not know how the error occurred in this instance – is that pasting code from a Google search into your project without fully understanding and testing it does not always work. Subtle bugs like this one, which may go unnoticed for a long time, can have severe consequences. Randomisation is used in security code, for example.

As an aside, there also seems to be some randomness in the appearance of the browser choice screen. It turned up on my laptop, but not on my desktop, although both have IE as the default.

And who would have guessed that the EU would arrange for so many of us to get an ad for something like the GreenBrowser popping up on our desktop? Apparently it is the “best choice of flexible and powerful green web browser”, though since it is based on IE it is less radical a choice than it first seems.

image