Migrating from physical to virtual with Hyper-V and disk2vhd

I have a PC on which I did most of my work for several years. It runs Windows XP, and although I copied any critical data off it long ago, I still wheel it out from time to time because it has Visual Studio 6 and Delphi 7 projects with various add-ins installed, and it is easier to use the existing PC than to replicate the environment in a virtual machine.

These old machines are a nuisance though; so I thought I’d try migrating it to a virtual machine. There are numerous options for this, but I picked Microsoft Hyper-V because I already run several test servers on this platform with success. Having a VM on a server rather than on the desktop with Virtual PC, Virtual Box or similar means it is always easily available and can be backed up centrally.

The operation began smoothly. I installed the free Sysinternals utility Disk2vhd, which uses shadow copy so that it can create a VHD (virtual hard drive) from the system on which it is running. Next, I moved the VHD to the Hyper-V server and created a new virtual machine set to boot from that drive.

Windows XP started up first time without any blue screen problems, though it did ask to be reactivated.

image

I could not activate yet though, because XP could not find a driver for the network card. The solution was to install the Hyper-V integration services, and here things started to go wrong. Integration services asked to upgrade the HAL (Hardware Abstraction Layer), a key system DLL:

image

However, on restart I got the very same dialog.

Fortunately I was not the first to have this problem. I was prepared for some hassle and had my XP with SP3 CD ready, so I copied and expanded halaacpi.dll from this CD to my system32 folder and amended boot.ini as suggested:

multi(0)disk(0)rdisk(0)partition(2)\WINDOWS=”Disk2vhd Microsoft Windows XP Professional” /FASTDETECT /NOEXECUTE=OPTIN /HAL=halaacpi.dll

I rebooted and now the integration services installed OK. However, if you do this then I suggest your delete the /HAL=halaacpi.dll argument before rebooting again, as with this in place Windows would not start for me.  In fact, you can delete the special Disk2vhd option in boot.ini completely; it is no longer needed.

After that everything was fine – integration worked, the network came to life and I activated Windows – but performance was poor. To be fair, it was not that good in hardware either. Still, I am working on it. I’ve given the Virtual Machine 1.5GB RAM and dual processors. Removing software made obsolete by the migration, things like the SoundMAX and NVIDIA drivers seemed to help quite a bit. It is usable, and will improve as I fine-tune the setup.

Overall, the process was easier than I expected and getting at my old developer setup is now much more convenient.

Adobe abandons Project ROME, focuses on apps rather than cloud

Adobe is ceasing investment in Project ROME, a labs project which provides a rich design and desktop publishing application implemented as an Adobe AIR application, running either in the browser or on the desktop using the Flash player as a runtime.

image

According to the announcement:

Project ROME by Adobe was intended to explore the opportunity and usability of creative tools as software-as-a-service in the education market and beyond. We have received valuable input from the community after a public preview of the software. Following serious evaluation and consideration of customer input and in weighing this product initiative against other projects currently in development, we have made the difficult decision to stop development on Project ROME. Given our priorities, we’re focusing resources on delivering tablet applications, which we believe will have significant impact on creative workflows.

There must be some broken hearts at Adobe because ROME is a beautiful and capable application that serves, if nothing else, as a demonstration of how capable a Rich Internet Application can be. In fact, I have used it for that purpose: when asked whether a web application could ever deliver the a user interface that comes close to the best desktop applications, I showed Project Rome with great effect.

I first saw Project ROME as a “sneak peek” at the Adobe MAX conference in 2009. It had made it past those initial prototypes and was being worked up as a full release, with a free version for education and a commercial version for the rest of us. Curiously, Adobe says the commercial version will remain available as an unsupported freebie, but the educational offering is being pulled: “we do not want to see pre-release software used in the classroom “.

Why abandon it now? I think we have Apple’s Steve Jobs to thank. AIR applications do not run on the iPad; and when Adobe says it is focusing instead on tablet applications, the iPad will figure largely in those plans. Still, there are a few other factors:

  • One thing that was not convincing in the briefing I received about Project ROME was the business model. It was going to be subscription-based, but how many in this non-professional target market would subscribe to online desktop publishing, when there are well-established alternatives like Microsoft Publisher?
  • Adobe makes most of its money from selling desktop software, in the Creative Suite package. ROME was always going to be a toy relative to the desktop offerings.
  • The output from ROME is primarily PDF. If Rome had been able to build web pages rather than PDF documents, perhaps that would have made better sense for a cloud application.
  • Adobe did not market the pre-release effectively. I do not recall hearing about it at MAX in October, which surprised me – it may have been covered somewhere, but was not covered in the keynotes despite being a great example of a RIA.
  • The ROME forum shows only modest activity, suggesting that Project ROME had failed to attract the attention Adobe may have hoped for.

It is still worth taking a look at Project ROME; and I guess that some of the ideas may resurface in apps for iPad, Android and other tablets. It will be interesting to see to what extent Adobe itself uses Flash and AIR for the commercial design apps it delivers.

Final reflection: this decision is a tangible example of the ascendancy of mobile apps versus web applications – though note that Adobe still has a bunch of web applications at Acrobat.com, including the online word processor once called Buzzword and a spreadsheet application called Tables.

As Microsoft releases new tools for Windows Phone, developers ask: how is it selling?

Microsoft has released Visual Basic for Windows Phone Developer Tools – not a lot to report, I guess, except that what you could already do in C# you can now also do in Visual Basic.

Still, when someone at Microsoft asked me what I thought of the Windows Phone 7 developer platform I replied that the tools look good for the most part – though I would like to see a native code option and it seems unfortunate that mobile operators can install native code apps but the rest of us officially cannot – but the bigger question is around the size of the market.

We all know that a strong and large community of developers is critical to the success of a platform – but as I’ve argued before, developers will go where their customers are, rather than selecting a platform based on the available tools and libraries. It is a bit of both of course: the platform has to be capable of running the application, and ease of development is also a factor, but in the end nothing attracts developers more than a healthy market.

Therefore the critical question for developers is how well Windows Phone 7 is selling.

Nobody quite knows, though Tom Warren makes the case for not much more than 126,000, that being the number of users of the Windows Phone Facebook application.

I’m not quite convinced when Warren says:

It’s likely that most users will connect their Facebook account so the statistics could indicate nearly accurate sales figures.

Not everyone loves Facebook; and when I was trying out Windows Phone 7 I found myself reluctant to have it permanently logged in. Even so, I’d agree that well over 50% of users will enable Facebook integration so it is a useful statistic.

Although that suggests a relatively small number in the context of overall Smartphone sales, my perception is that lack of availability is part of the reason, so it is too early to judge the platform’s success. I do not see many Windows Phone 7 in the mobile phone shops that I pass in the UK; in fact it is unusual to see it at all. I am not sure if this is mainly because of supply shortages, or because Microsoft and its partners found it difficult to build expectations in the trade that this would be a sought-after device, or both.

Some bits of anecdotal evidence are encouraging for Microsoft. Early adopters seem to like it well enough. Nevertheless, it is a minority player at the moment and that will not change soon.

Developers are therefore faced with a small niche market. Microsoft has done a fair job with the tools; now it needs to get more devices out there, to convince developers that once they have built their applications, there are enough customers to make it worth while.

Could Kinect trigger the Xbox 360 RROD (Red Ring of Death)?

On November 10th, launch day in the UK, I received and installed Microsoft’s Kinect motion controller. I wrote up my first impressions here. My Xbox was an Elite, bought to replace a launch 360 that had succumbed to the red ring of death – the means by which the console communicates hardware failure – been repaired, and failed again.

I left Kinect attached though I admit it has not been much used. Two and half weeks later, it was the turn of the Elite to display three red lights – at just over three years old, so beyond Microsoft’s extended RROD warranty.

It is probably coincidence, though some are theorising that the Kinect, or a system update associated with it, has tipped a proportion of Xboxes into failure:

I have a theory that MS was (and still is) having latency/response issues with the Kinect hardware and used one of these updates to speed something up, possibly XB360 memory speeds or access times, and some of the older 90nm hardware just can’t take it. There are LOTS of people who owned older systems that melted down immediately after the update/Kinect hookup – far to many to be a coincidence and even MS support admitted to me that repair call volumes were extremely high.

It still seems a stretch to me. There are a lot of Xboxes out there, and in the normal course of events a some of them will happen to fail at the same moment or soon after installing Kinect. The Kinect has its own power supply when connected to consoles older than the 360 Slim which appeared this summer.

Nevertheless, I was stuck with a broken Xbox. Fix or replace? The problem is, the 360 is not a reliable design – maybe the new slim model, but while the Elite is an improvement on the original, it is still, I believe, less reliable than most modern electronics. Although I could get the Elite fixed, I doubt I would get another three years of service from it. In any case, the eject button has also become unreliable, and sometimes the DVD tray has to be pushed up with some force in order to persuade it to work.

Instead, I went out and got 360 Slim, which has a bigger hard drive, integrated wi-fi, quieter running, and no need for the supplementary power supply for Kinect.

I whiled away Sunday afternoon transferring games and data from the old hard drive. I still had the hard drive transfer kit which I had used for the Elite, and it worked fine for the Slim although it tool several hours.

There is another complication when you replace your Xbox. The transfer kit moves any games you have purchased from the Live Marketplace, but not the DRM (Digital Rights Management) which protects them. In consequence, they revert to trial versions unless you are signed in with the account under which they were purchased.

The fix is to transfer the content licenses, a process which involves signing into Xbox Live on the web as well as on the new console. It is a two-stage process. First, the new console is authorized as valid for those content licenses. Second, the actual licenses have to be transferred. You are meant to able to do this second stage from the web, but this did not work for me. I found I had to repeat the download from the Live Marketplace on the 360 itself. When I chose Download Again, the download completed nearly instantly, implying that it merely verified what was already downloaded, but in addition it did some DRM magic which enabled the full games for all users of the console.

So … I got less than two years out of the original Xbox 360 (December 2005), and a little over three years from the Elite. Here’s hoping that the third attempt lasts longer.

HTML 5 Canvas: the only plugin you need?

The answer is no, of course. And Canvas is not a plugin. That said, here is an interesting proof of concept blog and video from Alexander Larsson: a GTK3 application running in Firefox without any plugin.

image

GTK is an open source cross-platform GUI framework written in C but with bindings to other languages including Python and C#.

So how does C native code run the browser without a plugin? The answer is that the HTML 5 Canvas element, already widely implemented and coming to Internet Explorer in version 9, has a rich drawing API that goes right down to pixel manipulation if you need it. In Larsson’s example, the native code is actually running on a remote server. His code receives the latest image of the application from the server and transmits mouse and keyboard operations back, creating the illusion that the application is running in the browser. The client only needs to know what is different in the image as it changes, so although sending screen images sounds heavyweight, it is amenable to optimisation and compression.

It is the same concept as Windows remote desktop and terminal services, or remote access using vnc, but translated to a browser application that requires no additional client or setup.

There are downsides to this approach. First, it puts a heavy burden on the server, which is executing the application code as well as supplying the images, especially when there are many simultaneous users. Second, there are tricky issues when the user expects the application to interact with the local machine, such as playing sounds, copying to the clipboard or printing. Everything is an image, and not character-by-character text, for example. Third, it is not well suited to graphics that change rapidly, as in a game with fast-paced action.

On the other hand, it solves an immense problem: getting your application running on platforms which do not support the runtime you are using. Native applications, Flash and Silverlight on Apple’s iPad and iPhone, for example. I recall seeing a proof of concept for Flash at an Adobe MAX conference (not the most recent one) as part of the company’s research on how to break into Apple’s walled garden.

It is not as good as a true local application in most cases, but it is better than nothing.

Now, if Microsoft were to do something like this for Silverlight, enabling users to run Silverlight apps on their Apple and Linux devices, I suspect attitudes to the viability of Silverlight in the browser would change considerably.

Microsoft removes Drive Extender from new Windows Home Server, users rebel

Microsoft’s Windows Home Server has a popular feature called Drive Extender [Word docx] which lets you increase storage space simply by adding an internal or external drive – no fussing with drive letters. In addition, Drive Extender has some resilience against drive failure, duplicating files stored in shared folders when more than one drive is available.

Recognising the usefulness of this feature for business users as well as in the home, Microsoft prepared a significantly upgraded Drive Extender for the next version of Windows Home Server, code-named Vail, and for new “Essentials” editions of Small Business Server (SBS) and Storage Server. Anandtech has an explanation of the changes, necessary to support business features such as the Encrypted File System.

The new version is more complex though, and it seems Microsoft could not get it working reliably. Rather than delay the new products, Microsoft decided to drop the feature, as announced by product manager Michael Leworthy. Note the rating on the announcement.

image

Part of the problem is that rather than discuss difficulties in the implementation, Leworthy presented the decision as something to do with the availability of larger drives:

We are also seeing further expansion of hard drive sizes at a fast rate, where 2Tb drives and more are becoming easy accessible to small businesses.  Since customers looking to buy Windows Home Server solutons from OEM’s will now have the ability to include larger drives, this will reduce the need for Drive Extender functionality.

He added that “OEM partners” will implement “storage management and protection solutions”.

Unfortunately this was a key feature of Windows Home Server. The announcement drew comments like this:

My great interest in Vail has just evaporated.  Drive Extender is the great feature of Home Server, and what my personal data storage is based around.  I have loved owning my WHS but unfortunately without DE I will be looking for other products now.

A thread (requires login to WHS beta) on the beta feedback site Microsoft Connect attracted thousands of votes in a couple of days.

 image

One of the concerns is that while Drive Extender 2 may be needed for the business servers, the version 1 is fine for home users. Therefore it seems that the attempt to bring the technology to business servers has killed it for both.

The SBS community is less concerned about the issue than home users. For example, Eriq Neale says:

While I can see how the Home Server folks are going to lament the loss of DE from their product, as cool as it is, removing that technology removes a LOT of roadblocks I was expecting for Aurora and Breckenridge, and that’s good news for my business.

though Wayne Small says:

I know that a few of my fellow MVPs were told of this recently and sworn to secrecy under our NDA, and we honestly were dumbstruck as to the fact it had been cancelled.  I can only assume that the powers that be at Microsoft know what they are truly doing by removing this feature.  On the flip side however, it means that any server backup or antivirus product that worked with Windows Server 2008 R2 will now most certainly work with SBS 2011 Essentials without modification!  See – there is a silver lining there somewhere.

What should Microsoft do? I guess it depends on how badly broken Drive Extender 2 is. Perhaps one option would be to keep Drive Extender 1 in Vail, but leave it out of the business servers. Another idea would be to delay the products while Drive Extender 2 is fixed, presuming it can be done in months rather than years.

Or will Microsoft ignore the feedback and ship without Drive Extender at all? Microsoft may be right, in that shipping a server with broken storage management would be a disaster, no matter how much users like the feature.

How will online services impact Microsoft’s partner business?

2010 is the year Microsoft got serious about cloud services. Windows Azure opened for real business in November 2009 – OK, just before 2010 – and CEO Steve Ballmer took to telling the world how Microsoft is “all in” for cloud computing whenever he got up to speak. Office and SharePoint 2010 launched in May 2010 complete with the ability to create and edit Office documents from a web browser. Microsoft also announced Office 365, essentially an upgrade of its existing BPOS offering, offering hosted Exchange, Sharepoint and Lync (Office Communicator). Microsoft also announced Small Business Server 2011, including an Essentials edition, formerly codenamed “Aurora”, which is little more than Windows Home Server plus Active Directory and points small businesses towards cloud services for email and document collaboration.

I’d guess that Microsoft’s cloud conversion is driven in part by the progress Google, Salesforce.com and others have made in persuading businesses that hosted internet services make more sense than maintaining your own servers and server applications in many cases.

But what is the impact on Microsoft partners, who have been kept busy supplying and configuring servers, implementing backup, keeping systems running, and then upgrading them as they become obsolete? On the face of it they have less to do in a hosted world, and although Microsoft offers commission on the sale of online subscriptions, that might not compensate for lost business.

Then again, cloud services offer new opportunities, still need configuring, and look likely to be a source of new business for partners particularly at a time when the majority of businesses have not yet made the transition.

I’m researching a further piece on the subject and would love to hear honest views from partners such as resellers and solution providers about how Microsoft’s online services are affecting partner business now and in the future. Or maybe you think this cloud thing is overdone and it will be business as usual for a while yet. You can contact me by email – tim(at)itwriting.com – or of course comment below.

The Microsoft Azure VM role and why you might not want to use it

I’ve spent the morning talking to Microsoft’s Steve Plank – whose blog you should follow if you have an interest in Azure – about Azure roles and virtual machines, among other things.

Windows Azure applications are deployed to one of three roles, where each role is in fact a Windows Server virtual machine instance. The three roles are the web role for IIS (Internet Information Server) applications, the worker role for general applications, and newly announced at the recent PDC, the VM role, which you can configure any way you like. The normal route to deploying a VM role is to build a VM on your local system and upload it, though in future you will be able to configure and deploy a VM role entirely online.

It’s obvious that the VM role is the most flexible. You will even be able to use 64-bit Windows Server 2003 if necessary. However, there is a critical distinction between the VM role and the other two. With the web and worker roles, Microsoft will patch and update the operating system for you, but with the VM role it is up to you.

That does not sound too bad, but it gets worse. To understand why, you need to think in terms of a golden image for each role, that is stored somewhere safe in Azure and gets deployed to your instance as required.

In the case of the web and worker roles, that golden image is constantly updated as the system gets patched. In addition, Microsoft takes responsibility for backing up the system state of your instance and restoring it if necessary.

In the case of the VM role, the golden image is formed by your upload and only changes if you update it.

The reason this is important is that Azure might at any time replace your running VM (whichever role it is running) with the golden image. For example, if the VM crashes, or the machine hosting it suffers a power failure, then it will be restarted from the golden image.

Now imagine that Windows server needs an emergency patch because of a newly-discovered security issue. If you use the web or worker role, Microsoft takes responsibility for applying it. If you use the VM role, you have to make sure it is applied not only to the running VM, but also to the golden image. Otherwise, you might apply the patch, and then Azure might replace the VM with the unpatched golden image.

Therefore, to maintain a VM role properly you need to keep a local copy patched and refresh the uploaded golden image with your local copy, as well as updating the running instance. Apparently there is a differential upload, to reduce the upload time.

The same logic applies to any other changes you make to the VM. It is actually more complex than managing VMs in other scenarios, such as the Linux VM on which this blog is hosted.

Another feature which all Azure developers must understand is that you cannot safely store data on your Azure instance, whichever role it is running. Microsoft does not guarantee the safety of this data, and it might get zapped if, for example, the VM crashes and gets reverted to the golden image. You must store data in Azure database or blob storage instead.

This also impacts the extent to which you can customize the web and worker VMs. Microsoft will be allowing full administrative access to the VMs if you require it, but it is no good making extensive changes to an individual instance since they could get reverted back to the golden image. The guidance is that if manual changes take more than 5 minutes to do, you are better off using the VM role.

A further implication is that you cannot realistically use an Azure VM role to run Active Directory, since Active Directory does not take kindly to be being reverted to an earlier state. Plank says that third-parties may come up with solutions that involve persisting Active Directory data to Azure storage.

Although I’ve talked about golden images above, I’m not sure exactly how Azure implements them. However, if I have understood Plank correctly, it is conceptually accurate.

The bottom line is that the best scenario is to live with a standard Azure web or worker role, as configured by you and by Azure when you created it. The VM role is a compromise that carries a significant additional administrative burden.

25 years of Windows: triumph and tragedy

I wrote a (very) short history of Windows for the Register, focusing on the launch of Windows 1.0 25 years ago.

image

I used Oracle VirtualBox to run Windows 1.0 under emulation since it more or less works. I found an old floppy with DOS 3.3 since Windows 1.0 does not run on DOS 6.2, the only version offered by MSDN. In the course of my experimentation I discovered that Virtual PC still supports floppy drives but no longer surfaces this in the UI. You have to use a script. Program Manager Ben Armstrong says:

Most users of Windows Virtual PC do not need to use floppy disks with their virtual machines, as general usage of floppy disks has become rarer and rarer.

An odd remark in the context of an application designed for legacy software.

What of Windows itself? Its huge success is a matter of record, but it is hard to review its history without thinking how much better it could have been. Even in version 1.0 you can see the intermingling of applications, data and system files that proved so costly later on. It is also depressing to see how mistakes in the DOS/Windows era went on to infect the NT range.

Another observation. It took Microsoft 8 years to release a replacement for DOS/Windows – Windows NT in 1993 – and another 8 years to bring Windows NT to the mainstream on desktop and server with Windows XP in 2001. It is now 9 years later; will there ever be another ground-up rewrite, or do just get gradual improvements/bloat from now on?

I don’t count 64-bit Windows as a ground-up rewrite since it is really a port of the 32-bit version.

Finally, lest I be accused of being overly negative, it is also amazing to look at Windows 1.0, implemented in fewer than 100 files in a single directory, and Windows 7/Server 2008 R2, a platform on which you can run your entire business.

What you are saying about the Java crisis

A week or so ago I posted about the Java crisis and what it means for developers. The post attracted attention both here and later on The Guardian web site where it appeared as a technology blog. It was also picked up by Reddit prompting a discussion with over 500 posts.

So what are you saying? User LepoldVonRanke takes a pragmatic view:

I’d much rather have Java given a purpose and streamlined from a central authoritative body with a vision, than a community-run egg-laying, wool-growing, milk-giving super cow pig-sheep, that runs into ten directions at the same time, and therefore does not go anywhere. The Java ship needs a captain. Sun never got a good shot at it. There was always someone trying to wrestle control over Java away. With the Oracle bully as Uberfather, maybe Java has a place to go.

which echoes my suggestion that Java might technically be better of under more dictatorial control, unpalatable though that may be. User 9ren is sceptical:

Theoretically, the article is quite right that Java could advance faster under Oracle. It would be more proprietary, and of course more focussed on the kinds of business applications that bring in revenue for Oracle. It would be in Oracle’s interest; and the profit motive might even be a better spur than Sun had.

But – in practice – can they actual execute the engineering challenges?

Although Oracle has acquired many great software engineers (eg. from Sun, BEA Systems, many others), do they retain them? Does their organizational structure support them? And is Oracle known for attracting top engineering talent in general?

In its formation, Oracle had great software engineers (theirs was the very first commercial relational database, a feat many thought impossible). But that was 40 years ago, and now it’s a (very successful) sales-driven company.

There’s an important point from djhworld:

Java is hugely popular in the enterprise world, companies have invested millions and millions of pounds in the Java ecosystem and I don’t see that changing. Many companies still run Java 1.4.2 as their platform because it’s stable enough for them and would cost too much to upgrade.

The real business world goes at its own pace, whereas tech commentators tend to focus on the latest news and try to guess the future. It is a dangerous disconnect. Take no notice of us. Carry on coding.

On Reddit, some users focused on my assertion that the C# language was more advanced than Java. Is it? jeffcox111 comments:

I write in C# and Java professionally and I have to say I prefer C# hands down. Generics are very old news now in .Net. Take a look at type inference, lambdas, anonymous types, and most of all take a look at LINQ. These are all concepts that have been around for 3 years now in .Net and I hate living without them in Java. With .Net 5 on the horizon we are looking forward to better asynchronous calling/waiting and a bunch of other coolness. Java was good, but .Net is better these days.

and I liked this remark on LINQ:

I remember my first experience with LINQ after using C# for my final-year project (a visual web search engine). I asked a C# developer for some help on building a certain data structure and the guy sent me a pseudocode-looking stuff. I thanked him for the help and said that I’d look to find a way to code it and he said "WTF, I just gave you the code".

From there on I’ve never looked back.

Another discussion point is write once – run anywhere. Has it ever been real? Does it matter?

The company I work for has a large Java "shrinkwrap" app. It runs ok on Windows. It runs like shit on Mac, and it doesn’t run at all on Linux.

write once, run anywhere has always been a utopian pipe dream. And the consequence of this is that we now have yet another layer of crap that separates applications from the hardware.

says tonymt, though annannsi counters:

I’ve worked on a bunch of Java projects running on multiple unix based systems, windows and mac. GUI issues can be a pain to get correct, but its been fine in general. Non-GUI apps are basically there (its rare but I’ve hit bugs in the JVM specific to a particular platform)

Follow the links if you fancy more – I’ll leave the last word to A_Monkey:

I have a Java crisis every time I open eclipse.