All posts by onlyconnect

David bowie Station to station gets the super deluxe treatment

In the autumn of 1975, David Bowie was immersed in the alien character of Thomas Newton in Nicholas Roeg’s film The Man Who Fell to Earth. He was also addicted to cocaine, suffering delusions, and by accounts of those close to him at the time, seemingly near to breakdown. It’s all a bit hard to take in, considering that during this period of his life he produced what I consider his best work, the album Station to Station – though his flirtation with fascism makes me uncomfortable.

The music is magnificent though; powerful, unsettling, emotional. Stylistically it is an amalgam of the the funk of Young Americans and the rock which preceded it; though saying that does no justice to the fact that Bowie had moved on from both.

The title itself is a pun – the track opens with white noise and chuffing train noises, a radio tuning, a train travelling. Bowie is mentally travelling too, too fast for safety. Earl Slick’s guitar is frenetic and urgent. The album is cold in feel, perfectly suited to the stark mostly black and white cover, but humanised by the two softer ballads which conclude each side on the original vinyl release: Wild is the Wind and Word on a Wing.

image

Now Station to Station is getting the super deluxe treatment. In September EMI will release a lavish special edition box which includes 5 CDS, a DVD, three vinyl records, and a pile of memorabilia. How can you get that lot from one album? Here’s how:

CD 1: 2010 transfer of Station To Station from the original stereo analogue master
CD 2: Station To Station 1985 CD master
CD 3: Station To Station single edits five track EP containing Golden Years, TVC15, Stay, Word On A Wing and Station To Station
CDs 4 & 5: Live Nassau Coliseum ’76
DVD containing the following…
Station To Station (original analogue master, 96kHz/24bit LPCM stereo)
Station To Station (new Harry Maslin 5.1 surround sound mix in DTS 96/24 and Dolby Digital)
Station To Station (original analogue master, LPCM stereo)
Station To Station (new Harry Maslin stereo mix, 48kHz/24bit LPCM stereo)
12″ heavyweight vinyl of Station To Station from the original stereo analogue master in replica sleeve
2 x 12″ heavyweight vinyl of Live Nassau Coliseum ’76 in gatefold sleeve
24-page booklet with sleevenotes by Cameron Crowe and chronology by Kevin Cann and also including…
– Previously unpublished Steve Shapiro photo
– Geoff MacCormack photos
– Andrew Kent live Nassau photos
Replica David Bowie On Stage 1976 press kit folder containing the following…
– Replica Nassau ticket from night of the show
– Replica backstage pass
– Replica A4 biog
– Replica band line-up
– 3 x 10×8″ press shots
Replica 1976 Fan Club Folder containing the following…
– Replica fan club membership card
– Fan club certificate
– 2 small collector cards
– 2 A4 photo cards
– Replica 4-page biography
– 2 badges
– 6 panel folded Steve Shapiro photo poster of Bowie kneeling

Some of this deserves a little explanation. Why is the “1985 CD master” included? This would be the first CD release, on RCA. and sought after by collectors. The reason for the popularity of these early CDs is that in general they sound closer to the original vinyl records. Bowie’s back catalogue has been remastered many times, but all the later CD versions sound quite different, from the over-bright Ryko issues to the noise-reduced later efforts. I guess someone noticed that some fans still seek out the RCA CDs and decided to include it here.

The concert from the Nassau Coliseum was a famous bootleg called The Thin White Duke, though it is to be hoped that the sound quality here will be superior. It is a great concert, and better than any of the other official live material in my opinion.

Very nice; but I find myself rather irritated by this release. Although there will also be a CD release with the remastered Station to Station and the Nassau Coliseum concert, much of the material is unique to the big box. In particular, the high resolution stereo, the new surround sound mix, and the new stereo remix. Fans who want to hear these also have to purchase the rest of the box, even though they might not have a record player for the three vinyl records, for example. It’s annoying if like me you are mainly interested in the music.

Another disappointment is the absence of any true rarities. Many of us would like to hear the unused soundtrack Bowie created for the Man Who Fell to Earth, for example.

Nevertheless, there’s a lot here to look forward to – if you can live with feeling somewhat exploited as you open your wallet for this super-deluxe, super-expensive box containing material some of which you have most likely bought at least once before.

Station to Station and Live Nassau Coliseum on Amazon

How infectious is the GPL? Battle of words between WordPress and Thesis

Matt Mullenweg, the creator of WordPress, is engaged in a battle of words with the maker of one of its premium themes, Chris Pearson, who runs DIYthemes and offers the Thesis theme on a paid-for basis. I listened to their discussion on Mixergy; it is ill-tempered particularly on Pearson’s side.

The issue boils down to this. WordPress is licensed under the GPL, which provides that if you derive a new work from an existing GPL-licensed work, the GPL applies to your new work as well.

Pearson argues, I think, that his work is not so tightly linked to WordPress that the GPL applies. “Thesis does not inherit anything from WordPress” he says.

Mullenweg says that the way themes interact with WordPress is such that all themes much be GPL. “If you build something on top of it, it should be GPL” he says.

Pearson is refusing to license his theme under the GPL. What is to be done – would Mullenweg go to court to protect the GPL?

“You want us to sue you? That would break my heart.” he says. Then later, “I really hope it doesn’t come to that.” Then, “If people decide the GPL doesn’t apply, it’s a serious step for open source.”

Disclosure: this site runs on WordPress and I regard Mullenweg as one of the heroes of open source. Like the Apache web server (also in action here), WordPress is among the greatest achievements of the open source community.

I have no legal expertise; though I know a little about how WordPress works. Themes link very tightly with WordPress and in most cases are built by modifying an existing GPL theme; but I guess if you could show that Pearson’s work does not do this but merely runs on WordPress, as opposed to modifying it, he may have a case. That’s the argument Michael Wasylik makes here. On the other hand, did Pearson really create his theme without including any tiny bit of GPL code?

Another factor: if you choose to build an extension to a platform like WordPress, it is arguably unwise to do something counter to the strong wishes of its founder. There are ethical as well as legal aspects to this.

It is an important discussion for the open source community.

Dysfunctional Microsoft?

Microsoft watchers have been scrutinising the fascinating Mini-Microsoft post on the Kin smartphone debacle and what it says about the company. If it is even slightly accurate, it is pretty bad; and it must be somewhat accurate since we know that the hopeless Kin launch happened and that the product was killed shortly afterwards. Of course it would have been better to kill the project before rather than after the launch; the negative PR impact has affected the strategically important Windows Phone 7 launch.

Handsome profits from Windows and Office have enabled Microsoft to survive and even prosper despite mistakes like Kin, or the Xbox 360 “red ring of death”, or the Vista reset and related problems – mistakes on a scale that would sink many companies.

I see frequent complaints about excessively bureaucratic management with too many layers, and a tendency towards perplexing, ineffective but expensive advertising campaigns.

There are also questions about CEO Steve Ballmer’s suitability for the task. He nearly indulged in a disastrously over-priced takeover of Yahoo, saved only by the obstinacy of the target company’s leadership. He habitually dismisses the competition, such as Apple’s iPhone, and is proved wrong by the market. He failed to see the importance of cloud computing, and even now that the company is at least partially converted he does not set the right tone on the subject. I watched his keynote at the Worldwide Partner Conference (WPC) where he sounded as if he were trying unsuccessfully to imitate Salesforce CEO Marc Benioff from ten years’ ago. Microsoft needs to present a nuanced message about its cloud initiative, not someone shouting “oh cloud oh cloud oh cloud”.

Microsoft is also copying its competition as never before. Bing has a few innovations, but is essentially a recognition that Google got it right and an attempt to muscle in with a copy of its business model – search, advertising and data mining. Windows Phone 7 occupies a similar position with respect to Apple’s iPhone and App Store. Windows 8 also seems to borrow ideas from Apple.

Nevertheless, Microsoft is not yet a dying company, and it would be a mistake to base too much analysis of the company on something like comments to Mini-Microsoft’s blog – good though it is – since it is a magnet for disaffected employees.

While Ballmer’s effort at the WPC was poor, he was followed by Bob Muglia, president of server and tools, who was excellent. Windows Azure has come on remarkably since its half-hearted preview at PDC 2008; and Muglia comes over as someone who knows what he is trying to achieve and how he intends to get there. The Azure “Appliance” idea, shipping a pre-baked cloud infrastructure to Enterprise customers, is a clever way to exploit the demand for a cloud application model but on hardware owned by the customer.

The eBay announcement at WPC was also quite a coup. eBay will “incorporate the Windows Azure platform appliance into two of its datacenters” later this year; and while it is not clear exactly how much of eBay will run on Azure, these appliance kits represent significant hardware.

We’ve seen other strong releases from Microsoft – server 2008 R2, Exchange 2010, SQL Server 2008 R2, SharePoint 2010 which whatever you think of SharePoint is a solid advance on its predecessor, and of course Windows 7 which has done a lot to rescue Microsoft’s performance and reputation after the Vista disappointment.

I also continue to be impressed by Visual Studio 2010, which is a huge release and works pretty well in my experience.

What about Windows Phone 7? With the market focused on iPhone vs Android, clearly it is in a tough market. If there is something slightly wrong with it on launch, instability or some serious hardware or software flaw, it might never recover. Nevertheless, I do not write it off. I think the design effort is intelligent and focused, and that the Silverlight/XNA/.NET development platform along with Visual Studio is an attractive one, especially for Microsoft Platform developers. VP Scott Guthrie describes the latest SDK here. People still switch phones frequently – something I dislike from an environmental point of view, but which works in favour of new entrants to the market. If Windows Phone 7 is a decent device, it can succeed; I’d rate its long-term chances ahead of HP WebOS, for example, and will be keen to try it when it becomes available.

phonedev

Is there a lot wrong with Microsoft? Yes. Does it need a fresh approach at the very top? Probably. Nevertheless, parts of the company still seem to deliver; and even the Windows Phone 7 team could be among them.

Google App Inventor – another go at visual programming

Google has put App Inventor for Android on Google Labs:

To use App Inventor, you do not need to be a developer. App Inventor requires NO programming knowledge. This is because instead of writing code, you visually design the way the app looks and use blocks to specify the app’s behavior.

Sharon Machlis at Computerworld says it is a breakthrough:

App Inventor has the potential to do for mobile app creation what VisiCalc did for computations — move it out of the exclusive realm of specialists in glassed-in data centers (or, in the case of mobile apps, programmers who can use a conventional SDK) into the hands of power users as well as make it easier for IT departments to create corporate apps.

I’d like to believe this but I do not. It is visual programming; it is interesting; but it is similar to other visual programming tools that we’ve seen in the past. These tools have their place for learning, and there is probably some small sub-section of programming tasks for which they are ideally suited, and some small sub-section of developers for whom they work better than text-based tools, but for most of us textual code is easier and more productive when we are coding the logic rather than the user interface of an application.

I took a look at the Quiz Me tutorial. Here’s a code snippet – it is a click event handler:

image

and here is the complete application. Note the navigator at top right, which would be vital for finding your way around a more complex app:

image

It is often a problem with visual programming tools: scaling an app beyond a few simple blocks introduces difficulties with navigation and project management. Our text-based tools are highly evolved for managing large projects with thousands of lines of code.

What about democratisation of programming through visual tools like this, coding without coding, that will allow domain specialists to develop apps without involving developers? Is visual programming really easier for the non-specialist than textual programming? I’m not convinced. It should be easier to get started: no syntax errors, no language reference to consult. In the end though, is a purple “if” block with jigsaw connections for “test” and “then-do” much easier than typing if (test) {code block}?

It is just a different way of presenting the same thing, but less concise and less flexible. I suspect the domain specialist who can succeed with App Inventor will also succeed with code; and that people who struggle with code-based programming in an accessible language like Basic will also struggle with visual programming.

Where this gets interesting is when you have powerful components that expose a simple interface. A high-level non-specialist programmer can drag a component onto a design palette and do amazing things, because of the smarts that are hidden inside. Components do democratise development. One reason for the success of Microsoft’s development platform is that from Visual Basic through COM and then .NET, the company has always made it easy to use components and fostered a strong third-party market for them. If App Inventor provides a great way to deliver components to high-level developers, it could succeed.

That said, components do not require visual programming. Microsoft has flirted with visual programming – like the abandoned PopFly – but despite using the name “visual” everywhere, Microsoft has never delivered a mainstream visual programming tool.

Small Business Server “Aurora” based on Windows Home Server and will have hooks to the cloud

The most interesting session at TechEd in New Orleans last month was one I could not talk about until today. It concerned the next version of Small Business Server, no date announced yet. The next SBS will come in two editions. SBS 7.0 will be conceptually similar to today’s SBS, but updated to Server 2008 R2, Exchange 2010 and so on.

SBS code-name “Aurora” is the compelling one though. It is based on Windows Home Server (or at least the next version of WHS, “Vail”, but with Active Directory added. There are no other apps; you are expected to use cloud services.

The reason this matters is Microsoft’s work on federated Active Directory. What this means is that your local SBS simply manages users, computers and file shares, but the same user accounts also work on cloud-hosted services such as Exchange or SharePoint – or any others that support Active Directory federation.

I love this concept; it is exactly the right thing for SMEs who need to run a properly managed Windows network while using hosted email and other cloud services.

image

Questions remain of course. Will services other than Microsoft’s own BPOS or third-party hosted Exchange and SharePoint support SBS federated Active Directory? And will Microsoft and its partners really steer small businesses in this direction, or focus on SBS 7.0?

More details in this article on The Register.

PS This version of SBS is not too far removed from what I asked for in February 2006.

Bare-metal recovery of a Hyper-V virtual machine

Over the weekend I ran some test restores of Microsoft Hyper-V virtual machines. You can restore a Hyper-V host, complete with its VMs, using the same technique as with any Windows server; but my main focus was on a different scenario. Let’s say you have a Server 2008 VM that has been backed up from the guest using Windows Server Backup. In my case, the backup had been made to a VHD mounted for that purpose. Now the server has been stolen and all you have is your backup. How do you restore the VM?

In principle you can do a bare-metal restore in the same way as with a physical machine. Configure the VM as closely as possible to how it was before, attach the backup, boot the VM from the Server 2008 install media, and perform a system recovery.

Unfortunately this doesn’t work if your VM uses VHDs attached to the virtual SCSI controller. The reason is that the recovery console cannot see the SCSI-attached drives. This is possibly related to the Hyper-V limitation that you cannot boot from a virtual SCSI drive.

The workaround I found was first to attach the backup VHD to the virtual IDE controller (not SCSI), so the recovery console can see it. Then to do a system recovery of the IDE drives, which will include the C drive. Then to shutdown the VM (before the restart), mount both the backup and the SCSI-attached VHDs on the host using diskpart, and use wbadmin to restore each individual volume. Finally, detach the VHDs and restart the VM.

It worked. One issue I noticed though is that the network adapter in the restored VM was considered different to the one in the original VM, even though I applied the same MAC address. Not a great inconvenience, but it meant fixing networking as the old settings were attached to the NIC that was now missing.

I’ve appended the details to my post on How to backup Small Business Server 2008 on Hyper-V.

Dusty PC keeps on keeping on

It is amazing how well desktop PCs work even when choked by dust. This one was working fine:

image

It is a little hard to see from the picture, but the mass of dust on the right is actually a graphics card. The graphics card has its own fan, which was so dust-choked that the the space between the blades was filled.

Running PCs in this state is not a good idea. If your machine is not working properly or overheating, it is worth a look. If it is working fine, perhaps it is a worth a look anyway. The best way to clear it is with one of those air duster aerosols.

Changing the motherboard under Windows 7

Today I needed to swap motherboards between a machine running Hyper-V Server 2008 R2 and another running 32-bit Windows 7. No need to go into the reason in detail; it’s to do with some testing I’m doing of Hyper-V backup and restore. The boards were similar, both Intel, though one had a Pentium D processor installed and the other a Core Duo. Anyway, I did the deed and was intrigued to see whether Windows would start in its new hardware.

Hyper-V Server – which is really 64-bit Server Core 2008 R2 – started fine, installed some new drivers, requested a restart, and all was well.

Windows 7 on the other hand did not start. It rebooted itself and offered startup repair, which I accepted. It suggested I try a system restore, which I refused, on the grounds that the problem was not some new corruption, but that I had just changed the motherboard. Next, startup repair went into a lengthy checking procedure, at the end of which it reported failure with an unknown problem possibly related to a configuration change.

That was annoying. Then I remembered the problems Windows has with changing to and from AHCI, a BIOS configuration for Serial ATA. I posted on the subject in the context of Vista. I checked the BIOS, which was set to AHCI, changed it to IDE mode, and Windows started fine. Then I made the registry change for AHCI, shutdown, changed back to AHCI in the BIOS. Again, Windows started fine.

What puzzles me is why the long-running Windows 7 startup repair sequence does not check for this problem. If the alternative is a complete reinstall of Windows, it could save a lot of time and aggravation.

It is also worth noting that Windows 7 declared itself non-genuine after this operation, though actually it re-activated OK. I guess if you had two machines with OEM versions of Windows 7, for example, and swapped the motherboards, then strictly you would need two new licenses.

Don Syme on F#

I’ve posted a lengthy interview with Don Syme, designer of Microsoft’s functional programming language F#. It covers:

  • The genesis of F#
  • Why it is in Visual Studio 2010
  • How it differs from other ML languages
  • Who should use it
  • What it brings to parallel and asynchronous programming
  • Unit testing F#
  • Future plans for F#
  • Book recommendations

One of the questions is: if I’m a C# or C++ developer, what practical, business-benefit reason is there to look at F#? Worth a read if you’ve wondered about that.

Setting up RemoteApp and secure FTP on Windows

I spent some time setting up RemoteApp and secure FTP for a small business which wanted better remote access without VPN. VPN is problematic for various reasons: it is sometimes blocked by public or hotel wifi providers, it is not suitable for poor connections, performance can be poor, and it means constantly having to think about whether your VPN tunnel is open or not. When I switched from connecting Outlook over VPN to connecting over HTTP, I found the experience better in every way; it is seamless. At least, it would be if it weren’t for the connection settings bug that changes the authentication type by itself on occasion; but I digress.

Enough to say that VPN is not always the best approach to remote access. There’s also SharePoint of course; but there are snags with that as well – it is powerful, but complex to manage, and has annoyances like poor performance when there are a large number of documents in a single folder. In addition, Explorer integration in Windows XP does not always work properly; it seems better in Vista and Windows 7.

FTP on the other hand can simply publish an existing file share to remote users. FTP can be horribly insecure; it is a common reason for usernames and passwords to passed in plain text over the internet. Fortunately Microsoft now offers an FTP service for IIS 7.0 that can be configured to require SSL for both password exchange and data transmission. I would not consider it otherwise. Note that this is different from the FTP service that ships with the original Server 2008; if you don’t have 2008 R2 you need a separate download.

So how was the setup? Pretty frustrating at the time; though now that it is all working it does not seem so bad. The problem is the number of moving parts, including your network configuration and firewall, Active Directory, IIS, digital certificates, and Windows security.

FTP is problematic anyway, thanks to its use of multiple ports. Another point of confusion is that FTP over SSL (FTPS) is not the same thing as Secure FTP (SFTP); Microsoft offers an FTPS implementation. A third issue is that neither of Microsoft’s FTP clients, Internet Explorer or the FTP command-line client, support FTP over SSL, so you have to use a third-party client like FileZilla. I also discovered that you cannot (easily) run a FTPS client behind an ISA Server firewall, which explained why my early tests failed.

Documentation for the FTP server is reasonable, though you cannot find all the information you need in one place. I also found the configuration perplexing in places. Take this dialog for example:

image

The Data Channel Port Range is disabled with no indication why – the reason is that you set it for the entire IIS server, not for a specific site. But what is the “External IP Address of Firewall”? The wording suggests the public IP address; but the example suggests an internal, private address. I used the private address and it worked.

As for RemoteApp, it is a piece of magic that lets you remote the UI of a Windows application, so it runs on the server but appears to be running locally. It is essentially the same thing as remote desktop, but with the desktop part hidden so that you only see the window of the running app. One of the attractions is that it looks more secure, since you can give a semi-trusted remote user access to specified applications only, but this security is largely illusory because under the covers it is still a remote log-in and there are ways to escalate the access to a full desktop. Open a RemoteApp link on a Mac, for example, and you get the full desktop by default, though you can tweak it to show only the application, but with a blank desktop background:

image

Setup is laborious; there’s a step by step guide that covers it well, though note that Terminal Services is now called Remote Desktop Services. I set up TS Gateway, which tunnels the Terminal Server protocol through HTTPS, so you don’t have to open any additional ports in your firewall. I also set up TS Web Access, which lets users navigate to a web page and start apps from a list, rather than having to get hold of a .RDP configuration file or setup application.

If you must run a Windows application remotely, RemoteApp is a brilliant solution, though note that you need additional Client Access Licenses for these services. Nevertheless, it is a shame that despite the high level of complexity in the configuration of TS Gateway, involving a Connection Authorization Policy and a Resource Authorization Policy, there is no setting for “only allow users to run these applications, nothing else”. You have to do this separately through Software Restriction Policies – the document Terminal Services from A to Z from Cláudio Rodrigues at WTS.Labs has a good explanation.

I noticed that Rodrigues is not impressed with the complexity of setting up RemoteApp with TS Gateway and so on on Windows Server 2008 R2:

So years ago (2003/2004) we had all that sorted out: RDP over HTTPS, Published Applications, Resource Based Load Balancing and so on and no kidding, it would not take you more than 30 minutes to get all going. Simple and elegant design. More than that, I would say, smart design.

Today after going through all the stuff required to get RDS Web Access, RDS Gateway and RDS Session Broker up and running I am simply baffled. Stunned. This is for sure the epitome of bad design. I am still banging my head in the wall just thinking about how the setup of all this makes no sense and more than that, what a steep learning curve this will be for anyone that is now on Windows Server 2003 TS.

What amazes me the most is Microsoft had YEARS to watch what others did and learn with their mistakes and then come up with something clean. Smart. Unfortunately that was not the case … Again, I am not debating if the solution at the end works. It does. I am discussing how easy it is to setup, how smart the design is and so on. And in that respect, they simply failed to deliver. I am telling you that based on 15+ years of experience doing nothing else other than TS/RDS/Citrix deployments and starting companies focused on TS/RDS development. I may look stupid indeed but I know some shit about these things.

Simplicity and clean design are key elements on any good piece of software, what someone in Redmond seems to disagree.

My own experience was not that bad, though admittedly I did not look into load balancing for this small setup. I agree though: you have to do a lot of clicking to get this stuff up and running. I am reminded of the question I asked a few months back: Should IT administration be less annoying? I think it should, if only because complexity increases the risk of mistakes, or of taking shortcuts that undermine security.