Future of Web Applications wrap-up: check out the Open Stack

I attended two great sessions at day two of FOWA London on Friday – which I guess makes it a good day. The first was from Tim Bray, about which I posted last week. Bray was alone in suggesting that the current economic climate will change the tech world deeply; it’s speculative, but I’m inclined to agree with him, though not on every detail of his analysis. He says recession will boost open source – it might, though I can also imagine companies cutting back on the number of staff they dedicate to open source projects like Eclipse – a large amount of open source development is done by professionals on company time. I asked Bray how he thought Sun (his company) would fare in the downturn; he said it would be better off than software companies which are not committed to the open source model, but again that’s speculative; what will be tested is the business model, and that is one thing Sun has never been able to explain satisfactorily.

I was amused that Tim Bray sneaked “Flash is bad” into his talk, as well as talking about “the Microsoft disaster” and “the Oracle disaster”.

The other high point was Kathy Sierra doing her piece on passionate users; I guess it is what she always says, but none the worse for that, especially as I had not heard here speak before. I’ll be posting again on this, here or elsewhere. I was sorry that her talk came just before the Diggnation silliness which meant no chance for questions.

There were disappointments too. Mark Zuckerberg and Dave Morin spoke about Facebook Connect and a few other things; I would have liked some sort of debate about Connect vs Google’s OpenSocial; my observation is that Google is doing a better PR job in persuading us that it supports the “open web” as opposed to some kind of walled garden.

Although I have an intense interest in things like Adobe AIR, Salesforce.com, and Amazon’s web services, the sessions on these were disappointing to me, being high level and mainly marketing. There was probably more detail in the “University” mini-sessions but one cannot attend everything.

Despite its absence from the main stage, there seemed to be a fair amount of interest in Microsoft’s stand and mini-sessions on Silverlight; nevertheless, if the fowa crowd is wary of Flash it is even more suspicious of anything from Microsoft. Fowa is biased towards open technology; only Apple gets a pass because its laptops and iPhone are well liked.

As for significant clues about, well, the future of web applications, I’d point to David Recordon’s session on the “Open Stack”, about which I posted on Friday. If this, or something like it, wins serious adoption it will have a huge impact.

Many of the sessions have been posted as videos here.

Amazon fails to address interoperability concerns; Flexiscale plans cloud platform

Just attended a session here at FOWA from Amazon’s Jeff Barr and Flexiscale’s Tony Lucas on cloud computing. These vendors have similar offerings (in kind, but not in scale; Flexiscale is tiny by comparison). Lucas had told me he would talk about interoperability between Amazon and Flexiscale but did not do so, nor did Barr mention it.

I took the opportunity to get in some questions at the informal gathering after the session. The context is that Amazon has had serious outages this year, which will not have gone unnoticed by organizations considering its platform; the ability to import and export AMI’s (Amazon Machine Instances) would help users to implement failover plans. Is either Amazon or Flexiscale considering support for the Open Virtual Machine Format (OVF), used by VMWare?

Neither is doing so. Lucas muttered something about standards driven by commercial agendas; Barr said Amazon would wait and see and did not want to standardise too early; and that customers were not asking for it.

What interested me was the intense interest from other developers who had come up to ask questions, in this topic of interoperability and avoiding lock-in. This makes me wary of Barr’s comment that there is little interest.

In mitigation, Lucas said that his company can already import AMIs, but does not do so because it might breach Amazon’s terms and conditions. Barr pointed out that AMIs are just Linux VMs so you can easily migrate their contents. Both good points. Nevertheless, it strikes me that VMWare’s vCloud offering goes beyond either Amazon or Flexiscale in this respect.

Lucas made a couple of other observations. He said that Google’s BigTable, which sits underneath the AppEngine API, is not open source and makes  it impossible to implement AppEngine on his platform. He added that Flexiscale was always conceived as a platform offering, not just on-demand virtual servers, and will announce a platform based on a 100% open source stack shortly (aside from the Windows version; sounds like there will both Linux and Windows available).

Sun’s Tim Bray declares end of Enterprise Software

In a dramatic session here at FOWA in London, Sun’s Tim Bray tore up his talk and spoke on life after the economic crash. While giving a near-apocalyptic prediction of tough times ahead, he said that certain technologies will be winners and losers. Winners: open source, agile development, web applications, cloud computing. Losers: enterprise software:

I do not see much future for enterprise software … you are not going to get any purchase orders

On the subject of cloud computing, he added that he is not sure what model is best – hosted applications, virtual servers on demand such as those from Amazon, or what. The main risk, he said, is lock-in.

Ironically his talk is being followed by one from Salesforce.com, where lock-in is real.

Clearly, and as Bray admitted, the ideas he is recommending are the same as what he would recommend anyway. That doesn’t make them wrong, of course. His dose of reality, despite his pessimism, won applause here.

There are a few more snippets from his talk on my twitter feed.

Future of Web Apps 2008 Day One: Web is DVD, desktop VHS

I’m at London’s dreary Excel centre for Carson’s Future of Web Apps conference, just before the opening of day two. Yesterday was a mixed bag; good when speakers talk technical; bad when they descend into marketing. The origins of the conference are as a start-up incubator; developers and entrepreneurs getting together to see what’s new and make contacts. It still has some of that flavour, but it has grown beyond that because web apps are a mainstream topic and Carson attracts generally excellent speakers. There is a good crowd here; I’m not sure if every last ticket sold, but it is pretty much packed out, though the dark economic mood is dampening spirits.

Digg’s Kevin Rose spoke briefly about his site’s new recommendation engine, which has been active since July or so. The idea is that Digg learns a user’s profile by examining clicks and votes, using it to customize what the user sees. He spoke about a forthcoming feature, where third-party sites will be able to call the Digg recommendation engine to get profile information that it can then use to customize its own site.

An interesting idea; though it raises several questions. How does it work – would logging out of Digg be sufficient to disable it? Will users opt-out or opt-in? How much of this kind of customization do we want anyway?

This whole theme of contextualization is a big one here; it ties in closely with social networking, and Google’s OpenSocial API is getting quite a bit of attention.

Blaine Cook (ex Twitter now Yahoo, Ruby guy and inventor of OAuth) gave a though-provoking session on scalability along with Joe Stump from Digg (and a PHP guy). They took the line that languages don’t matter – partly a reflection on Twitter’s scaling problems and whether it was Ruby’s fault. Other factors make language efficiency unimportant, they said, such as disk I/O and network speed; and the secret of scaling is multiple and redundant cheap boxes and apps which are segmented so that no one box  is a bottleneck. The case was overstated but the main points strike me as sound.

I’m wondering how many of the developers here are actually having to deal with these kinds of scalability problems. Many web apps get only light use; the problems for everyday developers are different.

I attended a session entitled "The future of Enterprise Web Apps" by Googler Kevin Marks. It turned out to be a plug for the OpenSocial API; not what I was expecting.

Francisco Tolmasky of 280slides.com evangelised his Objective-J and Cappucino JavaScript framework, based loosely on Apple’s Cocoa framework. Hmm, bit like SproutCore.

I give Tolmasky credit for the most striking analogy of the day. The Web is DVD is says, and the desktop VHS. Adobe’s AIR is a combo player. He is talking about transition and leaving us in no doubt about what he sees is the future of the desktop.

Best sessions of the day (that I attended) were Blaine Cook on Jabber and its XMPP protocol, and David Recordon from SixApart on the evolving Internet "open stack". In this he includes:

  • OpenID + hCard for identity
  • XRDS-Simple for discovery (http://is.gd/3M53)
  • OAuth for authentication
  • ATOM and POCO  ( or PorC) – Portable contacts)
  • OpenSocial

I put these two sessions together because they both addressed the "Web as platform" topic that is really the heart of why we are here. Spotting which APIs and protocols will win is tricky; but if consensus is reached on some or all of these, they will impact all web developers and bring new coherence to what we are doing.

I’ll be covering today on Twitter again – see here if you want to follow.

Vista even thinks Control Panel is photos

One of Vista’s most annoying features is the tendency of Explorer to decide, first, that all your documents are music or photos; and second, that if they are, you care more about metadata like “Rating” than humdrum details such as the date of the file.

I had thought that Vista only did this if it found at least one media file in the folder, but today it happened with Control Panel:

Notice how it highlights another user-hostile feature: the name of each applet is in a column too narrow to read, and several applets are indistinguishable from each other because they begin “Microsoft .NET Frame…” or “Internet Information S…”; another triumph of branding over usability.

What I wanted was the Event Viewer; and while I’m in ranting mode, let me add that I much prefer the old NT Event Viewer to the Vista effort. The new one takes ages to populate a clever multi-pane view, which presents too much information in tiny scrolling panels. In practice I use the tree view on the left to select the log I want, subverting the new design by doing exactly what I would have done in the old Event Viewer. Habit possibly; but there are real design problems with the new Event Viewer. Administrators will always choose practical over pretty.

See here for my earlier complaint about Explorer views and a partial remedy. Why wasn’t this fixed in SP1?

Recovering data from a failed hard drive with ZAR

A friend’s computer would no longer boot. The problem turned out to be a failed hard drive. After five years’ service, this 40GB Western Digital is nearly dead. Replacing the drive is cheap; but what about the hundreds of family snapshots, for which no backup exists? Such data falls into an awkward category, of no financial value, not worth huge sums for professional data recovery, but sad to lose.

This drive is free of clicking noises (which are usually a very bad sign) and is recognized by the BIOS. My usual procedure in cases like this is to attach the drive to another working computer, do a backup image copy if possible, and then run utilities like CHKDSK as an attempted repair.

This one wasn’t easy. One problem is that the faulty drive slows down the whole system, presumably as Windows repeatedly queries it for information and gets a delayed response or a timeout. That makes for slow and frustrating work. Initially the drive was completely unreadable. Following several hours of CHKDSK, I could see the file system in Explorer, but directories took several minutes to open. I managed to copy a few files, but most of the images failed to copy; after a long pause Windows would report a file I/O error.

I tried the official Western Digital diagnostic and repair utility. It reported too many bad sectors to continue.

I had a quick look for utilities that might help, and came across ZAR, Zero Assumption Recovery. This is trialware, free for recovering up to 4 folders, or images from a memory card, and inexpensive for the full version. I ran it first in the free image recovery mode. It took 20 hours but recovered 55,000 image files, saved with random names in a single directory. I tried opening some of the JPEGs; some opened, some were corrupt. Still, better than nothing. I paid for the full version, and re-ran the utility. This time it was quicker. I was able to select the NTFS folders I wanted to recover – I chose all of Documents and Settings – and it retained the folders and filenames. After about 7 hours, it recovered most of the data successfully. I have not tested all the images, but the ones I have tried open fine.

There may be better utilities out there, but I was impressed with ZAR; it takes a long time, but since it works unattended that is not a problem.

Finally, a few words of general advice if you have a failing Windows drive containing important data. Disclaimer: this is based on my experience and might not work for you.

  • If you notice the problem when the drive is working, backup what you care about immediately. It may never spin up again.
  • Check the event log – if there are disk errors reported, such as ATAPI or SCSI errors, perhaps the drive is failing. I always replace the drive in these cases; keeping it is not worth the hassle.
  • If that’s not possible, stop working with the drive. Writing data to it may make it worse. Attach it to a different PC as a spare drive. Back it up as-is if possible, using something like Drive Snapshot.
  • Now, how much do you really care about that data? If it is business critical, just send it to someone like OnTrack. It will cost a fortune, but pretty much anything can be recovered.
  • If the drive won’t spin, or the BIOS won’t recognize it, you are on your own. Homely remedies include sharp taps or a dose of refrigeration; or maybe the skip beckons.
  • If it kinda works, try CHKDSK /R. This can take many hours with a bad drive, but often works well enough to recover data.
  • If that fails, get the drive manufacturer’s diagnostic utility. This will tell you if the drive is physically damaged, or just scrambled. A repair using this utility may also work – but could also make data harder to recover. That’s why you made the image backup.
  • If that fails, try ZAR or one of many other utilities out there. I noticed that OnTrack has some too. There are free ones as well. Good luck.
Technorati tags: , , ,

Prism: official Delphi language comes to Visual Studio

Embarcadero is to release Delphi for .NET as a Visual Studio add-on, called Prism. Marco Cantu has a summary. Note that according to this post, which is based on an announcement statement by product manager Nick Hodges at the SDN conference near Amsterdam, there will be:

full support for the .NET framework 3.5 (WinForms, WFP, Silverlight, ASP.NET, WCF, LINQ) … CodeGear will provide Datasnap 2009 integration and dbExpress for ADO.NET support

It looks as if this will be a full alternative language for .NET developers. Note that many of the language changes, such as generics, in the Win32 version of Delphi 2009 seemed to have .NET compatibility in mind. It makes sense for Embarcadero to use Visual Studio to host .NET development tools, just as it uses Eclipse for Java.

There remains an awkward question. What advantage is there in using Delphi (a version of Pascal) rather than C# for .NET development? If this is aimed only at existing Delphi developers migrating code, it will only ever be a niche.

Not good news for RemObjects Oxygene, which is also an Object Pascal add-on for Visual Studio; but Oxygene has some other tricks like Mono support, for running on Linux, which may sustain it.*

I am trying to clarify a couple of points. To what extent, if at all, will Prism support the .NET version of Delphi’s VCL (Visual Component Library), which would not fit smoothly with the Visual Studio design tools? Even if VCL.NET applications work, you would probably be better off using Delphi’s own IDE for them. Code ported from Win32 Delphi will likely use the VCL, so this is tough to get right. And what is the future of Delphi for .NET in RAD Studio? I will update this post when I know more.

*Comments below suggest that this is in fact Oxygene rebadged; I won’t say more until I’ve got official confirmation.

Google Chrome usage one month on

Om Malik asks about Chrome usage, one month after its release.

On this site this month (only a few days in) Chrome has a 2.5% share, below Opera at 3.2%. Malik reports 5.59%; commenters to his post have figures as small as 0.36% up to something approaching Malik’s figure – his seems to be about the maximum.

Small, but even say 2.5% is not that bad for a new, beta web browser. I use it myself some of the time; I like the speed and clean UI.

That said, Chrome usage has declined, after the initial surge of people trying it out. The share now is more meaningful; it will be fascinating to watch its progress. The challenge for Google is to get a buzz going; surely a web browser is a perfect candidate for Web 2.0 marketing.

Technorati tags: , ,