Hot news: the Internet is as insecure as ever

I’ve been writing about the Internet for years, and some of my earliest articles were about security problems. I’ve written about why anti-virus software is ineffective, how application insecurities leave web servers open to attack, and why we need authenticated email combined with collective whitelisting in order to solve the problem of spam and virus-laden emails.

What depresses me is that we have made little if any progress over the last decade. Email is broken, but I have to use it for my work. Recently I’ve been bombarded with PDF spam and ecard viruses, which for some reason seem to slip past my junk mail filter. Said filter does a reasonable job and I could not manage without it, but I still get false positives from time to time – genuine messages that get junked and might or might not be spotted when I glance through them. The continuing flow of garbage tells me that anti-virus software is still failing, because it comes from other machines that are already infected.

And what about comment spam? Akismet is fantastic; it claims to have caught 43,000 spam comments to this blog since I installed it in October last year. In the early days I used to glance through all of them and occasionally I did find a comment that was incorrectly classified. Now, the volume of spam comments makes that unfeasible, so no doubt there are some being needlessly junked.

Security is a huge and costly problem. Even when everything is running sweetly, anti-virus and anti-spam software consumes a significant portion of computing resources. Recently I investigated why an older machine with Windows XP was running slowly. It did not take long: Norton anti-virus was grabbing up to 60% of the CPU time. Disabling NAV made the machine responsive again. Nevertheless, the user decided to keep it running. What is the cost to all of us of that accumulated wasted time?

We have become desensitized to security problems because they are so common. I come across people who know they have viruses on their PCs, but continue to run them, because they have stuff to do and would rather put up with a “slow” machine than try to fix it. Other machines are compromised without the awareness of their owners. Those PCs are pumping out viruses and spam for the rest of is, or are part of the vast botnet army which is now an everyday part of the criminal tool chest.

I actually write less about security that I used to, not because the issue is of any less importance, but because it becomes boringly repetitive. Desensitized.

The frustration is that there are things we could do. Email, as I noted above, could be made much better, but it requires collective willpower that we seem to lack. A while back I started authenticating my emails, but ran into problems because some email clients did not like them. Users saw attachments and thought it might be a virus, or could not reply to the email. I had to remember to remove the authentication for certain recipients, and it became too difficult to manage, so I abandoned the experiment. That’s really a shame. Authentication in itself does not prevent spam, but it is an essential starting point.

Do we have to live with this mess for ever? If not, how long will it take until we begin to see improvement?

Technorati tags: , , , ,

Salesforce.com + Google + Adobe Flex and AIR = New Internet platform?

I spoke to Adam Gross, vice president of developer marketing at Salesforce.com, about the new Summer 07 release. This includes the first full release of Apex Code, a server-side programming language which lets you customize and extend Salesforce.com applications. Currently Apex Code is only available in the high-end Unlimited Edition.

Gross told me that Apex Code has acquired some interesting new capabilities since we last spoke about its preview release. This includes the ability to make outgoing web service calls, a capability which is particularly interesting because it enables mash-ups with third-party web applications that also expose a web services API. Salesforce.com is a big user of SOAP, by the way, in contrast to all the negative press that SOAP seems to get elsewhere. You can develop Apex Code in Eclipse, though debugging is still fairly painful.

The mash-up idea is compelling. One of the key challenges of hosted applications is integration with local applications. Things like complex Outlook connectors are common, along with the ability to export data as Excel or Word documents. That friction can be reduced by moving more of your data online. For example, Gross told me that developers have already devised ways to export Salesforce.com reports directly into Google’s Documents and Spreadsheets. From there, you could easily share it with colleagues. In fact, the same application could email the link to a list of recipients. That makes more sense than exporting to Excel, attaching it to an email in Outlook, and sending it out again. Note that Google and Salesforce.com announced a strategic alliance in June, but this focused on AdWords which is less interesting.

Another key Salesforce.com partner is Adobe. Gross tells me that he sees wide take-up for Flex and AIR among Salesforce.com developers. There is a Flex Toolkit which simplifies the development of Flex or AIR applications that call Salesforce.com APIs. “The take up of the Flex Toolkit has been breathtaking,” says Gross. “We are already seeing companies create offline applications written in AIR, even though that product is still in beta.” He also praises the productivity of Flex – apparently some developers use it for that reason alone.

The offline aspect of AIR is vital, as it addresses the most obvious weakness of the Salesforce.com platform. Google Gears could be used for this as well, though I got the impression that Gross sees more take-up for AIR at the moment. He says there will be further announcements on both at the Dreamforce conference in September.

The vast majority of Salesforce.com usage seems still to be CRM, though there is no inherent reason why the platform should not support ERP or other application types. Its strength is the vast amount of pre-built functionality. Concerns include the cost – especially if the company reserves key features like Apex Code for its high-end edition – and the risk of vendor lock-in. See here for a good overview of the Salesforce.com platform.

From BT to VOIP

I have two BT (British Telecommunications) telephone lines, giving me the convenience of two separate numbers. I’ve known for some time that I could achieve greater flexibility at lower cost by using VOIP (Voice Over IP), but changing over was never urgent so I left well alone. That was until this month’s BT bills arrived, featuring a new “payment fee” – an £18.00 annual charge for paying your bill. That’s right, you now pay BT for the privilege of paying. You can avoid the new fee by switching to Direct Debit, but instead I treated it as an incentive to get on with the switch to VOIP. I still need one BT line, for my ADSL connection, but I’ll be scrapping the other and transferring the number to my new provider, Voipfone.

We currently use DECT wireless handsets, so I’ve also purchased a Siemens Gigaset wireless SIP (Session Initiation Protocol) handset and DECT base station, in the hope that I can re-use our existing DECT handsets. The Gigaset is also able to work with an old-style fixed line, so it is ideal for the transition.

Why not Skype? Well, Skype is massively popular and I do use it occasionally, particularly when calling from overseas. Unfortunately it uses a proprietary protocol, whereas SIP is an standard with lots of open source energy behind it, including of course the Asterisk PBX.

It’s going OK. The biggest problem I’ve had is trying to get it working behind ISA Server, Microsoft’s firewall. I’ve had partial success, but only with the X-Lite softphone running on a PC with the ISA Firewall Client installed. I can’t install this client on the Gigaset, so I’ve connected it directly to the ADSL router, bypassing ISA. This is actually a pretty good solution, though if anyone knows how to get this working through ISA I’d be glad to know. Incidentally, call quality is much better on the Gigaset than in X-Lite.

I should save some money, but what’s more important is that VOIP opens up many new possibilities and I’m looking forward to some experimentation.

PS – as chance would have it, Danny Bradbury has a post bemoaning the low quality of some VOIP calls. It’s a fair point, and another good reason to keep at least one POTS (Plain Old Telephone System) line to hand. Still, my experience so far is that the VOIP phone is fine for everyday use. I’ll let you know how it goes.

Technorati tags: , , , , , , ,

CodeGear abandons .NET Windows Forms?

Intriguing blog post here suggests that future versions of Delphi .NET will not support Windows Forms, the most widely used GUI library for .NET. Apparently the Delphi 2006 Windows Forms designer will not appear in Delphi “Highlander”. If you want to do GUI work in Delphi .NET, you will have to use VCL.NET, or else make do without a designer.

At first glance this looks like a mistake. The main problem is third party components. There are plenty around for Windows Forms, few for VCL.NET. OK, you can import a Windows Forms component and use it in VCL.NET, but it’s not ideal.

Then again, what is the future for Delphi .NET? Pretty uncertain, judging by what CEO Jim Douglas told me. If the speculation about Windows Forms in Delphi is correct, then anyone who invested in Delphi Windows Forms development has been left stranded. Might not the same happen to VCL.NET developers? And what are the implication for WPF, a much nicer GUI library than Windows Forms though immature and little-used at the moment? Developers hate this kind of uncertainty.

How to buy market share in search … or not

Microsoft gained remarkable market share in search last month, up from 8.4% to 13.2%. At last, competition for Google and Yahoo. Or is it? It turns out that most (not quite all) of the search gain was thanks to the Live Search Club, an online word game which links to Live Search. Remove its 3 million hits, and the gain is just 0.3%.

It gets worse. The Live Search Club lets you win points by completing games, and then exchange your points for prizes such as a Zune or Windows Vista. Very nice. But some dastardly individuals devised bots that complete the games for you. Result: product to sell on eBay. A low trick.

Personally I’m not chuffed with Live Search Club. I completed a game of Chicktionary without using a bot, won 20 points, but when I tried to register the site had gone offline. Drat. Still, perhaps Microsoft is coming up with some anti-bot measures.

It strikes me that Microsoft is being a little naive here. On the other hand, here I am writing about Live Search. So as a PR effort, I guess its working.

Performance expert becomes Visual Studio Chief Architect

Microsoft’s performance specialist Rico Mariani is to be Chief Architect of Visual Studio.

Mariani has earned huge respect for his detailed blog posts on performance issues in .NET. He’s recently posted some fascinating figures on Linq to SQL performance. From a technical point of view, it looks like Visual Studio architecture is in good hands.

Perhaps this also indicates that Microsoft is giving higher priority to performance. That’s needed. Most of my gripes about Windows Vista are performance related. Take the new Event Viewer, for example, just because I used it this morning. It takes 20 seconds to open on my system, during which time it displays “reading log” messages. This never happened with the old event viewer, which opens without any delay. The new one is much prettier, but at what cost? These small delays, repeated n times a day, consume a huge amount of expensive admin time.

That said, it’s puzzling to find a performance guy in charge of architecture. Still, Visual Studio is the first link in a chain that leads eventually to Windows, Office, and most third-party Windows apps. More speed everywhere, please.

Audio in Vista: more hell than heaven

Here is a contradiction. On the one hand, Vista audio is said to be much improved over audio in earlier versions of Windows. Certainly this was Microsoft’s intention. Larry Osterman’s 2005 post refers to several goals, including moving audio code out of the kernel to improve reliability, and making Windows a better platform for audio professionals. Osterman also describes the new audio API called WASAPI, which enables low-latency, and provides an illustration of how it fits together. Vista clearly has a much richer audio API than Windows XP. Here is an easy to understand overview, full of enthusiasm for its benefits.

Why a contradiction? Well, the actual, real-world experience of audio in Vista is mixed at best. Here is a typical post, complaining of stutters and pops in Vista audio which recall bygone days when PCs were barely up to the task. Surely playing 16-bit audio should be a breeze for today’s PCs?

I’ve had the same experience. I care about high-quality audio, so I installed a high-end Creative card, the Xi-Fi Elite Pro. I’ve been through all the drivers, from early betas to recent and supposedly production-ready releases. None have worked smoothly. I’ve had problems playing CDs, problems in Audacity where playback stutters or simply stops working, or a strange effect where the right and left channels go out of synch. I’ve had problems in Windows Media Player, where the responsiveness of the play, pause and stop buttons becomes sluggish, or playback fails completely.

I thought this might be primarily a problem with Creative’s drivers. There are certainly howls of anguish on the Creative forums. I also notice that if I switch to the motherboard’s integrated Realtec audio, reliability is greatly increased, though sound quality is worse. There are still occasional problems. Everyday use is fine, but a heavy editing session in Audacity causes glitches.

I decided to go pro. I removed the Xi-Fi, purchased a Terratec Phase 22, aimed at the pro market, and attached an external DAC. I chose the Terratec because it is a no-frills affair and has a Vista driver, unlike many of the pro audio cards out there. Happy now?

Well, no. The Phase 22 works OK using its internal DAC, but I’m having problems with the  SPDIF digital output. If I direct audio specifically to this output, by making it the default device, or selecting it in the preferences of an app like Audacity, it does not work. I can sometimes get it to work temporarily using the Phase 22 control panel, but it fails again as soon as I stop and restart playback. If I direct output to the Phase 22 internal DAC, then SPDIF output works, but it is always re-sampled to 48 kHz. Ideally I want bit-perfect output to the external DAC. For example, I’ve got a 96 kHz FLAC file. If I play this in Vista, it is output at 48 kHz.

In Windows XP, by contrast, it works perfectly. Ripped CDs are output at 44.1 kHz, my 96 kHz FLAC file is output at 96 kHz.

I also have problems with Steinberg’s Cubase SX. This works well in XP with the Phase 22, or with the internal card on Vista, but it does not work with the Phase 22 in Vista (I’ve not spent a lot of time trying to troubleshoot this). I called Terratec support. The guy didn’t bother trying to analyze the problem; he just said wait for a new driver.

Digging a little deeper

Maybe some of these problems are specific to my machine or the way it is configured. Maybe, and I look forward to your tips. But here are a few observations.

Pro audio vendors are very late with Vista drivers. I noticed this when looking for a replacement for the Xi-Fi. M-Audio, for example, has only patchy support, and some drivers are still in beta. E-Mu, Creative’s Pro range, is still on beta drivers. Bear in mind that Vista was released to manufacturing in November 2006, and that there were plenty of pre-releases.

Vista drivers, where available, may not be full-featured. Creative is a case in point. Its Vista drivers do not support decoding of Dolby Digital and DTS, DVD-Audio, 6.1 speaker mode, or DirectSound-based EAX effects.

General advice in the Pro community seems to be: stick with XP for the moment. I don’t see many posts from musicians raving about how much better Vista is for their work. I see plenty of posts about problems with audio in Vista.

What’s gone wrong? I don’t have a definitive answer, but can speculate a little. What we do know is that audio in Vista, and multimedia in general, is greatly changed. The links I gave above are just overviews. For a real drill-down, try the lengthy audio processing in Vista thread on the AVSForum, along with Creative’s explanation of audio in Vista. Note that a number of older APIs are now emulated on top of the new WASAPI. Emulation, as everyone knows, often means slow. Note also the two modes in Vista audio: shared and exclusive. As I understand it, in shared mode, Windows will always munge the audio at least a little. In exclusive mode this won’t happen, but according to this post, writing exclusive-mode drivers is exceedingly complex.

There’s also DRM to think about. Is the notorious protected media path getting in the way of faithful audio reproduction on Vista? Personally I doubt it, but it could be a factor.

Speculations

The bottom line is that Vista audio should be great, but in practice it is problematic for many users. Why? Here are a few possibilities.

1. Vista audio is great, but third-party vendors are a lazy bunch and haven’t bothered to do decent drivers. This is the view of many on the Creative forums, but I don’t buy this entirely. The failure to provide good drivers in a timely manner seems to go right across the industry. I am sure some vendors could have done better but I’m inclined to think there are other factors, such as perhaps…

2. Vista audio is so complex and different that third-parties had no chance of writing good drivers in time. This seems at least plausible. I still find it curious. I don’t doubt that the leading vendors of audio add-ons worked closely with Microsoft in the run up to Vista. Why then is support for the new operating system so limited and late?

3. Microsoft slipped up; audio in Vista does not work properly. It will certainly be interesting to see what effect Vista’s service pack 1 has, when it arrives later this year.

No conclusion

A year from now, we might all be saying Vista’s audio is fantastic. That will be after Vista SP1 and another year of driver development. Alternatively, we may know more clearly why it does not deliver. In the meantime, my own view is that Vista audio is more hell than heaven.

Technorati tags: , , , ,

Microsoft sets launch day for Visual Studio 2008, SQL Server 2008, Windows Server 2008

According to a press release just received, Microsoft has set 27 February 2008 for the “global launch” of its 2008 server and developer products:

Today at the Microsoft Worldwide Partner Conference, COO Kevin Turner announced that the company will jointly launch Windows Server 2008, Visual Studio 2008 and SQL Server 2008 in Los Angeles on 27 February, 2008. The event will kick off a “launch wave” of hundreds of events that Microsoft will host worldwide including training, virtual events and extensive online resources.

Windows Server 2008 has the IIS 7.0 web server, PowerShell command-line, and “Server core” which lets you install servers without any GUI components. Funny how Windows is getting more like Unix.

SQL Server 2008 has a new FileStream data type (a better blob), spatial and location data types, integrated full-text search, and a bunch of scalability and management improvements.

Visual Studio 2008 is the LINQ (Language Integrated Query) and WPF (Windows Presentation Foundation) release. WPF is already out there, but this has full design-time support. There is also ASP.NET AJAX. Visual Studio 2008 goes hand-in-hand with C# 3.0 and VB 9.0. The underlying CLR (Common Language Runtime) is still essentially 2.0, the same as for Visual Studio 2005.

Of course there are a zillion other new features, but I’ve picked out a few highlights.

Will this change our lives? LINQ is exciting, and so is WPF if anyone actually starts to use it, but of course we’ve known about these things for a while. Microsoft’s release cycle for new technology – from first announcement to full release – seems to stretch out for ages. Otherwise, this feels more like consolidation than any sort of new direction.

Fixing the Xbox 360

Microsoft says it will give a retrospective 3 year warranty to all owners of Xbox 360 consoles. Here’s a snippet from the press release:

As a result of what Microsoft views as an unacceptable number of repairs to Xbox 360 consoles, the company conducted extensive investigations into potential sources of general hardware failures.  Having identified a number of factors which can cause general hardware failures indicated by three red flashing lights on the console, Microsoft has made improvements to the console and is enhancing its Xbox 360 warranty policy for existing and new customers.

While the whole world knows that the 360 is unreliable, this perhaps Microsoft’s first public confession. An extended warranty is good; but prospective purchasers may be even more interested in the “improvements to the console” mentioned above. Has Microsoft really found a fix to the design fault(s) which cause the problem?

Another unanswered question concerns the DRM which causes problems for users who return consoles for repair and get back a refurbished unit that used to belong to someone else. This is a common practice in the IT industry, and normally it makes good sense, because you get a replacement quicker. Unfortunately it is a flawed plan with respect to the 360, because purchased downloads are tied to the machine on which they were downloaded. See this thread for the gory details, lots of unhappy customers, and Microsoft’s inconsistent response.

You would think that someone at Microsoft would have realised even before the launch that this was a likely scenario. Of course it is made worse by the high number of returned machines. Surely Microsoft can work out some way to allow customers to re-download the games they own, fully unlocked, to a new machine. Currently the mechanism seems to be: argue with customer service until you get your Microsoft Points refunded, then re-purchase the games. That is a disappointingly crude mechanism. 

Here’s another thing that puzzles me. Let’s presume that the Xbox 360 has a design fault, to do with overheating, that makes premature failure likely. Reasonable, I think. So how long ago was this fact apparent to Microsoft? I’d have thought it would be well over a year ago. I recall users complaining about repeated red light incidents in early 2006. Why then did Microsoft continue turning the handle and manufacturing machines with the same flaw for so long?

Still, users will be grateful that Microsoft has had the decency and the resources to admit to the problem and fix at least the hardware side of it for free.

Sun’s ODF converter

I had a quick look at Sun’s Open Document Format (ODF) converter for Microsoft Office.

This is aimed at users of Microsoft Office who want to open and save documents in ODF, an XML document format also used by Open Office and standardised by ISO. Why would you want to do this? The main reasons would be if you worked in an organization that mandates ODF as a standard, or if you need to send and receive documents from other organizations which use ODF as standard.

There is a curious twist here. On the face of it, one of the reasons you would send documents in ODF rather than, say, DOC or XLS, is to make it easier for users of Open Office to read your documents. However, this doesn’t necessarily apply, since this ODF converter actually is the same code used in Open Office to convert to and from Microsoft Office formats, so the recipient is really no better off. The new Open Office XML formats are a different matter but … the converter does not yet support Office 2007 (find out why here), or even Vista, according to the readme.

How good is the converter? On my quick test, pretty good, which I’d expect, given that Open Office is also pretty good in respect of Microsoft Office compatibility. It does not convert macros, but that’s not usually a problem since you rarely want to distribute documents containing macros. I managed to trip it up on one feature – there are probably others, but this is one that I found quickly. I tested the converter on my old Tablet notebook, since this still has Office 2003 installed. I created a new document and added an ink comment – a handwitten annotation written with a Tablet pen. Saved the document to ODT, reopened it, no comment. Not exactly a showstopper, but it illustrates the point that there are compromises if you choose to standardise on a non-native format.

Using the ODF converter in Word is more pleasant than with Microsoft’s ODF converter. It installs as an import/export filter, so that you can simply use Save As. You can even set Word to use it as default. There are also converters for Excel and PowerPoint, but these are not so deeply integrated.

Now for a few gripes. First, why is the download not digitally signed? This looks unprofessional these days, from a major software vendor.

Second, the converter installs itself in the system tray. What is a document converter doing in the system tray? I think this is ridiculous clutter. The only reason I can think of is to enable automatic or semi-automatic update; but I’d have thought this could be done on starting the add-in.

Third, the dialogs. Here’s what you get when you save a document as ODT:

The readme says of this dialog:

This warning unfortunately cannot be disabled and should be ignored.

I agree it is annoying; but should it be ignored given that, in fact, it might be true? If I’d been foolish enough to add lots of ink comments to a document, saved it as ODT, and ignored the dialog as Sun advises, I would be upset to have lost my work.

On the other hand, the real irritation of this dialog is that you do not know. Everything might be fine, or it might not.

Next, you get another dialog:

Sun’s readme says that this can be disabled, provided you have Word 2003 or XP, by setting a registry key. We are referred to KB 837011. But what does this KB say?

Word 2003 prompts you with the error messages that are mentioned in the “Symptoms” section when it is using file converters that have not been digitally signed. The prompt is expected and it is typical.

Right, so the real problem here is that Sun, again, hasn’t digitally signed its converter. If you read the KB article carefully, Microsoft does not recommend that you disable the warning. It is only intended for use if you have compatibility issues with legacy filters. Not, surely, something just released by Sun Microsystems.

Nevertheless, this is a decent converter. Would I use it? It will be handy for occasional import and export, but I would be most reluctant to use it by default on all my documents. If you are using Microsoft Office, it makes sense to use the Office native formats. If I’m sending a document and need the widest possible compatibility, RTF is good. At this point even .DOC and .XLS are probably more widely compatible than ODF, since they have been de facto standards for so long.

Technorati tags: , , , , , ,