Unravelling the reasons for Vista audio glitches

Since Vista’s first release I’ve been puzzling over why audio in Vista is prone to glitches, when it is meant to be fundamentally better than it was in Windows XP.

I’ve posted previously on the subject:

Audio in Vista: more hell than heaven

Why does audio glitch in Vista?

Another Pro Musician gives up on Vista audio

I myself suffered from this. When I stuck a CD in my Vista PC back in November 2006, it would not play smoothly. I don’t recall ever having this problem before, even back in Windows 3.1 days.

The Guardian commissioned a piece on the subject which is published today. The research showed multiple reasons for Vista’s audio problems. It’s best to show these as a series of scenarios.

1. Consumer buying new Vista PC with on-board audio

The recommended audio driver type for Vista is called WaveRT. The architecture is better than with previous driver models; you can read an official paper on the subject here. If you look at the API, you’ll notice two interfaces, IMiniPortWaveRTStream and IMiniPortWaveRTStreamNotification. The second interface was added at a late stage in the Vista development cycle. According to CakeWalk’s CTO Noel Borthwick, this was because the original API, which lacked this event notification, was very inefficient. Although Microsoft fixed it, the on-board audio drivers came out using the old inefficient driver model for WaveRT. RealTek actually lists support for IMiniPortWaveRTStreamNotification as one of the fixes in its 1.82 driver update, released in November 2007 a year after Vista went RTM.

The fact that the on-board audio vendors provided WaveRT drivers at all was an indication of their early support for Vista’s new driver model. Vendors of add-on audio cards didn’t get round to this much later, or in some cases not at all.

2. Consumer with Vista PC and an add-on card

Although WaveRT is the recommended driver type for Vista, older driver types are also supported. At a higher level, the new WASAPI audio API also emulates older APIs like DirectSound and MME. The quickest way to come up with Vista drivers was to use these legacy APIs. The result is that drivers for add-on cards were probably using inefficient compatibility APIs.

In both consumer cases, this is about apps as well as drivers. Applications make a choice about which Windows audio API to use. In many cases that’s going to mean an emulated API.

3. Pro audio user with add-on card

The situation for pro audio users is different again. On-board cards lack necessary features for pro audio. Pro audio applications have long bypassed the Windows audio stack to reduce latency, using either ASIO or WDM kernel streaming. This avoids the problems mentioned in (1) and (2) above, because ASIO and WDM kernel streaming work the same in Vista as in XP. However, even here Vista is less satisfactory than XP, because the OS imposes a greater overhead, and because according to Borthwick there are bugs which only Microsoft can fix. An example is mentioned in this interview in Create Digital Music:

Peter: Some users have reported MIDI performance issues — specifically, jitter — under Vista. How much of an issue is this? What are the factors that cause it?

Noel: Both Cakewalk and Digidesign and Cakewalk logged this issue with Microsoft. The root cause of this problem was found to be in the WinMM.DLL and was due to an inefficient check being done on every WinMM API call.  It has been addressed in Vista SP1.

The issue itself was pretty severe and impacted MIDI timing on playback and recording. As compared to XP, in Vista we observed timing discrepancies as far out as 150 ticks. You could also run into cases where MIDI events were lost while playing.

Here’s one instance where SP1 definitely improves matters; nevertheless, Borthwick told me that SP1 is not a cure-all and some other bugs remain unresolved.

Myth-busting

Some people think that Vista’s DRM is responsible for audio problems. Nobody I talked to thought that was the case. It doesn’t apply in the common cases mentioned above.

What about moving the audio stack out of the kernel? Probably not an issue, certainly not in the pro audio case, where things work the same as before.

Fixing Vista audio

Vista audio is definitely improving. SP1, improved WaveRT drivers for on-board sound, decent drivers for add-on cards, all are happening. It probably will reach the point where it is better than XP in some circumstances, because there are genuine improvements in the audio stack. If you are reading this and get glitches, check that you really have the latest drivers and updates.

64-bit has the potential to be really good, though driver support is dire right now.

It’s still a sorry tale and I suspect has lost Microsoft a lot of momentum in the pro audio world, and also among consumer users like myself who were surprised and disappointed by glitching audio.

Preventable? Ultimately I feel this is a symptom of Vista actually being rushed (despite long delays), thanks to the famous reset. There’s also the question of why the WaveRT API wasn’t done right at an earlier stage, which (if the above analysis is right) could have saved much grief. Finally it seems that the emulation layers are just too inefficient.

Technorati tags: , ,

How to speed up Windows Vista: official and unofficial tips

Microsoft has published an article on speeding up Vista, aimed at general users.

It’s not too bad. Here’s the summary:

  • Delete programs you never use
  • Limit how many programs load at startup
  • Defragment your hard drive
  • Clean up your hard disk
  • Run fewer programs at the same time
  • Turn off visual effects
  • Restart regularly
  • Add more memory
  • Check for viruses and spyware
  • Disable services you don’t need

Still, it’s a bit scattergun. I prefer a two-stage approach to improving performance (same applies to a single application):

  1. Find out what is slow
  2. Speed it up, or leave it out

For example, the benefits of adding memory tail off after a certain point. Task Manager will tell you to what extent RAM is slowing down Vista. Further, adding memory beyond 3GB is pretty much wasted on 32-bit Vista, since the system can only address 4GB, and the BIOS plus devices will use a lot of the 4th GB address space. That said, a system that is critically short of RAM (in other words, constantly swapping out memory to the hard drive) is in my opinion broken and unusable. Adding RAM in such cases delivers huge rewards.

Uninstalling programs gives little performance benefit if they are not running (unless disk space is limited). The aim is to reduce the number of running processes, not entries in the Start menu.

Vista defragments your drive regularly, by default. The benefits are often rather small, so it would be equally valid to suggest removing it from the schedule, or scheduling it to run less frequently.

The advice to restart regularly needs examination. Yes, a reboot can fix a sluggish machine. But it shouldn’t be necessary, and I recall that keeping Vista always-on was intended to be a benefit of the OS. Yes, here’s a quote from Power Management in Windows Vista [ppt]:

  • Windows Vista promotes the use of sleep as the default off state

In the right circumstances, Vista can run for ages without any problem. I’ve actually had Media Center (Vista Ultimate) run for several months without any issues; though this kind of thing is not very green so that’s another reason to do regular switch-offs. Still, to my mind “restart regularly” is a symptom of some problem that should be fixed.

Turning off visual effects is reasonable advice, though once again it may not yield much benefit. I tried it on my system and was surprised how little difference it made. Reason: I am running with Aero and a decent-ish graphics card, and hardware acceleration seems to handle the visual effects rather easily. Once again, if it’s not the thing slowing you down, then removing it won’t speed you up. You can test this quite simply, though it is tedious. Try it both ways. Did it make a difference? Measure it if possible.

It really is worth using the built-in tools, like Task Manager and the Reliability and Performance Monitor, to see which processes are grabbing lots of RAM and CPU. One of the good things about Vista is that such tools are easy to find. Click Start, type “reliability”, and click the link.

I’d also like to see mention of some favourite candidates for slowing down Vista:

1. Outlook 2007

2. The indexing service

3. Anti-virus software

4. Windows Defender

Hmmm, at least three of these are from Microsoft. Perhaps they are too embarrassing to mention.

Finally, I suspect disk performance is a big factor in real-world Vista speed. The reason is that many apps are very talkative when it comes to disk access. Here’s something to try. Go along to the Systernals site and download process monitor. This gives a good picture of what the actual processes on your Vista box are up to. Note how many events are occurring, how many of them involve file i/o, and which processes are responsible. You will also discover a large part of the reason why Outlook 2007 is so slow.

PS Another article, also just published, has good coverage of swap files and ReadyBoost.

Living without the mouse – the content is the interface

I meant to link yesterday to John Lam’s post on living without the mouse.

I do so today because it fits nicely with Edward Tufte’s remark that (ideally) the content is the interface.

Lam’s point is that “toolbars just waste screen real estate”. Thus, he’s learned to operate as far as possible with keystrokes, not only because this is quicker (hand does not leave the keyboard), but also because it lets him remove on-screen furniture.

He writes primarily about Visual Studio, but also has a great tip for Office 2007 apps like Word and Excel. Press Ctrl-F1 to toggle the ribbon on or off. No ribbon means a lovely clean workspace.

Web usability has a long way to go

First thing in the morning I often browse through recent blog posts and follow links that look interesting.

I noticed a free Windows 2008 book offer from Microsoft. Might be useful background for my review I thought – I’ll download it.

I lost count of how many slow, unresponsive pages I had to traverse before getting the book. Yes, I am persistent. I do recall having to sign in with Windows Passport (to the same account) twice – once to register for the book, and a second time for something called the E-Learning center, both times passing registration forms that I have seen many times before and do not intend to change. The final annoyance is that you cannot right-click and download the PDF; it is a Javascript link that opens in the browser. In my case I’ve set Adobe Reader to open outside the browser, which helps, but it is still an irritation.

It would not be so bad if this labyrinth of links were quick to navigate, but they are not. The problem in this case does not appear to be the download of large files (the PDF actually came down quickly once I got there), but rather slow server-side code resulting in web pages that seem to hang.

Next came an irony. Via Jimmy Guterman at O’Reilly I noticed a presentation by Edward Tufte on the Apple iPhone UI. Guterman warned that it was a large Quick Time file that would take “many minutes” to download. I clicked anyway. And waited. It was better than endless link-clicking, but still a poor user experience – no download thermometer, just a web page that seems completely unresponsive.

I agree with Guterman – the video is worth watching. Key points:

  • The content is the interface – remove “computer administrative debris” like buttons and toolbars.
  • Clutter is a failure of design
  • Add detail to clarify

Nevertheless, getting to the video is a lousy experience. The key here is that progress indicators transform the user’s perception of lengthy operations. I don’t just mean a spinning hourglass or the browser’s loading thermometer – we’ve learned that these are unreliable indicators, and that we may wait forever.

Nokia acquires Trolltech,

Nokia is to acquire Trolltech, makers of the popular cross-platform Qt GUI API and widget set. Qt (Cute Toolkit) is used by KDE, one of the two most widely used Linux desktops. It is also used in many cross-platform applications.

The announcement states that Qt will continue to be open source:

We will continue to actively develop Qt and Qtopia. We also want to underline that we will continue to support the open source community by continuing to release these technologies under the GPL

Nokia says:

The acquisition of Trolltech will enable Nokia to accelerate its cross-platform software strategy for mobile devices and desktop applications, and develop its Internet services business. With Trolltech, Nokia and third party developers will be able to develop applications that work in the Internet, across Nokia’s device portfolio and on PCs. Nokia’s software strategy for devices is based on cross-platform development environments, layers of software that run across operating systems, enabling the development of applications across the Nokia device range. Examples of current cross-platform layers are Web runtime, Flash, Java and Open C.

Interesting acquisition. I have great respect for Nokia, but find its platform strategy confusing. It is the major partner in Symbian, the operating system used in its smart phones, but it also uses Linux, for example in its Internet Tablets like the N810. Nokia sponsors an open source development platform called Maemo, which uses the Gnome Toolkit, a competitor to Qt. I also met Nokia folk at Adobe Max in Barcelona, talking up its Flash support. The Flash player is included on all high-end Nokia mobiles.

And what’s this about “applications that work in the Internet”?

It is a shame to see another smart, independent company swallowed by a giant, but there could be worse homes for Qt. That said, I’d be nervous about Qt’s support for Windows CE. And what will happen to Qtopia Phone Edition:

Qtopia Phone Edition is a comprehensive application platform and user interface for Linux-based mobile phones.

The releases say Nokia/Trolltech will continue to invest in Qtopia. Will it become core to Nokia’s own range of devices, or sidelined? What are the implications for Symbian? Is Nokia worried about too much mobile development going to Flash?

However you spin it, it seems that Linux is ascendant at Nokia.

Technorati tags: , , , , , ,

SQL Server 2008 delayed until third quarter 2008

Microsoft’s Francois Ajenstat says SQL Server 2008 will not be released until Q3 2008. He calls this “roadmap clarification”, Microsoft-speak for what most of us call a delay.

It’s embarrassing for the company, since SQL Server 2008 “launches” on 27th February, less than one month from today. What will actually launch is a feature-complete CTP (Community Tech Preview). Since Q3 can(and often does) mean September, there may be over six months between the launch and the release. It is an extraordinary stretch, especially since Visual Studio 2008, which has been available since late last year, “launches” on the same day.

Still, let’s look on the bright side. First, late releases are better than buggy releases; and the SQL Server team has a good record. Second, at least Microsoft has managed to control the dependencies so that it possible to stagger the releases without anything breaking (one presumes).

Polarisation

Slashdot takes the IBM line:

At this point nobody has the vaguest idea what OOXML will look like in February, or even whether it will be in any sort of stable condition by the end of March. ‘While we are talking about interoperability, who else do you think is going to provide long term complete support for this already-dead OOXML format that Microsoft Office 2007 uses today? Interoperability means that other applications can process the files fully and not just products from Microsoft. I would even go so far as to go back to those few OOXML files you have already created and create .doc, .ppt, and .xls versions of them for future use, if you want to make sure you can read them and you don’t want to commit yourself to Microsoft’s products for the rest of their lives

Alexander Falk, CEO of Altova, which makes a popular Windows XML editor:

I see the ISO vote as a non-event. In my opinion, the real-world adoption of OOXML is primarily driven by the ubiquity of Microsoft Office much more than any standards body…In terms of actual customer inquiries regarding need for ODF, we have not seen any interest from our customers…My advice to dev shops is to start working with OOXML as early as possible.

Technorati tags: , , , ,

Flex briefing in London tonight

Adobe’s Serge Jespers and James Ward are in London this evening (Thursday 24th Jan 2008) and will be speaking to the Flex User Group:

19:00 General intro (Serge Jespers)
19:15 Flex Builder 3 (Serge Jespers)
19:50 Open source (James Ward)
20:00 Data services (James Ward)
20:35 Q&A (and presentation from the community)
21:00 Drinks (free beer, thanks Adobe!)

It’s a free event and there’s room for more, so I thought I’d mention it here.

Here’s the signup page.

If you go along, I’d be interested to know how you find it and what you think of Flex and AIR. Comment here or email me – tim(at)itwriting.com.

Technorati tags: , , ,

Why Internet Explorer users get the worst of the Web

Microsoft’s Chris Wilson has a post on Compatibility and IE8 which introduces yet another compatibility switch. IE8 will apparently have three modes: Quirks, Standards, and Even More Standard.

Here’s the key paragraph:

… developers of many sites had worked around many of the shortcomings or outright errors in IE6, and now expected IE7 to work just like IE6. Web developers expected us, for example, to maintain our model for how content overflows its box, even in “standards mode,” even though it didn’t follow the specification – because they’d already made their content work with our model. In many cases, these sites would have worked better if they had served IE7 the same content and stylesheets they were serving when visited with a non-IE browser, but they had “fixed their content” for IE. Sites didn’t work, and users experienced problems.

In other words, so many web pages have “If IE, do this” coded into them, that pages actually break if IE behaves correctly. Alternative browsers will do a better job, even if IE is equally standards-compliant, because they do not suffer the effects of these workarounds.

Microsoft’s proposed solution is to make the supposed Standards mode a new quirks mode, this time frozen to IE7 compatibility, and to force developers to add a further meta tag to enable the better standards compliance of which IE8 is capable.

It actually goes beyond that. Aaron Gustafson explains the rationale for the new X-UA-Compatible meta tag which enables web developers to specify what browser versions their page supports. The idea seems to be that browsers parse this tag and behave accordingly.

This sounds uncomfortable to me. Versioning problems are inherently intransigent – DLL Hell, the Windows GetVersion mess – and this could get equally messy. It is also imposing a substantial burden on browser developers.

Has Microsoft made the right decision? Trouble is, there is no right decision, only a least-bad decision. Personally I think it is the wrong decision, if only because it perpetuates the problem. It would be better for IE to do the correct thing by default, and to support meta tags that turn on quirks modes of various kinds, or an option in browser preferences, rather than doing the incorrect thing by default.

Still, Wilson makes a case for the decision and has some supporters. Nevertheless, he is getting a rough ride, in part because the IE team has failed to engage with the community – note for example the long silences on the IE blog. Why is Wilson telling us now about this decision, as opposed to discussing the options more widely before it was set in stone, as I suspect it now is? Even within the Web Standards Project, some of whose members assisted Microsoft, there is tension because it it appears that other members were excluded from the discussion.

Another point which I’m sure won’t go unnoticed is that Wilson makes a good case for using alternative browsers. IE users get inferior markup.

Technorati tags: , , ,

Use HTML not Flash, Silverlight or XUL, says W3C working draft

The W3C has posted its working draft for HTML 5.0. Interesting statement here:

1.1.3. Relationship to XUL, Flash, Silverlight, and other proprietary UI languages

This section is non-normative.

This specification is independent of the various proprietary UI languages that various vendors provide. As an open, vender-neutral language, HTML provides for a solution to the same problems without the risk of vendor lock-in.

Food for thought as you embark on your Flash or Silverlight project. I would have thought XUL is less proprietary, but still.

The contentious part is this:

HTML provides for a solution to the same problems

I doubt that HTML 5.0 can quite match everything that you can do in Flash or Silverlight, depending of course what is meant by “a solution to the same problems”. The other issue is that Flash is here now, Silverlight is just about here now, and HTML 5.0 will be a while yet.

Technorati tags: , , , ,