Category Archives: microsoft

450 fixes in Office 2007 service pack 1

Microsoft has released Office 2007 service pack 1. But what does it fix? If you go to this page you can download a spreadsheet which lists around 450 fixes. It is a little misleading, since many of the fixes reference pre-existing knowledgebase articles, which I reckon means you may already have the fix. SP1 is still worth it (presuming it works OK) – there are plenty of other issues mentioned.

Of course I went straight to the Outlook 2007 section, as this is the app I have real problems with. This one will be interesting to some readers of this blog:

  • POP3 sync is sometimes slow.  An issue that contributed to this issue was fixed in SP1.

I believe I have noticed this one too:

  • A large number of items may fail to be indexed.

As to whether Outlook 2007 will perform noticeably better after SP1, I am sceptical but will let you know.

As it happens, the top four search keywords for visitors to this blog who come via search engines, for this month, are as follows:

  1. 2007
  2. outlook
  3. vista
  4. slow

It is similar most months. Hmmm, seems there may be a pattern there.

Wired votes for Zune over iPod

Wired Magazine, home of Cult of Mac, has declared the Zune 2 a better buy than the iPod Classic.

This may prove any number of things. One possibility is that Microsoft has a winner. After all, it the company’s modus operandi. Windows 1.0, rubbish. Windows 3.0, word-beating.

Then again, perhaps articles with unexpected conclusions just get more links. Like this one.

Not that I care – there is no Zune for the UK.

Technorati tags: , , ,

Live Workspace: can someone explain the offline story?

I showed the Asus Eee PC to a friend the other day. She liked it, but won’t be buying. Why? It doesn’t run Microsoft Office (yet – an official Windows version is planned).

It reminded me how important Office is to Microsoft. No wonder it is fighting so hard in the ODF vs OOXML standards war.

Therefore, if anything can boost Microsoft’s Web 2.0 credentials (and market share), it has to be Office. I’ve not yet been able to try out Office Live Workspace, but it strikes me that Microsoft is doing at least some the right things. As I understand it, you get seamless integration between Office and web storage, plus some extras like document sharing and real-time collaboration.

I still have a question though, which inevitably is not answered in the FAQ. What’s the offline story? In fact, what happens when you are working on a document at the airport, your wi-fi pass expires, and you hit Save? Maybe a beta tester can answer this. Does Word or Excel prompt for a local copy instead? And if you save such a copy, how do you sync up the changes later?

If there’s a good answer, then this is the kind of thing I might use myself. If there is no good answer, I’ll stick with Subversion. Personally I want both the convenience of online storage and the comfort of local copies, with no-fuss synch between the two.

That said, I may be the only one concerned about this. When I Googled for Live Workspace Offline, the top hit was my own earlier post on the subject.

Microsoft Volta: magic, but is it useful magic?

Microsoft has released an experimental preview of Volta, a new development product with some unusual characteristics:

1. You write your application for a single machine, then split it into multiple tiers with a few declarations:

Volta automatically creates communication, serialization, and remoting code. Developers simply write custom attributes on classes or methods to tell Volta the tier on which to run them.

2. Volta seamlessly translates .NET byte code (MSIL) to Javascript, on an as-needed basis, to achieve cross-platform capability:

When no version of the CLR is available on the client, Volta may translate MSIL into semantically-equivalent Javascript code that can be executed in the browser. In effect, Volta offers a best-effort experience in the browser without any changes to the application.

The reasoning behind this is that single-machine applications are easier to write. Therefore, if the compiler can handle the tough job of distributing an application over multiple tiers, it makes the developer’s job easier. Further, if you can move processing between tiers with just a few declarations, then you can easily explore different scenarios.

Since the input to Volta is MSIL, you can work in Visual Studio using any .NET language.

Visionary breakthrough, or madness? Personally I’m sceptical, though I have had a head start, since this sounds very like what I discussed with Eric Meijer earlier this year, when it was called LINQ 2.0:

Meijer’s idea is programmers should be able to code for the easiest case, which is an application running directly on the client, and be able to transmute it into a cross-platform, multi-tier application with very little change to the code.

What are my reservations? It seems hit-and-miss, not knowing whether your app will be executed by the CLR or as Javascript; while leaving it to a compiler to decide how to make an application multi-tier, bearing in mind issues like state management and optimising data flow, sounds like a recipe for inefficiency and strange bugs.

It seems Microsoft is not sure about it either:

Volta is an experiment that enables Microsoft to explore new ways of developing distributed applications and to continue to innovate in this new generation of software+services. It is not currently a goal to fit Volta into a larger product roadmap. Instead, we want feedback from the community of partners and customers to influence other Live Labs technologies and concepts.

Vista vs XP performance: some informal tests

After posting about the inadequacy of a recent test report I thought it would be interesting to conduct my own informal tests of Vista vs XP performance. I do not run a computer laboratory, but I guess my tests have the benefit of being real-world.

I tested several conditions on three computers. On two of them I was able to test XP 32-bit vs Vista 32-bit. I tried various combinations of Aero on or off, visual effects on or off, and UAC (User Account Control) on or off. I also tried setting Vista to run only basic services, using Msconfig.

The test suite I used was PassMark Performance Test 6.1. You can use this free for 30 days, so anyone can repeat the tests.

Caveat: perfect tests are hard to achieve. A big problem with Vista is that it has all sorts of background services that are meant to maintain your system, perform backups, check for updates, and so on. Further, if you have third-party software installed (and who doesn’t?) then it may also be running background tasks. The only compensation I applied was to wait a few minutes after start-up for disk activity to pretty much cease. I also switched off anti-virus software, and closed the Vista sidebar.

Here’s a few highlights. First, XP was fastest on the two dual-boot machines, and that was without any effort optimizing XP for performance. In both cases, the best Vista performance was about 15.5% slower than XP, based on the PassMark performance index. On one machine the worst Vista performance was 28% slower, and on the other 23% slower.

Graphics 2D – Ouch!

The most interesting result was the poor performance of the Graphics 2D tests on Vista. A series of PassMark tests cover drawing lines, rectangles, shapes, font rendering, and common GUI operations like scrolling listboxes, moving windows or filling progress bars. On both dual-boot machines, the best Vista performance was an amazing 70% slower than XP. Put another way, XP was 3 times faster. Since these are common operations, that is worrying.

That said, there are some specific reasons for this poor performance. See here for example:

The Windows Vista graphics system is designed to support a broad range of hardware and usage scenarios to enable new technology while continuing to support existing systems. Existing graphics interfaces, such as GDI, GDI+, and older versions of Direct3D, continue to work on Windows Vista, but are internally remapped where possible. This means that the majority of existing Windows applications will continue to work.

Hmm, re-mapping sounds slow. And here:

GDI primitives like LineTo and Rectangle are now rendered in software rather than video hardware, which greatly simplifies the display drivers.  We don’t think this will impact many real-world applications (usually when a GDI application is render bound its because it’s doing something like gradients that was never hardware accelerated), but if you do see problems please let us know.

See also Greg Schechter’s notes here on GDI-rendered windows.

In a nutshell, Vista is optimized for DirectX rendering at the expense of GDI; yet, as Schecter notes:

Today and for the near future, most applications use and will continue to use GDI to render their content.

This I suspect is a large part of the reason why some tests report that XP is twice as fast as Vista. Automating Office could well hit this slow GDI issue. Note that it is not all 2D graphics that are slow – CustomPC got less alarming results for its 2D graphics tests, which by the looks of it are not GDI-based.

Schechter does not cover the scenario where the DWM (Desktop Window Manager) is turned off, but it looks as if some of the factors which make GDI slow still apply.

What’s most effective at speeding up Vista?

I did some experiments where I compared Vista Basic with Vista Aero, or looked at what happened when UAC is switched off. I got inconsistent results. On an older machine, I found that disabling Aero made a significant difference, maybe an 8% speed-up. On another, more recent machine, Aero was actually faster. So much changes when you use DWM that I guess this is to be expected.

UAC? Not a huge difference in these tests – around 2%.

The biggest influence, on the basis of my imperfect testing, came when using Msconfig to switch off all but basic services and start-up programs. This speeds performance by around 10% overall.

In most other tests there were modest differences between Vista and XP. This includes 3D graphics, where Vista actually scored higher than XP on one machine, and CPU, where on one machine there was less than 2.5% difference between best and worst. Vista does come out significantly slower on the PassMark memory test suite, from just over 8% worse to over 20% worse.

Conclusion? First, my informal tests suggest that XP is faster than Vista, but not normally twice as fast. Second, an application written to use DirectX rather than GDI should perform better, other things being equal. WPF (Windows Presentation Foundation) uses DirectX, but unfortunately has its own performance overhead.

If this analysis is right, then Vista is at its worst when rendering the GUI in traditional Windows applications. That will make them feel less snappy, but would not impact the non-visual aspect, like say recalculating a spreadsheet.

Then again, I only tried one test suite, so please don’t take the above too seriously.

I’d be interested in further informed comments.

Technorati tags: , , ,

ParallelFX: concurrency library for .NET

A few links:

Announcement of CTP on Somasegar’s blog

MSDN article on PLINQ

MSDN article on Task Parallel Library

Joe Duffy – concurrent programming blog

Interesting note in Duffy’s blog about PDC 07 (the one that never was):

Note: some might wonder why we released the articles before the CTP was actually online.  When we originally put the articles in the magazine’s pipeline, our intent was that they would in fact line up.  And both were meant to align with PDC’07.  But when PDC was canceled, we also delayed our CTP so that we had more time to make progress on things that would have otherwise been cut.

Concurrency is a big deal; it’s good to see more library support.

Technorati tags: , , ,

Performance testing Vista: misleading reporting of inadequate tests.

Is Vista really half the speed of XP? That is what CNET reports here, in an article which does not bother to link to the source of the tests, as far as I can see.

The tests seem to be based mainly on scripting Office through OLE automation. This is certainly interesting, but it would be more helpful to have a performance breakdown, looking at various aspects of OS performance. Otherwise, how do we know whether this is more to do with, say, OLE performance than Vista performance?

Second, Vista has richer graphics than XP. Think of it like a game which offers difference graphics options to suit your hardware. Typically, you can set varying levels of background detail, shadow effects, etc, in order to find the right compromise between appearance and performance. Vista sets this higher than XP by default, so you would expect screen operations to be slower.

Third, Vista’s UAC security imposes a significant performance overhead. Again you would expect that, since it applies additional checks to numerous operations.

You could conduct a more useful test by configuring Vista to work as much as possible like XP. Turn off UAC, turn off Aero, turn off the indexer, turn off visual effects. I would also suggest checking what is running in the background and matching it as closely as possible on the two machines. I would be surprised if the performance difference is so striking after doing these things.

Perhaps you will argue that this is not the real-world experience. In practice, users take the machine as supplied and start using it; they do not open Control Panel and look at Performance Information and Tools, or get rid of all that third-party foistware. It’s a fair point, though on a corporate network admins can set these things on the user’s behalf.

When it comes to real-world performance, I also have gripes about Vista. However, I have little problem with Vista’s overall performance, say when working in Office, aside from Outlook 2007 which is a disgrace. What bugs me is unexpected delays. For example, I click the Start button, then All Programs, then scroll down to a program group and click. Vista just stops. I tried it just now, and it took 12 seconds to open the group I chose. I’ve had to train myself not to keep clicking. What’s going on there?

There are real issues here, but not helped by misleading reporting of inadequate tests.

.NET Framework for Symbian

Red Five Labs has announced its implementation of the .NET Compact Framework for Symbian, the operating system used (most of the time) by Nokia.

It is currently in beta, and implements only version 1.0 of the Compact Framework, but nevertheless it is an intriguing development.

With this, Mono, Moonlight, and Microsoft’s own implementation of Silverlight 1.1 for Mac OS, .NET is getting more interesting for cross-platform development.

.NET history: Smack as well as Cool

Microsoft’s Jason Zander comments on my piece on the early history of ASP.NET:

  • The CLR was actually built out of the COM+ team as an incubation starting in late 1996.  At first we called it the “Component Object Runtime” or COR.  That’s why several of the unmanaged DLL methods and environment variables in the CLR start with the Cor prefix.
  • There were several language projects underway at the start.  The C++ and languages teams both had ideas (Cool was one of them), and in the CLR we wrote Simple Managed C or SMC (pronounced ‘smack’).  We actually wrote the original BCL in SMC.

He says these are corrections though they seem more like supplementary information to me. I don’t have any inside knowledge of this history other than what people who should know say to me (though I do also have my own recollections of what was said publicly). He may be reacting to the idea that the CLR came out of the VB team, which Mark Anders kind-of implied.

One of the reasons I love blogging is that multiple authors can have a crack at getting the facts right. A great personal example is when I asked the question Who invented the wizard; and a good candidate came forward over a year later. If you see something inaccurate or misleadingly incomplete on this site, please do comment or let me know by email.

Technorati tags: , , ,

Microsoft vs Mozilla Javascript wars

My comment is here.

Note this debate is not only about the merits of different versions of Javascript/ECMAScript. It is also about power and responsibility. However you spin it, and however far Adobe and/or Microsoft succeed with Flash/Silverlight/AIR, I think we can agree that the browser has an important role for the foreseeable future. It is also likely (though less certain, I guess) that Internet Explorer will continue to have a large market share. The company does have a responsibility not to hold back the Web, and that surely includes not obstructing the evolution of a high-performance Javascript runtime.

It is disappointing that Microsoft says so little about IE8, presuming it exists. If the company sticks by its undertaking to leave no more than two years between IE releases, we should expect it no later than October 2008, less than one year away. It would help web developers to know more about what will be in it.