All posts by onlyconnect

Wired votes for Zune over iPod

Wired Magazine, home of Cult of Mac, has declared the Zune 2 a better buy than the iPod Classic.

This may prove any number of things. One possibility is that Microsoft has a winner. After all, it the company’s modus operandi. Windows 1.0, rubbish. Windows 3.0, word-beating.

Then again, perhaps articles with unexpected conclusions just get more links. Like this one.

Not that I care – there is no Zune for the UK.

Technorati tags: , , ,

Live Workspace: can someone explain the offline story?

I showed the Asus Eee PC to a friend the other day. She liked it, but won’t be buying. Why? It doesn’t run Microsoft Office (yet – an official Windows version is planned).

It reminded me how important Office is to Microsoft. No wonder it is fighting so hard in the ODF vs OOXML standards war.

Therefore, if anything can boost Microsoft’s Web 2.0 credentials (and market share), it has to be Office. I’ve not yet been able to try out Office Live Workspace, but it strikes me that Microsoft is doing at least some the right things. As I understand it, you get seamless integration between Office and web storage, plus some extras like document sharing and real-time collaboration.

I still have a question though, which inevitably is not answered in the FAQ. What’s the offline story? In fact, what happens when you are working on a document at the airport, your wi-fi pass expires, and you hit Save? Maybe a beta tester can answer this. Does Word or Excel prompt for a local copy instead? And if you save such a copy, how do you sync up the changes later?

If there’s a good answer, then this is the kind of thing I might use myself. If there is no good answer, I’ll stick with Subversion. Personally I want both the convenience of online storage and the comfort of local copies, with no-fuss synch between the two.

That said, I may be the only one concerned about this. When I Googled for Live Workspace Offline, the top hit was my own earlier post on the subject.

Microsoft Volta: magic, but is it useful magic?

Microsoft has released an experimental preview of Volta, a new development product with some unusual characteristics:

1. You write your application for a single machine, then split it into multiple tiers with a few declarations:

Volta automatically creates communication, serialization, and remoting code. Developers simply write custom attributes on classes or methods to tell Volta the tier on which to run them.

2. Volta seamlessly translates .NET byte code (MSIL) to Javascript, on an as-needed basis, to achieve cross-platform capability:

When no version of the CLR is available on the client, Volta may translate MSIL into semantically-equivalent Javascript code that can be executed in the browser. In effect, Volta offers a best-effort experience in the browser without any changes to the application.

The reasoning behind this is that single-machine applications are easier to write. Therefore, if the compiler can handle the tough job of distributing an application over multiple tiers, it makes the developer’s job easier. Further, if you can move processing between tiers with just a few declarations, then you can easily explore different scenarios.

Since the input to Volta is MSIL, you can work in Visual Studio using any .NET language.

Visionary breakthrough, or madness? Personally I’m sceptical, though I have had a head start, since this sounds very like what I discussed with Eric Meijer earlier this year, when it was called LINQ 2.0:

Meijer’s idea is programmers should be able to code for the easiest case, which is an application running directly on the client, and be able to transmute it into a cross-platform, multi-tier application with very little change to the code.

What are my reservations? It seems hit-and-miss, not knowing whether your app will be executed by the CLR or as Javascript; while leaving it to a compiler to decide how to make an application multi-tier, bearing in mind issues like state management and optimising data flow, sounds like a recipe for inefficiency and strange bugs.

It seems Microsoft is not sure about it either:

Volta is an experiment that enables Microsoft to explore new ways of developing distributed applications and to continue to innovate in this new generation of software+services. It is not currently a goal to fit Volta into a larger product roadmap. Instead, we want feedback from the community of partners and customers to influence other Live Labs technologies and concepts.

Mono and C# on an Asus Eee Pc

I am having a lot of fun with the Asus Eee PC. In its way, it is a game changer. I wondered if it would run Mono applications, enabling support for the open source version of Microsoft .NET. The news is partially good:

mono_ee

Unfortunately, I’ve not been able to do much more than that so far. I tried compiling a basic forms application, but got a pkg-config error. This may be because of a kernel module called binfmt, which let you register interpreters for different binary formats. This is normally present in Linux, but seems to be omitted from the Eee kernel. If I am right, then fixing this means figuring out how to recompile the kernel on the Eee. You can still execute Mono applications by running mono as in the screenshot, but the compiler seems to expect binfmt to work. I am sure someone will figure this out.

Update –  getting better – we have GUI:

mono_ee2

Still can’t use -pkg though.

Update

The problem with -pkg is easy to fix. Just install pkg-config 🙂

I’m not clear yet whether the absence of binfmt could cause other problems.

Further update

Everything is working. I can compile and run the Hello World examples here. Note that the Gtk example there does not quit properly, so I suggest you use this modified version.

To get this working, I did as follows:

1. Added a xandros repository to /etc/apt/sources.list:

deb http://xnv4.xandros.com/xs2.0/upkg-srv2 etch main contrib non-free

2. Installed mono-gmcs (.NET 2.0 compiler). (I think that is the minimum but I’m not 100% sure)

3. Installed pkg-config

4. Installed gtk-sharp2

I’ve also installed JEdit for editing. Not in the repository, so I installed using the jar installer on the Jedit site.

df shows 30% used, not too bad.

Technorati tags: , ,

CodeRage sessions available for download

You can now download the content from last week’s CodeRage, the virtual developer conference laid on by CodeGear. The downloads use Camtasia and Flash and work well.

A few that I recommend are Ravi Kumar’s session on JBuilder Application Factories from Day 5, and Joe McGlynn on 3rd Rail, an IDE for Ruby on Rails, from Day 3. For Delphi futures (64-bit, generics, concurrent programming, hints about cross-compilation to other operating systems) check out Nick Hodges’ session on Day 1. I’ve not viewed everything, so there are no doubt other excellent sessions.

Nevertheless, I have mixed feelings about this CodeRage. The keynotes were weak, with too much high level waffle about how CodeGear is committed to developers etc etc. The conferencing software was no more than adequate, did not work properly for me on Vista, and did not support Mac or Linux. That may explain why attendee numbers in some sessions were embarrassingly small.

I am struggling to make sense of this. CodeGear claims to have 7.5 million registered users; yet only 2100 registered to attend the free CodeRage, and some of those no doubt never turned up. If that is representative of the level of interest in new CodeGear products, as opposed to legacy users, then that is a worrying sign.

Eee PC vs Origami UMPC: D’Oh

I loved this comment to Kevin Tofel’s post on the Asus Eee PC:

You kinda get the feeling that the Origami team is saying “D’OH” right about now.

So it should. Origami (officially known as Ultra Mobile PC) is an attempt to re-define the ultra-portable market. It uses a touch screen, no keyboard. It is typically more expensive than a traditional laptop. It usually comes with just basic Windows software.

The Asus Eee PC is a bunch of mass-market components thrown together. It is the classic clamshell design. It comes with a bundle of free software that encompasses a large percentage of what people actually do on their portable computers: word processing, spreadsheet, presentation, email, web, music playback. It throws in a webcam for good measure. It is cheaper than almost any laptop on the market.

Result: UMPC has pretty much flopped, while Eee PC is a runaway bestseller and nobody can get enough stock.

This isn’t primarily a Windows versus Linux thing. In fact, the Eee PC is set up to be Windows-friendly, and Open Office is set to save in Microsoft Office formats by default. Further, the Eee PC can run Windows XP; and most of its applications are also available for Windows. An Asus Windows XP Eee PC is planned.

To my mind, the success of Eee PC proves that the Origami team got at least one thing right. There is a market for full-function ultra-portables.

What the Origami team got wrong is first, that ultra-portables that cost more than laptops are never going to be a mass-market proposition. Second, users like keyboards and would rather have a cheaper device than a touch screen. Third, a bundle of software that does everything you want is a great advantage.

As it is, Eee PC is bringing desktop Linux to the mass market. Interesting times.

Technorati tags: , , , ,

Fixing Webcam black screen on an Asus Eee PC

I’m reviewing the Asus Eee PC, running Linux. I was not the first user of this machine; I guess it was reviewed by someone else and then the OS was restored before it was sent to me. No problem with that; except that the webcam did not work. If I ran the webcam app I got a black screen, though the webcam led lit as if it were working. If I ran the diagnostic it did not detect the webcam, but said “Plug in your webcam” and “Failed”.

Here’s the fix. Reboot the machine and press F2 to go into the BIOS settings. Select the Advanced page and then OS Installation. It’s set to “Start”, right? Press Enter, and select Finished. Then F10 for Save and Exit. Webcam now works.

Seems to be a common problem for Eee PC users who have restored the OS.

Technorati tags: , ,

Vista vs XP performance: some informal tests

After posting about the inadequacy of a recent test report I thought it would be interesting to conduct my own informal tests of Vista vs XP performance. I do not run a computer laboratory, but I guess my tests have the benefit of being real-world.

I tested several conditions on three computers. On two of them I was able to test XP 32-bit vs Vista 32-bit. I tried various combinations of Aero on or off, visual effects on or off, and UAC (User Account Control) on or off. I also tried setting Vista to run only basic services, using Msconfig.

The test suite I used was PassMark Performance Test 6.1. You can use this free for 30 days, so anyone can repeat the tests.

Caveat: perfect tests are hard to achieve. A big problem with Vista is that it has all sorts of background services that are meant to maintain your system, perform backups, check for updates, and so on. Further, if you have third-party software installed (and who doesn’t?) then it may also be running background tasks. The only compensation I applied was to wait a few minutes after start-up for disk activity to pretty much cease. I also switched off anti-virus software, and closed the Vista sidebar.

Here’s a few highlights. First, XP was fastest on the two dual-boot machines, and that was without any effort optimizing XP for performance. In both cases, the best Vista performance was about 15.5% slower than XP, based on the PassMark performance index. On one machine the worst Vista performance was 28% slower, and on the other 23% slower.

Graphics 2D – Ouch!

The most interesting result was the poor performance of the Graphics 2D tests on Vista. A series of PassMark tests cover drawing lines, rectangles, shapes, font rendering, and common GUI operations like scrolling listboxes, moving windows or filling progress bars. On both dual-boot machines, the best Vista performance was an amazing 70% slower than XP. Put another way, XP was 3 times faster. Since these are common operations, that is worrying.

That said, there are some specific reasons for this poor performance. See here for example:

The Windows Vista graphics system is designed to support a broad range of hardware and usage scenarios to enable new technology while continuing to support existing systems. Existing graphics interfaces, such as GDI, GDI+, and older versions of Direct3D, continue to work on Windows Vista, but are internally remapped where possible. This means that the majority of existing Windows applications will continue to work.

Hmm, re-mapping sounds slow. And here:

GDI primitives like LineTo and Rectangle are now rendered in software rather than video hardware, which greatly simplifies the display drivers.  We don’t think this will impact many real-world applications (usually when a GDI application is render bound its because it’s doing something like gradients that was never hardware accelerated), but if you do see problems please let us know.

See also Greg Schechter’s notes here on GDI-rendered windows.

In a nutshell, Vista is optimized for DirectX rendering at the expense of GDI; yet, as Schecter notes:

Today and for the near future, most applications use and will continue to use GDI to render their content.

This I suspect is a large part of the reason why some tests report that XP is twice as fast as Vista. Automating Office could well hit this slow GDI issue. Note that it is not all 2D graphics that are slow – CustomPC got less alarming results for its 2D graphics tests, which by the looks of it are not GDI-based.

Schechter does not cover the scenario where the DWM (Desktop Window Manager) is turned off, but it looks as if some of the factors which make GDI slow still apply.

What’s most effective at speeding up Vista?

I did some experiments where I compared Vista Basic with Vista Aero, or looked at what happened when UAC is switched off. I got inconsistent results. On an older machine, I found that disabling Aero made a significant difference, maybe an 8% speed-up. On another, more recent machine, Aero was actually faster. So much changes when you use DWM that I guess this is to be expected.

UAC? Not a huge difference in these tests – around 2%.

The biggest influence, on the basis of my imperfect testing, came when using Msconfig to switch off all but basic services and start-up programs. This speeds performance by around 10% overall.

In most other tests there were modest differences between Vista and XP. This includes 3D graphics, where Vista actually scored higher than XP on one machine, and CPU, where on one machine there was less than 2.5% difference between best and worst. Vista does come out significantly slower on the PassMark memory test suite, from just over 8% worse to over 20% worse.

Conclusion? First, my informal tests suggest that XP is faster than Vista, but not normally twice as fast. Second, an application written to use DirectX rather than GDI should perform better, other things being equal. WPF (Windows Presentation Foundation) uses DirectX, but unfortunately has its own performance overhead.

If this analysis is right, then Vista is at its worst when rendering the GUI in traditional Windows applications. That will make them feel less snappy, but would not impact the non-visual aspect, like say recalculating a spreadsheet.

Then again, I only tried one test suite, so please don’t take the above too seriously.

I’d be interested in further informed comments.

Technorati tags: , , ,

Extend SQLite with Delphi functions

I have a couple of open source projects on the go, one of which is a simple Delphi wrapper for SQLite. Lukas Gebauer has now added experimental support for user defined functions. This lets you in effect extend the SQL understood by SQLite to include your own custom functions, written in Delphi.

To try out the feature, download the wrapper and have a look at the file sqlite3udf.pas.

Technorati tags: ,

ParallelFX: concurrency library for .NET

A few links:

Announcement of CTP on Somasegar’s blog

MSDN article on PLINQ

MSDN article on Task Parallel Library

Joe Duffy – concurrent programming blog

Interesting note in Duffy’s blog about PDC 07 (the one that never was):

Note: some might wonder why we released the articles before the CTP was actually online.  When we originally put the articles in the magazine’s pipeline, our intent was that they would in fact line up.  And both were meant to align with PDC’07.  But when PDC was canceled, we also delayed our CTP so that we had more time to make progress on things that would have otherwise been cut.

Concurrency is a big deal; it’s good to see more library support.

Technorati tags: , , ,