Microsoft’s live maps upgrade is a downgrade in the UK – developers are fuming

Microsoft’s latest upgrade to Live Maps has some interesting new features, including more detailed 3D city views, labels in bird’s eye view, GeoRSS feeds, and more:

As always the changes visible in the user interface only scratch the surface of the dozens of improvements across the application tiers including Geocoding enhancements, browser compatibility (Safari and IE8), parsing improvements, reverse geocoding, printing improvements and tons more. We are also releasing an upgrade of our Map Control to version 6.1 for developers.

Unfortunately for UK developers, there’s another change. In December 2007 Microsoft acquired Multimap, and the latest update now redirects maps.live.com to Multimap, if your region settings are UK (or, possibly, elsewhere in Europe). You can see the difference if you force the region to US with an url argument: http://maps.live.com/?mkt=en-us.

The Multimap maps have fewer features and include advertising. For example, there is no 3D view in Multimap:

Here’s the real Live Maps version (note the extra features):

It looks like the photography is the same; but developers are not happy. Check out the comments to the announcement, for example:

I loved Live maps, but i’m off to Google maps if multimap doesn’t get removed and we go back to the nice clean good interface and all the features we’ve come to love.

Multimap is awful.

or

I never thought I’d have to say this, but for UK users Live Maps is now complete TRASH. Why on Earth did you replace Live Maps with Multimap??? Multimap uses technology that hasn’t changed for years, the maps are awful, the page is full of clutter, and it doesn’t even use smooth zooming – it just reloads the page every time you zoom in because it’s all static image based.

Add to that the fact that the road maps all use images scanned straight out of the PAPER atlas, not the sleek, computer-friendly road map style that Live Maps uses.

It’s a shame, as I’d been meaning to blog about how impressive Live maps and Virtual Earth looked in some of the demos as Mix08. That said, Microsoft can fix this easily by removing the redirect.

This is an integration issue. If Microsoft-Yahoo becomes a reality, I wonder what other such issues will cause developers to fret?

Thanks to Ian Blackburn for the link.

UPDATE: Live Maps is back. As of yesterday, Multimap is no longer being used even in the UK. Looks like Microsoft listened; I’m impressed.

Google App Engine: how much will you pay for freedom?

Google is offering to host your web apps for free:

You can create an account and publish an application that people can use right away at no charge, and with no obligation. An application on a free account can use up to 500MB of storage and up to 5 million page views a month.

What’s an application? It’s a runtime for Python apps (only Python code will run) and includes the Django web framework. There is a structured datastore which on the briefest of looks has echoes of Amazon’s SimpleDB and Microsoft’s SQL Server Data Services. Welcome to GQL – the Google Query Language. You can send email through Google’s servers (hmm, hope some work is being done to foil the spammers). You can use Google Accounts as an identity service – this is a big one, since it helps Google to meld your online identity with its services.

So what’s the business model? Google says:

During this preview period, only free accounts are available. In the near future, you will be able to purchase additional computing resources at competitive market prices. Free accounts will continue to be available after the preview period.

There are a few clues about what will constitute an “additional computing resource”. Clearly storage is one limit, and there is also a limit of 3 applications for free accounts. There is also a reference to bandwidth limits, the number of results you can return from a query (1000), and the length of time taken to serve a web request.

Apps communicate through HTTP or HTTPS requests. No talk of SOAP or even XML that I can see, though presumably you can use Python libraries.

Although we talk a lot about the largest applications that need to scale, this is a minority of real-world applications. Many of today’s web applications could run happily for free on Google’s new service, once ported. The economics interest me. Google is offering to subsidise our web infrastructure even further than it does already with GMail, Blogger and iGoogle gadgets. Therefore, if we choose to host our own services we have to pay for the flexibility and control that gives us, as well having to deal with scalability and security issues that Google will otherwise look after for us. In the light of generous app hosting offers like this, how much are we willing to pay for that freedom?

Amazon, eBay, FaceBook: the risk of building your business on a third-party platform

We are seeing web giants flex their muscles. Here’s three instances.

FaceBook’s frequent platform changes make it tough for small developers to keep up – I blogged about this recently.

Amazon declares that Print on Demand sales on its site must use its own printing system, causing consternation for rivals like Lightning Source.

Ebay changes its terms for sellers, removing the option to give negative feedback to scam buyers and increasing final value fees from 5.25% to 8.75% (a 67% increase).

In each case, the losers can fume and complain; but there’s little else they can do, other than withdraw their business. Ebay, FaceBook and Amazon have the right to as they want, within the law, with their web sites. Unfortunately, withdrawing your business from the dominant platform in each field (social networking, web retailing, auction sales) is likely to be even more expensive than gritting your teeth and putting up with it – at least, that’s what the big guys are counting on.

The problem: it’s high risk to have a third-party control your platform. This is something the music industry has belatedly recognized in respect of Apple’s iTunes.

I expect to see more of this, as the biggest players change focus from buying market share with low prices and free services, to trying to monetize their existing share more effectively.

PS: I realise that FaceBook is in nothing like the same position of strength within its market as Amazon or Ebay; nevertheless there seems to be a parallel to do with lack of control over your destiny.

Technorati tags: , , , ,

A real-world account of Google Adsense – and it doesn’t look good

Advertising is “the economic engine that powers the Web”, according to Microsoft’s Ray Ozzie. Google’s rapid ascendancy, enabled by advertising revenue, is the primary evidence for this. That said, Rick Strahl’s post on Google advertising highlights several problems with Google’s approach. It is about Adsense, the mechanism by which third-party sites (like this one) host Google advertising. When someone clicks an ad, Google gets paid, and an undisclosed percentage of the fee goes to the site owner.

Strahl runs a small business and uses Google both as advertiser and web site owner. He’s puzzled by the stats he’s gathered at both ends. As advertiser, he says that 30-40% of his hits come from link parking sites, plus another 10% which have no referrer, and reckons that these hits are worthless and in many cases possibly fraudulent. That’s up to 50% of wasted ad spend. Google tells him there is no way to opt out of link parking sites, other than by excluding specific sites; but since there are thousands of such sites and they change constantly, that is impractical.

At the other end, Strahl sees frequent deductions from the clicks on his own site, presumably on the basis that they are fraudulent or accidental (such as robot clicks). In fact, deductions from his site, which he controls and which has good, genuine content, appear to be far higher than those from the link parking sites which have no real content at all. In other words, Google seems happier to make deductions from what it pays to him, than from what he pays to Google.

He’s also curious about the ad bidding process, which always seems to end up charging him the maximum possible.

It’s possible that he has some of this wrong; but there is no way to audit Google’s figures:

In the end it feels like black magic. Google (and other advertisers as well to be fair) control the process so completely that if there’s any foul play either on Google’s part or for cheating publishers that contest clicks on the other end there’s almost no real way to tell that it’s happening and unless you have the time to keep very close tabs on it there’s no way to follow the money all the way through – on both ends. And who has that kind of time?

I find this unsurprising. The pay-per-click model has always seemed to me far too vulnerable to abuse, especially bearing in mind all those botnets. Who pays for any fraud? Not Google, but Google’s customers, the advertisers.

Some level of click fraud is inevitable, but Google’s willingness to let any old worthless bot-driven link parking site run Adsense ads is a disgrace. This stuff poisons the web, because it provides a financial incentive to post junk.

Advertisers can opt-out of Adsense, by disabling the “content network” for the ads they place. If enough advertisers do this, Google will take note.

Disclosure and to add a personal note: I am an Adsense publisher, though not an advertiser. I also use Blogads, which to my mind has a better business model for advertisers, since they specify exactly which sites they wish to use. In addition, I get to approve each ad, whereas with Adsense I have to take whatever comes. The snag is, Blogads is tiny in comparison to Google, which can seemingly always supply ads for my site.

Technorati tags: , ,

 

Ubuntu Hardy Heron – very cool

I had a spare desktop after upgrading my Vista box – at least, I popped my old motherboard in a spare case and added a hard drive. It seemed a good opportunity to try Ubuntu Hardy Heron. Ubuntu has a policy of  upgrading its Linux distribution every six months, in April and October, and Hardy Heron is this year’s April release. I tried a late beta, since final release is not until the end of the month. Burned a CD, stuck it in the drive, and installed it.

The install went smoothly. The main hassle with Ubuntu, and most other Linux distros, is that there are a few add-ons which you can’t easily do without, but which are excluded from the main release either for legal reasons, or because they are proprietary. For example, I tried to play a DVD, but the Totem movie player said it did not have the right GStreamer plugin. It would be nice if Ubuntu had a one-click install, something like “OK, I give in, give me libdvdcss2, give me Flash, give me Java, and I’ll take the consequences.” I fiddled around with Medibuntu, then realised you can get something close to a one-click install if you add ubuntu-restricted-extras to the repository. It didn’t actually take too long before I was up and running: DVDs played, YouTube worked, Java worked. I also added the NVIDIA proprietary driver which is needed to enable the Compiz Fusion 3D desktop. That one was easy: Ubuntu prompted me to do it.

The “what’s new” list includes Linux kernel 2.6.24, Firefox 3 (although still in beta), and better virtualization support with KVM. Gnome is updated to 2.22. Think incremental rather than dramatic changes.

Subjectively, Ubuntu performs better on the same hardware than Vista. There is just less waiting around. I had some fun connecting to my Vista desktop using the Terminal Server client. Then I pressed Windows-Tab to cycle between applications (note the cool reflections):

The key factor for Ubuntu is not features, but usability. In this respect, it seems to get better every time I look.

Technorati tags: , , ,

Reality strikes for Blog Friends Facebook app

Just spotted this sad note from the developers of one of the few Facebook apps I’ve enjoyed using, Blog Friends. The app combines blog aggregation with social networking, and does a good job of highlighting interesting posts you might otherwise miss:

Although it appears simple on the surface, Blog Friends is actually an unusually complex and resource-intensive application to maintain and grow …. the way that Blog Friends is currently tied into the Facebook Platform means we have been at the mercy of Facebook’s frequent modifications of their Platform specifications, and that has also been another disabling factor for us.

What is needed is a complete rewrite of Blog Friends, one that makes it properly scaleable and independent of Facebook. As you can imagine, this is a huge undertaking and unfortunately we don’t have the resource or money to do this; we have never inflicted any advertising on you our users, so we haven’t made a penny in revenue from Blog Friends.

We’re shutting down, as of today.

It’s tough to prosper without a sane business model; and it’s tough to survive on some third-party’s proprietary platform.

Proprietary platforms love developers (Ballmer’s battle cry, remember), because they add value. They are risky for developers though, because the platform owner can change the rules.

Is Bubble 2.0 going to end the same way as Bubble 1.0?

Microsoft discusses next-gen MSDN … on Facebook

According to this blog post, Microsoft is setting up a buzz group on Facebook to discuss the next generation of its online documentation for developers, the MSDN (Microsoft Developer Network) Library:

We have put a facebook group together to aggregate together the folks who want to work with us to provide feedback, usability and ideas for the next generation of the MSDN Library. We call this project Library 3.0 and we will be organizing events and presentations from this group to bring us together on the project. My goal is build quorum of members over the next months with kickoff’s in late May for the first events.

It’s an interesting place to hold the discussion. Yes, Microsoft has a small stake in Facebook; but it also runs a vast network of technical communities and is doing great business with its Sharepoint collaboration platform. So why use Facebook?

Of course, two key areas that need improving in MSDN are collaboration and search, so you could argue that choosing a third-party platform for collaborating on MSDN itself is significant.

Then again, it’s probably more to do with internal red tape. What’s easier: getting corporate agreement on some new developer relations initiative and setting up the infrastructure, or just sticking a new group on Facebook?

If you are interested, the group is here. Currently it has no content or discussion whatsoever. Not a good start; but there’s time…

Technorati tags: , ,

JBuilder 2008 and Vista’s Program Compatibility Assistant

One of Vista’s annoyances is this dialog, which you may see shortly after installing an application:

As you can see, I got this after installing CodeGear’s new JBuilder. The reason it annoys me is that it doesn’t tell you what “compatibility settings” it has applied. In this case, even if you go to JBuilder.exe in Explorer and view its properties, you will find all the compatibility options unchecked. So what has it done?

Of course I clicked “What settings are applied”. Here’s what it says:

As you can see, this still does not tell you what settings are applied. By the way, Group Policy enables you to disable the Program Compatibility Assistant completely, but does not show the settings for individual applications.

I ran the registry editor, and found this entry:

It looks like the Persisted key tells Vista which applications have already had settings applied, while the Layers key tells Vista what settings to apply. ELEVATECREATEPROCESS lets the application create child processes which require admin rights, though they still raise a UAC prompt.

I also found this Microsoft article which does a good job of explaining how the Compatibility Assistant works. It appears that JBuilder 2008 tries to run something which requires administrator permissions, but does not use the  correct Vista technique for doing so. I soon found out what it is:

It’s running regedit, and exporting some keys that appear to relate to Mozilla’s Gecko Runtime project, for embedding a browser in an application. Unfortunately it does this (twice) every time it runs, which is unlikely to be necessary. You would have thought there would be a better way to use these registry entries, than exporting a temporary file.

Conclusions? None really; I just wanted to know what this annoying wizard does. A couple of observations though. First, it’s careless of CodeGear to let JBuilder 2008 out like this. It just looks bad, to have your app identified as an old one that needs compatibility help.

Second, if you read Microsoft’s article you’ll notice that among other things Vista “instruments” the CreateProcess API call in order to make this work. There must be a performance impact. I guess Microsoft will say it is a small one; but I guess it also makes its little contribution to Vista’s overall performance issues.

Microsoft’s business model for Silverlight

Pretty vague. As you’d expect. In this excellent interview Microsoft’s developer division VP Scott Guthrie cites three revenue opportunities:

  • Tools and servers
  • Customer engagement leading to ad sales
  • As a platform for other, presumably profitable, apps

I’m most interested in the third of these. By the way, I like Silverlight. Cross-platform .NET has been a personal interest of mine for ever. In 2002 I wrote an introductory article about .NET, and said:

It would do .Net enormous good if it became a credible cross-platform contender, say on Windows, Linux and the Mac. It would do Microsoft enormous good if it could be seen to work with the open source community in the same way as IBM does so successfully.

Six years on, the cross-platform potential in .NET is finally coming together. However, it is as a web-based runtime, rather than as a desktop runtime. That wasn’t quite what I expected back in 2002, but it is no bad thing. If Microsoft is serious about refactoring all its software for cloud services, as Ray Ozzie stated at Mix08, then Silverlight could be a key enabling technology, giving a rich desktop-like experience but in browser-hosted applications.

I was also interested in Guthrie’s comments on open source:

…people in the Linux community are much more likely to trust Novell and, specifically, Miguel [de Icaza] and the Mono Project and feel like, “Okay — if it is open source, I can get access to all the source [code]. You’re telling me that I can snap the source and build it myself if you’re not doing a good job? Okay, that’s interesting.” The higher level libraries that we are distributing — our controls and things like that — those will just work on the Linux version of Silverlight. They can take our source and use them for that.

Microsoft isn’t posting its source for the Silverlight runtime, but it is supporting an official open source implementation. That’s an intriguing distinction versus Flash, which has open source implementations none of which have taken off. Adobe has open-sourced Flex, but not the Flash runtime. However, note Guthrie’s comment:

We actually deliver the media graphics stack to Novell, so we use the same video pipeline and same media pipeline on the Linux version as on the Windows and Mac versions.

So that “media graphics stack”, is that open source? I suspect not but would be glad to be proved wrong. This point might make a difference to Linux distributions that exclude proprietary software by default.

Finally, Guthrie makes some remarks about Adobe AIR and the fact that Silverlight doesn’t have an equivalent cross-platform desktop engine. He says businesses are more interested in a “web-based model”, and observes that the full .NET and WPF stack is already a desktop runtime.

I’m not sure that this is a big deal. It wouldn’t be a huge step to host Silverlight in a cross-platform desktop application, for example by including it in a browser control. At the 2007 Mix, the New York Times folk told me they intended to do this with Times Reader. We are also going to see a number of different approaches to this problem. Mozilla is working on desktop integration for browser apps. Google shows a desktop shortcut in its introductory video for offline access. I recall Adobe’s Kevin Lynch remarking on the psychological barrier to opening a browser application when offline, as being one of the motivations for developing AIR, but there is more than one way to mitigate this.

Microsoft: OOXML has won approval as an ISO/IEC standard

According to Microsoft’s press release, and a document in unofficial circulation, Microsoft’s Open Office XML, an XML format for Microsoft Office, has been approved as an ISO standard.

It’s been an ugly process. That said, I suspect the spec has been significantly improved by all the attention it has received. The spat has exposed the money and politics behind standardization processes. This seems to be a theme of late: see also Blu-Ray vs HD-DVD, and even the Java Community Process.

I welcome standardization, but dislike the way both sides have put their standards wars ahead of the convenience of users.

Technorati tags: , , ,