Google, Adobe Flash, and H.264 video

On signing into Google Docs today I saw the following:

image

I clicked Learn more and was directed to this article. The files you can upload and play:

  • WebM files (Vp8 video codec and Vorbis Audio codec)
  • .MPEG4, 3GPP and MOV files – (h264 and mpeg4 video codecs and AAC audio codec)
  • .AVI (many cameras use this format – typically the video codec is MJPEG and audio is PCM)
  • .MPEGPS (MPEG2 video codec and MP2 audio)
  • .WMV
  • .FLV (Adobe – FLV1 video codec, MP3 audio)

And how do you play a video?

Simply click a video file that you’ve uploaded to your Documents List and the video opens in a new page that includes a video player. You will need to have Flash installed for the video player to work.

At the same time, Google says it is removing H.264 support from its Chrome browser:

Though H.264 plays an important role in video, as our goal is to enable open innovation, support for the codec will be removed and our resources directed towards completely open codec technologies.

How do we make sense of this? The implication is that Google is not in fact bothered about H.264, but rather wants to promote Flash for video instead of the HTML 5 <video> element. That is a problem for Apple iOS users who cannot run Flash, and puzzling insofar as you would expect Google to be promoting rather than discouraging HTML 5 adoption.

Possibly the real target is Apple. Flash has become a key selling point for non-Apple mobile devices. By making more use of Flash, Google can make the web more annoying for iOS users and thereby promote Android.

As John Gruber observes, Google has some questions to answer.

Update: Google’s Mike Jazayeri has posted some more background on the decision here.

Visual Studio 2010 nine months on: how good has it proved?

Visual Studio 2010 was released on April 12th 2010. Nine months on, how good has it proved to be?

image

I researched deeply into Visual Studio 2010 at the time, and was impressed overall. It was a huge release, partly because the IDE was rebuilt using Windows Presentation Foundation, and partly because of a large number of new features including the F# language. Performance was always going to be an issue with the move to a .NET-based IDE, but on my machines I found it satisfactory.

Others have been less pleased with the performance. The comments to Jason Zander’s announcement of the Service Pack 1 beta last month make interesting reading. Here is the negative:

I am a professional .NET developer and I am really upset with VS 2010. It crashes more often than VS 2008. It is slow as hell. It even crashes when debugging. VS 2010 is built with WPF which is causing all these problems.

and here is the positive:

I don’t know what y’all complaining about – VS2010 is blazingly fast… at least on my machine.

I am not sure whether the performance issues are more dependent on the the type of work you are doing, or the size of the projects, or some other factor. One issue may be graphics performance, since this will make a big difference to WPF whereas not so much with Visual Studio 2008 and earlier.

Thinking back to this time last year, I also recall how Visual Studio 2010 seems so focused on .NET, including Silverlight. Later on we got the announcement of Visual Studio LightSwitch, a RAD database application tool which builds Silverlight clients. It now seems obvious, especially following the PDC (Professional Developers) conference in November, that the vision of the developer team at Microsoft did not align with the vision of the Windows team; and that the Windows team seemed to win that argument internally. It is odd, because Silverlight has the potential to solve problems for the company. It is a technology that extends from the desktop to Windows Phone 7, which is well-suited to app store deployment thanks to the way apps are isolated, and which potentially can run on multiple platforms. Now with Silverlight 5, promised for release this year, Microsoft is adding more Windows-specific features and allowing more fragmentation between versions. Silverlight on Windows Phone 7 is based on version 3, the Mac version has more limited capabilities than the Windows version, and so on.

Microsoft said at PDC that “HTML 5” is its broad-reach platform. That suggests that what Visual Studio needs is HTML 5 designers and JavaScript libraries that integrate with Microsoft’s server technologies and which make it easier to develop HTML application for multiple form factors including small devices.

It is a confusing story, and I would love to know if the subject came up in CEO Steve Ballmer’s discussions with Bob Muglia, VP of Server and Tools, recently. The outcome of those discussions is that Muglia will be leaving Microsoft in the summer.

We will have to wait for Visual Studio 2012, maybe, to discover any change in its direction. In the meantime, SP1 adds a new help viewer, in response to many complaints, as well as a few new features for testing and debugging. There is also a list of bug-fixes, some of which look significant:

and so on. Let me add that while the list looks bad, it is no more than you would expect for a tool of this complexity and in my own testing Visual Studio 2010 has worked well.

I agree though with some of the commenters who note that Microsoft is slow to react when bugs are reported. It will be more than a year after the initial release when SP1 is finished, though you can use the beta for production code if you dare.

I would be interested in hearing from users of Visual Studio 2010. How are you finding it, or did you try it and go back to Visual Studio 2008? I realise that adoption of a new IDE for production work tends to be slow, because developers are reluctant to switch mid-project.

Google flexes its Chrome browser muscles, removes support for H.264 video – but what about Adobe Flash?

Google has announced that it will remove support for the H.264 video codec in its Chrome browser:

…we are changing Chrome’s HTML5 <video> support to make it consistent with the codecs already supported by the open Chromium project. Specifically, we are supporting the WebM (VP8) and Theora video codecs, and will consider adding support for other high-quality open codecs in the future. Though H.264 plays an important role in video, as our goal is to enable open innovation, support for the codec will be removed and our resources directed towards completely open codec technologies.

The reason given is that Google wishes to support open standards. That sounds good for open standards, but not so good for users who simply want a video to play.

Google’s position contrasts that of Microsoft with IE9:

In its HTML5 support, IE9 will support playback of H.264 video as well as VP8 video when the user has installed a VP8 codec on Windows

Still, at least IE9 will play VP8 if the codec is installed, so that makes VP8 look a better option for content providers – which is the outcome Google is hoping for.

I have mixed feelings about this approach, because while it is good for open standards it is bad for compatibility. I am also not sure that it is consistent. Google announced in June that it is integrating Adobe Flash support into the browser; yet Flash is not an open standard.

That also suggests that H.264 video will still play in Chrome, provided it is in a Flash wrapper.

Maybe Google is learning from Apple how to deprecate technologies by removing support. Apple refuses to allow Java or Flash on iOS, and has stopped doing its own build of Java for OS X. Apple has also stated that these “optional” components may not be used in apps that are deployed in the Mac App Store, thus making a disincentive for developers considering those runtimes.

Apple has not squashed Flash though; and Google may find it equally hard to squash H.264, which is widely supported throughout the industry, has the best tools, and for which hardware is optimised. Apple supports H.264 but is unlikely to support WebM or Theora any time soon. Here’s what Apple CEO Steve Jobs said in April 2010:

To achieve long battery life when playing video, mobile devices must decode the video in hardware; decoding it in software uses too much power. Many of the chips used in modern mobile devices contain a decoder called H.264 – an industry standard that is used in every Blu-ray DVD player and has been adopted by Apple, Google (YouTube), Vimeo, Netflix and many other companies.

Although Flash has recently added support for H.264, the video on almost all Flash websites currently requires an older generation decoder that is not implemented in mobile chips and must be run in software. The difference is striking: on an iPhone, for example, H.264 videos play for up to 10 hours, while videos decoded in software play for less than 5 hours before the battery is fully drained.

When websites re-encode their videos using H.264, they can offer them without using Flash at all. They play perfectly in browsers like Apple’s Safari and Google’s Chrome without any plugins whatsoever, and look great on iPhones, iPods and iPads.

It seems Jobs spoke too soon when he said H.264 would play perfectly in Chrome.

Post updated to add Apple quote

What next for application help and documentation? First thoughts on Adobe’s Technical Communication Suite 3

Adobe has launched Technical Communication Suite 3, which bundles FrameMaker 10, RoboHelp 9, Captivate 5, Photoshop CS5 and Acrobat X. FrameMaker and RoboHelp are Windows-only, so the suite is the same.

I had a brief briefing on the product today, which by coincidence came after my bad experience with SharePoint Designer and its help system. Please note: I do not hold Adobe responsible for the shortcomings of Microsoft’s online help, but it helped me to put the subject into context. I was trying to figure out how to get SharePoint to display file extensions in document lists. The supplied help looks pretty:

image

but I found it disappointing. I wanted to know, for example, what are the implications of converting a web part to XSLT, which is on one of the designer context menus:

image

Same story when I wanted to know what the @LinkFileName formula was meant to return. And when I looked for a SharePoint formula reference I got one useless result, an article on creating a workflow initiation form.

What we all do in these situations is to hit Google. The snag: whereas the little online help (which is also meant to search Office online) had high authority but no results, Google has the opposite problem: many results but little authority. I did eventually find the formula reference I wanted but finding correct information on the web as a whole is a matter of luck and judgment.

I found it interesting therefore to talk to Adobe about its Technical Communication Suite. How is online help changing? Do we even need it, when people hit Google rather than F1? Maybe it is better just to make sure your help articles and reference are easy to find on the web, rather than packaging them up and calling it a help document? In which case, we should be thinking in terms of a content management system, rather than online help as such.

The answer I guess is “all of these”. The key concept in Adobe Technical Communication Suite is “single-source authoring”, and you can use the same content for web pages as well as for print and traditional packaged online help.

It is still a bit old-school for my taste. For example, you can now include External content search in RoboHelp documents; but this only lets you add external URLS to the document along with search keywords. It does not let you search external content, but restricted to specified web sites, which would be a nice feature.

That said, if you use RoboHelp Server 9 – not included with the suite itself – in conjunction with an Adobe AIR help client, you can get user topic rating and commenting, so there is some concession to user-generated content.

There are also plenty of scenarios where you do still need a blow-by-blow documentation and reference for an application. In fact, if the SharePoint help mentioned above had provided this, I would have been happy.

This is not a review of the Technical Communication Suite, though I hope to get a look at the actual product shortly. In the meantime, a few points of interest. FrameMaker has considerable feature overlap with InDesign; but Adobe says there is still a place for a desktop publishing tool aimed at long technical documents with strong support for structured documents, cross-references and indexes. RoboHelp now supports collobaration workflows using Acrobat.com and PDF review. There is also new support for ePub, the eBook format for everything but Amazon Kindle, in FrameMaker and Kindle. I asked about Kindle support; the Adobe spokesperson was sniffy about Amazon’s proprietary MOBI format but said it might be added eventually if Amazon do not add ePub compatibility to the Kindle.

How Microsoft SharePoint makes simple things hard

When I was asked how to show file extensions in lists of documents on SharePoint sites I thought it would be a simple change to make. I did a quick Google and found several answers; but some of them involved editing core files that instinctively I thought should be left alone. I took a closer look and worked out the steps.

It turns out that you need SharePoint designer, plus you have to convert a web part to XSLT, and then figure out what to change in the rather complex page that is then generated.

A few observations.

First, I am surprised that Microsoft did not build in some easy way of showing the file extensions in a document library, which seems an obvious thing to want to do. There are hundreds of much more obscure things you can easily show, but not this one.

Second, it is nice that Microsoft has made its SharePoint Designer tool free, but I am not sure that the way it is presented is quite right. It is a techie product but I did not find Help particularly helpful. You know the kind of thing; you are in the Formula Editor, you hit F1, and you get a description of the dialog, when what you want of course is a reference to the formulae.

Third, when I did find the documentation I found it obscure. Here’s the reference for the @LinkFileName formula:

Returns a GUID that represents the icon that is used to create a link to a file in a document library, where the file can be edited by using a menu.

Hmm. I am not sure how many fat SharePoint books you need to read to understand why this particular formula is used as it is in SharePoint, or why String(@LinkFileName) returns the file name with its extension.

Fourth, I discovered that SharePoint deliberately hides the file extension. You can show the extension by removing the function that strips it off, in the formula that determines the contents of that cell.

Now I know why SharePoint is such good business for specialists.

Bob Muglia leaving Microsoft, CEO Steve Ballmer searching for new cloud leadership

Microsoft has announced that Bob Muglia, President of Server and Tools, is leaving Microsoft.

In his memo, Steve Ballmer says:

Bob Muglia and I have been talking about the overall business and what is needed to accelerate our growth. In this context, I have decided that now is the time to put new leadership in place for STB. This is simply recognition that all businesses go through cycles and need new and different talent to manage through those cycles.

It is always hard to tell from the outside, but in my encounters Muglia has been among the most articulate and confident of Microsoft’s top executives. I have also noticed in my regular look at Microsoft’s financials that the Server and Tools business has performed consistently well for as long as I can remember.

Most recently, Muglia took over the Azure business and seemed to know where he was going with it. He is also responsible for developer tools, and while his remarks about Silverlight at Microsoft’s PDC in November were disappointing to developers on that platform, they showed a clear sense of direction.

In this context, it seems surprising that Ballmer is in search of “new and different talent”. It does sound as if Ballmer and Muglia do not see the future of the cloud business – which is the focus of the memo – in the same way.

The key question: in what way did Ballmer and Muglia’s vision differ? I guess we will get some more clues as today’s news is discussed.

Update: Mary Jo Foley has posted Bob Muglia’s internal email to his team:

Later this year, I’m moving on to new opportunities outside of Microsoft, so I wanted to take a few minutes to share with you what’s important to me in life and leadership.

The foundation of who I am is based on living with integrity. Integrity requires principles, and my primary principle is to focus on doing the right thing, as best I can. The best thing, to the best of my ability, for our customers, our products, our shareholders, and of course, our people.

Just sugar, or did Muglia feel that staying at Microsoft would compromise those principles?

Since the announcement, the reaction across the industry has shown the high regard in which he is held, and bewilderment at why he is being let go. Here’s Redmonk analyst James Governor on Twitter:

Another exit: Microsoft server chief Muglia leaving company normally i say so what but this is TERRIBLE for microsoft

Spotify everywhere: now on Logitech Squeezebox as well as Sonos, Smartphones

Spotify, the music streaming service, has announced a partnership with Logitech to enable subscribers to play music via Squeezebox. Logitech already has a partnership with Napster for a similar service, but Spotify is winning in terms of usability, ubiquity and mind share.

It follows a similar agreement last September between Spotify and Sonos, a Squeezbox rival. The company has also announced support for Windows Phone 7, which joins Apple iPhone/iPad, Google Android, and Nokia Symbian among supported smartphones.

Spotify is available for free on a PC or Mac, but supported by advertising, making it like a commercial radio station where you choose the music. However only paying subscribers get the benefit of using the service from these other platforms.

In my view streaming is the future of mainstream music distribution, so I see this as significant. Why pay for downloads, when you can choose from a vast catalogue and play what you want when you want?

The main snag with Spotify is that some artists are not available on the service, and some countries (including the USA) cannot get Spotify. Still, if it builds a big enough customer base, the music industry may find it cannot do without the service.

No Java or Adobe AIR apps in Apple’s Mac App Store

Apple’s App Store Review Guidelines appear to forbid Java or Adobe AIR applications from being published in the store:

Apps that use deprecated or optionally installed technologies (e.g., Java, [PowerPC code requiring] Rosetta) will be rejected.

Since Adobe AIR is not shipped by default with OS X, any applications requiring that runtime will not qualify. Java is forbidden because Apple has deprecated its own build of Java; and while it seems supportive of Oracle’s official OpenJDK project for Mac OS X, apparently that support does not extend to allowing Java apps into the store.

Of course it is not only Java and Adobe AIR that are affected, but any apps that need a runtime.

There are many other provisions, most of which seem sensible in order to protect the user’s experience with the App Store. Some of them have potential for causing controversy:

Apps that duplicate apps already in the App Store may be rejected, particularly if there are many of them. Apps that are not very useful or do not provide any lasting entertainment value may be rejected.

What defines duplication in this context? How will Apple test whether an app has “lasting entertainment value” – I presume this refers to games.

The situation on Mac OS X is different than on the iPhone or iPad, since users can easily install apps via other routes. That said, if the App Store catches on then not being included may become a significant disadvantage. Further, it will not surprise me if Apple starts hinting that non-approved apps carry more risk to the user, so that some users might decide to avoid anything without this official stamp of approval.

I wonder if Adobe will do a Flash packager for the Mac similar to that which it offers for iOS, to get round these restrictions?

Apple’s Mac App Store – and the forgotten Windows Marketplace

Apple launched the Mac App Store yesterday and I had a look this morning. It is only available to users of Mac OS X Snow Leopard, where it comes with the latest system update.

image

It is interesting that Apple has not used iTunes for the App Store, but has developed new client software. Maybe it is coming round to opinion that iTunes has become bloated; it is only for historic reasons that a music player has become an all-purpose app installer.

The store itself worked well for me. I picked a free app, TextWrangler, and signed in with my Apple ID. The UI showed Installing, then Installed, and I was done.

image

The TextWrangler icon appeared in the Dock so I could start the app easily.

What counts is what I did not have to do – reboot, select from setup options, or deal with perplexing error messages.

Users will also like the common-sense licensing, which lets you download and install a purchased app on any Mac you use, controlled by your App Store log-in. I am not sure what happens if you install your app on your friend’s Mac, then sign out of the App Store. There is some link between the app and your Apple ID, because if you copy the application to another Mac it will ask for your sign-in details when you first run it, but I am not clear whether this is checked on every run to deter piracy.

Most important, there is an attractive range of apps at good prices. In the UK, Angry Birds is £2.99, Pinball HD £1.79, and Apple Pages or Keynote £11.99 each. That is less than typical Apple Store shrink-wrap prices. The prices for Pages and Keynote makes the price Microsoft charges for Office look impossibly expensive. Good for customers; but worrying for independent software vendors who want to make a living.

Developers pay $99.00 per year to join the Mac Developer Program and then 30% commission to Apple on every sale. Of course, like the iPhone App Store, apps are subject to Apple’s approval.

Lest you think it is clever of Apple to invent an app store for the desktop, it is worth noting that the concept is an old one. Linux has delivered free software like this for years, and some distributions have also featured paid app installers integrated into the OS.

So has Microsoft, which has run various varieties of Windows Marketplace over the years, for mobile and desktop applications. Windows Vista shipped with an app store for both Microsoft and third-party apps built-in. It was on the Start menu:

image

as well as in Control Panel:

image

On November 1st 2008 Microsoft shut down Windows Marketplace and “transitioned” it to a referral site. There was some angst at the time about the closing of the digital locker, which proved insecure against the threat of corporate mind-changing. It still runs the online Microsoft Store, but this is for Microsoft-only products. For example, you can download Microsoft Songsmith for £25.00:

image

Why did Windows Marketplace fail? Well, the user experience was poor, it was insufficiently prominent in the Vista user interface, setup could be troublesome. Major Windows app vendors figured out that they would be better off drawing potential customers to their own web sites, where they have full control. As is often the case, Microsoft was conflicted over whether it wanted to drive customers to the online store, or to partner retailers, or to app vendor sites; and the OEMs would have their say as well, when customising Windows for their own PCs.

Another factor is that Windows apps are often not well isolated. Silverlight actually solves this problem – out-of-browser apps are well isolated and secure – but Microsoft does not even ship Silverlight by default with Windows.

The indications are that Microsoft will have another go in Windows 8. Documents leaked last year show an app store. From my post at the time:

There’s a pattern here. Microsoft gets bright idea – Tablet, Windows Marketplace, Passport. Does half-baked implementation which flops. Apple or Google works out how to do it right. Microsoft copies them.

NVIDIA Tegra 2: amazing mobile power that hints at the future of client computing

Smartphone power has made another jump forward with the announcement at CES in Las Vegas of new devices built on NVIDIA’s new Tegra 2 package – a System on a Chip (SoC) that includes dual-core CPU, GPU, and additional support for HD video encoding and decoding, audio, imaging, USB, PCIe and more:

image

The CPU is the ARM Cortex-A9 which has a RISC (Reduced Instruction Set Computer) architecture and a 32-bit instruction set. It also supports the Thumb-2 instruction set which is actually 16-bit. How is 16-bit an upgrade over 32-bit? Well, 16-bit instructions means smaller code, even though it gets translated to 32-bit instructions at runtime:

For performance optimised code Thumb-2 technology uses 31 percent less memory to reduce system cost, while providing up to 38 percent higher performance than existing high density code, which can be used to prolong battery-life or to enrich the product feature set.

The GPU is an “ultra low power” (ULP) 8-core GeForce. In essence, the package aims for high performance with low power consumption, exactly what is wanted for mobile computing.

Power is also saved by sophisticated power management features. The package uses a combination of suspending parts of the system, gating the clock speed, screen management, and dynamically adjusting voltage and frequency, in order to save power. The result is a system which NVIDIA claims is 25-50 times more efficient than a typical PC.

According to NVIDIA, Tegra 2 enables web browsing up to two times faster than competitors such as the Qualcomm Snapdragon 8250 or Texas Instruments OMAP 3630 – though of course these companies also have new SoCs in preparation.

Tegra 2 is optimised for some specific software. One is the OpenGL graphics API. “The job of the GPU is to implement the logical pipeline defined by OpenGL”, I was told at an NVIDIA briefing.

image

I asked whether this meant that Tegra 2 is sub-optimal for Microsoft’s Direct X API; but NVIDIA says it is sufficiently similar that it makes no difference.

Nevertheless, Tegra 2 has been designed with Android in mind, not Windows. There are a couple of reasons for this. The main one is that Android has all the momentum in the market; but apart from that, Microsoft partnered with Qualcomm for Windows Phone 7, which runs on Snapdragon, shutting out NVIDIA at the initial launch. NVIDIA is a long-term Microsoft partner and the shift from Windows Mobile to Android has apparently cost NVIDIA a lot of time. The shift took place around 18 months ago, when NVIDIA saw how the market was moving. That shift “cost us a year to a year and a half of products to market”, I was told – a delay which must include changes at every level from hardware optimisation, to designing the kind of package that suits the devices Android vendors want to build, to building up knowledge of Android in order to market effectively to hardware vendors.

Despite this focus, Microsoft demonstrated Windows 8 running on Tegra during Steve Ballmer’s keynote, so this should not be taken to mean that Windows or Windows CE will not run. I still found it interesting to hear this example of how deeply the industry has moved away from Microsoft’s mobile platform.

Microsoft should worry. NVIDIA foresees that “all of your computing needs are ultimately going to be surfaced through your mobile device”. Tegra 2 is a step along the way, since HDMI support is built-in, enabling high resolution displays. If you want to do desktop computing, you sit down at your desk, pop your mobile into a dock, and get on with your work or play using a large screen and a keyboard. It seems plausible to me.

During the press conference at CES we were shown an example of simultaneous rich graphic gaming on PC, PlayStation 3, and Tegra 2 Smartphone.

image

Alongside Android, Tegra 2 is optimised for Adobe Flash. NVIDIA has been given full access to the source of the Flash player in order to deliver hardware acceleration.

image

image

Actual devices

What about actual devices? Two that were shown at CES are the LG Optimus 2X:

image

and the Motorola Atrix 4G:

image

Both sport impressive specifications; though the Guardian’s Charles Arthur, who attended a briefing on the Atrix 4G, expresses some scepticism about whether HD video (which needs a large display) and the full desktop version of FireFox are really necessary on a phone. Apparently the claimed battery life is only 8 hours; some of us might be willing to sacrifice a degree of that capability for a longer battery life.

Still, while some manufacturers will get the balance between cost, features, size and battery life wrong, history tells that we will find good ways to use these all this new processing and graphics power, especially if we can get to the point where such a device, combined with cloud computing and a desktop dock, becomes the only client most of us need.

NVIDIA says that over 50 Android/Tegra 2 products are set to be released by mid-2011, in tablet as well as Smartphone form factors. I’m guessing that at least some of these will be winners.