Tag Archives: cloud computing

Google’s Digital Garage, hosted by UK City Councils

I have recently moved into a new area and noticed that my (now) local city council was running a Google Digital Garage:

Winchester City Council is very excited to be partnering up with The Digital Garage from Google – a digital skills training platform to assist you in growing your business, career and confidence, online. Furthermore, a Google digital expert is coming to teach you what is needed to gain a competitive advantage in the ever changing digital landscape, so come prepared to learn and ask questions, too.

I went along as a networking opportunity and learn more about Google’s strategy. The speaker was from Google partner Uplift Digital, “founded by Gori Yahaya, a digital and experiential marketer who had spent years working on behalf of Google, training and empowering thousands of SMEs, entrepreneurs, and young people up and down the country to use digital to grow their businesses and further their careers.”

I am not sure “digital garage” was the right name in this instance, as it was essentially a couple of presentations which not much interaction and no hands-on. The first session had three themes:

  • Understanding search
  • Manage your presence on Google
  • Get started with paid advertising

What we got was pretty much the official Google line on search: make sure your site performs well on mobile as well as desktop, use keywords sensibly, and leave the rest to Google’s algorithms. The second topic was mainly about Google’s local business directory called My Business. Part three introduced paid advertising, mainly covering Google AdWords. No mention of click fraud. Be wary of Facebook advertising, we were told, since advertising on Facebook may actually decrease your organic reach, it is rumoured. Don’t bother advertising on Twitter, said the speaker.

image

Session two was about other ways to maintain a digital presence, mainly looking at social media, along with a (rather unsatisfactory) introduction to Google Analytics. The idea is to become an online authority in what you do, we were told. Good advice. YouTube is the second most popular search engine, we were told, and we should consider posting videos there. The speaker recommended the iOS app YouTube Director for Business, a free tool which I later discovered is discontinued from 1st December 2017; it is being replaced by Director Onsite which requires you to spend $150 on YouTube advertising in order to post a video.

Overall I thought the speaker did a good job on behalf of Google and there was plenty of common sense in what was presented. It was a Google-centric view of the world which considering that it is, as far as I can tell, entirely funded by Google is not surprising.

As you would also expect, the presentation was weak concerning Facebook, Twitter and other social media platforms. Facebook in particular seems to be critically important for many small businesses. One lady in the audience said she did not bother with a web site at all since her Facebook presence was already providing as many orders for her cake-making business as she could cope with.

We got a sanitised view of the online world which in reality is a pretty mucky place in many respects.

IT vendors have always been smart about presenting their marketing as training and it is an effective strategy.

The aspect that I find troubling is that this comes hosted and promoted by a publicly funded city council. Of course an independent presentation or a session with involvement from multiple companies with different perspectives would be much preferable; but I imagine the offer of free training and ticking the box for “doing something about digital” is too sweet to resist for hard-pressed councils, and turn a blind eye to Google’s ability to make big profits in the UK while paying little tax.

Google may have learned from Microsoft and its partners who once had great success in providing basic computer training which in reality was all about how to use Microsoft Office, cementing its near-monopoly.

Which Azure Stack is right for you?

I went in search of Azure Stack at Microsoft’s Ignite event. I found a few  in the Expo. It is now shipping and the Lenovo guy said they had sold a dozen or so already.

Why Azure Stack? Microsoft’s point is that it lets you run exactly the same application on premises or in its public cloud. The other thing is that although you have some maintenance burden – power, cooling, replacing bits if they break – it is pretty minimal; the configuration is done for you.

I talked to one of the vendors about the impact on VMware, which dominates the market for virtualisation in the datacentre. My sense in the VMware vs Hyper-V debate is that VMware still has an edge, particularly in its management tools but Hyper-V is solid (aside from a few issues with Cluster Shared Volumes) and a lot less expensive. Azure Stack is Hyper-V of course; and the point the vendor made was that configuring an equivalent private cloud with VMware would be possible but hugely more expensive, not only in license cost but also in the skill needed to set it all up correctly.

So I think this is a smart move from Microsoft.

Why no Dell? They told me it was damaged in transit. Shame.

image
Lenovo

image
Cisco

image

HP Enterprise

Microsoft financials: cloud good, Surface down, and “We had no material phone revenue this quarter”

Microsoft has released its financial results for the third quarter of its financial year. Revenue was up 8% year on year, and operating income up 6%. I’m always interested in the segmentation of the figures so here is a quick table:

Quarter ending  March 31st 2017 vs quarter ending March 31st 2016, $millions

Segment Revenue Change Operating income Change
Productivity and Business Processes 7958 +1437 2783 -198
Intelligent Cloud 6763 +667 2181 +5
More Personal Computing 8836 -703 2097 +346
Corporate and Other -1467 +158 -1467 +158

There is a bit more detail in the earnings slide:

image

A few points of note:

Cloud growth remains on track. Office 365 business revenue is up 45% year on year, according to Microsoft. Dynamics 365 revenue is up 81%. Azure revenue is up 93%. Of course these figures are offset by static or declining sales of on-premises licenses, though Microsoft does not spell this out precisely.

Windows is not doing too badly, despite continuing weakness in the PC market. OEM revenue up 5%, which the company attributes to “a higher mix of premium SKUs”. Surface is weak. Revenue is down 26%. Microsoft blames “heightened price competition and product end of lifecycle dynamics.” The truth is that the Surface range is not good value versus the competition. There should be a perfect marriage of hardware and software, given that it is all Microsoft, but instead there have been too many little issues. The likes of HP and Dell do a better job at lower price and with easier upgradeability.

“We had no material phone revenue this quarter” says Microsoft. I remain sad about the killing of Windows Phone, and regard it as a mistake, but that is a done deal.

Xbox is doing OK. Xbox live revenue growth has offset declining hardware sales.

Search revenue is up 8%. Nobody pays for search, so this is about advertising. Windows 10 drives users to “Cortana” search, and Edge defaults to Bing. Users can easily find defaults changed inadvertently, which is annoying, but Microsoft has a touch competitor (Google).

 

 

A reminder of Microsoft’s segments:

Productivity and Business Processes: Office, both commercial and consumer, including retail sales, volume licenses, Office 365, Exchange, SharePoint, Skype for Business, Skype consumer, OneDrive, Outlook.com. Microsoft Dynamics including Dynamics CRM, Dynamics ERP, both online and on-premises sales.

Intelligent Cloud: Server products not mentioned above, including Windows server, SQL Server, Visual Studio, System Center, as well as Microsoft Azure.

More Personal Computing: What a daft name, more than what? Still, this includes Windows in all its non-server forms, Windows Phone both hardware and licenses, Surface hardware, gaming including Xbox, Xbox Live, and search advertising.

AWS Summit London 2016: no news but strong content, and a little bit of Echo

I attended day two (the developer day) of the Amazon Web Services Summit at the ExCel conference centre in London yesterday. A few quick observations.

It was a big event. I am not sure how many attended but heard “10,000” being muttered. I was there last year as well, and the growth was obvious. The exhibition has spilled out of its space to occupy part of an upper mezzanine floor as well. The main auditorium was packed.

image

Amazon does not normally announce much news at these events, and this one conformed to the pattern. It is a secretive company when it comes to future plans. The closest thing to news was when AWS UK and Ireland MD Gavin Jackson said that Amazon will go ahead with its UK region despite the referendum on leaving the EU.

CTO Dr Werner Vogels gave a keynote. It was mostly marketing which disappointed me, since Vogels is a technical guy with lots he could have said about AWS technology, but hey, this was a free event so what do you expect? That said, the latter part of the keynote was more interesting, when he talked about different models of cloud computing, and I will be writing this up for the Register shortly.

Otherwise this was a good example of a vendor technical conference, with plenty of how-to sessions that would be helpful to anyone getting started with AWS. The level of the sessions I attended was fairly high, even the ones described as “deep dive”, but you could always approach the speaker afterwards with your trickier issues. The event was just as good as some others for which you have to pay a fee.

The sessions I attended on DevOps, containers, microservices, and AWS Lambda (serverless computing) were all packed, with containers perhaps drawing the biggest crowd.

At the end of the day I went to a smaller session on programming for Amazon Echo, the home voice control device which you cannot get in the UK. The speaker refused to be drawn on when we might get it, but I suppose the fact that Amazon ran the session suggests that it will appear in the not too distant future. I found this session though-provoking. It was all about how to register a keyword with Amazon so that when a user says “Alexa what’s new with [mystuff]” then the mystuff service will be invoked. Amazon’s service will send your service the keywords (defined by you) that it detects in the question or interaction and you send back a response. The trigger word – called the Invocation Name – has to be registered with Amazon and I imagine there could be big competition for valuable ones. It is all rather limited at the moment; you cannot create a commercial service, for example, not even for ordering pizzas. Check out the Alexa Skills Kit for more.

Presuming commercial usage does come, there are some interesting issues around identity, authentication, and preventing unauthorised or inappropriate use. Echo does allow ordering from Amazon, and you can optionally set a voice PIN, but I would have thought a voice PIN is not much use if you want to stop children ordering stuff, for example, since they will hear it. If you watch your email, you would see the confirming email from Amazon and could quickly cancel if it were a problem. The security here seems weak though; it would be better to have an approval text sent to a mobile, for example, so that there is some real control.

Overall, AWS is still on a roll and I did not hear a single thing about security concerns or the risks of putting all your eggs in Amazon’s basket. I wonder if fears have gone from being over blown to under recognized? In the end these considerations are not quantifiable which makes risks hard to assess.

I could not help but contrast this AWS event to one I attended on Microsoft Azure last month. AzureCraft benefited from the presence of corporate VP Scott Guthrie but it was a tiny event in comparison to Amazon’s effort. If Microsoft is serious about competing with AWS it needs to rethink its events and put them on directly rather than working through user groups that have a narrow membership (AzureCraft was up on by the UK Azure User Group).

Microsoft Financials: steady, but a turning point as on-premises server business declines

Microsoft has announced its latest financials, and I have made a quick table summarising the year-on-year comparison for the quarter. See the end of this post for what the confusing segment categories represent.

Quarter ending  March 31st 2016 vs quarter ending March 31st 2015, $millions

Segment Revenue Change Operating income Change
Productivity and Business Processes 6522 +65 2994 -210
Intelligent Cloud 6096 +193 2188 -345
More Personal Computing 9458 +89 1645 +596
Corporate and Other -1545 -1545 -1544 -1352

A few observations.

Overall the figures are flat. That is not a bad result if you think of Microsoft as a PC company, considering that the PC is in decline; disappointing if you think of Microsoft as a cloud company. The answer is that one is offsetting the other, which is not too bad.

Microsoft says that revenue and income would be up, were it not for currency fluctuations. Of course there is that big hit in “Corporate and other” which is “net revenue deferral related to Windows 10 of $1.6 billion,” according to the earnings statement.

On-premises server business is in retreat. It is not possible to migrate customers to the cloud while at the same time growing on-premises business. That truth finally showed up in Microsoft’s figures. CFO Amy Hood referred to a “larger than expected decline in our transactional on premise server business” in the earnings call.

Margins are not so good in cloud. Selling software license is almost all profit, once you have developed it. Not so with cloud, which requires data centres, networking, and ongoing maintenance. “Our company gross margin percentage declined this quarter driven by our accelerating mix of cloud services in our Intelligent Cloud and Productivity and Business Processes segment offset by higher gross margin percentage performance from products within More Personal Computing,” said Hood.

Office 365 continues to grow. CEO Satya Nadella said that “Commercial Office 365 customers surpassed 70 million monthly active users and we grew seats by 57 percent” year on year. This is key to the company’s health. Customers in Office 365 are hooked to the platform and more likely to buy other services such as Dynamics CRM, Enterprise Mobility Services MDM (Mobile Device Management), or applications hosted on Azure. “Dynamics CRM Online seats more than doubled this quarter with over 80 percent of our new CRM customers deploying in the cloud,” said Nadella.

Windows 10 is being taken up. The nagware is working according to Nadella, who said that “The number of Windows 10 devices is twice that of Windows 7 over the same time period since launch.” Nevertheless I still hear a lot of caution out there, with people advising one another to stick with Windows 7. Windows 10 pushes users more strongly to Microsoft services than 7, with Cortana driven by Bing. “Over 35 percent of our search revenue last month came from Windows 10 devices,” said Nadella.

Windows Phone is dying fast. “For phone we expect year over year revenue declines to deepen in Q4 as we work through our Lumia channel position,” said Hood.

Linux is growing. Nadella made a few comments about SQL Server on Linux and Linux on Azure. Why SQL Server on Linux? “We look at that as an expansion opportunity,” he said. Over 20% of VMs on Azure are Linux, he added. Microsoft made Linux “first class” on Azure in order to be able to host an enterprise’s “entire data estate across Windows and Linux.” People don’t move between operating systems, he said, but “now they have a choice around database.”

I’d add that we are now seeing scenarios where Linux is ahead of Windows on Azure. The new Azure Container service is currently Linux only, for example, though a Windows option is planned.

What Microsoft does with Linux in the coming years will be interesting to see. Office on Linux? Microsoft Android?

A reminder of Microsoft’s segments:

Productivity and Business Processes: Office, both commercial and consumer, including retail sales, volume licenses, Office 365, Exchange, SharePoint, Skype for Business, Skype consumer, OneDrive, Outlook.com. Microsoft Dynamics including Dynamics CRM, Dynamics ERP, both online and on-premises sales.

Intelligent Cloud: Server products not mentioned above, including Windows server, SQL Server, Visual Studio, System Center, as well as Microsoft Azure.

More Personal Computing: What a daft name, more than what? Still, this includes Windows in all its non-server forms, Windows Phone both hardware and licenses, Surface hardware, gaming including Xbox, Xbox Live, and search advertising.

Microsoft financials Jan-March 2015

Microsoft has released figures for its third quarter, ending March 31st 2015. Here is my simple summary of the figures showing the segment breakdown:

Quarter ending  March 31st 2015 vs quarter ending March 31st 2014, $millions

Segment Revenue Change Gross margin Change
Devices and Consumer Licensing 3476 -1121 3210 -807
Computing and Gaming Hardware 1800 -72 414 +156
Phone Hardware 1397 N/A -4 N/A
Devices and Consumer Other 2280 +456 566 +175
Commercial Licensing 10036 -299 9975 -157
Commercial Other 2760 +858 1144 +669

The figures form a familiar pattern: Windows and shrink-wrap (non-subscription) Office is down, reflecting weak PC sales and the advent of free Windows at the low end, but subscription sales are up and cloud is booming. See the foot of this post for an explanation of Microsoft’s confusing segment breakdown.

Microsoft says that Surface Pro 3 is doing well (revenue of $713 million) and this is reflected in the Devices figures. Commercial cloud (Office 365, Azure and Dynamics) is up 106% year on year.

Cloud aside, it is impressive that server products reported a 12% year on year increase in revenue. This is the kind of business that you would expect to be hit by cloud migration, though I am not sure how Microsoft accounts for things like SQL Server licenses deployed on Azure.

Xbox One is disappointing, bearing in mind the success of the Xbox 360. Microsoft managed to lose out to Sony’s PlayStation 4 with its botched launch and market share will be hard to claw back.

Microsoft reports 8.6 million Lumias sold, the majority being low-end devices. Not too bad for a platform many dismiss, but still treading water and miles behind iOS and Android.

The company remains a huge money-making machine though, and Office 365 is doing well. A few years ago it looked as if cloud and mobile could destroy Microsoft, but so far that is not the case at all, though its business is changing.

Microsoft’s segments summarised

Devices and Consumer Licensing: non-volume and non-subscription licensing of Windows, Office, Windows Phone, and “ related patent licensing; and certain other patent licensing revenue” – all those Android royalties?

Computing and Gaming Hardware: the Xbox One and 360, Xbox Live subscriptions, Surface, and Microsoft PC accessories.

Devices and Consumer Other: Resale, including Windows Store, Xbox Live transactions (other than subscriptions), Windows Phone Marketplace; search advertising; display advertising; Office 365 Home Premium subscriptions; Microsoft Studios (games), retail stores.

Commercial Licensing: server products, including Windows Server, Microsoft SQL Server, Visual Studio, System Center, and Windows Embedded; volume licensing of Windows, Office, Exchange, SharePoint, and Lync; Microsoft Dynamics business solutions, excluding Dynamics CRM Online; Skype.

Commercial Other: Enterprise Services, including support and consulting; Office 365 (excluding Office 365 Home Premium), other Microsoft Office online offerings, and Dynamics CRM Online; Windows Azure.

AWS Summit London: cloud growth, understanding Lambda, Machine Learning

I attended the Amazon Web Services (AWS) London Summit. Not much news there, since the big announcements were the week before in San Francisco, but a chance to drill into some of the AWS services and keep up to date with the platform.

image

The keynote by CTO Werner Vogels was a bit too much relentless promotion for my taste, but I am interested in the idea he put forward that cloud computing will gradually take over from on-premises and that more and more organisations will go “all in” on Amazon’s cloud. He instanced some examples (Netflix, Intuit, Tibco, Splunk) though I am not quite clear whether these companies have 100% of their internal IT systems on AWS, or merely that they run the entirety of their services (their product) on AWS. The general argument is compelling, especially when you consider the number of services now on offer from AWS and the difficulty of replicating them on-premises (I wrote this up briefly on the Reg). I don’t swallow it wholesale though; you have to look at the costs carefully, but even more than security, the loss of control when you base your IT infrastructure on a public cloud provider is a negative factor.

As it happens, the ticket systems for my train into London were down that morning, which meant that purchasers of advance tickets online could not collect their tickets.

image

The consequences of this outage were not too serious, in that the trains still ran, but of course there were plenty of people travelling without tickets (I was one of them) and ticket checking was much reduced. I am not suggesting that this service runs on AWS (I have no idea) but it did get me thinking about the impact on business when applications fail; and that led me to the question: what are the long-term implications of our IT systems and even our economy becoming increasingly dependent on a (very) small number of companies for their health? It seems to me that the risks are difficult to assess, no matter how much respect we have for the AWS engineers.

I enjoyed the technical sessions more than the keynote. I attended Dean Bryen’s session on AWS Lambda, “Event-driven code in the cloud”, where I discovered that the scope of Lambda is greater than I had previously realised. Lambda lets you write code that runs in response to events, but what is also interesting is that it is a platform as a service offering, where you simply supply the code and AWS runs it for you:

AWS Lambda runs your custom code on a high-availability compute infrastructure and administers all of the compute resources, including server and operating system maintenance, capacity provisioning and automatic scaling, code, and security patches.

This is a different model than running applications in EC2 (Elastic Compute Cloud) VMs or even in Docker containers, which are also VM based. Of course we know that Lambda ultimately runs in VMs as well, but these details are abstracted away and scaling is automatic, which arguably is a better model for cloud computing. Azure Cloud Services or Heroku apps are somewhat like this, but neither is very pure; with Azure Cloud Services you still have to worry about how many VMs you are using, and with Heroku you have to think about dynos (app containers). Google App Engine is another example and autoscales, though you are charged by application instance count so you still have to think in those terms. With Lambda you are charged based on the number of requests, the duration of your code, and the amount of memory allocated, making it perhaps the best abstracted of all these PaaS examples.

But Lambda is just for event-handing, right? Not quite; it now supports synchronous as well as asynchronous event handling and you could create large applications on the service if you chose. It is well suited to services for mobile applications, for example. Java support is on the way, as an alternative to the existing Node.js support. I will be interested to see how this evolves.

I also went along to Carlos Conde’s session on Amazon Machine Learning (one instance in which AWS has trailed Microsoft Azure, which already has a machine learning service). Machine learning is not that easy to explain in simple terms, but I thought Conde did a great job. He showed us a spreadsheet which was a simple database of contacts with fields for age, income, location, job and so on. There was also a Boolean field for whether they had purchased a certain financial product after it had been offered to them. The idea was to feed this spreadsheet to the machine learning service, and then to upload a similar table but of different contacts and without the last field. The job of the service was to predict whether or not each contact listed would purchase the product. The service returned results with this field populated along with a confidence indicator. A simple example with obvious practical benefit, presuming of course that the prediction has reasonable accuracy.

Reserved IPs and other Microsoft Azure annoyances

I have been doing a little work with Microsoft’s Azure platform recently. A common requirement is that you want a VM which is internet-accessible with a custom domain, for which the best solution is to create a A record in your DNS pointing to the IP number of the VM. In order to do this reliably, you need to reserve an IP number for the VM; otherwise Azure may assign a different IP number if you shut it down and later restart it. If you keep it running you can keep the IP number, but this also means you are have to pay for the VM continuously.

Azure now offers reserved IP numbers. Useful; but note that you can only link a VM with a reserved IP number when it is created, and to do this you have to create the VM with PowerShell.

What if you want to assign a reserved IP number to an existing VM? One suggestion is that you can capture an image from the VM, and then create a new VM from the image, complete with reserved IP. I went partially down this route but came unstuck because Azure for some reason captured the image into a different region (West Europe) than the region where the VM used to be (North Europe). When I ran the magic PowerShell script, it complained that the image was in the wrong region. I then found a post explaining how to move images between regions, which I did, but the metadata of the moved image was not quite the same and creating a new VM from the image did not work. At this point I realised that it would  be easier to recreate the VM from scratch.

Note that when reserved IP number were announced in May 2014, program manager Mahesh Thiagarajan said:

The platform doesn’t support reserving the IP address of the existing Cloud Services or Virtual machines. We expect to announce support for this in the near future.

You can debate what is meant by “near future” and whether Microsoft has already failed this expectation.

There is another wrinkle here that I am not clear about. Some Azure VMs have special pricing, such as those with SQL Server pre-installed. The special pricing is substantial, often forming the largest part of the price, since it includes licensing fees. What happens to the special pricing if you fiddle with cloning VMs, creating new VMs with existing VHDs, moving VMs between regions, or the like? If the special pricing is somehow lost, how do you restore it so SQL Server (for example) is still properly licensed? I imagine this would mean a call to support. I have not seen any documentation relating to this in posts like this about moving a virtual machine into a virtual network.

And there’s another thing. If you want your VM to be in a virtual network, you have to do that when you create it as well; it is a similar problem.

While I am in complaining mode, here is another. Creating a VM with PowerShell is easy enough, but you do need to know the image name you are using. This is not shown in the friendly portal GUI:

image

In order to get the image names, I ran a PowerShell script that exports the available images to a file. I was surprised how many there are: the resulting output has around 13,500 lines and finding what you want is tedious.

Azure is mostly very good in my experience, but I would like to see these annoyances fixed. I would be interested to hear of other things that make the cloud admin or developer’s life harder than it should be.

Quick reflections on Amazon re:Invent, open source, and Amazon Web Services

Last week I was in Las Vegas for my first visit to Amazon’s annual developer conference re:Invent. There were several announcements, the biggest being a new relational database service called RDS Aurora – a drop-in replacement for MySQL but with 3x write performance and 5x read performance as well as resiliency benefits – and EC2 Container Service, for deploying and managing Docker app containers. There is also AWS Lambda, a service which runs code in response to events.

You could read this news anywhere, but the advantage of being in Vegas was to immerse myself in the AWS culture and get to know the company better. Amazon is both distinctive and disruptive, and threes things that its retail operation and its web services have in common are large scale, commodity pricing, and customer focus.

Customer focus? Every company I have ever spoken to says it is customer focused, so what is different? Well, part of the press training at Amazon seems to be that when you ask about its future plans, the invariable answer is “what customers demand.” No doubt if you could eavesdrop at an Amazon executive meeting you would find that this is not entirely true, that there are matters of strategy and profitability which come into play, but this is the story the company wants us to hear. It also chimes with that of the retail operation, where customer service is generally excellent; the company would rather risk giving a refund or replacement to an undeserving customer and annoy its suppliers than vice versa. In the context of AWS this means something a bit different, but it does seem to me part of the company culture. “If enough customers keep asking for something, it’s very likely that we will respond to that,” marketing executive Paul Duffy told me.

That said, I would not describe Amazon as an especially open company, which is one reason I was glad to attend re:Invent. I was intrigued for example that Aurora is a drop-in replacement for an open source product, and wondered if it actually uses any of the MySQL code, though it seems unlikely since MySQL’s GPL license would require Amazon to publish its own code if it used any MySQL code; that said, the InnoDB storage engine code at least used to be available under a dual license so it is possible. When I asked Duffy though he said:

We don’t … at that level, that’s why we say it is compatible with MySQL. If you run the MySQL compatibility tool that will all check out. We don’t disclose anything about the inner workings of the service.

This of course touches on the issue of whether Amazon takes more from the open source community than it gives back.

image
Senior VP of AWS Andy Jassy

Someone asked Senior VP of AWS Andy Jassy, “what is your strategy of contributing to the open source ecosystem”, to which he replied:

We contribute to the open source ecosystem for many years. Zen, MySQL space, Linux space, we’re very active contributors, and will continue to do so in future.

That was it, that was the whole answer. Aurora, despite Duffy’s reticence, seems to be a completely new implementation of the MySQL API and builds on its success and popularity; could Amazon do more to share some of its breakthroughs with the open source community from which MySQL came? I think that is arguable; but Amazon is hard to hate since it tends to price so competitively.

Is Amazon worried about competition from Microsoft, Google, IBM or other cloud providers? I heard this question asked on several occasions, and the answer was generally along the lines that AWS is too busy to think about it. Again this is perhaps not the whole story, but it is true that AWS is growing fast and dominates the market to the extent that, say, Azure’s growth does not keep it awake at night. That said, you cannot accuse Amazon of complacency since it is adding new services and features at a high rate; 449 so far in 2014 according to VP and Distinguished Engineer James Hamilton, who also mentioned 99% usage growth in EC2 year on year, over 1,000,000 active customers, and 132% data transfer growth in the S3 storage service.

Cloud thinking

Hamilton’s session on AWS Innovation at Scale was among the most compelling of those I attended. His theme was that cloud computing is not just a bunch of hosted servers and services, but a new model of computing that enables new and better ways to run applications that are fast, resilient and scalable. Aurora is actually an example of this. Amazon has separated the storage engine from the relational engine, he explained, so that only deltas (the bits that have changed) are passed down for storage. The data is replicated 6 times across three Amazon availability zones, making it exceptionally resilient. You could not implement Aurora on-premises; only a cloud provider with huge scale can do it, according to Hamilton.

image
Distinguished Engineer James Hamilton

Hamilton was fascinating on the subject of networking gear – the cards, switches and routers that push bits across the network. Five years ago Amazon decided to build its own, partly because it considered the commercial products to be too expensive. Amazon developed its own custom network protocol stack. It worked out a lot cheaper, he said, since “even the support contract for networking gear was running into 10s of millions of dollars.” The company also found that reliability increased. Why was that? Hamilton quipped about how enterprise networking products evolve:

Enterprise customers give lots of complicated requirements to networking equipment producers who aggregate all these complicated requirements into 10s of billions of lines of code that can’t be maintained and that’s what gets delivered.

Amazon knew its own requirements and built for those alone. “Our gear is more reliable because we took on an easier problem,” he said.

AWS is also in a great position to analyse performance. It runs so much kit that it can see patterns of failure and where the bottlenecks lie. “We love metrics,” he said. There is an analogy with the way the popularity of Google search improves Google search; it is a virtuous circle that is hard for competitors can replicate.

Closing reflections

Like all vendor-specific conferences there was more marketing that I would have liked at re:Invent, but there is no doubting the excellence of the platform and its power to disrupt. There are aspects of public cloud that remain unsettling; things can go wrong and there will be nothing you can do but wait for them to be fixed. The benefits though are so great that it is worth the risk – though I would always advocate having some sort of plan B and off-cloud (or backup with another cloud provider) if that is feasible.

How is Microsoft Azure doing? Some stats from Satya Nadella and Scott Guthrie

Microsoft financials are hard to parse these days, with figures broken down into broad categories that reveal little about what is succeeding and what is not.

image
CEO Satya Nadella speaks in San Francisco

At a cloud platform event yesterday in San Francisco, CEO Satya Nadella and VP of cloud and enterprise Scott Guthrie offered some figures. Here is what I gleaned:

  • Projected revenue of $4.4Bn if current trends continue (“run rate”)
  • Annual investment of $4.5Bn
  • Over 10,000 new customers per week
  • 1,200,000 SQL databases
  • Over 30 trillion storage objects
  • 350 million users in Azure Active Directory
  • 19 Azure datacentre regions, up to 600,000 servers in each region

image

Now, one observation from the above is that Microsoft says it is spending more on Azure than it is earning – not unreasonable at a time of fast growth.

However, I do not know how complete the figures are. Nadella said Office 365 runs on Azure (though this may be only partially true; that certainly used to be the case); but I doubt that all Office 365 revenue is included in the above.

What about SQL Server licensing, for example, does Microsoft count it under SQL Server, or Azure, or both depending which marketing event it is?

If you know the answer to this, I would love to hear.

At the event, Guthrie (I think) made a bold statement. He said that there would only be three vendors in hyper-scale cloud computing, being Microsoft, Amazon and Google.

IBM for one would disagree; but there are huge barriers to entry even for industry giants.

I consider Microsoft’s progress extraordinary. Guthrie said that it was just two years ago that he announced the remaking of Azure – this is when things like Azure stateful VMs and the new portal arrived. Prior to that date, Azure stuttered.

Now, here is journalist and open source advocate Matt Asay:

Microsoft used to be evil. Then it was irrelevant. Now it looks like a winner.

He quotes Bill Bennett

Microsoft has created a cloud computing service that makes creating a server as simple as setting up a Word document

New features are coming apace to Azure, and Guthrie showed this slide of what has been added in the last 12 months:

image

The synergy of Azure with Visual Studio, Windows Server and IIS is such that it is a natural choice for Microsoft-platform developers hosting web applications, and Azure VMs are useful for experimentation.

Does anything spoil this picture? Well, when I sat down to write what I thought would be a simple application, I ran into familiar problems. Half-baked samples, ever changing APIs and libraries, beta code evangelised by Microsoft folk with little indication of what to do if you would rather not use this in production, and so on.

There is also a risk that as Azure services multiply, working out what to use and when becomes harder, and complexity increases.

Azure also largely means Windows – and yes, I heard yesterday that 20% of Azure VMs run Linux – but if you have standardised on Linux servers and use a Mac or Linux for development, Azure looks to me less attractive than AWS which has more synergy with that approach.

Still, it is a bright spot in Microsoft’s product line and right now I expect its growth to continue.