Category Archives: microsoft

Generating code for simple SQL Server data access without Entity Framework, works with .NET Core

I realise that Microsoft’s Entity Framework is the most common approach for data access in the .NET world, but I have also always had good results from a simple manual approach using DbConnection, DbCommand and DataReader objects, and like the fact that I can see and control exactly what SQL gets executed. If you prefer using Entity Framework or another abstraction that is fine and please stop reading now!

One snag with this more manual approach is that you have to write tedious code building SQL statements. I figured that someone must have written a utility application to generate this code but could not find one quickly so I did my own. It supports both C# and Visual Basic. The utility connects to a database and lets you generate a class for each table along with code for retrieving and saving these objects, ready for modification. Here you can see a generated class:

image

and here is an example of the generated data access code:

image

This is NOT complete code (otherwise I would be perilously close to writing my own ORM) but simply automates creating SQL parameters and SQL statements.

One of my thoughts was that this code should work well with .NET core. The SQLClient implements the required classes. Here is my code for retrieving an author object, mostly generated by my utility:

public static ClsAuthor GetAuthor(string authorID)
        {
            SqlConnection conn = new SqlConnection(ConnectString);
            SqlCommand cmd = new SqlCommand();
            SqlDataReader dr;
            ClsAuthor TheAuthor = new ClsAuthor();
            try
            {
                cmd.CommandText = "Select * from Authors where au_id = @auid";
                cmd.Parameters.Add("@auid", SqlDbType.Char);
                cmd.Parameters[0].Value = authorID;
                cmd.Connection = conn;
                cmd.Connection.Open();

                dr = cmd.ExecuteReader();

            if (dr.Read()) {

                    //Get Function
                    TheAuthor.Auid = GetSafeDbString(dr, "au_id");
                    TheAuthor.Aulname = GetSafeDbString(dr, "au_lname");
                    TheAuthor.Aufname = GetSafeDbString(dr, "au_fname");
                    TheAuthor.Phone = GetSafeDbString(dr, "phone");
                    TheAuthor.Address = GetSafeDbString(dr, "address");
                    TheAuthor.City = GetSafeDbString(dr, "city");
                    TheAuthor.State = GetSafeDbString(dr, "state");
                    TheAuthor.Zip = GetSafeDbString(dr, "zip");
                    TheAuthor.Contract = GetSafeDbBool(dr, "contract");
                }
            }
            finally
            {
                conn.Close();
                conn.Dispose();
            }

            return TheAuthor;
        }

Everything worked perfectly and I soon had a table showing the authors, using ASP.NET MVC.

In order to verify that it really does work with .NET Core I moved the project to Visual Studio Mac and ran it there:

image

I may be unusual; but I am reassured that I have a relatively painless way to write a database application for .NET Core without using Entity Framework.

Microsoft updates the .NET stack with .NET Core 2.0 and updated Visual Studio. Should you use it?

Microsoft has released .NET Core 2.0, a major update to its open source, cross-platform version of the .NET runtime and C# language.

New features include implementation of .NET Standard 2.0 (a way of targeting code to run under multiple .NET platforms), new platform support including Debian Stretch, macOS High Sierra and Suse Linux Enterprise Server 12 SP2. There is preview support for both Linux and Windows on ARM32.

.NET Core 2.0 now supports Visual Basic as well as C# and F#. The version of C# has been bumped to 7.1, including async Main method support, inferred tuple names and default expressions.

Microsoft has also released Visual Studio 2017 15.3, which is required if you want to use .NET Core 2.0. New Visual Studio features include Azure Stack support, C’# 7.1 support, .NET Framework 4.7 support, and other new features and fixes.

I updated Visual Studio and downloaded the new .NET Core 2.0 SDK and was soon up and running.

image

Note the statement about “This product collects usage data” of which more below.

image

The sample ASP.NET MVC application worked first time.

image

How is .NET Core doing? The whole .NET picture is desperately confusing and I get the impression that most .NET developers, while they may have paid some attention to what is happening, have concluded that the safe path is to continue with the Window-only .NET Framework.

At the same time, .NET Core is strategically important to Microsoft. Cross-platform support means that C# has a life on the Mac and on Linux, which is vital to its health considering the popularity of the Mac amongst developers, and of Linux as a deployment platform for web applications. Visual Studio for Mac has also been updated and supports .NET Core 2.0 in the new version.

Another key piece is the container trend. .NET Core is ideal for container deployment, and the only version of .NET supported in Windows Nano Server. If you want to embrace microservices running in containers, while still developing with C#, .NET Core and Nano Server is the optimum solution.

Why not use .NET Core, especially since it is faster than ASP.NET? In these comparisons, .NET Core comes out as substantially faster than .NET Framework for various algorithms – 600 times faster in one case.

The main issue is compatibility. .NET Core is a subset of the .NET Framework, and being a relative newcomer, it lacks the same level of third-party support.

Another factor is that there is no support for desktop applications, though some solutions have been devised. Microsoft does have a cross-platform GUI story, in Xamarin Forms, which is now in preview for macOS alongside iOS, Android, Windows and Tizen. If Xamarin used .NET Core that would be a great solution, but it does not (though it does support .NET Standard 2.0).

One of the pieces that most concerns developers is data access. If you use .NET Core you are strongly guided towards Entity Framework Core, a fork of Microsoft’s ORM (Object-Relational Mapping) framework. Someone asked on this page, is EF Core usable? Here’s an answer from one user (11 days ago):

Answering 4 months later but people should know: Definitely not, it is still not usable unless you are doing something very trivial and/or have very small DB.
I don’t understand how it is possible for MS to ship it, act like it’s OK and sparsely here and there provide shallow information about its limitations like in this article without warning clearly and explicitly about the serious issues this “v1 product” has.

Someone may jump in and say no, it is fine; but there are undoubtedly missing pieces and I would suggest caution.

You can also access data using the Connection/Command/DataReader approach which avoids EF, and although this is more work, this is what I would be inclined to do personally since you get the best performance and flexibility. Here is an example for SQL Server.

Who is using .NET Core? Controversially, Microsoft gathers telemetry from your use of the command-line tools though you can opt out by setting an environment variable. This means we have some data on .NET Core usage, though unfortunately it excludes Visual Studio usage. I downloaded the most recent dataset and imported it into a database. Here are the figures for OS family:

Total rows 5,036,981
Windows 3,841,922 (76.27%)
Linux 887,750 (17.62%)
Mac 307,309 (6.1%)

image

Given that this excludes Visual Studio users, who are also on Windows, we can conclude that the great majority of .NET Core developers use Windows, and only a tiny minority Mac (I do not know if Visual Studio for Mac usage is included). This is evidence that .NET Core has so far failed in its goal of persuading Mac-using developers to adopt .NET. It does show interest in deploying .NET applications to Linux, which is an obvious win in licensing costs as well as performance.

I would be interested in comments from developers on whether or not they use .NET Core and why.

An overreaching Office 365 integration from Sage

Sage, a software vendor best known for its accounting software, recently introduced an Office 365 integration in its products called Sage 50C Accounts (the “C” is for cloud).

The integration offers several features including:

  • Automatic data backup to OneDrive
  • Contact integration so that you can easily see Sage accounts data for contacts in Office 365/Outlook
  • A mobile app that lets you capture receipts with your smartphone and import them
  • Excel reports
  • A Business Performance Dashboard

image

Very good; but how is this implemented? Users get a special Getting Started email which says:

Are you ready to integrate your Microsoft Office 365 account with Sage 50c Accounts? All you need to do is click Get Started and sign in using the administrator account for your Office 365 Business Premium subscription, and we will guide you through accepting terms and conditions, how to sync your data and setup the Sage apps and users

To sign in, you’ll enter your email and password for your administrator account. Your email is formatted as follows: xxx@xxx.onmicrosoft.com. If you have forgotten your Office 365 administrator password, please click here for more information.

You’ll be asked to accept a provider invitation to give us permission to activate the Sage add-ins for your Office 365 account. Easy.

If you know Office 365 you will spot something odd in the above. Sage is asking you not just to install an Office 365 application, but to “accept a provider invitation”.

It is as bad as it sounds. In order to get its integration working, Sage demands that you appoint it as a Cloud Solution Provider (CSP) for your entire Office 365 tenancy. This does not require that you start paying for your tenancy via Sage, as it can be alongside an existing CSP relationship. However it does give Sage complete access to the tenancy including the ability to reset the global administrator password.

While I do not think it is likely that Sage will do anything bad, this is a lot to ask. It means that in the unlikely event that Sage has its systems compromised, your Office 365 data is at risk.

It gets worse. Once you have agreed to hand over the keys to your Office 365 kingdom, you click a “Let’s get started” button in Sage 50C Accounts on your desktop. You have to log in as manager (a local Sage administrator) and then enter the credentials for your Office 365 global administrator. These credentials are then stored by Sage for 90 days and used to perform synchronization. After 90 days, it will demand that the credentials are entered again.

And by the way, you will need an Office 365 Business Premium license for the global administrator, even though it down not normally use that license for day to day work.

Why is this bad? First, it is a mis-use of the global administrator account. Best practice is that this account is used only for Office 365 administration. It should not be an active user account for email, OneDrive etc, since this increases the risk of the account being compromised.

Second, end users (such as those in the accounts department) do not normally have knowledge of the global administrator credentials. Therefore to perform this operation, they will need to contact their IT support every 90 days.

Third, the fact that Sage has these credentials on a user’s PC, albeit I presume encrypted, adds a possible attack mechanism for your Office 365 tenancy. If the PC became hijacked or infected with malware, some bad guy can now start trying to figure out if there is a way of persuading Sage to do something bad.

Fourth, it is not even wise to enter these credentials on an end user PC. Perhaps I am being excessively cautious, but it is obvious that an end-user PC that is used for day to day work, web browsing and so forth, by someone non-specialist in IT terms, is more vulnerable than an administrator’s PC. If a keylogger were installed, then there is an opportunity to grab the global administrator credentials every 90 days.

Frankly, I do not recommend that businesses use this integration in its current implementation. Nor is it necessary. There are plenty of ways to create Office 365 applications that integrate nicely using the APIs which Microsoft has provided. Maybe there is a feature or two which is difficult to implement without these rights; in this case, the correct thing to do is to badger Microsoft to provide a new API, or perhaps to recognise that the security cost of adding the feature is not worth the value which it adds.

My suspicion is that Sage has gone down this path by a process of evolution. It set itself up as an Office 365 CSP (before doing this integration) in order to get some extra business, which is fair enough. Then it started adding value to its Office 365 tenants, making use of what it could do as the customer’s CSP. Then it wanted to extend that to other Office 365 customers, those for whom it was not the CSP, and went down the path of least resistance, “oh, let’s just require that we become their CSP as well.”

Imagine if other third-party vendors go down this route. Your specialist business software supplier, your CRM supplier, your marketing software, all demands total access and control over your Office 365 setup.

It is overreaching and disappointing that Microsoft CEO Satya Nadella blessed this integration with a quote about “empowering professionals” when the truth is that this is the wrong way to go about it.

The downside of “Windows as a service”: disappearing features (and why I will miss Paint)

Microsoft has posted a list of features that are “removed or deprecated” in the next major update to Windows 10, called the Fall Creators Update.

The two that caught my eye are Paint, a simple graphics editor whose ancestry goes right back to Windows 1.0 in 1985, and System Image Backup, a means of backing up Windows that preserves applications, settings and documents.

I use Paint constantly. It is ideal for cropping screenshots and photos, where you want a quick result with no need for elaborate image processing. It starts in a blink, lets you resize images while preserving aspect ratio, and supports .BMP, .GIF, .JPG, .PNG and .TIF – all the most important formats.

I used Paint to crop the following screen, of the backup feature to be removed.

image

System Image Backup is the most complete backup Windows offers. It copies your system drive so that you can restore it to another hard drive, complete with applications and data. By contrast, the “modern” Windows 10 backup only backs up files and you will need to reinstall and reconfigure the operating system along with any applications if your hard drive fails and you want to get back where you were before. “We recommend that users use full-disk backup solutions from other vendors,” says Microsoft unhelpfully.

If System Image Backup does stop working, take a look at Disk2vhd which is not entirely dissimilar, but copies the drive to a virtual hard drive; or the third party DriveSnapshot which can backup and restore entire drives. Or of course one of many other backup systems.

The bigger picture here is that when Microsoft pitched the advantages of “Windows of a service”, it neglected to mention that features might be taken away as well as added.

Microsoft Edge browser crashing soon after launch: this time, it’s IBM Trusteer Rapport to blame

A common problem (I am not sure how common, but there are hundreds of reports) with the Edge browser in Windows 10 is that it gets into the habit of opening and then immediately closing, or closing when you try to browse the web.

I was trying to fix a PC with these symptoms. In the event log, an error was logged “Faulting module name: EMODEL.dll.” Among much useless advice out there, there is one that has some chance. You can reinstall Edge by following a couple of steps, as described in various places. Something like this (though be warned you will lose ALL your Edge settings, favourites etc):

Delete C:\Users\%username%\AppData\Local\Packages\Microsoft.MicrosoftEdge_8wekyb3d8bbwe (a few files may get left behind)

Reboot

Run Powershell then Get-AppXPackage -Name Microsoft.MicrosoftEdge | Foreach {Add-AppxPackage -DisableDevelopmentMode -Register "$($_.InstallLocation)\AppXManifest.xml" -Verbose}

However this did not fix the problem – annoying after losing the settings. I was about to give up when I found this thread. The culprit, for some at lease, is IBM Trusteer Rapport and its Early Browser Protection feature. I disabled this, rebooted, and Edge now works.

Failing that, you can Stop or uninstall Rapport and that should also fix the problem.

Unhealthy Identity synchronization Notification: a trivial solution (and Microsoft’s useless troubleshooter)

If you use Microsoft’s AD Connect, also known as DirSync, you may have received an email like this:

image

It’s bad news: your Active Directory is not syncing with Office 365. “Azure Active Directory did not register a synchronization attempt from the Identity synchronization tool in the last 24 hours.”

I got this after upgrading AD Connect to the latest version, currently 1.1.553.

The email recommends you run a troubleshooting tool on the AD Connect server. I did that. Nothing wrong. I rebooted, it synced once, then I got another warning.

This is only a test system but I still wanted to find out what was wrong. I tweaked the sync configuration, again without fixing the issue.

Finally I found this post. Somehow, AD Connect had configured itself not to sync. You can get the current setting in PowerShell, using get-adsyncscheduler:

image

As you can see, SyncCycleEnabled is set to false. The fix is trivial, just type:

set-adsyncscheduler –SyncCycleEnabled $true

Well, I am glad to fix it, but should not Microsoft’s troubleshooting tool find this simple configuration problem?

Licensing Azure Stack: it’s complicated (and why Azure Stack is the iPad of servers)

Microsoft’s Azure Stack is a pre-configured, cut-down version of Microsoft’s mighty cloud platform, condensed into an appliance-like box that you can install on your own premises.

Azure Stack is not just a a new way to buy a bunch of Windows servers. Both the technical and the business model are different to anything you have seen before from Microsoft. On the technical side, your interaction with Azure Stack is similar to your interaction with Azure. On the business side, you are buying the hardware, but renting the software. There is no way, according to the latest pricing and licensing guide, to purchase a perpetual license for the software, as you can for Windows Server. Instead, there are two broad options:

Pay-as-you-use

In this model, you buy software services on Azure Stack in exactly the same way as you do on Azure. The fact that you have bought your own hardware gets you a discount (probably). The paper says “Azure Stack service fees are typically lower than Azure prices”.

Service
Base virtual machine $0.008/vCPU/hour ($6/vCPU/month)
Windows Server virtual machine $0.046/vCPU/hour ($34/vCPU/month)
Azure Blob Storage $0.006/GB/month (no transaction fee)
Azure Table and Queue $0.018/GB/month (no transaction fee)
Azure App Service (Web Apps, Mobile Apps, API Apps, Functions) $0.056/vCPU/hour ($42/vCPU/month)

This has the merit of being easy to understand. It gets more complex if you take the additional option of using existing licenses with Azure Stack. “You may use licenses from any channel (EA, SPLA, Open, and others),” says the guide, “as long as you comply with all software licensing and product terms.” That qualification is key; those documents are not simple. Let’s briefly consider Windows Server 2016 Standard, for example. Licensing is per core. To install Windows Server 2016 Standard on a VM, you have to license all the cores in the physical server, even if your VM only has one virtual CPU. The servers in Azure Stack, I presume, have lots of cores. Even when you have done this, you are only allowed to install it on up to two VMs. If you need it on a third VM, you have to license all the cores again. Here are the relevant words:

Standard Edition provides rights for up to 2 Operating System Environments or Hyper-V containers when all physical cores in the server are licensed. For each additional 1 or 2 VMs, all the physical cores in the server must be licensed again.

Oh yes, and once you have done that, you need to purchase CALs as well, for every user or device accessing a server. Note too that on Azure Stack you always have to pay the “base virtual machine” cost in addition to any licenses you supply.

This is why the only sane way to license Windows Server 2016 in a virtualized environment is to use the expensive Datacenter edition. Microsoft’s pay-as-you-use pricing will be better for most users.

Capacity model

This is your other option. It is a fixed annual subscription with two variants:

App Service, base virtual machines and Azure Storage $400 per core per year
Base virtual machines and Azure Storage only $144 per core per year

The Capacity Model is only available via an Enterprise Agreement (500 or more users or devices required); and you still have to bring your own licenses for Windows Server, SQL Server and any other licensed software required. Microsoft says it expects the capacity model to be more expensive for most users.

SQL Server

There are two ways to use SQL Server on Azure. You can use a SQL database as a service, or you can deploy your own SQL Server in a VM.

The same is true on Azure Stack; but I am not clear about how the licensing options if you offer SQL databases as a service. In the absence of any other guidance, it looks as if you will have to bring your own SQL Server license, which will make this expensive. However it would not surprise me if this ends up as an option in the pay-as-you-use model.

Using free software

It is worth noting that costs for both Azure and Azure Stack come way down if you use free software, such as Linux rather than Windows Server, and MySQL rather than SQL Server. Since Microsoft is making strenuous efforts to make its .NET application development framework cross-platform, that option is worth watching.

Support

You will have to get support for Azure Stack, since it is not meant to be user-serviceable. And you will need two support contracts, one with Microsoft, and one with your hardware provider. The hardware support is whatever you can negotiate with the hardware vendor. Microsoft support will be part of your Premier, Azure or Partner support in most cases.

Implications of Azure Stack

When Microsoft embarked on its Azure project, it made the decision not to use System Center, its suite of tools for managing servers and “private cloud”, but to create a new way to manage servers that is better automated, more scalable, and easier for end-users. Why would you use System Center if you can use Azure Stack? Well, one obvious reason is that with Azure Stack you are ceding a lot of control to Microsoft (and to your hardware supplier), as well as getting pushed down a subscription path for your software licensing. If you can handle that though, it does seem to me that running Azure Stack is going to be a lot easier and more productive than building your own private cloud, for most organizations.

This presumes of course that it works. The big risk with Azure Stack is that it breaks; and your IT administrators will not know how to fix it, because that responsibility has been outsourced to your hardware vendor and to Microsoft. It is possible, therefore, than an Azure Stack problem will be harder to solve than other typical Windows platform failures. A lot will depend on the quality control achieved both by Microsoft, for the software, and its hardware partners.

Bottom line: this is the iPad of servers. You buy it but don’t really control it, and it is a delight to use provided it works.

No more infrastructure roles for Windows Nano Server, and why I still like Server Core

Microsoft’s General Manager for Windows Server Erin Chapple posted last week about Nano Server (under a meaningless PR-speak headline) to explain that Nano Server, the most stripped-down edition of Windows Server, is being repositioned. When it was introduced, it was presented not only as a lightweight operating system for running within containers, but also for infrastructure roles such as hosting Hyper-V virtual machines, hosting containers, file server, web server and DNS Server (but without AD integration).

In future, Nano Server will be solely for the container role, enabling it to shrink in size (for the base image) by over 50%, according to Chapple. It will no longer be possible to install Nano Server as a standalone operating system on a server or VM. 

This change prompted Microsoft MVP and Hyper-V enthusiast Aidan Finn to declare Nano Server all but dead (which I suppose it is from a Hyper-V perspective) and to repeat his belief that GUI installs of Windows Server are best, even on a server used only for Hyper-V hosting.

Prepare for a return to an old message from Microsoft, “We recommend Server Core for physical infrastructure roles.” See my counter to Nano Server. PowerShell gurus will repeat their cry that the GUI prevents scripting. Would you like some baloney for your sandwich? I will continue to recommend a full GUI installation. Hopefully, the efforts by Microsoft to diminish the full installation will end with this rollback on Nano Server.

Finn’s main argument is that the full GUI makes troubleshooting easier. Server Core also introduces a certain amount of friction as most documentation relating to Windows Server (especially from third parties) presumes you have a GUI and you have to do some work to figure out how to do the same thing on Core.

Nevertheless I like Server Core and use it where possible. The performance overhead of the GUI is small, but running Core does significantly reduce the number of security patches and therefore required reboots. Note that you can run GUI applications on Server Core, if they are written to a subset of the Windows API, so vendors that have taken the trouble to fix their GUI setup applications can support it nicely.

Another advantage of Server Core, in the SMB world where IT policies can be harder to enforce, is that users are not tempted to install other stuff on their Server Core Domain Controllers or Hyper-V hosts. I guess this is also an advantage of VMWare. Users log in once, see the command-line UI, and do not try installing file shares, print managers, accounting software, web browsers (I often see Google Chrome on servers because users cannot cope with IE Enhanced Security Configuration), remote access software and so on.

Only developers now need to pay attention to Nano Server, but that is no reason to give up on Server Core.

OneDrive Files on Demand is back – will users get confused? And how does it look to applications?

Microsoft is restoring a much-requested feature to its OneDrive cloud storage: placeholders, or what is now called Files on Demand.

The issue is that when users have files in cloud storage, they want easy access to them at any time, but downloading everything to local storage may use too much disk space. There are also scenarios where you do not want a local copy, for example for confidential documents, especially if you do not enable Bitlocker encryption.

You can use OneDrive through the web browser, but Windows users expect File Explorer integration, the most natural way of working.

Windows 8.1 introduced placeholders, where OneDrive (then SkyDrive) files appeared in File Explorer but were not actually downloaded until you opened them. It was a popular feature, but Microsoft removed it in Windows 10, saying that users found it confusing. I suppose they might have thought a file was on their PC, boarded a plane, and then discovered they could not work on the document because they it was not actually there.

This was a user interface issue, but apparently there were other technical issues, particularly for applications using the Windows file APIs. Perhaps the problems were so intricate that the team did not think it could be fixed in the first releases of Windows 10.

Now the feature is back, and I have installed it on the latest Windows Insider build:

image

But could users still be confused? Files in OneDrive now have four possible states:

Hidden. You can still choose not to make all folders visible in File Explorer. In fact, hidden seems to be the default for folders previously not synced to the PC, though you can easily check an option to show them all:

image

Online-only: Files have a cloud icon and are offline until you open them:

image

Locally available:

image

Always available:

image

So what is the difference between “Locally available” and “Always available”? It really is not explained here but my assumption is that locally available files could automatically revert to online-only if there is pressure on disk space. It could catch you out, if you saw that a file was locally available and relied on that, only to find that Windows automatically reverted it without you realising.

If you right-click a file in OneDrive you can change its status or share a link. If you want to make a file online-only, you choose Free up space (I think it would be clearer if this option were called Online-only, but this is a preview so it might change).

image

How do online-only files look to applications? I ran up Visual Studio and wrote a utility that iterates through a folder and shows the file name and length:

image

You will note that the API reported the size of the file online, not on disk. This is the kind of thing that can cause issues, though if the file size were reported as zero bytes – well, that could cause issues too.

Incidentally, you can also now sort files in File Explorer by Status. I imagine the latest Windows 10 SDK will also have a way to report status so that applications can catch up.

Server shipments decline as customers float towards cloud

Gartner reports that worldwide server shipments have declined by 4.2% in the first quarter of 2017.

Not a surprise considering the growth in cloud adoption but there are several points of interest.

One is that although Hewlett Packard Enterprise (HPE) is still ahead in revenue (over $3 billion revenue and 24% market share), Dell EMC is catching up, still number two with 19% share but posting growth of 4.5% versus 8.7% decline for HPE.

In unit shipments, Dell EMC is now fractionally ahead, with 17.9% market share and growth of 0.5% versus HPE at 16.8% and decline of 16.7%.

Clearly Dell is doing something right where HPE is not, possibly through synergy with its acquisition of storage vendor EMC (announced October 2015, completed September 2016).

The larger picture though is not great for server vendors. Businesses are buying fewer servers since cloud-hosted servers or services are a good alternative. For example, SMBs who in the past might run Exchange are tending to migrate to Office 365 or perhaps G Suite (Google apps). Maybe there is still a local server for Active Directory and file server duties, or maybe just a NAS (Networked Attached Storage).

It follows that the big cloud providers are buying more servers but such is their size that they do not need to buy from Dell or HPE, they can go directly to ODMs (Original Design Manufacturers) and tailor the hardware to their exact needs.

Does that mean you should think twice before buying new servers? Well, it is always a good idea to think twice, but it is worth noting that going cloud is not always the best option. Local servers can be much cheaper than cloud VMs as well as giving you complete control over your environment. Doing the sums is not easy and there are plenty of “it depends”, but it is wrong to assume that cloud is always the right answer.