image

Just ahead of the launch of Oppo Reno 2, here is a look at Oppo Reno 10x Zoom

Oppo will launch Reno 2 on 16th October, under the heading “Make the world your studio”. Oppo mobiles have been making a an impression as an example of high quality technology at a price a bit less than you would pay for a Samsung or a Sony – similar in that respect to Huawei, though currently without the challenge Huawei faces in trying to market Android devices without Google Play services.

Oppo is a brand of BBK Electronics Corp, a Chinese company based in Chang’an, Dongguan. Other BBK Electronics brands include OnePlus and Vivo. If you combine the market share of all these brands, it is in the top four globally.

My first encounter with the Reno brand was in May this year when I attended the launch of the Reno 10x Zoom and the Reno 5G (essentially the 10x Zoom with 5G support) in London. Unfortunately I was not able to borrow a device for review until recently; however I have been using a 10x Zoom for the last couple of weeks and found it pretty interesting.

First impression: this is a large device. It measures 7.72 x 16.2 x 0.93cm and weighs about 215g. The AMOLED screen diagonal is 16.9cm and the resolution 2340 x 1080 pixels.

Second impression: it takes amazing pictures. To me, this is not just a matter of specification. I am not a professional photographer, but do take thousands of photos for work. Unfortunately I don’t have an iPhone 11, Samsung Galaxy Note 10 to test against. The mobile I’ve actually been using of late is the Honor 10 AI, a year older and considerably cheaper than the Reno but with a decent camera. I present the below snaps not as a fair comparison but to show how the Reno 10x Zoom compares to a more ordinary smartphone camera.

Here is a random pic of some flowers taken with the Honor 10 AI (left) and the Reno 10x Zoom (right):

image

Not too much in it? Try zooming in on some detail (same pic, cropped):

image

The Reno 10x Zoom also, believe it not, has a zoom feature. Here is a detail from my snap of an old coin at 4.9x, hand-held, no tripod.

image

There is something curious about this. Despite the name, the Reno has 5x optical zoom, with 10x and more (in fact up to 60x) available through digital processing. You soon learn that the quality is best when using the optical zoom alone; there is a noticeable change when you exceed 5x and not a good one.

The image stabilisation seems excellent.

The UI for this is therefore unfortunate. The way it works is that when you open the camera a small 1x button appears in the image. Tap it, and it goes to 2x.Tap again for 6x, and again for 10x. If you want other settings you either use pinch and zoom, or press and hold on the button whereupon a scale appears. Since there is a drop-off in quality after 5x, it would make more sense for the tap to give this setting.

There are four camera lenses on the Reno. On the rear, a 48MP f/1.7 wide, a 13MP f/2.4 telephoto, and an 8MP f/2.2 ultra-wide. The telephoto lens has a periscope design (like Huawei’s P30 Pro), meaning that the lens extends along with the length of the phone internally, using a prism to bend the light, so that the lens can be longer than a thin smartphone normally allows.

image

There is also a small bump (surrounded by green in the pic below) which is a thoughtful feature to protect the lenses if the device is placed on a flat surface.

image

On the front is a 16MP f/2.0 sensor which also gives great results, excellent for selfies or video conferencing. The notable feature here is that it is hinged and when not in use, slides into the body of the camera. This avoids having a notch. Nice feature.

image

ColorOS and special features

We might wish that vendors just use stock Android but they prefer to customize it, probably in the hope that customers, once having learned a particular flavour of Android, will be reluctant to switch.

The Oppo variant is called ColorOS. One good thing about it is that you can download a manual which is currently 335pp. It is not specific to the Reno 10x Zoom and some things are wrong (it references a non-existent headphone jack, for example), but it helps if you want to understand the details of the system. You might not otherwise know, for example, that there is a setting which lets you open the camera by drawing an O gesture on the lock screen.

image

How many customers will find and read this manual? My hunch is relatively few. Most people get a new smartphone, transfer their favourite apps, tap around a bit to work out how to set a few things as they want them, and then do not worry.

If you have a 10x, I particularly recommend reading the section on the camera as you will want to understand each feature and how to operate it.

The Reno 10x does have quite a few smart features. Another worth noting is “Auto answer when phone is near ear”. You can also have it so that it will automatically switch from speaker to receiver when you hold the phone to your ear.

Face unlock is supported but you are not walked through setting this up automatically. You are prompted to enrol a fingerprint though. The fingerprint sensor is under glass on the front – I prefer them on the rear – but there is a nice feature where the fingerprint area glows when you pick up the device. It works but it is not brilliant if conditions are sub-optimal, for example with a damp hand.

The Reno 10x Zoom supports split screen mode via a three-finger gesture. With a large high-resolution screen this may be useful. Here is Microsoft Teams (Left) with a web browser (Right).

Screenshot_2019-10-13-12-20-18-33_3aea4af51f236e4932235fdada7d1643

Settings – Smart services includes Riding mode, designed for cycling, which will disable all notifications except whitelisted calls.

VOOC (Voltage Open Loop Multistep Constant-current Charging) is Oppo’s fast charging technology.

Dolby Atmos audio is included and there are stereo speakers. Sound from these is nothing special, but sound from the bundled earbuds is excellent.

Quick conclusions

A Reno 10x Zoom is not a cheap smartphone, but it does cost less than the latest flagship devices from Apple or Samsung. If you are like me and need a great camera, it strikes me as a good choice. If you do not care much about the camera, look elsewhere.

Things I especially like:

  • Excellent camera
  • No notch
  • Great audio quality though supplied earbuds
  • Thoughtful design and high quality build

There are a few things against it though:

  • Relatively bulky
  • No wireless charging
  • No headphone jack (less important now that wireless earbuds are common)

Spec summary

OS: Android 9 with ColorOS 6

Screen: AMOLED 6.6″ 2340 x 1080 at 387 ppi

Chipset: Qualcomm Snapdragon 855 SM8150 , 8 Core Kryo 485 2.85 GHz

Integrated GPU: Qualcomm Adreno 640

RAM: 8GB

Storage: 256GB

Dual SIM: Yes – 2 x Nano SIM or SIM + Micro SD

NFC: Yes

Sensors: Geomagnetic, Light, Proximity, Accelerometer, Gyro, Laser focus, dual-band GPS

WiFi: 802.11 a/b/g/n/ac, 2.4GHz/5GHz, hotspot support

Bluetooth: 5.0

Connections: USB Type-C with OTG support.

Size and weight: 162 mm x 77.2 mm  x 9.3 mm, 215g

Battery: 4065 mAh. No wireless charging.

Fingerprint sensor: Front, under glass

Face unlock: Yes

Rear camera: Rear: 48MP + 8MP + 13MP

Front camera: 16MP

Finding the multi-factor authentication and authenticator options in an Office 365 account

Microsoft has done some good work enabling and promoting multi-factor authentication in Office 365, including use of the Microsoft Authenticator app.

Strangely though, it has made the user settings for this hard to find.

Logically it should be in the My Account – Security and Privacy section, but it is not.

image

Where is it then? The easiest way to find it is here:

https://aka.ms/mfasetup

image

Yamaha’s vinyl revival on display at IFA in Berlin including GT-5000 turntable

At IFA in Berlin, Europe’s biggest consumer electronics show, there is no doubting that the vinyl revival is real.

At times it did feel like going back in time. On the Teac stand there were posters for Led Zeppelin and The Who, records by Deep Purple and the Velvet Underground, and of course lots of turntables.

image

Why all the interest in vinyl? Nostalgia is a factor but there is a little more to it. A record satisfies a psychological urge to collect, to own, to hold a piece of music that you admire, and streaming or downloading does not meet that need.

There is also the sound. At its best, records have an organic realism that digital audio rarely matches. Sometimes that is because of the freedom digital audio gives to mastering engineers to crush all the dynamics out of music in a quest to make everything as LOUD as possible. Other factors are the possibility of euphonic distortion in vinyl playback, or that excessive digital processing damages the purity of the sound. Records also have plenty of drawbacks, including vulnerability to physical damage, dust which collects on the needle, geometric issues which means that the arm is (most of the time) not exactly parallel to the groove, and the fact that he quality of reproduction drops near the centre of the record, where the speed is slower.

Somehow all these annoyances have not prevented vinyl sales from increasing, and audio companies are taking advantage. It is a gift for them, some slight relief from the trend towards smartphones, streaming, earbuds and wireless speakers in place of traditional hi-fi systems.

One of the craziest things I saw at IFA was Crosley’s RDS3, a miniature turntable too small even for a 7” single. It plays one-sided 3” records of which there are hardly any available to buy.Luckily it is not very expensive, and is typically sold on Record Store Day complete with a collectible 3” record which you can play again and again.

image

Moving from the ridiculous to the sublime, I was also intrigued by Yamaha’s GT-5000. It is a high-end turntable which is not yet in full production. I was told there are only three in existence at the moment, one on the stand at IFA, one in a listening room at IFA, and one at Yamaha’s head office in Japan.

IMG_20190907_091642

Before you ask, price will be around €7000, complete with arm. A lot, but in the world of high-end audio, not completely unaffordable.

There was a Yamaha GT-2000 turntable back in the eighties, the GT standing for “Gigantic and Tremendous”. Yamaha told me that engineers in retirement were consulted on this revived design.

The GT-5000 is part of a recently introduced 5000 series, including amplifier and loudspeakers, which takes a 100% analogue approach. The turntable is belt drive, and features a very heavy two-piece platter. The brass inner platter weights 2kg and the aluminium outer platter, 5.2kg. The high mass of the platter stabilises the rotation. The straight tonearm features a copper-plated aluminium inner tube and a carbon outer tube. The headshell is cut aluminium and is replaceable. You can adjust the speed ±1.5% in 0.1% increments. Output is via XLR balanced terminals or unbalanced RCA. Yamaha do not supply a cartridge but recommend the Ortofon Cadenza Black.

Partnering the GT-5000 is C-5000 pre-amplifier, the M-5000 100w per channel stereo power amplifier, and NS-5000 three-way loudspeakers. Both amplifiers have balanced connections and Yamaha has implemented what it calls “floating and balanced technology”:

Floating and balanced power amplifier technology delivers fully balanced amplification, with all amplifier circuitry including the power supply ‘floating’ from the electrical ground … one of the main goals of C-5000 development was to have completely balanced transmission of phono equaliser output, including the MC (moving coil) head amp … balanced transmission is well-known to be less susceptible to external noise, and these qualities are especially dramatic for minute signals between the phono cartridge and pre-amplifier.

In practice I suspect many buyers will partner the GT-5000 with their own choice of amplifier, but I do like the pure analogue approach which Yamaha has adopted. If you are going to pretend that digital audio does not exist you might as well do so consistently (I use Naim amplifiers from the eighties with my own turntable setup).

I did get a brief chance to hear the GT-5000 in the listening room at IFA. I was not familiar with the recording and cannot make meaningful comment except to say that yes, it sounded good, though perhaps slightly bright. I would need longer and to play some of my own familiar records to form a considered opinion.

What I do know is that if you want to play records, it really is worth investing in a high quality turntable, arm and cartridge; and that the pre-amplifier as well is critically important because of the low output, especially from moving coil cartridges.

GT-5000 arm geometry

There is one controversial aspect to the GT-5000 which is its arm geometry. All tonearms are a compromise. The ideal tonearm has zero friction, perfect rigidity, and parallel tracking at all points, unfortunately impossible to achieve. The GT-5000 has a short, straight arm, whereas most arms have an angled headshell and slightly overhang the centre of the platter. The problem with a short, straight arm is that it has a higher deviation from parallel than with a longer arm and angled headshell, so much so that it may only be suitable for a conical stylus. On the other hand, it does not require any bias adjustment, simplifying the design. With a straight arm, it would be geometrically preferable to have a very long arm but that may tend to resonate more as well as requiring a large plinth. I am inclined the give the GT-5000 the benefit of the doubt; it will be interesting to see detailed listening and performance tests in due course.

More information on the GT-5000 is here.

Saving documents in Office 365 desktop applications

Those readers who also follow The Register may have noticed that I am writing more for that publication now, though be assured that I will still post here from time to time. My most recent piece is on saving documents in Office and reflects a longstanding annoyance that in applications like Word and Excel Microsoft mostly bypasses the standard Windows file save dialog in favour of its own Backstage,  now supplemented by an additional dialog which the team says  will help us “save your files to the cloud more easily.”

image

Admittedly the new dialog is small and neat relative to the cluttered Backstage but it is not very flexible and if you use multiple sub-folders to organize our files you will be clicking More save options half the time, defeating the point.

There is also a suspicion that rather than helping us with something most of us do not need help with, Microsoft is trying to promote OneDrive – which it is entitled to do, but it is an annoyance if the software you have paid for is being used as a surreptitious marketing tool.

Microsoft earnings: strong quarter, but Xbox revenue dives

Microsoft has announced its quarterly financial statements, reporting revenue of $33.7 billion, up 12% on the same period last year.

The company stated that Azure revenue is up 64% year on year. Azure has overtaken the other two segments and is now the biggest, by a small amount. In addition, Azure gross margin has improved by 6% year on year.

Office 365 revenue is up 31% year on year.

Gaming was a black spot, declining 10% year on year – though Xbox Live monthly active users is at a record 65 million. The main problem is a 48% decline in the volume of Xbox consoles sold.

Quarter ending June 30th 2019 vs quarter ending June 30th 2018, $millions

Segment Revenue Change Operating income Change
Productivity and Business Processes 11047 +1379 4344 +878
Intelligent Cloud 11391 +1785 4502 +601
More Personal Computing 11279 +468 3559 +547

The segments break down as:

Productivity and Business Processes: Office, Office 365, Dynamics 365 and on-premises Dynamics, LinkedIn

Intelligent Cloud: Server products, Azure cloud services

More Personal Computing: Consumer including Windows, Xbox; Bing search; Surface hardware

Not OK Google

Views on privacy vary. Most people either do not think about it, or trust that big tech companies will do no harm with knowledge of your location, who your friends are, what you like to view on the internet and so on.

That truest may be shaken by disturbing revelations last week from Belgian broadcaster VRT. The report states that:

  • Google records speech heard by its Google Assistant or Google Home devices
  • Google passes on a proportion of these recordings to third parties, to assist with transcription. This is done to improve the speech recognition
  • Many of these recordings – it is not known exactly how many – are recorded unintentionally, rather being started with the “OK Google” trigger words. This could be because of some sound that the device incorrectly interprets as “OK Google”, or because of a mis-tap on a smartphone.
  • The recording are not effectively anonymised. They include addresses, names, business names and so on. Identity is often easy to work out.
  • The recordings are personal. They include medical queries, domestic arguments, even on one occasion “a woman who was in definite distress.”

Google’s response? Its main concern is to prevent future leaks of audio files, rather than with the fact that these recordings should not have been in the hands of third parties in the first place. “We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again,” says Google’s David Monsees.

Did users consent, somewhere in the miasma and dark UI patterns of “Accept” buttons that now bombard us on the web? Maybe, but I do not think this is what was expected by those users whose identifiable private moments were first recorded and then passed around by Google. They have been let down.

Chromium and Microsoft annoyances : Dynamics CRM issues like broken downloads, Chromium team “won’t fix”

Microsoft Dynamics CRM (which exists in both cloud-hosted and on-premises versions) is not working well with Chromium, the open source browser engine used by Google Chrome.

I discovered one obvious issue using Edge Preview, which is based on Chromium. If you download a file, for example using a Word template, Microsoft Office does not recognise it. It turns out to have single quotes around it. I imagine the quotes are there to allow for document names which include spaces, but it should use double quotes. Chromium (and Chrome) used to work OK with single quotes but now does not. It’s causing quite a bit of grief for CRM users in businesses that have standardised on Chrome.

You can read all the details here. Here’s a user report by Troy Siegert, whose organization frequently downloads files from Dynamics:

This week when the Chrome beta build went mainstream, my 30 users suddenly had Windows 10 unable to determine what to do with the files they were so dutifully downloading and trying to look at. Instead of *Report.pdf* the file was named *’Report.pdf’* and of course Windows 10 has no idea what a *.pdf’* file is or what to do with it, so it started asking users questions for which they weren’t prepared and that they didn’t understand. Some of them got confused and tried to associate .xlsx files with Adobe and then became unhappy when Adobe was throwing up messages about corrupt files.

Google’s Abdul Syed responds:

For any server operators running into this issue, the way to fix for this is to use double quotes around any quoted string in the Content-Disposition header (And, more generally, in any HTTP header).

Translation: fix your stuff, don’t expect us to fix our stuff. And in fact the issue has been marked WontFix (Closed).

There was actually a bit of a battle about this. The original commit here (Oct 2018) was reverted here (Feb 12 2019) and unreverted here (Feb 19 2019). In other words, the Chromium team knew it broke downloads for Dynamics CRM users but were not willing to compromise.

I am in two minds about this one. Dynamics CRM is sloppy in places and part of me favours giving Microsoft’s team a kick to make them fix thing that should have been fixed years back.

On the other hand, Mozilla Firefox works fine with the CRM single quotes and you cannot help wondering if Google’s attitude would be different were it a Google application that is impacted.

Two Factor Authentication is great–but what if you lose your phone or have your number hijacked?

Account hijack is a worry for anyone. What kind of chaos could someone cause simply by taking over your email or social media account? Or how about spending money on your behalf on Amazon, eBay or other online retailers?

The obvious fraud will not be long-lasting, but there is an aftermath too. Changing passwords, getting back into accounts that have been compromised and their security information changed.

In the worst cases you might lose access to an account permanently. Organisations like Google, Microsoft, Facebook or eBay, are not easy to deal with in cases where your account is thoroughly compromised. They may not be sure whether your are the victim attempting to recover an account, or the imposter attempting to compromise an account. Even getting to speak to a human can be challenging, as they rely on automated systems, and when you do, you may not get the answer you want.

The solution is stronger security so that account hijack is less common, but security is never easy. It is a system, and like any system, any change you make can impact other parts of the system. In the old world, the most common approach had three key parts, username, password and email. The username was often the email address, so perhaps make that two key parts. Lose the password, and you can reset it by email.

Two problems with this approach. First, the password might be stolen or guessed (rather easy considering massive databases username/password combinations easily available online). And second, the email is security-critical, and email can be intercepted as it often travels the internet in plain text, for at least part of its journey. If you use Office 365, for example, your connection to Office 365 is encrypted, but an email sent to you may still be plain text until it arrives on Microsoft’s servers.

There is therefore a big trend towards 2-factor authentication (2FA): something you have as well as something you know. This is not new, and many of us have used things like little devices that display one-time pass codes that you use in addition to a password, such as the RSA SecureID key fob devices.

image

Another common approach is a card and a card reader. The card readers are all the same, and you use something like a bank card, put in your PIN, and it displays a code. An imposter would need to clone your card, or steal it and know the PIN.

in the EU, everyone is becoming familiar with 2FA thanks to the revised Payment Services Directive (PSD2) which comes into effect in September 2019 and requires Strong Customer Authentication (SCA). Details of what this means in the UK are here. See chapter 20:

Under the PSRs 2017, strong customer authentication means authentication based on the use of two or more independent elements (factors) from the following categories:

• something known only to the payment service user (knowledge)

• something held only by the payment service user (possession)

• something inherent to the payment service user (inherence)

So will we see a lot more card readers and token devices? Maybe not. They offer decent security, but they are expensive, and when users lose them or they wear out or the battery goes, they have to be replaced, which means more admin and expense. Giant companies like security, but they care almost as much about keeping costs down and automating password reset and account recovery.

Instead, the favoured approach is to use your mobile phone. There are several ways to do this, of which the simplest is where you are sent a one-time code by SMS. Another is where you install an app that generates codes, just like the key fob devices, but with support for multiple accounts and no need to clutter up your pocket or bag.

These are not bad solutions – some better than others – but this is a system, remember. It used to be your email address, but now it is your phone and/or your phone number that is critical to your security. All of us need to think carefully about a couple of things:

– if our phone is lost or broken, can we still get our work done?

– if a bad guy steals our phone or hijacks the number (not that difficult in many cases, via a little social engineering), what are the consequences?

Note that the SCA regulations insist that the factors are each independent of the other, but that can be difficult to achieve. There you are with your authenticator app, your password manager, your web browser with saved usernames and passwords, your email account – all on your phone.

Personally I realised recently that I now have about a dozen authenticator accounts on a phone that is quite old and might break; I started going through them and evaluating what would happen if I lost access to the app. Unlike many apps, most authenticator apps (for example those from Google and Microsoft) do not automatically reinstall complete with account data when you get a new phone.

Here are a few observations.

First, SMS codes are relatively easy from a recovery perspective (you just need a new phone with the same number), but not good for security. Simon Thorpe at Authy has a good outline of the issues with it here and concludes:

Essentially SMS is great for finding out your Uber is arriving, or when your restaurant table is ready. But SMS was never designed to provide a secure way for you to login to your online banking account.

Yes, Authy is pitching its alternative solutions but the issues are real. So try to avoid them; though, as Thorpe notes, SMS codes are much stronger security than password alone.

Second, the authenticator app problem. Each of those accounts is actually a long code. So you can back them up by storing the code. However it is not easy to get the code unless you hack your phone, for example getting root access to an Android device.

What you can do though is to use the “manually enter code” option when setting up an account, and copy the code somewhere safe. Yes you are undermining the security, but you can then easily recover the account. Up to you.

If you use the (free) Authy app, the accounts do roam between your various devices. This must mean Authy keeps a copy on its cloud services, hopefully suitably encrypted. So it must be a bit less secure, but it is another solution.

Third, check out the recovery process for those accounts where you rely on your authenticator app or smartphone number. In Google’s case, for example, you can access backup codes – they are in the same place where you set up the authenticator account. These will get you back into your account, once for each code. I highly recommend that you do something to cover yourself against the possibility of losing your authenticator code, as Google is not easy to deal with in account recovery cases.

A password manager or an encrypted device is a good place to store backup codes, or you may have better ideas.

The important thing is this: a smartphone is an easy thing to lose, so it pays to plan ahead.

Wrestling with Visual Basic 6 (really!) and how knowledge on the internet gets harder to find

I have not done a thing with Visual Basic 6 for years, but a contact of mine has done a very useful utility in a certain niche (it is the teaching of Contract Bridge though you do not need to know about bridge to follow this post) and his preferred tools are VBA and Visual Basic 6.

Visual Basic 6 is thoroughly obsolete, but apps compiled with it still run fine on Windows 10 and Microsoft probably knows better than to stop them working. The IDE is painful on versions of Windows later than XP but that is what VMs are for. VBA, which uses essentially the same runtime (though updated and also available in 64-bit)  is not really obsolete at all; it is still the macro language of Office though Microsoft would prefer you to use a modern add-in model that works in the cloud.

Specifically, we thought it would be great to use Bo Haglund’s excellent Double Dummy Solver which is an open source library and runs cross-platform. So it was just a matter of doing a VB6 wrapper to call this DLL.

I did not set my sights high, I just wanted to call one function which looks like this:

EXTERN_C DLLEXPORT int STDCALL CalcDDtablePBN(
   struct ddTableDealPBN tableDealPBN,
   struct ddTableResults * tablep);

and the ddTableResults struct looks like this:

struct ddTableResults
{
   int resTable[DDS_STRAINS][DDS_HANDS];
};

I also forked the source and created a Visual C++ project for it for a familiar debugging experience.

So the first problem I ran into (before I compiled the DLL for myself) is that VB6 struggles with passing a struct (user-defined type or UDT in VB) ByVal. Maybe there is a way to do it, but this was when I ran up Visual C++ and decided to modify the source to make it more VB friendly, creating a version of the function that has pointer arguments so that you can pass the UDT ByRef.

A trivial change, but at this point I discovered that there is some mystery about  __declspec(dllexport) in Visual C++ which is meant to export undecorated functions for use in a DLL but does not always do so. The easy solution is to go back to using a DEF file and after fiddling for a bit I did that.

Now the head-scratching started as the code seemed to run fine but I got the wrong results. My C++ code was OK and the unit test worked perfectly. Further, VB6 did not crash or report any error. Just that the values in the ddTableResults.resTable array after the function returned were wrong.

Of course I searched for help but it is somewhat hard to find help with VB6 and calling DLLs especially since Microsoft has broken the links or otherwise removed all the helpful documents and MSDN articles that existed 20 years ago when VB6 was hot.

I actually dug out my old copy of Daniel Appleman’s Visual Basic Programmer’s Guide to the Windows API where he assured me that arrays in UDTs should work OK, especially since it is just an array of integers rather than strings.

Eventually I noticed what was happening. When I passed my two-dimensional array to the DLL it worked fine but in the DLL the indexes were inverted. So all I needed to do was to fix this up in my wrapper. Then it all worked fine. Who knows what convoluted stuff happens under the surface to give this result – yes I know that VB6 uses SAFEARRAYs.

image

I do not miss VB6 at all and personally I moved on to VB.NET and C# at the earliest opportunity. I do understand however that some people like to stay with what is familiar, and also that legacy software has to be maintained. It would be interesting to know how many VB6 projects are still being actively maintained; my guess is quite a few. Which if you read Bruce McKinney’s well-argued rants here must be somewhat frustrating.

Microsoft’s Pipelines for Azure Kubernetes Service: fixing COPY failed

I like to try new technology when I can so following the Build conference I decided to deploy a Hello World app to Azure Kubernetes Service (AKS). I made a one-node AKS cluster in no time. I built a .NET Core app in Visual Studio deployed to a Linux Docker container, no problem. I pushed the container into ACR (Azure Container Registry) though it turns out I did not really need to do that. The tricky bit is getting the container deployed to the AKS cluster. There is a thing called Dev Spaces but it does not work in UK South:

image

I was contemplating the necessity of building a Helm chart when I tried a thing called Deployment Center (Preview) in the Azure portal.

Click Add Project and it builds a pipeline in Azure DevOps for you.

image

It worked but the pipeline failed when building the container.

COPY failed: stat /var/lib/docker/tmp/docker-builder088029891/AKS-Example/AKS-Example.csproj: no such file or directory

I spent some time puzzling over this error. You can view the exact logs of the build failure and I worked out that it is executing the Dockerfile steps:

COPY [“AKS-Example/AKS-Example.csproj”, “AKS-Example/”]

RUN dotnet restore “AKS-Example/AKS-Example.csproj”
COPY . .

This is failing because there the code in my repository is not nested like that. I eventually fixed it by amending the lines to:

COPY [“AKS-Example.csproj”, “AKS-Example/”]
RUN dotnet restore “AKS-Example/AKS-Example.csproj”

COPY . AKS-Example/

Now the pipeline completed and the container was deployed. I had to look at the Load Balancer Azure had generated for me to find the public IP number, but it worked.

image

Now the Dockerfile has a different path for local development than when deployed which is annoying. I found I could fix this by changing a step in the Deployment Center wizard:

image

Where it says /AKS-Example in Docker build context I replaced it with /. Now the build worked with the original Dockerfile.

I also noticed that the Deployment Center (Preview) used a sample YAML template which is linked directly from GitHub and referred confusingly to deploying sampleapp. It worked but felt a bit of a crude solution.

At this point I realised that I was not really using the latest and greatest, which is the pipeline wizard in Azure Devops. So I deleted everything and tried that.

image

This was great but I could not see an equivalent step to the Docker build context. And indeed, the new build failed with the same COPY failed error I got originally. Luckily I knew the workaround and was up and running in no time.

This different approach also has a slightly different shape than the Deployment Center pipeline, using Environments in Azure DevOps.

Currently therefore I have two questions:

  • Why does Azure offer both the Deployment Center (Preview) and the multi-stage pipeline which seem to have overlapping functionality?
  • What is the correct way to modify the generated YAML to fix the path issue?

I suppose it would also be good if the path problem were picked up by the wizard in the first place.

Tech Writing