Spectre and Meltdown woes continue as Intel confesses to broken updates

Intel’s Navin Shenoy says the company has asked PC vendors to stop shipping its microcode updates that fix the speculative execution vulnerabilities identified by Google’s Project Zero team:

We recommend that OEMs, cloud service providers, system manufacturers, software vendors and end users stop deployment of current versions, as they may introduce higher than expected reboots and other unpredictable system behavior.

This is a blow to industry efforts to fix this vulnerability, a process involving BIOS updates (to install the microcode) as well as operating system patches.

Intel says it has an “early version of the updated solution”. Given the length of time it takes for PC manufacturers to package and distribute BIOS updates for the many thousands of models affected, it looks like the moment at which the majority of active systems will be patched is now far in the future.

Vendors have not yet completed the rollout of the initial patch, which they are now being asked to withdraw.

The detailed microcode guidance is here. Intel also has a workaround which gives some protection while also preserving system stability:

For those concerned about system stability while we finalize the updated solutions, we are also working with our OEM partners on the option to utilize a previous version of microcode that does not display these issues, but removes the Variant 2 (Spectre) mitigations. This would be delivered via a BIOS update, and would not impact mitigations for Variant 1 (Spectre) and Variant 3 (Meltdown).

I am not sure who out there is not concerned about system stability? That said, public cloud vendors would rather almost anything than the possibility of code running in one VM getting unauthorised access to the host or to other VMs.

Right now it feels as if most of the world’s computing devices, from server to smartphone, are simply insecure. Though it should be noted that the bad guys have to get their code to run: trivial if you just need to run up a VM on a public cloud, more challenging if it is a server behind a firewall.

Office 2016 now “built out of one codebase for all platforms” says Microsoft engineer

Microsoft’s Erik Schweibert, principal engineer in the Apple Productivity Experiences group, says that with the release of Office 2016 version 16 for the Mac, the productivity suite is now “for the first time in 20 years, built out of one codebase for all platforms (Windows, Mac, iOS, Android).”

image

This is not the first time I have heard of substantial code-sharing between the various versions of Office, but this claim goes beyond that. Of course there is still platform-specific code and it is worth reading the Twitter thread for a more background.

“The shared code is all C++. Each platform has native code interfacing with the OS (ie, Objective C for Mac and iOS, Java for Android, C/C++ for Windows, etc),” says Schweibert.

Does this mean that there is exact feature parity? No. The mobile versions remain cut-down, and some features remain platform-specific. “We’re not trying to provide uniform “lowest common denominator” support across all platforms so there will always be disparate feature gaps,” he says.

Even the online version of Office shares much of the code. “Web components share some code (backend server is shared C++ compiled code, front end is HTML and script)”, Schweibert says.

There is more news on what is new in Office for the Mac here. The big feature is real-time collaborative editing in Word, Excel and PowerPoint. 

What about 20 years ago? Schweibert is thinking about Word 6 for the Mac in 1994, a terrible release about which you can read more here:

“Shipping a crappy product is a lot like beating your head against the wall.  It really does feel good when you ship a great product as a follow-up, and it really does motivate you to spend some time trying to figure out how not to ship a crappy product again.

Mac Word 6.0 was a crappy product.  And, we spent some time trying to figure out how not to do that again.  In the process, we learned a few things, not the least of which was the meaning of the term “Mac-like.”

Word 6.0 for the Mac was poor for all sorts of reasons, as explained by Rick Schaut in the post above. The performance was poor, and the look and feel was too much like the Windows version – because it was the Windows code, recompiled. “Dialog boxes had "OK" and "Cancel" exactly reversed compared to the way they were in virtually every other Mac application — because that was the convention under Windows,” says one comment.

This is not the case today. Thanks to its lack of a mobile platform, Microsoft has a strong incentive to create excellent cross-platform applications.

There is more about the new cross-platform engineering effort in the video below.

The mysterious microcode: Intel is issuing updates for all its CPUs from the last five years but you might not benefit

The Spectre and Meltdown security holes found in Intel and to a lesser extend AMD CPUs is not only one of the most serious, but also one of the most confusing tech issues that I can recall.

We are all used to the idea of patching to fix security holes, but normally that is all you need to do. Run Windows Update, or on Linux apt-get update, apt-get upgrade, and you are done.

This one is not like that. The reason is that you need to update the firmware; that is, the low-level software that drives the CPU. Intel calls this microcode.

So when Intel CEO Brian Krzanich says:

By Jan. 15, we will have issued updates for at least 90 percent of Intel CPUs introduced in the past five years, with updates for the remainder of these CPUs available by the end of January. We will then focus on issuing updates for older products as prioritized by our customers.

what he means is that Intel has issued new microcode for those CPUs, to mitigate against the newly discovered security holes, related to speculative execution (CPUs getting a performance gain by making calculations ahead of time and throwing them away if you don’t use them).

Intel’s customer are not you and I, the users, but rather the companies who purchase CPUs, which in most cases are the big PC manufacturers together with numerous device manufacturers. My Synology NAS has an Intel CPU, for example.

So if you have a PC or server from Vendor A, then when Intel has new microcode it is available to Vendor A. How it gets to your PC or server which you bought from Vendor A is another matter.

There are several ways this can happen. One is that the manufacturer can issue a BIOS update. This is the normal approach, but it does mean that you have to wait for that update, find it and apply it. Unlike Windows patches, BIOS updates do not come down via Windows update, but have to be applied via another route, normally a utility supplied by the manufacturer. There are thousands of different PC models and there is no guarantee that any specific model will receive an updated BIOS and no guarantee that all users will find and apply it even if they do. You have better chances if your PC is from a big name rather than one with a brand nobody has heard of, that you bought from a supermarket or on eBay.

Are there other ways to apply the microcode? Yes. If you are technical you might be able to hack the BIOS, but leaving that aside, some operating systems can apply new microcode on boot. Therefore VMWare was able to state:

The ESXi patches for this mitigation will include all available microcode patches at the time of release and the appropriate one will be applied automatically if the system firmware has not already done so.

Linux can do this as well. Such updates are volatile; they have to be re-applied on every boot. But there is little harm in that.

What about Windows? Unfortunately there is no supported way to do this. However there is a VMWare experimental utility that will do it:

This Fling is a Windows driver that can be used to update the microcode on a computer system’s central processor(s) (“CPU”). This type of update is most commonly performed by a system’s firmware (“BIOS”). However, if a newer BIOS cannot be obtained from a system vendor then this driver can be a potential substitute.

Check the comments – interest in this utility has jumped following the publicity around spectre/meltdown. If working exploits start circulating you can expect that interest to spike further.

This is a techie and unsupported solution though and comes with a health warning. Most users will never find it or use it.

That said, there is no inherent reason why Microsoft could not come up with a similar solution for PCs and servers for which no BIOS update is available, and even deliver it through Windows Update. If users do start to suffer widespread security problems which require Intel’s new microcode, it would not surprise me if something appears. If it does not, large numbers of PCs will remain unprotected.

Windows Mixed Reality: Acer headset review and Microsoft’s (lack of) content problem

Acer kindly loaned me a Windows Mixed Reality headset to review, which I have been trying over the holiday period.

First, an aside. I had a couple of sessions with Windows Mixed Reality before doing this review. One was at IFA in Berlin at the end of August 2017, where the hardware and especially the software was described as late preview. The second was at the Future Decoded event in London, early November. On both occasions, I was guided through a session either by the hardware vendor or by Microsoft. Those sessions were useful for getting a hands-on experience; but an extended review at home has given me a different understanding of the strengths and weaknesses of the product. Readers beware: those rushed “reviews” based on hands-on sessions at vendor events are poor guides to what a product is really like.

A second observation: I wandered into a few computer game shops before Christmas and Windows Mixed Reality hardware was nowhere to be seen. That is partly because PC gaming has hardly any bricks and mortar presence now. Retailers focus on console gaming, where there is still some money to be made before all the software becomes download-only. PC game sales are now mainly Steam-powered, with a little bit of competition from other download stores including GOS and Microsoft’s Windows Store. That Steam and download dominance has many implications, one of which is invisibility on the High Street.

What about those people (and there must be some) who did unwrap a Windows Mixed Reality headset on Christmas morning? Well, unless they knew exactly what they were getting and enjoy being on the bleeding edge I’m guessing they will have been a little perplexed and disappointed. The problem is not the hardware, nor even Microsoft’s implementation of virtual reality. The problem is the lack of great games (or other virtual reality experiences).

This may improve, provided Microsoft sustains enough momentum to make Windows Mixed Reality worth supporting. The key here is the relationship with Steam. Microsoft cheerfully told the press that Steam VR is supported. The reality is that Steam VR support comes via preview software which you get via Steam and which states that it “is not complete and may or may not change further.” It will probably all be fine eventually, but that is not reassuring for early adopters.

image

My experience so far is that native Windows MR apps (from the Microsoft Store) work more smoothly, but the best content is on Steam VR. The current Steam preview does work though with a few limitations (no haptic feedback) and other issues depending on how much effort the game developers have put into supporting Windows MR.

I tried Windows MR on a well-specified gaming PC: Core i7 with NVIDIA’s superb GTX 1080 GPU. Games in general run super smoothly on this hardware.

Getting started

A Windows Mixed Reality headset has a wired connection to a PC, broken out into an HDMI and a USB 3.0 connection. You need Windows 10 Fall Creators Update installed, and Setup should be a matter of plugging in your headset, whereupon the hardware is detected, and a setup wizard starts up, downloading additional software as required.

image

In my case it did not go well. Setup started OK but went into a spin, giving me a corrupt screen and never completing. The problem, it turned out, was that my GPU has only one HDMI port, which I was already using for the main display. I had the headset plugged into a DisplayPort socket via an adapter. I switched this around, so that the headset uses the real HDMI port, and the display uses the adapter. Everything then worked perfectly.

The controllers use Bluetooth. I was wary, because in my previous demos the controllers had been problematic, dropping their connection from time to time, but these work fine.

image

They are perhaps a bit bulky, thanks to their illuminated rings which are presumably a key part of the tracking system. They also chew batteries.

The Acer headsets are slightly cheaper than average, but I’ve enjoyed my time with this one. I wear glasses but the headset fits comfortably over them.

A big selling point of the Windows system is that no external tracking sensors are required. This is called inside-out tracking. It is a great feature and makes it easier just to plug in and go. That said, you have to choose between a stationary position, or free movement; and if you choose free movement, you have to set up a virtual boundary so that you do not walk into physical objects while immersed in a VR experience.

image

The boundary is an important feature but also illustrates an inherent issue with full VR immersion: you really are isolated from the real world. Motion sickness and disorientation can also be a problem, the reason being that the images your brain perceives do not match the physical movement your body feels.

Once set up, you are in Microsoft’s virtual house, which serves as a kind of customizable Start menu for your VR experiences.

image

The house is OK though it seems to me over-elaborate for its function, which is to launch games and apps.

I must state at this point that yes, a virtual reality experience is amazing and a new kind of computing. The ability to look all around is extraordinary when you first encounter it, and adds a level of realism which you cannot otherwise achieve. That said, there is some frustration when you discover that the virtual world is not really as extensive as it first appears, just as you get in an adventure game when you find that not all doors open and there are invisible barriers everywhere. I am pretty sure though that a must-have VR game will come along at some point and drive many new sales – though not necessarily for Windows Mixed Reality of course.

I looked for content in the Windows Store. It is slim pickings. There’s Minecraft, which is stunning in VR, until you realise that the controls do not work quite so well as they do in the conventional version. There is Space Pirate, an old-school arcade game which is a lot of fun. There is Arizona Sunshine, which is fine if you like shooting zombies.

I headed over to Steam. The way this works is that you install the Steam app, then launch Windows Mixed Reality, then launch a VR game from your Steam library. You can access the Windows Desktop from within the Windows MR world, though it is not much fun. Although the VR headset offers two 1440 x 1440 displays I found it impossible to keep everything in sharp focus all the time. This does not matter all that much in the context of a VR game or experience, but makes the desktop and desktop applications difficult to use.

I did find lots of goodies in the Steam VR store though. There is Google Earth VR, which is not marked as supporting Windows MR but works. There is also The Lab, which a Steam VR demo which does a great job of showing what the platform can do, with several mini-games and other experiences – including a fab archery game called Longbow where you defend your castle from approaching hordes. You can even fire flaming arrows.

image
Asteroids! VR, a short, wordless VR film which is nice to watch once. It’s free though!

Mainstream VR?

Irrespective of who provides the hardware, VR has some issues. Even with inside-out tracking, a Windows Mixed Reality setup is somewhat bulky and makes the wearer look silly. The kit will become lighter, as well as integrating audio. HTC’s Vive Pro, just announced at CES, offers built-in headphones and has a wireless option, using Intel’s WiGig technology.

Even so, there are inherent issues with a fully immersive environment. You are vulnerable in various ways. Having people around wearing earbuds and staring at a screen is bad enough, but VR takes anti-social to another level.

The added expense of creating the content is another issue, though the right tools can do an amazing job of simplifying and accelerating the process.

It is worth noting that VR has been around for a long time. Check out the history here. Virtual Reality arcade machines in 1991. Sega VR Glasses in 1993. Why has this stuff taken so long to take off, and remains in its early stages? It is partly about technology catching up to the point of real usability and affordability, but also an open question about how much VR we want and need.

Why patching to protect against Spectre and Meltdown is challenging

The tech world has been buzzing with news of bugs (or design flaws, take your pick) in mainly Intel CPUs, going way back, which enables malware to access memory in the computer that should be inaccessible.

How do you protect against this risk? The industry has done a poor job in communicating what users (or even system admins) should do.

A key reason why this problem is so serious is that it risks a nightmare scenario for public cloud vendors, or any hosting company. This is where software running in a virtual machine is able to access memory, and potentially introduce malware, in either the host server or other virtual machines running on the same server. The nature of public cloud is that anyone can run up a virtual machine and do what they will, so protecting against this issue is essential. The biggest providers, including AWS, Microsoft and Google, appear to have moved quickly to protect their public cloud platforms. For example:

The majority of Azure infrastructure has already been updated to address this vulnerability. Some aspects of Azure are still being updated and require a reboot of customer VMs for the security update to take effect. Many of you have received notification in recent weeks of a planned maintenance on Azure and have already rebooted your VMs to apply the fix, and no further action by you is required.

With the public disclosure of the security vulnerability today, we are accelerating the planned maintenance timing and will begin automatically rebooting the remaining impacted VMs starting at 3:30pm PST on January 3, 2018. The self-service maintenance window that was available for some customers has now ended, in order to begin this accelerated update.

Note that this fix is at the hypervisor, host level. It does not patch your VMs on Azure. So do you also need to patch your VM? Yes, you should; and your client PCs as well. For example, KB4056890 (for Windows Server 2016 and Windows 10 1607), or KB4056891 for Windows 10 1703, or KB4056892. This is where it gets complex though, for two main reasons:

1. The update will not be applied unless your antivirus vendor has set a special registry key. The reason is that the update may crash your computer if the antivirus software accesses memory is a certain way, which it may do. So you have to wait for your antivirus vendor to do this, or remove your third-party anti-virus and use the built-in Windows Defender.

2. The software patch is not complete protection. You also need to update your BIOS, if an update is available. Whether or not it is available may be uncertain. For example, I am pretty sure that I found the right update for my HP PC, based on the following clues:

– The update was released on December 20 2017

– The description of the update is “Provides improved security”

image

So now is the time, if you have not done so already, to go to the support sites for your servers and PCs, or motherboard vendor if you assembled your own, see if there is a BIOS update, try to figure out it it addresses Spectre and Meltdown, and apply it.

If you cannot find an update, you are not fully protected.

It is not an easy process and realistically many PCs will never be updated, especially older ones.

What is most disappointing is the lack of clarity or alerts from vendors about the problem. I visited the HPE support site yesterday in the hope of finding up to date information on HP’s server patches,  to find only a maze of twist little link passages, all alike, none of which led to the information I sought. The only thing you can do is to trace the driver downloads for your server in the hope of finding a BIOS update.

Common sense suggests that PCs and laptops will be a bigger risk than private servers, since unlike public cloud vendors you do not allow anyone out there to run up VMs.

At this point it is hard to tell how big a problem this will be. Best practice though suggests updating all your PCs and servers immediately, as well as checking that your hosting company has done the same. In this particular case, achieving this is challenging.

PS kudos to BleepingComputer for this nice article and links; the kind of practical help that hard-pressed users and admins need.

There is also a great list of fixes and mitigations for various platforms here:

https://github.com/hannob/meltdownspectre-patches

PPS see also Microsoft’s guidance on patching servers here:

https://support.microsoft.com/en-us/help/4072698/windows-server-guidance-to-protect-against-the-speculative-execution

and PCs here:

https://support.microsoft.com/en-us/help/4073119/protect-against-speculative-execution-side-channel-vulnerabilities-in

There is a handy PowerShell script called speculationcontrol which you can install and run to check status. I was able to confirm that the HP bios update mentioned above is the right one. Just run PowerShell with admin rights and type:

install-module speculationcontrol

then type

get-speculationcontrolsettings

image

Thanks to @teroalhonen on Twitter for the tip.

Let’s Encrypt: a quiet revolution

Any website that supports SSL (an HTTPS connection) requires a  digital certificate. Until relatively recently, obtaining a certificate meant one of two things. You could either generate your own, which works fine in terms of encrypting the traffic, but results in web browser warnings for anyone outside your organisation, because the issuing authority is not trusted. Or you could buy one from a certificate provider such as Symantec (Verisign), Comodo, Geotrust, Digicert or GoDaddy. These certificates vary in price from fairly cheap to very expensive, with the differences being opaque to many users.

Let’s Encrypt is a project of the Internet Security Research Group, a non-profit organisation founded in 2013 and sponsored by firms including Mozilla, Cisco and Google Chrome. Obtaining certificates from Let’s Encrypt is free, and they are trusted by all major web browsers.

image

Last month Let’s Encrypt announced coming support for wildcard certificates as well as giving some stats: 46 million active certificates, and plans to double that in 2018. The post also notes that the latest figures from Firefox telemetry indicate that over 65% of the web is now served using HTTPS.

image
Source: https://letsencrypt.org/stats/

Let’s Encrypt only started issuing certificates in January 2016 so its growth is spectacular.

The reason is simple. Let’s Encrypt is saving the IT industry a huge amount in both money and time. Money, because its certificates are free. Time, because it is all about automation, and once you have the right automated process in place, renewal is automatic.

I have heard it said that Let’s Encrypt certificates are not proper certificates. This is not the case; they are just as trustworthy as those from the other SSL providers, with the caveat that everything is automated. Some types of certificate, such as those for code-signing, have additional verification performed by a human to ensure that they really are being requested by the organisation claimed. No such thing happens with the majority of SSL certificates, for which the process is entirely automated by all the providers and typically requires that the requester can receive email at the domain for which the certificate is issued. Let’s Encrypt uses other techniques, such as proof that you control the DNS for the domain, or are able to write a file to its website. Certificates that require human intervention will likely never be free.

A Let’s Encrypt certificate is only valid for three months, whereas those from commercial providers last at least a year. Despite appearances, this is not a disadvantage. If you automate the process, it is not inconvenient, and a certificate with a shorter life is more secure as it has less time to be compromised.

The ascendance of Let’s Encrypt is probably regretted both by the commercial certificate providers and by IT companies who make a bit of money from selling and administering certificates.

Let’s Encrypt certificates are issued in plain-text PEM (Privacy Enhanced Mail) format. Does that mean you cannot use them in Windows, which typically uses .cer or .pfx certificates?  No, because it is easy to convert between formats. For example, you can use the openssl utility. Here is what I use on Linux to get a .pfx:

openssl pkcs12 -inkey privkey.pem -in fullchain.pem -export -out yourcert.pfx

If you have a website hosted for you by a third-party, can you use Let’s Encrypt? Maybe, but only if the hosting company offers this as a service. They may not be in a hurry to do so, since there is a bit of profit in selling SSL certificates, but on the other hand, a far-sighted ISP might win some business by offering free SSL as part of the service.

Implications of Let’s Encrypt

Let’s Encrypt removes the cost barrier for securing a web site, subject to the caveats mentioned above. At the same time, Google is gradually stepping up warnings in the Chrome browser when you visit unencrypted sites:

Eventually, we plan to show the “Not secure” warning for all HTTP pages, even outside Incognito mode.

Google search is also apparently weighted in favour of encrypted sites, so anyone who cares about their web presence or traffic is or will be using SSL.

Is this a good thing? Given the trivia (or worse) that constitutes most of the web, why bother encrypting it, which is slower and takes more processing power (bad for the planet)? Note also that encrypting the traffic does nothing to protect you from malware, nor does it insulate web developers from security bugs such as SQL injection attacks – which is why I prefer to call SSL sites encrypted rather than secure.

The big benefit though is that it makes it much harder to snoop on web traffic. This is good for privacy, especially if you are browsing the web over public Wi-Fi in cafes, hotels or airports. It would be a mistake though to imagine that if you are browsing the public web using HTTPS that you are really private: the sites you visit are still getting your data, including Facebook, Google and various other advertisers who track your browsing.

In the end it is worth it, if only to counter the number of times passwords are sent over the internet in plain text. Unfortunately people remain willing to send passwords by insecure email so there remains work to do.

New year, new web site

I took the opportunity of the Christmas break to move itwriting.com to a new server.

The old server has worked wonderfully for many years, but in that time a lot of cruft accumulated.

The old WordPress template was also out of date. Today it is necessary not only to have a site that works well on mobile, but also one that is served over SSL. I am taking advantage of Let’s Encrypt to give itwriting.com trusted SSL support.

On the old site I ran three blogs. itwriting.com, aimed at professionals. Gadgets.itwriting.com aimed at consumers. Taggedtalk.com for when I occasionally wanted to blog about music. Running three blogs is a hassle and I decided to combine them into one despite the differences in content. The idea is to use WordPress categories to make sense of this but this too is work in progress.

The price of this migration is broken links. Content that is migrated from the old itwriting.com blog is mostly fine, though there are some images linked with http that will need to be fixed. Content from the other two blogs is more problematic and I have some work to do tidying up the images. There is also the old pre-wordpress blog which is now offline. This was active from 2003 to 2006 and I am undecided about whether to reinstate it.

Apologies then for the disruption but I hope it will be worth it.

Honor 7x: a great value mid-range smartphone spoilt by unexciting design

Honor is Huawei’s youth/consumer smartphone brand and deserves its reputation for putting out smartphones with compelling features for their price. Just released in the UK is the Honor 7x, a mid-range phone whose most striking features are a 5.93" 2160×1080 (18:9) display and dual-lens camera.

I came to respect the Honor brand when I tried the Honor 8, a gorgeous translucent blue device which at the time seemed to provide all the best features of Huawei’s premium phone at a lower price. The Honor 8 is 18 months old now, but still on sale for around £70 more than a 7x (there is also an Honor 9 which I have not tried). The 5.2" 1920 x 1080 screen happens to be the perfect size for my hands.

What about the new 7x though?

The 7x feels solid and well-made though unexciting in appearance. The smooth rear of the matt metal case is broken only by the fingerprint reader and dual camera lenses and flash. On the front there is the phone speaker and front-facing camera at the top of the screen, media speaker, microphone, headset socket and Micro-B USB along the bottom edge. The larger than average screen does make for a phone that is less comfortable to hold than a smaller device, but that is the trade-off you make.

The Kirin 659 processor has 8 ARM Cortex-A53 cores, comprised of 4 high-speed 2.38 GHz cores and 4 power-saving 1.7 GHz cores. SoC (System on a Chip) also includes an ARM Mali-T830 graphics processing unit. This is a mid-range processor which is fine for everyday use but not a powerhouse.  Benchmark performance is around 15% better than Samsung’s Exynos 7 Octa 7870, found in the Galaxy A3, for example.

PC Mark came up with a score of 4930, a little behind the older Honor 8 at 5799.

image

The screen resolution at 2160 x 1080 is impressive, though I found it a little dull on the default automatic brightness settings.

Music audio quality is great on headphones or high quality earbuds, but poor using the built-in loudspeaker – usually not important, but I have heard much better.

Where this phone shines is in photography. The dual lens is now well proven technology from Huawei/Honor and does make a difference, enabling better focusing and sharper images. If you enable the wide aperture in the camera, you can refocus pictures after the event, a magical feature.

If you swipe from the left in the camera app, you can select between a dozen or so modes, including Photo, Pro Photo, video, panorama, time-lapse, effects  and more. Selecting Pro Photo enables controls for metering (determines how the camera calculates the exposure), ISO, shutter speed,  exposure compensation (affects brightness) and focus mode.

image

If you swipe from the right, you can access settings including photo resolution (default is 4608 x 3456), storage location, GPS tagging, object tracking and more. There is no option for RAW images though.

image

Along the top of the camera screen are settings for flash, wide aperture, portrait mode, moving picture (records a short video when you take a picture), and front/rear camera enable.

There are also a couple of features aimed at selfies or group photos, where you want to be in the picture. If you enable audio control, the camera will take a picture when you say “Cheese”. If you enable gesture control (only works with the front camera), you can take a picture by raising your hand, triggering a countdown. I tried both features and they work.

How are the actual results though? Here is a snap taken with default settings on the 7x, though I’ve resized the image for this blog:

and here it is again on the Honor 8:

Personally I think the colours are a bit more natural on the Honor 8 but there is not much between them. I was also impressed with the detail when zoomed in. In the hands of an expert you could take excellent pictures with this, and those of us taking quick snaps will be happy too.

Likes, dislikes and conclusion

For 25% of the price of Apple’s latest iPhone, you get a solid and capable device with above average photographic capability and a high resolution display. I also like the fact that the fingerprint reader is on the rear, even though this is against the recent trend. This makes it easy to pick up and unlock the phone with one hand, with no need for face recognition.

Still, while I would be happy to recommend the phone, I do not love it. The design is plain and functional, rather out of keeping with Honor’s “for the brave” slogan. No NFC is a negative, and it is a shame Honor has provided the old micro USB instead of USB C as on the premium models.

These are minor nitpicks though and I cannot fault it for value or essential features.

Specification

OS Android 7
Chipset 8-core Kirin 659 (4 x 2.38GHz + 4 x 1.7 GHz)
Battery 3340 mAh
Screen 2160 x 1080
Rear camera Dual lens 16MP + 2MP, F/0.95 – F/16 aperture
Front Camera 8MP
Connectivity 802.11 b/g/n wifi, Bluetooth 4.1, USB 2.0
Dimensions 156.5mm x 75.3mm x 7.6mm
Weight 165g
Memory 4GB RAM, 64GB storage, microSD up to 256GB
SIM slots Dual TD-LTE/FDD LTE/WCDMA/GSM SIM or SIM + microSD
Fingerprint reader Rear
Sensors Proximity, ambient light, compass, gravity
Audio 3.5mm headset jack
Materials Metal unibody design
Price £269.99

Google’s Digital Garage, hosted by UK City Councils

I have recently moved into a new area and noticed that my (now) local city council was running a Google Digital Garage:

Winchester City Council is very excited to be partnering up with The Digital Garage from Google – a digital skills training platform to assist you in growing your business, career and confidence, online. Furthermore, a Google digital expert is coming to teach you what is needed to gain a competitive advantage in the ever changing digital landscape, so come prepared to learn and ask questions, too.

I went along as a networking opportunity and learn more about Google’s strategy. The speaker was from Google partner Uplift Digital, “founded by Gori Yahaya, a digital and experiential marketer who had spent years working on behalf of Google, training and empowering thousands of SMEs, entrepreneurs, and young people up and down the country to use digital to grow their businesses and further their careers.”

I am not sure “digital garage” was the right name in this instance, as it was essentially a couple of presentations which not much interaction and no hands-on. The first session had three themes:

  • Understanding search
  • Manage your presence on Google
  • Get started with paid advertising

What we got was pretty much the official Google line on search: make sure your site performs well on mobile as well as desktop, use keywords sensibly, and leave the rest to Google’s algorithms. The second topic was mainly about Google’s local business directory called My Business. Part three introduced paid advertising, mainly covering Google AdWords. No mention of click fraud. Be wary of Facebook advertising, we were told, since advertising on Facebook may actually decrease your organic reach, it is rumoured. Don’t bother advertising on Twitter, said the speaker.

image

Session two was about other ways to maintain a digital presence, mainly looking at social media, along with a (rather unsatisfactory) introduction to Google Analytics. The idea is to become an online authority in what you do, we were told. Good advice. YouTube is the second most popular search engine, we were told, and we should consider posting videos there. The speaker recommended the iOS app YouTube Director for Business, a free tool which I later discovered is discontinued from 1st December 2017; it is being replaced by Director Onsite which requires you to spend $150 on YouTube advertising in order to post a video.

Overall I thought the speaker did a good job on behalf of Google and there was plenty of common sense in what was presented. It was a Google-centric view of the world which considering that it is, as far as I can tell, entirely funded by Google is not surprising.

As you would also expect, the presentation was weak concerning Facebook, Twitter and other social media platforms. Facebook in particular seems to be critically important for many small businesses. One lady in the audience said she did not bother with a web site at all since her Facebook presence was already providing as many orders for her cake-making business as she could cope with.

We got a sanitised view of the online world which in reality is a pretty mucky place in many respects.

IT vendors have always been smart about presenting their marketing as training and it is an effective strategy.

The aspect that I find troubling is that this comes hosted and promoted by a publicly funded city council. Of course an independent presentation or a session with involvement from multiple companies with different perspectives would be much preferable; but I imagine the offer of free training and ticking the box for “doing something about digital” is too sweet to resist for hard-pressed councils, and turn a blind eye to Google’s ability to make big profits in the UK while paying little tax.

Google may have learned from Microsoft and its partners who once had great success in providing basic computer training which in reality was all about how to use Microsoft Office, cementing its near-monopoly.

The Scalford Hi-Fi show is dead – long live the Kegworth “Europe’s biggest Hi-Fi enthusiasts show”?

It was March 2009 when I took part in an unusual Hi-Fi show, variously known as the Scalford, Wigwam, Wam or Pie Show (Pie show because Scalford is near Melton Mowbray, home of the Pork Pie, and Pie rhymes with Hi-Fi). Wigwam was and is a Hi-Fi enthusiasts forum and the idea was to put on a show where the kit on show was not the latest stuff from big brands, but rather actual systems in use by enthusiasts. Without the normal income from commercial exhibitors, the cost of the hotel booking was met by the entrance fee (£10 as I recall). Exhibitor rooms were free other than a small contribution to public liability insurance. The early shows were run by audio show specialists Chester who did it, they said, as a community building exercise.

Scalford Hall is an English country hotel which must once have been a grand country residence. it is beautiful, rambling and impractical, but full of atmosphere.

The show was an extraordinary success. There was a vastly greater variety of gear on show than at commercial shows, ranging from conventional and modern to old and home-made. The exhibitors were enthusiasts who loved to talk about their systems, and the sound achieved was in general rather better than most. A few pictures, not from the first show:

image

image

Personally I had a great time at Scalford and exhibited 8 years in succession (starting with the first). Hmm let me see:

2009: plain Squeezebox, Naim 32.5/Hicap/250 and Kans 

image

2010: Ergo speakers designed by James at another HiFi forum, Pink Fish Media,  loaned to me for the event. Same source and amplification.

2011: Active Speakers AVI ADM 9 with BK subwoofer

2012: Linn Kaber loudspeakers with Naim amplification; my least successful room I feel. I thought the Naim amplifier would get the Kabers sounding at their best but the sound was average and I was not sure how to fix it. 

2013: Active Speakers Behringer B3031A. The theme here was how to get a great sound on a small budget, and the Behringer active speakers offer a lot for the price.

2014: Amplifier comparison Naim as above vs Yamaha AS500

This was fascinating; a modern budget amplifier compared to a classic pre-power combination loved by many but also considered coloured. Most thought both sounded great and were not sure which was which.

2015: DSD vs PCM comparison using Teac DSD DAC 

2016: Raspberry Pi system no separate amplifier

Some of these events have separate write-ups on this blog.

My goal was not to have the best sounding system but to do something interesting and enjoyable.

Enjoyable it was, but also hard work – at first I didn’t bother booking a room for the night as I lived within 45 minute drive, but I gradually realised that staying over worked better both for access to the room and for hearing other rooms the night before.

Heaving equipment around is no fun even though I didn’t have the heaviest stuff, even so amplifiers, subs and speaker stands are hefty enough. Some of my stuff got a bit bashed about too, though scratches rather than real damage.

The earliest events were run supposedly at break-even or thereabouts by Chester. The only commercial presence in the early shows was a record shop in the lobby.

I was personally fine with everything as we were doing something a bit different that would not otherwise be possible.

Gradually more commercial rooms appeared and it became harder and harder to secure good rooms. My room in year 1 was brilliant and sounded great as a result. Many of the rooms though were small hotel bedrooms in an extension rather than the older part of the building, with poor sound insulation. It was hard to get a good sound in these rooms.

I also began (speaking personally) to feel a bit unappreciated as it was the exhibitors who made the event worth going to, but we paid for the privilege and if someone managed to make some money (as I believe the organisers did in some years) none of it came to us not even a free beer or two. After the first couple of shows the organisation passed to the owners of the WigWam forum, which itself changed hands a few times. In 2017 my heart was no longer in it and I did not exhibit.

The trend towards greater commercialism continues and the WigWam’s current owners now promote the event as "Europe’s Biggest HiFi Enthusiasts Show". The cost for exhibitors has increased and now starts at £85. I have fond recollections of the show and hope it goes from strength to strength, but last year felt it was no longer for me.

Scalford was a wonderful venue, quirky and romantic, visitors could still be surprised to open a door or ascend a stairway and find a corridor of rooms they had somehow missed. Of course it was also a bit impractical and the catering rather ho-hum but it wasn’t a big deal for me.

The show is now moving to Kegworth, just off the M1 near Nottingham. The move to a hotel handy for the motorway and airport is another step away from the atmosphere and culture of the initial concept.

That said, I have no doubt that it will remain a remarkable and unusual event and hope it continues to be a great success.

Tech Writing