NVIDIA’s GPU in the cloud: will you still want an Xbox or PlayStation?

NVIDIA’s GPU Technology conference is an unusual event, in part a get-together for academic researchers using HPC, in part a marketing pitch for the company. The focus of the event is on GPU computing, in other words using the GPU for purposes other than driving a display, such as processing simulations to model climate change or fluid dynamics, or to process huge amounts of data in order to calculate where best to drill for oil. However NVIDIA also uses the event to announce its latest GPU innovations, and CEO Jen-Hsun Huang used this morning’s keynote to introduce its GPU in the cloud initiative.

This takes two forms, though both are based on a feature of the new “Kepler” wave of NVIDIA GPUs which allows them to render graphics to a stream rather than to a display. It is the world’s first virtualized GPU, he claimed.

image

The first target is enterprise VDI (Virtual Desktop Infrastructure). The idea is that in the era of BYOD (Bring Your Own Device) there is high demand for the ability to run Windows applications on devices of every kind, perhaps especially Apple iPads. This works fine via virtualisation for everyday applications, but what about GPU-intensive applications such as Autocad or Adobe Photoshop? Using a Kepler GPU you can run up to 100 virtual desktop instances with GPU acceleration. NVIDIA calls this the VGX Platform.

image

What actually gets sent to the client is mostly H.264 video, which means most current devices have good support, though of course you still need a remote desktop client.

The second target is game streaming. The key problem here – provided you have enough bandwidth – is minimising the lag between when a player moves or clicks Fire, and when the video responds. NVIDIA has developed software called the Geforce GRID which it will supply along with specially adapted Kepler GPUs to cloud companies such as Gaikai. Using the Geforce GRID, lag is reduced, according to NVIDIA, to something close to what you would get from a game console.

image

We saw a demo of a new Mech shooter game in which one player is using an Asus Transformer Prime, an Android tablet, and the other an LG television which has a streaming client built in. The game is rendered in the cloud but streamed to the clients with low latency.

image

“This is your game console,” said NVIDIA CEO Jen-Sun Huang, holding the Ethernet cable that connected the TV to the internet.

image

The concept is attractive for all sorts of reasons. Users can play games without having to download and install, or connect instantly to a game being played by a friend. Game companies are protected from piracy, because the game code runs in the cloud, not on the device.

NVIDIA does not plan to run its own cloud services, but is working with partners, as the following slide illustrates. On the VDI side, Citrix, Microsoft, VMWare and Xen were mentioned as partners.

image

If cloud GPU systems take off, will it cannibalise the market for powerful GPUs in client devices, whether PCs, game consoles or tablets? I put this to Huang in the press Q&A after the keynote, and he denied it, saying that people like designers hate to share their PCs. It was an odd and unsatisfactory answer. After all, if Huang is saying that your games console is now an Ethernet cable, he is also saying that there is no need any longer for game consoles which contain powerful NVIDIA GPUs. The same might apply to professional workstations, with the logic that cloud computing always presents: that shared resources have better utilisation and therefore lower cost.

Review: Kingston 240GB V+200 ssdNow SSD kit

Prices for SSDs (solid-state drives) are falling and capacity is rising, so much so that fitting one now looks eminently sensible if you value performance and can manage with a bit less space than a hard drive offers – though note that you should really run Windows 7, or on the Mac OSX Snow Leopard or later, as these operating systems support SSD TRIM, improving performance by telling the drive which blocks of data are no longer in use and can be safely deleted.

The primary benefit of SSD is performance, but you also get silent running and lower power consumption.

image

This Kingston kit is a generous bundle, suitable for converting a laptop or desktop. It includes a USB-powered external disk caddy which assists with the transfer of your existing data as well as enabling you to continue using your old laptop drive for external storage if you wish. There are also brackets and cables so you can fit the drive into a desktop PC, and a CD containing an Acronis disk clone tool.

The recommended method for installation depends on whether you are upgrading a laptop or a desktop.  The first step is the same for both and may be the hardest: reduce the size of the data on your existing drive to less than 240GB. Next, if you are on a laptop, you remove the existing drive install the SSD, fit the existing drive to the caddy and connect it with USB, reboot using the CD, boot into Acronis and clone the existing drive to the SSD.

If you are on a desktop, your existing 3.5” drive will not fit into the caddy, so you fit the SSD to the caddy, connect, reboot into Acronis, clone the existing drive to the SSD, and then switch off and replace the existing desktop drive with the SSD using the brackets provided.

For this review I used the former approach but either should work well. On a three-year old laptop running Windows 7 64-bit I was rewarded with a Windows Experience Index for the hard drive of 7.7.

image

However, this laptop only has SATA 2, whereas the drive supports SATA 3 and would work faster if this were available.

Kingston quotes 480 MB/s for sequential writes and power consumption of 0.565w idle rising to 2.065w for writes.

If you do not need the kit you can get the SSD a little cheaper on its own.

An excellent kit though, and the Acronis cloning solution is cleaner than others I have seen which require software to be installed in Windows.

 

Fast service at Microsoft store in San Jose

I made a brief visit to the Valley Fair mall in San Jose yesterday and took a quick look at the Microsoft and Apple stores.

Personally I like the Microsoft stores. It is probably not the cheapest place to buy a Windows machine, but you do get the Signature install which as Microsoft notes:

Many new PCs come filled with lots of trialware and sample software that slows your computer down—removing all that is a pain, so we do it for you!

So much for the famous Windows partner ecosystem, eh! But I reckon this is worth the extra cost for most people.

Now, before looking at the following images, which were just snapped as I passed, note that:

1. It was a quiet Monday afternoon and none of the stores was busy.

2. I guess San Jose is Apple land; certainly I have not seen the Seattle store this quiet.

Nevertheless, there did seem to be a mismatch between the numbers of staff and customers. When I went in I was offered help three times and a free drink once.

The guy in yellow at the front left is protesting about some alleged Microsoft misdemeanour.

image

The Apple store was not exactly heaving and there were plenty of blue shirts, but a few more customers.

image

A vending machine with a difference: this one buys your old phone

I`m visiting San Jose and looked into the Valley Fair shopping mall. I was intrigued to see an inverse vending machine, one that buys your old phone or other gadgets.

image

You tap the screen to start the sale or get a valuation. Then you pop your old phone into the receptacle and the machine checks it out, using “Advanced machine vision and artificial intelligence” to work out what you have put in. You have power up your device for a second stage evaluation, presumably to check whether it actually works. Finally, you get paid in cash or credit, with an option to donate to charity.

The EcoATM machine will take anything, though some devices have zero value in which case you have only the warm glow of satisfaction that comes from recycling.

I was offered up to $104 for my 8GB Apple iPhone 4, though it was a valuation only since the machine was sadly not fully working. Of course you could do better on eBay, but instant cash and no hassle has its attractions.

OK, so what if you grab someone else’s phone, throw it into the machine, and walk away with the cash? The makers claim to have all sorts of anti-theft measures, including video of you doing the deed, though conceptually the idea does seem vulnerable to abuse.

These machines are USA only at the moment, though an international roll-out is planned.

Find out more here, or by watching the video below.

NVIDIA Nsight comes to Eclipse for Mac, Linux GPU programming

NVIDIA has ported its Nsight development tools, previously a plug-in for Visual Studio, to run within the open source Eclipse IDE for use on Mac and Linux.

image

The Nsight tools include profiling, refactoring, syntax highlighting and auto-completion, as well as a bunch of code samples.

The Windows version for Visual Studio has also been updated, and now supports local GPU debugging as well as new support for DirectX frame debugging and analysis.

Although Eclipse of course runs on Windows, Nsight users should continue to use the Visual Studio version. NVIDIA is not supporting use of the Eclipse Nsight on Windows.

The tools are in preview and you can sign up to try them here.

Another significant development is the availability of the CUDA LLVM Compiler. NVIDIA has contributed CUDA compiler code to the open source LLVM project. This means that other languages which compile to LLVM intermediate assembly language can be adapted to support parallel processing on NVIDIA GPUs. The CUDA Compiler SDK will be made available this week at the NVIDIA GPU Technology Conference in San Jose.

Review: Digital Wars by Charles Arthur

Subtitled Apple, Google, Microsoft and the battle for the internet, this is an account by the Guardian’s Technology Editor of the progress of three tech titans between 1998 and the present day. In 1998, Google was just getting started, Apple was at the beginning of its recovery under the returning CEO Steve Jobs, and Microsoft dominated PCs and was busy crushing Netscape.

Here is how the market capitalization of the three changed between 1998 and 2011:

  End 1998 Mid 2011
Apple $5.4 billion $346.7 billion
Google $10 million $185.1 billion
Microsoft $344.6 billion $214.3 billion

This book tells the story behind that dramatic change in fortunes. It is a great read, written in a concise, clear and engaging style, and informed by the author’s close observation of the technology industry over that period.

That said, it is Apple that gets the best quality coverage here, not only because it is the biggest winner, but also because it is the company for which Arthur feels most affinity. When it comes to Microsoft the book focuses mainly on the company’s big failures in search, digital music and smartphones, but although these failures are well described, the question of why it has performed so badly is not fully articulated, though there is reference to the impact of antitrust legislation and an unflattering portrayal of CEO Steve Ballmer. The inner workings of Google are even less visible and if your main interest is the ascent of Google you should look elsewhere.

Leaving aside Google then, describing the success of Apple alongside Microsoft’s colossal blunders makes compelling reading. Arthur is perhaps a little unfair to Microsoft, because he skips over some of the company’s better moments, such as the success of Windows 7 and Windows Server, or even the Xbox 360, though he would argue I think that those successes are peripheral to his theme which is internet and mobile.

The heart of the book is in chapters four, on digital music, and five, on smartphones. The iPod, after all, was the forerunner of the Apple iPhone, and the iPhone was the forerunner of the iPad. Microsoft’s famous ecosystem of third-party hardware partners failed to compete with the Ipod, and by the time the company got it mostly right by abandoning its partners and creating the Zune, it was too late.

The smartphone story played out even worse for Microsoft, given that this was a market where it already had significant presence with Windows Mobile. Arthur describes the launch of the iPhone, and then recounts how Microsoft acquired a great mobile phone team with a company called Danger, and proceeded to destroy it. The Danger/Pink episode shows more than any other how broken is Microsoft’s management and mobile strategy. Danger was acquired in February 2008. There was then, Arthur describes, an internal battle between the Windows Mobile team and the Danger team, won by the Windows Mobile team under Andy Lees, and resulting in 18 months delay while the Danger operating system was rewritten to use Windows CE. By the time the first new “Project Pink” phone was delivered it was short on features and no longer wanted by Verizon, the partner operator. The “Kin” phone was on the market for only 48 days.

The Kin story was dysfunctional Microsoft at its worst, a huge waste of money and effort, and could have broken a smaller company. Microsoft shrugged it off, showing that its Windows and Office cash cows continue to insulate it against incompetence, probably too much for its own long-tem health.

Finally, the book leaves the reader wondering how the story continues. Arthur gets the significance of the iPad in business:

Cook would reel off statistics about the number of Fortune 500 companies ‘testing or deploying’ iPads, of banks and brokers that were trying it, and of serious apps being written for it. Apple was going, ever so quietly, after the business computing market – the one that had belonged for years to Microsoft.

Since he wrote those words that trend has increased, forming a large part of what is called Bring Your Own Device or The Consumerization of IT. Microsoft does have what it hopes is an answer, which is Windows 8, under a team led by the same Steven Sinofsky who made a success of Windows 7. The task is more challenging this time round though: Windows 7 was an improved version of Windows Vista, whereas Windows 8 is a radical new departure, at least in respect of its Metro user interface which is for the Tablet market. If Windows 8 fares as badly against the iPad as Plays for Sure fared against the iPod, then expect further decline in Microsoft’s market value.

 

System Center 2012, Windows 8 and the BYOD revolution

Yesterday I attended a UK Microsoft MMS catch-up session in Manchester, aimed at those who could not make it to Las Vegas last month. The subject was the new System Center 2012, and how it fits with Microsoft’s concept of the private cloud, and its strategy for supporting Bring Your Own Device (BYOD), the proliferation of mobile devices on which users now expect to be able to receive work email and do other work.

The session, I have to say, was on the dry side; but taken on its own terms System Center 2012 looks good. I was particularly interested in how Microsoft defines “private cloud” versus just a bunch of virtual machines (JBVM?). Attendees where told that a private cloud has four characteristics:

  • Pooled resources: an enterprise cloud, not dedicated servers for each department.
  • Self service: users (who might also be admins) can get new server resources on demand.
  • Elasticity: apps that scale on demand.
  • Usage based: could be charge-back, but more often show-back, the ability to report on what resources each user is consuming.

Microsoft’s virtualization platform is based on Hyper-V, which we were assured now represents 28% of new server virtual machines, but System Center has some support for VMWare and Citrix Xen as well.

System Center now consists of eight major components:

  • Virtual Machine Manager: manage your private cloud
  • Configuration Manager (SCCM): deploy client applications, manage your mobile devices
  • Operations Manager: monitor network and application health
  • Data Protection Manager: backup, not much mentioned
  • Service Manager: Help desk and change management, not much mentioned
  • Orchestrator: a newish product acquired from Opalis in 2009, automates tasks and is critical for self-service
  • App Controller: manage applications on your cloud
  • Endpoint protection: anti-malware, praised occasionally but not really presented yesterday

I will not bore you by going through this blow by blow, but I do have some observations.

First, in a Microsoft-platform world System Center makes a lot of sense for large organisations who do not want public cloud and who want to move to the next stage in managing their servers and clients without radically changing their approach.

Following on from that, System Center meets some of the requirements Microsoft laid out as the start of the session, but not all. In particular, it is weak on elasticity. Microsoft needs something like Amazon’s Elastic Beanstalk which lets you deploy an application, set a minimum and maximum instance count, and have the platform handle the mechanics of load balancing and scaling up and down on demand. You can do it on System Center, we were told, if you can write a bunch of scripts to make it work. At some future point Orchestrator will get auto scale-out functionality.

Second, it seems to me unfortunate that Microsoft has two approaches to cloud management, one in System Center for private cloud, and one in Azure for public cloud. You would expect some differences, of course; but looking at the deployment process for applications on System Center App Controller it seems to be a different model from what you use for Azure.

Third, System Center 2012 has features to support BYOD and enterprise app stores, and my guess is that this is the way forward. Mobile device management in Configuration Manager uses a Configuration Manager Client installed on the device, or where that is not possible, exploits the support for Exchange ActiveSync policies found in many current smartphones, including features like Approved Application List, Require Device Encryption, and remote wipe after a specified number of wrong passwords entered.

The Software Center in Configuration Manager lets users request and install applications using a variety of different mechanisms under the covers, from Windows Installer to scripts and virtualised applications.

Where this gets even more interesting is in the next version of InTune, the cloud-based PC and device management tool. We saw a demonstration of a custon iOS app installed via self-service from InTune onto an iPhone. I presume this feature will also come to Software Center in SCCM though it is not there yet as far as I aware.

You can also see this demonstrated in the second MMS keynote here – it is the last demo in the Day 2 keynote.

image

InTune differs from System Center in that it is not based on Windows domains, though you can apply a limited set of policies. In some respects it is similar to the new self-service portal which Microsoft is bringing out for deploying Metro apps to Windows RT (Windows on ARM) devices, as described here.

This set me thinking. Which machines will be easier to manage in the enterprise, the Windows boxes with their group policy and patch management and complex application installs? Or the BYOD-style devices, including Windows RT, with their secure operating systems, isolated applications, and easy self-service app install and removal?

The latter approach seems to me a better approach. Of course most corporate apps do not work that way yet, though app virtualisation and desktop virtualisation helps, but it seems to me that this is the right direction for corporate IT.

The implication is two-fold. One is that basing your client device strategy around iPads makes considerable sense. This, I imagine, is what Microsoft fears.

The other implication is that Windows RT (which includes Office) plus Metro apps is close to the perfect corporate client. Microsoft VP Steven Sinofsky no doubt gets this, which is why he is driving Metro in Windows 8 despite the fact that the Windows community largely wants Windows 7 + and not the hybrid Metro and desktop OS that we have in Windows 8.

Windows 8 on x86 will be less suitable, because it perpetuates the security issues in Windows 7, and because users will tend to spend their time in familiar Windows desktop applications which lack the security and isolation benefits of Metro apps, and which will be hard to use on a tablet without keyboard and mouse.

A little colour returns to Visual Studio 11 – but not much

Microsoft has responded to user feedback by re-introducing colour into the Visual Studio 11 IDE. The top request in the official feedback forum was for more colour in the toolbars and icons.

image

Now Microsoft’s Monty Hammontree, who is Director of User Experience, Microsoft Developer Tools Division – it is interesting that such a post exists – has blogged about the company’s response:

We’ve taken this feedback and based on what we heard have made a number of changes planned for Visual Studio 11 RC.

That said, developers expecting a return to the relatively colourful icons in Visual Studio 2010 will be disappointed. Hammontree posted the following side by side image:

image

This shows Visual Studio 10 first, then the beta, and then the forthcoming release candidate. Squint carefully and you can see a few new splashes of colour.

image

You can also see the the word toolbox is no longer all upper case, another source of complaint.

Hammontree explains that colour has been added to selected icons in order to help distinguish between common actions, differentiate icons within the Solution Explorer, and to reintroduce IntelliSense cues.

Did Microsoft do enough? Some users have welcomed the changes:

You have to appreciate a company that listens to there [sic] users and actually makes changes based off feedback. You guys rock!

while others are doubtful:

with respect, I fear that the changes are token ones and that whoever’s big idea this monochromatic look is, is stubbornly refusing to let go of it in spite of the users overwhelming rejection of it.

or the wittier:

I’m glad you noticed all the feedback about the Beta, when people were upset that you chose the wrong shade of gray.

While the changes are indeed subtle, they are undoubtedly an improvement for those hankering for more colour.

Another issue is that by the time a product hits beta in the Microsoft product cycle, it is in most cases too late to make really major changes. The contentious Metro UI in Windows 8 will be another interesting example.

That said, there are more important things in Visual Studio 11 than the colour scheme, despite the attention the issue has attracted.

A taste of the high end at a bargain price: Behringer 3031A active loudspeakers

I have been taking an interest in active loudspeakers after sampling AVI’s ADM 9.1 model which deliver clear, uncoloured sound in a convenient package with built-in DAC and remote volume control. though they lack bass extension and really need a sub-woofer to perform at their best.

The ADM 9.1s are good value considering that you get a complete just-add-source package; but still not exactly a casual purchase at £1250 (May 2012). What about some of those active monitors at the low end of the music studio market, can that deliver some of the active magic at a lower price?

A quick hunt led to these Behringer 3031A active monitors which offer a remarkable specification for the price – around £300 at the time of writing. Ribbon tweeter, 150w amplifier in each speaker, 50Hz to 24 KHz frequency response, what can go wrong? I could not resist getting a pair for review, especially as there are surprisingly few reports on these speakers out on the internet, considering that they have been available since 2009.

Note that I am reviewing these as hi-fi speakers even though they are designed for studio use.

pair

Why so few reviews? It may be because Behringer has a mixed reputation in the pro audio community. The products are popular and good value, but the company is accused of lack of originality in design and poor quality in manufacture. Since the prices undercut most competition there could be some industry in-fighting going on. Behringer undoubtedly aims at the low end of the studio and hobbyist market, and manufactures in large Chinese factories, but I doubt their quality is all that bad given that their largest reseller Thomann offers a 3-year warranty. Still, a cautionary note there.

“They’re heavy”, said the delivery man, and I unpacked the monitors to find a pair of very solid, weighty loudspeakers (15Kg each according to the spec). The cabinet is MDF though the front baffle is some kind of plastic with a metal plate into which the drivers are set. There are two slim vertical ports. No grilles and these will not win prizes for appearance, though they are not too bad. This is about the sound though; look elsewhere if you are after hi-fi as furniture.

Wiring up

The B3031As have two balanced inputs, with XLR or 1/4” jacks. Most hi-fi cables use unbalanced RCA phonos; however you can easily get RCA to jack plug cables from a music equipment store or online. Using a balanced connection is better, if your pre-amplifier offers that option, but I used an unbalanced mono 1/4” jack for each input without any issues. One interesting and cost-effective choice is the new Cambridge Audio DacMagic Plus, around £350 from Richer Sounds in the UK or $600 in the USA, which has balanced outputs and includes a pre-amplifier, though I have not tried this combination.

I tried the B3031As in two configurations. The first was with a Beresford Caiman DAC, which also has a built-in pre-amplifer. The second was with a Naim 32.5 pre-amplifier. Neither of these has balanced outputs. My source is a Logitech Squeezebox Touch. Note that this also has a volume control and built-in DAC, so for the most cost-effective system you could go straight from the Touch to the speakers, though I have not tried that as yet.

The main point is that you must have some sort of pre-amplifier output with a way of adjusting the volume, since the B3031As do not really have a volume control. There is an input level trim control which in effect is the same thing, but this is only designed for setting a convenient level during setup, not for constant use.

In order to use the Caiman I have to set the input trim near its maximum, in order to get a full range of volume from the speakers. The Naim 3.5 has a more powerful output and I can set the input trim to 0dB with very satisfactory results.

Although the sound was good direct from the Caiman, I got better results from the Naim, though obviously this adds greatly to the cost. A full pre-amplifier is also more convenient since you have additional inputs available.

Controls

The back panel of the B3031A has several controls. The on-off switch is conveniently sited on the top. The inputs are slightly less conveniently on the underside, though this does mean that the cables hang vertically which is tidy.

0200-final

Then there are several additional controls:

Input Trim: Control input gain from –6dB to +6bB, as mentioned above.

Low Frequency: Cut the response from 60Hz and below between 0 and –6dB. The purpose is to integrate smoothly with a subwoofer or, if monitoring, to simulate a small speaker system.

Room Compensation: Cut the response from 300Hz and below between 0 and –6dB. The purpose is to reduce excessive bass if the speakers are sites against a wall or in a corner.

High Frequency: Adjust the response around 8kHz from +2dB to –4dB. The purpose is to tailor the high frequencies to allow for room effects.

Power mode: You can set the power to On, Auto, or Off. This one mystifies me. You do not need Off since you can more easily press the Power switch on the top. The Auto mode is meant to put the speakers into standby when not in use, but in my tests it was a disaster. The speakers would turn off during quiet passages. Admittedly that was with the rather low output from the Caiman DAC; but I suggest NOT using this option.

Mute Low and Mute High: mutes the high or low drivers, apparently “for service use”.

Frequency response

Each speaker comes with an individual calibration certificate, which is a nice touch especially at this price point.

chart-final

I presume this is done in an anechoic chamber; the frequency response in a normal room will be less even. One point interests me though. The certificate shows that the bass response does not begin to drop noticeably until 40Hz; yet the published specification is 50Hz-24Khz. That accords with my listening tests, in that the bass is well extended and unlike AVI’s ADM 9.1, these speakers work fine without a subwoofer.

060-final

Electronics

The amplifier packs are easy to unscrew from the back panel so I took a look, though I do not recommend this as it may invalidate your warranty. Also note that amplifiers can give you an electric shock even after they have been unplugged, thanks to the charge held by capacitors.

1051-final

Note the beefy toroidal transformer.

Listening tests

So how do they sound? In a word, excellent. They display the characteristics you would expect from an active system: exceptional clarity, a somewhat lean sound due to absence of boom, neutral tone, and an honest reproduction of the source which occasionally counts against your enjoyment if it is slightly distorted (play Peaceful Easy Feeling by the Eagles. Hear the distortion? Good, you have an accurate system).

I positioned the speakers on stands well into the room and only a few feet apart. These are more suitable for hi-fi than some monitors because the ribbon tweeters have a wide dispersal, which means the sweet spot of good listening positions is larger.

When I first switched on, I thought the bass was a little light. Then I played Stravinksky’s Firebird in the great performance by the Detroit Symphony Orchestra conducted by Antal Doráti. The drum sounded with dramatic effect; it is obvious that these speakers have no problem with bass.

I played Roads by Portishead, a demanding track that begins with a pulsing low-frequency tone that can easily cause speakers to buzz or the sound to break up. The B3031A coped with this as well as I have heard; then Beth Gibbons’ vocals come in with startling clarity, a stunning contrast.

The B3031A’s coped with Sade’s By Your Side, on which the strong bass can easily overwhelm and distort, with ease. You can hear the silky vocals, the pumping percussion, the fingers sliding on the guitar, the ticking cymbals, the swirling organ.

Ashkenazy playing Chopin sounds dynamic and natural. There is no boominess in the lower end nor breakup in the loud passages.

Is there anything these speakers do not do well? A few observations. If you like to rock out to heavy metal, I am not sure that this type of speaker is the best, though the B3031A is better than some in this regard. They are just a bit too polite, and further, maybe a floorstander with the chest-shaking bass that only a floorstander can deliver is a better choice.

Although the sound is generally excellent, these speakers do not quite have the refinement and limpidity I have heard from active ATCs costing many times more, for example.

Be reasonable though. You can get a pair of these delivered for around £300. What else would sound as good for the money?

Conclusion

My immediate conclusion is that these are a fantastic hi-fi bargain. If you can live with the looks and the Behringer name, you are getting a real taste of the high-end for what most audio enthusiasts would regard as as a low-budget price.

Admittedly the setup is a little more complex than some, since you need a pre-amplifier of some kind, though there are now DACs around at a reasonable price which have this included.

Specifications

Inputs: Balanced XLR or 1/4” jack.

Input trim: –6dB to +6dB

Tweeter: 2” ribbon

Woofer: 8 3/4” Kevlar

Woofer amplification: 100w RMS 150w peak at 4 ohms, 0.1% THD

Tweeter amplification: 30w RMS 75w peak at 6 ohms, 0.1% THD

Crossover frequency: 3.6Khz

Frequency response: Quoted 50Hz to 24Khz, no range given.

Max spl: 113dB at 1m per pair

Power consumption: max 200w

Dimensions: 400 x 250 x 290mm

Weight: 15Kg

Buying the B3031A

If you buy a pair of these pay special attention to whether you are buying a single speaker or a pair. In the pro music market, monitors are often sold individually, which means that great price must be doubled if you are after a stereo pair. That said, the B3031A is often, but not always, sold in pairs. This usually works out better value. Check the small print carefully!

  

Great sounding recordings

There was a discussion on a music form of which the best sounding recordings out there.

I am always amused by these discussions because I see stuff picked that is great music (at least to those who pick it) but cannot honestly be described as great-sounding in a technical sense.

Of course it is hard to separate; and maybe there are albums that sound deliberately “bad” as part of the artistic statement.

Conversely, if the music does not interest you, it is hard to appreciate the sonics.

Here were my picks though: six albums that I know will always sound good.

Kind of Blue by Miles Davis – great presence and realism, interesting bass lines to follow.

 
Carpenters by The Carpenters. Probably influenced by the great voice, but I find this a really natural sounding recording. CD you can get for pennies at any supermarket here in the UK.

New Blood by Peter Gabriel. Modern recording, just very nicely done. Probably helped by natural acoustic sound of orchestra etc.

The Freewheelin’ Bob Dylan. I like this for its simplicity and realism. If you want a recording where you can close your eyes and imagine a man there singing, this is excellent.

Electric Cafe by Kraftwerk. Great sounding electronica.

Stravinksky: Le sacre de Printemps/L’Oiseau de feu; Detroit Symphony Orchestra, Antal Dorati (Decca) No idea how this ranks in a list of fine-sounding classical recordings but I like it, beautifully conveys the drama of the music.

Always interested in hearing about other people’s favourites, from a sonic point of view.