WinFS reborn: SQL Server as a file system

Fascinating interview with Quentin Clark, who led the cancelled WinFS project at Microsoft. Jon Udell is the interviewer.

Clark talks about how technology from WinFS is now emerging as the Entity Framework in ADO.NET (part of .NET 3.5 SP1) and the FileStream column type in SQL Server 2008 – a connection I’d already made at the Barcelona TechEd last year. He also mentions the new HierarchyID column type that enables fast querying of paths, the concept of rows which contain other rows. He adds that a future version of SQL Server will support the Win32 API so that it can support a file system:

In the next release we anticipate putting those two things together, the filesystem piece and the hierarchical ID piece, into a supported namespace. So you’ll be able to type //machinename/sharename, up pops an Explorer window, drag and drop a file into it, go back to the database, type SELECT *, and suddenly a record appears.

Put that together with the work Microsoft is doing on synchronization, and you get offline capability too – something more robust than offline files in Vista. Clark says SharePoint will also benefit from SQL Server’s file system features.

Note that Live Mesh does some of this too. I guess SQL Server is there in the Live Mesh back end, but it strikes me Microsoft is at risk of developing too many ways to do the same thing.

The piece of WinFS that shows no sign of returning is the shared data platform, which was meant to enable applications to share data:

… all that stuff is gone. The schemas, and a layer that we internally referred to as base, which was about the enforcement of the schemas, all that stuff we’ve put on the shelf. Because we didn’t need it.

Cenzic web app report highlights security problems

Will we ever get a secure Internet? There’s no cause for optimism in the latest Cenzic report into web app security. A few highlights:

  • 7 out of 10 Web applications analyzed by Cenzic were found vulnerable to Cross-Site Scripting attacks
  • 70% of Internet vulnerabilities are in web applications
  • FireFox has the most reported browser vulnerabilities at 40%; IE is 23%
  • Weak session management, SQL Injection, and poor authentication remain very common problems
  • 33% of all reported vulnerabilities are caused by insecure PHP coding, compared to 1% caused by insecurities in PHP itself.

OK, it’s another report from a security company with an interest in hyping the figures; but I found this one more plausible than some.

The PHP remarks are interesting; it would be good to see equivalent figures for ASP.NET and Java.

Misunderstanding Vista

Microsoft has posted a 9-page document on Five Misunderstood Features in Windows Vista. Apparently these “cause confusion and slow Windows Vista adoption for many folks.” Here they are:

  1. User Account Control
  2. Image Management
  3. Display Driver Model
  4. Windows Search
  5. 64 bit architecture

I thought I did understand User Account Control, but now I’m not so sure. I understand the long-term goal of UAC, which is to move Windows to the position enjoyed by Unix-like operating systems, where users run with limited rights. Fixing this means fixing applications that require local administrator rights; but making third-party app vendors change their practice is hard. UAC takes a multi-pronged approach. It makes it safer to run as local administrator; it makes it possible to run some applications that used to require admin rights without really having those rights; and it is sufficiently annoying that app vendors will feel under some pressure to fix their next release.

This statement caused me to pause:

Enterprises should not run as default in Protected Admin mode, because there are really no benefits—only the pain of prompts. Instead, strive to move users to a Standard User profile.

The highlighting is mine. If there are no benefits, it seems odd that most Vista installations I see are set up in this way. I realise that in this context UAC is not a security boundary. Nevertheless, I figure there are some benefits, in that the user is running most of the time with standard user credentials. If there are no benefits … why does the feature exist?

I’m not sure the Image Management is “widely misunderstood”; it mostly matters only to network administrators whose business it is to understand it. Windows Display Driver Model … again, not sure; I think it is Desktop composition which is misunderstood; people dismiss this as eye-candy, when in fact it “fundamentally changes the way applications display pixels on the screen”, as the referenced article explains.

Windows Search is an interesting one. I think it is misunderstood, but not in the way explained by this new paper. People have questions like, “why does it not index all my files?”

What about performance? In my view, this is far and away the primary problem users have with Vista. It is not in any sense a misunderstanding, however Microsoft spins it. It is bewilderment: why does my new machine, which should be fast, spend so much time spinning its little bagel when I want to get on with my work?

Here’s what this document says:

We’ve heard some of you say that Windows Vista runs slower than Windows XP on a given PC. So what‘s really happening here? First, we need to avoid comparing apples to oranges – Windows Vista is doing a lot more than Windows XP, and it requires resources to conduct these tasks.

It goes on to say that:

On machines configured with the appropriate specifications for their operating system, the speed of most operations and tasks between Windows Vista and Windows XP is virtually on parity. Which is pretty remarkable when you consider one key thing Windows Vista is doing that Windows XP isn’t: indexing for near instantaneous search results for desktop files, even embedded in email messages. The result is users can find information significantly faster (measured in minutes), increasing productivity far in excess of the loss in speed of operations (measured in milliseconds).

Microsoft is off-target here, despite the sleight of hand about “appropriate specifications”. First, search can be a big drain on performance; sorry, not just a few milliseconds. Second, Vista can be dramatically slower than XP, often thanks to poor configuration by OEMs. See Ed Bott’s discussion about fixing a Sony laptop.

There’s recently been discussion about Windows Server 2008, which performs very well, versus Vista, which tends to perform badly. It’s all to do with configuration and disabling unnecessary processes. This is the core of Vista’s problems, not a series of “misunderstandings”.

Update: the document is no longer online. Perhaps it will reappear with amendments?

Further postscript: The Guardian has posted the document here.

Painful Debian / Ubuntu SSL bug

A bug in the Debian-modified version of OpenSSL (also used by Ubuntu) means that cryptographic keys generated on Debian systems for the last couple of years may be insecure. Instead of being well randomized, they are easily guessable.

More information about the vulnerability is here; how to fix it here.

How much does this matter? The full scope has not emerged yet; but as I understand it, it affects self-generated keys. Those who purchased certificates from a third-party certificate authority are not affected, unless one of those authorities turns out to have been using the broken version which is unlikely. Even if you purchased certificates from a third-party certificate authority, you would still be affected if you generated the certificate request on a system with the broken OpenSSL library (thanks to Nico for the correction below).

This means that a large number of supposedly secure SSH connections or SSL connections to web sites and servers over the last couple of years were actually not very secure at all.

If nothing else, it shows how easy it is to be falsely reassured, to think you are secure when you are not.

It also shows the risks of modifying security code. The problem is not with OpenSSL, but with changes made by a Debian coder who thought he was fixing something when in fact he was breaking it.

This site runs on Debian and I’ve spent some time today checking it for vulnerability and regenerating keys.

Technorati tags: , , ,

Small Business Server 2008: less for more?

The announced prices for SBS 2008 are substantially higher than those for SBS 2003. Client Access Licenses (CALS) for standard edition users are slightly lower than before, but a new CAL for premium users is remarkably expensive: $189.00, on top of the cost of the client Windows OS itself. In the old scheme, an SBS CAL applied to both Standard and Premium users and had a single price of $97.80.

How price sensitive is SBS? From what I see, the cost of installing and configuring SBS is usually more than the license cost, presuming a business gets a specialist to do this. In addition, the announced figures do not cover cheaper OEM editions. In other words, probably not very price sensitive.

This still strikes me as a surprising move. SBS 2008 has removed some features, including the ISA Server firewall. Further, SBS has more competition than before, both from Linux and from cloud-based offerings. Is this really the moment to hoist prices? Google will be pleased.

My high risk blog reader

I posted yesterday about the report from PC Tools saying that Vista is more prone to malware than Windows 2000.

The company kindly sent me its press release on the subject and is promising more information. According to the release, the figures are based on a tool called ThreatFire, available in free and commercial editions, which by default reports threats discovered back to PC Tools for analysis and statistics. ThreatFire is a behavioural tool; that is, it does not rely on signatures of known malware, but detects suspicious behaviour.

I thought I should try this tool on my own machine. I probably count as a high-risk user, since I frequently browse the web and download and run software, sometimes unsigned software. Would ThreatFire find any malware?

It did not take long:

The application is my own custom blog reader, a simple .NET app which calls the common feed list API and renders blog posts in the WebBrowser control.

Looks like a false positive to me. Still, I poked around in the dialog. The risk level is supposedly high. The Technical Details link does not tell you any more about what the app did that was suspicious, but identifies the files I can choose to quarantine. The link that says “Learn more about this threat” does a Google search on the file name.

By the way, doing a random web search on what is potentially malware strikes me as poor practice. Here’s what online help says:

Click the Learn more about this threat link to launch a quick web search on the threat.  In most cases the result of this search provides a clear indication of how to proceed.

Ever tried searching for the name of an executable or process? The bad guys and the scammers know we do this; and you will be offered all manner of “security” products some of which are likely spyware or malware themselves. A foolish thing to encourage. Further, how will a random web search provide “a clear indication of how to proceed”? It’s the wild web, no more, no less.

My blog reader is not very famous, so in this case Google found nothing. I’m puzzled that ThreatFire doesn’t tell you more about the supposedly malicious activity, like what data was sent and where, so that the user would have more chance of judging whether this is really a dangerous app.

I guess the “threat” is now in the PC Tools database, and my machine marked as Vista with malware. I’ll be interested to see what else it finds.

Technorati tags: , ,

Is Vista more prone to malware than Windows 2000?

So says the research department of PC Tools, apparently.

I was intrigued as I’ve investigated security in Vista. I went along to the PC Tools site in search of more information. Unfortunately there is no relevant press release in the news section or other details. I did find an article on ars technica that asks the questions I wanted to ask, but no answers.

I also registered on the site as press in search of further information, and received my username and password back as a plain text email. Remarkable, for a security company.

I don’t mean to be cynical; I really am interested, but frankly stories like this are worthless without more information. I blogged three years ago about exaggerated claims made by a security company. These companies are unlikely to put out releases saying that we no longer need their products.

My question to these security folk: given that most PC users (that I see) have been scared into using their products, why have we not seen a corresponding reduction in malware infections? It is as if the industry is glad to brag about the failure of its products.

Technorati tags: , , ,

Visual Basic returning to Mac Office

Microsoft will restore VBA to Mac Office:

The Mac BU [Business Unit] announced it is bringing VBA-language support back to the next version of Office for Mac. Sharing information with customers as early as possible continues to be a priority for the Mac BU to allow customers to plan for their software needs. Although the Mac BU increased support in Office 2008 with alternate scripting tools such as Automator and AppleScript — and also worked with MacTech Magazine to create a reference guide, available at http://www.mactech.com/vba-transition-guide — the team recognizes that VBA-language support is important to a select group of customers who rely on sharing macros across platforms. The Mac BU is always working to meet customers’ needs and already is hard at work on the next version of Office for Mac.

There’s a couple of ways to take announcements like this. The positive: the company is listening. The negative: what was it thinking when it cut the feature?

By the time Mac Office vNext is out of the door, I imagine many potential VBA users will have found other solutions.

The other point of interest: while Microsoft’s Mac BU is benefiting from Apple’s strength, I doubt that is enough to compensate for the lost Windows sales which are also implied.

How Outlook 2007 deletes your messages without asking

A puzzled Outlook 2007 user asked me why his Outlook 2007 archive folders were empty. Investigation led me to this dialog, found at Tools – Options – Other – AutoArchive:

This is actually from my own Outlook; but as you can see, it is set to move old items to an archive folder. Note that the option to Move rather than delete is set by default.

However, I was puzzled by the option to Delete expired items (e-mail folders only). What does this mean? In particular, why does it refer to expired items when the rest of the dialog refers to old items? The word expired suggests some kind of non-validity, like an expired subscription, or password, or credit card.

Pressing F1 did not yield anything helpful; but this article explains:

Delete expired items (e-mail folders only)   This option is not selected by default. You can choose to have e-mail messages deleted when their aging period has expired. The default period for your Draft and Inbox items is six months, and three months for your Sent Items, but you can change these periods using the Clean out items older than option.

As I understand it, this means that items are deleted after as little as three months if the option is checked; and expired means exactly the same as old. But that’s OK; it isn’t checked by default.

Or is it? For sure, I have never checked that option, nor did my contact, but it is checked on all my Outlook installations, and on his. Take a look: is your Outlook set up like this? I’d be interested to know.

The consequence is that old emails simply disappear. The only dialog the user will see is that auto-archive wants to run. By the way, most people would not imagine that an archive process will delete items. Archive means long-term storage. Words like prune or purge imply deletion, but not archive.

Now, I happen to think that archiving in Outlook is a mess anyway. If you have several machines on the go (which is one of the reasons for using Exchange and Outlook), then you usually end up with several archives, buried deep in hidden folders where nobody is likely to find them without help. It is easy to miss these archive files when migrating to a new machine.

Still, I hadn’t realised that Outlook actually deletes old emails without asking – that is, if I am right and this is (incorrectly) the default.

It may seem a small matter; but there are times when finding that old email, sent or received, is critically important. It is another reason why I am fed up with Outlook 2007: its amusingly obscure dialogs, its broken RSS support, and its disgracefully slow performance.

Update: Duncan Smart below suggests that the “Expired items” refers to emails that have an expiry date set in message options. I must say that makes more sense to me. On the other hand, it isn’t what the help document says, and it doesn’t explain why why my contact had no messages in his archive folder, until I changed the setting. I’ll try some experiments … [slightly later] … if I archive a folder with File – Archive, it does not delete old messages (good); on the other hand this dialog is different because you specify the archive date so it is not a perfect test.

I suspect it is not as bad as I first thought, that the help document is incorrect, and that some other factor must have messed up my contact’s archiving. I hope that is the case.

See also this official help document:

Choosing an option to have items deleted permanently deletes the items automatically when they expire. They are not archived. For instance, if you click Delete expired items (e-mail folders only), this option deletes all messages in all your e-mail folders, such as Inbox, Sent, or Drafts, when they reach the end of their aging periods. The messages are not archived.

So … either Outlook really is deleting messages without asking; or I’m not the only one confused.