Random Thoughts..
Wednesday, March 05, 2003
 
By Robert Lemos

CNET News.com
March 4, 2003, 5:20 AM PT

A viral one-two punch--the Code Red and Nimda worms--convinced Microsoft in mid-2001 that security needed to become its top priority. That decision led directly to the creation of the company's Trustworthy Computing initiative.
Company Chairman Bill Gates laid the groundwork for the program with an ambitious memo in January 2002 to employees, challenging them to improve the privacy and security of Microsoft software. The company subsequently halted much of its product development while about 8,500 developers were trained in secure programming and then reviewed the majority of the Windows code for security errors. Microsoft says the entire effort cost some $100 million.

But hackers continue to find holes in Microsoft's defense. In January, the Slammer worm hit. This time, not only did customers get infected; Microsoft did, too.


Mike Nash, vice president of the security business unit at Microsoft, is the executive responsible for the security component of Trustworthy Computing push. CNET News.com recently spoke with Nash about the effect of the Slammer worm on the Trustworthy Computing initiative and where Microsoft expects to take its security program in its second year.

Q: Now that Trustworthy Computing is in its second year, where does it go from here?
A: The first year of Trustworthy Computing was about dealing with issues that could correct for and mitigate primary areas where customers have pain. At the same time, (it was about) making investments in core infrastructure that would both help mitigate customer pain but also be the right investment in the long term. In the second year of Trustworthy Computing, I see it very much as a deepening and broadening of the same set of issues.

Does the Slammer incident invalidate what you did?

At some level, you could argue that Slammer illustrates that. The work we've been doing around patch management and our focus on things around Windows needs to extend out beyond Windows so we have the same capabilities for products like SQL Server.

At the same time, there is a little bit of irony here in the sense that one of the things that the SQL Server folks did...was to not only focus on the Trustworthy Computing process for new products but also focus on going back and applying that process to existing products. So Exchange Server and SQL Server both went back and did security pushes against their existing versions of the product. Ironically, in the case of SQL Server, that updated version--Service Pack 3 for SQL server--shipped the Monday before the Friday that Slammer was launched.

Then what is the lesson taught by Slammer?
Remember, there are two things that we are talking about: The security patch that we shipped last summer and the service pack that shipped four days before Slammer. I would not have expected people to have applied the service pack, because it was four days before. The evaluation period has to be longer than that. The key lesson of Slammer--maybe it's a re-lesson of Slammer--is our work is not done when the patch is available. Our work is done when the patch is installed on the majority of customers' systems.

So what needs to happen in order for patches to get installed? Do you need to assure customers somehow that the patch won't break their system? Or do you somehow force them to upgrade sooner?

I don't think we force companies to do anything. The thing we need to do is we need to make it easier for customers to install those patches both when they do it sitting at a system and also with automatic techniques like we have for Windows Update and Software Update Service.

We also have to make sure that the critical things around patching which customers are worried about (get taken care of.) Does it maintain capability with their existing applications? Is it easy to install? Does it have the ability to roll the patch back if you install it and decide you don't like it? A lot of the infrastructure we have built for Windows now can be extended out to things like SQL Server.

Focusing on Windows first was the right thing to do. I think the key thing here is to make it easier to install things like a security patch, so the customer has less work to do. And I want to be really clear: It's our issue. It is our job to make sure that friction goes down.

With Microsoft having problems of its own internally from the Slammer worm, do you think people have started to question Trustworthy Computing?
The most important part of the Trustworthy Computing conversation is that Trustworthy Computing is a journey. We didn't mean to set the expectation a year ago--nor do I think we did set the expectation a year ago--that you could simply flip a switch and Trustworthy Computing would be turned on for everything. I think that we made progress in some areas. We learned some things, and we know we have more work to do this year...Very clearly, patches have to be easier to install. That's an essential part of that.

How do you solve this patching problem? Is it a relationship issue that Microsoft needs to solve or a security issue that your customers need to solve?

There are two, maybe three things that Microsoft needs to do. We need to work with companies to help them build security plans, and patching is an important part of that. We have guidance that we need to do a better job of sharing with our customers. Two, we have to do a better job of creating tools that make it easier to install patches. And the third, which may sound simple but I think it's important, is increasing the overall awareness about the importance of installing patches. People are more aware now than they were a year ago, but clearly we have more work that we at Microsoft need to do to get more people to understand the importance of installing patches.

It's our issue, very clearly.

There are a lot of people out there who understand a lot less about software than Microsoft. And I think the feeling among some of them is a sort of fatalism that, if Microsoft can't protect themselves, then no one can.

The key thing here is that systems that were patched with any one of the five ways that customers could have protected themselves from Slammer were not affected by Slammer. I think on some level that provides good evidence that proper patching can be effective.

You have to realize that Microsoft is a special environment where the barrier to installing a piece of Microsoft software is exactly zero. Any employee of the company can basically install any product they want on their system for testing and development purposes. There are customers that have this ability as well. Microsoft just tends to do it more because of the culture and environment of the company.
What we realized here is that in the past, not having a patch installed affected your machine; therefore if it was a test machine, the importance of patching was relatively low. What Slammer taught us is that even if the value of the machine is low, the need to patch could be high because of its ability to impact systems beyond the machine that was or wasn't patched. That was important learning for us internally.

So it's analogous to home users being relatively unimportant systems out there to the Internet. But if they become a zombie for an attack then they actually become more important?

I don't think it's similar to that for the following reasons. To the home user, the continuous operation of their system is valuable. Therefore, the need to protect that home user is essential. That home user is as much a participant in Trustworthy Computing as an enterprise. If that person at home is going to start trusting software and do online trading and that kind of work, the software running on that home user's machine must be trustworthy.
I have a bunch of test machines under my desk. I install the product-of-the-week from Microsoft just to mess around with it and see what it's like. The value of that system underneath my desk is pretty low. I don't care if someone "fdisks" the machine and deletes all the data on it. I do that twice a week. (Editors' note: Fdisk is the command-line program used to format hard drives.) While the value of the trustworthiness of that machine on its own is pretty low, the value on the network at Microsoft is relatively high. Even though that machine is not valuable, its ability to hurt other parts of the company was relatively high. Therefore, it needed to be more trustworthy than we thought.

So how do you solve that? Do you put them on a closed network that isn't accessible to the Internet?

One is we increase the awareness of the need of people inside the company to follow the same SD3 law. (Editors' note: SD3 stands for Secure by Design, Secure by Default, Secure in Deployment--three tenets of creating secure products under Trustworthy Computing.) We make sure that people understand the need to ensure that products are patched properly--even if they are only test machines. It means we have to make it easier for people to install the patched version on their desktops.
Second, we need to make sure that some of the things that were turned on in our default configurations on our corporate network are not on by default. The machine running under my desk probably had port 1434 open. (Editors' note: 1434 is the software address used by part of SQL to connect to the network.) Third, while we did block port 1434 at the edge of our network, blocking 1434 between buildings on our network--which we did immediately after Slammer--we should have done before Slammer.

In the coming year, there will definitely be more attacks. What do you think companies should be doing and how can Microsoft help them do it?

The most important thing is for Microsoft to do the five things that customers have told us to do. Customers told us to address issues in our products before they ship. They want us to continue to review products after they have shipped. Customers tell us that we need to do a (better) job of being both responding to external identified issues with patches that are both high quality and available quickly...Microsoft must make it easier to deploy patches. And...Microsoft must provide other tools in the form of both software and guidance to help customers be secure.

The big lesson around Slammer in terms of what we have done and what customers should do are really about having a security strategy for your organization, which should include a patch management strategy.

Sounds as if you feel the need to get more out in front than be reactive.
We have been lucky that issues like Slammer--(which use) exploits against known vulnerabilities--(were found by) others outside of Microsoft...and we had a chance to fix them. We realize that we need to be more proactive to make sure we understand vulnerabilities for things that have not been found externally (and) to make sure we don't have someone who is finding a new vulnerability and then exploiting it. The best way to respond is not to be just reactive but to be proactive.

Who's Funding Free Software?
by chromatic
Feb. 21, 2003
Where would you be if Free Software went away tomorrow? Personally, I'd be sunk -- my daily toolkit includes bash, vim, ssh, Mozilla, AbiWord, OpenOffice.org, and Perl. That doesn't mention the countless underlying tools and libraries that come into play. Without Perl, this publishing system would be gone. Without bind or sendmail or Apache, the Internet would barely exist. Without Linux or BSD (and I apologize for apparently conflating the BSDs with "Free Software", but I'm on a rhetorical roll here, so please bear with me), we'd have no inexpensive community websites, no roll-your-own weblogs. Without MySQL and PostgreSQL and the other free databases, we'd have flat files and....
Without languages like Perl and PHP and Python and Ruby, I'd have very little to write about. Without gcc, even C and C++ would be tricky. Would anyone outside Sun still care about Java if it weren't for the Blackdown guys, or Jakarta, or all of the tools they produce?
Without all of this wonderful software, we'd plunge back into the Dark Ages. (It's tempting to say that the barbarians who sacked Rome represent proprietary software, but that leaves the unenviable metaphor of saying that unchecked expansion left the hackers soft around the edges, and no one wants that!)
Think of the effect on your business now. Maybe you only have one Linux box in the corner serving files, or maybe you've taken the plunge and have a beefy Linux Terminal Server in a closet and run everything on thin clients. Maybe you don't have any free or open source software anywhere -- even if that's the case, you've likely benefitted from the cost pressures that high-quality redistributable software has applied to proprietary vendors. (If you'd previously used OpenView or CDE and now have an X desktop that doesn't hurt your eyes, you've benefitted doubly.) Where would your business be without this software?
Don't worry. It's not going away.
That's not to say things are peachy keen, though. I'm going to discuss the Perl community briefly, because that's the community I know best. These ideas apply much more broadly, though.
Things are tough. Lots of good, smart, hard-working people are unemployed or underemployed. (Writing a book is a big job -- and several good people are working on books, but it can be tough to pay the bills while writing full-time.) That includes some of the tip-top names in the Perl world.
Face it -- if money were no object, wouldn't you hire a Michael Schwern to write tests, a Damian Conway to give training, and a Larry Wall to do research?
If you or your company benefit from Free Software, here's a list of ways you can help ensure the future of Free Software. Focus less on the "free" part and more on the "enlightened self interest" part:
Donate money to the project of your choice.
Sponsor a hacker to add a feature or fix a bug.
Hire a hacker to train your employees.
Donate code.
Report a bug.
Answer a question on a mailing list.
Encourage your employees to contribute to a project of their choice.
Replace a proprietary tool with a free one.
List the free software you use on your website.
List the free software you use in a press release.
Host a user group meeting.
Donate 10% of the money you've saved by using free software to the projects that have saved you money.
Write a thank you note to a project that's saved you time.
There are a lot of companies who've done one or all of these things, and there are a lot more ideas waiting to be discovered. (I'm pretty proud to say that my employer scores very well, and that I've done a fair few of these things myself.) To everyone who's done at least one of these ideas, congratulations. You've helped ensure that Free Software keeps going.
To everyone else, if you've benefitted from Free Software, what are you waiting for?
chromatic
is the Technical Editor of the O'Reilly Network.


Retiring Microsoft exec urges open-source embrace
By Joris Evers, IDG News Service
FEBRUARY 19, 2003

Microsoft Corp. must take an approach that favors and embraces the diversity of open-source software or face oblivion, David Stutz, a departing Microsoft executive, wrote in a farewell letter to the company.
Stutz, a respected technical thinker at Microsoft, sees networked software as the future for computing. Open-source software is already there, while Microsoft still has to move past its PC-centric roots, he wrote.
"If Microsoft is unable to innovate quickly enough, or to adapt to embrace network-based integration, the threat that it faces is the erosion of the economic value of software being caused by the open-source software movement," Stutz wrote in the letter that he posted on his Web site.
"Useful software written above the level of the single device will command high margins for a long time to come. Stop looking over your shoulder and invent something!" he wrote to Microsoft. "If the PC is all that the future holds, then growth prospects are bleak."
Stutz left Microsoft earlier this month. He held several key positions at the software vendor, including chief architect for Visual Basic and, most recently, group program manager for Microsoft's Shared Source program, the company's answer to open source.

Microsoft said it's not uncommon for recently retired employees to write an open letter. They offer "great fodder" for internal discussions, the company said in a statement sent via e-mail.
"David Stutz has been an important contributor to Microsoft's open source thinking and Microsoft agrees with much of the vision Dave [Stutz] has for the future," the company said. However, Microsoft added that it believes that "breakthrough innovations will come mostly from commercial software companies such as Microsoft."
Stutz said he worries that efforts to recover from perceptions of the company as "politically inept" and a focus on being the lowest-cost commodity software producer will empower managers and accountants at Microsoft rather than visionaries.
Microsoft's "denial" when it comes to networked computing is understandable, because the company built its empire on the notion of the PC as the natural point for hardware and application integration. However, "network protocols have turned out to be a far better fit for this middleman role," according to Stutz.
"Microsoft still builds the world's best client software, but the biggest opportunity is no longer the client. It still commands the biggest margin, but networked software will eventually eclipse client-only software," Stutz wrote.
Microsoft products due out later this year, such as Windows Server 2003 and the successor to Office XP, will offer more networked features than the previous versions, the company has said.
The greatest threat to Microsoft is not the Linux operating system, but applications, Stutz said. As the quality of open-source software improves, there will no longer be a need for Microsoft Office's one-size-fits-all suite of applications, he wrote.
"Open-source software is as large and powerful a wave as the Internet was," he wrote. "Microsoft cannot prosper during the open-source wave as an island, with defenses built out of litigation and proprietary protocols."
Steven Milunovich, a vice president at New York-based Merrill Lynch & Co., agreed that Microsoft needs to innovate more. "Microsoft must notch up the innovation component to do well in new areas," he wrote in a report today.
Microsoft has publicly said in the past that open-source software is a threat to its business (see story).

"The localisation of Linux to Indian languages can spark off a revolution that reaches down to the grassroots levels of the country," writes Prof. Venkatesh Hariharan. Read the rest of his informative essay

Why Linux Makes Sense for India
Falling costs have made computers more affordable to a larger section of India's population. At the same time, the Internet has made the PC a compelling proposition for fulfilling communications, education, entertainment and information needs. Based on these two trends, the market for Information and Communication Technologies (ICT) is likely to take off significantly in India.

Yet, India faces a peculiar problem in that almost all popular operating systems and applications packages are available only in English, a language which is spoken by a mere ten percent of the population. The lack of "Indianized" software is therefore an issue that seriously hampers the growth of the Indian computer industry. For almost 915 million Indians, the lack of Indian language interfaces is one among many issues that hamper their ability to reap the benefits of information technology. This is creating a new class of people who live in what can be called as "Information Poverty" even as technology becomes cheaper and cheaper.

At the infrastructure level, the barriers to information access are dropping dramatically with new ISPs coming into India and several players jockeying to provide bandwidth and other back-end services. However, without operating systems, applications and Internet content in Indian languages, key benefits of the digital revolution-e-commerce, low cost communication through e-mail, access to information databases, telemedicine services etc are denied to the Indian masses. Giving Internet access to an Indian who does not know a shred of English is like giving someone the keys to a car when there are no roads to drive on!

One development that can help India out of this deadlock is a national-level, collaborative effort to localise Linux to Indian languages.

Linux is a free operating system that has gained phenomenal popularity in recent times because it allows users to modify it to suit their own needs. Linux is a collaborative effort of thousands of programmers interacting over the Internet and is therefore not owned or controlled by any one company. In this article, we outline the economic and cultural imperatives for the localisation of Linux.

Free operating systems have several advantages for developing countries because most software packages today are developed in the west and then sold in developing countries where the parameters of affordability are completely different. The Bangladeshi activist Shahidul Alam expresses these differences poetically when he says, "A modem costs more than a cow." The benefits of free software multiply exponentially when we look at
large-scale implementations. The Government of Mexico is estimated to have saved close
to $125 million that would otherwise have been spent on proprietary systems when it signed up Red Hat to implement Linux in more than 140,000 schools and colleges across Mexico. In India too, large operators like World-Tel (which plans to have a thousand Internet Centres in Tamil Nadu, with each of them having between two to 20 PCs each) have expressed their intention to go the free software way. The company is negotiating
similar deals with several other state governments. Organizations like World-Tel, Internet centres, schools and homes etc. can be expected to be significant users of Indian language operating systems.
The growth of content in platform-independent file formats (HTML, MP3 etc) has also reduced the dependence on a specific operating system, making Linux a viable option.

Apart from these, there are cultural reasons that make Linux attractive. The existing user interface paradigm of files and folders evolved because computers were essentially designed for a western audience familiar with real-life files and folders. There is no reason to assume why the same paradigm should apply to a trader in Tamil Nadu or a farmer in Madhya Pradesh. The openness of Linux (and other free operating systems like Free BSD) allows local linguistic groups to customise user interfaces in ways that are
far more culturally sensitive than any centrally controlled approach. Linguistic groups that may be considered too small a market by vendors can also take their destiny in their own hands by customising the Linux interface to their own needs. It is therefore clear that Linux is a very attractive long-term solution to India's computing needs.

Localising the user interface of Linux to all the 18 official Indian languages will involve changing the menus and help-text to Indian languages and creating a whole stack of applications and tools (word processors, browsers, spell-checkers etc.) to enable computing in Indian languages.

This is a task that involves both technical and linguistic challenges. For example, should "File" simple be called "File" but written in Indian scripts because it is now a part of popular usage? Or should we find Indian language equivalents? In some cases it makes little sense. For example, how many people know that the Hindi word for computer is "sanghanak"? Or what is the Hindi equivalent for "Internet"? A very sensitive balance has to be struck between practicality and preserving Indian languages. However, Indian
linguistic groups will have to wake up to the fact that their languages will become outdated if they do not become a part of the digital age. In fact, the Internet can be one of the finest means of recording, archiving and propagating Indian culture. Since culture is embedded in language to a significant degree, the ability to compute in one's native language can give Indian culture a significant boost.

However, one of the greatest roadblocks to computing in Indian languages has been the lack of widely accepted standards. If millions of people are able to freely e-mail each other, it is because of a widely accepted standard called ASCII (American Standard Code for Information Interchange). It is sad that in spite of claims that India is a software superpower, we cannot harness IT for the benefit our own nation's citizens and the greatest stumbling block is a lack of agreement on standards. Check out ten different
Hindi newspapers on the Web to see for yourself. You'll end up downloading and installing ten different fonts that (in most cases) can be used for browsing that one site and nothing else. It is because of this reason that Hindi, despite being one of the largest spoken languages in the world, has a negligible presence on the Web. Informed sources feel that the Unicode standard (which Microsoft has adopted for the upcoming Windows 2000 operating system) will soon become the de-facto standard settling the language standards issue once and for all. If this prediction comes to pass, it will significantly increase the domestic market for hardware, software and services, which is restricted only to a small fraction of India's population that understands English.

There are several initiatives that are underway in order to make this possible. The National Centre for Software Technology has submitted a proposal to the Technology Development in Indian Languages of the Government of India. The Indian Institute of Technology, Madras has already started work on localizing Linux to Malayalam and Tamil. My own institute, the Indian Institute of Information Technology, Bangalore has committed resources to this the "IndLinux" project and started a collaborative effort to realise this goal. IndLinux has attracted the interest of organizations like FreeOS.com and
many individuals located around the world.
In conclusion, it has to be said that the Indianisation of Linux is probably one of the most practical ways of making information technology available to millions and millions of Indians. It is now upto linguistic and technical groups to collaborate and make things happen.

Prof. Venkatesh Hariharan is with the Indian Institute of Information Technology, Bangalore. He can be reached at venky@iiitb.ac.in.

Comments: Post a Comment

<< Home

Powered by Blogger