StratVantage – 02/28/02

A Bad Year for Security Incidents

 As I gear up to co-produce CyberCrime Fighter Forum 2002 on March 12th, I return my attention to the subject of security, or the lack thereof.

I was recently asked by an executive if there wasn’t a component of urban myth to all this recent emphasis on CyberCriminals (crackers, script kiddies, virus writers and the like). Were there really that many attacks on systems? Are viruses really the problem the anti-virus vendors make them out to be? Are security breaches really costing business the millions of dollars reported?

These are all good questions. Of course we shouldn’t take all the hype and hysteria on faith. There are many in the industry for whom crying wolf is self-serving. Nevertheless, there are many sources of somewhat objective information on security breaches.

One of the most difficult aspects of CyberCrime to pin down is the amount of actual damages. You’ll find estimates all over the map, from the FBI’s estimate that computer losses are up to $10 billion a year to Computer Economics’estimate that the worldwide impact of malicious code was $13.2 billion in 2001. Computer Economics stated that the biggest losses were caused by SirCam ($1.15 billion), Code Red (all variants $2.62 billion), and NIMDA ($635 million).

Estimates are fine, but published reports of actual losses are better. However, most corporations would rather be summoned before Congress than admit to a security problem. Of course, if they can use a security breach to justify bad fiscal performance, like CryptoLogic did, that’s another story. CryptoLogic, a Canadian maker of gambling software, reported a 10 percent drop in fourth-quarter revenue primarily due to a charge taken as the result of a security breach.

So where are these threats coming from? Most people point to CyberCriminals on the Internet, but they may be only a small part of the problem. The FBI and the Computer Security Institute performed a survey on CyberCrime and found that 81 percent of corporate respondents said the most likely source of attack was from inside the company. This confirms the conventional wisdom among security administrators that the biggest problem is your own employees or contractors. And according to an @stake Security research report entitled The Injustice of Insecure Software, 30 percent to 50 percent of the digital risks facing IT infrastructures are due to flaws in commercial and custom software. According to CERT®, security vulnerabilities more than doubled in the last year, from 1,090 holes in 2000, to 2,437 reported in 2001. Likewise, the number of reported incidents also drastically increased from 21,756 documented in 2000 to 52,658 in 2002.

This year is very likely to be worse, according to SecurityFocus co-founder and CEO Arthur Wong, who spoke recently at RSA Conference 2002. According to Wong, around 30 new software vulnerabilities were discovered each week In 2001, and this represented a decrease in the trend that produced a doubling of new vulnerabilities each year for much of the late ’90s. He expects 2002 to bring a return to old growth rates, and predicted that 50 new software security holes will be found each week in the coming year.

Michael Vatis, the former director of the National Infrastructure Protection Center (NIPC) agrees, saying, “The rate of growth of our vulnerabilities is exceeding the rate of improvements in security measures.”  He’s most worried about CyberAttacks that could bring down ATMs, power grids and public transportation systems.

If you’d like to get a near real-time picture of attacks worldwide, check out SecurityFocus’ ARIS Predictor. This service shows the actual number of incidents worldwide based on a sample of installations that contribute log information.

Against this rising tide of attack reports is a contrary stat: Security breaches and hacking attacks have actually decreased since the September 11 terrorist attacks, according to the Federal Computer Incident Response Center (FedCIRC).  FedCIRC shows just 15 incidents of intruder activity reported in December 2001, less than a third of that recorded in December 2000.

Where are all these attacks coming from? It turns out Europe is a virus hotbed, according to a report from mi2g’s Intelligence Unit. The Continent accounts for 57 percent of the world’s malicious code writing activity, with 21 percent originating from Eastern Europe, including Russia.  While conventional wisdom may tell us otherwise, North America only accounts for 17 percent of viruses developed, and the Far East only 13 percent.  The most prolific virus writers, according to the report, are Zombie, author of the Executable Trash Virus Generator; Benny from 29A virus group and author of the .Net Donut virus; Black Baron, author of Smeg; David Smith, author of Melissa; and Chen Ing-Hau, author of CIH.

So the solution for businesses is to stay alert, and stay patched. Make sure you’re always running the latest antivirus software and the latest patches on your operating systems and applications. However, Alan Paller, director of Research at the SANS Institute, said, “There are certain attacks that nobody can block. . . . If your people aren’t absolutely, all the time on the latest patches, you’re going to get hit.”

So hey, hey, hey! Let’s be careful out there! If you’re in the Twin Cities on March 12, be sure to attend the CyberCrime Fighter Forum 2002 and learn more about how you can be safe.

Briefly Noted

  • Shameless Self-Promotion Dept.: Did I mention CyberCrime Fighter Forum 2002? Also, in conjunction with the new CTOMentor paper, Basic Home Networking Security, we’re running a survey on home networking policies and procedures. The first survey cycle closed yesterday, but you can get in on thesecond, which will run through March 11.CTOMentor is also offering a two-part white paper on peer-to-peer technology: Peer-to-Peer Computing and Business Networks: More Than Meets the Ear. Part 1, What is P2P?, is available for free on the CTOMentor Web site. Part 2, How Are Businesses Using P2P?, is available for $50.
    CTOMentor
  • International Reach: A note from a reader in Guam prompted me to check out the subscription list and see where in the world SNS is going. There are subscribers in Australia, Canada, Germany, Greece, Guam, India, Italy, Japan, and the UK. Besides noting the obvious country suffixes on some of the email addresses, I used a cool tool called VisualRoute to determine subscriber’s location. Alert SNS Reader Bob Burkhart let me know about this program. You type in an URL or an email address, and it shows you all the network hops between your computer and the target. That’s not spectacular, but what is nice is VisualRoute looks up the DNS records on the final computer and pulls out any location information, which it reports to you.The bad thing about this software, which is free for trial use, is it doesn’t clean up after itself completely when you exit it. On a Windows 2000 machine it left MsPMSPSv.exe (the Microsoft Digital Rights Manager) and wjview.exe(Microsoft VM Command Line Interpreter) running after it exited. I recommend using your software firewall (What’s that? You don’t have a software firewall? Get one! And read the new CTOMentor paper on home network security ) to only grant one time access to the Internet for the various programs VisualRoute uses (including vrping1.exe, and vrdns2.exe) just to be safe.
    VisualWare
  • Microsoft As Security Threat: I missed this item from the irreverent UK site, The Register, back in December. They pull no punches in describing Microsoft as a bigger threat to security than Osama Bin Laden. Read the article and see if you agree.
    The Register
  • MessageLabs Says Viruses on the Increase: Message Labs, which sells a hosted antivirus service for email, reported that it detected one virus per 370 emails in 2001, compared to one in 700 in 2000 and one in 1400 in 1999.  The 2001 total of 1,628,750 infected emails that MessageLabs detected broke out this way:
    • More than 500,000 were infected with the SirCam.A virus
    • 258,242 with BadTrans.B
    • 152,102 with Magistr.A
    • 136,585 with Goner.A
    • 90,473 with Hybris.B.

ISPreview

StratVantage – The News 02/20/02

Handhelds in Health Care

 Wireless is one of those technology areas that always seems to be impending. Each of the last two years has been the year of wireless according to industry boosters. Pundits and prophets breathlessly report each twist and turn in the story. Yet when wireless nirvana hasn’t arrived, detractors have declared wireless a technology in search of a problem.

Wireless is here; so’s the gear. Get used to it.

Want proof? Take a look at health care, particularly in the hospital setting. Now there’s an information management problem. You’ve got doctors roaming around from room to room, changing orders, taking notes, and making life and death decisions. You’ve got nurses and other medical professionals monitoring patients, administering treatments and medications, and sometimes trying to figure out what the doctor said. Hospitals run on information, and the reliable transmission of information.

It’s critical to make sure that all this information is accurate, timely, and always available. That means most hospitals are in the paper shuffling business. Medical records departments are awash in it. For example, the RehabCare Group in St. Louis, an outsourcing staffing firm with 2000 therapists working at nearly 500 sites estimates that their therapists were writing and faxing an average of 3,000 pages of information each week. Many hospitals and clinics spend lots of money on keying services to convert the paper to bits so that the information can be managed.

Some hospitals today are digitizing the information at the source: the doctors and the nurses who care for patients. According to the Doctors Say E-Health Delivers study conducted this fall by the Boston Consulting Group and Harris Interactive, 89 percent of physicians use the Internet, 22 percent use electronic medical records to store and track information about their patients, and 11 percent are prescribing drugs electronically. The study further found that doctors were planning on adopting electronic information practices at a rapid rate.

Many of these forward-thinking doctors are going mobile. For example, about 30 doctors at the University of Minnesota have been testing a modular mobile Electronic Medical Records (mEMR™) software program designed by AllScriptsHealthcare Solutions. The modular nature of the AllScripts solution allows doctors to start using one solution and progressively add others. The company offers the following modules:

Using the AllScripts TouchWorks™ Dictate system, the Minnesota doctors record patient notes on wireless-enabled Compaq iPac Pocket PCs, creating an audio file that is sent wirelessly to medical transcribers through the hospital’s radio frequency network. The software supports dictation templates that can be customized to match hospital forms.

If the doctor strays outside the hospital’s radio network, when he or she enters an area with a wireless transceiver, the data is transmitted automatically. This is especially helpful since the University of Minnesota’s physicians work in more than 150 clinics around the state. Currently, 38 of the clinics are equipped with wireless equipment to capture data and transfer it to traditional land-based networks. The uploaded information is accessible to doctors and others through a Web site using their handhelds or office computers.

The physicians group plans to introduce the software’s other functions over a period of time, said Todd Carlson, Chief Operating Officer. After implementing medical transcription, the group will expand to electronic laboratory results, billing, scheduling, patient care and referring physician information.

Security of the data was a normal concern, and one that will become even more important once the HIPAA (Health Insurance Portability and Accountability Act of 1996) regulations come into effect. “We are really afraid of hackers because we’re on a college campus and we’re afraid students will attempt to hack into our wireless system,” said Carlson. “We did a hacking audit with Ernst & Young at an additional cost because we wanted the system to be safe and secure.”

The Rehab Care Group in St. Louis is in the process of equipping as many as 1,500 of their workers with Palm Pilots, according to Senior Vice President and CIO Jeff Roggensack. The company developed a custom application that works with Palm handhelds. “It streamlines the data collection process for our therapists working in the field, and eliminates the faxes, data entry, delays and handwriting errors experienced with the paper-based system used previously,” said Roggensack. Currently, the workers synch their Palms with desktop PCs for transmission, although wireless access is planned for the future.

Today, 19 percent of physicians own personal digital assistants, and that number should exceed 40 percent by 2005, according to Fulcrum Analytics. A Forrester survey of 44 medical practice managers for a report titled Doctors Connect with Handhelds, found that physician practice managers are actually “overexuberant” about the potential of using mobile computing devices. If their predictions turn out to be true, 86 percent of practices will be processing prescriptions on handheld computers by 2003, whereas only 11 percent of practices do so today. Forrester predicts the market for mobile physician software, devices and management will grow from a $21.4 million market today to a $1.6 billion market in 2007.

According to Taking the Pulse v 2.0: Physicians and Emerging Information Technologies by Fulcrum Analytics and Deloitte Research, more than half of all physicians who responded to a survey hope to view lab results via their PDAs in the future. Of the 30 percent who report that they currently own a PDA, 84 percent maintain their personal schedules and 67 percent manage their professional scheduling through the device.

So the docs are on the leading edge, and are impatient for more wireless applications. They’re not the only ones. Wireless has applications in many industries, according to Summit Strategies analyst Jennifer DiMarzio. DiMarzio suggests considering the use of mobile wireless technology if the location of your workers, or of their next assignment, changes frequently; if timely information improves productivity; and if your company can improve billing if employees can instantly record the work as they finish it.
City Business

Briefly Noted

  • Shameless Self-Promotion Dept.: CTOMentor has published a new paper called Basic Home Networking Security that should be of interest to anyone who wants to access at-work networks from home.The paper covers, in plain language, types of threats, secure home networking practices, and describes the basic home network security toolkit every home user should have.

    CTOMentor is also offering a two-part white paper on peer-to-peer technology: Peer-to-Peer Computing and Business Networks: More Than Meets the Ear. Part 1, What is P2P?, is available for free on the CTOMentor Web site. Part 2, How Are Businesses Using P2P?, is available for $50.
    CTOMentor

  • Kmart Supply Chain at Fault: Kmart CEO Chuck Conaway blamed many of Kmart’s problems on its supply chain. In September, Kmart wrote off $130 million for supply chain hardware and software and another $65 million for replacing two distribution centers. It didn’t help them avoid bankruptcy court. Nonetheless, Kmart plans to spend $1.7 billion or so on a project to improve the flow of goods to store shelves.
    Internet Week
  • Defacement Tracking Site Owner Steps Aside: The operator of a great resource for keeping tabs on Web site defacements (changes in Web pages caused by cybercriminals) is calling it quits. The Alldas.de Web site, which archives copies of defaced Web pages, announced that its founder would be retiring and the site moved to a new domain. Stefan Wagner said that dealing with system administrators who blamed Alldas.de for their defaced sites, denial-of-service attacks launched against his site, and a lack of a social life made him hang up his spurs. The site will move to Alldas.org in early March and be run by two staffers and volunteers.
    News.com

 

StratVantage – The News – 02/04/02

The Next Internet?

The Internet started out life as a way for major universities and government research centers to communicate and collaborate. Imagine the mixed feelings with which researchers viewed the tremendous explosion of the Internet since it was commercialized, after 25 years of relatively slow growth, in 1994.

On the one hand, there is now more information available on the Internet than anyone thought possible back when the first four Internet nodes went live (UCLA, Stanford Research Institute (SRI), University of California Santa Barbara (UCSB), University of Utah).

On the other hand, the commercial Internet is so busy and growing so rapidly, it’s hard for researchers to get the bandwidth they need for really large projects.

Today’s Internet doesn’t:

  • Provide reliable end-to-end performance
  • Encourage cooperation on new capabilities
  • Allow testing of new technologies
  • Support development of revolutionary applications

The Internet was not designed for the congestion caused by millions of users. It wasn’t designed for multimedia or even for real time interaction. Yet these are the characteristics of today’s Internet.

Faced with these limitations to innovation, Internet2 was formed in 1996 as a consortium led by universities working in partnership with industry and government to develop and deploy advanced network applications and technologies.

The consortium’s mission is to develop and deploy advanced network applications and technologies, accelerating the creation of tomorrow’s Internet. Each university pays $500,000 to $1 million or more a year to gain access to Internet2 and upgrade its campus network.

At a recent seminar sponsored by the University of Minnesota Management Information Systems Research Center, Myron Lowe of the University of Minnesota’s Office of Information Technology described the Internet2 effort and some of the applications being developed using it. According to Lowe, there are now 187 member universities, 70 member corporations, and 28 GigaPoPs (high speed access points) on Internet2. The backbone network, dubbed Abilene, is a 2.5Gbps backbone covering more than 10,000 miles coast to coast.

One of the important uses of Internet2 is videoconferencing, with several major telepresence initiatives taking advantage of the new network’s bandwidth. One such effort is the Access Grid, which has 81 nodes. The Access Grid supports large-scale distributed meetings, collaborative work sessions, seminars, lectures, tutorials and training and focuses on group-to-group communication rather than individual communication.

Another effort is the Virtual Rooms Videoconferencing System (VRVS), a project of CalTech and the CERN Lab in Switzerland (which gave us the World Wide Web). VRVS provides a worldwide videoconferencing service and collaborative environment to the research and education communities over Internet2. The system includes more than 6,150 registered hosts in more than 50 different countries and hosts an average of 190 multipoint videoconference and collaborative sessions worldwide each month.

A related Internet2 application is tele-immersion, being developed by the National Tele-immersion Initiative(NTII). This effort, led by VR pioneer Jaron Lanier, aims to enable users at geographically distributed sites to collaborate in real time in a shared, simulated environment as if they were in the same physical room.

Rather than transmitting live images of participants, the technology creates a new environment for participants to interact in. In a tele-immersive environment computers recognize the presence and movements of individuals and objects, track those individuals and images, and then project them in realistic, immersive environments. This allows participants to interact with nonexistent objects, like simulations or models.

Other applications include tele-operation of an electron microscope, real-time 3D brain mapping, interactive courseware by North Dakota State’s WWW Instructional Committee, and the Visible Human, a three-dimensional, computer-generated cybernetic body, that can be viewed from any angle, dissected and reassembled by anatomy students, or used as a model to study the growth of cancer cells. Astronomers can also control the famous telescopes on the top of Mauna Kea in Hawaii from their desktops.

What’s in store for the Internet2? First of all, more bandwidth. Lowe said the Abilene backbone will be upgraded to 10 gigabits/second and employ new multiple wavelength capabilities by next year. Also in store is competition. European researchers were recently given access to GÉANT, a gigabit research network serving more than 3,000 European academic and research institutions that will eventually operate in 32 countries.But will the masses ever be let on to Internet2, inevitably forcing the researchers to build Internet3? Not necessarily. The Internet2 is based on the same high speed fiberoptic circuits available to anybody. It’s the technology that runs it that’s important. Thus, it’s likely that, rather than giving Internet users physical access to Internet2, the consortium will migrate the new technologies developed on Internet2 onto the existing Internet. Among these technologies is Internet Protocol version 6 (IPv6), which will be the subject of a future SNS.

For now, Lowe said, one of the true pleasures of Internet2 is no pop-up ads. I doubt that researchers would ever give that up.
New York Times (registration required)

Briefly Noted

  • Shameless Self-Promotion Dept.: StratVantage has launched a new service, CTOMentor™, designed to allow Chief Technology Officers and other technical leaders to get rid of the Guilt Stack, that pile of magazines you’ll get around to reading someday.CTOMentor is a subscription advisory service tailored to customers’ industry and personal information needs. Four times a year CTOMentor provides a four-hour briefing for subscribers and their staffs on the most important emerging technology trends that could affect their businesses. As part of the service, subscribers also get a weekly email newsletter, Just the Right Stuff™, containing links to the Top 10 Must Read articles needed to stay current. These and other CTOMentor services will let you Your Inbox™.

    As part of its launch, CTOMentor is offering a two-part white paper on peer-to-peer technology: Peer-to-Peer Computing and Business Networks: More Than Meets the Ear. Part 1, What is P2P?, is available for free on the CTOMentor Web site. Part 2, How Are Businesses Using P2P?, is available for $50.
    CTOMentor

  • SSPs Bite the Dust: IDC recently published an analysis of the Storage Service Provider marketplace, which has had a rash of company failures. Does this mean the “storage on demand” idea of locating your storage on someone else’s machines on the other end of a network wasn’t viable? As a standalone business model, it may not be. But IDC states that “the original SSP model now is being adopted by the likes of IBM Global Services, EDS, BellSouth, Qwest, AT&T, and other outsourcing and telecommunications firms.”By combining SSP services with a larger package of services such as Web hosting and telephone services, these firms stand a good chance of wringing some profit out of the concept, according to IDC. IDC has historically been quite bullish on the SSP idea, but is now revising their original February 2001 SSP forecast downward from $10 billion worldwide through 2005. Nonetheless, IDC projects nearly 50% compound annual growth rate (CAGR) for 2001 through 2006 for the total managed storage services market, which includes storage managed on a customer’s own site.

    The company doesn’t say why we should believe this estimate, when the one from a year ago was so inaccurate.
    IDC

  • European 3G Operator Relief: Much has been made of the lead European countries enjoy in the penetration and use of wireless technology. But European wireless operators have faced a crisis of their own making. When various European governments held auctions for wireless bandwidth for development of 3G (Third Generation) wireless services, wireless operators bid the prices into the stratosphere. This saddled many operators with billions of euros of debt or investment, and as a result has hampered their ability to actually develop 3G services (which include high speed data access, location-based services, and eventually, wireless video). To date, only one small trial 3G network, on the UK’s Isle of Man, has gone operational, serving only 200 devices.So the second-most advanced wireless region (after Japan) is virtually dead in the water. Recently, European regulators have eased license fees and taxes to try to get the industry moving again. France has drastically reduced the license fees and changed future payments to a performance-based royalty scheme. Spain has cut their radio spectrum tax, and the Italian government is considering extending the country’s five 3G licenses from 15 to 20 years.Japanese wireless innovator,DoCoMo, is experiencing itsown problems. The company’s sales target is 150,000 3G users by March, but it only sold 11,000 3G handsets in October, when its 3G service, FOMA, went live with regional coverage. DoCoMo launched a popular trial video service, imotion, in November that will run through March. But just a week into the trial, the company had to recall 1,500 NEC N2002 handsets due to a software problem. The glitch destroyed users’ e-mails, content based on Java, call records and some of the handset’s personalized settings.

    The imotion service offers three types of multimedia files for download: full-motion videos, such as sports clips and trailers; slide shows of still pictures; and pure audio. DoCoMo has signed 28 content providers including Sony Music and Fuji TV. However, the FOMA revenue hasn’t met expectations as customers are still using their old cell phones to make voice calls less expensively. DoCoMo has also launched a Location-Based Service (LBS) called DLP, and has moved to license its technology in the Netherlands and Belgium.
    Third Generation Bulletin, December 2001, volume 3, issue 12

  • New Top Level Domains Go Live: Four of the seven new gTLDs (generic Top Level Domains) have gone live since September. Here’s a table of the “Go live dates” for the new global domains:
    TLD Go live date
    .info September 23, 2001
    November 7, 2001
    .name January 15, 2002
    .coop January 30, 2002
    .museum Demonstration: November 14, 2001
    Full Operations: mid-July, 2002
    .aero not clear
    (Stage 1 registration March-Apr 2002)
    ICANN agreement not yet complete

    Both .info and .name registrars claim they had 500,000 names registered about one month after going live. Hardly a land rush.
    Netcraft

  • Office XP Hates ZoneAlarm: Alert SNS Reader Jeff Ellsworth sends along this complaint: Why would Microsoft release an Office XP Service Pack with a known conflict with a popular personal firewall? I don’t have an answer for him, and neither do Microsoft or ZoneLabs, maker of the ZoneAlarm firewall. Seems that if you don’t totally uninstall ZoneAlarm before applying the Office XP Service Pack, you won’t be able to access the Internet afterwards.Microsoft points users to ZoneLabs, and ZoneLabs, well, they’re kind of silent on the issue. Jeff tried their tech support, which, for free users, only accepts emails and says they’ll get back to you within five days. He tried upgrading to ZoneAlarm Pro, which gets you an answer in one to two days. Finally, after trolling the newsgroups, he found the instructions on really uninstalling ZoneAlarm, which turned out to be on the ZoneAlarm site as part of a resolution to conflicts with a Windows XP install.

    ZoneLabs’ uninstall program doesn’t really, totally, absolutely uninstall ZoneAlarm. You have to muck about in the registry and also search your disk for possible orphan files. Add to this the fact that, if you use Microsoft’s install off the Net option for the service pack, your computer is unprotected while you download and install the service pack. Unfortunately, this kind of issue is typical. Doesn’t anyone care about the poor user?
    ZoneLabs