Bringing the Power of Knowledge to the Desktop
By ALLEN MONROE
Technology is just the prerequisite. Information technologies have revolutionized the way risk management and insurance professionals obtain, process, and analyze data. Before the concepts of risk management expanded the way enterprises handle insurable and non-insurable risks, information maintained by insurance departments generally were limited to the kinds of information needed to manage a portfolio of insurance policies. Paper-based files consisted of underwriting information, insurance policy records, and aggregate loss ratio statistics provided by insurers.
With the acceptance of the risk management process, organizations explored alternative methods of financing insurable risks, such as self insurance and captive insurance. Evaluating and implementing these techniques required more detailed data and sophisticated analysis. Risk managers systematically began to organize detailed information on their risk exposures, losses, and risk management programs. Their efforts were aided by increasingly powerful and accessible computer database and spreadsheet software, and through the development of specialized systems designed to track and analyze risk management data.
These systems have helped self insurers and TPAs process claims more efficiently, and provide the means for risk managers to access a comprehensive database that includes incidents, accidents, claims, exposures, underwriting data, and insurance policy records. Such processes as "cost-of-risk allocation," a term coined by Guyon Saunders, founder of Corporate Systems, have become commonplace. Desktop-based relational database systems, accessing information from shared databases on sophisticated "server" computers are in common use. They make it possible to obtain information from a variety of different insured and self-insured programs.
There is a critical need for consistent standards to avoid fragmentation of information. Because risk-related information comes from many disparate sources, it often is fragmented and not easily consolidated. This is due in part to the multi-jurisdictional regulations pertaining to insurance and self-insurance in each state and country. Other impediments to consolidating a consistent, enterprise-wide risk management database include the tremendous variety of data formats used by insurers, TPAs, and regulators, based on aging, customized systems on mainframe computers.
There is a recognized need for greater consistency in the maintenance of risk management information. A number of industry efforts are underway in an effort to develop a broadly-accepted set of standards. In addition to the ANSI X-12 committees that regularly meet to define standards. the major insurance brokers have begun to create a common set of transactional formats as part of their WIN initiative. RIMS is hosting meetings to jump-start the standards-making process. Dorn Technology, a leading provider of Risk Information Systems, has made public its own specification on claims data (at http://dorn.com) as part of an intensive effort to spearhead an industry-wide collaboration to gravitate toward a common standard.
According to Scott Gilmour, Director of Marketing of Corporate Systems, "the inefficiencies created by the current lack of a consistent standard are costing our industry tens of millions of dollars every year. We need to agree on a common format." When organizations update their risk management information systems, the cost of data conversion often is a significant percentage of the system price tag.
Consistent data consolidation over many years helps transform data into knowledge. Don St.Jacques, Director of Workers Compensation/Managed Care at The Hartford has begun an ambitious project to analyze its entire database of medical bill processing and teleclaim information to examine trends, evaluate lags in claims reporting, and respond to ad hoc inquiries from its clients. Using Corporate Systems' medical bill review system, The Hartford processed over 1.7 million workers' compensation medical bills in 1996 alone. Says St.Jacques, "through medical bill review, we achieved a net cost reduction of $231 million, amounting to 37% of the amounts billed by medical care providers."
Now St.Jacques is focusing on using Corporate Systems' CS Knowledge ä as a window into a replicated database, attempting to transform raw data into useful information by focusing on the "outcomes" of claims. The database contains a "carbon copy" of each transaction from three bill review centers. It resides on a dedicated server handling "gigabytes of data." These efforts previously required the help of programmers to create special reports. According to St.Jacques, "the days of having to print volumes and volumes of reports is going away. As we move to PC-based computing, sometimes we get so technically crazed that we think we need instantaneous access to every type of information is what we need. But sometimes we're dealing more with trends than with instantaneous responses, and some of the information becomes more valuable with age. We envision that the largest potential of the new system will be to identify how managed care programs are impacting the workers' compensation business. By anticipating trends in medical billing practices, we can adapt our programs and provide better information to our clients helping them manage their risks."
The next challenges will involve how to receive the initial data in an electronic format. If sufficient security measures are developed, the Internet may afford an easy-to-use, widely accessible way of receiving employer first reports of injury.
The explosion of new applications for risk management data means that the opportunity to benefit from acceptance of standards is not limited to claims or other risk management data. The range of useful applications using the data is proliferating, creating a need for both desktop and server software systems to interface easily with other systems. According to Dave Duden, National Director of Deloitte & Touche's risk management systems practice, "we are seeing a lot of efforts to integrate different systems. As standardized software platforms emerge, there is less of a need to rely on one system to do meet every need. We need to get to a more open software environment. Hopefully, the Internet will become that."
The role of risk management in the enterprise is transforming, as an increasing variety of types of information relating to organizations' risk exposures become accessible and manageable from individual desktop computers. For example, USAA, a leading property/casualty insurer based in San Antonio, Texas, combines two major types of risk management, formerly separate specialties, within its internal risk management department. Reporting to the Assistant Vice President of Risk Management is one Director responsible for Insurance Programs and a Director for Contingency Planning and Business Resumption programs. Previously, contingency planning was part of the corporate strategic planning function. Making it part of Risk Management makes it an ongoing, line-management process, as opposed to a specialized staff function.
As these new functions begin to come within the scope of risk management, new technologies used for business resumption and catastrophe management are being introduced as powerful risk management tools. Two of the more interesting developments are GIS (Geographic Information Systems) and GPS (Global Positioning Systems).
Geographic Information Systems applications for risk management are becoming widespread. "Prior to Hurricane Andrew, there was not widespread perception of a need for the use of such new technologies as GIS in performing Catastrophe Needs assessment," says, Allen Rodgers, a GIS specialist with Eagle Information Mapping, in San Antonio, Texas. While at USAA Insurance, Rodgers worked during the introduction of GIS as a tool for performing catastrophe needs assessment, to help answer such questions as:
- How big and how bad will a natural disaster be?
- How will it affect our insurance portfolio?
- Where should we set up catastrophe operations?
- How many adjusters and independents should we assign, and how soon?
One day's difference in making field assignments of loss adjusters amounts to a potential savings of as much as $70,000. Seventy two hours before a major hurricane makes landfall, USAA runs its catastrophe models, using data input received over the Internet from the U.S. Weather Service's Marine Advisory services. The data includes the projected track of the hurricane, its present coordinates, pressure gradients, and windspeeds. The data is loaded into a Hurricane Model developed by SAIC. The model overlays the storm data with predicted points of impact, and permits a database interface with ZIP codes of the insurer's members (policyholders) from a mainframe computer. The model runs on top of ArcInfo, software developed by ESRI (Environmental Systems Research Institute, based in Redlands, California).
Other applications of GIS at USAA include insurance rating, underwriting, and claims adjusting. "Catastrophes are the Number 1 priority when they're happening" says Traci Tracey, a GIS Analyst for USAA in Dallas. "But on a day-to-day basis, we use GIS for establishing rating territories by overlaying "geocodes" of policyholders. This allows for a more finely-detailed rating grid and permits overlay of flood plain maps with policyholder locations. This helps after a catastrophe as well, so that loss adjusters can locate the (former) properties affected by an insured loss. After Hurricane Andrew, there were no street signs or landmarks. Even the McDonalds was gone. Residents were evacuated and could not show adjusters where the house used to be. We also find preferred providers for a given accident, based on the member's X-Y (longitude/latitude) coordinates." ArcInfo and ArcView have been used by USAA following the Northridge earthquake to set up optimum routing on a printed map and printed driving directions for use by claims adjusters in handling property/casualty claims. In the future, this may be commonly used via either wired or wireless Internet access.
"Within a fairly short time" says Allen Rodgers, "insurance companies will begin merging GPS (which identifies a location's exact latitude and longitude), GIS, and Remote Sensing. Remote sensing is particularly useful for tracking down stolen cars. GPS is useful for identifying exact location of accidents and insured properties. We also expect to be able to sense the composition of roof material, backyard structures, and types of building materials used on an insured structure before a loss occurs. Previously, remote sensing data available from NASA was limited to 10-meter resolution. Now data is available for 1-meter resolution, though it is expensive."
Geocoding, an algorithm for estimating the actual locations of properties from a database, is commonly used for approximations to actual X-Y coordinates. The OMEGA system by Anistics, used by numerous risk managers for claims and exposure data, includes a data field for entry of each location's "geocode."
"Right now, in terms of the ultimate applications of GIS technology for insurance and risk management," according to Traci Tracey, "we are using a stone knife." Our biggest challenge is enabling people whose main job is not GIS but who can use it as a tool. If it's too difficult, they simply will not use it. We hope that our forthcoming 'Intranet' may make GIS more accessible to the occasional user. Even our van pool services group uses it. The ability to visually analyze data allows quicker decisions. Other tools could be used, but people like to see things as real pictures, as opposed to tables and pies. "
John Bruno Senior Meteorologist for SAIC in McLean, VA uses ESRI's ArcInfo GIS as a tool for in designing FEMA's contingency planning and disaster recovery systems. After Hurricane Andrew in 1992, FEMA realized could it was inadequately equipped to respond to a disaster of that size. A priority was established to install a better system by the next hurricane season. The purpose of the system was to provide FEMA with the capability to identify what resources and how many would be required to respond to a natural disaster (how many cots, blankets, gallons of fresh water, meals, etc.)
As a result, the "Consequences Assessment Tool Set" (CATS) was operational by the next hurricane season, in time for Hurricane Emily, which hit the outer banks of North Carolina. The system was effective in helping FEMA to pre-position resources to aid those affected and people made homeless by the disaster. The model is able to produce useful results "in real time" as a hurricane approaches. Storm data is received over a dedicated line from National Hurricane Center in Miami. The data is automatically extracted and input to the system. The system's modeling capabilities create "damage bands," based on wind speed for various types of structures. It provides information on the likely number of homeless. It estimates special parameters by interacting with census data, such as language spoken for neighborhoods likely to be affected. The GIS technology allows for "spatial estimation" of damages (not just numbers of impacted population but where they are and where disaster recovery resources need to be). It also helps to identify places to set up disaster supplies, such as shopping mall parking lots, and places to take injured people.
The CATS system also is used in conjunction with Pacific hurricanes and typhoons and is installed at the Defense Special Weapons Agency. A related flooding model was used for the Mississippi River floods in June/July, 1993. Similar models are used to track and project the trajectory and impacts of chemical releases and nuclear accidents. The software, which can be licensed from SAIC, can be customized for specific applications and databases. It is used by a number of major insurance carriers to develop Probable Maximum Loss estimates. It helped in planning to handle contingencies such as the Olympics and the Presidential Inauguration. Other common uses include fighting urban and wildland fires, using satellite imagery to show the likely evolution of fires if unfought, overlaying resources like hospitals. This helps in determining whether to use a particular hospital to aid victims, or whether the nearest hospitals are themselves threatened by the fire. GIS and aerial imagery, transmitted via the Internet, was helpful in fighting the Oakland Fire in 1991.
According to Gity Monroe, GIS Manager for the Stanislaus County, California Department of Public Works, their GIS system was being upgraded when the January flooding struck. "On the first night of the flood, the fire and sheriff's emergency dispatch asked for a map, and we tried to determine the locations of properties being flooded. Using GIS, we also drew evacuation maps and could see what houses were under water. When our system is completed, we will use it to assess the value of lost farmland for completing disaster reports, and also to determine the name and address of each person flooded. GIS will be a big help in preparing an inventory of damage estimates for FEMA disaster aid applications."
Notes Robert Spiva, director of marketing at California CAD Solutions, a 10-year-old GIS consulting group working with both Stanislaus and San Joaquin Counties: "AutoCAD has been useful in tracing the high water line of the January flooding from the Tuolomne River. Each time a levee breaks, new information comes into the emergency centers. That information has to be collated with data we already have and used to update the maps and develop contingency plans. That effort will continue long after the waters recede."
GIS and remote sensing is being used to expedite the handling of disaster assistance applications by FEMA at disaster field offices. Based on address of an applicant's home, FEMA can determine with high confidence whether the homeowner actually received severe damage. Then FEMA immediately can cut a check for interim assistance.
Not every use of GIS technology is "above ground." According to Michael Lane, President of Greenbrae Environmental near San Francisco, "Computer technology can greatly assist risk managers caught in this dilemma of evaluating the risks of environmentally-impaired sites. One of the fastest growing environmental service sectors is site visualization. In effect, this is 'underground GIS.' Instead of roads, utility lines, sewers and parcel boundaries, the "layers" consist of datasets such as soil borings, ground water monitor wells, contaminant concentrations, water table position, or top of bedrock. Formidable reports can be broken down into manageable portions, and the relevant results displayed in easily understandable color and three dimensional diagrams."
Lane continues, "The goal of this 'turning data into pictures' approach is to enable risk managers to quickly focus on conditions relevant to their particular situation, ignore the rest, and get on with their business. It may turn out that a deep groundwater problem has no impact on activities at ground level, and this can be conveyed effectively enough to satisfy jittery employees (lenders, insurers etc). Or perhaps a highly contaminated soil area was removed and extensive testing shows no further impacts. Or, perhaps a gasoline spill was cleaned up but an adjacent industrial site has a shallow chlorinated solvent plume headed directly for the subject property. In each case, these new tools provide risk managers with the ability to make decisions based on a real understanding of the issues."
Risk-financing decisions are being assisted with GIS tools. Such companies as Hilton Hotels, Mattel Inc., and clients of reinsurance intermediary E.W. Blanch Co. and Marsh & McLennan utilize catastrophe modeling software and services provided by Menlo Park, California-based Risk Management Solutions. The Interactive Risk Assessment System (IRAS) developed by RMS models the interaction of physical forces and related financial consequences. It integrates the analysis of risk portfolio information specific to an individual company's or insurer's exposures by location against the effects of natural hazards, such as seismic data. The resulting models developed can be viewed against constantly-updated weather, and can be used to model the consequences of specific risk financing decisions, such as alternative ways of structuring excess reinsurance placements. (This is graphically illustrated below.) These techniques are useful in managing risk portfolios and avoiding excessive accumulations of risk related to any one possible event. RMS has released a free software tool called RiskBrowserä to reinsurers throughout the world. The purpose is to move catastrophe management from being a specialized function to the desktops of managers who need such information in their daily business decisions.
Major insurance carriers are using GIS for viewing environmental sites and "sensitive receptor sites" that would receive the brunt of a disaster or undue effect of a catastrophe, such as schools or hospitals. A new technology known as "SDE," makes GIS applications "distributable," allowing for fast, distributed access into vary large databases, with hundreds of concurrent users. In effect, the new technology brings client/server to GIS. The users of such systems have an easy-to-use graphic user interface, and no specialized GIS software needs to be installed on desktop computers using the system.
The Internet make it possible to use GIS applications via a standard "web browser." This helps in eliminating or shortening the "learning curve" for non-GIS-specialists, since web browsers also will be in common use for numerous other common applications. As risk managers develop their own "Intranets," map-based and remote sensor data can be accessed, with details on organizations flood hazards, sensitive receptor sites, hazardous material sites, etc. Using secure access via the Intranet or Internet, underwriters, brokers, risk managers all will be able to simultaneously view the same graphic information for major locations.
The Web makes it easy! An example of how easy it is to get relevant up-to-date information on the impact of current disaster situations is illustrated by January's West Coast flooding information provided at the www.disasterplan.com Internet web site. Satellite photos of incoming weather, rainfall and snowfall estimates prepared by the National Weather Service, readings from remote sensors of streamflows, and a disaster bulletin board are among the information available simply through a web browser. Similar applications for risks within any organization can be developed as part of risk management Intranets.
Allen Monroe, Founder and CEO of RiskINFO in Larkspur, California, can be reached at (415) 927-8824, or by e-mail at firstname.lastname@example.org
Examples of Geographic Information Systems:
HAI-Maps City Viewer
Copyright 1997 RISK & INSURANCE. 747 Dresher Road, P.O. Box 980, Horsham, PA 19044-0980. Reproduced with permission of David Shadovitz, Editorial Director (215) 784-0910. An abbreviated version of this article first appeared in RISK & INSURANCE, in March, 1997, in an article titled, "Risk's New Desktop."
[Return to Tech Page]