From Pin Maps to the World Wide Web

By Stu Mathewson | March 5, 2001

Since the 1800s, the insurance industry has used technology to track and control catastrophic exposures. Insurers first used mapping after companies were hit by major fires in cities such as Boston, Chicago and Philadelphia. This sophisticated (for that time) tracking method consisted of underwriters placing pins on a map showing the location of their risks, from which they could restrict the company’s exposure in any block or town. At the time, fire and lightning were the only covered perils available to the insured.

While pin mapping assisted underwriters in controlling concentration when windstorm coverage was first introduced in the 1930s, it was all but abandoned by the 1960s as a time consuming and expensive process. The U.S. had experienced few severe catastrophes during this time period—a major earthquake had not occurred in almost 60 years and damaging hurricanes were scarce.

In the ’70s and ’80s, while computers were revolutionizing every aspect of American industry, the insurance industry was slow to recognize their application for catastrophe management. Natural disasters such as the 1964 Anchorage Earthquake and Hurricane Camille in 1969 were dismissed as random events striking unpopulated centers. Catastrophe modeling was largely in the domain of a few research pioneers.

Dr. Donald Friedman developed the first hurricane modeling concept in the mid-1970s for Travelers’ internal projections of coastal hurricane losses. While eons ahead of pin maps, Friedman’s model depended on simulations of a small number of historical and hypothetical losses, and was not an accurate gauge of the likelihood of potential losses.

In the 1980s, the first catastrophe model employing a large set of hypothetical scenarios that covered a range of potential events was developed by Karen Clark. Clark signed to do a joint venture with E.W. Blanch Company, which led to the first commercially available hurricane model, “CATALYST”, in 1987. Blanch followed in 1988 with the launch of its earthquake model, “CATALYST Earthquake”. Clark then formed Applied Insurance Research and developed CATMAP, for use by reinsurers.

In the next few years, other catastrophe models were introduced to the insurance industry, including RMS, AIR (for primary insurers), Tillinghast-Towers Perrin, and EQE. These models garnered industry wide attention, if not acceptance. While catastrophe models became more refined and their processes became generally understood, the models were less a tool to monitor and control company exposure to natural disasters than to create a useful benchmark for insurers to negotiate reinsurance pricing.

Hurricane Hugo and the Loma Prieta Earthquake in 1989 did little to change industry practice. The industry wide yawn was understandable: Hugo and Loma Prieta had done little damage to reserves and surplus, and financing of catastrophe losses (through the reinsurance market) continued to be widely available and cheap. Primary carriers continued to see little use for modeling as a way to avoid risk concentration.

Hurricane Andrew and Iniki in 1992, followed closely by the Northridge Earthquake in 1994, changed the industry and its view of catastrophe modeling forever. Carriers that did not embrace, or at least employ catastrophe modeling, were suddenly at risk.

Almost overnight, a new and highly sophisticated sector of the reinsurance market emerged. The Bermuda Reinsurance Market was created by a group of new companies with an exclusive focus on catastrophe risk. They employed catastrophe modeling to determine not only pricing of catastrophe reinsurance, but also as an underwriting look to establish whether or not to deploy their limited capacity. The emergence of the Bermuda market immediately “raised the bar” for carriers with heavy writings in catastrophe-exposed areas.

“The catastrophes of the early 1990s brought catastrophe modeling to the forefront. What many considered a theoretical concept quickly became the new paradigm for insurance companies with books of business in catastrophe prone areas,” said Bill Riker, president and chief operating officer of Renaissance Reinsurance Co., one of many reinsurance companies formed in the aftermath of Hurricane Andrew.

During the same period, rating agencies such as A.M. Best Co. began using catastrophe modeling as a factor in assessing insurance company financial security. High concentrations of catastrophe-exposed business indicated unacceptable financial volatility.

After incurring $15 billion in insured damage during Hurricane Andrew and another $13 billion in the Northridge Earthquake, carriers realized that, in order to secure catastrophe financing and maintain an adequate rating, catastrophe modeling would have to play a key role.

Today, catastrophe modeling is widely accepted and affects every sector of the insurance industry. The development of the Internet has moved catastrophe exposure management to yet a higher plane. The last 10 years of catastrophe modeling has been performed retrospectively—once a carrier had created a book of business, the model would indicate if the company had done a good job of creating a spread of risk. At best, an underwriter could assess the risk after receiving an application. The Internet now allows companies to model and track exposures and adjust availability on a real-time basis.

This is especially useful for companies that specialize in catastrophe underwriting. ICAT Managers, an MGA formed to service the needs of small-to-medium catastrophe-exposed risks, uses the Internet and RMS modeling to confirm locations to track risk factors such as distance to the coast or soil type for each risk before writing business. Once risks are geo-coded and modeled, ICAT can then provide real-time quotes and underwriting decisions.

“The Internet in combination with catastrophe modeling software provides our industry the ability to model every submission, or an entire book of business, in an instant,” said Jack Graham, chief executive officer of ICAT Managers. “These advances in technology have brought a new level of sophistication to catastrophe risk management.”

In the years to come, technology and the accuracy of catastrophe modeling will continue to improve. For example, Satellite GPS (global positioning systems) may further enhance the ability to quickly generate the exact longitude and latitude point for any risk to be compared to topographical maps, flood zones and even local building codes to determine risk eligibility.

While no one can verify if the models are 100 percent accurate, catastrophe modeling has instilled an industry-wide discipline to track catastrophe exposed writings and prevent over concentrations of risk. Carriers and agents that understand catastrophe modeling and its implications are well positioned to thrive as the industry moves toward a hardening market.

Mathewson is one of the insurance industry’s pioneer contributors to the development of catastrophe modeling software. He led E. W. Blanch Company’s development of their CATALYST model in the mid-1980s and joined Tillinghast in the early 1990s as a consulting actuary. He was a principal architect of their ToPCat earthquake and windstorm models. He is a trained Fire Protection Engineer and speaks frequently at industry events about natural disasters.

Topics Catastrophe Carriers Underwriting Hurricane Reinsurance Market Earthquake

Was this article valuable?

Here are more articles you may enjoy.

From This Issue

Insurance Journal Magazine March 5, 2001
March 5, 2001
Insurance Journal Magazine

When Your Insureds Remodel Their Homes, You Should Remodel Their Cover