iorg.com logo

Top

White Papers


 

© Copyright 2000
iorg.com

Intranet Systems Integration

Steve Telleen and Bart Meltzer
Business Communications Review, Volume 27, number 7, page 35-38, July 1997

Web-Enabled versus Web-Based Applications : Integrating Legacy Systems : Content Integration : Conclusion
For most companies, the intranet started as a grassroots experience -- often informal and unsanctioned. This self-starting process certains advantages, particularly the capturing of  a high level of  interest and involvement from organizations and individuals outside the formal IS structure. 

Simply put, intranets give these "outsiders" more control than they've had with traditional computer applications, and it is the ability of intranets to involve and empower independent decisions and action that makes them compelling. However, as intranets are used to support more business communications, the requirements for integration with other resources -- both intranet and legacy -- become critical.

Integration pressures are not new. In the past we've tried to solve the problem by standardizing on products from a single vendor, but that strategy doesn't always work. Individual departments or divisions sometimes continue to buy outside the standard to meet "special" needs, and there are instances where new products and capabilities have emerged from someone other than the sanctioned vendor. This problem has become  exacerbated by the trend in mergers and partnering, where there is a lot of history and no single control over brands and vendors.

Intranets address these cross-vendor problems better than any previously available set of technologies. Essentially, an intranet is an infrastructure, different from previous infrastructures in that it is based on Internet standards and tools, and focuses on content sharing within a limited and well-defined group. 

While conventional standardization efforts have attempted to force a homogenous approach, Internet standards and tools are based on support for and incorporation of diversity. Content, rather than the process of content creation, is standardized; as long as an intranet tool can read, act on and output Internet-standard content, its internal workings and features are unimportant.

The issue facing enterprise managers is that the arrival of Internet-type standards forces a network to expand beyond the vendor-specific, legacy architectures. While this may seem like a daunting task, the payoff is that the foundation of the architecture becomes based on the enterprise's vision and needs rather than a vendor's marketing or technology requirements. This not only means more flexibility, but also the potential for the network owner/operator to differentiate his/her business. 

An enterprise architecture involves development at several levels in the organization (see sidebar). The system level of an enterprise architecture is  the last to be developed, and  provides the transition from the architectural abstractions  to specific products and implementations. It documents the services required and how they inter-relate. It does not include specific products or technologies. The required services and standards depend not only on specific circumstances of the business, people and process levels, but also on the legacy environment and the future objectives of the organization. This article will look at approaches to integrating specific legacy situations.

Web-Enabled versus Web-Based Applications (top)

The first step seems obvious, but is often overlooked: defining what you intend to do. Is your intent to take a legacy application and make it available through a web browser -- i.e. web-enable an existing application -- or do you intend to build an intranet-based application that accesses legacy data? This is an important distinction because the requirements and tools are different.

To show the difference between "web-enabled" and "web-based," consider a web site designed to facilitate a customer credit application process. This process can be looked at as a business-to-consumer process or a business-to-business process where Company A wants a credit line from Company B. In its most basic form, the process lets a user come to the web site, fill out a form with credit information, submit the form and receive information about acceptance or denial of the credit line.

In the web-enabled system, the application on the web server does little more than accept the HTTP request (either a GET or POST), parse and package the data into a message and send the message back to the legacy workflow system, formatted to match the legacy system's API (Application Programming Interface). The legacy system  processes the message just as it would input from any other source and takes the appropriate action. 

The business rules associated with the process are encoded in and managed by the legacy application. In this case, the business rules might involve determining what credit level to offer, or whether to extend credit at all based on the submitted information, credit history and current credit load. The legacy application generates its standard output, which is post-processed into an HTML screen and sent back to the application running on the web server. which then delivers the HTML content to the user's browser.

In the web-based system, the application on the web server parses the data in the request. The business rules are developed and managed using a rules-based system that allows the business person at the credit-extending company to enter, change and manage the business rules through their browser. In our example, the business person might change the rules governing how credit levels are set. Based on these rules, the rules-based system takes the request and determines what calls to make to workflow functions (application objects) that are able to interact directly with databases or other legacy application services on the network. When the workflow functions are complete, the application on the web server applies the business rules to the data received, merges the appropriate HTML template with the data, then displays the page to the user's browser via the web server.

While web-enabled systems can be a valuable migration tool for quick conversion of legacy systems, over time web-based systems are more desirable, because they are much easier to maintain and extend. Web-based systems also provide better performance, from both the business and computing points of view. This is because web-enabled systems require all updates to be funneled through the engineering group that supports the legacy system, where all of the functionality resides, which produces inefficiencies and inevitable backlogs and bottlenecks. In contrast, web-based systems can be maintained in a distributed fashion on different platforms for different purposes. 

For instance, the business rules that drive the web-based application can be updated or changed by the domain specialists--the non-IT specialist who develops and is responsible for these rules from the business perspective. Similarly, the look and feel of the site can be updated by the designers responsible for presentation. This leaves the legacy and workflow functions to be updated by the engineering group that historically has been responsible for them.

Furthermore, web-enabled systems are not easily extensible. Say the business wants to offer other services during the credit application process when the customer profile indicates a potential interest. For example, if the credit is being requested from a financial institution, and the requester profile shows that the customer does business in foreign markets, the financial institution may want to offer their new currency-exchange management services in addition to approving the credit line. 

In the web-enabled system, the only screens that can be presented to the user are those generated from within the legacy workflow system, but this system has no concept or ability to customize the experience. In contrast, in the web-based system, the business rules are separate from the workflow functions and legacy data. Additional presentation screens can be developed independently, even dynamically, to improve the audience's experience with the site. Even the new services can be integrated more easily, because they are integrated at the business rule level, not as code in a legacy application.

Integrating Legacy Systems (top)

A major benefit of an intranet infrastructure and web-based applications is the ability to host different services on different servers. While neophytes envision their intranet as being hosted on the web server, experienced practitioners quickly accept, and learn to leverage, the power of diverse web servers on their intranet. 

This can be important when integrating legacy systems. Vendors of specific legacy content often sell an  integration server and middleware that will accomplish the integration of their applications with the least effort. While they try to sell these servers as being "general purpose," you may be better off using it primarily as an integration server for their legacy services and using other servers for other functions on your intranet. Just like selecting among screwdrivers, hammers and saws for a specific carpentry job, don't be afraid to use different web servers and middleware on the intranet to meet specific requirements. It often is useful to think of each intranet service as a potentially separate web server.

For example, if a unit within the enterprise has an extensive history with Lotus Notes,  a Domino server to support the integration is an obvious choice. If another unit shares MS Exchange files extensively, then Microsoft's IIS web server can support that integration. Legacy data in an Oracle database might be most easily served using an Oracle server, and there are even web servers for mainframe MVS and VM operating systems that can  integrate bulletin board files on the mainframes with intranet-threaded discussion groups.

The key point: Build toward your enterprise architecture, not your legacy constraints. Whether you have one legacy system or many, look at these servers as specialized integration servers that can be mixed.

If you are worried about the cost of these multiple web servers, remember that you are already supporting legacy applications on multiple systems, and the web server generally is an inexpensive add-on. It certainly will be less expensive and disruptive than attempting a conversion or writing and maintaining custom integration code. The Internet standards were designed specifically to provide an integration rosetta stone among diverse systems, so use it to your advantage. If you build an intranet to support this diversity now, in the future you will find you can migrate, at your own speed, to other solutions without the necessity of massive, synchronized conversions.

Content Integration (top)

Integrating content across legacy systems for delivery through a web-based application is not trivial and requires a sound architectural approach and good development tools.  The main issue is to provide a single web-based application interface which accesses multiple, heterogeneous content sources. 

The content sources may be structured (e.g. databases), semi-structured (e.g. rules-based or expert systems ) or unstructured (e.g. email archives, discussion groups or web pages), but in each case, the access to these sources must be transparent to the user of the web-based application. Another large issue is the condition of the data returned from each legacy system and the normalization and cleansing that must be done in order to make the data suitable for web delivery.

The best way to deal with these issues is to use an architectural approach based on information brokering and agent technology. Generally, a web application built with this architecture will contain HTML page templates, static application content and business logic, but instead of going after legacy system content directly, the web application will talk to an "information broker" whose job it is to cache content from legacy systems as it is delivered by the agent.

Agents connect the information broker to legacy systems and are responsible for finding content in legacy systems, normalizing and cleansing the content and delivering it to the information broker according to the broker's published schema as well as the rules associated for re-querying the content sources. Each agent has the knowledge to access specific legacy systems and contains rules for manipulating content before handing it to the broker. The agents are domain specific with usually at least one agent per content source. As new legacy systems come on line, an agent associated with that legacy system can register with the information broker. 

The registration approach makes it easy to bring on new legacy sources without any changes to the web application or the information broker. The only work to be done is to develop the agent associated with the new legacy system (according to standards and guidelines) and have the agent register itself with the broker. This approach also makes it possible for legacy systems to go off line without affecting the way the web based application functions. When a legacy system goes offline, cached data in the broker may be used to feed the application and there is no change in functionality, though it may be necessary to notify the user that cached content is being used.

Conclusion (top)

Intranets can be a powerful communication and integration tool. However, to reach their  full potential, the infrastructure must be predicated on supporting diversity and distributed control. This requires a robust enterprise architecture, and an approach that leverages the new functionality to manage the integration rather than falling back on the centralized or single vendor approaches of the past.

Once the basic infrastructure and architecture are in place, this technology supports migration in small steps. Begin by looking for servers and tools that, with little effort, will web-enable your legacy systems. When implementing new applications, make them web-based; there are an increasing number of off-the-shelf web-based applications, generally from smaller vendors. For functionality not yet available off-the-shelf, high-level development tools are becoming available. Finally, after web-enabling your legacy systems, you can migrate them to web-based more easily, because both systems can run simultaneously, using the same browser interface. 



An Enterprise Architecture Baseline

At a high level, creating an enterprise architecture involves working through four levels: business models, people roles, publishingprocesses and technology systems (application, data or platform): 

Business Models:
While the details of the business model will vary with each  enterprise, business content can be divided into threebasic types: informal, formal and access controlled. These distinctions are important because the roles, processes and, ultimately, system requirements will change depending on which type of information is being supported.

Informal content  supports  much of the  innovation and "work in progress" that takes place. Informal content has the fewest controls and is where all of the business's content  is created and refined. It includes personal work or insights that get shared outside formal projects and missions, project or departmental work and communication not intended for the entire enterprise, and draft content that has not yet completed a formal review and acceptance process.

Formal content is sanctioned by a recognized organization that stands behind both the quality of the content and any commitments it makes. Generally, the feature that moves content from informal to formal is an explicit  review and acceptance process defined by the organization that stands behind the content. Formal content includes published company benefit programs, press releases, product collateral, product pricing schedules, company policies, company mission and value statements, competitive analyses, company forms, official  process descriptions, and many other published standards, views and commitments that the enterprise formally endorses.

Access-controlled content involves additional decision processes on who can see the content--whether it's more restrictive or less restrictive than the general intranet population. If the content is restricted to a subset of the intranet, it generally is referred to as the confidential intranet. If the restricted subset includes partner or customer organizations, then we refer to the content as an extranet. If all restrictions are removed (even the intranet restriction), then the content becomes part of the Internet.

Each type of content may be created and managed differently on the intranet. For example, requirements for informal content often are as simple as providing for individually controlled  publishing spaces, setting baseline rules that extend existing company policies to intranet publishing, and adding the requirement that each page must contain the owner, a contact address and the date of last update. 

Formal content may  require someone to fill the publisher's role of determining what kinds of information the organization will provide, as well as the editor's role of lining up and managing the content authors, guiding the content through the formal review cycles and publishing the approved content, often with a logo on it that identifies the content as formal. 

People Roles: 
It is important to recognize that organizations and, ultimately, people, make decisions about the content's meaning and theorganization's desire and ability to live up to commitments. Processes and technologies can support the decision making, but in the end, these are judgment calls made by human beings. The enterprise architecture needs to support these people and their decision processes. 

Because the style and circumstances for each person and organization are different, it is important that the architecture provide for diversity -- enabling each decision maker the option of selecting processes and tools that meet her needs. Developing a systems infrastructure based on community-owned content standards, rather than specific products, gives decision-makers another set of options. Providing the decision makers with intranet-based threaded discussion and project management  services that they can configure and manage themselves is yet another way to provide support.

Publishing Processes:
There are at least four classes of  publishing processes that can be applied to intranet content. The first is the everyone-as-publisher class. Here each member of the intranet community is capable of publishing his or her own content on the
intranet.

Second is the webmaster class. The name is taken from the historical role of early webmasters as the individual who took the content, converted it to HTML and placed it in the web directory on the server. In our context it refers to any process where publishing on the intranet requires giving the content to a human gatekeeper who has control over what content is published and when.

The third is the database class. Specific content is published into a database, and the database acts as an automated intermediary with the web server. This process class is common in retail catalog applications and for generating dynamic pages.

Fourth is the object class. This takes its name from the object programming model, which provides application classes, that can be modified into sub-classes and ultimately are manifested as specific instances. If a change is made to the class level, it propagates itself down to all the instances of that class.

In an intranet publishing environment we call the class a page template. When content is placed into the template, it becomes aninstance of that template and it functions like an instance of an object class. If the template is changed, the changes propagate to the content pages that were created using that template. 

Each of these process classes provides different levels of user enablement and management control. No single process is appropriate for all parts of an enterprise intranet. The everyone-as-publisher process might be appropriate for much of the informal content; the webmaster process might be appropriate for managing content during the review and acceptance process that moves it from informal to formal; the database process might be appropriate when implementing restricted-access content, and the object process might be appropriate where publishing is distributed across divisions, but there is a desire for the format and image of the content to look coordinated. A good enterprise architecture provides managers and domain specialists with the ability to match the appropriate process classes to their specific business requirements.

Technology Systems:
At the architectural level, technology systems refers to the framework that includes the required services and how they are related to each other. The important standards both within the services and between them are documented in the architecture. The most robust and flexible enterprise architectures are based on community-owned standards rather than de facto, proprietary standards. The architecture should not specify products or technologies, no matter how popular, because these change too rapidly to be useful and can lead to dead-ends and costly conversions in the future.

(top)

  top page | papers


For more information contact: stevet@iorg.com
© Copyright 1997-2000 iorg.com